Ive recently been noticing that g suite apps in firefox tabs are routinely grabbing more than 500MB of ram, it's often more than 1GB just to have a gmail window open. Just a few years ago this wasn't the case, but nothing has been improved as far as functionality. Anyone have insight into how we got to a place where a google sheet with 4 entries can use more memory than the entire OS? I used to routinely perform identical tasks on much lesser hardware with greater speed, so how do these seemingly broad regressive changes happen at the ground level?
I blame the Software as a Service
model. Clearly, you should just downgrade to the old version of google docs (or whatever) that used 10% as many resources, and had a better UI.
Since you don’t get to control which version you use, teams don’t have to compete with last year’s version. Therefore their management doesn’t need to worry about extreme regressions in functionality, so they don’t take action to avoid them.
Since the service provider is the development team, you’re locked in, and won’t switch.
Usage metrics look good, management gets credit for shipping ${feature}, and gets their promotion. This happens a thousand times, and everyone wishes the irreparably broken ${megacorp} would just go out of business already.
This pattern doesn’t play out to the same extent with hardware, where people have to pay big lump sums to upgrade, and can hold on to older models / switch vendors instead.