> The second is that I think the saying 'premature optimization is the root of all evil' is the root of all evil.
The greater evil is putting a one-sentence quote out of context:
"""
There is no doubt that the grail of efficiency leads to abuse. Programmers waste enormous amounts of time thinking about, or worrying about, the speed of noncritical parts of their programs, and these attempts at efficiency actually have a strong negative impact when debugging and maintenance are considered. We should forget about small efficiencies, say about 97% of the time: premature optimization is the root of all evil.
Yet we should not pass up our opportunities in that critical 3%. A good programmer will not be lulled into complacency by such reasoning, he will be wise to look carefully at the critical code; but only after that code has been identified. It is often a mistake to make a priori judgments about what parts of a program are really critical, since the universal experience of programmers who have been using measurement tools has been that their intuitive guesses fail. After working with such tools for seven years, I've become convinced that all compilers written from now on should be designed to provide all programmers with feedback indicating what parts of their programs are costing the most; indeed, this feedback should be supplied automatically unless it has been specifically turned off.
"""
Indeed, but I think even that advice, with context, is pretty debatable. Obviously one should prioritize critical sections, but completely ignoring those "small efficiencies" is certainly a big part of how we got to where we are today in software performance. A 10% jump in performance is huge; whether that comes from a single 10% jump, or a hundred 0.1% jumps - it's exactly the same!
So referencing something in particular from Unreal Engine, they actually created a caching system for converting between a quaternion and a rotator (euler rotation)! Obviously that sort of conversion isn't going to, in a million years, be even close to a bottleneck. That conversion is quite cheap on modern hardware, and so that caching system probably only gives the engine one of those 0.1% boosts in performance. But there are literally thousands of these "small efficiencies" spread all throughout the code. And it yields a final product that runs dramatically better than comparable engines.
The greater evil is putting a one-sentence quote out of context:
""" There is no doubt that the grail of efficiency leads to abuse. Programmers waste enormous amounts of time thinking about, or worrying about, the speed of noncritical parts of their programs, and these attempts at efficiency actually have a strong negative impact when debugging and maintenance are considered. We should forget about small efficiencies, say about 97% of the time: premature optimization is the root of all evil.
Yet we should not pass up our opportunities in that critical 3%. A good programmer will not be lulled into complacency by such reasoning, he will be wise to look carefully at the critical code; but only after that code has been identified. It is often a mistake to make a priori judgments about what parts of a program are really critical, since the universal experience of programmers who have been using measurement tools has been that their intuitive guesses fail. After working with such tools for seven years, I've become convinced that all compilers written from now on should be designed to provide all programmers with feedback indicating what parts of their programs are costing the most; indeed, this feedback should be supplied automatically unless it has been specifically turned off. """