> Alternative theory: there will continue to be bad programmers, but there will also be good money in compilers that can deal with it and turn crap into optimized goodness. Midas compilers.
No amount of compiler optimization is going to fix algorithmic failures.
This has already happened with SQL databases. Performance tuning a database can be messy because sometimes you have to outsmart the query optimizer. But in the average case it is doing the Right Thing without you having to think about what engineering has gone into it.
Demand for new data models is insatiable and for each one, a software ecosystem can develop around a similar degree of automation, tuned towards the specific domain. It's a very, very black-boxed future.
Depends on what limits you place on the definition of 'compiler'.
Optimizing via better algorithmic choices is a semi-deterministic process performed by a human. Why can't a computer do it too?
Code is just your way of telling the computer your intentions. Once it can understand those deeply, well, it can choose a better way to achieve those intentions.
No amount of compiler optimization is going to fix algorithmic failures.