Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The article is factually correct, but I feel that the analysis is missing some of the historical context. Here's a quite probably strained analogy to cosmology:

Microprocessors and CMOS upended the computer industry, to the extent that a few years later, all the big companies in computing, up to that point, were in precipitous decline (even IBM, which embraced the new world order with the PC, only delayed this reckoning.)

In those days, microprocessors and DRAM alone were the cutting edge of technology, and they opened up all sorts of possibilities (though this also depended on some additional special-purpose equipment, notably for graphics and networking.)

One might draw an analogy here to the Big Bang. What happened next was like a period of cosmic inflation, in which Moore's law and Dennard scaling (plus fiber optics and the wiring of the world) created exponential growth. As in cosmology, we end up with a much bigger but quite uniform universe, in this case because the growth of the basic technology alone was enough fuel for all the innovation we could come up with.

It was only after inflation that the universe became really interesting. It is still overwhelmingly hydrogen, but it has differentiated: there are also galaxies, stars, planets and people.

So, the long-delayed conclusion to this analogy: innovation in computing is now more mature, but it is not shrinking, it has diversified - and there are still 10^n (for some large n) 8- and 16-bit microprocessors being made, and there are still hobbyists doing clever things with what is now basic, even primitive, hardware (that, not so long ago, was unattainable) - but we also have emerging technologies (machine translation and autonomous vehicles, for example) that quite a few people, not so long ago, assumed would be forever beyond the capabilities of mere machines.



A had a similar reaction to the article, and I appreciate your cosmic analogy.

The metaphor that came to my mind was the long persistence of steam power after electric motors were invented. Initially, 'going electric' meant replacing your one giant steam boiler with one giant electric motor, and doing everything else the same way. The factory remained organized around the drive shafts and pulleys and mechanical distribution systems. There was little advantage. It took decades of experimentation to gain the insights of how many small motors and task lighting could allow a factory to be optimized for task flow, not power distribution. And then it took longer for those insights to diffuse through slow human networks.

Hardware has developed so fast that software had no time or incentive to mature. Each decade's tech just gets ossified into the stack because hardware was making the stack faster at a better rate than human insight.

Every few decades, we get lucky and an invention happens when the ecosystem can use it... so we get compilers and sql and automated tests. But then other insights like immutable data structs just stay niche.

I hope that as hardware progress flattens, opportunities emerge for better software paradigms. Maybe this will coincide with the craft of software becoming introspective about it's myriad social issues.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: