Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Thanks. Very helpful.

One of the wild things to me is how incredibly elaborate chipmaking as become over the years. Per Wikipedia, the 6502's layout was made with "a very manual process done with color pencils and vellum paper". That's the processor that launched the personal computing revolution, powering the Apple II, Commodore PET and VIC-20, the Acorn, and the BBC Micro. Nearly 50 years later, things have gotten so fiendishly complex that "flip it over and put some stuff on the back" is a major industry change requiring who knows how many billions in R&D.



Chip making is the first industry that was automates by computers. Much like how steam engines were first used in coal mines.


The discussion around steam engines is generally that coal mines were the appropriate place because early steam engines were very bad and there wasn’t good transport so they could only really go in places with a plentiful local supply of coal. (There is also a requirement that coal be in sufficient demand for the steam engines to be worth it. To some extent, the bad transport meant that a lot of the price of coal was in the transport so the price of the coal for the steam engine was lower as it didn’t include that transport).

I don’t really think chip making is much like that. I wouldn’t say that chip making was the first industry to be automated by computers either.


Not making nuclear weapons or banking?


"Chip making is automated."

Is not a true statement.


“Automated” != “Fully Automated”/“Autonomous” IMO. I took one class on this years ago and didn’t pay a lot of attention so disregard me if I’m super off base, but Silicon does rely heavily on computer aided design, no? I think a more charitable reading of the parent in that light makes it an actually quite insightful comment


I can't think of a single aspect of chipmaking that is not automated to some degree. Manufacturing of semiconductors is probably the most automated thing that humans have ever managed to do in the physical world.


I mean, steam engines didn't fully automate mining either, but I still get his point.


Chip making is in a really sad state of automation. There's lots of money poured into very ingenious products, but the major incumbents have to real incentive to standardize, so you still have some unstructured file formats (SPICE) with different dialects veeery hard to parse correctly, you have ad-hoc APIs depending on vendor and software versions that all try to capture you into their ecosystem, you have half-assed attempts at standards (à la xkcd/927) that you can't rely on.

And this sad state of affairs shows no sign of evolving favorably. Closed-source software and corporate interests at their finest.


Also just the state of highly sophisticated but niche software (ecosystems) in general. It usually winds up super messy and only just put together enough to function for the small number of users.


Why would better standards be “better”, beyond just platonic affection for more standard things?


Wow so we still design CPUs on silk screens by hand?


I'm writing this comment on a computer, not with pencil and paper. Does that mean it was automated?


Have you ever published anything without using a computer? Comparing that process versus this, I think it's obvious the answer is yes.


Somewhat, IMO. For example, did you use spellchecking or autocorrect (where ‘use’ may mean you spent a tiny less time/attention on spelling, trusting your spellchecker to add wiggly lines to words you might misspell, even if you don’t make any typo?)


Are you nailing it to a church door when you're done?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: