I’m always a little sad that I missed the golden age of computing. I read my first Byte Magazine in the mid 1990s, a couple of years before they stopped publishing. Even then there was a ton of substance about the upcoming Windows 95 release, various detours in OS design that never panned out like Taligent, and how RISC was going to take over the world. It’s rare to read articles of that depth anymore, even on the web much less in published media.
Oh! Don't feel too bad. While the late 70s and early 80s were a great time to be alive (and interested in small computers,) there was still that nagging feeling that if you could only get a system with just 64k more memory, you could do WONDERFUL things, instead of the mundane things you were working on.
But I will say... the thing that made it interesting to me in retrospect is none of us knew what we were doing. I mean sure, we could write code, but we wrote it in languages that let us shoot ourselves in the feet. repeatedly. The first time I saw Smalltalk was like a religious experience. The day I realized why I couldn't use Smalltalk for "real" systems was like losing my religion.
On the hardware side... it seemed there was a new peripheral every week: 3" floppies, 2.8" floppies, 3.5" floppies, hey! an affordable hard drive!, voice recognition, head mounted displays, etc.
No one seemed to have figured out what the market would want to buy and certainly not how to make it at a profit.
And to me... that spirit of adventure... the "hey, let's just try this new thing and see if it works," was what characterized that era.
And you, right now can have the same experience. Just go out and learn about something halfway new (like how to use LISP in a modern web stack. (okay. maybe it's not that new.)) And try to find a group that's experimental.
The hardware and software tools we have now are INSANELY better than when I was a kid. Approach the market with that "beginners mind" and it'll be great.
And from the description of the timing of your arrival in computing, you're probably senior enough people will need to take you seriously.
>Oh! Don't feel too bad. While the late 70s and early 80s were a great time to be alive (and interested in small computers,) there was still that nagging feeling that if you could only get a system with just 64k more memory, you could do WONDERFUL things, instead of the mundane things you were working on.
Which eventually turned out to be false. Instead we got Slack, Facebook, and Electron apps using those gigabytes for things one could do in a PDP-11.
> ...there was still that nagging feeling that if you could only get a system with just 64k more memory, you could do WONDERFUL things, instead of the mundane things you were working on.
To me, that feeling only really went away in very recent times, when common desktop platforms stopped being starved for RAM. Before we could have 2GB+ RAM on our computers and not even think about it, it really was the case that adding more would make for a "wonderful" experience; there's just no comparison between running with free RAM vs. without! The transition to 64-bit compute everywhere also helped here, of course - no more address space constraints.
On the peripherals side, USB was a huge advance in retrospect. It did come at a bit of a price, in that every peripheral now has to run a fairly complex microcontroller just to deal with the high-level communication protocol (and in turn this led to the rediscovery of GPIO as a "thing" - it used to be the case that GPIO was how you did device interconnect!)
The next real advance will probably be software that actually uses heavily-multicore systems and even GPU compute effectively, for general-purpose tasks not niche special cases. Basically an equivalent to Firefox Quantum, for everything else that we do on our machines.
(Oh, and of course we're still waiting for a mobile platform that runs a genuine general-purpose OS. But hopefully we'll solve that shortly anyway, thanks to efforts like Purism and pmOS. And even the ubiquity of "smart" mobile hardware is in fact somewhat recent. This is linked to the upcoming advance I was just discussing, because power-efficient platforms like mobile are big on multicore compute and the use of GPU.)
Added: I just saw a sibling reply that talked about the crappiness of Windows 10 and recent OS X versions as proof that hardware innovation is dead, and a reply that in turn blamed the end of Moore's law. I think that both are missing the point quite substantially! Linux works just as well as it always did, and surely it should be the real standard if we're talking about innovation!
Personally, I think there was nothing golden-age about that time. I stuck with Physics. Monthly breakthroughs, yes. The giants whose shoulders we stand on, walking the earth, yes. But opportunities and platforms were miniscule compared to now. IBM’ers sneered at everyone else. This is the golden age of computing.
I disagree. In 1995, I picked up an issue of BYTE and marveled at the coverage of microkernel operating systems and RISC processors. New and exciting things were on the horizon! Now, I see only decadence. Hardware has stagnated for a decade. Software has less functionality while using exponentially more resources. Compare the Windows 10 Settings app to the Control Panel, which first debuted with Windows 95. Or the OneNote UWP app, which is now discontinued, to previous versions of OneNote. Key functionalities have gotten worse. Chrome's PDF handling is so awful that I spent weeks searching for a PDF viewer that's as good as Acrobat 10. To my shock everything was markedly worse--slower, less reliable searching, etc. I finally installed PDF XChange Editor and it was like a breath of fresh air. Not only does it search PDFs correctly, but it highlights the results and shows indicators of hits on the scrollbar! (Apps these days don't even have scrollbars.)
Apple has given up on the Mac, and now it only gets iOS hand-me-downs. One of my great regrets is that I didn't get get my first Mac until 2007, and missed out on a big part of the Mac's heyday. I remember when each new release of OS X came out, I'd carefully read John Sircusa's intensive review on Ars Technica. The PDF imaging model in OS X 10.0, hardware accelerated compositing in 10.2, fine-grained locking in 10.4. That was 15 years ago! Kids who were born then are stealing alcohol from their parents' liquor cabinets now and the only cool thing to happen to OS X in their lifetime is APFS.
There are highlights, no doubt. Rust is a breakthrough. Apple's Ax processors bring desktop computing power into impossibly small form-factors. Pervasive mobile broadband has enabled a lot of new applications. (But the best software stack for leveraging that capability is a direct descendant of NeXTSTEP.) But it's few and far between now.
There is a very simple reason for that, which I learned when I was in college: the end of Moore's law. During the 80s and 90s Moore's law was in full force, and every few months hardware doubled in speed and halved in price. This was the heyday of computing! With the end of Moore's law we won't see that happening again anytime soon, the future is to do incremental improvements in computer architectures, unless we find something fundamental to fuel the next generation of computing.
You're gonna get the "lots of progress in hardware part" back soon, even without Moore's law :|
...but you're likely not going to like it: will be a rain of exotic heterogeneous computing platforms first, with various ML/AI-accelerators all tailored to specific areas, at first used in IoT and mobile, then everywhere else. Thinks like NN-accelerators you now see in some mobile chips and "secondary chips" like T2s and what ever will become the dominant sources of computing power in systems, dwarfing the general-porpose-and-portable CPUs. And all software interacting with these will stop being portable in the way we know it, and as a result of network effects apps will no longer be portable across the many OS-flavours we'll have.
Second wave will be when ML gets to the point where it is heavily used to design hardware: you'll have an explosion of hardware architectures with non-(human-brain-comprehensible) instruction sets (plus maybe even with analog computing modules), to the point that compilers will likely not even be able to be coded by hand in "assembler", we'll need to have evolved compiler-generator-genrate-generators too etc.
Progress is coming back, and it will accelerate... just that it will not be the ape-brain-comprehensible kind of progress we knew before...
Moore's law was that transistor density doubles roughly every 18 months. That 'law' has not ended (15nm ->7nm, for example), it's just that the gains are being used for things other than clock speed.
It seems golden now because you have access to good information on how everything worked. Back then getting any information about the computer was difficult and problems we're very easy to get stuck on since there was nowhere to go for help.
Many people don’t realize this but a lot of the things we consider cutting edge have been available on mainframes for decades - the only real cutting edge part is that we can do it on a consumer’s budget now. Kubernetes for instance has no resource accounting/costing and security is obviously a bolt on, its probably going to take them another decade to get that to where IBM was forty years ago.
In the “golden” days you also spent a lot more effort on getting hardware and software from different vendors to work together. Mechanically, electrically, and in software. People used to joke how ironic it was a MacBook could talk to the alien mothership in Independence Day since Macs couldn’t communicate with anything on Earth - but that joke is no longer true, good job Apple.
Probably the last thing that differs from the “golden” days is the amount of pioneering that goes on. Today you are pretty much limited to whatever the various hardware consortiums feel like producing, or what intel’s roadmap allows, etc. So even if there is some amazing alternative to the von Neumann architecture, your unlikely to see it. The Soviets experimented with ternary based computers for years and it would have loved to see that go further - unfortunately all that got killed off due to one guy’s indiscretion.
The golden age of computing is still accessible. Just dig out an old machine from the attic and get it up and running, then get the BYTE mags up on your modern screen, and have at it .. there is still much to be learned from old computers.
Maybe not golden but at least direct. Very few layers between you and the machine, and to some extent infinite possibilities. On the other hand you could easily hit resource walls. OTOH, microcontrollers haven’t changed much since then - at least for me (8051, and similar). Still a great time!
If you think LISP is cool (here comes a shameless plug), I think you might be interested in Tree Notation (http://treenotation.org), which you could call LISP without parentheses. In the same way that a Tesla is a car without gas, LISP without parentheses can be drastically better. Imagine you get the power of LISP but you also get clean data, program synthesis, visual programming and more...
I just went over a sample of the advertising companies. Some of them were acquired, most of them no longer exist, with the exception of Apple and Xerox.
It might not be surprising, after all its been 40 years, but it is a strong testament to the fact that companies need to constantly evolve together with their markets. Being in the right place, at the right time, with the right product, is just the beginning.
Wow, it was that issue that inspired me to save every dime for a PC. Took me a couple years but eventually got an Osborne (!) and ordered a Lisp interpreter for it...I forget the name of the vendor.
Logo was always regarded as a "kids language" by association with schools and turtles. It was very much a niche even in the mid 1980s when I learned it. Of course it was in reality a fairly sophisticated LISP relative, but programming languages are driven by fashion rather than facts.
Logo was designed to be a children’s language, so the direction of causality between it being regarded as a “kid’s language” and its association with schools and turtles goes the other way!
It used a turtle because Wally Feurzeig, Cynthia Solomon, and Seymour Papert thought this was a good device for teaching about geometry, and Logo was associated with schools (and Lego!!!) because they were trying to change education.
Of course, you are also right in that once it has momentum along these lines, people tend to put it in a little box marked “children’s language” and not build on it in a more general-purpose language direction.
A few folks have tried, but like squeak and pharos and so many other dialects of Lisp and Smalltalk, it has never gained serious traction outside of its original purpose.
The demand for advertisements targeting readers interested in personal computers grew very quickly in 1979 and the early 1980s. Byte was able to capture much of the quickly growing advertising revenue because it took years for high-growth-capable competitors to Byte to get off the ground.
(Byte was mostly ads back then.)
PC Magazine started in Feb 1982. Its issues quickly became as thick or almost as thick and ad-filled as Byte's.
There were other magazines in the same market in the 1970s, e.g., Dr Dobb's Journal, but for some reason they never expanded their "portfolio" (terminology?) of advertisers the way that Byte did.
I guess the mainstream magazine publishers considered (probably rightly so at that stage of the industry) personal computers too niche to be worth their while. PC Magazine for example was started by an enthusiast, not a mainstream publisher.
80s "Personal Computer World" was pretty much the european equivalent to byte, and it was usually around 900 pages and 3/4" thick too - although most of that was adverts, naturally.
Also there weren’t so many alternatives back then for getting articles and ads out. So the release schedule probably made sense and it was possible to make some profit off magazines. (Not that much though I think.)
These days the dynamics have changed a lot. Today many ads an articles would be outdated if they were written many months before being published.
Looking at these old BYTE magazines makes me think that magazines these days simply look much better. Graphics design has has simply gotten better. Not sure if this is just my subjective opinion but somehow I believe there must be some truth to that.
Holy cow, I like this old BYTE more. Like Computer Chronicles. I like how direct and no-bullshit the layout and all the graphics are. Ads are creative and fun, and the content is rich in ideas. Modern magazines often lack substance and soul, including graphics design. What modern computing magazine can compare, is there a single example?
Looking at this BYTE mag pretty much confirms to me the old adage that the Earth's population is increasing but the IQ levels are staying the same.
Graphic design was done by hand in those days. They literally cut and pasted images and text for the pages. My parents were both graphic designers and I watched them do this as a child before computers took over the industry. You can imagine how digital publishing tools made this process easier to iterate on.
They do look better, but I must say BYTE and friends packed one hell of a punch.
I've almost zero interest in magazines today, but I would still subscribe to BYTE, COMPUTE and a couple others, given their editorial focus was similar to what made them great then.
That is why I wrote, "what made them great", and that was true of BYTE for more than a year. Adding consumer material did not displace enough great material for a while.
On page 158 of the linked issue, it points out that they just started using digital typesetting. (I totally disagree with you, by the way; I like my content front and center, and my design well to the rear, which I think BYTE does very well indeed.)
What amaze me the most is that half of it is ad. Those and the content seem to have a very similar production value, which did muddle the reading a bit...
Ads of today are much more invasive. And even the articles themselves today are often ads in disguise or direct copies of press releases. Haven't read a computer magazine in at least a decade, but this I would read.
The laser magazines, Laser Focus World, and Photonics Spectra, were historically and are today plagued by the “ads in disguise” problem. They have gotten a little better.
They don't call it desktop publishing anymore because all publishing is digital. It would be like if we still referred to cars as "horseless carriages".
Which I would, by the way, totally love to do. It's kind of like the fad among some early- to mid-20th Century writers of using only words of Germanic derivation. I love quirky, old-fashioned language.
One more historical / nostalgic tid-bit: If you flip to page 156, you have Carl Helmers (or possibly Gary Kildall) trying to describe a spreadsheet to readers. In 1979, the concept of a spreadsheet was still new...
Well, several magazines would be like that until the age of the internet. The 90s was the last time when we saw a flourishing magazine market. Nowadays the very few magazines that survive are nothing more than a leaflet.
Awful? I think you might need to get your outrageometer checked.
I’m old enough to remember being lectured on how prude we are in the States not to have topless women in ads like the enlightened Europeans and Japanese.
I looked through the pages and saw two ads that had a woman standing with the product. Seems reasonable and unoffensive, especially for 40 years ago and compared to other industries/magazines.