An artist named John Whitney made the world's first computer motion graphics in the 1950s using an analog computer that was converted from a WWII anti-aircraft gun control system. The results looked like this:
In the 1960s IBM hired him to work on digital computer animation concepts, and he gave up on the analog system. But it's interesting how the technical complexity of the work dipped for years while he had to wait for IBM's general-purpose digital systems to catch up with the capabilities of the analog computer setup.
I read a book[0] a few years ago that spends some time on the rise and fall of analog computing. Of course in the early electrical computer era it was not obvious that digital computing would be the best way forward.
Digital modeling is a pretty block-headed way to go about building a simulation. And when computers were slow and tubes and memory were very expensive it seemed even more block-headed. The main advantage was flexibility so as digital computers got faster and cheaper they ate more and more domains where analog computers might have seemed more ideally suited.
I love German engineering jargon. They (correctly) wrote Moduln instead of 'Module' that is so common nowadays. Mutterplatine feels forced but I'll let it pass. Some words for which there are no direct equivalents they took from English (like Strobe), some are just as bad and contrived as the English word (Aussprung for Escape).
It is very weird. But in the old days you got German translated books.
I've read most of my programming books in English as most of the online resources were in English. Reading a German book about C++ with "Vorlagen". Didn't know what they were talking about unti I've used a dictionary and found out that "Vorlagen" are "templates". Was pretty funny that it was easier for me to read programming books in English than in my mother tongue.
The same goes for me - I can say I learned English by reading tons of documentation and tutorials over the years. And because of strange out-of-the-world translation I also set the language of all my systems to English. If I ever get a problem, I have the right words to google for right in front of me and can profit from a far larger pool of solutions.
Yes! The German contribution to computing history gets overlooked. History is written by the winners, so the history of computing is American, and us Brits get miffed when Turing, Bletchley Park, the Colossus, Atlas at Manchester are ignored. There should be more recognition for Konrad Zuse [1]
Speaking of German technical jargon, one thing that bothered me probably more than it should have was that in my Norwegian-German wordbook, "computer" ("datamaskin" in Norwegian) was translated as "festplatte". Obviously "festplatte" would mean "harddrive", not "computer".
Analog computers perform a few, specific operations at full-speed that take quite a few components to do in digital with them only running at a specific, clock rate. Combining them with digital computers that feed them problems and measure the answers can be quite powerful. See especially the math-coprocessor at first link. The neural simulation at second is illustrative where one wafer (300+ chips) of analog NN's took 294,912 cores to simulate. That's full-custom, high-end digital stuff rather than the average stuff. Analog also gets used a lot in mixed-signal ASIC's where specific things are cheaper or lower power per unit in analog so they mix digital and analog.
EDIT to add: The paper below is a short history of them that gives examples whose benefits are consistent with other stuff I linked to. Some really cool stuff there, too. :)
The question is the level of simulation. It's common to use a large number of computers to run a VLSI simulation before fabbing a chip. NVidia has been known to use 50 racks of 1U servers to run a gate-level simulation of a CPU. The machines are running C code generated from VHDL. You could do the job of the GPU with far less CPU power; a gate level simulation is executing multiple instructions for each bit transition in a gate.
The level of detail is part of it. The inherent speed of analog on a large range of numbers is the other part. I recall early efforts to model analog circuits like digital gates overwhelmed the computer. One circuit with 9 transistors took over 50,000 equations or something to check. It just did that much.
This is only true when they're doing the things they directly implement like filters or differential equations. They run faster at fraction of size and power.
A modern GPU (10 TFlops) can simulate a million nodes at 1 MHz, far more than any analog computer. The article was written in 2005, when that might still have been true.
I call BS. I can't believe the claims that they are making about this chip. 50 times better battery life, and desktop level gpu in a button. I'm sure about that.
"I am quite desperately looking for an EPROM programmer capable of reading and writing 2708 type EPROMs since I am afraid that the 2708 EPROMs on this board will loose their contents - the are already about 20 years old."
It's not very hard to build your own programmer/verifier (reader) ... if you can't find schematics then message me. But this is very important. Those EPROMs are are erased by shining UV light through the window in the package ... get some UV opaque material over those windows immediately and stop leaving them in the light.
In the mean-time, I'll dig around in my basement and see if I still have the correct programmer for that part (I vaguely remember using it to restore my COSMAC ELF to operation).
I used to use 2708 and 2716 EPROMS for 6502 embedded controller work in the early 80s. The programmer was just a couple of buffers and a half a dozen discrete components plugged in to an Apple II.
Unfortunately I don't have the circuit but it was a trivial thing to put together and was printed in one of the many hardware magazines of the time, perhaps someone else remembers it.
It should be pretty easy to do now with an Arduino or similar. The only odd thing you need is a relatively high voltage for the actual write (it's a long time ago but I seem to remember 30V).
a reader would be easy from a micro controller with 20 lines of code....the writer part with the weird voltage and uv regimens you can wait one until you need to make a copy
i don't know if there is a 2708, but there were some eprom functionally compatible replacements that used more modern technology
I used EAI's analog computers when getting my EE bachelors degree. It was quite interesting, we even had a hybrid model that the professors were scratching their heads to make it work, but the purely analog was great to simulate integrals and such.
Fuzzy logic for one thing, but usually you do not user analogue computers for digital tasks, since that's a very inefficient use of their capabilities. Instead, they can be used to directly calculate differential and integral equations, "simulate" analogue filters of all kinds and so on. This works by recreating the system in question using analogue integrators, differentiators , amplifiers, log converters etc.
Input variables and output variables become voltages.
This is very different from digital, quantified computers. It's also extremely useful, because a whole lot of things can be described by differential equations, and since analogue computers have no FLOPS limit, but only noise (also dynamic range) and bandwidth limits.
PS: There were also mechanical analogue computers (using gears but also more complex constructions) and hydraulics based computers (but I think these were not used practically, just for demonstration).
My understanding is that all electronics are essentially analog, but we make them digital by assigning arbitrary thresholds digital meanings (e.g. everything < 5 is 0 and everything > 5 is a 1).
It's not that straight-forward ... the components in these analog computers have very specific characteristics over their operating range. Some may be very accurately linear, some may be logarithmic, etc.
Digital components on the other hand are purposely non-linear in a way that causes the gates to be definitely a zero or a one. Some parts include Schmidt-Trigger inputs so that there is hysteresis when input signals might be noisy.
One interesting fact about 4000-series CMOS parts - you can add a feedback resistor across an inverter and force the part into a narrow analog range. It's not good enough for audio or anything but you can make an RC oscillator, threshold detector, etc.
Not "good enough for audio" in the sense of being musically pleasant, but you can make a dead simple audible square wave oscillator out of a schmitt trigger inverter. Great project for kids or anyone getting started with a breadboard.
I've seen this view come up before on HN, with the same degree of confidence. I don't know enough about QC to discuss it myself, but from a quick search it seems that QC are generally considered digital:
https://youtu.be/TbV7loKp69s
In the 1960s IBM hired him to work on digital computer animation concepts, and he gave up on the analog system. But it's interesting how the technical complexity of the work dipped for years while he had to wait for IBM's general-purpose digital systems to catch up with the capabilities of the analog computer setup.