Hacker News new | past | comments | ask | show | jobs | submit login

> It is like witchcraft seeing someone produce VGA out with some raw/more tangible chips.

I think what you're missing is: VGA was designed in the era when this was A Thing™. Monochrome/NTSC/CGA/EGA/VGA displays are all about "bit banging," sending signals at the right time. If you can send 1's and 0's faster than the analog reception can update, you can "fake" voltage potentials. I say "fake" because that was actually a way to do it before digital-to-analog converters were easy to implement. Today, we can easily produce chips custom for the purpose; however, "in the beginning" it was really just all about timing.

The witchcraft for me was the fact that while older cards used bit-banging to get signals out the door, it was generally designed with a specific purpose (thus specific timings). If you can get access to the underlying timing control, it [opens a whole new world that will surprise people today](https://www.youtube.com/watch?v=-xJZ9I4iqg8).






Display controllers from the 8-bit era were simple conceptually but had a huge parts count, particularly it needs to have memory access logic very similar to what is in the microprocessor. The earliest home computers (TRS-80 Model I, Apple II) had a large parts count which was reduced in the next generation (TRS-80 Color Computer, VIC-20) because the glue logic and display controllers got the same LSI [1] treatment as the CPU.

People who build modern real-hardware fantasy computers [2] struggle with the cost of the display controller if it is done in an authentic style so they wind up using an FPGA or microcontroller (amazingly easy to do with ESP32 [3])

This thing addresses the problem by reusing many of the parts between the CPU and display controller, plus the contrast is not so stark since the CPU part count is greater than 1, unlike the typical retrocomputer.

It's fascinating! It's a minicomputer in the sense that it is built out of low-integration parts, but it is like a microcomputer in important ways, particularly having the closely integrated display controller.

[1] https://vaibhav-pawale19.medium.com/integrated-circuits-ssi-...

[2] http://www.commanderx16.com/

[3] https://github.com/fdivitto/FabGL


Historically, the best example of going it alone without display hardware, was the ZX80.

https://www.8bity.cz/files/zx80_schema.pdf

VIDEO comes out of the block of gates roughly center of the schematic.

SYNC comes out of the gates on the center right.

and of course the ZX81 merged the logic chips into one ULA.

There exists a kit to go the other way for a ZX Spectrum to split the ULA logic out into discrete chips again.



That demo looked cool, would probably be more impressive if I grew up with that kind of tech.

I wonder wrt an HDMI to VGA connector how simple it is, direct connections or some chip in the middle.

Yeah for end of days being able to go back to VGA/those analog plugs would be good eg. Red/White/Yellow


You need a chip for VGA->HDMI but they exist, and you can buy simple adapters. I think HDMI->VGA adapters might be cheaper (I have one in a draw somewhere) , One of the more tricky points with HDMI is that they are stricter on what they call a valid image and make weird assumptions like All your pixels are the same width.

A CRT can make do with signals to say "go to the next line now", "go back to the top now". and then just output whatever is coming in on the colour signal. It really means there is no concept of a display mode. It's all just in the timing of the signals on the wires. Plenty of modern hardware with digital internals look at a lot of that and just say "That's not normal so I quit".

Analog devices may make a high pitched whine and then explode, but at least they'll attempt the task they have been given.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: