> A NES with composite compared to the AVS is a stunning difference in picture quality.
So one counter-intuitive complaint of many modern emulation systems is that they provide "pixel-perfect" rendering. I.e. what the console outputs is what you get on screen.
Except that's not how the games looked back then! The video artifacts of TVs at the time, composite signal and NTSC, meant you had color bleed and slightly blurry pixels that weren't even square. It's not just scanlines. For practical examples, see this Gamasutra article [1]. Most telling to me is the comparisons of Link (from Zelda 2) therein.
The AVS page doesn't mention any of this, so am I to deduce it forgoes emulating these artifacts?
RetroUSB and the scene are all very well aware of this. Problem is emulating pixel blur and defect aren't accurate either. There will be no perfect "defects preserved" mode because people played them on a hundred or more different TV technologies from the '80s and '90s.
People after that old look, and the scanline mode the NES Classic have find that it's actually not that cool to have those running. Best to appreciate the game as it plays now. Purists who try to "replicate" classic gaming are trying to gobble up all the SONY PVM / BVM tube medical and broadcast monitors now. Unfortunately they are also wrong there too. No kid playing nintendo back in the day had that crisp of a monitor either.
This is futility played out in this hobby and it's better to look forward and appreciate the amazing color and pixel density of today's technology with the game scaled to the best of our abilities. Are we to try to add a front light simulated glare to the TVs as well so that we can see a fake reflection of the table lamp on our games as well. It's preposterous.
If people want a CRT to get a faster response time, good on them. and so you know this isn't emulation. So it's not 'forgoing' emulating artifacts, because that wasn't what the original systems did. This is the system running on a chip totally simulated with the same behaviors but with different output (HDMI).
I think these shader efforts are totally worthwhile. You said it yourself - a hundred or more TV technologies. And all those CRTs are leaving us for the landfills. Why not preserve as many of them as possible in a simulated form so that people can continue to experience them?
Well, even better, when it's working dont throw those that work to landfills, have a perfect retro monitor, with a hard switch button to consume 0 watts when not playing.
I attended Powerfest '90 (The Nintendo World Championships)[0], and every game on display was shown on an SVideo connection. The beta Final Fantasy's pixels looked sharp and defined (I spent an hour playing it, though it reset every 15 minutes). They explicitly used the best connections possible to ensure video fidelity and minimize color bleeding.
Magazines were the same; outside of the "Take a picture of your TV Screen with a Polaroid Camera" section of the magazine (scoreboards), Nintendo Power always had screenshots of the game using a Super Wide Boy, which outputted the screen to a crisp video feed perfect for articles. EGM used the same device. Nobody wanted blurred pixels or color bleeding. They certainly didn't want cathode ray distortion.
There is nothing but anecdotal information that the developers of NES games intended the blurring and bleeding to affect their game designs. The only purveyors of the "CRT Experience" seem to be milennials born after the NES's reign.
I didn't have the best TV for my NES, but it looked tons better than some of these scanline effects. If you were using a CGA monitor to play NES games, then maybe the shaders were accurate, but even the black and white TV I had for a while was a superior experience to the scanline shaders available in the NES mini or RetroArch.
Playing Bionic Commando: Return of Hitler[1] on the NES right now, with no shaders. You should too.
If they were using S-video in 1990, they were either converting to s-video from composite which would negate the benefit of s-video, or they were using a playchoice-10 arcade unit and converting the RGB to s-video, in which case the pixels would be sharper but the colors were wrong because the arcade ppu had a different color palette to accommodate direct rgbs arcade monitors. I sort of doubt they were using a Playchoice unit because back then pretty much all projectors could take an RGBS arcade signal directly (I am doing this today but with ordinary CRT) and gotten even less loss so, they would have used that. But, it was common back then to convert composite to s-video not to improve the picture quality, but to reduce the picture quality loss from long cabling so I'd guess that is what they were doing. I don't doubt you saw a good picture but it probably had more to do with the projector. There were projectors back then that could do a thousand vertical lines of resolution.
Subpixel rendering was a necessity to make recognizable faces on the tiny sprites, this is common knowledge in the pixel art community. This pretty much relied on the blurring effect of the CRT. Arcade monitors are sharper, but it's still a very low-res CRT.
Now that I have expressed the limits of my video knowledge from that era, some grey AV nerd with even more knowledge please come out and put me in my place.
It's not anecdotal, it's a comment that showed up in design documents of games by experienced developers. Trade shows and magazines used the best quality possible, because it was marketing.
However, I'm not aware of any gdds that specified coloring as a technical challenge; most of the concerns were around overscan and making sure games weren't unplayable on B/W TVs
I hate to be unable to put my money where my mouth is, but I read a bunch of early 3d game GDDs in college as part of understanding and overcoming hardware limitations in computer graphics. It wasn't so much "dont do this" as justifications such as "we did this because many tvs dont have stereo sound" or "using colors of the same hue is unviewable on black and white TVs and our portable systems"
By this "logic" you should watch everything on 14' CRT for the rest of your life, because thats how TV/movies looked when you were young. Enjoy that VHS Empire Strikes Back.
No, rest of the world didnt suffer thru RF modulated Never The Same Color. What you remember is not reality, just your crappy version of it.
The Japanese transliteration of "computer" (コンピュータ) is written with the "n" (ン) character, but it's pronounced more like an "m" sound because the lip movement for the following sound modifies it. The same dichotomy is seen with shinbun/shimbun, for example. Of course, "Famicom" is the official way to romanize it, so it's usually preferred.
I pronounce incredible with an n sound as well. I included it to contrast with important, showing how the following consonant influences its pronunciation. When enunciating, I pronounce input with an n; when conversationally, with an m. Different dialects are of course going to be a factor here as well.
I love these kind of projects. I'd love it even more if it was open source.
Preservation of the software is the most important thing and I'm glad the game pirates are very fastidious about it, but preservation of hardware like this is - if not as important - at least as fascinating!
And still, I won't buy one. I doubt I'll ever earn enough money to make retro game collecting feasible, so I'll stick with collection retro game controllers and connecting them to PCs for faithful emulation ;)
Interesting project - I wonder if this uses the 6502 core from Visual6502.
The price point might be tough for people. I think you can buy original NES systems for less. They aren't going to be HDMI compatible, but I think those adapters will run you $20 on Amazon.
Yeah, I just did some basic googling and it looks like there are very few options. I suppose it all depends on how lag-free you want your retrogaming to be. If you want it as low-latency as possible and you want to use your flatscreen TV (assuming it doesn't have a ton of lag!), this seems like a much better deal.
I'm deep into this hobby and I can tell you that the AVS is one of the better solutions for the average person looking to play NES games. There are various mods you can do to original hardware for use on modern displays, but they can be pretty pricey, especially if you need someone else to do the [non-trivial] installation.
Furthermore, original hardware can be finicky; the NES has a notoriously unreliable connector for reading carts and requires a bulky AC adapter. Meanwhile, the AVS has an improved design, new/clean cart connector and runs off 5V from a USB port on the back; most people can plug it into their TV, receiver, or any other device in their entertainment system with a spare USB port (like an Xbox or PlayStation).
It's a superb value for someone with the cash to burn. I own both the NES and the AVS and think both are fantastic, but would recommend the AVS to friends and the NES to purists.
On Ebay+Amazon, It looks like the front-load consoles are going for $75 and top-load ones are going for about 30% more. When I bought mine (front load) a couple of years ago, I think I spent $20-$30 on it, so those numbers are a little bit of a shock to me.
Hmm. Is this a sign of the publicly available stock dwindling? I guess we could see the market for these original systems start to really dry up and prices increase dramatically.
> Is this a sign of the publicly available stock dwindling?
Could be. The front loaders tend to wear out their cart connector due to the weird VCR loading mechanism. It's easily fixable now but at the time it probably doomed them to the trash.
The top loaders were always sorta rare because they came out way late in the NES's lifetime (post SNES and Genesis) and IIRC only sold for a couple years.
Wow. I see it around $35 on Amazon, and I know that I didn't pay that much for it around 6 years ago (can't find the order though, so I probably bought it in a large lot of games from Ebay).
A Link to the Past: Bought for $17.56 about 5 years ago, and it seems that I'd be paying around $45-$50 to get it now.
Zelda 2: Adventure of Link: bought for about $5 in 2010, selling for about $15 now.
Oof! This one's the biggest jump I've seen so far. Metal Warriors on SNES. I bought that in 2009 for $40, and the cheapest listing for it on Amazon right now is for $200.
Fear not. In less than 5 years, they'll be back to $20. This is a bubble. Soon, N64 consoles will see similar jumps in prices. Gotta cater to the mid-30's gamer with extra income.
So one counter-intuitive complaint of many modern emulation systems is that they provide "pixel-perfect" rendering. I.e. what the console outputs is what you get on screen.
Except that's not how the games looked back then! The video artifacts of TVs at the time, composite signal and NTSC, meant you had color bleed and slightly blurry pixels that weren't even square. It's not just scanlines. For practical examples, see this Gamasutra article [1]. Most telling to me is the comparisons of Link (from Zelda 2) therein.
The AVS page doesn't mention any of this, so am I to deduce it forgoes emulating these artifacts?
[1] http://www.gamasutra.com/blogs/KylePittman/20150420/241442/C...