GP here. I don't want to repeat the lengthy technical explanation I already posted in another response downthread, so please refer to that: https://news.ycombinator.com/item?id=42817006
> The reality is that different CRT's had wildly different characteristics in terms of sharpness, color, and other artifacts -- and it also varied tremendously depending on the type of video output/input.
As a video engineer old enough to have started my career in the analog era, I fully agree composite video displayed on consumer TVs could vary wildly. I already explained the technical point about decoding the signal information properly in my other post but you're making a different point about variability, so I'll add that just because TVs could be mis-adjusted (or even broken) doesn't mean there's not a technically correct way to display the encoded image data. This is why we used color bars to calibrate TVs.
> I definitely appreciate wanting to blur the image in an emulator to remove the jaggies
But that's not my point, blur was an undesirable artifact of the composite video standard. 80s and 90s 2D pixel art was hand-crafted knowing that the blur would blend some colors together, minimize interlace artifacts and soften hard edges. However, I only use shaders that model a small amount of blur and I run my CRT in analog RGB instead of composite, which can be quite sharp. My goal is not nostalgia for the analog past or to degrade a game's output as much as my parent's shitty 1970s living room TV did. I had to engineer analog video in that past - and I hated it's shortcomings every day. When I play 80s and 90s video games, whether via shaders or on my analog RGB CRT, it probably looks quite a bit sharper and clearer than the original artists ever saw it - but that's not due to some subjective up-res or up-scaling - it's due to accurately decoding and displaying the original content to the technical standard it was created to comply with (even if many consumer TVs didn't live up to that standard).
In the 90s I worked at a TV station and after hours we'd bring in consoles just to play them on the $3,000 Sony BVM broadcast reference monitor. And they looked great! That's what I'm after. Accurately reflecting the original artist's intent in the maximum possible quality - without slipping over the line into editorializing colors or pixels that were never in the original data in the first place. I want to play the game as it would have looked back in the day on the best video output available, through the best cable available and on the best screen money could buy. And via emulation and shaders, now everyone can have that experience!
> The reality is that different CRT's had wildly different characteristics in terms of sharpness, color, and other artifacts -- and it also varied tremendously depending on the type of video output/input.
As a video engineer old enough to have started my career in the analog era, I fully agree composite video displayed on consumer TVs could vary wildly. I already explained the technical point about decoding the signal information properly in my other post but you're making a different point about variability, so I'll add that just because TVs could be mis-adjusted (or even broken) doesn't mean there's not a technically correct way to display the encoded image data. This is why we used color bars to calibrate TVs.
> I definitely appreciate wanting to blur the image in an emulator to remove the jaggies
But that's not my point, blur was an undesirable artifact of the composite video standard. 80s and 90s 2D pixel art was hand-crafted knowing that the blur would blend some colors together, minimize interlace artifacts and soften hard edges. However, I only use shaders that model a small amount of blur and I run my CRT in analog RGB instead of composite, which can be quite sharp. My goal is not nostalgia for the analog past or to degrade a game's output as much as my parent's shitty 1970s living room TV did. I had to engineer analog video in that past - and I hated it's shortcomings every day. When I play 80s and 90s video games, whether via shaders or on my analog RGB CRT, it probably looks quite a bit sharper and clearer than the original artists ever saw it - but that's not due to some subjective up-res or up-scaling - it's due to accurately decoding and displaying the original content to the technical standard it was created to comply with (even if many consumer TVs didn't live up to that standard).
In the 90s I worked at a TV station and after hours we'd bring in consoles just to play them on the $3,000 Sony BVM broadcast reference monitor. And they looked great! That's what I'm after. Accurately reflecting the original artist's intent in the maximum possible quality - without slipping over the line into editorializing colors or pixels that were never in the original data in the first place. I want to play the game as it would have looked back in the day on the best video output available, through the best cable available and on the best screen money could buy. And via emulation and shaders, now everyone can have that experience!