Originally Posted by Nielo TM
Instead of trying to simulate CRT, why not focus on the deinterlacing and scaling to make sure the errors are kept to a minimum?
As far as 2D gaming is concerned, which is where most of this seems to be focused, that's already possible: nearest-neighbour scaling to the closest integer is perfect, and as has been mentioned here, most of those games ran at 240p, so deinterlacing is unnecessary. Via emulation, you can render the games at whichever resolution you choose, rather than relying on an external scaler. (most are built for video, rather than games) In my experience, even a box like the XRGB, doesn't do as good a job as emulating the games will. (and there's extra input lag from the external scaler in addition to what the display has) That said most emulators with the exception of bSNES are not perfect emulations, they use hacks to speed up the emulation process, and the biggest difference is usually that sound is not perfect when compared to the original hardware.
But "perfect" video doesn't suit the pixel art in most of these games very well at all, especially on the larger screens we have today.
Personally I don't like the filters which try to emulate a composite connection, a bad CRT, or distortion, but I wouldn't mind something that gets the "texture" of a CRT right. (the shadowmask, the scanlines etc.) The biggest problem is that most go too far, or we simply don't have enough resolution yet. Most filters which do a good job of emulating the scanlines or phosphor structure tend to dim the image a lot as a result.
Even then, they're not going to get the phosphor glow/afterglow right, but I think it'll be good enough at that point.
CRTs have more-or-less been overtaken now when it comes to video (at least they should be in a year or two with the first/second generation of OLED displays) but a CRT is absolutely the best way to play these games. The problem is that they're dying out and no-one makes them any more. I'm sure that televisions aren't quite so hard to get hold of, but arcade monitors are in short supply, and I finally had to move to an LCD after I simply couldn't get hold of a second-hand computer monitor of sufficient quality for widescreen gaming any more. (in my case, it was for modern gaming at 720/1080p as I was done with retro stuff at the time)
Unfortunately a lot of the filters these days seem to also have a "CRT gamma" component, which is probably fine if you're on an uncalibrated LCD, but if you have a CRT-like gamma on your display already (I use 2.40) then it just makes things look even dimmer. This is the case here, but it seemed like a fairly good comparison between "perfect" emulation/scaling and "degraded." (just to be clear, anything I've posted in this topic has been sourced online, they're not my images)
And that's only scaled to 720p, when scaling to 1080p, the "perfect" image looks even more stark. The lines are too clean, a pixel becomes a giant square, and what would have blended on a CRT, just looks aliased on a modern display.
It's probably difficult to understand if you are more of a videophile and not really a retro gamer. As good as modern displays can be, and as good as OLED is going to be, I still miss having CRTs. I thought I was over it at this point (for a while, I was trying to source an affordable Sony PVM/BVM for retro stuff) but all this talk lately is giving me the itch again...