Originally Posted by Puwaha
How is advanced post-processing a bad thing? Not everyone's display is setup to ideal. Not everyone's viewing room is ideal. Not everyone's viewing habits or preferences are the same.
People want the best video quality, and AMD helps them get it.
If the video was supposed to look any different, why wouldn't the mastering engineer that produces for example a Blu-ray adjust the image to look differently?
I'm not saying that its generally a bad thing, i just think assuming that everyone wants/needs it is a bad thing.
Like i said, a lot of people like over-satured images, because it looks "better" to them, and AMD exploits this by activating all these settings by default.
Preferences are however rather subjective, and i hate overly sharpened and over-satured images (and so do quite a bunch of people that value image quality)
All these algorithms don't generally "improve" the image, they just change it (improvements are subjective)
Of course it may seem better to you (increasing the Quality for you), but it may seem worse to someone else.
This is my whole point. No matter which subjective level of post-processing you prefer, if you want to compare quality, at least be smart about it and compare on equal settings.
All the vendors expose the settings to change post-processing. The only difference is that AMD has them on by default and NVIDIA has them off, but thats it - all the settings are there.
I dislike post-processing, but still i wouldn't say AMD gives me worse quality, because i can just turn everything off.
So in the same spirit, just turn on some post-processing in the NVIDIA control panel, and enjoy.