Originally Posted by AlanBuck
I feel like I need new glasses when watching many programs..
I swear by what I'm about to write: everything
sent to my Kuro 500M, including Xbox, PS3 and Cable TV (all using HDMI) look fuzzy when the sharpness is at -15. The only
source that is 100% clear [at -15 sharpness] is my PC input (VGA or HDMI).
I promise, I sat for 3hrs in front of all sorts of different source material, and everything looked noticeably fuzzy if it wasn't from a true 1080p PC source
. ...So what sharpness setting delivers a crisp picture from the Xbox, PS3 and Cable TV? In every test, with any source [except PC] +15 sharpness looked the best
. I still test it weekly just to try and understand why people use -15 sharpness
. Even Avatar on BluRay looks far better at +15.
People often argue that increasing the sharpness from minimum isn't representative of the source material, well I'm here to tell you that +15 is a drastic
improvement. ...Pausing a movie and transitioning from -15 to +15 makes it blatantly obvious: at -15 the definition of an actors eye ball is fuzzy at best, but at +15 you can see the blood vessels on the white of their eye. Even swapping between +10 to +15 shows a noticeable improvement
...If that's not a solid-enough example of the difference between min/max sharpness, then I'll take pics of what I'm talking about to support my claim! Because I want people to stop walking with the herd, and realize -15 sharpness is not
what any Pioneer should be set at (unless using a PC source). ...Go ahead and test this yourself, see the difference for yourself, I'll be here to discuss