Originally Posted by buzzard767
On high??? lol
Sharpness is a part of calibration. Calibration is getting the set as close to Rec. 709 HD standards as possible and this does not vary from model to model. There is only one goal, a picture that the director intended for you to see. Sharpness settings that leave white ringing around edges is incorrect and the content is distorted.
It seems to me that of all the parameters involving images we view on these screens, clarity is the one and only parameter that can be clearly seen by anyone that has normal vision as correct or incorrect, assuming of course that the "director" intended for the image to be in focus and as crisp as possible.
How do we know the lightness of a given image or any of the colors rendered are either accurate or, if not reproductions of the reality in front of the camera (which is quite commonly the case), modified to suit the director, cinematographer, etc? All we can do is trust that our electronics and calibrations get it right, because we can know neither what shade of green the grass actually was, nor what shade of green the director wanted it to look like it was. And of course the director was making all these decisions in a setting quite different than many of us will be in when we view his or her work.
I'm quite certain that the directors of TV reproductions of NFL games want the image to be as clear and in-focus as possible.
I don't know about other TVs out there, but our Samsungs have the ability to do A/B testing of different picture modes from within the TOOLS button. One can set everything as close to equal as possible between Standard and Movie and then test different Sharpness settings of actual content. On my PN51D8000 Sharpness settings of 0-10 are blurred in comparison to settings of 20 or higher. This is simply an observation: a "bottom up" fairly scientific observation.
I say fairly scientific because I have done this A/B comparison between all possible pairs of video modes/color temps, looking at DirecTV content and Blu-ray (24p) content, and have adjusted Sharpness for each of these combinations both ways (where Mode A is given a higher sharpness setting than Mode B one time and sharpness settings are reversed the second time.
It always comes out the same. A Sharpness setting of 20 or higher is ALWAYS superior to settings of 10 and lower. Higher settings than 20 result in small increments of additional clarity. It so happens that the point at which the ghosting at the edges of the black-line-on-gray-background test fields in the AVS and Spears & Munsil disks is also right at 20 when viewed from my normal seating position of ~10 feet. Up close to the screen and the value is right at 10 for the white ghosting to disappear. Sharpness setting of 10 yields a slightly less clear image on my TV, and it is the setting I lived happily with for 3 months. Now I'm going with 25.
By the way, I looked in some detail at the BT.709 standard and couldn't find where standards applicable to sharpness are mentioned.I hope others will test this
; it's why I brought it here. Sharpness is perhaps the ONE setting that can be scientifically evaluated with just our eyes. And, for the record, I have 20/15 vision. I don't think the test patterns for sharpness work as well as viewing actual content, and I don't mean trying to figure it out by watching something in motion. There is always something relatively static in the background, and the difference is easily seen, at least in my sample size of one.