Originally Posted by darkthrone
did anyone find out why hd programming doesn't look so good on this model?
There could be any number of reasons/factors.
Just as a reference, I worked out some PPI numbers on 1080p and 720p sets to estimate pixel content in correlation to set size. (I am no expert on this so if my figures/calculations are off, please excuse me.) If you are coming from a smaller 1080p set, you might notice some of the differences in the pixel content though it is unlikely. However, any distortion on your HD feeds may become more prevalent on a larger display. A smaller display will sort of blend that distortion together and your eye will have a more difficult time picking it up with all of the pixel mapping. (Or at least this is what I can figure, thinking about it.)
Refer to the following PPI (Pixels Per Square Inch) Figures I have calculated:
37" @ 1080p = 3,546 PPI
40" @ 1080p = 3,034 PPI
42" @ 1080p = 2,752 PPI
46" @ 1080p = 2,294 PPI
52" @ 1080p = 1,795 PPI
37" @ 720p = 1,576 PPI
40" @ 720p = 1,348 PPI
42" @ 720p = 1,223 PPI
46" @ 720p = 1,019 PPI
52" @ 720p = 798 PPI
Now, PPI content does not necessarily determine picture quality. There are a host of factors. However, PPI is certainly something to consider when looking into the capabilities of your set. Have you taken notice of the display zone on the Sharp 52" when viewing HD content from your HD Box? It should display the display output of the HD feed there (Upper Right Hand Corner). You may find it not distributing 1080p, but rather 720p. I am still uncertain how 720p HD feeds would be distributed across a 1080p set, in regard to pixel mapping. In addition, I would just feel prone to say that the quality of your image is dependent solely upon the quality of your source. HD Services have come a long way, but I still feel that they are not True HD as they claim to be, and I work for a cable company.
There is such a wide host of factors for why the HD feeds are not at par with what I would believe to be a True HD Standard, and it will probably be some time before they ever get there.
This is to say that even if your HD Box is sending a 1080p signal to your set, there are a host of factors that can cause that HD Feed to be distorted somewhere on your providers network, or even in the conversions for displaying over HDMI to your set itself. Network feeds are delivered in a host of ways, and there is a ton of combining, re-combining, and signal changes that occur beyond the box installed in a home.
In short, your HD Service, although worlds above basic analog or digital service, is perhaps, not really the best source for HD Video. Some sets compensate for IQ (Image Quality) loss better than others. I am uncertain of why that is.
For instance: some HD Sets hooked up directly to a piece of coax for basic analog cable will display the picture of those channels at a much higher IQ than the same set connected through an HD Box watching the same exact analog channel as it passes through the HD Box and HDMI component into the set. There is some sort of scaling changes occurring that will dramatically affect the IQ of the feed. Those scaling changes can just wreak havoc on the set and I wouldn’t even know how to begin placing where the true issue lies.
So as I mentioned on the SlickDeals forum. The quality of the image on your set is dependent on the quality of the source. Regardless of how good you think your HD Box is, or your Service Provider, I would not use HD TV as a scale to measure quality/capabilities of a television set. Then again, maybe others would use that as a scale for the particular reason of how a set will handle different quality feeds. I just feel it is not a level playing field to really work with.
Last night I was watching HD TV through my 52” Sharp and I made a few adjustments to some levels and the picture quality seems adequate. I still have to discuss some box related options with my tech buds here in the cable industry, but I really don’t believe it will get much better than what I am seeing. Now, watching a local network in HD, the picture was rather sharp and clean. Within that local network, watching the news last night, they would input other network feeds for some of the stories they were airing, and instantly, the IQ would drop down, and I would pick up distortion and issues. This is a prime example of how many changes are being made in the programming you watch. Even commercials can often drop in IQ levels.
My Settings Thus Far:
A/V Mode: Standard
View Mode: Dot by Dot (When Available: Seems to be Source Compliant)
Fine Motion Enhanced: Off
Active Contrast: On
Film Mode: Standard (When Available: Seems to be Source Compliant)
Digital Noise Reduction: Off
Range of OPC: [N/A]
Color Temp: Low/Med (Still playing with it.)
Power Control menu:
Power Saving: Off
I have yet to mess with the Hue/Saturation Levels and may play with that later as I see fit after the set breaks in more.
Haven't had all that much time to run more tests. But should be able to break out some Blu-Ray movies shortly to run this test through a fair standard.
PS3 Games look fantastic.