AVS Forum banner
Status
Not open for further replies.
1 - 1 of 1 Posts

·
Registered
Joined
·
69 Posts
Discussion Starter · #1 ·
I posted about this before, but it seems to have been lost when AVSForum went down.


In both the 1920x1080i and 1280x720p modes, my HiPix exhibits pattern dithering, as if it is using 16-bit color. In 1920x1080i this creates a moire effect on my monitor, and in 1280x720p I can clearly see the pixel-checkboard dithering pattern.


The moire is annoying, but doesn't bother me excessively. However, the 1280x720p dithering is rather destructive. In many places, it has an apparent resolution-halving effect on the picture, quantizing the horizontal borders between colors to every-other pixel.


I would very much like to watch 720p-native programming at its proper resolution, rather that scaling it to 1080i. But the dithering actually makes 720p-native programming look worse in its native resolution. Scaling to 1080i totally defeats the purpose networks like ABC use 720p in the first place.


I strongly suspect that the HiPix exhibits this dithering because it uses 16-bit color video modes. The seems to suggest that it is incapable of anything higher than 16-bit color; otherwise, why wouldn't it use 24- or 32-bit color? This seems to me to be a significant flaw in its hardware design.


Has anybody besides me noticed the HiPix's dithering? Also, the AccessDTV uses a similar chipset; has anybody noticed dithering in the ADTV? Any thoughts on my 16-bit color theory?
 
1 - 1 of 1 Posts
Status
Not open for further replies.
Top