AVS Forum banner
Status
Not open for further replies.
1 - 5 of 5 Posts

·
Registered
Joined
·
772 Posts
Discussion Starter · #1 ·
I've seen this over and over, time and time again.


There seems to be some confusion regarding resolution and perceived video quality. People are tempted to judge video quality by resolution first (meaning, a higher resolution display will always look better than a lower resolution display). That would only be true if all sets were made equal using the same parts. But, of course they don't, and some of these variables take greater precedence than resolution.


My 21" GFM-F500 Sony monitor running 1600x1200p does NOT look as clear when playing DVDs as my 34" Sampo, which has a much lower resolution but has a much higher contrast ratio due to its loose aperture grill. The monitor, although running at an extremely high bandwidth output of 1600x1200p, has a tight grille pitch and just looks dim and washed out in comparison. The higher contrast ratio on the Sampo enhances edges and makes video look more 3D. And my SGI 1600SW flat panel running 1600x1024p blows away both with its exceptionally high contrast ratio AND resolution, thanks to its lcd display technology.


Think about plasma displays. The Panny TH-42PWD3U is a much lower resolution display compared to the newer 50" plasmas that are capable of much higher resolutions. Neverthless, the construction and plasma display technology used in the Panny is superior in terms of contrast ratio etc. to the point where it has become the benchmark in terms of picture quality. Just go into the Plasma discussion forum and see for yourself.


So, just because X has greater resolution then Y does NOT mean that X has better picture quality. Display technology can matter more, whether it's crt (beam spot size!), rp, fp, lcd/flcd, plasma, dlp, dila, etc.


AND, as in the case of the Panny plasma, we see that this applies even within the same display technology set.


MMAfia



[This message has been edited by MonkeyMafia (edited 06-15-2001).]
 

·
Registered
Joined
·
24 Posts
Totally on the button.


Just tonight I had to play a 3rd generation VHS


(copy from another copy) on a 27" Pana Gaoo


& then on a cheap Citizen 20"


I was o.k. on the Citizen but awful on the Gaoo.
 

·
Registered
Joined
·
133 Posts

Originally posted by MonkeyMafia:


And my SGI 1600SW flat panel running 1600x1024p blows away both with its exceptionally high contrast ratio AND resolution, thanks to its lcd display technology.


Just curious but what video card are you using with the Multilink Adapter? Support for Win 2K? I have the 1600SW and the original Number 9 card. I would like to upgrade to a better card but fear full width incompatibilities. Also, are you able to view anamorphic, wide DVD content with your setup?


------------------

STOP DFAST NOW!!!
 

·
Registered
Joined
·
772 Posts
Discussion Starter · #4 ·
You need the Dual Head Display version of the GeForce2 which has a digital DVI output that hooks into the multi-link adapter. I've heard of a few success stories with the Radeon DVI using custom T&R. But the GeForce2 route is the safer way... albeit *slightly* poorer video playback quality.


Original #9 rev IV SUCKED for overlay playback qualities. 2D was good, but that was it. 3D sucked, and so did video playback. They're not even a company anymore (bought by ATI).


When I got my 1600SW, there was one dead pixel, but it was stuck on the bad state, if you know what I mean. Corporate policy considered something like 7 or more dead pixels before being called defective and warrantying a replacement. I was p-oed so I called them and they gave me a replacement unit (although they could not guarantee that it would be perfect) which was perfect.


The things still blows me away. It is still the best display in terms of quality among everything I have seen (and I've seen quite a bit). 1600x1024 on only 17.3" screen real estate would look very soft and blurry on a CRT. But this is the unit's NATIVE res. the mac cinema display runs the same res, but with much larger screen estates and thus a lower dpi.


MMAfia



[This message has been edited by MonkeyMafia (edited 06-10-2001).]
 

·
Registered
Joined
·
887 Posts
MonkeyMafia, Good post and I agree with your position. I think the key link (or discontinuity!) here is between intuition and measurement. Intuition says (to many of the folks you're addressing) that with more resolution the ability of the set to show details will be higher. The point you make is that the contrast range plays a big role. The problem is that resolution is not generally measured or claimed under consistent conditions that take the contrast into account. I think there is an ANSI spec on how to claim "ANSI pixels" . For example my 8500LC has specs for : 2500x2000 addressable pixels but 'only' 1300 x 1000 or so ANSI pixels. I have taken this to mean something like :- the bandwidth of the circuitry can handle signals and keep them distinct, or addressable, up to 2500x2000 but when it comes to the display being able to resolve them BY A STANDARDIZED AMOUNT you get the 1300x1000. This standardization requires that the contrast between 'resolved' pixels is at least X difference in contrast - a figure of 30% modulation is stuck in my mind* - and this is a lot more stringent than the eye's ability to see a 'bit of a difference' between adjacent pixels. If you can see a distinction between a light gray and a slightly darker gray, is that "resolved"? - if they're supposed to be black and white, no it isn't, but if they are supposed to be a light gray and a slightly darker gray, then it is. So the difference in grayness is a critical parameter to specify when claiming resolution.


If resolution claims were made using something like ANSI conditions they would 1) be more likely to allow comparisons "by numbers" and 2) take your issue into account - i.e. contrast matters. The audio industry went through something like this many years ago (remember Watts per channel "Music Power" - meaningless, to WPC RMS at X% distortion at Y Hz, far more meaningful and comparable.)


I'm not wedded to ANSI or any other particular set of conditions, but I would like to see the manufacturers and reviewers standardize on something. It would be a great service to consumers who cannot possibly go "look at" all the possible purchases to inform themselves and rely on information rather than first-hand experience in their decisions on what to buy and why. Still a good idea to look at the unit before you purchase, but being confident you narrowed the choice based on good information would make it a lot easier.


Andy


*If this number were, say, 10%, the specs for resolved pixel count would be much higher, but that picture of black and white lines would look "washed out".


[This message has been edited by AJSJones (edited 06-15-2001).]
 
1 - 5 of 5 Posts
Status
Not open for further replies.
Top