Apple Cinema Display- Nothing better? - AVS Forum
Forum Jump: 
 
Thread Tools
post #1 of 21 Old 05-16-2010, 11:15 PM - Thread Starter
Member
 
TaxSux's Avatar
 
Join Date: Oct 2008
Posts: 34
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
I have a coworker who just bought a new 30 inch Apple Cinema HD Display. It cost him over 2000 CAD. He says he uses it as his HDTV and also swears there is nothing that comes close to it's picture quality. Are computer monitors supposed to have better picture quality than HDTVs? He says the native resolution is 2560 x 1600 which he explained is better than 1080P HDTV. I thought 1080P was full HD. I didn't know there is even more HD than that. I'm just a little surprised because he is the first person who has told me high end computer monitors are better for HD movies than real HDTVs. I am talking just strickly picture quality. Can anyone confirm this one way or the other?
TaxSux is offline  
Sponsored Links
Advertisement
 
post #2 of 21 Old 05-17-2010, 01:12 AM
AVS Special Member
 
cavu's Avatar
 
Join Date: Apr 2005
Location: CANADA
Posts: 6,885
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 13
Almost everything he said is true ... but ... there are other similar monitors which may be a better bang-for-the-buck. LCD monitors tend not to have great contrast. Caveat emptor. And the Apple Cinema displays always carry the Apple "premium" price.

My son got the Samsung 305T for his Mac Pro instead of the Cinema and he is thrilled with it. Check NCIX.


"The greatest obstacle to discovery is not ignorance  it is the illusion of knowledge." - Daniel Boorstin
"Our lives begin to end the day we become silent about things that matter.
" - MLK

cavu is offline  
post #3 of 21 Old 05-17-2010, 02:04 AM
AVS Special Member
 
shaddix's Avatar
 
Join Date: Mar 2006
Posts: 1,436
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 11
Quote:
Originally Posted by TaxSux View Post

He says the native resolution is 2560 x 1600 which he explained is better than 1080P HDTV. I thought 1080P was full HD. I didn't know there is even more HD than that. I'm just a little surprised because he is the first person who has told me high end computer monitors are better for HD movies than real HDTVs. I am talking just strickly picture quality. Can anyone confirm this one way or the other?

The extra resolution will not give it a better picture than a 1080p hdtv because there is no content available to watch that can take advantage of a higher resolution than 1920x1080.
In fact, watching 1920x1080 content on a 2560x1600 display could decrease picture quality since the pixels of the source are no longer exactly matched to each pixel of the display.

As far as comparing it to another display, a good plasma will have a better picture ;D. Don't tell him that though as I know those things are expensive as hell!
shaddix is offline  
post #4 of 21 Old 05-17-2010, 01:30 PM
Advanced Member
 
rgb32's Avatar
 
Join Date: Jul 2007
Posts: 920
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 7 Post(s)
Liked: 14
Quote:
Originally Posted by shaddix View Post

The extra resolution will not give it a better picture than a 1080p hdtv because there is no content available to watch that can take advantage of a higher resolution than 1920x1080.
In fact, watching 1920x1080 content on a 2560x1600 display could decrease picture quality since the pixels of the source are no longer exactly matched to each pixel of the display.

As far as comparing it to another display, a good plasma will have a better picture ;D. Don't tell him that though as I know those things are expensive as hell!

1:1 pixel modes are common on the 30" models, so scaling is optional. Any half decent PC game developed within the past 6 years can be set to render and display at 2560x1600.

Red plasma herring! You'd see a ton of noise on any plasma if you were to view one from the same distance as one of the 30" LCD monitors... not to mention phosphor lag!
rgb32 is offline  
post #5 of 21 Old 05-17-2010, 02:59 PM
Senior Member
 
scorrpio's Avatar
 
Join Date: Mar 2009
Posts: 405
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 1 Post(s)
Liked: 15
HD movies are 1080p. Interpolating 1080 lines into 1600 physical pixels doesn't sound like a good idea. Besides, movie watching is a lean-back activity, and 30" in book is too darn small to enjoy from a couch. Gimme a 60" plasma - you in fact can get one for the price of that Apple.
2560x1600 monitors are generally pro tools for serious work: design, modeling, CAD etc, not multimedia devices.
scorrpio is offline  
post #6 of 21 Old 05-17-2010, 03:15 PM
Member
 
andy2000's Avatar
 
Join Date: Oct 2004
Posts: 165
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 12
One problem with using an Apple monitor as a TV is that they are fixed frequency and any image scaling has to be handled by the computer's video card. They will not work directly with something like a Blu-Ray player, or cable box. Most other computer monitors have built in image processing that can accept a wide range of resolutions and scale it to the panel. Also, the 30" Cinema display lacks HDCP which will make watching Blu-Ray difficult, even when it's connected to a PC.
andy2000 is offline  
post #7 of 21 Old 05-17-2010, 03:59 PM
AVS Special Member
 
shaddix's Avatar
 
Join Date: Mar 2006
Posts: 1,436
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 11
Quote:
Originally Posted by rgb32 View Post

1:1 pixel modes are common on the 30" models, so scaling is optional. Any half decent PC game developed within the past 6 years can be set to render and display at 2560x1600.

Red plasma herring! You'd see a ton of noise on any plasma if you were to view one from the same distance as one of the 30" LCD monitors... not to mention phosphor lag!

You could sit back on a nice couch on a plasma. And using 1:1 pixel mode on a fixed pixel display will give you a tiny image on a screen of that res lol.

the point was that no, increasing the resolution past the source content won't improve the picture quality, but quite possibly make it worse.
shaddix is offline  
post #8 of 21 Old 05-17-2010, 05:56 PM - Thread Starter
Member
 
TaxSux's Avatar
 
Join Date: Oct 2008
Posts: 34
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
So regardless of the subjective factors like being able sit back and view a bigger screen from the couch, are the best Plasmas and LCDs playing second fiddle to high end computer monitors for watching HD movies? I couldn't find any comparisons online. If these monitors are so much better, why isn't this more talked about in the review magazines or anywhere else? Again, I am only speaking of watching HD movies NOT gaming.
On a side note, I did find that there are other computer monitors that would easily compare or out do the Apple. NEC and Eizo are two brands considered to be the standards in the industry. So this question isn't just in regards to Apple monitors, it's in regards to high end computer monitors in general.
TaxSux is offline  
post #9 of 21 Old 05-17-2010, 06:37 PM
AVS Special Member
 
shaddix's Avatar
 
Join Date: Mar 2006
Posts: 1,436
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 11
Quote:
Originally Posted by TaxSux View Post

So regardless of the subjective factors like being able sit back and view a bigger screen from the couch, are the best Plasmas and LCDs playing second fiddle to high end computer monitors for watching HD movies?

no

The computer monitors that can outdo hdtvs on picture quality are not going to be lcds. They are going to be CRTs like the Sony FW900.
Perhaps if SED/FED ever comes to pass then we can see thin computer displays that can compete with local dimming LEDs and plasmas. OLED is supposed to be able to fill that gap too.

Current lcd computer displays can be extremely accurate when it comes to color of course. But motion and black level, they fall short of a good plasma or local dimming LCD.
shaddix is offline  
post #10 of 21 Old 05-17-2010, 09:41 PM - Thread Starter
Member
 
TaxSux's Avatar
 
Join Date: Oct 2008
Posts: 34
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
Quote:
Originally Posted by shaddix View Post

no

The computer monitors that can outdo hdtvs on picture quality are not going to be lcds. They are going to be CRTs like the Sony FW900.
Perhaps if SED/FED ever comes to pass then we can see thin computer displays that can compete with local dimming LEDs and plasmas. OLED is supposed to be able to fill that gap too.

Current lcd computer displays can be extremely accurate when it comes to color of course. But motion and black level, they fall short of a good plasma or local dimming LCD.

Thanks.
TaxSux is offline  
post #11 of 21 Old 05-17-2010, 10:25 PM
AVS Special Member
 
dlplover's Avatar
 
Join Date: Nov 2006
Posts: 2,178
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 11
It's worth bearing in mind who these displays are targeting, which is the productivity market - office professionals, designers, architects, artists, etc... all of which are using them to design content, not to play it back.

You get very accurate gray-scale and color, but black levels are nothing impressive and motion handling isn't quite as good. The one nice thing about the high end "productivity" IPS monitors is that some of them actually have consistent backlights unlike the vast majority of LCDs. It's still something of a tossup though from panel to panel. A plasma will have deeper black levels and consistent backlight without having to return or exchange panels like with a high-end LCD.
dlplover is offline  
post #12 of 21 Old 05-18-2010, 04:33 AM
AVS Club Gold
 
HogPilot's Avatar
 
Join Date: Jun 2006
Location: Good Ol' US of A
Posts: 2,870
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 19
Quote:
Originally Posted by shaddix View Post

The extra resolution will not give it a better picture than a 1080p hdtv because there is no content available to watch that can take advantage of a higher resolution than 1920x1080.
In fact, watching 1920x1080 content on a 2560x1600 display could decrease picture quality since the pixels of the source are no longer exactly matched to each pixel of the display.

As far as comparing it to another display, a good plasma will have a better picture ;D. Don't tell him that though as I know those things are expensive as hell!

Just one small point - I respectably disagree with your assertion that loss of 1:1 pixel mapping automatically degrades an image. In fact, this flies in the face of the basic idea behind "upscaling" lower resolution sources - such as SD cable or DVD - to display on HDTVs. A good scaling algorithm - Lumagen's being a good example - can create a better picture on a display that has a higher resolution than the source material. The difference in resolution and the quality of the scaling algorithm are the key to determining whether "upscaling" an image to view on a higher resolution display will yield better results.

Of course, given the display size, resolutions, and seating distances that we're talking about here, the increase in Blu-ray quality will probably be negligible at best. The only thing you'll buy is more desktop space for regular computer usage, but I'd most definitely rather own a larger 1080p plasma or local dimming LCD over the aforementioned monitor.

There are 10 types of people: those who understand binary, and those who don't.

HogPilot is offline  
post #13 of 21 Old 05-18-2010, 05:07 AM
AVS Special Member
 
8mile13's Avatar
 
Join Date: Nov 2009
Posts: 3,708
Mentioned: 2 Post(s)
Tagged: 0 Thread(s)
Quoted: 95 Post(s)
Liked: 190
Quote:
Originally Posted by hogpilot View Post

just one small point - i respectably disagree with your assertion that loss of 1:1 pixel mapping automatically degrades an image. In fact, this flies in the face of the basic idea behind "upscaling" lower resolution sources - such as sd cable or dvd - to display on hdtvs. A good scaling algorithm - lumagen's being a good example - can create a better picture on a display that has a higher resolution than the source material. The difference in resolution and the quality of the scaling algorithm are the key to determining whether "upscaling" an image to view on a higher resolution display will yield better results.

Of course, given the display size, resolutions, and seating distances that we're talking about here, the increase in blu-ray quality will probably be negligible at best. The only thing you'll buy is more desktop space for regular computer usage, but i'd most definitely rather own a larger 1080p plasma or local dimming lcd over the aforementioned monitor.

+1
8mile13 is offline  
post #14 of 21 Old 05-18-2010, 06:14 AM
AVS Special Member
 
shaddix's Avatar
 
Join Date: Mar 2006
Posts: 1,436
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 11
Quote:
Originally Posted by HogPilot View Post

Just one small point - I respectably disagree with your assertion that loss of 1:1 pixel mapping automatically degrades an image. In fact, this flies in the face of the basic idea behind "upscaling" lower resolution sources - such as SD cable or DVD - to display on HDTVs. A good scaling algorithm - Lumagen's being a good example - can create a better picture on a display that has a higher resolution than the source material. The difference in resolution and the quality of the scaling algorithm are the key to determining whether "upscaling" an image to view on a higher resolution display will yield better results.

I think I see what you're saying. Please correct me if I have the basic idea wrong:
A good scaling algorithm would interpolate more pixels for the higher resolution display instead of just trying to find places for the source pixels to fit.

Like how running 8xAA in a video game(what you're saying) at a lower res can look better than no AA on a slightly higher resolution.(what I was saying)

What lead me to make my statement was the common assumption that a native 720p display is going to display a 720p source better than a 1080p display will be able to do with the 720p source. I guess that also makes some other assumptions about seating distance though.
shaddix is offline  
post #15 of 21 Old 05-18-2010, 07:33 AM
AVS Club Gold
 
HogPilot's Avatar
 
Join Date: Jun 2006
Location: Good Ol' US of A
Posts: 2,870
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 19
Quote:
Originally Posted by shaddix View Post

I think I see what you're saying. Please correct me if I have the basic idea wrong:
A good scaling algorithm would interpolate more pixels for the higher resolution display instead of just trying to find places for the source pixels to fit.

Like how running 8xAA in a video game(what you're saying) at a lower res can look better than no AA on a slightly higher resolution.(what I was saying)

What lead me to make my statement was the common assumption that a native 720p display is going to display a 720p source better than a 1080p display will be able to do with the 720p source. I guess that also makes some other assumptions about seating distance though.

Yep, that's pretty much what I'm saying.

There are 10 types of people: those who understand binary, and those who don't.

HogPilot is offline  
post #16 of 21 Old 05-18-2010, 08:56 AM
AVS Special Member
 
dlplover's Avatar
 
Join Date: Nov 2006
Posts: 2,178
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 11
The way of the future is upscaling or quite possibly down-capturing:

http://www.wired.com/magazine/2010/02/ff_algorithm/

There is still something to be said for watching things at their native resolution... at least until those techniques become standardized in av equipment.
dlplover is offline  
post #17 of 21 Old 05-18-2010, 09:50 AM
AVS Special Member
 
cavu's Avatar
 
Join Date: Apr 2005
Location: CANADA
Posts: 6,885
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 13
Quote:
Originally Posted by HogPilot View Post

I respectably disagree with your assertion that loss of 1:1 pixel mapping automatically degrades an image.

Information theory proves that ANY scaling operation of an image (up or down) introduces distortion. Period. It isn't a debate. You cannot get gold from straw.

Research "moire pattern" and "aliasing".
Quote:


this flies in the face of the basic idea behind "upscaling" lower resolution sources - such as SD cable or DVD - to display on HDTVs.

With digital displays "the basic idea behind 'upscaling' lower resolution sources - such as SD cable or DVD - to display on HDTVs" is not a choice, it is a requirement! Digital displays can only produce an image at their native (physical) resolution. To use an image source that does not match that native resolution REQUIRES that the source image be rescaled. ALL digital displays contain the appropriate scaling circuitry but, in a few cases, external, stand-alone circuitry MAY do a better job.

A 480p 1:1 bit-mapped source will produce a purer image than the same 480p imaged remapped to 720p or 1080p. But, unfortunately, most people only have one display with and all other sources must be forced to fit.

"The greatest obstacle to discovery is not ignorance  it is the illusion of knowledge." - Daniel Boorstin
"Our lives begin to end the day we become silent about things that matter.
" - MLK

cavu is offline  
post #18 of 21 Old 05-18-2010, 10:39 AM
Senior Member
 
TheMarco's Avatar
 
Join Date: Apr 2010
Posts: 272
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
Quote:
Originally Posted by TaxSux View Post

I have a coworker who just bought a new 30 inch Apple Cinema HD Display. It cost him over 2000 CAD. He says he uses it as his HDTV and also swears there is nothing that comes close to it's picture quality. Are computer monitors supposed to have better picture quality than HDTVs? He says the native resolution is 2560 x 1600 which he explained is better than 1080P HDTV. I thought 1080P was full HD. I didn't know there is even more HD than that. I'm just a little surprised because he is the first person who has told me high end computer monitors are better for HD movies than real HDTVs. I am talking just strickly picture quality. Can anyone confirm this one way or the other?

Nonsense. I've used the 30'' cinema display myself. The whole upscaling issue has already been mentioned but there's another issue:

This display has horribly uneven backlighting and black levels are not great (to say the least). On a (mostly) dark scene you'll see lighter and darker areas. Even mid range LCD TV's do a better job here than this Apple screen.

Don't get me wrong, the screen is pretty awesome to use as a computer screen but I'd never even consider it as a viable alternative for a TV.
TheMarco is offline  
post #19 of 21 Old 05-18-2010, 11:03 AM
AVS Special Member
 
dlplover's Avatar
 
Join Date: Nov 2006
Posts: 2,178
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 11
These 30" monitors are hit or miss for the backlight. Sometimes you need to exchange a few times to get one with a consistent backlight.

I would not call the PQ better than a kuro, but it is definitely better than cheapo LCD HDTVs which he may have been comparing it to.
dlplover is offline  
post #20 of 21 Old 05-18-2010, 01:51 PM
AVS Special Member
 
Gary McCoy's Avatar
 
Join Date: Jul 1999
Location: San Jose, California, USA
Posts: 6,233
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 3 Post(s)
Liked: 39
I have seen the Apple Cinema display and the Samsung 305T and the Dell Ultrasharp monitor, and I think that all three are above average displays - and the Samsung is the value priced unit.

One display that is clearly superior to my eyes is the HP "DreamColor" professional graphics display, model number LP2480zx. It is about $2348 with the color calibration software (which is needed).

You will also need a video card that supports 30-bit color depth, but those are no longer hard to find or expensive.

The HP display is only 24", and even that is in 16:10, if you use a 16:9 video frame and run a news ticker or something along the bottom, you effectively have a 23.6" video image.

The HP Dreamcolor display also uses an IPS panel, which virtually eliminates the off-axis viewing problems associated with LCDs. You can see no visible image degradation until off-center far enough to introduce severe geometric image distortion, effectively there is no off-axis viewing problem.

Pity it's only a 24" display. It's the best image I have seen this side of a $6K Sony professional grade CRT monitor.

Gary McCoy
The United States Constitution ©1791. All Rights Reserved.

Gary McCoy is offline  
post #21 of 21 Old 05-18-2010, 02:16 PM
AVS Club Gold
 
HogPilot's Avatar
 
Join Date: Jun 2006
Location: Good Ol' US of A
Posts: 2,870
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 19
Quote:
Originally Posted by cavu View Post

Information theory proves that ANY scaling operation of an image (up or down) introduces distortion. Period. It isn't a debate. You cannot get gold from straw.

Research "moire pattern" and "aliasing".

Firstly, I'm quite familiar with moire and aliasing, although the latter is not always an undesirable artifact of scaling. There's plenty of demonstrations on how aliasing can trade a small amount of sharpness for a much higher perceived resolution and smoother image. In theory, you are correct - there's no such thing as a free lunch. However, in execution - as I said, with a properly designed algorithm - the positives far outweigh the negatives and it's quite possible to get a better result from upscaling an image on a higher resolution display.

Quote:
Originally Posted by cavu View Post

With digital displays "the basic idea behind 'upscaling' lower resolution sources - such as SD cable or DVD - to display on HDTVs" is not a choice, it is a requirement! Digital displays can only produce an image at their native (physical) resolution. To use an image source that does not match that native resolution REQUIRES that the source image be rescaled. ALL digital displays contain the appropriate scaling circuitry but, in a few cases, external, stand-alone circuitry MAY do a better job.

It's only a requirement when 1 pixel can't be mapped to an integer multiple of pixels on the display. In the case of 720p and 1080p displays, neither resolution is an integer multiple of 480, so scaling must be performed for a 720x480 image to completely fill either higher resolution display. Although any anamorphically encoded 480 source had to be scaled to 853x480 to be viewed on a 16:9 480p display anyways, so there goes our 1:1 pixel mapping right out the window. Again, the benefits of scaling are clearly visible and have been discussed ad nauseam here in the past. And yes, all digital displays contain some sort of scaling chip, most of which are cheap solutions that can't compare to a more expensive, standalone solution.

Quote:
Originally Posted by cavu View Post

A 480p 1:1 bit-mapped source will produce a purer image than the same 480p imaged remapped to 720p or 1080p. But, unfortunately, most people only have one display with and all other sources must be forced to fit.

Again, it completely depends on the size of the display, your viewing distance, and the differential between the source rez and the display rez. The benefits of scaling a 480 source to 1080p on large displays are significant when viewing inside of a distance at which pixel structure becomes noticeable on a 480 display.

In this case - as I previously said - given the relatively small size of the display and the viewing distance, the OP would realize more benefit from getting a significantly larger 1080p display.

There are 10 types of people: those who understand binary, and those who don't.

HogPilot is offline  
Reply Flat Panels General and OLED Technology

User Tag List

Thread Tools
Show Printable Version Show Printable Version
Email this Page Email this Page


Forum Jump: 

Posting Rules  
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are Off
Refbacks are Off