Are monitors/ laptops better calibrated out of the box then TV's? - AVS Forum
Forum Jump: 
Reply
 
Thread Tools
post #1 of 40 Old 01-19-2012, 01:22 PM - Thread Starter
Member
 
sefmiller's Avatar
 
Join Date: Nov 2011
Posts: 86
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
I know Mac/Window's have calibration software that alter contrast, brightness & gamma. There are no options for sharpness, colour, black level etc. So, I assume (probably wrong) that monitor's/laptop's are closer to calibrated.

I have only ever had basic lcd monitors with very few customisable options. They always have the temperature at 6500. This again makes me think they have been pre-calibrated. My LCD HDTV, on the other hand, is nowhere near calibrated.
sefmiller is offline  
Sponsored Links
Advertisement
 
post #2 of 40 Old 01-19-2012, 01:30 PM
AVS Special Member
 
derekjsmith's Avatar
 
Join Date: Oct 2003
Location: Mukilteo, WA
Posts: 1,890
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 1 Post(s)
Liked: 98
Quote:
Originally Posted by sefmiller View Post

I know Mac/Window's have calibration software that alter contrast, brightness & gamma. There are no options for sharpness, colour, black level etc. So, I assume (probably wrong) that monitor's/laptop's are closer to calibrated.

I have only ever had basic lcd monitors with very few customisable options. They always have the temperature at 6500. This again makes me think they have been pre-calibrated. My LCD HDTV, on the other hand, is nowhere near calibrated.

Yes and no. The higher end monitors are more often closer to calibrated. But even then they to drift and if you are running VGA those can change as well with the DAC.

Derek

CTO / Founder - SpectraCal Inc.
derekjsmith is offline  
post #3 of 40 Old 01-19-2012, 02:15 PM
Advanced Member
 
PE06MCG's Avatar
 
Join Date: Jun 2010
Location: West Yorkshire, UK
Posts: 729
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 49
Quote:
Originally Posted by derekjsmith View Post

Yes and no. The higher end monitors are more often closer to calibrated. But even then they to drift and if you are running VGA those can change as well with the DAC.

Very recently bought your CalPC Client to run as an addon to CalMANv4 Pro and after a few early learning problems regarding the way in which it operates I can say it works extremely well with my Sony Vaio laptop.

Early days yet but I'm impressed with its simple yet effective automatic operation using my OEM D3.
PE06MCG is offline  
post #4 of 40 Old 01-19-2012, 04:42 PM
Advanced Member
 
Smackrabbit's Avatar
 
Join Date: Sep 2001
Location: Portland, OR, USA
Posts: 893
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 2 Post(s)
Liked: 30
Quote:
Originally Posted by sefmiller View Post

I know Mac/Window's have calibration software that alter contrast, brightness & gamma. There are no options for sharpness, colour, black level etc. So, I assume (probably wrong) that monitor's/laptop's are closer to calibrated.

I have only ever had basic lcd monitors with very few customisable options. They always have the temperature at 6500. This again makes me think they have been pre-calibrated. My LCD HDTV, on the other hand, is nowhere near calibrated.

Are you saying that they are measuring 6500K out of the box, or that they offer 6500K as a preset? I've tested quite a few with a 6500K preset and never found it to be that close to that value, even when using the manufacturer provided profile. I've had monitors with lots of options, and many with few options, but on a typical consumer monitor, I'm happy if I get an average dE of 8 on the Gretag Macbeth color chart with no calibration.

Chris Heinonen
Senior Editor, Secrets of Home Theater and High Fidelity,
To view links or images in signatures your post count must be 0 or greater. You currently have 0 posts.

Displays Editor,
To view links or images in signatures your post count must be 0 or greater. You currently have 0 posts.

Contributor,
To view links or images in signatures your post count must be 0 or greater. You currently have 0 posts.
and
To view links or images in signatures your post count must be 0 or greater. You currently have 0 posts.

ISF Level II Certified Calibrator,
To view links or images in signatures your post count must be 0 or greater. You currently have 0 posts.
Smackrabbit is offline  
post #5 of 40 Old 01-19-2012, 04:42 PM
Member
 
ZandarKoad's Avatar
 
Join Date: Feb 2009
Posts: 136
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
My understanding is that monitors (including those built into laptops) don't need to have calibration settings themselves, as the video processing for those display devices are handled by the computer. Rather, you'd be looking for calibration settings in the video drivers / video software. Using those adjustments, you can perform a calibration and build look up tables to feed the video processor which allows it to display correct colors... So I imagine you could have an absolute crap monitor as long as you had a good video card with good drivers, you can get a good, accurate calibration and picture (within the physical limitations of the monitor).

Like if you had a TV with terrible calibration controls, it wouldn't matter at all if you added a video pre-processor that had all those controls... This is all based on what I've read. I could be wrong.
ZandarKoad is offline  
post #6 of 40 Old 01-26-2012, 03:18 AM
Senior Member
 
undermined's Avatar
 
Join Date: Jun 2005
Posts: 211
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 15
Quote:
Originally Posted by ZandarKoad View Post

My understanding is that monitors (including those built into laptops) don't need to have calibration settings themselves, as the video processing for those display devices are handled by the computer. Rather, you'd be looking for calibration settings in the video drivers / video software. Using those adjustments, you can perform a calibration and build look up tables to feed the video processor which allows it to display correct colors... So I imagine you could have an absolute crap monitor as long as you had a good video card with good drivers, you can get a good, accurate calibration and picture (within the physical limitations of the monitor).

Like if you had a TV with terrible calibration controls, it wouldn't matter at all if you added a video pre-processor that had all those controls... This is all based on what I've read. I could be wrong.

nope, most video cards still have a 8-bit LUT so if the display has a bad gamma curve and it's colors are off , using just the video card wont help much and you soon get "posterized" colors and loose details in crushed blacks and clipped whites.

you never ever use the video card controls for proper color, you use whatever controls the monitor has and use a colormeter to profile the monitor and make a custom icc profile the OS will use to map sRGB PCs use to your display.

the big difference between a monitor and a TV is monitors are setup to display RGB 4:4:4 (0-255) and normally have controls to adjust the whitepoint and sRGB is the colorspace standard used on Windows and MacOS is essentially the same as rec.709 and 2.2 gamma with D65 whitepoint.

TV's try to look good in a big box store, Monitors sorta try to look normal while displaying sRGB (although many displays boost the saturation to crazy levels) and the controls by design are there to fix deviations from the standard but most TVs controls are there to alter video to look good unless you get into a service menu.
undermined is offline  
post #7 of 40 Old 01-26-2012, 04:44 AM
AVS Special Member
 
Chronoptimist's Avatar
 
Join Date: Sep 2009
Posts: 2,583
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 1 Post(s)
Liked: 227
Apple calibrates their displays to being close to D65 out of the box, and most of their displays have an sRGB-like gamut, which means that colour is generally quite good out of the box.

With just about anything else, other than high end graphics monitors from companies like Eizo, you can't count on them being close at all. (even if they offer a "6500K" preset)

Most TVs generally have a pretty good film (or THX) mode these days as well though. So while it's not quite "out of the box" you can get good results on most without a proper calibration.


As for monitors which "boost the saturation to crazy levels" it's not so crazy. Most photo/video editing packages support colour management, as do some video players/renderers, Firefox supports full management for web browsing (this must be set to 1) Safari & Internet Explorer have partial support. (tagged images only)

Digital cameras can capture colour far outside the sRGB gamut, and so you actually need a wide gamut display for accurate image reproductionthe better cameras can even capture colour outside the Adobe RGB colourspace, which is why the ProPhoto colourspace exists:



As has been mentioned though, PCs only send 8-bit data to the display, so you don't want to do much in the way of calibration through the video card. High end graphics monitors have 10-bit or greater internal LUTs which can be calibrated with software such as ColorEyes.
Chronoptimist is offline  
post #8 of 40 Old 01-26-2012, 08:02 AM
AVS Special Member
 
rovingtravler's Avatar
 
Join Date: Nov 2010
Location: Clovis, NM
Posts: 1,262
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 14
If you look at a site like toms hardware or Anandtech you will see how far off most computer monitors and laptops are. Couple this with the fact that the average laptop only does 60% of Adobe color space and most monitors are in the 80-90% range and you end up with bad colors.

David

"You buy a Ferrari when you want to be somebody. You buy a Lamborghini when you are somebody." - Frank Sinatra
rovingtravler is offline  
post #9 of 40 Old 01-26-2012, 11:55 AM
Advanced Member
 
Smackrabbit's Avatar
 
Join Date: Sep 2001
Location: Portland, OR, USA
Posts: 893
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 2 Post(s)
Liked: 30
Quote:
Originally Posted by rovingtravler View Post

If you look at a site like toms hardware or Anandtech you will see how far off most computer monitors and laptops are. Couple this with the fact that the average laptop only does 60% of Adobe color space and most monitors are in the 80-90% range and you end up with bad colors.

As the displays editor at AnandTech, most desktop LCDs are around 65-70% of AdobeRGB once calibrated and start out with an average dE of 8-10. A display is far more likely to have 100% of AdobeRGB than 80-90% because once you hit that point you are way past sRGB (72% or so) and may as well go the full way. I don't test laptops so I can't comment on them.

Chris Heinonen
Senior Editor, Secrets of Home Theater and High Fidelity,
To view links or images in signatures your post count must be 0 or greater. You currently have 0 posts.

Displays Editor,
To view links or images in signatures your post count must be 0 or greater. You currently have 0 posts.

Contributor,
To view links or images in signatures your post count must be 0 or greater. You currently have 0 posts.
and
To view links or images in signatures your post count must be 0 or greater. You currently have 0 posts.

ISF Level II Certified Calibrator,
To view links or images in signatures your post count must be 0 or greater. You currently have 0 posts.
Smackrabbit is offline  
post #10 of 40 Old 01-26-2012, 12:05 PM
AVS Special Member
 
rovingtravler's Avatar
 
Join Date: Nov 2010
Location: Clovis, NM
Posts: 1,262
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 14
Quote:
Originally Posted by Smackrabbit View Post

As the displays editor at AnandTech, most desktop LCDs are around 65-70% of AdobeRGB once calibrated and start out with an average dE of 8-10. A display is far more likely to have 100% of AdobeRGB than 80-90% because once you hit that point you are way past sRGB (72% or so) and may as well go the full way. I don't test laptops so I can't comment on them.

Thanks for the correction. I was going off of a Tom's article on non Pro monitors. Have to say I enjoy Anandtech much more.

David

"You buy a Ferrari when you want to be somebody. You buy a Lamborghini when you are somebody." - Frank Sinatra
rovingtravler is offline  
post #11 of 40 Old 01-26-2012, 02:38 PM
703
Member
 
703's Avatar
 
Join Date: Mar 2006
Location: New Zealand
Posts: 167
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
What we have to keep in mind is that Windows and Mac OS features a device indepedent CMS.

As long as your monitor is profiled accurately, and have been calibrated via a 3DLUT it does a good job at image reproduction for applications that supports CMS (including all modern web browsers and image/video editing applications).

The beauty of the CMS is that you can switch between working colour spaces quite easily then output to a colour space like Rec.709, DCI P3, adobe RGB etc.

Founder | BullsEye Calibration |
To view links or images in signatures your post count must be 0 or greater. You currently have 0 posts.
703 is offline  
post #12 of 40 Old 01-26-2012, 02:49 PM
AVS Special Member
 
sotti's Avatar
 
Join Date: Aug 2004
Location: Seattle, WA
Posts: 6,621
Mentioned: 2 Post(s)
Tagged: 0 Thread(s)
Quoted: 28 Post(s)
Liked: 167
In windows their is no system wide implementation of ICC, so every application has their own CMM implementation.

There are LUT tables that can provide gamma and whitepoint correction, but in order to adjust the gamut you need to be using ICC, and ICC support is not pervasive.

Of course to do any of that you'd need to have a meter and some software to create the ICC profiles.

Joel Barsotti
SpectraCal
CalMAN Lead Developer
sotti is offline  
post #13 of 40 Old 01-26-2012, 05:22 PM
703
Member
 
703's Avatar
 
Join Date: Mar 2006
Location: New Zealand
Posts: 167
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
Scotti- Do you know if there are any improvement in Win 7's WCS compared to Vista's WCS?

I haven't yet seen any applications or devices using native WCS profiles. The industry doesn't seem to care much about this new architecture.

Founder | BullsEye Calibration |
To view links or images in signatures your post count must be 0 or greater. You currently have 0 posts.
703 is offline  
post #14 of 40 Old 01-27-2012, 04:07 PM
Member
 
Guspaz's Avatar
 
Join Date: Dec 2007
Posts: 31
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
Dell factory calibrates their higher-end screens (like the U2711), and ships them with a rather detailed benchmark report showing the results. It's about as good as you're going to get out-of-box, although that doesn't mean you can't achieve better results with other monitors if you DIY. It's convenient for the lazy, like me :P
Guspaz is offline  
post #15 of 40 Old 02-02-2012, 05:09 PM
Senior Member
 
mcantu1's Avatar
 
Join Date: Dec 2009
Posts: 471
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 12
is it true that pc monitors tend to be set to 7500k. i've always heard that but know where it came from...
mcantu1 is offline  
post #16 of 40 Old 02-02-2012, 08:23 PM
Advanced Member
 
Smackrabbit's Avatar
 
Join Date: Sep 2001
Location: Portland, OR, USA
Posts: 893
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 2 Post(s)
Liked: 30
Quote:
Originally Posted by mcantu1 View Post

is it true that pc monitors tend to be set to 7500k. i've always heard that but know where it came from...

It really varies from model to model. Even ones with an sRGB mode rarely are that close to D65.

Chris Heinonen
Senior Editor, Secrets of Home Theater and High Fidelity,
To view links or images in signatures your post count must be 0 or greater. You currently have 0 posts.

Displays Editor,
To view links or images in signatures your post count must be 0 or greater. You currently have 0 posts.

Contributor,
To view links or images in signatures your post count must be 0 or greater. You currently have 0 posts.
and
To view links or images in signatures your post count must be 0 or greater. You currently have 0 posts.

ISF Level II Certified Calibrator,
To view links or images in signatures your post count must be 0 or greater. You currently have 0 posts.
Smackrabbit is offline  
post #17 of 40 Old 02-03-2012, 03:37 AM
AVS Special Member
 
Mr.D's Avatar
 
Join Date: Dec 2001
Location: UK
Posts: 3,307
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 15
Quote:
Originally Posted by undermined View Post

nope, most video cards still have a 8-bit LUT so if the display has a bad gamma curve and it's colors are off , using just the video card wont help much and you soon get "posterized" colors and loose details in crushed blacks and clipped whites..

Nope the precision of a LUT is not necessarily something that will impact on the actual precision of the image with regard to posterisation. Most video cards work internally at greater than 8bit these days (whether they are actually allowed to by certain drivers or utilised this way by certain software is another matter).

An 8bit LUT itself does not mean that the image path itself will be limited to that precision. 8bit luts are actually quite "large" as far as luts themselves go , you can get away with 6bit luts for high end color work especially if you need good interactivity.

digital film janitor
Mr.D is offline  
post #18 of 40 Old 02-03-2012, 03:42 AM
AVS Special Member
 
Mr.D's Avatar
 
Join Date: Dec 2001
Location: UK
Posts: 3,307
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 15
Quote:
Originally Posted by undermined View Post

RGB 4:4:4

I really wish people would stop saying this. Its a misnomer to use a chroma sampling rate designation with native RGB colorspace it only applies meaningfully to component type encoded signals and even then saying 4:4:4 is pretty much redundant as its actually saying there is no chroma subsampling so why state it in the first place.


And yes I'm being pedantic.

digital film janitor
Mr.D is offline  
post #19 of 40 Old 02-03-2012, 04:04 AM
AVS Special Member
 
Mr.D's Avatar
 
Join Date: Dec 2001
Location: UK
Posts: 3,307
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 15
Quote:
Originally Posted by Chronoptimist View Post


As has been mentioned though, PCs only send 8-bit data to the display, so you don't want to do much in the way of calibration through the video card. High end graphics monitors have 10-bit or greater internal LUTs

This is not true , some cards can output 10 and even 12bit YCbCr although it will be 4:2:2 at best (think you can go 4:4:4 with 10bit though).

However I've yet to see anyone qualify a card as being completely transparent in this regard ; most do an intermediate conversion to RGB for processing and then back to YCbCr for output (or even back and forth again). Depending on how its done you might get a benefit though.

Lastly I don't think I've ever seen a consumer digital panel that was truly transparent with 8bit video at the screen regardless of the stated precision of the processing , most of them seem to be about 6bit albeit with a non-linear mapping of that precision.

The only thing I'd stick my neck out for would be a pro Barco DCi cinema projector to be honest.

As for 3dluts you can get this capability with certain PC routes (Upsilon mixer and MadVR for example) but its dependant on playback software too.

So its a bit of a mess with no real reason behind it other than software and hardware devs not using the PC platform to its fullest potential.

Shame really , hope its something that gets sorted in the future (then again they could have sorted it over 10 years ago)

digital film janitor
Mr.D is offline  
post #20 of 40 Old 02-03-2012, 04:22 AM
AVS Special Member
 
Mr.D's Avatar
 
Join Date: Dec 2001
Location: UK
Posts: 3,307
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 15
Quote:
Originally Posted by Smackrabbit View Post

It really varies from model to model. Even ones with an sRGB mode rarely are that close to D65.

Never in my experience and remember that calibration has to factor in the source, so you can't really say a monitor is calibrated unless you appraise it with a given source device and quantify the whole playback chain in terms of the end result on the screen.

I would suspect the instances of a given source with a given display resulting in a totally accurate result without objective calibration of the chain is zero.

digital film janitor
Mr.D is offline  
post #21 of 40 Old 02-03-2012, 04:23 AM
AVS Special Member
 
Chronoptimist's Avatar
 
Join Date: Sep 2009
Posts: 2,583
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 1 Post(s)
Liked: 227
Quote:
Originally Posted by Mr.D View Post

This is not true , some cards can output 10 and even 12bit YCbCr although it will be 4:2:2 at best (think you can go 4:4:4 with 10bit though).

I believe the pro AMD cards can output 10-bit RGB, but there is virtually no software which actually sends them anything more than 8-bit dataat least on the consumer side. I'm sure there is probably some pro-level colour grading hardware/software that can do this though.

I think even some of the consumer-grade AMD cards now report sending 10-bit RGB data to some displays over HDMI, but that just means the card is "upsampling" the data (input value x4 = "10-bit" value) there isn't actually any more precision, even when you're working in programs that use 16-bit or greater internal precision. (such as Photoshop etc.)

Quote:
Originally Posted by Mr.D View Post

Lastly I don't think I've ever seen a consumer digital panel that was truly transparent with 8bit video at the screen regardless of the stated precision of the processing , most of them seem to be about 6bit albeit with a non-linear mapping of that precision.

The only thing I'd stick my neck out for would be a pro Barco DCi cinema projector to be honest.

That's interesting to hear. I've been of that opinion as well, though my only reference for how good an 8-bit signal should look has been CRT monitors, rather than pro-level hardware.
Chronoptimist is offline  
post #22 of 40 Old 02-03-2012, 06:54 AM
AVS Special Member
 
Mr.D's Avatar
 
Join Date: Dec 2001
Location: UK
Posts: 3,307
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 15
Quote:
Originally Posted by Chronoptimist View Post


I think even some of the consumer-grade AMD cards now report sending 10-bit RGB data to some displays over HDMI, but that just means the card is "upsampling" the data (input value x4 = "10-bit" value) there isn't actually any more precision, even when you're working in programs that use 16-bit or greater internal precision. (such as Photoshop etc.)

Any processing that happens in the card , such as lutting will benefit from being performed in a higher precision , if this is then carried through to the output as 10bit it mitigates against the processing generating additional posterisation from rounding errors as compared with deflating back to 8bit. This is why people are so keen to output 10bit with standalone CMS boxes.

If you had a 16bit image and passed it to a graphics card it will deflated down to the putput of the graphics card, 8bit wil likely show some banding in certain ranges even if they are not on the actual image , 10bit output would benefit from greater transparency in this regard and I'd say its unlikley you would see any banding on a notionally transparent to 10bit monitor.

As most digital displays within the reach of the average consumer are not transparent to 8bit let alone 10bit its unlikley to make much difference but...it might just have less banding in certain areas.

I often find analogue RGBHV (vga) from a decent graphics card often has less visible banding and generally looks cleaner ( especially to a plasma) as 10bit plus processing becomes apparent through the dac back into analogue. The trade off is that it may be softer (not really noticed this) and it may be slightly noisier ( which actually aids in hiding posterisation anyway).

digital film janitor
Mr.D is offline  
post #23 of 40 Old 02-03-2012, 07:16 AM
AVS Special Member
 
Chronoptimist's Avatar
 
Join Date: Sep 2009
Posts: 2,583
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 1 Post(s)
Liked: 227
Quote:
Originally Posted by Mr.D View Post

Any processing that happens in the card , such as lutting will benefit from being performed in a higher precision , if this is then carried through to the output as 10bit it mitigates against the processing generating additional posterisation from rounding errors as compared with deflating back to 8bit. This is why people are so keen to output 10bit with standalone CMS boxes.

Yes, I agree with you here, the problem is that almost nothing actually sends more than 8-bit data to the video card.

Quote:
Originally Posted by Mr.D View Post

If you had a 16bit image and passed it to a graphics card it will deflated down to the putput of the graphics card, 8bit wil likely show some banding in certain ranges even if they are not on the actual image , 10bit output would benefit from greater transparency in this regard and I'd say its unlikley you would see any banding on a notionally transparent to 10bit monitor.

If it was being done on the GPU like that, yes, but it's not. The program itself is working with 16-bit (or more) precision internally, but only passing 8-bit data out to the GPU. This is the same for virtually all applications due to how the OS is architected.

It's my understanding that your program needs to use a specific 10-bit (or more) output mode to actually get more than 8-bit data to the card. I believe it needs an OpenGL or Direct3D output? I'm not too clear on the specifics, I just know that even if your GPU is reporting "10-bit" colour, in most if not all cases, you aren't seeing anything more than 8-bit data.

Photoshop might actually be a bad example, as I think they do now have the option of running things on the GPU.
Chronoptimist is offline  
post #24 of 40 Old 02-03-2012, 08:46 PM
AVS Special Member
 
Mr.D's Avatar
 
Join Date: Dec 2001
Location: UK
Posts: 3,307
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 15
Quote:
Originally Posted by Chronoptimist View Post

It's my understanding that your program needs to use a specific 10-bit (or more) output mode to actually get more than 8-bit data to the card. I believe it needs an OpenGL or Direct3D output? I'm not too clear on the specifics, I just know that even if your GPU is reporting "10-bit" colour, in most if not all cases, you aren't seeing anything more than 8-bit data.

Photoshop might actually be a bad example, as I think they do now have the option of running things on the GPU.

In my experience things like Photoshop, Shake , Nuke (not sure on after effects but I consider it prosumer) have always demonstrated cleaner visual output by moving to 10bit or greater processing cards ( this is coming at it from when the cards only did 8bit processing even high end ones).

In terms of the level of precision I generally use when appraising these sorts of attribute I don't think I'm merely looking at 8bit data hidden in 10bit or greater dither. And this is when creating imagery that has to stand up to display that is truly transparent to at least 10bit if not 12.

digital film janitor
Mr.D is offline  
post #25 of 40 Old 02-03-2012, 09:06 PM
703
Member
 
703's Avatar
 
Join Date: Mar 2006
Location: New Zealand
Posts: 167
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
Quote:
Originally Posted by Chronoptimist View Post

I believe the pro AMD cards can output 10-bit RGB, but there is virtually no software which actually sends them anything more than 8-bit data—at least on the consumer side. I'm sure there is probably some pro-level colour grading hardware/software that can do this though.

I think even some of the consumer-grade AMD cards now report sending 10-bit RGB data to some displays over HDMI, but that just means the card is "upsampling" the data (input value x4 = "10-bit" value) there isn't actually any more precision, even when you're working in programs that use 16-bit or greater internal precision. (such as Photoshop etc.)

That's interesting to hear. I've been of that opinion as well, though my only reference for how good an 8-bit signal should look has been CRT monitors, rather than pro-level hardware.

AMD and Nvidia cards can output 10-bit RGB via HDMI/Displayport without upsampling. E.g. MadVR has a 16-bit internal processing queue, then is dithered down to the 10-bit RGB output bitdepth.

OpenGL supports 10-bit in Windowed and Full Screen Modes, while Direct3D supports 10-bit in Full Screen mode only. MadVR uses Direct3D.

In fact, Nvidia cards are quite popular in post-production because of CUDA, which heaps a lot in processing 2K and 4K material.

Founder | BullsEye Calibration |
To view links or images in signatures your post count must be 0 or greater. You currently have 0 posts.
703 is offline  
post #26 of 40 Old 02-03-2012, 09:54 PM
AVS Special Member
 
sotti's Avatar
 
Join Date: Aug 2004
Location: Seattle, WA
Posts: 6,621
Mentioned: 2 Post(s)
Tagged: 0 Thread(s)
Quoted: 28 Post(s)
Liked: 167
Quote:
Originally Posted by 703 View Post

AMD and Nvidia cards can output 10-bit RGB via HDMI/Displayport without upsampling. E.g. MadVR has a 16-bit internal processing queue, then is dithered down to the 10-bit RGB output bitdepth.

OpenGL supports 10-bit in Windowed and Full Screen Modes, while Direct3D supports 10-bit in Full Screen mode only. MadVR uses Direct3D.

In fact, Nvidia cards are quite popular in post-production because of CUDA, which heaps a lot in processing 2K and 4K material.

How can you force that and verify it?

I have a monitor that I can feed 10bit RGB via DVI, but only when running fullscreen D3D. I saw that I could do that in MadVR, but I couldn't verify the rendered bit rate with the hud with ctrl-J

Joel Barsotti
SpectraCal
CalMAN Lead Developer
sotti is offline  
post #27 of 40 Old 02-04-2012, 01:40 AM
AVS Special Member
 
Chronoptimist's Avatar
 
Join Date: Sep 2009
Posts: 2,583
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 1 Post(s)
Liked: 227
Quote:
Originally Posted by Mr.D View Post

In my experience things like Photoshop, Shake , Nuke (not sure on after effects but I consider it prosumer) have always demonstrated cleaner visual output by moving to 10bit or greater processing cards ( this is coming at it from when the cards only did 8bit processing even high end ones).

In terms of the level of precision I generally use when appraising these sorts of attribute I don't think I'm merely looking at 8bit data hidden in 10bit or greater dither. And this is when creating imagery that has to stand up to display that is truly transparent to at least 10bit if not 12.

Well the newest version of Photoshop does now support 10-bit output, now that it can render the image on the GPU, but it requires a Quadro or FirePro card.

There is no support for 2D 10-bit output in Windows at all, it requires D3D or OpenGL rendering. OS X doesn't have any native support for 10-bit colour, even on Quadro or FirePro cards.

I don't believe Shake has 10-bit support, and I'm not sure about Nuke. (a quick search suggests that it does not)

That's not to say that having a 10-bit LUT won't be beneficial at all. I suspect that if you are using ICC profiles, the data in the video card LUT will at least have greater precision there, but it's certainly not going to be transparent to 10-bit if you are doing that. (though it may be transparent to 8-bit?)

Quote:
Originally Posted by 703 View Post

AMD and Nvidia cards can output 10-bit RGB via HDMI/Displayport without upsampling. E.g. MadVR has a 16-bit internal processing queue, then is dithered down to the 10-bit RGB output bitdepth.

That's not true at all. MadVR does use 16-bit internal precision, and has a D3D output, bit it does not support 10-bit output, it only passes dithered 8-bit data to the display.

There has been talk that it may get 10-bit output added to the Fullscreen Exclusive mode in a future update, but as of v0.80, it does not. This is what I mean about not believing what your display is telling you. With the exception of a very limited number of specialist applications, almost nothing actually passes more than 8-bit data to the GPU. It's not as simple as a developer flipping a switch and enabling support, the application has to use a specific OpenGL or Direct 3D output to enable it.
Chronoptimist is offline  
post #28 of 40 Old 02-04-2012, 05:30 AM
AVS Special Member
 
madshi's Avatar
 
Join Date: May 2005
Posts: 5,467
Mentioned: 6 Post(s)
Tagged: 0 Thread(s)
Quoted: 57 Post(s)
Liked: 128
Quote:
Originally Posted by Chronoptimist View Post

MadVR does use 16-bit internal precision, and has a D3D output, bit it does not support 10-bit output, it only passes dithered 8-bit data to the display.

Correct. I do have 10bit output on my to do list. But it doesn't have a high priority at the moment because I believe that dithering the internal calculation bitdepth (16bit+) down to 8bit should produce results that are virtually indistinguishable from 10bit output. The only gain you get with 10bit output is a lower noise floor.
madshi is online now  
post #29 of 40 Old 02-04-2012, 05:51 AM
Advanced Member
 
janos666's Avatar
 
Join Date: Jan 2010
Location: Hungary
Posts: 594
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
laptops - HELL NOT !

cheap LCDs - no, they are rarely close to D65 white, very rarely offer standard color spaces and almost never offer usable 3D CMS on hardware. (But at least they are often additive, so a decent software CMS can do a nice job if calibrated and profiled correctly with proper sensor and software package.)

expensive semi- and real professional monitors (with uniquely calibrated 3D LUTs) - Oh, YES! (No current consumer grade HDTV can reach that level of color accuracy, speaking about either out-of-box<->out-of-box, calibrated<->calibrated or even out-of-box_monitor<->calibrated_HDTV.)

"DIY certified hobby-calibrator" (based on ChadB's "warning signs" list
janos666 is offline  
post #30 of 40 Old 02-04-2012, 05:56 AM
Advanced Member
 
janos666's Avatar
 
Join Date: Jan 2010
Location: Hungary
Posts: 594
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
Quote:
Originally Posted by madshi View Post

Correct. I do have 10bit output on my to do list. But it doesn't have a high priority at the moment because I believe that dithering the internal calculation bitdepth (16bit+) down to 8bit should produce results that are virtually indistinguishable from 10bit output. The only gain you get with 10bit output is a lower noise floor.

Good to hear you didn't forget about that , I am still waiting to check out if it's useful with plasma TVs which are very noisy to start with (by design because they have very few real gradation steps to work with).

"DIY certified hobby-calibrator" (based on ChadB's "warning signs" list
janos666 is offline  
Reply Display Calibration

User Tag List

Thread Tools
Show Printable Version Show Printable Version
Email this Page Email this Page


Forum Jump: 

Posting Rules  
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are Off
Refbacks are Off