HOW-TO: Calibrating Display to Match HTPC Output - Page 16 - AVS Forum
Forum Jump: 
Reply
 
Thread Tools
post #451 of 486 Old 05-12-2008, 03:44 AM
Member
 
mbmoler's Avatar
 
Join Date: Mar 2008
Posts: 45
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
Hi i have an ati radeon hd2400xt graphic card and it is connected to my projektor ,my problem is that i can see btb bars even when i am not surposed to ,there is a setting on my projektor where btb not should be visible ,and with my denon dvd player it is not ,but when using the pc it is ,so i guess i have to set the pc to output video level, but i am using powerdvd ultra to watch blueray from harddisk. How can i get it to output video levels?
mbmoler is offline  
Sponsored Links
Advertisement
 
post #452 of 486 Old 05-25-2008, 07:35 PM
 
jane1043's Avatar
 
Join Date: Mar 2008
Posts: 3
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
AHh, so follows hot
jane1043 is offline  
post #453 of 486 Old 05-27-2008, 05:12 AM
AVS Special Member
 
IanD's Avatar
 
Join Date: Apr 2000
Posts: 1,833
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 2 Post(s)
Liked: 20
Quote:
Originally Posted by maxleung View Post

All calibration I do is done on the display if possible.

(Calibration devices like a Spyder2 or other colorimeter could help in your case, at least for desktop color.)

I just calibrated my CRT monitor with a Spyder2 and it made a huge difference to the desktop in removing a greenish tint that pervaded neutral colours. My understanding is that it accomplished this by creating a profile for the graphics card which is loaded when Windows starts.

However, when playing any sort of video with PowerDVD, I still have the same greenish tint within the video window.

I think I have heard on this forum that video bypasses the colour profiles.

Consequently, is there a way to calibrate video via the driver (eg adjusting RGB gamma sliders) in a similar way to how the desktop is calibrated via a colorimeter (initial RGB balancing with the help of the colorimeter then fine tuning with the profile)? It certainly made a difference to the colour accuracy of the desktop and I would really like to achieve the same thing for video playback.
IanD is offline  
post #454 of 486 Old 06-17-2008, 02:01 AM
Newbie
 
kkendall's Avatar
 
Join Date: Jun 2008
Posts: 5
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
Quote:
Originally Posted by Entrecourt View Post

This is a very informative thread.

I understand that one should adjust picture settings primarily on the TV display, however I, like many others, am running multiple input sources (e.g. H264 DVB-S2, MPEG2 DVD) with different codecs (NVIDIA, CoreAVC) outputting via a single hdmi cable. PQ across the sources is not uniform so I beleive I need to adjust the picture settings for each codec.

What's the recommended best practice in such situations?

Thanks.


Yeah, that is my situation too.
I'd really would like to know how I can make sure my progressive 24fps files are sent to my display in 24hz, my interlaced 25fps sources in 50hz, progressive 25fps in 25 or 50hz, progressive 29,97 fps in 29,97hz or 59/60 hz, etc.

Is there a way to set this on the videocard, player, or somewhere else?
kkendall is offline  
post #455 of 486 Old 08-29-2008, 12:04 PM
AVS Special Member
 
rajdude's Avatar
 
Join Date: May 2005
Location: Woodbridge, VA
Posts: 2,102
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
I find this tool very useful......much better than anything else I have seen.

Now, Is there a DVD/Blu-Ray/video version available of this same pattern?

I know ramps are there in many discs but this type of stuff with BTB and WTW ramps is really good!

Quote:
Originally Posted by cyberbri View Post

Another good tool to use are the image files created by 3no in this post. One represents the black end of the spectrum, and the other represents the white. The backgrounds are 0 and 255, respectively, IIRC, and each have 25 vertical bars from 1-25 and 230-255. If I need to quickly verify my brightness level on my DVI/HTCP, this is what I use.


-Rajiv
rajdude is offline  
post #456 of 486 Old 09-16-2008, 06:57 AM
Newbie
 
Torch11's Avatar
 
Join Date: Sep 2008
Posts: 11
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
Great Info.
Torch11 is offline  
post #457 of 486 Old 12-05-2008, 09:42 AM
Member
 
larry2007's Avatar
 
Join Date: Jan 2007
Posts: 16
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
I am trying to figure out the affects of the various pieces that might be used in a HTPC video chain, and the language. I am not in any way trying to be annoying. I just think in an engineering way, and sometimes the way others think and communicate doesn't really clear things up for me. I hope this inspires those that maintain the threads to take these ideas and language and add this form of clarity and table to the top level postings.

At the top on my confusion list is the whole digital video levels range thing. I see this as a video dynamic range. Based on my reading, It seems that the PC digital outputs tend to output from 0 to 255 (00h - ffh), yet the rest of the industry seems to use 16 to 235 (0fh-EBh), video levels. It seems to me if a source is recoded from 0-255 and then finally displayed at this range it will have a dynamic range of 256 bits. If the source is created from 16 to 235 and displayed at 16-235, it has a dynamic range of 220 bits. If the same 16-235 source is displayed at 0-255 by digital look up table expansion of some algorithm, it has no more information, and can be collapsed and expanded as many times as is desired, without any further loosing of accurate data, always having a dynamic range of 220 bits. If however it started life as a 0-255 source and then was compressed digitally to 16-235, it would have a one time loss of data (256 bits reduced to 220 bits), but after that initial loss , it can be expanded and compressed digitally as many times as is desired, always having 220 bits of dynamic range data. This seem sto be a slight bit complicatd by the fact that in the video world, care is not taken to have the lowest bit (black) and the highest bit white stay withn the 16 and 235 boundaries as hard limits, but rather relying on the flexible and dynamic nature of the edges in an analog world of display, where in the 0 and 255 world this boundary condition is ridgid (nothing less than 0 nor greater than 255). This seems silly but perhaps a reality, and not the focus of my confusion.

I record HDTV signals from both cable and over the air to my PC (Vista MC), and then display them digitally from the video card (nVidea) DVI interface to HDMI input on the Plasma monitor. I am relatively sure that the signal is being sent with a 0-255 range. I am relatively sure that the monitor treats 16 as all black, and anything less than 16 as just as black as 16 (my blacks are squashed or blended together when watching HD TV video). It similarly treats white that way (235-255). I suspect that a DVD disk played in this same HTPC will also output a 0-255 level range. I am also relatively sure that the same DVD (lets say an Avia test disk), will output a 16-235 level range when placed in my PS3 player into HDMI2 into the same plasma display (or both HDMI inputs into a receiver and one switched output from the receiver into the Plasma). Is there an obvious piece of equipment that I can use to see the digital (hex) bit stream to the monitor?

Many treads use the term clipping. What does a 12 (or anything from 0-15) look like when it is clipped in the bit stream? Is it a 16 (off) or is it a 235 (off)? i.e. does a blacker than black stay black or does it flip to white (or whatever a pixel looks like with no signal) on a digital display (not CRT)? My guess is that it always stays black or people would be really complaining (0-15 all turn into 16). White would likely be handled similarly. What is the term or phrase for the opposite look up table digital translation, where we would expand the effective dynamic range to make a level 16 (black) into a level 0 black in the bit stream (and a 235 into a 255 white)? I have never seen a thread clearly describe this change, even though I have read paragraph after paragraph try, and I have read endless debates over what should be done. Another term is overlay, which sounds like a look up table (LUT). What level of step causes stair-stepping/banding?

Clipped after source No change effect LUT expand
0-15 16 (squash black) 0 (lost data)
16-235 16-235 (grey looks black, white looks grey) 0-235 (220 bit)
236-255 235 (white compressed) 255 (lost data)
16-235 16-235 (no change) 0-255 (220 bit, no loss)
0-15 Skip 0-15 (16 bits of data)
16-235 Skip 16-235 (220 bits of data)
236-255 Skip 236-255 (20 bits of data)




Related, but seemingly different, is the concept of IRE. Is this a number from 1-100, that represents a percentage of the ranges described above (e.x. 16-235)? Or is it the actual analog voltage in a CRT for the intensity? Is each step the same (linear) or is it a logarithmic scale to appear to be the same? Does this variable (IRE) matter in digital signals (HDMI)? If it is an analog voltage, why not just use the voltage (ex 350mv) instead of confusing things with another term (IRE)? If it is a reference level to an AD, then there is a mapping from an IRE number to a digital level, perhaps with an offset. Why does Avia use IRE on the DVD, if it is a voltage, as clearly there can be no voltages recorded as a hole in a plastic disk, though I guess it could be a digital number representing the mv output from an 8 bit D/A.

The last related term that needs some clarity, is gamma. I assume that there is an exponential equation for gamma based on the difference between two numbers, where gamma is in the exponent, and mathematically explains the step changes (1% changes or 1 on the delta of the 220 bit dynamic range 16-235 gets some amount brighter at each step). If this is the case, I would think that getting gamma right in the middle of the range 50% (~6Dh) +/- 20% would be more important than getting the step from 30% to 100% (recommended by Tom Huffman) right. In other words, getting the % step from 40% to 50% intensity (Y) to be a step of gamma of 2.2. (ex: L = lo + ((V-vo)/0.7)^gamma)

Enough with the definitions and detail. The heart of my post is to fix and understand my (and I am sure a lot of others) range of solutions to fixing the PC/Video level issue.

How do we set up the PC to always (source = any: HD tuner, analog tuner, DVD, web downloads, mp4, avi, etc) output levels of 16-235 out the DVI/HDMI interface, so that it matches the calibration set by any other video source (PS3) into the receiver or monitor, where they use 16-235?

Do we adjust the video card (big changes to everything, bad idea in my mind)? Is there a normal output setting/mode for the video card (nVidia) that does this (0-255 to 16-235), requiring only very minor adjustments after the display is calibrated to the PS3 DVD HDMI test source? Is there a software translator or rendering (codec?) used requiring only very minor adjustments after the display is calibrated to the alternate source (PS3 DVD HDMI test source)? Does a compromise have to be made at the PC, where one signal source (HD tuner, DVD, etc) needs to be chosen as the master, and the rest will suffer? If so, I would chose the HD tuner first, and use the PS3 as the DVD source.

-Larry
larry2007 is offline  
post #458 of 486 Old 12-05-2008, 06:18 PM
AVS Addicted Member
 
stanger89's Avatar
 
Join Date: Nov 2002
Location: Marion, IA
Posts: 17,359
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 86 Post(s)
Liked: 129
Quote:
Originally Posted by larry2007 View Post

I am trying to figure out the affects of the various pieces that might be used in a HTPC video chain, and the language. I am not in any way trying to be annoying. I just think in an engineering way,

You're not the only one

Quote:


and sometimes the way others think and communicate doesn’t really clear things up for me. I hope this inspires those that maintain the threads to take these ideas and language and add this form of clarity and table to the top level postings.

Welcome to the work of the internets.

Quote:


At the top on my confusion list is the whole digital video levels range thing. I see this as a video dynamic range. Based on my reading, It seems that the PC digital outputs tend to output from 0 to 255 (00h – ffh), yet the rest of the industry seems to use 16 to 235 (0fh-EBh), video levels. It seems to me if a source is recoded from 0-255 and then finally displayed at this range it will have a dynamic range of 256 bits. If the source is created from 16 to 235 and displayed at 16-235, it has a dynamic range of 220 bits.

So far so good. For a little background, digital "images" (lets call them) are most often represented by 8bit/channel integers, where there are three channels, either RGB or Luminance+color difference. The Windows desktop is 8-bit, the "32bit color" comes from 8bit RGB channels + 8bit Alpha (transparency) channel). Movies delivered to end users are also 8-bit/channel coded.

The difference between "PC" and "Video" are which of those 8-bit integer values are defined as black and white. For "PC" 0 is black and 255 is white, there is no accounting for head/toe room because PCs generate the content they display. For Video 16 and 235 were chosen for black and white (respectively), the head/toe room is there to account for under/overshoot of the analog signals that are sampled to create the digital video content.

And yes, PC content has an 8-bit dynamic range, while video has a about a 7.8bit dynamic range. One thing I will stay is that that doesn't mean 256:1 or 220:1, both are capable of "infinite" dynamic range because the minimum value represents no light (0 ftL or cd/m^2 or whatever other measurement) hence how displays can (validly) reproduce many thousands:1 of dynamic range from those ~8 bits.

Quote:


If the same 16-235 source is displayed at 0-255 by digital look up table expansion of some algorithm, it has no more information, and can be collapsed and expanded as many times as is desired, without any further loosing of accurate data, always having a dynamic range of 220 bits.

Unfortunately that is not the case, "Video levels" cannot be "cleanly" converted into PC levels, this means that the conversion results in banding due to the requisite rounding errors when converting 220 levels into 255.

Quote:


If however it started life as a 0-255 source and then was compressed digitally to 16-235, it would have a one time loss of data (256 bits reduced to 220 bits), but after that initial loss , it can be expanded and compressed digitally as many times as is desired, always having 220 bits of dynamic range data.

Yes, but with the same caution/note above, the conversion is not "lossless".

Quote:


I record HDTV signals from both cable and over the air to my PC (Vista MC), and then display them digitally from the video card (nVidea) DVI interface to HDMI input on the Plasma monitor. I am relatively sure that the signal is being sent with a 0-255 range.

The signal you are receiving from the cable and OTA is an 8-bit/channel (255 value) signal, but it contains "video" information, meaning the data it caries defines black as 16 and white as 235. It is a 16-235 video signal.

Beyond that, all commercial/official video (that I'm aware of) is coded to video levels (black=16, white=235), DVD, Blu-ray, ATSC, QAM Cable, Satellite. The only exception is perhaps some "for PC" video (like PC trailers).

Quote:


I am relatively sure that the monitor treats 16 as all black, and anything less than 16 as just as black as 16 (my blacks are squashed or blended together when watching HD TV video). It similarly treats white that way (235-255).

That will depend on the TV and/or settings. Many TVs have an option of whether to interpret the input as "PC" (0-255) or "Video" (16-235). Unfortunately there is no standard for this definition/naming. You are correct though about what would happen at the display when it is set to "Video" levels, and is being fed "PC" levels.

Quote:


I suspect that a DVD disk played in this same HTPC will also output a 0-255 level range.

This will depend on the player and video drivers you are using (as well as the setting there of).

Quote:


I am also relatively sure that the same DVD (lets say an Avia test disk), will output a 16-235 level range when placed in my PS3 player into HDMI2 into the same plasma display (or both HDMI inputs into a receiver and one switched output from the receiver into the Plasma).

That is very likely.

Quote:


Is there an obvious piece of equipment that I can use to see the digital (hex) bit stream to the monitor?

I'm sure they exist, but they would not be cheap.

Quote:


Many treads use the term clipping. What does a 12 (or anything from 0-15) look like when it is clipped in the bit stream? Is it a 16 (off) or is it a 235 (off)? i.e. does a blacker than black stay black or does it flip to white (or whatever a pixel looks like with no signal) on a digital display (not CRT)? My guess is that it always stays black or people would be really complaining (0-15 all turn into 16). White would likely be handled similarly.

Clamping is maybe a more illustrative term of what happens, think of it like this when converting levels:

output(x) = min(max((x-16)*256/220,0),255)

So output(12) = (12-16)*256/220 = -4.6, max(-4.6,0) = 0, min(0,255) = 0

Quote:


What is the term or phrase for the opposite look up table digital translation, where we would expand the effective dynamic range to make a level 16 (black) into a level 0 black in the bit stream (and a 235 into a 255 white)?

That's usually referred to as "expanding" to video levels.

Quote:


Another term is “overlay”, which sounds like a look up table (LUT).

"Overlay" is a hardware device that mixes video content with static desktop content, it's a "renderer" that inserts video into the PC display output. It's been replaced/superceeded by VMR (Video Mixing Render), and EVR (Enhanced Video Renderer). Overlay was special, dedicated hardware on the graphics chip that replaces a specific color in a window with the video content (think green-screen on the weather, I believe "blitting" is the technical term for what overlay does), where as VMR/EVR are more software solutions.

Quote:


What level of step causes stair-stepping/banding?

The cause of banding is as I mentioned above, the 220 values of Video levels cannot be uniquely/consistently mapped to the 256 values of PC levels. When the mapping occurs, due to the coarseness of the values, "gaps" appear. For example in the conversion from Video-PC levels, nothing would get mapped to 5. 16,17,18,19,20 would be mapped to 0,1,2,3,5.

8-bit video is just "barely" enough, under ideal circumstances to avoid banding, so the skipping of those levels causes banding.

Quote:


Related, but seemingly different, is the concept of IRE. Is this a number from 1-100, that represents a percentage of the ranges described above (e.x. 16-235)? Or is it the actual analog voltage in a CRT for the intensity? Is each step the same (linear) or is it a logarithmic scale to appear to be the same? Does this variable (IRE) matter in digital signals (HDMI)? If it is an analog voltage, why not just use the voltage (ex 350mv) instead of confusing things with another term (IRE)? If it is a reference level to an AD, then there is a mapping from an IRE number to a digital level, perhaps with an offset. Why does Avia use IRE on the DVD, if it is a voltage, as clearly there can be no voltages recorded as a hole in a plastic disk, though I guess it could be a digital number representing the mv output from an 8 bit D/A.

IRE is a unit measure of Voltage (it really has no meaning in the digital realm). 1 IRE = 1/140V, it comes from the measure of composite video signals. In the US it's defined as such:
Quote:


Black = 7.5 IRE * 1000 mV /140 IRE =~ 53.57 mV
White = 100 IRE * 1000 mV /140 IRE =~ 714.29 mV

See here for a detailed explanation of IRE:
http://archive.avsforum.com/avs-vb/p...postid=4030461

Quote:


The last related term that needs some clarity, is gamma. I assume that there is an exponential equation for gamma based on the difference between two numbers, where gamma is in the exponent, and mathematically explains the step changes (1% changes or 1 on the delta of the 220 bit dynamic range 16-235 gets some amount brighter at each step). If this is the case, I would think that getting gamma right in the middle of the range 50% (~6Dh) +/- 20% would be more important than getting the step from 30% to 100% (recommended by Tom Huffman) right. In other words, getting the % step from 40% to 50% intensity (Y) to be a step of gamma of 2.2. (ex: L = lo + ((V-vo)/0.7)^gamma)

I'll let Poynton explain gamma:
http://www.poynton.com/notes/colour_...mma_correction

Quote:


Enough with the definitions and detail. The heart of my post is to fix and understand my (and I am sure a lot of others) range of solutions to fixing the PC/Video level issue.

How do we set up the PC to always (source = any: HD tuner, analog tuner, DVD, web downloads, mp4, avi, etc) output levels of 16-235 out the DVI/HDMI interface, so that it matches the calibration set by any other video source (PS3) into the receiver or monitor, where they use 16-235?


Quite unfortunately, you can't. The problem is not all video we see on the PC is encoded correctly (16-235). If we restrict ourselves to just "correct" content (which most obtained through "legitimate" channels is) it's pretty simple.

All you need to do is ensure your video player, video drivers do not expand the 16-235 video to 0-255. This is quite easy to check by using test patterns with below black (PLUGE) and above white content. You should be able to see the below/above black on the PC monitor.

Quote:


Do we adjust the video card (big changes to everything, bad idea in my mind)? Is there a normal output setting/mode for the video card (nVidia) that does this (0-255 to 16-235), requiring only very minor adjustments after the display is calibrated to the PS3 DVD HDMI test source?


To the best of my knowledge, nVidia has a registry setting that controls whether it's drivers expand video to PC levels or not. ATI has actually added a very nice "Pixel Format" option in their recent drivers that allows choosing how the entire PC output is handled. There's a "Studio RGB" option which (appears to) compress the desktop into 16-235 thus making the entire PC output "Video level compatible".

Quote:


Is there a software translator or rendering (codec?) used requiring only very minor adjustments after the display is calibrated to the alternate source (PS3 DVD HDMI test source)?


Usually on nVidia, with the right registry setting, video levels are retained. If you have a choice of renderer, VMR9 or EVR will probably retain your levels.

Quote:


Does a compromise have to be made at the PC, where one signal source (HD tuner, DVD, etc) needs to be chosen as the master, and the rest will suffer?


It really depends on what all your sources are. HDTV, DVD, Blu-ray, all use video levels (as dictated by ITU-R BT.601 and BT.709). It's really only a problem for analog tuners (but those can be calibrated to be correct) or random videos, usually bad encoding by the one who encoded it.

See what an anamorphoscopic lens can do, see movies the way they were meant to be seen
stanger89 is online now  
post #459 of 486 Old 12-06-2008, 08:29 PM
Member
 
larry2007's Avatar
 
Join Date: Jan 2007
Posts: 16
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
Very nice. I found this very helpful, and I expect others will as well. I think the gist of it should make it's way into an FAQ.

Unfortunately, it sounds like since my Plasma (Panasonic TH-50PHD8UK) does not have a control for input video range selection on any of the inputs, that the expanded video signal from HDTV tuner signals can not be solved there, either with adjustments or by adding a second HDMI card ($$). It also looks like it may be difficult to solve at the PC either on my older nVidia HTPC or on my newer Intel G33 chipset box. There was a recent update to the Intel driver, but after installing it, there still seems to be no level selection options for the DVI output (converted to HDMI to the plasma), even though this board is designed for media center / HTPC use. I wonder if the newer G45 chipset has that in the driver.

I searched the treads but did not find anything relevant to fixing this level expansion for the Intel G33. Unless there is something at the driver tweak level, it seems like my only option is to change the video feed to VGA, and hope the plasma assumes that the levels are PC levels for that input type.
larry2007 is offline  
post #460 of 486 Old 12-06-2008, 08:51 PM
AVS Addicted Member
 
stanger89's Avatar
 
Join Date: Nov 2002
Location: Marion, IA
Posts: 17,359
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 86 Post(s)
Liked: 129
Quote:
Originally Posted by larry2007 View Post

Unfortunately, it sounds like since my Plasma (Panasonic TH-50PHD8UK) does not have a control for input video range selection on any of the inputs, that the expanded video signal from HDTV tuner signals can not be solved there, either with adjustments or by adding a second HDMI card ($$).

Just to be clear, the HD tuners don't expand the video signal, they capture exactly what is in the broadcast bitstream. If you want to prove it to yourself, you can open one of your captures/recordings in something like dgindex which will allow saving of individual frames as picture files, and then can open those up in an editor and check individual pixel values. You'll find that black is indeed 16.

Any expansion that is being done is being done by the decoders/player or video drivers. That is usually fixable in one way or another.

Quote:


It also looks like it may be difficult to solve at the PC either on my older nVidia HTPC or on my newer Intel G33 chipset box. There was a recent update to the Intel driver, but after installing it, there still seems to be no level selection options for the DVI output (converted to HDMI to the plasma), even though this board is designed for media center / HTPC use. I wonder if the newer G45 chipset has that in the driver.

For nvidia, try a search for "vmrcccsstatus"

It also looks like the 177.66 drivers added a colorspace control option (from googling at least).

Quote:


I searched the treads but did not find anything relevant to fixing this level expansion for the Intel G33. Unless there is something at the driver tweak level, it seems like my only option is to change the video feed to VGA, and hope the plasma assumes that the levels are PC levels for that input type.

Before ATI fixed it's level issues, users went into the ATI control panel and adjusted the desktop color settings (brightness/contrast) to "undo" the expansion.

See what an anamorphoscopic lens can do, see movies the way they were meant to be seen
stanger89 is online now  
post #461 of 486 Old 12-13-2008, 06:55 AM
Member
 
larry2007's Avatar
 
Join Date: Jan 2007
Posts: 16
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
I attempted to test the HD tuner levels as you mentioned, but I have inconclusive results. Unfortunately, video editing is way outside of my field of expertise. There appear to be a million formats of images, encodings both still and moving (video), gobs of converters and decoders, etc. In other words a big dark rabbit hole.

That being said, I used the tool referenced above, dgindex. In order to do so, I needed to convert the VMC recoding to DVD (mpg) format, as the dgindex utility appeared to not like the native format used by Vista for media center recordings. I used Ultimate DVD video converter. I am not sure that this did not change things, but I suspect that it did not. Form there I was able to stumble my way to a useful frame, and then output a bit map from the main menu. I assume this is the intended technique and output.

The next step seemed to be using an editor of some sort to view the pixel data. After stumbling around for a bit, I got both Gimp and ACD to reveal that data. In both cases, they reviled that the RGB data was actually at and around 0,0,0, rather than 16,16,16. See included clip (test_frame.bmp) of a Gray's Private Practice HD fade scene.

Going further down that dark hole, I then tired to grab a screen shot of a pattern form DVE (as I believe that Avia (1 or 2 not pro) does not have extended level image data BTB, WTW). I tried and tried, but was never able to get this data. The dgindex utility was unable to open any paters from this disk, by my command at least. I suspect that it is due to the way the disk is laid out (titles vs. one tree).

I thought I would be successful by grabbing a screen shot by pausing the video, pressing shift prt_scrn, then pasting to MS paint, word or Gimp. No technique or utility seemed to actually work. The paste seemed to almost work, but when saved, or when the original source DVD was stopped, the image pasted disappeared from the buffer and the utility. I dont understand.

Is bit map the best (or simplest useful) output format to be attempting to use when grabbing a video image or a frame? Is there an easy way to grab a video image of a frame, specifically from a DVD like the DVE calibration disk? Is there an easy way to just pause the output and look at the levels with an editor directly, so that all of the interpretation and conversion ambiguity goes away?

In an effort to figure out is either of these converter tools adjusted the image, last night I took another crack at it. It appears that dgindex has conversion settings in the video menu, where we have to chose to convert to PC or convert to TV levels. The out of the box setting is to convert to PC (YUV RGB). I changed this to convert to TV, and low and behold, the image editor now reported levels around 16,16,16 now in those dark areas on the new bit map. Unfortunately, this seems no more definitive than the last set of test, as the tool seems to expand or collapse the image in all cases, not having a leave it as it is setting.

I am almost certain that if I were to view a pluge pattern that had data at and around the level of digital 32, that I would see that as the first viable black level on my Pany Plasma on the HDMI input from the VMC HTPC. If I interpret the earlier data correctly, this would be at an 7-8% window, or just below a 10% window. Strangely, I do not see this on any of the disks. I cannot help but assume that this issue is very common indeed, based on my own experience and the hundreds of treads that I have read.

I examined the nVidia update driver info, but did not try it yet as I am currently messing with the Intel G33 setup right now, but believe that it will work for that system, based on what I read. It is about time. Why is this not in Intels driver (rhetorical question)? Is there another solution for Intel? I will check if the new G45 board has the proper driver support on that thread.

What does ffdshow do (very short answer)? Is that a solution for VMC?

Sugestions?

 

test_frames.zip 203.6123046875k . file
larry2007 is offline  
post #462 of 486 Old 12-13-2008, 09:01 AM
AVS Addicted Member
 
stanger89's Avatar
 
Join Date: Nov 2002
Location: Marion, IA
Posts: 17,359
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 86 Post(s)
Liked: 129
Quote:
Originally Posted by larry2007 View Post

The next step seemed to be using an editor of some sort to view the pixel data. After stumbling around for a bit, I got both Gimp and ACD to reveal that data. In both cases, they reviled that the RGB data was actually at and around 0,0,0, rather than 16,16,16. See included clip (test_frame.bmp) of a Gray's Private Practice HD fade scene.

Going further down that dark hole, I then tired to grab a screen shot of a pattern form DVE (as I believe that Avia (1 or 2 not pro) does not have extended level image data BTB, WTW). I tried and tried, but was never able to get this data. The dgindex utility was unable to open any paters from this disk, by my command at least. I suspect that it is due to the way the disk is laid out (titles vs. one tree).

For DVE, you just need to load up the right vob (or all of them) and move the slider to the right test pattern. DGIndex doesn't support menus.

Quote:


I thought I would be successful by grabbing a screen shot by pausing the video, pressing shift prt_scrn, then pasting to MS paint, word or Gimp. No technique or utility seemed to actually work. The paste seemed to almost work, but when saved, or when the original source DVD was stopped, the image pasted disappeared from the buffer and the utility. I don't understand.

That means you're using overlay, and not a newer renderer.

Quote:


Is bit map the best (or simplest useful) output format to be attempting to use when grabbing a video image or a frame?

It works. For these purposes you could save a JPG as well.

Quote:


Is there an easy way to grab a video image of a frame, specifically from a DVD like the DVE calibration disk?

See above. But grabbing a frame from DVE with DGIndex wouldn't be terribly useful since we already know what the DVD will be.

Quote:


Is there an easy way to just pause the output and look at the levels with an editor directly, so that all of the interpretation and conversion ambiguity goes away?

Yes, just save an image like you did in DGIndex.

Quote:


In an effort to figure out is either of these converter tools adjusted the image, last night I took another crack at it. It appears that dgindex has conversion settings in the video menu, where we have to chose to convert to PC or convert to TV levels. The out of the box setting is to convert to PC (YUV RGB). I changed this to convert to TV, and low and behold, the image editor now reported levels around 16,16,16 now in those dark areas on the new bit map. Unfortunately, this seems no more definitive than the last set of test, as the tool seems to expand or collapse the image in all cases, not having a leave it as it is setting.

Actually it is, it means the recording is "right", ie has the correct, as-broadcast video levels.

Quote:


I am almost certain that if I were to view a pluge pattern that had data at and around the level of digital 32, that I would see that as the first viable black level on my Pany Plasma on the HDMI input from the VMC HTPC. If I interpret the earlier data correctly, this would be at an 7-8% window, or just below a 10% window. Strangely, I do not see this on any of the disks. I cannot help but assume that this issue is very common indeed, based on my own experience and the hundreds of treads that I have read.

You should be able to view everything from 16-235 or 0%-100%, regardless of if BTB/WTW are being passed. The only way you won't be able to see them, is if your PC is expanding Video to PC levels, and then your display is accepting that expanded video as PC levels and expanding again.

Quote:


What does ffdshow do (very short answer)?

Basically it allows you do to all sorts of various processing to your video.

Quote:


Is that a solution for VMC?

Depends on what you're trying to do, for levels, not really.

See what an anamorphoscopic lens can do, see movies the way they were meant to be seen
stanger89 is online now  
post #463 of 486 Old 12-13-2008, 07:41 PM
Member
 
larry2007's Avatar
 
Join Date: Jan 2007
Posts: 16
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
Quote:


For DVE, you just need to load up the right vob (or all of them) and move the slider to the right test pattern. DGIndex doesn't support menus .

I would like to know what you are doing differently than I am. I tried this before but still it does not work. Do you have the disk and have you actually tried this? Can you really get past the Video_TS.VOB (which works)? This procedure works for all of Avia, but not for DVE.

Quote:


Actually it is, it means the recording is "right", i.e. has the correct, as-broadcast video levels.

I am not sure how you can draw the conclusion that the source data is either at PC or Video levels if the tool does not have a setting to not change (or set) the output. Perhaps you know that that is how it is coded (ex video leaves the source alone, leaving level 0 as a 0 and a 16 as a 16, where as PC expands all things 16 moves to 0 and 0 -15 gets cropped). Is this the case?

Quote:


You should be able to view everything from 16-235 or 0%-100%, regardless of if BTB/WTW are being passed. The only way you won't be able to see them, is if your PC is expanding Video to PC levels, and then your display is accepting that expanded video as PC levels and expanding again.

I think that you may have mis-comunicted this last bit. If the PC is expanding the range, 16 becomes 0, right, as previously discussed. If the TV display is doing nothing to the data stream, but it's input display range starts at 16, from the standpoint of the signal source, the first bit of black would be around 32 (now at 16 at the input to the dispaly), crushing the first 16 or so bits of data. The set does not need to do any further expansion to lose this data. This can not be adjusted by increasing black level, as there is effectively nothing below 16 any longer, and we end up making 16 brighter, but never seeing 15 and below, since it is chopped at the input to the monitor.

As I see it, if the input can not be set for the expanded range of levels, then the output needs to be not expanded. I would gather that increasing the brightness from the PC (Intel utility) would also have no posatiuve effect, but I guess it is time to try some stuff there anyway, as I should be able to reset to defaults and I have calibration gear.

One big confusing part in the driver setting area, that seems to me was about the same deal with nVidia and even before that with my ATI card, is that there are at least three locations or menus that set brightness, contrast and gamma, not counting the advanced sections of those three sections where often the individual RGB portions if each can be set, and then there is that little square with the diagonal line that can often be dragged to funny shapes. If that is not bad enough, every driver release seems to change all of these areas look and feel. Basically it looks like a nearly infinite number of ways to mess things up to me.

-Larry
larry2007 is offline  
post #464 of 486 Old 12-14-2008, 08:47 AM
AVS Addicted Member
 
stanger89's Avatar
 
Join Date: Nov 2002
Location: Marion, IA
Posts: 17,359
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 86 Post(s)
Liked: 129
Quote:
Originally Posted by larry2007 View Post

I would like to know what you are doing differently than I am. I tried this before but still it does not work. Do you have the disk and have you actually tried this? Can you really get past the Video_TS.VOB (which works)? This procedure works for all of Avia, but not for DVE.

You've got to find the right VTS_x_y.vob that has those test patterns in it.

Quote:
I am not sure how you can draw the conclusion that the source data is either at PC or Video levels if the tool does not have a setting to not change (or set) the output. Perhaps you know that that is how it is coded (ex video leaves the source alone, leaving level 0 as a 0 and a 16 as a 16, where as PC expands all things 16 moves to 0 and 0 -15 gets cropped). Is this the case?

"TV" is the "don't change" setting. The TV setting doesn't covert PC to TV levels, it just leaves them alone. Only the PC setting modifies the output.

Quote:
I think that you may have mis-comunicted this last bit. If the PC is expanding the range, 16 becomes 0, right, as previously discussed. If the TV display is doing nothing to the data stream, but it's input display range starts at 16, from the standpoint of the signal source, the first bit of black would be around 32 (now at 16 at the input to the dispaly), crushing the first 16 or so bits of data.

That's what I was trying to say in the second sentence.

Quote:
The set does not need to do any further expansion to lose this data.

Conceptually (if not actually) it is expanding levels. As you describe, if the TV is expecting Video levels but is recieving PC levels, it will clip "real" black and white.

Quote:
This can not be adjusted by increasing black level, as there is effectively nothing below 16 any longer, and we end up making 16 brighter, but never seeing 15 and below, since it is chopped at the input to the monitor.

Depends on the display, often the PC/TV toggle is nothing more than a brightness/contrast preset, and you can adjust it either way. Of course on some displays they don't give the adjustment range necessary to calibrate with the wrong assumed input levels.

Quote:
As I see it, if the input can not be set for the expanded range of levels, then the output needs to be not expanded. I would gather that increasing the brightness from the PC (Intel utility) would also have no posatiuve effect, but I guess it is time to try some stuff there anyway, as I should be able to reset to defaults and I have calibration gear.

You kind of lost me there.

Adjusting brightness/contrast in the display drivers can (on ATI at least) be used to "undo" the TV-PC expansion.

See what an anamorphoscopic lens can do, see movies the way they were meant to be seen
stanger89 is online now  
post #465 of 486 Old 12-17-2008, 07:24 PM
Member
 
EVGA's Avatar
 
Join Date: May 2008
Posts: 33
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
Are there any new updates for the new video cards for this thread?

I have an EVGA 8800 GTX Super Clocked
EVGA is offline  
post #466 of 486 Old 05-18-2009, 10:43 AM
Senior Member
 
floepie's Avatar
 
Join Date: Dec 2006
Posts: 211
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
So, is EVR in Vista/Windows 7 similar to VMR9 in that all values 0-255 are passed along to the display with no expansion? If I set black to 16 and white to at least 235, then it will pretty much prevent me from being able properly adjust levels in photos without changing profiles at the monitor, won't it?
floepie is offline  
post #467 of 486 Old 05-27-2009, 08:31 AM
Member
 
djphatic's Avatar
 
Join Date: Jan 2007
Posts: 52
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
I was also going to ask the same thing in relation to EVR.

Also, the latest graphics drivers allow options to display 0-255 or 16-235 but I don't really know if this only affects video playback or it also affects the desktop.

I also use my PC for gaming along with video. Calibrating to 16-235 would mean clipping during gaming? Or does the pixel format options available in CCC take care of this.

The introduction of the pixel format options in the drivers are confusing when trying to think about how best to calibrate my PC.
djphatic is offline  
post #468 of 486 Old 05-27-2009, 10:06 AM
Senior Member
 
floepie's Avatar
 
Join Date: Dec 2006
Posts: 211
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
After looking into this further, this is the way I understand it, and I'm pretty sure this is accurate. First, I do not see this setting you refer to in my version of CCC. It could be that I'm running one of the first versions of CCC for Windows 7. ATI's AVIVO video system will always expand video playback to PC levels before output. So, both your desktop at level 0 will be properly black, and your videos at 16 will be the same color as your desktop at level 0 - both black, provided your display is properly calibrated.

Now, the setting that you are referring to adjusts whether the expanded video and desktop (both at 0-255) are emitted at those levels (uncompressed at 0-255) or compressed to 16-235. So, the setting will not affect whether something get clipped or not. If you output at 16-235, your video will have been first expanded to clip btb and wtw and then compressed, and your desktop will simply be compressed to 16-235. There is no clipping when levels are compressed, however, there may be some slight banding issues, as the full desktop gamut cannot be fully displayed. I would recommend leaving output at 0-255 if possible, and if your display really expects levels at 16-235, your display's brightness (black level) will have to be turned up significantly and contrast (white level) will have to be turned down in order to avoid clipping at the at the display. If you display cannot be properly calibrated to avoid your display clipping levels, then you may have to revert that setting to video levels (16-235).
floepie is offline  
post #469 of 486 Old 05-29-2009, 01:29 AM
 
hammel's Avatar
 
Join Date: May 2009
Posts: 1
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
It gives me max contrast and after trying both, it works for me.
hammel is offline  
post #470 of 486 Old 08-12-2009, 05:23 PM
Member
 
Polcius's Avatar
 
Join Date: May 2008
Posts: 86
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
Quote:


To calibrate your display to show Video levels of 16-235, as they should be for DVD playback in VMR9 (Zoom Player), adjust the brightness (black level) adjustment on the display DOWN so that the greyscale block below the white dots is black, and that the spot in the greyscale gradation right at the white dots is black, you can still see grey to the left of the dots, and to the right of the dots appears as all black. Then adjust contrast (white level) UP the same way, so that the greyscale block below the dots is white, the point in the grayscale gradation at the dots is white, with visible grey to the right of it and all white to the left of it. (NOTE that I'm referring to the top half of the shot. You will probably need to open up the ORIGINAL size, save it, and show it on a screen with a black background to be able to see the black end of the scale easier). Note that if you try to view any image/picture, etc. that has important black/dark detail down below 16, that detail will be lost at this video levels 16-235 setting

I adjusted the black levels, but I have problems with the contrast.

The two white squares don't "blend" as much as I set the contrast up. I have set it to 100 (max.), but the intensity of the white color seems to hurt my eyes in some scenes. I just have to get used to it or am I doing something wrong?

Also, I don't know how I should set the gamma.

CCC says that my gamma is at "1".
Polcius is offline  
post #471 of 486 Old 09-05-2009, 03:52 AM
Member
 
Polcius's Avatar
 
Join Date: May 2008
Posts: 86
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
Ok, now I have other problems.

I have a PC running Win7, hooked up to a Samsung HDTV via VGA (Nvidia 9500GT graphics card). Mainly, I use it to watch DVD's and some TV shows in .mkv.

In the TV, I can't select between "Auto" or "Native" colour spaces because they are greyed out.

In MPC-HC, in renderer settings, I can select to output 0-255 or 16-235. But in the Nvidia Control panel I ALSO can select between 0-255 or 16-235.

My question is wich way should I set it up, because the levels can be changed in many settings.

Thanks, and sorry for my English...

PD: Nvidia settings off.

In MPC HC. "View-> Renderer Settings-> Output Range". I'm using EVR Custom.

With 0-255, I see 2 bars.

With 16-235, I see 3 bars.

My problem is that when using 0-255 (blacks/whites) clipped, the colors look good & vivid, but the whites seem to "pop out" a lot. And when using 16-235, the white/black level is fine, but the colors seem washed out.
Polcius is offline  
post #472 of 486 Old 09-09-2009, 04:18 PM
Member
 
Polcius's Avatar
 
Join Date: May 2008
Posts: 86
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
Wich graphics card will give me the most when adjusting 0-255 and 16-235 problems? ATi or NVIDIA?

Based on some threads, ATI seems to handle it better in latest Catalyst versiones, but I'm not sure.
Polcius is offline  
post #473 of 486 Old 11-24-2009, 11:42 AM
Senior Member
 
kevm14's Avatar
 
Join Date: Jan 2009
Posts: 471
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
I have calibrated my HTPC to my TV with a Pantone Huey. I am using an ATI 4670 with component output at 1080i to my CRT TV. CCC 9.9 I think, and Win 7 64. I have set the ehPresenter.dll key in my registry which appears to have corrected what looked like a double compression on the whites and blacks, but seemingly introduced a black crush.

After running Huey with all room lights off, I ended up setting the software to the preset "Game" which gives me good results in the Avia II calibration DVD I have. Interestingly my TV needed the color and sharpness boosted over the nominal settings, per the DVD. I can calibrate the TV using Avia II, then go back and re-run Huey and everything matches.
kevm14 is offline  
post #474 of 486 Old 12-13-2009, 08:45 AM
Member
 
jiddu's Avatar
 
Join Date: Aug 2009
Posts: 115
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
Very noob question, I REALLY would appreciate some feedback as I'm very confused here:

I have my laptop with HD4650 GPU hooked up via HDMI to a Panny G15 Plasma, when displaying the different test patterns linked to in the first post I can usually see black and white beyond the reference 16 for black and 235 for white. Initially I took this to mean that my TV is accepting and displaying the full 0-255 colorspace.

My laptop's HDMI does not support HDCP according to the manual and some internet specs, so I'm thinking maybe I don't get the conversion from 0-255 to 16-235 that is supposed to happen when you use HDMI out?.. another thing, when adjusting brightness I can get the TV to differentiate all the levels of black down to #1 but with contrast and white I can never differentiate between the last 3-5 shades I think no matter what I do, but this could just be the weakness of my display (or calibration skills ).

Then I got to thinking (because I'm confused), if HDMI out on laptops/htpcs causes the 0-255 colorspace to be converted to 16-235 those test patterns would be useless as 1 (reference black in PC levels) would just get converted to 16 and you'd still be able to see it, unless of course instead of crush it just gets clipped in that situation? If that's true then I'm seeing 0-255 on my display correct?

I can't believe it, after all the reading I've done I thought I had it all figured out, then I connect the cables and I'm stumped and as lost as ever.

Please help
jiddu is offline  
post #475 of 486 Old 12-13-2009, 07:32 PM
Senior Member
 
floepie's Avatar
 
Join Date: Dec 2006
Posts: 211
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
Generally, the video card drivers are responsible for expanding video levels to PC levels at the output. I'm not sure if the detection of a specific connection, i.e. HDMI, would trigger the expansion of levels, but I would seriously doubt that would be sufficient. For instance, I use DVI output with video expansion occurring at output.

Now, when you are calibrating to stills, i.e. jpgs and like, you are sending PC levels to your display, and thus, you would calibrate your display to barely differentiate between the lowest level black (1-2) using brightness and the highest whites (254-255, if possible) using contrast. If you are calibrating to a video disc of some sort, you will not be able to see levels 1-15 blacks, as they now "btb" or crushed out of existence due to the levels expansion. Similarly, the highest levels are obliterated with only differences in levels apparent at levels below 235. To recap, the PC is always sending "PC levels" to the display. Video is expanded, the PC side of things, i.e. your desktop, is not.
floepie is offline  
post #476 of 486 Old 12-14-2009, 07:17 PM
Member
 
jiddu's Avatar
 
Join Date: Aug 2009
Posts: 115
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
Quote:
Originally Posted by floepie View Post

Generally, the video card drivers are responsible for expanding video levels to PC levels at the output. I'm not sure if the detection of a specific connection, i.e. HDMI, would trigger the expansion of levels, but I would seriously doubt that would be sufficient. For instance, I use DVI output with video expansion occurring at output.

Now, when you are calibrating to stills, i.e. jpgs and like, you are sending PC levels to your display, and thus, you would calibrate your display to barely differentiate between the lowest level black (1-2) using brightness and the highest whites (254-255, if possible) using contrast. If you are calibrating to a video disc of some sort, you will not be able to see levels 1-15 blacks, as they now "btb" or crushed out of existence due to the levels expansion. Similarly, the highest levels are obliterated with only differences in levels apparent at levels below 235. To recap, the PC is always sending "PC levels" to the display. Video is expanded, the PC side of things, i.e. your desktop, is not.

Oh ok, I think I get it, thanks. If I'm playing any video content from a file off the HD like an mkv for instance then that will show correctly if my display is calibrated with the .jpeg slides in this thread.

The reason I brought up HDMI is because I've read many threads where other members have stated that through HDMI the desktop (because of Windows or GPU drivers, can't remember which) was getting crushed from 0-255 to 16-235 levels and that any video that was played at 16-235 like DVD discs would as a result get crushed even further if it wasn't expanded in the player to 0-255 first (and in the process BTB/WTW would be lost, I guess that was the issue they wanted to preserve BTB/WTW info). I believe this has something to do with protected path as playing Bluray discs people would get proper 16-235 levels and BTB/WTW preserved. Which is also why I mentioned HDCP and the fact that my laptop is supposedly not supporting it, I figured this meant my laptop would send PC levels (0-255) without the conversion or "crush" happening to the desktop much like if I was using DVI (I think there is some registry hack people do to get the GPU to use HDMI video out like a DVI out). So, for some dumb reason I thought I could check if I'm getting the same crush happening or not easily by feeding the .jpeg slides to the TV, which I now realize can't work unless I use a calibration DVD instead.

Anyway, now I am only worrying about playing video files off the HD. I wanted to save money by getting a laptop with a BD drive so it could do it all but then standalone players dropped so much in price I picked one up and skipped the BD drive in the laptop, I'm glad because for me it is easier this way and also more convenient. I just wanted to understand what was happening when I hooked the laptop up to my display.

Thank you for the reply.
jiddu is offline  
post #477 of 486 Old 12-15-2009, 08:29 PM
Member
 
jiddu's Avatar
 
Join Date: Aug 2009
Posts: 115
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
Well, I've tried to accomplish one more thing written in this guide and failed.. I played some DVDs and .MKV files (in ZoomPlayer like the OP), I tried using Overlay and then VMR9 to see if I could get 16-235 and no difference (I did restart the player in between changes).

Any ideas why? Anything else I should do/check?
jiddu is offline  
post #478 of 486 Old 12-16-2009, 07:46 PM
Member
 
jiddu's Avatar
 
Join Date: Aug 2009
Posts: 115
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
ugh.. I don't know what it is, I've messed with every setting I could think of. Could it be something with my video card?
jiddu is offline  
post #479 of 486 Old 04-27-2010, 09:37 AM
Member
 
jediman's Avatar
 
Join Date: Apr 2010
Posts: 50
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
I have recently hooked my panasonic s2 plasma to my nvidia graphics card running on a windows 7 computer via a dvi port while using Pdvd 9 to play blu ray and I am having a hard time interpreting what is going on and I am being made to think that something is correcting my color automatically because when i try to correct things manually things look wonky.

First i would assume anything the computer should be generating without modification should be in the range 0-255 and that the tv should clip any blacks between 0 and 15 to the same base black. However, when i display a 16 black and a 0 black on the tv via the computer i see a very clear difference between the two (the color setting in the nvidia control panel is set to 0-255). If I change the control panel to 16-235 I still see a difference between the two (as i should) but the 0 value (now changed to 16) has become noticeably brighter. Is the display or os correcting the output for me and my manual correction attempts essentially applying the correction twice compressing the range more than is needed?
jediman is offline  
post #480 of 486 Old 07-11-2010, 04:37 PM
Advanced Member
 
RocShemp's Avatar
 
Join Date: Nov 2001
Location: Puerto Rico
Posts: 662
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 8 Post(s)
Liked: 13
Is there no way to make the desktop and video output match (other than setting my video to full rather than limited)?
RocShemp is offline  
Reply Home Theater Computers

User Tag List

Thread Tools
Show Printable Version Show Printable Version
Email this Page Email this Page


Forum Jump: 

Posting Rules  
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are Off
Refbacks are Off