or Connect
AVS › AVS Forum › Gaming & Content Streaming › Home Theater Gaming › Xbox Area › New HDMI Colour Space w/NXE
New Posts  All Forums:Forum Nav:

New HDMI Colour Space w/NXE - Page 2

post #31 of 209
Seems like all RGB all the time is the best choice for most people. I'll bet the Xbox properly converts YCbCr to RGB. Unfortunately, many HDTVs do not. Since the reference levels are still separately selectable, this doesn't really change the current 0-255 vs 16-235 issue.
post #32 of 209
Quote:
Originally Posted by chrisherbert View Post

Seems like all RGB all the time is the best choice for most people.

Best choice for games: RGB and expanded (PC levels)
Best choice for video: YCbCr and standard (Video levels)

If you choose to output RGB with PC levels, when you watch a video it's levels are expanded, so 16 becomes 0 and 235 becomes 255, <16 and >235 are cut off and all the levels in between are remapped. It also converts from YCbCr colour space to RGB. When you play video games it outputs with no conversion.

If you choose to output YCbCr with video levels, when you play a video game the range is compressed so 0 becomes 16 and 255 becomes 235 and the levels in between are remapped. There is nothing <16 and >235 so those levels are wasted. It also converts from RGB colour space to YCbCr. When you watch a video it outputs with no conversion.

The choice is yours. If it had an automatic levels setting you wouldn't need to compromise.
post #33 of 209
Quote:
Originally Posted by chrisherbert View Post

I'll bet the Xbox properly converts YCbCr to RGB. Unfortunately, many HDTVs do not.

Mine does, and with no banding at all. Id rather my TV did it than the xbox.
post #34 of 209
I know what the probable outcome will be but I am not going to comment further until I have the NXE here for testing , we should all refrain from speculation unless somebody with a very solid grasp actually has the NXE and can dig into it (would be great to get Stacy Spears or Kris Deering in on this one) .

- Jason
post #35 of 209
Thread Starter 
Quote:
Originally Posted by cybersoga View Post

Best choice for games: RGB and expanded (PC levels)
Best choice for video: YCbCr and standard (Video levels)

If you choose to output RGB with PC levels, when you watch a video it's levels are expanded, so 16 becomes 0 and 235 becomes 255, <16 and >235 are cut off and all the levels in between are remapped. It also converts from YCbCr colour space to RGB. When you play video games it outputs with no conversion.

If you choose to output YCbCr with video levels, when you play a video game the range is compressed so 0 becomes 16 and 255 becomes 235 and the levels in between are remapped. There is nothing <16 and >235 so those levels are wasted. It also converts from RGB colour space to YCbCr. When you watch a video it outputs with no conversion.

The choice is yours. If it had an automatic levels setting you wouldn't need to compromise.

I think it does when its set to auto.
I just mine to auto and expanded yesterday, and if go from dashboard to a video the picture flickers and it adjusts something. You also see the dashboard briefly in strange colours.
post #36 of 209
Quote:
Originally Posted by Ripeer View Post

I think it does when its set to auto.
I just mine to auto and expanded yesterday, and if go from dashboard to a video the picture flickers and it adjusts something. You also see the dashboard briefly in strange colours.

Ah , more juicy details .

So then some will have to manually switch their displays (some do not do RGB/YCbCr switching automatically) once the 360 does its AUTO mode selection , not a deal breaker for me if it does things properly .

Thanks for the added information Ripeer .

- Jason
post #37 of 209
Do we know xbox 360 games are actually Rgb pc levels?

From what I read pc levels are also darker so you will need to compensate for that if you are calibrated for video levels.

And secondly I dont know if you should be using pc levels unless your display device is calibrated for it.

Mine is calibrated for video levels naturally. If yours is also calibrated for video levels like I figure alot of us non htpc users are, I believe you should stay with video levels.

Unless your gaming on a monitor or you have a display which supports pc levels (I dont know if many do, but I have always read many dont) you should not use extended levels. Unless you want crushed blacks.

So I would say use it if

1. You know your display supports pc levels and not just video levels.
2. You calibrated for pc levels.
post #38 of 209
Quote:
Originally Posted by Murilo View Post

Do we know xbox 360 games are actually Rgb pc levels?

From what I read pc levels are also darker so you will need to compensate for that if you are calibrated for video levels.

And secondly I dont know if you should be using pc levels unless your display device is calibrated for it.

Mine is calibrated for video levels naturally. If yours is also calibrated for video levels like I figure alot of us non htpc users are, I believe you should stay with video levels.

Unless your gaming on a monitor or you have a display which supports pc levels (I dont know if many do, but I have always read many dont) you should not use extended levels. Unless you want crushed blacks.

So I would say use it if

1. You know your display supports pc levels and not just video levels.
2. You calibrated for pc levels.

Any resonable HDMI/DVI display device expects RGB color to be at 0-255 levels and YCbCr color to be at 16-235 levels, all automatically. No adjustment or undue worrying by the user should be required.

P.J.
post #39 of 209
Hmmm, your sure? Why exactly would it not accept rgb studio levels. Only a monitor I would think would be expecting rgb pc levels.

Still I asked in the calibration forum, they said it depends what you calibrate for.

Again im guessing many of our displays are calibrated for Rec 709 video levels.

People would need to calibrate for pc rgb levels if they want to use it.

http://www.avsforum.com/avs-vb/showthread.php?t=967106

Here is a great thread you should all read.

Im going to have to disagree with valence 01 the issue in this thread is again not all displays do proper rgb pc levels. There is a test in that thread to make sure that your set does proper rgb pc levels. One person noted there XBR4 on auto did not properly match up pc rgb levels.


So again I am going to say

1. Make sure your set can properly display rgb pc levels, consult the test in thread link above.
2. Make sure you calibrate for rgb pc levels, but this will screw up blueray/dvd viewing. You will need multiple settings in this case.

Most believe from what I read and from that thread the difference is small and maybe not noticeable using rgb pc levels, compared to video levels.

Also the answer to what video games use is not clear, the thread states that although it is believed games use rgb pc levels developers are all over the map regarding the rgb level.



Im going to use rec 709 since I calibrated mine with blue ray.
post #40 of 209
Quote:
Originally Posted by Murilo View Post

Hmmm, your sure? Why exactly would it not accept rgb studio levels. Only a monitor I would think would be expecting rgb pc levels.

Still I asked in the calibration forum, they said it depends what you calibrate for.

Again im guessing many of our displays are calibrated for Rec 709 video levels.

People would need to calibrate for pc rgb levels if they want to use it.

http://www.avsforum.com/avs-vb/showthread.php?t=967106

Here is a great thread you should all read.

Im going to have to disagree with valence 01 the issue in this thread is again not all displays do proper rgb pc levels. There is a test in that thread to make sure that your set does proper rgb pc levels. One person noted there XBR4 on auto did not properly match up pc rgb levels.


So again I am going to say

1. Make sure your set can properly display rgb pc levels, consult the test in thread link above.
2. Make sure you calibrate for rgb pc levels, but this will screw up blueray/dvd viewing. You will need multiple settings in this case.

Most believe from what I read and from that thread the difference is small and maybe not noticeable using rgb pc levels, compared to video levels.

Also the answer to what video games use is not clear, the thread states that although it is believed games use rgb pc levels developers are all over the map regarding the rgb level.

Here is another quote from that thread just as a warning



Im going to use rec 709 since I calibrated mine with blue ray.

The number of monitors out there factory calibrated to "Studio RGB Levels" is very low, and most TV's and PC monitors cannot be re-calibrated to "Studio RGB Levels" in the user menu if at all. It's a non standard that no user should ever need to deal with... I firmly believe Microsoft made the wrong decision with trying to push "Studio RGB Levels" with the introduction of the VGA cable, which now continues to be an issue with HDMI and DVI.

HDMI TV's and monitors have a video processor that converts whatever is input into whatever the screen uses natively. With most TV's you calibrate the input and it affects everything that's plugged into that input, it doesn't have different memories for different colour spaces and reference levels, so you should only need to calibrate that input once and it'll look right with both RGB at PC levels and YCbCr at video levels.
post #41 of 209
Quote:
Originally Posted by cybersoga View Post

The number of monitors out there factory calibrated to "Studio RGB Levels" is very low, and most TV's and PC monitors cannot be re-calibrated to "Studio RGB Levels" in the user menu if at all. It's a non standard that no user should ever need to deal with... I firmly believe Microsoft made the wrong decision with trying to push "Studio RGB Levels" with the introduction of the VGA cable, which now continues to be an issue with HDMI and DVI.

HDMI TV's and monitors have a video processor that converts whatever is input into whatever the screen uses natively. With most TV's you calibrate the input and it affects everything that's plugged into that input, it doesn't have different memories for different colour spaces and reference levels, so you should only need to calibrate that input once and it'll look right with both RGB at PC levels and YCbCr at video levels.

I pretty much agree with your post, that we need to send what we calibrate for, thats what I am trying to say.

However I do agree naturally most monitors use pc rgb levels, but I am of the understanding most tv's expect studio rgb levels.

As for your last sentence I agree if your saying that it will look right with either rgb pc level OR YCbCr at video level. If you calibrate that input for rgb pc levels, it wont look right if you feed it ycbcr video levels.
post #42 of 209
Studio and video rgb levels are both 16-235 just different names.


Were you trying to say most tv accept computer rgb levels? I think maybe since your from uk things are different. On my benq w5000 overseas users set the machine to 0ire, while we in north america set it to 7.5
post #43 of 209
Quote:
Originally Posted by Murilo View Post

Studio and video rgb levels are both 16-235 just different names.


Were you trying to say most tv accept computer rgb levels? I think maybe since your from uk things are different. On my benq w5000 overseas users set the machine to 0ire, while we in north america set it to 7.5

TVs that accept RGB through VGA, DVI and HDMI usually expect PC levels. TVs also require YCbCr to be at video levels. HDMI can accept either on the same input without the need for re-calibration.
post #44 of 209
Great news.
Thanks for the update.
post #45 of 209
Cybersoga, you live in the uk, where 0 IRE is the standard. Here our tv's expect 7.5 IRE or 16-235 even for rgb, this is where the issues after researching the vga cables had issues before allowing the expanded levels. Pal is different 0 IRE is the standard.
post #46 of 209
Actually nevermind cybergosa, I think your right. That may only apply to ycbcr.

I have a flea, which converts rgb to pc levels automatically. Some people complained about it because our Satellite receivers output rgb for some reason at 16-235.

SO anyway I took the flea out so the xbox 360 would send 16-235 to my projector. And boy did it looked washed out. It did not look nearly as good. The only way I could get it to look proper again was set it to expanded.
post #47 of 209
Quote:
Originally Posted by Murilo View Post

Cybersoga, you live in the uk, where 0 IRE is the standard. Here our tv's expect 7.5 IRE or 16-235 even for rgb, this is where the issues after researching the vga cables had issues before allowing the expanded levels. Pal is different 0 IRE is the standard.

I wasn't going to bother mentioning this, but there is one situation specific to the UK where RGB is expected to be at video levels, and that is with RGB Scart, an ageing connector that's limited to 576 line interlaced.
post #48 of 209
Just to add to the complication. It seems expanded levels from my research is only 0 IRE. Its not pc levels. An expert on the calibration forum posted this.

"Technically, PC Levels are 0-255 for 0-100 IRE and as thus you do not get a blacker than black level with PC levels (as you cannot go below 0 and I assume you know you can only have 255 numbers).

For non-PC, HDTV Video Programming, the Video Levels are 16-235 for 0-100 IRE, so blacker than black levels are possible.

The problem is that some DVDs and Games output 0 for black and some output 7.5 IRE. You need to know what the source (DVD or Game) is using as reference before you can set it accordingly."

Now according to this expanded is simply 0 IRE black

http://www.ps3forums.com/showthread....96706&posted=1

And from what I get from all this is Xbox 360 outputs Video levels 16-235, but gives you the choice of black level IRE.
post #49 of 209
I also found this quote online


"People need to be aware that the 360 outputs video levels (black=16/white=235). PC Monitors, and most TVs with a VGA connection, expect PC Levels (black=0/white=255). If your monitor/TV is calibrated for PC levels it will look washed out, as you will never achieve true black or true white.

You need to recalibrate your monitor/TV to output video levels. This will make white and black appear truer, but colors still won't be as vibrant because the display is expecting a wider range color palette (100% red would be 255/0/0 rather than 235/0/0, making it look dull). Some TVs have the ability to switch between PC/Video Levels when using a VGA connection, but most don't.

It seems like it would be a simple fix on the surface, just expand the range. But the problem is all the games are made for video levels. When you expand 16-235 to 0-255 you clip whiter-than-white and blacker-than-black information (dynamic range) and introduce banding as well."

This just gets more confusing.
post #50 of 209
The thing is that the games render at whatever the developer decides to render at. It's not real imagery, so black and white are whatever the developer decides they are. The same for gamma and colors. This is why so many games have sliders in thier menus to adjust the video output seperately from your TV.
post #51 of 209
Quote:
Originally Posted by Murilo View Post

Cybersoga, you live in the uk, where 0 IRE is the standard. Here our tv's expect 7.5 IRE or 16-235 even for rgb, this is where the issues after researching the vga cables had issues before allowing the expanded levels. Pal is different 0 IRE is the standard.

Indeed PAL and NTSC are different, but of course those are both broadcast standards, which have nothing much to do with HDTV transmission over cables. For YCbCr or YPbPr component, I agree that every TV built for U.S. market expects 16-235 video levels. For RGB over HDMI/DVI or VGA, I would beg to differ. I've never seen any set, in any country for that matter, that expected 16-235 levels for RGB via HDMI, DVI, or VGA. You can of course calibrate your set to whatever you like, but that's beside the point. Computers spit out RGB 0-255 levels and most sets VGA inputs are designed with computer connection in mind. Users would be upset with poor performance if mfgrs had them adjusted for 16-235 levels out of the box. Same for RGB over HDMI or DVI. It is most common for ordinary video devices like blu-ray or upconverting standard dvd players to spit out YCbCr component video over such a digital connection and this is done at 16-235 levels. Now, when a TV receives RGB over HDMI or DVI, the assumption is, that the device must be a computer and the set then expects 0-255 levels.

The only significant exception to the above, is what Microsoft did with RGB over VGA on the original Xbox 360 release. That is, they inadvertantly output 16-235 level RGB over VGA. Perhaps there was confusion with SCART RGB? Everybody that tried VGA, complained and eventually Microsoft discovered the error and corrected it, sort of. They now have the ability to do both 0-255 and 16-235 levels for RGB over VGA, even though there is no known TV or monitor that expects VGA RGB at 16-235 levels. I suppose that some idiot will have his TV calibrated for 16-235 levels on the VGA input and the option in the 360, will keep him from complaining.

P.J.
post #52 of 209
Quote:
Originally Posted by Valence01 View Post

Indeed PAL and NTSC are different, but of course those are both broadcast standards, which have nothing much to do with HDTV transmission over cables. For YCbCr or YPbPr component, I agree that every TV built for U.S. market expects 16-235 video levels. For RGB over HDMI/DVI or VGA, I would beg to differ. I've never seen any set, in any country for that matter, that expected 16-235 levels for RGB via HDMI, DVI, or VGA. You can of course calibrate your set to whatever you like, but that's beside the point. Computers spit out RGB 0-255 levels and most sets VGA inputs are designed with computer connection in mind. Users would be upset with poor performance if mfgrs had them adjusted for 16-235 levels out of the box. Same for RGB over HDMI or DVI. It is most common for ordinary video devices like blu-ray or upconverting standard dvd players to spit out YCbCr component video over such a digital connection and this is done at 16-235 levels. Now, when a TV receives RGB over HDMI or DVI, the assumption is, that the device must be a computer and the set then expects 0-255 levels.

The only significant exception to the above, is what Microsoft did with RGB over VGA on the original Xbox 360 release. That is, they inadvertantly output 16-235 level RGB over VGA. Perhaps there was confusion with SCART RGB? Everybody that tried VGA, complained and eventually Microsoft discovered the error and corrected it, sort of. They now have the ability to do both 0-255 and 16-235 levels for RGB over VGA, even though there is no known TV or monitor that expects VGA RGB at 16-235 levels. I suppose that some idiot will have his TV calibrated for 16-235 levels on the VGA input and the option in the 360, will keep him from complaining.

P.J.

I agree with everything you say. About the same time microsoft brought out the VGA cable for the xbox 360, MS also started using video levels for windows media center, so you'd have your windows desktop at PC levels and when you go into media center the picture would be washed out because it was using video levels! When I complained about it, they said that you should be able to see the below black and above white details on DVDs and you were to adjust your display if you didn't like it!

Black, level 0 in PC levels should look the same as black level 16 in video levels to the end user.
post #53 of 209
My problem is with this statement

"They now have the ability to do both 0-255 and 16-235 levels for RGB over VGA, even though there is no known TV or monitor that expects VGA RGB at 16-235 levels"

I thought 0Ire is 0-255, and and 7.5 IRE is 16-235 but not so.

Unless my flea and microsoft is lying expanded is still 16-235 but 0ire black level. Even that link from microsoft cheif i provided only stated expanded is 0 IRE nothing to do with pc levels. And that other quote I found on the internet stated the xbox and video games output 16-235.

I wish someone could confirm if the xbox does nothing with levels or not.
post #54 of 209
Quote:
Originally Posted by Murilo View Post

My problem is with this statement

"They now have the ability to do both 0-255 and 16-235 levels for RGB over VGA, even though there is no known TV or monitor that expects VGA RGB at 16-235 levels"

I thought 0Ire is 0-255, and and 7.5 IRE is 16-235 but not so.

Unless my flea and microsoft is lying expanded is still 16-235 but 0ire black level. Even that link from microsoft cheif i provided only stated expanded is 0 IRE nothing to do with pc levels. And that other quote I found on the internet stated the xbox and video games output 16-235.

I wish someone could confirm if the xbox does nothing with levels or not.

IRE 0 / 7.5 IRE is a brightness level shift with component video and isn't the same thing as video levels/pc levels. It only applies to YPbPr component video. If your device is outting out 7.5 IRE you should have your TV adjusted to expect 7.5 IRE. If your device is putting out 0 IRE you should have your TV adjusted to expect 0 IRE. If you don't get it right the brightness will either be too high or too low. The best setting is 0 IRE as it gives you the widest dynamic range. I think the "intermediate" reference levels setting on the xbox 360 has something to do with this.

It's not applicable to RGB video.
post #55 of 209
Quote:
Originally Posted by Murilo View Post

My problem is with this statement

"They now have the ability to do both 0-255 and 16-235 levels for RGB over VGA, even though there is no known TV or monitor that expects VGA RGB at 16-235 levels"

I thought 0Ire is 0-255, and and 7.5 IRE is 16-235 but not so.

Unless my flea and microsoft is lying expanded is still 16-235 but 0ire black level. Even that link from microsoft cheif i provided only stated expanded is 0 IRE nothing to do with pc levels. And that other quote I found on the internet stated the xbox and video games output 16-235.

I wish someone could confirm if the xbox does nothing with levels or not.

Let me elaborate a bit. VGA interface consists of 5 signals; R, G, B, H, V. The RGB part I think everyone understands, but what are the H and V? They're horizontal and vertical sync. They're carried completely separately from the RGB color information on a VGA interface. Because sync is carried separately, the RGB output levels use the full range allowed (0-255). If you do not have H and V channels and you combine sync on the green channel (consumer) or sync on all 3 (pro), then you must make accomodation for that by creating a "blanking pedestal" by raising the voltage level during the time when the display should be at black level, from 0 to 16. That way, the TV set can distinguish between black video output (16) and sync level (0). An RGB only interface, with no separate H and V channels, is NOT VGA!

OK now, where does IRE fit into all this? IRE is a term originating in the broadcast industry and it refers to modulation level of the RF transmitter. 0 IRE means 0% modulation and 100 IRE means 100% modulation level. So, when the broadcaster wants to send out video info at black level, the level is set to 16 or 7.5 IRE modulation (roughly 7.5% of 235-16). When the broadcaster wants to send out sync, he sets the level to 0 or 0 IRE modulation. If the broadcaster wants to send out pure white, he sets the level to 235 or 100 IRE modulation. So, when someone says black is at 7.5 IRE, that's 16 on the 0-255 scale. Similarly 100 IRE is at 235 on the 0-255 scale. Of course, all of this is (usually) irrelevant to a full 5 channel RGBHV VGA interface, since there is no need to make room for sync on the RGB channels and the full 0-255 level is used with black = 0 and white = 255. The concept of IRE doesn't figure here, since we're not broadcasting anything. Microsoft, in their infinite wisdom, made this not irrelevant by spitting out 16-235 levels over the full RGBHV VGA interface. I suppose they had difficulty reading the spec. With one of the dashboard updates, they now allow the user to choose between what they call standard (16-235) and expanded (0-255) levels for RGBHV VGA. I stand by my previous statement that there is no known TV or PC monitor that is expecting RGBHV over VGA at 16-235 levels, out of the box. Perhaps there are professional RGB with embedded sync devices out there. I don't know much about SCART and it's use of RGB, but the few references I checked state that sync is not embedded on the RGB channels, but is carried by the composite signal and there is no blanking pedestal. So, I would expect SCART RGB to use the full 0-255 level for RGB.

After the above discussion, it should become obvious that the concept of "expanded" levels (0-255) for YPbPr analog component video, is a contradiction and here again, there are no consumer monitors or TV sets out there that use 0-255 levels for YPbPr component video, since sync is embedded in the Y channel and there would be no way for the monitor to distinguish between sync level and black level. 0 is the lowest voltage level that the DAC can output.

Full digital formats such as YCbCr and RGB over HDMI or DVI, are not limited by the video output DACs and there again, 0-255 levels are the norm for RGB format carried via digital interface. While it is possible to carry 0-255 levels for YCbCr via digital interface, I'm extremely skeptical that there are any consumer monitors or TV sets that expect it. As such, even if the XBOX allows you to set expanded (0-255) levels for YCbCr over HDMI/DVI, the monitor would simply clip anything less than 16, at black and anything over 235, at white. It may not even be possible to recalibrate to correct that.

P.J.
post #56 of 209
Like I stated previously discussing the NXE at this point is speculation on our parts until just exactly what the 360 is doing has been tested and measured by qualified people .

I had the chance to see the new NXE on a VGA connected 360 and the extra settings beyond just STD/INT/EXP were not there (not that they are needed with VGA) , so the extra settings must only be present when using HDMI .

- Jason
post #57 of 209
Could one really tell the difference between color spaces, I have outputted my ps3 at rgb for blue ray and didnt notice a difference. I keep it at ycbcr just to keep consistant.

I have also been refering to hdmi really this whole time, I own an hdmi version.
post #58 of 209
Quote:
Originally Posted by Murilo View Post

Could one really tell the difference between color spaces, I have outputted my ps3 at rgb for blue ray and didnt notice a difference. I keep it at ycbcr just to keep consistant.

I have also been refering to hdmi really this whole time, I own an hdmi version.

It's far easier to tell if the levels are wrong than any difference between colour spaces...
post #59 of 209
Quote:
Originally Posted by DaGamePimp View Post

I had the chance to see the new NXE on a VGA connected 360 and the extra settings beyond just STD/INT/EXP were not there (not that they are needed with VGA) , so the extra settings must only be present when using HDMI .

Why aren't they needed with VGA? I'm still going to be doing video and games with the 360 so I would think I would need "auto" just like HDMI.
post #60 of 209
Quote:
Originally Posted by turls View Post

Why aren't they needed with VGA? I'm still going to be doing video and games with the 360 so I would think I would need "auto" just like HDMI.

Well some displays would need it while others would not (most displays with a VGA port are expecting PC levels and some will not Auto adjust to what is being sent) , thus the reason for having a choice of settings , it is display dependent . But as I said we'll have to wait and see what happens once the NXE is out and some in depth testing can be done .

- Jason
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: Xbox Area
AVS › AVS Forum › Gaming & Content Streaming › Home Theater Gaming › Xbox Area › New HDMI Colour Space w/NXE