or Connect
AVS › AVS Forum › Display Devices › LCD Flat Panel Displays › Sharp Aquos LCD 2007 speculation thread: D92? TruD? HDMI 1.3?
New Posts  All Forums:Forum Nav:

Sharp Aquos LCD 2007 speculation thread: D92? TruD? HDMI 1.3? - Page 5

post #121 of 3473
So that begs questions for me:

1) If the current LCD is 24bit color and you don't care about lip sync solutions in the HDMI cable, then HDMI 1.2 is just fine?

2) My guess is that the new LCD will also be 24bit color so still no need for HDMI 1.3? Except that it will be presumably 120hz and if that is the case it must be 1.3 as that is new in the new spec.

3) If the display has inherent limitations like 24bit color or 60hz inputs, it seems that HDMI 1.3 would provide little advantage for them? Any thoughts on the value of 1.3 for the 92 series?

4) I read a blurb in an article that said current technologies are inherently capable of much more than 24bit depth, this is true? That negates 1) and 2).

5) I also read that the human eye is limited to about 30bit color deplth? If so HDMI 1.3 color depth is still valuable, but not 48bit.

--prosumer

****
interesting link to article on hdmi audio specs etc
http://www.hdtvmagazine.com/articles...i_versions.php

this discussion of color depth and video bandwidth also bring me back to banding issues. here is a link discussing dell 24" LCD banding issues
http://www.engadget.com/2006/04/26/2...nts/3#comments

interesting thread on hdmi 1.3
http://forums.avguide.com/viewtopic....e2198c38e0d746

here is a banding test image
http://home.comcast.net/~rrighetti/CheckeMON.jpg
post #122 of 3473
What say ye, gentlemen?

--prosumer
post #123 of 3473
Quote:
Originally Posted by prosumer View Post

What say ye, gentlemen?

--prosumer


Ok, I'll take a shot.

1. Yes, unless you want to pass TruHD over HDMI.
2. You don't need HDMI 1.3 to take advantage of the 120hz. A 120hz set should allow you to input a 1080p24 source and display each frame 5 times, instead of having to do a 3:2 pull down as in 60hz sets. PC input is a different story.
3. It seems there would be little advantage unless they support deep color, and even then I'm not sure there is any source that will provide deep color. Again, PC input is a different story.
4. Possible, but that brings us back to the question of source material.
5. 24 bit looks darn good. When people talk about limits of human detection, they're usually talking about using material to help the eye detect the difference.

If anyone has any additions or corrections, please jump in.
post #124 of 3473
Just look at the Canon 30D sample gallery at dpreview on your current 24-bit computer monitor

http://www.dpreview.com/gallery/canoneos30d_samples/

Do you see any color banding? After seeing those, most people will realize they have no need for more than 24-bit color, rather they need better quality source material.
post #125 of 3473
Quote:
Originally Posted by 2Channel View Post

Ok, I'll take a shot.

1. Yes, unless you want to pass TruHD over HDMI.
2. You don't need HDMI 1.3 to take advantage of the 120hz. A 120hz set should allow you to input a 1080p24 source and display each frame 5 times, instead of having to do a 3:2 pull down as in 60hz sets. PC input is a different story.
3. It seems there would be little advantage unless they support deep color, and even then I'm not sure there is any source that will provide deep color. Again, PC input is a different story.
4. Possible, but that brings us back to the question of source material.
5. 24 bit looks darn good. When people talk about limits of human detection, they're usually talking about using material to help the eye detect the difference.

If anyone has any additions or corrections, please jump in.

The 1.3 will broaden the color depth which will alleviate color banding because the gradation scale increases. But, the greatest improvement, at least at the outset, will be with the PC.
Here is why i am waiting for the 92u series.
I plan on getting the PS3 for gaming and as a DVD player. My media PC will also be my home base. HDMI 1.3, along with the refresh rate of 120hz, and 1:1 pixel mapping is a killer settup. You jump up to 48 bit color gamut along with the smoothness of 120hz and you are the envy of millions. So, those of you incorporating your PC into your home theater system will be set for years.
post #126 of 3473
because new technology is all about people envying you :P
post #127 of 3473
Quote:
Originally Posted by lipcrkr View Post

The 1.3 will broaden the color depth which will alleviate color banding because the gradation scale increases. But, the greatest improvement, at least at the outset, will be with the PC.
Here is why i am waiting for the 92u series.
I plan on getting the PS3 for gaming and as a DVD player. My media PC will also be my home base. HDMI 1.3, along with the refresh rate of 120hz, and 1:1 pixel mapping is a killer settup. You jump up to 48 bit color gamut along with the smoothness of 120hz and you are the envy of millions. So, those of you incorporating your PC into your home theater system will be set for years.

Two questions

1. Do you plan to use your HTPC for HD-DVD playback? If so, which video card and optical drive are you thinking of using?

2. Does BR or HD-DVD actually have the ability to take advantage of deep color? Isn't this material a 24 bit color depth anyway?
post #128 of 3473
Man i hope that all this that people are talking about is tru cause i really want this tv now lol
post #129 of 3473
Quote:
Originally Posted by mchamblissII View Post

Man i hope that all this that people are talking about is tru cause i really want this tv now lol

It's mostly speculation at this point. We probably shouldn't get too excited untill we see more concrete information.
post #130 of 3473
Quote:
Originally Posted by 2Channel View Post

1. Do you plan to use your HTPC for HD-DVD playback? If so, which video card and optical drive are you thinking of using?

You did ask me but I am gonna chime in. At the current point in time your best solution for video processing on the PC is an NVidia 7800/7900 card and the PureVideo software. http://firingsquad.com/hardware/buil...htpc/page5.asp They do regional based motion adaptive processing for all your basics tasks and have visibly better quality than ATI today. For drives the Plextor drives are fast and very quiet but are rarely the best value. http://www.cdrinfo.com/Sections/Revi...rticleId=16685 For an HTPC don't skimp on the processor, upscaling and/or noice reduction with software like ffdshow will bring any CPU to its knees, right now the Intel Core 2 Duo is tops and oddly the best value.

/flex! puters are my strong suite
post #131 of 3473
double post
post #132 of 3473
More on topic, so the 92u is replacing this LC-57D90U with a LC-57D92U? http://www.sharpusa.com/products/Mod...8,1653,00.html That would be nice because 57" is more the size I am looking for.

On the 65" Panny plasma thread they were speaking poorly of the 65" Sharp LCD. Certainly these features we are wishing for would bring the Sharp back up to par or better. Right now the 90U and the 62U seem pretty close and probably shouldn't be?

--prosumer
post #133 of 3473
Quote:
Originally Posted by 2Channel View Post

Two questions

1. Do you plan to use your HTPC for HD-DVD playback? If so, which video card and optical drive are you thinking of using?

2. Does BR or HD-DVD actually have the ability to take advantage of deep color? Isn't this material a 24 bit color depth anyway?

Sorry for the delay. The PS3 rolling out soon will have 1.3, however, it may not have the ability to increase to 48 bit, at least at the outset. It's still not finalized. It looks like spring will be time frame for the full 1.3 capability. This coincides with the 92u series. So my plan is to get the 92 series and then the PS3 when they have the upgraded 1.3. I know this sounds confusing but what i mean is PS3 and other devices coming out now with 1.3 specs may not be able to take full advantage of the specs right now, sort of like the first 1080p displays not having 1080p inputs. It looks like spring when BluRay will be fully 1.3. So, to answer your question, yes, BR will be able to take advantage of 48 bit.
Also, the new 7 series nVidia cards look like a winner, after of course you install the drivers for HD. I currently use my PC for DVD's and gaming with my ATI 9600XT card but i need tons more memory. So my plan is to use the PS3 for gaming and DVD's to loosen the load on my CPU. The Sharp, because it can do 1:1, will be used for HD content and as a computer monitor since text will be readable at 8 feet or so.
Also,
post #134 of 3473
Quote:
Originally Posted by lipcrkr View Post

It looks like spring when BluRay will be fully 1.3. So, to answer your question, yes, BR will be able to take advantage of 48 bit.

Umm really? With what source are you talking about? The content specs don't specify 48-bit content storage so you will be stuck with lowest common denominator discs regardless of what revision HDMI is in play. Not that it makes any difference as 48-bit is a waste of bits where almost no one will be able to tell any difference.
post #135 of 3473
Quote:
Originally Posted by lipcrkr View Post

The 1.3 will broaden the color depth which will alleviate color banding because the gradation scale increases. But, the greatest improvement, at least at the outset, will be with the PC.

Any color banding you may see today is not the result of 24-bit limitations.

Look at these pictures on your existing 24-bit display and tell me where you see color banding:
http://www.dpreview.com/gallery/canoneos30d_samples/
post #136 of 3473
Quote:
Originally Posted by sfhub View Post

Umm really? With what source are you talking about? The content specs don't specify 48-bit content storage so you will be stuck with lowest common denominator discs regardless of what revision HDMI is in play. Not that it makes any difference as 48-bit is a waste of bits where almost no one will be able to tell any difference.

Dude, the color gamut is extended to xvYCC, a new standard that has already begun production in LED displays. It doubles the per channel bit from 8 to 16. What this means is virtually every color perceivable to the human eye will be available. Also, the gradation scale is extended giving you more steps so that a blue sky, for example, will be smoother since the colors are transitioning more often resulting in a more natural picture. This is one of the reasons i love the industrial Panny plasmas......using the HDMI blade results in over 4000 steps of gradation. I suggest you watch "Master & Commander" in a standard 256 step display and a Panny 8UK/9UK plasma and pay close attention to the fog scenes.
post #137 of 3473
Quote:
Originally Posted by lipcrkr View Post

Also, the gradation scale is extended giving you more steps so that a blue sky, for example, will be smoother since the colors are transitioning more often resulting in a more natural picture. This is one of the reasons i love the industrial Panny plasmas......using the HDMI blade results in over 4000 steps of gradation. I suggest you watch "Master & Commander" in a standard 256 step display and a Panny 8UK/9UK plasma and pay close attention to the fog scenes.

Dude, you are confusing video processing of the display with the transfer interface.

The Master & Commander DVD is mastered 8-bit YCbCr 4:2:0 regardless of how many bits you wish to use to transfer it from the DVD player to the display. Even though current HDMI is capable of 12-bit, I highly doubt your player is sending out anything other than 8-bit YCbCr 4:2:2 over HDMI. That means any reduction of banding you claim is happening is being done by the display *despite* an 8-bit HDMI interface.

Look at the pictures I linked to on your current 24-bit (8-bit per color) display. If 10-bit and 12-bit is so critical for your viewing pleasure then you should be able to point out obvious color gradation in the pictures I posted because your laptop display is 24-bit (or less).

I'll gladly entertain your assertion if you can point out the banding in those pictures. There's quite a variety of many different scenes.

Whether you want to wait for HDMI 1.3 is your call, but setting up expectations that a 12-bit HDMI transfer interface is going to solve problems that are in the source material is not being realistic.

I'm not disputing there is color banding in various material people see. I'm just telling you those are in the source material or video processing and not something introduced by an 8-bit HDMI transfer interface.
post #138 of 3473
Quote:
Originally Posted by lipcrkr View Post

Dude, the color gamut is extended to xvYCC, a new standard that has already begun production in LED displays. It doubles the per channel bit from 8 to 16. What this means is virtually every color perceivable to the human eye will be available. Also, the gradation scale is extended giving you more steps so that a blue sky, for example, will be smoother since the colors are transitioning more often resulting in a more natural picture. This is one of the reasons i love the industrial Panny plasmas......using the HDMI blade results in over 4000 steps of gradation. I suggest you watch "Master & Commander" in a standard 256 step display and a Panny 8UK/9UK plasma and pay close attention to the fog scenes.

Uhhh, you are aware that the current Panny HDMI blade (TY-FB8HM) is still only v1.1 right? I envy you already!


ron
post #139 of 3473
Quote:
Originally Posted by sfhub View Post

The Master & Commander DVD is mastered 8-bit YCbCr 4:2:0 regardless of how many bits you wish to use to transfer it from the DVD player to the display. Even though current HDMI is capable of 12-bit, I highly doubt your player is sending out anything other than 8-bit YCbCr 4:2:2 over HDMI. That means any reduction of banding you claim is happening is being done by the display *despite* an 8-bit HDMI interface.

Thanks sfhub! I've asked this question on various threads and never gotten a confirmation on my assumption. So all the HD-DVD and BR releases are mastered at a 24 bit color depth anyway. Therefore, for movie purposes, there is no way for a future 92u to take advantage of deep color features in HDMI 1.3. It's just not in the movie source material.
post #140 of 3473
It's not in the current HD specification, however adding it is "simply" a matter of adding it to the specification - though of course that requires the agreement of the committee that controls that specification. The movie source is downgraded to 3x8 bit color already to put on disc, and future HD discs can "easily" have an optional extra 8 bits per color, which will only be output with HDMI 1.3.
post #141 of 3473
Quote:
Originally Posted by thrill View Post

It's not in the current HD specification, however adding it is "simply" a matter of adding it to the specification - though of course that requires the agreement of the committee that controls that specification. The movie source is downgraded to 3x8 bit color already to put on disc, and future HD discs can "easily" have an optional extra 8 bits per color, which will only be output with HDMI 1.3.

Yeah, that's what i was eluding to in my original post when i said the full capacity 1.3 might not be functional at the outset. It doesn't matter what the DVD is mastered at, a DVD can be "upgraded" to whatever. But i'm not going to argue with these people, it's a waste of time.
post #142 of 3473
Quote:
Originally Posted by thrill View Post

It's not in the current HD specification, however adding it is "simply" a matter of adding it to the specification - though of course that requires the agreement of the committee that controls that specification. The movie source is downgraded to 3x8 bit color already to put on disc, and future HD discs can "easily" have an optional extra 8 bits per color, which will only be output with HDMI 1.3.

If you think there is going to be an effort to do this feel free to speculate.

Current HDMI is *already* capable of transmitting 12-bit when you choose YCbCr 4:2:2. In fact it is *always* transmitting 12-bit when you choose 4:2:2. It is just the low-order bits are zeroed out. See how many content producers have taken advantage of this.

So there are 2 issues.
1) will you notice any difference with > 24-bit.

You can believe what you want, maybe you have super amazing eyes.

I look at high quality digicam pictures on a 24-bit display and I just don't see any color banding. I bring up color wheels in photoshop and I don't see any color banding. I watch crappy mpg encodings with limited bitrate and I see color banding. Where do you think the problem is, HDMI interface or crappy source?

2) will content producers decide to waste space on 10, 12, 16-bit encodings that users will not be able to notice. You won't be able to notice because 99% of the TVs out there cannot process and display the data and even if it could process the data most people will not be able to see any difference.

Content producers don't like to use formats that aren't mandatory in the specs because the value of these advancements are limited. Inevitably most people's equipment will not be able to process the higher bit-depth and the data is just wasted. Maybe if it is Sony they will try selling you multiple copies of the same movie, the regular BD version, then the deep-color version, then the superbit deep-color version. Damn, how many copies of Fifth Element can one person own.

Deep color will go down as one of the less useful upgrades. They might as well spend their time fixing HDMI/HDCP negotiation problems between devices. That will actually be something that benefits users.

Everyone wants to get the last word in, then say I'm done arguing, blah blah, so I'll do the same.
post #143 of 3473
Quote:
Originally Posted by sfhub View Post

If you think there is going to be an effort to do this feel free to speculate.

Current HDMI is *already* capable of transmitting 12-bit when you choose YCbCr 4:2:2. In fact it is *always* transmitting 12-bit when you choose 4:2:2. It is just the low-order bits are zeroed out. See how many content producers have taken advantage of this.

So there are 2 issues.
1) will you notice any difference with > 24-bit.

You can believe what you want, maybe you have super amazing eyes.

I look at high quality digicam pictures on a 24-bit display and I just don't see any color banding. I bring up color wheels in photoshop and I don't see any color banding. I watch crappy mpg encodings with limited bitrate and I see color banding. Where do you think the problem is, HDMI interface or crappy source?

2) will content producers decide to waste space on 10, 12, 16-bit encodings that users will not be able to notice. You won't be able to notice because 99% of the TVs out there cannot process and display the data and even if it could process the data most people will not be able to see any difference.

Content producers don't like to use formats that aren't mandatory in the specs because the value of these advancements are limited. Inevitably most people's equipment will not be able to process the higher bit-depth and the data is just wasted. Maybe if it is Sony they will try selling you multiple copies of the same movie, the regular BD version, then the deep-color version, then the superbit deep-color version. Damn, how many copies of Fifth Element can one person own.

Deep color will go down as one of the less useful upgrades. They might as well spend their time fixing HDMI/HDCP negotiation problems between devices. That will actually be something that benefits users.

Everyone wants to get the last word in, then say I'm done arguing, blah blah, so I'll do the same.

These are all good point sfhub. I was curious about it because although 24 bit color is fantastic, there can (I believe) be small incremental improvements realized with HDMI 1.3. Here is a post I found on the subject from Greg Rogers of Accupel I found this reposted by rdjam.

Originally Posted by gregr
The HDMI interface is just an interconnect between components (video processing engines). The 12-bit interface is currently adequate to connect the results of one processing engine to another when the original consumer source is only 8-bits. The individual processing engines can use as many bits as they want for processing headroom. It is advantageous to maintain 10 bits after scaling, and arguably perhaps even 12 bits, although I don't know of any video processing chips (other than those specialized for gamma expansion as display drivers) that currently output or input more than 10 bits. If the scaler performs 4:2:2 processing the current 12-bit HDMI video interface seems more than adequate for practical consumer video applications. Even professional SDI interfaces are only 4:2:2 10-bit.

But the Deep Color pixel packing is to provide greater than 8-bit RGB 4:4:4 and YCbCr 4:4:4 video transmission. Therefore, if an external scaler converts the 8-bit 4:2:2 YCbCr video to 4:4:4 prior to scaling, and then does 4:4:4 scaling, it becomes advantageous to transmit the results of that scaling as 4:4:4 10-bit or 12 bit video if the display will accept and maintain the video in the higher bit resolution 4:4:4 format and not convert it back to 4:2:2 for internal processing. If all of that takes place, then the HDMI 1.3 spec will eventually provide an advantage over the current spec.
post #144 of 3473
Quote:
Originally Posted by 2Channel View Post

These are all good point sfhub. I was curious about it because although 24 bit color is fantastic, there can (I believe) be small incremental improvements realized with HDMI 1.3. Here is a post I found on the subject from Greg Rogers of Accupel I found this reposted by rdjam.

Originally Posted by gregr
...
But the Deep Color pixel packing is to provide greater than 8-bit RGB 4:4:4 and YCbCr 4:4:4 video transmission. Therefore, if an external scaler converts the 8-bit 4:2:2 YCbCr video to 4:4:4 prior to scaling, and then does 4:4:4 scaling, it becomes advantageous to transmit the results of that scaling as 4:4:4 10-bit or 12 bit video if the display will accept and maintain the video in the higher bit resolution 4:4:4 format and not convert it back to 4:2:2 for internal processing. If all of that takes place, then the HDMI 1.3 spec will eventually provide an advantage over the current spec.

I am a fan of fewer conversions also and that is essentially what the quote is saying. From a data flow standpoint I agree with what they are saying.

However from a reality standpoint, my question is really what problem are they really trying to solve and will you be able to see the difference. How much of a difference are you seeing, how many people can see it, and how much is that worth to you.

If I can using the existing 24-bit interface to transfer data from source to sink in a way where no banding is visible, what value are the additional bits when I can't detect the banding in 24-bits.

Maybe there are some rounding off errors that can occur in the video processing, but if I'm going to buy into it, I want it demonstrated with real equpment rather than marketing brochures or powerpoint, because there have been various technologies touted as the next revolution that just fizzle because they were overhyped.

Take the cloud scene mentioned by lipcrkr earlier which I assume he mentioned because you can see banding, false contours, whatever. That is in the mpg source. If your video processing wants to clean that up, the question is can it do so effectively with a 24-bit transfer interface to the display. I contend it can because I can freeze that frame, run photoshop, clean up the contours, then send it over 24-bit interface and it will look fine.

Maybe for the video processor it will be less resource intensive if it could do the processing once with higher bits and not have to convert back down, and I can believe that, but if they want to spend the cycles processing, they can send something over 24-bit that looks perfectly fine. That is my assertion.

As an aside I have no issues with a post like gregr's (reposted by rdjam) There is some basis behind the assertions. The ones I mainly object to are pointing at bad source material with such obvious contouring/banding that you know the problem is in the source and saying boom, HDMI 1.3 is going to fix that. I simply point out the problem you are seeing is in the source or the video processing, it isn't evidence that the current HDMI is limiting the picture quality.

With a better source and current HDMI, that problem is resolved.
post #145 of 3473
Quote:
Originally Posted by sfhub View Post

I am a fan of fewer conversions also and that is essentially what the quote is saying. From a data flow standpoint I agree with what they are saying.

However from a reality standpoint, my question is really what problem are they really trying to solve and will you be able to see the difference. How much of a difference are you seeing, how many people can see it, and how much is that worth to you.

If I can using the existing 24-bit interface to transfer data from source to sink in a way where no banding is visible, what value are the additional bits when I can't detect the banding in 24-bits.

Maybe there are some rounding off errors that can occur in the video processing, but if I'm going to buy into it, I want it demonstrated with real equpment rather than marketing brochures or powerpoint, because there have been various technologies touted as the next revolution that just fizzle because they were overhyped.

Take the cloud scene mentioned by lipcrkr earlier which I assume he mentioned because you can see banding, false contours, whatever. That is in the mpg source. If your video processing wants to clean that up, the question is can it do so effectively with a 24-bit transfer interface to the display. I contend it can because I can freeze that frame, run photoshop, clean up the contours, then send it over 24-bit interface and it will look fine.

Maybe for the video processor it will be less resource intensive if it could do the processing once with higher bits and not have to convert back down, and I can believe that, but if they want to spend the cycles processing, they can send something over 24-bit that looks perfectly fine. That is my assertion.

As an aside I have no issues with a post like gregr's (reposted by rdjam) There is some basis behind the assertions. The ones I mainly object to are pointing at bad source material with such obvious contouring/banding that you know the problem is in the source and saying boom, HDMI 1.3 is going to fix that. I simply point out the problem you are seeing is in the source or the video processing, it isn't evidence that the current HDMI is limiting the picture quality.

With a better source and current HDMI, that problem is resolved.

I totally agree with you re your 24bit color v HDMI 1.3 deep color arguments. We really should be focussing our attentions on the video "source" providers to get them to improve their cr@ppy, over compressed low bitrate signals, rather than introduce new technology, which until the programme source material is sorted out, won't give enough benefit to be worthwhile.

I have no doubt at all that some of the features in HDMI 1.3 will eventually deliver an improved viewing (and audio) experience, but we need to get the content providers to get their act together first.

Graham.
post #146 of 3473
Quote:
Originally Posted by sfhub View Post

I am a fan of fewer conversions also and that is essentially what the quote is saying. From a data flow standpoint I agree with what they are saying.

However from a reality standpoint, my question is really what problem are they really trying to solve and will you be able to see the difference. How much of a difference are you seeing, how many people can see it, and how much is that worth to you.

If I can using the existing 24-bit interface to transfer data from source to sink in a way where no banding is visible, what value are the additional bits when I can't detect the banding in 24-bits.

Maybe there are some rounding off errors that can occur in the video processing, but if I'm going to buy into it, I want it demonstrated with real equpment rather than marketing brochures or powerpoint, because there have been various technologies touted as the next revolution that just fizzle because they were overhyped.

Take the cloud scene mentioned by lipcrkr earlier which I assume he mentioned because you can see banding, false contours, whatever. That is in the mpg source. If your video processing wants to clean that up, the question is can it do so effectively with a 24-bit transfer interface to the display. I contend it can because I can freeze that frame, run photoshop, clean up the contours, then send it over 24-bit interface and it will look fine.

Maybe for the video processor it will be less resource intensive if it could do the processing once with higher bits and not have to convert back down, and I can believe that, but if they want to spend the cycles processing, they can send something over 24-bit that looks perfectly fine. That is my assertion.

As an aside I have no issues with a post like gregr's (reposted by rdjam) There is some basis behind the assertions. The ones I mainly object to are pointing at bad source material with such obvious contouring/banding that you know the problem is in the source and saying boom, HDMI 1.3 is going to fix that. I simply point out the problem you are seeing is in the source or the video processing, it isn't evidence that the current HDMI is limiting the picture quality.

With a better source and current HDMI, that problem is resolved.

If you are going to comment on my posts it would be nice to know you are being accurate in your assessment. I never mentioned the cloud scene nor did i mention anything about the picture that was posted.
Color banding can appear, especially on LCD's for a number of reasons. But, the color banding i'm talking about is related to the color gamut. 24 bit supports 16 million to 16.7 million colors, HDTV's are limited at 24 bit or 8 bits per color channel. The increased bandwidth of 1.3 allows a deep color display to do 30, 36, 48 bit color spaces which virtually every color perceivable to the human eye can be achieved. It goes from millions of colors to billions of colors. At 30 bit you add at least 4 more shades of gray. You put these numbers on a computer screen and do the test patterns and it blows the roof off. Now, here is what i'm getting at as far as banding. Color banding in a blue sky for example, where blue is the only color, banding appears do to a more acute contrast in those blue colors when viewing a 24 bit display. When you are missing some "steps", due to the limitations of the color gamut, banding can occur because it's not able to blend the colors the way a higher bit display can. A broader color gamut achieved at the higher bit deep color display allows more gradations or steps in that blue color so it smoothes the blue out resulting in the alleviation of color banding due to the limited amount of blue in a lower bit display. Same with fog scenes where grey is the previlent color, those extra shades of gray in a deep color display smoothes the transition in the various shades of gray available in that particular scene. You can have a clean, uncompressed signal and still have banding. The limitations set forth to the display makers with 1.1 and 1.2 are lifted with 1.3. The sky's the limit (no pun intended).
post #147 of 3473
Do anybody know what this tv might actually look like?
post #148 of 3473
Quote:
Originally Posted by snaithg View Post

I totally agree with you re your 24bit color v HDMI 1.3 deep color arguments. We really should be focussing our attentions on the video "source" providers to get them to improve their cr@ppy, over compressed low bitrate signals, rather than introduce new technology, which until the programme source material is sorted out, won't give enough benefit to be worthwhile.

I have no doubt at all that some of the features in HDMI 1.3 will eventually deliver an improved viewing (and audio) experience, but we need to get the content providers to get their act together first.

Graham.


Most won't see it, but feel it.
post #149 of 3473
Quote:
Originally Posted by DEAC View Post

The Sharp D92 series was presented at IFA 2006 in Berlin, maybe:

In Europe the D62 is called XD1 (2000:1 contrast ratio)

At IFA, the new HD1 series was announced for spring 2007 (2500:1 contrast ratio, integrated HDD, integrated HD-tuner, HDMI 1.3) , which could be the D92


Here are some pictures a friend took at IFA:





I have also set up 2 picture galleries from IFA 2006:

http://www.aixess.de/ifa/2006a.html
http://www.aixess.de/ifa/2006b.html

what folks think is the D92!
post #150 of 3473
I think the D92 or what ever it will really be called will have its own attributes. It might have some of the same electronics as the XD1's, but I still think it will have its own identity.
Has Europe become the testing gound for panels that will ship to the U.S.? I don't think so!
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: LCD Flat Panel Displays
AVS › AVS Forum › Display Devices › LCD Flat Panel Displays › Sharp Aquos LCD 2007 speculation thread: D92? TruD? HDMI 1.3?