AVS Forum banner
1 - 12 of 12 Posts

·
Registered
Joined
·
30 Posts
Discussion Starter · #1 ·
Does the deep color capability of HDMI 1.3 offer a NOTICEABLE improvement in picture quality over HDMI 1.1?
 

·
Banned
Joined
·
20,735 Posts
Depends on if you have any sources that need the extra bit-depth (you don't) or if you have video processing in the chain before the digital link that could benefit from the higher bitdepth *IF* you can actually send that to the display AND the display itself has the bit-depth to benefit from this (unlikely, but possible). Given that almost no displays can really take advantage of this unless it's a CRT that you're going back to analog with, or a professional display, or otherwise a very high-end display, I suspect that there isn't any difference for the vast majority of users. Most digital displays don't even have enough bit-depth to keep up with 8-bit content without banding, since most usually max out at 10-bit linear, which is not enough. Many are worse than that.
 

·
Registered
Joined
·
30 Posts
Discussion Starter · #3 ·
I have a panasonic blu-ray and panasonic plasma both 1.3 capable. The problem is I'm running the HDMI through my Denon receiver which is 1.1 and therefore converts the 1.3 in to 1.1 out to the TV as I understand it. I'm thinking about upgrading my receiver if you think there would be a vast improvement in picture quality.
 

·
Moderator
Joined
·
23,032 Posts
You will see no difference in PQ unless the receiver is modifying the video in such a way that makes it worse than the source video. HDMI versions really have no bearing on the PQ of the video.


larry
 

·
Registered
Joined
·
1,755 Posts
The Denon just extracts the audio and passes the video signal through without change. It should pass a 1.3 video signal even if it is was built when 1.1 was the standard. You can test this by bypassing the Denon and going directly from the Blu-ray to the TV.


The one thing a new receiver will have is decoding for the lossless Blu-ray audio formats.
 

·
Registered
Joined
·
30 Posts
Discussion Starter · #6 ·
I spoke with Denon support today and they told me that it can't pass the 1.3 through but it will send it out through 1.1. Not sure which is correct...
 

·
One-Man Content Creator
Joined
·
25,424 Posts

Quote:
Originally Posted by sdrph /forum/post/15546616


I spoke with Denon support today and they told me that it can't pass the 1.3 through but it will send it out through 1.1. Not sure which is correct...

Ok, but remember: there are no deep color sources (other than some camcorders). Neither Blu-ray nor DVD use deep color. There are no broadcast sources for it.


-Bill
 

·
Registered
Joined
·
1,755 Posts

Quote:
Originally Posted by sdrph /forum/post/15546616


I spoke with Denon support today and they told me that it can't pass the 1.3 through but it will send it out through 1.1. Not sure which is correct...

Perhaps they are referring to the higher bandwidth possible with 1.3 but which the Denon receiver is apparently not capable of retransmitting. Since the higher bandwidths are only required for resolutions greater than 1080p or deep color with 1080p (see http://en.wikipedia.org/wiki/High-De...edia_Interface ), and neither of these has a commercial source at present (except for some camcorders), it's a moot point.
 

·
Banned
Joined
·
20,735 Posts

Quote:
Originally Posted by crutschow /forum/post/15544859


The Denon just extracts the audio and passes the video signal through without change. It should pass a 1.3 video signal even if it is was built when 1.1 was the standard. You can test this by bypassing the Denon and going directly from the Blu-ray to the TV.


The one thing a new receiver will have is decoding for the lossless Blu-ray audio formats.

Depends on the denon. We don't know what it is, thus what it's doing. Many are not just switches.
 

·
Banned
Joined
·
20,735 Posts

Quote:
I have a panasonic blu-ray and panasonic plasma both 1.3 capable. The problem is I'm running the HDMI through my Denon receiver which is 1.1 and therefore converts the 1.3 in to 1.1 out to the TV as I understand it. I'm thinking about upgrading my receiver if you think there would be a vast improvement in picture quality.

No. Again, the difference is ONLY there is you need higher bitdepth because you have a video processor in the chain which would benefit from being able to work without an 8-bit bottleneck at its output.


This is not the case in your system since you're not processing the video. So there is no benefit.


And in any case, the benefit is not in any way vast, and even in the possible situation I describe, is relegated to video engineers and anally retentive people (like myself) who sit around with test patterns and displays with the bit-depth capabilities to even benefit from this (which you don't have either.)


And further, you can get 10 and 12-bit video with HDMI versions previous to 1.3. You just needed to do it at 4:2:2, 1.3 adds the ability to do it in 4:4:4 or RGB. Again, not really an issue for you in any way whatsoever I suspect. If you have to ask the question, then NO going to 1.3 gets you absolutely nothing in terms of *video.* Audio may be a different story entirely.
 

·
Registered
Joined
·
1 Posts
Interestingly I was in a Sony Centre shop(Stockbridge,Edinburgh,Scotland) and they had two LCD sets playing the same video from the same model of Blu-Ray player. They were set up to show the supposed difference between two cables. The difference was very noticeable but difficult to quantify.


I was sceptical but didn't know anything about cables at the time. I'm even more sceptical now after doing some reading. Barring the obvious that they had configured the sets differently does anyone have any suggestions on what the explanation for the difference would be.


Thanks

Stephen
 

·
Registered
Joined
·
1,755 Posts

Quote:
Originally Posted by s_g_robertson /forum/post/15977351


Interestingly I was in a Sony Centre shop(Stockbridge,Edinburgh,Scotland) and they had two LCD sets playing the same video from the same model of Blu-Ray player. They were set up to show the supposed difference between two cables. The difference was very noticeable but difficult to quantify.


I was sceptical but didn't know anything about cables at the time. I'm even more sceptical now after doing some reading. Barring the obvious that they had configured the sets differently does anyone have any suggestions on what the explanation for the difference would be.

If they both were HDMI cables (not component) then I believe any differences you saw had nothing to do with the cable. If an HDMI cable does not work properly you get serious picture artifacts (sparkles, dropouts, etc.). The degradation is never subtle.


They probably just had the sets setup differently, as you suggest.
 
1 - 12 of 12 Posts
Top