or Connect
AVS › AVS Forum › Audio › Receivers, Amps, and Processors › HDMI in Receivers, why?
New Posts  All Forums:Forum Nav:

HDMI in Receivers, why? - Page 2

post #31 of 126
Quote:
Originally Posted by oztech View Post

This may be why the resurgence in high end stereo people may be fed up with the fork
lift overhauls they are having to go through with a weakening economy.

English please.
post #32 of 126
Quote:
Originally Posted by RedHot View Post

English please.

What part did you not understand.
post #33 of 126
Thread Starter 
Quote:
Originally Posted by abelincoln View Post

Slaughter, I think you've really hit upon something.

I purchased a hdmi switcher since I was running out of hdmi ports on my television and it occurred to me that I would lose my per-input settings. On my TV (a panny th-50ph10uka) I don't think there's any way to make a Harmony macro that would properly switch to a certain preset.

The only feasible solution I see is for receiver manufacturers to start providing comprehensive video adjustments for each hdmi input.

I agree. Either television or receiver manufacturers need to do something. In some cases it the same company, Sony. Someone care to write one of them and try to get a response? I will as well.

I guess an HDMI switcher would partially solve the problem as well, but that's just a workaround for a bad problem.
post #34 of 126
I think with receivers that have excellent scalers in them, HDMI is a great idea.
post #35 of 126
Thread Starter 
Quote:
Originally Posted by Woodshed View Post

I think with receivers that have excellent scalers in them, HDMI is a great idea.

I'm not sure if I would count on my receiver having the best scaler, unless it is a high end receiver and that still doesn't solve the issue of multiple sources with different colors and levels.
post #36 of 126
Either get a BD player that decodes the HD formats and has analog outs or only use the HDMI in on your receiver for Blu-Ray. Everything else can be connected directly to your display. Calibrate each input as you see fit and your set.
post #37 of 126
Quote:
Originally Posted by Slaughter View Post

Am I the only one that see's HDMI in receivers as a a terrible thing? Why are we being forced to route our audio and video through the receiver to take advantage of the new audio formats? Who in the world would want all their video sources routing to a receiver and output to a single video input on the television? Last time I checked all sources output different colors, brightness, ect. Calibration would be useless at that point.

Is there a chance that TV manufacturers will eventually add an HDMI audio output so we can use more than one video input on our television?

Quote:
Originally Posted by MichaelJHuman View Post

This may be a dumb question, but why the compelling need to customize per input, when it's digital? There's no analog circuit to screw it up.

Once you set the brightness and color as best you can, if the colors are wrong or not to your liking for a given source, it's the source's fault. For example, some games are really dark and others may be overly "colorful" but that's not an issue with the TV. Also, it's likely going to be on a per movie/game/show basis, so what help is adjusting settings per input?

I was about to say the same thing. The signals are all digital now. You don't NEED to do any "calibration" except once for the set itself (as far as the digital signal sources are concerned, which is where everything is headed). If you still are using a VCR, that analog source I suppose could warrant a special calibration of the TV, but seriously, are you even going to notice? (when the source is a VCR or other lo-def analog source) The only other thing I can think of might be a game system, or an old Laserdisc player. Analog sources. In the case of the game system I find myself asking "does it really matter?" Create a second calibration for the laserdisc source, and simply use that calibration for all the analog sources. Create the primary calibration for the digital sources like Cable boxes and DVD over HDMI.

Eventually there isn't going to be any need to calibrate except for the set itself. Don't be held to old schools of thought, digital is going to mostly eliminate the needs for those old school calibrations.

HDMI (Digital) for the win.
post #38 of 126
Quote:
Originally Posted by Slaughter View Post

I am not positive about this, but if televisions had say 4 totally customizable profiles per input and you could select these profiles using a harmony remote macro, then it wouldn't be an issue and I would stop complaining. Not to turn this into a display thread, but are there any tv's with this capability?


This is a great idea!

Quote:
Originally Posted by Slaughter View Post

Not to turn this into a display thread, but are there any tv's with this capability?

Just find a display with 4 or five inputs and use a splitter(since most receivers only have one or two outputs) with one input and 4 or five outputs. You could than calibrate each picture(source) individually. Could get pricey buying all them hdmi cables........
post #39 of 126
Thread Starter 
Quote:
Originally Posted by Tspeer View Post

I was about to say the same thing. The signals are all digital now. You don't NEED to do any "calibration" except once for the set itself (as far as the digital signal sources are concerned, which is where everything is headed). If you still are using a VCR, that analog source I suppose could warrant a special calibration of the TV, but seriously, are you even going to notice? (when the source is a VCR or other lo-def analog source) The only other thing I can think of might be a game system, or an old Laserdisc player. Analog sources. In the case of the game system I find myself asking "does it really matter?" Create a second calibration for the laserdisc source, and simply use that calibration for all the analog sources. Create the primary calibration for the digital sources like Cable boxes and DVD over HDMI.

Eventually there isn't going to be any need to calibrate except for the set itself. Don't be held to old schools of thought, digital is going to mostly eliminate the needs for those old school calibrations.

HDMI (Digital) for the win.

Are you serious? Even with all digital sources, they all use different processing. The same game on Xbox using HDMI and a PS3 using HDMI look totally different. Why, different processing. I would never think of using those two on the same input. Calibration will always be needed unless all source material is captured the same way with the same equipment and the same and hardware uses the same exact processing. Not going to happen. I guess you are lucky to have bad eyes or not care a lot about picture quality.
post #40 of 126
I am still curious about this. I don't mean to hammer the point, but I don't really understand the need for per source video settings with digital devices.

If the digital source device works properly, it should be able to move it's pixels over HDMI with no issues. I am not sure how colors are translated, say off of a DVD, but the colors should have predictable values and any color space translation should be correct if the device works correctly.

An analog/digital device such as a cable box with HDMI tuned to an analog channel could be different due to the nature of analog video. I could see maybe being able to compensate for that. But analog channels suck, and each channel/program likely has it's own serious issues.

Any device may send an objectionable video signal, but it seems likely the problem is going to be with that game, disc, channel. In which case what good is a per device setting going to do you? The problem is with the source, or your perception of the source.

You mention processing. What kind of processing? Scaling could mess up your colors, but only if it's buggy. Deinterlacing should not effect the colors. Any change to colors from any video processor implies a buggy VP. I would turn it off or replace it if possible. You also mention PS3 vs XBox 360 differences. I would like to see that. If you could make per source changes to your settings have have those games look alike, and then change games and have those look alike again, I will concede your point. But in that case, either the PS3 or the XBox 360 has flaws in their video output.

If your DVD player has a bug in it, a per input setting could help maybe. But that seems unlikely.
post #41 of 126
Ever hear of Deep Color? Some sources support it, some don't. If you want to use it, you have to enable it on your display for sources tha use it. You have to disable it on the display for sources that don't support it.

Deep Color: HDMI 1.3 supports 10-bit, 12-bit and 16-bit (RGB or YCbCr) color depths, up from the 8-bit depths in previous versions of the HDMI specification, for stunning rendering of over one billion colors in unprecedented detail.
Broader color space: HDMI 1.3 adds support for “x.v.Color™” (which is the consumer name describing the IEC 61966-2-4 xvYCC color standard), which removes current color space limitations and enables the display of any color viewable by the human eye.


Mark

Quote:
Originally Posted by MichaelJHuman View Post

I am still curious about this. I don't mean to hammer the point, but I don't really understand the need for per source video settings with digital devices.

If the digital source device works properly, it should be able to move it's pixels over HDMI with no issues. I am not sure how colors are translated, say off of a DVD, but the colors should have predictable values and any color space translation should be correct if the device works correctly.

An analog/digital device such as a cable box with HDMI tuned to an analog channel could be different due to the nature of analog video. I could see maybe being able to compensate for that. But analog channels suck, and each channel/program likely has it's own serious issues.

Any device may send an objectionable video signal, but it seems likely the problem is going to be with that game, disc, channel. In which case what good is a per device setting going to do you? The problem is with the source, or your perception of the source.

You mention processing. What kind of processing? Scaling could mess up your colors, but only if it's buggy. Deinterlacing should not effect the colors. Any change to colors from any video processor implies a buggy VP. I would turn it off or replace it if possible. You also mention PS3 vs XBox 360 differences. I would like to see that. If you could make per source changes to your settings have have those games look alike, and then change games and have those look alike again, I will concede your point. But in that case, either the PS3 or the XBox 360 has flaws in their video output.

If your DVD player has a bug in it, a per input setting could help maybe. But that seems unlikely.
post #42 of 126
Quote:
Originally Posted by Slaughter View Post

I guess you are lucky to have bad eyes or not care a lot about picture quality.

That is really not fair... You came here asking for opinions and your getting them. Be respectful of those that chime in.


Video for the most is decently calibrated from digital sources these days and it's getting better not worse all the time. The displays user saved presets are the ticket for those that want more calibration.

I see no difference for you or anybody else that uses yoru system between having to select a different input on the display -vs- having to select a saved user picture controls preset.

Not sure what else to say on it all.

All various opinions and choices of setup are valid on this one.
post #43 of 126
Quote:


Is there a chance that TV manufacturers will eventually add an HDMI audio output so we can use more than one video input on our television?

Obviously you forget about full house video solutions. Why do I want more then ONE cable going to any of my 7 plasmas or my projection system?

For those who have single room setups, its better to connect all devices to one switch and then send one cable to the display device.

Please remember that some of us do not put our gear beside our display devices, instead we have it in AV racks hidden maybe 10, 20, 50 feet away.
post #44 of 126
Quote:
Originally Posted by Slaughter View Post

Are you serious? Even with all digital sources, they all use different processing. The same game on Xbox using HDMI and a PS3 using HDMI look totally different. Why, different processing. I would never think of using those two on the same input. Calibration will always be needed unless all source material is captured the same way with the same equipment and the same and hardware uses the same exact processing. Not going to happen. I guess you are lucky to have bad eyes or not care a lot about picture quality.

You are mistaken.

Xbox360/PS3 with hdmi outputs send the signal digitally, there is nothing interfering with it. It's coming down the hdmi in as pure of a form it can get, just as the game designers wanted it. Games looking different between PS/3 and Xbox is a whole different topic of discussion, which 'calibration' has nothing to do with.

My eyes are fine.
post #45 of 126
Quote:
Originally Posted by mdv View Post

Ever here of Deep Color? Some sources support it, some don't. If you want to use it, you have to enable it on your display for sources tha use it. You have to disable it on the display for sources that don't support it.

Deep Color: HDMI 1.3 supports 10-bit, 12-bit and 16-bit (RGB or YCbCr) color depths, up from the 8-bit depths in previous versions of the HDMI specification, for stunning rendering of over one billion colors in unprecedented detail.
Broader color space: HDMI 1.3 adds support for “x.v.Color™” (which is the consumer name describing the IEC 61966-2-4 xvYCC color standard), which removes current color space limitations and enables the display of any color viewable by the human eye.


Mark

In my lowly opinion deep color is so far one of the biggest jokes ever come up with for marketing.

Noting uses it, no source material. All Blu-ray is 8 bit even.
post #46 of 126
Quote:


I guess you are lucky to have bad eyes or not care a lot about picture quality.

Not me, Im lucky not to be ultra fussy and I do not have to worry about seeing things that do not really exist. I enjoy HD in 720p over component as much as I do watching 1080p over HDMI. People that think there is a huge difference definitely have some issues and feel sorry for people around them

Quote:


The same game on Xbox using HDMI and a PS3 using HDMI look totally different.

totally different??? who cares if you think its totally different? You should just enjoy it, we are not you. We do not want to be like you, we like what we see already and we are happy.
post #47 of 126
Quote:


In my lowly opinion deep color is so far one of the biggest jokes ever come up with for marketing.

Noting uses it, no source material. All Blu-ray is 8 bit even.

hehe...shh!!!

let them believe its so awesome
post #48 of 126
I don't think he wanted discussion on it. I think he just wanted to argue with anybody that chooses to hook their system up in a different manner than he chooses to.
post #49 of 126
Quote:
Originally Posted by MichaelJHuman View Post

I am still curious about this. I don't mean to hammer the point, but I don't really understand the need for per source video settings with digital devices.

If the digital source device works properly, it should be able to move it's pixels over HDMI with no issues. I am not sure how colors are translated, say off of a DVD, but the colors should have predictable values and any color space translation should be correct if the device works correctly.

An analog/digital device such as a cable box with HDMI tuned to an analog channel could be different due to the nature of analog video. I could see maybe being able to compensate for that. But analog channels suck, and each channel/program likely has it's own serious issues.

Any device may send an objectionable video signal, but it seems likely the problem is going to be with that game, disc, channel. In which case what good is a per device setting going to do you? The problem is with the source, or your perception of the source.

You mention processing. What kind of processing? Scaling could mess up your colors, but only if it's buggy. Deinterlacing should not effect the colors. Any change to colors from any video processor implies a buggy VP. I would turn it off or replace it if possible. You also mention PS3 vs XBox 360 differences. I would like to see that. If you could make per source changes to your settings have have those games look alike, and then change games and have those look alike again, I will concede your point. But in that case, either the PS3 or the XBox 360 has flaws in their video output.

If your DVD player has a bug in it, a per input setting could help maybe. But that seems unlikely.

MICHAEL..
You are mssing the point...

Within an AVRs' video processor chip there are multiple functions/modes, these include:
1. Upscaling
2. De-interlacer
3. Noise reduction
4. Video edit
5. Edge correction
6. Pixel enhancement
7. Aspect Ratio
8. Cross Color Supressor
9. Gamma gain/correction

These are in addition to the transcoding of formats and GUI/OSD functions.
Since there is such a diversity of sources and respective video output native streams connected to a typical AVR, certain modes are required for some streams and not required for others. If all of the sources were outputting the same stream then a single global setting could be justified, but that is not the case.

Hope that clears up any confusion.

Just my $0.04..
post #50 of 126
Quote:


Are you serious? Even with all digital sources, they all use different processing. The same game on Xbox using HDMI and a PS3 using HDMI look totally different. Why, different processing. I would never think of using those two on the same input. Calibration will always be needed unless all source material is captured the same way with the same equipment and the same and hardware uses the same exact processing. Not going to happen. I guess you are lucky to have bad eyes or not care a lot about picture quality.

Lets end the discussion.

You are not the majority that TV companies care one bit about. You are less then 1% of the population so why would ANY manufacturing company care about your needs. J6P, will use AVRs or Maxtrix switch and have ONLY one cable run to their TV.

You dont see that the number of cables is more important to J6P then your preceived difference between XBOX HDMI and PS3 HDMI. You will just have to deal with the issues that YOU HAVE. I own both and I do not have any issues you are talking about so it is truely just you, sorry!

You post bad eye sight? I can read the copywrite at the bottom of eye charts during eye tests, I took flying lessons once and I need to pass some tough eye tests. Did not pass that whole small plain, no room to move, flying stuff though
post #51 of 126
Quote:


Within an AVRs' video processor chip there are multiple functions/modes, these include:
1. Upscaling
2. De-interlacer
3. Noise reduction
4. Video edit
5. Edge correction
6. Pixel enhancement
7. Aspect Ratio
8. Cross Color Supressor
9. Gamma gain/correction

These are in addition to the transcoding of formats and GUI/OSD functions.
Since there is such a diversity of sources and respective video output native streams connected to a typical AVR, certain modes are required for some streams and not required for others. If all of the sources were outputting the same stream then a single global setting could be justified, but that is not the case.

Hope that clears up any confusion.

Just my $0.04..


Good points, so some sources send a different digital signal and therefore some people feel the need to tweak just that signal?

I dont know, I pass in HDMI, I pass out HDMI to projection system or my plasma and it looks good to me, looks good to all my friends and family too....damn we all most have bad eye sight....oh wait we are J6P and general TV design is for us and not very fussy, fussy people. Those people can send ten times the money to get what they want because we DO NOT want EXTRA COSTS IN OUR TVS!!!!
post #52 of 126
Quote:
Originally Posted by Slaughter View Post

Everdog - The audio never has to be touched in your tv. It's digital. Just pass it on to my receiver as 1s and 0s.

Let's play with the wording here a bit....

The (audio) video never has to be touched in your (tv) AVR . It's digital. Just pass it on to my (receiver) TV as 1s and 0s...

Am I missing something??
post #53 of 126
Thread Starter 
I didn't post this to argue, I wanted confirmation and solutions. The thing about current devices is that you get component and HDMI video, which is better depends on the equipment. You get composite, coax and optical audio. My point is that currently we are free to connect devices as we see fit, but with HDMI switching and HD audio formats, we are being forced to connect our equipment only one way. Its the Apple mentality that simple is better, which is rarely the case.

Erik - I hear what you are saying, but you would have a hard time telling the difference from 5.1 audio from your cable box for a given movie and 5.1 from a dvd from that same movie. I guarantee the video looks different.

MichaelJ / Tspeer - You are assuming that the video is same coming from different sources. What is so hard to understand? Games looking different is exactly the reason for needing different profiles for each input.

Penngray - If you are using that advanced of a setup, hopefully you are not using a receiver and are using much better equipment. But if you are using a receiver, I would hope that you would want the best picture quality from each soruce and HDMI switching does not allow for that. I might be 1%, but that is out of only 5% who own an HDTV and buy a receiver. (my opinionated math of course).

JohnnDenver - My users don't have to select an input, my harmony remote does it for them. You would be happy with a Bluray player being calibrated the same as SD programming from your cable box? One of two would look bad, there is no way around it.
post #54 of 126
How does Slaughter want to send HD audio into the AVR?

We need to connect the HDMI to the receiver to do so and the last I checked all these sources have just ONE HDMI output option so even if he has a TV with multiple HDMI inputs he will still need a HDMI switch to send the HDMI audio portion to the AVR and also send the HDMI video portion the the TV. Maybe I missed something but it just seems to a huge expensive waste of time for any preceived PQ different.
post #55 of 126
Quote:
Originally Posted by Slaughter View Post

I didn't post this to argue, I wanted confirmation and solutions. The thing about current devices is that you get component and HDMI video, which is better depends on the equipment. You get composite, coax and optical audio. My point is that currently we are free to connect devices as we see fit, but with HDMI switching and HD audio formats, we are being forced to connect our equipment only one way. Its the Apple mentality that simple is better, which is rarely the case.

Erik - I hear what you are saying, but you would have a hard time telling the difference from 5.1 audio from your cable box for a given movie and 5.1 from a dvd from that same movie. I guarantee the video looks different.

MichaelJ / Tspeer - You are assuming that the video is same coming from different sources. What is so hard to understand? Games looking different is exactly the reason for needing different profiles for each input.

Penngray - If you are using that advanced of a setup, hopefully you are not using a receiver and are using much better equipment. But if you are using a receiver, I would hope that you would want the best picture quality from each soruce and HDMI switching does not allow for that. I might be 1%, but that is out of only 5% who own an HDTV and buy a receiver. (my opinionated math of course).

JohnnDenver - My users don't have to select an input, my harmony remote does it for them. You would be happy with a BlurayDVD player being calibrated the same as SD programming from your cable box? One of two would look bad, there is no way around it.


I'm just playing devil's advocate here because audio can be considered just as big a portion of the HT experience as the video and there would be others who would question as to whether every HDTV would truly pass audio untouched or unaltered to the AVR.

So you're damned if you do and damned if you don't.

Also, will all HDTVs truly pass 7.1 PCM or bitstreamed audio via HDMI to an AVR if you put the HDTV first in the chain? I thought some would only pass 2 channel.
post #56 of 126
Quote:


Penngray - If you are using that advanced of a setup, hopefully you are not using a receiver and are using much better equipment. But if you are using a receiver, I would hope that you would want the best picture quality from each soruce and HDMI switching does not allow for that. I might be 1%, but that is out of only 5% who own an HDTV and buy a receiver. (my opinionated math of course).

Kuddos, I like that arguement 1% of 5%, seriously! you are definitely right there!

Honestly though, I have matrix switches and I have AVRs. I run 720p and 1080i throughout my house over component video, my one XBOX actually does 1080p over component video. I have several HDMI devices that I connect back to my component video matrix solution using an HDFury. I also have a PS3 local in my HT room (Panny AE900U). I have done 720p from my system, 1080i from my system and 1080p locally from the PS3 and of course there is a difference but its honestly not meaningful enough for me to ever consider wanting to run any source directly to any of my TVs. Its not worth it because the difference is again a "splitting hairs" difference that few will ever care about and Im very happy I do not think its much.

btw, I can not tell the difference with the following
PS3 -> JVC 61" 1080p vs PS3 -> V663 -> JVC 61" 1080P

In the end J6P will never care about what you are describing.

I guess what Im saying is that the those seeking the extreme level of PQ just have to pay extreme amounts of money to get their needs satisfied.


You even pointed out the 5% so J6P doesnt even have HD yet and so just having PS3 -> V663 -> HD TV is a dream come true.


I know that my house gets rave reviews from family and friends because I have HD throughout the house even a 50" plasma pool side (I send HD too it over cat5e !!) That HD is still component video and the picture is stunning for me, my family and ANYONE that visits.
post #57 of 126
I will buy into the fact that devices which process HDMI signals could muck up the colors.

I have no noticed any issues with an all HDMI system, but I am not a videophile either.

I also would turn off any video processor that would modify HDMI. My receiver, thankfully, only processes analog inputs. It's component upconversion process resulted in perfectly fine looking XBox 360 images being sent to the TV.

I guess I was just trying to understand why per input settings would be useful when you may find any game, movie or program to have objectionable color.
post #58 of 126
Quote:


It's component upconversion process resulted in perfectly fine looking XBox 360 images being sent to the TV.

another great point.....We want HDMI from the receiver to the TV because we then have ONE cable. We then connect component video to our receiver and can upconvert to 1080i....its worth it just for that!!
post #59 of 126
Thread Starter 
Penngray - You nailed it, an hdmi switcher might be the way to go. I just know the HD from my cable box and Bluray look different enough that I couldn't live with the PQ knowing it could be better. I can't stand going to someones house and seeing blown highlights and red push....I want to grab their remote....errrr. My cable box is setup as HDMI to the tv and coax audio to the receiver. I want to keep them separate and probably can for a while, until hdcp and hd-audio take over or die like SACD.

MichaelJ - You make a valid point about content having different color, levels, ect even from a single source, but typically if you calibrate your DVD player, most if not all DVDs will look good. Use those same calibrated settings for HD cable and the picture will not look as good or the same. Once we get to a point where everything is HD, then this probably won't matter too much, but that is probably a long way away.

Penngray - On your last point about the 360. Your tv is going to upconvert to 1080p anyway even if you send it a 1080i signal, so unless the receiver has a better deinterlacer, you are doing unnecessary conversion, correct?
post #60 of 126
For all of the people who say that digital is digital and the colors are therefore best left alone (they were perfect to start with)...

Have you ever hooked up a colorimeter to your display? I'm not meaning to be antagonistic here but when I calibrated my oppo 980, xbox, and htpc, they all required different contrast, black level, and color balance adjustments. My monitor is a Panasonic 50ph10uka.

Maybe it's just that my equipment sucks and is somehow mucking up the hdmi signal. But calibrating for me made a very noticeable improvement (even though I'm not a picky videophile) and I wouldn't be happy with losing my settings.

Anybody interested in calibration should check out this tutorial:

http://www.curtpalme.com/forum/viewtopic.php?t=10457
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: Receivers, Amps, and Processors
AVS › AVS Forum › Audio › Receivers, Amps, and Processors › HDMI in Receivers, why?