hdmi splitter question. - AVS Forum
Forum Jump: 
 
Thread Tools
post #1 of 6 Old 06-03-2012, 12:51 AM - Thread Starter
Advanced Member
 
sstephen's Avatar
 
Join Date: Dec 2001
Location: Edmonton, Alberta
Posts: 539
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 20
I have a problem and am looking for a solution. This is probably going to be long winded. Sorry.

Bought a new ivy bridge mobo and processor
z77 mobo intel 3570k processor.
So that means HD 4000 graphics
Now I find out that the geniuses at Intel can't, or more likely won't write hd4000 drivers which will communicate with a Denon receiver, which is what I have (of course).

With older chipsets, people could just do an EDID override and all was good.
The drivers in my computer will not even acknowledge the existence of my receiver/projector.
For example, if I try and force detection, I see some "no monitor detected" type of message.
If I run MonInfo, an executable which reads EDID info from a monitor, it reports just fine. So it isn't a cable issue. Linux can also extend the display onto the computer, and my old nvidia card had no problem.

If I remove the receiver from the loop, the computer will detect the projector just fine.
Now, the projector is an optoma hd81 which includes a separate box for doing switching and scaling. That unit has the following hdmi connectors
3 switchable inputs
1 output which would normally go to your receiver.
1 input from the receiver to the scaler
1 output from the scaler to the projector.
So the chain if you are using the switcher would be
source (bd/computer/satellite) -> switcher -> receiver -> (back to) scaler -> projector

but you don't need to use the switcher if your receiver can switch inputs (which they can), so the NORMAL way to hook up is to bypass the switcher and do
source (bd/computer/satellite) -> receiver -> scaler -> projector

What I'm hoping I can do is

computer->splitter--+-> receiver -> switcher (input 1) -> scaler -> projector
................................+->switcher(input 2)-> scaler -> projector

So that for bd/satellite, I use input 1: The splitter never enters the equation. Receiver splits out and plays audio, and passes video on to the switcher/scaler/projector.
For computer I use input 2: Splitter passes audio and video to receiver; receiver plays audio from the computer, video just gets thrown. Splitter also passed video and audio to switcher; switcher passes video to scaler, audio gets thrown away.

What I don't know is how those kinds of passive splitters are going to handle the handshaking that goes on. You now have 2 devices, the receiver and the switcher trying to communicate with the computer. Both will tell it they can play video (since the receiver thinks it will be passing it along). I really don't know if the switcher will tell the computer it can play audio when a receiver is not hooked up.

Now from the looks of it, some people have 2 tv's hooked up and they don't appear to cause communication problems. Other people can't get it to work.

Passive or powered: in either case can both tell the computer what they are capable of doing and work? In my case is one device going to say "I can't play video" and the other say "I can't play audio" and I get no output? Is Intel's second rate driver still going to see a "denon" on the wire somewhere and just send out nothing?

The computer is not going to be used for material requiring hdcp, but of course everything is capable of using it.

Will a powered splitter be any better?

Thanks

Scott Stephens
sstephen is online now  
Sponsored Links
Advertisement
 
post #2 of 6 Old 06-03-2012, 10:21 PM
AVS Special Member
 
Colm's Avatar
 
Join Date: Aug 2002
Posts: 4,652
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 87
There is no such thing as a passive HDMI splitter. They are all active devices. The only question is where does the power come from. So called passive splitters take their power from the 5V line of the HDMI cable. The problem is that that line really wasn't intended for that kind of use. HDMI spec only requires source devices to provide 55 mA. It is intended to be used for hot plug detect. It is quite possible that insufficient voltage and/or current will be available for the splitter to work properly. Splitters should have their own power supplies.
Colm is offline  
post #3 of 6 Old 06-04-2012, 12:20 AM - Thread Starter
Advanced Member
 
sstephen's Avatar
 
Join Date: Dec 2001
Location: Edmonton, Alberta
Posts: 539
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 20
Thanks, but the rest of my concerns remain. Is there much hope it will do what I want?

Scott Stephens
sstephen is online now  
post #4 of 6 Old 06-04-2012, 06:21 AM
 
alk3997's Avatar
 
Join Date: Jan 2004
Posts: 3,722
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 87
Quote:
Originally Posted by sstephen View Post

I have a problem and am looking for a solution. This is probably going to be long winded. Sorry.

Bought a new ivy bridge mobo and processor
z77 mobo intel 3570k processor.
So that means HD 4000 graphics
Now I find out that the geniuses at Intel can't, or more likely won't write hd4000 drivers which will communicate with a Denon receiver, which is what I have (of course).

...

Thanks

I have almost the same setup as you (ASUS Ivy Bridge instead) and talk with a Denon receiver to send the main computer screen out. The difference between the way you're setup and mine is that I use a separate video card (a Sapphire fanless) and have no problems with handshaking. The Intel onboard video is disabled.

Your only choices without a new card are to either find an HDMI option (such as CEC) that *may* be preventing proper handshaking or to hope one of the two companies changes their interface sufficiently for the other to recognize them. Also make sure you are outputting a video format that exactly matches what the Denon is expecting (including resolution and frame rates). Check the Denon owners manual for a list of supported resolutions (changes by receiver number and you say which one you had).

Also keep in mind that audio and video are always sent. There is no such thing as a switcher saying whether audio should be sent. What changes is the type of audio or the type of video allowed. When EDID handshaking fails, nothing gets sent or what is sent isn't compatible.
alk3997 is offline  
post #5 of 6 Old 06-04-2012, 09:16 AM - Thread Starter
Advanced Member
 
sstephen's Avatar
 
Join Date: Dec 2001
Location: Edmonton, Alberta
Posts: 539
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 20
First, to be clear, in my second post, I meant assuming I give up the "passive" solution and try for an active only.



Quote:


Your only choices without a new card are to either find an HDMI option (such as CEC) that *may* be preventing proper handshaking or to hope ...

I just did a search, and CEC is the protocol that allows remotes to turn on or off other devices? I'll check, but I'm pretty sure that is already disabled in my Denon. I don't think the Optoma supports it, but I'll check on that also.

Quote:


one of the two companies changes their interface sufficiently for the other to recognize them.

I'm thinking that is probably my only hope, and I have nearly none.

Quote:


Also make sure you are outputting a video format that exactly matches what the Denon is expecting (including resolution and frame rates).

Since the hd4000 driver will not even acknowledge the existence of my Denon, I don't know if that will help, but 1920x1080p is supported by all, and that is what I was trying to use. Also 1920x1080i (which is what Optoma advertises best), but still no go.

Quote:


Also keep in mind that audio and video are always sent. There is no such thing as a switcher saying whether audio should be sent.

Yes, I was aware of that.

Quote:


When EDID handshaking fails, nothing gets sent or what is sent isn't compatible.

And unfortunately that is what I expect will happen if I put in an active splitter. I was hoping someone more knowledgeable than myself would tell me I was wrong and that an active splitter would solve all my problems.

I'll send some whiney messages to Gigabyte, who makes my MB to tell them I want it to work with my receiver, but I expect no changes in anyones policy. I have resigned myself to putting in a separate video card, or running 6 analog audio signals to my amp. I'd rather the amp did the D/A since the converters are better, but I think this is my only solution unless I want to buy yet another video card. I have at least 3 still hanging around doing nothing. I know one will work (gt240) but it uses 20-30 watts extra, and it ticks me off that I have to waste it just because Intel or Denon or both appear to be acting like children who can't get along.

Scott Stephens
sstephen is online now  
post #6 of 6 Old 06-04-2012, 09:32 AM
 
alk3997's Avatar
 
Join Date: Jan 2004
Posts: 3,722
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 87
One other choice is the HDMI detective or an matrix splitter that will force an EDID value. The matrix is more expensive than a video card while the HDMI detective is about the same. The HDMI detective would require a sink that is compatible and then you store that EDID - kind of a very rough version of what you were doing with your older computers.
alk3997 is offline  
Reply HDMI Q&A - The One Connector World

User Tag List

Thread Tools
Show Printable Version Show Printable Version
Email this Page Email this Page


Forum Jump: 

Posting Rules  
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are Off
Refbacks are Off