Hi, i have been having a strange issue with a tv signal problem sent over cat5. I think it may be useful for some of to read some of my findings & problems.
I am currently using a HDMI signal running over an old HDMI over2 cat5e cables. As far as i remember, this converter could output a max 1080i signal. I have tried an DVD player (i have lowered the resolution to 720p, and even lower to test), a SKY satellite receiver, as well as a Denon 3312 Amp(newest model which supports HDMI 1.4). Normally, all my devices plug into the amp which upconverts the signal to a resolution of my liking into the HDMI-CAT5 adaptor, where cables run about 10-15 metres to the roof deck, where a Plasma tv sits.
The problem starts after around 10 minutes.......the Plasma (which uses DVI, so i have a DVI-HDMI adaptor), loses the signal, therefore it shows "No signal" until i turn off the source (eg DVD, SKy box etc). I tested with a small lcd TV and around 10 mins when i first play my source, i see a flash on the screen where the signal changes (the screen goes dark for a second) but the image comes back after a second. Therefore, it seems the lcd recovers from the signal change but the plasma doesnt.
2 reasons that i can think of:
1) the HDMI-cat5 adaptor is near the back of the AV rack which has all my equipment may be getting ElectroMagnetic Interference (EMI), although the same adaptor has been used for 2 years (although the Denon amp is new, replacing an older Denon 2008 model amp). The roofdeck Plasma is new. The old one never had this problem (although may not have noticed, so may have had similar result to the lcd i tested).
or 2) The HDMI-Cat5 adaptor is old and needs changing. I have seen a new model that sends 1080p over 1 Cat5 cable (mine runs over 2).
Could this solve my problems? Doesnt explain why this happens at around 10 mins.........
Someone please help!