Originally Posted by dlm10541
Can someone explain how a HDMI cable from box to display will cause missing channels. Someone better go back to school and not accept corporate BS.
Do you frequently install these converters into homes on various brand/model televisions? It seems like It shouldn't matter which output method you use right? Yeah, I get that, and I hear that a lot. Keep in mind though that HDMI is a DIGITAL signal. In fact HDMI stands for High Definition Multimedia Interface (interface being the key word). It is uncompressed video, or video and audio. It needs software for it to work right.
Component cables provide ANALOG High Definition. The picture clarity is the same at 1080i. I have done some research and there are quite a few Cable providers who are experiencing issues with HDMI, particularly on DVRs. Cisco and Samsung produce Cable DSTs and DVRs for all of them. It's clearly a software issue.
Here's a good comparison. If you have a new video card installed on your PC and you dont have the right drivers for them, is it going to work right? It may or it may not. There's probably going to be errors, and certain resolutions will not be available, certain outputs may not work right or not work at all. I agree it seems crazy to think that using one output method over another would make or break your ability to record. But I'm not an engineer. I can tell you from repeated experiences, that when the HDMI cable comes off and another output method is used, the box will (provided theres no other issues, network or signal) begin working properly again.
For all intensive purposes a DVR is a computer. It has a motherboard, a hard drive, memory, a video output, and an "operating system". It's really more of a workstation following commands of a server, but it is a computer. If the software isn't right, it's not going to operate properly.
I can understand why people are frustrated at not being able to use an output they want, because HDMI is new, clean, and hip... but really if component cables do the same job, just as well, there's no harm in using them. The only real people who have a problem then are the ones who have HDTVs lacking component input(s). There were many sold two years ago on Black Friday that I came across that only had HDMI and Cable inputs on the back, some of them didnt even have composite inputs. In this case you would need to go component to a receiver and out HDMI to the TV. I've seen a lot of success doing it this way if you were previously having issues with HDMI.
I personally had 3 Samsung 3270s at one time in my home. Every one of them had the same issues. I would go to record something and when playback time came it would be bits and pieces (namely the first or last 3 minutes of the program only). Tuning into certain channels was problematic also. And it would tell me I had less available storage space than was actually available. Eventually I went back to Cisco 8640s and they also had HDMI issues. The picture would flicker in and out on my Vizio 32" 1080p. It would do similar things on my 47" Philips 1080p. After having enough, I switched to component -- ZERO problems since.