AVS Forum banner
Status
Not open for further replies.

Best way to feed 720p/1080i to a Marquee 8500?

2K views 55 replies 13 participants last post by  ChrisWiggles 
#1 ·
From a DirecTV HR10-250.


It has HDMI or component video outs... would I be better off getting a DVI card and doing HDMI-DVI from the receiver to the projector, or going with some kind of transcoder to go from the component to RGBHV?


If the latter, can anyone recommend a transcoder that supports 720p and 1080i?


Or, maybe I'm crazy - can the Marquee 8500 take component? If so, is it recommended?
 
#3 ·
Thanks for the reply Rolf.


Anyone else want to chime in here?
 
#5 ·
Here's another question then - I also want to be able to connect my Xbox (not 360) from timem to time. It's component only, 480i/p, 720p, 1080i. How would I do that?


Thanks!
 
#7 ·
OK, Moome adapter it is then.


Thanks!
 
#8 ·
One more question for you guys - how would I switch inputs on the Moome interface? Does the Marquee see the component and DVI inputs as two different inputs?
 
#9 ·
Quote:
Originally Posted by dloftis
One more question for you guys - how would I switch inputs on the Moome interface? Does the Marquee see the component and DVI inputs as two different inputs?
Moome's card automatically switches from Component to DVI depending on which one has a signal. There is a switch on the faceplate of the card to tell it which one you want to be primary. So, say you mostly want to use the DVI port, but then have it automatically switch to component for your XBox, then you would set the switch so that component is the primary.


HTH
 
#11 ·
Dloftis, the very best way to connect your HR10-250 to your projector is to use the broadcast high definition digital format. This is a substantial improvement over both DVI and analog connections, especially with a CRT projector. The picture has more detail, punch, 3-dimensional look and more accurate colors. The downside is this is quite a bit more expensive.


Robert Zuch

Reference Imaging
 
#12 ·
Umm... what? Robert, the HR10-250 can either do component or HDMI. It's an HD Tivo for DirecTV. What the heck are you talking about?
 
#14 ·
I don't think you can mod the Tivo with HD-SDI, but I could be wrong. And even then you'd have to get an HD-SDI inout card, and Casper hasn't finished them ;).
 
#15 ·
Quote:
Originally Posted by Robert Zuch
Dloftis, the very best way to connect your HR10-250 to your projector is to use the broadcast high definition digital format. This is a substantial improvement over both DVI and analog connections, especially with a CRT projector. The picture has more detail, punch, 3-dimensional look and more accurate colors. The downside is this is quite a bit more expensive.


Robert Zuch

Reference Imaging
You've piqued my interest... now show me how to do it ;)
 
#16 ·
We have a HR10-250 in our showroom and have compared different ways to connect to CRT projectors. For best picture quality we use broadcast-grade equipment with the existing outputs on the HR10-250 as well as other sources. We convert the video signal to HD-SDI and then back to RGBHV at the projector. The connection is a single coaxial cable and very long run lengths can be used maintaining the best picture quality for high def. signals. The difference is huge compared to DVI or HDMI.


We have tested many different types of equipment/manufacturers for the signal conversion, cables, etc. Our approach will vary depending on the specifics of the system. I can't go into much more detail but if someone is interested PM me your system info/email address and I can make suggestions.


Robert Zuch

Reference Imaging
 
#17 ·
Quote:
Originally Posted by Robert Zuch
We have a HR10-250 in our showroom and have compared different ways to connect to CRT projectors. For best picture quality we use broadcast-grade equipment with the existing outputs on the HR10-250 as well as other sources. We convert the video signal to HD-SDI and then back to RGBHV at the projector.
So you're not actually getting HD-SDI directly from the DVR, you're taking the component or DVI outputs and using an external converter to make that HD-SDI?
 
#18 ·
You are taking the standard output of the HD Tivo. There's only so much you can do with it at that point. The signal has already passed from a digital to analog form if you're using component. I'd believe there was a significant advantage if you had managed to add an actual HD-SDI output to the Tivo itself, but since you haven't then the internal processing of the Tivo has already occured. Yes, you can feed it it a professional grade scaler or something to perform the transcoding and possibly even manipulate the image quality itself. But it's not like it was the pure SDI output to start with.


And let's be real clear... it's D*. You're working with bit-starved 1280x1080 (at best) signals to start with. I just can't believe the difference is that huge.
 
#19 ·
That's because you haven't seen it.


There is a tendency to make decisions based on theory. We have been led to believe by the manufacturers that digital is better than analog. So it seems to make sense that an all-digital DVI or HDMI connection would be best. But the reality is that DVI and HDMI are low cost formats designed for consumer electronics. The cost to add these inputs/outputs must be kept very low; these are mass market items. These formats often give acceptable picture quality when driving a digital display, but lack considerably when driving an analog display (e.g. CRT). One of the shortcomings is the D/A conversion of the DVI signal. Timing jitter in the DVI signal results in inaccuracies in the analog RGBHV signal which feeds the CRT projector.


Just a few years ago D/A and A/D conversion were not transparent, often introducing subtle distortions except for the most expensive converters. Now top quality conversion is available at much lower cost. If the proper equipment is selected, these conversions are now competely transparent. If the digital format being used is accurate and robust (e.g. SDI) then best picture quality can be achieved. The DVI/HDMI implementations are much too limiting for reference quality images.


Robert Zuch

Reference Imaging
 
#20 ·
Robert,


So are you taking analog out of the STB? If so, you're relying on the crap mass market D/A converters that you're talking about. If you were to take DVI/HDMI out, and then convert it externally to some other format (be it RGBHV or HD-SDI, etc) then you'd actually be bypassing any cheap mass-market A/D processes altogether. What am I missing?


I'm not disagreeing with you that different ways of doing things can make for a better picture... just trying to understand your point of view.
 
#21 ·
Robert, I'm sorry but you're just not making sense. I am not arguing that digital output is better. In fact, in the last few days we had a similar debate and the reality is it is ENTIRELY dependent on the source component. In this case I happen to agree that the component output on the HD Tivo is stronger than the HDMI. But you have, at that point, relied on the internal consumer-quality digital to analog conversion process. On the one hand you say that these consumer level conversions are not good. On the other you say this:


"If the digital format being used is accurate and robust (e.g. SDI) then best picture quality can be achieved"


You are not using a digital format. You are using an analog format that you are converting to digital. It has already been subjected to that inferior conversion process. And you are also working from an inferior source to being with in D* HD.


This isn't based on theory by the way. I use what is generally considered to be a good quality scaler (HD Leeza) to do 1080i-1080p.
 
#22 ·
I did not refer to "crap mass market D/A converters". D/A integrated circuits with excellent performance are available at low cost. This was not the case a few years ago. High quality analog outputs are common with the newer set top boxes, and this would include any box with high definition video. On the other hand, DVI, especially when converted to analog, results in a lower quality image. This is not because of the quality of the D/A converter used; it is the DVI signal causing the problems. This may be caused by jitter, poor design, or something else.


To assume that converting a DVI/HDMI output would bypass any quality issues, especially when compared to analog outputs, is a common fallacy. Digital is not necessarily better than analog. If you prefer CRT projectors to digital projectors then you have an excellent example of this.


The bottom line is that analog outputs for high definition video will almost always outperform DVI/HDMI for analog displays, and this difference is clearly observable.


Robert Zuch

Reference Imaging
 
#23 ·
Really? Because the component outputs on devices like the JVC 30K (DVHS deck) are notoriously weak and that's certainly a box with high definition video. HD Tivos are what, 2.5 years old now? They aren't exactly first generation anymore ;). Seriously Robert, I just feel like there's more to your statements than meet the eye. If you are doing some post-processing of the signal beyond just conversion from analog to SDI, then sure it could be better. But just the act of taking the analog signal and converting it to SDI isn't doing anything but putting yet another conversion in the chain. Why not just leave it analog?
 
#24 ·
Actually we have used the JVC 30K for demos with excellent results using HD-SDI conversion, including the DTS exhibit at CES. But that is not a good example for this discussion as the JVC only has component HD outputs; DVI is not available for comparison.

taking the analog signal and converting it to SDI isn't doing anything but putting yet another conversion in the chain


Our research shows otherwise. Analog runs, especially for high def. video, always result in some degradation even with the most expensive wire. A typical run length from a source in the equipment rack to the projector is 15-25 feet. Let's assume we have three options: 1) run analog from the source to the projector, 2) convert the analog HD component output to HD-SDI in the equipment rack and convert the HD-SDI signal to RGBHV at the projector, and 3) connect DVI from the source to the projector. Let's also assume we're using a CRT projector. Which of these three options would you expect to give the best result?


Many installers would use the DVI connection. After all it is digital and it is convenient. But with CRT projectors the analog connection (through a transcoder) will almost always outperform the DVI connection (even with a 25' run assuming you use a decent quality RGBHV wire). This is because the DVI to RGBHV conversion is the weak link. Not because of the D/A converter hardware, but because of the DVI signal issues that exist before the D/A conversion takes place.


If we use option 2 we add two conversions and more expense, but we do get a small bonus of not needing a transcoder and replacing the RGBHV wire with a single coax cable. But we do get improved picture quality as compared to the analog run because we eliminate analog signal degradation. Why doesn't adding two conversions result in the same or more degradation than with the analog run? The reason is the conversion to/from broadcast formats is good enough to be transparent, whereas running a HD analog signal 15-25' (or more ) results in at least some observable degradation. Comparing option 2 (HD-SDI) to option 3 (DVI) is a huge difference in PQ.


This configuration works great with DVD players. A DVD player with a SDI output connects to a scaler (SDI IN with HD-SDI OUT). So the scaled DVD video shares the single coax cable to the projector. This is an example of an all-digital path using the broadcast quality digital format (the same format used by TV studios, broadcast facilities and post production houses). A DVD player's DVI/HDMI output can't compete with this.


Robert Zuch

Reference Imaging
 
#25 ·
So really what you're saying is that you convert to SDI because it's better for longer cable runs, not that the conversion to SDI somehow is superior to the analog signal. Ok, that I can buy. The problem is you're making assumptions about the length of the cable run when you say that it is "better" to do it that way. Not everyone has a 25' run between the source and the projector. At what point does it become a diminishing return? 12'? 6'? I agree completely btw about a competently modded SDI DVD player being superior to using the DVI output for so many reasons (lack of HDCP, crush, etc). That's what I use myself.


I just feel like your initial statements weren't entirely complete and should be qualified to say "If you have a longer analog cable run then conversion to SDI will decrease the signal degredation and improve the picture quality". Otherwise it looks like you're saying that merely converting to SDI is performing some kind of black magic and making the signal better.


-MP
 
Status
Not open for further replies.
You have insufficient privileges to reply here.
Top