AVS Forum banner
  • Get an exclusive sneak peek into our new project. >>> Click Here
  • Our native mobile app has a new name: Fora Communities. Learn more.

Why use DVI on a CRT?

1673 Views 29 Replies 14 Participants Last post by  ChrisWiggles
I am seeing lots of talk about DVI input boards. What is the advantage of that?


~Jay
Status
Not open for further replies.
1 - 20 of 30 Posts
HD via disc will probably be HDMI with HDCP encryption... Therefore NO analog HD allowed :eek:
I think one advantage is to keep the signal digital as long as possible since a analogue signal can denigrate ( i think thats the right word) over long runs. Another would be to have more flexibility since most of todays components have dvi outputs.
Quote:
Originally Posted by Jay M
I am seeing lots of talk about DVI input boards. What is the advantage of that?
1. No cable losses that cause a soft images at 1920x1080p


2. You can get 1920x1080p instead of no better than 960x540p when we get some form of high-definition DVD and the software providers turn on the constrained output flag.
Quote:
Originally Posted by Tom.W
HD via disc will probably be HDMI with HDCP encryption... Therefore NO analog HD allowed :eek:
I believe that we'll get constrained output (analog outputs are allowed but with an equivalent resolution no better than 960x540p) instead of no analog output so that the manufacturers don't suffer public relations and sales problems due to new HD DVDs being incompatable with early adopter's HD sets that only have analog inputs.
Quote:
Originally Posted by Drew Eckhardt
I believe that we'll get constrained output (analog outputs are allowed but with an equivalent resolution no better than 960x540p) instead of no analog output so that the manufacturers don't suffer public relations and sales problems due to new HD DVDs being incompatable with early adopter's HD sets that only have analog inputs.
But Drew, 540p is still not HD!!
Quote:
Originally Posted by GEBrown
But Drew, 540p is still not HD!!
Right - you will need a DVI or HDMI input with HDCP that has not had its keys revoked to get HD.


If you don't have it, things just proably won't suck as bad as having no picture at all.
Quote:
Originally Posted by Jay M
I am seeing lots of talk about DVI input boards. What is the advantage of that?


~Jay
Jay,


Getting back to your original question - if you read through the Moome card thread(s) for example, almost everyone says the picture they get from DVI is better than what they have been viewing. The problem is that without a DVI card that completes the HDCP handshake, you can't use the DVI outputs of your source equipment into an analog CRT projector. Most people have had to use RGBHV or component outputs up to this point and both of these are analog signals.


So even if you don't plan on buying an HD-DVD or Blu-Ray player, there are still, and will continue to be, DVI (and HDMI) sources that will need a DVI card in order to be used with a CRT projector.


HTH
See less See more
i am using a dvi card on my pj so i can use the digital out of my scaler. this way i can keep the signal in the digital domain as long as possible.
I'm sure that someone will correct my statements if what I say is wrong but is not DVI limited to just a few meters between source and projector? I thought that I read somewhere that it is only twisted pair cables. You can get much longer distance between your components with coax cable.


You have to have dual link DVI for 1080p. One link doesn't have the bandwidth.


One of the big reasons for DVI is the projector can inform the source of it's native resolution/special settings. Kinda like plug and play. It makes life with digital projectors easier. CRT projectors don't need that kind of stuff.


The movie makers want a completely encripted, digital signal chain. They are being hurt by pirating. They would rather have no analog signal anywhere in the chain. I think if they had their way we CRTers would have no video sources at all.


I don't see any reason for DVI and CRT at all except that's the only way you are going to get any modern video. It is no improvement over a decent analog source for your CRT projector.
See less See more
single link can do 1080p.


DVI over long distances may have problems, HDMI seems to be better in this regard, based on statements.

Quote:
I don't see any reason for DVI and CRT at all except that's the only way you are going to get any modern video. It is no improvement over a decent analog source for your CRT projector.
Yup, basically.


The one issue though is that you can do analog poorly and not realize it if you aren't aware, you can get ghosting or loss or other things and you may not notice the degradation because it can be gradual/small. With digital interfaces it generally works or it doesn't, you're not going to have a slow degradation of the signal, though you can get sparklies and things like that. But with analog RGB done well with good cabling and circuitry such, it shouldn't really be any different at all.

ChrisWiggles said:
single link can do 1080p.

QUOTE]


Ah, that jogs my memory. You would need dual link for 2048 x 1536 @ 60Hz. I did some investigation at VDC a couple of years ago and afterwards we decided that a DVI card made little sense for the majority of the Marquee customers. Even with fiber-optic transmitters and receivers to increase the range it didn't gain anyone anything over co-ax cable runs.
Quote:
Originally Posted by Drew Eckhardt
I believe that we'll get constrained output (analog outputs are allowed but with an equivalent resolution no better than 960x540p) instead of no analog output so that the manufacturers don't suffer public relations and sales problems due to new HD DVDs being incompatable with early adopter's HD sets that only have analog inputs.
Pretty much, but not exactly. Neither HD-DVD nor BD require down rezzed analog output, they support it. They both support full resolution output via analog. They both also implement a down rez flag that if present will output down rezzed analog output. In other words, it is completely up to the studio whether you get full rez or down rezzed analog. A couple studios have already said they will support full rez analog output. Rumor has it that there are a few studios that will do full rez analog output because they do not want to punish early adopters. Rumor also has it that there are some studios that are very anti-full rez analog and plan to use the flag.


Simple fact is that not allowing full-rez analog will do nothing to stop real pirating operations. It will not be that hard to buy an HD CRT direct view and use its guts to get an analog signal to record (once the require 4 GHz processors are available). This is of course assume HDCP is not cracked, which according to one encryption expert is not that hard to do at all.


Bottom line, disallowing full rez analog output won't stop real pirating operations. Mostly it will just hurt us, the honest consumer who doesn't yet want to use an inferior display technology.


For more info about all the issues connected with HD on disc see:
http://www.dvdsite.org/


Dave
See less See more
Quote:
Originally Posted by tse
I don't see any reason for DVI and CRT at all except that's the only way you are going to get any modern video. It is no improvement over a decent analog source for your CRT projector.
True, but the jury has not returned on the present state of DVI to analog conversion. Many has claimed that they are getting "great" images from DVI to RGB conversion. What we've seen, and it's pretty much lines up with what Curt had reported from his evaluation, is that the image is worse than analog RGB.


In my evaluations (as also witnessed by many others), this was observed with each of the units tested. And we looked at almost every known one of them out there. The DVI idea/design is a good one. But I think it needs some time to get better, because it should at least be better than than analog RGB. Curt had mentioned color saturation and hues. This was also my same observation, to include lack of real detail in backgrounds. We even noticed this at Clarence's on his G90, and he had also mentioned in his post on my visit that his DVI converter (not the Moome's unit) did not do too well with the HDTV resolution pattern. I know things are changing and that we'll more likely have to go all the way for DVI conversion, but I sure hope someone does something better with the conversion.


I've yet to look at Moome's unit via DVI (but hope to when JBJR comes down). I have however seen it component to RGB. Other than what Curt had reported, I've read very good things so far. I only hope that the colors are truly correct, to include some other observations that I've noticed from some of the others.


The best HDTV I've seen so far, seems to have come from a laptop playing ts files. Everything about that image was perfect. Colrs were vibrant, as they should be with HDTV, and the background detail was very clear and distinct. And even at higher resolutions. The true essence of HDTV is fine detail and colors.


So for now, they seem to be OK, affordable and a very good option for DVI conversion. however, I do hope Moome's is better than what I've already seen so far, because we'll need something really good for for high def on our high performing 9" CRT projectors. It really would be great to get around the shortcomings of analog cabling/etc.
See less See more
Well I would like to add that DVI performance does seem to depend on source as well. ( as one might suspect.)


In Using the Moome card in my sony 1272 I have noticed that:


When I hooked the Comcast Moto 6412 box by HDMI that it was a bit less saturated than comp out, but when I use the JVC 5U DVHS deck with HDMI the colors are noticeably more saturated and sharper than comp. (note comp looks the same between the two boxes).


I do look forward to what you find when you do get to try out one of the input cards, Mike!


Thanks!


Brian
See less See more
DVI input for CRT's is not a performance upgrade. It should not even be thought of as such. It is a method to enforce content protection, plain & simple. At the edge of CRT limits that HD-SDI inputs can take you ( Stephens/Teranex modded 9500LC) you see all out performance. DVI is not about that.


Your display is either relevant or it is not in the coming High Def DVD copy protected world. DVI inputs address this.
Well put Mark ! Its do or die for CRT's .The MP-5 via component is VERY hard to beat but i'm afraid the analog hole is closing... :(
Quote:
Originally Posted by damon
DVI input for CRT's is not a performance upgrade. It should not even be thought of as such
:eek:


This is really bad news for us high end CRT users..:eek: I was hoping things did not turn out this way with DVI conversion..:mad: I'm also hoping that some chip manufacturer would take us to the next level with higher performance chips. and for some reason, I'm very optomistic that it will happen for us, because what I've been seeing is NOT what I want to be seeing in my HT. because with HDTV, the real virtues are bandwidth and resolution. For Video (movies,etc) this translates to detailed and sharp backgrounds, and improved colors. Yes, resolution and bandwidth has a big effect on colors:


With good DVD and other SDTV performance video. When there's six people in a room of the same race or complexion, you should be able to see the differences in at least three of their complexions.


With good HDTV. when there's six people in a room of the same race or complexion, you should be able see the differences in at least five of their complexions, to include sometimes being able to see the makeup on their faces.


With the inabilty to discern the super detail in the backgrounds, to include the amazing color pallete of pastel colors that's so pungent with HDTV, we're not really getting TRUE HDTV.. :eek::mad:



btw, the MP-5 will not be listed on my website with the site changes next month. Our focus next year will shift to another direction.
See less See more
A few of us were pinning high hopes on Reinhold's HDSDI/HDMI input board with 12 bit resolution using the Analog device 7321 NSV rated at 216MHz versus the Silicon image converters that are rated at 10 bit (170MHz) and hopefully soon we shall see this happen. :)
HD-SDI would be the way to go so I am with you in hoping that Reinhold can deliver. Think of it, a digital input designed and implemented strictly for all out performance. The fact that you could then run one single, fairly cheap run of coax to your PJ would just be icing on the cake!!
1 - 20 of 30 Posts
Status
Not open for further replies.
Top