AVS Forum banner
Status
Not open for further replies.
1 - 20 of 40 Posts

·
Registered
Joined
·
498 Posts
Discussion Starter · #1 ·
Hi,


I would appreciate it if someone would take the time to explain what DVI is all about, and more importantly the consequences of buying a new projector (the Sharp for example) with no DVI on board.


Thank you

Ran

 

·
Registered
Joined
·
457 Posts
Ran,

Here's the short answer. DVI is basically another interconnect between a video source (DVD, STB, etc.) and a video display (TV, Monitor, Projector, etc.). The main benefit is that it is a purely digital connection, as opposed to RGB, Y-Pb-Pr, etc.


The scuttle about DVI is that content producers (Hollywood, MPAA) want to use this "perfect" connection as a method of enabling copy protection on high-definition signals. The fallout is that if the content has the copy protection set, the high-definition signal can only be sent through the DVI connector. All other outputs (Component, RGB, etc.) would be down-res to 480p.


The reason this has upset so many people (including yours truly), is that we have made some significant investments in equipment that has been advertised as "HDTV compatible", only to find now that the content providers won't "provide" the content unless our display equipment has the new connector.


In fact, just having the DVI connector won't be enough (some equipment already has it), your equipment will need to have the special version of it with the copy-protection scheme (HDCP) implemented.


OK. So that's the short answer. Although I am definitely biased (see my sig), I tried to give an objective explanation of DVI and why it's such a HOT topic. If you're really interested, I would do a search for "DVI HDCP" or "DVI copy protection" in the HDTV forums. Then sit back, on a rainy afternoon, and read until your eyes bleed.



------------------

DVI/HDCP makes your HDTV not ready
 

·
Registered
Joined
·
441 Posts
Ran:


DVI stands for digital video interface. To take advantage of this technology you must have an HTPC with a video card that has a DVI output. I am not aware of any DVD players with DVI outputs. The benefit today is that it will allow you to bypass the DACS (digital to analog converters) in both the DVD player (inside the computer) and the projector. The result should be a more stable image with no motion artifacts. Whenever you can bypass a conversion process it is a good thing. DVI, therefore, as it relates to DVD playback through a computer, will benefit immediately by DVI regardless of future standards.


There are two types of DVI connections, DVI-I and DVI-D. The DVI-I (integrated) connection allows you to connect a DVI device to an anolog device by way of an inexpensive adapter. The DVI-D (digital) connection will only allow yout to connect a digital device to a digital device.


What many are concerned about is future compatibility with HDTV sources and perhaps future high resolution DVD players. As Jhill32 says, the powers that be are worried about pirating the excellent HDTV signal. Several connections that would work in conjuction with copy protection software have been suggested, two of which are DVI/HDCP and iEEE Firewire. The current problem is that I am not aware of any current production projectors that are compatible with the DVI/HDCP standard. The DVI connection itself is not a problem as it is just a means to send and receive a digital signal. The problem will be with the chipset associated with the DVI connection. I personally am not concerned about it because there is currently so little HDTV material, at least in my part of the world, that DVD is my primary source. I also expect that when a copy protection method is finally adopted it will be implemented into a box between the satellite receiver and the projector or in the satelite receiver itself, which will decode the signal and like all current encryption software, it will send either a digital signal or analog signal that all DVI input projectors or analog projectors can read. Otherwise, current DVI projectors and more importantly the miullions of anolog projectors out there would be obsolete. There is no way the industry could afford or allow this to happen if they want people to accept the new format and use their products.
 

·
Registered
Joined
·
498 Posts
Discussion Starter · #4 ·
jhill and knuck,


thank you both for taking the time to answer my question which has become very relevant regarding the new Sharp.

So, if I understand correctly, there is still some time before the MPAA finalizes the copy protection method, which will effect only HD and HD DVD when available, and even then it's hardly likely that all are projectors will become obsolete. If so, and taking into account that here, in Israel, I don't foresee HD in the next couple of years, the new Sharp remains on my short list.

Thank you

Ran
 

·
Registered
Joined
·
87 Posts
IMHO, no-DVI should not be a deal breaker. I run a 25-foot VGA cable to my LT150 with no blurring or ghosting at all. Fast action scenes are crisp.


Like any added extra, if you can get DVI for little or no cost, then of course go for it.


Thomas
 

·
Registered
Joined
·
1,587 Posts
Quote:
Actually, I think HDCP has been accepted by the consortium. I don't know if there are any announcements by manufacturers yet, though.


I just hope they don't design away the analog outputs before they realize what they've done.
Thats what I have heard. It is a done deal, so I guess we have to live with it. We will start getting this on Dvd as the digital out, so the same copy rules apply.


I do not think the analogs will go the way of T-Rex just yet. But I want a pure digital connection for my new Dlp and if DVI is it, then thats great.


I would not want to spend 10K and not have the best signal avaliable for my new projector.For both HD and Dvd.


DavidW


[This message has been edited by David Wallis (edited 09-21-2001).]
 

·
Registered
Joined
·
5,211 Posts
I have about ten different display devices available in my projection lab with DVI inputs. I have done alot of testing using a PC with test pattern software ( http://www.displaymate.com ) and i can conclusivly say that there is little to no difference between Analog (VGA) and Digital (DVI)


THis is based on about 60 people over the last few months who all work in the AV field takiong a look and deciding. Not one person was able to make a conclusive decision that one was better than the other. With a few older Display devices it is fair to say that DVI had a SLIGHT sharpness/contrast advantage. Nobody could quantify that advantage in any terminology other than the type you find in the description for some audio cables. The only thing you could say is that the DVI was a bit punchyer. (is that a word?) But there was no pattern that we could say... ha- there you go.


On the newer devices... NEC GT 1150, the Epson 810p... No difference at all. they were both so sharp that if you put up fine dither patterns it actually hurt to look at. (i am serious) In a blind comparison nobody could tell the difference.
 

·
Registered
Joined
·
2,971 Posts
Which patterns were used in the comparison?


I'm particularly interested in knowing if a pattern which compared the brightness of single pixel horizontal and vertical lines was used. What I mean by single pixel line pattern is one in which single pixel thickness lines of white are next to single pixel thick black lines. I note a significant difference in the brightness of vertical vs horizontal lines when a VGA signal is used rather than digital DVI. The digital DVI has equal brightness for vertical and horizontal every other line patches indicating that no bandwidth losses. The VGA connections had dimmer vertical line patches than horizontal lines indicating bandwidth losses in the signal path. I'm surprised you didn't note a difference.


------------------

Guy Kuo
www.ovationsw.com
Ovation Software, the Home of AVIA DVD


 

·
Registered
Joined
·
3,102 Posts
I have tried both VGA and DVI and can see a difference. Not in the ghosting and artifacting, but in color saturation. The colors are more vibrant. If you don't have a projector yet, I suggest get one with DVI input and get a HTPC with DVI out. Since most video cards that have DVI also come with VGA you can do your own test.
 

·
Registered
Joined
·
3,653 Posts
Quote:
Originally posted by Dizzman:
and i can conclusivly say that there is little to no difference between Analog (VGA) and Digital (DVI)

Thats a bit like saying that there is no difference between component and SDI, you should go over to the scaler forum and tell them to stop wasting their money. http://www.avsforum.com/ubb/smile.gif Whether you see a difference depends on quite a lot of things. Why do you thinks loads of HTPC guys made the RF filter mods to their video cards?


Jeff
 

·
Registered
Joined
·
5,211 Posts
I have been a professional projectionist for the last 12 years. I have seen and setup every type of device you can name. And these observations are not just me looking. They are based on many people doing comparisons. And as to the alternating pixel chart you refer to, i have looked at ALL of the patterns... EVERY SINGLE ONE, Numerous times at that. Blind comparisons, non-blind comparisons.


Some projectors have a bigger difference, some have a smaller difference. I will qualify things a little more by saying that all VGA cable distances were very small, and the graphics cards were reference quality. I did these tests on allot of different devices to make this statement.


I would not suggest that people in the scaler forum stop wasting their money, as first of all the company i work for makes scalers... http://www.avsforum.com/ubb/tongue.gif Secondly, there is a big difference in the SDI category (Theoretically) as a scaler has one very large step removed from its processing path... The decode/A/D conversion. YOu no longer need to be concerned with which chip a manufacturer is using as it is not a concern.


What i AM saying is that DVI was introduced to reduce the cost and complexity of desktop monitors. THERE ARE NO OTHER MAJOR REASONS FOR ITS DEVELOPMENT. It was simply to allow LCD monitors an advantage (greatly reduce the internal circuitry required) in order to reduce their costs and start getting them on desktops. The reality of the situation is that due to the many different versions of DVI (DVI, DVI-D, DVI-I, DVI-A, M-1) due to the constant improvements of the A-D convertors for display devices, due to the lower cost of these chipsets, due to the increased cost in adopting a new technology... DVI is entering the marketplace FAR slower than anybody (and just about everybody) anticipated.


All i am saying is that there is little difference (in a practical viewing sense) between the Analog and digital interfaces available to us. Also, there are issues such as the fact that allot of the advantages get lost in the somewhat inferior image quality of allot of the devices using DVI.


Just my .02$ worth.
 

·
Registered
Joined
·
388 Posts
People should realize that there's not going to be a huge difference between DVI and VGA since the VGA on many projectors is already pretty darn good to begin with. And the problems that people do complain about with their projectors' images generally have nothing to do with the signal having gone through a D/A/D conversion.


Still, there is a difference. I use a Radeon DVI output to drive my MP1600. I haven't noticed any difference with moving images; there may be some but trying to discover what it is would be a waste of time, in my opinion.


There is a definite difference in static images, however. On a VGA image, even if it looks excellent with no ghosting, etc., the display will exhibit some tiny bits of motion or flicker when you get up very close to the screen and look closely. I wouldn't have realized this if I didn't have the DVI signal to compare it to. But the DVI signal gives an incredibly rock solid, stable image. You can go up to within inches of the screen and there is absolutely no change in the image at all. It's perfect.


In comparison, this isn't true with VGA. With VGA, it's like there's a very subtle flicker, invisible unless you're right up at the screen. Also, you can see that around the outlines of a piece of static text on VGA, very occasionally you see a stray dark pixel for a tiny fraction of a second in the white surrounding the black of the text. Nothing like this with DVI.


I don't know whether some of what I see in VGA has to do with the analog VGA color info or position info varying up or down by a bit or so every ten thousandth (millionth? billionth?) time sampled, though that sounds like what might cause the very slight instability I see when comparing VGA to DVI.


In the almost two years I've had the projector, I've never really bothered to do A/B'ing of DVI vs. VGA at ordinary viewing distances. I expect there's next to no difference, but I do choose to use the DVI connection just because if there is a difference, then I'm pretty confident that it's the DVI that's better. I actually have two MP1600's right now, so I guess I could do a true A/B test.


[This message has been edited by hsitz (edited 09-21-2001).]
 

·
Registered
Joined
·
1,652 Posts
To clarify the acronym DVI if it hasn't already been mentioned DVI does not stand for Digital Video Interface it means Digital Visual Interface. I would suggest to those who have spent a lot of energy trying to say there is little difference between analog and digital interface my experience and the experience of others including Evan Powell of ProjectorCentral would say you are wrong. In my case when I owned my LP350 I saw a very significant improvement in image quality when switching from analog output to DVI on my dual head GeForce card. Other LP350 owners have reported the same results as have Compaq projector owners who have DVI. The difference was so significant that running an analog input if there was an option for DVI would have to called foolish.


Lenny Eckian
 

·
Registered
Joined
·
75 Posts
Can someone say how much of a real difference the DVI connector makes compared to the standard VGA 15 pin connector?


The reason I ask is that I'm looking at the Sharp too and I also want to use a HTPC. I know the DVI connector is going to be theoretically better but I assume we're not talking about the difference between night and day here?


Thanks
 

·
Registered
Joined
·
457 Posts
Quote:
Originally posted by Ran:


So, if I understand correctly, there is still some time before the MPAA finalizes the copy protection method, which will effect only HD and HD DVD when available,
Actually, I think HDCP has been accepted by the consortium. I don't know if there are any announcements by manufacturers yet, though.


Many people feel that it has more to do with the MPAA trying to break into a "pay per view" model than simply trying to stop illegal copying. I believe that marketing plan will ultimately fail - the precedent has already been set. When I buy a DVD, I expect to be able to play it foward, backward, scene-by-scene, as many times as I want.


I just hope they don't design away the analog outputs before they realize what they've done.

[/b][/quote]




------------------

DVI/HDCP makes your HDTV not ready
 

·
Registered
Joined
·
1,587 Posts

Quote:
The difference was so significant that running an analog input if there was an option for DVI would have to called foolish.
Totally agree. Now that I have seen it DVI is just so much better that I could only ever recommend a new projector purchace include this digital connection.

Its like using analog audio connections when you can connect via a coaxial digital.



DavidW
 

·
Registered
Joined
·
75 Posts
Unfortunately even though a DVI connector may be better the latest projectors I'm interested in don't have one. (the Sharp 9000, the Seleco HT300). The new Yamaha has one but that one apparently suffers from more rainbows.


So even though I want to use a HTPC the Sharp till remains top of my list for all the other good things I've heard about. If I keep waiting for the perfect projector I'm never going to buy one :)
 

·
Registered
Joined
·
1,587 Posts
Quote:
Chris Stephens -- ex:Ultimate Entertainment Digital Guy

One thing for sure DVI is about to become a defacto standard for all things video in the consumer market. You should be at least wiring for DVI to the projector or you loose.
As Chris well knows (and as he has stated on the AVS special guest forum above) DVI is a must have, and this connection has become the new digital standard for connecting digital video such as Dvd and HD.


To me at least it would be very foolish to be looking at new equipment at this late stage, that could not handle DVI.

At least IMHO.


Judging on my phone conversations with industry people (manufactures) all have said we will see a flood of DVI gear at the January CES because the copy laws are now in stone.


DavidW


[This message has been edited by David Wallis (edited 09-23-2001).]
 

·
Registered
Joined
·
1,244 Posts
Although this is not fp specific, I will shortly be getting a 19" Iiyama 453 CRT PC monitor and Radeon VE DVI/analog video card for my new PC system. I've read that this new Iiyama model is the first PC CRT to have both a DVI AND analog VGA interface (DVI connectors, of course, are usually only found on medium/high end flatpanel PC LCDs.) I'll compare and contrast the image qualities here when I get them.


P.S. If anyone knows where I can get the Iiyama 453 for less than $440 shipped (zip=32828) then please let me know.


[email protected]


[This message has been edited by nowknown (edited 09-23-2001).]
 
1 - 20 of 40 Posts
Status
Not open for further replies.
Top