AVS Forum banner
1 - 6 of 6 Posts

·
Registered
Joined
·
2 Posts
Discussion Starter · #1 ·
First off, I don't know much about playing blu-ray using a computer, but i do know quite a bit about computers in general as well as hardware. If I had an mkv file at 1080p resolution and I was sending this via a vga cable from a dell latitude d630 (integrated graphics) to a 1080p 40" sony bravia, would this be an accurate representation of a blu-ray movie (in terms of video, not audio)? I also own a desktop computer with a BFG geForce gtx 260 GPU and an AMD Athlon X2 6000+ CPU. Would this produce a better picture if it was connected to the sony bravia via vga? Also, would DVI produce a better picture? Thanks.
 

·
Registered
Joined
·
2,646 Posts

Quote:
Originally Posted by MovieSwede /forum/post/15455569


DVI is better since it doesnt need D-A-D.

Be sure that it is DVI-D and not DVI-I (analog which can be converted from/to with a simple VGA adapter). DVD-D can also optionally do HDCP handshake for copy protection.
 

·
Registered
Joined
·
290 Posts

Quote:
Originally Posted by bobgpsr /forum/post/15455871


Be sure that it is DVI-D and not DVI-I (analog which can be converted from/to with a simple VGA adapter). DVD-D can also optionally do HDCP handshake for copy protection.

DVI-I can be used for digial video as well, a device with a DVI-I port is used differently depending on what it is connected to. DVI-A is always analog however.
 
1 - 6 of 6 Posts
Top