But i wanted to know something based on this setup:
Saphire Radeon 2600 AGP Pro w/512 using the DVI to VGA adapter then this goes via VGA cable to the VGA port on my Vizio P50HDTV i then play a BluRay movie via PowerDVD Ultra & a LiteOn BluRay ROM drive.
Given this setup when i play a BluRay movie am i getting HD or is it downscaling to DVD quality due to HDCP or some other setting i'm unaware of?
Is there software i can use to verify i'm getting HD quality?
I know that the TV i have only does 720p/1080i is there any way to tell which of these resolutions i'm getting?
Sorry about all the questions but i'm really curious at this point.
So does content protection over BluRay not functional when you use VGA or DVI as your output method?
Okay so i'm outputting video at 1366x768 (i think) does that mean i'm getting only 720p resolution or is my conversion way off?
If your resolution is set at 1366x768, then you're getting 768p resolution. The computer will downscale the 1080p source to 1366x768.
Content proection does work, but right now, no Blu-Ray disc has the ICT flag enabled, so all will output full resolution over analgoue connections. If you're using DVI, then your tv and video card will have to be HDCP enabled, or you'll have to run AnyDVD HD.