AVS Forum banner
1 - 1 of 1 Posts

1 Posts
Discussion Starter · #1 ·
I have a Samsung TV connected to an older win7 HTPC. The HTPC has an integrated ATI 3000 video card that outputs DVI through an adapter to an HDMI cable into the TV. Whenever I play video from the HTPC, the TV recognizes and displays an info window showing a resolution of 1920x1080 at 60Hz. I use the HTPC for downloaded media (no Blu-Ray) and have always assumed the quality I get on the screen is a result of the resolution on the files.

However, the other day I tried streaming a file from my phone to the screen on the TV using DLNA and was shocked to see a real HD picture. I then copied the same file to my HTPC and played it. The quality was as usual - ok, but not HD. I cannot figure out why the HTPC outputs such low visual quality in comparison to something streaming over wifi to my TV from my phone. What is happening here?

-The Radeon 3000 chipset, while older, is capable of a resolution of 2560x1600(@ 60Hz so that shouldn't be a problem.
-The video clarity looks the same no matter what media player I use on the PC
-Is the TV upscaling the wifi stream somehow? If so, why doesn't it upscale the DVI/HDMI input from the HTPC?

Thanks for any direction. I am not sure which part of the system to begin to look to understand this.
1 - 1 of 1 Posts