Is there a way to always output the movie original Resolution and refresh rate? - AVS Forum
Forum Jump: 
 
Thread Tools
post #1 of 12 Old 08-13-2012, 10:47 AM - Thread Starter
Member
 
danbez's Avatar
 
Join Date: Feb 2004
Location: Seattle
Posts: 104
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 7 Post(s)
Liked: 11
I know that with TMT5 I can use the Auto-Refresh frame rate to output Blu-Ray movies at 24Hz to my TV/Projector. I wonder if there is a solution to change both the resolution and refresh rate based on source.

For example, I would like to have my desktop/WMC set to 1080i by default since this is the resolution used by most of TV broadcasts (USA). When playing a DVD (I use MediaBrowser with TMT and/or WMC player), somehow set the resolution back to 480p, 59Hz and then return back to 1080i once finished. The same for Blu-Rays, using 1080p, 24Hz.

Right now I just leave WMC set to 1080p, but unless I send the original Live TV resolution to the TV (1080i), I see dropped frames here and there which is quite annoying. But of course I don't want to see Blu-rays at 1080i :-)

My Video card is an Asus GT430 (fanless) and my TV is a LG 6700.

Thanks!
danbez is online now  
Sponsored Links
Advertisement
 
post #2 of 12 Old 08-13-2012, 11:19 AM
AVS Special Member
 
fitbrit's Avatar
 
Join Date: May 2006
Posts: 2,073
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 5 Post(s)
Liked: 25
I know JRiver Media Center does this, but no idea about other software.
fitbrit is offline  
post #3 of 12 Old 08-13-2012, 12:47 PM
AVS Special Member
 
Mrkazador's Avatar
 
Join Date: Jun 2005
Posts: 3,865
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 58 Post(s)
Liked: 261

Your tv is an LCD and has a native resolution of 1920x1080p. What this means is that whatever resolution you input, the tv will internally upconvert it to 1920x1080p. It is a waste of time to input these different resolution when in the end, the tv will upconvert the resolution. You need to find a video decoder that works well for tv broadcast so it can deinterlace it properly then you would get progressive video.

 

Madvr and MPC-HC can change resolutions/refresh rate depending on the video.

Mrkazador is offline  
post #4 of 12 Old 08-13-2012, 01:30 PM - Thread Starter
Member
 
danbez's Avatar
 
Join Date: Feb 2004
Location: Seattle
Posts: 104
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 7 Post(s)
Liked: 11
Quote:
Originally Posted by Mrkazador View Post

Your tv is an LCD and has a native resolution of 1920x1080p. What this means is that whatever resolution you input, the tv will internally upconvert it to 1920x1080p. It is a waste of time to input these different resolution when in the end, the tv will upconvert the resolution. You need to find a video decoder that works well for tv broadcast so it can deinterlace it properly then you would get progressive video.

Madvr and MPC-HC can change resolutions/refresh rate depending on the video.

Unfortunately it's impossible to use anything else but the WMC default MS encoders for Live TV.
danbez is online now  
post #5 of 12 Old 08-13-2012, 05:10 PM
AVS Addicted Member
 
walford's Avatar
 
Join Date: May 2003
Location: Orange County, CA
Posts: 16,789
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 12
The de-inteterlaceing logic in todays TVs has come a long way in the last couple of years as has the upscaling algorithims.
As stated above there is no way to display anything but the native resolution on a fixed pixel TV(LCD/LED or Pixel).
Many TVs today also automatically detect film Camera produced video and will then use (Inverse Telecine logic to convert it to 24fps co it can be converted to 1080p/60 withoug suffering any frane rate conversion Judder.
walford is offline  
post #6 of 12 Old 08-18-2012, 04:25 AM
AVS Special Member
 
sneals2000's Avatar
 
Join Date: May 2003
Location: UK
Posts: 7,048
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 54 Post(s)
Liked: 47
Quote:
Originally Posted by walford View Post

The de-inteterlaceing logic in todays TVs has come a long way in the last couple of years as has the upscaling algorithims.
As stated above there is no way to display anything but the native resolution on a fixed pixel TV(LCD/LED or Pixel).
Many TVs today also automatically detect film Camera produced video and will then use (Inverse Telecine logic to convert it to 24fps co it can be converted to 1080p/60 withoug suffering any frane rate conversion Judder.

3:2 and IVTC aren't an issue this side of the pond - but surely you'd want to output 1080/24p content (Blu-ray mainly - though increasingly IPTV streams as well?) as a 1080/24p signal and display it at a multiple of 24Hz refresh to avoid 3:2 judder.

I'd hope that 120/240Hz etc. displays didn't convert 1080/60i 3:2 pull-down removed 24p content to 1080/60p - you'd want them to convert to an n x 24Hz refresh rate surely - to avoid 3:2 cadence judder - wouldn't you? (You'd go 5:5 with 120Hz or 10:10 with 240Hz )
sneals2000 is offline  
post #7 of 12 Old 08-18-2012, 04:27 AM
AVS Special Member
 
sneals2000's Avatar
 
Join Date: May 2003
Location: UK
Posts: 7,048
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 54 Post(s)
Liked: 47
Quote:
Originally Posted by danbez View Post

Unfortunately it's impossible to use anything else but the WMC default MS encoders for Live TV.

But don't the MS H264 decoders offload both H264 decoding and de-interlacing to DXVA - so the quality of the result is dependent on the implementation. (i.e. you can benefit from vector adaptive de-interlacing on higher-end TV cards - which is likely to out-perform quite a lot of HDTV de-interlacing implementations?)
sneals2000 is offline  
post #8 of 12 Old 08-18-2012, 11:29 AM
AVS Special Member
 
whiteboy714's Avatar
 
Join Date: Feb 2009
Posts: 4,601
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 8 Post(s)
Liked: 16
Quote:
Originally Posted by sneals2000 View Post

Quote:
Originally Posted by danbez View Post

Unfortunately it's impossible to use anything else but the WMC default MS encoders for Live TV.

But don't the MS H264 decoders offload both H264 decoding and de-interlacing to DXVA - so the quality of the result is dependent on the implementation. (i.e. you can benefit from vector adaptive de-interlacing on higher-end TV cards - which is likely to out-perform quite a lot of HDTV de-interlacing implementations?)
I always figured the TVs deinterlacing would be better. Not that I send it 1080i. The i3 550 handles it fine for me.
whiteboy714 is offline  
post #9 of 12 Old 08-19-2012, 07:14 AM - Thread Starter
Member
 
danbez's Avatar
 
Join Date: Feb 2004
Location: Seattle
Posts: 104
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 7 Post(s)
Liked: 11
Sometimes it is just frame drops here and there. Other times, I get a "film like" output for 1 or 2 seconds when watching video (like the news program).

I will try to disable Aero and see if it happens somehow. If not, my plan is to set the resolution to 1080i (most of broadcasts use it) and let MadVR change it back to 1080p based on content... in this situation, what would happen with DVD content? Would MadVR also change the output to 1080p or leave it to 1080i?
danbez is online now  
post #10 of 12 Old 08-19-2012, 09:38 AM
AVS Special Member
 
sneals2000's Avatar
 
Join Date: May 2003
Location: UK
Posts: 7,048
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 54 Post(s)
Liked: 47
Quote:
Originally Posted by danbez View Post

Sometimes it is just frame drops here and there. Other times, I get a "film like" output for 1 or 2 seconds when watching video (like the news program).
The 'few seconds of film' effect usually happens on Motion Adaptive or simple Bob/Weave switched de-interlacing when the motion detection hasn't realised the source is native interlaced (aka video) rather than native progressive (aka film) - it shouldn't happen with decent Vector Adaptive de-interlacing.
Quote:
I will try to disable Aero and see if it happens somehow. If not, my plan is to set the resolution to 1080i (most of broadcasts use it) and let MadVR change it back to 1080p based on content... in this situation, what would happen with DVD content? Would MadVR also change the output to 1080p or leave it to 1080i?

I don't think there are many decent ways of taking a 1080i signal into windows and keeping it 1080i for output - almost all systems de-interlace to 1080p and re-interlace ISTR. Suspect 480i content would be deinterlaced, scaled to 1080p and then re-interlaced to 1080i (though that may well be better than doing a 240->540 line field scale anyway).
sneals2000 is offline  
post #11 of 12 Old 08-19-2012, 09:44 PM
Advanced Member
 
Dodgexander's Avatar
 
Join Date: Mar 2007
Location: Nr Brighton, England.
Posts: 855
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 5 Post(s)
Liked: 12
Compare the quality manually. Set 480i or p in your graphics card properties. Do you notice improved scaling our deinterlacing with 480 sources?

If you do then it's probably worth using mad vr.otherwise convenience would just be best and leaving 1080i will be fine. As long as you don't watch any progressive content. As then you will go from p to I to p.

Sent from my Blade S using Tapatalk 2

Natalya Simonova: Do you destroy every vehicle you get into?
James Bond: Standard operating procedure. Boys with toys.
Goldeneye
Dodgexander is offline  
post #12 of 12 Old 08-19-2012, 09:53 PM - Thread Starter
Member
 
danbez's Avatar
 
Join Date: Feb 2004
Location: Seattle
Posts: 104
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 7 Post(s)
Liked: 11
Quote:
Originally Posted by sneals2000 View Post

The 'few seconds of film' effect usually happens on Motion Adaptive or simple Bob/Weave switched de-interlacing when the motion detection hasn't realised the source is native interlaced (aka video) rather than native progressive (aka film) - it shouldn't happen with decent Vector Adaptive de-interlacing.

Interesting. The problem is that the NVidia drivers don't allow me to set the de-interlacing mode like the ATI does. Seems to be fully automatic, and I don't have any idea of what trype of de-interlacers the driver is enabled. All I can see is that by setting it to 1080i most of the film like issues disappear. It is also interesting that on an older TV (720P JVC LCD), I don't see any of that. (using the same HTPC).
danbez is online now  
Reply Home Theater Computers

User Tag List

Thread Tools
Show Printable Version Show Printable Version
Email this Page Email this Page


Forum Jump: 

Posting Rules  
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are Off
Refbacks are Off