Originally Posted by bobof
It's just so backwards though it makes me not even want to bother with the device if that is their proposed solution to framerate switching; that the end user is expected to know what the correct framerate is of some piece of content and adjust to suit.
Unless I'm mistaken there isn't actually any easy way to work out what the correct frame rate is on the Shield for a given piece of playing video - or does it tell you what the currently playing video frame rate is when you pull up the menu (which I guess it could to to ease the pain).
Originally Posted by bobof
Can you please describe exactly how it works as I've already gone through the pain of returning a 2017 shield last year after being told that did auto rate switching out of the box with Netflix.
Is it a matter of playing content and then pressing a single button on the remote handset? Which button? Or do you play content, bring up a menu, navigate to a function and then select it? Is there no selection whatsoever necessary of the frame rate - it is fully automatic and gets 24 vs 23.976, 60 vs 59.94 and 30 vs 29.97 right?
Out of interest, what does it do for 30 and 25 - does it use those rates or double them like Appletv does?
Sent from my G8441 using Tapatalk
This is why I'd suggest using the Refresh Rate app
instead of Nvidia's beta feature, which I found unreliable. Refresh Rate requires more setup and configuration, but once you figure out each app's idiosyncrasies, you have a solution that works in every scenario, if a bit inconsistently.
Here are how I have my apps set up:
Netflix works automatically. I start playing a video and after a few seconds, Refresh Rate detects the frame rate and switches the refresh rate automatically if it's different from the current refresh rate. The complication is that Netflix plays those stupid video previews as you browse and they trigger Refresh Rate also, which can be annoying (the audio drops and my receiver pops a bit).
The automatic rate switching doesn't seem to work so well with Criterion; I normally need to start the video, back out, and restart it a couple times for it to work. Instead, I set it to always start up as 1080p23.976 since all their film-based content is in that format. That way, there's no rate switching while I use the app and my display gets to do the upscaling.
Amazon Prime is the buggiest one. As soon as Refresh Rate triggers a rate change, the app quits playing the video because of an HDCP error. This is another one where it makes sense to set a default resolution/rate and then change it manually for exceptions. Refresh Rate provides an overlay menu that lets you select a specific refresh rate in cases like this. What's nice is you can set Refresh Rate to display the detected refresh rate for any content when the video starts playing, so if you do need to switch it manually, you don't have to guess what rate to use.
This is a distinctly Android-like experience with more options and fiddling than Apple, but the bottom line is that you can't get proper native rates on the ATV for a variety of content and Apple has made no movement on that for years, so if you're a videophile who insists on having smooth frame rates, it's the best option there is right now.
I'm still holding on to my ATV though in case Apple comes through with a better solution.
Originally Posted by Keenan
Nvidia via a posting in their forum has stated there will be no auto-refresh rate support coming, the beta feature is as good as it's going to get.
I'm starting to get the feeling that enabling this feature in the OS as a global setting is a complex undertaking and direct app support is necessary. Refresh Rate is sniffing the network stream (which involves enabling Developer Tools and some other advanced settings) and changing the video signal mid-stream, and as Prime shows, messing with a stream in progress can have weird consequences. Kodi and Plex both do automatic rate switching on the Shield, so that proves that the app makers themselves can support it if they want.