AVS Forum banner
1 - 7 of 7 Posts

·
Registered
Joined
·
13 Posts
Discussion Starter · #1 ·
I have a HTPC setup connected to my Optoma HD20. My question is regarding graphics card refresh rate. I have all my bluerays ripped to my computer. They are 23.976 fps and 1080p. I have an ATI HD 3850. I originally had the refresh rate set to 60hz, and the movie playback was ok, with some detectable jutter. The Optoma claims to do 24fps (I'm assuming that really means native 23.976hz). So used the ATI CCC to set the refresh rate to 24 hz, but the playback was AWFUL! I then tried 23 hz with the same result. I then downloaded powerstrip and set the vertical refresh rate to 23.976 hz. for the first few seconds the video playback is awful, but then after about 10 seconds it smooths out and plays pretty good, with a stutter maybe once every minute or two. My questions is, does the HD20 really natively display 23.976 fps, or is it doing something else. Also, in powerstip, when i set the vertical refresh rate to 23.976 the horizontal scan rate is something like 24.000 hz. Should i use a value that puts BOTH horizontal and vertical closer to 23.976 or just the vertical. Thanks!
 

·
Registered
Joined
·
133 Posts
I'm running my HTPC at 1920x1080 60hz for the desktop. I use XBMC for playback of movies and it switches to 1080p/24 automatically and the movies are near perfect.
 

·
Registered
Joined
·
211 Posts
Don't quote me on this but me and another HD20 user Xenon discussed here awhile back about the 24Hz on the HD20, we thought that it was taking a 24Hz signal and using a 2:2 pull-down and taking it to 48fps. Have you updated your vid-card drivers and such just to make sure? Are you going to the HD20 with VGA or HDMI?


Now, I thought in my experience with the XBMC, it applied a 2:3 pull-down to streaming 24fps movies from a HTPC, and took them to 60Hz, I could be wrong though.
 

·
Registered
Joined
·
13 Posts
Discussion Starter · #5 ·

Quote:
Originally Posted by DrNegative /forum/post/20854341


Don't quote me on this but me and another HD20 user Xenon discussed here awhile back about the 24Hz on the HD20, we thought that it was taking a 24Hz signal and using a 2:2 pull-down and taking it to 48fps. Have you updated your vid-card drivers and such just to make sure? Are you going to the HD20 with VGA or HDMI?

I am going to it via HDMI. The projector doubling the frames is not a problem, it is when my computer has to do a 3:2 pulldown to output 60hz that causes the jutter. My question is, can my HD20 and my computer output the native refresh/frame rate. Anyone figure this out?
 

·
Registered
Joined
·
133 Posts

Quote:
Originally Posted by roberto188 /forum/post/20854295


So what you're saying is that XBMC will automatically switch my graphics card refresh rate to match the source material frame rate?

It does for me. I think there is something in the settings to tell it to do that.
 

·
Registered
Joined
·
13 Posts
Discussion Starter · #7 ·
I went into the CCC settings and added 1080p 24hz profile to the refresh rate settings. That in turn added a 23hz frequency to my standard windows refresh rate. Played it, and again the first 5 seconds, it studders, but once it settles down, it plays flawlessly. Not studder. Thanks for the help guys!
 
1 - 7 of 7 Posts
This is an older thread, you may not receive a response, and could be reviving an old thread. Please consider creating a new thread.
Top