or Connect
AVS › AVS Forum › Video Components › Home Theater Computers › Let's set this straight - No one can do 24p consistently well
New Posts  All Forums:Forum Nav:

Let's set this straight - No one can do 24p consistently well - Page 34

post #991 of 1281
Quote:
Originally Posted by stanger89 View Post

There's no reason to output anything higher than 8-bit per channel since we have access to no video sources with greater than 8-bit per channel information. Now if you've got a high end video processor that support internal operations at greater than 8 bits per channel there's a good case to be made for maintaining the higher precision but I don't think PCs or streamers fall into that category (well at least PCs can't output greater than 8 bit per channel).

With MadVR, your PC is a video processor. But unlike dedicated ones, it's stuck at 8-bit dithered output. Even simple streamers and bd-players need at least some processing to do chroma-upsampling and de-interlacing. It helps that better models don't have to downsample back to 8-bit after this processing.

Seems kind of ridiculous that every Nvidia card since the 8000 series claims to support deep color according to their spec page and yet there is no way to use it. I doubt their hardware even supports it. False advertising.

Has anyone ever got 10-bit or higher output from an ATI card? Most AV receivers have a HDMI status/info display where you can check the signal format.
post #992 of 1281
Quote:
Originally Posted by Wizziwig View Post

....
Has anyone ever got 10-bit or higher output from an ATI card? Most AV receivers have a HDMI status/info display where you can check the signal format.

You mean this? Ati 6770, mpc hc, madvr, lav filters and an Epson 6500UB.

post #993 of 1281
Nice. I'll have to pick up an ATI card to experiment. All I own right now is several Nvidia and Intel IGP - all of which failed to output 10-bit signal like my stand-alone BD player does.
post #994 of 1281
Im not sure the point in trying to get 10 bit when Bluray is only 8 bit.
post #995 of 1281
My TV claims it gets a 10bit signal as well, but i know for a fact that Windows only outputs 8-bits, its just modified by the GPU to 10-bit for transmission - there is no additional information present in there.
If you want real 10-bit output from a PC, you need a "pro" card, like NVIDIA Quadro or AMD FirePro.

Like mentioned above though, there really is no 10-bit content.
- DVD/Blu-ray are 8-bit
- TV Broadcast is 8-bit
- Most Internet video is 8-bit

Only some Anime use 10-bit encoding, BUT, they only use 10-bit because it can make the files 10-20% smaller, the source of those encodes is also just a 8-bit video.
post #996 of 1281
Quote:
Originally Posted by Wizziwig View Post

Nice. I'll have to pick up an ATI card to experiment. All I own right now is several Nvidia and Intel IGP - all of which failed to output 10-bit signal like my stand-alone BD player does.

NVIDIA dGPU usually do 12-bit. Intel uses 8-bit unless xvYCC is enabled.
post #997 of 1281
Quote:
Originally Posted by RapalloAV View Post

Im not sure the point in trying to get 10 bit when Bluray is only 8 bit.

I probably should mention, I didn't do anything specific to set that up or think I get any additional benefit. I just noticed that it said 10 bit when I bring up my Epson's signal info menu.
post #998 of 1281
i think we are getting a little offtopic here.
post #999 of 1281
24 finally fixed with me!

Ive tried for months to correct my 24 issues on XBMC and I can report I have "mine" fixed FINALLY OMG!!!!. I realize A/V sync method [Audio Clock, Video Clock(resample audio) or Video Clock(Drop/Dupe Audio)] is broken, after turning it off all my sync issues have gone away! The lip sync is perfect and never wanders like it used to. I have my Integra 80.3 set to 180mc and nothing on XBMC, its absolutely perfectly in sync the same as my two Oppo BDP-95s. Im really happy but it was a very frustrating time.

I have tried for weeks to mention this on XBMC forum but they still have not approved my membership, even after resending it many times........
Much much easier to be a member on AVSforums than over there. rolleyes.gif
post #1000 of 1281
I'm interested in what anyone thinks of this:

My Celeron G530 renders anything I throw at it perfectly with 0 dropped frames with the Intel control panel set to 60Hz and the TV set to 24p with EVR custom. I set Intel to 23 or 24Hz and everything falls apart, dropped frame after dropped frame. Anyone know why (even though I'm perfectly satisfied with 60Hz though)?
post #1001 of 1281
Quote:
Originally Posted by RapalloAV View Post

24 finally fixed with me!
Ive tried for months to correct my 24 issues on XBMC and I can report I have "mine" fixed FINALLY OMG!!!!. I realize A/V sync method [Audio Clock, Video Clock(resample audio) or Video Clock(Drop/Dupe Audio)] is broken, after turning it off all my sync issues have gone away! The lip sync is perfect and never wanders like it used to. I have my Integra 80.3 set to 180mc and nothing on XBMC, its absolutely perfectly in sync the same as my two Oppo BDP-95s. Im really happy but it was a very frustrating time.
I have tried for weeks to mention this on XBMC forum but they still have not approved my membership, even after resending it many times........
Much much easier to be a member on AVSforums than over there. rolleyes.gif

Glad its working for you eventually.

AVSync doesn't work with bitstreaming (was sure I'd mentioned that earlier) - never has and wasn't built for that purpose AFAIK - in the same way that neither reclock or JRivers derivation don't either.

Regarding your xbmc membership problems - offered assistance on that earlier in the thread - but got no response (sound familiar rolleyes.gif)
post #1002 of 1281
Quote:
Originally Posted by steelman1991 View Post

Glad its working for you eventually.
AVSync doesn't work with bitstreaming (was sure I'd mentioned that earlier) - never has and wasn't built for that purpose AFAIK - in the same way that neither reclock or JRivers derivation don't either.
Regarding your xbmc membership problems - offered assistance on that earlier in the thread - but got no response (sound familiar rolleyes.gif)

I wonder how many others know that it doesnt work with bitstreaming then as there since there seems to be a huge number that are complaining about 24?wink.gif

Seems strange to me if it doesnt work with bitstreaming, why the manual doesnt tell users, thats not too helpful.

A/V sync method [Audio Clock, Video Clock(resample audio) or Video Clock(Drop/Dupe Audio)]
Audio has to stay in sync, this can either be done by resampling, skipping/duplicating packets, or adjusting the clock if it gets out of sync too far. Resampling has the advantage that the speed of the video can be changed considerably, so 24 fps can be sped up to 25 fps to play at PAL speed. The disadvantage of resampling is that it doesn't work with passthrough, and there is a slight loss of audio quality. Skipping/duplicating audiopackets has no loss of audio quality, but the speed of the video can only be changed a little to avoid doing a skip/duplication too often, most of the time it's inaudible, but it can produce a very audible click. Adjusting the clock has the best audioquality, but some extra video jitter can occur, also the speed of the video can't change much, as the audio will sync the clock more often the more the speed of the video is changed. [
/B]

I read that you we going to contact the mods at XBMC why they hadnt approved my membership and I waited but they still never did. I kept re applying but still nothing was approved, looks like they dont update their data base too often. Anyway looks like I dont need their help now anyway. I resolved my sync issues and thought I would try and help anyone else in the same boat on AVSforums
post #1003 of 1281
No idea but this might just give an indication "The disadvantage of resampling is that it doesn't work with passthrough", though I agree it might have been prudent to extend that to the other options.

The no response comment I made was aimed at you - not the the xbmc mods. I couldn't contact the devs, because you never PM'd me with your log in details - therefore I had no identity details for them to check smile.gif
post #1004 of 1281
Quote:
Originally Posted by steelman1991 View Post

No idea but this might just give an indication "The disadvantage of resampling is that it doesn't work with passthrough", though I agree it might have been prudent to extend that to the other options.
The no response comment I made was aimed at you - not the the xbmc mods. I couldn't contact the devs, because you never PM'd me with your log in details - therefore I had no identity details for them to check smile.gif

Well I don't think many would read pass through to be bitstream, I didnt. I know of a number of people who had done as I, set up the AV sync with bitstream expecting it to work but not realizing it was going to throw the sync all over the place. Its such a shame they dont publish this as it would had saved alot of frustration. Anyway I have it fixed finally with the "AV sync" turned off and thats the main thing.

Never mind..... I missed to PM you, but its all to late now as I don't need XBMC forum now. Just seems odd they had never approved my membership on XBMC forums after trying for three weeks, they don't have any reason not to, maybe they are too busy.......cool.gif
post #1005 of 1281
So was that the one change that ultimately resolved the issue for these 'other known' users as well. If so perhaps it should be more prominently advertised.

Just had a look at your cinema by the way - cracking build I'm quite jealous biggrin.gif
post #1006 of 1281
Quote:
Originally Posted by steelman1991 View Post

So was that the one change that ultimately resolved the issue for these 'other known' users as well. If so perhaps it should be more prominently advertised.
Just had a look at your cinema by the way - cracking build I'm quite jealous biggrin.gif

Thank you for that, I do spend alot of time in that room wink.gif

I did only make that change, nothing else....
If it helps some others I would really like to know, if so it does need to be promoted!
post #1007 of 1281
Quote:
Originally Posted by RapalloAV View Post

Well I don't think many would read pass through to be bitstream, I didnt. I know of a number of people who had done as I, set up the AV sync with bitstream expecting it to work but not realizing it was going to throw the sync all over the place. Its such a shame they dont publish this as it would had saved alot of frustration. Anyway I have it fixed finally with the "AV sync" turned off and thats the main thing.
Never mind..... I missed to PM you, but its all to late now as I don't need XBMC forum now. Just seems odd they had never approved my membership on XBMC forums after trying for three weeks, they don't have any reason not to, maybe they are too busy.......cool.gif

as I said many times, turning off the broken thing is not a fix for the problem, but as long as you are happy, thats a good thing.
post #1008 of 1281
Quote:
Originally Posted by renethx View Post

The above setup is just the standard CEA timings. You won't get a refresh rate closer to 23.976Hz. Here are sample timings for 23.976Hz:






Play a movie with MPC HomeCinema + madVR (Ctrl+J) and test timings; the new refresh rate will be immediately seen in the OSD. BTW at "23.97601Hz", "1 frame drop every 9.93 hours" is not correct: 23.976 - 23.97601 = 0.00001 s^(-1), 1/0.00001 = 100000 s = 27.7 h; so 1 frame repeats every 27.7 hours.

525x525px-LL-f6e9050b_vbattach219191.png

A couple of days ago my wife and I replaced the failed nVidia GeForce 8500 GT video card in our 4-year-old HP Slimline s3330f HTPC (which, if you are interested, is described by HP on the following link:
http://h10010.www1.hp.com/wwpc/ca/en/ho/WF06b/12132708-12133156-12133158-12133158-12133158-81136177-81574148.html?dnr=1 )

We have used this once-cutting-edge HTPC as a source for a JVC RS1. HP designed and built the thing as a device to play HD-DVD and BD movies and we have used, and will continue to use, the system exclusively to watch HD-DVD and BD movies, period. Up until the video card failure, it had performed its duty marvelously. The 8500 GT's control panel allowed the selection of either 23 or 24 Hz so that movies played very smoothly.

I replaced the now-impossible-to-find, low-profile 8500GT with a low-profile Asus GeForce 210 Silent card. In short, so far it works great, but there is no 23 or 24 hz option -- only 60, 59, 30 or 25. However, I have noticed the "create custom resolution" pages which are virtually identical to the ones renethx has posted above, so I'd also like to create a custom 23.976 resolution for output to our RS1. But I'm not sure if the settings in the above screen captures are appropriate for the RS1 and, if they are not (if the RS1 states that "the frequency is out of range"), exactly how should I tweak the settings so that they will be in range?

Thanks very much for any assistance. And renethx, thanks for starting this absolutely fantastic thread.
post #1009 of 1281
That card may not have a fast enough shader clock to provide stutter-free playback for some content but yes you should be able to set it up as shown above. Keep in mind that you need to tinker with the pixel count to get 23.976 exactly. I used what is shown above and got 23.9758 which works fine for me with frame drops/repeats in the range of several hours.
post #1010 of 1281
I have 23.97542 and usually it shows 1 day or so for drop or repeat
post #1011 of 1281
Quote:
Originally Posted by Sammy2 View Post

That card may not have a fast enough shader clock to provide stutter-free playback for some content but yes you should be able to set it up as shown above. Keep in mind that you need to tinker with the pixel count to get 23.976 exactly. I used what is shown above and got 23.9758 which works fine for me with frame drops/repeats in the range of several hours.

Thanks for the help.

I used those settings and, sure enough, the custom resolution was accepted by the RS1 and a new custom resolution was created.

I then played "Valkyrie" on an older version of MPC-HC and it reported 23.97, 23.98, 23.96, 23.99 (only two decimals in this version). Jitter of around 14 ms. No frame drops during at least 15 mintues. ReClock reported a source frame rate of 23.976, a video device frequency of 23.975 and a display device rate of "1080p @ 24 (1/2)". (Older version of ReClock, too.

I have noticed that, after it's creation, I am not able to delete the new custom resolution. I first change display resolution to 1080p @ 60. I then try to delete the custom resolution, but it simply will not delete. Anyone else experience this?

Thanks for the help.
post #1012 of 1281
Quote:
Originally Posted by Herve View Post

Thanks for the help.
I used those settings and, sure enough, the custom resolution was accepted by the RS1 and a new custom resolution was created.
I then played "Valkyrie" on an older version of MPC-HC and it reported 23.97, 23.98, 23.96, 23.99 (only two decimals in this version). Jitter of around 14 ms. No frame drops during at least 15 mintues. ReClock reported a source frame rate of 23.976, a video device frequency of 23.975 and a display device rate of "1080p @ 24 (1/2)". (Older version of ReClock, too.
I have noticed that, after it's creation, I am not able to delete the new custom resolution. I first change display resolution to 1080p @ 60. I then try to delete the custom resolution, but it simply will not delete. Anyone else experience this?
Thanks for the help.

nvidia custom refresh selection sucks, i have same issues with it. drives me crazy.
post #1013 of 1281
I'm using Autofrequency and have zero issues on my G530. 23.976 is 23.976 no custom resolution. The Intel control panel also allows manual change to 23Hz. Autofrequency just automates it. No need for complicated Reclock or custom resolution options.
post #1014 of 1281
BTW, exactly how does one get all of that playback data to display on MPC-HC + MadVR?

Thanks.
post #1015 of 1281
Quote:
Originally Posted by Herve View Post

Thanks for the help.
I used those settings and, sure enough, the custom resolution was accepted by the RS1 and a new custom resolution was created.
I then played "Valkyrie" on an older version of MPC-HC and it reported 23.97, 23.98, 23.96, 23.99 (only two decimals in this version). Jitter of around 14 ms. No frame drops during at least 15 mintues. ReClock reported a source frame rate of 23.976, a video device frequency of 23.975 and a display device rate of "1080p @ 24 (1/2)". (Older version of ReClock, too.
I have noticed that, after it's creation, I am not able to delete the new custom resolution. I first change display resolution to 1080p @ 60. I then try to delete the custom resolution, but it simply will not delete. Anyone else experience this?
Thanks for the help.

It is a bug, don't try and delete it or edit it, if you want to adjust the timings for 23 hz just make a new custom resolution and it will replace the existing one.
post #1016 of 1281
Quote:
Originally Posted by Herve View Post

BTW, exactly how does one get all of that playback data to display on MPC-HC + MadVR?
Thanks.

Ctrl J
post #1017 of 1281
Thank you all!

Edit:
Ctrl J tells me that I'm getting 29.9765 and a repeated frame every couple of days. Playback looks very, very good, except for an occasional small tear near the right top of the screen when playing back HD-DVD files. No biggie. I can just use DVD Play (PowerDVD) to play those.

Using nVidia cuvid for hardware acceleration, I see a typical cpu use of around 20 - 30% on VC1. When I play back either the disc or image using PowerDVD, I see cpu use of between 7 and 13%. PowerDVD must have some tremendous hardware accelerators. But I don't think that the PQ using PDVD is quite as good as MPC-HC with MadVR.

I'm once again very pleased with our "old" Slimline dual-format HTPC.
Edited by Herve - 9/22/12 at 2:20pm
post #1018 of 1281
So I was curios about ATI's 10-bit deep color output support and borrowed a cheap Radeon 7570 card that Dell bundles with all their desktops. To my surprise, the card really does output 10-bits of actual color data. I confirmed it with a 10-bit gradient pattern generated inside a pixel shader. You could easily see distinct steps/bands in the pattern when outputting 8-bit but it was very smooth with 10-bit output enabled. I tried the same experiment on 2 Nvidia cards (210 and 460) but saw lots of banding - although definitely different from plain 8-bit output. My projector indicated that Nvidia was always sending 36-bit output vs. 30 bit for ATI. It's possible that my projector only has 10-bit internal processing and does not like the Nvidia 12-bit input. I don't have another deep color compatible display to confirm if Nvidia is actually passing more than 8-bits of data.

As others have pointed out BD/DVD is 8-bit but that's not the point here. One can use the extra precision for processing such as chroma upsampling, deinterlacing, color correction, scaling, etc. You can also get native output higher than 8-bits from video games that support it.

Now for something actually relevant to the topic of this thread: biggrin.gif

While I had the ATI card in there, I measured it's A/V clock drift. It's actually extremely accurate - the best I've seen on a PC. Just using ATI's presets for 59.940, 23.976, etc. I got a clock desync of about 24ms over a 3 hour period. That means you would not experience any dropped frames over the course of any 23.976 movie (frames are ~42 ms) and maybe 1 drop when watching 2+ hours of 59.940 content (~16 ms frames). Now I see why ATI does not bother to include custom resolutions in their control panel - it would serve no purpose. I was never able to achieve such high clock accuracy no matter how much I played with Nvidia's custom resolutions. I wonder if the Kepler 600 series does any better.
post #1019 of 1281
Speaking of 23.976, I've given up trying to get perfect frames on my setup. I leave my G540 on 50Hz permanently. I don't notice any judder or stuttering or frame drops. Then again, I know Sandy Bridge can't display it properly; I've given up on caring. I can't see a difference.
post #1020 of 1281
Quote:
Originally Posted by Wizziwig View Post

While I had the ATI card in there, I measured it's A/V clock drift. It's actually extremely accurate - the best I've seen on a PC. Just using ATI's presets for 59.940, 23.976, etc. I got a clock desync of about 24ms over a 3 hour period. That means you would not experience any dropped frames over the course of any 23.976 movie (frames are ~42 ms) and maybe 1 drop when watching 2+ hours of 59.940 content (~16 ms frames). Now I see why ATI does not bother to include custom resolutions in their control panel - it would serve no purpose. I was never able to achieve such high clock accuracy no matter how much I played with Nvidia's custom resolutions. I wonder if the Kepler 600 series does any better.

The problem is that it varies b/w their dGPU - higher end cards do much, much better where a Zacate is pretty bad - something that might be correctable if they offered the option.
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: Home Theater Computers
AVS › AVS Forum › Video Components › Home Theater Computers › Let's set this straight - No one can do 24p consistently well