Originally Posted by dle
I've read that many 6150 boards have inaccurate audio timing. Using a PCI sound card should fix that problem.
I also run MythTV on an Asus M2NPV-VM and I too experience this stuttering problem (approximately one skip every 11 seconds or so due to the audio getting ahead). However, I have the integrated audio disabled and am using an Audiophile 2496 which has very accurate timing so that must not be the problem. By setting Mythfrontend to Use Video As Timebase I too can band-aid the problem.
I do not think this problem has anything specifically to do with Bob-Deinterlacing and this is why: while true that if you run one-field deinterlacing or any of the others the skipping effect disappears, I believe this is because the rate of refresh of the images after these deinterlacers process the video is only ~30 fps whereas the video card refreshes at ~60 Hz so Mythfrontend can compensate for the audio getting ahead by simply speeding up the playback of the images very subtly and smoothly to resync - it has two video card refreshes for every one image to work with. When you're displaying an image at every refresh (ie 60 fps which is what bob does) there are no extra video card refreshes between images so to sync the video back up with the audio means Mythfrontend must make an abrupt change - ie a sharp jump ahead by dropping frames - there is no room to fudge smoothly.
That being said I think the problem lies in the actual playback rate that Mythfrontend employs. In checking -v playback logs, I have discovered that with exceptional consistency video with an actual framerate of 29.97 fps (or 59.94 for 720p) plays back at 29.88 fps (or 59.76 fps for 720p). Playing back the video at .09 fps slower than the audio is consistent mathematically with the video lagging behind the audio by ~3 frames after about 30 seconds of playback as is observed:
'video_output' mean = '33448.02', std. dev. = '305.78', fps = '29.90'
'video_output' mean = '33462.99', std. dev. = '283.90', fps = '29.88'
'video_output' mean = '33461.73', std. dev. = '230.68', fps = '29.88'
2008-01-21 02:23:47.192 NVP: 400 interlaced frames seen.
'video_output' mean = '33458.79', std. dev. = '263.56', fps = '29.89'
'video_output' mean = '33461.59', std. dev. = '253.78', fps = '29.89'
'video_output' mean = '33462.91', std. dev. = '292.96', fps = '29.88'
'video_output' mean = '33460.53', std. dev. = '258.86', fps = '29.89'
2008-01-21 02:24:00.578 NVP: 800 interlaced frames seen.
'video_output' mean = '33463.27', std. dev. = '181.08', fps = '29.88'
2008-01-21 02:24:03.793 NVP: Video is 3.00198 frames behind audio (too slow), dropping frame to catch up.
This doesn't depend on the timing method being used, realtime priority or any of the various options available in nvidia-settings. It doesn't even seem to depend on the refresh rate or resolution of the video card (I tested it at several different refresh rates around 60 Hz and at both firstname.lastname@example.org and email@example.com). The total CPU usage is about 30% (of one core of a dual core system) in all tests, ie the CPU is not an issue.
I am using Debian Etch (amd64) with Myth .20-fixes from svn (compiled with enable-proc-opt and enable-opengl-vsync and the bob-deint refresh rate patch applied). I am running the latest Nvidia driver (169.07) and driving my Sony CRT via DVI (with ExactTimingsDVI and UseEvents enabled - I did test disabling both of those one at a time and then both together but the actual playback fps still reported as 29.88 fps). My modeline is:
ModeLine "ATSC-1080i" 74.176 1920 2008 2052 2200 1080 1085 1095 1125 +hsync +vsync Interlace
The 720p modeline I tested just to make sure this wasn't an issue with interlacing was:
ModeLine "ATSC-720p" 74.176 1280 1390 1430 1650 720 725 730 750 +hsync +vsync
Both modelines have 59.94 Hz refresh rates.
Mythfrontend with -v playback reports the refresh rate as 16683 which is 59.94 Hz and the frame interval as 33366 which is 29.97 fps (with bob on these are equal) yet the actual playback rate reports as 29.88 fps:
'video_output' mean = '33459.33', std. dev. = '140.43', fps = '29.89'
'video_output' mean = '33461.81', std. dev. = '194.72', fps = '29.88'
'video_output' mean = '33462.45', std. dev. = '241.12', fps = '29.88'
'video_output' mean = '33463.24', std. dev. = '216.54', fps = '29.88'
Does anyone have any insight into why the playback fps is consistently 29.88 fps for 1080i material and 59.76 for 720p material? Or, even better, a fix to make it playback correctly at 29.97? It would be fantastic to be able to run mythfrontend without having to choose between dropping frames every 11 seconds or dealing with the drawbacks of the Use Video As Timebase feature.