Originally Posted by Wizziwig
Interesting. Our current graphics engine only supports OpenGL on the PC (for easier porting to Mac, iOS, Android, etc.). I'll be adding a D3D backend renderer in a few weeks and will try to reproduce your problem. Have you tried increasing the maximum latency with SetMaximumFrameLatency()? Maybe the default queue size is too low for really intensive CPU/GPU apps. You could also try disabling Nvidia's power management - I've seen cases where it would randomly toggle into a low power state where the card was too slow to keep up with our game. I also remember something like what you describe happening when Nvidia first released their 260.xx series of drivers. I've been doing mostly console games the past few years so don't remember if they ever fixed that on current D3D drivers.
As a general observation, Windows is a really poor platform for a media player! For any "real-time" application, it's difficult to get 100% reliable results from the task scheduler. We've had much better luck in Linux - it's almost as reliable as our console projects. Might be something to consider for a future madvr version.
I'm already using SetMaximumFrameLatency(). One of the madVR "tweak" options (specifically designed to help with the NVidia glitching problem) defines whether I'm calling SetMaximumFrameLatency with the exact number of backbuffers, or with backbuffers + 2. I'm quite sure that one of the many NVidia users affected by this problem has already tried disabling the power management, so I don't think it will help here, either. Both good suggestions, though.
I think the problem has to do with synchronization. I think the problem occurs if the GPU is busy doing something specific (maybe copying to backbuffer or whatever) while a VSync occurs. In that case it seems that NVidia's driver sometimes isn't able to flip the page, probably because it fails to get access to some critical section or something like that. That's why another madVR tweak option checks where the VSync scanline position is and avoids starting a new rendering pass when VSync is near to the next VSync interrupt. And that does help a lot in avoiding those presentation glitches.
Yeah, I can well imagine that Linux might be a more stable platform for time critical stuff. But really, D3D exclusive mode *should* in theory solve this problem because you can present multiple frames in advance and the pre-presented frames are supposed to be flipped by the VSync interrupt in driver land. That should take out all timing problems. In theory. It seems to work well enough with ATI and Intel, just not with NVidia at the moment.
Originally Posted by whiteboy714
I think he referred to the 650 which is not released yet. That card is from nvidia's previous generation and has been out for a while. I would hope it would be plenty for madvr, but who knows down the road. As far as gaming power the 550ti is in between the 7750 and 7770 in regards to AMD.
Yeah, sorry for the confusion, I meant the 650ti.