I'm having an issue when I try to launch games on my TV vs. my Monitor. The error I'm getting is:
"Failed to create the D3D9 Device! this can happen if the desktop is locked
Only some games give me this error. I think I've isolated it to mainly UT3 Engine games.
They work fine on my monitor (BenQ 1080p@60Hz), but they give me the above error when I try to run them on my TV (Sony KDL-46W3000 Bravia 1080p@60Hz).
Here's another wrinkle: I had this set up working fine for years with an AMD Radeon 5870 running HDMI (40-ft) directly from my card to my Receiver (Onkyo TX-SR606). When I switched the output to my TV in CCC I would have to manually add 1080p@60Hz and overscan to scale the screen size but that was a one-time fix and everything would work fine after that. The reason I had to manually add 1080p@60Hz is unknown to me, but for some reason it was detecting 1080i@30Hz native even though the TV is supposed to support 1080p@60Hz native.
Last week, I bought an nVidia 670. When I switched the output to my TV, I had to manually add 1080p@60Hz (as it was detecting 1080i@30Hz native again) and the desktop displayed fine after that - no need to mess with overscan this time. But now when I try to run XCOM for example (an UT3 Engine game), I get that error from above and it crashes.
Games that don't crash seem to change resolution at launch as I notice the screen goes black for a few seconds as it would when I change resolutions (possibly back to the native 1080i@30Hz that was initially being detected - though I'm not sure).
After reading various forums and messing around myself for several hours I've tried the following:
-Launching the problem games in windowed mode - it works, but that's hardly a solution
-Launching the problem games @ the natively detected 1080i@30Hz - again, it works, but that's not ideal either...
And before you suggest, yes I'm running the latest drivers and DirectX version in Win7 64-bit. I've even tried the latest beta drivers from nVidia, and I've run driver cleaner between installs (and cleared out all the old AMD drivers).
Has anyone else run into this issue, and is there a way to resolve it? I'm really banging my head against the wall the last few days. Any input is appreciated.
Are you passing the signal through your receiver still, or is it direct from your card to your display?
Also, I'd focus on DirectX rather than your GPU drivers. Perhaps try reinstalling the problem games and/or DirectX. And sometimes MS does weird things with DirectX (like not changing the number of the release even though they've updated the drivers).
EDIT: also, a quick google search turned up a couple of possible solutions in this Steam thread (assuming you're using Steam):http://forums.steampowered.com/forums/showthread.php?t=2759841
My samsung TV has a button called "info" I believe. It shows what settings the current source is displaying. Should tell you the res and refresh rate.
Have you googled this error? Seems like a pretty common error.https://support.steampowered.com/kb_article.php?p_faqid=772
I'm still passing the signal through my receiver, yes.
As per DirectX, I tried running the installer and it said I'm up to date... not sure what else to do there.
And thanks for the link, I had already read it and tried the solutions - none of which worked unfortunately.
My theory on the cause of the error is that if the resolution/refresh rate you last launched the game with is unsupported - so for most people it's as simple as changing the resolution in a config file as stated in all those steam forum posts. But that's a solution in my case, as I'm running at 1920x1080. The resolution is correct...
Since you're passing video through your receiver, have you tried using a direct connection? Checked to make sure you're not using a finicky HDMI port? Tried different boot-up sequences? Etc.?
Is there a point to doing direct connection? As I stated, this set up was working for years with a different video card so it's not the port / HDMI cable or boot-up sequence. I was going to try swapping the AMD Radeon 5870 back in tonight just to test everything again and make sure everything was working as I thought before I made the change, but I don't anticipate learning anything new by doing this. Man this is frustrating
Using a direct connection is just a way of narrowing down the source of the problem, and it's a lot easier than swapping cards. Too many variables with your current setup--especially if you've already tried everything on the software-side.
Also, swapping between an old AMD card and a new Nvidia card introduces too many other variables on top of that IMO. If it works, it doesn't help you any. If it doesn't, it still doesn't help you any. The Nvidia card and drivers have their own compatibility quirks (with the other components in your PC as well as whatever else the signal is passing through--like your receiver and your display). Could be hardware related, could be driver related, could be a handshake issue (originating from the card, the receiver, or the display). If I were you, I'd try to keep it simple until you get a handle on where the problem's coming from. But that's just my opinion. Others may have different suggestions.
If you've already uninstalled and reinstalled all the relevant drivers, games, and so on, I'd move into narrowing down a hardware culprit. Cutting out the receiver seems like the easiest place to start. From there, I'd move to trying different HDMI connections/cablees, then I'd try a direct mobo connection (bypassing the GPU). Hopefully, by then you'd know at least where the problem is coming from (though not necessarily if it's hardware or software related).
But just based on the error message you're getting, it still sounds software related. Switching between AMD and Nvidia cards can shake loose a whole bunch of potential software issues--especially if your computer had been using one for a long time. That's years of updates and customizations built around one card. Which is why uninstalling and reinstalling drivers and programs is the place to start.