Just thought I'd chime in with a bit of info. I use a DVI-to-HDMI adapter and read that the EDID info is sent through pins 6 & 7 of the DVI plug, so seeing as I had two of these adapters I figured one could be sacrificed for experimentation.
I removed pin 7 with the hopes that the EDID from the tv would not be found and Windows would be forced to use the data from the modified INF. Unfortunately, after removing the pin the screen no longer received a signal - I believe this is due to the graphics card not detecting the EDID and so believing there is no display attached.
Another option that will probably have more success is for people using VGA - you can remove pin 12 on the VGA side of a DVI-to-VGA adapter and this should stop the EDID from passing to the graphics card. Windows should detect it as a non-PnP monitor, at which point you would update the display driver with your modified INF that has the extension block removed. If anyone has a couple of spare DVI-to-VGA adapters lying around that they don't mind wrecking then it would be interesting to hear the results of this.
The reason for trying these options is because Butmuncher has reported successful images with HDMI 1.3 displays, and although I've tried using a modified INF I'm not convinced that Windows 7 64-bit in combination with a nVidia graphics card is actually using the data from the INF - I believe they're still using the EDID passed from the tv. I've read that you can force a different EDID using ATI software but not with nVidia, so the only way to be sure is to stop the EDID getting through. I think using VGA should work as it shouldn't require an EDID to function (otherwise older generic CRTs wouldn't work).
Anyway, I'd be interested to read if anyone tries it, maybe I'll have a root around for some VGA adapters and give it a go. BTW, it was a bitch removing the pin from the DVI adapter, but if you've got needle-nose plyers with a good grip you should be able to yank them out cleanly.