The fix is with the video card: Any not too old Nvidia-based or ATI-based, discrete video card -an actual card, not something built into the motherboard ("onboard")- should have the feature I'm about to describe. On-board video cards may not provide this HDTV-overscan-specific feature, some may:
My NVidia video card has some HDTV settings in it's control panel. (Advanced settings.) One setting is "resize my desktop to fit the screen" (or similar), which displays a 1920x1080 image with big green arrows in each of the four corners.
The arrow heads go far into overscan land.
There is a vertical slider and a horizontal slider that you drag to bring those green arrows in, out of the overscan area, until the arrow heads point perfectly at the four corners of the visible screen. (ok, you might have to use the Mits own pixel shift to get it perfect -but such a shift is secondary, not related to this video card feature)
Now, when those arrows are being drawn in the sliders, and the image seems to be getting smaller, the video card is NOT scaling the image -it's not down-sampling the original 1920x1080 output to fit into a smaller rectangle -no, it is actually chopping off pixels from the right, left, top and bottom of the 1920x1080 image. (a crop, instead of a resize)
From how far you had to drag the sliders to make the green arrows point to the four corners of the TV, the video card driver software determines the amount of overscan your TV has -in pixels- and chops your 1920x1080 resolution down that amount of pixels, like down to 1860x1036 as on mine.
The video driver then reports that lower resolution to Windows as the desktop resolution ("hey Windows, we don't have a 1920x1080p display after all, it's actually 1860x1036") -so all apps are resized to that, including the desktop, just as if you were connected to a lower resolution monitor.
Then, the key technology here:
While the driver reports the smaller 1860x1036 resolution to Windows, the video card actually outputs the full 1920x1080 to the TV -but as a 1860x1036 desktop rectangle drawn in the center of a full 1920x1080 black rectangle. In other words, as 1860x1036 rectangle with black borders on each side, making a full 1920x1080 image.
That full 1920x1080 image is rendered as usual by the TV, still 1-to-1 pixel ratio, but this time, the part of the image that goes into overscan land is those black borders, not your Taskbar and Start button.
It's a very cool solution indeed. All (cool) Windows applications support arbitrary desktop sizes, and many games do too now, because most games get their list of supported resolutions by polling Windows for it. So if Windows report 1860x1036 as a valid desktop resolution, that resolution will show up in the game's pick list.
I now see a resolution option of 1860x1036 in the Video Settings of my Lara Croft Anniversary game. And, my, does she look good on that 65" screen.
(Even at slightly less than 1080p resolution. That it chops/crops, not scales your resolution is also cool: it avoids a blurry output, preserves 1:1 rendering.)
I am still interested in mechanically moving the light engine/projector closer to the screen to defeat overscan. I don't like losing the edges of my movies. What works above for Windows and DirectX games cannot work for Blu-Ray movies. HD is 1080p, period. Scale it or chop it, it's no longer 1080p.
By the way:
A great way to check for how much overscan you have, and weigh it with a real world HD video, is to stream Netflix with an Xbox: the Xbox Netflix video player allows changing the video size between letterbox, stretch to screen, and others, but most interestingly, to "native" resolution, which shows a streamed 720p HD video (all they stream) as actually 720 pixels high (by 1280 wide?), centered on the screen. From that much smaller image, and flipping back and forth to the full size, you can see what you are losing at the edges due to overscan.