AVS Forum banner
1 - 16 of 16 Posts

·
Registered
Joined
·
217 Posts
Discussion Starter · #1 ·
So...


I have seen other people have this problem, and i've never seen a definitive solution on how to force a video card to display the tv's native resolution of 1366x768 over DVI/HDMI or VGA.


I would even deal with the slightly less ideal 1360x768 but SOMEHOW unbeknownst to me, both nVidia and ATi drivers are detecting the tv's native resolution as 1360x768x30i. Interlaced... WTF.


If I try to create a custom resolution of 1360x768x60hz or 1366x768x60hz the nvidia drivers say "Test Failed" and refuse to try the resolution at all.


I remember this wasnt an issue about 10 driver revisions ago in Vista. I actually formerly had 1:1 pixel mapping on this tv with a 7600GT and Vista-64 but since switching to Win7 I can no longer get it to work, despite agonizing over it and switching back to Vista doesnt solve it either amazingly...


What am I to do here... 1360x768x30i flickers and drives me nuts, 1280x720 doesnt fit the screen correctly even in 720p Home Theater mode or whatever. There has to be a solution for this .

 

·
Registered
Joined
·
1,110 Posts
It is my experience that 1:1 pixel mapping on 720p is a pain in the ass. Most TVs with HDMI always assumes 720p or 1080i/p and will not accept its native resolution. Forget 1:1 pixel mapping, it is hard just to defeat the overscan.


I'm surprised that you have trouble with VGA though. Do you do most of the test with HDMI or VGA?
 

·
Registered
Joined
·
4,696 Posts
My LG 720p 42LH20 picked it up right away over HDMI using Windows 7. Not all 720ps sets have this issue, but 3rd tier brand sets don't play nice. Even the remote codes on 3rd tier sets can be a pain for universal remotes. Odd that it worked fine in Vista...
 

·
Registered
Joined
·
217 Posts
Discussion Starter · #5 ·

Quote:
Originally Posted by pixelation /forum/post/18183304


It is my experience that 1:1 pixel mapping on 720p is a pain in the ass. Most TVs with HDMI always assumes 720p or 1080i/p and will not accept its native resolution. Forget 1:1 pixel mapping, it is hard just to defeat the overscan.


I'm surprised that you have trouble with VGA though. Do you do most of the test with HDMI or VGA?

To be honest, most of the tests have been with HDMI, but I just ordered a Geforce 220 to replace my aging 7600GT and once I get it installed and up and running I plan to revisit the VGA question with a new round of testing for custom resolutions.


I think the E-DID data works differently for DVI than it does for VGA. I think the nVidia drivers give you more "Leeway" to try resolutions.


I do hate that VGA is noticeably less crisp though
 

·
Registered
Joined
·
4,696 Posts

Quote:
Originally Posted by pixelation /forum/post/18184044


What resolution did you get on your LG? I have no idea how a TV could accept 1366x768 at 720p.

1360 x 768, it's not @ 720p. Windows auto detected that resolution through HDMI. I am missing 3 pixels on each side but you don't notice unless you are right next to the TV. Using a 8600 video card in dual display mode (other display is an old 1440 x 900).
 

·
Registered
Joined
·
16,749 Posts
All "720p" TVs manufactured today have a native resolution of 1366x768 so the commercial users of the panels can easily program them. Many driver or interfaces will not accept 1366x768 as a resolution but will accept 1360x768 since 1360 is diviable by 8 and 1366 is not. The reason for this is that that basic font size for PC monitors has always been 8 pixels wide by 12 pixels high.
 

·
Registered
Joined
·
217 Posts
Discussion Starter · #8 ·

Quote:
Originally Posted by walford /forum/post/18185986


All "720p" TVs manufactured today have a native resolution of 1366x768 so the commercial users of the panels can easily program them. Many driver or interfaces will not accept 1366x768 as a resolution but will accept 1360x768 since 1360 is diviable by 8 and 1366 is not. The reason for this is that that basic font size for PC monitors has always been 8 pixels wide by 12 pixels high.

If thats the case, why does the 1360x768 over HDMI extend beyond the edges of the border of the screen.


I'm guessing its because the TV has overscan built into it for some reason, but I dont know if thats defeatable.
 

·
Registered
Joined
·
1,375 Posts
Not all sets will accept their native resolution on all (or even any) input. Check your TV manual, and see if their is a setting for 'dot for dot' or 'just scan' or '1:1' or something similar to turn off the overscan on the TV. Many 720p sets will accept only a standard HD resolution (ie 1280x720) over HDMI and overscan it. Most, but not all, will accept 1360 (or 1368) x768 over VGA with no over/underscan. 1360 gives a 3 pixel black bar on each side like servicetech or 1368 which loses 1 column of pixels on each side. Some sets will not allow any thing but standard 1024x768 over VGA.


This all gets compunded if you are using an ATI CCC which, by default is set to underscan by ~15%.


To get 1:1 pixel mapping, the trick is to turn of the TV's overscan on the input you are using and set the videocard overscan to 0 and the desktop resolution to the TV's native resolution and NOT to disregard the use of the VGA input.


BB
 

·
Registered
Joined
·
324 Posts
I have a Vizio VX42L 720P which has a native resolution of 1366 x 768. I have a ati 4350 card and when I use HDMI-HDMI on a connection then the image is not centered. If I use DVi then the image is center, I can then turn of the scaling and the image fills the entire screen. Is this also a pixel mapping issue?
 

·
Registered
Joined
·
1,110 Posts
How do you turn off scaling? I have tried a 32" 720p Vizio and could not find an option to defeat the overscan over HDMI. If I set the resolution to 1366x768, the video card switch to 1080p and again there is overscan. Only VGA works.


If your turn off cleartype and your font looks sharp, you most likely has 1:1 pixel mapping.


What resolution does the TV say? Is it 720p or 1080p or 1080i?
 

·
Registered
Joined
·
16,749 Posts
Yes the Vizio is overscanning over \\ it's component or digital interface the same as almost all HDTVs do to emulate a standard CRT TV. Thiis also enables the physical size of the meaningfull content in a broadcast or movie to be larger since they all assume overscanning will occur and therefore have not meaningfull content with about 15% of the frame borders.

PC monitors never overscanned which is why the TV is not overscanning when usingthe VGA interface.

My current generation Vizion has vertical and horizontal size and centering capability so that I can eliminate the overscan from those picture modes where I do not want it.
 

·
Registered
Joined
·
1,110 Posts
On the 720p Vizio I tried, it will only let me increase the overscan. The scaling is default to 0 and image gets larger (more overscan) as I increase the scaling. Very frustrating.


I can't seem to find an option to underscan with the nvidia driver. It has a way to scale the desktop but it actually reduces the resolution of the usable area of the desktop. i.e. in 1280 by 720 mode, if I adjust the scaling to match border to border, the resolution drops to 11xx by 6xx. Absolutely useless. On top of that, not all resolution supports scaling. 1366 by 768 does not.


Basically, I cannot get 1:1 pixel mapping.


I don't know how you guys are getting 1:1 pixel mapping with 720p TVs over HDMI as 99% of them has a native resolution of 768 lines. The best I can think of is let the video card upscale 768p to 1080p and then the TV downscale it back to 768p. That is just disgusting.
 

·
Registered
Joined
·
1,942 Posts
I get true 1:1 pixel mapping using the VGA input on my Vizio. PC is set to [email protected] I tried HDMI but the overscan was horribly bad. Once I scaled my Intel graphics back enough to show everything on the screen it was horribly blurry.
 

·
Registered
Joined
·
16,749 Posts
It is much better to use the true native resolution over VGA and then get true 1:1 pixel mapping then it is to underscan the output (by deleating content) so that the TV will have only the appearance of 1:1 mapping when it overscans the input and has to invent the content of the extra pixels required to fill the screen which causes bluriness of PC application text.
 
1 - 16 of 16 Posts
Top