or Connect
AVS › AVS Forum › Video Components › Home Theater Computers › what video card can do 1366x768 rez?
New Posts  All Forums:Forum Nav:

what video card can do 1366x768 rez? - Page 2

post #31 of 144
Quote:
Originally Posted by dchester View Post

I'm interested in doing 1366 x 768 (although 1360 or 1368 would be acceptable) via the VGA connection, to my Panny plasma TV. Will an ATI X300 video card support this? I have a chance to get a good deal on a PC with this card, but I'll pass if it won't work. Any advice is appreciated.

the ATI's don't support 1366. They only support multiples of 8 in resolutions over both VGA and DVI.

the new nVidia cards can handle a 1366 resolution but only via DVI.
post #32 of 144
Quote:
Originally Posted by Naylia View Post

the ATI's don't support 1366. They only support multiples of 8 in resolutions over both VGA and DVI.

the new nVidia cards can handle a 1366 resolution but only via DVI.

Could you tell me how to setup 1366 on EVGA 6800 card with the latest Nforce drivers. Thanks.
post #33 of 144
Can't walk you through it as unfortunately I'm still on a 9800 Pro, but you should be able to adjust your resolution pixel by pixel in one of the advanced ares of the control panel if you are using DVI. My friend has his 50" plasma pixel perfect at 1366 over DVI and it's gorgeous. Maybe someone else can weigh in with buttonlogoy for you.
post #34 of 144
My 7900GT video card will output a 1368x768 resolution but it looks terrible on my Sony HS20 and the software sets it as a 1080i display type. I think the reason my TV and many others don't work properly at native resolution is because the EDID doesn't accept them (don't ask me why). So, the EDID sets the resolution as 1280x720 and when overscan is adjusted it ends up as 1176x664. The output looks good but of course it's much lower than it's native resolution.

So, even if you can output that resolution you should check to see if your TV supports it in the EDID.
post #35 of 144
Quote:
Originally Posted by Naylia View Post

Maybe someone else can weigh in with buttonlogoy for you.

Good phrase. I just did this for my 6600GT last night so it's reasonably fresh.

With the plasma selected as primary monitor, go into the Nvidia control panel and add a custom resolution of 1368x768.

In the advanced timings control panel you will need to enable flat panel scaling and then set the "back-end active" to 1365 or 1366 depending on you panel.

A picture of the panel is available here:

http://www.avsforum.com/avs-vb/showt...ia#post7184461
post #36 of 144
Quote:
Originally Posted by jvincent View Post

Good phrase. I just did this for my 6600GT last night so it's reasonably fresh.

I actually missed an 'o' in the spelling. It should be 'buttonology'

I really need to snag an nvidia card that can do 1366x768, but I'm holding out an an video card upgrades until I can get something from nvidia that supports HDCP...just in case.
post #37 of 144
Quote:
Originally Posted by Naylia View Post

I really need to snag an nvidia card that can do 1366x768, but I'm holding out an an video card upgrades until I can get something from nvidia that supports HDCP...just in case.

I was in the same boat as you, but the way things are going I figure it's going to be at least a year or so before the whole DRM/HDCP/Vista/BluRay/HDDVD/..... even begins to stabilize.

Given that you can get 6600GTs and 7600GSs relatively cheap right now I figured it was worth pulling the trigger now.
post #38 of 144
It's really strange how there are lots of projectors and plasmas with WXGA (1366 x 768) as the native resolution, but there don't seem to be a lot of video cards that do that resolution for PCs. What's even more strange is that there are plenty of laptops (like from Sony or Toshiba) that will do WXGA, so the reason doesn't appear to be a technical one. While I suppose it's cheaper to get a PC and perform some hack to get it to work at 1366x768, I'm wondering if I should just get a laptop that does it right out of the box, so I don't have to fool around any more than is neccesary.
post #39 of 144
There are tons of video cards that do 1366x768, every card on the market for the past 5 years can do it.

What people aren't understaning is it is the display makers that are imposing this limitation. The EDID value, the resolutions that the display will accept, is exchanged between the HDMI/DVI port on the display and the video card. Most of the time this value is HD resolutions only - 1080i/720p/480p, because this port is expected to be used for HD.
post #40 of 144
Quote:
Originally Posted by almostinsane View Post

There are tons of video cards that do 1366x768, every card on the market for the past 5 years can do it.

What people aren't understaning is it is the display makers that are imposing this limitation. The EDID value, the resolutions that the display will accept, is exchanged between the HDMI/DVI port on the display and the video card. Most of the time this value is HD resolutions only - 1080i/720p/480p, because this port is expected to be used for HD.

I plan on using the VGA port on my Panny TH-50px500u. Are you saying that any video card (made in the past 5 years) should be able to do 1366 x 768 (without using powerstrip, or some other hack)?
post #41 of 144
Quote:
Originally Posted by dchester View Post

I plan on using the VGA port on my Panny TH-50px500u. Are you saying that any video card (made in the past 5 years) should be able to do 1366 x 768 (without using powerstrip, or some other hack)?

He is right that many cards could support just about any WXGA resolutions that the display device indicates that it can support. However, the problem is that the older boards have no way to support resolutions (overscan and custom resolutions) that the display device does not support. I think the new boards and PowerStrip do some fudging in that they are really using a higher supported resolution than is desired and then by applying a large amount of overscan so that only the desired resolution is displayed on the screen (the rest will be off the screen area) but still use a large memory map but only enter data in the area to be displayed. As an example, if 1366x768 is desired and the only resolution that the display device supports are 1280x720 and 1920x1080, the board would use the 1920x1080 resolution and then create enough overscan to have the desired 1366x768 displayed on the screen. A 1920x1080 map is created but only the center 1366x768 will be used for displayed data. You'll notice that all custom resolutions above 1280x720 is always indicated as a 1080i resolution on the TV. It would have to use one of the defined resolutions from the display device since that is all the display device supports. I hope my assesment is correct.
post #42 of 144
Quote:
Originally Posted by almostinsane View Post

There are tons of video cards that do 1366x768, every card on the market for the past 5 years can do it.

What people aren't understaning is it is the display makers that are imposing this limitation. The EDID value, the resolutions that the display will accept, is exchanged between the HDMI/DVI port on the display and the video card. Most of the time this value is HD resolutions only - 1080i/720p/480p, because this port is expected to be used for HD.

Yeah, this is my problem using the Sony HS20 TV. My 7900GT video card will output that resolution but the damn TV EDID doesn't support it. So, I'm only able to get 1368x768 after jumping through many hoops and then after a reboot it won't keep those settings. The problem with many display devices is the EDID, which I think is often blamed on the video card.
post #43 of 144
Quote:
Originally Posted by mnn1265 View Post

The problem with many display devices is the EDID, which I think is often blamed on the video card.

I'm new to the Nvidia control panel, but isn't there an "ignore EDID" data in it somewhere?
post #44 of 144
Quote:
Originally Posted by Naylia View Post

the ATI's don't support 1366. They only support multiples of 8 in resolutions over both VGA and DVI.

the new nVidia cards can handle a 1366 resolution but only via DVI.

Thanks for all your posts on the subject. What you have said is consistant with what I've seen in my testing. It seems like everytime I see a good buy on a PC (that would fit in my entertainment center), it never has an nVidia card.


BTW, you wouldn't happen to know if the Intel Graphics Media Accelerator 950 does 1366 x 768? I saw a good price on a Dell with that.
post #45 of 144
unlikely, the drivers/capabilities of the intel onboard processors are a good ways behind both ati and nvidia. the solution may be to just drop a 6600GT (or better) in the pc yourself if you're looking to prebuilt

or another option would be to look at a smaller company like 2PartsFusion (i think that's their name, they advertise on htpcnews.com) to build you a pc
post #46 of 144
Quote:
Originally Posted by jvincent View Post

I'm new to the Nvidia control panel, but isn't there an "ignore EDID" data in it somewhere?

It can be ignored for many video drivers in Windows.

1) Go into Display Properties -> Settings, and then click on "Advanced...".

2) When the next dialog box comes up, select the Monitor tab, and then uncheck the box that says "Hide modes that this monitor can not display". (and of couse click on the "Apply" button).

3) Then click on the Adapter tab, and click on the "List All Modes" button.

I attached pictures for step 2 & 3.
LL
LL
post #47 of 144
Quote:
Originally Posted by almostinsane View Post

There are tons of video cards that do 1366x768, every card on the market for the past 5 years can do it.

What people aren't understaning is it is the display makers that are imposing this limitation. The EDID value, the resolutions that the display will accept, is exchanged between the HDMI/DVI port on the display and the video card. Most of the time this value is HD resolutions only - 1080i/720p/480p, because this port is expected to be used for HD.

Actually the EDID value isn't normally the problem, that can be ignored by the driver. I came across this in the faq for powerstip while trying to get 1:1 mapping for a syntax olevia LCD (1366x768).
Quote:
Originally Posted by Rik Wang View Post

A common question among, e.g., plasma owners, is why horizontal timings cannot be specified in terms of individual pixels - i.e., why 848x480 instead of an optimal 852x480, or 1368x768 instead of the native 1366x768?

The reason is that graphics cards almost invariably do horizontal timing in terms of character clocks of 8 pixels, rather than in terms of individual pixels - a legacy of the original Motorola 6845 CRT controller. At the hardware register level, 848x480 is actually programmed as 106 character clocks x 480 lines - 106 characters of 8 pixels each equals 848 total pixels. Likewise, 1366 pixels isn't possible - the closest possible values are 1368 (171 character clocks) or 1360 (170 character clocks).

Hence a horizontal resolution that isn't evenly divisible by an 8 pixel character clock is not possible on display hardware that claims compatibility with VGA (and pre-VGA) standards. You either live with a couple of pixels of overscan, or settle for a couple of pixels blank border.

One exception to this horizontal timing rule is ATI new X1K and Matrox's Parhelia and P-series, which use pixels rather than character clocks to generate horizontal timings.

Another exception is the old Kyro 2 - still used on special purpose graphics cards from some manufacturers. The Kyro 3 is also capable of pixel-perfect resolutions at the hardware level, but the generic drivers do not allow this.

Also: for digital (not analog) timing, NVidia cards have always used individual pixels rather than character clocks of 8 pixels, even though effective changes have tended to be in character clocks anyway. Again: this only applies to DVI-D/HDMI, not analog VGA or DVI-A connections.

Brandon
post #48 of 144
Quote:
Originally Posted by dchester View Post

It can be ignored for many video drivers in Windows.

1) Go into Display Properties -> Settings, and then click on "Advanced...".

2) When the next dialog box comes up, select the Monitor tab, and then uncheck the box that says "Hide modes that this monitor can not display". (and of couse click on the "Apply" button).

3) Then click on the Adapter tab, and click on the "List All Modes" button.

I attached pictures for step 2 & 3.

My problem with EDID isn't up to that point (which I can get to) but after. After I go to the "List all modes" and choose the 1386x768 @ 56Hz option (which is in there) I get a windowed (1024x768) output. I have to go back into the "GeForce 7900 GT" tab and under "Advanced Timings" then under the "Timing standard" dropdown I have to select either GMT, GTF or CTV to get a nice 1368x768 output. It looks just great. The problem comes after a reboot in which case I'm right back to a 1024x768 window.

Perhaps I'm attributing this to the EDID falsely but I can't think of what else could be causing the problem (as the EDID contains the 1024x768 setting). Maybe it's just coincidence but it's a tricky one.

Edit: Forgot to mention. This wouldnt' be a problem if I was able to just reapply the timings setting and be good to go, but the problem is whenever I switch to a game or any other app that uses a custom resolution setting it resets the whole shebang back to 1023x768. Well, that and the fact that games only detect the option to do 1024x768 and not the higher 1368x768. If it's not the damn EDID then what could be the cause?
post #49 of 144
mnn1265, have you tried adding the custom resolution in the Nvidia control panel as shown in the picture I linked to earlier in the thread?

I'm pretty sure for this to work you have to have the DVI connected panel as the primary display BTW.

Once you do that you should be able to get 1366x768 to your display, although you will need to actually select 1368x768 in the control panel.
post #50 of 144
Quote:
Originally Posted by jvincent View Post

mnn1265, have you tried adding the custom resolution in the Nvidia control panel as shown in the picture I linked to earlier in the thread?

I'm pretty sure for this to work you have to have the DVI connected panel as the primary display BTW.

Once you do that you should be able to get 1366x768 to your display, although you will need to actually select 1368x768 in the control panel.

Yes and thanks as that's how I actually managed to get the 1368x768 to display in the first place. As I mentioned though it only works if I change the timing selection in the timings drop down menu and then the setting doesn't keep.

The 1368x768 setting continues to be applied in the control panel the entire time whether it's a 1368x768 or a windowed 1024x768 output. Curiously though, when the 1368x768 is actually being displayed the TV shows it to be 1368x768 and when the 1024x768 is being displayed that's what it shows as the output.

It is the only (single) display connected via DVI.
post #51 of 144
Hmm. Got me stumped.
post #52 of 144
Thanks for getting me as far as you did... at least now I know how nice 1368x768 actually looks on my HS20. My guess is some smart person has figured out a way around this problem and hopefully I'll get it working eventually.

What I've been doing the last year or so is using "Video GRB" mode with a resolution of 1280x720 and then using the overscan compensation feature in the NVidia driver control panel. It looks fine and works well with games and such but unfortunatly only nets me a 1177x667 resolution. Under this mode NVidia recognizes the "Sony HS20" monitor driver.

Using the "Computer" setting I'm able to get the 1367x768 resolution to work but with the problems I've described (can't maintain the res after reboot etc.) and NVidia recognizes only a "Plug and play" monitor. I think that driver w/ the Sony EDID is probably causing the problem. I think I'll go dust-off powerstrip and see if I can make a custom monitor driver for it and see if that helps. Can't think of anything else to try at this point.

Thanks again for the help.
post #53 of 144
mnn1265:

If your monitor is being recognized as "Plus and play" then you are using automatic detection. This is bad. Never, never use autodetect. Here is the procedure to fix it:

1. Download the following file:

davemon.INF

Here's the comments from the first section of the INF:

; DAVEMON.INF
;
; This is a Setup information file for monitors
; supported in the Windows 98 product.
; It was hacked out of MONITOR6.INF
; I was annoyed that my plug & play monitor was only supported up to 1600x1200 after I discovered
; accidentally that it could display well over 1900x1200. I found this from the default driver
; but that only supported refresh rates of 60Hz.
;
; I decided to create my own "no limits" driver, which simply enables all the modes of my GeForce2Pro card.
;
;
; Copyright (c) 2002, David Pietromonaco



2. Go to your display properties.
3. Click Settings tab
4. Click Advanced button
5. Click Monitor tab
6. Now, find the check box beside "Hide modes that this monitor cannot display", and if there's a check in it, UNCHECK it.
7. Next, the first section at the top will say "Monitor Type". Click the Properties button.
8. In the window that appears, click the Driver tab
9. Click Update Driver
10. In the window that pops up, click "Install from a list or specific location (Advanced)"
11. Now click Next
12. Now click the second radio button titled "Don't search. I will choose the driver to install."
13. Click Next
14. Uncheck "Show compatible hardware". When you do this, you'll see lots of manufacturers and models appear.
15. Click the Have Disk button.
16. In the window that pops up, click the Browse... button.
17. In the window that pops up, go to the location where you saved davemon.INF and select it, then click the Open button.
18. Now back at the previous window, click the OK button.
19. You will now be back at the "Hardware Update Wizard" window, in which a single Model will now be listed, namely "NoLimits monitor for GeForce2Ti/Pro/GTS". Click the name once to select it.
20. Click the Next button. A window will/may pop up warning you that the driver isn't signed. Click the Continue Anyway button.
21. Now click the Finish button.
22. Back at the previous window, click the Close button.
23. Back at the previous window, click the OK button.
24. And finally back at the Display Properties window, click the OK button.

OK, phase 1 complete. You have to OK all the windows before the new device's modelist will become active. Now we'll go back in there and see all the modes your adapter will do:

25. Go to your display properties.
26. Click Settings tab.
27. Click Advanced button.
28. Click the Adapter tab.
29. Click the List All Modes... button.

Now you can pick a mode to your heart's content. If you have an NVidia card, you can also create custom mod8 resolutions. If your have a DVI device, you may even be able to create non-mod8 ones. Remember though that DivX only does mod4 horizontal resolutions. This does not include 1366, unfortunately.
post #54 of 144
Isochroma, thanks for your help. Even though I had already created a custom .inf file for the HS20 in powerstrip (that allowed larger resolutions) I tried the DAVEMON.INF. It worked just fine except I ended-up with the same problem.... namely the 1368x768 or any higher resolution I attempted was displayed in a 1024x768 window that required me to pan with the mouse to see the rest of the desktop.

I just can't seem to get past the restrictions in the EDID of 1024x768 even with a driver that allows me to go higher.
post #55 of 144
Yes, there's a secondary limitation that is imposed at the hardware/driver level when connecting via DVI/HDMI. The recommendation is to do what I will do in 8 months time when the 37HL95 arrives: use VGA!

VGA will always work (apart from length limitations - VGA extender cables are available) and considering the low bandwidth of 60/72/75 Hz. 1368x768, the analog fuzziness should not be visible even with an extender.

Regarding extra length, I tested a 6' extension cable with my current 21" ViewSonic P220f monitor at a resolution of 1600x1200x88 Hz (bandwidth of 253.4 MHz), and noticed very very slightly less sharpness on close inspection (less than 1 foot) (total length ~12 ft. including the standard cable).

Now, using an LCD at 1368x768x60 Hz. gives us 94.6 Mhz., or 1/2.679th the bandwidth (37.3% of my test). This means any analog blurring should be absolutely invisible with 12' of VGA cable.

So long as your card has a DVI port that also outputs analog, you just plug in a cheap ($25) DVI->VGA adapter plug and away you go! Analog displays do send DDC info to the card, but modes are not enforced, ever.

One disadvantage is you will be SOL for HDCP, but it's unlikely your card/display both support it anyway.

Finally, after long reading of this forum, it is my opinion that DVI/HDMI are not a good idea for connecting computers to TVs. Not dissing the interfaces per se., but their implementations in TV-class devices tend to be very rigid and mode-limited. You may luck out and get a functional, pixel-perfect sync between your PC and such a device, but this is not likely to occur at the resolution you'd want.

In contrast, the flexible nature of VGA means devices that use a VGA-interface chip tend to support many resolutions and refreshes. Also, no video card on the market restricts output modes when connected via VGA, while almost all do thru DVI.

If you're lucky enough to own a Westinghouse or other model where the manufacturer has specifically made efforts to accomodate users, then you could get nice 1:1 1920x1080, etc. Such manufacturers can be counted on half the fingers of one hand, unfortunately.
post #56 of 144
For people with ATI cards, Catalyst 6.4 drivers are now out.
No, I haven't even downloaded them yet
post #57 of 144
I have a ATI 9500 Pro with a Dell W2306C LCD display (1366x768 native resolution). Am I missing anything by having it set to 1360x768? It looks clear enough and I'm not dealing with any over and underscan (that I am aware of). It's currently running on VGA.

I've noticed though that the display will detect and correct overscan at startup only if whatever is displayed is reasonably bright. If the right area of the screen happens to be dark, it will detect the wrong area and chop off part of the image. You then need to tell it to detect again or cycle it to a component and back to VGA. Is this kind of thing normal? Not a deal breaker, just a minor annoyance.

I'm just curious if there's some untapped potential in that display that I'm not using because it's not set to 1366x768.
post #58 of 144
I own a Westinghouse 27w7 and also use 1360x768 even though the panel is listed as 1366x768 native. There is no underscan or overscan, so maybe it truely is 1360 pixels wide?

I think this entire thread is a good lesson to all. Only buy TVs with VGA inputs! It will save you hours of headaches!
post #59 of 144
thats the same conclusion i came to as well. i have a samsung 32" LNS3251D and i can only get 1x1 mapping on vga. 1280 gives me over scan, 1366 gives me horrible underscan, and messing with custom resolutions is a headache. my main issue with vga and my display is the sharpness and detail compaired to dvi>hdmi. vga seems washed and not as vibrant. right now im still trying to figure out if maybe going with an nvidia card (7600gs) will fix my problem. i hear some people say it can't be done with edid, and others say it can be in the nvdia drivers. i've picked up a few tips along the way, so i think i'll fire up my box tomorrow and give it a shot again.
post #60 of 144
Quote:
Originally Posted by Isochroma View Post

mnn1265:

If your monitor is being recognized as "Plus and play" then you are using automatic detection. This is bad. Never, never use autodetect. Here is the procedure to fix it:

1. Download the following file:

davemon.INF

Here's the comments from the first section of the INF:

; DAVEMON.INF
;
; This is a Setup information file for monitors
; supported in the Windows 98 product.
; It was hacked out of MONITOR6.INF
; I was annoyed that my plug & play monitor was only supported up to 1600x1200 after I discovered
; accidentally that it could display well over 1900x1200. I found this from the default driver
; but that only supported refresh rates of 60Hz.
;
; I decided to create my own "no limits" driver, which simply enables all the modes of my GeForce2Pro card.
;
;
; Copyright (c) 2002, David Pietromonaco



2. Go to your display properties.
3. Click Settings tab
4. Click Advanced button
5. Click Monitor tab
6. Now, find the check box beside "Hide modes that this monitor cannot display", and if there's a check in it, UNCHECK it.
7. Next, the first section at the top will say "Monitor Type". Click the Properties button.
8. In the window that appears, click the Driver tab
9. Click Update Driver
10. In the window that pops up, click "Install from a list or specific location (Advanced)"
11. Now click Next
12. Now click the second radio button titled "Don't search. I will choose the driver to install."
13. Click Next
14. Uncheck "Show compatible hardware". When you do this, you'll see lots of manufacturers and models appear.
15. Click the Have Disk button.
16. In the window that pops up, click the Browse... button.
17. In the window that pops up, go to the location where you saved davemon.INF and select it, then click the Open button.
18. Now back at the previous window, click the OK button.
19. You will now be back at the "Hardware Update Wizard" window, in which a single Model will now be listed, namely "NoLimits monitor for GeForce2Ti/Pro/GTS". Click the name once to select it.
20. Click the Next button. A window will/may pop up warning you that the driver isn't signed. Click the Continue Anyway button.
21. Now click the Finish button.
22. Back at the previous window, click the Close button.
23. Back at the previous window, click the OK button.
24. And finally back at the Display Properties window, click the OK button.

OK, phase 1 complete. You have to OK all the windows before the new device's modelist will become active. Now we'll go back in there and see all the modes your adapter will do:

25. Go to your display properties.
26. Click Settings tab.
27. Click Advanced button.
28. Click the Adapter tab.
29. Click the List All Modes... button.

Now you can pick a mode to your heart's content. If you have an NVidia card, you can also create custom mod8 resolutions. If your have a DVI device, you may even be able to create non-mod8 ones. Remember though that DivX only does mod4 horizontal resolutions. This does not include 1366, unfortunately.

I have loaded this and get many more options but NOT 1368x768. What do I need to do to achieve this? I'm using an ATI Radeon x850xt.
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: Home Theater Computers
AVS › AVS Forum › Video Components › Home Theater Computers › what video card can do 1366x768 rez?