AVS Forum banner
1 - 12 of 12 Posts

·
Registered
Joined
·
7 Posts
Discussion Starter · #1 ·
So, first of all, I am new here, though I lurk here on occasion especially when I am shopping for a new television or computer monitor! I know only a little about video technology, so please forgive me if I'm misusing terminology or getting some details wrong.


I recently purchased a vizio m320VT HDTV to use as a computer monitor. I had a lot of apprehensions about the brand but it was well reviewed (and very cheap for an OLED 1080p display), so I went for it. The picture quality isn't really anything to write home about it, but the colors and contrast are decent enough to keep me satisfied after some calibration.


When I first got the TV I had it hooked up via VGA since I was waiting for my DVI to HDMI cable to arrive, and at that point it looked fine. But when my HDMI cables came I noticed this strange looking artifacting on the screen, which I can only describe as being similar to JPEG compression. It's most noticeable on text which against a colored background. Honestly kind of resembles a bad blurry CRT. I always though HDMI was supposed to be uncompressed, and 1080p IS the native resolution of this display, so it can't be a scaling artifact. Does anyone have any idea what could be causing this problem? Adjusting the television's picture does little to change this, including the sharpness setting, which I figured might have been the culprit; the only thing with even minimal effect is the backlight, which really just makes it less noticeable instead of eliminating it. Could this be cause by my cable? I know some HDMI cables cannot do true 1080p, but I made sure that mine can. Am I just better off with VGA?


here are some photos I took with my phone. don't know if they will be useful but you can kind of see what I'm talking about in the pixels above and below the text that says "note in reader":


 

·
Registered
Joined
·
12,697 Posts
Is there a PC mode or PC label that can be used/applied to tell the TV it is receiving a PC signal through the HDMI input? I know you're supposed to label the input as "PC" on the Samsungs to get best results.
 

·
Registered
Joined
·
7 Posts
Discussion Starter · #3 ·
there is a PC Setting menu but it is only available when using the RGB (what this TV calls VGA) input
 

·
Registered
Joined
·
12,697 Posts
Are you using the HDMI YCC color space or the HDMI RGB color space on your PC?
 

·
Registered
Joined
·
7 Posts
Discussion Starter · #5 ·
I'm not completely sure how to figure that out, but it says "RGB" under my nvidia control panel color settings menu. switching to YCC actually makes the problem much worse!


edit: and now it won't let me switch back :'(

edit 2: never mind, I just had to switch off of hdmi and back on for it to accept the RGB setting.
 

·
Registered
Joined
·
12,697 Posts
Is your video card outputting 1920 by 1080, 60Hz? Are there any settings in the nvidia control panel that might fix the issue?
 

·
Registered
Joined
·
7 Posts
Discussion Starter · #7 ·
it is. strangely, when I run the television optimization wizard, it only goes as high 1080i... and then the 1080p resolution is removed from the list and I have go add a custom resolution to get it back to 1080p. there are all sorts of weird looking timing settings that I'm too scared to mess with. one of those might fix it but I have no clue what any of them do and don't really know where to start.
 

·
Registered
Joined
·
7 Posts
Discussion Starter · #8 ·
this leads me to think that the cable, for whatever reason, might only actually be capable of 1080i (despite being advertised otherwise), and the tv is poorly converting 1080i to 1080p 60hz?


I don't think I mentioned that it's actually a DVI - HDMI cable. maybe something is going wrong in conversion?
 

·
Registered
Joined
·
16,749 Posts
DVI and HDMI use exactly the same video prototcol and signals so nothing can go wrong when switching between with video when coverting between the DVI and the HDMI connectors and pin out.

Youre display resolution should be set to [email protected] if possible otherwise set it to 1080i since many 1080p resolution HDTVs have receiver chips that will not accept 1080p but will accept 1080i since 1080i/60 is an ATSC resolution. The TV then de-interlaces the 1080i to 1080p for display.

All HDMI cables are capable of 1080i/60 or 1080p/60
 

·
Registered
Joined
·
7 Posts
Discussion Starter · #10 ·
well, it's set to 1080p 60hz for sure. when I set it 1080i it's the same picture but it flickers, so I don't think it's expecting a 1080i signal...
 

·
Registered
Joined
·
16,749 Posts
Your TV always displays 1080p/60 since that is it's native resolution. It will accept 1080i/60 from your PC and de-interlace it. If you send it 1080p/24 it will flicker.

Since your objective is to use it as a PC monitor and not as a TV I suggest that you use the VGA(PC) interface using a standard PC monitor cable.

DVI/HDMI does not provide any screen resolution improvement over VGA when used as a PC monitor and most users never notice any color improvement when used with video.
 
1 - 12 of 12 Posts
Top