Originally Posted by sofakng
Most televisions have a "Game Mode" which disables a lot of video processing to improve input/display latency and they seem to typically work really good, but what is the disadvantage to these modes?
For example, my setup only has an HDMI matrix and AVR connected and then one single HDMI cable connected to my TV. This means I can only choose one "picture mode" unless I want to go into the TV menu every time I switch between movies and games.
Could I just leave "Game Mode" turned on all the time and calibrate it with a colorimeter? ...or is the picture going to probably be worse?
I would think "Game Mode" is a good thing if it disables most of the video processing because isn't most of that processing actually worse? (i.e. 120 Hz "smooth motion", removing noise artifacts; it just doesn't give you the raw actual image from the bluray, etc)
Game Modes generally turn off all advanced video processing and picture does suffer a little. You can do some minor tweaking, at least on Samsungs. What TV do you have? I have a Samsung UN65F9000 in the same setup you are talking about and I used to dig through the menu and turn on game mode every time but now I just choose my input, long press my remote's select button, change the input name to PC and play. It is the best way to reduce lag and pretty quick. Changing Input to PC will let you do a few calibration things but not much.
UN65F9000AFXZA + SEK2500U/ZA firmware version 1151.4
Denon AVR-X5200W, Apple TV, Sony BDP-S6200, PS4, Xbox One
(2) Klipsch RF-82IIs, (2)RS-62IIs (4)RS-52IIs, (2)RS-42IIs, (1)RC-62ii, (2)RB-61IIs
(1) SVS PB12-Plus, (1)Emotiva XPA-100, (1)XPA-200, (1)UPA-200,
UN55HU8550, AVR-S900W, Ceton Echo, Apple TV, Panny BDT460, PC,
(1)Klipsch G-42, (2)G-16s, (2) Bose 161, (1)BIC Acoustech PL-200