I hear about how it makes things smoother, yet at the same time hear things about how it has no effect in HD gaming. Time to settle it here: Does 120hz matter to gamers?
I saw plenty of these while shopping for my HDTV and not once did I consider buying one. In fact, 120Hz was a factor in NOT selecting a model! I didn't like the way it affected the presented material and just seemed a little too weird. they say it's more lifelike but I doubt it. My girlfriend, who doesn't really give a rip about these new technologies, said she wouldn't watch it if we bought one.
I'm not sure if the "3D" effect is due to 120hz or to other motion processing, but i definitely don't like it. Pretty sure that ain't what it looked like in the theater...
Quote:
Originally Posted by ebackhus /forum/post/13241196
I saw plenty of these while shopping for my HDTV and not once did I consider buying one. In fact, 120Hz was a factor in NOT selecting a model! I didn't like the way it affected the presented material and just seemed a little too weird. they say it's more lifelike but I doubt it. My girlfriend, who doesn't really give a rip about these new technologies, said she wouldn't watch it if we bought one.
I suppose it's mainly due to the frame rate my eyes have been used to for the past 25 years.
In some respects it's odd because the material being presented wasn't originally created in this higher frame rate and so the added frames, or quasi-interpolation, make it feel off. We watched Pirates of the Carribbean: At Worlds End and didn't much like it. It seemed too smooth to be a movie and we felt that it didn't quite seem life-like. It seemed more... Rendered?
They also showed a demo with a stock ticker running at the bottom of the screen. On the side that had the 120Hz enhancement the text was slightly sharper. This seems to be a feature that would only benefit LCD television sets which aren't even on my list of potential buys. Contrast ratio! Blurring! Dead pixels! Oh my!
To OP: I have samsung 4071f, and with the 120hz thing on, I do see a difference. For an easy test, rotate in game camera (better with 3rd person games) in circles and you can definitely see that the motion is smoother, then when you turn it off.
I am guessing that 120hz mode blends two frames and create a mid motion blurred frame..
If you haven't seen 120Hz I suggest you take a look. I personally did not like, nor did my girlfriend. Since we spend the most time in front of the TV (our room mate has a TV upstairs and cats don't count) we went with what we felt best fit out budget, size preference, and picture preference.
Under fast motion you will soon begin to notice nasty artifacts of the interpolation. I was playing Halo3 with the Samsung AMP on and my wife thought I had a "neat force field" around my character. It was actually just a weird looking aura that was the result of poor interpolation.
The TV went back since at that time the AMP feature could not be turned off. I think they fixed that via firmware since.
Quote:
Originally Posted by geister /forum/post/13274274
If you're the type that obsesses about aspect ratios, directors' intentions or experiencing a film as it was originally intended (whatever that means), then you will not prefer the 120hz/anti-judder tech because of the complaint that "film no longer looks like film".
It looks "alive". After adjusting to this effect, the traditional look of film appears somewhat less dimensional, lifeless or outdated by comparison.
Like color, now that this technolgy is available, it isn't worth buying an HDTV without it IMO.
Would you say that any preference may be due also to the way some people perceive the images being produced? For instance, I'm sensitive to the rainbow effect and that has been been linked to having a higher flicker fusion threshold. My girlfriend and room mate claim they don't see the rainbow effect. In fact, nobody in my family sees it but me. Could it be said that a higer threshold may also cause a less enjoyable experience with 120Hz "enhancement?" I do agree that the 'film no longer looks like film' but can't quite say it looks alive. When I'm out and about I tend to carefully examine my world (mark of a graphics/3D artist I suppose) and have a keen sense for how motion and vision work in it. When I watch a television show that was filmed at 29.97fps I can tell that it was. When I watch a movie at the theater that uses film I can tell it's 24fps. When I play an older PlayStation game I can tell that the video clips are running at 15fps.
Quote:
Originally Posted by kirknelson /forum/post/12088860
Most people can't see any difference beyond 24 fps (48hz) which is what movies are shot at and I haven't heard of anybody that can really tell a difference at more than 30fps (60hz). So I can't imagine that it would have any effect on anything except your electric bill.
If you think you can see a difference at 120hz the flickering of a flouresent light must drive you crazy.
Quote:
Originally Posted by jhoff80 /forum/post/12107051
I've never actually seen one of these new TVs, but from a purely technical standpoint.... How? (PC gaming is excluded from this, because PC games actually have the capability of running at refresh rates other than 60hz.
The source material stays the same framerate. So for example, if you're watching material at 30 fps, you're going to have to show each frame for 4 cycles on the monitor. I don't see how it could possibly have any effect when the net result is the same. Hell, for material recorded at 24fps (movies), its more of an issue because 24 frames can't go evenly into 120 cycles per second, so it'll have some frames that are shown for four cycles, and some frames that are shown for 3, which seems to me that it'd make the picture more jumpy.
For PC games at least, you could set the refresh rate of your computer to 120hz, and if you have a fast enough video card, it is actually possible to have games running at 120 frames per second. Maybe then I could see the refresh rate making a difference, but it seems impossible in any other circumstances.
I'm not trying to just argue with you, but if there's some plausible explanation I'm missing out on, I'd love to hear it.
Isn't it just as possible that the difference you're seeing is akin to that of an older model TV compared to the newer generation, or quality differences between the various manufacturers?
when running games on ur pc - hmmm - lets take an example. U see a picture on the screen - u move ur mouse to make an alteration - 16ms later an refresh is made to the display adding the new coordinates - the refresh time of ur display is, lets say 6 ms for the given color-transition in the pixel we are looking at - this adds up to a total refresh time of 22 ms... now lets assume we're running 120Hz - we would then end up with a total rerfeshtime of 14ms - thats pretty nice
I'm sure a lot of people realize this but I would like to mention it for those that may have overlooked it. The fact that 120hz is divisible by both 24 and 30 allowing for exact division of 24fps, 30fps, and 60fps video playback material makes sense, aside from the interpolation issues on the in between frames. (In 3D applications and hand drawn for that matter the in-between frame creation is called "tweening" by the way).
However, correct me if I'm wrong, but as far as gaming is concerned since video games are always pushing the envelope vs. gpu's (even people with extereme sli rigs push the settings up to utilize the extreme settings available) - the framerate on a pc game varies during the game depending on the stress to the hardware during any particular scene. So even if you could set your framerate to 120fps or set the sync , it wouldnt keep perfect time with the refresh rate, especially during dips in the framerate below 120fps. Game framerates usually run the fastest they can per any given scene, and that framerate varies due to the varying complexity of the scene (amount of polygons being displayed, 'special effects' happening, etc). Perhaps if your framerate never dipped below 120fps and was synched to the refresh rate this would work but I don't know of many people running modern games at the max settings their pc can handle whose framerate doesn't -ever- go below 120fps. If you artificially set it to 60fps you would get 'tweening' interpolated frames and artifacts from what people are describing. And then you would have to consider whatever fps consoles run and how they implement it if you run console games.
I'd also like to say that blurring and afterimages are ugly on many lcds so anything that would elminate it would be welcome if it didnt produce artifacts. I recently bought a 2ms desktop lcd screen to replace a 12ms and the blurring is gone now. I hooked up rock band(ps3) via hdmi to the 12ms lcd last night and the fast-scrolling lyrics were ghosting which is very annoying.
Quote:
Originally Posted by bkchurch /forum/post/12150592
No, I'm sorry but an explanation is definitely required. I admittedly have not seen a 120hz panel in action yet but their is not logic (that I can see) as to why the image would be "more 3D". I'm being condescending because in the world of tech "It just does" is not a relevant answer and I'm calling BS. I'm not trying to insult you but people do have a tendency to create reasons to justify their purchases, and I'm not calling you a liar because more often than not people honestly believe the BS they have conjured up, it's human nature.
I remember back when 1080p was the new big thing alot of people were running around spouting off BS about how the higher pixel density created better looking colors and superior viewing angles, which is obviously a bunch of bull. I'm gonna say again I'm not calling you a liar but the idea that adding nonexistent frames to an image makes it look "more 3D" (a pretty vague description I may add but I assume you mean the picture appears to have more depth) just doesn't make sense that's why I'd like a technical explanation before I believe it.
Edit: I'm also going to add that saying anything makes as big a difference as the jump from SD to HD is an incredibly lofty claim. I don't think I'm alone in not believing a 120hz display could possibly make that kind of difference.
wow you have to love people who havent even seen this for themselves telling people there wrong. everyone i have shown this too on my sony cant believe how different it looks.
The motion interpolation feature will cause input lag. The ideal is to display each frame as soon as it is rendered. However, when interpolation is on, the display has to wait and see the next frame in order to render the "tween" frame. Now, whether this lag is noticeable or not will depend on the game and how sensitive the person playing is to it.
As for what is objectionable about it, it's a purist issue, just like colorizing black and white movies or processing stereo sound into multichannel. When it is turned on, it alters the source material and movies cease to look like movies. That's what I don't like about it, but that's just my opinion and I know some people love this feature.
Also, I wish people would stop thinking that 120Hz is synonymous with the interpolation. A 120Hz refresh rate is useful for reducing motion blur (even without interpolation) and for doing 5:5 pulldown to get rid of judder. However, some sets don't do the 5:5 pulldown, unfortunately.
Quote:
Originally Posted by greenman /forum/post/12594277
To the people who think they can tell a difference, can you see your lightbulbs flicker 50- 60 times a second? If so then maybe you would benifit from a 120 hz tv.
I would say a lightbulb has a higher refresh rate than 4-6ms. And if you really need to know, some people get migraines from working in environments lit by fluorescent lightbulbs.
Ps : this is waaaaay offtopic, but do you know why fluorescent lights are avoided in machine rooms? Machine parts rotating at the same frequency as the frequency of the fluorescent lights may appear NOT to be moving. People have lost limbs because of this effect!
The 120hz itself doesn't add the smoothing, it's their frame interpolation options like Auto Motion Plus and Motionflow.
In which it DOES make games look more like 60fps instead of 30fps. For games that are already 60fps, it doesn't add anything new, except lesseing the blur when panning the camera, but not by much.
With interpolation on, yes... 120hz will smooth motion depending on the framerate of the game. However, the processing required for the interpolation will result in lag between button pushes and action on-screen... frame interpolation ("AMP", "Motionflow"...) is intended for movies/TV and not recommended for gaming.
With interpolation off, console gaming on a 120hz set isn't much different than gaming on a 60hz TV.
Quote:
Has anyone tested this with their 120hz set on Console games?
Game Mode on 120hz sets I have seen revert the display to 60hz, therefore making 120hz useless.
Asd for using frame interpolation in games, I had NO issues while play the majority of games, even with the slight delay. It's not a huge lag. Again, I find it ridiculous that people would even complain about button lag with frame interpolation on. I played game consoles online with no problem.
It might be an issue with time sensitive games like Guitar Hero, but for most games, no.
Quote:
Originally Posted by Shin CZ /forum/post/15469768
Asd for using frame interpolation in games, I had NO issues while play the majority of games, even with the slight delay. It's not a huge lag. Again, I find it ridiculous that people would even complain about button lag with frame interpolation on. I played game consoles online with no problem.
I think we have the same TV (Samsung 650?), and I really don't know how you can say this! ^^^
I noticed the lag right off the bat after getting this 120hz TV: The first thing I did was hook up a PS3 via HDMI and played Ratchet and Clank at the input's default setting. It was very apparent there's a slight timing gap between pushing "X" and Ratchet jumping. The gap is less than a second, but the level I was on was the one where Ratchet is sliding along a rail and jumping over obstacles, requiring split second timing. This was very difficult with AMP on at the default setting. Once I turned game mode on, thus turning AMP and processing off, it became much easier and jumps were right when I pushed the button, not after.
I'll agree that for other parts of the game it doesn't make that much of a difference -- average movement on the screen with interpolation doesn't seem too laggy. But any game that requires a button push at a precise moment... not all games require this but most do, at least occasionally -- frame interpolation will kill that gaming experience.
I have a 52" Philips 120Hz LCD and use it to play Xbox360, PS3 and PC. I live in a single apt. and watch/play from 6~8ft from it. I never had any probs before, and honestly my previous plasma had better image (no surprise). Buuuuuttttt... size matters and the DNM feature really works well in games. I don't know if they have this feat on 60hz but the "soap opera" effect makes the graphics stand out. GT5 Prologue and MSG4 looks beautiful. COD4 never looked better.
This is an older thread, you may not receive a response, and could be reviving an old thread. Please consider creating a new thread.
Related Threads
?
?
?
?
?
AVS Forum
34M posts
1.5M members
Since 1999
A forum community dedicated to home theater owners and enthusiasts. Come join the discussion about home audio/video, TVs, projectors, screens, receivers, speakers, projects, DIY’s, product reviews, accessories, classifieds, and more!