AVS Forum banner

1 - 20 of 132 Posts

·
Registered
Joined
·
65 Posts
Discussion Starter · #1 ·
I just spent the weekend helping my father-in-law setup his new 55" Samsung Series 8000 LED 240hz TV. The TV itself is very nice, but let me say two things that have nothing to do with Samsung:


1) 240hz is HORRIBLE!!! If you are familiar with the history of television, there was a process before videotape called Kinescope, where TV shows were shot with video cameras, but to save and archive what was shot, film cameras were used to film the TV monitors. This was the only way to preserve the TV show as videotape wasn't available yet. The result was a strangely dimensional image. This is what 240hz looks like to me. The Samsung has settings to adjust the 240hz: Off, Clear, Normal, Smooth, Custom and Demo. Clear is the lowest setting, which was barely acceptable, but still had the artifact. When set to Normal or Smooth, the iTV image was so bad it drove me nuts. Some people say 240hz makes the TV show look like it was shot with a home video camera, kind of like a cheap soap opera. I totally agree. I don't think I've read any review of 240hz TV's that mention this artifact. I noticed this artifact on film-shot content, whether from DVD/BR disc or watching a film on Turner Classics. HD video-shot content looked OK. Anyways, I do not like the 240hz feature.


2) The Series 8000 is an edge-lit LED. I personally didn't like what LED does to colors. The color of light from LED does not look natural and to me, the colors on LED do not look pleasant or natural. Have you seen a string of LED Christmas lights at night? Yes, they are bright, but to me, the colors are cold, harsh and not natural. The Samsung TV had an exceptionally bright display, which is great, and HD channels with video-shot content looked good (Discovery HD "Planet Earth"), but LED colors are way over the top. I watched a few NBA games Sunday in HD, and I am very familiar with the uniform colors. LED made the Royal Blue of the LA Clippers uniforms look like cobalt blue. Oh, by the way, I did use Video Essentials to help setup the TV the best I could. Maybe there is something I didn't do, but I tried everything.
 

·
Registered
Joined
·
360 Posts
Your first complaint has nothing to do with 240 Hz. You can adjust the judder and blur independently to eliminate the cartoonish look and/or artifacts. As for edge-lit sets, we're in total agreement. Edge-lit LEDs make especially large LCD panels and colors look unquestionably like crap.
 

·
Registered
Joined
·
351 Posts

Quote:
Originally Posted by Dan Filice /forum/post/18058697


I just spent the weekend helping my father-in-law setup his new 55" Samsung Series 8000 LED 240hz TV. The TV itself is very nice, but let me say two things that have nothing to do with Samsung:


1) 240hz is HORRIBLE!!! If you are familiar with the history of television, there was a process before videotape called Kinescope, where TV shows were shot with video cameras, but to save and archive what was shot, film cameras were used to film the TV monitors. This was the only way to preserve the TV show as videotape wasn't available yet. The result was a strangely dimensional image. This is what 240hz looks like to me. The Samsung has settings to adjust the 240hz: Off, Clear, Normal, Smooth, Custom and Demo. Clear is the lowest setting, which was barely acceptable, but still had the artifact. When set to Normal or Smooth, the iTV image was so bad it drove me nuts. Some people say 240hz makes the TV show look like it was shot with a home video camera, kind of like a cheap soap opera. I totally agree. I don't think I've read any review of 240hz TV's that mention this artifact. I noticed this artifact on film-shot content, whether from DVD/BR disc or watching a film on Turner Classics. HD video-shot content looked OK. Anyways, I do not like the 240hz feature.

I noticed this very thing on a Sony Flat Screen I was looking at a while ago, not 240hz, but still. One of the National Treasure movies was playing on it, Blu Ray I believe, and it looked like it was shot with a video camera, I hated it. I mean it was super sharp and super clear, but it just didn't look natural, it wasn't right. I took the remote and turned all the "enhancements" off and all was well, it looked like a movie again. Why is it that the settings on these TVs that are supposed to make the picture better only ever seem to make it worse?
 

·
Registered
Display: TCL 65R617, AVR: Denon X2300W, Source: PS4 Pro, Fronts: Infinity Primus 250
Joined
·
2,913 Posts

Quote:
Originally Posted by loki993 /forum/post/18060689


I noticed this very thing on a Sony Flat Screen I was looking at a while ago, not 240hz, but still. One of the National Treasure movies was playing on it, Blu Ray I believe, and it looked like it was shot with a video camera, I hated it. I mean it was super sharp and super clear, but it just didn't look natural, it wasn't right. I took the remote and turned all the "enhancements" off and all was well, it looked like a movie again. Why is it that the settings on these TVs that are supposed to make the picture better only ever seem to make it worse?

Because the general consumer isn't an A/V enthusiast and has no idea what supposed to look "proper". They just go by their eyes and what looks good to them. And what looks good to the majority is extremely bright whites, overly saturated colors (I admit, I like these to an extent...Anime fan here) and very clear, sharp pictures, even if it compromises the look of film.


Manufacturers build these displays for the average consumer. They need to do so in order to sell as many units as possible. It costs a lot of money to run these giant corporations.


I don't blame them. You know why? Because true enthusiasts have means at their disposal to tune these displays to their specifications. Its built in to the sets. You can take that Edge-lit LED and have it calibrated to ISF specs if you want to. The only thing stopping you is a few hundred $. Besides, don't you love unboxing a new TV, plugging it in, cringing at the default settings them tinkering with it for hours on end to get it just right?


I know I do!
 

·
Registered
Joined
·
65 Posts
Discussion Starter · #6 ·
The Samsung has settings within "240hz". The last one is called "Custom" which allows one to use sliders to adjust the amount of motion blur and studder reduction you want the TV to process. I messed around with these for hours to no avail.


I agree with the comments that the TVs are manufactured for the "average" consumer. When my father-in-law saw me mulling over the settings on his new TV, he said he didn't notice anyting odd or strange and he wanted his new TV to give him the "maximum of everthing possible." So...I just set everything back to factory default and let it ride. The Geek Squad folks came out Tuesday and were supposed to help adjust everything. Right. If this was my TV, I would have done the ISF calibration thing.


I have two 120hz TVs and I do not see any weird imaging on my sets. Maybe I have the level of processing for motion blur tweaked down correctly. I just could not find any setting on the Samsung that corrected the strange image processing.
 

·
Registered
Joined
·
190 Posts

Quote:
Originally Posted by NuSoardGraphite /forum/post/18060768


Besides, don't you love unboxing a new TV, plugging it in, cringing at the default settings them tinkering with it for hours on end to get it just right?


I know I do!

One of my favorite things about getting new stuff - be it music gear, audio equipment, video gear, or any component for my computer - is the setup and tinkering
I am definitely in that camp, getting it just right is half the fun!
 

·
Registered
Joined
·
6,225 Posts
There are two factors at work here - the first is your unfamiliarity with the settings for the Frame Interpolation engine which is pretty much covered in the messages above. If you spend some more time and do a basic calibration, I think you will find the result more pleasing.


The second effect is more psychological than anything else. You grew up watching CRT direct-view televisions which had the following problems:


1) Low resolution images.

2) 24fps film source (predominately), telecined to 60Hz and interlaced. (Some video.)

3) Colored light produced by glowing phosphors.

4) Artifacts in plenty from interlacing and telecining. (Definately present, but difficult to see on a low resolution TV.)


Now you have an HDTV which has:


1) Much higher source resolution.

2) Film, videotape, and digital source material.

3) Pure colored light produced by semiconductor junctions on LEDs.

4) A much better view of all the artifacts you had before on SD material, plus resolution scaling artifacts. You also have frame interpolation artifacts on both SD and HD material. Some artifacts are added by your HDTV, some are present in the HD signal, and you can see all of them much clearer than before.


But the point is there are lots of changes and they are unrelated to 240Hz. If you want my opinion after viewing many HDTVs, the old/new 60Hz sets look the worst. But there is a paradigm shift needed to appreciate Frame Interpolating sets which many simply cannot make, or to be more precise, are unwilling to try to make.
 

·
Registered
Display: TCL 65R617, AVR: Denon X2300W, Source: PS4 Pro, Fronts: Infinity Primus 250
Joined
·
2,913 Posts

Quote:
Originally Posted by Gary McCoy /forum/post/18062281


g.


The second effect is more psychological than anything else. You grew up watching CRT direct-view televisions which had the following problems:


1) Low resolution images.

2) 24fps film source (predominately), telecined to 60Hz and interlaced. (Some video.)

3) Colored light produced by glowing phosphors.

4) Artifacts in plenty from interlacing and telecining. (Definately present, but difficult to see on a low resolution TV.)


Now you have an HDTV which has:


1) Much higher source resolution.

2) Film, videotape, and digital source material.

3) Pure colored light produced by semiconductor junctions on LEDs.

4) A much better view of all the artifacts you had before on SD material, plus resolution scaling artifacts. You also have frame interpolation artifacts on both SD and HD material. Some artifacts are added by your HDTV, some are present in the HD signal, and you can see all of them much clearer than before.


This is exactly right. A lot of people don't consider the fact that they have just moved up from 307,200 pixels (via a 4:3 SDtv) to 2,073,600 pixels (at 1080p) which is almost 7 times the resolution. You're going to see things that you weren't able to see before. This is why cable looks so crappy because of all the compression artifacts that have always existed, but were invisible at the lower resolutions we viewed them before. The absolute best way to see what issues your TV has is to get a good quality Blu Ray or HDDVD player and play a high def disc at your TV's natural resolution. At that point you can't blame anything on compression, Resolution/upscaling issues or source issues.


My TV plays Blu Ray movies perfectly. The picture is pheonomenal, thus any problems from other sources are inherent in those sources and not my TV. I don't agonize over how horrible cable looks anymore.


However in the case of Frame Interpolation, as others have said, its a matter of taste really. You will either get used to it, or hate it forever. If you hate it, turn it off. If you can't turn it off (is there a TV where you can't?) then a new TV is in order I think.
 

·
Registered
Joined
·
65 Posts
Discussion Starter · #11 ·

Quote:
However in the case of Frame Interpolation, as others have said, its a matter of taste really. You will either get used to it, or hate it forever. If you hate it, turn it off.

The Samsung was my father-in-law's TV. Yes, I could turn off the 240hz processing and I would have done so if it was my TV, but he just spent $3000 on this TV and he wanted the "maximum of everything he paid for." So, I had no choice in the matter. The point that is sadly missing here is that many of the new TV's are promoting 240hz as the greatest invention. Turning off a feature that the TV companies are spending zillions on to promote isn't what they want people to do with their TVs. My point is that this promoted feature has flaws.


In regards to having a much better view of artifacts if using HDTV components, I have 3 BR players and 3 HDTV's, and I see all of the artifacts you and the other poster mention, and I accept them or have elimininated them with tweaking. My components do not give me bad artifacts that I can't eliminate or adjust. It seems the only solution for the 240hz artifacts is turning it off. But that would piss off my father-in-law since he just spent a bundle on his TV and he wants the 240hz. He is the "Average" consumer that these TVs are being marketed to.
 

·
Registered
Joined
·
190 Posts
They always market stuff that gets you farther from a true picture. That's just what they do. There's only two real avenues for them to go down, make the backlight/panels/supporting hardware better (which they do) and add more features (which they do). Doesn't matter if the features are silly and make things look cartoonish (which they are, and do), as soon as one manufacturer touts a new feature, EVERYONE better have it or they're "behind." So it goes. At least they let us turn the silly me-too stuff off.


Is it easier to market "Improved scalar processing efficiency for 15ms lower input lag on non-native sources compared to the previous generation, with improved pixel matrix architecture on our current generation of LCD panels for a small but real improvement in viewing angles!" or "MotionPlus technology for lifelike motion!"?


Tell your dad he needs to not let them determine his preferences for him, they already have his money, they don't need his mind, too. If he gives them that he's just being way too generous.


Edit: Of course there are really three avenues for them to go down, the third being "and do it cheaper," which they always try to do, sometimes with really disastrous results... But that's another discussion.
 

·
Registered
Display: TCL 65R617, AVR: Denon X2300W, Source: PS4 Pro, Fronts: Infinity Primus 250
Joined
·
2,913 Posts

Quote:
Originally Posted by Dan Filice /forum/post/18063907


The Samsung was my father-in-law's TV. Yes, I could turn off the 240hz processing and I would have done so if it was my TV, but he just spent $3000 on this TV and he wanted the "maximum of everything he paid for." So, I had no choice in the matter. The point that is sadly missing here is that many of the new TV's are promoting 240hz as the greatest invention. Turning off a feature that the TV companies are spending zillions on to promote isn't what they want people to do with their TVs. My point is that this promoted feature has flaws.

I have a buddy who has the Samsung A650 and he has his Automotion Plus turned on. I turned it down to "low" when I adjusted his settings for him and didn't tell him I did it. I noticed that the Frame Interpolation definately had a negative impact on film-based movies. We watched Back to the Future with the AMP on and it made the movements in the film wonky. However my buddy loves hockey and the AMP (on low) didn't have a negative impact on the movement...in fact it smoothed out the movement just enough to reduce motion blurring to the point where it was unnoticeable. In some cases the Frame Interpolation that comes with 120+hz TV's can be beneficial and in a lot (most?) of cases it won't be. Hopefully one day they will make a smart tv that can tell the difference between programming and automatically engage interpolation based on whats playing.

Quote:
In regards to having a much better view of artifacts if using HDTV components, I have 3 BR players and 3 HDTV's, and I see all of the artifacts you and the other poster mention, and I accept them or have elimininated them with tweaking. My components do not give me bad artifacts that I can't eliminate or adjust. It seems the only solution for the 240hz artifacts is turning it off. But that would piss off my father-in-law since he just spent a bundle on his TV and he wants the 240hz. He is the "Average" consumer that these TVs are being marketed to.

Well, keep in mind that the Frame Interpolation in 120+hz sets was created for 2 reasons: #1 to reduce "Judder" from 24fps material and #2 to reduce motion blur inherent in LCD HDTV's. Frame Interpolation is very proficient at both of those things at the expense of sacrificing the integrity of film-based content. The best combination is modest amounts of Frame Interpolation + Video based material. Those work great together. But Frame Interplation + Film equals the Soap Opera Effect (making Film look like Video) or too much Frame Interpolation destroys any sense of realistic movement in both Film or Video. Maybe you can try explaining this to your Father-in-Law but I suppose it would probably be pointless as such an explanation would be to my buddy.
 

·
Registered
Joined
·
65 Posts
Discussion Starter · #14 ·

Quote:
Maybe you can try explaining this to your Father-in-Law but I suppose it would probably be pointless as such an explanation would be to my buddy.

Agreed, and what I may do is secretly change his TV settings without telling him. He may not notice the improvement, but I'll be able to sleep.
 

·
Registered
Joined
·
1,170 Posts

Quote:
Originally Posted by Dan Filice /forum/post/18061348


The Samsung has settings within "240hz" which allows one to use sliders to adjust the amount of motion blur and studder reduction you want the TV to process.
Quote:
Originally Posted by Dan Filice /forum/post/18063907


Yes, I could turn off the 240hz processing and I would have done so if it was my TV, but he just spen, but he just spent $3000 on this TV and he wanted the "maximum of everything he paid for." So, I had no choice in the matter.


if the effect is fully adjustable and/or can be turned off i don't understand what the gripe is?

you "have no choice in the matter" because your dad prefers watching his tv that way?...so blame samsung?

options are good unless you're running a dictatorship. make your case and then let him decide.
 

·
Registered
Joined
·
6,225 Posts
"They always market stuff that gets you further from a true picture."?


Have you not noticed that lots of movies don't exist on film today? They are shot on videotape, processed on digital servers, and shown on digital projectors. Not to mention animated movies, videotaped HD programs, Web content, or video games, which never were film. With sources like that, anything that gets you further from the look of film, which is what you are actually talking about, is not a bad thing.


This is not the world of SDTV anymore, where film was the medium most used. Digital media looks best on fast frame rate displays IMHO. Meaning it looks more like real life and less like the classic look of film.
 

·
Premium Member
Joined
·
2,787 Posts

Quote:
Originally Posted by Gary McCoy /forum/post/18064700


"They always market stuff that gets you further from a true picture."?


Have you not noticed that lots of movies don't exist on film today? They are shot on videotape, processed on digital servers, and shown on digital projectors. Not to mention animated movies, videotaped HD programs, Web content, or video games, which never were film. With sources like that, anything that gets you further from the look of film, which is what you are actually talking about, is not a bad thing.


This is not the world of SDTV anymore, where film was the medium most used. Digital media looks best on fast frame rate displays IMHO. Meaning it looks more like real life and less like the classic look of film.

You continue to confuse things that ultimately have nothing to do with one another. Most movies are still shot on film first, and there's loads of information on IMDB to support that statement. Many prime time shows are also shot on film. Media that originates digitally is still in the very small minority. Not to mention that no one is capturing material above 30 progressive frames/sec or 60 interlaced fields/sec, so your entire argument for the appropriateness of "fast frame rate displays" is completely nonsense.


Either way, you seem to be continuing to argue some sort of connection between "old" capture methods and "old" display methods vs "new" capture methods and "new" display methods where none exists. Display technology isn't driving directors to use different media any more than different media is driving the use of new display technology. Any display should simply reproduce what's fed to it, irregardless of where that material originated - thus yet there's no one display that does this.
 

·
Registered
Joined
·
828 Posts

Quote:
Originally Posted by Gary McCoy /forum/post/18064700


"They always market stuff that gets you further from a true picture."?




This is not the world of SDTV anymore, where film was the medium most used. Digital media looks best on fast frame rate displays IMHO. Meaning it looks more like real life and less like the classic look of film.

Again, you keep associating 240 Hz processing as more "life-like". Besides the fact that this is only your opinion (and you are in the vast minority) your comment also assumes that video on a plasma doesn't look life-like...well, IMO, video on plasma looks more life-like than on an LCD and especially ones with 240 Hz processing.


60 Hz is more than enough for a person not to notice motion blur on a capable TV (CRT, plasma, etc.) while maintaing full HD resolution. Because LCD can not even handle this properly, it employs 120 Hz or 240 Hz processing to reduce motion blur. This, in and of itself, doesn't produce any major distractions (except of course decreased resolution on an LCD) but displaying something twice as fast on an LCD as something that already looks smooth on a plasma, doesn't help anything either.


So what's the solution? Frame interpolation. Frame interpolation is the real culprit in this nonsense processing. It artificially generates frames to link the actual frames together to make it look smoother (and more life-like according to you).


NOTHING on TV, IMO, looks as fake as this BS processing. Smoother? Yes (for films) but this makes films look ridiculously fake and un-film like (sorry, film should look like film). Smoother on video? Not really because, again 60 Hz is plenty on a good set without motion blur issues...that's why video is SHOT at 60fps. So why bother with frame interpolation? To make video look more video-like?


So here are the options:


1) Plasma which has NO issues at all with motion blur or motion resolution on video or film and looks far more "life-like" IMO than any LCD I've ever seen


2) LCD which has issues with motion blur and motion resolution on everything at 60 Hz...film and video


or


3) LCD with 120/240 Hz processing which does nothing except reduce motion resolution (excerpt from S&V's review of the Panasonic 58V10: Motion resolution was superb as well, with the TV retaining full motion HD resolution on scrolling test patterns - a result I've never seen with any LCD TV, even ones with a 120 Hz refresh rate)


or


4) LCD with 120/240 Hz processing AND film interpolation which makes video look like what it already looks like on a plasma and makes film look like it was shot on an Insignia home video camera.



Take your pick.



P.S. EVERYTHING HogPilot wrote above is also true
 

·
Registered
Joined
·
190 Posts

Quote:
Originally Posted by Gary McCoy /forum/post/18064700


"They always market stuff that gets you further from a true picture."?


Have you not noticed that lots of movies don't exist on film today? They are shot on videotape, processed on digital servers, and shown on digital projectors. Not to mention animated movies, videotaped HD programs, Web content, or video games, which never were film. With sources like that, anything that gets you further from the look of film, which is what you are actually talking about, is not a bad thing.


This is not the world of SDTV anymore, where film was the medium most used. Digital media looks best on fast frame rate displays IMHO. Meaning it looks more like real life and less like the classic look of film.

Fidelity to the source is the best we can hope for. Other than that, you get into the realm of totally arbitrary adjustments just however you feel. Which is fine, but any source that actually had artistic vision or direction at the start is compromised by your fiddling. If you don't care, have at it. If you do, a calibration is in order, and take care to disable most of those fidelity-compromising processing options
 

·
Registered
Joined
·
6,229 Posts
Gary pops into many threads spouting the same crap about video, face it the dude likes stuff shot digitally, and digitally only. He and several others are unaware of the nuance film gives to the story telling. If it's not razor sharp and video flat looking they are cranking their sets adjustment to make it so. That's what those worthless controls are for, people who like neon colors and video tape. They are welcome to it, but don't insist that is the PROPER way to watch anything but cartoons.
 
1 - 20 of 132 Posts
Top