or Connect
AVS › AVS Forum › Display Devices › LCD Flat Panel Displays › Is the 60Hz power grid related to the HDTVs 60Hz refresh rates?
New Posts  All Forums:Forum Nav:

Is the 60Hz power grid related to the HDTVs 60Hz refresh rates?

post #1 of 22
Thread Starter 

Currently I have been reading a lot of what I believe is a misinformation, such as HDTVs supporting native 24p playback. In fact, a trick called pulldown is being applied when that kind of content is played back.

 

However, after research, I can't find anywhere this: is the 60Hz refresh rate of a HDTV related or linked to the 60Hz power grid from AC, like it used to be with CRT tubes? Or is the 60Hz refresh rate just something that became standard for TV material footage?

post #2 of 22
Thread Starter 

The claims I say that are misinformation is that 24p can be played back natively, without any pull-down. No TV can do that, right? The 24p support is the pulldown you are mentioning, thus it is not natively playing back frame by frame as in A, B, C, D, E...

 

But my question is in another context - in CRTs, the power grid was used to sync the timing signal on the tube, so TVs had a refresh rate of 60Hz (or 50Hz). The question is whether this applies to modern HDTVS - do HDTVs run on 60Hz because of any relation to the power grid, or because it became a standard refresh rate from that CRT era?

 

... and if there is any relation to 60Hz from AC towards the refresh rate of HDTV (LCD), what exact kinda of relation would it be?

post #3 of 22
Thread Starter 

Ok, so modern TVs follow the 60Hz refresh rate as a convention and to maintain compatibility with most contents.

My doubt was whether the power grid still had an influence on this refresh rate on LCD TVs, like it had with CRT tubes.

post #4 of 22
Modern TVs are DC internally, so 60hz AC power really has no bearing on them as far as displaying a picture goes. CRTs operated on AC, so their refresh had to align with the frequency of the utility power.
post #5 of 22
Quote:
Originally Posted by benes View Post

Actually there are TVs that have a native 24Hz playback mode. And when you turn this mode on you'll realize why its a very very bad idea. A frame rate that is this low HAS to be converted to a higher refresh rate otherwise it would be unwatchable. Again this is exactly what happens with film projection. Nothing is ever shown at native 24Hz.

LCDs don't flicker like film projectors. The image is held for 1/24. In a film projector the image is flashed 2 or 3 times, not held.
post #6 of 22
Quote:
Originally Posted by benes View Post

Most LCDs today are 120Hz or higher.

No LCD HDTV today is capable of more than 120Hz reliably, pixel response isn't that fast yet.
And pretty much only 3D capable LCD TVs have real 120Hz, everything else is only 60Hz.
The fastest gaming geared monitors are 120/144Hz.

There's no relationship between the power grid and LCD refresh rate.
post #7 of 22
Thread Starter 

Ok, the guy who explained that modern TVs are DC internally pretty hit the nail on how it bears no relation to power grid.

post #8 of 22
My tv is capable of 24p playback for blu-rays so that it maintains the 5:5 natural cadence without any pull-down.
post #9 of 22

top, 24fps Original frames. Next, the same film projected, with a shutter cutting the light to the screen. Next, a traditional 60Hz TV with 2:3 pulldown. Next, 72Hz and 120Hz displays, which show multiples of 24.

http://reviews.cnet.com/8301-33199_7-57588438-221/what-is-1080p24/
post #10 of 22
^^^^ nice graphic.
post #11 of 22
It's also important to note that the limitation of CRTs to a multiple of line frequency was only in the beginning. Technology obviously progressed since then as computer monitors had no such restriction. It was just due to the technology at the time that a multiple of line frequency was necessary.

So, we had a technological limitation give birth to a video standard that then drove the imposed technological limits on new technology.
post #12 of 22
Thread Starter 
Quote:
Actually there are TVs that have a native 24Hz playback mode. And when you turn this mode on you'll realize why its a very very bad idea. A frame rate that is this low HAS to be converted to a higher refresh rate otherwise it would be unwatchable. Again this is exactly what happens with film projection. Nothing is ever shown at native 24Hz.

 

Ok, I need further clarification on this...

 

It doesn't make any sense for TVs to allow the user to bring the display down to 24Hz, just to see that the native is horrible to watch. Actually, you are saying it is UNwatchable for anyone right? Because some dudes do claim that "they prefer that way", and that they "don`t notice one thing", and that is "even better than any other rate". Yes, I am coming across these type of folks on the net, and they're pretty irritating. You do mean that 24Hz native is UNwatchable for any human being, right ?  These guys are out of their minds.

 

I tried to bring down my PC panel to see what was with 24 Hz, and I can tell you it is even hard to look - fonts go shaky and flickering reigns. So I guess that would be the same effect on a HDTV LCD panel.

 

But then again...

 

If something HAS to be converted to a higher refresh rate, why HDTVs allow the user to bring it down to "see the effect". Wouldn`t that kind of thing being even considered as a risky operation in any user manual ?

 

And you end saying that nothing is ever shown at native 24Hz, well, if the user don`t turn that on, correct?

post #13 of 22
You seem confused. What source are you referring to, broadcast tv, blu-ray's, cable/sat?
post #14 of 22
Thread Starter 

I am particularly talking about the hability of Blu-Ray devices to sent 24p input to the HDTV when playing a BD Disc movie. Seems that in the past we didn`t have that option, and now folks do. That is the context of what I am saying.

post #15 of 22
Quote:
Originally Posted by kraftytwo View Post

The claims I say that are misinformation is that 24p can be played back natively, without any pull-down. No TV can do that, right? The 24p support is the pulldown you are mentioning, thus it is not natively playing back frame by frame as in A, B, C, D, E...

But my question is in another context - in CRTs, the power grid was used to sync the timing signal on the tube, so TVs had a refresh rate of 60Hz (or 50Hz). The question is whether this applies to modern HDTVS - do HDTVs run on 60Hz because of any relation to the power grid, or because it became a standard refresh rate from that CRT era?

... and if there is any relation to 60Hz from AC towards the refresh rate of HDTV (LCD), what exact kinda of relation would it be?
Actually the 60Hz powerline has had absolutely nothing to do with TV since the 1950s, when color came along. NTSC started out @ 60Hz so if there was any hum on the video signal it would appear as a stationary line and not roll thru the picture. Below is an extreme example of 60Hz hum on a 59.94Hz video signal:
post #16 of 22
I'm confused by several of the replies on this thread.

I would have thought that the limitation to 60Hz refresh rate on US-TVs was tied to flicker (or beating) with other light sources. Any light source in the US is putting out a light which is varying at 60Hz. If a TV is refreshing and/or changing light output at a frequency of 60Hz (or a multiple thereof), there will not be any beating with other 60Hz light sources. But if the TV is refreshing and/or changing light output at a frequency other than 60Hz (for example 50Hz), I would have thought that that would cause noticeable beating/flickering.

Also, aside from the native refresh rate, there is also the entire question of scanning backlights - if a backlight is strobing at anything other than 60Hz (or a multiple of 60Hz), I'm pretty sure that that would cause noticeable beating / flickering with other light sources.

I may be completely wrong about this which is why I am asking - why would varying light sources at differing frequencies not cause a noticeable beating / flickering effect???

-fafrd
post #17 of 22
Refresh rate for digital displays is a slippery thing because it really doesn't work the same way as analog technologies. A CRT refreshed interleaving 60 times every second. It actually had to redraw the image even if it was unchanged.

Digital displays don't work that way. On an LCD, if a particular pixel doesn't change color from frame to frame, it just continues to illuminate (most of the time, more on that later.) So, when we get into the digital domain, "refresh rate" doesn't really mean the same thing as it did in the past and honestly doesn't have a hard and fast definition. It CAN mean how often the TV is able to change the picture information. A 'panel refresh' rate of 240hz would be that it can change it 240 times a second, 120hz 120 times, 60hz 60 times. However, that isn't a hard and fast rule. Some 120hz TVs are only 60hz panels with backlight scanning or some sort of blanking frame inserted. So, those TVs insert a black frame (either by turning off the backlight or by just putting a black frame) once per cycle. They aren't capable of changing the information on the display more than 60 times a second, but the blanking frame that's inserted technically means they are delivering 120 frames to your eye per second. If a TV is marketed as a 120hz display but can't deliver correct film cadence, then this is likely the reason. The same thing happens for 240hz as well. Some TVs are capable of changing the picture that often, some only use a blanking frame on a 120hz panel to double the effective framerate.

So, a "24hz" LCD wouldn't necessarily have flicker or be hard to look at as long as it wasn't using backlight scanning. It just would be incapable of displaying anything above 24fps.

24p mode on blu-rays is used on (true) 120hz and 240hz displays to display movies without introducing any motion artifacts. On a 120hz tv, it would simply display each frame 5 times. On a 240hz display, it would show each frame 10 times. 'Show' is a tricky term though because that implies that the same frame is being "redrawn" multiple times which really isn't a thing on digital displays. It could simply be persisting on the screen through 5 or 10 display cycles. The display could also be using backlight scanning to insert black intervals to reduce motion blur. Regardless, this would be "correct cadence" for a 24fps film. You shouldn't notice any flicker really and motion should be smooth but a bit blurry since motion resolution isn't very high at that framerate.

Now, you can stack motion interpolation on top of that to increase motion resolution. So, instead of display the same frame 5 or 10 times, the TV creates new in-between frames. This is what causes the "soap opera effect."

Again, it's all very slippery and determined on how a particular TV does motion and what the native refresh of the panel is.

Film projection used either a 2 blade or 3 blade shutter on 24fps film which is essentially the same thing that backlight scanning is trying to accomplish. With a 3 blade shutter, 24fps film was shown as 72 flashes per second. Motion resolution wasn't any higher, but the increased flash rate reduced the effect of flicker.

Modern digital cinema works similar to TVs.
post #18 of 22
Quote:
Originally Posted by fafrd View Post

I'm confused by several of the replies on this thread.

I would have thought that the limitation to 60Hz refresh rate on US-TVs was tied to flicker (or beating) with other light sources. Any light source in the US is putting out a light which is varying at 60Hz. If a TV is refreshing and/or changing light output at a frequency of 60Hz (or a multiple thereof), there will not be any beating with other 60Hz light sources. But if the TV is refreshing and/or changing light output at a frequency other than 60Hz (for example 50Hz), I would have thought that that would cause noticeable beating/flickering.

Also, aside from the native refresh rate, there is also the entire question of scanning backlights - if a backlight is strobing at anything other than 60Hz (or a multiple of 60Hz), I'm pretty sure that that would cause noticeable beating / flickering with other light sources.

I may be completely wrong about this which is why I am asking - why would varying light sources at differing frequencies not cause a noticeable beating / flickering effect???

-fafrd

It was originally 60hz simply because vacuum tube tech back in the 20s and 30s really wasn't up to the task of operating at a different frequency without difficulties (such as hum lines.)

Up until recently, competition with other light sources wouldn't have been an issue. Incandescent lights run on AC power true enough, but they don't have a "refresh rate" like fluorescent lights since their light is generated by heat. Incandescents have no flicker at all. So, there really wouldn't have been competition with ambient lighting until the past 10 years or so. Even then, it's not an issue with CFLs since they use an electronic ballast. Only magnetic ballast fluorescent lights are tied to the line frequency (double it, so 120hz.). CFLs refresh at 20 kHz and since that is quicker than the phosphor persistence, there's no perceivable flicker.

It's simply a matter of "this is the way it's always been so we'll keep doing it that way."
post #19 of 22
Quote:
Originally Posted by fafrd View Post

I'm confused by several of the replies on this thread.

I would have thought that the limitation to 60Hz refresh rate on US-TVs was tied to flicker (or beating) with other light sources. Any light source in the US is putting out a light which is varying at 60Hz. If a TV is refreshing and/or changing light output at a frequency of 60Hz (or a multiple thereof), there will not be any beating with other 60Hz light sources. But if the TV is refreshing and/or changing light output at a frequency other than 60Hz (for example 50Hz), I would have thought that that would cause noticeable beating/flickering.

Also, aside from the native refresh rate, there is also the entire question of scanning backlights - if a backlight is strobing at anything other than 60Hz (or a multiple of 60Hz), I'm pretty sure that that would cause noticeable beating / flickering with other light sources.

I may be completely wrong about this which is why I am asking - why would varying light sources at differing frequencies not cause a noticeable beating / flickering effect???

-fafrd
It was originally but as I said up there with the advent of NTSC Colour they took the 'F' out of 60Hz TV. That's right, there ain't no 'effen' 60Hz. It's 59.94Hz now. There are also very few 60Hz flickering lights now. Mostly old fluorescent fixtures with magnetic ballasts.
post #20 of 22
Quote:
Originally Posted by benes View Post

Only if you had a 24Hz LCD. Most LCDs today are 120Hz or higher. That means the image is refreshed 120 times a second even if its showing a still picture.

Yes, but the refresh is invisible. Not just to the human eye, but also to a high speed camera. It's not flashing like a film projector.
post #21 of 22
Quote:
Originally Posted by Luke M View Post

Yes, but the refresh is invisible. Not just to the human eye, but also to a high speed camera. It's not flashing like a film projector.

http://www.youtube.com/watch?v=hD5gjAs1A2s
post #22 of 22
Quote:
Originally Posted by Luke M View Post

Quote:
Originally Posted by benes View Post

Only if you had a 24Hz LCD. Most LCDs today are 120Hz or higher. That means the image is refreshed 120 times a second even if its showing a still picture.

Yes, but the refresh is invisible. Not just to the human eye, but also to a high speed camera. It's not flashing like a film projector.

Unless it has a scanning or strobing backlight...

-fafrd
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: LCD Flat Panel Displays
AVS › AVS Forum › Display Devices › LCD Flat Panel Displays › Is the 60Hz power grid related to the HDTVs 60Hz refresh rates?