Hello... I am very confused on what tech to buy.
I need to know something about the HDMI 1.4 3D standards. The older HDMI 1.3 3D standards were made and designed around 120hz frame sequential formats and the early formats correct?
That meant the display had to have 120hz input for that 3D format. This included and still includes PC monitors and Projectors... Correct? NOW
let me Ask this right...
The new 3D standards only require 60hz total since they use FRAME-PACKED 3D and split/compress the formats like Top/Bottom Side By Side into one single frame for both eyes to see at the same time : All packed in 1 frame for both eyes... This makes the 120hz requirement no longer needed for 60hz MAX input sources... Correct?
THIS means that ALL 3DTVs on the market accept the 60hz input signal and that is the requirement for these new standards... correct?
This also means that ALL TV sets in general only display 60hz at all times unless they have special "Frame Interpolation" Scanning back light tech turned on. So in FACT... NO TV or 3DTV on the market is even 120hz... They are all 60hz... Correct?
They all are 60hz when in normal operation... That is why they can do 3D on the HDMI 1.4 formats and not exceed bandwidth... correct?
I have looked all over and finally decided to ask you.
You are getting confused with a lot of terms.
First of all, bandwidth is total data throughput and while it's often rated in something like 120hz what that means is enough data to send 120 1080p frames per second.
If you reduce the frame rate, you could theoretically send a higher resolution image and vice versa.
So while frame packing only sends 60 frames per second, those frames are actually 2 1920x1080 frames (well with staggered compression etc I believe) so bigger than 1080p 60hz.
So for SBS, OU and Checkerboard you really only need bandwidth for 60 1080p frames per second.
But for Framepacked I believe you would need more as while it's 60 fps the frames are bigger than 1080p.
The need for 120hz displays was for active panels, and only necessary to achieve 60 effective frames per eye per second (ie you could use a 60hz panel and do active 3d but each eye would only get 30 fps and the blanking window would cause significant flicker).
With passive currently 60hz is fine as there is no blanking/flickerin that will occur. In theory passive 3D will be just as acceptable (flicker wise) as 2D at the same fresresh rate.
As for panel and hz this is where it's important to understand the difference between resfresh rate and frame rate.
Refresh rate is ther ate at which the image eliments can change, frame rate is the number of frames shown in a time frame (usually a second).
120hz panel means that the pixels can turn off and on 120 times per second. This means theoretically they could show 120 sepereate frames in 1 second.
In a 60 hz panel the pixels turn off and on more slowly and thus could only should 60 individiaul frames per second.
Similar idea for 240hz etc.
Plasma says 600hz subfield but my understanding is they are taking 200hz syb pixels of each primary color, counting 3 per pixel and calling it 600hz.
As for strobing backlight it's a gimmick that has some function but is not the same as higher refresh rate. LG uses strobing backlights to call 60hz panels 120hz trumotion, but the pixels do not turn on and off faster than any other 60 hz panel, the strobing backlight is just used to "crisp up" the image motion if you will (my words) and in theory help remove judder.
Now frame rate is how many individual frames there are per second with standard movies being 24fps, NTSC being 60FPS and PAL being 50FPS.
The trick is understanding that a panels refresh rate means it can show up to that many unique frames per second.
If the number of unique frames from the source is less than the panel can display, the panel may process the image to either show each frame multipele times (ie 30fps content on 60fps display could just be each frame shown twice - 24fps conversion to 60hz display involves pulldown with some frames being shown more often than others).
Als there is a thing called frame interpolation (often with it's own brandname per manufacturere) which tries to make fake frames between frames to make use of higher refresh rate displays.
For instance rather than simply showing every frame twice on a 120hz display with 60 hz content, frame interpolation may average two frames and stick the averege frame in between instead of just doubling. This can result in a smoother motion but also may result in image degredation.
Lastly most displays do not have input scalers that can handle more than 60fps. This means no matter what your panel speed, you are never seeing more than 60 frames of original data per second in 2D (potentially 120 frames in 3D with each eye gettings it's own frame 60 times per second).
So if you turn on frame interpolation on your 120hz set and feed it 60hz content, you are indeed seeing 120 unique images per second, but only half of them are original image data and half are processed/created.
So short answer:
Yes some display panels ARE faster than 60hz. Strobing backlight is NOT the same as a faster panel speed. And not all 3D formats are just 60 1080p frames per second.
I OWE YOU BIG TIME MAN.
Finally its explained! :')
So I was able to make forced refresh rates of (70hz - 120hz) on my Panasonic Plasma with Custom EDID timings... Anything higher than (70-85hz) on Progressive formats caused screen waves and frequency shaking "distortion".
1920x1080i (Interlaced) @ 120hz
1280x720i (Interlaced) @ 120hz
1920x1080p (Progressive) @ 70-85Hz
1280x720p (Progressive) @ 70-85Hz
So now I wonder to myself something...
I wonder if the nVidia 3D vision Kit I am getting soon would work on this Plasma!
All nVidia says you need is a 120hz capable display + 3D Vision kit and glasses + nvidia GPU + PC
MAYBE! I just wonder If I could get Nvidia 3D vision to accept the 1080i,720i 120hz signal?
MAYBE! if it could just accept the beefier 70-85hz modes for 1080p,720p?!
Honestly my Panasonic has great interlacing/interlacing tech performance... I couldn't really tell the difference between the 1080i 120hz and 1080p 85hz as far as picture quality goes.
Check this site out. LOTS of people are testing their TV capabilities with EDID edits and custom timings only to find hidden supported refresh rates!
Any thoughts on this and if there is a way to make custom resolutions and timings for Nvidia 3D Vision?