or Connect
AVS › AVS Forum › Display Devices › Flat Panels General and OLED Technology › 4k by 2k or Quad HD...lots of rumors? thoughts?
New Posts  All Forums:Forum Nav:

4k by 2k or Quad HD...lots of rumors? thoughts? - Page 79

post #2341 of 3670
Quote:
Originally Posted by mrsmith View Post

One of the great disappointments this year in HDTV-land. Today I saw LG's $17,000 84-inch 4K display, showing a high-end travelogue: Other than black level(a little better but not all that much) the resolution was about equal to the 1080p Ultra-BluRay boxed travelogue discs which have been available for a year. So what's the big deal? The resolution on an 84-inch display is about equal to what you'd get with a high-end source on a 65-incher like mine and that is a genuine benefit, but will 4K become the norm? If the cost gets down to what we now pay for 1080p, why not? But at the premium price being ask for today? Gimme a break.

There is little advantage to the 4K comparing to the 2K, especially if the 2K is compressed to the same bit rate as the 4K. As the displays approach 100" and more, the 4K becomes more justifiable. However, there is a trend towards high-density displays. One will see many new smartphones coming with the full 2K resolution next year and 4K computer monitors too . If pocket device has 2K then one can ask, why not 4K, or even better 8K, for TV displays.
8K in particular opens possibilities for scenarios going beyond the current TV viewing, e.g. immersive curved displays.
post #2342 of 3670
Quote:
Originally Posted by irkuck View Post

There is little advantage to the 4K comparing to the 2K, especially if the 2K is compressed to the same bit rate as the 4K. As the displays approach 100" and more, the 4K becomes more justifiable. However, there is a trend towards high-density displays. One will see many new smartphones coming with the full 2K resolution next year and 4K computer monitors too . If pocket device has 2K then one can ask, why not 4K, or even better 8K, for TV displays.
8K in particular opens possibilities for scenarios going beyond the current TV viewing, e.g. immersive curved displays.

Actually, wouldn't it be 2K compressed with the same compression algorithm to use the same space as the 4K at whatever bit rate it's at? For example, if the 4K was using H.265 at 25Mbps bitrate taking up 50GB of space, that same H.265 might be able to get 2K to 25Mbps to 25GB of space. To be fair though, there's an extra 25GB of unused space on the 2K disc. So instead, why not have it be 2K using H.265 at 50Mbps bit rate taking up 50GB? The question would then be, does 2K compressed at 50Mbps look better than 4K compressed at 25Mbps?

I'm totally making these numbers up out of thin air btw, just to make my point.

Chuck
post #2343 of 3670
Quote:
Originally Posted by irkuck View Post


There is little advantage to the 4K comparing to the 2K, especially if the 2K is compressed to the same bit rate as the 4K. As the displays approach 100" and more, the 4K becomes more justifiable.  ....

Here I am in full agreement with you:  I don't think that anything beyond 1080p (2K) makes any difference for normal tv's, i.e., </= 60 to 70 In diag.     4K will be an improvement for projector setups, and though even here mainly for persons who like to sit close (< 3 PH).

post #2344 of 3670
Quote:
Originally Posted by millerwill View Post

Here I am in full agreement with you:  I don't think that anything beyond 1080p (2K) makes any difference for normal tv's, i.e., </= 60 to 70 In diag.     4K will be an improvement for projector setups, and though even here mainly for persons who like to sit close (< 3 PH).

Given the impression is that 4K will be pushed to consumers as the Next Big Thing, I wonder how they are going to make people excited over a difference so subtle. Anyone could see the difference Hi-Def made vs what they were used to with NTSC. Or the flatness of a flat panel vs a CRT. Or the difference between 2D and 3D. But what to they do when the promoters of 4K say "Look at this! Pretty amazing huh? Don't you want it?" And most people say "Uh...I can't really see a difference." Or "I thiiiink I might be able to see what you are talking about..." (And, especially a difference at anything like the distance most people prefer to sit from their displays).
post #2345 of 3670
Yeah... I think we've been through this and we've heard opinions from people who saw it.
post #2346 of 3670
Quote:
Originally Posted by R Harkness View Post

Given the impression is that 4K will be pushed to consumers as the Next Big Thing, I wonder how they are going to make people excited over a difference so subtle. Anyone could see the difference Hi-Def made vs what they were used to with NTSC. Or the flatness of a flat panel vs a CRT. Or the difference between 2D and 3D. But what to they do when the promoters of 4K say "Look at this! Pretty amazing huh? Don't you want it?" And most people say "Uh...I can't really see a difference." Or "I think I might be able to see what you are talking about..." (And, especially a difference at anything like the distance most people prefer to sit from their displays).

My guess is it will go hand in hand with larger displays.

It may not add much cost to the LCD, so it will be in the upper end product.

I have a hard time getting jazzed about this when so few displays are accurate with good black levels.
Twice the pixels with middling black levels, poor off-angle viewing, and inaccurate color and gamma: Sign me up tongue.gif

- Rich
post #2347 of 3670
4K is need for glasses free 3D ( Dolby 3D) and flicker free passive 3D in 1080p and this is an huge improvement over current 3D-TV solutions.
post #2348 of 3670
Quote:
Originally Posted by ALMA View Post

4K is need for glasses free 3D ( Dolby 3D) and flicker free passive 3D in 1080p and this is an huge improvement over current 3D-TV solutions.

Next along this line will be:"8K is needed for glasses free 3D@4K for huge improvement over 1080p 3D"biggrin.gif

Anyway, is anybody willing to bet what will be the price of the 110" 4K LCD??? On one hand if the price will be upward-proportional wrt to those mini 84 inchers then the 25-30K range can be seen. On the other hand if it will be downard-proportional, chinese-style price then something close to the 10K range may show up.
post #2349 of 3670
Quote:
Originally Posted by irkuck View Post

Next along this line will be:"8K is needed for glasses free 3D@4K for huge improvement over 1080p 3D"biggrin.gif
More than 8K would be needed for glasses free 3D displays @ 4K in 3D mode if they keep about the same number of viewing zones as the Toshiba one (the ZL2).

Toshiba's 4K glasses-free display gives 720p resolution in 3D mode with 9 viewing zones.
(3840*2160)/(1280*720)=9

An "8K" (7680x4320) glasses-free display with 9 viewing zones would have about 2.5K (2560x1440) in 3D mode (1440p isn't a huge improvement over 1080p).
(7680*4320)/(2560*1440)=9

Though with autostereoscopic TVs, you could have each view show a different perspective (like the Panasonic ones did), something you don't get with normal stereoscopic 3D using glasses.
Edited by Joe Bloggs - 12/11/12 at 11:16am
post #2350 of 3670
NHK's projected circa 2050 autostereoscopic TV technology is neither zone count nor head orientation limited . . . so you can lay your head down on the sofa to watch 3D TV without the need to keep both eyes in the same horizontal plane! biggrin.gif
_
post #2351 of 3670
Quote:
Originally Posted by SoundChex View Post

NHK's projected circa 2050 autostereoscopic TV technology is neither zone count nor head orientation limited . . . so you can lay your head down on the sofa to watch 3D TV without the need to keep both eyes in the same horizontal plane! biggrin.gif
_
It uses a microlenses.
Here's a video:
Their system in 2010 only had a resolution of 400x250, even though they were using a Super Hi-vision "8K" projector .
Edited by Joe Bloggs - 12/11/12 at 12:35pm
post #2352 of 3670
From September:
Faster 4Kx2K, Slower AMOLED TV?

Here we can see from where the different companies source their 4K/UHD panels.


Interesting to note that the Sony 84" is made by LG.
Quote:
If the productivity and yield rate are well-controlled, 4Kx2K will enable panel makers to increase profitability. A 50” 4Kx2K panel is priced at $800, compared to $400 for a full HD panel with slim type LED backlight, while an 84” 4K panel will be priced over $5,000.
So now we know why the Sony 84" is $5000 more expensive than the LG 84".

Strangely, as this chart comes from "Weekly TV Supply Chain Executive Briefing report", that they have not managed to include mainland China 4K panel manufacturers and brands like Skyworth, supposedly the largest TV brand in China and Konka Group which is Skyworths largest rival in China. Both releasing 4K TVs.
As mentioned before in this thread China Star Optoelectronics Technology (CSOT)a business group of TCL Corporation announced they have a 55" 4K TV and the 110" 4K TV. They also claim "The success of CSOT has enabled TCL Corporation to become the first company in China to produce LCD TV sets all by itself."

This report from October also sums up some points of why 4K TVs will be an easier sell than 3D TV was based om much of the same info; What If They Gave A Party, But No One Came – 3D vs. 4K TV?

NPD DisplaySearch November report claims; OLED TVs to Start Shipping by the End of 2012, which proboably won't happen. They want payment for such prediction report that we at AVS forum knows that will miss the target? tongue.gif

An interesting tidbit from a Ceatec report from October 1. Japan CEATEC: Ultra high-def TVs
Quote:
Panasonic said it will model an 8K model, while Sharp will show a new concept TV that uses an impressive technology called ICC (Integrated Cognitive Creation), which it says "reproduces the cognitive process by which the human brain interprets light stimuli."

Some one posted somewhere else that a well known brand name will show a 55" 4K at CES that will be in the shops in March costing $3300, and a 84" by summer. Could it be Toshiba?

Let the price wars on 4K TVs and Monitors begin. cool.gif
post #2353 of 3670
Quote:
Originally Posted by coolscan View Post


Some one posted somewhere else that a well known brand name will show a 55" 4K at CES that will be in the shops in March costing $3300, and a 84" by summer. Could it be Toshiba?
Let the price wars on 4K TVs and Monitors begin. cool.gif

Yes, Toshiba already has a 55" 4K ZL2 model selling in Japan using the 55" AUO panel and will use LG 84" panel for their 84" 4K sets. Not include in Weekly TV Supply Chain Executive Briefing report is AUO all new 65" IGZO 4K panel and the Sharp 32" 4K computer monitor. Also, not included was the Sharp 70" 4K panel showed at CES. Don't know if the Sharp 60" and 70" 4K demo units where using IGZO at the time, but since neither are in production yet, I assume the will. The Sharp's could also get the new MothEye screens and ICC-4K reality engine. Just hope that whatever they show at CES this year does not become more vaporware like last year.

http://www.auo.com/?sn=107&lang=en-US&c=10&n=1459
post #2354 of 3670
Quote:
Originally Posted by chucky2 View Post

Actually, wouldn't it be 2K compressed with the same compression algorithm to use the same space as the 4K at whatever bit rate it's at? For example, if the 4K was using H.265 at 25Mbps bitrate taking up 50GB of space, that same H.265 might be able to get 2K to 25Mbps to 25GB of space. To be fair though, there's an extra 25GB of unused space on the 2K disc. So instead, why not have it be 2K using H.265 at 50Mbps bit rate taking up 50GB? The question would then be, does 2K compressed at 50Mbps look better than 4K compressed at 25Mbps?
I'm totally making these numbers up out of thin air btw, just to make my point.
Chuck

Hah, you've got to the right question. First, one has to be aware about the insidious cheating if the 4K is compressed to the 50GB space and compared with the 2K compressed to much lower space. The right comparison is obviously for both formats compressed to the same space of 50GB. This is however not done since it would show patently clear that 4K makes no sense.

Now, with extremely compressed sources it is always better to start with a lower res if the bit budget is the same. Why? Just because you have to shave more details @ higer res to get to the budgeted bits whicc visually makes difference. This can be observed by comparing current 720p and 1080p sources compressed to a broadcast rate of 10 Mb/s with H.264. Even when enlarged to fit the 1080 display the 720p has more details. This is because the 720p has roughly 1 mln pixels and 1080p has 2 mln pixels. This observation is not changed by the fact that 720p has 60 fps and 1080p has 30 fps since predictive coding effectively eliminates differences here.

Situation is obviously not the same if the 1080p is compressed @Blu-Ray rate of 25-30 or even 50Mb/s. This is the region where the details of 1080p are showing up. But again, when trying to squeeze the 4K into the same bit rate one has to shave off details.

Regarding the magic of H.265 one has to be careful about it whne talking about extreme compression. Modern compression system is a toolbox which can fit compression to almost any bit rate in the sense of not breaking up, even able to produce cartoon-like animation from the original video. This is watchable but far from the original.
post #2355 of 3670
Quote:
Originally Posted by irkuck View Post

Hah, you've got to the right question. First, one has to be aware about the insidious cheating if the 4K is compressed to the 50GB space and compared with the 2K compressed to much lower space. The right comparison is obviously for both formats compressed to the same space of 50GB. This is however not done since it would show patently clear that 4K makes no sense.
Now, with extremely compressed sources it is always better to start with a lower res if the bit budget is the same. Why? Just because you have to shave more details @ higer res to get to the budgeted bits whicc visually makes difference.
An important thing to note; This and the rest of your claims are your personal theories created out of thin air. rolleyes.gif
All reports say that a 4K source, that is compressed to the same space as a 2K compressed source, has much better image quality than the compressed 2K image quality. The higher detailed source will always compress better and can be compressed harder than a low resolution source and still retain more detail and better image quality.

What is continuously forgotten both in compression discussions based on old theories and seating distances for higher resolution displays, is that the higher resolution cameras (4K++) have already captured more detail than 2K cameras (that mostly don't resolve 2K of detail) and therefore are able to give the compression tools a higher number of details to make a choice of what to discard and what to keep.

The various new HEVC compression tools are not the old tool basis just scaled to be able to compress harder. They are based on new evolved sciences with a new understanding on how image data can be treated.
Edited by coolscan - 12/12/12 at 2:42am
post #2356 of 3670
Quote:
Originally Posted by coolscan View Post

An important thing to note; This and the rest of your claims are your personal theories created out of thin air. rolleyes.gif
All reports say that a 4K source, that is compressed to the same space as a 2K compressed source, has much better image quality than the compressed 2K image quality. The higher detailed source will always compress better and can be compressed harder than a low resolution source and still retain more detail and better image quality.
Is this with any codec (including H264) or only with H265 which isn't finished yet? If at the same bitrate, the higher resolution one is always better, does that mean that at 1 Mbps a 4K video file at 60 fps is going to show a football match in better quality than encoding it 480p60, 720p60 or 1080p60? Do you have any links which show this? Surely you'd get to a point where there just isn't the bandwidth to encode the higher resolution one without visible artefacts or without dropping frames etc. and you'd get a better result with a lower resolution video encoding.
Quote:
The various new HEVC compression tools are not the old tool basis just scaled to be able to compress harder.
The HEVC/H265 codec is very much like the H265 one, but with different parameters, eg. able to use different size blocks etc.

Also, I agree it's best to start with a higher resolution source, but that doesn't mean that encoding a video file with the same codec and bitrate will mean the highest resolution one will always be better.
Edited by Joe Bloggs - 12/12/12 at 6:36am
post #2357 of 3670
Quote:
Originally Posted by Joe Bloggs View Post

Is this with any codec (including H264) or only with H265 which isn't finished yet?
I kind of call all the new codecs for HEVC. H.264 I have never heard be calle HEVC?
The HEVC I know about in development is H.265, Mpeg-H(possible name) and .RED
Quote:
If at the same bitrate, the higher resolution one is always better, does that mean that at 1 Mbps a 4K video file at 60 fps is going to show a football match in better quality than encoding it 480p60, 720p60 or 1080p60? Do you have any links which show this? Surely you'd get to a point where there just isn't the bandwidth to encode the higher resolution one without visible artefacts or without dropping frames etc. and you'd get a better result with a lower resolution video encoding.
The HEVC/H265 codec is very much like the H265 one, but with different parameters, eg. able to use different size blocks etc.
At some point I guess the bitrate is constrained so much that it doesn't matter what was the original source, So let's keep it to the realistic bitrates.
By the way; The RedRay .RED codec that is spec'd at 20Mb/s for 4K. At the demos that RED run regularly for interested people at RED Studios features 4K material at 11Mb/s projected with a Barco 4K and Sony VW1000 with impressive results.

All reports from people that has tested 4K source down scaled to 2K and compressed to equal to BD bitrates says that it looks better than BD. Can't remember if anybody says what codec they used, but this have been claimed for a long time, so I guess it must have been H.264 type of codecs.
Which is also partly confirmed on BD re-releases that have used a 4K source.
post #2358 of 3670
Quote:
Originally Posted by coolscan View Post

I kind of call all the new codecs for HEVC. H.264 I have never heard be calle HEVC?
I wasn't calling it HEVC. H264=what we have now on Blu-ray. H265 and HEVC=the same thing.
Quote:
At some point I guess the bitrate is constrained so much that it doesn't matter what was the original source, So let's keep it to the realistic bitrates.
I went down to 1 Mbps to prove the point, to make it obvious that at a certain point you'd get better quality (less visible artefacts etc.) with a lower resolution.
Quote:
All reports from people that has tested 4K source down scaled to 2K and compressed to equal to BD bitrates says that it looks better than BD.
That's not the same as a 4K video file. That's just saying that oversampling will give you higher quality, which it does.
Quote:
Which is also partly confirmed on BD re-releases that have used a 4K source.
Again oversampling. Not the same as compressing a 4K video at the same bitrate as a 2K video with the same codec and the 4K one always being better quality. There will be a bitrate where it will be better, but under that, the 2K will be better.
post #2359 of 3670
Quote:
Originally Posted by coolscan View Post

An important thing to note; This and the rest of your claims are your personal theories created out of thin air. rolleyes.gif
All reports say that a 4K source, that is compressed to the same space as a 2K compressed source, has much better image quality than the compressed 2K image quality. The higher detailed source will always compress better and can be compressed harder than a low resolution source and still retain more detail and better image quality.

What you promote here is pure absurdity. If higher res=better compression PQ than 8K would better than 4K compressed to the same space, 16K would be better than 8K, and in the end GigaK would be better than 2K.
Quote:
Originally Posted by coolscan View Post

What is continuously forgotten both in compression discussions based on old theories and seating distances for higher resolution displays, is that the higher resolution cameras (4K++) have already captured more detail than 2K cameras (that mostly don't resolve 2K of detail) and therefore are able to give the compression tools a higher number of details to make a choice of what to discard and what to keep.
The various new HEVC compression tools are not the old tool basis just scaled to be able to compress harder. They are based on new evolved sciences with a new understanding on how image data can be treated.

I know those things very well and you not. Compression works by shaving off details, at extreme compression e.g. 1080p broadcast at 10 Mb/s H.264 there are details left just to barely cover standard TV viewing scenario. If one squeezes 4K to 20 Mb/s using H.265 the PQ will be at most equivalent to the 1080p at 10 Mb/s.
post #2360 of 3670
Quote:
Originally Posted by coolscan View Post

From September:
Faster 4Kx2K, Slower AMOLED TV?
Here we can see from where the different companies source their 4K/UHD panels.

Interesting to note that the Sony 84" is made by LG.

I posted that in August:

http://www.avsforum.com/t/1309492/4k-by-2k-or-quad-hd-lots-of-rumors-thoughts/1590#post_22335422

(And, yes, to be clear, I concluded the fact then without official confirmation.)
post #2361 of 3670
Quote:
Originally Posted by irkuck View Post

What you promote here is pure absurdity.
No it isn't.
This is already been proven and demonstrated for many. Just wait and see when people get their RedRay players and can compare 4K source material at 20Mb/s down-converted to 2K for a HD screen on the fly and compare it to a source like BD.
Quote:
If higher res=better compression PQ than 8K would better than 4K compressed to the same space, 16K would be better than 8K, and in the end GigaK would be better than 2K.
Not quite sure I understand what you mean. But as we don't have any real 8K or 16K video source we don't know at what sort of compression the source starts to loose its detail.
At some point it will also be meaningless to continue compressing ever higher resolution and downscale everything to 2K.
Quote:
I know those things very well and you not. Compression works by shaving off details, at extreme compression e.g. 1080p broadcast at 10 Mb/s H.264 there are details left just to barely cover standard TV viewing scenario. If one squeezes 4K to 20 Mb/s using H.265 the PQ will be at most equivalent to the 1080p at 10 Mb/s.
Compression partly shaves off detail, but there are much other "stuff" in an moving image that also can be discarded to reach low bitrates. If the source has an abundance of information for the compression algorithms to choose from it can make a better selection of important things to keep and unimportant thing to throw out. This is what the new compression codecs does better than the old codecs.
It is an easy principle to understand. Not as easy tools to make. Soon we can compare the new tools to see which of them does it best.
post #2362 of 3670
Quote:
Originally Posted by coolscan View Post

No it isn't.
This is already been proven and demonstrated for many. Just wait and see when people get their RedRay players and can compare 4K source material at 20Mb/s down-converted to 2K for a HD screen on the fly and compare it to a source like BD.

It is. At extreme compression ratios you can also get so much. Think, e.g. why the DCI uses only intraframe with mild compression for cinema apps.
Quote:
Originally Posted by coolscan View Post

Not quite sure I understand what you mean. But as we don't have any real 8K or 16K video source we don't know at what sort of compression the source starts to loose its detail.

You claim that 4K gives better results than 2K at the same space. This could be then continued: 8K is better than 4K, 16K better than 8K and so on ad infinitum. But then it results 8K is better than 2K, 16K is better than 2K and in the end infinite res compression is better than 2K. Hopefully you notice now
that your argument leads to reductio ad absurdum.
Quote:
Originally Posted by coolscan View Post

Compression partly shaves off detail, but there are much other "stuff" in an moving image that also can be discarded to reach low bitrates. If the source has an abundance of information for the compression algorithms to choose from it can make a better selection of important things to keep and unimportant thing to throw out. This is what the new compression codecs does better than the old codecs.
It is an easy principle to understand. Not as easy tools to make. Soon we can compare the new tools to see which of them does it best.
´

Motion is another redundancy source for compression but it is done using motion compensation which shaves detail. This is not a problem when eyes are static. But when eyes are following motion the lack of detail becomes apparent. This is becoming serious issue with huge displays at low viewing distance.
The end result of this is that at the extreme compression rate one can achieve barelu sufficient PQ. This is why Blue-ray is better than HD broadcast. This is why 4K compressed @ 20 Mb/s may not stand up to the 2K @ the very same 20 Mb/s and upconverted to 4K.
post #2363 of 3670
Back to reality: Sharp's "Integrated Cognitive Creation" 60-inch 4K LCD: Branded the ICC PURIOS, it brings a new premium level above previous AQUOS models, thanks to not only the 3,840 x 2,160 resolution but also professional quality image processing that Sharp says brings "unparalleled realism and excitement".
On top of this it is build-to-order and the price suggests... golden frame with diamond incrustation confused.gifeek.gif
post #2364 of 3670
Quote:
Originally Posted by irkuck View Post

It is. At extreme compression ratios you can also get so much. Think, e.g. why the DCI uses only intraframe with mild compression for cinema apps.
You claim that 4K gives better results than 2K at the same space. This could be then continued: 8K is better than 4K, 16K better than 8K and so on ad infinitum. But then it results 8K is better than 2K, 16K is better than 2K and in the end infinite res compression is better than 2K. Hopefully you notice now
that your argument leads to reductio ad absurdum.
´
Motion is another redundancy source for compression but it is done using motion compensation which shaves detail. This is not a problem when eyes are static. But when eyes are following motion the lack of detail becomes apparent. This is becoming serious issue with huge displays at low viewing distance.
The end result of this is that at the extreme compression rate one can achieve barelu sufficient PQ. This is why Blue-ray is better than HD broadcast. This is why 4K compressed @ 20 Mb/s may not stand up to the 2K @ the very same 20 Mb/s and upconverted to 4K.
Please tell RED that their effort in making a compression codec, compression software and playback hardware is futile because you said so. Their claims and demonstrations of 4K material to playback at 20Mb/s delivered over the net by ODEMAX distribution is unobtainable because it goes against your personal theories of what is possible. Again we must already declare REDs product a scam (even though it has been proven otherwise) because an armchair poster on AVS is unable to wrap his head around the fact that his theories doesn't fit the real world. smile.gif

Sometimes I wonder if you are just a provocateur or trolling just for the fun of it. You did the same protest about the advantage of 4K in the face of all contrary reports. Now you have cooled somewhat on that, but pick up the protesting against 4K and higher resolution at low bitrates against all reports proving you are wrong again. rolleyes.gif
Edited by coolscan - 12/13/12 at 8:00am
post #2365 of 3670
Has the Red codec been independently tested against the major codecs, such as mpeg2, H264, H265? Do you have any links to the results of these tests?
post #2366 of 3670
Quote:
Originally Posted by Joe Bloggs View Post

Has the Red codec been independently tested against the major codecs, such as mpeg2, H264, H265? Do you have any links to the results of these tests?
Red has not released any data whatsoever for the .RED codec ecept variouse playback bitrates for material they use when they do demos.
One reason is that the Image Quality was Locked in just a week or two ago. Now they are working on optimising for such things as speed efficiency like the speed of the encoder.
I am sure we will see comparison tests when other HEVCs are ready for prime time. I guess RED is guarding their Codec so not to loose eventual advantages to the teams working on the other HEVC codecs.
Each RedRay player will come with one .RED encoder licence. Additional licences will cost $20. The RedRay encoder is a plug-in for the Redcine-X editing software.
Everybody that have access to any 4K material can use the software to encode their own .RED material and compare. (You need a RedRay player to playback .RED material.)

post #2367 of 3670
Quote:
Originally Posted by irkuck View Post

Hah, you've got to the right question. First, one has to be aware about the insidious cheating if the 4K is compressed to the 50GB space and compared with the 2K compressed to much lower space. The right comparison is obviously for both formats compressed to the same space of 50GB. This is however not done since it would show patently clear that 4K makes no sense.
I think you are vastly overestimating how bad compression artifacts currently are and if they were that bad than wouldn't REDRAY look horrible since it is encoded at 20 Mbps? I don't like that REDRAY is using a proprietary video standard but I doubt that RED would use horrible looking compression. The people who care about video quality are the ones that would be interested in REDRAY.

Quote:
Originally Posted by irkuck View Post

Now, with extremely compressed sources it is always better to start with a lower res if the bit budget is the same. Why? Just because you have to shave more details @ higer res to get to the budgeted bits whicc visually makes difference. This can be observed by comparing current 720p and 1080p sources compressed to a broadcast rate of 10 Mb/s with H.264.
If that was true than there would be no such thing as 1080p24 video streaming at 4 to 5 Mbps. I am not saying that 5 Mbps delivers great looking 1080p24 but from everything I have read 20 Mbps MPEG-4 AVC can deliver acceptable looking 2160p24.

Quote:
Originally Posted by coolscan View Post

I kind of call all the new codecs for HEVC. H.264 I have never heard be calle HEVC?
The HEVC I know about in development is H.265, Mpeg-H(possible name) and .RED
If you call all new video standards HEVC that will confuse people. HEVC is a specific video standard that is being developed by a large number of companies. The RED video standard was only developed by RED. It is quite possible that the RED video standard will be better than MPEG-4 AVC, which came out in 2003, but I think that HEVC will be better since a lot more work has gone into it.

Quote:
Originally Posted by irkuck View Post

It is. At extreme compression ratios you can also get so much. Think, e.g. why the DCI uses only intraframe with mild compression for cinema apps.
Apples and oranges. Intra frame compression does compression on each individual frame so the bit rate is 10 to 30 times higher than inter frame compression with current video standards. Intra frame compression comes at a high cost but makes for easier video editing and is more resistant to errors (since an error only affects that particular frame of video). Also even when DCI is limited to 2K resolution it still uses 12-bit video, no chroma subsampling, and a much larger color space.

Quote:
Originally Posted by irkuck View Post

Motion is another redundancy source for compression but it is done using motion compensation which shaves detail. This is not a problem when eyes are static. But when eyes are following motion the lack of detail becomes apparent. This is becoming serious issue with huge displays at low viewing distance.
More advanced video standards don't just remove more detail. If that was true than consumer video would still be using MPEG-1.
post #2368 of 3670
Quote:
Originally Posted by coolscan View Post

Please tell RED that their effort in making a compression codec, compression software and playback hardware is futile because you said so. Their claims and demonstrations of 4K material to playback at 20Mb/s delivered over the net by ODEMAX distribution is unobtainable because it goes against your personal theories of what is possible. Again we must already declare REDs product a scam (even though it has been proven otherwise) because an armchair poster on AVS is unable to wrap his head around the fact that his theories doesn't fit the real world. smile.gif

There have been tons of codec promising wonders which have not materialized. Red is playing game of trying to lock people to their proprietary tech. I am not saying Red is a scam. What I am saying is compare Red 4K@20Mb/s to the plain old H.264 2K@20Mb/s+optimized 4K upconversion. It would not harm also to compare Red2K@10Mb/s with best broadcast H.264 codecs in realtime scenario and Red 2K@ Blu-ray with Blu-ray. What I am predicting is that differences would be at best imperceptible for standard TV viewing scenario.
Quote:
Originally Posted by coolscan View Post

Sometimes I wonder if you are just a provocateur or trolling just for the fun of it. You did the same protest about the advantage of 4K in the face of all contrary reports. Now you have cooled somewhat on that, but pick up the protesting against 4K and higher resolution at low bitrates against all reports proving you are wrong again. rolleyes.gif

I was not proven wrong in anything thus far and have not changed my position on the major issue: that 4K is still and will remain nonsense from standard TV viewing scenario perceptual point of view, in line with authoritative reports on this subject. That is especially true when considering the extreme compression regime when one considers what is said above about Red. The 4K game is just a marketing ploy by the display industry fooling ignorant consumers, you are effectively their mouthpiece.

Now, telling truth about this, there are some factors in favor of 4K. One is appearance of displays in the 100" range which very likely will be watched from closer distance. Another is 4K computer monitors where 4K might be considered necessary for 32"+. There is also possibility for new viewing scenarios (e.g. curved surround displays) which suggest that even 8K might be needed there. From this point of view, 8K is better futureproofed and should be prepared for launch later instead of rushing with the 4K.
Quote:
Originally Posted by Richard Paul View Post

I think you are vastly overestimating how bad compression artifacts currently are and if they were that bad than wouldn't REDRAY look horrible since it is encoded at 20 Mbps? I don't like that REDRAY is using a proprietary video standard but I doubt that RED would use horrible looking compression. The people who care about video quality are the ones that would be interested in REDRAY.
If that was true than there would be no such thing as 1080p24 video streaming at 4 to 5 Mbps. I am not saying that 5 Mbps delivers great looking 1080p24 but from everything I have read 20 Mbps MPEG-4 AVC can deliver acceptable looking 2160p24.

The problem of artefacts and compression is nowadays very sophisticated. One does not see horrible artefacts even down to a very low bit budget. What one sees is a diminishing overall subjective perceptual quality which plain streetgoers have even difficulty to quantify. Experts will tell about edge-preserving low-pass filtering types of PQ reduction, e.g. fine detail of a skin missing here or there. Everything is done to fool visual system by masking lack of information using its weaknesses - for standard TV viewing scenario of of 3-4PH. There is thus no wonder you can see still acceptable HD video @4Mb/s. And you can see quite acceptable 4K @20Mb/s using H.264. However, if the 4K makes sense it is only with perfect PQ which can stand viewing conditions 2-2.5PH where the differences comparing to 2K will be apparent. You won't get the level of detail required for this @20Mb/s .

Quote:
Originally Posted by Richard Paul View Post

I
If you call all new video standards HEVC that will confuse people. HEVC is a specific video standard that is being developed by a large number of companies. The RED video standard was only developed by RED. It is quite possible that the RED video standard will be better than MPEG-4 AVC, which came out in 2003, but I think that HEVC will be better since a lot more work has gone into it.
Apples and oranges. Intra frame compression does compression on each individual frame so the bit rate is 10 to 30 times higher than inter frame compression with current video standards. Intra frame compression comes at a high cost but makes for easier video editing and is more resistant to errors (since an error only affects that particular frame of video). Also even when DCI is limited to 2K resolution it still uses 12-bit video, no chroma subsampling, and a much larger color space.
More advanced video standards don't just remove more detail. If that was true than consumer video would still be using MPEG-1.

Plums and peaches. The DCI standard if for distribution in cinemas, it has nothing to do with video editing. They selected intraframe coding since on big displays and small viewing distance it is next to impossible to eliminate subtle but annoying artefacts of prediction and motion compensation. A honest 4K consumer standard should be going rather in this direction than to extreme compression. After all, there is 10Mb/s HD broadcast and 25-50 Mb/s Blu-ray which in addition is non-real time, optimised multipass but both based on H.264. If compression is so fine why they do not use 10 Mb/s for Blu-ray? Equivalent for 4K should be no less than 100 Mb/s, 120Hz and 10-bit. Then one could at least say: OK this is something almost-ultimate (8K would be real-ultimate smile.gif). But no, what is now is a rush with the 4K displays only. In short time we will thus see those standard 4K displays and usual vomits about clouding, banding, light-leakage and viewing angles of the super-duper 4K technology plus long debates about 4K improvement over 2K.
post #2369 of 3670
Quote:
Originally Posted by irkuck View Post


 ...  4K is still and will remain nonsense from standard TV viewing scenario perceptual point of view, in line with authoritative reports on this subject....
Now, telling truth about this, there are some factors in favor of 4K. One is appearan ce of displays in the 100" range which very likely will be watched from closer distance. ...

I fully agree with you here:   4K has very little to offer for std TV viewing (on std TV's, e.g., </= 60" diag) at typical viewing distances, and this is even more true for 8K.

 

And even for large screens and projectors, 4K will be a noticeable improvement, but most analyses that I've read indicate that 8K will be a barely perceptible improvement, even at viewing distance between 1.5 and 2.0 PH's.

 

With 2K, 4K, or 8K, improvements in video processing, color space, motion handling, etc., can of course always make a significant improvement to PQ.

post #2370 of 3670
the new AMD fusion gpu is 30% faster than nvidia 580 the 720 will have dual core gpu so 4 k resoluition at 60 flops should be no problem they are making these future proof to a degree.

AMD will show off the new 7000 seies at CES a month away with live demos.luojie
Edited by avsforumsdsd - 12/22/12 at 4:28am
New Posts  All Forums:Forum Nav:
  Return Home
AVS › AVS Forum › Display Devices › Flat Panels General and OLED Technology › 4k by 2k or Quad HD...lots of rumors? thoughts?