Does 4k upscaling really do anything? - AVS Forum

AVS Forum > Audio > Receivers, Amps, and Processors > Does 4k upscaling really do anything?

Receivers, Amps, and Processors

joka.'s Avatar joka.
08:14 PM Liked: 10
post #1 of 5
11-21-2011 | Posts: 2
Joined: Nov 2011
I'm looking to get an Onkyo or Integra A/V receiver and I noticed they have "4k upscaling" [3840 x 2160 or 4096 x 2160] which is something other receiver companies do not have [yet?]. They have the Marvell Kyoto G2H chip. I've been reading around and I'm trying to clarify some of the things I've read. Although the chip is able "guess" the missing information to upscale the resolution to 4k, does it make a difference if your tv's max resolution is 1080? Would this technology be used mostly for projectors or can it be used for tv's too?

Quote:


4K upscaling means taking the native resolution of your source (be it standard definition or one of the many high definition formats such as 1080i) and upscaling it to a full resolution of 3,840 x 2,160 pixels. Bearing in mind that 1080p (the usual maximum resolution) is only 1,920 x 1,080 pixels. you can see that 4K upscaling is four TIMES the resolution.

This is an impressively large resolution, but will it mean a better picture? Well, the thing to bear in mind here is source and screen. If your source is 1080p, then the final image, even upscaled, is still based on a 1080p source - you cannot add additional information. However, with a good quality video processing chip, you can 'guess' the missing information, so theoretically yes - you can improve the picture.

However, more important is your screen. Most televisions will reach a maximum of 1080p as that is the current standard, and whilst some projectors may be able to reach these heady heights, it's not the norm. Therefore, 4K upscaling will do pretty much nothing in these instances. It's best to output to the native resolution of your screen, so check your screens resolution and choose the right format for your screen - of course, the Onkyo will let you do that!


ccotenj's Avatar ccotenj
08:20 PM Liked: 90
post #2 of 5
11-21-2011 | Posts: 21,915
Joined: Mar 2005
^^^

if your display isn't a 4k display, it doesn't do you any good... and 4k displays won't be coming to a store near you any time soon... the new sony 4k projector msrp's at $25k, and thats as cheap as it gets...

fwiw, "true" 4k is 4096x2160...

edit: and fwiw, guess what a 4k display will have in it?
MichaelJHuman's Avatar MichaelJHuman
10:23 PM Liked: 118
post #3 of 5
11-21-2011 | Posts: 18,955
Joined: Nov 2006
Even if you DID have a TV with 4k, it would have it's own 4k scaler, which is likely indistinguishable from the receiver's VP (even if the receiver's VP is better.) But that's speculation on my part, from having seen little visible improvement between scalers.
Audio-Projekt's Avatar Audio-Projekt
05:41 PM Liked: 10
post #4 of 5
11-15-2012 | Posts: 1
Joined: Nov 2012
I totally agree. The importance of scaler itself is much lower than generally thought. Deinterlacing is critical. In case of 1080p->4K scaling there is nothing complicated to be done. TV or projector will do the job. Besides, 4K puts much tougher conditions on HDMI signal transfer - it does not make sense to scale at source level!
amirm's Avatar amirm
07:21 PM Liked: 547
post #5 of 5
11-15-2012 | Posts: 18,375
Joined: Jan 2002
Quote:
Originally Posted by joka. View Post

I'm looking to get an Onkyo or Integra A/V receiver and I noticed they have "4k upscaling" [3840 x 2160 or 4096 x 2160] which is something other receiver companies do not have [yet?]. They have the Marvell Kyoto G2H chip. I've been reading around and I'm trying to clarify some of the things I've read. Although the chip is able "guess" the missing information to upscale the resolution to 4k, does it make a difference if your tv's max resolution is 1080? Would this technology be used mostly for projectors or can it be used for tv's too?
It would be great if the chip could really guess what used to be in the image. Alas, it can't do that. If I took a picture and tore it in half, can you guess what the other half held? While there are techniques to generate "super resolution" images out of a series of stills, they can generate a ton of artifacts and at any rate, are not remotely economical to build into consumer electronic devices.

All these chips do is take the *same* image and represent it with more pixels. The image will not have any more detail. It will have, in the best case scenario, all the pixels you put in them. In the less than best case scenario, they actually add some artifacts of their own!

The only application of 4K is to reduce the pixel size of the display. If you have a 4K display, and 1080p (slightly less than 2K) then you have no choice but to manufacture new pixels to feed them. The manufactured pixels have been interpolated meaning they would represent the larger pixel of the source faithfully. If with your current display you don't see the pixels, then that reason goes out the window for needing or having 4K.

In your case, the scenario does not work at all. A current display will not accept a 4K feed. And even if it did, it would have to chop it back down to 1080p anyway since that is all the pixels it has. If you actually did the double conversion of 1080p to 4K in the AVR and 4K back to 1080p, all you would have is some degradation of video due to artifacts of each scaling and nothing more.

BTW, here is the spec sheet for the Marvel part: http://www.marvell.com/digital-entertainment/assets/qdeo_extended_technology_brief.pdf. Not how there is nothing in their marketing material there about guessing anything, or inventing new pixels. What is there is perceptual techniques for making it look like there is more resolution than there really is. Those techniques can be applied just the same to 1080p images (just turn up the sharpness on your TV as an example).
Reply Receivers, Amps, and Processors

Subscribe to this Thread

Powered by vBadvanced CMPS v3.2.3