I worked in this industry when HD was first introduced, c. 1998, and I remember an almost universal gripe with consumers was that although the new HD material looked great on HD displays, the older, standard def (SD) material, still very common for many years, looked grainier and almost murkier than it did on the older displays that could ONLY handle SD. I attributed this to the fact that the extra processing, scaling, and pixel mapping which would upscale SD content to an HD display had artifacts and limitations. In other words, if you want to see an image as best possible, you should view it on a display screen with the same native resolution as the incoming signal so you have to process the image as little as possible. With no scaling at all I believe this is referred to as "1:1 pixel mapping", a mode found on my LG HDTV, for example, called "Just Scan".
I've only casually viewed 4K TVs in stores, usually playing 4K content, but my question is this:
Do 4K TVs also have the same issue that when playing normal [not 4K] HD material the image is inferior to seeing that same material on a regular HD TV ?
Answers which include a substantiating link to a disinterested, third party, professional reviewer discussing this concept will win bonus points! Thanks.
In which production month exactly did they all switch to the new better scalers and might you know the manufacturer of these new, better circuit boards?
I wouldn't want to buy one of the old production, klunkier designs which have this problem I speak of.
I worked in this industry when HD was first introduced, c. 1998, and I remember an almost universal gripe with consumers was that although the new HD material looked great on HD displays, the older, standard def (SD) material, still very common for many years, looked grainier and almost murkier than it did on the older displays that could ONLY handle SD. I attributed this to the fact that the extra processing, scaling, and pixel mapping which would upscale SD content to an HD display had artifacts and limitations. In other words, if you want to see an image as best possible, you should view it on a display screen with the same native resolution as the incoming signal so you have to process the image as little as possible. With no scaling at all I believe this is referred to as "1:1 pixel mapping", a mode found on my LG HDTV, for example, called "Just Scan".
I've only casually viewed 4K TVs in stores, usually playing 4K content, but my question is this:
Do 4K TVs also have the same issue that when playing normal [not 4K] HD material the image is inferior to seeing that same material on a regular HD TV ?
Answers which include a substantiating link to a disinterested, third party, professional reviewer discussing this concept will win bonus points! Thanks.
Although the full article hides behind a pay wall, I see ConsumerReports mentions scaling of HD to 4K differs and agrees with me it is very important:
"How will 1080p shows look?
We also want to see how well the TV upconverts regular high-definition programs—720p, 1080i, and 1080p—to the set's higher native 3840x2160 resolution. (Vizio calls its upconversion technology the Spatial Scaling Engine.) This feature is important because with limited amounts of 4K content currently available, most of the time we'll be watching upconverted regular high-definition content on our UHD TVs. How well a TV can perform this critical function is a key differentiator among sets and brands." [emphasis mine]
Over the last 5 months (almost), I've watched lots of 1080i video from DirecTV satellite on a Samsung JS9000, and for the best 10% of the DirectTV HD channels, it is great. So much better than on my previous 2013 plasma TV. Of course, we go from 2k to 4k, but also scaled up are 8 bit to 10 bit color (I think this is Samsung's "depth enhancer") and 100 nit peak white to 600 nit peak white (probably Samsung's "peak illuminator") and rec.709 color gamut to a wider 92% of P3 ("native color space").
I've owned a 4K/UHD display for over 2 years. First the Samsung F9000 and now I also have the JS9500. Prior to that, I owned a Panasonic "vt" plasma. I have a better picture from HD resolutions on the Samsungs than I ever had on the Panasonic.
Seriously doubt that you will find any owner of an UHD top end display that has any problem with 720/1080 input. I'm in my 70's and therefore have an excuse for being afraid of change. Don't know about the rest of you.
I assume it is a photograph, not a still from a video, correct?
What processing has been applied? Any sharpening, curves, levels, color adjustments, etc.?
The source photo was a random low-res PNG I found lying around. The image on the right was an upscale. The image was upscaled exactly 2x on auto. No processing other than the resize was applied.
Honestly, I probably could have gotten a better upscale with better tools but I thought the resultant image was sufficiently impactful as it was.
This is an older thread, you may not receive a response, and could be reviving an old thread. Please consider creating a new thread.
Related Threads
?
?
?
?
?
AVS Forum
34M posts
1.5M members
Since 1999
A forum community dedicated to home theater owners and enthusiasts. Come join the discussion about home audio/video, TVs, projectors, screens, receivers, speakers, projects, DIY’s, product reviews, accessories, classifieds, and more!