Official 4:4:4 / Chroma Subsampling Thread - Page 20 - AVS Forum | Home Theater Discussions And Reviews
Forum Jump: 
 43Likes
Reply
 
Thread Tools
post #571 of 582 Old 06-10-2017, 01:17 PM
 
RLBURNSIDE's Avatar
 
Join Date: Jan 2007
Posts: 3,901
Mentioned: 16 Post(s)
Tagged: 0 Thread(s)
Quoted: 2012 Post(s)
Liked: 1394
Quote:
Originally Posted by NintendoManiac64 View Post
It actually could still be related to the PC. Monitors in particular have much more basic scalars
Minor correction: scal-ers.

A scalar is a single-valued number, a mathematical term.
A scaler is a chip whose purpose it is to scale images.

People get this wrong all the time.

Two different words with completely different meanings. It doesn't matter if some manual misspells it either, it's not proper English. Apologies for the pedantry but we're on a science website
RLBURNSIDE is offline  
Sponsored Links
Advertisement
 
post #572 of 582 Old 06-12-2017, 08:18 AM
Newbie
 
Join Date: Jun 2017
Posts: 6
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 3 Post(s)
Liked: 0
Quote:
Originally Posted by NintendoManiac64 View Post
Did you actually try setting the video output format to RGB in your Nvidia control panel, and/or making a custom resolution of 1920x1072?

Or do you need help knowing how to do these things? (I can do that, just not this instant as I'm current on my AMD-powered laptop rather than my Nvidia-powered HTPC).
I already tried that, the help i want is actually solving the issue as none of that helped.
pushqrdx is offline  
post #573 of 582 Old 06-13-2017, 12:59 AM
Senior Member
 
showmak's Avatar
 
Join Date: Dec 2005
Posts: 443
Mentioned: 2 Post(s)
Tagged: 0 Thread(s)
Quoted: 212 Post(s)
Liked: 85
When I get home I will send the results of my 1080P Sony XBR-55HX929.

__________________________________________________
Sony BRAVIA XBR55HX929, Yamaha Aventage RX-A3050, Front Mains: Jamo C109, Surrounds: Jamo C103, FH/RH: Jamo C93, Center: Jamo C10 CEN and Subwoofer: Jamo J112 SUB (5.1.4)
showmak is offline  
 
post #574 of 582 Old 06-13-2017, 09:09 AM
Member
 
allanmac's Avatar
 
Join Date: Feb 2003
Location: Boston, MA
Posts: 33
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 3 Post(s)
Liked: 12
Quote:
Originally Posted by thuety View Post
if anyone could do UHD 60 4:4:4 with a Intel HD graphics + DP-HDMI converter
Yes, it works for me.

New ECS Liva Z (N4200+ 2x4GB/1866).

The Liva Z has DP 1.2 out via an Apollo Lake HD505.

The test setup is ECS > DP 1.2 > "Plugable Active Mini DisplayPort to HDMI 2.0 Adapter" > Denon x1300W Game Port > Vizio 2017 E55-E1.

I verified that 4K/60Hz/4:4:4 worked fine -- no chroma noise on fine text when looking at a bitmapped text test pattern.

The driver, monitor and Denon all reported 4K/60Hz.

I believe the Plugable uses a Parade chip (you can dig up the model # online).

I won't be able to verify HDCP 2.2 (unlikely) or Netflix compatibility until this weekend.

-A

Last edited by allanmac; 06-13-2017 at 04:28 PM.
allanmac is offline  
post #575 of 582 Old 06-14-2017, 12:23 AM
AVS Forum Special Member
 
NintendoManiac64's Avatar
 
Join Date: Feb 2012
Location: Northeast Ohio
Posts: 1,852
Mentioned: 2 Post(s)
Tagged: 0 Thread(s)
Quoted: 933 Post(s)
Liked: 388
Quote:
Originally Posted by allanmac View Post
no chroma noise on fine text when looking at a bitmapped text test pattern
How about with the madVR chroma test pattern? (I typically recommend opening the image in MS Paint since no scaling is applied by default):

http://forum.doom9.org/showthread.ph...99#post1640299

Last edited by NintendoManiac64; 06-14-2017 at 12:26 AM.
NintendoManiac64 is offline  
post #576 of 582 Old 07-01-2017, 01:42 PM
Member
 
allanmac's Avatar
 
Join Date: Feb 2003
Location: Boston, MA
Posts: 33
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 3 Post(s)
Liked: 12
Quote:
Originally Posted by NintendoManiac64 View Post
How about with the madVR chroma test pattern? (I typically recommend opening the image in MS Paint since no scaling is applied by default):

http://forum.doom9.org/showthread.ph...99#post1640299
Sorry for the delay.

The madVR test seems to pass.

Although you can't see it on the attached photo, the 4:4:4 is crisp and there is a slightly visible 4:2:2 which, according to posts on the site, is expected. The detail is there on my iPhone but not on Chrome so I assume there are some color format disagreements between the two.

I also included the "quick brown fox" red/blue lines.

Update: I can also confirm that a Netflix "4 Screen Ultra HD" account does not playback at 4K on the Apollo Lake chip with the DP-to-HDMI 2.0 active adapter despite Windows 10 running at 4K/60Hz/8bpp.

Apollo Lake supports HEVC decode but probably does not support HDCP 2.2 (which is vague in the Intel docs but most sites have suggested there is no support) and it's my understanding that an active HDMI 2.0 adapter is just a pass-through device when it comes to HDCP handshaking. I don't have a handy way to inspect the HDCP status via the Denon X1300W, Windows or the Vizio M50-E1. Any tips?
Attached Thumbnails
Click image for larger version

Name:	IMG_4646_25.jpg
Views:	41
Size:	89.6 KB
ID:	2213409   Click image for larger version

Name:	IMG_0436_25.jpg
Views:	40
Size:	329.7 KB
ID:	2213417  

Last edited by allanmac; 07-01-2017 at 02:07 PM.
allanmac is offline  
post #577 of 582 Old 07-01-2017, 02:40 PM
AVS Forum Special Member
 
NintendoManiac64's Avatar
 
Join Date: Feb 2012
Location: Northeast Ohio
Posts: 1,852
Mentioned: 2 Post(s)
Tagged: 0 Thread(s)
Quoted: 933 Post(s)
Liked: 388
Quote:
Originally Posted by allanmac View Post
Update: I can also confirm that a Netflix "4 Screen Ultra HD" account does not playback at 4K on the Apollo Lake chip with the DP-to-HDMI 2.0 active adapter despite Windows 10 running at 4K/60Hz/8bpp.

Apollo Lake supports HEVC decode but probably does not support HDCP 2.2 (which is vague in the Intel docs but most sites have suggested there is no support)
Even if Apollo Lake does in fact support HDCP 2.2, DisplayPort 1.2 does *not* (and no Intel iGPU currently supports DisplayPort 1.3). So as long as you're using a DP--to->HDMI adapter on Intel graphics, you won't have HDCP 2.2.

Don't you just love DRM?
NintendoManiac64 is offline  
post #578 of 582 Old 07-01-2017, 04:42 PM
Member
 
allanmac's Avatar
 
Join Date: Feb 2003
Location: Boston, MA
Posts: 33
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 3 Post(s)
Liked: 12
Quote:
Originally Posted by NintendoManiac64 View Post
Even if Apollo Lake does in fact support HDCP 2.2, DisplayPort 1.2 does *not* (and no Intel iGPU currently supports DisplayPort 1.3). So as long as you're using a DP--to->HDMI adapter on Intel graphics, you won't have HDCP 2.2.

Don't you just love DRM?
Argh... Thanks for clarifying how that works.

So I assume the just announced ASRock Beebox (or equivalent ITX boards) with an onboard HDMI 2.0 transceiver connected to the Apollo Lake's eDP will also lack HDCP 2.2?

Yeah, DRM is a pain but I suppose Netflix is deeply concerned about pirates capturing the 15+ Mbps HEVC+HDR stream. The quality is pretty good for streaming.

Last edited by allanmac; 07-01-2017 at 04:45 PM.
allanmac is offline  
post #579 of 582 Old 10-13-2017, 07:29 AM
Senior Member
 
tylerk86's Avatar
 
Join Date: Nov 2008
Posts: 283
Mentioned: 4 Post(s)
Tagged: 0 Thread(s)
Quoted: 215 Post(s)
Liked: 38
Hi Everyone,
Hopefully I’m hoping someone can weigh in on the workflow I’ve posted below. I’m trying to learn about colorspace and bit-depth and how they interact between my device and display so feel free to poke holes in it. I’m using an Apple TV4K and an LG C7. I try to avoid any “artificial” enhancement of data, meaning adding data (on the device) that is not present in the source file. I’m using 60Hz for everything because it runs the UI smoothly, and I can reverse the present 3:2 pulldown to 24Hz accurately using Real Cinema controls.

Firstly, my understanding is that a typical SDR file is delivered to the content provider at 4:2:0 colorspace, 8-bit color depth. A typical HDR file is delivered to the content provider at 4:2:0 colorspace, 10-bit color depth.

My Apple TV 4K allows me to pick 1080p SDR@60Hz, YCbCr 4:4:4 for SDR content, and this format would probably be my first inclination since the resolution isn’t being upscaled right off the bat. However, the colorspace is being upsampled and my C7 apparently cannot “properly” display a 4:4:4 signal outside of PC mode anyway (I haven’t been able to find out exactly what goes wrong here), and this mode locks the refresh rate at 60Hz, which I won’t choose because I don’t enjoy the look of 3:2 pulldown. That brings me to my next option for SDR: 4K SDR@60Hz, YCbCR 4:2:0. This seems to be the best option for my setup, since I can choose the native colorspace of 4:2:0 which my display has no issues with, and it should still be an 8-bit file, so no device-added data.
As far as HDR goes, I can choose 4K HDR@60Hz, YCbCr 4:2:0 or 4:2:2. I believe 4:2:0 to be the correct colorspace here since the file is encoded as such and this format is going to give me a 10-bit color depth.
Next up would be Dolby Vision which I have no choices for (though I believe it’s a 4:2:0 12-bit file)?.
Is there ever a reason, as far as source bit-depth fidelity goes, to choose a 24Hz refresh rate? I’m in the process of testing some chroma patterns on my set to get the best result - just looking for some insight about my thought process and understanding of these topics.

LG OLED55C7P
Apple TV 4K
Nintendo Switch
tylerk86 is online now  
post #580 of 582 Old 10-13-2017, 10:13 PM
AVS Forum Special Member
 
NintendoManiac64's Avatar
 
Join Date: Feb 2012
Location: Northeast Ohio
Posts: 1,852
Mentioned: 2 Post(s)
Tagged: 0 Thread(s)
Quoted: 933 Post(s)
Liked: 388
You would only ever use 24Hz for two things:

1. to match the frame rate of a 24fps video

2. to reduce the bandwidth used so that you can use higher resolutions and color depth / chroma, etc.
NintendoManiac64 is offline  
post #581 of 582 Old 10-14-2017, 06:13 AM
Senior Member
 
tylerk86's Avatar
 
Join Date: Nov 2008
Posts: 283
Mentioned: 4 Post(s)
Tagged: 0 Thread(s)
Quoted: 215 Post(s)
Liked: 38
Quote:
Originally Posted by NintendoManiac64 View Post
You would only ever use 24Hz for two things:



1. to match the frame rate of a 24fps video



2. to reduce the bandwidth used so that you can use higher resolutions and color depth / chroma, etc.

My understanding is that I can get the correct color depth (correct being what he content is encoded at) while using 60Hz, for SDR and HDR content, right? I know what you’re saying, and if I drop down the refresh rate I can have the device output 4:4:4 for SDR, but if it wasn’t encoded that way, I’d rather let my TV upsample to that Chroma.

LG OLED55C7P
Apple TV 4K
Nintendo Switch
tylerk86 is online now  
post #582 of 582 Old 10-14-2017, 01:34 PM
AVS Forum Special Member
 
NintendoManiac64's Avatar
 
Join Date: Feb 2012
Location: Northeast Ohio
Posts: 1,852
Mentioned: 2 Post(s)
Tagged: 0 Thread(s)
Quoted: 933 Post(s)
Liked: 388
I personally would stick with 60Hz because, if you watch something that's encoded at 30fps, then it won't be a judder fest.

Also if you manage to find something that's in 25fps, then there should be less noticeable judder at 60Hz than 24Hz.
NintendoManiac64 is offline  
Sponsored Links
Advertisement
 
Reply LCD Flat Panel Displays

Tags
Lcd Hdtv

Thread Tools
Show Printable Version Show Printable Version
Email this Page Email this Page


Forum Jump: 

Posting Rules  
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are Off
Refbacks are Off