Advanced MPC-HC Setup Guide - Page 157 - AVS Forum | Home Theater Discussions And Reviews
Forum Jump: 
 175Likes
Reply
 
Thread Tools
post #4681 of 4716 Old 08-09-2019, 12:53 AM
Member
 
Join Date: Jan 2016
Posts: 53
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 26 Post(s)
Liked: 0
Quote:
Originally Posted by mightyhuhn View Post
a full range chain is technically the best.
So i leave the options as they are? Cause i read TV levels ( limited RGB ) is better for movies.
lazostat is offline  
Sponsored Links
Advertisement
 
post #4682 of 4716 Old 08-09-2019, 07:37 AM
AVS Forum Special Member
 
mightyhuhn's Avatar
 
Join Date: Nov 2013
Posts: 2,318
Mentioned: 12 Post(s)
Tagged: 0 Thread(s)
Quoted: 1287 Post(s)
Liked: 469
if your end device properly supports full range yes.
mightyhuhn is offline  
post #4683 of 4716 Old 08-09-2019, 02:56 PM
Newbie
 
Join Date: Aug 2019
Posts: 1
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 0
im new to this and i was wondering what are the best settings for a rx 2080 with a 1440p 165hertz monitor?
i tried to follow the guide but some options do not appear for me, im using mpc-hc 1.7.13 and madvr 0.92.17
wizixutofo is offline  
Sponsored Links
Advertisement
 
post #4684 of 4716 Old 09-06-2019, 11:48 AM
Member
 
booleano's Avatar
 
Join Date: Feb 2016
Location: Spain
Posts: 72
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 38 Post(s)
Liked: 1
Hello to everybody friends!
Im looking for the best settings for MadVr using a Gigabyte 1050ti , core i5 7400, 8 gb ram, thanks in advance for all.
booleano is offline  
post #4685 of 4716 Old 09-07-2019, 02:42 AM
AVS Forum Special Member
 
Join Date: Jul 2017
Posts: 1,197
Mentioned: 19 Post(s)
Tagged: 0 Thread(s)
Quoted: 618 Post(s)
Liked: 185
Quote:
Originally Posted by booleano View Post
Hello to everybody friends!
Im looking for the best settings for MadVr using a Gigabyte 1050ti , core i5 7400, 8 gb ram, thanks in advance for all.
There are no best settings, it's up to the user's personal preference. No GPU can use madvr with all the settings at best quality, so compromises have to be made, especially with a 1050ti.

Link to madvr guide:
https://forum.kodi.tv/showthread.php?tid=259188
Link to madvr thread:
https://www.avsforum.com/forum/26-ho...thread-42.html
noob00224 is online now  
post #4686 of 4716 Old 09-11-2019, 06:17 PM
Member
 
scarface717's Avatar
 
Join Date: Jun 2013
Posts: 49
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 32 Post(s)
Liked: 12
Edited!

Last edited by scarface717; 09-12-2019 at 11:39 AM.
scarface717 is offline  
post #4687 of 4716 Old 09-12-2019, 11:31 AM
AVS Forum Special Member
 
Join Date: Jul 2017
Posts: 1,197
Mentioned: 19 Post(s)
Tagged: 0 Thread(s)
Quoted: 618 Post(s)
Liked: 185
@scarface717

The HDD should be able to read a file with that bitrate.
Check the render time, it should be under 41ms in the madvr OSD.

LE: You can post a screenshot of the OSD.
Also this is the thread for MPC HC, the madvr is this one:
https://www.avsforum.com/forum/26-ho...thread-43.html

Last edited by noob00224; 09-12-2019 at 11:45 AM.
noob00224 is online now  
post #4688 of 4716 Old 12-01-2019, 07:12 AM
Newbie
 
Join Date: Dec 2019
Posts: 5
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 1 Post(s)
Liked: 0
Hi,
I am wondering which line in the OSD tells me that it is displaying 10bit HDR on the TV?
I can change what the first line says (NV HDR, 8bit, RGB etc...) by changing the Nvidia color settings in its control panel so that it outputs 12bit, YCbCr etc... does this change anything at all?
But I assume that the line "D3D11 fullscreen windowed (10bit)" tells me that I actually output HDR, is that assumption correct?
I cannot determine HDR by eye since I've never seen it, so I want to make sure it is outputting it.

I run a GTX1060 with a Samsung Q6FN. The TV tells me it is in HDR mode once I open the movie in MPC.

Thanks in advance.
Attached Thumbnails
Click image for larger version

Name:	hdr.PNG
Views:	22
Size:	636.1 KB
ID:	2647936  
tallica is offline  
post #4689 of 4716 Old 12-01-2019, 10:42 AM
AVS Forum Special Member
 
mightyhuhn's Avatar
 
Join Date: Nov 2013
Posts: 2,318
Mentioned: 12 Post(s)
Tagged: 0 Thread(s)
Quoted: 1287 Post(s)
Liked: 469
you are currently output 8 bit HDR(line 1) and madVR is outputting 10 bit HDR(line 4).
just change madVR to 8 bit and move on.

the file you are playing there is BTW. broken and is missing meta data.
mightyhuhn is offline  
post #4690 of 4716 Old 12-01-2019, 01:44 PM
Newbie
 
Join Date: Dec 2019
Posts: 5
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 1 Post(s)
Liked: 0
Thanks for answering!

I do not quite understand: If I change madVR to 8bit, does that mean I do not get HDR? In my mind HDR is supposed to be synonymous with 10 bit. So if I set madVR to 8 bit, is it still "HDR"? Can you clear that up?
I can make the top line say 12bit (by changing settings in Nvidia control panel). I really don't get the meaning of that.
Thanks!
tallica is offline  
post #4691 of 4716 Old 12-01-2019, 02:25 PM
Advanced Member
 
Anderegg's Avatar
 
Join Date: Jul 2012
Posts: 746
Mentioned: 30 Post(s)
Tagged: 0 Thread(s)
Quoted: 502 Post(s)
Liked: 287
HDR is the gamma, the dynamic range of the video levels. 10 bit is WCG = Wide Color Gamut. My Sony PXW-Z90 video camera shoots 8 bit 4K HLG HDR video, but also shoots 10 bit HD SDR video. Bit depth only relates to banding in colored areas of an image (yes I know you can have greyscale banding also). The two are not synonomous.

Paul

Sony X900F SDR and HDR Calibrations https://www.avsforum.com/forum/166-l...l#post57551552
Anderegg is offline  
post #4692 of 4716 Old 12-01-2019, 03:00 PM
AVS Forum Special Member
 
mightyhuhn's Avatar
 
Join Date: Nov 2013
Posts: 2,318
Mentioned: 12 Post(s)
Tagged: 0 Thread(s)
Quoted: 1287 Post(s)
Liked: 469
the benefit of 10 bit for encoding is compression efficiency.
the benefit of 10 bit for presentation when done correctly is just the noise level nothing else.
if it is handled correctly banding has nothing todo with bit deep at the presentation level if you have at least 6 bit's at UHD just the noise level get's really high.
mightyhuhn is offline  
post #4693 of 4716 Old 12-01-2019, 03:05 PM
AVS Forum Special Member
 
Join Date: Jul 2017
Posts: 1,197
Mentioned: 19 Post(s)
Tagged: 0 Thread(s)
Quoted: 618 Post(s)
Liked: 185
Quote:
Originally Posted by tallica View Post
Thanks for answering!

I do not quite understand: If I change madVR to 8bit, does that mean I do not get HDR? In my mind HDR is supposed to be synonymous with 10 bit. So if I set madVR to 8 bit, is it still "HDR"? Can you clear that up?
I can make the top line say 12bit (by changing settings in Nvidia control panel). I really don't get the meaning of that.
Thanks!
8 bit is fine for HDR.

Quote:
Originally Posted by Onkyoman View Post
I would recommend using 8-bit RGB with madVR, anyways, as it is visually identical to 10-bits because of madVR's dithering.
noob00224 is online now  
post #4694 of 4716 Old 12-01-2019, 03:15 PM
Advanced Member
 
Anderegg's Avatar
 
Join Date: Jul 2012
Posts: 746
Mentioned: 30 Post(s)
Tagged: 0 Thread(s)
Quoted: 502 Post(s)
Liked: 287
Quote:
Originally Posted by mightyhuhn View Post
the benefit of 10 bit for encoding is compression efficiency.
the benefit of 10 bit for presentation when done correctly is just the noise level nothing else.
if it is handled correctly banding has nothing todo with bit deep at the presentation level if you have at least 6 bit's at UHD just the noise level get's really high.
That is nonsense...I work with 10 bit video every single day, for a TV station...10 bit provides a wider color gamut, which for broadcast, allows better green screen such as placing a weather person in front of a map. 10 bit will absolutely reduce banding, say in a shot with the sky, you may get 4 bands with 8 bit, but 20 bands with 10 bit, because you have more gradiations. Additionally, 10 bit with a low quality source can create additional noise that 8 bit can mask by it's inability to finely resolve video grain/noise. The fact that you think 6 bit video would look the same as 8 or 10 bit shows you really do not have an even rudimentary understanding of bit depth. (encoding 8 bit BlueRays and DVD's as 10 bit doesn't count as understanding the topic) . :-)

Paul

Sony X900F SDR and HDR Calibrations https://www.avsforum.com/forum/166-l...l#post57551552
Anderegg is offline  
post #4695 of 4716 Old 12-01-2019, 03:26 PM
AVS Forum Special Member
 
Onkyoman's Avatar
 
Join Date: Aug 2007
Location: Canada
Posts: 1,948
Mentioned: 54 Post(s)
Tagged: 0 Thread(s)
Quoted: 1143 Post(s)
Liked: 673
He is just referring to the lossless use of 10-bit data on UHD Blu-rays by dithering it when rendering it or displaying it as video on a typical consumer display. People have been doing this for years with madVR without seeing any difference.

Mastering or color grading is a different story. That would also be an important consideration beyond simply compressing a finished source with a high initial bit depth so it is distributed without banding.
Onkyoman is online now  
post #4696 of 4716 Old 12-01-2019, 03:32 PM
Advanced Member
 
Anderegg's Avatar
 
Join Date: Jul 2012
Posts: 746
Mentioned: 30 Post(s)
Tagged: 0 Thread(s)
Quoted: 502 Post(s)
Liked: 287
Quote:
Originally Posted by Onkyoman View Post
He is just referring to the lossless use of 10-bit data on UHD Blu-rays by dithering it when rendering it or displaying it as video on a typical consumer display. People have been doing this for years with madVR without seeing any difference.

Mastering or color grading is a different story. That would also be an important consideration beyond simply compressing a finished source with a high initial bit depth so it is distributed without banding.
Yeah, that's why I mentioned the encoding of BluRays...and dithering is basically a form of gradiation banding. My Sony TV has an 8 bit to 10 bit effect that dithers everything. Dithering can create resolution loss and video softening, best to stick to your content bit depth assuming your signal chain can maintain it. And there is not anything efficient about 10 bit compared to 8 bits...every bit is more data.

Paul

Sony X900F SDR and HDR Calibrations https://www.avsforum.com/forum/166-l...l#post57551552
Anderegg is offline  
post #4697 of 4716 Old 12-01-2019, 03:35 PM
AVS Forum Special Member
 
mightyhuhn's Avatar
 
Join Date: Nov 2013
Posts: 2,318
Mentioned: 12 Post(s)
Tagged: 0 Thread(s)
Quoted: 1287 Post(s)
Liked: 469
i said for presentation just for you rendering the final image on the screen.

not processing or mastering where high bit deep is absolutely necessarily to avoid banding. i mean how do you even dither something if you don't have a high bit deep...

Quote:
10 bit provides a wider color gamut
bit deep has nothing todo with gamut absolutely nothing.

Quote:
The fact that you think 6 bit video would look the same as 8 or 10 bit shows you really do not have an even rudimentary understanding of bit depth.
i said the difference is noise that's clearly not saying it's the same BTW. i can prove this.
but why should i prove something that has been proven over and over again if you may just test it yourself:
http://www.bealecorner.org/red/test-...ient-16bit.png

i don't have to tell you how to do the test with madVR right? if you think i have so little understanding of this topic right.
mightyhuhn is offline  
post #4698 of 4716 Old 12-01-2019, 03:40 PM
AVS Forum Special Member
 
mightyhuhn's Avatar
 
Join Date: Nov 2013
Posts: 2,318
Mentioned: 12 Post(s)
Tagged: 0 Thread(s)
Quoted: 1287 Post(s)
Liked: 469
Quote:
Originally Posted by Anderegg View Post
Yeah, that's why I mentioned the encoding of BluRays...and dithering is basically a form of gradiation banding. My Sony TV has an 8 bit to 10 bit effect that dithers everything. Dithering can create resolution loss and video softening, best to stick to your content bit depth assuming your signal chain can maintain it. And there is not anything efficient about 10 bit compared to 8 bits...every bit is more data.

Paul
ok let's the fun start.
how do you preserve YCbCr original 8/10 bit with RGB processing?

do i really need to get back to the countless test of 10 bit h264 and 8 bit h264 specifically in bit rate starving.
and next i have to prove that 4:4:4 encoding is more often then not more efficient then 4:2:0. guess what has been proven countless times now.
mightyhuhn is offline  
post #4699 of 4716 Old 12-01-2019, 03:43 PM
Advanced Member
 
Anderegg's Avatar
 
Join Date: Jul 2012
Posts: 746
Mentioned: 30 Post(s)
Tagged: 0 Thread(s)
Quoted: 502 Post(s)
Liked: 287
Quote:
Originally Posted by mightyhuhn View Post
i said for presentation just for you rendering the final image on the screen.

not processing or mastering where high bit deep is absolutely necessarily to avoid banding. i mean how do you even dither something if you don't have a high bit deep...


bit deep has nothing todo with gamut absolutely nothing.



i said the difference is noise that's clearly not saying it's the same BTW. i can prove this.
but why should i prove something that has been proven over and over again if you may just test it yourself:
http://www.bealecorner.org/red/test-...ient-16bit.png

i don't have to tell you how to do the test with madVR right? if you think i have so little understanding of this topic right.
I was responding to a question as to 10 bit being a requirement for HDR, it is not.

Paul

Sony X900F SDR and HDR Calibrations https://www.avsforum.com/forum/166-l...l#post57551552
Anderegg is offline  
post #4700 of 4716 Old 12-01-2019, 04:00 PM
Advanced Member
 
Anderegg's Avatar
 
Join Date: Jul 2012
Posts: 746
Mentioned: 30 Post(s)
Tagged: 0 Thread(s)
Quoted: 502 Post(s)
Liked: 287
Quote:
Originally Posted by mightyhuhn View Post
ok let's the fun start.
how do you preserve YCbCr original 8/10 bit with RGB processing?
.
Um...magic? I have no clue, not even a rudimentary understanding of that.

Sony X900F SDR and HDR Calibrations https://www.avsforum.com/forum/166-l...l#post57551552
Anderegg is offline  
post #4701 of 4716 Old 12-01-2019, 04:09 PM
AVS Forum Special Member
 
mightyhuhn's Avatar
 
Join Date: Nov 2013
Posts: 2,318
Mentioned: 12 Post(s)
Tagged: 0 Thread(s)
Quoted: 1287 Post(s)
Liked: 469
me neither because you end up with float point numbers.

and if you really want to be smart here you can bring up the well known exception to that?
mightyhuhn is offline  
post #4702 of 4716 Old 12-02-2019, 11:06 AM
Newbie
 
Join Date: Dec 2019
Posts: 5
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 1 Post(s)
Liked: 0
OK thanks guys, I had a hard time understanding what you were writing:
Quote:
Originally Posted by mightyhuhn View Post
the benefit of 10 bit for encoding is compression efficiency.
This does not make any sense to me since 2^8 = 256 and 2^10 = 1024, which is 4x more data.

Quote:
Originally Posted by noob00224 View Post
8 bit is fine for HDR.
1) So is that under the assumption that madVR applies dithering to the 10bit image and furthermore, this looks the same as 10bit? However, why should I even bother about dithering 10 to 8 bit if I can natively display 10 bit?
2) This would also be possible with a native 8bit TV, correct?

3) So the first line in madVR's OSD tells me what is outputted from the graphics card, and the "D3D11" line tells me what madVR is outputting to the graphics driver`before that?
4) If 3) is correct, is the graphics driver somehow converting the 10 bit range to 8 bit or how can it produce anything that makes sense?

Thanks!
tallica is offline  
post #4703 of 4716 Old 12-02-2019, 03:48 PM
AVS Forum Special Member
 
mightyhuhn's Avatar
 
Join Date: Nov 2013
Posts: 2,318
Mentioned: 12 Post(s)
Tagged: 0 Thread(s)
Quoted: 1287 Post(s)
Liked: 469
neither 8 bit source or 10 bit source will result in 8 or 10 bit output that is fine without dithering.
there is only one reason to disable dithering to prove you should not disable it.

no display can display the image as it is and just the step to convert it to something it can display results in the need of "unlimited" bit deep float point numbers this alone is enough to need dithering.
when using 10 bit windowed fullscreen output combined with 8 bit output at the GPU you will get a lot of bending because there is a bug in the nvidia driver that pretty old now. this bug is pretty much ignored because this is just a bad setup.

the next problem are the TV it self they generally fail at this job and add banding. we can't look into them to know why for sure so we have to guess why they have this problem. is it low bit deep processing in like 10-12 bit do they not dither we don't know but we kind know that most TV produce banding with 10 bit input but not with 8 bit.
PC monitors usually don't have this issue.
Quote:
This does not make any sense to me since 2^8 = 256 and 2^10 = 1024, which is 4x more data.
they are heavily lossy compressed and giving the encoder more bit deep has proven to be more efficient. the benefit for HEVC is limited compared to h264 but still present mostly because HEVC works internally in 16 bit for all 8-12 bit output builds that work with 8 bit internally where discontinued very fast at least for x265 it was not competitive anyway.
noob00224 likes this.
mightyhuhn is offline  
post #4704 of 4716 Old 12-03-2019, 04:51 AM
AVS Forum Special Member
 
Join Date: Jul 2017
Posts: 1,197
Mentioned: 19 Post(s)
Tagged: 0 Thread(s)
Quoted: 618 Post(s)
Liked: 185
Trying to figure out two things:
1. Correct configuration for the setup below.
2.How can WASAPI Exclusive be enabled on MPC HC/BE?
Without reclock.



1.Is this setup considered bitstreaming?

Setup: Xonar DGX>S/PDIF>TOSLINK cable>DAC>Headphones.



The DAC only supports PCM and DSD:
1 x TOSLink optical capable of playing 44.1KHz to 192KHz PCM and DSD64 in DoP format.
The USB and COAX S/PDIF also only accept PCM and DSD.


DGX:
The S/PDIF protocol specification (IEC-60958) can only carry 2-channel PCM data or non-PCM AC3/DTS data. So, when a user selects PCM output for S/PDIF, the Xonar sound card will always deliver 2 channel PCM data through the S/PDIF output port.

Optical TOSLINK digital output port. Connects to external digital decoder or digital speaker systems, Home Theater systems, AV receivers for outputting digital audio including PCM, DTS Digital, DTS, WMA-Pro, etc.





In CP the S/PDIF passthrough is disabled, but there is audio regardless:



LAV filters:





Should any of the bitstreaming options be enabled in LAV?



2.

In MPC HC>Options>Internal Filters>Audio Renderer:





Is there such an option in MPC BE?

Should Exclusive mode be clicked?
How can I tell if WASAPI mode is working?

If I enable Exclusive mode and there is something else playing in the background, that sound does not cut off.
Shouldn't Exclusive mode silence all other sounds?




Not sure if this matters, but Waves NX is also being used.

When it needs to be used:

In MPC HC>Options>Playback>Output>Audio Renderer needs to be changed to NX.
Changing it in Internal Filters> Audio renderer does nothing.

In MPC BE>Options>Audio>Audio renderer needs to be changed to NX.



In Waves NX itself, the Output device is the Xonar:

Attached Thumbnails
Click image for larger version

Name:	07.JPG
Views:	42
Size:	61.0 KB
ID:	2648696   Click image for larger version

Name:	05.JPG
Views:	43
Size:	63.2 KB
ID:	2648698   Click image for larger version

Name:	01.JPG
Views:	44
Size:	39.2 KB
ID:	2648700   Click image for larger version

Name:	04.JPG
Views:	42
Size:	125.1 KB
ID:	2648702   Click image for larger version

Name:	02.png
Views:	44
Size:	26.9 KB
ID:	2648706  

Click image for larger version

Name:	08.png
Views:	44
Size:	73.1 KB
ID:	2648708  
noob00224 is online now  
post #4705 of 4716 Old 12-03-2019, 06:05 AM
AVS Forum Special Member
 
mightyhuhn's Avatar
 
Join Date: Nov 2013
Posts: 2,318
Mentioned: 12 Post(s)
Tagged: 0 Thread(s)
Quoted: 1287 Post(s)
Liked: 469
for wasapi use sanear and there is no point in wasapi if you don't want exclusive.

what the point of bitstreaming for a 2 channel device?
for a DAC that doesn't support bitstreaming
in a PC setup up for analog audio output.

Quote:
Should any of the bitstreaming options be enabled in LAV?
if you have a device that supports bitstreaming you haven't shown one.

Waves NX wouldn't really work with bitstreaming.
noob00224 likes this.
mightyhuhn is offline  
post #4706 of 4716 Old 12-03-2019, 08:05 AM
AVS Forum Special Member
 
Join Date: Jul 2017
Posts: 1,197
Mentioned: 19 Post(s)
Tagged: 0 Thread(s)
Quoted: 618 Post(s)
Liked: 185
Quote:
Originally Posted by mightyhuhn View Post
for wasapi use sanear and there is no point in wasapi if you don't want exclusive.

what the point of bitstreaming for a 2 channel device?
for a DAC that doesn't support bitstreaming
in a PC setup up for analog audio output.


if you have a device that supports bitstreaming you haven't shown one.

Waves NX wouldn't really work with bitstreaming.
Thanks.
Sanear worked.


I don't want to bitstream, the external dac only has digital inputs (USB, COAX and TOSLINK S/PDIF) (Chord Mojo).

So is this bitstreaming, or if the signal is converted to PCM by the Xonar this is no longer the case?
noob00224 is online now  
post #4707 of 4716 Old 12-03-2019, 08:24 AM
AVS Forum Special Member
 
mightyhuhn's Avatar
 
Join Date: Nov 2013
Posts: 2,318
Mentioned: 12 Post(s)
Tagged: 0 Thread(s)
Quoted: 1287 Post(s)
Liked: 469
is not bitstreaming. bitstreaming with spdif is done when using ac3 or dts not PCM. there is no other way to output spec confirm surround sound with spdif.
for the digital output on asus devices you should use the digital output on your system.
and when present even better the onboard digital audio output to avoid the asus driver.
noob00224 likes this.
mightyhuhn is offline  
post #4708 of 4716 Old 12-03-2019, 08:46 AM
AVS Forum Special Member
 
Join Date: Jul 2017
Posts: 1,197
Mentioned: 19 Post(s)
Tagged: 0 Thread(s)
Quoted: 618 Post(s)
Liked: 185
Quote:
Originally Posted by mightyhuhn View Post
is not bitstreaming. bitstreaming with spdif is done when using ac3 or dts not PCM. there is no other way to output spec confirm surround sound with spdif.
for the digital output on asus devices you should use the digital output on your system.
and when present even better the onboard digital audio output to avoid the asus driver.
The board is a TUF B360-PLUS GAMING and it has a 4-1 pin SPDIF_OUT.

There seem to be some adapters and high-definition front panel audio modules, but haven't researched it yet.

Why avoid the Asus driver? I know there have been issues, but I haven't had any.


Should I use the MB's digital out to get a digital Coaxial output?

However, the DAC does not have support for DTS/AC3, just PCM and DSD64/124/256 in DoP. And the final output is headphones.

What's the best option?
noob00224 is online now  
post #4709 of 4716 Old 12-03-2019, 02:57 PM
Newbie
 
Join Date: Dec 2019
Posts: 5
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 1 Post(s)
Liked: 0
Quote:
Originally Posted by mightyhuhn View Post
neither 8 bit source or 10 bit source will result in 8 or 10 bit output that is fine without dithering.
there is only one reason to disable dithering to prove you should not disable it.

no display can display the image as it is and just the step to convert it to something it can display results in the need of "unlimited" bit deep float point numbers this alone is enough to need dithering.
when using 10 bit windowed fullscreen output combined with 8 bit output at the GPU you will get a lot of bending because there is a bug in the nvidia driver that pretty old now. this bug is pretty much ignored because this is just a bad setup.

the next problem are the TV it self they generally fail at this job and add banding. we can't look into them to know why for sure so we have to guess why they have this problem. is it low bit deep processing in like 10-12 bit do they not dither we don't know but we kind know that most TV produce banding with 10 bit input but not with 8 bit.
PC monitors usually don't have this issue.


they are heavily lossy compressed and giving the encoder more bit deep has proven to be more efficient. the benefit for HEVC is limited compared to h264 but still present mostly because HEVC works internally in 16 bit for all 8-12 bit output builds that work with 8 bit internally where discontinued very fast at least for x265 it was not competitive anyway.
OK. Then what's the point in HDR TV's at all?
tallica is offline  
post #4710 of 4716 Old 12-03-2019, 04:07 PM
AVS Forum Special Member
 
mightyhuhn's Avatar
 
Join Date: Nov 2013
Posts: 2,318
Mentioned: 12 Post(s)
Tagged: 0 Thread(s)
Quoted: 1287 Post(s)
Liked: 469
Quote:
Originally Posted by noob00224 View Post
The board is a yor board doesn't have spdif and it's not worth spending any money on this. GAMING and it has a 4-1 pin SPDIF_OUT.

There seem to be some adapters and high-definition front panel audio modules, but haven't researched it yet.

Why avoid the Asus driver? I know there have been issues, but I haven't had any.


Should I use the MB's digital out to get a digital Coaxial output?

However, the DAC does not have support for DTS/AC3, just PCM and DSD64/124/256 in DoP. And the final output is headphones.

What's the best option?
asus driver as jitter problems and dpc issues and they add nothing if you are using spdif.
if your headphones don't need an amp you are fine with just

Quote:
OK. Then what's the point in HDR TV's at all?
i guess the possibility to handle PQ, bigger colorspace increased brightness for HDR stuff like this.

i still don't even understand how you come up with that 10 bit is 4 times the data of 8 bit just no...
noob00224 likes this.
mightyhuhn is offline  
Sponsored Links
Advertisement
 
Reply Home Theater Computers

Thread Tools
Show Printable Version Show Printable Version
Email this Page Email this Page


Forum Jump: 

Posting Rules  
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are Off
Refbacks are Off