2019 C9–E9 Owner's Thread (No Price Talk) - Page 32 - AVS Forum | Home Theater Discussions And Reviews
Forum Jump: 
 875Likes
Reply
 
Thread Tools
post #931 of 1111 Old 04-15-2019, 01:57 PM
Member
 
avernar's Avatar
 
Join Date: Mar 2008
Posts: 75
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 37 Post(s)
Liked: 28
Quote:
Originally Posted by VindicatorDX View Post
If the cable was the limiting factor, I don't believe 60hz would be a choice for 4k with HDR turned on in the options, but it always is. I do however think the XBOX could limit the bitrate to 8bit vs. 10 on it's own though because doing some research it did look like 4K 10bit HDR 60hz is out of the limit of HDMI 2.0b being at about 22gbit/sec. So that's the scenario where the Xbox would likely force 8bit color if the cable and TV wasn't up to par and 4:2:2 mode was unchecked.
You can't do 4K/60p 4:4:4 HDR10 (20.05Gbps) with HDMI 2.0b. You can do 4K/60p 4:2:2 HDR10 (17.82Gbps) and 4K/60p 4:2:0 HDR10 (11.14Gbps) just fine. You can't drop to 8 bit with HDR so if you don't allow 4:2:2 it will drop to 4K/60 4:2:0 HDR10 or to 4K/30p 4:4:4 HDR10. Easy enough on your TV's info screen to see if it's 30p or 60p.


That's why the C9 with its 48Gbps ports is of interest to me. Actually, I'm going to buy one soon. With the next get consoles we'll be able to do the full 4K/60p 4:4:4 HDR10.
avernar is offline  
Sponsored Links
Advertisement
 
post #932 of 1111 Old 04-15-2019, 02:22 PM
Member
 
Join Date: Apr 2019
Posts: 15
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 15 Post(s)
Liked: 0
Quote:
Originally Posted by VindicatorDX View Post
So I think the problem here is Xbox capabilities vs cable bandwidth. I tested Forza Horizon 4 just now and with 4K mode with 4:2:2 disabled. The game goes smooth if "Performance" is selected in Forza's options menu. If quality is selected, then it's ~30fps and stuttery. Xbox one X hardware can't handle 4k on a lot of games @ 60hz so this is the choice many games offer to get whichever you value more, performance or quality.

If the cable was the limiting factor, I don't believe 60hz would be a choice for 4k with HDR turned on in the options, but it always is. I do however think the XBOX could limit the bitrate to 8bit vs. 10 on it's own though because doing some research it did look like 4K 10bit HDR 60hz is out of the limit of HDMI 2.0b being at about 22gbit/sec. So that's the scenario where the Xbox would likely force 8bit color if the cable and TV wasn't up to par and 4:2:2 mode was unchecked.
Thank you for testing. Still does not make sense to me: why would it stutter so badly at 30fps with HDR in 4.4.4 and 4k? Quality mode (30fps) and HDR 10-bit 4.4.4 should still be within the 18Gbps of HDMI 2.0b (that is, xbox one x port). By the way, if you disable the "instant game response" (game mode) from the tv menu (not from picture mode, but from the additional options of the HDMI port), you will see that the stutter goes away even in quality mode. I suspect that in game mode the TV "expects" to operate at 60fps and so 4.4.4 HDR 10b does not cut it. So the only way is to allow 4.2.2. Other games have the same issue if run at 4k hdr in game mode w/o 4.2.2 enabled. What does the performance mode do in Forza? Is it still 4k?
My guess is that the "feature enabled" 2.1 port on the xbox does not handshake well with the full 2.1 compliant C9 port.
This is all very confusing to me, please let me know what you think.
pelleran is offline  
post #933 of 1111 Old 04-15-2019, 03:27 PM
Member
 
Join Date: Apr 2019
Posts: 16
Mentioned: 3 Post(s)
Tagged: 0 Thread(s)
Quoted: 14 Post(s)
Liked: 9
Quote:
Originally Posted by jmpage2 View Post
Do you have the capability of testing eARC on the C9?
My receiver doesn't even have HDMI
VindicatorDX is offline  
Sponsored Links
Advertisement
 
post #934 of 1111 Old 04-15-2019, 03:30 PM
Member
 
Join Date: Apr 2019
Posts: 16
Mentioned: 3 Post(s)
Tagged: 0 Thread(s)
Quoted: 14 Post(s)
Liked: 9
Quote:
Originally Posted by pelleran View Post
Thank you for testing. Still does not make sense to me: why would it stutter so badly at 30fps with HDR in 4.4.4 and 4k? Quality mode (30fps) and HDR 10-bit 4.4.4 should still be within the 18Gbps of HDMI 2.0b (that is, xbox one x port). By the way, if you disable the "instant game response" (game mode) from the tv menu (not from picture mode, but from the additional options of the HDMI port), you will see that the stutter goes away even in quality mode. I suspect that in game mode the TV "expects" to operate at 60fps and so 4.4.4 HDR 10b does not cut it. So the only way is to allow 4.2.2. Other games have the same issue if run at 4k hdr in game mode w/o 4.2.2 enabled. What does the performance mode do in Forza? Is it still 4k?
My guess is that the "feature enabled" 2.1 port on the xbox does not handshake well with the full 2.1 compliant C9 port.
This is all very confusing to me, please let me know what you think.
Performance mode is not 4k meaning that forza horizon 4 does not support more than 30fps at 4k.
VindicatorDX is offline  
post #935 of 1111 Old 04-15-2019, 03:33 PM
Member
 
Join Date: Apr 2019
Posts: 16
Mentioned: 3 Post(s)
Tagged: 0 Thread(s)
Quoted: 14 Post(s)
Liked: 9
Quote:
Originally Posted by avernar View Post
You can't do 4K/60p 4:4:4 HDR10 (20.05Gbps) with HDMI 2.0b. You can do 4K/60p 4:2:2 HDR10 (17.82Gbps) and 4K/60p 4:2:0 HDR10 (11.14Gbps) just fine. You can't drop to 8 bit with HDR so if you don't allow 4:2:2 it will drop to 4K/60 4:2:0 HDR10 or to 4K/30p 4:4:4 HDR10. Easy enough on your TV's info screen to see if it's 30p or 60p.


That's why the C9 with its 48Gbps ports is of interest to me. Actually, I'm going to buy one soon. With the next get consoles we'll be able to do the full 4K/60p 4:4:4 HDR10.
Ah, my bad, Forgot that the 10 in HDR10 meant 10 bit. Makes sense you couldn't bump it down to 8 bit.
VindicatorDX is offline  
post #936 of 1111 Old 04-15-2019, 03:38 PM
AVS Forum Addicted Member
Industry Insider
 
Cleveland Plasma's Avatar
 
Join Date: Jun 2005
Location: Cleveland, Ohio
Posts: 23,303
Mentioned: 68 Post(s)
Tagged: 1 Thread(s)
Quoted: 6050 Post(s)
Liked: 6075
Quote:
Originally Posted by ma1746 View Post
I watched the 4K blu ray of Schindler’s list on my c9 today, wow.. if a better disc exists to show off this new set, id like to see it. Absolutely a treat to watch. I watched it in DV cinema home at default settings and the blacks were perfect.


Sent from my iPhone using Tapatalk
You would better of listing disks that do not look good It would be short......
ma1746 likes this.
Cleveland Plasma is online now  
post #937 of 1111 Old 04-15-2019, 03:42 PM
Newbie
 
Lunatic_Gamer's Avatar
 
Join Date: Jul 2014
Location: Denver, CO
Posts: 9
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 8 Post(s)
Liked: 14
Quote:
Originally Posted by gorman42 View Post
Thank you. I'll eagerly wait for your update after the email.


So got the email from LG today and it is kinda... vague. Mostly it just repeated what the LG representatives have previously told me. That they ran into a small engineering issue regarding the BFI tuning option. And that was after the product had been demoed at CES 2019. They said the feature is still coming and will be implemented at a later date via FW update.
Personally I’m really disappointed with this whole situation and sincerely hope it can be fixed in a near future.
I also urge more people to contact LG, as I have multiple times, and put some more pressure on them on this issue.


Sent from my iPhone using Tapatalk
Lunatic_Gamer is offline  
post #938 of 1111 Old 04-15-2019, 04:25 PM
AVS Forum Addicted Member
 
fafrd's Avatar
 
Join Date: Jun 2002
Posts: 13,025
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 14 Post(s)
Liked: 4555
Quote:
Originally Posted by Lunatic_Gamer View Post
So got the email from LG today and it is kinda... vague. Mostly it just repeated what the LG representatives have previously told me. That they ran into a small engineering issue regarding the BFI tuning option. And that was after the product had been demoed at CES 2019. They said the feature is still coming and will be implemented at a later date via FW update.
Personally I’m really disappointed with this whole situation and sincerely hope it can be fixed in a near future.
I also urge more people to contact LG, as I have multiple times, and put some more pressure on them on this issue.


Sent from my iPhone using Tapatalk
Would love to see a cut and paste of that enail from LG.

If LG has stated that 120Hz BFI will eventually be supported on the C9 through FW update, that's not vague to me at all and will lead me to put a 77C9 back in my future AV plans.

Of course, I won't actually purchase until LG has delivered, so I suppose I am a 'trust but verify' sort of videophile .
fafrd is offline  
post #939 of 1111 Old 04-16-2019, 12:04 AM
Newbie
 
Join Date: Apr 2019
Posts: 7
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 4 Post(s)
Liked: 2
Quote:
Originally Posted by avernar View Post
You can't do 4K/60p 4:4:4 HDR10 (20.05Gbps) with HDMI 2.0b...

Sigh ...That's all I want from this TV and my 1080Ti...
I think I'll wait for next generation 2.1 video cards (Navi or 3080).
Anyway, it would be interesting if anyone with AMD video card could try VRR.
It should be compatible with AMD freesync, or not?
magic_carpet is offline  
post #940 of 1111 Old 04-16-2019, 06:32 AM
Member
 
avernar's Avatar
 
Join Date: Mar 2008
Posts: 75
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 37 Post(s)
Liked: 28
Quote:
Originally Posted by magic_carpet View Post
It should be compatible with AMD freesync, or not?
Not quite the same but I've read that AMD is eventually adding VRR to their Radeon RX GPUs in a driver update. It will be a while before Nvidia supports it.
avernar is offline  
post #941 of 1111 Old 04-16-2019, 07:40 AM
Newbie
 
Join Date: Apr 2019
Posts: 2
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 2 Post(s)
Liked: 0
I just purchased a 65 inch LG C9. Right now I’ve got a Sony HT – ST 5000 sound bar. Obviously it doesn’t have integration with the TV. Would it be beneficial for me to just purchase a new LG SL10 YG sound bar? They would know if there’s a quality difference? I know they are new but I’m not quite sure how that would work with my C9.
BAC05 is offline  
post #942 of 1111 Old 04-16-2019, 08:10 AM
Member
 
Join Date: Sep 2018
Posts: 52
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 34 Post(s)
Liked: 16
Quote:
Originally Posted by BAC05 View Post
I just purchased a 65 inch LG C9. Right now I’ve got a Sony HT – ST 5000 sound bar. Obviously it doesn’t have integration with the TV. Would it be beneficial for me to just purchase a new LG SL10 YG sound bar? They would know if there’s a quality difference? I know they are new but I’m not quite sure how that would work with my C9.
There’s no need for that. Your current soundbar should work just fine.
TechNerd666 is offline  
post #943 of 1111 Old 04-16-2019, 08:44 AM
Advanced Member
 
Join Date: Nov 2017
Posts: 668
Mentioned: 5 Post(s)
Tagged: 0 Thread(s)
Quoted: 443 Post(s)
Liked: 95
Quote:
Originally Posted by BAC05 View Post
I just purchased a 65 inch LG C9. Right now I’ve got a Sony HT – ST 5000 sound bar. Obviously it doesn’t have integration with the TV. Would it be beneficial for me to just purchase a new LG SL10 YG sound bar? They would know if there’s a quality difference? I know they are new but I’m not quite sure how that would work with my C9.
HT-ST5000 now supports eARC, like your C9. Give it a try first.
New_to_4K is online now  
post #944 of 1111 Old 04-16-2019, 08:51 AM
Newbie
 
Join Date: Apr 2019
Posts: 7
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 4 Post(s)
Liked: 2
Quote:
Originally Posted by avernar View Post
Not quite the same but I've read that AMD is eventually adding VRR to their Radeon RX GPUs in a driver update. It will be a while before Nvidia supports it.

Hence, if I have a current-gen AMD video card, VRR is not supported?
Wow, this is a mess...
magic_carpet is offline  
post #945 of 1111 Old 04-16-2019, 09:01 AM
AVS Forum Special Member
 
guitarguy316's Avatar
 
Join Date: Oct 2006
Posts: 2,481
Mentioned: 6 Post(s)
Tagged: 0 Thread(s)
Quoted: 1880 Post(s)
Liked: 600
Quote:
Originally Posted by magic_carpet View Post
Quote:
Originally Posted by avernar View Post
Not quite the same but I've read that AMD is eventually adding VRR to their Radeon RX GPUs in a driver update. It will be a while before Nvidia supports it.

Hence, if I have a current-gen AMD video card, VRR is not supported?
Wow, this is a mess...
Nobody knows until someone tries it. With the Xbox there are very few games with freesync support.
guitarguy316 is online now  
post #946 of 1111 Old 04-16-2019, 10:10 AM
AVS Forum Special Member
 
SED <--- Rules's Avatar
 
Join Date: Feb 2006
Location: Sarasota, FL
Posts: 1,045
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 196 Post(s)
Liked: 155
Quote:
Originally Posted by Lunatic_Gamer View Post
So got the email from LG today and it is kinda... vague. Mostly it just repeated what the LG representatives have previously told me. That they ran into a small engineering issue regarding the BFI tuning option. And that was after the product had been demoed at CES 2019. They said the feature is still coming and will be implemented at a later date via FW update.
Personally I’m really disappointed with this whole situation and sincerely hope it can be fixed in a near future.
I also urge more people to contact LG, as I have multiple times, and put some more pressure on them on this issue.


Sent from my iPhone using Tapatalk
I'm glad that they're working on the BFI. To many here a properly working BFI implementation (low, med, high) will push them to buy the tv, rather than hold back. And to be fair, engineering problems happen in the technology business. If you know anything about the big delay JVC had with their projectors, you can understand that these things happen. But like any problem, if it can be fixed then it's all good. I hope LG is diligently working on the situation.
Lunatic_Gamer likes this.
SED <--- Rules is offline  
post #947 of 1111 Old 04-16-2019, 10:10 AM
Senior Member
 
CPTuell's Avatar
 
Join Date: Nov 2005
Location: Louisville, KY
Posts: 415
Mentioned: 4 Post(s)
Tagged: 0 Thread(s)
Quoted: 177 Post(s)
Liked: 84
Quote:
Originally Posted by helvetica bold View Post
So cable is bit depth starved correct? Is this much dithering normal in dark scenes? I guess this is where Sony’s processing helps.
FYI is is from a Spectrum cable feed so I assume it’s pretty crappy quality.





Sent from my iPhone using Tapatalk

I DVR'd the show from DirecTV and watched it on my 5yr old Samsung HU9000 with SEK3500 One Connect. Went back and looked at those scenes again and they were smooth as can be. I'm leaning toward buying the C9 in a couple of weeks. I have a hard time believing it would look worse than 5yr old Sammy. I would have to say it is definitely the source causing the issue.

CPTuell
" I'm a man, I can change, If I have to......I guess." Words to live by. From the Red Green Show...The Man's Prayer
UN65HU9000/SEK3500/US02,DirecTv HR-54 Genie w/4K MiniGenie, Denon AVR-X3300W, Roku Ultra 4K/HDR, Chromecast Ultra 4K/HDR, Sony UBP-X800
CPTuell is online now  
post #948 of 1111 Old 04-16-2019, 10:54 AM
AVS Forum Addicted Member
 
jmpage2's Avatar
 
Join Date: Feb 2005
Posts: 10,195
Mentioned: 2 Post(s)
Tagged: 0 Thread(s)
Quoted: 233 Post(s)
Liked: 205
Quote:
Originally Posted by SED <--- Rules View Post
I'm glad that they're working on the BFI. To many here a properly working BFI implementation (low, med, high) will push them to buy the tv, rather than hold back. And to be fair, engineering problems happen in the technology business. If you know anything about the big delay JVC had with their projectors, you can understand that these things happen. But like any problem, if it can be fixed then it's all good. I hope LG is diligently working on the situation.

I don't think anyone should buy this TV assuming that 3.5ms BFI is going to show up. LG could cite any one of a number of reasons for it never happening and all we have is some "vague emails" (that haven't been shared in this thread) as an indication the feature makes it to this generation. Contrast that to well known calibrator working at major electronics seller who gets told by LG flatly that no 2019 OLEDs will have it due to technical limitations.
jmpage2 is offline  
post #949 of 1111 Old 04-16-2019, 11:04 AM
AVS Forum Addicted Member
 
fafrd's Avatar
 
Join Date: Jun 2002
Posts: 13,025
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 14 Post(s)
Liked: 4555
Quote:
Originally Posted by jmpage2 View Post
I don't think anyone should buy this TV assuming that 3.5ms BFI is going to show up. LG could cite any one of a number of reasons for it never happening and all we have is some "vague emails" (that haven't been shared in this thread) as an indication the feature makes it to this generation. Contrast that to well known calibrator working at major electronics seller who gets told by LG flatly that no 2019 OLEDs will have it due to technical limitations.
I'm with you - I'll believe it only when I see it.
fafrd is offline  
post #950 of 1111 Old 04-16-2019, 11:21 AM - Thread Starter
AVS Forum Special Member
 
helvetica bold's Avatar
 
Join Date: Oct 2004
Location: NYC
Posts: 2,135
Mentioned: 1 Post(s)
Tagged: 1 Thread(s)
Quoted: 1384 Post(s)
Liked: 767
Quote:
Originally Posted by CPTuell View Post
I DVR'd the show from DirecTV and watched it on my 5yr old Samsung HU9000 with SEK3500 One Connect. Went back and looked at those scenes again and they were smooth as can be. I'm leaning toward buying the C9 in a couple of weeks. I have a hard time believing it would look worse than 5yr old Sammy. I would have to say it is definitely the source causing the issue.


I snapped some pics from Apple TV a few posts later and it looks great! Spectrum Cable is a fault, it’s just poor quality.
C9 is great, you won’t be disappointed.



Sent from my iPhone using Tapatalk
CPTuell likes this.
helvetica bold is online now  
post #951 of 1111 Old 04-16-2019, 11:26 AM - Thread Starter
AVS Forum Special Member
 
helvetica bold's Avatar
 
Join Date: Oct 2004
Location: NYC
Posts: 2,135
Mentioned: 1 Post(s)
Tagged: 1 Thread(s)
Quoted: 1384 Post(s)
Liked: 767
Quote:
Originally Posted by guitarguy316 View Post
Nobody knows until someone tries it. With the Xbox there are very few games with freesync support.


Gears of War 4 and Division 2 support VRR on Xbox One X. They look stunning on the C9 and run smoothly but I can’t say VRR is a game changer. It’s subtle. I’m more impressed with how the C9 handles HDR.

Today’s news from Wired magazine is PS5 is supporting 8K upscaling! I can’t keep up.


Sent from my iPhone using Tapatalk
anwsmh likes this.
helvetica bold is online now  
post #952 of 1111 Old 04-16-2019, 12:11 PM
AVS Forum Special Member
 
lsorensen's Avatar
 
Join Date: May 2012
Location: Toronto
Posts: 2,370
Mentioned: 3 Post(s)
Tagged: 0 Thread(s)
Quoted: 1306 Post(s)
Liked: 750
Quote:
Originally Posted by tbonestl View Post
My new C9 has EARC. The Onkyo receiver I'm getting does as well. The receiver does not support other HDMI 2.1 features. Would my best option be too run the Xbox One X directly to the TV, and another cable from my receiver to my TV using the ERAC HDMI port on my TV? Then would the receiver get the HD audio formats from my Xbox through ERAC? Just trying to figure out my final setup. Thanks!
Going Xbox to TV and eARC from TV to AVR would be ideal. You get best audio and best video options, and you get to keep the Xbox input in game mode with the AVR input in video mode if you want, without having to change modes when playing games. Seems ideal to me.

Len Sorensen

Sony XBR55A1E, Marantz SR6012, Benq W7000, Oppo BDP-93, PSB Image T5/C5/B4/Subseries 200
lsorensen is offline  
post #953 of 1111 Old 04-16-2019, 02:00 PM
Member
 
emmapeel159's Avatar
 
Join Date: May 2007
Posts: 43
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 9 Post(s)
Liked: 18
Quote:
Originally Posted by helvetica bold View Post
Gears of War 4 and Division 2 support VRR on Xbox One X. They look stunning on the C9 and run smoothly but I can’t say VRR is a game changer. It’s subtle. I’m more impressed with how the C9 handles HDR.

Today’s news from Wired magazine is PS5 is supporting 8K upscaling! I can’t keep up.


Sent from my iPhone using Tapatalk
Game dev here. VRR is a game changer but the nature of the technology is such that you wont actually ever notice it working. Its just going to make the games you play a whole lot smoother and get rid of occasional fps drops (does not really get rid of them just minimizes you noticing them) that all games suffer from.

This will not help all games as the VRR is usually range bound and wont work past the range. I am not sure what the vrr ranges (each resolution might/probably will have a different vrr range) that are supported for the lg oleds are but to give an example from some freesync monitor tables I found that support 4k VRR from 48fps to 60fps. The problem with this is that most 4k games currently only do 30fps (don't expect this to change for current gen consoles, and also wouldn't expect many 60fps 4k games on next gen). VRR will do nothing for a 30fps 4K game that drops frames when the minimum VRR range is 48.

I also wouldn't expect much on the "above 60fps" VRR for a while when attached to a console either, not for this gen. Next gen is even questionable because having a "performance" option in console games is not something thats culturally acceptable/understood right now for most console gamers. Its really hard getting people to understand that setting your console resolution to 1080p instead of 4k will usually get you a vastly better play experience (vs visual experience). The 4K marketing machine has sold people on the idea that its just flat better all around which is not necessarily true for games.

Also specifically with the VRR on the XBox One X. Each game developer has to enable this internally. If the game was not made to handle VRR it will not currently and likely it never will be updated to support this. I am not sure the amount of work involved on the engine side to make VRR work (might be wildly different depending on the current game engine) but it might be something that most game makers DO NOT support if its tricky or steals resources from non VRR work. It might not be something supported by most games this console generation.

However all that being said as a game dev and a game player I am really excited for VRR. I think that this is one of the most revolutionary hardware developments for games on the display side in a long long time along with HDR. It is the future, it might just take a while for broad adoption.

Anyways sorry to go off topic with that. Back on topic bought my first HDR tv (LG C965) last week and I am blown away by it all. We have been using LG oleds for quite a while at work but its a completely different experience sitting in the comfort of your own home comparing it to your old setup (Pioneer Kuro 500m). I am also marveling at all the modern touches which I did not get on the old tv. Simple things like the TV sources auto recognizing what devices I have attached and auto turning on my ps4 when I switch to that input.

Last edited by emmapeel159; 04-16-2019 at 02:27 PM.
emmapeel159 is offline  
post #954 of 1111 Old 04-16-2019, 03:19 PM
AVS Forum Special Member
 
NintendoManiac64's Avatar
 
Join Date: Feb 2012
Location: Northeast Ohio
Posts: 1,988
Mentioned: 3 Post(s)
Tagged: 0 Thread(s)
Quoted: 1011 Post(s)
Liked: 439
Quote:
Originally Posted by emmapeel159 View Post
VRR will do nothing for a 30fps 4K game that drops frames when the minimum VRR range is 48.
Back in the day on the N64 with 20fps games, I recall games like Zelda Ocarina of Time rendered menus at 30fps and the game at 20fps, but the frame buffer always updated at 60Hz with a 240 progressive output regardless meaning the 30fps situations were frame-doubled while the 20fps situations were frame-tripled.

Therefore, couldn't the game dev cheat the system and simply do a similar sort of frame-doubling on their own rather than relying on the console itself to do it, thereby resulting in the Xbox seeing a 30fps game as being like a native 60fps game, and any frame drops to like 28fps would be treated as 56fps?

As a whole this methodology is similar to that the low-framerate compensation mode on freesync but that requires the maximum to be more than 2x the minimum (which the Xbox can do at least at 1440p), but at least on the PC low framerate compensation is done at the driver level and therefore occurs at a lower level than the actual code running inside of the game.


Quote:
Originally Posted by emmapeel159 View Post
as a game dev and a game player I am really excited for VRR.

bought my first HDR tv (LG C965)
Surely then you must have a semi-modern AMD GPU (whether discrete or integrated) laying around, no?

Thus far nobody in this thread has such a thing and therefore nobody has been able to check what the freesync range is via something like CRU - Custom Resolution Utility at the likes of 720p, 1080p, 1440p, 4k, etc.

(and heck, maybe one could even see if CRU can be used to extend the freesync range farther )
NintendoManiac64 is offline  
post #955 of 1111 Old 04-16-2019, 03:44 PM - Thread Starter
AVS Forum Special Member
 
helvetica bold's Avatar
 
Join Date: Oct 2004
Location: NYC
Posts: 2,135
Mentioned: 1 Post(s)
Tagged: 1 Thread(s)
Quoted: 1384 Post(s)
Liked: 767
2019 C9–E9 Owner's Thread (No Price Talk)

Quote:
Originally Posted by emmapeel159 View Post
Game dev here. VRR is a game changer but the nature of the technology is such that you wont actually ever notice it working. Its just going to make the games you play a whole lot smoother and get rid of occasional fps drops (does not really get rid of them just minimizes you noticing them) that all games suffer from.



This will not help all games as the VRR is usually range bound and wont work past the range. I am not sure what the vrr ranges (each resolution might/probably will have a different vrr range) that are supported for the lg oleds are but to give an example from some freesync monitor tables I found that support 4k VRR from 48fps to 60fps. The problem with this is that most 4k games currently only do 30fps (don't expect this to change for current gen consoles, and also wouldn't expect many 60fps 4k games on next gen). VRR will do nothing for a 30fps 4K game that drops frames when the minimum VRR range is 48.



I also wouldn't expect much on the "above 60fps" VRR for a while when attached to a console either, not for this gen. Next gen is even questionable because having a "performance" option in console games is not something thats culturally acceptable/understood right now for most console gamers. Its really hard getting people to understand that setting your console resolution to 1080p instead of 4k will usually get you a vastly better play experience (vs visual experience). The 4K marketing machine has sold people on the idea that its just flat better all around which is not necessarily true for games.



Also specifically with the VRR on the XBox One X. Each game developer has to enable this internally. If the game was not made to handle VRR it will not currently and likely it never will be updated to support this. I am not sure the amount of work involved on the engine side to make VRR work (might be wildly different depending on the current game engine) but it might be something that most game makers DO NOT support if its tricky or steals resources from non VRR work. It might not be something supported by most games this console generation.



However all that being said as a game dev and a game player I am really excited for VRR. I think that this is one of the most revolutionary hardware developments for games on the display side in a long long time along with HDR. It is the future, it might just take a while for broad adoption.



Anyways sorry to go off topic with that. Back on topic bought my first HDR tv (LG C965) last week and I am blown away by it all. We have been using LG oleds for quite a while at work but its a completely different experience sitting in the comfort of your own home comparing it to your old setup (Pioneer Kuro 500m). I am also marveling at all the modern touches which I did not get on the old tv. Simple things like the TV sources auto recognizing what devices I have attached and auto turning on my ps4 when I switch to that input.


Thanks for the insight! Very interesting post and honestly I don’t know much about VRR. You hit the nail on the head regarding “wont actually ever notice it working”. I’ve been playing a decent amount of The Division 2 which supports VRR according to Xbox.com.
Division 2 is a mostly native 4K (dynamic) @ 30 FPS. I’m not sure how VRR works with the C9 is it within the range of VRR (HDMI 2.1)? I will say the same feels very smooth. This also goes for Gears of War 4, very smooth and stunning w/ HDR.



Sent from my iPhone using Tapatalk
helvetica bold is online now  
post #956 of 1111 Old 04-16-2019, 04:55 PM
Member
 
convexmacrolabs's Avatar
 
Join Date: Jun 2010
Posts: 44
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 27 Post(s)
Liked: 28
Quote:
Originally Posted by emmapeel159 View Post
Game dev here. VRR is a game changer but the nature of the technology is such that you wont actually ever notice it working.

If you play a game without vsync or triple buffering on, it will be very noticeable whether it is working or not because of screen tearing. I don't know whether any game fits the bill on XB1X, but if someone had a PC with a GPU that supported Freesync it would be pretty easy to confirm whether or not it was actually working properly.
convexmacrolabs is offline  
post #957 of 1111 Old 04-16-2019, 05:09 PM
Member
 
emmapeel159's Avatar
 
Join Date: May 2007
Posts: 43
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 9 Post(s)
Liked: 18
Quote:
Originally Posted by convexmacrolabs View Post
If you play a game without vsync or triple buffering on, it will be very noticeable whether it is working or not because of screen tearing. I don't know whether any game fits the bill on XB1X, but if someone had a PC with a GPU that supported Freesync it would be pretty easy to confirm whether or not it was actually working properly.
For sure that a savvy user could determine its working or not but for most people it would not be something they notice especially if the game is already pretty optimal (mainly running at 60 for example).

I was simply agreeing that the results are very subtle unlike say HDR which hits you over the head with a noticeable difference. For example my significant other notices the HDR on the tv but would never notice VRR working.
emmapeel159 is offline  
post #958 of 1111 Old 04-16-2019, 05:20 PM
Member
 
emmapeel159's Avatar
 
Join Date: May 2007
Posts: 43
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 9 Post(s)
Liked: 18
Quote:
Originally Posted by helvetica bold View Post
Thanks for the insight! Very interesting post and honestly I don’t know much about VRR. You hit the nail on the head regarding “wont actually ever notice it working”. I’ve been playing a decent amount of The Division 2 which supports VRR according to Xbox.com.
Division 2 is a mostly native 4K (dynamic) @ 30 FPS. I’m not sure how VRR works with the C9 is it within the range of VRR (HDMI 2.1)? I will say the same feels very smooth. This also goes for Gears of War 4, very smooth and stunning w/ HDR.

Sent from my iPhone using Tapatalk
It would be nice to get more information about the VRR support for these games (and the tv) and if there are any caveats. They might only support VRR in certain modes (like 1080p). I have no knowledge of how these games work.

I have tried looking for VRR info for the LG and I can not seem to find anything.
emmapeel159 is offline  
post #959 of 1111 Old 04-16-2019, 05:34 PM
AVS Forum Special Member
 
NintendoManiac64's Avatar
 
Join Date: Feb 2012
Location: Northeast Ohio
Posts: 1,988
Mentioned: 3 Post(s)
Tagged: 0 Thread(s)
Quoted: 1011 Post(s)
Liked: 439
Quote:
Originally Posted by emmapeel159 View Post
I have tried looking for VRR info for the LG and I can not seem to find anything.
As I alluded to, CRU - Custom Resolution Utility combined with a freesync-supported AMD GPU connected over HDMI can tell you the freesync range of a given display:

NintendoManiac64 is offline  
post #960 of 1111 Old 04-16-2019, 05:42 PM
Member
 
emmapeel159's Avatar
 
Join Date: May 2007
Posts: 43
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 9 Post(s)
Liked: 18
Quote:
Originally Posted by NintendoManiac64 View Post
Back in the day on the N64 with 20fps games, I recall games like Zelda Ocarina of Time rendered menus at 30fps and the game at 20fps, but the frame buffer always updated at 60Hz with a 240 progressive output regardless meaning the 30fps situations were frame-doubled while the 20fps situations were frame-tripled.

Therefore, couldn't the game dev cheat the system and simply do a similar sort of frame-doubling on their own rather than relying on the console itself to do it, thereby resulting in the Xbox seeing a 30fps game as being like a native 60fps game, and any frame drops to like 28fps would be treated as 56fps?

As a whole this methodology is similar to that the low-framerate compensation mode on freesync but that requires the maximum to be more than 2x the minimum (which the Xbox can do at least at 1440p), but at least on the PC low framerate compensation is done at the driver level and therefore occurs at a lower level than the actual code running inside of the game.




Surely then you must have a semi-modern AMD GPU (whether discrete or integrated) laying around, no?

Thus far nobody in this thread has such a thing and therefore nobody has been able to check what the freesync range is via something like CRU - Custom Resolution Utility at the likes of 720p, 1080p, 1440p, 4k, etc.

(and heck, maybe one could even see if CRU can be used to extend the freesync range farther )
So I don't know anything about N64 development as I never worked on that specifically but consoles currently work that way (not the 20-30 hz ui/game split). If a game is rendering at 30hz the tv essentially renders the same frame twice because the consoles are always outputting at 60hz regardless (non VRR). The engine puts the new frame in the buffer at 30hz but the console pumps it to the tv at 60hz so you get twice the frames.

I don't know enough about VRR to know why you would not get benefit from this as you suggest but there must be a reason as you don't get the benefit with current implementations. I suspect, knowing how many braniacs are in the game industry and graphics hardware industry, if there were any large scale improvements that can be made in this way they already have been.

Unfortunately card wise I only have a nvidia card (2080RTX TI) so I can't run any tests for you. My work as well is only nvidia (everyone is standardized to just a couple cards).
Although I remember hearing recently that nvidia was going to start supporting freesync.
emmapeel159 is offline  
Sponsored Links
Advertisement
 
Reply OLED Technology and Flat Panels General

Thread Tools
Show Printable Version Show Printable Version
Email this Page Email this Page


Forum Jump: 

Posting Rules  
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are Off
Refbacks are Off