Originally Posted by Blue Sun
Well that’s no good at all about the crushed blacks. Was this in PC mode?
Yes, Everything has been in PC mode, it can achieve pretty good colors and deep blacks to a point but to get past that ( and thanks to denpom for pointing it out ) I can bounce shadow detail back -1 and gamut +1 to somewhat preserve it w/o losing much at all for a richer picture.
Originally Posted by denpom
your posts are a little bit confusing but nevertheless, i'm not sure why you're always in PC mode and how are you trying to calibrate it exactly? most people on these forums don't use PC mode so it's very hard to give any kind of advice for someone who's trying to use this TV for not one of it's main purposes.
Apologies I had assumed you had been reading my posts since I was explaining why, since I use my TV's for Gaming mostly ( aside from Netflix/Hulu at times ) I was providing feedback for other gamers and users in the same category as again I had already stated I'm well aware that I'm within the small % that use it outside of movie use, and its main purpose is to be used as a display that is correct, its up to the user on how. I've been doing this since 2017 on the forums this is 2nd TV to ever give me issues, w/ regards to samsungs blunder on the 2018 q9 where the gamut tanked heavily between the q7 and the q9.
As for my calibration tools, I use the following
SpectraCal C6 &
X-Rite 1iDisplay Pro Colorimeter along w/ DisplayCAL Software. Why both? I owned the X-Rite prior and like to run comparisons.
Brightness is this year's Backlight, Shadow Detail is this year's Brightness.
Noted thanks, I'll use that information going forward. This is most helpful.
regarding fake "HDR+" mode from previous years, again, i'm not sure why do you need because like i said
I'm confused, I quoted you previously answering that question though?
, it was just a Picture settings preset. TV automatically goes into true HDR mode when an HDR signal is being fed to it, just like with previous models. once you're in HDR mode, you can adjust the settings just for HDR as TV treats it as a separate source. again, this is no different from previous years.
Did you mean no different than LAST YEARS MODEL? Because u stated it was changed in last years, but existed previously on 2018 and 2017 models, though if you don't know the presets then that's ok, my previous post was asking you if you knew or could direct me where I could find a guide on that.
i'm not sure what "flared" whites are but i'm not experiencing any crushed blacks on Q80T (i don't see how Q90T can be any worse) but again, you're in "PC mode" with unknown settings and source material, can't really help there.
I forget the term used for whites that flare out in intensity, not sure best how to describe it but hurts to look at and disrupts its surroundings, its been a while so I've forgotten some terms.
As for the other, again I quoted the person who also games giving them my feedback as an owner, its not asking for advice, though you answered urself on the 2nd half. I don't understand where the confusion is here, but if you have questions for me please feel free to ask??? Statements like " I don't see why anyone would ever" or " why would you even " are pointless and sometimes feel toxic, the forum is for information, not to look down on people, (remember its hard to read a persons tone on the internet unless u know them well) I get there are some toxic individuals that frequent here time to time, but not everyone is the same, I'm genuinely excited for a TV that supports 2.1, and also an Early adopter like many here given this just released recently. As I already explained, stores will NOT let people in to look, so I'm unable to get a side by side comparison. If its looked down on for using for a PC, that's fine, however if its not against the ToS here then I'll continue to post. Just please do me the courtesy of reading prior is all I ask (IF you're going to comment)
You're sharing your insights based on what you refer to as "intended/normal" use, I'm sharing mine based on MY type of use ( for the gaming community that utilizes TV's as their main display ) which I've been doing since the 2014.
My question to the forum above was just on the Labeling part to clarify if that was the intended mention i.e Q7 = Q6 etc.
Also again as a reminder, I'm giving my feebback based on my progression w/ this, since things have changed quited a bit since the 2018 line, I'm learning the new ins and outs, once done I'll be able to give an actual report along w/ the differences of 3 feet in front vs 8-10 feet. Again FOR PC GAMERS.
hold on, you want to downgrade your TV to be "closer" to an even more downgraded TV from previous year? i mean, what?
I never said that, again are you even reading my posts or just responding based on key words, I said I would grab the Q8 to test along side it, since I had a similar experience in 2018 where the q7 was a better choice over q9 in terms of color, as well as a few other areas for gaming, which were very apparent, side by side ( since I had both and had to take 1 back ) My comment was on a comparison, and I did state that I hope its just a BAD panel due to the tinting I'm noticing.
Originally Posted by Dan Daigle
I guess it all depends on the person's calibration but I can only speak for me I have not noticed any blooming or black crushing yet. I have it on movie mode and I'm still trying to find the sweet spot with calibration. I'm just going by eye but will definitely have it professionally calibrated when Covid-19 let's up a bit.
Again, yeah I'm speaking SOLELY from PC MODE, which DOES NOT have MOVIE MODE on it. My Posts are to the PC gamers which is an entirely diff beast to tackle. I hope this clears any confusion on that.
*Edit* Given the addition of 2.1 ports you can expect to see more enthusiasts checking these TV's out, since the 4k 120hz option @ 4:4:4 10 bit just sounds amazing, whether it will actually work or not is yet to be tested, but vs spending 2-3x more for a "gaming" gimmick TV w/ fake HDR that has a display port instead, Samsung ( and LG ) are offering much more viable options, especially for AMD users, though recent updates have allowed Nvidia to take advantage of VRR as well in the past. I know my rig can handle the current, (IF I had an adapter) however I do plan to upgrade when the 3000 series hits from Nvidia regardless since they SHOULD have 2.1 points on them vs having to use a 1.4 -> 2.1 adapter ( DP - > HDMI ) which supposedly is out but not available ( diff than the link posted prior ) and I'm willing to bet as well that the cards will release prior anyways, since they're due to be announced next month.