What really determines a card's playback ability?
This is something I've had a hard time figuring out. So I've decided to stop by and ask all the smart guys here what actually makes this work. So what is it that actually determines whether or not a video card can output 4k video well? I mean streaming netflix, as well as video file playback. Pretty much everything you'd want to do with 4k.
I've talked to some people who say "Oh well you're just bypassing most of the video card's processing capabilities, since largely they're made for gaming. Most of the big bad expensive video cards out there are made to play new demanding games at high graphics settings and not bog down. Which is a big job, in computing terms. But if you use a high end graphics card to stream or play 4k video files, well you're really not making it work very hard."
But what about a low end card? What about an Nvidia GT 1030, or an AMD RX 550? Why do those cards seem to struggle with video playback, when they've got the processing power to play most modern games on low to medium settings? They seem like pretty capable little cards, all things considered. But people will tell you not to use them for all things 4k, because they're just not up to the task. I've heard they'll "drop frames" or that they just can't do it. Essentially, don't waste your time. So what specs do I need to know about, in order to make the determination if a card can play all 4k video well? I won't be gaming with the card, this is only a question about all types of 4k video playback that you might do these days.
Last edited by bennylava; 01-28-2020 at 02:41 PM.