No way. Any time you transcode you lose quality, like a photocopy of a photocopy, even if you're upscaling. It's largely pointless unless you're upscaling to UHD using some fancy TensorFlow machine learning tech. It mostly makes sense to transcode to save space or for codec compatibility.
MadVR has excellent upscaling tech in realtime but it needs a hefty GPU to do it all at the highest possible settings.
I would leave your 1080p content as is, unless you are ripping your own 1080p Blurays, in which case it actually makes great sense to transcode high bitrate AVC to HEVC in 10-bit. 10-bit transcoding even with 8-bit source content results in less quantization errors, thus less banding, thus less dithering required, thus better compression (dithering in static frames looks like noise to a video codecs, like film grain). Noise compresses poorly.
there is a new NVenc setting to remove banding, which is definitely something I would suggest trying unless you own a high end Sony TV which does de-banding / bit depth upgrade from 8- to 10- bit internally.
NB I worked in VR video production that pushed the limits of the hardware encoding / decoding capabilities of all these videocards. NVidia works much better, IMO. Although the AMD 480 is decent too. I want to buy an AMD Vega 10 for my next videocard but I'm worried that it will never support Netflix or UHD Bluray playback. I'm going to wait until all this shakes out before upgrading.
I really want an HDMI 2.1 videocard with VRR and HBM2 memory, with excellent performance for HEVC / VP9 decoding in hardware that I can do FI on my PC. DmitriRender
is the best, hardly any visible artifacts and way more stable than SVP. I highly recommend HTPC users give it a try. One of the best reasons to decode video on a PC is to add FI (if you don't already have it) and then take advantage of 60 fps all the time.