Bit late to the party but ...
I think with video transcoding the video is decoded to DCT and then re-encoded, so it doesn't need to go fully to baseband. I don't know if the motion vector information from the source is re-used in any way. Of course this assumes going from one DCT based compression scheme to another. DWT (like JPEG2k) is another matter.
As for those fond memories of CRT based displays, I'm glad we're finally past them. They did have some advantages such as the inherit exponential gamma with good dark area performance. The down sides included most consumer sets didn't have good phosphor colorimetry. It's common now that displays have wider gamut than the 709/601 specifications. Those RCA CRTs had poor enough scanning linearity that one could see the image stretch and compress on pans. Blooming and mis-registration were common issues on consumer sets, and even getting that near perfect on professional ones was a challenge. Lifetime on CRTs seemed to decrease once HD started, and on the professional monitors (such as the 32" Sony) was around three years with the replacement being very costly. CRTs also had lag, though not nearly as bad as some LCDs that were released. Phosphor burn was an issue, though not as bad as plasma. My own opinion is that I've never really liked plasma and I'm happy to see it fading away.
The newer technologies can have many advantages over CRTs. One mentioned earlier is expanded gamut. Even if this expanded area (deeper colors) isn't utilized, it still provides a greater area to use a matrix or LUT to create exacting chomaticity coordinates. Since most displays require exponential gamma correction, this can be used to create accurate gamma (assuming enough bits are used to avoid quantizing errors). Mis-registration is a thing of the past on direct view and sequenced color displays. Consumer 4K is now around the corner. Lag on the newer LCDs is minimal, and newer technologies such as OLED even better.
The ultimate to me is the display Sony demonstrated at CES this year with LEDs. If it had flaws, I didn't see them. Great color, no lag at all, fantastic blacks. Of course this isn't ready even for professional markets yet, but it does show that the best is yet to come. OLEDs look to offer improved images for now.
I will agree that both broadcast and most of the providers have compromised quality for quantity. However, there have been improvements. DirecTv was widely criticized on this site for its poor HD quality, but they improved when they went to MPEG4. NBC also improved very noticeably when they switched to MPEG4 network distribution not only for the image but on the audio too. The 5.1 audio is all in time now (no echos or hollow sound) and in lip sync. Live cameras are better than the early days. 4K film scanning allows better transfers. I don't know if the move to electronic acquisition for episodics and features has necessarily been a good thing, but to be sure film is fading.
We're still in a state of transition from SD and MPEG2. This wastes providers bandwidth. It will probably be quite some time before OTA goes with an improved codec, but providers can take better advantage of these advances. I still think the providers should source their station feeds directly from the broadcaster in baseband with better than OTA encoding, and then advertise "better than antenna quality". Internet streaming will be better to take advantage of improved codecs, though it could be argued that it will just allow the same (or poorer) quality to be sent at lower bitrates. With both bandwidth and codecs improving, there may be reason for optimism. At least on VOD material. The recent server overloads on the O'Reilly-Stewart debate demonstrates there's still problems with live streaming.