How to upconvert composite RCA on Onkyto TX-NR929 - AVS Forum | Home Theater Discussions And Reviews
Forum Jump: 
 3Likes
  • 1 Post By Ratman
  • 1 Post By Mark12547
  • 1 Post By Ratman
 
Thread Tools
post #1 of 8 Old 10-13-2014, 09:17 AM - Thread Starter
Member
 
YDR05's Avatar
 
Join Date: Sep 2013
Location: Polkton, Onkyork
Posts: 94
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 7 Post(s)
Liked: 20
How to upconvert composite RCA on Onkyto TX-NR929

Guys,


I downgraded my cable to get rid off all the useless stuff and save some money.


I still got a non HD box Cisco RNG100 from Comcast with composite RCA out - 480i. When I plug this into the receiver, shouldn't I be able to upconvert it to 1080p?


I am setting HDMI out main - resolution - 1080p. I don't see any improvement in pic quality after doing this. Any idea what is going on?
YDR05 is offline  
Sponsored Links
Advertisement
 
post #2 of 8 Old 10-13-2014, 09:40 AM
 
lovinthehd's Avatar
 
Join Date: Dec 2007
Location: OROR
Posts: 16,231
Mentioned: 50 Post(s)
Tagged: 0 Thread(s)
Quoted: 4588 Post(s)
Liked: 4796
Searching the manual for "upconversion" I only found this: "If you’ve connected your TV to the AV receiver with an HDMI cable, composite video and component video sources can be
upconverted and output by the HDMI output" and the accompanying picture seems to confirm. Maybe its just the quality of your source....

Last edited by lovinthehd; 10-13-2014 at 09:45 AM. Reason: Looked at pic in manual....
lovinthehd is offline  
post #3 of 8 Old 10-13-2014, 12:09 PM
AVS Forum Addicted Member
 
Ratman's Avatar
 
Join Date: May 2002
Location: Collingswood, N.J.
Posts: 19,472
Mentioned: 25 Post(s)
Tagged: 0 Thread(s)
Quoted: 2413 Post(s)
Liked: 2244
Why would you expect 480i video to "magically" become 1080p?

If you take a Kia through a Car Wash... would you hope a Bentley to come out?
HDMI Guy likes this.
Ratman is offline  
Sponsored Links
Advertisement
 
post #4 of 8 Old 10-13-2014, 09:25 PM
AVS Forum Special Member
 
Mark12547's Avatar
 
Join Date: Nov 2013
Location: Salem, Oregon, United States, Earth
Posts: 2,304
Mentioned: 2 Post(s)
Tagged: 0 Thread(s)
Quoted: 842 Post(s)
Liked: 651
I started a long reply, but decided to cut this down to specifically cable.

Somewhere between 3 out of 4 HD channels (using HD stations I watch) or 4 out of 5 HD channels (from examining this Wikipedia page) transmit 1080i (1920x1080, interlaced), the other 1 out of 4 or 1 out of 5 are 720p (1280x720, progressive). In both cases, the aspect ratio is precisely 16:9, or approximately 1.78:1. (Both 1920x1080 and 1280x720 formats use square pixels, which is a contrast to SD that I'll mention below.)

Your 1080p display device is probably also 1920x1080 (pixels per line x number of lines), so if you are watching a 1080i station, no scaling takes place. (Deinterlacing does take place, but if there is little movement, you are watching 1920x1080.) If you are watching a 720p station, some upscaling has to take place, expanding the incoming signal by 50% horizontally and 50% vertically. Yes, that means a lot of the image on the screen is interpolated, but there is enough data that the scaler in most TVs do a decent job of displaying 720p content on a TV with a native resolution of 1080.

A standard definition signal, such as from a DVD or from a standard definition cable box, is 480i (720x480, interlaced), and usually has an aspect ratio of 4:3 or approximately 1.33:1. (But 720/480 is not 1.33! True. But from what I read, only 704 of the 720 pixels per line are used [the missing 16 pixels per line being an "overscan" margin to make sure part of the original analog image wasn't lost], and then each pixel represents a slightly skinny rectangle of the picture, the width being 10/11th of the height and 704/480*10/11 is the 1.333... that we were expecting in order to fill the screen of an old fashioned 4:3 old standard TV.) Ok, we should really call this 704x480, not 720x480. (Some DVDs are anamorphic widescreen. They do this by using pixels that are "fat" or the width is 40/33rds the height, so 704/480*40/33 = 1.777... which would fill the screen of a modern 16:9 HD TV, which means both 4:3 DVDs and 16:9 DVDs use the same number of pixels so use the same amount of space on the disc.)

To boil this down and looking at the resolution, we have:

Source Resolution Scaling Factor (hor. x vert.)
(the 1080 HDTV)1920 x 1080 1 x 1 (included for reference)
1080p Blu-ray1920 x 1080 1 x 1 (no scaling)
1080i stations1920 x 10801 x 1 (no scaling, Note 1)
720p stations1280 x 7201.5 x 1.5
SD cable box704 x 4802.05 x 2.25 (Note 2)
SD cable box zoomed704 x 3602.73 x 3 (Note 3)
fullscreen (4:3) DVD704 x 4802.05 x 2.25 (Note 2)
anamorphic (16:9) DVD704 x 4802.73 x 2.25


Note_1:While 1080i doesn't require rescaling to 1080, it does require deinterlacing since LCD and Plasma displays and some other display technologies are progressive-only displays.
Note_2:Expansion is to 1440 x 1080, not to 1920 x 1080, because the aspect ratio of the incoming signal is 4:3 and so needs only 75% of the width of the 16:9 display to preserve the 4:3 aspect ratio.
Note_3:If viewing 16:9 HD channel letterboxed into a 4:3 SD channel for a SD box, and then zooming that to fill a HD screen, 120 lines of the 480 contain the letterbox bars and the picture is contained in the remaining 360 lines that then have to fill the 1080 lines of the display device


The "Scaling Factor" indicates how much data the scaler would have to extrapolate, from not extrapolating (1x1), to generating 78% of pixels on the screen (SD cable box), to generating 88% of the pixels on the screen (SD cable box zoomed 25% so letterboxed HD picture fills the screen). As you can imagine, if every dot on the screen comes from the original source, the resulting picture is far better than if the scaler in the TV (or Receiver, or Blu-ray player) has to create lots of dots of the picture between each received dot, and the worst case I presented with the SD cable letterboxed image of a HD channel forcing the scaler to produce 88% of what you see on the screen produces the least amount of detail and does the maximum magnification of any compression artifacts.

The scaler cannot recreate the original dots that are missing in the picture. Instead, it has to interpolate the picture data from the information that it receives; the less information, the more interpolation and the less accurate the resulting image will be. This is why on a close-up head shot on TV, if the source is Blu-ray or a HD TV channel, you might be able see every individual strands in a lock of hair on the head; but for a DVD or an SD TV channel, for the same size head shot on the TV, you can see curls and locks, but not individual hairs in a lock of hair. Or when watching the Planet Earth series, on the Blu-ray you can see the fine details in the beauty of nature, but on DVD, without as much detail for the TV to work with, the fine detail is more of a slight blur.

Why, then, do standard definition cable channels or standard definition satellite channels don't look as sharp as DVD, especially in action scenes or other quickly-changing scenes? Because cable operators and satellite TV services do a lot more compression to squeeze more channels onto the cable or into the bandwidth the satellite service is allocated, and the result is that there can be compression artifacts, like macroblocking (as bright explosions happen as the Japanese army fires at Godzilla in "Godzilla 2000" during the flashes there are a bunch of bright square regions that briefly appear and quickly change into the features of bright bolts that they should have been), pixelation (sometimes a staircase or jagged edge effect on what should have been a smooth line or curve) and, on one subchannel, I have noticed at times with just one show that there will be brief flashes of purple along the bottom edge.

I have noticed compression artifacts far more often on SD channels than on HD channels. Maybe my local cable company figures that those watching SD are doing so on smaller TVs, or maybe it's that a macroblock is a smaller piece of a 1920x1080 image than a 704x480 image, so when the scaler scales up a macroblock on a 704x480 image it is 2.05 times wider and 2.25 times taller on the TV than a macroblock of a 1920x1080 image.

The larger scaling factor and the larger screen both act as a magnifying glass on visual defects, making any visual defects and loss of detail more painfully obvious. (That's why DVDs of some 1950s shows or some VHS tapes looked good on a 21-in SD screen, but look terrible on a 50-in screen.)

I guess that is a lot of writing just to say that if you are expecting to see HD cable, you cannot substitute an SD cable channel, unless you have a really small TV. And the best scaler and the best TV in the world won't make up for having an SD source.

Maybe one thing would substitute: if your area gets good over-the-air TV reception, using an actual antenna and tuning to a station will get that station's HD channel (usually the .1 subchannel) and possibly other subchannels (e.g., .2 or .3 as an SD channel for alternate programming or a digital multicast network that got affiliated with that station for broadcast purposes). (Some stations may have both the .1 and .2 subchannels be HD and the rest be SD; one of my local stations has nothing but SD subchannels, 5 of them. It is up to the station to decide how to divide up the bandwidth they had been allocated, but the more they try to transmit in their allocated bandwidth the more compression artifacts there will be, especially in action scenes or other quickly-changing scenes.)

PS: I just did some checking of SD vs. HD channels on my cable system for which there is HD content. For local channels, some apparently do center-cut or pan-and-scan (couldn't tell which with my quick checking) and some have the HD channel letterboxed down into the 4:3 image size of the SD channel. For the non-local HD channels, it appears that some just letterbox HD down to the SD channel, and some have separate feeds but it looks like where the HD channel has 16:9 program content, the program content is letterboxed down to the SD channel, with some channels having their logo and other annoying advertising overlay in the letterboxed area (e.g., Cartoon Network), and some moving the logo and part of the annoying advetising overlay into the letterbox bars (e.g., The History Channel).

In any case, SD channels just will not give the picture quality that HD channels do, and no amount of massaging the data will create the detail that was lost by downscaling the picture to an SD channel.
YDR05 likes this.

My very humble setup:
Spoiler!
Mark12547 is offline  
post #5 of 8 Old 10-14-2014, 06:11 AM
AVS Forum Addicted Member
 
Ratman's Avatar
 
Join Date: May 2002
Location: Collingswood, N.J.
Posts: 19,472
Mentioned: 25 Post(s)
Tagged: 0 Thread(s)
Quoted: 2413 Post(s)
Liked: 2244
I'm glad you cut it down.
Ratman is offline  
post #6 of 8 Old 10-14-2014, 10:58 AM - Thread Starter
Member
 
YDR05's Avatar
 
Join Date: Sep 2013
Location: Polkton, Onkyork
Posts: 94
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 7 Post(s)
Liked: 20
Mark, thanks much for the all the information.


So long story short, a $1000 AVR cannot help me save $10/month for the HD/DVR box from Comcast! Just kidding, I know that is not my AVR's main purpose.
One big reason I was hoping this would work is that I still have subscription to on-demand and HBO. I cannot imagine how HBO HD - 3D would look in 480i!.
May be TiVo Roamio without the subscription (not sure if they do that) is the way to go. $150 for TiVo will pay for itself in 15 months versus $10/month to Crapcast
YDR05 is offline  
post #7 of 8 Old 10-14-2014, 07:59 PM
AVS Forum Special Member
 
Mark12547's Avatar
 
Join Date: Nov 2013
Location: Salem, Oregon, United States, Earth
Posts: 2,304
Mentioned: 2 Post(s)
Tagged: 0 Thread(s)
Quoted: 842 Post(s)
Liked: 651
Quote:
Originally Posted by Ratman View Post
I'm glad you cut it down.
But I may have cut it down too much! I had overlooked that it is a composite video connection, and composite video involves modulating chroma (color information) and adding it to the line carrying the luma (brightness) information, and then at the receiving piece of equipment demodulating the color information. It is possible to have "dot crawl" from crosstalk between chroma and luma information. Or one could see some false color fuzz along the edge between two extremely different brightnesses (e.g., white printing on black, or black printing on white). Some equipment use a low-pass filter for the luma signal to reduce the effect of modulated chroma signal adversely affecting the luma detail, but at the cost that some of the fine detail may get filtered out. And a "comb filter" reduces the false color fuzz. Another problem with component video in NTSC countries is that there is no colorburst so it is possible for a bit of a drift in the modulation and demodulation of the chroma information, resulting in a color drift.

The scaler would then take that distorted picture (with softened edges, or with dot crawl or false color fuzz) and try to expand that to fill the TV screen, and, alas, magnifying garbage results in worse looking garbage.

Even in old analog days, it was considered better to keep luma and chroma signals separate, and thus S-Video (which modulates the chroma signal but keeps it separate from the luma signal, though modulating the color information can result in errors separating the color components), or, best in the consumer analog world because neither luma nor chroma information get modulated/demodulated and so remain the closest to their original values, component video (luma, B-Y, R-Y on separate conductors).

Even better is the use of HDMI since modern HD cable channels have the information in digital form and by using HDMI the signal can remain digital from the cable to the TV, eliminating possible luma and chroma distortion by having to go to analog and then in the TV (or receiver) back to digital. But I haven't heard of a HDMI connector on an SD cable box. (There might be, but I haven't heard of it.)

Now I don't know if I had seen composite artifacts since the only time I had watched composite on a large HDTV had been watching VHS cassettes recorded in EP mode (record 6 hours on a 2-hour tape) from analog cable for purposes of time shifting, and the tapes were old, so the picture had lots of distortions anyway. And, as I mentioned in my other brief message, these artifacts aren't so apparent on a small TV, such as the 21-in color TV I had before I purchased my first big HDTV.

My very humble setup:
Spoiler!
Mark12547 is offline  
post #8 of 8 Old 10-15-2014, 06:56 AM
AVS Forum Addicted Member
 
Ratman's Avatar
 
Join Date: May 2002
Location: Collingswood, N.J.
Posts: 19,472
Mentioned: 25 Post(s)
Tagged: 0 Thread(s)
Quoted: 2413 Post(s)
Liked: 2244
No matter how you slice or dice it, you cannot take a 480i video signal (composite or S-Video) and transform it to "HD" quality. Sure, some AVR's can upscale/upconvert composite/S-Video to 1080i/p, but it still will never be true "HD" quality.

OTOH, if you have a 1080p TV (LCD/Plasma) and it has composite/s-video inputs, the TV will do the upscaling/upconversion for it's native display anyway. So either the AVR does the conversion or the TV does the conversion. But either way, upscaled 480i will never be true HD quality. As they say, "Garbage in, garbage out".
Mark12547 likes this.
Ratman is offline  
Sponsored Links
Advertisement
 
Reply Cable, Digital Cable - Non-HDTV

Thread Tools
Show Printable Version Show Printable Version
Email this Page Email this Page


Forum Jump: 

Posting Rules  
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are Off
Refbacks are Off