AVS Forum banner
Status
Not open for further replies.
1 - 13 of 13 Posts

·
AVS Forum Special Member
Joined
·
11,139 Posts
Looks like it would deliver more than many sets, assuming that spec is accurate. I pointed out in this thread (last,8/22 post^) that set filtering trims received resolution. Earlier, in another thread (6/09 post), I ran through some estimates about what you might typically achieve resolution-wise with HDTV. Since this last June post, BTW, I've concluded that live 1080i HDTV broadcasts are typically a true 1920 X 1080. -- John
^While this post, now indicated as 8/22 (?), started this thread, it mysteriously ended up at the end of it after the forum transistion to new servers in October.
 

·
Registered
Joined
·
90 Posts
Discussion Starter · #3 ·
John, thanks so much for the info. But just to clarify, since you said that live hdtv broadcasts are a true 1920x1080, and since this set only supports 1440x1080, does this mean that this tv will not handle the full potential of the signal?
 

·
AVS Forum Special Member
Joined
·
11,139 Posts
As that first thread (FCC experts data) indicates, you should knock about 20% off 1920 for available horizon. res., then compare that to 1440. The small difference, about 100 pixels, couldn't be readily perceived. Also, at these high resolution limits, the contrast and focus of any set must be excellent. -- John
 

·
Banned
Joined
·
3,772 Posts
On very high quality 1080i material I _can_

see a subtle difference between 1920x1080i

and 1440x1080i as displayed on my Sony W900

monitor. The HiPix card has a 1440x1080i output

setting so I can easily click back and forth

between 1920x1080i and 1440x1080i.


Admittedly the scaler in the HiPix could

explain the difference I see, but I do think that

some material has a little bit of detail beyond

1440.


In terms of today's sets, 1440x1080 is about as

good as it gets. Many are only capable of around

1200x700 so I wouldn't take the 1440x1080 as

a negative.


The difference I can see between 1920x1080 and

1440x1080 is only observable on scenes with very

complex backgrounds. Like trying to see blades

of grass at a golf game, or the detail in the brickwork of a large large public square.
 

·
Registered
Joined
·
447 Posts
The reason these sets do 1440x1080 is simple...


1440:1080 == 4:3


They are using run of the mill CRTs with aspect-correcting lens elements.


-- Robert
 

·
Registered
LG 55" C9 OLED, Yamaha RX-A660, Monoprice 5.1.2 Speakers, WMC HTPC, TiVo Bolt, X1
Joined
·
45,683 Posts
Quote:
Originally posted by DevoX
The reason these sets do 1440x1080 is simple...


1440:1080 == 4:3


They are using run of the mill CRTs with aspect-correcting lens elements.


-- Robert
Nice try, but it's a lot more complicated than that, and the lens elements don't correct or alter the AR, that's done electronically.
 

·
Registered
Joined
·
233 Posts
Since this specification is "1440 x 1080 lines" (at the RCA web site), we could assume that the spec is in TVL (TV lines) , not pixels. Converting to pixels (1440 X 16 / 9), gives 2560 pixels along the horizontal dimension. The 1920x1080 HDTV spec is in pixels. So, 1440 x 1080 TVL would be more than enough for a nice picture. If for some reason, the person at RCA publishing the spec got confused about the difference between pixels and TVL, the set still has enough resolution to display most broadcast signals. When HDTV came alone, it would have been nice if the TV industry would have converted from TVL to pixels since the confusion factor is even greater with the 16:9 format.


Ernie
 

·
AVS Forum Special Member
Joined
·
11,139 Posts
Quote:
Originally posted by PVR
On very high quality 1080i material I _can_

see a subtle difference between 1920x1080i

and 1440x1080i as displayed on my Sony W900

monitor.
Appreciate your post, PVR. It's great to have someone with your hardware setup to keep folks honest here. Belatedly, thought I'd try to clear up what might appear to be a conflict between what I said initially about differences between a true 1920 HDTV image and the supposed 1440 capability of the RCA set.


What I wrote was: As that first thread (FCC experts data) indicates, you should knock about 20% off 1920 for available horizon. res., then compare that to 1440. The small difference, about 100 pixels, couldn't be readily perceived.


As you point out, and I agree with, you certainly should be able to tell the difference between a 1920 image and a 1440 image. But I didn't indicate above that you shouldn't. Probably should have elaborated more in my response to make it clear where I got that ~100-pixel difference.


All these numbers, of course, aren't absolutes and at these very high resolutions good contrast is crucial for accurate comparisons of the B&W stationary test patterns used. To amplify my original number crunching a bit:


I assumed, as the FCC's experts did (see link above), a ~20% filtering of a true 1920-pixel signal. That trims the resolution by 384 pixels down to 1536. Then, assuming the RCA is displaying it's maximum resolution, 1440, I noted that's only a ~96 pixel difference (I rounded off to 100). For the RCA to be displaying a 1440-pixel signal (measured resolution), it would need to be receiving a 1728-pixel signal (1440 + 20%). Again, I'm assuming the input signal is being sent to the display monitor, like a broadcast HDTV program, so it passes through a receiver filter that trims horizontal resolution by 20%. If you go through the measured resolution results of the FCC experts for 720p and 1080i, as I did in a thread a while back, their resolution reductions actually covered a wide range, reaching a maximum of only 17%. -- John
 

·
Banned
Joined
·
3,772 Posts
I should also mention that I removed the RFI

filter circuit from my HiPix (and Radeon card),

so it is possible that my configuration is

able to show "1920" detail that would be

lost in the noise filter on other people's

systems.


Also - as I said - only very few - very high

quality - 1080i clips show any difference

(that I can see). I would say that most

current material looks basically the same

at 1920x1080i or 1440x1080i.
 

·
Registered
Joined
·
961 Posts
Quote:
Originally posted by vectorzsigma
John, thanks so much for the info. But just to clarify, since you said that live hdtv broadcasts are a true 1920x1080, and since this set only supports 1440x1080, does this mean that this tv will not handle the full potential of the signal?
To answer your question directly:

That's right, a monitor with a resolution of 1440x1080 would not display the full potential of the HD 1920x1080 signal. But, as far as I know, no consumer displays are capable of 1920x1080 ...yet.
 

·
AVS Forum Special Member
Joined
·
11,139 Posts
Interesting about your filter removal, PVR. Reminded me of what some Quadscan video processor owners have been doing with their hardware. I concluded, though, that this differs from removing or bypassing that portion of the circuit that trims horizontal resolution by 20% in HDTV receivers. The Quadscan is for NTSC, of course, and I'm not certain what filtering it undergoes. Also, the Quadscan is a processor of video signals, such as DVD, not NTSC RF. This filtering of HDTV signals seems fairly obscure and perhaps needs detailing by someone expert in HDTV receiver design.


It's interesting to hear about different impressions of HDTV video images on different hardware. But until we get something like a HDTV Avia test disc and hardware, with complete B&W resolution patterns, most of this remains just that--impressions. A while back I suggested that CBS ought to work a HDTV resolution test pattern into the HDTV graphics it periodically displays (for example, "brought to you by Panasonic"). ABC could do the same with 720p. While it wouldn't appear for long on viewer's screens, you could still approximate your system's resolution because the pattern would be repeated frequently. -- John
 

·
Registered
Joined
·
99 Posts
In my humble opinion, I have to say this thread sounds like a lot of engineers (yes, I'm one too) discussing a lot of theory.


In theory broadcasters might care enough to send out a pristine 1920x1080 signal, but then they have to compress it through MPEG to fit into a standard TV channel, and your set has to reconstruct a lossy picture from that. Complex scenes with lots of motion or changes are not going to hold up well to that theoretical limit.


Ultimately you have to use your own eyeballs and deal with many other viewing impediments, such as eyeglasses, viewing distance, room lighting, etc. Besides, too often the camera is noticeably NOT in focus for the shot. Or the actors have insisted on a soft-focus filter.


From what I've seen, so far any HDTV set on the market today is an incredibly big step up from what has been available in the past. I thank my lucky stars every time I am able to flip to a program in any kind of digital resolution --even 480p looks astonishingly clear with no color noise compared to the old NTSC. Yes, 1080i looks a bit sharper to me than ABC's 720p, but they all look better than UPN, AT&T cable or VHS. And hardly anyone has mentioned the absolutely wonderful extra punch added by dolby 5.1 surround sound to those programs that care about their audio.


I understand some people are techno-perfectionists; we need more of them in the backrooms of the TV stations and behind the HD cameras! But for the average viewer, HDTV is all a big gravy train of eye candy and digital surround sound coming our way! Hooray!! At last!! (What took so long?)
 
1 - 13 of 13 Posts
Status
Not open for further replies.
Top