or Connect
AVS › AVS Forum › Display Devices › Display Calibration › How to use HDNET calibration...
New Posts  All Forums:Forum Nav:

How to use HDNET calibration... - Page 2

post #31 of 79
Quote:
Originally Posted by SomeUser4321 View Post

Do all HDTV's (LCD type, Plasma, and Rear-projectors) need calibration?

Yes! If you read these posts (including the "stickies" at the top of this forum), you will discover why this is true. Most of the displays, of the types you mentioned, are set way too bright, with exaggerated colors, in order to sell these in the stores (they grab the customers attention).

Some display's default settings are better than others, but they all need calibration to some degree. Also, each input source has to be calibrated (your DVD player will have different calibration settings from your TV source, etc.).
post #32 of 79
Quote:
Originally Posted by Ken Ross View Post

I'm not sure what you mean by a 'sweeping statement'. If you're referring to my overall assessment of PQ on FIOS, I am in the overwhelming majority on this point. We also have an owner of a 1080p display and FIOS user reporting that he did indeed get a horizontal resolution of 1920 from one of his two measured channels. So FIOS is indeed passing the full rez signal when it's provided by the broadcaster.

There's no way I would lump FIOS with the typical 'cable TV technology' since their bandwidth is almost unlimited relative to typical cable systems. This enables FIOS to provide a greater bit rate, better resolution and an overall better PQ. If you read the myriad of posts from people that switched from "XXX' system to FIOS, you'll find the overwhelming majority reporting the same thing.

Ken, I don't doubt at all the many reports of improved PQ from FIOS, although as I recall a number of the them emphasize SD. They've all made me eager from day 1 to try it out, too, although Verizon may not reach my location for a while. Sweeping statement refers to, in effect, suggesting (with all the reasons given) that folks with the right displays can't measure HDNet's resolution wedges and report effective resolution accurately. 'Right display' means a 1080p or equal if you assume HDNet's resolution wedges originally (satellite uplink/downlink) provide ~1920X1080, and you further assume no limition of that 1920X1080 in resolvble detail by your program delivery source, tuner, and display hardware. And still, if FIOS effective rez is typically ~1300 lines, displays with ~1366-pixel horizontal resolution should be adequate.

I've only seen one report (above) of measured effective resolution from FIOS with HDNet. That's GeekGirl's 1333-line measurement above. Registering (not measuring) a FIOS delivered format resolution of 1920X1080 versus 1280X720p (etc.) for any number of channels, as we all know, is entirely different. A full 1920X1080 HD-format signal can deliver zero effective resolution if it's an all-gray screen--or any resolution between zero and 1920. Mixing DirecTV's HDLite into the discussion seems misleading here because their reformatting to 1280X1080 totally lops off any details >1280.

Making this HDNet resolution-wedge measurement is no more difficult than taking a temperature reading--well, maybe in C with conversion to F. So FIOS or cable subscribers with DVRs can record HDNet's Sunday* 6:50 am ET pattern, watch the last 4 mins (of 10), observe the 'grayout' merge point where vertical converging wedge lines can no longer be resolved, then multiply the number beside the wedge by 100 (lines/picture height), then by 1.78 to obtain lines/16X9 picture width.
*(Or maybe Tuesday same time? They seem to switch. Check at HD.Net).

If other typical FIOS readings are ~1300 lines, maybe the cable-type STBs used are a limitation. And perhaps if FIOS assigns higher bit rates to both SD and HD, and presumably doesn't rate shape with requantization like bandwidth-limited delivery systems, that accounts for the better PQ with motion video compared to static test-pattern images that have very limited bit rate requirements. -- John
post #33 of 79
I'm not sure the Samsung HL-R6768W is an ideal display for measuring horizontal resolution since it only has 960 horizontal pixels on the chip. Does anyone have any data on whether it can actually achieve an effective resolution of 1920 lines/picture width using the "wobulation" technique?

Btw, the method of multiplying the vertical wedge measurment by 16/9 to get # of vertical lines/picture width only works if you have square pixels. My display does not (16x9 = 1024x768) so when I measure the vertical wedge I get 7.5x100x1.33=1000 which is what I should get.

edit: An additional note about FiosTV, the HDNET stream is 5C encrypted so I can't probe the mpeg stream for format but unencrypted streams like FOODHD and MTVHD delivered at the same bit rate (18 Mbps) is 1920x1080.
post #34 of 79
Quote:
Originally Posted by zoyd View Post

I'm not sure the Samsung HL-R6768W is an ideal display for measuring horizontal resolution since it only has 960 horizontal pixels on the chip. Does anyone have any data on whether it can actually achieve an effective resolution of 1920 lines/picture width using the "wobulation" technique?

A good point that also occurred to me earlier. One of the few 1080p display (RPTV) reviews I've read mentioning failure to resolve 1920 lines horizontally was a Mits model tested by CNET. But a few threads, 1-2 years back in the RPTV forum, included screen shots of full 1920 line resolution with a number of wobulation-chip DLP displays. And all the more recent reviews I've seen of wobulation or SmoothPicture (TI's term) 1080p displays have published 1920 line readings. Maybe owners or anyone interested in pinning down the Samsung capabilities can confirm whether it, too, has 1920 resolvability from reviews; (no it doesn't; see my edit below). And of course a DIY confirmation of 1920X1080 resolution for any display is possible with the latest 1080i test DVDs mentioned in this forum, or using the variety of computer software programs generating 1920-line patterns. One reading of 1333 line effective resolution using FIOS should only be a starting point and clearly needs more confirmation with additional measurements.

Quote:


Btw, the method of multiplying the vertical wedge measurment by 16/9 to get # of vertical lines/picture width only works if you have square pixels. My display does not (16x9 = 1024x768) so when I measure the vertical wedge I get 7.5x100x1.33=1000 which is what I should get.

Interesting point. Multiplying by 1.33 is also the factor used for converting 4X3 NTSC resolution measurements/picture height into resolution lines/picture width. Seems best, then, to use a 1920X1080i/p display for confirming full (test-pattern) resolution, or at least 1366X768 displays for measuring suspected effective test pattern resolutions of typically ~1300-lines for STBs and/or delivery systems. If HDNet's pattern was very crisp out to 1366 lines, then you'd want to try a higher-resolution display to measure STB limitations. (DirecTV's reformatted HDNet resolution to 1280X1080 from 1920X1080 isn't suitable either.)

Presumably, then, the 7.5X100X1.33 reading (overlooking scaling) would mean, with a square-pixel 16X9 display, this HDNet reading, if that's what it was, would be 1335 lines (7.5X100X1.78). If so, that's a typical cable STB reading, especially using a HDMI hookup with a 1080p display (as here .) Then again, measuring 1335 with a 1024-limited display doesn't make sense!

Quote:


edit: An additional note about FiosTV, the HDNET stream is 5C encrypted so I can't probe the mpeg stream for format but unencrypted streams like FOODHD and MTVHD delivered at the same bit rate (18 Mbps) is 1920x1080.

Assume most know that the format resolution of 1920X1080, reported on many STBS or cable-STB diagnostic modes, differs entirely from measured effective resolution . Always useful to confirm a program or delivery source is using the standard 1920X1080, of course. -- John

EDIT: Here's an extract from CNET's review of the Samsung display used for the FIOS effective resolution measurement:
Quote:


Resolution is a mixed bag, as the Samsung HL-R6768W didn't actually deliver all 1,920 lines of horizontal resolution. However, unlike many HDTVs, the Samsung does process HD accurately, retaining the resolution in the signal.

So, as mentioned earlier, test disc, software, or pattern-generator confirmation of the actual limitations would be useful, as well as FIOS effective resolution measurements with other displays wtih HDNet's patterns.

The Mits model linked above measured ~25% too low by CNET, or ~1440 lines. But even that would be adequate if a STB and/or delivery system was limiting effective resolution to ~1300 lines.
post #35 of 79
Quote:
Originally Posted by John Mason View Post

The Mits model linked above measured ~25% too low by CNET, or ~1440 lines. But even that would be adequate if a STB and/or delivery system was limiting effective resolution to ~1300 lines.

I think it's even more complicated than that. To measure the line frequency shouldn't you oversample by at least 2 pixels (Nyquist)?
post #36 of 79
Quote:
Originally Posted by zoyd View Post

I think it's even more complicated than that. To measure the line frequency shouldn't you oversample by at least 2 pixels (Nyquist)?

Greetings, zoyd. You might have missed my addition above about your 1000 line reading and whether that equals ~1335 with a square pixel 16X9 display; not sure if that was from HDNet or not. Doesn't seem feasible, BTW, to measure 1335 lines with a 1024-pixel-limited display.

Interesting point about the Nyquist oversampling. That 1440 number was effective resolution (~25% off 1920).

Guess I've assumed HDNet's test patterns are electronically or computer generated. Recall years back speculating here whether it's more useful for testing HD delivery (matching TV camera or telecine 74-MHz HD sampling) to point a camera at test patterns rather than using pattern generators. Using a pattern generator or computer software at home avoids the Nyquist limitation of 74-MHz sampling, about twice that of the highest HD frequency, which boils down to a limiting resolution of ~1700 lines horizontal rez for 1920X1080. Since folks have reported measuring ~1920 lines from HDNet in the past, it seems they're not using a TV-camera-sampled source. (Or, on 2nd thought, perhaps you're getting into DLP mirror and wobulation sampling.)-- John
post #37 of 79
Quote:
Originally Posted by John Mason View Post

Greetings, zoyd. You might have missed my addition above about your 1000 line reading and whether that equals ~1335 with a square pixel 16X9 display; not sure if that was from HDNet or not. Doesn't seem feasible, BTW, to measure 1335 lines with a 1024-pixel-limited display.

Yes, it was from HDNET and my point was that my set limits me to measuring at most 512 cycles/picture width due to the sampling frequency. So for me the set and not HDNET source/STB decoding/etc... may be the limiting factor.

Quote:


Interesting point about the Nyquist oversampling. That 1440 number was effective resolution (~25% off 1920).

Guess I've assumed HDNet's test patterns are electronically or computer generated. Recall years back speculating here whether it's more useful for testing HD delivery (matching TV camera or telecine 74-MHz HD sampling) to point a camera at test patterns rather than using pattern generators. Using a pattern generator or computer software at home avoids the Nyquist limitation of 74-MHz sampling, about twice that of the highest HD frequency, which boils down to a limiting resolution of ~1700 lines horizontal rez for 1920X1080. Since folks have reported measuring ~1920 lines from HDNet in the past, it seems they're not using a TV-camera-sampled source. (Or, on 2nd thought, perhaps you're getting into DLP mirror and wobulation sampling.)-- John

Thanks for the pointer back to that advisory committee report and past discussions. I prefer their nomenclature (cycles/spatial dim. for "resolution" and "scanning format" for pixel pitch) So from that should I conclude that the best one could hope for from a static B&W test pattern would be (1638x800) x 0.85 (mpeg decode hit) = 1392x680 (~340 cycles/picture height)? Or do we only apply the mpeg degradation for dynamic images?

edit:

Well, the vertical resolution parth of my conclusion seems to be contradicted by GeekGirl's measurements:
"Vertical resolution (horizontal wedge): 950 = 9.5 (playing) * 100, 750 = 7.5 (pause) * 100"
Although this doesn't agree with overall behavior of the measurements in the technical report which all showed a lower vertical resolution then horizontal resolution. From the report:

Hres=460 (lines=460x2x16/9 =1636)
Vres=400 (lines=400x2=800)

I also found it interesting from that report that color 720p has significantly better vertical resolution than 1080i.
post #38 of 79
Quote:
Originally Posted by zoyd View Post

Thanks for the pointer back to that advisory committee report and past discussions. I prefer their nomenclature (cycles/spatial dim. for "resolution" and "scanning format" for pixel pitch) So from that should I conclude that the best one could hope for from a static B&W test pattern would be (1638x800) x 0.85 (mpeg decode hit) = 1392x680 (~340 cycles/picture height)? Or do we only apply the mpeg degradation for dynamic images?

Never did like the cycles/PH, etc. outlined in the ATSC committee's Resolution intro paragraph for its measurements (table 2.3), so that's why I converted it for this table oriented to 16X9 HD displays.

It's been a few years since I mulled all this over, but fairly sure you'd have to distinguish between non-sampled test pattern images and those made with a camera (74-MHz sampling). The HDNet wedge patterns seem to be non-sampled and should provide close to ~1920X1080. Recall Gary Merson's review of the first 1080p RPTV, with HDNet via pre-HDLite DirecTV a few years back got that, and he wrote his signal generator then supplied a full 1920X1080 resolution.
Quote:


edit:

Well, the vertical resolution parth of my conclusion seems to be contradicted by GeekGirl's measurements:
"Vertical resolution (horizontal wedge): 950 = 9.5 (playing) * 100, 750 = 7.5 (pause) * 100"
Although this doesn't agree with overall behavior of the measurements in the technical report which all showed a lower vertical resolution then horizontal resolution. From the report:

Hres=460 (lines=460x2x16/9 =1636)
Vres=400 (lines=400x2=800)

Never pinned it down, but the ATSC-approval measurements may imply using sampled signals as well as interlaced CRTs (for 1080i) in the mid-'90s. The test committee has a series of notes (1--5) under table 2.3 (link just above) qualifying some of the readings including vertical resolution (800 lines static). All static-pattern Vres readings of ~1080 from HDNet obviously don't involve the standard Kell factor loss of ~0.7 X format line count, or the additional vertical filtering of 1080i--smearing lines vertically--that further reduces Vres. (BTW, figure A3 of a recent European report on 720p/1080i compares expected readings with both the Kell factor and more filtering.) Something to factor in with 1080p display might be how 540p bobbing--using 540-line TV fields to create 1080p frames--may be influencing Vres; it's often used with 1080p displays as Merson's tests/lists show.

The loss from MPEG-2 decoding I mentioned in my earlier linked table, I've always assumed, was part of the ~20% loss in Hres the video experts predicted in their resolution 'graph intro, even though it appears their measured loss from 1920 (table 2.3) is closer to 15%. Again, their table 2.3 notes 1-5 fudge it a bit more.

Quote:


I also found it interesting from that report that color 720p has significantly better vertical resolution than 1080i.

Can't find them now, but have posted a variation of my simplified table showing the three measurements with higher 720p values than with 1080i by coloring the three. But they're easy to spot visually. Whether the report's table 2.3 notes 1--5 fudges those values isn't clear.

What's needed IMO are some new independent authoritative re-measurement of test patterns--as well as measured values from various major cable head ends, DBS, C-band dishes, etc. Some spectrum analyzer comparisons of program material instead of test patterns would be interesting, too. Suggested this to the SMPTE but so far no reports of pending tests. -- John
post #39 of 79
Thanks for the great info, I have been interested in this subject but until recently haven't had time to go poking around for info. So the HDNet patterns shouldn't suffer from any camera sampling/filtering degradation because they are computer generated but doesn't the de-interlace at the panel introduce smearing (vertically)? And if you don't have 1:1 pixel mapping you should throw in the Kell factor also, shouldn't you?
post #40 of 79
Quote:
Originally Posted by zoyd View Post

So the HDNet patterns shouldn't suffer from any camera sampling/filtering degradation because they are computer generated but doesn't the de-interlace at the panel introduce smearing (vertically)? And if you don't have 1:1 pixel mapping you should throw in the Kell factor also, shouldn't you?

Sure seems as though HDNet patterns are either computer or produced with pattern generators. AIUI, interlaced 480i or 1080i have additional vertical filtering added at transmission (or earlier) to prevent 'twittering' of finer details when they're displayed partially with one TV field, then the other field 1/60 sec later (on interlaced CRTs). Static test patterns may not need this vertical smearing (filtering) of lines. Progressive displays, of course, either combine the half-frames (fields) into frames--or 'bob' each field separately into frames, losing vertical resolution.

Consultant Michael Robins, in an article outlining 720p, points out that the 0.7 X line count (Kell factor) lowering vertical resolution applies to progressive images unless they originate from computers. So, with HDNet's patterns, reportedly measured close to 1080 lines with some 1080p displays, the Kell factor isn't entering. Best I've measured for Vres with my year-2000 9"-gun 64" CRT 1080i RPTV is ~800 lines (using HDNet).

Don't have any data on scaling and 1:1 mapping. But, somewhat aside, it may be that viewing telecined movies on 1024X768 displays delivers most of the resolvable detail present. As quotes and sublinks in this post point out, 800-1000 lines (motion-video equivalent) maximum effective horizontal resolution is very typical of 1080/24p master tapes used for HD-delivered movies. Whether newer 4k scans downconverted to 1080/24p and used for some HD discs provide far more Hres hasn't been measured yet, although many reports cite 'crisper' images. -- John
post #41 of 79
Quote:
Originally Posted by zoyd View Post

I'm not sure the Samsung HL-R6768W is an ideal display for measuring horizontal resolution since it only has 960 horizontal pixels on the chip. Does anyone have any data on whether it can actually achieve an effective resolution of 1920 lines/picture width using the "wobulation" technique?

And the other postings after that. Interesting discussion, but the point is that I'm trying to measure the resolution of my Samsung display, not vice versa. As long as the test pattern has better resolution than the "Display Under Test" why is there a problem?

Perhaps the unanswered question is how the "wobulation" works to trick the human eye. Popular Science has an interesting article that shows the technique: http://www.popsci.com/popsci/bown/20...767810,00.html. It's not that technical, but it shows the concept. My thinking is that since the mirror moves at 2x60 Hz, the processor is putting pixels on top of- and adjacent to- each other in subsequent mirror flips. The (native?) resolution should be 960 and the effective resolution somewhere between 960 and 1,920. As nature works in logarithmic ratios (like human eye sensitivity), I would take the geometric mean between the two: sqrt(960 * 1920) = 1,358, which is close to my measurement.

Data repeated below. Multiply by 16/9 or whatever ratio you think applies. I still see 7.5 horizontal, but it's a judgment call (7 to 8, depending on where the lines blur together).

Display: Samsung HL-R6768W (67" DLP), room lighting = low ambient
Source: Motorola QIP6416-2 HD-DVR, Verizon FiOS, 1080i, HDMI interconnect.

Vertical resolution (horizontal wedge): 950 = 9.5 (playing) * 100, 750 = 7.5 (pause) * 100
Horizontal resolution (vertical wedge): 1,333 = 7.5 (pause or playing) * 100 * (16/9)

- Can someone tell me why I can't see the "10" using a YCC 4:4:4 color space, but it's just fine using RGB color space in the STB? Still confused on this.
post #42 of 79
Quote:
Originally Posted by GeekGirl View Post

And the other postings after that. Interesting discussion, but the point is that I'm trying to measure the resolution of my Samsung display, not vice versa. As long as the test pattern has better resolution than the "Display Under Test" why is there a problem?

The raw test pattern has enough resolution but the discussion was geared towards what happens during transmission, i.e. can you expect the line frequency to be maintained by the time it hits your display.

Quote:


- Can someone tell me why I can't see the "10" using a YCC 4:4:4 color space, but it's just fine using RGB color space in the STB? Still confused on this.

I did comment on this in the 6416 thread, I don't see this behavior with my moto, I can see the "10" in either RGB or YCC settings.
post #43 of 79
zoyd - Thanks. Missed yesterday's post. I asked here since I wasn't sure if it was an STB problem or something related to a pure color space concern (for this forum).
post #44 of 79
Perhaps the best method for establishing wobulation-display (or others) maximum effective resolution is using HD test discs, the resolution wedges on Blu-ray discs, pattern generators like the AccuPel, or computer software. Published reviews help. Then less-certain sources, such as HDNet via FIOS or standard-cable systems, can be measured for a maximum. -- John
post #45 of 79
Just ran across this statement at filmbug.com

"Due to technical reasons having to do with the video equipment, recording technologies, and the 19.2 Mbit/s-limited ATSC channel, some HDTV signals will not reach their nominal resolution. Most notably, 1080i60 is impossible to broadcast without artifacts at this bandwidth using ATSC. Most 1080i broadcast signals actually are filtered to 1440 horizontal samples to allow adequate compression"

If this is the case then the HDNET test pattern will not be adequate for horizontal resolutions > 1440. I've got a 1920 LCD display I can hook-up my moto box to this weekend and I'll try to verify.
post #46 of 79
Spotted similar comments about a 1440-line limitation almost from day 1 of HD broadcasting. Gets murky as well as tricky IMO: Don't doubt that the ATSC broadcast format of 19.39 Mbps (~17 Mbps video payload ) can't deliver full-motion 1920X1080i with the same effective resolution as this standard-format resolution. But keep in mind that HD's standard 74-MHz sampling (cameras, telecines) requires Nyquist filtering to prevent aliasing, reducing 1920 lines to ~1700 lines effective resolution. To obtain an effective resolution of 1920X1080 matching the format resolution, sampling at ~148 MHz (double) is needed, with downconversion to 1920X1080 in order to boost the limiting resolution (~1700) to ~1920.

The ATSC MPEG-2 codec uses lossy compression so that as motion increases in images fine details are 'tossed out' to achieve a ~17 Mbps payload (or often less). So, prefiltering at some stations may well filter effective resolutions to ~1440 lines, although the broadcast format would still be 1920X1080i. Encoding hardware efficiency varies. Also, widely used HDCAM recorders deliberately limit 1080 resolution to <1440 lines, although tape signals are upscaled to 1920X1080 before broadcasting. The prefiltering of frequencies between 1440 and 1920 eliminates noise and sometimes film grain that makes HDCAM ~140 Mbps bit rate and tape speed possible. Networks now use newer HDCAM-SRs without 1440 limitations, although the earlier HDCAM is still used for production, too.

Test patterns such as HDNet's delivered at 17 Mbps video payload are mostly static images, requiring very little of the full bit rate, so resolution wedges achieving 1920-line effective resolution are possible.

Guess this all relates to calibration and the thread topic by trying to pinpoint HDNet's current maximum test pattern resolution from typical sources (FIOS, standard cable, C-band, DBS, although not D*'s HDLite, or even the few reported OTA stations relaying HDNet). Earlier above, believe I mentioned the typical max HDNet test pattern readings most viewers have been reporting at AVS (~1300 lines), with one claiming ~1920 lines from HDNet with a 1080p LCD RPTV on a smaller cable system, using a SA8300HD STB. -- John
post #47 of 79
Quote:
Originally Posted by John Mason View Post

The ATSC MPEG-2 codec uses lossy compression so that as motion increases in images fine details are 'tossed out' to achieve a ~17 Mbps payload (or often less). So, prefiltering at some stations may well filter effective resolutions to ~1440 lines, although the broadcast format would still be 1920X1080i.

I don't think they filter the horizontal information in the way you are thinking. I've noticed this on some of my stream captures of HD stations that are not 5C protected. The format is 1920x1080 with an anamorphic 1440x1080 payload. That way 33% of the image is black leading to a higher compression ratio. I guess the mpeg2 decoder must know to remap the 1440 back to 1920 horizontal pixels.

edit: Did subsequently find that wrapping a smaller anamorphic image within a larger format is done all the time in the dvd world (anamorphic 4:3 displayed as 16:9). The mpeg header carries the image offsets and aspect ratio so that the display can reconstruct the proper image.
post #48 of 79
Quote:
Originally Posted by zoyd View Post

I don't think they filter the horizontal information in the way you are thinking. I've noticed this on some of my stream captures of HD stations that are not 5C protected. The format is 1920x1080 with an anamorphic 1440x1080 payload. That way 33% of the image is black leading to a higher compression ratio. I guess the mpeg2 decoder must know to remap the 1440 back to 1920 horizontal pixels.

Hadn't heard that before. Only encountered mention of anamorphic-type compression with Fox's 480i/p, and there believe it was only from the network to stations in anamorphic form. Suggest posting such a finding in the HD programming forum for some of the gurus. -- John

Edit: A bit of Googling suggests this 1440 technique is used in HDV, the prosumer tape format.
post #49 of 79
Quote:
Originally Posted by John Mason View Post

Edit: A bit of Googling suggests this 1440 technique is used in HDV, the prosumer tape format.

yes, it allows you to get an hour onto a minidv format tape. I'm going to get some frame grabs of several of the in-the-clear HD channels to see if some are using anamorphic compression or not and also measure the HDNET pattern using a true 1920 display. I have read elsewhere on the forum that the PQ on HDNET movies is not as good when watching the same movie on the premium HD channels, maybe this is the reason. I'll post any results I get to the HDTV forums.
post #50 of 79
Quote:
Originally Posted by John Mason View Post

A bit of Googling suggests this 1440 technique is used in HDV, the prosumer tape format.

Is that an issue of pre-filtering, or an issue of the sensors only producing a max of 1440 pixels in the 1080i camcorders (specifically Sony)? I do know that the JVC prosumer cameras (being adopted by several stations) have full 720p x1280 resolution onto HDV, and that's roughly the same bitrate as 1080i x 1920.
post #51 of 79
Quote:
Originally Posted by davehancock View Post

Is that an issue of pre-filtering, or an issue of the sensors only producing a max of 1440 pixels in the 1080i camcorders (specifically Sony)? I do know that the JVC prosumer cameras (being adopted by several stations) have full 720p x1280 resolution onto HDV, and that's roughly the same bitrate as 1080i x 1920.

Mulling it over a bit more, believe the prosumer HDV format is specified at 1440X1080. Camcorder sensors might be capable of higher resolutions, but tape size and speed require such limits because of the restricted bandwidth. AIUI, a 1440X1080 signal feed to some displays would be automatically upscaled to 1920X1080i/p (or downscaled, as needed). Or, more likely, the HDV playback hardware could perform the upscalling to 1920X1080i.

Sony's prefiltering for it's pro HDCAM hardware is similar in that 1440 is a cutoff resolution (for the tape, not the camera). During production processing or before transmission that 1440X1080 is upscaled to 1920X1080i. Perhaps, to speculate, both HDV and HDCAM are using FIR (finite impulse response) filtering that averages and samples 'missing' resolutions/frequencies (1440--1920), and these stored samples can then be used to enhance PQ during upscaling to full 1920. Codec engineer dr1394, who wrote that linked post, mentioned ATSC doesn't permit such filtering.

Also, Fox's network-to-station delivery of so-called 'anamorphic' 480i (before their switch to 720p) I mentioned above was meant for expansion to 16X9 480p from 4X3 480i, as I recall. Seems to be an electronic adaption of the optical meaning of anamorphic; that is, using a special lens that horizontally squeezes images while recording them, then a different lens to widen optically compressed images during playback. -- John
post #52 of 79
The contrast between 1920X1080i measurements made during mid-'90s approval of the ATSC HD formats and readings from HDNet's 1080i test patterns are significant. The reasons aren't entirely clear. The ATSC approval final report appears at the ATSC site and table 2.3 summarizes measurements while the "Resolution" intro paragraph summarizes their approach. As mentioned above, I posted this simplified table of table 2.3 data.

Here's what one reviewer, Gary Merson, using HDNet's resolution-wedge patterns to test Toshiba's now discontinued 1080p RPTV for the May/June '03 "The Perfect Vision, wrote:
Quote:


...While the test transmission clearly showed the 1080 lines of resolution vertically, the broadcast's horizontal resolution fell a tad short. (Resolution is determined by taking the highest number of lines viewable and multiplying it by the picture width--in this case, 1000 lines times 1.777, which equals 1770 pixels or lines.) My test generator confirmed that the few missing lines were likely caused by a limitation of the camera and/or compression and not by the Toshiba display.

Merson used a pre-HDLite (1920X1080 NOT reformatted to 1280X1080) DirecTV receiver. About that time another AVSer measured HDNet's res patterns and concluded they were computer generated, not from a camera. Earlier above I linked an authoritative article pointing out non-sampled computer-generated images needn't undergo the sharp (0.7 X line count) vertical resolution loss from the Kell factor of sampled (camera, telecine) images. One other member reported measuring >1900 lines from HDNet with a LCD 1080p via his TWC head end and a 8300HD. Most have posted measuring ~1300 lines maximum effective lines of horizontal resolution from HDNet. Other HDNet readings, such as those planned by zoyd above, would help pin down differences.

The ATSC measurements, presumably using the best-quality HD CRT monitors available at time, were 1638X800 for the static B&W pattern test. Whether these differences with HDNet are just a difference in years, pattern source (computer vs. TV camera), or display hardware, is still unknown. -- John
post #53 of 79
Quote:
Originally Posted by John Mason View Post

Mulling it over a bit more, believe the prosumer HDV format is specified at 1440X1080. Camcorder sensors might be capable of higher resolutions, but tape size and speed require such limits because of the restricted bandwidth.

I checked some documentation (from Steve Mullen) that I have which confirms what you are saying. The HDV format is limited to1080i with 1440 pixels/line OR 720p with 1280 pixels/line. I had come to my conclusion that it was limited by the cameras based on the first 3 chip Sony prosumer camcorder (HDR-FX1) which had, I believe, 1080 x 960 sensors. By using a green pixel shift, these were able to produce a luminance signal with 1080 x 1440 pixels. Hence, my association of 1440 with the sensors, not the standards.

Good discussion, though.
post #54 of 79
Saturday, June 23rd, at 6:30 am ET
http://www.hd.net/program_search_res...tosearch=title

Sunday or Tuesday mornings were slated for airings previously. To locate upcoming changes go to: hd.net, select schedules, then search programming, then HDNet, then use TEST as a TITLE search word ("HDNet test patterns"). -- John
post #55 of 79
Has anyone tried to use the HDnet Test Pattern for white-balance calibration or gamma measurement? specifically, the "resolution pattern" at the 4min (remaining) mark? I called HDnet and talked to one of their techies, who assured me that the grayscale blocks at the bottom of that pattern go from 0% to 100% in 10% steps. Any pitfalls in using this pattern to set color temp and to read/compute luminance and gamma? In particular, is the APL of this pattern too high to get accurate readings on a plasma? The gray "background" of the majority of this pattern is identical to the 60% gray block. And are the steps accurate?

I've used this pattern to dial in the white balance using the User Mode controls of my Panny TH-50PH9UK, and to my eye, it took the panel's color accuracy from obviously wrong to extremely sweet, especially judging from sports uniforms, the kind of things I've seen live enough times to know what color they should be. I get screwy numbers when trying to measure gamma, though, which makes me skeptical of the accuracy of the grayscale steps, though this could have been pilot error as well.

Just curious what the experts think about this particular pattern for this particular application.

Thanks,
Jim S.

P.S. I should add that I'm using a Spyder2 and HCFR to do the calibration(s).
post #56 of 79
Quote:
Originally Posted by John Mason View Post

Saturday, June 23rd, at 6:30 am ET http://www.hd.net/program_search_res...tosearch=title Sunday or Tuesday mornings were slated for airings previously. To locate upcoming changes go to: hd.net, select schedules, then search programming, then HDNet, then use TEST as a TITLE search word ("HDNet test patterns"). -- John

For Verizon FiOS, the program guide shows up as "Off Air" - transmitter maintenance. No hint of test patterns whatsoever.
post #57 of 79
Quote:
Originally Posted by GeekGirl View Post

For Verizon FiOS, the program guide shows up as "Off Air" - transmitter maintenance. No hint of test patterns whatsoever.

I have found that HDNet is not too hot on getting programming updates to the data service providers (TVGuide & Tribune) that all the cable/satellite providers use. Titan simply lists HDProgramming for Saturday morning, but TW, here in Rochester, lists the test patterns for Sat AM OK. It will probably show up for you in a day to two.
post #58 of 79
Probably not. It's been a few weeks and "Off Air" is the only description shown. Program times are OK. A TiVo owner on Broadband Reports said that his program guide listed the title as "Test Patterns", but the description as "Vintage episodes of Canada's favorite game shows." http://www.dslreports.com/forum/remark,18411359
post #59 of 79
Quote:
Originally Posted by GeekGirl View Post

For Verizon FiOS, the program guide shows up as "Off Air" - transmitter maintenance. No hint of test patterns whatsoever.

Interesting. Thanks. Often my TWC guide, for the earlier SUnday/Tuesday HDNet test pattern showings, wasn't accurate until about one day before the showings. If Verizon's FIOS doesn't deliver the Saturday June 23 6:30 am patterns (HDNet search link above).... ! -- John
post #60 of 79
Quote:
Originally Posted by jdsolomon View Post

Has anyone tried to use the HDnet Test Pattern for white-balance calibration or gamma measurement? specifically, the "resolution pattern" at the 4min (remaining) mark? I called HDnet and talked to one of their techies, who assured me that the grayscale blocks at the bottom of that pattern go from 0% to 100% in 10% steps. Any pitfalls in using this pattern to set color temp and to read/compute luminance and gamma? In particular, is the APL of this pattern too high to get accurate readings on a plasma? The gray "background" of the majority of this pattern is identical to the 60% gray block. And are the steps accurate?

I've used this pattern to dial in the white balance using the User Mode controls of my Panny TH-50PH9UK, and to my eye, it took the panel's color accuracy from obviously wrong to extremely sweet, especially judging from sports uniforms, the kind of things I've seen live enough times to know what color they should be. I get screwy numbers when trying to measure gamma, though, which makes me skeptical of the accuracy of the grayscale steps, though this could have been pilot error as well.

Just curious what the experts think about this particular pattern for this particular application.

Thanks,
Jim S.

P.S. I should add that I'm using a Spyder2 and HCFR to do the calibration(s).

Yes, used the pluge pattern inset into the resolution-wedge screen a few years ago for checking my 64" CRT RPTV. As I recall, using a few live 1080i sports programs, sourced in bright sunlight, I then slightly readjusted brightness/contrast and even tint/color for the most realistic flesh tones and other colors. Not really a calibration expert, but since shadow detail was too low for the live 1080i I adjusted user controls until images looked 'normal'. Most programming has appeared normal since, and when color is off (rarely) I assume something's misadjusted by the program source, not my display. There's never been a huge difference between HDNet pluge settings and what appears better to my eyes. (Not a color-TV neophyte, having built and adjusted three color TVs from kits, and adjusted/repaired an Advent 1000 CRT FP for about 21 years.)

About the time I measured 1280-line effective resolution from HDNet's resolution wedges (and 1335 later with another cable system and different STB), a local ISFer measured and posted a 1290-line effective resolution using a 1366X768 plasma like yours. Some time later a local Ruby 1080p FP owner measured ~1335 lines from TWC. Some other HDNet pattern users, besides this one who claims seeing >1900 lines from a smaller Calif. cable system, may also be getting roughly full resolution. If so, suspect the measurement with a 1366X768 display would show clearly resolvable detail out to ~1366 lines (~7.67 on the wedges), indicating a higher-rez display would show more.

That's with test patterns, of course, while programming, going by AVS and other estimates posted for years, may be limited to ~1450 lines equivalent maximum effective horizontal resolution with good HD sources, but only 800--1000 lines effective resolution from non-HD-disc movie sources. -- John
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: Display Calibration
AVS › AVS Forum › Display Devices › Display Calibration › How to use HDNET calibration...