AVS Forum banner
Status
Not open for further replies.
61 - 80 of 140 Posts

·
Registered
Joined
·
6,903 Posts
Quote:
Originally posted by Bill Johnson
I'm sure Fox execs have eyes and may be asking themselves whether they too should go to 1080i.
Why did FOX choose 720P? Was there an explanation anywhere? I think I just missed it.
 

·
Registered
Joined
·
3,325 Posts
They believed it to be technically superior. Regardless of whether 720p or 1080i offers a sharper picture, there is something to be said for having a picture that's free from interlacing artifacts.
 

·
Registered
Joined
·
14,599 Posts
Quote:
Originally posted by mx6bfast
Why did FOX choose 720P? Was there an explanation anywhere? I think I just missed it.
For the same reason ABC chose it. Fast action of sports. Progressive handles the fast motion better than interlace. It is that simple.
 

·
Registered
Joined
·
67 Posts
Quote:
Originally posted by foxeng
For the same reason ABC chose it. Fast action of sports. Progressive handles the fast motion better than interlace. It is that simple.
I keep hearing this but CBS football and INHD baseball are as smooth as silk all during the game,so if 1080i can handle sports without any issues it seems to me the way to go,i keep hearing how 720p handles the fast action of sports better but if 1080i is not having any problems handling the fast action of sports the point becomes moot
 

·
Registered
Joined
·
4,746 Posts
Quote:
Originally posted by foxeng
For the same reason ABC chose it. Fast action of sports. Progressive handles the fast motion better than interlace. It is that simple.
if it's that simple, why are there NO artifacts on cbs' coverage?


i don't understand this whole 720p and 1080i thing because we have THREE networks broadcasting 720p(fox is without a doubt the worst) and we have CBS in 1080i and CBS smokes the other three
 

·
Registered
Joined
·
67 Posts
Quote:
Originally posted by Rakesh.S
if it's that simple, why are there NO artifacts on cbs' coverage?


i don't understand this whole 720p and 1080i thing because we have THREE networks broadcasting 720p(fox is without a doubt the worst) and we have CBS in 1080i and CBS smokes the other three
I think that Fox,ABC and ESPN made a big mistake going with 720p over 1080i and its to late to do anything about it(its not like they would admit it even if they thought they screwed up)guess we just have to feel lucky that everything else(CBS,NBC,PBS,HDNET,INHD,DISCOVERY,HBO,SHOWTIME and a host of others) have all went 1080i
 

·
Registered
Joined
·
553 Posts
Quote:
Originally posted by foxeng
No, since you know more than the rest of us engineers, you better do it. We are just idots and shills.
foxeng,

I for one appreciate your knowledge and contributions to this forum. What do you think of the PQ of Fox's NFL-HD coverage as compared to CBS's NFL-HD PQ? If you agree with others on this board that Fox might be a little softer and less detailed, what can Fox do to change this? Is it related to all of talk about variable bit-rates?
 

·
Premium Member
Joined
·
10,517 Posts
Quote:
Originally posted by balazer
Word from the FOX network is that the bandwidth will be increased after they sort out a problem with their satellite feeds, probably around the end of the year. There was no indication of what the higher bit rate would be.
The current transponder bitrate is 55.294 Mbps. That supports eight ATSC MPEG-2 streams, half of which are SD and the other half HD. They ultimately want to go to 73.276 Mbps.


Their main transponder on G3R, 18, is experiencing ingress from an adjacent satellite, resulting in too low of an Eb/No when used at 73.276 Mbps.


How they will be allocating the 73.276 Mbps is unknown.
 

·
Registered
Joined
·
32,172 Posts
"For the same reason ABC chose it. Fast action of sports. Progressive handles the fast motion better than interlace. It is that simple."


In theory it was. And that was the mistake made many years ago by ABC -- a belief that it was automagically going to be better for sports.


It isn't.


And I say that as a believer in the mantra progressive is better.


The fact, however, is that 1/3 more vertical resolution is present in 1080i, which is hugely valuable.


And CBS has, indeed, proved that interlace artifacts are not some inherent requirement of 1080i sports.


But it is, indeed, a done deal for FOX and ABC.
 

·
Registered
Joined
·
1,235 Posts
after more than 2 years of HD viewing, i can rather confidently say that - at least on my loewe aconda - 1080i is basically uniformly superior to 720p at this juncture. granted, overall picture quality varies greatly from broadcast to broadcast (sunday night football looks considerably better than fox's football games, cbs's games look better than both without fail) and there are many factors that affect the net quality of the picture. but at the end of the day, this much is true: fox football looks great. even the overcompressed stuff looks noticeably better than 480p and has an overall pleasing picture. but, for that "looking through a window" sensation, you NEED that cbs 1080i. the best 720p looks like really great digital video. the best 1080i looks almost 3-d. the worst 720p doesn't look much better than 480p, and the worst 1080i (i'm thinking "E.R.") is about the same. when both formats are done well -- and usually they are -- 1080i still wins. can't wait to see high-bitrate 1080p someday, though.
 

·
Registered
Joined
·
742 Posts
Quote:
Originally posted by Rakesh.S
if it's that simple, why are there NO artifacts on cbs' coverage?


i don't understand this whole 720p and 1080i thing because we have THREE networks broadcasting 720p(fox is without a doubt the worst) and we have CBS in 1080i and CBS smokes the other three
I agree with you. 720p is so much better for sports why does CBS sports look so much better than FOX, ABC, and ESPN. I will say that ESPN and ABC do look much better than FOX.


I would love to hear from the more technically savvy on this question.
 

·
Registered
Joined
·
3,325 Posts
If you don't see interlacing artifacts, then you don't know what to look for, or your TV is not very big. The interlacing artifacts are always there, unless they dramatically sacrifice the vertical resolution. The artifacts are there. I always see them.


Say what you will about which system is sharper or which looks better to your eyes. There is something to be said about having a system that delivers pictures without these artifacts.


Also, let me throw this out there: 720p has potential yet to be realized. Wait until you see oversampling cameras and TVs. Did it ever surprise you how good a 480p DVD can look on an HDTV, compared to a 480i TV? That's the result of an oversampling display with an oversampled progressive source.
 

·
Registered
Joined
·
4,039 Posts
Quote:
Originally posted by balazer
If you don't see interlacing artifacts, then you don't know what to look for, or your TV is not very big. The interlacing artifacts are always there, unless they dramatically sacrifice the vertical resolution. The artifacts are there. I always see them.


Say what you will about which system is sharper or which looks better to your eyes. There is something to be said about having a system that delivers pictures without these artifacts.


Also, let me throw this out there: 720p has potential yet to be realized. Wait until you see oversampling cameras and TVs. Did it ever surprise you how good a 480p DVD can look on an HDTV, compared to a 480i TV? That's the result of an oversampling display with an oversampled progressive source.
Oversampling cameras are already in use. That is what Fox uses and many production companies. Those are Thompson/ Grass Valley cameras. I believe that CBS uses them now too. 720p will not look as good as 1080i natively captured, because it has less horizontal resolution which is mostly responsible for the detail. What eventually will make 720p look as good as 1080i, is when they will start using 1080p for production and capture. Until then 1080i will always be the king.
 

·
AVS Forum Special Member
Joined
·
11,139 Posts
Quote:
Originally posted by CKNA

What eventually will make 720p look as good as 1080i, is when they will start using 1080p for production and capture. Until then 1080i will always be the king. [/b]
Believe I'm missing something. If a 720p network uses 1080p for production and capture won't that just (potentially) bump resolvable horiz. detail from the current ~1138 samples/lines/pixels up to a full 1280? There is still the format limit. Or do you mean 720p sources converting all the way to 1080p? -- John
 

·
Registered
Joined
·
4,746 Posts
Quote:
Originally posted by balazer
If you don't see interlacing artifacts, then you don't know what to look for, or your TV is not very big. The interlacing artifacts are always there, unless they dramatically sacrifice the vertical resolution. The artifacts are there. I always see them.


Say what you will about which system is sharper or which looks better to your eyes. There is something to be said about having a system that delivers pictures without these artifacts.


Also, let me throw this out there: 720p has potential yet to be realized. Wait until you see oversampling cameras and TVs. Did it ever surprise you how good a 480p DVD can look on an HDTV, compared to a 480i TV? That's the result of an oversampling display with an oversampled progressive source.
i have a 55" tv..I do not look for interlacing artifacts.


The picture just flat out looks better on CBS..as many have described, you get a 3d effect from 1080i that just isn't there with 720p..
 

·
Registered
Joined
·
2,816 Posts
Quote:
i don't understand this whole 720p and 1080i thing because we have THREE networks broadcasting 720p(fox is without a doubt the worst) and we have CBS in 1080i and CBS smokes the other three
OK, time to hear from an expert. Television Engineer Mark Schubin has provided the following explanation. I think we could all learn from it.



--------------------------------------------------------------------------------------------------------------

There are many factors that affect picture quality aside from format.

These include lens design, lens condition, lens mounting, camera type,

camera design, camera condition, and camera setup. Those can have huge

effects on picture quality. But I'll concentrate on image format.


An interlaced format has a number of drawbacks relative to a progressive

format. Regardless of image, there is a "pi" effect when scanning lines

are visible that can draw attention, but that's unlikely to be an issue

on an HDTV consumer display viewed at normal distances.


There is another reduction in vertical resolution, referred to as the

"interlace coefficient," which researchers have placed anywhere from 0.5

(half the resolution) to 1.0 (no reduction). In NHK's early HDTV

testing, they found 731 (total) lines progressive to be visually

equivalent to 1125 (total) lines interlaced for still images, but that

research was conducted in an early era of tube cameras.


Another deficiency that can appear in interlace even in still images is

called interline flicker or twitter. If a horizontal line in an image

appears in only one scanning line, it will appear only 29.97 times per

second instead of 59.94. The smaller number is below the human vision

flicker threshold for most viewing conditions. The line, therefore,

will appear to flicker. That's a problem for graphics, but most cameras

are normally set up to do line averaging (two rows on the image sensor

are added to create a scanning line; in the next interlaced field, each

of those rows has a different partner). That reduces the effect.


The other deficiencies of interlace all relate to moving images. When

vertical motion is at a rate that is a multiple of one scanning line per

field, vertical resolution is halved because what was in a scanning line

in one interlaced field appears in a different scanning line in the next

interlaced field, and the adjacent interlaced scanning line never sees

the additional detail in the source. That's a serious issue in graphics

(consider credits at the end of a show), but, thanks to gravity, it's

less of an issue in football.


In horizontal motion, the fields get separated. That's a problem in

signal compression and processing, which is why progressive scanning has

been said to have a compression-efficiency advantage over interlace

roughly equivalent to the interlace coefficient. It's obviously a big

problem in still images. It can be a problem for displays that show all

lines at once, such as LCD, DLP, and some plasma displays. But it is

not a problem for normally interlaced displays like CRTs (direct-view or

projection) other than any interlace-coefficient losses.


So, all else being equal, progressive should look better than interlace.

But all else is absolutely NOT equal. Even ignoring lens, camera,

maintenance, and setup issues, interlace has one very significant

advantage over progressive. It has half the information rate.


A 1280 x 720 progressive camera has a little under a million pixels per

frame. At 59.94 frames per second, it approaches sixty million pixels

per second. An interlaced camera has only 29.97 frames per second, so

it can use roughly twice as many pixels per frame and achieve the same

number of pixels per second. 1980 x 1080 is roughly two million pixels

per frame.


If we assume NHK's research still holds true today, then the 720 lines

of a progressive camera will actually provide slightly better vertical

resolution than the 1080 of an interlaced camera. But there's no

question that the 1920 pixels per line of the interlaced camera are far

more than the 1280 of a progressive camera.


That's a limiting-detail discussion. There's also sharpness. The

psychovisual sensation of sharpness is proportional to the square of the

area under a curve plotting contrast ratio against detail fineness. All

such curves (normally called "modulation-transfer function" or MTF

curves) have a shape somewhat like the right side of a bell shaped

curve, i.e., high at the left, sloping down slightly on a "shoulder,"

dropping faster after the shoulder, and then flaring out at the bottom

in a "toe." The shoulder area is what is most significant for

sharpness. If the shoulder can be made higher and broader, sharpness

increases even when images are viewed after recording on an analog VHS

cassette. The toe area, being low in contrast ratio, is relatively

insignificant, which is how Sony got away with dropping all resolition

over 1440 pixels per line in the professional HDCAM format (JVC and

Panasonic do similar in D9 HD and DVCPRO HD, respectively).


It has LONG been known that more pixels in the camera make a broader

shoulder. Ordinary high-end standard-definition cameras intended for

use in analog broadcasts (which, in the U.S., cannot carry more than

about 440 pixels per line) have typically had about 1300 pixels per line

for the purpose of raising the shoulder of the MTF curve. It works.

That's why the pictures from those cameras look better than pictures

from older cameras, even when viewed off VHS cassettes recorded off

analog broadcasts.


1080-line HDTV cameras typically have 1920 x 1080 sensors. The pictures

would look better if they had, say, 4000 x 1080, but the technology

hasn't really been available to do that economically yet.

Unfortunately, most 720-line HDTV cameras typically have 1280 x 720

sensors. 1280 is fewer pixels per line than in even some high-end SDTV

cameras. It makes for a shortened, lowered shoulder and, therefore,

significantly less sharpness than in a typical 1080-line camera. The

720-line format does not preclude more pixels per line in the camera; it

just hasn't been done until very recently.


Finally, let me discuss format conversion or scaling. ABC the network

distributes 720p signals, but not all ABC affiliates broadcast it. WFAA

in Dallas, for example, uses 1080i in house. So whatever ABC

distributes goes through a format-conversion stage that is likely to

reduce image quality prior to transmission. I don't know what Comcast's

Dallas-area cable systems are doing (I suspect passing whatever the

broadcast is), but, back when they were AT&T's, an executive of the

company told Congress they would convert any 1080i to 720p for the

aforementioned compression efficiency. Add, say, a Pioneer 720p plasma

panel and an early Pioneer set-top ATSC receiver that could only emit

1080i, and you could come up with this bizarre format-conversion scenario:


- ABC decides to show a clip from CBS in a "Monday Night Football" show

and converts it from 1080i to 720p.

- WFAA in Dallas converts ABC's 720p to 1080i.

- Hypothetically, a Dallas-area cable operator converts WFAA's 1080i to

720p.

- The Pioneer set-top box (back in the days when most cable operators

used 8-VSB for DTT retransmission) converts the 720p to 1080i.

- And the Pioneer plasma display converts the 1080i back to 720p.


That's FIVE passes through format converters, regardless of lens,

camera, maintenance, setup, and production issues.


So, to sum up, the only advantage 1080i, as a format, has over 720p is

1920 pixels per line, but those above 720p's 1280 fall in the MTF toe

and are not, therefore, very significant. Unfortunately, 1080i CAMERAS

have 1920 pixels per line and most 720p cameras do not (although they

COULD). That affects the shoulder of the MTF curve and gives 1080i a

big advantage in sharpness.


Many in the industry eagerly await Sony's 1080p camera, which should

make lovely 720p pictures.

----------------------------------------------------------------------------------------------------------
 

·
AVS Forum Special Member
Joined
·
11,139 Posts
Hadn't seen that published, Rich. Thanks for the post. Not sure about the final 'graphs indicating 1080's detail above ~1280 isn't "very significant", though. Elsewhere Schubin has written that live 1080 can deliver >1600 lines horizontal resolvable detail (nearly static scenes). Most here point out the obvious: 1080i does look better than 720p, if your display is capable. On my display, most 1080i is strikingly better looking, from a detail viewpoint, than 720p.


Also, this MTF-curve 'sharpness' thing versus finer HD resolvable detail gets a bit subtle. As I understand Schubin's point, which he also outlined recently here , sharpness resulting from more area under a MTF curve isn't necessarily the same thing as more resolvable detail made possible by HDTV. It's just an enhancement to resolutions/detail that might fall well beneath HD's potential resolutions. If you spend $10k on a new 1080p set, would you rather display
 

·
Registered
Joined
·
742 Posts
Based on Mark Schubin's response, the phenomena many on this thread have observed is attributed MTF. I am happy now.
 
61 - 80 of 140 Posts
Status
Not open for further replies.
Top