AVS Forum banner

Status
Not open for further replies.
1 - 20 of 22 Posts

·
Registered
Joined
·
7 Posts
Discussion Starter · #1 ·
I've been reading AVS for a couple of months now, and I'm very impressed with this group. There seems to be one ultimate question, however, that hasn't been answered: the Holy Grail question.


The Holy Grail, as I call it, is creating an all-digital consumer video experience: a system which sources a video signal in a digital format and gets it all the way to the display device with pixel-for-pixel accuracy and no image-degradation through scaling. Below are a few practical questions related to this which I am struggling with. I hope some of you experts in this forum can set me straight.


As I see it, there are a few key steps to attain the Holy Grail:


1) Obtain a clean HD digital feed in a given format (OTA or DBS, for the moment)

2) "Tune that signal in" using an ATSC or D* tuner

3) Transport that signal in the *digital* domain from the tuner to a native display (LCD / DLP, etc.) using a true digital interface (DVI or firewire)

4) Display that signal digitally in its native resolution (no scaling)


Until recently, the above was only a dream. As we have all heard, however, "wonder of wonders, miracle of miracles", D* is requiring DVI outputs on all new HD tuners, so it seems at last we will have access to tuners that will give us a native digital signals. [As we have also heard, there are a few Firewire solutions emerging here and there, and that may work too (notice, I'm neatly side-stepping this part of the debate!)]


Lets us use DVI for our example. With one of these DVI HD STBs, the Holy Grail SHOULD be possible. For a few years now, a number of projectors have been able to accept DVI (or its earlier incarnation, DFP (TMDS)) signals, so a few of us should be in a position to make all this work. Assuming for the moment that the source material is not HDCP protected, one could theoretically receive a 480p, 720p, 1080i or whatever signal, shoot it across the DVI interface, and display it accurately, pixel for pixel, with one of these projectors. Theoretically.


How are these signals likely to interact with the screens we have. For instance, I have a Sony VPL-PX30, a 2400 lumen unit originally designed for auditorium use. It claims to handle 720p and 1080i input signals, and it has a DFP port which, with a handy Molex adapter cable, can accept a signal from a DVI output. I drive it from a PC now and then with this type of signal, and its GORGEOUS. But if I finally get one of these new DVI STBs, and I tune in some source material in 1080i format (assuming HDCP is not flagged for the content), what are they going to send down the DVI pipe to my projector? '1080i'? What do the manufacturers really mean by that?


I played with a Sensory Science HDT-100 recently (this is an ATSC-only OTA DVI tuner) that claims to receive '1080i' signals. Much to my dismay, its output (in this case a VGA RGB port), in '1080i' mode mind you, is a 800x600 pixel source image, which my VPL-PX30 then butchers into some form of 1024x768 image. Two nasty scaling effects compounded. Yuck. What's up with that? I imagine the old processor in the HDT-100 just wasn't up to 1920 x 1080, and nothing back when they made that tuner could display such a resolution anyway, so they didn't bother. Excuses, excuses. Will the DVI STB manufacturers do the same thing this year? After all, aside from a few very high end CRTs that probably don't have DVI ports on them, nothing can display 1920 x 1080 today either. What signals are they really going to give us?


Do people think these new DVI-equipped STBs are going to output 1920x1080 pixel streams? If not, they are scaling down 'true' 1080i images to something less than that. What resolutions ARE they going to output?


More importantly, what implications does all of this have for the optimal resolution of one's digital display? Given the mix in the market of 480p (from DVD, using 3:2 pulldown), 480p and 480i from SD DVI, and 720p and 1080i from HD signals, all of which coming in various aspect ratios, one has a huge number of potential pixel fields that need to be displayed. What 'native' resolution makes most sense for us today, if we want to create an all-digital system?


It seems to me that many people seem to ignore the complexity underlying this question, and leave the problem up to the scaler to solve. But a scaler taking, say, a 1920 x 1080 image, and trying to cram it into a 1024 x 768 pixel screen could do several possible things, depending on how smart its programmer was. It could do a 'clean' downsampling, giving you a 960 x 540 pixel image with each pixel representing exactly 4 original pixels (at the expense of losing some pixel real estate) (this would be optimal) or, quite disturbingly, it could choose to somehow come up with an image that is a full 1024 pixels wide, somehow cramming in 64 new lines of pixels with who-knows-what video content.


Has anyone worked out a map out all the various pixel fields one is likely to encounter, the 'clean' up & down-sampling options available for each, and therefor the optimal screen resolution for a Holy Grail display, given the constraints of today's technology? Obvioulsly one would want a display that will display all of those sources either in their native resolution or at least in one which is a clean multiple up or down from the original so it maintains its fidelity, but what resolution is that? Have the display manufacturers worked this out? I'm not sure they have.


Next, once one gets this 'optimal' projector, where should scaling occur (in the STB or in the projector) in order to get 'rational' scaling as in my first example above, and avoid ugly scaling? Can anyone tell me if 'smart' scaling technology (respecting digital sources and digital display constraints) has been incorporated into these new STBs or into digital displays?


It seems to me that we lovers of high resolution digital video are very, very close today to beautiful, native, high resolution video images that are both highly available and reasonably affordable, but there are a few stumbling blocks left that I'm a bit worried the manufacturers are going to trip over.


Thanks for your advice.
 

·
Registered
Joined
·
532 Posts
There should be no compression in such a utopia, but that'll never happen.
 

·
Registered
Joined
·
192 Posts
The textbook method for resampling a signal involves the use for Fourier transforms or else time domain based inpulse repsonse filters. A commonly occuring application (example) being the conversion of 48kHz rate audio source in a recording studio to the 44.1 rate for CD mastering. One way to do that conversion is to perform a real-time fast fourier transform on the input stream and then apply a so-called "brick wall" low pass filter to the frequency spectrum to elimiate all frequency components above one half of the target sampling frequency, which in this case would be 22.05 kHz.


Given the remaining, truncated frequency components, one could perform fourier re-synthesis of the signal allowing interpolation in the time domain with sufficient resolution to all re-sampling at any desired sampling rate. If this were all being done with analog circuitry one would have simply generated and analog signal from the 48 kHz samples, then passed that analog signal through a "brick-wall" low pass filter and then right back through the final digitizer to give the resampled output. For audio, this process is nearly perfect.


What is more likely to be done is that one would perform this process once in the digital domain for an ideal, theoritical single pulse and use the sample amplitudes of that output as a convolution mask to be applied to actual signals; basicly creating a sort of pattern of pre-echoes and post-echoes that performs the sampling rate transition. That's convolution.


Well with MPEG encoded video we dont actually have to perform fourier tranforms in the receiver to do all of this brick wall filtering and then inverse tranforms to get our resampled video out. That's because the MPEG encoders use the discrete cosine transorm in the encoding process as a part of the compression algorithm. Hence at the receive side we already are being delivered frequency spectral data - and that means that half of the resampling work has already been done for us! All that remains then is for the decoder to peform the re-synthesis of the pixel information in an intelligent way!


The problem with convolutional algorithms being that they tend to produce undesirable ringing or fringing effects surrounding solid lines and edges. That's a theoretical built in part of the beast. So really a re-scaling decoder needs to make a resonable effort to recognize and respect the overall structural aspects of the picture. It is probably impossible to do this in a perfect way, short of having an artist retouch the image frame by frame; but I would think that a decent decoder that also is already doing other things like motion adaptive interpolation should have an easy time recognizing edges and other structure and this be able to perform resynthesis in an intelligent way.


In other words, you get re-scaling thrown in free as a part of the MPEG algorithm. I'm not so sure what to say about the external resamplers that so many others seem to swear by. The complexity of actually doing this correctly I would suppose is similar to the complexity of a full blown real time MPEG encoder, and as I understand - the price of external scalers is likewise up there with the price of real time encoding hardware.
 

·
Registered
Joined
·
7 Posts
Discussion Starter · #4 ·
You've highlighted the problem: all this fancy scaling (in today's systems) takes place TWICE, first in the STB/tuner, and second in the display (generally) often with mucky analog signals and a-d converters getting in the way along the road.


As we move into the world of DVI (correct me if I'm wrong) the video signal being transferred out of the STB to the display is NOT mpeg. It is an uncompressed raw pixel field of video information at whatever resolution the STB is chooses to output. This presents an opportunity.


If we could get the STB to output the right resolution for our digital display, the display wouldn't have to do any work, and there would be no extra layers of processing in the display to much up the signal. Alternatively, we could have the STB send the display the 'native' source resolution as originally broadcast, without ANY modification in the STB, and ask the display to do the work of figuring out the best & clearest way to display it.


Does anyone out there know:


1) Is there a DVI-capable HD DTV STB that outputs clean signals (e.g. 1080i in 16:9 is really a 1920x1080 pixel field), cutting out one rescaling step, and


2) If not, what resolutions the DVI-capable HD DTV STBs actually are capable of outputting, and how do they arrive at the particular scaled image they output?


3) Is there a digital display or projector that can take in 'true' 1080i full-pixel-field signals and does 'smart' rescaling, using whole-number downsampling multiples where possible to give clean images?


3) Can someone tell me where I can find the actual (by actual, I mean broadcasted) pixel dimensions of the output signals in 480p, 720p and 1080i in each of 4:3 and 16:9 (and any other popular) aspect ratios?


Am I on the wrong track by pushing this line of thinking?


Thanks!
 

·
Registered
Joined
·
1,096 Posts
Hi All

DVI will only work better when there is directly access picel driven displays such as DLP, LCD ETC. For CRT displays there is still a D/A conversion required to dirve the CRT tubes.


The best and current solution is to put the MPEG decorder in the display and use Firiwire as the transport to send the compressed digital signal from the STB, DVD, ETC to the display. Then and only then should the digital signal should be decoded and then directly routed to the CRTs.


Using the Firewire network is the best solution for home theater in that it allows for all devices to communicate directly with each orther.
 

·
Registered
LG 55" C9 OLED, Yamaha RX-A660, Monoprice 5.1.2 Speakers, WMC HTPC, TiVo Bolt, X1
Joined
·
45,616 Posts
Quote:
Originally posted by rowe
Am I on the wrong track by pushing this line of thinking?
For now, I would say yes.


At a minimum, until digital displays are capable of much higher quality (full HDTV resolutions, the ability to do black, no digital artifacts, etc) it's of no value to anyone (save the MPAA) to have a fully digital signal path from source to display.


I will agree in theory that keeping the HDTV signal in the digital domain as much as possible is a good idea, but the current state of digital video processing is much less mature than high end analog signal processing.


For many years, analog ultra high bandwidth computer graphics video has been distributed, switched, routed, amplified, displayed, etc., with excellent results. The technology that does this is supported by an entire industry specializing in this discipline. And these signals are much more demanding than HDTV. For now, and a good while yet, if HDTV signals are handled properly it's very unlikely a consumer display would benefit from using DVI or other digital technology from stem to stern.


Not to mention DVI is the digital poster child of those who would have us lose our 'Fair Use' rights.
 

·
Registered
Joined
·
243 Posts
If in fact Toshiba ships a true 1920 x 1080P RPTV this fall with DVI, will this in fact be the Holy Grail? Since most HDTV signals are 1080i, can someone explain what if any scaling is required. What about 480i and 720P signals? I understand the film industry is moving toward 1080P/24. How would these new displays like the Toshiba 57HLX82 handle that signal?
 

·
Registered
Joined
·
3,804 Posts
The CCD's used in TV cameras are analog devices, so the chain is broken right at the start. The only images that reach your screen that didn't start out as analog are computer graphics.
 

·
Registered
Joined
·
7 Posts
Discussion Starter · #9 ·
Has anyone out there got an all-digital system actually working, with re-scaling taking place at only one spot in the chain, and no analog anywhere (except at the original point of capture)?


Has anyone got one of these DVI STBs yet?


With all respect to those downplaying the power of an all-digital system, I recall someone on this board having seen a system at CES this year using a Korean prototype DVI STB from a place called MIT driving a native 1920 x 1080 FP display, and they said viewing it was like "looking through a window".
 

·
Registered
LG 55" C9 OLED, Yamaha RX-A660, Monoprice 5.1.2 Speakers, WMC HTPC, TiVo Bolt, X1
Joined
·
45,616 Posts
Quote:
Originally posted by rowe


Has anyone got one of these DVI STBs yet?
No.


The 'looking thru a window' thing is a subjective experience, many members here use the same term now when watching HDNet or PBS.


Having said that, I've seen the difference between a DVI feed and an analog feed to a 3 chip professional DLP projector and it does make a difference. I'm not down playing an all digital video experience, but until consumer displays are up to the task, it's a moot point. Even when the display & digital interface is ready it will take time to make the digital signal path transparent, for the 'Holy Grail' video experience. The biggest barrier will be the cost of the processing itself, it's not cheap now and I think we're looking at a few years yet before the technology will all come together.


The same can be accomplished using an IEEE1394 interface, as opposed to DVI.
 

·
Registered
Joined
·
306 Posts
The digital data from satellite or an OTA signal is in MPEG 2 format.



Here is what I think we would like to see:


If the display device accepts digital input, the orignal MPEG 2 data should be forwarded without alteration. The display device should decode the MPEG-2 data and do any conversion, scaling, and other video processing.


If the display device is analog, then the STB will need to have an MPEG-2 decoder and NTSC video generator (or PAL in other parts of the world).
 

·
Registered
Joined
·
7 Posts
Discussion Starter · #13 ·
Thank you, Ken H, for putting this issue in some perspective. I agree with you that a number of challenges remain.


Lets hope that, as settopguy suggests, STB and display manufacturers will learn how to pass digital signals along the path without redundant scaling and encoding/decoding steps, allowing us to get high-fidelity video that has the price performance of digital audio playback (e.g. cheap and good).
 

·
Registered
Joined
·
32,172 Posts
I totally disagree on the value of putting the MPEG decoder in the display. In fact, I think it's kind of nutty. Video processing electronics are really best left to devices that can be upgraded/changed out/replaced/specialized and displays are best left to display makers. I am not going to put $10K into a wall-mounted flat panel or a projector and then have to live with it's as-good-as-they-want electronics. Scalers/HTPCs/next-gen DVD players/etc. can all be built to display native-rate images for standard displays. Those displays can then be brain dead. Much, much more logical.


Mark
 

·
Premium Member
Joined
·
11,868 Posts
Quote:
Originally posted by rogo
I totally disagree on the value of putting the MPEG decoder in the display. In fact, I think it's kind of nutty. Video processing electronics are really best left to devices that can be upgraded/changed out/replaced/specialized and displays are best left to display makers. I am not going to put $10K into a wall-mounted flat panel or a projector and then have to live with it's as-good-as-they-want electronics. Scalers/HTPCs/next-gen DVD players/etc. can all be built to display native-rate images for standard displays. Those displays can then be brain dead. Much, much more logical.


Mark
Mark,

the main problem with your above thinking is that an MPEG decoder is usually the most expensive part of most digital set top boxes and peripherals. Having to pay for multiple devices with MPEG decoders incorporated can be very costly and unnecessarily and significantly raise the price of putting together a home theater or system. There is no good reason to have multiple MPEG decoders. You the consumer will end up paying more if DVI prevails because DVI demands multiple MPEG decoders in digital set top boxes and peripherals.
 

·
Registered
Joined
·
32,172 Posts
And, again, Kipp, I respectfully disagree. MPEG encoders are very expensive. MPEG decoders are a dime a dozen. They are so cheap that you can find them working beautifully in $99 DVD players. They are so cheap that they are going to appear in handheld video players. They are so cheap that the cheapest set-tops from Dish and DirecTV can be had for $49 (I realize there is a bit of subsidy, but I'm talking add-on boxes with little subsidization) with MPEG decoders.


Digital set tops are overpriced for a lot of reasons, most especially lack of volume. When they are in volume and competitive, I'm sure we'll see lots of cheaper boxes.


Anyhoo, I would rather invest a few hundred dollars in MPEG decoders in my set-tops just like I'm already doing in my 2 Tivos, 1 Replay, 2 other DirecTV boxes (including an HD one), PS2, et al. I would rather not have a display that needs to handle all the video processing and be stuck with "well, that's the best your damn set is ever going to look." Let's get the sets pixel perfect images and let the plasma/DLP/CRT/LCD folks work solely on rendering a pixel with perfect color in the right spot and leave the rest to folks who know much more about scaling/deinterlacing like Leeza/Faroudja/ATI/the dScaler team/et al. I shudder to imagine how bad TV would be if we all had to live with Sony's idea of scaling and deinterlacing because the display only took the MPEG feed and had to handle it from there.


Mark
 

·
Premium Member
Joined
·
11,868 Posts
Mark,

point taken. I can understand your position and for some that may work. I am coming from the "mainstreams" point of view. When HD goes mainstream, many will not want to fuss with all of the extra devices. A good example is the DVD home theaters in a box. These setups have been a big hit. Many consumers just want to be done with the whole setup and get to watching their movies. They do not want to worry about all of the little details that many of us here including myself pay attention to.


Going the route of say 1394 with the simplicity of utilizing a daisy chain for connectivity, will be very attractive to many consumers. Has anyone here had a friend or relative look behind our sets and say "Oh my God!, how do you figure out what goes where?"? This can be very discouraging to the average Joe. Obviously for most of us this is a no brainer.


DVI with its length limitations and requirements of multiple cables for each individual device is a major drawback. Again the addition cost of the MPEG decoders will make the choice for the majority of the mainstream consumers via simple economics...what is cheaper is what they will buy. That particular policy is not the route I or many here go by but that is how the uniformed consumer makes purchasing decisions.
 

·
Registered
Joined
·
9,884 Posts
I am not saying DVI is the answer, and certainly not the copy protected variety, but it is important to me to be able to drive my display from a computer. And it seems important also do be able to drive it from STB's that use MPEG4 or other newer non-MPEG2 technologies that may offer better compression or new features.


Otherwise how will Blu-Ray, FMD, or Corona DVD's be displayed? Maybe Blu-Ray will use standard MPEG2 but I'd hate to already lock that door on a ten thousand dollar display.


Giving the MPEG-LA folks a monopoly on the digital display standard seems generally a bad idea. So I would certainly like some digital standard that can just send pixels.


- Tom
 

·
Registered
Joined
·
32,172 Posts
Kipp: I'd love FireWire cables as my only cables back there! I'd love it.


You make great points, but I think you neglect to accept that just about everyone is boarding the DVI train. Few are boarding the FireWire train. I think FireWire does eventually give us the interface we need for digital recorders -- and I hope it catches on elsewhere -- but I struggle to imagine how the display interface is *not* going to be DVI given what's already happening with the manufacturers (Sony, Toshiba, JVC, Samsung, Zenith to name a few).


Mark
 

·
Premium Member
Joined
·
11,868 Posts
Mark,

when you say everyone is boarding the DVI train, I am assuming that you are talking about manufacturers...correct me if I am wrong. Mitsubishi who has over 50% of the market is completely against DVI as well as the majority of people here are against DVI.


I understand your position in viewing the DVI standard, but I am holding out that DVI/HDCP will be pointless due to Hollywood being found to having too much control with what we can and cannot watch. DVI as a connection itself is not bad even with its limitations. I really feel things will go 1394 with 5C and DVI/HDMI or whatever it will be called next week will die a slow death. Thanks for the intriguing discussion.:)
 
1 - 20 of 22 Posts
Status
Not open for further replies.
Top