AVS Forum banner

4k may be sooner than you think, at least DirecTV thinks so.

14K views 92 replies 30 participants last post by  sneals2000 
#1 ·
From PC Mag.com

Quote:
DirecTV Preps for Ultra HDTV Signals


By Mark Hachman


DirecTV has begun the groundwork for a rollout of ultra HDTV (UHDTV), according to a report.


Advanced Television.com said that Philip Goswitz, senior vice president of space and communications and technology development for DirecTV, is preparing for 4,000- and 8,000-line services, although Goswitz did not say when.


Reports have indicated, however, that the format could be ready by as early as 2016, with 2018 to 2020 seen as a more likely timeframe. In any event, DirecTV is looking toward the future.


"4,000- and 8,000-line services are great for the satellite industry, and will ensure that satellite broadcasting continues to distinguish itself for image quality of service," Goswitz reportedly said, according to Advanced Television.com "We see this as a key strategic advantage for us."


DirecTV representatives could not be reached Friday for comment.


Current 1080p signals use 1080 horizontal lines of resolution. The so-called "4K" format is used by the digital cinema industry, but those refer to horizontal lines of resolution. Digital cinema resolution, for example, is commonly 4096-by-1714 pixels.


It's possible that Goswitz referred to so-called QFHD, which basically doubles the 1080p HDTV standard in the vertical and horizontal dimensions: 3,840-by-2,160. But those still don't come close to 4,000 lines of vertical resolution.


True ultra HD or UHDTV, however, has been proposed by NHK, also known as Super Hi-Vision. At 7,680-by-4,320, or 4320p HDTV, the resolution far exceeds conventional HDTVs and digital cinema. Uncompressed, the video would require massive bandwidth and storage space; in 2006, however, NHK demonstrated a compressed version, using an NHK codec that compressed the video signal from approximately 24 Gbits/s down to 180-600 Mbits/s and the audio from 28 Mbit/s to 7-28 Mbits/s. Prototype TVs from LG (pictured) have also been shown.


To enable the transition, DirecTV sees itself migrating from Ku-band to Ka-band satellites. Ka-band satellites offer higher bandwidth than Ku-band satellites, but reportedly are more susceptible to so-called "rain fade," or a degradation of performance during rainy conditions.


"At DirecTV we see a couple of things happening," Goswitz said. "First, our subscribers are migrating away from Ku-band, and upgrading themselves to Ka-band and its HDTV services. In four or five years, our Ku-band [transmissions] could end. We are also developing the so-called Reverse Band for DBS services, and these are on our Road Map for future international services. 4000-line is exciting to us because of its image quality, and the potential for glasses-free 3D."
http://www.pcmag.com/article2/0,2817,2401711,00.asp
 
#27 ·
The QAM system will die long before 4k HDTV channels are rolled out. IPTV is obviously the future; offering vast flexibility as long as it's not constrained to horrible infrastructure (like AT&T Crapverse trying to push HDTV over centuries-old telephone lines... ridiculous). I believe IPTV has great potential over true fiber optics like FiOS and the cable companies' lines.


Now back to 4k... quite frankly, I don't see what the big deal about 4k is anyway. Talk about jumping the gun. It's only four times the resolution of 1080p. Do we really need this temporary distraction to further segment the market between SD, HD, and the upcoming UHDTV?


I'd rather skip the baby steps and just wait for Ultra High Definition Television to become feasible. Now here's a technology which will truly approach the limits of human vision. 8k is 16 times the resolution of 1080p; 4k is only 4 times. 4k is like the leap to 720p from 480p; 8k will be more like the leap from 480p to 1080p.


I believe UHDTV will be the final resolution leap before we move onto a different form of video, one without a set resolution... holographics, perhaps. Something similar to the holodeck in Star Trek.
 
#28 ·

Quote:
Originally Posted by scorpiontail60 /forum/post/21793324


I believe UHDTV will be the final resolution leap before we move onto a different form of video, one without a set resolution... holographics, perhaps. Something similar to the holodeck in Star Trek.

Just called DirecTV about when I would be able to get holographic programming. Good news, the answer is "soon."
 
#29 ·
In corporate driven America ("Corprica") it is all about convincing you that what you have is obsolete and you MUST have the latest thing. Look at all the people who already had iPads line up for the latest one even though the one they had works fine. And of course many home theater owners are more about bragging rights than anything else.
 
#30 ·

Quote:
Originally Posted by Ken H /forum/post/21788899


Broadcasters can't deliver Blu-ray quality with the system they currently use, you know this.

Then the question should be, "Why would a provider discuss delivering higher resolution when they can't deliver the existing resolution well enough?" I'd expect them to do an even worse job.


It's like when AT&T was talking about video phones connected to fiber optic networks back in the 70's when most people didn't yet have touch tone phones.

 
#31 ·
not sure i get all the D* hate in this thread
 
#34 ·
4k sets are simply not needed unless a person has a 100 inch screen sitting 6 feet away, and they can't even deliver the bandwidth to display the channels at 1080i correctly without compression artifacts due to wasting bandwidth on analog channels, outdated codecs, and the like.


As to 1080p, there's no practical benefit over 1080i except in perhaps live sports where the feed is broadcast at 60FPS.


It's good to see mediacom ditching the analog. I didn't realize analog took up so much bandwidth. I just wish they'd send the signals unencrypted so we didn't need boxes. I'm not going to order premium channels, I don't need the boxes. Let the TV do the converting.
 
#35 ·
The benefit of 1080p is having more than twice the resolution on motion (twice just because of not throwing away half the fields, then more because there would be no need to apply anti-twitter filtering beforehand). It would also not have artifacts introduced by deinterlacers, whether due to poor design or just the inherent inability to perfectly reverse the folding of space and time that is interlacing. The real problem is that there is no way to monetize 1080p from a provider perspective when Joe Six Pack doesn't know the difference between i and p, and think he's already getting 1080p because he has a 1080p TV. 4k gives them a bigger number to sell, even though no one sits close enough to really tell the difference.
 
#37 ·

Quote:
Originally Posted by Ken H /forum/post/21788899


Broadcasters can't deliver Blu-ray quality with the system they currently use, you know this.

Locally, OTA is rather painful too. Last night I viewed TiVo on my 120 inch screen (1080p projector) and the content mostly looked terrible. As an example Late Show with David Letterman. His studio image was a complete mess as it appeared someone turned the sharpness all the way up to the point of creating SD. When they showed video shot outside of the studio it often looked fine.


So it's more than the resolution. The production value has to be there to even take advantage of what's available. When I noticed edge enhancement on our local news I wrote them and they responded we have it turned down to one of the lowest settings... at or below the recommendation.
 
#38 ·
The Late Show With David Letterman uses awful cameras. Letterman along with Kimmel must be using old hand-me-downs. The worst is Chelsea Lately though which is obvious 16:9 upscale. E!'s been passing that one off as HD for years now. Yeah right.


If you want an example of amazing quality look at a proper late night show - Late Late Show With Craig Ferguson. Leno, Fallon, and Conan all have better cameras too.
 
#39 ·
Leno has absolutely zero skin texture. Everyone looks like a plastic doll. If I remember correctly they use a special lens/filter/whatnot to achieve such. It's roughly the polar opposite of Dave's show.
 
#40 ·

Quote:
Originally Posted by coyoteaz /forum/post/21811364


The real problem is that there is no way to monetize 1080p from a provider perspective when Joe Six Pack doesn't know the difference between i and p, and think he's already getting 1080p because he has a 1080p TV. 4k gives them a bigger number to sell, even though no one sits close enough to really tell the difference.

Sadly, this is true.
 
#41 ·

Quote:
Originally Posted by coyoteaz /forum/post/21811364


The benefit of 1080p is having more than twice the resolution on motion (twice just because of not throwing away half the fields, then more because there would be no need to apply anti-twitter filtering beforehand).

The problem is that there is only a small range of slow motion that doesn't end up as motion blur in video, therefore the resolution to display it goes to waste. This is limited mostly to slow camera motion such as very slow pans.


Also, if you've ever seen how motion is displayed in even full bit rate OTA HDTV by stepping through it frame by frame, you'll see that interlacing artifacts are the least objectionable artifacts generated by motion compression. Most motion is a pixelized mess, but fortunately our vision isn't good enough to detect it.

Quote:
It would also not have artifacts introduced by deinterlacers, whether due to poor design or just the inherent inability to perfectly reverse the folding of space and time that is interlacing.

The deinterlacer in an average HDTV these days is incredible. Run the deinterlacer "torture tests" that you can find on the Internet. These are designed to expose every flaw in every deinterlacing method invented. My HDTV passed all of them except for one which caused flashing boxes.


You can try all these things to see what I'm talking about. It's not theoretical at all.
 
#42 ·

Quote:
Originally Posted by stockwiz /forum/post/21811112


4k sets are simply not needed unless a person has a 100 inch screen sitting 6 feet away...

If you had told me twenty years ago I'd be watching a 46 inch screen from six feet away, I would have thought you were insane.


Don't underestimate our vision. Photographers require five to seven times the resolution of HDTV to make smaller acceptable prints. Have you even seen a "tiny" 24x36 print from six feet away? The detail makes an HDTV look like a bunch of blurry flickering light bulbs.
 
#43 ·
Isn't the new iPad's screen resolution double the older model? With only a 9.7 inch display all you hear about is how much better images look. I realize you are fairly close to the display but holding it 2 times the image's width (a little more than what I sit from the 120 inch projector's image) I'm guessing you could easily tell the difference between it and the old model and more resolution wouldn't be wasted in a lot of conditions.
 
#45 ·

Quote:
Originally Posted by NetworkTV /forum/post/21790729


Further, most TVs wouldn't even have the ability to display a 1080p60 signal. They would either convert it to 1080p30 or show you a blank screen.

Every HDTV I have owned has been perfectly capable of accepting a 1080p60 and a 1080p50 input and displaying it at 60Hz or 50Hz. My first HD set didn't accept 1080p24, and would only accept 1080p50 or 1080p60 via HDMI (Component was limited to 1080i and below), but my more recent ones accepted 1080p via component AND 1080p sources with a 24Hz refresh rate.


My HTPC outputs 1080p50 whether I'm viewing 576i25, 720p50 or 1080i25 (aka 1080/50i) broadcast TV, either scaling (720p) , de-interlacing (1080i) or both (576i) to generate a 1080p50 source to feed my TV via HDMI.


All Freeview HD set-top boxes sold for the UK OTA HD standard are mandated to support 1080p50 output as this has benefits for IPTV (which they are also mandated to support and is often 720p or 1080p and could potentially suffer if interlaced and de-interlaced on the route to the display if carried 1080i), receiver rendered text services etc. Sure the UK OTA standard doesn't include 1080p50 broadcasts, but the STBs are able to de-interlace 1080i25 to 1080p50 (de-interlacing to 1080p25 would halve the temporal resolution of a 1080i25 broadcast as it has 50 fields - potentially captured at 50 different points in time - every second)


Not sure where you get the idea that 1080p50 or 1080p60 are not widely supported. Many people running HTPCs feeding HDMI into their HDTVs are running 1080p at 24/50/60Hz progressive (NOT 25/30Hz or interlaced)


My PS3 has always output 1080p50 or 1080p60 via HDMI. My XBox 360 (first gen) happily outputs 1080p50 or 1080p60 via Component (it doesn't have HDMI)


These are NOT frame doubled sources - they have full 50/60Hz motion.
 
#46 ·

Quote:
Originally Posted by Ken H /forum/post/21788657


From Advanced Television.com


http://advanced-television.com/index...ng-for-u-hdtv/

Me thinks he is confusing horizontal and vertical lines...


I suspect when he talks about 4k he means 3840x2160 (as that is nearly 4k horizontally) and when he talks about 8k he means 7680x4320 (as that is nearly 8k horizontally)


7680x4320 is also known as SuperHiVision or UltraHDTV and is being standardised by NHK in Japan, with input from the BBC in the UK and RAI in Italy, amongst others. (Current plan is for 120fps progressive to be used I believe, though originally 60fps progressive was planned)


There will be some 7680x4320 broadcasts from the UK Summer Olympics this year - to screens in viewing areas in a couple of UK locations and also Japan I believe?


I saw some early 7680x4320/60p demonstrations a couple of years ago - and it was breathtaking. However the size of screen required to truly appreciate that resolution is significant.


However I suspect 4320x2160 could be more sensible for mainstream viewing.


RF modulation has come quite a way since DVB-T and ATSC 8VSB were standardised in the mid-to-late 90s.


The UK went with DVB-T2 to go HD - switching from an 18Mbs DVB-T mux to a 40Mbs DVB-T2 mux in the same RF bandwidth. (Though that mux could have run at 24Mbs with less robustness using DVB-T)
 
#47 ·

Quote:
Originally Posted by John Mason /forum/post/21816542


A recent NY Times piece mentions the new iPad has a 2,048-by-1,536-pixel 'retina' display, which AIUI means max useful rez at typical iPad viewing distance. Haven't seen older model rez.

Today at Fry's I compared the old and new models... I found it similar to 720p and 1080i/p. Both look great however 1080 just looks smoother and to some degree 720 appears sharper as in the edges are more defined. As I can clearly see pixel alignment issues from my viewing distance with both the set in my den and my projector I'm sure I would benefit from higher resolutions.
 
#49 ·
I think 4k is coming sooner too. 4K acquisition and post is common now so there is already past and future material. Display manufactures are ready to produce consumer models. Improved compression such as h.265 is on the horizon and I think we'll start seeing hardware acceleration support. Broadcasting looks doubtful any time soon, but IPTV, cable and satellite may offer some premium material. VOD would seem the best candidate. Even Youtube announced 4K support. Inexpensive 4K cameras will start appearing, and there is already one from JVC at around $5k.


HD was around for around two decades before it finally came to the consumer in the US in late 1998. During most of that time HD was considered an interesting experiment without much of a future and despite its promise of much higher quality images (especially in the 80s compared to what was typically used for SD) most predicted its demise. Now 13 years later HD production is now the norm and 4k is common for the higher end such as features. While it's true that we still have issues just trying to get good HD to the home, technology does not progress as a stair steps but rather as a gradation. It may not be that far in the future when we see 4K displays and even consumer camcorders at Best Buy, that is assuming that Best Buy is still around.


As for the interlace question, there is the misconception that there are merely two 540 line fields which creates half the resolution of progressive. Another is that interlace has half the frame rate conveying motion. The answer is that motion in interlaced acquisition is captured at twice the frame rate with alternating odd lines (or maybe should be called rows now) and even lines "leapfrogging" with shutter time at half the frame duration. For 30 fps each field is normally captured at 1/60th second (slightly less due to CCD readout), which is the same as each frame in 60p systems. It's the fact that each field is captured at different moments in time that causes de-interlacing to be an issue. The idea is that static scenes would have full vertical resolution while areas with motion would decrease to half. There is normally a loss of resolution on motion due to motion blur, so the interlace mode is not the sole contributor to resolution loss on motion. Even though on static areas the full use of the vertical lines is utilized, there is some loss in vertical resolution because the vertical size of the imager elements are essentially doubled to suppress aliasing and jagged diagonals on motion. The reason is that on areas with motion each field does not have the benefit of the other field's information and there would be a gap between the lines, so the lines on each field need to effectively touch each other top and bottom. This creates a ~200% fill area vertically which causes faster rolloff of higher vertical frequencies. The larger vertical size however has the advantage of making a live camera around twice as sensitive to light (1 stop) compared to progressive mode. Better de-interlacers have the capability to perform reverse pulldown on 24p, 25p and 30 material so the system becomes pseudo progressive. Interlaced CRT displays showed each field offset in time, so vertical frequencies higher than 1/2 the vertical resolution could have flicker. Converting to progressive for newer displays has largely resolved that problem. Newer codecs encode progressive nearly as efficiently as interlace, so the advantage of interlace's reduction in bandwidth will be largely if not completely lost.


As a side note, if Sony's crystal LED display makes it to the market any time soon, and is not outrageously expensive and doesn't have lifetime issues, then I think it's over for LCD and plasma. I think it's pretty much over for plasma as it is. I don't think I have ever seen a display as good, though I was just viewing demo material. 4k on that would just be stunning and could kill theaters.
 
#50 ·

Quote:
Originally Posted by TVOD /forum/post/21819363


I think 4k is coming sooner too. 4K acquisition and post is common now so there is already past and future material. Display manufactures are ready to produce consumer models. Improved compression such as h.265 is on the horizon and I think we'll start seeing hardware acceleration support. Broadcasting looks doubtful any time soon, but IPTV, cable and satellite may offer some premium material. VOD would seem the best candidate. Even Youtube announced 4K support. Inexpensive 4K cameras will start appearing, and there is already one from JVC at around $5k.


HD was around for around two decades before it finally came to the consumer in the US in late 1998. During most of that time HD was considered an interesting experiment without much of a future and despite its promise of much higher quality images (especially in the 80s compared to what was typically used for SD) most predicted its demise. Now 13 years later HD production is now the norm and 4k is common for the higher end such as features. While it's true that we still have issues just trying to get good HD to the home, technology does not progress as a stair steps but rather as a gradation. It may not be that far in the future when we see 4K displays and even consumer camcorders at Best Buy, that is assuming that Best Buy is still around.


As for the interlace question, there is the misconception that there are merely two 540 line fields which creates half the resolution of progressive. Another is that interlace has half the frame rate conveying motion. The answer is that motion in interlaced acquisition is captured at twice the frame rate with alternating odd lines (or maybe should be called rows now) and even lines "leapfrogging" with shutter time at half the frame duration. For 30 fps each field is normally captured at 1/60th second (slightly less due to CCD readout), which is the same as each frame in 60p systems. It's the fact that each field is captured at different moments in time that causes de-interlacing to be an issue. The idea is that static scenes would have full vertical resolution while areas with motion would decrease to half. There is normally a loss of resolution on motion due to motion blur, so the interlace mode is not the sole contributor to resolution loss on motion. Even though on static areas the full use of the vertical lines is utilized, there is some loss in vertical resolution because the vertical size of the imager is essentially doubled to suppress aliasing and jagged diagonals on motion. The reason is that on areas with motion each field does not have the benefit of the other field's information and there would be a gap between the lines, so the lines on each field need to effectively touch each other top and bottom. This creates a ~200% fill area vertically which causes faster rolloff of higher vertical frequencies. The larger vertical size however has the advantage of making a live camera around twice as sensitive to light (1 stop) compared to progressive mode. Better de-interlacers have the capability to perform reverse pulldown on 24p, 25p and 30 material so the system becomes pseudo progressive. Interlaced CRT displays showed each field offset in time, so vertical frequencies higher than 1/2 the vertical resolution could have flicker. Converting to progressive for newer displays has largely resolved that problem. Newer codecs encode progressive nearly as efficiently as interlace, so the advantage of interlace's reduction in bandwidth will be largely if not completely lost.


As a side note, if Sony's crystal LED display makes it to the market any time soon, and is not outrageously expensive and doesn't have lifetime issues, then I think it's over for LCD and plasma. I think it's pretty much over for plasma as it is. I don't think I have ever seen any display as good, though I was just viewing demo material. 4k on that would just be stunning and could kill theaters.

Excellent post - particularly on the widespread misconceptions about interlace.
 
#51 ·

Quote:
Originally Posted by sneals2000 /forum/post/21820838


Excellent post - particularly on the widespread misconceptions about interlace.

Except I said the imager is doubled in vertical size. It's the imager element (individual pickup) that's effectively doubled in vertical size. In the words of a great wise guy man ... ooops.


I don't think we'll see 4k interlaced. But I do think the .1% pull down (59.94 vs 60, 29.97 vs 30, 23.98 vs 24) will survive on 60hz systems. The creators of NTSC color were too smart (and cautious) for their own good worrying about the potential interference between the color and sound carriers.
 
This is an older thread, you may not receive a response, and could be reviving an old thread. Please consider creating a new thread.
Top