Why HDMI is pointless! - AVS Forum
Forum Jump: 
 
Thread Tools
Old 09-01-2010, 03:01 PM - Thread Starter
Member
 
blakehew's Avatar
 
Join Date: Dec 2001
Location: Utah
Posts: 50
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
I got thinking about this the other day and i came to the conclusion that HDMI, DVI, and all other uncompressed consumer video cable formats are really pointless.

Lets be honest, there is no such thing as uncompressed video to the end consumer. All video content that a consumer partakes of is coming from a compressed source. DVD, Blu-ray, Cable/Satellite TV, and even digital home videos and pictures are all compressed. So why do we need cables that send uncompressed video which in turn makes the cables more expensive, and have strict length requirements, since they need much higher bandwidth to carry the uncompressed video.
Why not just do the decoding in the display device? instead we decode on the player, then send the uncompressed video over a cable capable of carrying the necessary bandwidth for uncompressed video, then display on the display. Just send the compressed video directly to the display to save having to make complex expensive cables.

While writing this post, i did find one small flaw in my argument. There are a few types of sources out there that are natively uncompressed, those being sources where the content is being directly crated by the system dynamically. Computers, and Game consoles come to mind.

The point of this is that it seems much more logical to me to send the source as is to the end device. Stop sending the decoded data to the end device when all that data is compressed to start with.

Life is like a roll of toilet paper, Long and useful.
blakehew is offline  
Sponsored Links
Advertisement
 
Old 09-01-2010, 03:03 PM
Member
 
Sr20kidD's Avatar
 
Join Date: Aug 2005
Posts: 55
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
its so we as teh consumer can feel appreciated and have some bragging rights!
Sr20kidD is offline  
Old 09-01-2010, 03:55 PM
AVS Special Member
 
Naylia's Avatar
 
Join Date: Feb 2005
Location: San Jose, CA
Posts: 1,818
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 2 Post(s)
Liked: 17
Requiring the display to be able to decompress the video would obsolete your display every couple years. As compression algorithms evolve, you wouldn't be able to hook new equipment up to old tvs.

However, to support a new format you always need to buy a new playback device - hence why that device decompresses the video.
Naylia is offline  
Old 09-01-2010, 04:18 PM
AVS Special Member
 
Colm's Avatar
 
Join Date: Aug 2002
Posts: 4,652
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 89
You are thinking way to hard about HDMI. HDMI is an offshoot of DVI, a monitor interface. The plan from day one was to allow the display device to be as dumb as possible. It just had to display what was fed to it, a raster line at a time. It is what it is.
Colm is offline  
Old 09-01-2010, 10:29 PM
 
ChrisWiggles's Avatar
 
Join Date: Nov 2002
Location: Seattle
Posts: 20,730
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 15
Because they you'd need to have a display which could decode every conceivable video and compression format you could ever throw at it. This would be absurdly expensive, or would basically require a robust computer with software solutions which could be updated constantly.

HDMI isn't stupid for the reasons you point out at all. HDMI is stupid because there are professional video interface formats, namely SDI, which accomplishes everything HDMI does (and more) over coax which is cheap, readily available, easily terminated, very rugged, and is extraordinarily reliable over very long distances and through video routers for very robust distribution.
ChrisWiggles is offline  
Old 09-02-2010, 12:49 AM
Advanced Member
 
btmoore's Avatar
 
Join Date: Jan 2001
Location: Oakland, CA
Posts: 836
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
on the uncompressed front HD-SDI, one standard coax cable, one standard 75 ohm BNC connector, and standard HD-SDI switches, DA and Matrix switches vs all the odd ball HDMI crap. Even better, 3G-SDI interface consists of a SINGLE 2.97 Gbit per second serial link and is standardized in SMPTE 424M, supports 4:4:4 @ 2K resolution on 1 coax 75 ohm BNC cable. all of this is standard broadcast industry equipment, the HD you are watching at one time was highly likely shipped around over HD-SDI.

To Chris's point on the compressed front I don't see any reason that existing modulation (ATSC, QAM, etc) standards and codec's (mpeg2/4, etc) over standard coax wouldn't of worked well or adapted to CE displays and equipment vs HDMI. A good basic example of this was the the old ATSC module that dish use to sell for their HDTV STB. You had your DISH HD STB and instead of hooking up a component cable, you just had one standard coax with f connectors that you just ran into the Antenna F connector on your TV and you could run to multiple displays that had ATSC demodulators using basic cable splitters. I understand the the content providers had issues with this, but I dont understand why they couldn't of just developed an encryption wrapper, oh that is right they did encrypted QAM.

Most if not all this existed and predated HDMI, worked with standard gear and it seems to be just needed to be cleaned up a bit just to be a little bit more CE compatible, instead it feels like we got the HDMI mess.
btmoore is offline  
Old 09-02-2010, 08:45 AM - Thread Starter
Member
 
blakehew's Avatar
 
Join Date: Dec 2001
Location: Utah
Posts: 50
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
Thanks for all the great comments, this is a very interesting discussion to me. I guess in the end its marketing and the all powerful $$ that really drives the industry. Its rare that common sense and simplicity is used, and the end consumer always seems to get screwed in the end.

Life is like a roll of toilet paper, Long and useful.
blakehew is offline  
Old 09-02-2010, 01:37 PM
Member
 
Chris Friesen's Avatar
 
Join Date: May 2009
Posts: 20
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
I suspect that there are a few reasons why they didn't go with HD-SDI:

1) No support for copy protection.
2) No compatibility with existing DVI equipment.

DVI was originally intended as a computer monitor interface, so likely they didn't plan on needing to support long runs. The desire to support both analog and digital (as well as high resolutions via dual-link) over a single cable likely played into the many-conductor design.

When DVI expanded into the consumer electronics arena it ran into similar length limitations as we're seeing with HDMI, but it was a less-popular option so it was less of an issue.

With HDMI now common, more people are running into cable length problems.
Chris Friesen is offline  
Old 09-06-2010, 11:43 PM
AVS Special Member
 
crutschow's Avatar
 
Join Date: Feb 2008
Location: USA
Posts: 1,774
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
If the TV did all the decompression, then it would have to be able to decode MPEG-2, MPEG-4, ATSC, QAM, VC-1 and possibly others. Obviously this requires a lot of added complex codecs in the TV.

Carl
Curmudgeon Elektroniker
crutschow is offline  
 
Thread Tools


Forum Jump: 

Posting Rules  
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are Off
Refbacks are Off