Any CES Buzz on New Projectors Accepting Ultra Hi-Def or 4K? - Page 3 - AVS Forum
Forum Jump: 
Reply
 
Thread Tools
post #61 of 78 Old 01-13-2013, 05:48 PM
AVS Special Member
 
Seegs108's Avatar
 
Join Date: Jul 2007
Location: Schenectady, New York
Posts: 3,652
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 53 Post(s)
Liked: 139
Quote:
Originally Posted by Kris Deering View Post

Considering that theaters can't even do 4K 48fps, why would we think that anything else will? There isn't a single theater out there with a projector that can do 4k 48fps 3D video. All of them are doing 2K to each eye. The Sony D-Cinema projectors can't even do full 4K with standard 3D, the end resolution ends up being less than 2K per eye! But you don't see people screaming about pixel structure with that.

I highly doubt that they'll use the 4K vehicle with any future format to push higher bit depth and color resolution. Just like the existing 1080p standards they will find every way to shortcut the video to gain the most compression efficiency. Personally I'd rather see the new higher effeciency compression added to the existing Blu-ray spec and get true 10 bit or 12 bit 4:4:4 color resolution and D-Cinema color gamut with 1080p transfers than 4K. I don't care what the TVs at CES looked like.

You're most likely only going to get 10bit color resolution. The reason behind that has to do with almost all LCDs being limited to 10bits of color per pixel. I'm not an engineer so I don't know if this has to do with the controller or the panels themselves. But what I've heard is that most LCDs these days are 10bit native and won't passthrough anything higher than that. Apart from LCDs, I think most consumer processors/dispalys can only passthrough and display 10bit color. So the situation will be similar to 3D in the fact that, if you want 12 bit color you're going to need to upgrade your display device to one that can actually display 12 bit color.. But I highly doubt 12 bit color will fly and 10 will be the number they land on due to compatibility.

---------------------------------------

Look What AVS Made Me Do!
Seegs108 is offline  
Sponsored Links
Advertisement
 
post #62 of 78 Old 01-13-2013, 07:10 PM
AVS Addicted Member
 
millerwill's Avatar
 
Join Date: Apr 2004
Location: Berkeley, CA
Posts: 11,347
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 7 Post(s)
Liked: 24
Quote:
Originally Posted by mark haflich View Post

 ......

Bill. Are you giddy with the SF win?

Yep, was exciting to see the Niners handle the Packers; I didn't think they would be able to do it.

 

Would be great to see a Niners-Pats SB, for my son-in-law is a die hard Pats fan (and I actually like them too).   I think the Pats will certainly make it, but looks like the Niners may hard a real test with Atlanta.

 

PS   And watching it all this weekend on the VW1000, with a 136x72 pic on an HP2.4 screen, 11 ft away, still makes me giddy.    I still am in your debt for talking me into buying this projector!

millerwill is offline  
post #63 of 78 Old 01-13-2013, 07:19 PM
AVS Special Member
 
Kris Deering's Avatar
 
Join Date: Oct 2002
Location: The Pacific Northwet
Posts: 6,963
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 48 Post(s)
Liked: 244
Quote:
Originally Posted by Seegs108 View Post

You're most likely only going to get 10bit color resolution. The reason behind that has to do with almost all LCDs being limited to 10bits of color per pixel. I'm not an engineer so I don't know if this has to do with the controller or the panels themselves. But what I've heard is that most LCDs these days are 10bit native and won't passthrough anything higher than that. Apart from LCDs, I think most consumer processors/dispalys can only passthrough and display 10bit color. So the situation will be similar to 3D in the fact that, if you want 12 bit color you're going to need to upgrade your display device to one that can actually display 12 bit color.. But I highly doubt 12 bit color will fly and 10 will be the number they land on due to compatibility.

All of the displays in my house currently will accept a 12-bit signal. Whether the panels or displays actually keep the signal 12-bit I don't know, but they'll all accept it.

Contributing Editor/Writer
Sound And Vision Magazine

Click Here To See My Current Setup
Kris Deering is offline  
post #64 of 78 Old 01-13-2013, 07:21 PM
AVS Special Member
 
Seegs108's Avatar
 
Join Date: Jul 2007
Location: Schenectady, New York
Posts: 3,652
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 53 Post(s)
Liked: 139
Yeah, any HDMI 1.3 or higher will accept them. Most processors and the panels themselves can only do 8 or 10 bit

---------------------------------------

Look What AVS Made Me Do!
Seegs108 is offline  
post #65 of 78 Old 01-13-2013, 07:29 PM
AVS Special Member
 
Seegs108's Avatar
 
Join Date: Jul 2007
Location: Schenectady, New York
Posts: 3,652
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 53 Post(s)
Liked: 139
Without any 10 or 12 bit video material on the consumer side of things (other than PC generated stuff perhaps) there hasn't been a need to produce them. Maybe if that becomes the standard the displays will be upgraded to meet specifications?

---------------------------------------

Look What AVS Made Me Do!
Seegs108 is offline  
post #66 of 78 Old 01-13-2013, 08:09 PM
AVS Special Member
 
Geof's Avatar
 
Join Date: Feb 1999
Location: Eden NY
Posts: 6,007
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 2 Post(s)
Liked: 33
Audiophiles wanted SACD and DVD-Audio but the masses like mp3's. The same thing will happen with 4K. Store shelves are lined with DVD's and outnumber Blu Rays by a considerable amount. The majority of LCD's accept 1080p yet DVD's must be feeding most of them given the ratio of DVD to Blu Ray titles I saw the last time I walked thru Best Buy. If the masses are satisfied with DVD's where's the logic that they'll buy 4K Blu Rays? If there is a relatively low cost development path to 4K it may happen but I'm not sure it makes business sense to spend a bundle of dough developing and releasing 4K discs and 4K players to a relatively small market. The other thing is how many movies would really benefit with 4K. Are film based flicks going to really look a lot better at 4K resolution? Seems to me there is going to be a limited subset of content that will shine with 4K so not only is the market going to be small I expect content choices will also be limited.

Geof
Geof is offline  
post #67 of 78 Old 01-14-2013, 12:13 AM
AVS Special Member
 
madshi's Avatar
 
Join Date: May 2005
Posts: 5,414
Mentioned: 2 Post(s)
Tagged: 0 Thread(s)
Quoted: 16 Post(s)
Liked: 108
Quote:
Originally Posted by Ron Jones View Post

Very little info has been officially released about what the new capabilities will be, but they did have a goal of supporting 4K at 60 Hz. To what extent 3D (at higher than 24 Hz), deep color and increased color resolution (e.g., 4:2:2 or 4:4:4) may be supported has not yet be made public (as far as I know). I do expect we will hear some details within the next few months.

Of course wikipedia could be wrong, but it says: "Based on HDMI Forum meetings it is expected that HDMI 2.0 will increase the maximum TMDS per channel throughput from 3.4 Gbit/s to 6 Gbit/s which would allow a maximum total TMDS throughput of 18 Gbit/s". Let's do some math based on that. My numbers don't match exactly what wikipedia has calculated, so maybe there's a small percentage error in my numbers, but they should be roughly correct:

Maximum throughput (Gbit/s) with 8b/10b overhead removed:
HDMI 1.4: 8.16 Gbit/s
HDMI 2.0: 14.4 Gbit/s

3840*2160, 4K, 24fps:
2D, 4:4:4, 8bit -> 4.8 Gbit/s - ok
2D, 4:4:4, 10bit -> 6.0 Gbit/s - ok
2D, 4:4:4, 12bit -> 7.2 Gbit/s - ok
3D, 4:4:4, 8bit -> 9.6 Gbit/s - ok
3D, 4:4:4, 10bit -> 11.9 Gbit/s - ok
3D, 4:4:4, 12bit -> 14.3 Gbit/s - ok

3840*2160, 4K, 48fps:
2D, 4:4:4, 8bit -> 9.6 Gbit/s - ok
2D, 4:4:4, 10bit -> 11.9 Gbit/s - ok
2D, 4:4:4, 12bit -> 14.3 Gbit/s - ok
3D, 4:4:4, 8bit -> 19.1 Gbit/s - fail
3D, 4:4:4, 10bit -> 23.9 Gbit/s - fail
3D, 4:4:4, 12bit -> 28.7 Gbit/s - fail

3840*2160, 4K, 60fps:
2D, 4:4:4, 8bit -> 11.9 Gbit/s - ok
2D, 4:4:4, 10bit -> 14.9 Gbit/s - fail
2D, 4:4:4, 12bit -> 17.9 Gbit/s - fail
3D, 4:4:4, 8bit -> 23.9 Gbit/s - fail
3D, 4:4:4, 10bit -> 29.9 Gbit/s - fail
3D, 4:4:4, 12bit -> 35.8 Gbit/s - fail

Quote:
Originally Posted by Kris Deering View Post

Considering that theaters can't even do 4K 48fps, why would we think that anything else will? There isn't a single theater out there with a projector that can do 4k 48fps 3D video. All of them are doing 2K to each eye. The Sony D-Cinema projectors can't even do full 4K with standard 3D, the end resolution ends up being less than 2K per eye! But you don't see people screaming about pixel structure with that.

Yes, but I hate if specs (or connectors) limit what we can do. I also hate that we have to update receivers etc every other year because of HDMI version bumps. I would expect HDMI 2.0 to have enough bandwidth to do everything we could ever need/want in the next 10 years. Unfortunately that doesn't appear to be the case. From what I've heard, the RED projectors will be able to do high bitdepth 4K 3D at up to 60fps. Their codec can do that, too, so can their delivery platform, I think. Whether 4K 3D is really necessary is a question worth asking, but then if you create a new spec / delivery platform and you can pull out all the stops, why not? I think that's a *much* better approach than what we're used from CE companies and film studios.

Quote:
Originally Posted by Kris Deering View Post

I highly doubt that they'll use the 4K vehicle with any future format to push higher bit depth and color resolution. Just like the existing 1080p standards they will find every way to shortcut the video to gain the most compression efficiency.

You may be right with that. I sure hope not. If they don't support bigger color spaces and higher bit depth with 4K Blu-Ray then I would consider that really really stupid.
Elix likes this.
madshi is offline  
post #68 of 78 Old 01-14-2013, 12:19 AM
AVS Special Member
 
Seegs108's Avatar
 
Join Date: Jul 2007
Location: Schenectady, New York
Posts: 3,652
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 53 Post(s)
Liked: 139
The problem is that HDMI is basically single link DVI just reconfigured into a different pin/port configuration. I think it's safe to say they're really pushing the technology to it's limits. It might be better to create something new if they can't make a new configuration of it work with 4K in all of it's possible forms.

---------------------------------------

Look What AVS Made Me Do!
Seegs108 is offline  
post #69 of 78 Old 01-14-2013, 01:57 AM
AVS Special Member
 
Elix's Avatar
 
Join Date: Jun 2011
Location: Dungeon, Pillar of Eyes
Posts: 1,161
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 4 Post(s)
Liked: 23
Who's obliged to use HDMI anyway? There's a display port/thunderbold, for example, which is free for makers of electronic devices (unlike HDMI). Can someone please explain why?
Elix is offline  
post #70 of 78 Old 01-14-2013, 02:19 AM
AVS Special Member
 
JerryW's Avatar
 
Join Date: Aug 2001
Location: Location, Location
Posts: 1,012
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 30
All these bandwidth specs are designed as cost/performance ratios. They take a look at what sort of signal processing is currently affordable and design around that. What was unaffordable, state-of-the-art back in 1998 when they were designing DVI is now not only dirt-cheap, but obsolete. Due to Moore's Law the transceiver chips in the HDMI ports can be a couple of orders of magnitude more powerful than they were 15 years ago for the same price. So it isn't really a case of pushing the limits.

Single-link DVI is ~5Gbps, HDMI 2.0 is projected to be 18Gbps. Less than 4x increase over 15 years isn't particularly aggressive. We saw a 10x speed increase in ethernet in just 3 years (100Mbps in 1995 to 1Gbps in 1999) - although to be fair even though the cable standard stayed the same, there were unused wires in the 100Mbps spec that the 1Gbps spec made use of..

Also, the reason they want to use HDMI is for backwards compatibility. Adapters are a PITA. HDMI has 100% of the consumer entertainment market - displayport and the rest are niche specs for PC monitors with 0% of consumer entertainment devices. Any consumer entertainment device that implemented displayport would still have to include HDMI for backwards compatibility.

Copyright is not property, it is merely a temporary loan from the public domain.
JerryW is offline  
post #71 of 78 Old 01-14-2013, 04:28 AM
AVS Special Member
 
coolscan's Avatar
 
Join Date: Jan 2006
Posts: 1,786
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 6 Post(s)
Liked: 99
I am surprised that HDMI has survived this long. Must be the worst connecting plug design in the history of electronics.

There is some hope that HDBaseT Alliance, founded in 2010 by LG Electronics, Samsung Electronics, Sony Pictures Entertainment, and Valens Semiconductor, will be the alternative in the future. They have taken their time to get this going, but there is some movement. HDBaseT Announced at CES 2013 That Onkyo & Pioneer Have Joined Alliance.

Separate HDBaseT extender boxes excist, but HDBaseT buildt into equipment as an alternative transmission standard to HDMI would be a much better solution. If it could be upgradeable with new firmware we wouldn't need to buy new equipment every time they upgraded the standard.

HDBaseT use a standard RJ-45 plug and a Cat5e/6 LAN cable.

The compelling features of HDBaseT technology include:
>Uncompressed video/audio up to 10.2 Gbps.
>Maximum cable length of 100m, including support of multiple hops, up to 8 x 100m
>Low cost standard Cat5e/6 LAN cable
>Utilizes a standard RJ-45 connector
>Supplies up to 100W of power – which can be utilized to power a remote TV
>Support for 100Mbps Ethernet
>Easy installation utilizing existing in-wall Ethernet connectivity
>USB support
>Supports HDCP
>Networking support including extended-range daisy chain and star topologies

There is also a similar "Chinese" system under development called DiiVA, founded (allegedly) by among others Sony, Samsung, Panasonic, Sharp, LG, and Toshiba and a multitude of Chinese companies. But as of today, the list of Chinese and Japanese companies that have adopted this, only Toshiba and Foxcon are well known names.

Doesn't look like there is much difference between HDBaseT and DiiVA.
coolscan is offline  
post #72 of 78 Old 01-14-2013, 05:50 AM
AVS Addicted Member
 
mark haflich's Avatar
 
Join Date: Dec 2000
Location: brookeville, maryland, usa
Posts: 19,195
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 59 Post(s)
Liked: 293
Quote:
Originally Posted by Geof View Post

Audiophiles wanted SACD and DVD-Audio but the masses like mp3's. The same thing will happen with 4K. Store shelves are lined with DVD's and outnumber Blu Rays by a considerable amount. The majority of LCD's accept 1080p yet DVD's must be feeding most of them given the ratio of DVD to Blu Ray titles I saw the last time I walked thru Best Buy. If the masses are satisfied with DVD's where's the logic that they'll buy 4K Blu Rays? If there is a relatively low cost development path to 4K it may happen but I'm not sure it makes business sense to spend a bundle of dough developing and releasing 4K discs and 4K players to a relatively small market. The other thing is how many movies would really benefit with 4K. Are film based flicks going to really look a lot better at 4K resolution? Seems to me there is going to be a limited subset of content that will shine with 4K so not only is the market going to be small I expect content choices will also be limited.

Pretty much what I have been saying. The market is such that 4K content will be very nitch and will be served by servers. Content will be more expensive than blurays and will eventually be network based. but limited distribution could be by menory card or hard drive and there is some movie content now (Time or some title like that) that you can buy it in true 4K for about $300 a copy hard based.

Mark Haflich
markhaflich@yahoo.com
call me at: 240 876 2536
mark haflich is offline  
post #73 of 78 Old 01-14-2013, 05:53 AM
AVS Addicted Member
 
mark haflich's Avatar
 
Join Date: Dec 2000
Location: brookeville, maryland, usa
Posts: 19,195
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 59 Post(s)
Liked: 293
Quote:
Originally Posted by coolscan View Post

I am surprised that HDMI has survived this long. Must be the worst connecting plug design in the history of electronics.

There is some hope that HDBaseT Alliance, founded in 2010 by LG Electronics, Samsung Electronics, Sony Pictures Entertainment, and Valens Semiconductor, will be the alternative in the future. They have taken their time to get this going, but there is some movement. HDBaseT Announced at CES 2013 That Onkyo & Pioneer Have Joined Alliance.

Separate HDBaseT extender boxes excist, but HDBaseT buildt into equipment as an alternative transmission standard to HDMI would be a much better solution. If it could be upgradeable with new firmware we wouldn't need to buy new equipment every time they upgraded the standard.

HDBaseT use a standard RJ-45 plug and a Cat5e/6 LAN cable.

The compelling features of HDBaseT technology include:
>Uncompressed video/audio up to 10.2 Gbps.
>Maximum cable length of 100m, including support of multiple hops, up to 8 x 100m
>Low cost standard Cat5e/6 LAN cable
>Utilizes a standard RJ-45 connector
>Supplies up to 100W of power – which can be utilized to power a remote TV
>Support for 100Mbps Ethernet
>Easy installation utilizing existing in-wall Ethernet connectivity
>USB support
>Supports HDCP
>Networking support including extended-range daisy chain and star topologies

There is also a similar "Chinese" system under development called DiiVA, founded (allegedly) by among others Sony, Samsung, Panasonic, Sharp, LG, and Toshiba and a multitude of Chinese companies. But as of today, the list of Chinese and Japanese companies that have adopted this, only Toshiba and Foxcon are well known names.

Doesn't look like there is much difference between HDBaseT and DiiVA.

Very interesting. Thanks for sharing.

Mark Haflich
markhaflich@yahoo.com
call me at: 240 876 2536
mark haflich is offline  
post #74 of 78 Old 01-14-2013, 06:49 AM
AVS Special Member
 
d.j.'s Avatar
 
Join Date: Oct 2007
Location: Denmark
Posts: 1,242
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 1 Post(s)
Liked: 20
Quote:
Originally Posted by madshi View Post

Yes, but even the next generation HDMI spec (thought the name was HDMI 2.0?) will *not* be able to support what we need for optimal "Hobbit" playback. Meaning, it will not have enough bandwidth to do 4K with 48fps 3D with 10/12bit and 4:4:4. It won't even have remotely enough bandwidth for that, from what I've heard.
Generally I agree with you. However, there are still 2 arguments for 4K Blu-Ray:

(1) The studio couldn't really sell "higher bitdepth" and "bigger color spaces" to normal consumers. They need more megapixels for marketing. So I don't think there's any chance that we'll see a new spec for 1080p Blu-Ray with support for higher bitdepth and bigger color spaces. I do hope that when they touch the spec to add 4K support, they will also add in support for higher bitdepth, bigger color spaces and 4:4:4 at the same time. So 4K might be the vehicle for us to get what we really need. That said, I really wouldn't mind getting 4K resolution. I'm sitting quite near to my projection screen so I should probably be able to see the difference...

(2) I hope studios won't sink so low as to use old 2K DVD masters for 4K Blu-Rays. So 4K Blu-Rays would probably result in us getting new scans/masters for some movies, which could potentially *greatly* improve image quality even for 1080p playback.


OT

But IFAIK the "Hobbit" is not 4K, its "only" 2K, 3D HFR ( 48 P ) or am I wrong?

dj
d.j. is online now  
post #75 of 78 Old 01-14-2013, 06:56 AM
AVS Special Member
 
Ron Jones's Avatar
 
Join Date: Jul 2000
Location: Florida and West Virginia, USA
Posts: 5,686
Mentioned: 2 Post(s)
Tagged: 0 Thread(s)
Quoted: 35 Post(s)
Liked: 95
Quote:
Originally Posted by d.j. View Post

OT

But IFAIK the "Hobbit" is not 4K, its "only" 2K, 3D HFR ( 48 P ) or am I wrong?

dj

The Hobbit was shot with Red cameras in native 4K (or prehaps 5K) resolution. However, the current digital cinema standards do not support full 4K resolution for HFR 3D and therefore I assume it is shown as 2K for the HFR 3D presentations. With the next generation of HDMI, even if it doesn't have the bandwidth to support the most demanding 4K 3D format, industry implementators could get creative and define a "standard" configuration to support HFR 3D 4K w/12-bit depth and 4:4:4 would use two HDMI connections with one for the right and the other for the left video streams. Perhaps not very likely, but possible.

Ron Jones
Blog + Reviews + Articles: projectorreviews.com
Ron Jones is offline  
post #76 of 78 Old 01-14-2013, 07:37 AM
AVS Special Member
 
coolscan's Avatar
 
Join Date: Jan 2006
Posts: 1,786
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 6 Post(s)
Liked: 99
The Hobbit at the moment exist only in a 2K version. The reason, and this goes for all movies, is that to create and render VFX/CGI in 4K is very much more expensive and take very much more time than doing VFX/CGI in 2K.
For The Hobbit it was just not even time to do VFX/CGI in 4K. The Hobbit edit was locked just 2-3 days before the world premier in New Zeeland.
That is also why we will never see a true 4K version of Avatar.
They are now working on creating a 4K version of The Hobbit part1.
My guess is that we will see a lot of 4K movies where the VFX/CGI is just up-converted from 2K and reconnected to the Live shot 4K camera capture. (The Hobbit was shot in 2x5K)
Or just up-converted movies that are shot on cameras with lower resolution than 4K, like Skyfall.
coolscan is offline  
post #77 of 78 Old 01-14-2013, 08:14 AM
AVS Special Member
 
space2001's Avatar
 
Join Date: Dec 2002
Posts: 2,033
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 5 Post(s)
Liked: 21
Quote:
Originally Posted by coolscan View Post

The Hobbit at the moment exist only in a 2K version. The reason, and this goes for all movies, is that to create and render VFX/CGI in 4K is very much more expensive and take very much more time than doing VFX/CGI in 2K.
For The Hobbit it was just not even time to do VFX/CGI in 4K. The Hobbit edit was locked just 2-3 days before the world premier in New Zeeland.
That is also why we will never see a true 4K version of Avatar.
They are now working on creating a 4K version of The Hobbit part1.
My guess is that we will see a lot of 4K movies where the VFX/CGI is just up-converted from 2K and reconnected to the Live shot 4K camera capture. (The Hobbit was shot in 2x5K)
Or just up-converted movies that are shot on cameras with lower resolution than 4K, like Skyfall.


Because of how clean FX can look when rendered they have to be post processed with Noise,grain etc to match the scene . The Up conversion of 2k to 4k depending on how it is done, you may not notice a difference.


Also depending on how they did the Hobbit. Once you properly De-Bayer Red 5k Footage you are left with around 2.5k.
space2001 is offline  
post #78 of 78 Old 01-14-2013, 10:04 AM
AVS Special Member
 
coolscan's Avatar
 
Join Date: Jan 2006
Posts: 1,786
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 6 Post(s)
Liked: 99
Quote:
Originally Posted by space2001 View Post

Because of how clean FX can look when rendered they have to be post processed with Noise,grain etc to match the scene . The Up conversion of 2k to 4k depending on how it is done, you may not notice a difference.
I agree. Also is the fact that often the VFX elements are just a part of the frame so it doesn't have to have the full resolution of the frame.
But the time consuming (and thereby expensive) is that when a 4K Live shot frame shall be combined with a VFX part that is originally created for a 2K finish, it has to be up-converted and rendered out first to match the finishing resolution. You can not just take the original VFX/live shot combination 2K frame and up-convert it because it will look different from the rest of the original Live shot frames that doesn't have VFX.
Quote:
Also depending on how they did the Hobbit. Once you properly De-Bayer Red 5k Footage you are left with around 2.5k.
I think you miss spelled that or think about RedOne or Scarlet that resolve about 3.5K after debayering. When you debayer Red Epic 5K footage you have about 4.5K of real resolved resolution.
coolscan is offline  
Reply Digital Hi-End Projectors - $3,000+ USD MSRP

User Tag List

Thread Tools
Show Printable Version Show Printable Version
Email this Page Email this Page


Forum Jump: 

Posting Rules  
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are Off
Refbacks are Off