Is anyone else bothered by "4K" being 3840? Why not use DisplayPort? - AVS Forum
Forum Jump: 
 
Thread Tools
post #1 of 30 Old 10-19-2013, 06:16 PM - Thread Starter
Member
 
pqwk50's Avatar
 
Join Date: Oct 2013
Posts: 74
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 23

If the industry is not using DisplayPort because of DRM, it is official that it is ruining home theater.

 

Wasn't the reason 4K was knocked down to 3840 because HDMI couldn't handle the bandwidth? DisplayPort can, and yet they still go ahead and bastardize the home theater experience just to be able to use HDMI?

 

The HDMI 2.0 spec is done and states 4K. But is this 4096 horizontal resolution or 3840? Can HDMI 4.0 handle 4096 horizontal?

 

Listening to Home Theater Geeks episode 158 really ticked me off that the industry isn't going with 4096 res with DisplayPort.

pqwk50 is offline  
Sponsored Links
Advertisement
 
post #2 of 30 Old 10-20-2013, 01:11 AM
AVS Special Member
 
Joe Bloggs's Avatar
 
Join Date: Apr 2007
Location: UK
Posts: 2,534
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 111 Post(s)
Liked: 55
Quote:
Is anyone else bothered by "4K" being 3840?
Yes since it's less than 4K. Though lack of high frame rates is probably more of an issue.

HDMI 1.4 could handle 4096x2160 at 24 fps, and 3840x2160 at 24 or 30 fps.

HDMI 2.0 increases "4K" support to up to 60 fps. The official HDMI site doesn't say which "4K" that is.
edit: Though the Home Theater Geeks 179 video says HDMI 2.0 supports up to 4096x2160p60.
Joe Bloggs is offline  
post #3 of 30 Old 01-08-2014, 09:57 PM
AVS Special Member
 
Mr.D's Avatar
 
Join Date: Dec 2001
Location: UK
Posts: 3,307
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 15
No.

3840 is a better fit for upscaling 1080p material, which will be the bulk of material available to view on a "4k" display for the considerable future.

The visual difference between 3840 and 4096 is tiny ( especially if 4096 material is mastered sensibly ; ie cropped to 3840 rather than rescaled).

digital film janitor
Mr.D is offline  
post #4 of 30 Old 01-08-2014, 11:26 PM
AVS Special Member
 
Joe Bloggs's Avatar
 
Join Date: Apr 2007
Location: UK
Posts: 2,534
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 111 Post(s)
Liked: 55
Quote:
Originally Posted by Mr.D View Post

No.

3840 is a better fit for upscaling 1080p material, which will be the bulk of material available to view on a "4k" display for the considerable future.
The point is also the name. Wouldn't it matter if TV companies advertised their TVs were 4,000 pixels wide but weren't? eg. "buy our new TV - it has 4,000 pixels across" - you buy it, the sale of goods act says that goods must be as described, but it was actually 3840 pixels across. You've been misled by the TV advertisement or description of the TV. 4K is short for 4,000. Advertisting it as 4,000 (4K) is misleading if you are actually getting only 3840 pixels across and surely the goods aren't as described as in the sale of goods act.

The sale of goods act says:
Quote:
Where there is a contract for the sale of goods by description, there is an implied [term] that the goods will correspond with the description.

..For the purposes of this Act, goods are of satisfactory quality if they meet the standard that a reasonable person would regard as satisfactory, taking account of any description of the goods

There's also advertising law
https://www.gov.uk/marketing-advertising-law
Quote:
All marketing and advertising must be:

an accurate description of the product or service
legal
decent
truthful
honest
Is stating something is 4K (which is short for 4,000 - or has been for a long time) an accurate description of something which only has 3,840 pixels across?
Joe Bloggs is offline  
post #5 of 30 Old 01-08-2014, 11:58 PM
AVS Special Member
 
Mr.D's Avatar
 
Join Date: Dec 2001
Location: UK
Posts: 3,307
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 15
Quote:
Originally Posted by Joe Bloggs View Post

The point is also the name. Wouldn't it matter if TV companies advertised their TVs were 4,000 pixels wide but weren't? eg. "buy our new TV - it has 4,000 pixels across" - you buy it, the sale of goods act says that goods must be as described, but it was actually 3840 pixels across. You've been misled by the TV advertisement or description of the TV. 4K is short for 4,000. Advertisting it as 4,000 (4K) is misleading if you are actually getting only 3840 pixels across and surely the goods aren't as described as in the sale of goods act.

The sale of goods act says:

Its irrelevant. They could call it 100k , as a term its meaningless. I personally would have preferred UHD or even SuperHD

I take great exception to people referring to 1080p as "2k". It simply isn't and "2k" as a term referring to specific imaging standards has ironically been around longer than "1080p". As has 4k funnily enough.

With the advent of "4k" as a consumer format I've given up fighting on this point but I certainly don't refer to 1080p as "2k" despite it becoming the norm for others. .

And I still think that the over-riding concern for a home cinema enthusiast is that a 4k display handles 1080p upscaling to a very high standard and sticking with an integer pixel multiple of 1080p is the way to go, Considering that 1080p material is going to represent about 90% of even an enthusiast's source material for at least the next decade.

And also given my very firm opinion based on having watched and created both 2k and 4k professional material for over 20 years to the point that most people will appreciate the difference between 4k and well displayed 1080p as being "subtle" even on uncommonly large screen sizes. And I'd even suggest that without the facility to AB 1080p vs 4k most people would not be able to readily identify one from the other reliably.

digital film janitor
Mr.D is offline  
post #6 of 30 Old 01-09-2014, 12:27 AM
AVS Special Member
 
Joe Bloggs's Avatar
 
Join Date: Apr 2007
Location: UK
Posts: 2,534
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 111 Post(s)
Liked: 55
Quote:
Originally Posted by Mr.D View Post

And also given my very firm opinion based on having watched and created both 2k and 4k professional material for over 20 years to the point that most people will appreciate the difference between 4k and well displayed 1080p as being "subtle" even on uncommonly large screen sizes. And I'd even suggest that without the facility to AB 1080p vs 4k most people would not be able to readily identify one from the other reliably.
Does that depend on whether you are dealing with film scans or direct recordings from 4K/5K digital cameras (like the Red) ones? ie. wouldn't direct captures from a digital camera (assuming no filtering which would soften the picture)- assuming the best lens and focusing give more visible resolution or be easier to differentiate on average than most 35mm film scans? Wouldn't it also depend on shutter duration/frame rate - ie. easier to notice the difference with a higher frame rate with each frame more clear (less motion blur) and easier to track due to the higher frame rate?
Joe Bloggs is offline  
post #7 of 30 Old 01-09-2014, 12:44 AM
AVS Special Member
 
Mr.D's Avatar
 
Join Date: Dec 2001
Location: UK
Posts: 3,307
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 15
Quote:
Originally Posted by Joe Bloggs View Post

Does that depend on whether you are dealing with film scans or direct recordings from 4K/5K digital cameras (like the Red) ones? ie. wouldn't direct captures from a digital camera - assuming the best lens and focusing give more visible resolution or be easier to differentiate on average than most 35mm film scans?

In reality its not that simple. It comes down to lenses , types of shot , content , depth of field. I generally encounter DOPs finding softness issues with their footage more readily on digital captured footage than film. For example I have been asked by DOPs recently for my opinions as to why certain shots in a feature film (not actually one I'm working on) were coming through soft from 5k capture. The only conclusion I could draw was that maybe they had hit just the right color combinations in parts of the plate that showed up shortfalls in color resolution for those particular hues.

I'll be honest I've never heard a DOP complain about film scans not being sharp enough whilst I usually hear about sharpness issues on some material on every film that I've worked on that's been shot digitally ( with the possible exception of Alexa which is ironically generally regarded as being slightly softer than ideal but generally is so well sampled that it responds very favourably to small amounts of sharpening in post). In general I like 35mm and Alexa. Red I hate because the gamut just never looks right to me regardless of how its graded and F65 I've not shot enough of to make a firm decision; in theory that should be everything that Alexa is but sharper but it seems to be a bit quirky and slightly unreliable in operation; seems to crash more often than the others)

Quote:
Originally Posted by Joe Bloggs View Post

Wouldn't it also depend on shutter duration/frame rate - ie. easier to notice the difference with a higher frame rate with each frame more clear (less motion blur) and easier to track due to the higher frame rate?

It would but not with static shots which make up the bulk of shots in a movie, so if its a significant difference you would see it without 48fps capture. 24fps playback with a 1/48th capture interval ( standard 24p capture with a 180 degree shutter angle) is something I like anyway. The 48FPS presentation I saw were not pleasing to me aesthetically regardless of their mooted benefits. I'm not mad about 3d either though with the exception of gaming.
Joe Bloggs likes this.

digital film janitor
Mr.D is offline  
post #8 of 30 Old 01-27-2014, 08:11 AM
Advanced Member
 
nathanddrews's Avatar
 
Join Date: Jun 2012
Location: Minnesota
Posts: 900
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 9 Post(s)
Liked: 64
1. I wish DP adoption was higher. HDMI is and always has been less capable. When DP 1.3 arrives, support will expand to full 8K support as well as 4K 3-D and HFR. I'm not sure if this will be fully compatible with Super HiVision 8K 120fps, but I'd settle for 4K at 120fps. wink.gif8K Desktop!

2. While I know that consumer 4K is technically less than "true 4K", it's rather difficult to settle on just a single 4K standard. Just like the whole 1080p/2K argument, I think it's appropriate that the 4K consumer devices are slightly less than the max spec. Since 4K digital cameras range in resolutions from 3840x2160 to 4096x2304, you'd be forced to either have pillarboxes or slight blowups for anything shot on sensors under "true 4K". Not to mention safe areas or edits... I'd rather have higher-res content shrunk down or cropped by a couple pixels than deal with awkward 6% scaling. I think someone else already mentioned the fact that 1080p content scales better as well.
nathanddrews is offline  
post #9 of 30 Old 01-27-2014, 09:06 AM
Newbie
 
jamesbrummel100's Avatar
 
Join Date: Dec 2013
Posts: 6
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 11
You prob all know this but few of the labels on tech stuff are accurate. 1MB of ram is really 1.024MB, the term "up to" can mean 0 to whatever number is being sold, gas is 9/10 of a cent more per gallon than you think it is, this is not the worlds best coffee, etc.

A band called CopShootCop has an album called "Consumer Revolt". Food fo thought.
nathanddrews likes this.
jamesbrummel100 is offline  
post #10 of 30 Old 01-27-2014, 11:16 AM
AVS Special Member
 
Joe Bloggs's Avatar
 
Join Date: Apr 2007
Location: UK
Posts: 2,534
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 111 Post(s)
Liked: 55
Quote:
Originally Posted by jamesbrummel100 View Post

1MB of ram is really 1.024MB.
Not really. There can be multiple definitions of megabytes, kilobytes etc. either based on 1000 or 1024.

eg. type in kilobytes in a megabyte in google and it says 1024, type the same thing in wolfram alpha and it says 1000 - see also Wikpedia for the 2 definitions.
http://en.wikipedia.org/wiki/Megabyte
For RAM it will be using the 1024 definition.

It would be okay if you're actually getting more (eg. if they were giving you 1024 when they're telling you 1000) - it's different when you're getting a lot less pixels on the horizontal than they're telling you you're getting.
And for 2.40:1 films they still say 1080p resolution, just like you won't be getting the full 2160p / UHD resolution when they release it at (less than) "4K".
Joe Bloggs is offline  
post #11 of 30 Old 01-27-2014, 11:40 AM
Newbie
 
jamesbrummel100's Avatar
 
Join Date: Dec 2013
Posts: 6
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 11
Binary uses base 8, not base 10. So 1024 is rounded off, 1000 would be like a bakers dozen. We humans like that better, but our binary friends don't.
jamesbrummel100 is offline  
post #12 of 30 Old 01-27-2014, 11:47 AM
AVS Special Member
 
Joe Bloggs's Avatar
 
Join Date: Apr 2007
Location: UK
Posts: 2,534
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 111 Post(s)
Liked: 55
Quote:
Originally Posted by jamesbrummel100 View Post

Binary uses base 8, not base 10. So 1024 is rounded off, 1000 would be like a bakers dozen. We humans like that better, but our binary friends don't.
Binary actually uses base 2. smile.gif Though I know what you mean - 8 bits in a byte.

There's still 2 definitions of gigabyte/megabyte/kilobyte etc. in use based on 1000 or 1024 like said above. Though I agree in computing it will normally be 1024 (though people selling a hard disc are likely to use the 1000 definition to give you less), which would still mean "1MB of ram is really 1.024MB." is incorrect since, in the normal 1024 definition, it's still 1MB not 1.024 MB.
Joe Bloggs is offline  
post #13 of 30 Old 01-27-2014, 06:20 PM
AVS Special Member
 
coolscan's Avatar
 
Join Date: Jan 2006
Posts: 1,793
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 9 Post(s)
Liked: 102
Quote:
Originally Posted by nathanddrews View Post


Since 4K digital cameras range in resolutions from 3840x2160 to 4096x2304, you'd be forced to either have pillarboxes or slight blowups for anything shot on sensors under "true 4K". Not to mention safe areas or edits... I'd rather have higher-res content shrunk down or cropped by a couple pixels than deal with awkward 6% scaling. I think someone else already mentioned the fact that 1080p content scales better as well.
Not entirely correct.
Digital cameras that are being used to shoot feature movies have more resolution than 4K, which is then oversampled down for a 4K finish in post for better quality, if they do it right.

Sony F65;
Claimed by Sony to be a 8K camera, which later got a new firmware that did "8K" with some "pixel-shift magic" and was promoted by Sony to now being a "real 8K" camera.
The truth is that the camera has roughly a 20 megapixel sensor (6K), which is far from 8K. You can put more than a whole 4K sensor resolution on top of that before you get 8K.

The sensor uses only 17.6 megapixel (5782 x 3060 pixel) of recording area.
This is processed and downsampled in camera to 4K, or records the sensor data to an external recorder for RAW data.

The camera outputs a very good 4K image but is not so popular because of big size of camera house and weight.
Used on After Earth and Oblivion.

.
Red Epic-MX; 5120 x 2700. Sensor (5K)
The Hobbit, Prometheus, Elysium, Pacific Rim, Lone Survivor, Transformers; Age of Extinction, etc. etc.

Red Epic Dragon(new, just released); 6144 x 3160. Sensor (6K)

Shooting those cameras with intended aspect ratio as safe area leaves room for re-framing and stabilizing space in post.

LG, Samsung and Toshiba's new 21:9 TVs have a resolution of 5,120 x 2,160. Would be nice if movies was finished in that resolution for owners of those TVs. wink.gif
Never going to happen though. tongue.gif

But if one had been following the thought behind the original Cinemascope format, movies in 21:9 AR should always be larger than movies in 16:9, including resolution with Constant Image Height (CIH) between them.
Not like now when 21:9 movies are cropped down with less vertical resolution than 16:9.

.
.
coolscan is offline  
post #14 of 30 Old 01-28-2014, 04:43 AM
Advanced Member
 
nathanddrews's Avatar
 
Join Date: Jun 2012
Location: Minnesota
Posts: 900
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 9 Post(s)
Liked: 64
I'm not talking about resolutions beyond 4K. I specifically called out only cameras marketed as "4K digital cameras", not 5K, etc. The point being that the 4K spec, like other resolution specs is not simply one set of values, but is a range that varies upon several factors. This is why I'm not bothered by consumer-level 4K (UHD) being at the bottom end of the range, since all other resolutions can be easily scaled or cropped down.

Since you brought up 21:9 displays, I'm itching to get a 5120x2160 21:9 display when they come down in price - but the only caveat is whether or not the BDA supports anamorphic with their next BD 4K spec. Otherwise I'll probably just stick with good ol' 16:9. I recently bought (and returned) this Asus monitor for my PC to try it out. I have to say that I love what it did for games and the quasi-CIH effect for video content, but I wasn't a fan of the low horizontal resolution for everything else.
nathanddrews is offline  
post #15 of 30 Old 01-28-2014, 05:59 AM
AVS Special Member
 
coolscan's Avatar
 
Join Date: Jan 2006
Posts: 1,793
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 9 Post(s)
Liked: 102
Quote:
Originally Posted by nathanddrews View Post

I'm not talking about resolutions beyond 4K. I specifically called out only cameras marketed as "4K digital cameras", not 5K, etc. The point being that the 4K spec, like other resolution specs is not simply one set of values, but is a range that varies upon several factors. This is why I'm not bothered by consumer-level 4K (UHD) being at the bottom end of the range, since all other resolutions can be easily scaled or cropped down.

You just misunderstand the reasons and use of cameras with sensors with higher resolution.

A camera with 4K resolution sensor doesn't resolve full 4K resolution, and the same goes for a camera with 2K sensor, it doesn't resolve 2K.
This because when CMOS sensors RAW data is debayered it looses captured resolution.

A camera with a 2K sensor will resolve aprox. 1.5K resolution.
A camera with 4K sensor will resolve aprox. 3.5K resolution.

This is the reason that the top end cameras in its field has sensors with more pixels than the intended output format.

Example; The most used digital cinema camera, Arri Alexa has a 2880×1620 sensor, where the data is debayered and oversampled down in Camera and records/outputs 2K in the ProRes format.
In this way the full resolve of 2K in the image is contained.

Similar is done with the Sony F65 where the 20 megapixel sensor (17.6MP effective) is downsampled in camera to output full 4K resolved images.
The RED cameras records only RAW, so there the oversampling down happens in post-production.

The same principle is/should be done with scanning of film, where a film should be scanned at 6K or higher for a top quality 4K master.
70mm and IMAX is typical scanned at 8K or sometimes at 11K even though the release format is only for 4K.

The use of the full sensor resolution of these cameras/scanners, after debayering but before downsampling, is only used for VFX shot plates where the downsampling only happens after the elements are integrated in the movie, or used when they need to Up-scale the files for a higher than sensor resolution master output/release.

This means than when we in the future gets to a 8K format of 33 megapixels, the High-End 8K cameras will have sensors with 60+ megapixel sensor to be able to fully resolve 8K.

To repeat; When you see a movie in the future has been shot with a camera that has only a 4K sensor and is released as a 4K movie, it doesn't really contain Real 4K resolved details.


Cam Man has recently started a Cinematography thread where both Camera Technology, Cinematography and many other aspects of Cinema will be discussed; http://www.avsforum.com/t/1513529/cam-mans-cinematography-thread
coolscan is offline  
post #16 of 30 Old 01-28-2014, 07:20 AM
 
bralas's Avatar
 
Join Date: Jan 2014
Location: Lafayette, LA
Posts: 117
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 20
Quote:
Originally Posted by Mr.D View Post

I'll be honest I've never heard a DOP complain about film scans not being sharp enough whilst I usually hear about sharpness issues on some material on every film that I've worked on that's been shot digitally...

Greetings Mr. D,

 

Likely correlates to characteristic of the film capture itself i.e film tracks color logarithmically like the human eye, coupled with significant latitude / headroom. Also, S16mm and 35mm film has unmatched color saturation quality. Very desirable ingredients in the industry.    

bralas is offline  
post #17 of 30 Old 01-28-2014, 11:10 AM
AVS Special Member
 
Joe Bloggs's Avatar
 
Join Date: Apr 2007
Location: UK
Posts: 2,534
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 111 Post(s)
Liked: 55
Quote:
Originally Posted by bralas View Post

Also, S16mm and 35mm film has unmatched color saturation quality. Very desirable ingredients in the industry.    
The BBC doesn't want Super 16 - at least in HD - it's too grainy. They'll accept it now in SD but not HD because of the grain.
Joe Bloggs is offline  
post #18 of 30 Old 01-28-2014, 11:19 AM
 
bralas's Avatar
 
Join Date: Jan 2014
Location: Lafayette, LA
Posts: 117
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 20

Personally I'm not concerned with BBC perspective. 2K, nice fit for S16mm and 4K is not overkill for 35mm.

 

http://en.wikipedia.org/wiki/Category:Films_shot_in_Super_16

 

'Moonrise Kingdom' is a great example IMO;)

bralas is offline  
post #19 of 30 Old 01-28-2014, 12:20 PM
 
bralas's Avatar
 
Join Date: Jan 2014
Location: Lafayette, LA
Posts: 117
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 20

https://www.youtube.com/watch?v=7N8wkVA4_8s

 

'Moonrise Kingdom' S16mm

bralas is offline  
post #20 of 30 Old 01-28-2014, 12:46 PM
 
bralas's Avatar
 
Join Date: Jan 2014
Location: Lafayette, LA
Posts: 117
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 20

http://www.imdb.com/title/tt1125849/

 

'The Wrestler' S16mm

 

[Watch Trailer]

bralas is offline  
post #21 of 30 Old 01-28-2014, 04:30 PM
AVS Special Member
 
Joe Bloggs's Avatar
 
Join Date: Apr 2007
Location: UK
Posts: 2,534
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 111 Post(s)
Liked: 55
If they're good films they would have been better if they were shot on something that doesn't give a bad picture quality because of all the grain. Super 16=grainy. This is the HDTV section, and since the BBC will only allow S16 in SD because of the grain, it will only be shown in SD quality in their channels. If you look at the reviews of the above film, you'll see them talking about the PQ problems because of the grain.
Joe Bloggs is offline  
post #22 of 30 Old 01-28-2014, 04:49 PM
 
bralas's Avatar
 
Join Date: Jan 2014
Location: Lafayette, LA
Posts: 117
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 20

Sure:)

bralas is offline  
post #23 of 30 Old 01-28-2014, 04:59 PM
 
bralas's Avatar
 
Join Date: Jan 2014
Location: Lafayette, LA
Posts: 117
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 20

Maybe darinp2 :confused: knows why the BBC shows 2K content as Std Def only?

bralas is offline  
post #24 of 30 Old 01-29-2014, 10:54 AM
AVS Special Member
 
Joe Bloggs's Avatar
 
Join Date: Apr 2007
Location: UK
Posts: 2,534
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 111 Post(s)
Liked: 55
Quote:
Originally Posted by bralas View Post

Maybe darinp2 confused.gif knows why the BBC shows 2K content as Std Def only?
It's that they show bad quality, grainy content in SD because down-converting averages out the grainy picture. If it was quality 2K or full HD content (not like S16) they'd accept it in HD.

http://www.tvbeurope.com/main-content/full/super16mm-an-sd-delivery-for-hd
Quote:
...it also stipulated that the format be classified as SD for delivery.
Quote:
The tests proved conclusively that the TX chain can cope if an SD version is ingested, rather than an HD version, because in the SD version much of the grain has been stripped away.
Quote:
Producers wishing to originate in 16mm are still asked to scan and post produce in HD or 2K but to deliver an SD master
Joe Bloggs is offline  
post #25 of 30 Old 01-31-2014, 06:29 AM
AVS Addicted Member
 
John Mason's Avatar
 
Join Date: Jul 2000
Location: New York, NY
Posts: 10,609
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 7 Post(s)
Liked: 16
Recalled this earlier AVS post from a BBC producer about why they decided not to use S16 for UK HDTV.

I own the Blu-ray set of "Pride and Prejudice" captured with S16 for the BBC and it looks superb. But only after undergoing special newly developed (at the time) telecine processing from the original negative. The technique was outlined, with an online DVD/Blu-ray comparison video, in the review of the Bu-ray in the Blu-ray software section here--although the comparison video (from the Blu-ray technical extra) may no longer have the same URL, if any). -- John

Last edited by John Mason; 07-16-2014 at 06:22 AM. Reason: URL correction
John Mason is offline  
post #26 of 30 Old 01-31-2014, 06:41 AM
 
bralas's Avatar
 
Join Date: Jan 2014
Location: Lafayette, LA
Posts: 117
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 20
Quote:
Originally Posted by Joe Bloggs View Post

It's that they show bad quality, grainy content in SD because down-converting averages out the grainy picture. If it was quality 2K or full HD content (not like S16) they'd accept it in HD.
 

S16mm is 2K, content is more then 1920 x 1080:)

The color depth of S16mm is way more then (HDTV) obligatory Rec 709.

bralas is offline  
post #27 of 30 Old 01-31-2014, 06:55 AM
 
bralas's Avatar
 
Join Date: Jan 2014
Location: Lafayette, LA
Posts: 117
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 20

From wiki:

 

"...In particular, Scrubs has been shot on Super16 from the start and is aired either as 4:3 SD (first 7 seasons) or as 16:9 HD (seasons 8 and 9). John Inwood, the cinematographer of the series, believed that footage from his Aaton XTR Prod camera was not only sufficient to air in high definition, it "looked terrific."

 

Empirical evidence? A BBC opinion does not have global significance that I'm aware. BBC deviated from universal standard SMPTE C phosphor for technical reason? 

bralas is offline  
post #28 of 30 Old 01-31-2014, 06:57 AM
AVS Special Member
 
Joe Bloggs's Avatar
 
Join Date: Apr 2007
Location: UK
Posts: 2,534
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 111 Post(s)
Liked: 55
Quote:
Originally Posted by bralas View Post

S16mm is 2K, content is more then 1920 x 1080:)
Not when you broadcast it - it is not more than - or even equal to 1920x1080 - its SD - when you have to deliver it to the broadcaster in standard definition because super 16 is so bad a medium that it will not compress well due all the grain. If it was accurately capturing scene information to each of the 1920x1080 elements it would be as easy for the compressor to compress as it is for it to compress 35mm shot content. So while there may be some full HD information resolved - in amongst all the random grain, each 1920x1080 element isn't accurate in Super 16, because of the random added grain making it difficult to compress without artefacts at the bitrates used by the BBC HD channels - unlike 35mm or normal 1920x1080 video shot content which is accurate.

So the proof that it doesn't compress at the bitrates being used as well as normal 1920x1080 video or 35mm film is proof that it isn't accurate 1920x1080 (where each pixel should represent image information from the scene - not just random grain) - never mind more than 1920x1080.
Joe Bloggs is offline  
post #29 of 30 Old 01-31-2014, 07:03 AM
 
bralas's Avatar
 
Join Date: Jan 2014
Location: Lafayette, LA
Posts: 117
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 20
Quote:
Originally Posted by Joe Bloggs View Post

... because super 16 is so bad a medium 

Sure it is:rolleyes:

bralas is offline  
post #30 of 30 Old 01-31-2014, 07:25 AM
AVS Special Member
 
Joe Bloggs's Avatar
 
Join Date: Apr 2007
Location: UK
Posts: 2,534
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 111 Post(s)
Liked: 55
I wonder if Super 35 will have the same issues when broadcasting it at 3840x2160 as Super 16 has at 1920x1080. Maybe when we have UHDTV1 3840x2160 they'll only allow Super 35 films to be sent at no more than 1920x1080 because of the grain. It will depend on the bitrates used and how good H265 is if that is used, but it still might not be as easy to compress while maintaining the full 2160p resolved resolution as other formats.
Quote:
In particular, Scrubs has been shot on Super16 from the start and is aired either as 4:3 SD (first 7 seasons) or as 16:9 HD (seasons 8 and 9). John Inwood, the cinematographer of the series, believed that footage from his Aaton XTR Prod camera was not only sufficient to air in high definition, it "looked terrific."
So the person who shot it thinks it looks "terrific"? Isn't he a bit biased being the one who shot it?

Here's what highdefdigest says about the Blu-ray, which will be encoded at a lot higher bitrate than broadcast TV, and with non-realtime encoding:
http://bluray.highdefdigest.com/2357/scrubs_s8.html
Quote:
the transfer still suffers from slight mediocrity. Detail is never sharp. Every shot takes on a feel of softness. Colors are plentiful, but subdued. Nothing ever pops off the screen, rather it all seems just a bit flat.
Not exactly "terrific" PQ is it?
Joe Bloggs is offline  
Reply HDTV Software Media Discussion

User Tag List

Thread Tools
Show Printable Version Show Printable Version
Email this Page Email this Page


Forum Jump: 

Posting Rules  
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are Off
Refbacks are Off