Blu ray played on pc, is 2k resolution anti-aliased to 1080p? - AVS Forum
Forum Jump: 
 
Thread Tools
post #1 of 11 Old 11-25-2009, 06:00 PM - Thread Starter
AVS Special Member
 
8:13's Avatar
 
Join Date: Sep 2006
Posts: 1,250
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 11
I read on wikipedia that anti-aliasing is when a higher resolution is fit to a smaller resolution.

The video card frame buffer/ ram, needs to be enough so the videocard can do anti-aliasing properly.

This frame buffer/ram, isn't so important in pcie 2.0 or 2.1, but in pcie 1.0 16x the frame buffer on the video card is the only ram available, it holds the video frame and enables things ike anti-aliasing to be done to the frame.

In pcie 1.0 16x if you have a 2k source frame being anti-aliased to 1080p and you don't have enough video card ram to do this properly you will have anti-aliasing issues.

So my question is, is blu ray 2k on the disk, or do they anti alias it at the factory to 1080p?


There is new, and then you are new.
This is a moral of the bears and their cereal.
8:13 is offline  
Sponsored Links
Advertisement
 
post #2 of 11 Old 11-25-2009, 06:28 PM
AVS Special Member
 
demonfoo's Avatar
 
Join Date: Apr 2007
Posts: 1,272
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 22
I've never seen any BD at a resolution other than 1080 (either i or p). I believe technically 720p resolution is supported as part of the BD specs, and some bonus material is 480i/p SD resolution, but 1920x1080 is typical. BD doesn't support any higher resolutions, and HDMI prior to 1.3 didn't support resolutions higher than 1920x1080. HDMI 1.3 does support 2560x1440 (though no display hardware does at present, to my knowledge).
demonfoo is offline  
post #3 of 11 Old 11-25-2009, 10:07 PM
Advanced Member
 
CRT Dude's Avatar
 
Join Date: Aug 2008
Location: Hanover, PA
Posts: 887
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
Article was probably about gaming. Supersampling, the form of AA your talking about, is when your render at a higher resolution then downscale. This gets ride of jaggies and shimmering. CGI does this (TF2's IMAX scenes were rendered at 8K IIRC).
CRT Dude is offline  
post #4 of 11 Old 11-25-2009, 10:17 PM
AVS Special Member
 
Joe Bloggs's Avatar
 
Join Date: Apr 2007
Location: UK
Posts: 2,542
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 118 Post(s)
Liked: 56
I think they often scan at 2K and then usually crop to 1920x1080 for Blu-ray (maybe they might resample down from 2K to 1920x1080 but I think they usually crop). Some films are scanned at 4K or higher.

The Blu-ray spec doesn't support any resolution higher than 1920x1080. (though I suppose you could put a higher resolution video file onto a Blu-ray disc and have it playable by a PC ).
Joe Bloggs is offline  
post #5 of 11 Old 11-26-2009, 01:12 AM - Thread Starter
AVS Special Member
 
8:13's Avatar
 
Join Date: Sep 2006
Posts: 1,250
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 11
I use ati overscan slidebar in the catalyst settings for my lcd tv.
This makes the picture bigger so it fill my tv screen.
I want to know if this changes the way the uvd handles the resolution.

Since the size of the screen is being changed by the "scaling options" slide bar then does this also affect how the ati uvd treats the resolution so it may need to have anti-aliasing for some technical reason I can't think of right now?


There is new, and then you are new.
This is a moral of the bears and their cereal.
8:13 is offline  
post #6 of 11 Old 11-26-2009, 01:52 AM
AVS Special Member
 
Mr.D's Avatar
 
Join Date: Dec 2001
Location: UK
Posts: 3,307
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 15
Most film scanners are natively 4k or higher. Even if the scans come off as 2k they have usually been scanned initially at 4k and downsampled to 2k.

Its already anti-aliased, however there is usually still some aliasing at 2k ( even 4k in certain situations) if you really look for it.

digital film janitor
Mr.D is offline  
post #7 of 11 Old 11-26-2009, 02:19 AM - Thread Starter
AVS Special Member
 
8:13's Avatar
 
Join Date: Sep 2006
Posts: 1,250
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 11
I found the answer to my question in post #5. If I don't use the ati catalyst "scaling options" to make the picture fill the screen but leave it at default I get no aliasing when I play the blu ray movie Rambo at 6 minutes and 8 minutes.

6 minutes into the movie I see him get off the boat and if I look at the wooded deck I see aliasing or not.
8 minutes into the movie the shot goes from Rambo to the preachers and there is aliasing on the mans face or not.

I have 256MB on my ati 2600xt pcie 1.0 16x videocard.
It looks like scaling the picture to fill the screen increases the resolution past 1080p and the picture needs to be anti aliased when playing a blu ray movie, so if you don't have enough ram you will see aliasing like I have which goes away if you don't scale the screen.


There is new, and then you are new.
This is a moral of the bears and their cereal.
8:13 is offline  
post #8 of 11 Old 11-26-2009, 02:30 AM
AVS Special Member
 
Joe Bloggs's Avatar
 
Join Date: Apr 2007
Location: UK
Posts: 2,542
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 118 Post(s)
Liked: 56
Quote:
Originally Posted by Mr.D View Post

Most film scanners are natively 4k or higher. Even if the scans come off as 2k they have usually been scanned initially at 4k and downsampled to 2k.

Its already anti-aliased, however there is usually still some aliasing at 2k ( even 4k in certain situations) if you really look for it.

Why is 3K never used in film scanning/digital intermediates (there are digital cameras that can do 3K)? Is it because it's not an even multiple of 2K or 4K so is harder to/doesn't produce as good results when up/downscaled? For uncompressed scans 4K will take 4x the data as 2K maybe rendering etc. could take around 4x as long as 2K, but 2K shown on a digital cinema screen is more likely to have the pixels visible, so why is 3K never used ie. it would be better and higher resolution than 2K but would use less data/storage and should be quicker to render/process than 4K?
Joe Bloggs is offline  
post #9 of 11 Old 11-26-2009, 03:10 AM
AVS Special Member
 
Mr.D's Avatar
 
Join Date: Dec 2001
Location: UK
Posts: 3,307
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 15
You're looking at it the wrong way round. Contrary to popular belief the first standardized commercially available film scanners (not some heath robinson internal creation) were 4k native.

I've never actually seen a film scanner that was natively 2k. When I started back in 95 they were already 4k.

Everyone started scanning at 4k and then when the implications of moving 4k data around a 90's spec network to production deadlines meant something had to give. 2k was about as low as they could go and preserve reasonable picture quality. 2k was originally refered to as "half res" and 4k was "base res"

( I suspect 3k probably doesn't look any sharper but means you have to carry more data around for the same image quality as 2k)

digital film janitor
Mr.D is offline  
post #10 of 11 Old 11-26-2009, 06:19 AM
AVS Addicted Member
 
John Mason's Avatar
 
Join Date: Jul 2000
Location: New York, NY
Posts: 10,615
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 8 Post(s)
Liked: 16
Can't follow the wikipedia interpretation of anti-aliasing, but Arri engineer Dr. Hans Kiening provides a good explanation, with photos/graphs, of both aliasing and anti-aliasing in his pdf tutorial starting on page 12. The paper appeared in slightly different form recently in SMPTE's Motion Imaging Journal. -- John
John Mason is offline  
post #11 of 11 Old 11-28-2009, 09:16 PM
AVS Special Member
 
Kilian.ca's Avatar
 
Join Date: Mar 2007
Location: Ex-50Hz, now 60Hz
Posts: 1,901
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 36
Quote:
Originally Posted by 8:13 View Post

I read on wikipedia that anti-aliasing is when a higher resolution is fit to a smaller resolution.

The video card frame buffer/ ram, needs to be enough so the videocard can do anti-aliasing properly.

Quote:
Originally Posted by 8:13 View Post

...It looks like scaling the picture to fill the screen increases the resolution past 1080p and the picture needs to be anti aliased when playing a blu ray movie, so if you don't have enough ram you will see aliasing like I have which goes away if you don't scale the screen.

No, only scaling from sub-1080 (c.800 lines for scope) to 1080 lines, that is from a lower to a higher resolution, the opposite of what was said about anti-aliasing in the first post. With upscaling you'd expect to see more artefacts. I don't know the precise mechanism of hardware rendering in upscaling but discussion of computer graphics card capabilities is not really within the scope of this section of the forum. In any case most people I presume watch in OAR without zooming.

Audiosceptics accept audio trials using 25 people. A recent Oxford study with over 353,000 patient records from 639 separate clinical trials shows for every 1,000 people taking diclofenac or ibuprofen there would be 3 additional heart attacks, 4 more cases of heart failure and 1 death every year.

Kilian.ca is offline  
Reply HDTV Software Media Discussion

User Tag List

Thread Tools
Show Printable Version Show Printable Version
Email this Page Email this Page


Forum Jump: 

Posting Rules  
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are Off
Refbacks are Off