Originally Posted by ITemplate
Well I'm using Bluray ISO's not MKV so that might change how easy it is?
Second - I'm not an expert at all, and I can't say I understand quite what you are saying. But I can see Wiki
that has a nice table that shows what Blu-Ray supports.
Indeed, email@example.com is listed. Also, I have downloaded BDInfo which says that my BD is "1080i / 29,970 fps / 16:9". This information is extracted from the Blu-Ray disc using the "CLIPINF" files - without the gfx card being involved at all.
So...either you are wrong or they are. Or have I got it all mixed up so that you are both right?
EDIT: Just read your post again. Trying to understand. You say "So the final video stream sent to the display is a sequence of frames with interval 16.68ms, i.e., 60 frames per second
". But you also say "BD movie is always either 23.976 frames/sec or 24 frames/sec
". How can those two sentences be equal?
OK, I understand your point. Wiki
says, "Frame rate". A simple conversion: "60 fields per second" = " 30 frames per second". When you write the resolution/progressive or interlaced/rate in the format "1080p/ixxx", xx is usually (i.e. by convention)
- xx = frame rate
for progressive contents (e.g. 1080p24), "fps"
- xx - field rate
for interlace contents (e.g. 1080i60), "fps" is still used with "f" = "frame", so it's xx/2 fps (e.g. "1920 pixels x 1080 pixels, 29.970 fps, interlaced"). The best practice is always add the unit "fps" and write this way, "1080i/30fps", "1080i@30fps", "1080 interlaced 30fps" etc. (The same wiki says, "Some manufacturers will list field rate for interlaced material, but this is incorrect industry practice. To avoid confusion, only FRAME rates should ever be listed."
There are two types of videos:
- Film-based, shot by a movie camera, usually at the rate of 24 frames/s.
- Video-based, shot by a video camera, at the rate of 60 fields/s (NTSC), 50 fields/s (PAL), 30 frames/s, 60 frames/s (recent commercial camcorders). "Dire Straits Alchemy Live" is a typical one. (So the word "video" can have two meanings: video in general and video shot by a video camera.)
Video contents at 60/50 fields/s is interlaced
(only half of the complete frame [i.e. top or bottom field] is recorded at a time, with the uniform time interval 1/60 = 16.68ms or 1/50 = 20ms). At the play back of such a content, a whole frame is created from neighboring fields by guess work (video-mode deinterlacing
, weave, bob, adaptive, motion adaptive, vector adaptive etc.) and the graphics card outputs a video stream at the rate of 60/50 frames/s
to the display. There is no perfect deinterlacing because the original information is incomplete.
There is "film-mode deinterlacing
". This mode is applied to movies in DVD and broadcast (SD/HD) (the original film, a progressive video at 24/25 fps, is converted to an interlaced one at 60/50 fields/s to be stored in DVD or broadcast format: how
; perhaps you've heard that DVD is always interlaced, that's true) to restore the original film at 24/25 fps. Unlike video-mode deinterlacing, film-mode deinterlacing should restore the original film perfectly. Keywords related to this topic are "inverse telecine (IVTC)
" (a synonym for film-mode deinterlacing) and "pulldown detection
" (detect the correct mode, film or video, and the pulldown or cadence pattern). Movies in BD are always progressive, so (film-mode) deinterlacing is unnecessary.
These are basic facts that everybody should know, but somehow it is hard to find a summary. If you read wiki articles, perhaps you will lose a global picture quickly because of too many technical terms, too broad, too superficial explanations.