It is just a fancy word for "error" or "anomoly" in the picture. There are different kinds of artifacts cause by different kinds of processing. For example, interlaced images often suffer from a "flickering" artifact since only 1/2 of the image is updated during each screen update. Computer graphics often suffer from stair-step "jaggies" caused by the limited resolution cartesian coordinate matrix. Film suffers from "motion blur" caused by the relatively slow frame rate of 24 frames per second. (Do not believe people that tell you humans cannot distinguish anything faster than 24 fps, hardcore gamers know better.)
Every type of existing image encoding and display system suffers from one type of artifact or another. The quest is to minimize artifacts without using up a ridiculous amount of resources (bandwidth, power, processing power, cost, storage, etc.) Engineering is always a game of trade-offs.
[This message has been edited by dagman (edited 05-10-2001).]
Artifacts are undesired visual elements not originally intended to be part of an image. The most "popular" (talked-about) artifacts in home theater are motion artifact and compression artifact.
Motion artifact is usually related to interlacing: mild artifact introduced by interlacing, or much worse artifact from poor deinterlacing. Depending on the type of problem, you get different visual effects. This alone would take pages to expound on... but for movies, this boils down to 3:2 pulldown (telecine) issues. Please see my post in the following thread: http://www.avsforum.com/ubb/Forum11/HTML/013752.html
Compression artifact is caused by lossy compression (currently discrete cosine transform is the flavor of the day) which compresses video data by removing redundant information from areas it deems such removal would be least likely to be noticed. However, with extreme levels of compression such as that used in DBS (mini-dish satellite), this compression artifact ranges from nuisance to major annoyance. This appears as pieces of fiberglass, usually in areas of "solid" color.
Dagman, you're absolutely right, the human eye IS capable of registering more than 24 images per second, but not MUCH more. Indeed, even 24 fps film is projected at 48 flashes per second (2 flashes per frame) in theaters to decrease the flicker of 24 fps material. Studies with flicker in monitors have shown that refresh rates of >60 Hz result in disappearance of visible flicker, so movie projectors at 3 flashes per frame (72 flashes per second) may actually improve the flicker situation further in theaters. Indeed, on my CRT projector at home I have noticed BETTER than theater smoothness at 72 fps, though some stuttering of motion is noticeable just by virtue of the film source being shot at 24 fps.
First, motion blur is not technically an artifact; it's an optical property (movement of a light source while the shutter of a camera is open, causing a blurred image on film).
Our eye has a certain refresh rate as well. If you have watched a spinning wheel slow down, you'll see it first as a complete blur, then it will appear to spin in one direction, then the other, until you start to see the clear outline of the spokes. This is a function of the position of the wheel as it spins with relation to your eye's natural refresh rate until movement is slow enough to resolve the solid object.
In many cases motion blur is a desirable feature, as it fools the eye into thinking smooth motion is occurring rather than 24 completely still images per second. Indeed, if you look at still frames in motion sequences of Pixar's films (Toy Story, Bug's Life, etc.), you will note the addition of motion blur effects to enhance its realism to the eye (this does not occur naturally in 3D rendering software, it must deliberately be added).
Regarding fluidity of 3D games, one issue is the variability in scene complexity, which necessitates some "headroom" in your video card: if your card cruises along at 30 fps on simple scenes, it will get choppier in complex scenes with multiple objects/characters. But if you cruise along at 60 fps and occasionally slow to 30 fps, you'll maintain fluidity throughout. But the problem with lack of fluidity in 3D FPS games may have more to do with the lack of motion blur than with actual frame rate; as more 3D accelerators and 3D games start to support hardware motion blur, even 30 fps will appear perfectly smooth. I think frame rates have been overemphasized; I predict the biggest gain in fluid motion (or at least the perception of it) in 3D games will come with the widespread adoption of hardware motion blur (as well as hardware antialiasing at modest resolutions rather than just brute resolution), at which point the only remaining issue will be enhancing realism of object shapes/textures/animation.
A forum community dedicated to home theater owners and enthusiasts. Come join the discussion about home audio/video, TVs, projectors, screens, receivers, speakers, projects, DIY’s, product reviews, accessories, classifieds, and more!