Presently, nearly all flat panel displays have worse motion resolution than a CRT, from the perspective of full framerate (60fps @ 60Hz) video gaming. CRT always look really fluid for this.
Plasma is probably a reasonable bet for gaming, if you don't mind a slight input lag -- not important for solo gaming if you can't notice it. Some high end TV's such as the Elite LCD, or one of the Sony "XR 960" / Samsung "CMR 960" displays, sometimes have a mode where motion interpolation is disabled but the scanning backlight is enabled to allow approximately a 75% motion-blur reduction relative to a regular LCD, without using motion interpolation. But it will flicker quite a lot (like a 60Hz CRT) since most of the video-game-compatible scanning backlight modes, only operate at 60Hz, and cost a lot more than a plasma display.
Once input lag is low enough, improvements really only matters during competition gaming: Input lag, even if not noticed by the human, can still be a disadvantage during fast-reaction network gaming -- like the 100 meter race at Olympics every milliseconds matter, the person who shoots 1 millisecond sooner will be the one that wins. Even though you cannot notice the millisecond, the gaming computer calculations will, and the person who shoots first wins! (Even if it's just by one millisecond). Gaming calculations are unable to bias/handicap users that have low input lag, so low input lag is a way to gain some advantage between two very good video gamers in a videogaming competition.