An animated fight scene
On its own the brain perceives the motion of an edge, but only if it moves not too far, and not too fast, from the first frame to the second one. Like persistence of vision, this is a real effect, called apparent motion. It's interesting but it's not the explanation I like so much. Classic cel animation—of the old ink on celluloid variety—relies on the apparent-motion phenomenon. The old animators knew intuitively how to keep the successive frames of a movement inside its "not too far, not too fast" boundaries. If they needed to exceed those limits, they had tricks to help us perceive the motion—like actual speed lines and a poof of dust to mark the rapid descent of Wile E. Coyote as he steps unexpectedly off a mesa in hot pursuit of that truly wily Road Runner.Read the secret: “Why Do Movies Move?” 2012: What is your favorite deep, elegant or beautiful explanation, Edge>>
Exceed the apparent motion limits—without those animators' tricks—and the results are ugly. You may have seen old school stop-motion animation—such as Ray Harryhausen's classic sword-fighting skeletons in Jason and the Argonauts—that is plagued by an unpleasant jerking motion of the characters. You're seeing double, at least—several edges of a skeleton at the same time—and correctly interpret it as motion, but painfully so. The edges stutter, or "judder," or "strobe" across the screen—the words reflect the pain inflicted by staccato motion.
Why don't live-action movies stutter? (Imagine directing Uma Thurman to stay within "not too far, not too fast" limits.) Why don't computer-animated movies a la Pixar judder? And, for contrast, why do videogames, alas, strobe horribly today? All are sequences of discrete frames. There's a general explanation that works for all three. It's called "motion blur," and it's simple and pretty.