Reread my post and you’ll see I was already aware of the nature of film capturing temporal data. I think it was amusing to you because what I said went over your head. Had you understood me you would not be repeating what I said. So I’ll explain it to you, by now I’m very used to this. Video games render a frame as if they were taken by a camera with a 1/∞ shutter speed, or an instantaneous moment in time. Things like water drops and fan blades will appear frozen and detailed. Placing 24 of those frames together in a second will make a very disjoint (laggy) animation. A much smoother view requires 30-60 frames, beyond that it will not appear disjoint or distract from an apparent lack of animation. Film on the other hand:
Anything moving will be a blur, but while animated at 24fps it seems natural. The film frame is able to represent light through time with that blur in 1/24th of a second. As a still frame it will lack decernable details, it would make a bad still or desktop wallpaper because we’re viewing light with time. In a 24fps animation of blurs the motion is seen as fluid and no one think it looks disjoint or too slow. If these facts weren’t true we would be using much higher film speeds now that the industry is strong compared to the early 20th century when film was emerging.
I know, I’m just a homo sapien. All those other forums always bring me to this conclusion.
I’ve played games before at 200fps+ on a 60 Hz CRT and “noticed” the “clarity” of it. Because a true 60fps was being achieved and there were absolutely no troughs of 10fps or 15fps. But personally I believe 40fps is good enough to not impact the player response. Fps is still an average and doesn’t mean every frame is sent through timed like film.

