Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
First Person Shooters (Games) Graphics XBox (Games) Technology Games

Human Eye's Oscillation Rate Determines Smooth Frame Rate 187

jones_supa writes: It should be safe to conclude that humans can see frame rates greater than 24 fps. The next question is: why do movies at 48 fps look "video-y," and why do movies at 24 fps look "dreamy" and "cinematic." Why are games more realistic at 60 fps than 30 fps? Simon Cooke from Microsoft (Xbox) Advanced Technology Group has an interesting theory to explain this all. Your eyes oscillate a tiny amount, ranging from 70 to 103 Hz (on average 83.68 Hz). So here's the hypothesis: The ocular microtremors wiggle the retina, allowing it to sample at approximately 2x the resolution of the sensors. Showing someone pictures that vary at less than half the rate of the oscillation means we're no longer receiving a signal that changes fast enough to allow the supersampling operation to happen. So we're throwing away a lot of perceived-motion data, and a lot of detail as well. Some of the detail can be restored with temporal antialiasing and simulating real noise, but ideally Cooke suggests going with a high enough frame rate (over 43 fps) and if possible, a high resolution.
This discussion has been archived. No new comments can be posted.

Human Eye's Oscillation Rate Determines Smooth Frame Rate

Comments Filter:
  • Bring on HFR (Score:5, Interesting)

    by Anonymous Coward on Wednesday December 24, 2014 @12:02PM (#48666999)

    I, for one, am sick of slow (seconds-long!) pans across scenery that *still* end up with judder and motion blur.

    HFR isn't a gimmick like migraine-inducing stereoscopic "3D", it's more akin to adding color instead of relying solely upon greyscale for film presentation.

    Like all tools, I'm sure it can be used for both good and evil. Blame evil, jump-cutting directors if the dark side is channeled.

    • I personally liked the effect in the Hobbit, made the movie seem much more "real" even the ridiculously goofy parts like Radagast and the disney adventure through the misty mountains.

  • It's in the image (Score:5, Interesting)

    by Diddlbiker ( 1022703 ) on Wednesday December 24, 2014 @12:08PM (#48667029)
    Movies tend to be shot around 1/50" shutter speed, and that creates motion blur. The motion blur actually helps us see the animation as smooth, even at "only" 24 fps. Games on the other hand are razor sharp and will hence look much more like a staccato sequence of images than as an animation.

    Or so I was told by a moviemaker
    • by itzly ( 3699663 )
      Motion blur isn't perfect, though, because it only works if your eyes don't track moving objects, which is something that happens in a lot in action games.
    • Also, video games tend to have no depth of field. How could they? The whole screen needs to be in focus so that you can look at any part of it. Movies have the luxury of deciding what you should pay attention to and focusing (literally) on that.
    • Movies don't look smooth. They look like a staccato of motion-blurred still frames. 24fps was simply the minimum (read: cheapest) frame rate at which most of the population would perceive as mostly motion-like. Motion blur helps, but it hardly makes up for the deficiency.

      Technology has advanced quite a bit since the advent of motion picture cameras, to the point that the "film" is pretty far from the most expensive line item in the budget. Why not record at a more natural frame rate?

      The conceit of the m

      • Re: (Score:3, Interesting)

        The mayonaise you like is the mayonaise you grew up with ...

        Films are shot at 24 fps, but displayed [in theaters] at 48 fps, each frame is displayed twice: f0, black, f0, black, f1, black, f1, black, f2, ...

        According to one study, when test audiences were shown true 1-to-1 48 fps film, they actually preferred the 24 fps.

        The same is true for audio. Those that grew up on 128 kbps .mp3's preferred that over higher fidelity formats.

        The human optic nerve has [surprisingly] low bandwidth. I worked for a compan

    • There has been a trend in the last decade or so for the shutter speeds to be faster, plus the heavy use of CGI has resulted in razor-sharp images where motion blur would have been better.

      To demonstrate this, and that it spans genres, Blade Trinity and Drumline come to mind immediately. In the former case, the CGI was too razor-sharp when viewed in a cinema, and it made it difficult to watch. In the later, there are scenes where the band are performing in bright daylight, and the fast shutter speeds made th

  • ... is why cinema frame rates seem generally comfortable to watch, while video game frame rates at around the same number are problematic?
    • by Rashkae ( 59673 )

      I don't really know, but I can throw a couple educated guesses from experience. There are two reasons:

      1: Motion Blur. This is even simulated in high end animated movies. (look at a scene in a movie like Shrek of How to Train your dragon, and watch frame by frame where there is motion.

      2. Conistency. 24fps looks ok so long as it is consistent, either because of how the brain receives the image naturally, or just a matter of conditioning since we've been watching movies at 24fps for so long. I know when I

      • 3) History

        Film has traditionally not been HFR and thus people used to classic film are going to bitch for a while.

    • Simple - game frame rates aren't consistent.

      The effect of showing some frames for 16ms and others for 33ms (what will happen if your game is running at somewhere between 30 and 60 fps) is much more jarring to you than the effect of showing all frames for 42ms.

      This is why hitting 60fps (or slightly above) is the magic number at which it all looks smooth again - at that point, all frames are rendered before the screen refreshes, and you get an absolutely smooth 60fps with 16ms frame times across the board. T

      • by Twinbee ( 767046 )
        Just to say that Nvidia's new Gsync will allow for any frame rate perfectly consistent as long as you have a new Gsync monitor.
        • by Mashiki ( 184564 )

          Gsync is already DOA, VESA has adopted AMD's Freesync over it. [techpowerup.com] Including no licensing charges, which basically means compared to Gsync where nvidia wanted to charge for it, they're screwed.

        • Not quite. What gsync/freesync do is allow the monitor to be notified of when a frame has been produced, and trigger a swap exactly then. That doesn't mean that all frames are displayed for an equal amount of time.

          Sure, it means that if one frame is late, it doesn't wait until the next vertical blank, but it does still mean that the previous frame is on screen for longer. If for example, a series of frames takes 12,14,30, and 16 ms to render, normally you'll see the frame before first for 16ms, the next

  • What about our eyes is oscillating?

    • by sribe ( 304414 ) on Wednesday December 24, 2014 @12:29PM (#48667195)

      What about our eyes is oscillating?

      The whole eye. Our eyes actually cannot detect a static edge, only transitions. The reason we can see non-moving objects is that the oscillations of the eye provide the transitions. There's a simple experiment from long ago which illustrates this vividly: put a black square on a white background, track a subject's eye motion and move that target with the eye motion so that the image is always hitting the retina at the same location, and voila, the subject cannot see that target.

      • by tlhIngan ( 30335 ) <slashdot.worf@net> on Wednesday December 24, 2014 @01:18PM (#48667547)

        The whole eye. Our eyes actually cannot detect a static edge, only transitions. The reason we can see non-moving objects is that the oscillations of the eye provide the transitions. There's a simple experiment from long ago which illustrates this vividly: put a black square on a white background, track a subject's eye motion and move that target with the eye motion so that the image is always hitting the retina at the same location, and voila, the subject cannot see that target.

        The other reason is the "sensors" we have are quite poor - the eyeball itself is actually a very low resolution device - the high resolution center part of the eye covers such a narrow field of view that it's practically useless if it was a fixed camera, while the peripheral vision is so low res it's unusable.

        Instead, what happens is we evolved a gigantic amount of wetware to process the image into a high-resolution image we perceive - the brain does a lot of visual processing, and the eyes rapidly move (or oscillate) to move the sharp high-res center vision around to give you a much higher "virtual resolution" than the actual Mk. 1 Eyeball can achieve.

        Of course, this visual processing comes at a price - optical illusions abound because it's very easy to trick the wetware into seeing things that aren't there, because the information is often interpolated, shifted in time, etc.

  • It would take hundreds of frames per second to truly fool the eye. We tend to have long decay, which I believe offers a hardware solution for "Where is it coming from?" but human attack FPS is much higher than you might think.

  • This makes sense. One of the things that drives me nuts about those cheap Chinese no-name Android tablets is the display. They draw every other pixel in a checkerboard type fashion, and if your eye is totally still then you don't notice. However if you move your eye quickly back and forth you can clearly see that only half the pixels are drawn at a time. So there's something about the motion that doesn't allow enough processing time to smooth that out. It's amazing how much our visual processing smooth

  • The smoothest frame rate would be no frames. 2,073,600 analog tracks, one for each pixel of a 1080p display would smoothly transition from one color to another. OK, so maybe the recording tape would be 3 feet wide. But a small price to pay for smooth video.
    • Re: (Score:2, Interesting)

      All recording mediums, even Tapes and records are digital if you look close enough. There is a limit to how fine a change you can have even in a record groove. So the fact of the matter is, eventually digital will be able to surpass any conceivable analog source in sampling rate.

      • by Megane ( 129182 ) on Wednesday December 24, 2014 @01:45PM (#48667717)
        That's like saying laserdisc is digital "because it's got pits and non-pits". Except that the length of the pits and non-pits is very much analog. (It's a full-bandwidth FM signal driven to maximum overmodulation. VHS does a similar thing.) In other words, the digital-ness becomes analog if you look even closer.
        • or the analog becomes digital if you turn it around. The point is that the only faithful reproduction of a how a trumpet causes waves of compressed air is a trumpet. No series of microphone -> vacuum tubes -> magnets -> flexible diaphragm will produce exactly the same set of waves as the trumpet (or piano strings or vocal cords, etc.) You get to the point that the difference is not perceivable by whatever instrument you choose to measure with and then call it perfect. A great number of people choo

          • by qwak23 ( 1862090 )

            We could also make the argument that the sound from the trumpet is itself digital, with an exceptionally fine grained sampling.

    • by zlives ( 2009072 )

      make it so!!

  • Movie FPS (Score:5, Interesting)

    by meustrus ( 1588597 ) <meustrus@NospAm.gmail.com> on Wednesday December 24, 2014 @12:27PM (#48667179)

    why do movies at 48 fps look "video-y," and why do movies at 24 fps look "dreamy" and "cinematic."

    For the same reason children are picky eaters. They say that people have to take three bites of a new flavor to really know if they like or dislike it. I have personally experienced that, going from "wtf this is so wrong" to "ok it's not so bad and I might actually like this" between bite 1 and bite 3. Well, we all grew up consuming 24 fps movies, and anything higher is new and different. Rather than "take three bites", though, so many of us recoil from the different experience and immediately start talking to all our friends about how it looks wrong, concluding that high FPS just looks bad. Try. Three. Bites.

    • make movies look like they were shot on a daytime TV cam to me. I saw Braveheart on one of those modern tvs and I suppose the lighting was more natural, but it was considerably less dramatic. It just killed it for me. Also (and this is mostly just me) I can perceive noticeable drops in framerate on those newer tvs. The rate goes up and down like crazy. Drives me nuts.
      • 120hz tvs make movies look like they were shot on a daytime TV cam to me.

        That's what I remember being the original complaint: higher FPS looks like daytime TV (or home video). That's an understandable prejudice, but one that I think we can get over by watching more normal TV in higher FPS. For some reason though, after the initial wave of honest people like you there's been hack after hack trying to explain why low FPS is actually better for smarter sounding reasons. It hasn't exactly been great for the TV industry; real 120 hz isn't as available in mid-range TVs as it was a cou

    • by Myrmi ( 730278 )
      Is this the real reason they made three Hobbit movies?
  • Anyone that has done mushrooms can tell you that seeing the world at the frame rate that the brain is capable of processing is a load of fun. I have no idea how psilocybin affects the visual processing center of the brain -- or hell, it may affect the eye itself, what I do know is stepping out into a room and looking around without the brain discarding the frames that it doesn't feel like processing is amazing. However, it does look completely fake. It is too clear / crisp. Our brains aren't used to seeing

    • by itzly ( 3699663 ) on Wednesday December 24, 2014 @01:06PM (#48667449)
      Unless of course, the mushroom just make you think this is happening.
    • by lannocc ( 568669 )

      As a test, pan your head from left to right and notice the "jumpiness" that is reality. Now, eat about a half gram of shrooms, and do the same thing. It is no longer jumpy, and you get a REAL smooth pan.

      Do the same test, but hold out your finger a foot or two in front of your eyes and move your finger with your head. Focusing on the finger, you now see a smooth transition of the background. The "jumpiness" you experience without it is simply your eyes trying to fix on any number of objects as you are panning (my layman's interpretation). I can get the smooth transition without using the finger crutch, by "defocusing" my eyes, but only panning left to right... I can't do it right to left. My guess is it has

      • by r_naked ( 150044 )

        The "jumpiness" you experience without it is simply your eyes trying to fix on any number of objects as you are panning (my layman's interpretation).

        That is actually my understanding as well. The brain wants to focus on what it has been trained to be the most pertinent objects, so it just jumps from object to object. I am fairly certain this is why people that are put under deep hypnosis can recall details of a situation that they can't do consciously. I don't think the brain discards that information, I think it is like off screen rendering that still gets saved to RAM :)

        I tried your finger trick, and while it was definitely smoother, it still was not

  • The old Bell & Howell 16mm projectors (24 fps) actually projected at 72 fps (call it flashes per second). The projector had a 3 bladed circular shutter. The film was pulled down one frame while the shutter was closed with one blade, then as the shutter rotated the frame was 'flashed' 3 times, then the next frame pulled down and the process repeated. The human eye could see flicker at 24 fps but not so much at 72. The video experts here can correct me, but I believe standard NTSC video was 30 fps of

    • It's not about flicker, it's about the smoothness of motion.

      • It still good to ponder what flicker could tell us. For example, the 200 Hz LCD backlight PWM frequency causes sore eyes and headaches to many people, so certainly the eyes and/or nervous system are sensing something? Put some white text on a black background on a computer like that and you can see multiple images of the text [tftcentral.co.uk] when rapidly turning eyes horizontally. Of course, as you say, the picture appearing and disappearing is largely a different discussion when compared to motion sensing, but there might
  • At 48Hz, you’re going to pull out more details at 48Hz from the scene than at 24Hz, both in terms of motion and spatial detail.

    Motion yes, but spatial? I don't get that bit.

    [at 24Hz] We’re no longer receiving a signal that changes fast enough to allow the super-sampling operation to happen.

    Err, what? You're not supersampling if the data has changed between the two samplings.

    To answer the question posed in the headline:

    Why movies look weird at 48fps

    Because it's not what we're used to when we go to the movies. That's all.

  • I say the human eye does see more than 24 fps, pan your head back and forth, no blurring like you get panning a camera. OK so I haven't RTFA but I recently read/search info on framerates. From what I gather 24 fps came about from movies particularly when the talkies became standard for motion pictures. What they settled on enough fps to have smooth action and matching audio but not too much as film is/was very expensive. But each frame is shown twice (refresh rate in the movie theatres is 48 Hz). I read 24

  • 'It should be safe to conclude that humans can see frame rates greater than 24 fps."

    We can go even faster than that.

    While this video I just shot [youtube.com] won't show it very well due to FPS limitations, you can easily perceive much faster than anyone here assumes. In the frequency range I'm playing in, you've got THOUSANDS of hertz in difference on some of these notes. The LED setup makes it REALLY easy to see in real time.

  • The ear follows the rules Nyquist created about sample rates (i.e. there are hairs in your ear that are turned to hear 40 kHz but you can't hear that high). There is no reason the eye can't be doing the same thing.

  • by MtViewGuy ( 197597 ) on Wednesday December 24, 2014 @11:19PM (#48671033)

    I think the problem is that because we're so used to 24 fps on theatrical motion pictures, going to 48 fps can be quite jarring, since everything looks so much "clearer" that you have to rethink set design, costume design and even the use of special effects to be less obtrusive at 48 fps. (Indeed, this became a huge issue with Peter Jackson's "Hobbit" trilogy because everything looked TOO clear.)

    The late Roger Ebert liked the 48 fps "Maxivision" analog film format, but that idea never took off due to need to use a lot more physical film and the increased stress of running a film projector at twice the speed of regular projectors. But with modern digital movie cameras, 48 fps is now much more viable.

Neutrinos have bad breadth.

Working...