frame-rate

robosmack asked you:

What is the sampling rate of the human eye? and do we actually need TVs that refresh at 240+ Hz?

This is a great question, and I’ve been meaning to get to it. Again, you guys just fill my inbox with such great questions that I just can’t get to them all. But I’ll keep trying.

The human eye doesn’t work like a television. But that’s not to say that there isn’t some signal rate that our brains find important. First, TVs …

Early in TV history, the refresh rate of the screen (literally how many times a second the cathode ray redrew the lines that make up the image) was dependent on the vacuum tube technology used to make the tubes, and the AC power frequency coming through the wires (60 Hz). But as we developed more advanced displays like LCD screens, the image on the screen is not redrawn line-by-line like it was on old tube monitors. Each pixel is refreshed a certain number of times per second (240 times/s for a 240 Hz TV).

How does that affect our eyes? Early experiments in motion pictures told us that anything less than about 24 frames per second would make film flicker and make your audience uncomfortable. Today, 24 FPS is the current minimum for 35mm film.

Your visual cortex doesn’t work like a TV screen though. It isn’t refreshed frame by frame, rather it’s a continuous flow system. What matters is how fast a single image is held from the time it hits your retina to the time it is recognized by the brain. We can discern 10-12 single images a second. However, moving faster than that doesn’t automatically make smooth “video” come alive.

Take alternating frames of black and white. Anything less than 30 FPS and it flickers black to white. From there to 60 FPS, depending on the person, it will start to look gray. The effect of changing frame rates on the human visual system is demonstrated by the so-called “wagon-wheel effect”, as in this video

So do we need 240 Hz (or higher) TVs? It only becomes an issue as TVs get bigger. When an X-wing darts quickly across a 30-inch LCD at 120 Hz against the black of space toward the Death Star, the refresh rate of the pixels gives us smooth, Rebel motion. But try that same refresh rate on an 80-inch TV, and you’ll see some pixels misfire and don’t refresh fast enough, making motion blurry and uncomfortable. We don’t see like TVs, but we can pick up some of their refresh rate artifacts based on how our nervous system works.

mariothewizard asked:

I have been watching some YouTube videos and they talk about how next gen console games usually run at 30 fps where 8-9 years ago console games ran at 60 fps. A few months ago controversy rose when the upcoming The Order: 1886's developers said they intentionally are keeping the game at 30 fps to emulate the feeling of a movie. Is there ever a valid design reason for a game today to run at 30 fps? And why are more games running at 30 fps such as the upcoming WiiU game, Bayonetta 2?

As you can probably guess, the main reason that frame rates are set at 30 so often is because of technology. When you need that much data processed each second just for video, you run into hardware limits when you have to pile other things on like animation data, pathfinding algorithms, artificial intelligence, and so on. But why would somebody want to purposely adjust the frame rate down? Well…

Meet Guilty Gear Xrd SIGN. This fighting game, nonsensical name aside, uses a 3D engine to make a 2D game in order to take advantage of all the benefits 3D offers over their traditional 2D sprite work. The game is actually running at 60 frames per second most of the time for purposes of gameplay and timing. But they purposely eschew the super smooth animations for their characters because they want them to seem as if they are running at a lower frame rate. Why? Because the entire game is like a long-running anime story, and to be true to its anime roots, it should actually be running at 24 frames per second, since that’s true to the source material.

You’ve probably heard of this movie, right? Peter Jackson filmed it at 48 frames per second, but a lot of moviegoers who saw it at that frame rate complained about it. They didn’t like it, it just made them feel uncomfortable. There have been a good number of complaints about this filmmaking decision - it feels weird to the viewers. It’s hard for the layman to put a finger on it, but it just feels foreign.

The main reason most of the new generation of games is running at 30 frames per second is because they are still getting used to new technology. A lot of the software and driver support for the new platform isn’t as robust or efficient as the old stuff, and all of the new games are running at 1080p as their native resolution. This actually raises the amount of video data required to process by a factor of about 2-3x each step the resolution is increased. As technology improves and becomes more efficient, we’ll see more games come with 60fps standard.

As for why Bayonetta 2 in specific is running at 30 frames per second? I don’t know for certain, but I’m pretty sure it’s because the WiiU is effectively a PS3 or X360 in terms of hardware. I’ve heard the Wii U described by developers as “two Wiis duct taped together”. Which really isn’t too far from the truth, especially since the Wii itself was also described as “Two Gamecubes duct taped together”. That said, it’s all just hearsay to me since I’ve never actually worked on the Wii U, just the Wii (and the Gamecube). The comment about two Gamecubes duct taped together is pretty accurate though.

Sorry, Pikachu, but you know it’s true. If there is interest, I could probably write something up about the base technical requirements for frame rates at the different resolution steps. I actually started writing that as part of the answer to this question, but I decided against it since it would have made this post too long.

youtube

Have you seen this amazing high frame rate Green Day concert?

"The Hobbit": Peter Jackson annoys film buffs with higher frame rate
  • cause Director Peter Jackson’s upcoming take on “The Hobbit,” on top of being created in 3D, uses a technique unusual for mainstream films — it’s shot at 48 frames per second, double what most films use.
  • reaction Many fans who saw an early ten-minute screening of the film at CinemaCon found the frame rate to be a significant change, with some saying that the film felt artificial and fairly jarring to watch.
  • rebuttal Jackson stood his ground: “It’s literally a new experience, but you know, that doesn’t last the entire experience of the film — not by any stretch, [just] 10 minutes or so. …  you settle into it.” Good move? source

Follow ShortFormBlog • Find us on Twitter & Facebook

Consider Edison’s early motion picture experiments. The inventor recommended 46 frames per second as the ideal frame rate, concluding that “anything less will strain the eye.”

Instead, 24 fps became the norm based on a simple calculation: Celluloid costs money. Less film used means lower production costs. “They wanted to see how little film could you get away with feeding into the camera, because it was a resource and 24 was the minimum,” Watro says. “We’ve been at 24 [fps] for 80 years probably because you had some bean counter saying, ‘If people watch 24 without vomiting, then let’s go with that.’”

– Today in Unexpected Things I Learned About The Movies (via Wired)

What causes 24p video to look jumpy?

Everyone wants to shoot 24p but it only looks “natural” when viewed at a higher framerate:

A large amount of content is produced in 24p. In theaters, 24p is the standard, but narrative television is also often produced in 24p. Yet, we don’t experience any obvious jumpiness when watching 24p on television or in theaters. Why not? One of the reasons is that we don’t often actually see 24p in either of those environments. In the U.S., we broadcast all video at 60 Hz. NTSC video is broadcast at 60i (59.94 interlaced fields per second), 1080 video is also broadcast at 60i, and 720 at 60p (59.94 progressive frames per second). To show 24p in the 60 Hz world, we need to convert it using a 2:3 pulldown method. This process not only conforms the video to the standard, but also has a smoothing effect. DVDs are mastered this way, as well, giving the same smooth result. Home televisions also often double their display rates (a 120 Hz TV is easy to find at your local electronics store), and this smoothens video on home screens even further.

24p film doesn’t have pulldown, so why isn’t it jumpy? Well, there’s something special happening that helps reduce the effect. Thomas Edison determined that for comfortable viewing in theaters, 46 fps was the minimum display rate of projectors. Projectors used a multiple-bladed shutter to show 24 fps film at 48 times a second, doubling the display of each frame. Many modern projectors actually will show each frame three times, giving 72 frames per second on screen. This has the same smoothing effect to our eyes that we see on TV sets. In fact, some of the only places we don’t see this smoothing effect is on production monitors and computer screens, so we can understand why cinematographers and editors may get a little uneasy about the 24p jumpiness.

Read the rest at HDVideoPro

anonymous asked:

when it comes to pc games, do you think framerate is important than resolutions?

Honestly, when it comes to these technical visual things, I don’t know much and I don’t care much, either. I want games to look “good,” and for me, that’s mostly about art style, animations, effects, etc. I play mostly on a mid-range laptop, and have to toggle graphics settings down (often to the lowest setting) pretty regularly, and it doesn’t bother me one bit. There are lots of people for whom things like framerate and resolution are super important, and that’s cool, but I’m just not one of them.

10

Scream, “Fuck the world ‘cause it’s my life, I’m gonna take it back.”

And never for a second blame yourself.

There were some complaints that my 60fps vs 30fps comparison was unfair, so here's one final example.

Dark Souls

Open in separate tab and let image fully load.  

One is near 60fps, the other near 30fps.  

Both are identical animations running at identical speeds.  

It’s in game, in motion, in combat with real gameplay, and a background and backdrop to it.  

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

Left is higher frame rate.

Album of GIFs

Follow if interested in more content like this.

24p:

24p is a progressive format and is now widely adopted by those planning on transferring a video signal to film. Film and video makers use 24p even if their productions are not going to be transferred to film, simply because of the on-screen “look” of the (low) frame rate which matches native film. When transferred to NTSC television, the rate is effectively slowed to 23.976 FPS (24×1000÷1001 to be exact), and when transferred to PAL or SECAM it is sped up to 25 FPS. 35 mm movie cameras use a standard exposure rate of 24 FPS, though many cameras offer rates of 23.976 FPS for NTSC television and 25 FPS for PAL/SECAM. The 24 FPS rate became the de facto standard for sound motion pictures in the mid-1920s.[2] Practically all hand-drawn animation is designed to be played at 24 FPS. Actually hand-drawing 24 unique frames per second (“1’s”) is costly. Even big budget films usually hand-draw animation shooting on “2’s” (one hand-drawn frame is shown twice, so only 12 unique frames per second)[4][5] and a lot of animation is drawn on “4’s” (one hand-drawn frame is shown four times, so only six unique frames per second).