frame-rate

.

.

.

.

.

Space left intentionally blank

.

.

.

.

.

External image

.

.

.

.

.

.

.

.

.

.

External image

Frame rate test.  

All 3 of the images have one that’s near 60fps and the other near 30fps.  

Guess which one you think is which, then check your answers at the bottom of this post.  

There’s a constant debate about whether you can see a difference between 60 and 30 fps.  I’ve made these images to allow people to see if they’re capable of recognizing a difference.  

.

.

.

.

.

.

.

.

.

1. Left is higher frame rate

2. Left is higher frame rate

3. Right is higher frame rate

Album of GIFs

Follow if interested in more content like this.

youtube

Here’s the second video we’ve made in partnership with LG for our “How To Make Your Mobile Videos Cinematic” series. This one’s all about frame rate, and the cinematic quality of shooting at 24 frames per second. Just about every movie you’ve ever seen was shot in this frame rate, and I personally feel there’s a certain magic to the flicker of film. Nowadays a lot of video is shot at higher frame rates and I think it lacks the cinematic quality of 24.

But I’m actually curious how you feel. Do you prefer 24? Do you like the higher frame rates? Do you think those higher frame rates still look cinematic? I really am curious to know what you think.

Thanks again <3

J

anonymous asked:

Hi AskAGameDev. First of all, thank you for your blog. It's nice to have such a down-to-earth insight of the Gaming Industry. As a guy who plays on both PC and consoles I'm curious on why 30 frames per second has become the de-facto standard for console gaming. Aside from obvious reasons (like marketing and customers' expectations on graphic fidelity) are there other reasons from the game developers' points of view that determine this choice (if it can be considered a choice) ?

So there’s two main questions here. Let me quickly break them down:

#1: Why is 30 frames per second the de-facto standard?

Once upon a time, movies used to be silent and had no sound. These movies were actually hand-cranked so the frame rate was variable, typically running anywhere from 16 to 26 frames per second. Then, one day, some really smart people added sound to the films. This changed the way that films had to be run - human hearing is extremely sensitive to changes in frequency, so in order to sync everything properly, they had to settle on a frame rate. They chose 24 frames per second, and that’s been the “cinematic” standard ever since.

This is the reason that games are 30 frames per second (instead of some other number) - 30 is the same as the cinematic rate of 24 fps, but with exactly five frames everywhere a film would have four. It could also work with other frame rates along the same multiples - 36, 42, 48, 54, or 60, but 30 was close enough to 24 that it still feels “cinematic” to the human eye, since we’re so used to watching things at 24 frames per second. Going to 36 or more would smooth out the view too much and it would feel like you’re watching a soap opera or something.

#2: Why are some games locked at 30 frames per second instead of 60?

Sometimes it’s because the team’s leadership wants that cinematic feeling. Practically speaking, it’s just easier to do all of the calculations you need done for each frame in a 33 millisecond window than it is in a 16 millisecond window. At some level, it takes a static amount of time to do anything. Calculating the positions of all of the animation bones to display, going through all of the gameplay logic, doing all of the lighting calculations, reading from and writing to different sections of memory, and so on and so forth all require some amount of processor time. Some of them might be streamlined so that you can run multiple processes in parallel, but not all of them. The sum total of critical path processing time in the worst case scenario is what determines your worst case scenario frame rate. This is, of course, dependent on the hardware available and how well-optimized the software is. Most of the time the engineers can get it to within 33 milliseconds pretty easily, but cutting it down to half that can be problematic, especially in the olden days when console hardware was kind of lacking. If you can’t get your critical path to below 17 milliseconds, you’re not going to be able to keep a constant 60 fps… and if you can’t do that, you drop to 30.

Remember, whenever a game is locked at 30 frames per second, it isn’t because it can’t do 60 frames per second at some parts. It’s just that the developers cannot guarantee 60 frames per second for the entirety of the experience, but they can guarantee 30. If you’re playing at 60 frames for a wihile, and then suddenly the game drops to 30, the game doesn’t feel good. It feels inconsistent, jerky, and suddenly unresponsive. We want to avoid that weird cognitive dissonance because it wouldn’t have happened at a consistent 30 frames per second (since players get used to it early on). In order to ensure that consistency of experience, most developers choose to keep it at 30 the entire way through if they do not think they can optimize enough by launch.

CONTRIBUTE HERE

==

CINEMATOGRAPHERS: Watch THIS ROUGH CUT and shoot cutaways for it. You can find a shot list of all the elements we need HERE.

  • It would be great for you to set up a camera on a tripod (or have an additional camera person on hand) to get footage of a subject recording with their mobile device. If you’re able to do this, it would be great to have two separate uploads, one from each of the recording devices.
  • If possible, please film all your footage for this video in 24fps.
  • If you film people experiencing cinema in anyway, please make sure there is no copyright material included in your footage. Thanks!

VISUAL ARTISTS: Design Graphic Cards for this How-To Video. Watch the slugged reference edit HERE.

ANIMATORS: Animate demonstrational cutaways for this How-To Video. Check out THIS SHOT LIST for more direction.

ACTORS: Find a moment in the video, and act or react to what RegularJOE is saying.

==

* The deadline for this request is Monday, October 19th

2

Adam Driver and Daisy Ridley demonstrating that size isn’t everything behind the scenes of Star Wars: The Force Awakens (2015)

Bonus:

robosmack asked you:

What is the sampling rate of the human eye? and do we actually need TVs that refresh at 240+ Hz?

This is a great question, and I’ve been meaning to get to it. Again, you guys just fill my inbox with such great questions that I just can’t get to them all. But I’ll keep trying.

The human eye doesn’t work like a television. But that’s not to say that there isn’t some signal rate that our brains find important. First, TVs …

Early in TV history, the refresh rate of the screen (literally how many times a second the cathode ray redrew the lines that make up the image) was dependent on the vacuum tube technology used to make the tubes, and the AC power frequency coming through the wires (60 Hz). But as we developed more advanced displays like LCD screens, the image on the screen is not redrawn line-by-line like it was on old tube monitors. Each pixel is refreshed a certain number of times per second (240 times/s for a 240 Hz TV).

How does that affect our eyes? Early experiments in motion pictures told us that anything less than about 24 frames per second would make film flicker and make your audience uncomfortable. Today, 24 FPS is the current minimum for 35mm film.

Your visual cortex doesn’t work like a TV screen though. It isn’t refreshed frame by frame, rather it’s a continuous flow system. What matters is how fast a single image is held from the time it hits your retina to the time it is recognized by the brain. We can discern 10-12 single images a second. However, moving faster than that doesn’t automatically make smooth “video” come alive.

Take alternating frames of black and white. Anything less than 30 FPS and it flickers black to white. From there to 60 FPS, depending on the person, it will start to look gray. The effect of changing frame rates on the human visual system is demonstrated by the so-called “wagon-wheel effect”, as in this video

So do we need 240 Hz (or higher) TVs? It only becomes an issue as TVs get bigger. When an X-wing darts quickly across a 30-inch LCD at 120 Hz against the black of space toward the Death Star, the refresh rate of the pixels gives us smooth, Rebel motion. But try that same refresh rate on an 80-inch TV, and you’ll see some pixels misfire and don’t refresh fast enough, making motion blurry and uncomfortable. We don’t see like TVs, but we can pick up some of their refresh rate artifacts based on how our nervous system works.

mariothewizard  asked:

I have been watching some YouTube videos and they talk about how next gen console games usually run at 30 fps where 8-9 years ago console games ran at 60 fps. A few months ago controversy rose when the upcoming The Order: 1886's developers said they intentionally are keeping the game at 30 fps to emulate the feeling of a movie. Is there ever a valid design reason for a game today to run at 30 fps? And why are more games running at 30 fps such as the upcoming WiiU game, Bayonetta 2?

As you can probably guess, the main reason that frame rates are set at 30 so often is because of technology. When you need that much data processed each second just for video, you run into hardware limits when you have to pile other things on like animation data, pathfinding algorithms, artificial intelligence, and so on. But why would somebody want to purposely adjust the frame rate down? Well…

Meet Guilty Gear Xrd SIGN. This fighting game, nonsensical name aside, uses a 3D engine to make a 2D game in order to take advantage of all the benefits 3D offers over their traditional 2D sprite work. The game is actually running at 60 frames per second most of the time for purposes of gameplay and timing. But they purposely eschew the super smooth animations for their characters because they want them to seem as if they are running at a lower frame rate. Why? Because the entire game is like a long-running anime story, and to be true to its anime roots, it should actually be running at 24 frames per second, since that’s true to the source material.

You’ve probably heard of this movie, right? Peter Jackson filmed it at 48 frames per second, but a lot of moviegoers who saw it at that frame rate complained about it. They didn’t like it, it just made them feel uncomfortable. There have been a good number of complaints about this filmmaking decision - it feels weird to the viewers. It’s hard for the layman to put a finger on it, but it just feels foreign.

The main reason most of the new generation of games is running at 30 frames per second is because they are still getting used to new technology. A lot of the software and driver support for the new platform isn’t as robust or efficient as the old stuff, and all of the new games are running at 1080p as their native resolution. This actually raises the amount of video data required to process by a factor of about 2-3x each step the resolution is increased. As technology improves and becomes more efficient, we’ll see more games come with 60fps standard.

As for why Bayonetta 2 in specific is running at 30 frames per second? I don’t know for certain, but I’m pretty sure it’s because the WiiU is effectively a PS3 or X360 in terms of hardware. I’ve heard the Wii U described by developers as “two Wiis duct taped together”. Which really isn’t too far from the truth, especially since the Wii itself was also described as “Two Gamecubes duct taped together”. That said, it’s all just hearsay to me since I’ve never actually worked on the Wii U, just the Wii (and the Gamecube). The comment about two Gamecubes duct taped together is pretty accurate though.

Sorry, Pikachu, but you know it’s true. If there is interest, I could probably write something up about the base technical requirements for frame rates at the different resolution steps. I actually started writing that as part of the answer to this question, but I decided against it since it would have made this post too long.

Consider Edison’s early motion picture experiments. The inventor recommended 46 frames per second as the ideal frame rate, concluding that “anything less will strain the eye.”

Instead, 24 fps became the norm based on a simple calculation: Celluloid costs money. Less film used means lower production costs. “They wanted to see how little film could you get away with feeding into the camera, because it was a resource and 24 was the minimum,” Watro says. “We’ve been at 24 [fps] for 80 years probably because you had some bean counter saying, ‘If people watch 24 without vomiting, then let’s go with that.’”

– Today in Unexpected Things I Learned About The Movies (via Wired)

"The Hobbit": Peter Jackson annoys film buffs with higher frame rate
  • cause Director Peter Jackson’s upcoming take on “The Hobbit,” on top of being created in 3D, uses a technique unusual for mainstream films — it’s shot at 48 frames per second, double what most films use.
  • reaction Many fans who saw an early ten-minute screening of the film at CinemaCon found the frame rate to be a significant change, with some saying that the film felt artificial and fairly jarring to watch.
  • rebuttal Jackson stood his ground: “It’s literally a new experience, but you know, that doesn’t last the entire experience of the film — not by any stretch, [just] 10 minutes or so. …  you settle into it.” Good move? source

Follow ShortFormBlog • Find us on Twitter & Facebook

What causes 24p video to look jumpy?

Everyone wants to shoot 24p but it only looks “natural” when viewed at a higher framerate:

A large amount of content is produced in 24p. In theaters, 24p is the standard, but narrative television is also often produced in 24p. Yet, we don’t experience any obvious jumpiness when watching 24p on television or in theaters. Why not? One of the reasons is that we don’t often actually see 24p in either of those environments. In the U.S., we broadcast all video at 60 Hz. NTSC video is broadcast at 60i (59.94 interlaced fields per second), 1080 video is also broadcast at 60i, and 720 at 60p (59.94 progressive frames per second). To show 24p in the 60 Hz world, we need to convert it using a 2:3 pulldown method. This process not only conforms the video to the standard, but also has a smoothing effect. DVDs are mastered this way, as well, giving the same smooth result. Home televisions also often double their display rates (a 120 Hz TV is easy to find at your local electronics store), and this smoothens video on home screens even further.

24p film doesn’t have pulldown, so why isn’t it jumpy? Well, there’s something special happening that helps reduce the effect. Thomas Edison determined that for comfortable viewing in theaters, 46 fps was the minimum display rate of projectors. Projectors used a multiple-bladed shutter to show 24 fps film at 48 times a second, doubling the display of each frame. Many modern projectors actually will show each frame three times, giving 72 frames per second on screen. This has the same smoothing effect to our eyes that we see on TV sets. In fact, some of the only places we don’t see this smoothing effect is on production monitors and computer screens, so we can understand why cinematographers and editors may get a little uneasy about the 24p jumpiness.

Read the rest at HDVideoPro

24p:

24p is a progressive format and is now widely adopted by those planning on transferring a video signal to film. Film and video makers use 24p even if their productions are not going to be transferred to film, simply because of the on-screen “look” of the (low) frame rate which matches native film. When transferred to NTSC television, the rate is effectively slowed to 23.976 FPS (24×1000÷1001 to be exact), and when transferred to PAL or SECAM it is sped up to 25 FPS. 35 mm movie cameras use a standard exposure rate of 24 FPS, though many cameras offer rates of 23.976 FPS for NTSC television and 25 FPS for PAL/SECAM. The 24 FPS rate became the de facto standard for sound motion pictures in the mid-1920s.[2] Practically all hand-drawn animation is designed to be played at 24 FPS. Actually hand-drawing 24 unique frames per second (“1’s”) is costly. Even big budget films usually hand-draw animation shooting on “2’s” (one hand-drawn frame is shown twice, so only 12 unique frames per second)[4][5] and a lot of animation is drawn on “4’s” (one hand-drawn frame is shown four times, so only six unique frames per second).

9

Papyrus Pranks Sans: An Epic Tale~ [Mega Gif Post!]

See the sound-less video version [HERE]

This is showing off every separate flipnote I created to string this story together. <3 

I have never worked this hard on a series of flipnotes before, and I think the love really shows in this!

I /adore/ animating Grillby after this. xD He is so fun to animate since his head is /literally fire/ with glasses pasted over it. xD

The scene with Sans walking into the bar with the door opening, ringing the bell, and then the whole bar stopping what everyone is doing to look at sans is just the best. <3

I noticed the video version did not give the gifs much justice since some of the frame rates slowed down during some scenes… (You can tell parts of this has faster frame rates than others… xD)

I am in the process of adding background music, sound effects ect to really make it like a full-blown Undertale Cartoon! <3

I hope you all enjoy this!

There were some complaints that my 60fps vs 30fps comparison was unfair, so here's one final example.

External image

Dark Souls

Open in separate tab and let image fully load.  

One is near 60fps, the other near 30fps.  

Both are identical animations running at identical speeds.  

It’s in game, in motion, in combat with real gameplay, and a background and backdrop to it.  

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

Left is higher frame rate.

Album of GIFs

Follow if interested in more content like this.