It’s a truly exciting time to be a gamer. More so than other forms of media, videogames live and die not just by pushing creative boundaries, but technological ones. Today, we live in the day of motion sensing peripherals, eye watering visuals, interconnectivity, and pretty soon, virtual reality. So it’s all the more puzzling to me that there’s even a debate as to whether all videogames should aspire to play at the very least in 60 frames per second.
The release of 2012’s The Hobbit, which offered viewers a version that was filmed in 48 frames per second, really opened up discussion on whether the traditional framerates of 24 and 30 should continue to be the norm. Many argue that having a high framerate in both movies and games takes the cinematic feel out of them, and some even push the absurd notion that human eyes aren’t capable of seeing past 30 fps.
Lately, with the release of the Xbox One and PlayStation 4, the frames per second debate has been happening a lot more, largely in the context of game developers intentionally capping their games to 30 fps. This was especially the case with Bethesda’s The Evil Within, which not only capped the game’s frame rate on both consoles and the PC, but letterboxed the screen. Along with an optional grain filter, it was plainly obvious that developer Tango Gameworks wanted to emulate an old, 1970s film look. After outcry from the PC community, the developer patched the game to both uncap the framerate and remove the letterbox.
This brings us to a large question: why *should* games aspire to look cinematic? The cinematic feel that game developers and publishers speak of is nothing more than the effects of old technology. In their earliest days, film cameras couldn’t shoot past 16 fps. As time and technology progressed, that number went up to 20, 24, and 26. Today, movies shot on film are tied to 24 fps because it produces the most clear sound, and many of our beloved films, from Star Wars to Silence of the Lambs to The Matrix, have this look.
There is nothing about a low framerate that is aesthetically pleasing; it only seems that way because many of the finest works of art in human history have it. It is only because of the impact of the afore-mentioned films on our creative minds that some developers aspire to have their games adopt this look. There are developers who limit the framerate because of technological limitations, and that is perfectly understandable. However, for developers to limit the framerate for “cinematic” reasons without providing players the option to uncap it is downright ridiculous. Videogames today are easily capable of both glorious visuals and high framerates, and it is unfathomable why they shouldn’t be presented in the framerate that our eyes are capable of and meant to view.
Aside from the obvious fact that controlling in games is more responsive and smoother, a high framerate gives action-packed scenes an increased sense of urgency and has less strain on the eyes. A high framerate also removes any barriers between the images on the screen and our eyes, and because our brains don’t have to do any work to fill in the gaps between the frames, we can better appreciate and scrutinize the game’s visuals.
Of course, the fact that this debate is even taking place is a sign that 60 fps will indeed become the norm. For example, YouTube recently announced that they will be supporting videos at higher framerates solely because of the demand from the gaming community for smoother gameplay videos. But perhaps the herald that will truly force 60 fps gameplay upon videogaming is the Oculus Rift. As a virtual reality headset, the Rift syncs a player’s head movements with the player character’s head movement, and as such it practically demands that the game maintain a seamless framerate. Can you imagine using your own head to look around and seeing things in the juddery hell of 30 frames per second? Exactly.
Videogames have come a long way to looking more and more like real life. There’s a grocery list of technologies that have been implemented for the sole purpose of bringing graphics closer to reality, like parallax mapping, tessellation, face mapping and Nvidia’s TressFX. So if we’re working towards bringing videogame graphics closer to, and arguably better than real life, does it make sense to turn around and take a shotgun blast to that idea by purposely halving framerates? Today’s technology is making it easier and easier to present visually arresting games in the framerate that our eyes were *always* meant to view, so there’s no reason why developers should aim to make *all* their games run at 60 fps. Whether anyone likes it or not, gaming has to continuously push technological boundaries to stay alive, and I’m hoping that once the next generation of gaming consoles arrive, there won’t be any reason to debate framerates.
Leave a comment below, and tell us what you think about the article. Should all games be 60fps?