Frames per second (fps) have been a crucial discussion both in videogames and cinema for some time now. In the case of movies, the conventional fps is 24, that means that a second of tape is made of 24 still frames put together. This concept is also easily seen in an old toy such as the zoetrope, a cylinder with drawings inside of it that spins thanks to its rotating base and gives the idea of movement. Nowadays movies can be filmed up to 60 fps, director James Cameron for example -who directed, among others, Avatar and Titanic– is a supporter of 60 fps movies. Nevertheless, even such a big name did not manage to break away from the conventional 24 fps since even if they were filmed at higher fps, they were still forced to show them at the old standard. In general, this is because people (on average) think 24 fps looks nicer.
Fps in movies
In the early days of cinema, when sound could be included with the image filmmakers had a hard time putting the together because a lot of the time they were recorded at different speeds. Until 1927:
“In September 1927, the SMPTE’s Standards and Nomenclature Committee undertook a fact-finding exercise in order to establish what speeds the emerging sound systems were using. The two which were entering commercial use (Vitaphone and Movietone) both used 24fps. THE RCA variable area system, still in development at this point, used 22fps, while de Forest Phonofilms (which had virtually ceased production by that point) ran at 20fps. Accepting that, trends in exhibition practice over the previous decade and decisions made by the designers of two, the most successful sound systems had effectively standardized 24fps by default.” Taken from the book Moving Image Technology: From Zoetrope to Digital
Since then, every movie was recorded and shown in 24 fps. When technology allowed for higher fps, those were usually associated with home (and amateurish) movies made with the newly invented VCR camcorders, therefore higher fps were linked to lower quality movies. When the era for higher fps movies was teste once again by modern directors, people still hated it, for some reason or another. Lots of critics can be found on the High Frame Rate (HFR) of ‘The Hobbit’ by director Peter Jackson which was filmed in 48 fps.
So, when it comes to movies, 24 fps has been accepted as the norm for almost 100 years, and it might be for that reason that the public does not enjoy HFR movies, we are simply not used to it.
Fps in videogames
When it comes to videogames, though, the discussion is the opposite. In modern videogames, especially fast-paced videogames such as First Person Shooters (FPS), the public seems always to want more frames.
In general, this is because gaming requires active participation in FPS games that translates in moving your character using a set of controls and doing specific actions (such as shooting, very popular action in FPS games) using another set of controls. While you move and perform actions, your mind is functioning, thinking of what actions to perform at what time and where to move both in time and space. At the same time, your mind expects specific output from your eyes, if that output is glitchy or fuzzy, your brain will not like it that much. For this reason, regardless of familiarity with the number of frames, playing a videogame in 60 fps will always look better in your head than playing it in 30 fps, it is simply more fluid.
The quest for higher fps in videogames has now reached absurd levels. However, to better understand it let’s understand how fps are created:
“The frame rate is how many of these images are displayed in one second. To produce, or render, a new frame your CPU and GPU work together to determine the actions of the AI, the physics, positions, and textures of the objects in the scene to produce an image. Then your GPU cuts this image into pixels at the resolution you set and sends this information to the display. The more powerful your CPU and GPU, the more frames they are able to generate per second.” Taken from an AVADirect blog post by Nikita Fedorov
A powerful gaming PC is usually paired with an ultra-high quality monitor, otherwise, how are you going to take advantage of your new graphics card if your monitor cannot display the image as neat as your GPU is producing it.
The thing is, a monitor’s quality is measure in image quality (1080, 4K, etc.) and frequency (Hz). The latter refers to how many times a second your monitor redraws (or refreshes) the image.
The absurdity with the current fps quest is that some computers can run some games in ups of 400 fps. The problem is that to this day no monitor can refresh the image 400 times a second (max is 360 Hz at 1080p) making such computing power useless, the computer is limited by the monitor.
Furthermore, to the average human eye, everything over 60 fps looks practically the same, so what is even the point of reaching for always higher fps?
Enticknap, L. D. G. (2005). Moving image technology: From zoetrope to digital. Wallflower Press.