PC Gaming Part 2 – Frames & time
Heyho lovely Artsygamer crowd!
As promised in part 1 of my article series about PC gaming, I want to talk about some of the common tech terms, how gaming can be influenced by them and how the PC platform is great for people who care about it.
When talking about framerate, most of the time it’s a discussion about 30 frames per second and 60 frames per second and which of those two is “enough”. The problem is that through false analogies and bad knowledge some people even thing the human eye is not even capable of perceiving more than 30 frames per second. But let’s take a step back first and see why “Frames per Second” actually tells you less than what you would expect, what people actually mean when talking about frames per second even without being aware of it and what the better metric is.
So, framerate basically is the rate with which your system displays its content on its output. In the case of a game console, this could be what is happening in a game getting displayed on your TV. The framerate is a time dependent metric and seconds are the only time metric that make sense to use so we are talking about frames per second. The reason why this metric is a pretty bad one is because it tells you nothing about the distribution of the frames. If a game has 59 frames rendered in 0.01 seconds each but the next frame takes 0.41s the game clearly runs with 60 frames per seconds but it will be unplayable due to one frame being displayed for over 1/3 of a second every second.
What people automatically asume when talking about frames per second is that those frames are all distributed equally over one second which is not always the case. People have gotten used to using frames per second because framerate as a metric is much older than games. I leave it up to you to read some stuff about movies and filmmaking and frames per second but the conclusion is: framerate is a metric that is actually only sufficient to express a game’s performance in the best case scenario. Since the load of a game on a machine varies and depends a lot on how the player behaves in the game, it’s actually rare for a game to deliver frames with constant timings by itself. Thus the better metric to use is the time that the frames happen to be displayed till the next one (which also expresses the time needed to render the next frame): the frametime.
So when talking about frametime, we’re getting closer to two topics important for the conversation: the refresh rate of your display and your ability to perceive. We’re going to talk about the refresh rate first because it’s pretty easy: most displays update with 60Hz (this means Hertz and is used in Physics to express frequency). Basically, 1 Hertz is 1 unit per second. So 60 Hertz means 60 units per second. In the case of a 60 Hertz display it means the display will update the displayed image 60 times per second. There are also displays out there which can handle higher frequencies, the more common ones are 144Hz displays.
But can you actually perceive the difference between 30 and 60 frames per second (or, to stay within our newly discovered metric of frametime: 0.0333s per frame and 0.01666s per frame)? Well, yes and no. We need to be more precise here and also shouldn’t generalize too much because there will always be exceptions. If we’re talking about changes in brightness, most humans can’t perceive changes faster than 0.1s per change. This is used in modern lighting where you have very bright light emitted in a high frequency rather than less bright light being emitted constantly all the time. The human eye can’t tell the difference most of the time but it saves power. If we’re talking about movement however, the human eye is incredibly capable. The problem here is that the human brain is equally great in filling in missing information so it’s hard to tell where the limit is but 0.002s should be perceivable for most human beings (that’s 2 milliseconds!).
So if human beings are capable of perceiving 60 frames per second with 0.016666s display time per frame and displays are capable of displaying 60 frames per second (not just capable actually, they do it wheter you want it or not), why do games (at least on consoles) so often focus on hitting a frame time target of 0.03333s resulting in 30 frames per second? Well, the answer here is a mixture of hardware capability and game design focus. Games today often focus on larger areas or “open worlds” and the problem here is that while GPU power in consoles is enough to support those quite easily actually, it’s the CPU power holding those games back and limiting them to 30 frames per second. Of course all objects in the scene are rendered by the GPU, but some instance needs to tell the GPU which objects to render. This instance is the CPU and the whole process of telling the GPU to render something is named a “Draw Call”, because the CPU calls out to the GPU to draw something on the screen. Doubling the framerate then means to at least double the amount of draw calls because that’s required for the GPU to act. And we haven’t even talk about physic calculations, NPC behaviour etc. yet. Many of those calculations are often on the CPU as well. But how does a longer frametime affect your enjoyment of the game?
Latency is basically the amount of time it takes between a cause A and it’s corresponding effect B. When talking about latency in our context I want to emphasise on 2 things: first, the latency between your action as a player and the outcome of this action being presented on the screen. This depends on multiple factors. The first thing is the signal processing of your input device. This is usually quite fast (less than 2 Milliseconds). The next thing is the refresh rate of the code part of the game that processes the input. This of course varies from game to game, but it shouldn’t be complex enough to take up a noticeable amount of processing time. After processing the input, all the code that generates and renders the picture is running. The whole runtime of the code makes up our frametime. After that, the rendered image needs to be displayed by the display. And here another – mostly unknown – topic strikes: enhancement features of TVs. See, most TVs automatically take the images they receive and try to enhance them by applying certain filters and algorithms. This takes up time and although the system is already finished putting out the frame, you will not get to see it. Good TVs offer the option to enable a specific gaming mode in which the enhancements will be disabled. This changes the display lag from otherwise up to 300 Milliseconds (0.3s) to 20 Milliseconds (0.02s). PC monitors in general are between 1 and 8 Milliseconds (0.001s and 0.008s). So in a gaming setup with TV, the latency not caused by code (which we can call constant latency) is roughly 22 Milliseconds (0.022s). If our game now runs at 30 frames per second with a constant frame time of 0.033s, this means we end up at a total latency of 0.055s. If the game runs at 60 frames per second with a constant frame time of 0.016s, we end up at 0.038s. This means lower frametimes will make the game react more timely to your actions.
The second thing is your ability to react to stuff happening in the game which also depends on the latency. If we take our figures, a game running with frame times of 0.016s will enable you to react 30% faster to changes in the game. This increases further if your display can handle lower frametimes, with a 144Hz display that can be served with a 0.0069s frametime resulting in a total lag of 29 Milliseconds (48% faster than our 30 frames per second case). Of course this raises the question if you need this fast reaction times which in turn depends on the game and its mechanics. A fast paced competitive shooter would be a scenario which profits from very low frametimes while a tactical turn based RPG doesn’t need low frametimes to ensure perfect playability.
So far we focused on the 0.033s and 0.016s frametime examples, but why is that? I mentioned earlier that frametimes are usually variable and wouldn’t it make sense to just let the game run as fast as it can, ending up with more than 30 frames per second in most cases? Well, let’s enter the realm of image quality.
I mentioned that TVs typically update with 60Hz which means it will pull an image from the system every 0.016s. If our game runs with 0.033s frame timing this means the display will pull the same image twice from the system before the next one. But if our game now runs faster, say with a frame time of 0.02s it means the image pulled from the display will not be fully updated. Instead, a fraction of the old image will be displayed with the other fraction being the new image. The frame is torn. The result is a flickering line that will often appear to travel across the screen. This means image quality will only be intact if the frame times stay at a multiple of the display refresh rate (including multiples of 0.5, 0.25, etc). That’s why frame times of 0.33s are used. The technique to make sure the game doesn’t run faster is called V Sync (which stands for vertical synchronisation; vertical because updating the display happens vertically). V Sync essentially will make the game wait to deliver the image so that the frame will not be pulled incompletely from the display.
It would be easy to jump to the conclusion that the PC has an advantage because of the potential increase in power and the resulting performance delivering shorter frame times. This is just one of the factors in favor of the PC though. The main advantage is the flexibility. The players can tweak every game to their liking, deciding what is more important to them for each game. On a console, this decision is take away from you because developers created an experience targeted specifically towards the hardware at hand. On the PC, you could even decide approve torn frames for better frame timing if your PC can’t quite handle the game at the visual settings you want.
End of part 2
So that’s it. I’ll admit the article was more about explaining some technical details than going into PC gaming really. Sorry ;) But if you’ve made it this far, congratulations! I hope you’ve learned a thing or two that enables you to understand better why your games behave the way they do. Maybe you even jumped up from your seat to see if your TV has a gaming mode. That’s great! If you have any more questions or want a certain topic to be covered, I am in the process of thinking about part 3 of the series and what it would be about so your input would help me a lot.
Until next time and game on!