A respectable frame rate is very important to the enjoyment of video games. By respectable frame rates, we don’t mean a target number of 100 or 120. We refer to maintaining a smooth frame time to have a pleasant experience. There are a lot of factors that affect your FPS (frames per second) and misconceptions about in-game settings make things even worse. Many gamers fall prey to these misunderstandings. This costs them performance and even worsens their gameplay experience at times. This is unnecessary, especially considering the correct information is a Google search away. Our article covers a few of the most common misconceptions surrounding in-game settings.
1. You Should Play at Ultra Settings
A lot of people suggest you should play the game on Ultra settings. In reality, that slightly bigger texture on the leaf you won’t even look at while playing won’t make the game more beautiful. Nothing will differ other than that massive FPS drop that might ruin your gaming experience. Thus, we recommend you optimize your game in a way that both looks good to you and provides the frame rate that you are comfortable with. This may mean playing at the Ultra settings, but remember, that is not the rule of thumb.
2. Ray Tracing Makes Games Look Beautiful
The community has a lot of misconceptions about the Ray Tracing technology. We will try to define it shortly. Ray Tracing uses your CPU or GPU to accurately trace light paths and then render them in a scene to create the most realistic lighting possible. In short, ray tracing recreates real-world lighting. While this may sound neat and a must-have, it requires a lot of processing power. Nvidia pioneered hardware-accelerated ray tracing in 2018 with their RTX lineup of graphics cards. These cards use dedicated RT cores, which are more expensive and more powerful compared to CUDA.
However, before the advent of RT cores, reflection spheres simulated lighting in games. The process was called rasterization. Rasterization uses your computing power and captures the lighting conditions that would probably have been created in the real world. Simply put, this technique would fake lighting. It was first introduced in the late ’90s when computers weren’t powerful enough to handle ray-traced environments. With decades of fine-tuning and improvements, this method has become superb. At times, it even beats ray-traced environments.
Look, games are a form of art. Making them look realistic is not the point here. It is to make them look good, even better than reality if possible. Therefore, ray tracing might not be the alpha feature that developers chase. Additionally, it’s still in its early stages today, plus, ray-tracing capable hardware requires a lot of cooling and is quite expensive. This, coupled with massive frame drops for improvements that many people can’t even notice, makes it more of a gimmick, at least for now.
3. Misconceptions Surrounding In-Game V-Sync Settings
V-sync technology is supported throughout almost all graphics card lineups these days. The option is also available as a toggle inside the in-game settings. Although V-Sync can help with screen tearing, it won’t help when the game is struggling to maintain a frame rate higher than or equal to the monitor’s refresh rate (considering the coefficients only).
V-sync is more of a preventative technology. Simply put, it prevents the graphics card from displaying a second frame before the monitor has completed one refresh cycle. Thus, it limits the graphics card’s power and provides smoother gameplay, fixing a monitor bottleneck in the process.
However, the V-sync technology has limitations. GPU manufacturers and game developers were well aware of these even when V-Sync was created. When your graphics card struggles to maintain a frame rate greater than or equal to the refresh rate of your monitor, V-Sync causes input lag issues. It has also been reported to cause stutters, including screen freezes. A good example is Call of Duty: Modern Warfare. The game would completely freeze if V-Sync was turned on in the opening cinematic of the mission named Proxy War.
Bottom line — V-Sync is not a feature you should leave turned on, as it might hurt your experience. There have been several improved iterations of V-Sync. These include Nvidia’s G-Sync and AMD’s FreeSync. While G-Sync is proprietary, FreeSync is open source. Hereafter, consider getting a FreeSync-enabled display. These improved technologies physically alter the display’s refresh rate and match it to the frame rate the GPU is capable of outputting. This improves your experience irrespective of your hardware.
4. Increasing your Field of View Increases FPS
We have heard people say that increasing your Field of View (FOV) increases your FPS. They rationalize it by saying that increased FOV means that the game has to load smaller textures for every model displayed on the screen, and this boosts the frame rates. However, this does not make any sense. An increased FOV means that your computer will have to render more models.
This in-game setting increases the angle of view and has nothing to do with view distance. Just to clarify, most modern smartphones pack both wide and ultrawide lenses. This lets users switch to a wider shot with the ultrawide lens. However, they don’t have to physically change the smartphone’s distance from the object while doing so.
To sum it up, simply play at the FOV you are content with. Do not alter it to “get more FPS”.
5. Anti-Aliasing Increases Render Resolutions
While playing games, a smooth curved line might appear jagged. This happens at lower resolutions like 900p or 1080p. The underlying problem is that computers use squared color boxes called pixels to make up the display, Thus, a straight line will be perfect, but a diagonal line will look like a staircase. This can look increasingly weird in hyperrealistic game worlds. Thus, game developers came up with a technology to fight this issue and called it Anti-Aliasing.
There are several forms of Anti-Aliasing but their working principle is identical. Multi-Sampling Anti-Aliasing (MSAA) and Fast Approximate Anti-Aliasing (FXAA) are among the leading ones. These work by identifying a jagged line before gathering several samples from the surrounding environment. The feature then adds the missing information to create a nearly perfect curved line. Thus, AA has no relation with increasing render resolutions.
However, this misconception is not completely wrong. Apple’s Retina Display renders the on-screen content at a higher resolution and then compresses them down to a smaller on-screen space, thus packing much more information than required. However, unless you are gaming on an Apple Retina display, using Anti-Aliasing might be an option to fix jagged edges without actually stepping up your native resolution.
6. 30 FPS Gaming is Bad
A lot of gamers will state without a second thought that 30 fps gaming should be avoided. While 60 fps gaming does feel a lot smoother and is recommended for competitive games like Overwatch and Fortnite (at a minimum), 30 fps gaming will always be satisfactory.
Gaming in 60 frames per second comes with a “problem”. The excessive smoothness and responsiveness take away the cinematic feel of the game. This effect is not welcome in games that heavily focus on narratives and cinematics like A Plague Tale: Innocence or Life is Strange: True Colors. This is why a plethora of in-game cinematics get pre-rendered at 30 fps, irrespective of what frame rate the game runs during gameplay.
Although 60 fps and higher frame rates have their set of positives, 30 fps gaming was never, and never will be, terrible. This is a rule of thumb that many people aren’t ready to accept.
Final word
These were some misconceptions that are most common in the gaming community. There are plenty more where these came from, which would make our analysis far too long. Gaming is all about the experience. Therefore, our crucial bit of advice to gamers is to play at the settings they feel the most satisfied with, instead of trying out things that might ruin the experience.