Why do we talk about console performance in teraflops, but use other metrics for PC performance measurement?

868 views

I’ve always heard people compare console performance using teraflops, but I’ve never heard anyone mention that metric with regards to PCs. Why?

In: Technology

4 Answers

Anonymous 0 Comments

Ideally, to compare gaming machines (pc or console) you want to run a certain scene (or multiple) in game with the same resolution and quality settings and compare the frame rates. Higher fps means that machine is better at playing that game. Repeat this with a bunch of popular games and you end up with an idea of each machine’s performance.

On PC, this is very easy. You can choose the exact settings and resolution the game runs at and you can unlock the frame rate. On console, you typically can’t choose the exact settings. And the frame rate is limited to 30 or 60fps. To make things worse, console settings and resolution are often dynamic. So a game might silently switch from 4k to 1080p when it has trouble maintaining 30fps (which is great if you’re just playing the game, but terrible when benchmarking). These dynamic settings make it difficult to even compare generations of consoles because a ps4 game on a ps5 might just run at a higher resolution instead of a higher fps.

Because of this, consoles have to be compared in other ways. As mentioned by other users, Teraflops are not a great way to measure gaming performance. [Compare AMD’s 5500xt graphics card (5.2 TFLOPs) and their RX 580 (6.2 TFLOPs)](https://www.anandtech.com/bench/product/2524?vs=2577). The 5500xt performs better in many games despite having less TFLOPs.

You are viewing 1 out of 4 answers, click here to view all answers.