I understand gigahertz, gigabytes, but not gigaflops.
​
Thanks
In: 5
Floatingpoint operations per second. Basically a number to express how powerful a computer is. The more operations a computer can do per second the faster/more powerful the computer is. And the floating point system is a way computers often handle numbers.
Let’s decipher what those letters mean – FLOPS – Floating Point Operations Per Second.
Floating Point is a type of number that processing unit can do calculations with. FP means it’s specially formated decimal value. In contrast to integers, it can be a value in very large range, from very small numbers to very large.
With flops we describe how many calculations with “decimal” numbers can a computer do.
Everyone else has pretty much explained the basics of what FLOPS are. Floating point operations per second.
It’s a quick and dirty way of expressing the computing power of a computer or processor, typically it’s used in advertising as a quick and simple way of saying “Hey, this GPU/graphics card is the many time more powerful than this other one because the number of FLOPS is bigger!”
Some processors and GPUs can do floating point arithmetic (fractions) in hardware, which is fast. Some chips have to do floating point arithmetic using integers, which is much slower.
Many applications, like video games, require floating point arithmetic, so a good measure of how useful the chip will be is floating point operations per second, or flops.
A “flop” is short for “floating point operation”. Floating point numbers are a type of number representation in computers, similar to scientific exponential notation (such as 3600 = 3.6e3).
FLOPS (or FLOP/s) tells you how many operations on floating point numbers a computer can do per second. A gigaflop is 1 billion flops, a teraflop is 1 trillion. The prefixes work the same for every unit.