A gigabit is a measure of serial data, that is to say, a billion binary digits (bits). “Per second” makes this a stream of serial data.
A CPU also runs using binary, but in parallel. Simplistically-speaking, a single core single thread 32-bit processor “works” with 32 bits at one time. It looks into its intray and pulls a 32 of those bits at once as a “word”. Whilst inaccurate, 32 bits at 5GHz is 32 x 5 billion, or 160 billion bits.
Also, 100Gbit internet (ethernet?) is a statement of capacity and rarely seen as a sustained transfer for long periods of time. It *can* run up to 100Gbit.
This is a bit of an oversimplification, but illustrates the numbers. In reality, the CPU is not working hands-on with the network data. That is managed separately. Consider the CPU the boss, whilst the network interface is a delegated expert on network stuff, including serial data to usable “words” or bytes.
Latest Answers