If computer clocks max out somewhere around 5GHz, how is it possible for 100Gbit internet to exist? How does the computer possibly transfer that much data per second?
100 Gbps is about 10 gigabytes per second. Many modern servers can process tens of GB/s. PCIE 4.0, a common standard for communication between the CPU/memory and the I/O cards, can be run at up to 32 gigabytes per second.
Latest Answers