Because there never was a reason for this to happen. Why do you believe it should ?
Moore’s law is not a law but a simple observation. And it only talks about the transistor count on a chip, not its power.
At some point, engineers ran out of stuff to put on a chip . We added many instruction sets, tons of various cache, the whole northbridge, then memory controlers, then multiplied cores.
There are some chips that are complete systems, on a single chip (which are called ‘soc’ system on chip).
Now, to make it run faster, you need faster clocks; but we found out that going above 5GHz makes photons a problem, on regular operations. So we cant. Another way is to run colder, but not everyone can run their chips near 0°K.
Also, smaller transistors have been better, as they leak less energy, but they also make chips smaller, which is a problem when you want to dissipate heat. And we are at a point where we can hardly make them smaller.
For several years now, the research has been focused on that and on reducing power consumption. The computing power to consumed watts ratio has litterally skyrocketed.
Science works that way. When something is discovered, it opens a completely new field, so there is a lot to explore. Once most things have been explored and well understood, only fringe cases remain, so progress is slow, until something else major is discovered.
Latest Answers