It is not quite like that. If we use the proper temperature scale, there are simply *no negative temperatures*.
Then the question becomes: why in our everyday life the environment is mostly at 300 Kelvin, plus or minus a few tens of degrees, even though there are places in the universe where it is much hotter?
And the answer to that would require talking about our biology etc. It is not a question of fundamental physics.
The simplest answer is that your thermometer is stupid. When Fahrenheit and Celsius were invented, we didn’t know how cold the coldest cold could be, so the inventors picked an arbitrary temperature and said “This is zero”. They could have picked anything. Celsius picked the freezing point of water because, hey, water is everywhere, and decided that 100 would be the temperature at which water boils. Then the range between was divided into “degrees” to be an arbitrary unit of temperature.
Later, we figured out that there is a limit to how cold the coldest cold can be, and there was a new temperature scale invented called Kelvin. The absolute coldest a thing can ever theoretically be is 0 Kelvin. And then you work your way up by those degrees and eventually get to the temperatures we normally experience between ~250-350 Kelvin.
So, if your thermometer was smart, it would be using Kelvin and start at zero and only ever go up-no negative temperatures. Since temperature is a measure of how much/fast atoms are moving around (average kinetic energy) negative temperature doesn’t really make sense, any way. It’s not like you can move slower than being completely stopped, which is the state at 0 Kelvin.
So, in short, your question highlights a symptom of the fact that the commonly used temperature scales were made arbitrarily. Because of this, they don’t make sense once you get outside of the ranges we normally experience in weather.
Temperatures that tend to be useful for humans are, on the grand scheme of things, remarkably cold. There’s no need for our “human useful” scales to be calibrated to such high numbers. To humans, there isn’t really a difference between 10,000C and 1,000,000C. They’re both too dang hot. That’s the same reason scientists use Kelvin, starting from zero makes the most sense in experiments.
Latest Answers