If 100°C is equal to 212°F then why is 1°C not equal to (212/100) or 2.12°F but instead equal to 33.8°F?

792 views

If 100°C is equal to 212°F then why is 1°C not equal to (212/100) or 2.12°F but instead equal to 33.8°F?

In: 0

11 Answers

Anonymous 0 Comments

You mistakenly assumed that 0 C = 0 F. That isn’t true. If it was, you’d be right: 1 C *would* be 2.12 F. But it isn’t.

0 C = 32 F. Why? The Celsius scale pegged 0 to the freezing point of water. The Fahrenheit scale didn’t.

What is the 0 point of the Fahrenheit scale pegged to, then? No one is really sure. Fahrenheit was a secretive sort. Some people say it’s the temperature of a certain brine solution, but that’s probably not why the zero point was chosen to be where it was. The brine solution was likely created after the fact to replicate that chosen temperature to calibrate newly made thermometers. It is most likely that the zero point was just “the temperature of some basement on the coldest day in winter” or something mundane like that.

Why such an arbitrary choice? Well, you have to give some credit to context here–this was still the era where negative numbers were considered loathesome to work with. They created pesky clerical errors in data recording and calculations. So in a pre-refrigeration tech society, pegging the zero point of your scale to the coldest possible natural temperature you’d expect to encounter to avoid negative numbers was an attractive idea.

You are viewing 1 out of 11 answers, click here to view all answers.