Why isn’t water temperature taken into account when calculating evaporation rate?

220 viewsOtherPhysics

Basically the title. First I look up a calculator, and there were different variables, even the air temperature, but not water. I always thought hotter water means faster evaporation. And if it’s true, how happens that it doesn’t matter when calculating evaporation rate? Is it because evaporation require a lot of heat, and no matter how much heat water have, it’s not enough to affect evaporation rate?

In: Physics

3 Answers

Anonymous 0 Comments

No, the water temperature DEFINITELY affects evaporation rate. Whatever calculator you were looking at must be assuming that the water will be at the same temp as what you input for the air temp. If the water was being directly heated to a warmer temp than the air (eg 60C water in 25C air) then it would be a more complex calculation and you’d have to use a different calculator / model.

Anonymous 0 Comments

The assumption will be that the air and the water are the same temperature. This is a good assumption if the water is being left out for a long time and isn’t being heated directly. If the water is being heated or has not been allowed to cool to air temperature I suspect there will be other effects that would have to be taken into account (eg. convection in the water and air) that would make the calculation significantly more complicated

Anonymous 0 Comments

Water has a Heat of Vaporization of ~2260 J/g and a Specific Heat of ~4.2 J/g°C.

That means that heating 1.0 g of water from 5°C to 100°C takes ~400 J but boiling it away takes another ~2260 J.

Although actively boiling water is somewhat different from the processes happening with passive evaporation, the fact that the Heat of Vaporization is so much higher than the Specific Heat will still come into play and mean that corrections for evaporation rate given the exact temperature of the water would be relatively minor changes compared to everything else.