# Eli5. Why is meteorology always so off base?

190 views
0

I can literally check the weather a day before and it ends up being 80% incorrect. It will say it’s going to rain tomorrow and then not a drop.

It will call for storms all week and then end up being a small storm for 3 hours on Wednesday.

In: Earth Science

The issue here is you thinking it’s solely about wherever you happen to be on that day instead of the chance of rain over the entire area and the area is probably quite large and not just your suburb or town.

So, for example, if there’s an 80% chance of rain over the area but you live in a rain shadow in that area then chances are it’s not going to rain much in your area no matter how high the % is until it gets high enough so that it has to also mean your area.

Because meteorology is the science of educated guessing.

Though I’m inclined to disagree with your experience of it being 80% wrong the day before, I would say in my experience it’s usually 80% right the day before. So this could be example of a subconscious bias where you’re just more likely to notice when the weather told you wrong.

It’s more likely to be wrong the further out from the date you are tho, so a week ahead of time and yea, it’ll be off.

This is because meteorology is based on two things.

Models, which are built off of decades of collected data.

And then current reports from cities/towns/airports/weather stations dozens to hundreds of miles away.

Essentially, meteorologists take all the data that they have from storms and weather in the past, and then compare that to the data that they collected that day.

Essentially you use a computer model to go “well it’s 80% humidity, 70 degrees, 1.015 ATM pressure with a 15 mph wind coming from the west here now, and it’s currently raining in a city 100 miles to our west” (this is just random numbers I’m making up for the example)

Then you can punch all that information into a computer model that then compares your current conditions to similar when similar conditions have happened in the past to make an educated guess about what will happen to you now.

This is why when you see the “path of a hurricane” on the news, you’ll either see a big cone shaped thing showing its general path.

OR

You’ll see a map with what looks like 2 dozen squiggly lines on it. Each of those squiggly lines is a different Computer Model that is trying to predict the path of the storm 5 days to a week or two ahead of time by comparing the current conditions of the storm to the hundreds of hurricanes we’ve tracked and gathered data from over the last several decades.

Edit: an extra fun fact, during WW2 Nazi Germany got cut off from most of its Weather data, because in the Nothern hemisphere weather typically comes from the west, and since the UK, Canada, and USA were to Germany’s west and not going to give them weather data since they were at war, they needed to gather their own.

So, they sent a submarine with weather equipment all the way across the Atlantic to the coast of Canada and Greenland and set up remote automated weather stations that would collect data and use radio waves to send the information back to Germany, so they could try to predict the weather.

Your definition of accuracy is probably not achievable. NOAA, which gathers and distributed virtually all of the data used by weather forecasters, has a criteria, makes forecasts and grades them as well.
Their criteria is for areas, of which we have over a thousand in the US. That’s a lot, but still each is a large chunk, with lots of potentially variable micro climates within. Go look at their boundaries for your location. See how well or poorly they have done.
More importantly, how often have they failed to warn you of impending storms or violent changes in pressure? In the 19th century there was no knowing a hurricane was coming. Until they put up satellites in the 60s, radio from ships was the only warning. Now we know days ahead.