Main question: Why does a particular highway have roughly the same number of automobile-related deaths every year if a car crash is unexpected and dependent on the actions of individual drivers?
More detail: in Tennessee, they have those electronic signs on the interstate that occasionally show the number of roadway fatalities in the state for the current year and beneath that it will show the previous year. The numbers are almost always close, within a reasonable margin of error and accounting for both slightly more drivers on the road each year fur to population growth.
Having been in a serious car crash, I understand the seemingly arbitrary way a sequence of events can play out depending on multiple factors such as driver awareness, vehicular dependency, roadway conditions, weather, etc. Even a highly skilled and fully focused driver in a perfectly functional car on a road with no issues can be involved in a crash, say due to a different driver’s condition, a deer running out into the road, or a sudden gust of wind blowing a newspaper onto their windshield and obscuring their vision for a few seconds.
This may by a multi-part question, but it’s been in the back of my mind for years now and I woke up today wondering about radioactive decay so I started reading about isotope decay. That’s not really what this question is about though, but the seemingly random probability of an isotope decaying in an independent manner (not related to the actions of any other isotopes nearby) made me kind of connect these two ideas and start to wonder about it again.
So getting back to the roadway fatalities, in this case for the state of Tennessee, I found the 20 year statistical data at the site linked below. It shows some ups and downs over the years, presumably with the reductions coming from improved safety features in cars, yet the total number for 2020 (1,221) is closer to the total for 2001 (1,251) than it is to the total for even the previous year 2019 (1,148).
So despite population growth and enhanced safety features, we are kind of right back where we started, or rather, where we’ve always been.
I could also expand this question to cover other events that should ideally never occur, such as murder. Why does something as abominable and world-shattering like murder, at least from an individual perspective, happen with roughly the same frequency and rate when looking at a large sample size? Shouldn’t something like that be the exception and not the norm? Is it somehow related to density?
This has me wondering about probability, fate, design, and all sorts of things both rational and irrational.
Anyway, thanks for reading this. Even if nobody responds, I think it’s helped to just get it out in writing for the next time I think about this in a few months.
TLDR: Why do independent actions and events that deviate from the norm happen with almost certain predictability?
In: Mathematics
It’s known formally as the law of large numbers. A large number of random events will always converge to its true probability.
It’s easiest to talk of a coin toss for simplicity. What is the true probability of a coin toss? 50.000000000…% heads. We’ll assume a fair coin, not funny business. Can you predict a coin toss? Absolutely not. There are way too many factors going on. It’s very complicated, and even the slight deviation can cause weird bounces that drastically change what happens. Either way, it’s still going to result in a heads or tails. You’re not predicting the a 40 year old male, left handed, flips it with his thumb, coin flies 2m into the air, spin 20 times about the North West axis at 3.6% tilt, hit the table at 30°, bounce east, flies 0.5 m sideways, slides off the table, falls 1.5 m, lands on its spine with 1m/s velocity, rolls 3m with a 5m radius curve to it, and then falls on its heads side, and it’s a Mexican peso from 2015 and the bird eating snake side is what you decided was heads. Those details don’t matter. It’s heads.
Toss it once. It’s either heads or tails. Let pretend each of these is its own timeline, if you’re in one of them, that is all you see. You don’t know about the other cases. So we have 100% heads, or 0% heads as timelines. So far, not looking good. No where near the true probability for either of the two possible timelines. Tossing a coin once is not a good way to find out it’s true probability.
Toss is two times. What are our options? HH, HT, TH, and TT. We’ll, we could have 0% heads, 100% heads or 50% heads. So now 2/4 times lines have converged to the true probability. Looking better. And were only at 2 tosses.
Toss it four times. What are our options? Well, there’s sixteen of them. I’m not writing them out, but the 50% cases are HHTT, HTHT, HTTH, THHT, THTH, TTHH. So 6 of 16 are the 50%, or 38% of timelines. It’s the most common, but it dropped from 50% past time. However, there a lot of cases that aren’t 50%, but aren’t as bad as 0% or 100%. HTTT for example is 25% heads. The 0% and 100% cases are now only 2 of 16.
Toss a coin 1 million times, and the number of timelines with 50% (with rounding) is going to be astronomically high, well over 99.9999% of timelines. The 0% heads case is going to be 1 in 2^1000000. That’s an absurdly large number. It’s like 300,000 digits long if you were to write it out.
So, onto highways. Yes, it’s very complicated. Even more than a coin toss. There’s not way to predict a crash. However, you’ve narrowed all those factors down into two outcomes just like a coin toss. Fatality or no fatality.
Well, clearly the odds of crashing and dying are way less than a coin toss. If they were 50%, we wouldn’t do it. It’s about 1.3 per 100 million vehicle miles in the US. So driving one mile and trying to estimate the probability of drying is clearly not going to get you even close to finding the true probability. However, there are a LOT of miles driven in the US. It’s about 3 trillion a year. So yes, you’re converging to meaningful and consistent probability that is not lost in noise. It doesn’t matter how complex any given crash is, enough miles are driven that the true probability is being approached. This probability hides all the complexities that can’t be predicted, just like probability of a coin toss hides how hard it is to predict a given coin toss.
For a given highway, the data is going to be less strong, as you have less numbers, so will jump around more per year. Just like tossing a coin 10 times has a real chance of not giving you close to 50%. The highway in question is probably not exactly 1.3 per 100 million vehicle miles, and you might see swings of tens of percent each year. The busier the road, the more consistent the data will be, as you have more numbers. Same with things like murder rates. The murder rate for a small island country is always going to be really good, or really bad, as one random murder swings the numbers. Murder rates for US or Brazil, the stat is going to be stable. (And changes over time likely to reflect actual changes to the true probability). For radioactive decay, the 2^1000000 I had for 1 million coin tosses is a tiny child’s play type of number. You’ll never not get the half life. It’s just so laughably improbable, the numbers are mind-boggling.
Latest Answers