This is one of the core concepts of calculus, although idk if you’re asking about calculus.
Normally an rate of change is calculated between two points (speed at t=1 and t=2); this is an average over the span of those two points. You can make those points closer and closer together, getting closer to a single point (t=1.499 and t=1.500), but it’s still an average rate of change between those points, no matter how close they are.
Calculus (specifically derivatives) is the mathematical process of finding what the rate of change approaches as that range approaches 0. I think an intro to calculus concepts is maybe beyond ELI5 but you can find lots of Calculus 101 pages with a quick google.
This is actually a very important question and the very question that led to a lot of conflict in the math community for many decades regarding if calculus is valid math or means anything. Can even argue that its been a debate lasting literal millennia if you include things like Zeno’s paradox into the debate.
The entire notion of an “instantaneous rate” is a huge paradox if you think about the meaning of those words. It makes absolutely no sense. But we could make rates just less than instantaneous, just not instantaneous itself. To resolve the problem (which was no small feat, it took quite literally over a hundred years to come up with a rigorous answer), we invented the limit and a rigorous look at calculus through the means of a limit. Even then, defining new things to get math to work itself can be thought of as a stretch, though this is how all math works.
Latest Answers