I think I understand that driving faster increases drag because there’s more air pushing against your vehicle, but why is the drag for that distance greater at higher speeds? If a car is driving slower but across the same distance, wouldn’t the total impulse created by the drag be the same as going faster because it’s delivered over a greater time, even though it’s a smaller force at any given moment?
In: 3
Your assuming a linear drag curve. However parasite drag increases exponentially with speed.
D = .5*rho*S*Cd*V^2
Rho = air density
S = surface area
Cd = drag coefficient based on surface shape
V = velocity
The important thing to note is velocity is squared. So if you double your speed you quadruple the drag.
Impulse causes change in momentum, but this is about energy. The important quantity is work = force times distance. The distance of your trip is the same no matter how fast you go, but higher speed means more drag force means more work done and more energy lost to drag.
This would be true regardless of how force increased with speed: the fact that it increases with the square of speed, as mentioned by others, makes it worse.
No matter what the gearing is, the increase in RPM (thus the amount of fuel sprayed per turn) will always be greater than the increase in circumference driven by the wheels.
So say you increase the engine RPM from 2k to 3k RPM, your consumption increases by 50% but your car will go from 60 to 75 MPH (an increase of roughly 25%). You are burning more fuel than you are getting speed for it.
You now see how manufacturers can only focus on gearing the transmission so 60mph is linked to when the engine is at its lowest drivable gear ratio (you still have to pass cars and go up hills so it can’t be too low) race cars can have an other “speeding” gear that would be useful at higher speeds and saving more fuel at 60 mph but it would also lack power as the engine would be at a very low RPM to accelerate confortably.
Imagine sticking your hand in a tub full of water. If you move it through the water very slowly there’s almost no resistance at all. If the water was exactly skin temperature and you closed your eyes, you’d scarcely even know that your hand was moving through water instead of air. Now move your hand as fast as can through the water. The resistance is now much, much greater. Clearly, it takes far more energy per stroke to move your hand through the water fast. It’s the same for the car moving through the air. Even though it only goes the same distance, if it’s going faster it takes more energy. Since that energy comes from burning fuel, you have to burn more fuel to go faster.
Drag increases with the square of your speed. If you double your speed, the drag *quadruples*. So yeah, you also get their twice as fast, but the total impulse from the drag force is still double what it would have been, had you driven at half the speed and taken twice as long.
For instance, let’s say at 30 mph you experience a drag of 10 (these are made-up numbers so I’m not going to dignify them with units), and you’re taking a 30-mile trip. The trip takes you 1 hour = 3600 seconds, yielding a total impulse of 36,000. Now let’s say you double your speed to 60 mph. This quadruples your drag to 40, while cutting your travel time in half to 1800 seconds, for a total impulse of 40*1800=72,000.
This is part of the story. Another part is due to the efficiency of your engine, which isn’t the same for all RPMs. There’s some optimal range. If you’re in a low gear and getting above the optimal RPM range, you can shift up a gear. But once you’re in your car’s top gear, you can’t do that any more. So, once you’re in the top gear and in the optimal RPM range, going any faster will inevitably make your car operate less efficiently.
Latest Answers