You’re on the playground during recess and you’re discussing with your friend what 14 times 95 is (as cool kids do).
Your friend is pretty good with computers and could run back inside to Google what the answer is. (And this would still work even for harder problems!) That’s pretty much guaranteed to get the right answer, but it’s a pain for your friend to run all the way there, wait for a turn on the computer, Google it, and run all the way back.
But your friend is also good at mental math and can multiply it out in her head. She might not get the answer as fast as the computer would, but it’s still faster than running inside.
The second case is like edge computing — the “less powerful” devices immediately close to you, like your smartphone (or your friend, in my analogy), can do calculations and such directly so that you don’t need to send the questions to the more powerful cloud (the computer inside the school). Even though cloud computing might get the answer faster, you still save time overall for relatively simple questions because you’re not wasting time sending the data to the cloud (or running back inside). For real edge computing, you’re also just saving on data costs since you don’t need to contact the cloud at all.
But if a question is super complicated, like 1454 times 7395, then now your edge-computing friend might not be able to do it in her head, so she’ll still have to run inside and wait her turn to ask the cloud-computing school computer for the answer. That’s why cloud computing is still important.
Maybe for a medium-difficulty question, she doesn’t have to run all the way inside and can ask a nearby teacher instead, which still takes time but is faster than going all the way inside. That’s like another layer of edge compute.
—
The other comment gives a pretty good real-world example of these problems (video recognition), so I won’t repeat that here. 🙂
Latest Answers