It’s a bit of a mixed bag.
Internet traffic is handled by requests. You ask some server for information and it decides to either give it to you or not.
When you’re on a web page, you might make a new request every few seconds/minutes through normal scrolling/viewing/reading.
A simple bot can make thousands of requests per second.
And that’s not always in a nefarious sense.
Automoderators will make a request (maybe multiple) for every single comment made on reddit.
So if you think about it like that, then every comment you make has a bot reading it. That makes half of all requests for comments bot requests.
Then you have marketing agencies making API requests en mass.
And then normal maintenance/debugging which has automated checks.
So while half of all internet traffic is bots, it doesn’t mean that they’re all repost/astroturfing bots.
Automated requests are one of the cornerstones of keeping the internet functioning, and since computers can process many times faster than us, relatively few of them can make the same amount of traffic is hundreds of real users.
Latest Answers