Bot applications are not used in the final stages when the discussions are taking place. Those are typically actual humans who are using logins authenticated and made to look like real people by software programs.
You typically wont take on board information from a person who claims that Ukraine is eating Russian children, when their account is brand new and the first thing they post is memes of ukrainian cannibals.
However an old well established account that has 100k karma, lots of other interests, that person mey be able to get you to see the Russian “alternative view”, just asking questions, doubting the authorities.
In order for these teams of people to create an astroturf movement, they need lots and lots or real seeming accounts. The bot applications create these accounts for them.
In a social network, if an attacker’s bots predominantly only interact with other bots also run by that attacker it can be easy to identify them all and ban them. This is especially easy if all the bots that interact with each other are all doing the same thing too, creating posts about the same topic and upvoting posts made by the other bots.
A way to make this detection harder is called a Sybil attack – the attacker creates far more bots, but sets many of them to do more innocent things to make identifying whether the other accounts interacting with any particular bot are also bots or not murkier and harder to know for sure.
The rub to this strategy is that you’d need those accounts to do something else, ideally something easy to automate in a somewhat-authentic-looking way. The Sybils aren’t really helping directly with the real goal of the bot network, though, so they should be doing things that can be set up once and let to run without a lot of ongoing effort.
Memes are really good for this – they’ve been around for a while so there’s a lot of past examples to pull from, and don’t need to relate to current events or anything of that sort to make keeping them useful take more work than is needed.
I think its 2 parts
1. they are low effort attempts at farming karma, the karmad accounts then have value as either advertising tools or as “nodes” of an information campaign.
2. what i mean by information campaign is that i believe there are forces online that are seeking to control the flow of information by using AI and targeting communities to influence societal conversation topics. i think that is quite clear that the way that people perceive the world is heavily influenced by the social environment that is presented online, and by doing even the slightest manipulations to what they are exposed to, you can shift the general populations concerns in ways that suit you, if you have a broad enough reach. I think that a lot of fake profiles are being used to propagate beliefs and instigate conflicts around specific topics.
it may seem conspiracy theory like but when you consider how significant it is, its really more inevitable than it is far fetched. People put more thought than that about how to get people to drink an energy drink. there is no way in hell there arent people working tirelessly. I mean, people already think that the presidency of the US is determined by how they are perceived to feel about gay and trans people. its not hard to shift peoples focus to ineffectual topics by nudging the conversation. i think right now its a battle for volume, which is why theres so many bots fighting for spotlight.
Accounts with high karma have more posting rights in more subs, and are taken more seriously by other posters.
If a random account shows up in some political sub and starts picking fights about some particular controversial subject for example, if you check their profile and the account has only existed for 3 days and only ever talks about that one subject, you’re likely to brush them off as a fake account.
But if the account has existed for awhile and has high karma from reposting lots of random stuff, you’re more likely to think it’s a real person, engage with it, and therefore help their actual agenda.
But these days, it’s mostly middle-men. State and corporate actors driving agendas will pay very good money for accounts that have high karma and post history. Sometimes $100+ each, if you really look at the marketplaces for this stuff. So it’s become a good business to bot karma farm, then flip the accounts en masse to state, corporate, and political actors.
Latest Answers