The Indian influencer ER Yamini has never tweeted in her life; she prefers to cultivate her large following on Instagram and YouTube.
But in early March, a Twitter account using her photo tweeted: “ # IStandWithPutin. True Friendship” accompanied by a video showing two men embracing, one representing India and the other Russia.
Yamini says she does not support either countries in the war between Russia and Ukraine and cares about his supporters.
“If they see that tweet, what will they think of me?“, he wonders: “I wish they wouldn’t use my photo in that profile”.
The fake account is part of a network promoting Russian President Vladimir Putin on Twitter, which used the hashtags #IStandWithPutin and #IStandWithRussia on March 2 and 3.
That led to trending topics in different regions, particularly in the global south, apparently showing support for the war, in countries like India, Pakistan, South Africa and Nigeria.
Part of the activity tracked was organic, in others words, produced by real people, reflecting genuine support in some countries for Putin and Russia.
But many other profiles appear to have been inauthentic. They retweeted messages in large numbers, produced few original messages, and were created very recently.
“They were probably produced by bots, fake profiles, or compromised accounts, artificially amplifying support for Putin in these countries,” he says. Carl Miller, co-founder of CASM Technology, a company that investigates harm and misinformation online.
Tracked 9,907 profiles that promoted support for Russia on March 2 and 3, in several different languages. CASM found that more than 1,000 of those accounts had spam-like characteristics.
The BBC investigated hundreds of these apparently fake profiles. Our investigation confirms Miller’s suspicions: they try to pass as genuine, but are actually fake.
Through reverse image search, we discovered that the images used by these profiles were copied from celebrities, influencers and ordinary users, who had no idea that their faces were being used to support Russia in its war against Ukraine.
We have not been able to determine who created the accounts or if they have any connection to the Russian government .
An account called Preety Sharma, for example, states in her biography that she is a “model and businesswoman” originally from India, now in Miami.
It was created the 26 February, two days after the invasion of Russia. “Putin is a good person,” reads one of his retweets.
But the woman depicted in Putin’s profile photo the account is on the other side of the world.
Nicole Thorne is an Australian social media influencer who has 1.5 million followers on Instagram and only occasionally uses his original profile on Twitter.
Another account tries to impersonate the Indian singer Raja Gujjar. His first tweet was published on 05 February, the first day of the invasion. And the 178 account posts are retweets, a strong indicator of automation.
The BBC contacted Thorne and Gujjar, and both confirmed that these accounts were not theirs.
However, although some of the investigated accounts were very bot-like, not all of them were fake.
When performing a reverse search on one of the profiles, created in February 2019 with tweets from March 2 and no followers, the BBC found the account from a young Indian on LinkedIn.
Turned out to be authentic and assembled by Senthil Kumar, an aeronautical engineer. We asked him why he created an account solely to retweet messages in favor of Russia.
“Usually I open Twitter and see the trends. So I saw these posts and just retweet them,” he said.
Thinks Russia has supported India in the past
Indicators
The accounts tweet a mix of criticism of Western countries, express solidarity between the so-called Brics countries (Brazil, Russia, India, China, South Africa) and offer direct support to Putin.
“We default to the idea that information campaigns will target the West. However, none of the accounts were headed to the West or claimed to be from the West,” says Miller.
To identify what could be a group of inauthentic accounts, he adds, researchers analyze account creation dates, an “inhuman” tweet pattern (such as an account that tweets the 12 hours of the day) and the variety of tweeted topics.