Twitter has launched an investigation after removing numerous phantom accounts from its platform this week that claimed to be Black people supporting President Donald Trump.
The social media giant said the network of more than two dozen similar accounts violated its rules against manipulating the site and spreading spam, The Washington Post reported.
»JULY: Twitter bans thousands of accounts promoting QAnon conspiracy theory
“Our teams are working diligently to investigate this activity and will take action in line with the Twitter Rules if Tweets are found to be in violation,” Twitter said in a statement.
The tweets posted from the accounts used scripted language such as: “YES I’M BLACK AND I’M VOTING FOR TRUMP”.
Fake profile pictures of Black people on some of the accounts were lifted from real individuals across the internet, a practice known as “digital blackface,” according to Darren Linvill, an associate professor at Clemson University, who said he tallied more than 24 accounts that had been retweeted or mentioned more than 265,000 times.
» APRIL: Twitter zaps GOP commentators for misinformation about coronavirus
One account that became active last week on Twitter featured the image of a Black police officer, President Donald Trump and the words “VOTE REPUBLICAN.” In six days, the account tweeted just eight times but garnered 24,000 followers, with its most popular tweet being liked 75,000 times, The Post reported. On Sunday, Twitter suspended the account.
Researchers informed Twitter of the breach.
With the election less than three weeks away, Trump is struggling to find more support from Black voters across the country. Only 10 percent of Black voters nationally plan to vote for Trump, according to FiveThirtyEight, a website that focuses on opinion poll analysis.
Social media have taken recent actions to crack down on disinformation in an effort to mute foreign interference that influenced the 2016 election.
Under pressure from lawmakers and advertisers, Facebook CEO Mark Zuckerberg has reversed course on his previous refusal to take action on inflammatory posts by President Trump that spread misinformation about mail voting and suggested violence against George Floyd protesters nationwide.
In August, Facebook removed hundreds of fake accounts which also claimed to be Black Americans who proudly supported Trump but were later found to be connected to a “pro-Trump foreign troll farm” in Romania.
»JULY: Facebook losing millions of dollars as companies cut off advertising
Since then, Facebook announced that it will not allow new ads in the week leading up to Election Day, but ads already running will remain and not be fact-checked, according to reports.
The platform is also on the lookout for misinformation related to COVID-19, QAnon and Holocaust denial.
In July, Twitter removed thousands of accounts for promoting content related to the far-right QAnon conspiracy theory, which is based on the absurd notion that a secret “deep state” campaign is trying to undermine Trump’s reelection, among many other unproven claims.
Twitter has since banned all political ads and is prominently flagging posts, including those by the president, that contain questionable or misleading content.
About the Author