Fearing a chaos on its platforms around the US presidential election on November 3, Facebook is planning to apply internal tools it has used in ‘at-risk’ countries like Sri Lanka and Myanmar.
According to a report in The Wall Street Journal citing people familiar with the matter, the emergency measures “include slowing the spread of viral content and lowering the bar for suppressing potentially inflammatory posts”.
The Facebook plan reportedly includes slowing the spread of posts as they begin to go viral and altering the news feed algorithm to change what content users see.
These internal tools have previously been used in countries including Sri Lanka and Myanmar.
The tools would only be used in the event of election-related violence or other serious circumstances, according to the report on Sunday.
“But slowing down the spread of popular content could suppress some good-faith political discussion, a prospect that makes some Facebook employees uneasy, some of the people said”.
Facebook CEO Mark Zuckerberg said last month that the US presidential election “is not going to be business as usual.”
He was “worried that with our nation so divided and election results potentially taking days or weeks to be finalized, there could be an increased risk of civil unrest across the country”.
The social network withdrew over 22 lakh advertisements and 120,000 posts on Facebook and Instagram that attempted to obstruct the US presidential election, Facebook’s head of global affairs Nick Clegg revealed last week.
“In 2020, the increase in the misuse of our platform comes from inside, from the US. This is the biggest change. Here too, we are adapting and taking action: we have just suppressed all the accounts, pages and groups linked to the QAnon movement,” the Facebook executive informed.
Earlier this month, Facebook removed more malicious networks from its platform — especially in the US and Myanmar that face elections in November — and one such network has been linked to Rally Forge, a US marketing firm, working on behalf of Turning Point USA and Inclusive Conservation Group.
Facebook said it has removed 17 Pages, 50 Facebook accounts and six Instagram accounts for violating its policy that originated in Myanmar.
Facing flak for failing to prevent hate speech from spreading on its platform in Myanmar, Facebook also announced additional steps aimed at protecting the integrity of November elections in the country.
The social network who had admitted that they have been “too slow” to prevent the spread of misinformation and hate speech in Myanmar said it will significantly reduce the distribution of content that its proactive detection technology identifies as likely hate speech.