Weeks before the US elections, Facebook tightens political ad rules. The social media giant formed teams to manage election-related posts. They do so by deploying internal tools designed for “at-risk” countries. Apart from the United States, countries include Sri Lanka and Myanmar.
Facebook’s emergency measures include controlling viral content and inflammatory posts. Executives said that only under extreme situations will they deploy emergency tools. Conditions for deployment include the threat of election-related violence.
Adjusting the Rules
Facebook’s playbook includes adjusting the ways posts get spread. The company will slow down posts across the board once they start going viral. The company will also tweak the news feed, which will control what types of content users see. At the same time, it will lower thresholds for content deemed “dangerous.”
Once executed, Facebook hopes to shield users from sensationalized posts. It also hopes to limit any opportunities to incite violence or spread misinformation. Spokesman Andy Stone said they “spent years building for safer, more secure elections. We’ve applied lessons from previous elections, hired experts, and built new teams with experience across different areas to prepare for various scenarios.”
Safety or Suppression?
The measures do not come without risks. Healthy discourse can get swept alongside suspected inflammatory content. This is what makes people uneasy. Both Republicans and Democrats are both critical of the company’s policies. Any further efforts to regulate content are often seen as meddling.
Republicans, in particular, felt that the social media app isn’t level. Senior GOP members including President Donal Trump criticized the social site for bias. They complained that Facebook suppressed the New York Post story on Hunter Biden. Joe Biden’s son allegedly used his position to introduce a Ukrainian executive to his dad.
Facebook denied acting biased. They said they applied the rules that prevent election interference. As they deemed the NY Post’s evidence dubious, they restricted the spread of the story. Twitter also blocked the story.
Despite the Biden issue, Democrats continue to complain about Facebook. They said the company isn’t doing enough to prevent misinformation on their site. They also accuse the company of becoming too deferential to the Republican right.
Regular Algorithm Updates
Facebook makes periodic changes to its algorithms to improve its services. They often make changes to increase user engagement and sanction harmful activities. But, Facebook carries out these changes without informing the public beforehand.
Last month, CEO Mark Zuckerberg confirmed their action plans. “We need to be doing everything that we can to reduce the chances of violence or civil unrest in the wake of this election,” he said. At a conference call last week, a source said that Facebook is limiting speech more than it liked. The pandemic and the coming election are the reasons for this. Zuckerberg said a clear-cut victory for either candidate “could be helpful.” A decisive win in the polls can reduce the risk of unrest.
Facebook communications head Nick Clegg said they are preparing contingencies. They have “break glass tools” ready for use in case of a crisis. Clegg refused to provide details on these tools. He said, “it will no doubt elicit a greater sense of anxiety…”
How Do You Regulate Speech?
Over the years, the company continues to take action to prevent misinformation. The UN scored Facebook for its inaction on hate speech and calls for violence in other countries. In one incident, Myanmar’s Rohingya Muslims became the target for hate groups.
Chloe Poynton, a Facebook human-rights consultant, noted Facebook’s inaction. By doing nothing, the company helped escalate the 2018 atrocities in Sri Lanka. By not doing anything, hate speeches and false rumors against Muslims became widespread.
Which Ones are Safe?
For the 2018 US midterm polls, the company deployed tools that controlled content. Facebook also disabled the recommend option for its groups to prevent viral spreads.
The question remains: who decides which content is inflammatory and which ones are safe? How can we be sure that Facebook will act impartial and not take action according to its leanings? Unless the company makes that very clear, they won’t run out of critics who think they have their own agenda.
Watch this as CBS’ “Good Morning” reports on Facebook initiatives to combat 2020 election misinformation, limit political ads:
Do you post your opinions and ideas on social sites like Facebook? Do you believe that anything you say is covered by the First Amendment? Or do you think that certain restraint is necessary to prevent the spread of misinformation? Let us know what you think of Facebook actively policing its site.