Type Here to Get Search Results !

Five Questions About Donald Trump's Facebook Ban

 Five Questions About Donald Trump's Facebook Ban

Five Questions About Donald Trump's Facebook Ban


The social network initiative raises more problems than it solves. It is unknown who made the decision and how political content that violates the rules will be handled in the future. Furthermore, the ban could end just for the beginning of the new US electoral cycle.


Last Friday, Facebook announced that it was going to suspend former US President Donald Trump from its social network for two years, until at least January 7, 2023, and stated that "he would only be reinstated if conditions allowed."


The statement comes in response to last month's recommendations from Facebook's recently created Oversight Board. The company expected the board to decide how to manage Trump's account, but while it defended the company's initial decision to ban the former president from its platform for inciting violence on January 6, the board noted that the long-term decision it belonged to the directors of Palo Alto.


The news that Trump would be banned from Facebook for another 19 months was going to provide some answers about the platform's relationship with the former businessman, but instead leaves many questions open.


Who is this decision supposed to please?

Although the ad offers some rules for how politicians can use Facebook, and some recommendations for how those rules would be enforced, the decision to ban Trump for at least two years will not be the most popular. Activist groups such as Ultraviolet and Media Matters, which had long lobbied Facebook to ban Trump, have issued statements saying that anything other than a permanent ban is inappropriate.


Meanwhile, people who believe that any enforcement of the rules against conservative politicians is proof that Facebook penalizes conservative content continue to think so, despite a wealth of evidence that the opposite is true. And in this way, the possibility remains open that Trump will be back online just in time for the 2024 election cycle in the United States.


What does "newsworthy" mean now?

Many platforms, including Facebook, have used the "newsworthy" exception to avoid enforcing their own rules against politicians and world leaders. The Facebook release appears with some changes to how it will use that loophole in the future. First of all, according to Facebook, it will post a notice every time it applies that rule to an account. And secondly, on enforcement of the rules, "it will not treat content posted by politicians differently than content posted by anyone else," which basically means determining whether the public interest in content that breaks the rules exceeds possible harm from keeping it online.


Facebook officially introduced this policy in late 2016, after censoring an iconic photo from the Vietnam War because it contained nudity. However, the newsworthy exception became a general rule of thumb for politicians, including Trump, who allowed content that violated the rules to remain online because it was considered in the public interest by default. But while this announcement appears to end that blanket protection, it doesn't completely remove it and doesn't go into more detail about how Facebook will determine if something falls under that exception.


Who made this decision?

The announcement was written by the company's vice president of general affairs, Nick Clegg, but it refers to "us" at all times. However, it does not specify who on Facebook was involved in the process of making these decisions, which is important for transparency and credibility, given the controversy of the decision.


"We know that today's decision will be criticized by many on opposite sides of politics, but our job is to make decisions in the most balanced, fair and transparent way possible," Clegg wrote.


Where will you get Facebook tips from?

The statement also notes that the company will seek "experts" to "assess whether the risk to public safety has decreased," without specifying which experts they would be, what knowledge they would contribute, or how Facebook (or, again, who on Facebook) would have the authority to make decisions based on their findings. The Oversight Board, which was designed in part as a way to outsource controversial decisions, has already indicated that it does not want to play that role.


This means that it is especially important to know whose opinions will matter to Facebook and who will have the authority to act on the advice, especially considering how much is at stake. Conflict assessment and violence analysis are specialized fields and in which Facebook's previous responses have not built much trust.


Three years ago, for example, the United Nations accused the company of being "slow and ineffective" in responding to the spread of hate online that led to attacks on the Rohingya minority in Burma. Facebook commissioned an independent report from the nonprofit Business for Social Responsibility that confirmed the UN claims. That report, released in 2018, pointed to the possibility of violence in the 2020 US elections and recommended steps the company could take to prepare for such "multiple possibilities." Facebook executives at the time recognized that "we could and should do more." But, during the 2020 election campaign (after which Trump lost the presidency) and in the run-up to January 6, the company made few attempts to act on those recommendations.


What will happen in 2023?

Then there is the question of the limit of the ban, and the fact that it could put off the very conversation until perhaps it is even more inconvenient than it already is today. Unless Facebook decides to extend the ban even further based on its definition of "if conditions permit," it will end just in time for the primaries of the next cycle of the US presidential elections What could go wrong?

Post a Comment

0 Comments
* Please Don't Spam Here. All the Comments are Reviewed by Admin.