Facebook’s moderation of disinformation content is highlighted regularly. However, with respect to COVID-19, the network social was very responsive and still adheres to the very strict policies that were put in place place since the start of the pandemic. But today, Meta is wondering if, given the evolving health situation, it should apply the same policy to Facebook.
False news on COVID-19: Meta advises son Supervisory Board
In a recent press release, Meta’s supervisory board indicated that the company had taken over. Mark Zuckerberg’s company is looking to this council for advice on whether it should continue to remove misinformation content about COVID-19 or be more likely to take a less restrictive approach. phase with their values and responsibilities in the field of human rights. We remind you that the supervisory board or supervisory board, is an independent entity created by Meta. It is often considered the equivalent of the U.S. Supreme Court in matters of moderation. Advice can be taken by users who are not satisfied with Meta’s moderation decisions, but it can also be taken by a company when it needs guidance on its moderation policy. Overall, the company’s approach to fake news is “primarily based on contextualizing and narrowing the scope of potentially false claims, rather than removing content,” according to Meta’s review board. Indeed, the company believes that “because it is difficult to pinpoint exactly what constitutes disinformation on a range of topics, suppressing disinformation on a large scale is fraught with the potential for inappropriate interference with user experience. But with COVID-19, Meta took a different approach as early as January 2020. Instead of betting on “contextualization,” the company vindictively removed the content.
A tough policy that was justified by the urgency and gravity of the situation.
Why ? According to the oversight board, which cites Meta’s request, the decision was made because “outside health experts have told us that misinformation about COVID-19, such as false claims about treatment, masking, social distancing, and the transmissibility of the virus, could contribute to the spread of the virus.” “. to the risk of imminent physical harm.” If today Meta is wondering if it should always apply the same measures, it is because the situation has changed. First, at his request to the supervisory board, there was a lack of authoritative sources of information at the beginning of the pandemic. This would create a vacuum conducive to the spread of disinformation. However, today people have better access to information. In addition to this, at the health level, the Meta believes that the development of vaccines, treatments, and the evolution of options have made COVID-19 less deadly today. In addition, the company also indicated that “public health authorities are actively evaluating whether COVID-19 has progressed to a less severe condition. »
Will the Meta moderation policy change?
However, Facebook’s parent company also acknowledges that the evolution of the pandemic will vary from country to country, depending on parameters such as vaccination rates, healthcare system resources, or even citizens’ trust in their governments. At the moment, on it is unclear what will change in Meta’s policy regarding disinformation about COVID-19. But if the group gets approved son review board to adopt a “less restrictive” policy, Facebook could, for example, treat disinformation about the virus the same way it treats other fakes news. And as mentioned above, Meta’s general disinformation policy is to contextualize and reduce the scope of the post instead of removing it so as not to violate the user’s freedom of expression.