Meta thinks Facebook may need more “harmful health misinformation” [Updated] – Ars Technica

Meta thinks Facebook may need more “harmful health misinformation” [Updated]

The US continues to struggle with pandemic management. Where cases are rising right now, some cities and counties are considering reinstating mask mandates, and many hospitals are confronting a chronic nursing shortage.

Despite new concerns and a recent uptick in daily deaths recorded in the US and globally, however, Meta is already thinking about what a return to normal might look like. That includes recently speculating that normalcy might mean it’s time to go back to the company’s heydays of allowing health misinformation to spread through posts on Facebook and Instagram.

On Tuesday, Meta’s president of global affairs, Nick Clegg, wrote in a statement that Meta is considering whether or not Facebook and Instagram should continue to remove all posts promoting falsehoods about vaccines, masks, and social distancing. To help them decide, Meta is asking its oversight board to weigh whether the “current COVID-19 misinformation policy is still appropriate” now that “extraordinary circumstances at the onset of the pandemic” have passed and many “countries around the world seek to return to more normal life.”

Clegg says that Meta began removing entire categories of information from the site for the first time during the pandemic, and this created tension that it’s now trying to resolve between two of the company’s values: protecting the “free expression and safety” of users.

“We are requesting an advisory opinion from the Oversight Board on whether Meta’s current measures to address COVID-19 misinformation under our harmful health misinformation policy continue to be appropriate, or whether we should address this misinformation through other means, like labeling or demoting it either directly or through our third-party fact-checking program,” Clegg wrote.

The oversight board already accepted Meta’s request and is fielding public comments here. The board is expecting “a large volume of submissions.” Once the board has considered all input and issued its policy advisory, Meta has 60 days to respond publicly to explain how it will or will not act upon recommendations.

Meta doesn’t have to abide by any decisions that the oversight board makes, though, and even if a shift to less extreme content moderation is approved, critics are likely to interpret the move as Meta seeking a scapegoat so that loosening restrictions is not perceived as an internal decision.

Why change the policy now?

Clegg told The Verge that Meta is seeking guidance from the oversight board now because “the Oversight Board can take months to produce an opinion,” and the company wants feedback now so that Meta can act “more thoughtfully” when moderating content during future pandemics.

Way before changing its name to Meta, Facebook spent the year before the pandemic “taking steps” to crack down on anti-vax misinformation spread. Those steps are similar to steps that Clegg is suggesting are appropriate to revert back to now. In 2019, the company started fact-checking more posts with misinformation, limiting the reach of some, and banning ads with misinformation.

Then, the pandemic started, and research found that despite taking these steps, anti-vax content on Facebook increased and, compared to official information, spread more rapidly to neutral audiences who had not yet formed an opinion on COVID-19 vaccination. Bloomberg reported that this dangerously boosted vaccine hesitancy during the pandemic, and Facebook knew it was happening but was motivated by profits not to swiftly respond. One study showed that the pages with the furthest reach in neutral newsfeeds were “people who sell or profit off of vaccine misinformation.”

Eventually, Congress investigated, and Facebook changed its name and then its policy, deciding that “some misinformation can lead to an imminent risk of physical harm, and we have a responsibility not to let this content proliferate.” The company made it official policy to remove “misinformation on an unprecedented scale,” deleting 25 million pieces of content that otherwise it likely would have left up, due to its policies protecting free speech.

Now, Clegg says that Meta has a duty to reconsider whether it acted rashly by unilaterally deciding to remove all those posts, so that next time there’s a pandemic, there’s clearer guidance on hand that adequately weighs free speech and harmful misinformation concerns. The idea is that Meta’s harmful health misinformation policy should only be used to limit misinformation spread during times when official sources of information are scarce, as they were at the start of the pandemic, but are not now.

Meta is basically asking the oversight board to consider: In times where there are obvious official sources of information, should tech companies have less obligation to limit misinformation spread?

As more people prepare to mask up to help limit transmission throughout the US and vaccine hesitancy remains a force driving transmission, that question feels premature from a platform that has already proven how hard it is to control misinformation spread even when there is a total ban on harmful misinformation.

Meta did not immediately respond to Ars’ request for comment. (Update: A Meta spokesperson tells Ars that “under our Community Standards, we remove misinformation during public health emergencies when public health authorities conclude that the information is false and likely to directly contribute to the risk of imminent physical harm.” During the pandemic, “COVID-19 was declared a Public Health Emergency of International Concern (PHEIC)” so Meta “applied this policy to content containing claims related to COVID-19 that, according to public health authorities,” are either false or “likely to contribute to imminent physical harm.” Now they’re seeking input from the Oversight Board to examine “current policies before a potential future pandemic so we can adjust those policies appropriately.” This month, a World Health Organization COVID-19 emergency committee “unanimously agreed that the COVID-19 pandemic still meets the criteria of an extraordinary event that continues to adversely impact the health of the world’s population.”)

Sorgente articolo:
Meta thinks Facebook may need more “harmful health misinformation” [Updated] – Ars Technica

User ID Campaign ID Link
d9a95efa0a2845057476957a427b0499 l-99999993 High Pagerank Service
d9a95efa0a2845057476957a427b0499 l-99999984 Cloud Realtime
d9a95efa0a2845057476957a427b0499 l-99999996 Service for LMS
d9a95efa0a2845057476957a427b0499 l-99999994 Marketing Automation