Sunday, Nov 17, 2024
CLOSE

Facebook is having a tougher time managing vaccine misinformation than it is letting on, leaks suggest


Facebook is having a tougher time managing vaccine misinformation than it is letting on, leaks suggest

But the internal (*)FacebookFB(*()Documents suggest that there is a disconnect between the company’s public statements about its overall response to Covid-19 misinformation, and some of its employees’ findings regarding the issue. “We have no idea of the scale of the Problem with comments,” an internal report that was posted to Facebook’s internal website in February 2021, one-year after the pandemic began, stated. The report stated that “our internal systems are not yet identifying and degrading and/or removing antivaccine comments often enough.”

Additional reports a month later raised concerns about the prevalence of vaccine hesitancy — which in some cases may amount to misinformation — in comments, which employees said Facebook’s systems were less equipped to moderate than posts. One of the March 2021 reports stated that “our ability detect vaccine hesitancy comment is poor in English and basically nonexistent elsewhere.”[Covid-19 vaccine hesitancy]These documents were included in disclosures to Securities and Exchange Commission and given to Congress in redacted form, by Frances Haugen (Facebook whistleblower). The redacted versions were reviewed by a group of news organizations, including SME.

The World Health Organization (WHO), in its early stages of the pandemic, began calling Covid-19’s misinformation an “infodemic”. It was amid a flood social media posts about conspiracy theories, dangerous advice about faulty vaccines, and dangerous information about the virus’ origins. The organization asked big tech companies to give it a direct line so that they could flag posts on their platforms which could pose a threat to people’s safety.

Mark Zuckerberg, CEO of Facebook, posted on March 3, 2020 that his company was collaborating with the WHO as well as other leading health organizations to provide accurate information about the virus. There were approximately 90,000 cases worldwide at the time and around 3,100 deaths, most of which were in China. Approved vaccines were still months away. The company was already struggling with misinformation about Covid-19. Zuckerberg wrote, “As our community guidelines make clear, it is not okay to share anything that puts people in harm’s way.” “So, we are removing false claims or conspiracy theories that have been flagged up by global health organizations.” He stated that the WHO would be given “as many free advertisements as they need for coronavirus response together with other in-kind support” and that the WHO would also provide “millions of dollars in ad credits to other authoritative organisations.

However, there were many comments that raised questions and raised legitimate concerns about vaccines on Facebook. In some cases, these organizations did not want to receive the free help. One of the March 2021 internal reports stated that the rate at which vaccine hesitancy comments were raised on Facebook posts was so high that authoritative health actors like UNICEF or the WHO would not use the free ad spend that we provide to them to promote provaccine content because they don’t want to encourage anti-vaccine commenters who swarm their Pages.”

Facebook employees were concerned that although the company’s AI systems could detect misinformation in posts but not comments, documents show. This may be because comments are more likely have vaccine-hesitant material.

“The overall risk from

“While comments may be more than posts, we have under-invested on preventing hesitancy from comments as compared with our investment in content,” another March 2020 report stated.

“One flag from UNICEF was a disparity between FB & IG,” one comment stated. “Where they said this: ‘One way we manage these situations on Instagram is through pinning top comments. Pinning allows us to highlight our most important comment (which will almost always link to useful vaccine information) and highlight other top comments which support vaccination. [vaccine hesitancy]UNICEF, WHO and other agencies did not respond to our requests for comment.

A Facebook spokesperson claimed that the company had improved on the issues raised by the internal memos. She said: “We approach misinformation in comments using policies that help us remove, reduce or minimize the visibility of false or potentially misleading data while also promoting reliable and giving people control over their comments. There is no one-size-fits all solution to stop the spread of misinformation. We are committed to developing new tools and policies that make comments sections more secure.

Among other efforts, Facebook — as well as fellow social media giants Twitter and YouTube — has added Covid-19 misinformation to its “strike policy” under which users can get suspended (and potentially removed) for posting violating content since the pandemic began. The platforms began labeling content that was related to Covid-19 in order to direct users towards authoritative sources.

Facebook halted the public release a “transparency report” earlier this year after it revealed that the most viewed link on the platform for the first quarter 2021 was a news story that claimed that a doctor died from the coronavirus vaccine. The New York Times reported that it had been pulled by the social media giant.

Facebook appears to reward sensationalist and irresponsible news media coverage about the purported dangers that Covid-19 vaccines pose. A February internal memo revealed that a tabloid story on vaccine deaths had been shared over 130,000 times on Facebook. However, the company’s problems have not been limited to comments or news articles. According to a May 18, 2020 post on Facebook, the “most active” civic organizations in the United States were the “hundreds of anti-quarantine and standard groups that have been active for months/years (Trump 2020 and Tucker Carlson).According to a May 18, 2020 Facebook post, “(? ),”

The author of the post wrote that these groups were full of Covid-19 misinformation. He also noted that content from these groups was prominently featured in the Facebook feeds “the tens to millions of Americans who are now a member of them.”

SME was informed by a spokesperson for Facebook that the company has added safety controls to groups following the May 2020 internal post.

Biden claimed that platforms like Facebook were “killing” people with Covid-19 misinformation in July 2021. Biden later backtracked on that claim, but not until a Facebook executive posted a strongly rebuke to the President.

“At an age when COVID-19-related cases are increasing in America, the Biden Administration has chosen to blame some American social media firms,” Guy Rosen, Facebook vice President of integrity, wrote. “While social media is an important part of society, it is clear that a whole-of society approach is needed to end this pandemic. And facts — not allegations — should help inform that effort.”

He stated that Facebook takes action against misinformation regarding Covid-19 vaccines and cited research done by Carnegie Mellon to show that the majority of US Facebook users have been vaccinated.

However, the February 2021 internal report on the prevalence of anti-vaccine or vaccine-hesitant comments in Facebook comments suggested that there was an “anti-vax sentiment” in Facebook comments relative to the wider population in the United States and the United Kingdom.

“This overrepresentation could convey that it is normal to be hesitant regarding the Covid-19 vaccination and encourage greater vaccine hesitancy,” said the report.

The post Facebook is having a tougher time managing vaccine misinformation than it is letting on, leaks suggest appeared first on Social Media Explorer.