Document Leaks Reveal Vaccine Misinformation Battle
November 5, 2021
According to CNN Business, leaked internal documents from Facebook shed light on how dire the vaccine misinformation situation is.
The documents show that finding, demoting, or deleting vaccine misinformation within comments is severely lacking on the platform: “We have no idea about the scale of the [Covid-19 vaccine hesitancy] problem when it comes to comments.”
Documentation also revealed that comments could play a bigger role in vaccine hesitancy than actual posts. But an internal report from March 2021 stated that the company underinvested in efforts targeting this problem. Another March report claimed that anti-vax comments on the site are so prolific that healthcare organizations—like UNICEF and the WHO—are declining free Facebook ad spend permissions to avoid attracting such comments.
A new report by NewsGuard also revealed that misinformation surrounding COVID-19 and vaccines gained 370,000 followers over more than a dozen accounts, pages, and groups in the last year.
What's the impact?
Public health impacts of misinformation are not yet fully understood. But preliminary research suggests that misinformation access directly impacts an individual’s intent to vaccinate. At a large scale, this likely has a significant impact on transmission rates and healthcare systems.
Exposure to misinformation can also fuel social dissent, leading to further harm. Those experiencing social or financial impacts surrounding the pandemic and vaccine mandates may be more vulnerable to content appealing to their grievances. Misinformation can facilitate connections between these individuals and even encourage more radical beliefs or actions.
Insider threats, demonstrations, and vandalism have already affected healthcare organizations involved in vaccine distribution. Government, healthcare, big tech, and media organizations may be especially vulnerable to similar security impacts of COVID-19 misinformation as it proliferates online.
What can you do?
The leaked internal documents claimed that Facebook’s AI systems can detect misinformation in posts, but not comments. This suggests that misinformation detection technologies have not kept up with the proliferation of misleading content.
If misinformation impacts your organization, detecting and addressing this content requires the right tools and technologies. You may not be responsible for addressing misinformation on the scale of Facebook—but OSINT tools can help you:
- Find mentions of your organization or assets in connection with misleading content or misinformation accounts
- Identify new misinformation before it gains traction
- Monitor covert social sites where misinformation may lead to harmful planning or intent that could harm your organization or assets
Connect with us to learn more about the impacts of misinformation and how OSINT solutions can help protect your organization against fallout.