Researchers say broad Facebook accounts are still condemning vaccinations, while Facebook-banned anti-vaxxers have fled to Instagram.
On Facebook and Instagram, conspiracy theories and rumors about the coronavirus vaccine are still circulating, more than a month after Facebook agreed to delete them.
Facebook revealed Dec. 3, under pressure to curb an epidemic of falsehoods, that it will ban unfounded arguments about the safety and efficacy of vaccinations that are now being circulated worldwide. Between March and October, the company said it removed over 12 million pieces of content from Facebook and Instagram and collaborated with fact-checkers to flag an additional 167 million pieces of content during the same time.
But researchers say massive Facebook accounts, some with more than half a million followers and long falsehood-promoting records, are still openly churning out new posts that challenge the vaccine.
Meanwhile, influential anti-vaxxers barred from Facebook continue to spread disinformation on Instagram, which Facebook owns, to hundreds of thousands of users.
The social network says the scope of some popular anti-vaxx Facebook pages has been restricted and that few people see some of the new misinformation about the coronavirus.
Yet experts on disinformation say the actions of the site are just too little, too late.
The Center for Countering Digital Hate (CCDH), which has tracked the rapid growth of the anti-vaccine movement during the pandemic in a December study, argued that more effective action is overdue for tech platforms to take.
“Anything less than taking down these individuals’ profiles, pages and groups and permanently blocking their service now that they know what’s happening is willing acquiescence.”
Same disinformation, multiple channels, separate
Even before the coronavirus pandemic, “vaccine hesitancy” – the inability to get vaccinated even though vaccines are available – was identified by the World Health Organization as one of the top ten challenges to global health.
Experts agree that last year brought a troubling expansion of an anti-vaccine movement that had already exploded on social media, where anti-vaxx advocates used private Facebook groups to warn mothers not to vaccinate their children and to organize campaigns of social media abuse against physicians who discussed the medical benefits of vaccinations.
According to a CCDH study, large anti-vaccine accounts on social media sites have gained more than 10 million new followers since 2019, including 4 million more followers on Instagram and 1 million on Facebook.
Vaccines from Pfizer and Moderna, which were approved for sale in the United States this December, each underwent a series of comprehensive clinical trials.
In the final phase of these tests, more than 15,000 individuals received each vaccine, and both vaccines were found to be more than 90% effective in coronavirus prevention without significant safety concerns.
There have been a few reports of recipients experiencing allergic reactions to the vaccine after the vaccinations were administered, but these events were not serious.
All possible vaccine adverse reactions are closely monitored as part of an intensive, ongoing protocol to ensure the safety of new vaccines.
But the rapid timeline and intense political pressure to deliver a coronavirus vaccine left people around the world questioning whether they should trust the latest vaccines and seek truthful answers, a situation that anti-vaxx organizations were well prepared to take advantage of.
Leading anti-vaccine activists held a private online conference in October to strategize how to exploit public concerns to spread doubt about vaccinations during the coronavirus pandemic, according to CCDH, which in a December study reported the conference speeches and conversations.
“At the conference, a prominent U.S. anti-vaxx activist, Del Bigtree, summarized a three-point strategy to undermine public belief: “It’s risky. It’s not important for you.
And herd immunity, according to the study, is your mate,’ he said.
In July, YouTube disabled Bigtree’s channel, which reportedly had more than 15 million views, and in November, according to a Facebook spokesperson, Facebook took down Bigtree’s Facebook page, which had more than 350,000 followers, for repeatedly releasing Covid misinformation