:no_upscale()/cdn.vox-cdn.com/uploads/chorus_asset/file/19433750/open_sourced_story_logo.png?w=560&ssl=1)
Nearly a year after the Covid-19 pandemic, Facebook is still taking its strictest stance against misinformation against vaccines by banning it altogether. The ban does not only apply to incorrect information about the Covid-19 vaccine. This means, for example, that posts claiming that vaccines cause autism, or that measles cannot kill people are no longer allowed on Facebook. At the same time, the platform will also encourage Americans to be vaccinated, and will guide people to information on when it’s their turn to get a Covid-19 vaccine and how to find an available dose.
These movements, which are part of a broader pressure by the company, are important because Facebook with almost 3 billion users is one of the most influential social media networks in the world. And as vaccinations begin to roll out around the world, many are concerned that misinformation – including misinformation on Facebook – could exacerbate the refusal or reluctance to be vaccinated.
In a blog post published on Monday, Facebook explained that these changes are part of the largest global campaign to promote authoritative information on Covid-19 vaccinations. The effort is being developed in consultation with health authorities such as the World Health Organization, and will include reliable information from organizations such as the United Nations and various health ministries. (A list of banned vaccine claims, compiled with the help of health authorities, is available here.) The overall approach looks similar to Facebook’s U.S. voter registration initiative, which the company said helped a few million people sign up to participate in the November election.
“A year ago, Covid-19 was declared a public health emergency and since then we have helped the health authorities reach billions of people with accurate information and support health and economic aid,” said Kang-Xing Jin, head of health of Facebook, written. , Monday. “But there is still a long way to go and in 2021 we are committed to supporting health leaders and public officials in their work to vaccinate billions of people against Covid-19.”
A major caveat to the new policy is that just because Facebook say changing its guidelines on the wrong information about vaccines does not mean that wrong information about vaccines will not end up on the website in any case. Changing rules and applying rules are two different things. Despite Facebook’s earlier rules banning incorrect information specifically about Covid-19 vaccines, images suggesting that coronavirus vaccinations had extreme side effects could still go viral on the platform, and some picked up tens of thousands of “Likes” before Facebook removed it.
A Facebook spokesman told Recode that the company will apply its extensive rules as it becomes aware of the content it violates, whether it has already been posted or will be posted in the future. The spokesman did not say whether Facebook would increase its investment in content moderation, given the greater scope for misinformation against vaccine, but told Recode that expanding its application would require time to train the content moderators and systems.
The changes on Monday are still important because Facebook CEO Mark Zuckerberg, who has repeatedly defended the principles of free expression, now says the company will pay particular attention to pages, groups and accounts on Facebook and Instagram (owned by Facebook ) who share regularly. vaccine incorrect information, and can remove it completely. It also adjusts search algorithms to reduce the appearance of anti-wax content.
Like other enforcement actions Facebook has taken – on everything from the right-wing, anti-Semitic QAnon conspiracy theory to incitement to violence by Donald Trump – some believe the company’s move is slowing down. “This is a classic case of Facebook acting too little, too late,” Fadi Quran, a non-profit Avaaz campaign director who leads his disinformation team, told Recode. “Facebook has been at the center of the misinformation crisis exacerbating the pandemic for more than a year, and the damage has already been done.” He said much more needs to be done at this stage to address users who have already seen misinformation about vaccines.
Facebook’s announcement comes as major technology platforms are grappling with their role in the Covid-19 crisis. Back in the fall, experts warned that social media platforms had a fine line when it came to global vaccination: while social networks should promote accurate information about Covid-19 vaccinations, they should also leave platforms for people to express . honest questions about these relatively new vaccines.
“We have a new virus along with a new vaccine and a new way of life – it’s too much new for humans,” Ysabel Gerrard, a digital sociologist at the University of Sheffield, told Recode at the time. “I think the backlash against a Covid-19 vaccine is going to be on a scale we’ve never seen before.”
How well Facebook will apply its new rules, or how many people will help the platform get vaccinated, is unclear. The changes it announced Monday come after experts repeatedly warned about Facebook’s role in promoting vaccine conspiracy theories. For years, researchers have branded Facebook as a platform where incorrect and misleading information about vaccines has increased, including the idea that vaccines can be linked to autism.
Open source is made possible by Omidyar Network. All open source content is editorially independent and is produced by our journalists.