Facebook does its best to counter vaccination damage through Facebook

On Monday, Facebook unveiled a plan aimed at vaccinating 50 million people, the latest in a series of attempts by social media company to combat the Covid-19 pandemic and the misinformation that thrives on its platform has. The campaign follows years of criticism on Facebook for not doing enough to combat the dangers of the anti-vaccination movement.

The first plans of Facebook, Mark Zuckerberg, were introduced by the plans of Facebook to launch a tool to help people make appointments with local vaccinations and to strengthen credible vaccination information of health officials, and to add labels about the coronavirus that infects people on information from the World Health Organization. The company is also expanding the official WhatsApp chatbots to help people sign up for vaccinations, and is offering new stickers on Instagram so people can inspire others to get vaccinated. ‘(WhatsApp and Instagram are owned by Facebook.)

On top of that, Facebook, and perhaps more critically, does something it hates: limiting the dissemination of information. The company also announced that it would temporarily reduce the distribution of content by users who violate Covid-19 policy and the misinformation of vaccines, or who continue to share the content that its fact-checking partners have unraveled. It is difficult to find out what is wrong information and what is not, and it is difficult to see the difference between people who deliberately mislead others and have legitimate questions.

These efforts build on existing promises Facebook has made. In February, Facebook announced that it would take down false information against vaccination and use its platform for the world’s largest vaccination information campaign Covid-19, the start of which was announced this week. The social media company also worked with public health researchers to find out the reasons for vaccination – and how to combat it – through surveys on the platform.

Critics say Facebook’s efforts are not enough to counter the enormous situation that the platform itself is helping to create.

Anti-vaccination rhetoric has been flourishing on the platform for years, providing a safe space for groups that do not contain vaccine and even recommend such groups. According to David Broniatowski, a professor at George Washington University who researches vaccination communities, much of the content that causes vaccine hesitation will not be viewed as misinformation, but rather as an opinion.

“People who oppose vaccinations do not primarily make arguments based on science or facts, but on values ​​such as freedom of choice or civil liberties,” Broniatowski told Recode. “These are opinions, but very biting opinions.”

For example, a post that says, “I do not think vaccines are safe, do I?” will probably not be marked as incorrect information, but the tone may be sly.

Facebook is aware that such posts that do not violate Facebook’s rules cause vaccine hesitation, according to a new report from the Washington Post. “Although research is very early, we are concerned that damage by non-infringing content could be significant,” the story quotes from an internal Facebook document.

While Broniatowski praises Facebook’s efforts to partner with health organizations and vaccine facts, he thinks it could do something more effective: allow public health officials to target vaccine groups, with compelling arguments such as those by vaccine disruptors are driven. He noted that a relatively small proportion of Facebook users with excessive influence promote a hesitation about vaccines, and that a small group of public health experts could be used to combat it.

“There are some very sophisticated actors who make a number of arguments, whatever they hold, to prevent people from being vaccinated,” he said. “We need a more nuanced response that responds more to people’s real concerns.”

Facebook did not immediately respond with a comment.

People who refuse to be vaccinated have a wide variety of reasons, according to data released today by Delphi Group at Carnegie Mellon University in partnership with Facebook. Of those surveyed, 45 percent said they should not be vaccinated due to fear of side effects, and 40 percent expressed concern about the safety of the vaccine. Smaller percentages of respondents indicated mistrust in vaccines and the government. Addressing these issues directly can have a significant impact on people’s willingness to get vaccines.

Facebook can also make sure that its efforts to curb the misinformation of Covid-19 are more than just its latest liaison campaign, Imran Ahmed, CEO of the Center for Countering Digital Hate, said in a statement to Recode.

“Since Facebook’s last announcement that they were planning to combat anti-vaccine misinformation more than a month ago, almost no progress has been made,” Ahmed said.

“Facebook and Instagram still do not remove the vast majority of the messages reported to them because it contains dangerous misinformation about vaccines,” he said. “The main spreaders of anti-vaccine lies are all still present on Instagram or Facebook, despite promises to remove them.”

Since announcing the misinformation about vaccines in February, the company has said it has removed another two million pieces of content from Facebook and Instagram. Whether this and the new measures will vaccinate another 50 million people remains to be seen.

Source