Small number of Facebook users responsible for most skepticism about Covid vaccination – report | Facebook

According to early results from an internal Facebook study, a small portion of Facebook users are responsible for the majority of content that expresses or encourages skepticism about Covid-19 vaccines.

The study, first reported by the Washington Post, confirms what researchers have long argued about how the echo chamber effect can reinforce certain beliefs within social media communities. It also shows how speech that lacks outright misinformation about vaccines, which is banned on Facebook, can still contribute to the hesitation of vaccines.

A document outlining the study – which was not made public – was obtained by the Washington Post. Researchers at Facebook have divided users, groups and pages into 638 “population segments” and are studying it for a vaccine-free hesitation, according to the Post. This could include language such as ‘I’m worried about the vaccination because it’s so new,’ or ‘I do not know if a vaccine is safe,’ rather than using the wrong information.

Each ‘segment’ could cover up to 3 million people, which means the study could examine the activity of more than 1 billion people – less than half of Facebook’s approximately 2.8 billion monthly active users, reports the Post. The large study also highlights how much information can be obtained from Facebook’s user base, and how the company uses this amount of data to examine health results.

The Post reports that the study found in the population segment with the highest incidence of vaccinations against vaccine, only 111 users were responsible for half of all content labeled in the segment. It also showed that only 10 of the 638 population segments marked contained 50% of the contents of the vaccine hesitation.

Dani Lever, spokesman Dani Lever, said Facebook’s investigation into vaccine hesitation is part of an ongoing effort to help public health campaigns.

“We regularly study things like voice, prejudice, hate speech, nudity and Covid – to understand emerging trends so we can build, refine and measure our products,” Lever said.

Meanwhile, over the past year, Facebook has partnered with more than 60 global health experts to provide accurate information about Covid-19 and vaccines. It announced in December 2020 that it would ban all misinformation about vaccines, suspend users who violate the rules, and eventually ban them if they continue to violate the policy.

The study is just the latest to illustrate the huge effect that only a few actors can have on the information ecosystem. It comes on the heels of another study by the Election Integrity Project that found that a handful of right-wing “super-distributors” on social media were responsible for most of the misinformation in the run-up to the Capitol attack. In the report, experts set out a number of recommendations, including the removal of ‘super-spreader’ accounts.

The Facebook study also found that there could be a significant overlap between users showing anti-vaccination behavior on Facebook and supporters of QAnon, an unfounded conspiracy theory surrounding a ‘deep state’ cabal of Democrats and Hollywood celebrities who are busy with pedophilia and sex trafficking.

The overlap shows another long-term effect of the rise of QAnon, which was also linked to the January uprising in the Capitol. Many far-right actors, including supporters and supporters of QAnon, understand how to manipulate social media algorithms to reach wider audiences, said Sophie Bjork-James, a professor of anthropology at Vanderbilt University, who is researching the white nationalist movement in the US.

“QAnon is now a threat to public health,” Bjork-James said. “Over the past year, QAnon has spread widely in the online anti-vaccination community, adding to the alternative health community. The Facebook study shows that we will probably experience the consequences of this for some time to come. ”

Source