YouTube extends fact-checking feature to US video searches during COVID-19 pandemic

(Reuters) – YouTube, Google’s video service from Alphabet Inc, said on Tuesday it would begin showing text and links from third-party fact-checkers to U.S. viewers as part of efforts to remove misinformation on the site during the COVID-19 pandemic restrict. .

The information panels, which were launched in Brazil and India last year, will include third-party articles that have been audited over search results for topics such as ‘covid and ibuprofen’ or false claims such as ‘COVID-19 is a bio-weapon’ , as well as specific searches such as “a tornado hit Los Angeles.”

Social media sites, including Facebook Inc and Twitter Inc, are under pressure to combat misinformation about the pandemic caused by the new coronavirus, from fake drugs to conspiracy theories.

YouTube said in a blog post that more than a dozen U.S. publishers participate in its fact-checking network, including FactCheck.org, PolitiFact and The Washington Post Fact Checker. The company said it could not share a complete list of fact-checking partners.

In 2018, YouTube began using information panels to locate links to sources such as the Encyclopedia Britannica and Wikipedia for topics considered misinformation, such as ‘flat earth’ theories. But Tuesday’s blog post says the panels will now help tackle misinformation in a fast-moving news cycle.

The site also recently launched links to the World Health Organization, Centers for Disease Control and Prevention, or local health authorities for videos and searches related to COVID-19.

YouTube did not indicate in the blog post how many search terms the fact boxes would indicate. It is said that it will “take a while before our systems are fully deployed” as it expands the fact-checking feature.

The feature will only appear on searches, although the company said earlier that its recommendation feature, which encourages people to watch videos similar to those they used to spend a lot of time on, is the bulk of the total “viewing time”.

In January, YouTube said that recommendations on frontier content or videos that could misinform users in harmful ways had begun to diminish, such as “videos that promote a false miracle cure for a serious illness.”

Major social media companies, which vacated their offices during the pandemic, have warned that their content moderation could be affected by relying on more automated software. In March, Google said it could cause an increase in videos to be removed incorrectly due to policy violations. [nL1N2BA01A]

Reporting by Elizabeth Culliford; additional reporting by Paresh Dave. Edited by David Gregorio and Marguerita Choy

.Source