(Reuters) – YouTube, Google’s video service from Alphabet Inc, said on Tuesday it would begin showing text and links from third-party fact-checkers to U.S. viewers as part of efforts to expose misinformation on the site during the COVID-19 pandemic restrict. .
The information panels, which were launched in Brazil and India last year, will include third-party articles, which are reviewed, above the search results for topics such as ‘covid and ibuprofen’ or false claims such as ‘COVID-19 is a bio-weapon’ , as well as specific searches such as “a tornado hit Los Angeles.”
Social media sites, including Facebook Inc and Twitter Inc, are under pressure to combat misinformation regarding the pandemic caused by the new coronavirus, from false cure to conspiracy theories.
YouTube said in a blog post that more than a dozen U.S. publishers participate in its fact-checking network, including FactCheck.org, PolitiFact and The Washington Post Fact Checker. The company said it could not share a complete list of fact-checking partners.
In 2018, YouTube began using information panels that popped up links to sources like Encyclopedia Britannica and Wikipedia for topics considered misguided, such as “flat earth” theories. But Tuesday’s blog post says the panels will now help address misinformation in a fast-moving news cycle.
The site also recently launched links to the World Health Organization, Centers for Disease Control and Prevention, or local health authorities for videos and searches related to COVID-19.
YouTube did not indicate in the blog post how many search terms the fact boxes would indicate. It is said that it will take a while before our systems fully launch as it expands the fact checking feature.
The feature will only appear on searches, though the company said earlier that its recommendation feature, which encourages people to watch videos similar to those they used to spend a lot of time on, is the bulk of the total “viewing time”.
In January, YouTube said that recommendations of frontier content or videos that could misinform users in harmful ways began to diminish, such as ‘videos that promote a false miracle cure for a serious illness’.
Major social media companies, which vacated their offices during the pandemic, have warned that their content moderation could be affected by relying on more automated software. In March, Google said it could cause an increase in videos to be removed incorrectly due to policy violations. [nL1N2BA01A]
Reporting by Elizabeth Culliford; additional reporting by Paresh Dave. Edited by David Gregorio and Marguerita Choy