
Nopparat Khokthong / Shutterstock
By Jeffrey Brainard
Some scientists want to spread the word about a brand new study and share a link. But Twitter links rarely attract eyeballs to papers, a recent study finds.
In a survey of 1.1 million Twitter links to scientific articles, it was found that half did not attract any clicks, and an additional 22% attracted only one or two. Only about 10% of the links received more than ten clicks, according to the January 23 study Journal of the Society for Information Science and Technology.
Such meager click-through rates are not uncommon, other Twitter studies have found. Tweets that highlight stories in media do not fare much better on average. Although most of the research articles included in the new study did not click, a small minority went viral: an article on freshwater fish contaminated with radioactive cesium released by the Fukushima nuclear disaster in 2011 revealed more received as 25,000.
The Twitter study broke new ground by being one of the first to measure how users of the social media platform respond to tweeted scientific articles using a measure other than clicking the like or retweet buttons. Other research has shown that, due to Twitter’s restriction of 242 characters per tweet, many tweets about papers only display the title, and that a like or retweet can therefore only be a fleeting gesture of interest based on limited information. In contrast, if you click on a link, it is a sign – although it is not proof – that someone has read the newspaper, say the authors of the new study, led by Zhichao Fang of the University of Leiden.
Due to technical problems, the researchers were only able to investigate links to papers published from 2012 to 2017. They have therefore not been able to access data on whether the click has increased since the start of the COVID-19 pandemic, during which many researchers turned to Twitter. to share papers and comment.
Another data limitation: the team only examined a subset of all links, created by the bit.ly link shortening service, which allows social media users to compress longer URLs. About 15% of all tweets during the study period contained bit.ly links. (Twitter introduced its own link shortening feature in 2017, but click-to-click data is not widely available.)
Even with such limitations, Fang’s team found that even the hottest tweets, measured by pressure and hold, did not have much of an impact on the subsequent scholarship. Papers mentioned in popular tweets, for example, did not receive significant more citations. This may reflect that tweets are usually posted quickly with little consideration, while quotes are often chosen after careful consideration, other research indicates. “Science and social media meet two different spaces of engagement,” says Rodrigo Costas Comesana of Leiden University, who co-authored the new study. “Everyone has their own rules.”
Costas Comesana and Fang say that if Twitter is willing to provide them with more information about links, it can help them better understand why scientists – and non-scientists – click on some tweeted articles, but not others. . They wonder, for example, whether the prominence of the tweeter, or the magazine where the article is published, makes a difference. (Using existing data, the research team was unable to determine how many scientists click on links.)
The new study contributes to the understanding of how science is communicated on Twitter, says Nicolás Robinson-García of the University of Granada, who was not involved in the study. He and colleagues published a separate analysis in 2017 which found that it was ‘impossible’ to use only the content of a tweet to ‘deduce that there was any form of involvement in the newspaper itself’, he said. noticed in an email. In contrast, he wrote that investigating link clicks can provide a clearer, but not complete, picture of what users are doing.
Robinson-García’s own work suggests that Twitter is not an effective medium for catalyzing meaningful, sustained consideration of new findings. In a 2017 analysis he did with colleagues, the content of 8247 tweets was examined after 4358 articles published in dental journals. They found that many tweets were merely retweets or duplicates sent from the same account, probably by robots. Only 6% of the tweets, which came from only 1% of the Twitter accounts studied, provided evidence that the tweeter read the newspaper, as indicated by comments in the tweet about the conclusion of the article or other aspects.
It would be interesting, they wrote, “to identify the tweets and reports that are truly informative, relevant, and indicative of receiving and discussing research.”