Open Science: Sharing Data Do Not Indicate Twitter Significantly Augments Article-Level Citation Impact of Recent Research Results

Open Science: Sharing Data Do Not Indicate Twitter Significantly Augments Article-Level Citation Impact of Recent Research Results. “Guest-authoring a post, published on June 12, 2018, for the Altmetric Blog, Stefanie Haustein, an information science scholar from the University of Ottawa and Université du Québec à Montréal, Canada, has drawn attention to the mixed findings on the connection between Twitter mentions and citation counts of recently published articles. While social media, such as Facebook, can be assumed to contribute to the visibility of scientific research results, the collection of essays on Internet-based indicators for the impact of science edited by Wolfgang Glänzel, Henk Moed, Ulrich Schmoch and Mike Thelwall, to be published later in 2018, incidentally opens the discussion on the degree to which altmetrics can be helpful for the assessment of article-level impact.”

VOX EU: Effects of copyrights on science

VOX EU: Effects of copyrights on science. “Copyrights grant publishers exclusive rights to content for almost a century. In science, this can involve substantial social costs by limiting who can access existing research. This column uses a unique WWII-era programme in the US, which allowed US publishers to reprint exact copies of German-owned science books, to explore how copyrights affect follow-on science. This artificial removal of copyright barriers led to a 25% decline in prices, and a 67% increase in citations. These results suggest that restrictive copyright policies slow down the progress of science considerably.”

LSE Impact Blog: The academic papers researchers regard as significant are not those that are highly cited

LSE Impact Blog: The academic papers researchers regard as significant are not those that are highly cited . “For many years, academia has relied on citation count as the main way to measure the impact or importance of research, informing metrics such as the Impact Factor and the h-index. But how well do these metrics actually align with researchers’ subjective evaluation of impact and significance? Rachel Borchardt and Matthew R. Hartings report on a study that compares researchers’ perceptions of significance, importance, and what is highly cited with actual citation data. The results reveal a strikingly large discrepancy between perceptions of impact and the metric we currently use to measure it.”

EurekAlert: How social media helps scientists get the message across

EurekAlert: How social media helps scientists get the message across . “Analyzing the famous academic aphorism ‘publish or perish’ through a modern digital lens, a group of emerging ecologists and conservation scientists wanted to see whether communicating their new research discoveries through social media–primarily Twitter–eventually leads to higher citations years down the road. Turns out, the tweets are worth the time investment.”

Medium: What are the ten most cited sources on Wikipedia? Let’s ask the data.

Medium: What are the ten most cited sources on Wikipedia? Let’s ask the data.. “Citations are the foundation of Wikipedia’s reliability: they trace the connection between content added by our community of volunteer contributors and its sources. For readers, citations provide a mechanism to validate and check for themselves that what Wikipedia says is sound and trustworthy: they act as a gateway towards a broader ecosystem of reliable knowledge. In an effort to spearhead more research on where Wikipedia gets its facts from, and to celebrate Open Citations Month, we asked ourselves: what are the most cited sources across all of Wikipedia’s language editions?”

Nature: Science search engine links papers to grants and patents

Nature: Science search engine links papers to grants and patents. “The marketplace for science search engines is competitive and crowded. But a database launched on 15 January aims to provide academics with new ways to analyse the scholarly literature — including the grant funding behind it. Dimensions not only indexes papers and their citations, but also — uniquely among scholarly databases — connects publications to their related grants, funding agencies, patents and clinical trials. The tool ‘should give researchers more power to look at their fields and follow the money’, says James Wilsdon, a research-policy specialist at the University of Sheffield, UK.”

The “phantom reference:” How a made-up article got almost 400 citations (Retraction Watch)

Retraction Watch: The “phantom reference:” How a made-up article got almost 400 citations. “Pieter Kroonenberg, an emeritus professor of statistics at Leiden University in The Netherlands, was puzzled when he tried to locate a paper about academic writing and discovered the article didn’t exist. In fact, the journal—Journal of Science Communications—also didn’t exist. Perhaps Kroonenberg’s most bizarre discovery was that this made-up paper, ‘The art of writing a scientific article,’ had somehow been cited almost 400 times, according to Clarivate Analytics’ Web of Science.”