Elsevier: Combating image misuse in science: new Humboldt database provides “missing link”. “How do researchers use and change images to make their results look more consistent or convincing? What is considered ‘appropriate’ image manipulation, and when does a scientist cross the line? These are some of the questions I’ve been trying to answer since I started writing my PhD thesis on scholarly image manipulation back in 2013. Inappropriate image manipulation is not good for the ecosystem of science. Science builds on science, and if there’s something wrong with a published paper, then you are poisoning that well.” This is a much deeper dive than a simple new resource announcement.
Nature: Huge peer-review study reveals lack of women and non-Westerners. “Women are inadequately represented as peer reviewers, journal editors and last authors of studies, according to an analysis of manuscript submissions to an influential biomedical journal. The study looked at all submissions made to the open-access title eLife from its launch in 2012 to 2017 — nearly 24,000 in total. It found that women worldwide, and researchers outside North America and Europe, were less likely to be peer reviewers, editors and last authors. The paper — which hasn’t itself yet been peer-reviewed — was posted on the preprint server bioRxiv1 on 29 August.”
Nature: Dutch publishing giant cuts off researchers in Germany and Sweden. “Elsevier last week stopped thousands of scientists in Germany from reading its recent journal articles, as a row escalates over the cost of a nationwide open-access agreement. The move comes just two weeks after researchers in Sweden lost access to the most recent Elsevier research papers, when negotiations on its contract broke down over the same issue.”
Science Business: Free access to research papers by 2020? ‘Impossible without radical steps’, says EU official. “A senior European Commission official called for ‘radical steps’ to speed up making publicly funded research in Europe freely available to readers, rather than locked behind publishers’ paywalls. ‘We are today at 20 per cent full open access,’ said Robert-Jan Smits, former director-general for research and innovation at the Commission. ‘Fifteen years ago, we were at 15 per cent open access,’ and in 2016 the EU set a target that all publicly research be open, free, to readers by 2020.”
Open Science: Sharing Data Do Not Indicate Twitter Significantly Augments Article-Level Citation Impact of Recent Research Results. “Guest-authoring a post, published on June 12, 2018, for the Altmetric Blog, Stefanie Haustein, an information science scholar from the University of Ottawa and Université du Québec à Montréal, Canada, has drawn attention to the mixed findings on the connection between Twitter mentions and citation counts of recently published articles. While social media, such as Facebook, can be assumed to contribute to the visibility of scientific research results, the collection of essays on Internet-based indicators for the impact of science edited by Wolfgang Glänzel, Henk Moed, Ulrich Schmoch and Mike Thelwall, to be published later in 2018, incidentally opens the discussion on the degree to which altmetrics can be helpful for the assessment of article-level impact.”
LSE Impact Blog: Introducing the Free Journal Network – community-controlled open access publishing. “Discontent with the scholarly publishing industry continues to grow, as the prevailing subscription model appears increasingly unsustainable and open access big deals, one mooted alternative, unlikely to lead to optimal outcomes either. The Free Journal Network was established earlier this year in order to nurture and promote journals that are free to both authors and readers, and run according to the Fair Open Access Principles. Mark C. Wilson describes the progress the network has made so far, why community ownership is a crucial and underappreciated issue, and what research libraries can do to help.”
LSE Impact Blog: The academic papers researchers regard as significant are not those that are highly cited . “For many years, academia has relied on citation count as the main way to measure the impact or importance of research, informing metrics such as the Impact Factor and the h-index. But how well do these metrics actually align with researchers’ subjective evaluation of impact and significance? Rachel Borchardt and Matthew R. Hartings report on a study that compares researchers’ perceptions of significance, importance, and what is highly cited with actual citation data. The results reveal a strikingly large discrepancy between perceptions of impact and the metric we currently use to measure it.”