BACKGROUND Improving rigor and transparency measures should lead to improvements in reproducibili... more BACKGROUND Improving rigor and transparency measures should lead to improvements in reproducibility across the scientific literature; however, the assessment of measures of transparency tends to be very difficult if performed manually. OBJECTIVE This study addresses the enhancement of the Rigor and Transparency Index (RTI, version 2.0), which attempts to automatically assess the rigor and transparency of journals, institutions, and countries using manuscripts scored on criteria found in reproducibility guidelines (eg, Materials Design, Analysis, and Reporting checklist criteria). METHODS The RTI tracks 27 entity types using natural language processing techniques such as Bidirectional Long Short-term Memory Conditional Random Field–based models and regular expressions; this allowed us to assess over 2 million papers accessed through PubMed Central. RESULTS Between 1997 and 2020 (where data were readily available in our data set), rigor and transparency measures showed general improve...
Background Improving rigor and transparency measures should lead to improvements in reproducibili... more Background Improving rigor and transparency measures should lead to improvements in reproducibility across the scientific literature; however, the assessment of measures of transparency tends to be very difficult if performed manually. Objective This study addresses the enhancement of the Rigor and Transparency Index (RTI, version 2.0), which attempts to automatically assess the rigor and transparency of journals, institutions, and countries using manuscripts scored on criteria found in reproducibility guidelines (eg, Materials Design, Analysis, and Reporting checklist criteria). Methods The RTI tracks 27 entity types using natural language processing techniques such as Bidirectional Long Short-term Memory Conditional Random Field–based models and regular expressions; this allowed us to assess over 2 million papers accessed through PubMed Central. Results Between 1997 and 2020 (where data were readily available in our data set), rigor and transparency measures showed general improve...
SciScore, an automated screening tool, gives studies a mark out of ten for 'rigour and transp... more SciScore, an automated screening tool, gives studies a mark out of ten for 'rigour and transparency'. SciScore searches methods sections for pieces of key information, which act as proxies for how rigorous the experiments are, and how easy it would be for other researchers to reproduce them. The software can flag where authors have specifically identified the reagents they use, such as antibodies, software tools, cell lines or transgenic organisms they use, for example. It also checks whether they have discussed factors such as sample sizes, how tests have been blinded or the sex of animals used. Last month we launched version 2 of SciScore which features 55 algorithms and a brand new MDAR report that can be used a the author's checklist. We have also extended the rigor checklist in the SciScore report such that it will be of added value for publishers, institutions and funders alike.<br>
- What can a journal do with altmetrics? - Using Altmetrics to Encourage and Track the Disseminat... more - What can a journal do with altmetrics? - Using Altmetrics to Encourage and Track the Dissemination of a Potentially High Impact Paper - Monitoring the broad impact of the journal publication output on a country level: A Case Study for Austria
During the Rubber-meets-the-road panel at the 2019 STM Conference in London, Anita Bandrowski and... more During the Rubber-meets-the-road panel at the 2019 STM Conference in London, Anita Bandrowski and Martijn Roelandse presented new means to measure transparency and reproducibility of biomedical journals. <br>These means are based on SciScore —, a tool that can evaluate whether the authors have addressed blinding, sex, and randomization of subjects into groups, power analysis, as well as key resources. These are all difficult and tedious things for humans to check, but critical if we want to measure — and ultimately improve — the quality of the science being conducted and published.<br>The first results have been presented in London.
My slides for "Seminar 4: Innovation in Scholarly Book Publishing: What Have We Achieved and... more My slides for "Seminar 4: Innovation in Scholarly Book Publishing: What Have We Achieved and What More Is Needed?" at #SSP2017
- Annotation in Researcher and Publishing Workflows - Annotation in Publisher and Researcher Work... more - Annotation in Researcher and Publishing Workflows - Annotation in Publisher and Researcher Workflow - Annotation and Metrics on Cambridge Core ...
<div>EXPLORE THE IMPACT OF YOUR BOOKS</div><div>Bookmetrix brings together a co... more <div>EXPLORE THE IMPACT OF YOUR BOOKS</div><div>Bookmetrix brings together a collection of performance metrics, helping you to see how your books are being discussed, cited, and used around the world.</div
- Timo Hannay of Digital Science opened the session a with a brief introduction. He pointed out t... more - Timo Hannay of Digital Science opened the session a with a brief introduction. He pointed out that the metrics related to only journals highlight only one dimension of the impact. Futhermore the metrics related to the article level are "much more than metrics". - After introducing himself Hans Zijistra illustrated the work done by Elsevier in the field of altmetrics. His presentation was very interesting for me: he spoke of passion and enthusiasm, to bring to bridge the communication gaps between old and new researchers about the issue of alternative metrics. For him Altmetrics are: "another angle of paper not the only way to impact". This is a very important aspect in those who, like me, are trying to introduce the existence of altmetrics in their institutions. Hans points out how often the term Altmetrics "confuses people". He suggested to promote, educate, explore these metrics and make them known to scientists with a "holistic approach". But, in the meantime, what do Elsevier do? They have been running a pilot across Scopus and ScienceDirect in which they are testing different versions of the Altmetric donut, and soon will be announcing further developments soon. At the end of his presentation, Hans has highlighted the importance of quality over quantity, and than when you click the donut to reveal the full original data, Almetrics becomes "more sexy"! - Jennifer Lin brought the experience of open access publishers. PLOS was the first publisher that introduced article level metrics for their journals. She started her presentation with an Albert Einstein citation: "Not everything that can be counted counts, and not everything that counts can be counted". Jennifer revealed how to construct such a system of metrics is very difficult. Researchers want to find the relevant papers, the most bookmarked, and navigate through the literature. In the other words, what happens behind the numbers and the data, is altmetrics. PLOS is focusing on the article usage by subject area and the use across all the materials. The [...]
Recognizing that the use of ORCID iDs and their associated metadata records—by authors, publisher... more Recognizing that the use of ORCID iDs and their associated metadata records—by authors, publishers, and others in the publishing ecosystem—has thus far been almost exclusively in the context of journals, ORCID contracted with Apex to explore the extent to which ORCID is used in book workflows and to make recommendations to promote and facilitate this.<br>In the context of virtual meetings with the ORCID in Books Community Working Group (CWG), it was determined that this could best be accomplished by two vehicles:• A survey that could be sent out to a wide variety of recipients and which would contain questions that would be designed for systematic tabulation and analysis.• A set of 25-some interviews with the CWG members and a carefully selected group of publishers and related organizations that could provide more wide ranging and nuanced insights.<br>
The reproducibility crisis in science is a multifaceted problem involving practices and incentive... more The reproducibility crisis in science is a multifaceted problem involving practices and incentives, both in the laboratory and in publication. Fortunately, some of the root causes are known and can be addressed by scientists and authors alike. After careful consideration of the available literature, the National Institutes of Health identified several key problems with the way that scientists conduct and report their research and introduced guidelines to improve the rigor and reproducibility of pre-clinical studies. Many journals have implemented policies addressing these same criteria. We currently have, however, no comprehensive data on how these guidelines are impacting the reporting of research. Using SciScore, an automated tool developed to review the methods sections of manuscripts for the presence of criteria associated with the NIH and other reporting guidelines, e.g., ARRIVE, RRIDs, we have analyzed ~1.6 million PubMed Central papers to determine the degree to which article...
This report is the collective product of word-leading experts working in the branches of integrat... more This report is the collective product of word-leading experts working in the branches of integrative medicine by predictive, preventive and personalised medicine (PPPM) under the coordination of the European Association for Predictive, Preventive and Personalised Medicine. The general report has been prepared as the consortium document proposed at the EPMA World Congress 2011 which took place in Bonn, Germany. This forum analyzed the overall deficits and trends relevant for the top-science and daily practice in PPPM focused on the patient. Follow-up consultations resulted in a package of recommendations for consideration by research units, educators, healthcare industry, policy-makers, and funding bodies to cover the current knowledge deficit in the field and to introduce integrative approaches for advanced diagnostics, targeted prevention, treatments tailored to the person and cost-effective healthcare.
BACKGROUND Improving rigor and transparency measures should lead to improvements in reproducibili... more BACKGROUND Improving rigor and transparency measures should lead to improvements in reproducibility across the scientific literature; however, the assessment of measures of transparency tends to be very difficult if performed manually. OBJECTIVE This study addresses the enhancement of the Rigor and Transparency Index (RTI, version 2.0), which attempts to automatically assess the rigor and transparency of journals, institutions, and countries using manuscripts scored on criteria found in reproducibility guidelines (eg, Materials Design, Analysis, and Reporting checklist criteria). METHODS The RTI tracks 27 entity types using natural language processing techniques such as Bidirectional Long Short-term Memory Conditional Random Field–based models and regular expressions; this allowed us to assess over 2 million papers accessed through PubMed Central. RESULTS Between 1997 and 2020 (where data were readily available in our data set), rigor and transparency measures showed general improve...
Background Improving rigor and transparency measures should lead to improvements in reproducibili... more Background Improving rigor and transparency measures should lead to improvements in reproducibility across the scientific literature; however, the assessment of measures of transparency tends to be very difficult if performed manually. Objective This study addresses the enhancement of the Rigor and Transparency Index (RTI, version 2.0), which attempts to automatically assess the rigor and transparency of journals, institutions, and countries using manuscripts scored on criteria found in reproducibility guidelines (eg, Materials Design, Analysis, and Reporting checklist criteria). Methods The RTI tracks 27 entity types using natural language processing techniques such as Bidirectional Long Short-term Memory Conditional Random Field–based models and regular expressions; this allowed us to assess over 2 million papers accessed through PubMed Central. Results Between 1997 and 2020 (where data were readily available in our data set), rigor and transparency measures showed general improve...
SciScore, an automated screening tool, gives studies a mark out of ten for 'rigour and transp... more SciScore, an automated screening tool, gives studies a mark out of ten for 'rigour and transparency'. SciScore searches methods sections for pieces of key information, which act as proxies for how rigorous the experiments are, and how easy it would be for other researchers to reproduce them. The software can flag where authors have specifically identified the reagents they use, such as antibodies, software tools, cell lines or transgenic organisms they use, for example. It also checks whether they have discussed factors such as sample sizes, how tests have been blinded or the sex of animals used. Last month we launched version 2 of SciScore which features 55 algorithms and a brand new MDAR report that can be used a the author's checklist. We have also extended the rigor checklist in the SciScore report such that it will be of added value for publishers, institutions and funders alike.<br>
- What can a journal do with altmetrics? - Using Altmetrics to Encourage and Track the Disseminat... more - What can a journal do with altmetrics? - Using Altmetrics to Encourage and Track the Dissemination of a Potentially High Impact Paper - Monitoring the broad impact of the journal publication output on a country level: A Case Study for Austria
During the Rubber-meets-the-road panel at the 2019 STM Conference in London, Anita Bandrowski and... more During the Rubber-meets-the-road panel at the 2019 STM Conference in London, Anita Bandrowski and Martijn Roelandse presented new means to measure transparency and reproducibility of biomedical journals. <br>These means are based on SciScore —, a tool that can evaluate whether the authors have addressed blinding, sex, and randomization of subjects into groups, power analysis, as well as key resources. These are all difficult and tedious things for humans to check, but critical if we want to measure — and ultimately improve — the quality of the science being conducted and published.<br>The first results have been presented in London.
My slides for "Seminar 4: Innovation in Scholarly Book Publishing: What Have We Achieved and... more My slides for "Seminar 4: Innovation in Scholarly Book Publishing: What Have We Achieved and What More Is Needed?" at #SSP2017
- Annotation in Researcher and Publishing Workflows - Annotation in Publisher and Researcher Work... more - Annotation in Researcher and Publishing Workflows - Annotation in Publisher and Researcher Workflow - Annotation and Metrics on Cambridge Core ...
<div>EXPLORE THE IMPACT OF YOUR BOOKS</div><div>Bookmetrix brings together a co... more <div>EXPLORE THE IMPACT OF YOUR BOOKS</div><div>Bookmetrix brings together a collection of performance metrics, helping you to see how your books are being discussed, cited, and used around the world.</div
- Timo Hannay of Digital Science opened the session a with a brief introduction. He pointed out t... more - Timo Hannay of Digital Science opened the session a with a brief introduction. He pointed out that the metrics related to only journals highlight only one dimension of the impact. Futhermore the metrics related to the article level are "much more than metrics". - After introducing himself Hans Zijistra illustrated the work done by Elsevier in the field of altmetrics. His presentation was very interesting for me: he spoke of passion and enthusiasm, to bring to bridge the communication gaps between old and new researchers about the issue of alternative metrics. For him Altmetrics are: "another angle of paper not the only way to impact". This is a very important aspect in those who, like me, are trying to introduce the existence of altmetrics in their institutions. Hans points out how often the term Altmetrics "confuses people". He suggested to promote, educate, explore these metrics and make them known to scientists with a "holistic approach". But, in the meantime, what do Elsevier do? They have been running a pilot across Scopus and ScienceDirect in which they are testing different versions of the Altmetric donut, and soon will be announcing further developments soon. At the end of his presentation, Hans has highlighted the importance of quality over quantity, and than when you click the donut to reveal the full original data, Almetrics becomes "more sexy"! - Jennifer Lin brought the experience of open access publishers. PLOS was the first publisher that introduced article level metrics for their journals. She started her presentation with an Albert Einstein citation: "Not everything that can be counted counts, and not everything that counts can be counted". Jennifer revealed how to construct such a system of metrics is very difficult. Researchers want to find the relevant papers, the most bookmarked, and navigate through the literature. In the other words, what happens behind the numbers and the data, is altmetrics. PLOS is focusing on the article usage by subject area and the use across all the materials. The [...]
Recognizing that the use of ORCID iDs and their associated metadata records—by authors, publisher... more Recognizing that the use of ORCID iDs and their associated metadata records—by authors, publishers, and others in the publishing ecosystem—has thus far been almost exclusively in the context of journals, ORCID contracted with Apex to explore the extent to which ORCID is used in book workflows and to make recommendations to promote and facilitate this.<br>In the context of virtual meetings with the ORCID in Books Community Working Group (CWG), it was determined that this could best be accomplished by two vehicles:• A survey that could be sent out to a wide variety of recipients and which would contain questions that would be designed for systematic tabulation and analysis.• A set of 25-some interviews with the CWG members and a carefully selected group of publishers and related organizations that could provide more wide ranging and nuanced insights.<br>
The reproducibility crisis in science is a multifaceted problem involving practices and incentive... more The reproducibility crisis in science is a multifaceted problem involving practices and incentives, both in the laboratory and in publication. Fortunately, some of the root causes are known and can be addressed by scientists and authors alike. After careful consideration of the available literature, the National Institutes of Health identified several key problems with the way that scientists conduct and report their research and introduced guidelines to improve the rigor and reproducibility of pre-clinical studies. Many journals have implemented policies addressing these same criteria. We currently have, however, no comprehensive data on how these guidelines are impacting the reporting of research. Using SciScore, an automated tool developed to review the methods sections of manuscripts for the presence of criteria associated with the NIH and other reporting guidelines, e.g., ARRIVE, RRIDs, we have analyzed ~1.6 million PubMed Central papers to determine the degree to which article...
This report is the collective product of word-leading experts working in the branches of integrat... more This report is the collective product of word-leading experts working in the branches of integrative medicine by predictive, preventive and personalised medicine (PPPM) under the coordination of the European Association for Predictive, Preventive and Personalised Medicine. The general report has been prepared as the consortium document proposed at the EPMA World Congress 2011 which took place in Bonn, Germany. This forum analyzed the overall deficits and trends relevant for the top-science and daily practice in PPPM focused on the patient. Follow-up consultations resulted in a package of recommendations for consideration by research units, educators, healthcare industry, policy-makers, and funding bodies to cover the current knowledge deficit in the field and to introduce integrative approaches for advanced diagnostics, targeted prevention, treatments tailored to the person and cost-effective healthcare.
Uploads
Papers by Martijn Roelandse