PEERS - an open science “Platform for the Exchange of Experimental Research Standards” in biomedicine

Annesha Sil, Anton Bespalov, Christina Dalla, Chantelle Ferland-Beckham, Arnoud Herremans, Konstantinos Karantzalos, Martien J. H. Kas, Nikolaos Kokras, Michael J. Parnham, Pavlina Pavlidi, Kostis Pristouris, Thomas Steckler, Gernot Riedel, Christoph H Emmerich* (Corresponding Author)

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

8 Citations (Scopus)
7 Downloads (Pure)

Abstract

Laboratory workflows and preclinical models have become increasingly diverse and complex. Confronted with the dilemma of a multitude of information with ambiguous relevance for their specific experiments, scientists run the risk of overlooking critical factors that can influence the planning, conduct and results of studies and that should have been considered a priori. To address this problem, we developed ‘PEERS’ (Platform for the Exchange of Experimental Research Standards), an open-access online platform that is built to aid scientists in determining which experimental factors and variables are most likely to affect the outcome of a specific test, model or assay and therefore ought to be considered during the design, execution and reporting stages.
The PEERS database is categorized into in vivo and in vitro experiments and provides lists of factors derived from scientific literature that have been deemed critical for experimentation. The platform is based on a structured and transparent system for rating the strength of evidence related to each identified factor and its relevance for a specific method/model. In this context, the rating procedure will not solely be limited to the PEERS working group but will also allow for a community-based grading of evidence. We here describe a working prototype using the Open Field paradigm in rodents and present the selection of
factors specific to each experimental setup and the rating system. PEERS not only offers users the possibility to search for information to facilitate experimental rigor, but also draws on the engagement of the scientific
community to actively expand the information contained within the platform. Collectively, by helping scientists search for specific factors relevant to their experiments, and to share experimental knowledge in a standardized manner, PEERS will serve as a collaborative exchange and analysis tool to enhance data validity and robustness as well as the reproducibility of preclinical research. PEERS offers a vetted, independent tool by which to judge the quality of information available on a certain test or model, identifies knowledge gaps and provides guidance on the key methodological considerations that should be
prioritized to ensure that preclinical research is conducted to the highest standards and best practice.
Original languageEnglish
Article number755812
Number of pages9
JournalFrontiers in behavioral neuroscience
Volume15
Early online date21 Oct 2021
DOIs
Publication statusPublished - 21 Oct 2021

Bibliographical note

Funding
The PEERS Consortium is currently funded by Cohen Veterans Bioscience Ltd and grants COH-0011 from Steven A. Cohen.

Acknowledgements
We would like to thank IJsbrand Jan Aalbersberg, Natasja de Bruin, Philippe Chamiot-Clerc, Anja Gilis, Lieve Heylen, Martine Hofmann, Patricia Kabitzke, Isabel Lefevre, Janko Samardzic, Susanne Schiffmann and Guido Steiner for their valuable input and discussions during the conceptualization of PEERS and the initial phase of the project.

Keywords

  • reproducibility
  • study design
  • neuroscience
  • transparency
  • platform
  • quality rating
  • study outcome
  • animal models

Fingerprint

Dive into the research topics of 'PEERS - an open science “Platform for the Exchange of Experimental Research Standards” in biomedicine'. Together they form a unique fingerprint.

Cite this