Give us your feedback

Reproducibility

Reproducibility is a cornerstone of scientific research, ensuring the verifiability and trustworthiness of findings. Its importance is particularly emphasized in domains like computer science and AI. While reproducibility initiatives have emerged across different communities, practical implementation faces significant barriers, including technological limitations and social/governance challenges. | 11 videos

0 / 11 completed

Reproducibility stands as a fundamental foundation of scientific research. Reproducibility can be seen as the capacity to reproduce the findings of scientific papers and it is key for verifying the outcome of a scientific outcome. The difficulty in reproducing the experiments of scientific research has been a matter of strong debate inside the computer science and the AI communities in recent years. Ensuring the robustness and trustworthiness of scientific research utilizing computational methods and data is of utmost importance, since the inability to replicate experimental outcomes may result in various consequences, including the withdrawal of research and harm to the standing of both the authors and the conference/journal that disseminated it. Moreover, promoting reproducibility facilitates the dissemination, comparison, and adoption of research, thereby potentially augmenting the influence and increasing the prominence of scholarly articles. 

In response to the growing need for rigor and transparency in research, numerous reproducibility initiatives have emerged in the software engineering community, the supercomputing community, and more recently in the machine learning community. These initiatives are designed to oversee and streamline the governance that facilitates the assessment of the reproducibility of the submitted papers. Nonetheless, reproducibility in practice is in its infancy, with numerous barriers that must be addressed. These barriers range from the challenges and constraints imposed by current technologies to the social and governance aspects necessitating a shift in our research practices. While certain technical barriers have been previously analyzed, we revisit them in Task 4.3, alongside an analysis of the social and governance obstacles. These latter barriers represent the challenges encountered in real practice. Furthermore, we introduce a reproducibility process aimed at mitigating these barriers, a process that we presented at ECAI’23. 

This analysis of current reproducibility in practice is essential to understand the requirements that the platform tools, resources and services of the AIoD need to address. Our main objective is that our efforts are more useful for the European AI community. Overall, regarding our reproducibility analysis, our contributions encompass the following relevant points: 

  • Revision and extensions of technical barriers for reproducibility. 
  • Proposal of social barriers for reproducibility 
  • From our findings, we propose a reproducibility process to mitigate the previous barriers.

We leverage on existing know-how in reproducibility to specify a concrete reproducibility process that has been implemented in practice in MLSys, SC23, ICPP23, TPDS, and ICPP24. The process goes beyond the existing recommendations and standardizes the contents of the documentation (i.e., how the software and data artifacts need to be documented). The process ensures that documentation of artifacts follows a standardized format, making it easier for authors to adhere to documentation practices, and for reviewers and third-party researchers to encounter more consistency in the content across submissions.

    Allowed file types: PDF, JPG, PNG only.