Pilot 1: Open Peer Review for Conferences

pilot1

The goal of the pilot study was to test an open peer review workflow in a conference setting. For this purpose, we adapted and evaluated an existing conference management system (CMS) that includes features such as:

  • open report: the review report is published alongside the publication,
  • open identity: authors and/or reviewers are aware of each other’s identity, or
  • open pre-review: early versions of material are public before the review.

Key Results

We tested the practicability and impact of Open Peer Review at two conferences. The first venue for Pilot 1 was the Second European Machine Vision Forum 2017 (EMVA 2017) and the second venue was the eHealth2018 Master Student Competition. The conference organisers agreed to test the four OPR principles “Open Identity”, “Open Participation”, “Open Report” and “Open final-version comments” for papers submitted through the submission system.

The most important changes to the traditional workflow of paper submission, reviewing and voting included:

  • All participants can see all submissions after the submission deadline and can discuss the submission with the authors
  • Instead of strict reviews, only shorter comments of about a paragraph are used to summarize the individual opinion and suggestions for improvements
  • All identities connected to comments and submissions are visible to all participants
  • For the final voting, all participants have four votes; project committee members get 10 votes.
  • The final result based on the sum of all votes from all members was accepted as a final ruling
  • All discussions can continue in the CMS interface after the conference has finished

For the eHealth2018 student competition, we again tested three OPR principles “Open Identity”, “Open Participation”, and “Open Report” but with a different setup:

  • After the submission deadline, all submissions stayed hidden. However, each participant of the challenge had to write two ‘lay-man’s reviews’ (“Open Participation”). These initially double-blind reviews were augmented by traditional ‘expert reviews’ done by external assigned reviewers.
  • The rebuttal phase allowed each participant to withdraw his submission based on all the reviews. In this case his contribution would have remained hidden and all involved persons (reviewers and authors) will stay anonymous. All participants moving forward and staying in the race at the end of the rebuttal phase will move to an “Open Identity” status: all submissions, the reviews (“Open Report”), reviewer’s names and author’s names are visible to all conference visitors.
  • The program committee used all available reviews (lay-man and expert) to decide on the final winner of the competition.

To support the specific mix of new OPR features needed we had to create and adapt our own CMS solution based on the popular HotCRP. The resulting source code has been released to the public under an open source license at the GitHub repository: https://github.com/mthz/hotcrp.

Feedback from the researchers involved in the OPR process at the EMVA conference and eHealth 2018 student competition was positive. Overall, the participants expressed a strong acceptance of the proposed OPR process and would support it again. The participants’ greatest fears associated with OPR included: biased/whitewashed reviews due to non-anonymity; backlash for bad reviewing (e.g. over other channels/private email); and added effort and risk for reviews outside one’s own expertise (lay-man reviews). Also, the conference organisers of the EMVA are willing to continue applying the OPR approach for the next conference.

Read the full evaluation report here.

Contact: Oliver Zendel, AIT Austrian Institute of Technologyoliver.zendel@ait.ac.at