Review Strategy: Two-Stage Streamlining Review

Case Study: RFA-16-022: Human Tissue Models for Infectious Diseases (U19)

Review 2016, SRO contact:Brenda Lange-Gustafson. Team members: Amir Zeituni, Susana Mendez, Duane Price, Lynn Rust, Carlton Boyd, Lucy Dickson, Nicole Slade-Acty, Voracia Oliver, Greg Jarosik.

Table of Contents:

Abstract

Driving Factorsfor Strategy Choice

Overview

Detailed Approach

Unique Features of the Strategy

Lessons Learned

TechnicalChallenges

Timeline

Questions and Answers

Review Strategy: Two-Stage Streamlining Review

Case Study: RFA-16-022: Human Tissue Models for Infectious Diseases (U19)

Review 2016, SRO contact: Brenda Lange-Gustafson. Team members: Amir Zeituni, Susana Mendez, Duane Price, Lynn Rust, Carlton Boyd, Lucy Dickson, Nicole Slade-Acty, Voracia Oliver, Greg Jarosik.

Abstract

A two-stage review strategy was employed to handle the large response of U19 applications(28 with ~ 154 project/cores) where a low number of awards (3-4, totaling between $15-20M) were expected to be made. Reviewers reviewed the entire U19s to which they were assigned and submitted critiques for the projects, cores, and overall program 2 weeks prior to the FACA Streamlining teleconference meeting (Stage 1).The face-to-face Stage 2 meeting was scheduled 2 weeks after the Stage 1 meeting and discussed the most meritorious applicationsby overall program only (not by project/core)with the reviewers noting particular strengths and weaknesses from the projects and cores. Strengths of the strategy included the increased amount of time reviewers had with the critiques prior to the Stage 1 streamlining meeting and the increased time they had with the to-be-discussed critiques prior to the Stage 2 face-to-face meeting; the requirement that each reviewer review the entire U19 application to which they were assigned allowed for a full understanding of the program project; thisstrategy resulting in the participation of ~ 10 unassigned reviewers per application participating in the discussion since they had had 2 weeks to focus on those applications; the resulting holistic and comprehensive discussion by overall program only; and the ability to write only one overall resume for the multi-project Summary Statement (SS) while still providing additional feedback since all reviewers had provided overall program critiques as well as the usual critiques. Disadvantages of the strategy were the prolonged review timeline;limited flexibility in replacing lost reviewers since reviewers were assessing an entire application (thus have extra reviewers who are lightly assigned with one U19);and the challenges associated with transferring data after the close of the first FACA meeting. Not related to strategy, but lessons learned, the size of the panel was a challenge and actually should have been larger since some reviewers were given 3 entire U19s to review, and that was too high of a load; 2 U19s should be the maximum if they are reviewing the entire application. Also, the number of reviewers needed for the face-to-face meeting did not decrease substantially (from 68 to 54), so one should not count on a large reduction when choosing a potential room.

Driving Factors for Strategy Choice

The large responseof U19applications (28 with ~ 154 project/cores) would be impossible to review in a single meeting due to volume and complexity.Given that only 3-4 applications would be funded, it was important to focus on differentiating among the most meritorious. Amir Zeituni and I developed the following plan and presented it to our branch chief, Gregory Jarosik. Then we presented it to the Program Officer, who very much liked the idea of review by overall program.

Overview

RFA-16-022: Human Tissue Models for Infectious Diseases (U19) required an applicant to develop an in vitro 3D human tissue model advanced enough to address the ground-breaking, obstacle-defeating questions in an infectious disease field (excluding HIV and prions). Thus, some applications used one model and tested various infectious diseases; others had multiple models with multiple pathogens.

This type of application lent itself well to being discussed via the overall program only (vs. by project, then core, then overall program) since one was basically asking “are the models good and appropriate for the disease questions being asked” and “are the disease questions the important ones”. We held a pre-review teleconference highlighting the unique issues of the RFA as well as the usual logistics.

We had essentially two types of reviewers (model people and pathogen people, although a good number were both), and we wanted each reviewer to review the entire U19 from their perspective. Thus the pathogen reviewers were assessing if the important questions were being asked and if the model had the component parts needed to get the answers to those questions. Model people were assessing if the model held together for what they were asking it to do in all sections. Theoretically, for an application to pass streamlining, they would have to be strong in both areas.

Since we had the two types of reviewers, we wanted reviewersto have substantial time with the critiques prior to streamlining decisions so they could see comments from the other areas and take all comments into account prior to recommending an application be streamlined. Thus, critiques were posted 2 weeks prior to the FACA Streamlining teleconference meeting (Stage 1).Several days prior to the streamlining meeting, streamlining data were presented via 3 ways of assessment to the panel (see below). It ended up that 11 applications were in the top categories in all 3 assessments (although for a few, the order changed). The panel thus streamlined 17/28 U19s (61%). This gave a leeway of 3 applications that probably should have been streamlined.

After streamlining, reviewers then had 2 weeks to re-read the critiques for those 11 applications and access the applications if necessary. Unassigned reviewers were given the option of posting critiques for those 11 applications as well as providing preliminary scores, which had no bearing on the application, but indicated their preliminary thoughts about the application. Although we had been thinking that streamlining would decrease the panel size substantially, it did not. We still had 54 unique reviewers needed for the 11 applications. Nine reviewers, whose applications were all streamlined, still requested to attend, and they were welcomed due to the extreme amount of work they had put into this review and the value they could add to the discussions.

At the face-to-face Stage 2 of the review, 11 applications were discussed by overall program, with the reviewers noting strengths and weaknesses from the projects and cores. Each review lasted approximately 1-1.5 hours.

Detailed Approach

Amir Zeituni and I developed the following plan and presented it to our branch chief, Gregory Jarosik. Then we presented it to the Program Officer, who very much liked the idea of review by overall program.

  • Stage 1
  • Administrative Review
  • SRDMS was used.
  • Duane made the preliminary COI list by a “highlighting multiple words using pdf” process (see attached). Susana insured all COI were included.
  • Three (Amir, Susana, BLG) SROs performed administrative review and entered expertise terms into SRDMS.
  • SROs were provided a cheat sheet that included terms to identify, e.g., at least-Pathogen(s), Disease if any, Body part, Type of model, Stem cells, Specialized needs, Much genomics, immunology, pathology or imaging
  • Recruitment
  • Review dates for both meetings were chosen prior to recruitment due to the expected large size of the panel.
  • The initial recruitment letter notified reviewers that this was a 2-stage review and that they had to commit to both meetings.
  • The initial recruitment letter also explained these were multi-project applications with an average of 3 projects and 3 cores and that they would be assigned 2-3 of these applications. Approximately a third, or a little under one half,of the reviewers did not really understand what this meant until they saw the assignment sheet with 12-16 assignments (including overall application)!
  • For the initial recruitment log, BLG used Bruce Sundstrom’s QVR LIKE matrix, which identified ~ 600 potential reviewers. BLG narrowed this down to 130 for the initial mail merge.
  • BLG handled recruitment and received additional names from SROs on the team.
  • The initial mail merge was only ~ twice the number of panel members needed. It is suggested it should be three times since once assignments are made with the large group of “yes” reviewers, it is difficult to determine the best search terms for another mail merge group. Rather, it almost needs to be recruiting 1:1 then since you have spread out your expertise as best as you can with the initial group. Better to start with too many than try to fill in later.
  • Assignments
  • The Reviewer Assignment Form was exported from SRDMS to excel, and all expertise terms were combined into one column. (BLG had made an excel summary sheet of all applications with the Goals, Specific Aims, and major expertise needed. She made sure all of that was included also.)
  • The Assignments were made on this SRDMS exported Reviewer Assignment Form by three SROS who used the reviewers’Attachment 4 expertise liststo match with the needs on the assignment form.
  • Reviewers were assigned to the entire U19, but they were not primary on all sections; their reviewer level varied.
  • Five-six reviewers were assigned to the projects, scientific cores, and overall program. Three reviewers were assigned to the administrative core.
  • A companion document was made to track reviewer load.
  • Once fully recruited and assigned, two SROs input the assignments into IMPAC and double checked them.
  • Mailout
  • Mailout was completed 2 months before critiques were due.
  • Federal Register Notices
  • Federal Register Notices need to go out at least 1 month prior to the meeting. There was less than a month between Stage 1 and Stage 2 meetings, but the second meeting can be created (without the applications in it) with enough information for the FRN (e.g., date, location, title, roster number).
  • Pre-review teleconference
  • One Pre-review Teleconference was held 3 weeks after mailout. Recordings were sent to those who could not attend (52 attended/16 received recordings).
  • We stressed to reviewers that since this is a 2-stage meeting, they may receive inquiries from applicants since applicants may see new meetings in eRA Commons. We stressed that reviewers should not answer any questions, but refer them to the SROs. Reviewers asked questions, and we gave them some stock answers: If a reviewer was emailed- “I am not able to discuss anything related to the meeting, please contact the SRO.” And if pushed in person, they should say “I am uncomfortable in this discussion since I am unable to discuss anything related to the meeting, please contact the SRO”
  • Critique Submission
  • Reviewers submitted critiques for each project, core, and the overall program with preliminary scores for each.
  • Critiques were due 2 weeks prior to the streamlining meeting. The reviewers had these 2 weeks to read critiques.
  • Reviewers did not edit critiques prior to the meeting.
  • Travel

We had the reviewers wait until after the Stage 1 streamlining meeting to make travel reservations.

  • Preparation for Streamlining meeting
  • Reviewers were provided scoring data in 3 different ways:
  • Preliminary Overall scores
  • Percent projects, cores and overalls in the high impact range
  • Percent projects and overalls in the high impact range
  • Yong Gao provided a table (see attached) he used to display data in this way, which was very helpful. It also provides all of the raw scores so the reviewers have all of the data on one document.
  • Eleven applications clearly were at the top in all of the displayed data.
  • A reviewer also made a scatter plot with mean and standard error of the mean that some reviewers liked very much.
  • We emphasized often that the streamlining results would be final; no changing them at the Stage 2 meeting. Having 2 separate FACA meetings is the only way streamlining cannot be revisited at the second meeting.
  • Stage 1 streamlining meeting
  • The FACA Stage 1 meeting lasted about 1 hour.
  • Program preferred to have 12 applications reviewed, so the 11 applications were in that range and chosen for discussion. There was a clear cut-off at 8 applications, which the panel could have done, but the 11 were chosen.
  • Stage 1 scores
  • No scores were released until after Stage 2. When asked by applicants about a change in date of the meeting, etc., they were told that this RFA had two meeting dates associated with it, and scores will be released after the last meeting date.
  • Transferring Stage 1 data to a new Stage 2 meeting in IMPAC
  • Since all assignments, COIs, critiques, and scores must be removed for any application you want to move to a new FACA meeting, you must download and re-upload a number of documents:
  • Download all critiques as you normally would for the concatenation macro so you have the ND critiques and criterion scores
  • Download the overall preliminary scores for re-upload (criterion scores will be in the PSS).
  • Keep copies of assignments and COIs since they will have to be re-entered.
  • Download the Pre Summary Statement (PSS) from IAR after you have done the above since you need to re-upload these as the critiques in Meeting Materials
  • Amir made an abstract book for Stage 2 applications using the Abstract Book feature under the SRDMS Administrative Reports tab.
  • Also, we took apart the Abstract Book and made a pdf file of the abstract and the PSS for each to-be-discussed application.
  • Once everything is downloaded, and removed for the to-be-discussed applications, Lucy can make a new FACA meeting and move the applications to that meeting.
  • Creating the Stage 2 Meeting in IMPAC
  • In IMPAC II
  • Re-enter COIs
  • Re-enter assignments
  • In IAR
  • Upload, into Meeting Materials, the PSS/Abstract book pdf for each application as application-specific documents. This preserves the COI restrictions.
  • This is now where reviewers look to see the critiques.
  • To input the original preliminary overall scores, I had to upload a blank critique.
  • Enable reviewers to the new meeting and explain document locations.

We used the 2 weeks until the Stage 2 meeting to have work on the ND SS, but not release them.

  • Stage 2
  • Meeting
  • The Stage 2 face-to-face SEPto discuss the competitive applications was held over 2 days.
  • Two chairpersons led the Stage 2 meeting.
  • We needed 54/68 unique reviewers for the 11 applications. Nine reviewers, whose applications were all streamlined, still requested to attend, and they were welcomed due to the extreme amount of work they had put into this review and the value they could add to the discussion.
  • At the face-to-face Stage 2 of the review, 11 applications were discussed by overall program, with the reviewers noting particular strengths and weaknesses from the projects and cores.

Summary Statements

  • The summary statement for Stage 2 applications consisted of:
  • An SRO-written resume of the overall program
  • A list of the Project/Cores titles and leader names with a statement saying “Based upon the evaluation of scientific and technical merit, Project 1 received Overall Impact scores averaging XX”. ETC.
  • The reviewers’ Overall Program Critiques (no criteria scores)
  • The Project critiques (with criteria scores)
  • Core critiques (no criteria scores)
  • The Summary Statement for Stage 1 (ND) applications consisted of:
  • A list of the Project/Cores titles and leader names with a statement saying “Based upon the evaluation of scientific and technical merit, Project 1 received Overall Impact scores averaging XX”. ETC.
  • The reviewers’ Overall Program Critiques (no criterion scores)
  • The Project critiques (with criterion scores)
  • Core critiques (no creation scores)
  • A team of SRP staff was assembled (RSSs and SROs) to work on summary statements for the Not Discussed (ND) applications soon after the Stage 1 meeting.
  • Prior to the Stage 1 meeting, Carleton made an excel table with all the titles, PIs, specific aims, etc. so these could be cut and pasted into the SS, completing the first part of the resume. Thus, the SRO only had to write the strengths and weaknesses after the meeting.
  • Prior to the Stage 1 meeting, BLG wrote all the footers and headers based on SRO notes. These were edited if necessary after the meetings.
  • All summary statements (Discussed and ND) were released ~ 1 month prior to council.

UniqueFeatures of the Strategy