Computer Based Assessment at UCSF, November 15, 2012

Notes from Discussion

Faculty Panelists:

Tracy Fulton, School of Medicine

Pilar Bernal de Pheils, School of Nursing

JoAnne Saxe, School of Nursing

Wade Smith, School of Medicine

Motivations for moving to computer-based assessments

  • Effective way to assess knowledge of rote facts, which is still a portion of the curriculum
  • Desire to get students comfortable taking exams online in preparation for board exams
  • Ease of grading.
  • Ability to pull out additional data for analysis about student performance.
  • Desired feature: ability for faculty to generate these reports themselves and to capture data for longitudinal comparisons.

Exam formats described

-open book, timed exams taken within window of time; students can take from anywhere

-on-campus timed exams; all students take at the same time; closed book

-Multiple choice, T/F, Matching and essay questions

-MC exam taken on campus, followed by an essay portion that was timed but was done at home

-Essay exams; interest in increasing these but want to keep some MCE to prepare students for boards.

Other types of assessment activities

-Most common: Regular (i.e. weekly) self-assessments with immediate feedback

-Self-assessments and ‘low-stakes exams’ discussed as important learning tools

-Use asynchronous on-line discussions as an assessment tool; students work in groups to respond to instructor’s question re clinical case

-Synchronous tele-consult activities with assessment component

Different Approaches to Common Issues

  • Open or closed book
  • Panelists comfortable with open book on timed exams; not enough time to look everything up
  • Exams taken by students from home or off-campus were all open book
  • Closed book exams held on campus did not prevent students from having a short cheat sheet they could peak at on their computer; but felt there was not enough time for students to really go digging around for answers.
  • Honor Code also an important tool. Some schools have students sign a code of conduct that specifically addresses these types of issues.
  • Providing access to the answer key and potential to compromise future re-use of the same exam
  • Effective method has been to display the answers on-screen for one hour during class time and allow students time to review their exam in comparison to the key. Students can also ask questions of faculty to understand their mistakes.
  • Pilar Bernal de Pheils has systematically revealed the correct answers to students as soon as they complete the exam. Student’s responsibility to compare their own answers to the correct answers. Students have the option to do extra work to demonstrate their understanding of the questions they missed. She has not seen any evidence that students have given the answers to others.
  • Desired Feature: Strong interest in being able to reveal the right answers to students immediately without having to worry about the key being compromised.
  • Exam creation process – labor intensive
  • Collaborative exam creation a feature of SOM integrated curriculum; using Moodle made this easier than what they did with paper, but still not easy. Desired feature: a practical method for collaborating on exam design.
  • Course director collected 25 questions from subject matter experts and pick the best 10.
  • Creating excellent exam questions and dealing with students’ criticisms of ‘bad’ questions
  • JoAnne Saxe has set expectations by telling students up front that an exam is experimental, faculty are constantly working to improve the questions, so let us know if you see problems.
  • Faculty take very seriously the need to develop effective and valid exam questions. Aware of impact on faculty’s and course’s reputation if students perceive the exam as being irrelevant, poorly designed, etc.
  • SON faculty had made use of Moodle Quiz Tool’s features for analyzing the effectiveness of questions.
  • Desired Feature: A place to include a comment regarding a question that a student found unclear, difficult to answer, or problematic in some way.

Key Issue: Linking Curriculum with assessments, e.g. tying assessments to learning objectives.

  • Current functionality: each question has a ‘title’ assigned to it which can contain the learning objectives or other keywords to help with tracking. Partially meets this need.
  • Desired feature: more granular tagging of questions with attributes such as learning objectives and
  • reports to track performance on questions based on these attributes
  • question bank that enables programs and schools to share questions that address specific competencies that are common across programs by searching on these attributes
  • Desired feature: ability to use assessments to track student progress in acquiring competencies - where do I need help, where do I need to put more effort

SOM and SOP considering pilots of ExamSoft

  • Eliminates dependence on network connection
  • Ability to block students from looking up answers
  • Exam-taking experience closer to paper and pencil e.g. ability to cross out answers that student has eliminated
  • Additional features for analysis and tracking
  • Note: excellent timing to compare with new Moodle 2 environment and tool to see if it can meet needs

Assessments in 5-10 years

  • more collaborative assessment to have more faculty involved in the assessment process
  • triple jump method - give student a problem to work on and they report back at different stages to faculty at different points in their process

1