Tests / Locally designed pre-test and post-test (either in a single session, a course, or a student’s college career); ETS ICT Literacy (based on interpretation of scenarios); Project SAILS (a standardized multiple-choice exam); a coalition of four Midwestern liberal arts colleges is developing an instrument with input from their faculty and institutional research directors / Provides easily-communicated data; can allow for comparisons across a campus or among institutions; allows for demographic analysis so groups can be targeted for special assistance. / Limited to questions that have a single best answer; limited to fairly discrete skills or bits of information; logistically difficult to get students to participate; does not show information applied in a complex and authentic setting; a student who scores well on specifics may fail to integrate them in practice (and vice versa).
Student evaluation of teaching / Short surveys soliciting responses to library sessions; may be distributed in class or after their research assignment is well underway. Need to choose whether to focus on evaluating the skill of the teacher, on what the students retained, or some combination. / Easy to conduct; provides quick and easily-understood information; can be directly applied to future teaching situations. / Indirect evidence based on the students’ impressions; difficult to gain much in-depth information about what they learned or failed to learn; more telling about attitude than about learning; may lead to blurring between evaluation of librarians and assessment of programs.
Surveys / A campus-wide survey conducted by students on behalf of the library; an online survey targeted to students through e-mail or course management system. / Focused on student perspectives; can include the perspective of students who are not library users; / Generally more revealing about the experience of using the library (e.g. temperature, lighting, noise level) than about learning; more focused typically on service issues than on research or inquiry.
Classroom assessment techniques / One-minute essay; classroom polls; muddiest point. / Quick to administer; can influence teaching on the fly. / Hard to cumulate across classes; tends to focus on small rather than holistic issues.
Type of Assessment Measure / Examples / Pros / Cons
Interviews & focus groups / Hold annual focus groups with the same students as they progress through four years; ask individuals about a particular research project; senior exit interviews. / Provides in-depth information; opens up issues that you might not think to ask about. / Time-consuming; need to transcribe and analyze results; limited numbers of students involved.
Ethnographic observation / Observe students’ use of library spaces; compare social v. individual uses of library; observe how students use resources. / Can provide insights about library facilities that challenge assumptions; can suggest where changes might be made. / Need to avoid being too intrusive; results can be interpreted in different ways; may raise questions that have to be explored by other means.
Examining finished products using a rubric / Examine the content and presentation of bibliographies only or of entire works; can examine papers, poster sessions, or speeches but need different rubrics for each. / Provides direct evidence of student learning; filters out interpersonal issues or subjective factors that might cloud an indirect assessment (e.g. interviews). / Time-consuming; developing a rubric takes several passes using multiple judges to make sure it works; assignments may not ask students to do what the rubric assesses.
Examining portfolios / Create portfolios for a library-taught course; examine portfolios that are compiled for other departments/course; examine portfolios that are general graduation requirement. / Provides direct evidence of student learning across time; shows how and when different skills develop, so may suggest developmental paths for an instruction program; may offer insight into what departments expect of students. / Time-consuming; need to decide what to look for and then develop a rubric for finding it; if looking at portfolios compiled for other purposes may not have work that demonstrates information literacy.
Alaska Library Association Conference, 2006 / Barbara Fister