Learning Outcomes Assessment at Hamline University

AY 2013-2015 Assessment Report

July 30, 2015

Submitted by Stacie Bosley

Director of Assessment and Chair of the Committee on Learning Outcomes Assessment

Overview:

The Committee on Learning Outcomes and Assessment (CLOA) exists to help generate a culture of program assessment at Hamline, to develop a framework for reporting assessment activity, and to provide training, feedback and advice on assessment practices. More broadly, CLOA’s long run objective is to support improvement of programs and student outcomes at Hamline. Such work ensures institutional relevance and viability, supporting successful reaccreditation or other external review. Furthermore, meaningful measurement systems provide evidence of value, where such value is defined around intellectual promises that student-centered and authentic to Hamline.

CLOA’s membership has broadened since its initial creation and in 2013-2015 included the following:

  • two representatives from the College of Liberal Arts (CLA),
  • two representatives from the Hamline School of Business (HSB) – if desired, one representing undergraduate and another representing graduate programs,
  • two representatives from the Hamline School of Education (HSE) – if desired, one representing undergraduate and another representing graduate programs,
  • one representative from the School of Law,
  • two representatives from Student Affairs,
  • one representative from Bush Library,
  • one representative from the Center for Teaching and Learning (CTL),
  • one representative from Information Technology Services (ITS) ,
  • one representative from Institutional Research (IR), who co-chairs CLOA with the faculty Director of Assessment, and
  • the Associate Provost.

Committee members are chosen by their respective units, with approval from the Provost, and typically serve two-year terms.

The committee’s work in academic years 2013-14 and 2014-15 builds on significant accomplishments in past years. Beginning in 2007, the first phase of learning outcomes and assessment implementation created the foundation for program-level assessment, supporting individual programs as they created learning outcomes and began to build corresponding curriculum maps and measurement tools. CLOA continued this work in 2013-14, asking programs to move from assessment of individual learning outcomes to creation and implementation of an overarching Program Assessment Plan. Program assessment support (for individual academic programs as well as those in Student Affairs, Advising, etc.) will continue to be central to the work of the committee.

We have also moved into the second phase of implementation, building an assessment system for the “Hamline Plan” (or HP, Hamline’s version of a general education undergraduate program). The Hamline Plan program recently went through a revision process that defined new learning outcomes for existing components,introduced one additional curriculum component (quantitative reasoning), and removed one component (computer-intensive, where responsibility for associated learning outcomes now rest with individual academic programs). Select HP components were chosen for pilot assessment projects in AY 2014-15,including First Year Seminar (writing, research and discussion), Capstone Writing, Quantitative Reasoning and Formal Reasoning. CLOA will continue to work with the Undergraduate Curriculum Committee (UCC) to create and implement assessment plans for the remaining curriculum components. As this work proceeds, the next step will be to create structures that allow for examination of Hamline’s seven university-wide learning outcomes.

This report describes assessment committee priorities for the 2013-14 and 2014-15 academic years as well as the progress made in building learning outcomes culture and infrastructure at Hamline over that time period.

AY 2013-14 Goals:

In collaboration with CLOA members and academic leadership, the committee established the following goals for AY2013-14:

Individual Program Assessment

1)Plan and pilot a new annual peer review process for program assessment.

2)Continue to support individual programs as they draft program assessment plans, with individual consultation and through new professional development offerings.

3)Support academic affairs as they move from individual unit learning outcomes to shared learning outcomes and work with other academic support programs as they begin to plan for program assessment.

Hamline Plan Assessment

4)Develop Hamline Plan assessment plans, in concert with UCC, with special focus on pilot assessment plans for FYSem, LEAP (P), Formal Reasoning (R), and Quantitative Reasoning (M). Seek out professional development so that CLOA members can effectively support general education assessment planning and implementation at Hamline.

5)Support efforts to create the Quality Initiative proposal, needed for the HLC accreditation process.

Infrastructure

6)Continue to train faculty and staff on Blackboard Outcomes, with a goal of broad utilization in program and HP assessment.

7)Update the Learning Outcomes and Assessment website, providing more resources for faculty and staff.

AY 2013-14Progress:

Individual Program Assessment - Support & Feedback

In AY 2013-14, CLOA launched a review process for program assessment intended to provide feedback to programs but also to create a dialogue and share knowledge across programs. Programs had submitted two documents to CLOA in the summer/fall of 2013: a Program Assessment Plan (see Program Assessment Plan Template), which outlined the long run assessment plan for all program learning outcomes, and the AY12-13 Annual Program Assessment Report (see 2012-13 Annual Assessment Report Form), which described assessment activity that occurred over the past academic year. The focus of this peer review workshop was on the Program Assessment Plans that had been submitted. While not all programs had submitted a plan, all programs were invited (both undergraduate and graduate from all schools) to attend this peer review workshop.

Participants were placed in groups with programs from other units, though graduate programs were separated from undergraduate programs. 47 faculty and staff attended the event and, of the 29 who provided event feedback, 45% indicated that the workshop was “very useful” and another 45% indicated that it was “somewhat useful.” While CLOA thought that this peer review would provide sufficient feedback to support programs in their assessment planning and implementation, we concluded that it was still important to provide feedback to all programs, directly from CLOA. CLOA members then shared the responsibility of reviewing assessment documentation and summarizing progress and feedback for each program. Feedback was communicated to individual programs and the respective unit’s dean.

In response to faculty and staff comments, our committee subsequently altered the submission deadlines, peer review structure, and review process, effective Fall 2014 (see Program Assessment Review Process). Going forward, annual peer review workshops will occur within units (administrative units within the CLA and corresponding units within other schools)and focus on the annual assessment report. CLOA will also begin a rotation, formally reviewing documentation and providing written feedback for one-third of all programs each year. This formal CLOA review will now occur after the program has received internal feedback through the unit’s peer review workshop. CLOA’s review reportwill also be sent to the program/department leader for comment and possible changes before that report is officially communicated to the respective unit dean. CLOA will continue to review Program Assessment Plans on an as-needed basis, as new or revised plans are submitted. We believe that the new timeline will better support meaningful and collaborative program conversations, as programs will now be encouraged to reflect on the prior year’s assessment data/findings at the start of each new academic year.

As mentioned above, the Program Assessment Plan Peer Review Workshop was a university-wide event, aimed at providing feedback and increasing awareness and knowledge of assessment. CLOA also encouraged individual programs to meet with the most appropriate committee members to address questions and concerns, paying special attention to those programs who had not yet submitted a Program Assessment Plan. To further address the needs of this group, CLOA offered a Program Assessment Planning Workshop, in concert with CTL. This event was targeted to those programs that had not yet written a Program Assessment Plan, though some attendees came to review the basics on program assessment. There were approximately 20 attendees and feedback, though not formally captured, was extremely positive. The event was also recorded to serve as a resource to faculty in the future. This offering prompted additional individual meetings and allowed additional programs to submit Program Assessment Plans to CLOA. At the end of AY 2013-14,65 percent of programs (of 60 programs, largely in the academic arena) hadsubmitted a Program Assessment Plan.

Programs in academic affairs and other academic support programs are intended to follow the same process whenever possible: identifying learning outcomes, creating “maps” that connect these outcomes to program activities, developing measures, collecting data, closing the loop, and reporting annual activity. After reviewing the assessment plans for each respective unit of Student Affairs, connections across units were identified. The CLOA members from Student Affairs led a conversation to move toward common learning outcomes for the division and such learning outcomes were drafted. Some initial work was done to establish possible measures and data that could be contributed by each unit. Certain units, such as the Wesley Center and Career Development Center, decided to retain unique learning outcomes, beyond the common division-wide outcomes. Next steps include developing a specific assessment plan for the division and implementation of this plan. Beyond Student Affairs, Academic Advising and the HU Honors Program have started the process of drafting learning outcomes and contemplating assessment approaches. Preliminary conversations began with Student Success, Financial Aid and other academic support units.

As programs set about the work of developing plans, rubrics and other assessment measures, CLOA recognized the need for a clear communication of standards and expectations. CLOA members, in concert with Provost’s office, crafted the “Hamline University Assessment: Plan and Standards” document (see HU Assessment Standards). This document sought to communicate the overarching purpose of HU assessment, outline expectations for faculty and staff, and express standards needed for uniformity and aggregation. CLOA recognized that any single document is insufficient - continued communication and professional development offerings would be needed to support faculty and staff strive to understand and meet assessment expectations.

Hamline Plan Assessment

Progress was made on assessment planning for nearly every component of the Hamline Plan. The original goal was to develop plans for all HP components, where this work was led by the Undergraduate Curriculum Committee (as leaders of the HP program) and supported by CLOA. UCC ultimately decided to stage the implementation of the newly revised HP curriculum across multiple years. This meant that only certain components (FYSem, Writing (W), Formal Reasoning (R), Quantitative Reasoning (M), and Liberal Education as Practice (P)) would be assessable in the upcoming year. Given the time-pressure to complete this work, the most developed assessment plans were associated with those curriculum components. It was understood that further assessment planning would be needed for the remainder of the Hamline Plan in the upcoming year. This would include Cultural Diversity (D), Independent Critical Inquiry and Information Literacy (Q), Disciplinary Breadth (N, S, H and F) and Oral-Intensive (o) coursework. As First Year Seminar and Writing-Intensive courses were not significantly altered by the recent Hamline Plan revision, pilot assessmentproceeded ahead of other curriculum areas.

First Year Seminar

FYSem assessment planning began in the Fall of 2013, led by the CLA Associate Dean (who also serves as the FYSem program director) with support from the Director of Assessment and CTL Director. Over the Fall 2013 term, FYSem instructors participated in multiple facilitatedgroup discussions where each group considered a possible mode of assessment for critical thinking (common rubric, common assignment, and/or external evaluation tool such as the CLA+). These instructors came to the collective decision to develop a common rubric, both to maintain flexibility in curriculum and pedagogy and to ensure that assessment would be authentically connected to Hamline’s FYSem approach. After recognizing the central role of critical thinking across other learning outcomes, it was determined that the pilot project would instead embed critical thinking components inside three individual rubrics measuring student discussion, research and writing. To pilot these rubrics, Fall 2014 FYSem faculty would be asked to adopt one or more of these rubrics and use Blackboard outcomes to submit artifacts and rubric scores (unless an alternative process for entry is required based on individual circumstances).

Writing

A capstone writing assessment pilot project began in AY 2013-14, led by the Director of Writing Across the Curriculum and supported by the CTL director. Faculty teaching writing-intensive capstone courses across all schools were asked to participate in this pilot assessment project. Participating capstone instructors (11 of 27 possible faculty, all from the CLA) supplied student writing artifacts via Blackboard (124 artifacts). Two assessment sessions were held in June and August of 2014, where faculty and staff across the university utilized Blackboard Outcomes to rate randomly selected student work. Across the two sessions, 22 faculty and staff participated in the training and rating sessions. Training included an overview of purpose and best practices, followed by a norming workshop using sample artifacts. Each participant was then assigned multiple artifacts and each artifact evaluated was rated by 3 individuals. Stipendsand materials were supplied by CLOA and the Provost’s Office. A Fall 2014 session was planned for discussion of results and next steps.

Assessment of first year undergraduate writing (based on student work in the required expository writing course) also continued under the leadership of the Director of First Year Writing. Assessment of incoming first year undergraduate students continued under the leadership of the Director of the Writing Center. While this assessment has been conducted for many years, AY 2013-14 was the first year assessment occurred within Blackboard Outcomes (our university’s assessment support system). Future steps should include consideration of ways to link these efforts (incoming, first year, and capstone) for maximum efficiency and value. As undergraduate writing assessment is the most developed of our university’s assessment efforts, much can be learned about the way to process and act on data from multiple points in the developmental arc to most effectively improve teaching and learning.

Quantitative Reasoning, Formal Reasoning and LEAP

UCC/CLOA working groups developed assessment plans for each of these curriculum areas and each planned for pilot assessment projects in AY 2014-15. LEAP assessment would be led by the LEAP program director and R and M assessment would be coordinated by their respective UCC/CLOA working groups. The use of different structures would help us understand the most appropriate structures for ongoing assessment of HP curriculum components.

All HP Components

To connect a larger group of faculty/staff with the planning process, CLOA worked with UCC to put on the first Hamline Plan Assessment Planning Workshop (held during the afternoon of the January Professional Development Day). This event brought together faculty from various undergraduate programs to weigh in on assessment plans being drafted for every component of the Hamline Plan. Similar events would potentially be needed in AY 2014-15 to finalize these plans and move to implementation.

Professional Development – General Education and Assessment

To improve institutional knowledge and ability to support general education assessment, the Director of Assessment and one additional CLOA member (as well as the LEAP program director) attended the AAC&U General Education and Assessment conference. A document of ideas, knowledge and next steps was created and attendees met with other curriculum and assessment leaders to discuss after returning to campus. We recognized that additional professional development would be very beneficial, both for CLOA and UCC members, as we seek to enhance HP curriculum management systems.

Quality Initiative

In support of HLC review preparation, a subset of CLOA (Director of Assessment, Director of CTL, Director of Institutional Research Assessment, and Associate Provost) participated in discussions of the Quality Initiative (QI) project. As assessment is already a major improvement initiative at Hamline, the QI working group determined that it should be central to our proposal. The proposal will focus on “entry to exit” student experiences, with specific attention to student learning over time. Proposal writing and implementation are expected to continue into the next few academic years and will build on our ongoing work on FYSem and Capstone evaluation.