1

Liberal Arts Core Assessment 2014-2015 Report

Fort Lewis College

Assessment Committee

Office of Assessment

Liberal Arts Core Assessment2014-2015

Report prepared by:

Heidi Steltzer and Lisa Snyder

Assessment Committee Members (2014-2015)

Heidi Steltzer (Sciences, chair)

Erich McAlister (Sciences)

Shawn Meek (Arts and Humanities)

Michael Martin (Arts and Humanities)

Mika Kusar (Business)

Rebecca Austin (Social Sciences)

Kim Hannula (associate dean, ex-officio)

Lisa Snyder (Director of Assessment, ex-officio)

I. Background and Overview

In May, 2014, The FLC Faculty Senate passed new Liberal Arts Core Learning Outcomes. These new learning outcomes, which were adapted from the Essential Learning Outcomes of the Association of American Colleges and Universities (AAC&U),paved the way for formal assessment of Fort Lewis College’s current general education program as well as the ongoing assessment of the new Liberal Arts Core, which remains in development.

In an effort to obtain guidance from experts in the fields of general education and assessment, a trans-disciplinary group attended AAC&U’s Institute on General Education Reform & Assessment (IGEA) in Burlington, Vermont. The results of the IGEA were an action plan for the reform of FLC’s General Education Program (now called the Liberal Arts Core), and an assessment plan with accompanying procedures. The structure of the 2014-2015 Liberal Arts Core assessment plan is based on consultations provided at AAC&U’s IGEA. This plan called for scoring to take place at the end of both the Fall, 2014, and Spring, 2015, giving us two rounds of assessment data prior to our HLC accreditation visit in October, 2015. Fall 2014 assessment was completed April 2015 and Spring 2015 assignments for assessment have been collected. The second round of assessment will be completed in Fall 2015.

For both rounds of assessment, a scoring committee made up of faculty members of the LAC Council,the Assessment Committee, and faculty volunteers collected cross-disciplinary samples of students’ written work in their senior seminar / capstone courses. Scoring is based on the AAC&U’s critical thinking and problem solving rubrics.This approach was suggested for several reasons. First, there is no cross-disciplinary course from which to take work samples for student’s all in the same place in their academic career. Second, critical thinking is included as an outcome for every program on campus; therefore seniors should demonstrate this skill in their senior-level work. Finally, this would allow the Assessment Committee to work on the assessment process while the Liberal Arts Core Council focused on the reform of the Liberal Arts Core program.

The experience gained from assessment of Fall 2014 work samples has greatly improved the process for Liberal Arts Core assessment at Fort Lewis College, including the approaches used to collect samples, and faculty familiarity with the process of submitting and scoring student work. The following report identifies protocols created for the collection and de-identification of student work samples, findings of what was learned through scoring Fall 2014 work samples, as well as modifications to the assessment process.

II. Process and Protocols for the Collection and De-identification of Student Work Samples

Objective for 2014-2015 Academic Year: Assess critical thinking and problem solving for liberal arts core

Summary of method for liberal arts core assessment at Fort Lewis College

•Faculty teaching capstone senior courses submit writing samples for each student/group within their course at the end of the term.

•Writing samples are scored for critical thinking and problem solving, using the AACU LEAP rubrics.

•Writing samples are selected through a stratified random sampling approach

•Stratification is by area (Science, Social Science, Arts and Humanities, Business, Teacher Education) and within areas by faculty to increase chances that selected writing is not all from one faculty, as is possible.

•Approximately 20% of the work samples are scored by faculty volunteers, including individuals from the assessment committee

•Data will only be analyzed across areas and faculty

Detailed method for liberal arts core assessment at Fort Lewis College:

Score writing assignments from the final capstone course (senior thesis) across all disciplines using the AACU LEAP rubrics for critical thinking and problem solving. Tasks are listed below by number and the corresponding number is included in the table on the next page to identify by whom and when tasks need to be completed.

1)Instructors were asked to select a minimum of two criteria for critical thinking and one criterion for problem solving by which their students’ assignments will be scored. All criteria could be selected if appropriate. Written assignments and cover sheets are due at the end of the term.

2)A protocol for de-identification was developed and will be used to ensure consistency and thoroughness across written assignments(see page 6).

  • Assignments and corresponding cover sheets will be coded to ensure cover sheet information (one per instructor) can be matched to assignments (more than one per instructor).
  • Two spreadsheets have been developed, so that the process is double-blind. One spreadsheet is for Institutional Research, the Director of Assessment, and the Director of e-learning. This spreadsheet contains identifying information for students, departments, and faculty. The second spreadsheet for the AC chair does not contain this information and will be used to select writing assignments for scoring.

3)Data on submissions will be compiled and provided to the Chair of the Assessment Committee using MS Excel spreadsheet for this purpose (template provided by AC chair). This spreadsheet will be used to select writing assignments for scoring.

4)Data on submissions will be compiled and provided to the Director of Assessment and Director of e-learning using MS Excel spreadsheet for this purpose (template provided by AC chair). This spreadsheet will be used, if needed, to provide departments with data or resources.

5)A stratified, random sampling design will be used to select theses for scoring.

  • Stratification will be by area (i.e. Science, Arts and Humanities, Social Science, Business, Teacher Education) and if needed by whether written assignments were completed as a group or by individuals. Stratification is needed, because submissions are unbalanced across areas, yet all areas should be represented. Area will not be used as a variable by which data is analyzed.

6)Assignment instructions or grading rubric will be added to each written assignment selected for scoring by copying the text into the written assignment. The instructions or grading rubric will precede the written assignment.

7)Written assignments selected for scoring will be de-identified using the protocol.

8)Notes will be taken on additional steps needed and steps not needed to improve the protocol. The notes will be used to revise the protocol. Revised protocol and de-identified assignments will be made available to Director of Assessment and Director of e-learning director.

9)Volunteer scorers will be recruited from across areas. A system to track service has been developed by the AC.

10)Assignments will be assigned to scorers by area to the extent this is possible and uploaded to Canvas.

11)Scorers will be trained through a rubric norming session.

12)Scorers will have one month to complete scoring through Canvas

13)Scores will be analyzed to assess critical thinking and problem solving for the liberal arts core across all areas. Complete a report, discussing results and process, ex. what worked, what didn’t work and steps for improving process.

Phase 1: Fall senior seminar / capstone courses

Begin Oct 1, 2014 complete by Mar 20, 2015 (actual date of completion Apr 2015)

Total number of theses to select and score will be 50, each scored twice.

10 volunteer scorers were recruited of which 9 completed their scoring.

Service for assessment of liberal arts core was recorded for those who completed scoring.

Who was responsible and when tasks were completed:

Task / Whose task / Needed by / Send to:
1) submission of assignments and cover sheets / Coordinated by AC chair / Dec 23, 2014 / Office of Assessment
2) develop de-identification protocol / Director of Assessment / Jan 27, 2015 / AC chair
3) Data on submissions compiled for AC / Office of Assessment / Jan 29, 2015* / AC chair
4) Data on submissions compiled for Director of Assessment and Director of e-learning / Office of Assessment / Jan 29, 2015 / Director of Assessment and Director of e-learning
5) selection of assignments for scoring / AC chair / Jan 30, 2015* / Office of Assessment
6) add instructions to assignments selected for scoring / Director of e-learning / Feb 6, 2015 / Post on canvas
7) use de-identification protocol to de-identify written assignments / Office of Assessment / Feb 6, 2015
8) record notes and revise protocol for de-identification; provide revised protocol and assignments to Director of Assessment and Director of e-learning / Office of Assessent / Feb 6, 2015 / Director of Assessment and Director of e-learning
9) recruit scorers / Coordinated by AC chair / Feb 6, 2015 / Director of Assessment and Director of e-learning
10) assign assignments to scorers and upload to Canvas / Director of e-learning / Feb 13, 2015 / Director of Assessment and AC chair
11) train scorers (rubric norming session) / Director of Assessment and AC chair / Feb 13, 2015 / AC chair and Director of e-learning
12) scoring / Faculty scorers / Mar 13, 2015 / Enter on canvas
13) analyze data and complete report / AC chair and Director of Assessment / Mar 20, 2015 / Director of Assessment

*task 5 requires that task 3 is completed on time. Also note that there is a different spreadsheet for the chair of AC that does not include any identifying information for students or faculty.

Phase 2: Spring senior seminar / capstone courses

Begin Apr 1, 2015 complete September, 2015

Phase 2 will be designed to improve on phase 1 including approach for assignment submission and tracking, component selection for scoring from rubrics, and approach for scoring.

Faculty senate approved a procedure for involving faculty in scoring, which will be implemented in August for fall scoring (see procedure for faculty service for liberal arts core assessment).

Due to the complexities involved with soliciting, collecting, organizing, coding and uploading work samples to Canvas, it was felt by all involved that it would be more efficient to centralize the process within the Office of Assessment. The methods listed in this report include this revision to the process for assessment of the liberal arts core.

Protocol for collecting and de-identifying work samples

1) Faculty whose courses are selected to provide student work samples will be notified at the beginning of each semester. Notification will include corresponding rubrics and instructions.

2) At the end of the term, faculty will submit student work samples, the assignment instructions, the cover sheet, and the rubric components electronically, in the form of a zipped file, to the Assessment Coordinator, Office of Assessment.

3) The Assessment Coordinator will pool assignments across all courses/instructors; summary data on assignments will be compiled and provided to the Chair of the Assessment Committee using MS Excel spreadsheet (template provided by AC chair).

4) Selecting from the spreadsheet identified in #3, the Chair of the Assessment Committee will select a stratified, random sample to be used for scoring. Stratification will be by area (i.e. Science, Arts and Humanities, Social Science, Business, Teacher Education) and if needed by whether work samples were completed as a group or by individuals. Stratification is needed, because submissions are unbalanced across disciplines, yet all disciplines should be represented. Discipline will not be used as a variable by which data is analyzed. The AC chair will give sample list will be given back to the Assessment Coordinator for de-identification of samples.

5) The Assessment Coordinator will de-identify all work samples. This will include deleting names of both students and faculty from the work samples, deleting acknowledgement sections, etc. Citations will not be changed, although this could provide identification of faculty who mentored students as this will be less of an issue when work samples are not from senior seminar /capstone course. Each work sample with its corresponding cover sheet and assignment will be coded with corresponding and non-identifying codes (i.e. no faculty initials will be used) and saved as a file named with the code. For example, file F1401.001 will include the work sample, the cover sheet, and the assignment for a fall 2014 work sample for faculty 01 and student 001. A second spreadsheet will be created that includes student names, discipline, and corresponding codes and will be provided to the Director of eLearning so that she may set up the files in Canvas. The addition of this spreadsheet also serves to double blind the student work samples.

6) When scoring is complete, the director of e-learning will provide results from Canvas (all scoring will be done on Canvas) for each rubric component to the Director of Assessment and the AC Chair.

7) The Chair of the Assessment Committee will provide a report of findings to the Assessment Committee and the Office of Assessment. The Assessment Committee will make recommendations for disseminating findings and acting on assessment results.

8) The Chair of the Assessment Committee and the Director of Assessment will collaborate on the assessment report for the academic year.

Faculty Volunteer Scorers Fall 2014

Michael Martin (Arts and Humanities)*

Michele Malach (Arts and Humanities)

Gordon Cheesewright (Arts and Humanities)*

Rebecca Austin (Social Sciences)*

Brad Clark (Social Sciences)

Cathy Hartney (Honors Program)

Gary Gianniny (Sciences)

Melissa Knight-Maloney (Sciences)

Erich McAlister (Sciences)*

Paul Clay (Business)

*are individuals also serving on the assessment committee or the liberal arts core council

III. Submissions, Findings, and Modifications Fall 2014 Liberal Arts Core Assessment

Submissions Tracking and Improvements

For Fall 2014 liberal arts core assessment, work samples were collected from 14 courses. All faculty from whom the assessment committee requested work samples submitted work samples, but through the process of requesting work samples, it was clear that a system was needed to track submissions. A submission tracking system would enable us to determine which faculty had not yet submitted work samples, issues that may arise in collecting the work samples and the reasons for these issues, and the total number of submissions and students represented by these submissions (some work samples were group assignments).

Area Participation for Fall 2014 Work Samples

The number of writing samples is affected by the 1) number of students/courses taught 2) whether assignments are completed individually or as groups and 3) if suitable assignment is included in the capstone course, i.e. writing, in English, critical thinking and problem solving can be assessed.

The primary issues that arose in collecting Fall work samples were identifying which faculty needed a reminder to submit work samples and determining if work sample submissions from a faculty represented all student work in a course. To improve on the submission process for Spring 2015, we developed spreadsheets shared by the Assessment Coordinator and the chair of the Assessment Committee. The spreadsheets made it possible to review submissions weekly and send reminders to faculty who still needed to submit work samples. For many faculty who did not submit work samples on time, they did not understand if they needed to submit work samples. Familiarity with the process of assessment we are designing will resolve this issue.

The issue of incomplete submissions was corrected by including on the cover sheet (see appendixes) that faculty list the total number of students in the course, the number of work samples they are submitting, and clarification for the difference. For example, faculty might have 14 students in a course, but only submit 10 work samples if 2 students did not complete the assignment and some worked in a group. Alternatively, work sample submissions may be incomplete because an email may not have gone out. For liberal arts core assessment, faculty submission of work samples was done through email, but it is our intention to shift this process to a different system such as task stream or canvas.

Area Participation for Spring 2015 Work Samples

Our improved system for work sample submissions allowed us to track the total number of writing samples and the number of students this includes (some students worked in groups). More work samples were submitted in spring as more programs teach an appropriate capstone course in spring terms. We will also increase the number of selected samples given the greater number of submissions.

In spring 2015, we also created a system to track reasons that work samples were not submitted from senior seminar / capstone courses in order to improve the process for requesting and receiving submissions. While nearly all requested work samples were submitted, we can improve through planning for scoring in Spanish, through familiarity among faculty with the process, and earlier notification by assessment committee to faculty that work samples are needed in an electronic format.

Findings and Modifications for Liberal Arts Core Assessment

In our first round of assessment of the liberal arts core, the assessment committee chose to have faculty select two components of critical thinking (of 5 in the LEAP rubric) and one component of problem solving (of 6 in the LEAP rubric). This was chosen as assignments were not intentionally designed for these rubrics and some components may not be well matched to the assignment. Through our rubric norming session for the faculty volunteer scorers, we learned that this approach had some limitations as well. Many faculty who provided their selection of components were either unfamiliar with the rubric components or the written explanation of the components is not sufficient to determine if it is appropriate for an assignment.

During the rubric norming session, the 10 scorers became more familiar with the rubrics and identified approaches to use the rubrics for scoring assignments in their areas. For example, science faculty scorers created a system to match components of critical thinking and problem solving to common sections of scientific papers, which was the format for most submissions in the sciences. We agreed that all rubric components, not just those selected by the instructor, should be scored and that a 0 would be entered for any components for which there was no evidence. No evidence was often a result of the assignment design or student understanding of the assignment not corresponding with a component of the rubric. For Spring 2015 liberal arts core assessment (to be completed Fall 2015), we will again use all components of the rubrics for Critical Thinking and Problem Solving and have not asked faculty to select components. For data analysis, we did not include scores of 0 in the analysis as these do not reflect an assessment of student work.