Academic Affairs: AssessmentJuly 2010

CENTRAL WASHINGTON UNIVERSITY

2009-2009 Assessment of Student Learning Report

Feedback for the Department of Interdisciplinary Studies

Degree Award: BS Interdisciplinary Studies: Social Sciences Program: Undergraduate

1. What student learning outcomes were assessed this year, and why?

Guidelines for Assessing a Program’s Reporting of Student Learning Outcomes (Target = 2)
Program
Score / Value / Demonstrated Characteristics
3 / 4 / Outcomes are written in clear, measurable terms and include knowledge, skills, and attitudes. All outcomes are linked to department, college and university mission and goals.
3 / Outcomes are written in clear, measurable terms and include knowledge, skills, and attitudes. Some outcomes are linked to department, college and university mission and goals.
2 / Outcomes are written in clear, measurable terms and include knowledge, skills, or attitudes. Outcomes may be linked to department, college and university mission and goals.
1 / Some outcomes may be written as general, broad, or abstract statements. Outcomes include knowledge, skills, orattitudes. Outcomes may be linked to department, college and university mission and goals.
0 / Outcomes are not identified.

Comments:The review evaluated three student learning outcomes (skill and knowledge) as approved by the HEC Board in 2002. These outcomes were written in specific terms andwere related to university goals. Student attitudes (4 related to learning goals) were also assessed in this year’s review – which is a marked improvement from last year. Good job!

2. How were they assessed?

  1. What methods were used?
  2. Who was assessed?
  3. When was it assessed?

Guidelines for Assessing a Program's Reporting of Assessment Methods (Target = 2)
Program Score / Value / Demonstrated Characteristics
2 / 4 / A variety of methods, both direct and indirect are used for assessing each outcome. Reporting of assessment method includes population assessed, number assessed, and when applicable, survey response rate. Each method has a clear standard of mastery (criterion) against which results will be assessed
3 / Some outcomes may be assessed using a single method, which may be either direct or indirect. All assessment methods are described in terms of population assessed, number assessed, and when applicable, survey response rate. Each method has a clear standard of mastery (criterion) against which results will be assessed.
2 / Some outcomes may be assessed using a single method, which may be either direct or indirect. All assessment methods are described in terms of population assessed, number assessed, and when applicable, survey response rate. Some methods mayhave a clear standard of mastery (criterion) against which results will be assessed.
1 / Each outcome is assessed using a single method, which may be either direct or indirect. Some assessment methods may be described in terms of population assessed, number assessed, and when applicable, survey response rate. Some methodsmayhave a clear standard of mastery (criterion) against which results will be assessed.
0 / Assessment methods are nonexistent, not reported, or include grades, student/faculty ratios, program evaluations, or other “non-measures” of actual student performance or satisfaction.

Comments: The three skills/knowledge outcomes under investigation were measured through a specific assignment, the “Final Essay” within each portfolio (direct measure). This was measured against a rubric created for these learning outcomes. The number of students assessed is identified and discussed. This is positive and provides some indication as to student success regarding each outcome. Additionally, the percentage of students meeting the various levels was presented. The program incorporated earlier suggestions to include indirect methods (surveys) to determine student perceptions of goal attainment and program satisfaction. Again, this is positive! The program should be commended for making improvements from past reports! The only area that was not obvious was the standard of mastery. It was unclear what the standard of mastery was in terms of the outcomes. Sadly, this lowered the overall score of this section – which does not reflect the improvements made from last year. This should be a relatively simple fix for next year and will substantially improve the score.

3.What was learned (assessment results)?

Guidelines for Assessing a Program’s Reporting of Assessment Results (Target = 2)
Program Score / Value / Demonstrated Characteristics
2 / 4 / Results are presented in specific quantitative and/or qualitative terms. Results are explicitly linked to outcomes and compared to the established standard of mastery. Reporting of results includes interpretation and conclusions about the results.
3 / Results are presented in specific quantitative and/or qualitative terms and are explicitly linked to outcomes and compared to the established standard of mastery.
2 / Results are presented in specific quantitative and/or qualitative terms, although they may not all be explicitly linked to outcomes and compared to the established standard of mastery.
1 / Results are presented in general statements.
0 / Results are not reported.

Comments: The results were presented in specific quantitative terms (numbers and percentages) and were linked to program outcomes. The indirect measures were also incorporated into the results. The conclusion was appropriately brief as almost all students met the learning outcomes. The score above reflects the lack of identified “standards of mastery.” This should be able to easily address in next year’s report.

4.What will the department or program do as a result of that information (feedback/program improvement)?

Guidelines for Assessing a Program’s Reporting of Planned Program Improvements (Target = 2)
Program Score / Value / Demonstrated Characteristics
2 / 2 / Program improvement is related to pedagogical or curricular decisions described in specific terms congruent with assessment results.The department reports the results and changes to internal and/or external constituents.
1 / Program improvement is related to pedagogical or curricular decisions described only in global or ambiguous terms, or plans for improvement do not match assessment results.The department may report the results and changes to internal or external constituents.
NA / Program improvement is not indicated by assessment results.
0 / Program improvement is not addressed.

Comments: Program improvement was described in specific terms as related to curriculum and pedagogy. The department is utilizing both direct and indirect measures for program improvement. There seems to be a continued emphasis on individual sessions with students to further develop their individual programs. The changes mentioned seemed positive and consistent with improving the quality of student experience and program standing.

  1. How did the department or program make use of the feedback from last year’s assessment?

Guidelines for Assessing a Program’s Reporting of Previous Feedback (Target = 2)
Program
Score / Value / Demonstrated Characteristics
2 / 2 / Discussion of feedback indicates that assessment results and feedback from previous assessment reports are being used for long-term curricular and pedagogical decisions.
1 / Discussion of feedback indicates that assessment results and feedback from previous assessment reports are acknowledged.
NA / This is a first year report.
0 / There is no discussion of assessment results or feedback from previous assessment reports.

Comments: The program acknowledged the changes it has made to improve its assessment process as well as its continued commitment to outcomes based assessment. This year’s report indicates a more developed feedback cycle between the department and the faculty and the students, as the latter are achieving outcomes. The incorporation of more courses will strengthen identified outcomes.

A challenge for the program will be in determining change/improvement areas even when the results are positive. With added effort, the continuous improvement process and student learning focused nature of this program will improve.

Please feel free to contact either of us if you have any questions about your score or comments supplied in this feedback report, or ifany additional assistance is needed with regard to your assessment efforts.

Dr. TracyPellett & Dr. Ian QuitadamoAcademic Assessment Committee Co-chairs

1