Academic Affairs: Assessment August, 2010

CENTRALWASHINGTONUNIVERSITY

2009-2010 Assessment of Student Learning Report

Feedback for the Department of History

Degree Award: BA History;History Teaching; & MA History Program: History

1. What student learning outcomes were assessed this year, and why?

Guidelines for Assessing a Program’s Reporting of Student Learning Outcomes (Target = 2)
Program
Score / Value / Demonstrated Characteristics
2 / 4 / Outcomes are written in clear, measurable terms and include knowledge, skills, and attitudes. All outcomes are linked to department, college and university mission and goals.
3 / Outcomes are written in clear, measurable terms and include knowledge, skills, and attitudes. Some outcomes are linked to department, college and university mission and goals.
2 / Outcomes are written in clear, measurable terms and include knowledge, skills, or attitudes. Outcomes may be linked to department, college and university mission and goals.
1 / Some outcomes may be written as general, broad, or abstract statements. Outcomes include knowledge, skills, or attitudes. Outcomes may be linked to department, college and university mission and goals.
0 / Outcomes are not identified.

Comments:

It was assumed that the department evaluated 4 student learning outcomes at the undergraduate level and 2 at the graduate level for the History programs (as based on the assessment plans on file). The specific goals assessed should be spelled out in future assessment reports so that it is clear what is being analyzed. This same feedback has been provided twice. An improvement in this area this coming year would be helpful as the yearly reports are used for more comprehensive HECB and NWCCU institutional reporting.

The review notes the failure to include the number of key student learning outcomes in the capstone courses.

Previous reviews indicate and the current one concurs, that the program is encouraged tomeet with Dr. Quitadamo, Pellet or Henderson to review and revise the program assessment plans and assist in reporting.

2. How were they assessed?

  1. What methods were used?
  2. Who was assessed?
  3. When was it assessed?

Guidelines for Assessing a Program's Reporting of Assessment Methods (Target = 2)
Program Score / Value / Demonstrated Characteristics
1 / 4 / A variety of methods, both direct and indirect are used for assessing each outcome. Reporting of assessment method includes population assessed, number assessed, and when applicable, survey response rate. Each method has a clear standard of mastery (criterion) against which results will be assessed
3 / Some outcomes may be assessed using a single method, which may be either direct or indirect. All assessment methods are described in terms of population assessed, number assessed, and when applicable, survey response rate. Each method has a clear standard of mastery (criterion) against which results will be assessed.
2 / Some outcomes may be assessed using a single method, which may be either direct or indirect. All assessment methods are described in terms of population assessed, number assessed, and when applicable, survey response rate. Some methods mayhave a clear standard of mastery (criterion) against which results will be assessed.
1 / Each outcome is assessed using a single method, which may be either direct or indirect. Some assessment methods may be described in terms of population assessed, number assessed, and when applicable, survey response rate. Some methodsmayhave a clear standard of mastery (criterion) against which results will be assessed.
0 / Assessment methods are nonexistent, not reported, or include grades, student/faculty ratios, program evaluations, or other “non-measures” of actual student performance or satisfaction.

Comments:

The review notes the use of direct (e.g.,research paper rubric and thesis rubric) and indirect measures (survey) to measure student learning and growth at the undergraduate and graduate levels.

Previous reviews note and the current one concurs that the number of students assessed was not reported at all levels.

Again, similar to previous reviews, the department is encouraged this next year to make the standard clearer in the plan and possibly consider a standard that includes a percentage of students that can exceed expectations as well as meets expectations.

3.What was learned (assessment results)?

Guidelines for Assessing a Program’s Reporting of Assessment Results (Target = 2)
Program Score / Value / Demonstrated Characteristics
1 / 4 / Results are presented in specific quantitative and/or qualitative terms. Results are explicitly linked to outcomes and compared to the established standard of mastery. Reporting of results includes interpretation and conclusions about the results.
3 / Results are presented in specific quantitative and/or qualitative terms and are explicitly linked to outcomes and compared to the established standard of mastery.
2 / Results are presented in specific quantitative and/or qualitative terms, although they may not all be explicitly linked to outcomes and compared to the established standard of mastery.
1 / Results are presented in general statements.
0 / Results are not reported.

Comments:

Previous reviews note and the current one concurs that the dept. references attached data sheets, but lack their inclusion with these reviews.Future reports should include specific quantitative and/or qualitative data that has been summarized (preferably in a table or in some other organized fashion) and linked to outcomes and compared to established standards of mastery or proficiency. It provides important context and makes the process clearer and more useable. Resultswere only presented in general terms. The department provided some interpretation and conclusion regarding the findings which was positive. However, without any results to compare the summary statements, it was hard to make much connection.

4.What will the department or program do as a result of that information (feedback/program improvement)?

Guidelines for Assessing a Program’s Reporting of Planned Program Improvements (Target = 2)
Program Score / Value / Demonstrated Characteristics
0 / 2 / Program improvement is related to pedagogical or curricular decisions described in specific terms congruent with assessment results. The department reports the results and changes to internal and/or external constituents.
1 / Program improvement is related to pedagogical or curricular decisions described only in global or ambiguous terms, or plans for improvement do not match assessment results. The department may report the results and changes to internal or external constituents.
NA / Program improvement is not indicated by assessment results.
0 / Program improvement is not addressed.

Comments:

As previous reviews note and the current on concur there is no relationship between assessment results and program improvement. The replacement of faculty lines, though important is not linked to any student learning outcome.

Previous reviews and program reports note planned faculty discussion of survey results, yet there is an absence of them. Again, the department is encouraged to document their actions related to student learning in the yearly assessment report.

  1. How did the department or program make use of the feedback from last year’s assessment?

Guidelines for Assessing a Program’s Reporting of Previous Feedback (Target = 2)
Program
Score / Value / Demonstrated Characteristics
2 / 2 / Discussion of feedback indicates that assessment results and feedback from previous assessment reports are being used for long-term curricular and pedagogical decisions.
1 / Discussion of feedback indicates that assessment results and feedback from previous assessment reports are acknowledged.
NA / This is a first year report.
0 / There is no discussion of assessment results or feedback from previous assessment reports.

Comments:

There was discussion of planned pedagogical changes (i.e., increased course and program credit) as well as increased requirements (prospectus creation). This is positive and should be continued in the future!

The department should be commended for having made some positive changes to its assessment regimen. However, there are areas where improvement and change is still needed. These changes should provide additional and useful information for the department in improving student learning. The department is encouraged to meet withDr. Quitadamo, Henderson, or Pellett to discuss its assessment plan as well as provide some help with writing next year’s report. This meeting might help streamline some of the department’s effort and improve clarity of future reports.

Please feel free to contact either of us if you have any questions about your score or comments supplied in this feedback report, or if any additional assistance is needed with regard to your assessment efforts.

Dr. Tracy Pellett & Dr. Ian Quitadamo, Academic Assessment Committee Co-chairs