2018Annual Assessment Report Planning Sheetfor
Undergraduate Degree Programs (Optional)
NOT FOR SUBMISSION – Reports are collected online using Qualtrics due June 1, 2018
The 2018 Annual Assessment Reports for Undergraduate Degree Programs will be collected online using Qualtrics, due June 1, 2018. This document contains a copy of the questions from the 2018 Annual Assessment Report for Undergraduate Degree Programs in Qualtrics and is intended to be used as an optional planning sheet to help assessment coordinators and/or program leadership develop their reports prior to submission. The Office of Assessment of Teaching and Learning (ATL) is available to answer questions and help programs complete this planning sheet in advance of the June 1st deadline.
This document is not to be used for submission; please copy and paste your responses from this planning sheet into the Qualtrics report form. Each degree program’s assessment coordinator will receive an email from ATL with a unique link to their Qualtrics report form; if you misplace your link or have questions, please contact ATL () for assistance. You can start and stop working on your report in Qualtrics at any time using this unique link; the report form in Qualtricswill automatically save your work as you go and allow you to pick up where you left off from any computer or with any internet browser.Upon submission of the report in Qualtrics, the program chair/director will receive an email requesting that they review and approve the Qualtrics report (Note: This will not apply in cases where the program chair/director is the person submitting the Qualtrics report).
Please note that since the report is collected online using Qualtrics, questions may appear differently in the online format (however the wording and order of questions are as follows). Some questions are intended to collected further information on a topic and only appear as applicable.Additionally, Questions marked with asterisks (**) apply only to programs with degrees offered on multiple campuses or fully-online.Please contact ATL if you have questions.The hyperlinks included throughout this planning sheet can be used to view a glossary with definitions for key assessment terms.
In August, ATL will generate and provide each program with a PDF copy of the report submitted in Qualtrics, for their archive.
Targets for Meaningful Assessment. WSU aims to have substantially all (≥ 90%) programs reporting thatassessment elements and other indicators of quality assessment are in place. The university’s overarching goal isfor assessment to be meaningful and useful to faculty and students. WSU’sapproach encourages deeper involvement in assessment and increases in quality over time as programs makeimprovements to meet evolving assessment needs.
Annual Assessment Report for Undergraduate Degree Programs
Period: June 2017-May 2018
Scope and Audience for this Report: This report provides a summary of the academic assessment activities conducted by each undergraduate degree program and does not include all details or data. Unless otherwise noted, this report includes only activities occurring June 2017 - May 2018.Please provide clear and complete information -- with sufficient description for people outside your department and discipline, who are not familiar with your assessment processes. ATL compiles portions of these annual program assessment reports, and collects examples, to share with your college and the institution. Degree program assessment reports also help fulfillrequirements to maintain WSU's regional accreditation under the Northwest Commission on Colleges and Universities.
Undergraduate DegreeTitle:______
Which campuses/locations offered this undergraduate degree in the past year (AY 2017-18)? (select all that apply)
Note: Please select all campuses/locations where the undergraduate degree was offered.
□Pullman / □Spokane / □Tri-Cities□Vancouver / □Everett / □Bremerton
□Global (Online) / □Other (please specify) ______
Optional comments:
**Which of these campuses/locations are included in the scope of this report? (select all that apply)
□Pullman / □Spokane / □Tri-Cities□Vancouver / □Everett / □Bremerton
□Global (Online) / □Other (please specify) ______
**Optional comments:
Department Chair/School Director/Other Program Leadership Name(s): ______
Assessment Coordinator Name: ______
Report Prepared by Name(s): ______
Contact information for person that should approve this report (i.e. Chair/Director/Other Program Leadership):
Name: ______
Phone: ______
Email: ______
Contact information for questions about this report:(Note: This person should be available to answer questions for at least 10 days after the report is approved by program leadership in Qualtrics.)
Name: ______
Phone: ______
Email: ______
**Name of Contact for Assessment by Campus/Location:
**Pullman: ______/ **Spokane: ______**Tri-Cities: ______/ **Vancouver: ______
**Everett: ______/ **Bremerton: ______
**Other: ______
Section 1. Quality Measures on Key Assessment Elements
Quality assessment follows an intentional and reflective process of design, implementation, evaluation, and revision. The Assessment Cycle (see graphic below) begins with student learning outcomes (SLOs) and questions about student learning in the curriculum. After reviewing the program’s SLOs and a curriculum map indicating where particular SLOs are highlighted, faculty select assessment measures to gather evidence of student learning. The evidence is analyzed, discussed by the faculty, and used to inform program decisions, including those about instruction, the curriculum, the assessment, and dialog about teaching and learning.
1.A. Program-level Student Learning Outcomes
Did this program have program-levelstudent learning outcomes in place during the past year?
Yes, ourprogram had program-level SLOs in place
Yes, but they were under or need revision
No, our program did nothave program-level SLOs in place
Were your program-level student learning outcomes available on your department or program’s website?
Yes / NoHave your program-level student learning outcomes been approved (formally or informally) within the past three years by the majority of faculty who teach?(e.g. as part of a retreat or regular faculty meeting)
Yes / No**Briefly describe how faculty who teach on all campuses/locations that offer the degree are included in approving (formally or informally) program SLOs. (e.g. as part of aregular meeting that includes faculty from all campuses)
Comments:
Please attach your program-level student learning outcomes:
1.B. Curriculum Map
Did this program havea curriculum mapin place during the past year?
Yes, our program hada curriculum map in place
Yes, but it was under or needs revision
No, our program did not have a curriculum map in place
Has your curriculum map been approved (formally or informally) within the past three years by the majority of faculty who teach?(e.g. as part of a retreat or regular faculty meeting)
Yes / No**Briefly describe how faculty who teach on all campuses/locations that offer the degree are included in approving (formally or informally) your curriculum map.(e.g. as part of a regular meeting that includes faculty from all campuses)
Comments:
Please attach your curriculum map:
1.C. Assessment Plan
Did this program have a currentassessment plan in placeduring the past year?
Yes, our program hada current assessment plan in place
No, our program did not have a current assessment plan in place
Was your program’s assessment plan updated in the past year?
Yes / NoDidyour assessment plan include a timeline that identified when measures were collected, analyzed, and shared?
Yes / Yes, but only for some measures / NoComments:
Please attach your assessment plan:
Note: Please make sure that your assessment plan indicates the academic year(s) it covers.
1.D. Direct Measures
Did you collect a direct measure of student learning during the past yearfor use in program-level assessment?
Yes, our program collected one or more direct measures
Yes, but the only direct measure collectedwas a pilot or collected for the first time
No, our program did not collect a direct measure
Which type(s) of direct measure(s) were collectedduring the past year? (select all that apply)
Course-embedded assignment (e.g. project, paper, presentation, poster, portfolio, performance, thesis, or exhibition evaluation)Course-embedded exam / Internship supervisor, preceptor, or employerevaluation of student skills and knowledge
National exam (e.g. certification or other standardized test)
Other direct measure (please specify) ______
Comments:
Did you collect asenior-leveldirect measure of student learning during the past yearfor use in program-level assessment?Note: a measure does not need to apply exclusively to seniors (i.e. it may also provide data about students at other academic levels).
Yes, our program collected one or moresenior-leveldirect measures
Yes, but the only senior-level direct measure collected was a pilot or collected for the first time
No, our program did not collect a senior-leveldirect measure
Which type(s) ofsenior-level direct measure(s) were collectedduring the past year? (select all that apply)
Course-embedded assignment (e.g. project, paper, presentation, poster, portfolio, performance, thesis, or exhibition evaluation)Course-embedded exam / Internship supervisor, preceptor, or employerevaluation of student skills and knowledge
National exam (e.g. certification or other standardized test)
Other direct measure (please specify) ______
Comments:
For ONE of thesesenior-level direct measures collected in the past year, please provide a description of the measure including how the measure is collected and evaluated, and the student learning outcome(s) assessed.[see examples]
Select the statements below that best exemplify the senior-level direct measure described above.
We have collected it 3 or more times / We have collected it 1-2 times / It is still in the pilot phaseIt includes adequate representation of seniors / We have not yet gotten representative results
**It includes seniors on all campuses/locations that offer the degree / **It does not include seniors on all campuses/locations that offer the degree
Results from the measure are shared with faculty / Results are not yetshared with faculty
We have results from this measure in our assessment archive, together with any related rubric or tool / Results from this measure are not yet in our assessment archive
We may make adjustments to collection or analysis to improve this measure / We are not planning to make adjustments to this measure
Please describe how this measure provides (or does not yet provide) adequate representation of students (i.e. how many seniors were assessed and why you took this approach).
1.E. Indirect Measures
Did you collect anindirect measureof student learning or experienceduring the past yearfor use in program-level assessment?
Yes, our program collected one or moreindirect measures
Yes, but the only measure collected was a pilotor collected for the first time
No, our program did not collect anindirect measure
Which type(s) of indirect measure(s) were collectedduring the past year? (select all that apply)
Student Perspectives & Experience:Focus groups
Interviews (e.g. exit or other)
Student review of portfolio or project / Survey, alumni
Survey, student (e.g. NSSE, exit or other)
Other (please specify) ______
Professional Perspectives & Input:
Advisory board (providing input on program)
Faculty review of curriculum, SLOs, syllabi, or assignment prompts
Feedback from external accreditors / Internship supervisor, preceptor, or employer feedback on student activities, motivation or behavior
Survey, employer (providing professional input on program)
Other (please specify) ______
Indicators of Progress, Success, Retention, etc:
Grades
Internal data (e.g. student demographics, retention) / Participation rates (research, internship, service learning, study abroad, etc.)
Other (please specify) ______
Comments:
Did you collect asenior-levelindirect measure of student learning or experienceduring the past yearfor use in program-level assessment?Note: a measure does not need to apply exclusively to seniors (i.e. it may also provide data about students at other academic levels).
Yes, our program collected one or moresenior-level indirect measures
Yes, but the only senior-level indirect measure collected was a pilot or collected for the first time
No, our program did not collect a senior-level indirect measure
Which type(s) ofsenior-levelindirect measure(s) were collectedduring the past year? (select all that apply)
Student Perspectives & Experience:Focus groups
Interviews (e.g. exit or other)
Student review of portfolio or project / Survey, alumni
Survey, student (e.g. NSSE, exit or other)
Other (please specify) ______
Professional Perspectives & Input:
Advisory board (providing input on program)
Faculty review of curriculum, SLOs, syllabi, or assignment prompts
Feedback from external accreditors / Internship supervisor, preceptor, or employer feedback on student activities, motivation or behavior
Survey, employer (providing professional input on program)
Other (please specify) ______
Indicators of Progress, Success, Retention, etc:
Grades
Internal data (e.g. student demographics, retention) / Participation rates (research, internship, service learning, study abroad, etc.)
Other (please specify) ______
Comments:
Direct andIndirect Measures, cont.
**Please indicate which senior-level direct and indirect measure(s) were collected in the past year for each campus/location that offered the degree.(select all that apply)
Note: the appropriate multiple choice response options and campuses will be automatically populated in Qualtrics.
**Comments:
Please describe how your assessment included online courses and students.
Have direct and indirect measures been approved (formally or informally) within the past three years by the majority of faculty who teach?(e.g. as part of a retreat or regular faculty meeting)
Yes, all measures approved by majority of faculty
Yes, some measures approved by majority of faculty
No, measures not yet approved by majority of faculty
**Briefly describe how faculty who teach on all campuses/locations that offer the degree are included in approving (formally or informally) your measures.(e.g. as part of a regular meeting that includes faculty from all campuses)
Comments:
Section 2. Use of Assessment Results to Inform Program Decisions
2.A. Focus on Use of Results from Student Learning Outcomes-Aligned Assessment
In successful assessment cycles (see page 2), degree programs use assessment results to inform decision-making to support student learning. In assessment cycles aligned with program-level student learning outcomes (SLOs), a degree program begins with one or more program-level SLOs, measures student performance on these outcomes using a combination of direct and indirect measures, and uses the results to inform program decisions, including those about instruction, the curriculum, the assessment, and dialog about teaching and learning. Actions can include choosing to make changes to a program, continue current effective practices, or build on strengths.
Did this program use assessment results aligned with program-level SLOs to inform decision-making to support student learningduring the past year?Note: It is not expected that programs complete an assessment cycle every year, or that programs complete an entire assessment cycle for a particular SLO in one academic year (i.e. an action or change in the past year may have been informed by an assessment measure collected in previous academic years).
Yes, our program used assessment results from SLO-aligned assessment to inform decision-making
No, our program has collected one or more SLO-aligned assessment measures, but has not yet used the assessment results to inform decision-making
Our program has not yet collected a SLO-aligned assessment measure
IF YES:In the spaces below, illustrate how your programused SLO-aligned assessment results to inform decision-making by describing an action or change to the curriculum, teaching, faculty development, and/or the assessment process in the past year that was informed by a SLO-aligned assessment cycle.
Description of Use of Results from Student Learning Outcomes-Aligned Assessment[see examples]
Aligned with Program-level SLO:______
(Please specify the learning outcome(s) aligned with this assessment cycle)
Action or Change in the Past Year / Results that Informed Decision / Assessment Measure(s)/ActivitiesReplace this text with a brief description of the action/change to curriculum, teaching, faculty development, the assessment process and/or other decision-making. / Replace this text with a brief explanation of the assessment results that informed the decision-making (i.e. what your program learned from the assessment activity/activities). / Replace this text with a brief description of the assessment measure(s) used to determine if students achieved the learning outcome(s). Where possible, indicate the number of students and faculty included in the assessment(s).
Please select the category corresponding to the type of action/changedescribed above. (select all that apply)
Curriculum or instruction decision / Faculty/TA development decisionAssessment process decision / Other decision (please specify) ______
IF NO:If your degree program has not yet usedSLO-aligned assessment results to inform decision-makingin the past year, indicate where the program is in the assessment cycle for one learning outcome in the spaces below.
Description of Student Learning Outcomes-Aligned Assessment Cycle Process[see examples]
Aligned with Program-level SLO:______
(Please specify the learning outcome(s) aligned with this assessment cycle)
Assessment Measure(s)/Activities / Assessment Results / Next StepsReplace this text with a brief description of the assessment measure(s) used to determine if students achieved the SLO(s). Where possible, indicate the number of students and faculty included in the assessment(s). / Replace this text with a brief explanation of the assessment results (i.e. what your program learned from the assessment activity/activities). / Replace this text with a brief description of how assessment results will be shared with faculty and used to inform program decision-making.
2.B. Other Assessment Activities and Uses of Assessment Results