Spring 2015 and Fall 2015
Student Learning Outcomes Assessment Plan and Report
College: ___College of Education______
Department: _Educational Leadership______
Name of Degree or Certificate Program/Stand Alone Minor/Online Distance Education Program: M.Ed. in Instructional Systems Technology
Reflection on the Continuous Improvement of Student Learning1. List the changes and improvements your program planned to implement as a result of last year’s student learning
outcomes assessment data.
2. Were all of the changes implemented? If not, please explain.
3. What impact did the changes have on student learning?
No changes were needed as all performance outcomes were met. Performance outcomes remain strong during 2015.
Student Learning Outcome 1
(knowledge, skill or ability to be assessed)
SLO 1 (revised 2015 report): Instructional Systems Technology candidates demonstrate an understanding of instructional technology standards and are able to apply knowledge and skills specific to their concentration area.
Changes to the Student Learning Outcomes Assessment Plan: If any changes were made to the assessment plan (which includes the Student Learning Outcome, Effectiveness Measure, Methodology and Performance Outcome) for this student learning outcome since your last report was submitted, briefly summarize the changes made and the rationale for the changes.
In 2013, the College of Education accrediting body, the Council for the Accreditation of Educator Preparation (CAEP), released new standards for educator preparation programs. To better align with these standards, the College of Education faculty have collaboratively worked this year to revise our Student Learning Outcomes (SLOs). In addition, the UNC Charlotte Office of Assessment recommends that programs revisit SLOs every 3-5 years to ensure that SLOs accurately assess student learning. As a result, SLO 1 has been changed as indicated above.
To assess the revised SLO 1, an existing data source was identified as the Internship Project. Specific indicators as aligned with SLO 1 are indicated below. These indicators are different from the 2014 data report.
Effectiveness Measure: Identify the data collection instrument, e.g., exam, project, paper, etc. that will be used to gauge acquisition of this student learning outcome and explain how it assesses the desired knowledge, skill or ability. A copy of the data collection instrument and any scoring rubrics associated with this student learning outcome are to be submitted electronically to the designated folder on the designated shared drive.
Candidates were assigned a project which was used to measure knowledge and skills. The project was assessed using a rubric with a 10-point scale (Unacceptable if the total score is below 5, Acceptable score must total at least 5 with no “unacceptable” standards identified. Target the score must be at least 8 or more points with no more than 2 standards as acceptable.) The projects look at the candidate’s ability to demonstrate an understanding of the instructional technology standards and apply it in their instructional technology concentration area.
Projects to assess knowledge and skills
The internship project is implemented during the Internship (EIST 6491) course. Internship requires the candidate to demonstrate an in-depth understanding of knowledge in their fields. They demonstrate this knowledge and skill by designing conditions for learning by applying principles of instructional systems design, message design, instructional strategies, and learner characteristics. The faculty member responsible for the internship uses the common rubric to assess the work sample. The internship mentor provides feedback to the faculty member regarding student performance and the faculty member assigns the grade using the internship evaluation rubric. For these reasons, the Internship Project is an appropriate measure for revised SLO 1.
Methodology: Describe when, where and how the assessment of this student learning outcome will be administered and evaluated. Describe the process the department will use to collect, analyze and disseminate the assessment data to program faculty and to decide the changes/improvements to make on the basis of the assessment data.
The internship project is implemented during the Internship (EIST 6491 Internship) course. The instructor of the Internship course uses the Internship Work Sample Assessment (IWSA) to score each student project. Assessments are administered at identified points during the internship. Work samples are scored using the designated method, and scores are collected and analyzed at the program level and reported to the Program Coordinator. The program coordinator aggregates the data from all the assessments and provides a report to the program faculty each year. Based on the review of the data by the program faculty, specific recommendations are made for possible modifications to the program for the following year. In addition to this data collection, each faculty member provides the coordinator with descriptions of any course modifications made end of each academic year. The instructor identified modifications are completed by the program coordinator and shared in the report to program faculty. Simple descriptive statistics are used to analyze the scores, and disaggregated findings are reported by semester or year at two levels (College and Program). Once a year all results are disseminated to the program and department faculty and discussed during a final faulty meeting. All data reports created by College of Education are housed on a secure web site which is accessible to faculty with the College of Education. These data guide the faculty in identifying program and curricular changes needed.
Performance Outcome: Identify the percentage of students assessed that should be able to demonstrate proficiency in this student learning outcome and the level of proficiency expected. Example: 80% of the students assessed will achieve a score of “acceptable” or higher on the Oral Presentation Scoring Rubric. (Note: a copy of the scoring rubric, complete with cell descriptors for each level of performance, is to be submitted electronically to the designated folder on the designated shared drive.)
The program expects 90% or higher of its candidates score at the Acceptable or better level on each standard on the Internship Work Sample Assessment (IWSA) (possible range of scores would be 0 to 2 points).
Assessment Data:
Semester / Spring 2014 / Fall 2014
Count / 8
Internship Work Sample Assessment / 100%
Spring 2015-Fall 2015 Assessment Data
Instructional Systems Technology M.Ed.EIST 6491 - Internship / Spring 2015
DE only / Fall 2015
DE only
Rubric Criterion: / Student Count / 4 / 4
Design / Average / 1.75 / 2.00
Count of 0 / 0 / 0
% Score of 0 / 0.00% / 0.00%
Count of 1 / 1 / 0
% Score of 1 / 25.00% / 0.00%
Count of 2 / 3 / 4
% Score of 2 / 75.00% / 100.00%
Development / Average / 1.75 / 1.75
Count of 0 / 0 / 0
% Score of 0 / 0.00% / 0.00%
Count of 1 / 1 / 1
% Score of 1 / 25.00% / 25.00%
Count of 2 / 3 / 3
% Score of 2 / 75.00% / 75.00%
Utilization / Average / 1.75 / 2.00
Count of 0 / 0 / 0
% Score of 0 / 0.00% / 0.00%
Count of 1 / 1 / 0
% Score of 1 / 25.00% / 0.00%
Count of 2 / 3 / 4
% Score of 2 / 75.00% / 100.00%
Management / Average / 1.75 / 2.00
Count of 0 / 0 / 0
% Score of 0 / 0.00% / 0.00%
Count of 1 / 1 / 0
% Score of 1 / 25.00% / 0.00%
Count of 2 / 3 / 4
% Score of 2 / 75.00% / 100.00%
Evaluation / Average / 1.75 / 1.75
Count of 0 / 0 / 0
% Score of 0 / 0.00% / 0.00%
Count of 1 / 1 / 1
% Score of 1 / 25.00% / 25.00%
Count of 2 / 3 / 3
% Score of 2 / 75.00% / 75.00%
Changes to be implemented Fall 2016: Based upon the 2015 assessment data included in this annual report, what changes/improvements will the program implement during the next academic year to improve performance on this student learning outcome?
Data indicated that candidates in the M.Ed. program met the targeted performance outcomes for revised SLO 1. However, the College of Education is focused on continuous improvement based on data-based decision-making. It is hard to identify areas for improvement based on this data since there is not much variability. We are revising the rubric to include the range from 0 to 3.
Student Learning Outcome 2
(knowledge, skill or ability to be assessed)
Revised SLO 2: Instructional Systems Technology candidates apply research and evidence to provide leadership in the field of instructional technology and analyze, integrate, and implement technology‐rich learning solutions based on the needs of learners and instructional context.
Changes to the Student Learning Outcomes Assessment Plan: If any changes were made to the assessment plan (which includes the Student Learning Outcome, Effectiveness Measure, Methodology and Performance Outcome) for this student learning outcome since your last report was submitted, briefly summarize the changes made and the rationale for the changes.
In 2013, the College of Education accrediting body, the Council for the Accreditation of Educator Preparation (CAEP), released new standards for educator preparation programs. To better align with these standards, the College of Education faculty have collaboratively worked this year to revise our Student Learning Outcomes (SLOs). In addition, the UNC Charlotte Office of Assessment recommends that programs revisit SLOs every 3-5 years to ensure that SLOs accurately assess student learning. As a result, SLO 2 has been changed as indicated above.
To assess the revised SLO 2, existing data source was identified as the Capstone Project. Specific indicators as aligned with SLO 2 are indicated below. These indicators are different from the 2014 data report.
Effectiveness Measure: Identify the data collection instrument, e.g., exam, project, paper, etc. that will be used to gauge acquisition of this student learning outcome and explain how it assesses the desired knowledge, skill or ability. A copy of the data collection instrument and any scoring rubrics associated with this student learning outcome are to be submitted electronically to the designated folder on the designated shared drive.
Candidates are assigned the capstone project which are used to measure knowledge and skills. The projects look at the candidate’s ability to design, develop, utilize, manage, and evaluate technological solutions to instructional problems. The rubrics focus on these five professional standards.
The project is implemented with the Capstone Master’s Degree Final Project. The Capstone or Final Project is completed during the last semester of coursework. Candidates are required to demonstrate their ability to collect and analyze data related to their work, reflect on their practice, and use research and technology to support and improve student learning. They demonstrate this knowledge and skill by developing instructional materials and experiences using print, audiovisual, computer-based, and integrated technologies. (See attached Rubric). For these reasons, the Capstone project serves as an appropriate measure of revised SLO 2.
Methodology: Describe when, where and how the assessment of this student learning outcome will be administered and evaluated. Describe the process the department will use to collect, analyze and disseminate the assessment data to program faculty and to decide the changes/improvements to make on the basis of the assessment data.
Assessments are administered at identified points during the program. Work samples are scored using the designated method, and scores are collected and analyzed at the program level. Simple descriptive statistics are used to analyze the scores, and disaggregated findings are reported by semester or year at two levels (College and Program). Once a year all results are disseminated to the program and department faculty and discussed during a final faculty meeting. All data reports created by College of Education are housed on a secure website which is accessible to faculty within the College of Education. These data guide the faculty in identifying program and curricular changes needed.
The capstone project is implemented with the Capstone Course (The Capstone or Final Project is completed during the last semester of coursework). The faculty final project committee (3 faculty members from the program and department) use the Capstone Work Sample Assessment (CWSA) meet to review the student’s final project. Each committee member rates the project individually and the committee chair produces a score based on the assessment of the final project by all three members. The chair of the committee enters this score on task stream.
Performance Outcome: Identify the percentage of students assessed that should be able to demonstrate proficiency in this student learning outcome and the level of proficiency expected. Example: 80% of the students assessed will achieve a score of “acceptable” or higher on the Oral Presentation Scoring Rubric. (Note: a copy of the scoring rubric, complete with cell descriptors for each level of performance, is to be submitted electronically to the designated folder on the designated shared drive.)
The program expects 90% of the candidates score at the Acceptable or better on each standard on the Capstone Work Sample Assessment (CWSA) (possible range of scores would be 0 to 2 points).
Assessment Data:
Spring 2014-Fall 2014 Assessment Data
Spring 2014-Fall 2014 Assessment DataSemester / Spring 2014 / Fall 2014
Capstone Project Count / 8
100%
Spring 2015-Fall 2015 Assessment Data
Capstone Project / Fall 2015 DE only
Rubric Criterion: / Student Count / 3
Design / Average / 2.00
Count of 0 / 0
% Score of 0 / 0.00%
Count of 1 / 0
% Score of 1 / 0.00%
Count of 2 / 3
% Score of 2 / 100.00%
Development / Average / 2.00
Count of 0 / 0
% Score of 0 / 0.00%
Count of 1 / 0
% Score of 1 / 0.00%
Count of 2 / 3
% Score of 2 / 100.00%
Utilization / Average / 2.00
Count of 0 / 0
% Score of 0 / 0.00%
Count of 1 / 0
% Score of 1 / 0.00%
Count of 2 / 3
% Score of 2 / 100.00%
Management / Average / 2.00
Count of 0 / 0
% Score of 0 / 0.00%
Count of 1 / 0
% Score of 1 / 0.00%
Count of 2 / 3
% Score of 2 / 100.00%
Evaluation / Average / 2.00
Count of 0 / 0
% Score of 0 / 0.00%
Count of 1 / 0
% Score of 1 / 0.00%
Count of 2 / 3
% Score of 2 / 100.00%
Changes to be implemented Fall 2016: Based upon the 2015 assessment data included in this annual report, what changes/improvements will the program implement during the next academic year to improve performance on this student learning outcome?