Spring 2015 and Fall 2015
Student Learning Outcomes Assessment Plan and Report
College: College of Education
Department: Special Education and Child Development
Name of Program: Online M.Ed. in Special Education—Academically/Intellectually Gifted (AIG)
Reflection on the Continuous Improvement of Student Learning1. List the changes and improvements your program planned to implement as a result of last year’s student learning
outcomes assessment data.
2. Were all of the changes implemented? If not, please explain.
3. What impact did the changes have on student learning?
Although all student learning outcomes were met for 2014 and 2015 for all SLOs, changes were made in beginning in late 2015 to be phased in during 2016, as described in the sections below.
Student Learning Outcome 1
(knowledge, skill or ability to be assessed)
Revised SLO 1: Advanced program candidates are able to demonstrate and apply content knowledge and skills specific to their content area or discipline.
Changes to the Student Learning Outcomes Assessment Plan: If any changes were made to the assessment plan (which includes the Student Learning Outcome, Effectiveness Measure, Methodology and Performance Outcome) for this student learning outcome since your last report was submitted, briefly summarize the changes made and the rationale for the changes.
In 2013, the College of Education accrediting body, the Council for the Accreditation of Educator Preparation (CAEP), released new standards for educator preparation programs. To better align with these standards, AIG program faculty have collaboratively worked this year to revise our Student Learning Outcomes (SLOs). In addition, the UNC Charlotte Office of Assessment recommends that programs revisit SLOs every 3-5 years to ensure that SLOs accurately assess student learning. As a result, SLO 1 is being changed.
Reporting for 2015 used the existing data to address this student learning outcome. To assess the revised SLO 1 during 2015, one data source was used: The AIG Case Study Project in Differentiation. Data for 2015 are reported as scores of 80% or above on the overall Project rubric, while beginning in 2016, candidates’ performance will be reported using Taskstream and disaggregated within each response category of the Taskstream rubric.
Effectiveness Measure: Identify the data collection instrument, e.g., exam, project, paper, etc. that will be used to gauge acquisition of this student learning outcome and explain how it assesses the desired knowledge, skill or ability. A copy of the data collection instrument and any scoring rubrics associated with this student learning outcome are to be submitted electronically to the designated folder on the designated shared drive.
In the AIG Case Study Project in Differentiation the candidate demonstrates and applies the ability to use pretest and post-test information to guide and differentiate instruction for an individual or a small group of students within a larger classroom setting. The Project rubric also addresses the ability to express content knowledge and skills in written form, and the ability to reflect on the application of content area and disciplinary knowledge to differentiate instruction effectively based on learners’ presented needs. Together, these areas demonstrate the candidate’s in depth knowledge of AIG learners and pedagogy and their ability to apply effectively the knowledge and skills specific to their content area discipline in addressing the learning needs of students identified as AIG.
Methodology: Describe when, where and how the assessment of this student learning outcome will be administered and evaluated. Describe the process the department will use to collect, analyze and disseminate the assessment data to program faculty and to decide the changes/improvements to make on the basis of the assessment data.
The AIG Case Study Project in Differentiation is the major course project and is completed by the M.Ed. candidate in SPED 6270: Gifted Assessment and Program Evaluation in the second or third year of the eleven-course master’s degree program. The Project is evaluated by the instructor using a set of criteria on a rubric that result in a grade. Grades are converted from a point total to a percentage score and scores of 80% or higher are given a grade of B, with 90% or higher given a grade of A. For the purpose of this report, data reflect points earned at 80% or better (grades of ‘A’ or ‘B’). An overall score of B or higher indicates a student product whose item-level scores on the rubric items fall at or above 80 percent. These are recorded on the data management system’s rubric as Proficient or Accomplished (scores of 4 or 5).
Scores beginning in 2016 will be collected using the College’s electronic data management system. Candidate performance is analyzed at the college and program level. Simple descriptive statistics are used to analyze the scores, and disaggregated findings are reported by semester at three levels (College, Program and Licensure Area). Once a year results from all assessments administered by the programs are disseminated to the faculty in the College of Education. The data is discussed during a final faculty meeting and next steps determined to address any needs identified. All strategies determined during this closing the loop discussion are implemented during the next academic year. All data reports created by the College of Education are housed on a secure website which is accessible to faculty members within the College of Education.
Performance Outcome: Identify the percentage of students assessed that should be able to demonstrate proficiency in this student learning outcome and the level of proficiency expected. Example: 80% of the students assessed will achieve a score of “acceptable” or higher on the Oral Presentation Scoring Rubric. (Note: a copy of the scoring rubric, complete with cell descriptors for each level of performance, is to be submitted electronically to the designated folder on the designated shared drive.)
The program expects 80% of its add-on licensure candidates to score “Proficient” or better on each area of the rubric for their work to be considered proficient on the AIG Case Study Project in Differentiation. This is confirmed by achieving scores of 4 or above in each rubric area on the Taskstream rubric, which has a 5-point scale for each rubric item (0-1-4-5).
Assessment Data: NOTE: our data sources and reporting procedures have changed; therefore there are two tables here – one for Spring 2014-Fall 2014 data, and one for Spring 2015-Fall 2015 data.
Old SLO 1: SPED 6695/6696 Course grade of B or better
Program / AIG M.Ed. / AIG M.Ed. / M.Ed.
Semester / Spring 2014 / Summer 2014 / Fall 2014
Total Count / 2 / Not offered / Not offered
Met Standard / 2 / N/A / N/A
Percentage Met / 100% / N/A / N/A
Spring 2015-Fall 2015 Assessment Data
SLO 1: SPED 6270 Case Study Project in Differentiation
Program / AIG M.Ed. / AIG M.Ed. / AIG M.Ed. / AIG M.Ed.
Semester / Fall 2015
Distance Ed / Fall 2015
Face to face / Spring 2015
Distance Ed / Spring 2015
Face to face
Total Count / 7 / Not offered / Not offered / Not offered
Met Standard / 7 / N/A / N/A / N/A
Percentage Met / 100% / N/A / N/A / N/A
Changes to be implemented Fall 2016: Based upon the 2015 assessment data included in this annual report, what changes/improvements will the program implement during the next academic year to improve performance on this student learning outcome?
Data indicated that candidates in the AIG M.Ed. program met the targeted performance outcomes for revised SLO 1. However, the College of Education is focused on continuous improvement based on data-based decision-making. Based on the data presented here, faculty will review any rubric category in which the rate of meeting expectations is less than 100% across two or more groups. For 2015 data, this rate was at the 100% target in both groups, so no changes are identified to be made at this time.
Beginning with 2016 this SLO1 evidence will be replaced by the AIG Case Study Project in Differentiation, which will be completed by candidates in the course SPED 6270: Gifted Assessment and Program Evaluation.
Student Learning Outcome 2
(knowledge, skill or ability to be assessed)
Revised SLO 2: Advanced program candidates use domain-specific research and evidence to demonstrate leadership in developing high quality learning environments.
Changes to the Student Learning Outcomes Assessment Plan: If any changes were made to the assessment plan (which includes the Student Learning Outcome, Effectiveness Measure, Methodology and Performance Outcome) for this student learning outcome since your last report was submitted, briefly summarize the changes made and the rationale for the changes.
In 2013, the College of Education accrediting body, the Council for the Accreditation of Educator Preparation (CAEP), released new standards for educator preparation programs. To better align with these standards, the College of Education faculty have collaboratively worked this year to revise our Student Learning Outcomes (SLOs). In addition, the UNC Charlotte Office of Assessment recommends that programs revisit SLOs every 3-5 years to ensure that SLOs accurately assess student learning. As a result, SLO 2 has been changed as indicated above.
To assess the revised SLO 2, an existing data source was identified: 1) The Gifted Workshop Project completed by candidates in SPED 5211 Nature and Needs of Gifted Students. This replaced the Instructional Unit Plan from SPED 6241, which was used for the M.Ed. SLO2 during 2014. Specific indicators as aligned with SLO 2 are indicated below. These indicators are different from the 2014 data report.
Effectiveness Measure: Identify the data collection instrument, e.g., exam, project, paper, etc. that will be used to gauge acquisition of this student learning outcome and explain how it assesses the desired knowledge, skill or ability. A copy of the data collection instrument and any scoring rubrics associated with this student learning outcome are to be submitted electronically to the designated folder on the designated shared drive.
In the SPED 5211 Workshop Project, M.Ed. candidates develop materials for a one-hour workshop for teachers at their school that addresses definitions of giftedness, characteristics of gifted students, North Carolina’s gifted legislation (Article 9B), issues gifted students face, and what programming for the gifted entails in the candidates’ school district. Candidates’ performance on this task is assessed by a rubric addressing their selection of data-based research supporting the topic of their workshop, their understanding of the learning and other needs of AIG learners, the inclusion of current policies, standards, and issues affecting the education of AIG learners, and candidates’ professional leadership as demonstrated in their ability to design collaborative learning activities and professional reflection experiences for the workshop’s intended audience. Candidates design the workshop to address needs they observe within their own school and then share these with colleagues. For these reasons, specific indicators on the Workshop Project Rubric align with the revised SLO 2.
Methodology: Describe when, where and how the assessment of this student learning outcome will be administered and evaluated. Describe the process the department will use to collect, analyze and disseminate the assessment data to program faculty and to decide the changes/improvements to make on the basis of the assessment data.
The Workshop Project is evaluated using a rubric developed by faculty of the AIG program in SPED 5211: Nature and Needs of Gifted Learners. The rubric items address the competencies described above under the heading Effectiveness Measures. The rubric is on a 5-point scale: 0 = Not Observed, 1 = Emergent/Developing; 4 = Proficient; and 5 = Accomplished.
Scores of Proficient or Accomplished (4 or 5 points) are considered to meet the program’s standards, while scores of 0 or 1 are considered Not Met. Students are provided ahead of time with a copy of the rubric and specific instructions for completion of the Workshop Project. The rubric is completed by the course instructor. Point values on the rubric are designed to also be used for grading purposes for this assignment, which is why the 5-point scale was adopted by program faculty.
Data for 2015 were collected manually based on students’ grades on this assignment and overall performance was discussed by program faculty following each semester in which the course was offered. After 2015, rubric scores will be collected using the College’s electronic data management system, Taskstream, using a revised rubric being prepared for this purpose. Scores will be provided to program faculty bi-annually by the COED Office of Assessment and Accreditation. Simple descriptive statistics are used to analyze the scores, and disaggregated findings are reported by term at the college and program levels. All data reports created by the College of Education will be housed on a secure website which is accessible to faculty within the College of Education. The data collected in Taskstream will be discussed during AIG Program meetings as well as at the department’s faculty meeting at least once per semester. In these meetings, next steps determined to address any needs identified. Strategies determined during this closing the loop discussion will be implemented during the next academic year. These meetings are documented by program directors and department chairs and candidate performance will be revisited at each subsequent meeting to monitor implementation progress.
Performance Outcome: Identify the percentage of students assessed that should be able to demonstrate proficiency in this student learning outcome and the level of proficiency expected. Example: 80% of the students assessed will achieve a score of “acceptable” or higher on the Oral Presentation Scoring Rubric. (Note: a copy of the scoring rubric, complete with cell descriptors for each level of performance, is to be submitted electronically to the designated folder on the designated shared drive.)
The program expects 80% or higher of its candidates to obtain a score of 4 (“Proficient”) or higher overall and across the Workshop Project rubric indicators listed above.
Assessment Data: NOTE: Because our data sources have changed, there are two tables here – one for Spring 2014-Fall 2014 data, and one for Spring 2015-Fall 2015 data. Rubrics were not yet finalized when 2015 SLO2 assignments were collected, so overall scores are reported in lieu of performance on individual rubric items.