Spring 2014 and Fall 2014

Student Learning Outcomes Assessment Plan and Report

College: College of Education

Department: Educational Leadership

Name of Degree/Certificate Program/Stand Alone Minor/Online Distance Education Program:

Ed.D. in Educational Leadership

Reflection on the Continuous Improvement of Student Learning
1. List the changes and improvements your program planned to implement as a result of last year’s student learning outcomes assessment data.
2. Were all of the changes implemented? If not, please explain.
3. What impact did the changes have on student learning?
1.  All performance outcomes were met during 2013 thus all changes being suggested were based upon other mandates and program observations. The program continues to look at its candidates and outcomes as well as external expectations to produce excellent educational leaders.
2.  All planned changes were implemented.
3.  A review of implemented changes, based on data collected, indicate continued strong student performance across all SLO dimensions as measured by assessment rubrics.

(Document student learning outcomes assessment plans and assessment data for each undergraduate and graduate degree program and certificate program, stand-alone minor, and distance education program offered online only.)

Student Learning Outcome 1
(knowledge, skill or ability to be assessed)
SLO 1: Candidates for other professional school roles demonstrate an understanding of the professional and contextual knowledge expected in their fields; and use data, current research and technology to inform their practices.
Changes to the Student Learning Outcomes Assessment Plan: If any changes were made to the assessment plan (which includes the Student Learning Outcome, Effectiveness Measure, Methodology and Performance Outcome) for this student learning outcome since your last report was submitted, briefly summarize the changes made and the rationale for the changes.
Effectiveness Measure: Identify the data collection instrument, e.g., exam, project, paper, etc. that will be used to gauge acquisition of this student learning outcome and explain how it assesses the desired knowledge, skill or ability. A copy of the data collection instrument and any scoring rubrics associated with this student learning outcome are to be submitted electronically to the designated folder on the designated shared drive.
The Department of Educational Leadership (EDLD) program uses the following measures to assess SLO
1.  Qualifying Examination. The Qualifying Exam may be taken after the candidate completes a minimum of 24 credit hours and before they take 36 hours. The exam has two parts: the written portion, over approximately 12 hours, and then that is followed by an oral defense of the written work. The six components of the rubric are applied to the candidate’s combined performance on both the written and the oral exam. Specific areas assessed are:
a.  Ability to recognize and articulate the problems at hand.
b.  Expression of the problems’ background; able to employ critical analysis and relevant literature.
c.  Reasoning skills.
d.  An understanding and ability to apply appropriate research methods vis-à-vis problems posed during exam.
e.  Apply critical reflection to knowledge gained from the academic program.
f.  Ability to effectively respond to scholarly questions.
2.  Proposal Defense. (The proposal is a draft of the first three chapters of candidate’s dissertation.) Results of the proposal defense are used to estimate the candidates’ ability to design a research project that answers important questions in the candidates’ content area. Specific areas assessed are:
a.  A research problem which is clear, articulated and significant.
b.  Research methods which provide detailed description of (if applicable): subjects, design/approach, methods/procedures and analyses.
c.  Research methods and analyses that are appropriate to the research questions.
d.  A relationship between the research problem and the student’s role as an educational leader.
e.  A preliminary literature review that describes prior conceptual and research investigations of the research problem.
3.  Dissertation Defense. (Completion of research and includes five chapters.) Results from the dissertation defense are used to determine the knowledge and skills of the candidate to conduct a research project.
a.  Develops clear and appropriate research questions or hypotheses that guide the study.
b.  Demonstrates how research questions or hypotheses have been examined in previous studies.
c.  Analysis is comprehensive, complete, sophisticated, and convincing.
d.  All pertinent results reported in clear and concise manner. Table/figures are labeled appropriately.
e.  Draws clear conclusions based on collected data that answers research question or test the hypotheses.
f.  Makes recommendations for further research that can build on this project.
g.  Provides reflection of problems or errors in the study and discusses how they could be avoided in subsequent studies.
Methodology: Describe when, where and how the assessment of this student learning outcome will be administered and evaluated. Describe the process the department will use to collect, analyze and disseminate the assessment data to program faculty and to decide the changes/improvements to make on the basis of the assessment data.
Qualifying Examination:
Written component. Candidates should take the written comprehensive examination as soon as possible after completing 24 credit hours of foundations and research coursework and not later than enrollment in ADMN 8699 (Dissertation Proposal Seminar). The examination may occur at any time during the year and normally will include questions from six different doctoral courses to be completed within twelve hours (six hours on two consecutive days). Those questions will require candidates to connect basic concepts from completed coursework and to apply what they have learned to different situations and educational contexts. A committee consisting of a candidate’s advisor and the faculty members who have instructed the candidate will prepare and evaluate the written examinations. Steps in the Written Examination: 1) The candidate and advisor will determine a date for the examination which will be at least 60 days from the day of the decision; 2) The advisor will notify committee members that the student will take the examination and will request any materials/information (if appropriate) to guide the candidate’s preparation; 3) The candidate will take the examination in the department area on a department laptop computer (unless otherwise indicated by a faculty member, no materials or resources will be used during the examination); 4) The advisor will give the candidate’s responses to the examination questions to the appropriate faculty members for evaluation; 5) If the candidate’s performance on the written examination is unsatisfactory (Not Acceptable), in whole or in part, the candidate will be allowed to re-take the failed portion(s) of the examination. A second failure will result in termination from program. The written examination is scored on the following scale: Expectations Not Met-0; Meeting Expectations-1; and Exceeding Expectations-2 across multiple dimensions.
Oral component. The oral examination will normally occur within 30 days upon successful completion of the written examination. During the oral examination, the candidate’s advisor and committee will engage in dialog with the candidate about the written examination. The discussion has two purposes. First, it provides an opportunity for the candidate to address in more detail or to clarify responses to questions on the written examination. Second, it allows the committee to engage the student in a discussion of issues not addressed in the written examination but which are pertinent to the content. If the candidate’s performance on the oral examination is unsatisfactory, an additional oral examination may be scheduled and/or the candidate may be required to take additional coursework. Subsequent failure on the oral examination will result in termination from the program. Upon successful completion of the written and oral examinations, the student’s advisor and committee must sign and submit the Qualifying Examination/Comprehensive Examination Report for Doctoral Candidates. The oral exam is scored on the following scale: Expectations Not Met-0; Meeting Expectations-1; and Exceeding Expectations-2, across multiple dimensions. A rubric is used to evaluate the combination of written and oral defense of the qualifying exam. It contains the six assessment dimensions mentioned earlier.
Dissertation Proposal Defense
The development and defense of a dissertation proposal is an important aspect of dissertation research. The proposal is a draft of the first three chapters of one’s dissertation. The proposal defense is scored on the following scale: Expectations Not Met-0; Meeting Expectations-1; and Exceeding Expectations-2, across multiple dimensions. After the student/candidate “meets” or “exceeds” all dimensions, they are allowed to begin their research. A rubric is used to evaluate each of the five domains in the proposal defense.
Dissertation Defense
When the candidate’s dissertation committee believes that the dissertation is in satisfactory form, a final defense is scheduled. The dissertation defense is scored on the following scale: Expectations Not Met-0; Meeting Expectations-1; and Exceeding Expectations-2, across multiple dimensions. Students/candidates who do not meet expectations are provided feedback and another defense is scheduled. A rubric is used to evaluate each of the seven domains in the dissertation defense.
Assessments are administered at identified points during the program. Work samples are scored using the designated method and scores are collected and analyzed at the program level. Simple descriptive statistics are used to report the scores. Findings are discussed at monthly Doctoral Advisory Committee meetings and during department faculty meetings. Recommendations for changes and improvements are examined and adopted as deemed appropriate. All data reports created by the College of Education are housed on a secure website which is accessible to all faculty members within the College of Education.
Performance Outcome: Identify the percentage of students assessed that should be able to demonstrate proficiency in this student learning outcome and the level of proficiency expected. Example: 80% of the students assessed will achieve a score of “acceptable” or higher on the Oral Presentation Scoring Rubric. (Note: a copy of the scoring rubric, complete with cell descriptors for each level of performance, is to be submitted electronically to the designated folder on the designated shared drive.)
The program expects at least 80% of the students to score “1” or “2” (meet or exceed expectations) on each of the elements of the Qualifying Exam, Proposal Defense, and Dissertation Defense. The results indicated that candidates’ performance exceeded the expectations.
Spring 2013-Fall 2013
Assessment Data / Spring 2014-Fall 2014 Assessment Data

Qualifying Examination: The percentages of candidates who “met” or “exceeded” expectations are reported in the table below.

2013 / 2014
% Meets or Exceeds
n=7 / % Meets or Exceeds
n=10
Ability to recognize and articulate the problems at hand. / 100% / 100%
Expression of the problems’ background; able to employ critical analysis and relevant literature. / 100% / 100%
Reasoning skill / 100% / 100%
An understand and ability to apply appropriate research methods vis-à-vis problems posed during exam / 100% / 100%
Apply critical reflection to knowledge gained from the academic program. / 100% / 100%
Ability to effectively respond to scholarly questions / 100% / 100%

Proposal Defense: The percentages of candidates who “met” or “exceeded” expectations are reported in the table below.

2013 / 2014
% Meets or Exceeds
n=9 / % Meets or Exceeds
n=9
A research problem which is clear, articulated and significant. / 100% / 100%
Research methods which provide detailed description of (if applicable): subjects, design/approach, methods/procedures and analyses. / 100% / 100%
Research methods and analyses that are appropriate to the research questions. / 100% / 100%
A relationship between the research problem and the student’s role as an educational leader. / 100% / 100%
A preliminary literature review that describes prior conceptual and research investigations of the research problem. / 100% / 100%

Dissertation Defense: The percentages of students who “met” or “exceeded” expectations are reported in the table below.

2013 / 2014
% Meets or Exceeds
n=9 / % Meets or Exceeds
n=10
Develops clear and appropriate research questions or hypotheses that guide the study. / 100% / 100%
Demonstrates how research questions or hypotheses have been examined in previous studies. / 100% / 100%
Analysis is comprehensive, complete, sophisticated, and convincing. / 100% / 100%
All pertinent results reported in clear and concise manner. Table/figures are labeled appropriately. / 100% / 100%
Draws clear conclusions based on collected data that answers research question or test the hypotheses. / 100% / 100%
Makes recommendations for further research that can build on this project. / 100% / 100%
Provides reflection of problems or errors in the study and discusses how they could be avoided in subsequent studies. / 100% / 100%
Changes to be implemented Fall 2015: Based upon the 2014 assessment data included in this annual report, what changes/improvements will the program implement during the next academic year to improve performance on this student learning outcome?
Given the strong record of student success, no changes are anticipated.
Student Learning Outcome 2
(knowledge, skill or ability to be assessed)
SLO 2: Candidates for other school professions demonstrate professional behaviors consistent with fairness and the belief that all students can learn, including creating caring, supportive learning environments, encouraging student-directed learning, and making adjustments to their own professional dispositions when necessary.
Changes to the Student Learning Outcomes Assessment Plan: If any changes were made to the assessment plan (which includes the Student Learning Outcome, Effectiveness Measure, Methodology and Performance Outcome) for this student learning outcome since your last report was submitted, briefly summarize the changes made and the rationale for the changes.
Effectiveness Measure: Identify the data collection instrument, e.g., exam, project, paper, etc. that will be used to gauge acquisition of this student learning outcome and explain how it assesses the desired knowledge, skill or ability. A copy of the data collection instrument and any scoring rubrics associated with this student learning outcome are to be submitted electronically to the designated folder on the designated shared drive.
Intern Summary Evaluation instrument is used during the candidates’ internship process. There are seven domains on which students/candidates are scored. The professional behaviors are scored on the following scale: Expectations Not Demonstrated-0; Developing-1; and Proficient-2. Domains include Strategic Leadership, Instructional Leadership, Cultural Leadership, Human Resource Leadership, Managerial Leadership, External Development Leadership, Micro-political Leadership.
Additionally, all students/candidates must take the Collaborative Institutional Training Initiative course in the protection of human research subjects and must participate in a tutorial that assesses their knowledge of the procedures for protecting human research subjects and conducting research in the social, educational, and behavioral sciences. All students/candidates must answer at least 80% of the tutorial assessment instrument, which is imbedded within the instrument itself, correctly before they can conduct research at UNC Charlotte.
Methodology: Describe when, where and how the assessment of this student learning outcome will be administered and evaluated. Describe the process the department will use to collect, analyze and disseminate the assessment data to program faculty and to decide the changes/improvements to make on the basis of the assessment data.
During Internship: Students/Candidates are scored by two supervisors (i.e., the University Supervisor and Intern Site Mentor) on the Intern Summary Evaluation instrument during the internship experiences (Parts I and II). At the end of the internship, the two raters provide a summative evaluation based on the ratings.
Assessments (i.e., Intern Summary Evaluation and Collaborative Institutional Training Initiative exam) are administered at identified points during the program. The tutorial assessment is embedded in the training session and the user is not permitted to proceed until mastery is attained.
Work samples are scored using the designated method and scores are collected and analyzed at the program level. Simple descriptive statistics are used to report the scores. Findings are discussed at monthly Doctoral Advisory Committee meetings and during department faculty meetings. Recommendations for changes and improvements are examined and adopted as deemed appropriate. All data reports created by the College of Education are housed on a secure website which is accessible to all faculty members within the College of Education.
Performance Outcome: Identify the percentage of students assessed that should be able to demonstrate proficiency in this student learning outcome and the level of proficiency expected. Example: 80% of the students assessed will achieve a score of “acceptable” or higher on the Oral Presentation Scoring Rubric. (Note: a copy of the scoring rubric, complete with cell descriptors for each level of performance, is to be submitted electronically to the designated folder on the designated shared drive.)
At least 80% of students/candidates will score “2” (Proficient) across all domains on the Intern summary Evaluation instrument and 100% must pass (score 80% or higher) on the Collaborative Institutional Training Initiative tutorial in order to conduct research at UNC Charlotte.
Spring 2013-Fall 2013
Assessment Data / Spring 2014-Fall 2014 Assessment Data

Professional Domains – Internship— the percentage of students who were determined to be “Proficient” (score of 2-the highest score) are reported in the table below. Eighty-three percent of the students scored at the proficient level for all dimensions. These data are a combination of the final internship report from Spring 2013 and the interim report in Fall 2013, since students complete the internship over a two-semester period.