1

University of Cincinnati Educator Preparation Programs

Unit Assessment System

Transforming Lives, Schools, and Communities

The University of Cincinnati Educator Preparation Programs are committed to transforming lives, schools, and communities. We target the continuous improvement of the lives of the p-12 students with whom we work, our partner schools, performance of our candidates, the quality of our programs, and the quality of our procedures and operations. We are accountable internally to our candidates, faculty, and clinical faculty and externally to our specialized program associations, state department of education, the students with whom our candidates work, our partner schools, and the community. As a Transformation Initiative institution we are accountable to our field and to improving student outcomes through replicable efforts in teacher preparation. As an institution on the first annual list of institutions accredited by NCATE in 1954, we have a long tradition of self-study. We recognize the role of assessment and evaluation for decision-making and increased effectiveness. We recognize the need for multiple sources of data, and have identified the need for an “assessment mosaic” focused on improving p-12 student outcomes and unit operations.

The culture of data-based decision making has long been established in the unit. The Assessment and Evaluation Board in the College of Education, Criminal Justice, and Human Services has been in place since 1996, evolving into an Assessment Advisory Board in 2010. There have been efforts to evaluate educator programs prior to the NCATE 2000 standards, and systematic data collection, management, and application to the continuous improvement of programs and unit operations has been in place since 2002. The University Council for Educator Preparation, comprised of university-wide faculty members and administrators, public school teachers, and community members monitors the assessment system. Implementation of the assessment is managed by the Office of Assessment and Continuous Improvement, with Dr. James Vondrell as the director. Examples of initiatives he has directed include: the shift to a paper-free system and more efficient field and clinical placements; candidate, mentor, and field site evaluation; an annual student satisfaction survey, and advisory panels comprised of principals, associate superintendents, and superintendents of our district partners.

As a unit, we are committed to a transparent system that promotes discussion with various stakeholders. We ground our efforts in research and evidence. As with any assessment cycle, our system is constantly under review for the power of the data it generates. In addition, our Transformation Initiative has forced us to review the focus of our system.

Development of the Assessment System

The Assessment System was initiated in 2002 through a collaborative effort of five groups representing faculty members and the professional community. As a planning process, the work groups met individually, presenting their plans to the Continuous Improvement Committee (now the University Council on Educator Preparation - UCEP). The committee then endorsed the plans for implementation at the program level.

The assessment system was based on several principles put forth by UCEP:

·  Data is gathered from the professional community (cooperating teachers/mentors, graduates, employers, district personnel, and other members of the professional community)

·  Data is gathered from candidates, faculty members, cooperating teachers/mentors, graduates, employers, district personnel, members of the professional community, as well as the students and clients with whom they work

·  Because of the broad base of data collected, members of these groups are participants in the design, implementation, and evaluation of the assessment system and its components

·  Data is gathered related to standards, proficiencies, and tenets of the Conceptual Framework as well as national and state standards

Various measures of the Unit Assessment Plan were used during the 2001-2002 academic year, with near complete implementation during the 2002-2003 academic year. In our review of the data generated by assessment efforts, a need was identified to develop more specific performance assessments for advanced programs. This need emerged concurrent with changes in graduate program policies reducing the number of required credit hours to earn masters degrees. As a result, programs were revised, and new performance assessments were developed for those programs, implemented during the 2004-2005 academic year. All aspects of the assessment system are institutionalized, though individual assessments undergo annual review, to insure the data are useful.

Several opportunities to the system and programs have presented. These opportunities forced even greater reference to assessment data of programs, the unit, program operations, and p-12 student learning. These changes and the opportunities include:

·  The shift from quarters to semesters beginning Fall 2012, which provided us the opportunity to use data from the assessment of programs and unit operations to completely rethink programs in view of seven years of program, unit, and operations data

·  Collaborating with Stanford University as a one of four institutions in Ohio piloting the Teacher Performance Assessment (Ohio is a “fast track” state), forcing us to rethink our assessments of performance in clinical experiences

·  Awarded the Woodrow Wilson Fellows program, providing us the opportunity to design a program for candidates with strong content knowledge and degrees in mathematics and science to become teachers in high-needs schools

·  The introduction of a series of formative assessment tools consistent with the Ohio Residency Program (evolved from work with the New Teacher Center)

·  Redesigned all programs in response to themes described in our Transformation Proposal

·  Our recognition that the system must be clearly aligned with best practices in assessment and evaluation

·  Our recognition that any system involved in preparing professionals must be related to the impact on the clients; in our case we must systematically collect, analyze, review, and use data related to the impact of our candidates and graduates on the students with whom they work

·  Our commitment to establish an “assessment mosaic” in which a wide range of assessment strategies and data sets, grounded in outcomes of p-12 students, are designed, evaluated, and used continuously to inform program and procedural improvements.

Relationship of Assessment System to Conceptual Framework

Our conceptual framework has evolved in view of our participation in the Transformation Initiative. Our Unit standards for performance expectations have become: Candidates of the University of Cincinnati are committed to transforming the lives of P-12 students, their schools, and their communities, and

·  Demonstrating foundation knowledge, including knowledge of how each individual learns and develops within a unique developmental context

·  Articulating the central concepts, tools of inquiry, and the structures of their discipline.

·  Collaborating, leading, and engaging in positive systems change

·  Demonstrating the moral imperative to teach all students and address the responsibility to teach all students with tenacity

·  Addressing issues of diversity with equity and using skills unique to culturally and individually responsive practice

·  Using technology to support their practice

·  Using assessment and research to inform their efforts and improve outcomes

·  Demonstrating pedagogical content knowledge, grounded in evidence- based practices, committed to improving the academic and social outcomes of students

Our assessment system is organized around these institutional standards. In order to demonstrate our commitment to national, professional standards (Ohio is a partnership state and state standards and national standards are synonymous) all assessments in the system are explicitly aligned. This alignment has forced us to discontinue our student teaching/internship performance assessment because Ohio has moved from Praxis III to a “fast-track” Teacher Performance Assessment state.

Identifying our unit dispositions was the first task of our Unit-wide Continuous Improvement Committee. Our unit dispositions reflect our “Ways of Being.” Intrinsic to our dispositions is the notion of community and belonging. We appreciate each individual’s fundamental need for acceptance and belonging, and that a student’s fundamental need is to be successful and competent. We appreciate that we are members of a community, and that “none of us can find ourselves, know ourselves, or be ourselves, all by ourselves” (Binau, 2000). As educators transforming lives, schools, and communities we, aspire to the following:

·  initiative on behalf of all learners

·  responsibility to promote effort and excellence in all learners

·  rapport with students, peers, and others

·  a commitment to reflection, assessment, and learning as an ongoing process grounded in inquiry

·  collaboration with other professionals to improve the overall learning environment for students

·  acknowledging multiple perspectives

·  dedication to teaching the subject matter and in keeping informed and competent in the discipline and its pedagogy

·  appreciating both the content of the subject are and the diverse needs, assets, and interests of the students and value both short and long term planning

·  commitment to the expression and use of democratic values in the classroom

·  responsibility for making the classroom and the school a “safe harbor” for learning, in other words, a place that is protected, predictable, and has a positive climate

·  value opportunities to collaborate with parents

·  recognition of the fundamental need of students to develop and maintain a sense of self-worth, and that student misbehavior may be attempts to protect self-esteem

·  belief that all children can learn and persistence in helping every student achieve success

·  value all students for their potential and people and help them value each other

·  high ethical and professional standards.

Because of our intense commitment to these dispositions, we developed a unit-wide Candidate Dispositions Progress Report for the formal documentation of candidate dispositions. In addition, a Dispositions Brief Report was developed to both document exemplary dispositions and to identify areas of development for specific candidates. These reports identify candidates in terms of dispositions and general behavior, and are effective in documenting behavior that requires intervention and action plans. However, as formative assessment tools for classroom observation, these reports were less behavioral and measurable than we wished. As part of our Transformation Initiative, clear, specific measurable descriptions of specific behaviors demonstrating our dispositions were generated. We are currently piloting and calibrating Student-Teacher Performance Assessment Tool (Appendix A), with one set of pilot data collected. In this assessment, we used research to generate specific items that would support candidate development of appropriate interactions. A second issue that emerged was that of campus behavior. In an effort to again provide candidates with more specific feedback, a Classroom Disposition Assessment was developed. Both assessments are web-based.

All measures are aligned with institutional standards and candidate proficiencies. Our dispositions are measured and documented across the unit. In this way the University of Cincinnati Educator Preparation Programs, with the involvement of its professional community, is implementing an assessment system that reflects the conceptual framework(s).

Relationship of Assessment System to Professional, State, and Institutional Standards: Programs and Unit Operations

In addition to aligning our assessment system to our institutional standards, the system is aligned with Ohio Standards for the Teaching Professions and the Model Core Teaching Standards (CSSO, 2011) for initial programs and National Board for Professional Teaching Standards for advanced programs. All licensure programs employ the standards of the appropriate specialized program associations. The use of the Student-Teacher Performance Assessment Tool is being piloted to evaluate candidate performance in all professional field experiences as required by our state. The assessment plan demonstrates the alignment in the presentation of data. Through presenting our assessment efforts in this way, we are constantly reminded of our professional, state, and institutional standards.

Relationship of Assessment System to National Models for Assessment Systems

As we evaluate our assessment system, we identified our efforts as “purpose oriented” (Goodwin, Englert, & Cicchinelli, 2002). The over-riding goal of a purpose oriented system is improving student outcomes. This is consistent with our Transformation Initiative Proposal which aims to improve outcomes for all students. In addition, this system is appropriate in that it based in clear standards (professional, state, and institutional standards) flowing directly into assessments and multiple measures. Two aspects for this purpose related accountability involve (a) evaluating the effectiveness of our efforts and reforms to support programs in making decisions and (b) monitoring learning and holding candidates and programs responsible for their student outcomes.

The shift to a new web-based application for our assessments (from ReMark to Qualtrics) has provided the impetus for us to examine the measurements aspect of our system. The National Institute for Learning Outcomes Assessments (NILOA, 2011) contends that learning outcomes must be useful and transparent. We want our system to be as useful as possible to programs and clearly communicate to candidates, faculty members, administrators, p-12 partners, and the community. This alignment with the National Institute for Learning Outcomes Assessment supports our efforts in being evidence based. As we review our system, the six aspects of the transparency framework and examples of the activities in each area are:

Assessment Plans: Assessment processes, procedures, and activities
NILOA Activities / Our System / Example of Revision in Response to Review
Candidate Learning Outcomes Statements
Specific to Program Level / Alignment with SPAs, NBPTS, INTASC, Ohio / Align syllabi as well as assessments with standards
Prominently posted / Available to students / Available on every syllabus, handbook, assessment
Assessment Plans
Review what the measures are, how they are used, and their frequency of use (field coordinators meet to assess) / Handbooks and assessment website are reviewed by field coordinators of each program each year / In response to the Student-Teacher Performance Assessment Tool Pilot study, coordinators modified requirements for previous assessments
Review descriptions of the assessment to ensure that they are clear and understandable / Review all assessments for clarity, reading level, and transparency / Educator Impact Rubric has been repeatedly revised and finally replaced because of its complexity
Post or link assessments so they can be reviewed by all stakeholders / Review Office of Assessment and Continuous Improvement website / Recommendation for a "button" on the home page for easier access
Downloaded or accessed in a timely fashion / Data downloaded and shared / At program coordinators' request, all disposition assessments are downloaded and shared weekly; evaluations of or by university supervisors are shared prior to hiring deadlines
Receptive to feedback or comments / University Council for Educator Preparation, Field Coordnators Council, Licensure Council, Partnership Panels / Increased flexibility of scheduling of meetings with members of the p-12 school community Partnership Panels
Evidence of Learning
Explained, analyzed, and interpreted in a way easily understood / Program development plans / When results are shared, a narrative is included
Presented in text and graphics / Data posted for programs / We have always relied on graphs; we will review the need for narrative
Disseminated and summarized for different groups / Website and emails used / Candidates are sent emails regarding rationales for changes in their programs; candidate outcomes are posted on website; explore changing language/format for additional groups
Use of Evidence
Examine the extent to which evidence is used to identify evidence for change / Program development plans for each program / Programs review evidence from each assessment and design a response plan
Assessment Resources
Downloadable or accessible / Resources available on Office of Assessment and Continuous Improvement website; field coordinators Blackboard group / Handbooks explaining the Student-Teacher Performance Assessment are posted on Blackboard
Receptive to feedback or comments / Evaluation surveys with stakeholders / Cooperating Teacher Assessment, University Supervisor assessment, candidate assessments
Current Activities
Clearly communicated / Ongoing review of websites and handbooks / Candidates provide feedback on materials
Prominently Posted / Posted on website / Review ease of access with candidates

The Teacher Performance Assessment Consortium