Florida Charter Schools

Teacher and Leader Evaluation

Staff Development Presentation Materials

for Certified Trainers

Data Analysis Processes & Cause and Effect

Presented by

© 2013 The Leadership and Learning CenterPage 6
Reproduction rights provided to Florida DOE and FL Charter SchoolsTeacher and Leader Evaluation

Learning Activity 1

Respond to the question below in the space provided. Do this individually, and then share your definition with a partner.

How do you define cause and effect as it relates to the teaching and learning process?

Cause and effect is…

Learning Activity 2

Directions:

Think of a student achievement challenge we wish to conquer as a school, department, or grade level. For example; increase 3rd grade writing scores, improve Algebra EOC results, and improve overall reading results. Place an X in the quadrant below that best describes our status and provide a justification for the quadrant you selected. Be prepared to share your thinking with the group in five minutes.

Lucky
High results, low understanding Replication of success unlikely / Leading
High results, high understanding
Replication of success likely
Losing
Low results, low understanding
Replication of mistakes likely / Learning
Low results, high understanding Replication of mistakes unlikely

Learning Activity 3

Implementation of Higher Order Questioning Techniques: Cause and Effect Data

Level of Frequency/
Effectiveness / Highly Effective / Effective / Progressing / No Evidence
1.
Evidence of Thinking/ Student Involvement / All students are engaged by the questioning. Students are prepared to interact with the teacher and classmates during discussion and questioning at any point. / The large majority of students are engaged in questioning and provide sound responses to the questions.
A few students are not ready to discuss or answer questions and need to be redirected. / Some students are actively engaged in the lesson and provide quality responses to the questions, the majority of students have not moved to higher levels of thinking. / Students are unengaged in the lesson and/or questioning
Number of Observations – 131 / 3 / 40 / 76 / 12
Percentage / 2% / 31% / 58% / 9%
2.
Question Complexity / Teacher uses multiple methods to facilitate learning at the conceptual level and questioning leads students to transfer their learning to other disciplines and lessons. / The question complexity is evident, and consistently requires students to analyze, apply and evaluate. / Question complexity is not varied, i.e. Who? What? When? Questions are used frequently, with a few higher level questions included. Teacher misses opportunities to connect the discussion to critical thinking. / No evidence of teacher varying levels of questioning.
Number of Observations – 131 / 4 / 20 / 88 / 19
Percentage / 3% / 15% / 67% / 15%
3.
Academic Rigor/Curriculum Concepts and Connections / Questioning techniques are related to the conceptual foundations of the content and promote student curiosity about the content and previously learned material. / Questions are content relevant and produce critical thinking for all students. / Some questions are designed to promote critical thinking but the majority are based on recall, rather than reasoning. / Questions do not match content objectives or are not asked at all.
Number of Observations –131 / 10 / 27 / 76 / 18
Percentage / 8% / 21% / 58% / 13%
Level of Frequency/
Effectiveness / Highly Effective / Effective / Progressing / No Evidence
4.
Questioning Management / Teacher is deliberate about questioning for conceptual understanding. Teacher is actively assessing all students during discussion and is able to probe for misconceptions and redirect for deeper learning. / Teacher implements a questioning strategic plan.
Teachers engages all students in questioning and provides cues and wait time for quality responses. / Teacher does not have a set plan for questioning, but lets the class responses control the discussion.
Most students are actively participating in the questioning.
Teacher sometimes answers her own questions. / No plan for questioning is evident
Questioning are not purpose driven or are not asked at all.
Number of Observations – 131 / 14 / 40 / 67 / 10
Percentage / 11% / 30% / 51% / 8%
5.
Feedback/
Reflection / Students are always given specific feedback for question responses.
Classroom “Pulse Checks” are done to assess understanding / Students are given specific feedback to most of their responses.
Teacher consistently checks for student understanding. / Students are given feedback for their responses, however it is very general. (good, nice, yes are examples of the feedback)
Some checking for student understanding. / No feedback is given on student responses.
No evidence of assessing student understanding.
Number of Observations – 131 / 25 / 50 / 43 / 13
Percentage / 19% / 38% / 33% / 10%

Reflection Questions:

The data on the chart is cause data collected every nine weeks along with data on student benchmark assessments. The hypothesis is that if teachers get better at the components of higher order questioning, student learning will improve. To what extent do you connect your proficiency of high-effect size practices to outcomes for students? How can we do a better job of this as we work to improve student learning?

Learning Activity 4

Review the 6-step data analysis process below. Which components do we consider a strength for we school and which ones can we add to strengthen our process to improve student learning and build our instructional capacity?

Data Teams 6-Step Process

The Leadership & Learning Center
Step 1: Collect & Chart the Data
·  Teachers come to meetings with pre or assessment data scored into 4 performance bands based on agreed upon cut points.
·  1. Proficient & higher 2. Close to proficiency 3. Far to go, but likely to get there 4. Needs intensive support and not likely to get there.
·  The grade-level data is charted by teacher and performance band and computed as a grade-level total at the bottom.
Step 2 : Analyze to Prioritize
·  Overall student strengths are examined.
·  Challenges/ common errors/ obstacles/ misconceptions are noted.
·  Accurate inferences are made for each performance category.
·  Urgent needs are identified and prioritized for focus.
·  High-leverage points are discussed.
Step 3: Create SMART Goals
·  The percentages of students in each performance band drives the goal-setting phase of the process. This is formula based, not guessing or hoping.
·  The team takes the number of students already proficient, adds that to the number close category and far to go category for the overall SMART goal.
·  Example: Pre-Assessment Data
100 students total – 25 proficient, 33 are close, 20 far to go but likely, 22 intensive
·  SMART Goal – Students scoring proficient or higher on main idea will increase from 25% to 78% by Oct. 30 as measured by a common formative assessment given on Sept. 28, 2014.
Step 4: Select Instructional Strategies
·  Teachers discuss the most effective, research-based instructional strategies, specifically for main idea.
·  They decide how to deliver instruction based on the needs of students in each performance band because the misconceptions of students will vary by group.
·  The frequency, duration, and delivery methods are decided upon as well.
Step 5: Determine Results Indicators: Cause and Effect
·  Expected adult behaviors are determined; what will we see teachers doing when implementing the strategy with fidelity?
·  Expected student behaviors/ results are determined; what will students be doing to show us they are progressing in their learning? What will be the indicators of progress?
Monitoring Meetings
·  Teams discuss how things are going… how they are progressing with the instructional strategy… what adjustments do they feel they need to make?
Step 6: Results and Reflection: Determining Cause and Effect using the SMART Goal
·  Teachers come to the meeting with post-assessments scored by 4 performance bands based on previously established cut scores and definitions of proficiency
·  Teams determine if they met their SMART Goal and discuss the WHY behind their results, both positive and negative
·  Teams review the impact of their action on the students in each of the performance bands, especially in the intensive band, maybe they are not proficient YET, but did they move to higher bands of performance? Teams discuss what strategies worked for this group and do more of it!
·  This is the most powerful part of the process – teams discover what works and why.
·  They discover and analyze the cause and effect of their work.
·  Then they replicate what works and eliminate what does not for their next unit of study.
The purpose of teacher data teams is finding best practices, based on results, and then replicate them!

Feedback and Concluding Thoughts

Based upon your experiences today…

© 2013 The Leadership and Learning CenterPage 6
Reproduction rights provided to Florida DOE and FL Charter SchoolsTeacher and Leader Evaluation