Los Alamitos Unified School District

Factors that Influence Student Achievement

The factors listed below help “attribute cause” for assessment results that measure student achievement. Once the data has been analyzed and summarized, it is important to identify the reasons for the assessment results so that improvement is purposeful. If we don’t know why the assessment results went up or down, it’s just plain luck – good or bad.

Until 100% of our students are proficient, we need action plans designed to improve student achievement that include one or more of these factors:

  1. Attitudes of staff members and students toward the testing program
  • Staff believes that all students can attain proficiency or better.
  • Staff believes the test is a reliable and valid measure of student learning.
  1. Test taking skills
  • Skills should be explicitly taught throughout the year.
  • Assessments should be formatted to align with CSTs, district benchmarks, and local common assessments.
  1. Curriculum Taught (PLC: What Do We Want Students to Know?)
  2. Curriculum Map should be aligned to CST blueprints with essential learnings identified.
  3. Use Standards-based materials.
  4. There should be congruence between standard, objective, lesson, and assessment.
  5. Appropriate Study Skills should be taught (how to take lecture notes, how to use a textbook, how to prepare for exams, etc.).
  1. Explicit DirectInstruction at Grade Level or Above(PLC: How Will We Help

Them Learn It?)

  • Give students a clear objective.
  • Preview/Review prior knowledge, academic vocabulary, and prerequisite skills.
  • Continually increase rigor as students demonstrate proficiency. (Treat all

students as if they were GATE.)

  • Checking for Understanding – Every student responds to every

question every time – during the initial instruction and guided practice.

  • Bell-to-Bell purposeful instruction with students on task.
  • I do it Alone, We do it Together, You do it Together, You do it Alone:
  • During I doit Alone, explain, model, and demonstrate metacognition (think aloud).
  • During We do it Together, all students participate (guided practice, checking for understanding).
  • DuringYou do it Together students collaborate on content.
  • During You do it Alone, teachers assign individual practice after students have demonstrated proficiency.

5. Non-fiction Writing Program in all content areas

6.Rigorous, robust vocabulary development

7. Frequent and timely feedback

8. Consequences other than an “F” or “0” for missing homework and assignments

9.Formative and Summative Assessments(PLC: How Will We Know When They’ve Learned It?)

  • Checking for Understanding
  • Guided Practice
  • Independent Practice
  • Local Common Assessments created by teachers through collaboration
  • District Benchmarks (essential agreements)
  • CSTs, CAHSEE, CELDT, Advanced Placement (No surprises!!)

10. Ongoing Analysis of Assessment Results using Data Director

  • Staff uses data to plan interventions and enrichments, modify instruction,

and drive school and district improvement.

11.Intervention/Enrichment Opportunities (PLC: How Do We Respond When They Don’t Learn It? How Do We Respond When They Already Know It?)

  • Intensive instruction during the school day based on assessments results

for underperforming students.

  • Extended learning time (after school intervention)
  • Enrichment opportunities for advanced students
  • Leveled grouping

12.Professional Learning Communities

  • The way we do business
  • The culture/context in which we strive for continuous improvement
  • Based on teachers collaborating to increase shared knowledge in order to

benefit from one another’s expertise

  • Clear mission for student achievement with a focus on results
  • The umbrella in which we collaborate to create common assessments and

subsequent interventions and enrichments

Adapted from Cox, 2000; Dembo, 2007; DuFour, DuFour, & Eaker, 2006; Fisher & Frey, 2008; Johnson, 2002; Marzano, 2003, 2006, 2010; Reeves, 2006, 2010; Stronge, 2005; Johnson, Kropp.

Revised 8/18/2010