SRCL National Performance Report: 2014–15


EDTASS: Striving Readers Comprehensive Literacy (SRCL)

U.S. Department of Education, Office of Elementary and Secondary Education

5.2 – National Performance Report: 2014–15

September 2016

Prepared for:

U.S. Department of Education, Office of Elementary and Secondary Education

400 Maryland Avenue, SW

Washington, DC 20202

Submitted by:

Applied Engineering Management Corporation

13880 Dulles Corner Lane, Suite 300, Herndon, VA 20171

Contract Number: ED-ODS-12-A-0019

Document Number: 5.2

Document Classification: Unclassified, For Official Use Only

1

SRCL National Performance Report: 2014–15

Executive Summary

In 2011, the U.S. Department of Education awarded grants to six state education agencies (SEAs) through a national Striving Readers Comprehensive Literacy (SRCL) discretionary grant competition to establish and support a state literacy team and to assist the states in developing acomprehensive literacy plan.Awarded SEAs included the Georgia Department of Education, the Louisiana Department of Education, the Montana Office of Public Instruction, the Nevada State Department of Education, the Pennsylvania Department ofEducation, and the Texas Education Agency.

Each SEA is required to provide leadership activities and at least 95 percent of its grant funds to local education agencies or other eligible entities.SEAs distribute subgrantee awards,as required by the enabling legislation, for services and activities related to effective literacy instruction. These services and activities include professional development, screening and assessment, targeted interventions for students reading below grade level, and other research-based methods of improving classroom instruction and practice.

In addition, the U.S. Department of Educationawarded fiveset-aside grants reserved by Congress from the appropriation for literacy instruction programs runby the Bureau of Indian Education (BIE) and four outlying areas (American Samoa, Commonwealth of the Mariana Islands, Guam, and the Virgin Islands).[1]

This SRCL NationalPerformance Report explores the implementation and outcomes for the 2014–15 grant performance year. The findings for SEAs that received discretionary grant awards and BIE and outlying areas that received set-aside awards are presented in separate sections, since the differences in the size, scope, and funding in the two types of grants would limit meaningful comparisons of results.

Key Findings

SRCL programs are serving disadvantaged student populations.

  • During the 2014–15 year, 78 percent of students served in SRCL schools in each state were identified as disadvantaged.Grantees defined disadvantaged student populations differently based on the needs and characteristics of their respective states. However, disadvantaged students typically included economically disadvantaged students, students with limited English proficiency, and students with disabilities.

All grantees are making progress in implementing their SRCL program goals through their state comprehensive literacy plan.

  • ALL SEAs made progress in the implementation of their state comprehensive literacy plans, with the majority working to refine or improve future implementation activities. Grantees continually updated their state literacy plans to match the changing state standards (i.e., college and career readiness standards).One grantee is vertically aligning their literacy plan to enhance common language, goals, and developmentally continuous benchmarks for literacy from birth through grade 12.
  • SEA grantees fostered the development of district and school level literacy plans, allowing strategies and programs to be individualized to the needs of the subgrantees.
  • In two states, the state comprehensive literacy plan that was created using SRCL funds has become part of larger literacy initiatives. One grantee’s state literacy plan has been adopted into the state legislation and is now being disseminated throughout the state. Another grantee led the process in creating state preschool program standards.
  • Set-aside grantees are working towardimplementing activities from their current plans, including principal and teacher professional development activities, providing technical assistance for program implementation, collaboration with childcare programs, providing resources to families, and continual student assessment.

All grantees providesystemwideprofessional development focused on evidence-based methods to improve instructional practice.

  • SEA grantees offered workshops and leadership modules to a range of stakeholders, such as teachers, school leaders, coaches, early childhood education providers, and parents.
  • High-quality, targeted, and multifaceted professional developmentincluded trainings, conferences, workshops, online modules, weekly electronic office hours, implementation coordinators, and learning institutesmonitoring, meetings, reporting, communication, mapping, and a best practices guide. Professional development was provided in different formats and conducted both online and in person, with some activities occurring onsite.
  • Grantees customized the professional development activities to the needs of the subgrantees and their student populations. For example, one grantee provided targeted professional development for teachers and school leaders who work with students with limited English proficiency. Activities involved analyzing data from the state English language proficiency system, understanding the information provided, and how it could assist with instruction.
  • Set-aside grantees implemented high quality and job-embedded professional development activities, including programs targeted towards improving literacy among English language learners, developing authentic literacy and learning readiness skills, educating teachers and school leaders on the Common Core State Standards, and using assessment data to inform instruction.

All SRCL grantees reported increased data-baseddecision-making.

  • SEA activities included usingdata-based decision-makingteams, providing guidance on how to use data for system change and improvement, monitoring data trends over time, training on assessments, creating tools for accuracy and consistency of data submissions, and creating data profiles. Grantees used data-based decision-makingto improve school readiness, meet the academic needs of students, and increase literacy skills across the continuum.
  • Set-aside grantees encouraged the use of short-cycle, formative, curriculum-based assessments tools, including I-Station, which assessed students on a monthly basis.

SRCL grantees are implementing technology in their literacy programs.

  • SEA Grantees are using technology to increase student engagement and teacher effectiveness. Activities include online data management systems, hosting district technology fairs, and online professional development learning modules. Four granteescreated online professional development that is available to all SRCL teachers, as well as to those who are not part of the program. Topics provide introduction and overviews of each states’ literacy plans and are geared to specific grade levels.The literacy learning modules and resources
  • Three of the five set-aside grantees have purchased technology devices(e.g., iPads, computers, and Kindles) through SRCL to help schools implement their programs effectively. However, challenges with technology infrastructure (e.g., bandwidth)remain for some grantees.

Four-year-old children made progress in their oral language skills from fall to spring, with all SEA grantees increasing the percentage of children making significant gains from the previous year.

  • During 2014–15, the percentage of four-year-old children achieving significant gains in oral language skills ranged from 14 to 89 percent, with only three grantees obtaining gains in over 50 percent of the children.

Grantees demonstrated improvement in the oral language skills of disadvantaged student populations.

  • Two grantees showed increases in the percentage of disadvantaged four-year-old children who made significant gains in oral language skills from the previous year.
  • Two grantees showed increases in the percentage of four-year-old children with limited English proficiencywho made significant gains in oral language skills from the previous year.
  • Five grantees showed increases in the percentage of four-year-old children with disabilities who made significant gains in oral language skills from the previous year.
  • One grantee reported that the disadvantaged student populations had higher percentages achieving significant gains than the all four-year-old group.

SEA student performance data demonstrated mixed results for 5th grade, 8th grade, and high school studentswhen compared to the previous year.[2]

  • Every SEA grantee experienced a change in the state assessment over the course of the grant cycle that prevents an observation of trends from 2011–12 to 2014–15. Because of changes in assessments, comparisons are only made for grantees that have multiple years of data with a single assessment.
  • The percentage of 5th grade students scoring proficient in reading during the 2014–15 year ranged from 29 percent to 78 percent. Only two grantees have multiple years of data. One grantee demonstrated an improvement in 5th grade reading proficiency from 2013–14 to 2014–15 (a growth of 7 percentage points) whereas the othergrantee demonstrated a decrease of 4 percentage points.
  • The percentage of 8th grade students scoring proficient in reading during the 2014–15 year ranged from 32 percent to 75 percent. Only two grantees have multiple years of data. One grantee demonstrated an improvement in 8th grade reading proficiency from 2013–14 to 2014–15 (a growth of 8 percentage points) whereas the other grantee demonstrated a decrease of 6 percentage points.
  • The percentage of high school students scoring proficient in reading during the 2014–15 year ranged from 27 percent to 91 percent. Only three grantees have multiple years of data. Two grantees demonstrated an improvement in reading proficiency in high school from 2013–14 to 2014–15 (a growth of 11 and 4 percentage points, respectively) whereas the other grantee remained the same.

Grantees demonstrated improvement in the oral language skills and reading proficiency of disadvantaged student populations.[3]

  • Disadvantaged Students: Two grantees showed increases in the percentage of 5th and 8th grade students scoring proficient in readingfrom the previous year. Four grantees showed increases in the percentage of high school students scoring proficient in readingfrom the previous year.
  • Students with Limited English Proficiency:Two grantees showed increases in the percentage of 5th grade students and high school students scoring proficient in readingfrom the previous year. Only one grantee showed increases in the percentage of 8th grade students scoring proficient in readingfrom the previous year.
  • Students with Disabilities:Only one grantee showed increases in the percentage of 5th and 8th gradestudents scoring proficient in readingfrom the previous year. Two grantees showed increases in the percentage of high school students scoring proficient in readingfrom the previous year.

Grantees are engaging in activities to sustain the SRCL program.

  • SEA grantees have conducted outreach and dissemination efforts, developed and refined progress monitoring tools and processes, and strategically aligned themselves with existing programs and new initiatives.
  • SEA grantees have focused on capacity building through professional development, increased use of technology, and the refinement and delivery of literacy resources and tools. Through the use of online modules, tools, or annual institutes, all SEA grantees sought to build capacity among administrators and staff.

Lessons Learned

As part of the APRs, the grantees described activities that were successful during implementation of the SRCL program, alongside potential challenges.

Dissemination activities are critical in building broad and collaborative system-wide support.

  • SEAs noted family and community supportactivities, integration of SRCL activities across state projects, innovation awards, and alignment of curriculum as efforts to build broad and system-wide support for the SRCL program.
  • Set-aside granteesreported collaboration with other programs to provide effective professional development and literacy councils as steps taken to increase the visibility of literacy in the community.

Building administrator and teacher capacity through the use of tools, literacy coaches, and professional development is important.

  • SEA grantees found that engaging teachers directly in the creation of tools and working with literacy coaches played a critical role in implementing SRCL activities.
  • One set-aside grantee reported success in implementing professional development activities in restructuring schools. Observations by service providers and principals showed evidence of shifts in classroom instruction stemming from the job-embedded professional development provided by the SRCL program.

Grantees highlighted successes in implementing data-based decision-making activities that benefited districts, schools, and students.

  • SEAs reviewed the practices and implementation choices of highly successful schools, highlighted collaborative approaches to data-based decision-making, and established data retreats for subgrantees to learn how to better interpret and use data to refine and assess their local SRCL projects.

Use of up-to-date technology, establishment ofschool leadership teams, ongoing revisions to state literacy plans, and annual leadership summits can all support administrators and teachers.

  • To support administrators and teachers in implementing SRCL activities, SEAs upgraded schools’ technology infrastructure, established school leadership teams, made ongoing revisions to their state literacy plans, and held annual leadership summits.
  • Three of the five set-aside grantees purchased technology devices, such as iPads, Kindles, and computers, through SRCL to help schools implement their programs effectively.

Recommendations

The Literacy Education for All, Results for the Nation (LEARN) program is built upon the current SRCL program. Like SRCL, LEARN targets disadvantaged student populations, including students with limited English proficiency and students with disabilities. It also includes an emphasis on evidence-based programs and a focus on early childhood education. As the Department moves forward, the AEM team has some recommendations based on the current SRCL program evaluation activities:

Revise current SEA annual performance reporting efforts to include disaggregated Government Performance and Results Act targets.

  • Current Government Performance and Results Act reporting includes only program-level data, making it difficult to identify improvements by schools.
  • Many SEAs have multiple cohorts of SRCL grantees, and current reporting efforts make it difficult to disaggregate data by cohort. Currently, different cohorts are combined into a single performance target.Disaggregation would allow the review of progress at different points of implementation.
  • In addition to reporting the percentage of students proficient, SEAs should include the percentage of students transitioning from one level to the next (e.g., from basic to advanced).One grantee reported changes in cut scores that affected proficiency levels.

Streamline the reporting process.

  • SRCL program directors noted that the APR and quarterly monitoring report process was cumbersome and in many instances duplicative.
  • One grantee recommended an online form where the program director could access the report from a previous quarter or year and modify it as necessary, rather than re-entering the data.

Encourage SEAs to revise and resubmit previously reportedperformance data.

  • Several SEAs reported data concerns or inaccuracies with previously submitted data leading to inaccurate interpretations from one year to the next. Allowing for revision and resubmission of data from previous years would contribute to accurate and up-to-date assessments of program performance.
  • Additionally, this would allow for follow-up with states regarding data issues and explanation of dramatic decreases in Government Performance and Results Act measures from one year to the next.

Promote the use and reporting of localized assessments to ensure continuity of data and to provide accurate achievement results for students with disabilities.

  • Each SEA experienced a change in the state assessments, making it difficult to compare student performance data over time. The use of local assessments or alternate data sources may allow for more consistent data and evidence of growth and improvement.
  • Across SEA grantees, achievement gaps were often largest for children with disabilities. Some students with disabilities are able to complete an assessment with accommodations while others require an alternate assessment. These types of assessments are not often comparable. For example, one grantee replaced the modified state assessment in 2014–15 with a version that only allowed accommodations. The percentage of students with disabilities who met proficiency was 8 to 22 percentage points lower than the previous year.Alternate data sources would allow grantees to monitor literacy achievement of students with disabilities with greater accuracy.

Include measures of English language proficiency for limited English proficient students.

  • SEA performance data also showed achievement gaps for limited English proficient students and that the achievement gaps increase as grade level increases. Moving forward, grantees could include measures of English language proficiency, such as Assessing Comprehension and Communication in English State-to-State for English Language Learners® (ACCESS for ELLs). An English language proficiency measure provides information about proficiency in listening, speaking, reading, and writing.
  • A measure of English language proficiency is important since LEARN includes literacy across content areas. Research has shown that it is possible that the linguistic complexity of assessments may interfere with limited English proficient students’ ability to present a valid picture of what they know and are able to do.[4]

Encourage grantees to provide targeted technical assistance for disadvantaged student groups using evidence-based practices and/or programs.

  • Achievement gaps were largest for students with disabilities and students with limited English proficiency. Technical assistance activities should provide evidence-based strategies and/or programs for working with different student populations.
  • The Every Student Succeeds Act (ESSA) and LEARN place an emphasis on “evidence-based” interventions. This involves identifying evidence-based interventions for school improvement. However, some grantees or subgranteesmay not have the capacity to determine the levels of evidence as identified in the legislation.The What Works Clearinghouse provides evidence-based practice guides and evidence ratings for programs.Moreover, grantees can leverage the resources available from several federally funded centers including the Comprehensive Centers, Regional Educational Laboratories, and Equity Centers to identify evidence-based programs or to receive high-quality technical assistance.

Encourage rigorous evaluations of program efficacy.