NATIONAL CENTER ON

ACCESSIBLE INSTRUCTIONAL MATERIALS

Statusof State Systemsfor the Provision of NIMAS/AIM in 2014

Executive Summary

2014 Survey of State Education Agencies conducted by the

National Center on Accessible Instructional Materials

Published: November, 2014

This document was produced under U.S. Department of Education, Office of Special Education Programs Grant No. H327T090001. Michael Slade served as the project officer. The views expressed herein do not necessarily represent the positions or polices of the Department of Education. No official endorsement by the U.S. Department of Education of any product, commodity, service or enterprise mentioned in this publication is intended or should be inferred. This product is public domain. Authorization to reproduce it in whole or in part is granted.

While permission to reprint this publication is not necessary, the citation should be—

National Center on Accessible Materials (November14, 2014). Status of State Systems for the Provision of NIMAS/AIM in 2014, Wakefield, MA, National Center on Accessible Instructional Materials.

National Center on AIM at CAST 40 Harvard Mills Square, Suite 3 Wakefield, MA 01880-3233

Voice: (781) 245-2212 TTY: (781) 245-9320 Fax: (781) 245-5212 Web: aim.cast.org

Status of State Systems for the Provision of NIMAS/AIM in 2014

Executive Summary

The Office of Special Education Programs (OSEP) of the U.S. Department of Education has charged the National Center on Accessible Instructional Materials (AIM Center) with developing and reporting national snapshots of the current status of state systems to ensure the timely provision of accessible instructional materials (AIM) to students with disabilities who require such materials. To inform these snapshots, the AIM Center developed a survey aligned to the seven areas of the Critical Components of Quality Indicators for the Provision of Accessible Instructional Materials (see Appendix A of the full report) as well as collaboration across all areas. The first administration of the survey took place in 2010, the second administration in 2012, and the final administration in 2014. A total of 52 respondents completed the 2014 survey, including designees from each of 49 states and designees from three additional educational entities. To ensure anonymity, all respondents are referred to as “states,” “respondents,” or “participants” throughout this report.

Purpose and Data Collection

The main purpose of the 2014 survey was to gather data to assist with the development of the snapshot by identifying areas of progress as well as areas of continuing challenge in the development and implementation of coordinated systems to ensure the timely delivery of AIM to students with disabilities who need such materials. Quantitative data were gathered by questions requiring a single response (forced choice) and by questions allowing the selection of all items that apply (multiple answers).Quantitative data were analyzed by tabulation of the aggregate responses from the entire sample and calculation of the percentage of responses that were received for each option in the response array. Responses were compared, when possible, across 2010, 2012,and 2014 data sets to determine areas where change had occurred and where ongoing focused attention and support is needed.

Analysis and Reporting

Data from the 2014 administration of the survey were analyzed and reported in two groups: 1) all states and 2) a disaggregated sub-set of 10 states receiving intensive targeted technical assistance (TTA) from the AIM Center. To focus attention on the national picture of the implementation of AIM/NIMAS, the first, all states group, consists of all 52 entities that responded to the survey, including the 10 states that received intensive, targeted technical assistance. Data submitted by the total sample in 2014 were compared to data submitted by the total sample in the 2010 and 2012 survey administrations. This group is referred to in the report as “All states.” To assist with determining the impact of intensive, targeted technical assistance, the second group includes only the disaggregated data submitted by the 10 states that have received intensive TTA from the AIM Center. Data submitted by these 10 states in 2014 were compared to data submitted by the same states in 2010 and 2012. This group is referred to in the report as “AIM TTA states.” Data sets for each group were primarily compared within the group across the three administrations of the survey. However, when pertinent, differences were noted between “All states” and “AIM TTA states” and those differences are also mentioned in the report.

Summary of Key Findings

Data across the three survey administrations suggest that progress has been made by states in the development of coordinated systems for the timely provision of AIM to students with disabilities. In some areas, particular progress is apparent among the states that have received targeted technical assistance (TTA) from the AIM Center over the past four years. In 2014, high percentages of respondents in both the total sample (75%) and the disaggregated AIM TTA states group (100%) reported that their state’s current implementation of AIM is much better or better than it was five years ago. While advancements have been made, there are still a number of areas in which additional work is necessary. It is both noteworthy and realistic that only a small percentage of respondents in the total sample (8%) and AIM TTA states (10%) reported that their state’s current implementation of AIM was excellent. Key areas of reported progress as well as areas in which additional work is needed are included in the Key Findings and Recommendations sections of the full Report and are highlighted here.

Notable Areas of Progress from 2010 to 2014

1. Increased reporting of efforts to servea broader range of students who need AIM

Data submitted in 2014 suggest increased efforts are being made to provide AIM to a broader range of students as indicated by notable increases in the following areas: categories of students served (see Table 3.1 and Table 3.2), learning opportunities topics (see Table 20.1 and Table 20.2), and allocation of resources (see Table 31.1 and Table 31.2). These data suggest that, while maintaining high levels of provision of AIM to students who are served under IDEA and meet copyright criteria, there is a growing awareness of the need to provide AIM to students who require accessible materials but do not meet copyright criteria even when doing so is challenging. Because students who do not meet the definition of blind or other persons with disabilities cannot receive AIM through the NIMAC or accessible media producers, the purchase of instructional materials that include features leading to wide usability across the full range of student variability becomes extremely important.

2. Increased Increasedcollaboration with the NIMAC, AMPs, state AT agencies

Data submitted in 2014 suggest that progress has been made with respect to collaboration between the states and the NIMAC and AMPs. Responses indicate that states have named more NIMAC authorized users (AUs) over time (see Table 7.1 and Table 7.2). Similarly, higher percentages of respondents in 2014 than in 2010 reported having named Bookshare and/or Learning Ally as a NIMAC authorized users (see Table 8.1 and Table 9.1). Additionally, the percentage of respondents reporting that their state uses quota funds and additional APH services increased from 2010 to 2014 (see Table 11.1). It is also noteworthy that,from 2010 to 2014, there is an increase of states reporting collaboration with their state assistive technology (AT) agency by both the total group (see Table 13.1) and by the AIM TTA states group (see Table 13.2). This finding is important because IDEA requires states to work collaboratively, to the maximum extent possible, with their state agency responsible for AT programs. Moreover, the effective use of most specialized formats requires technology for students to perceive and interact with the content.

3. Increased provision of learning opportunities with respect to AIM

Data submitted in 2014 show increases in both the types of learning opportunities offered and the range of topics identified (see Table 19.1 and Table 20.1). This is important because high-quality learning opportunities can help facilitate the effective provision of AIM to students who need such materials by building the capacity of key stakeholders. It is noteworthy that all

respondents in the AIM TTA group reported providing learning opportunities on the following five topics: 1) awareness of statutory requirements and limitations, 2) identification of the need for AIM,3) identification of students with print disabilities as defined by copyright law,4) selection of specialized formats and tools that address student needs, and 5) acquisition of AIM for students who qualify as having a print disability as defined by copyright statute. Furthermore, for each of the three survey administrations, the AIM TTA states reported providing learning opportunities on the acquisition of AIM for students who need such materials but do notmeet copyright criteria.

Additional Areas in Which the AIM TTA States Reported Progress

1. Collaboration with general education departments

Because the instructional materials that form the basis of AIM are intricately connected to the general education curriculum, it is important for state AIM systems to include collaboration between special education and general education. Data submitted in 2014 show that high percentages of the AIM TTA states collaborate with various general education departments—namely, curriculum and instruction, instructional and information technology, materials procurement, and assessment (see Table 5.2).

2. Collaboration with families

The data show that from 2010 to 2014 there was an increase in the percentage of respondents in the AIM TTA states who reported that their state shares information, shares training calendars, and conducts joint training with their Parent Information Center (see Table 12.2). The AIM TTA states also demonstrated an increase over time in the percentage of respondents reporting that their state disseminates written guidelines to families (see Table 16.2) and the percentage of respondents reporting that they provide learning opportunities to families (see Table 18.2).

Areas in which Greater Progress is Needed for All States

1. Collection and Use of AIM-Related Data

A key facet of ensuring that coordinated systems for the timely provision of AIMare working as expected involves the collection and use of data to identify areas of concern and to improve AIM-related services and activities. Examination of data across the three survey administrations reveals that states continue to struggle with the collection and use of AIM-related data. Three areas were identified as needing improvement: collection of data, the type of system used to collect data (see Table 23.1 and Table 23.2), and the type of data being collected (see Table 24.1). In 2014,while it is promising that a higher percentage of states indicated that some data are being collected, responses also indicated that those data are seldom integrated into the broader statewide student data collection system, making it difficult to determine the equity of AIM-related activities. Additionally, lowest reported percentages of the types of data being collected include changes in achievement for students who have AIM, quality of AIM, and the use of AIM to improve learning (see Table 24.1), all important to the use of AIM to improve access to learning materials and learning outcomes.

2. Incorporation of AIM into the State Systemic Improvement Process

In order for SEAs to move toward the scaling up of AIM-related systems and procedures, it is critical for them to understand the connection between AIM and other areas that are being addressed as part of the SSIP process (e.g., graduation rates; assessment)Data show that

many states have not incorporated AIM into the SPP/APR process. This finding suggests that states either have limited awareness of the relationship between the provision of AIM and various SPP indicators (see Table 27.1 and Table 27.2) or the inability to collect and analyze relevant data. Similarly, respondents in the total sample and AIM TTA states demonstrated limited knowledge of whether and how AIM might be included in the SSIP process, also known as Indicator 17 (Part B) and Indicator 11 (Part C) (see Table28.1 and Table 28.2).

3. Preference given to publishers who offer accessible materials

Data indicate that states could be doing more to provide incentives for the development and provision of accessible learning materials for purchase. Although respondents in the total sample and in AIM TTA states reported that their state provides guidance to LEAs about the need for purchasing contracts to include the requirement that NIMAS files be deposited in the NIMAC, a much smaller percentage reported that the SEA gives preference or recommends that LEAs give preference to publishers who offer accessible versions of print materials for purchase (see Table 14.1 and Table 14.2). Given the importance of accessibility considerations as part of procurement decisions, it is disappointing that only one third of respondents in the total sample in 2014 reported that the general education department “Materials Procurement” was involved in their state’s AIM-related activities (see Table 5.1). It is critical for SEAs to emphasize the importance of accessibility considerations as part of purchasing decisions that are made at all levels with respect to the acquisition of both curricular materials and technology.

Recommendations

Data collected over the course of the three survey administrations affirm that the investments of OSEP and the work of the AIM Center and related projects have supported SEAs and LEAs in the development of improvedcoordinated systems for the selection, acquisition, and use of high-quality accessible educational materials in a timely manner. While progress has been made, it is clear that more work needs to be done. Data strongly suggests that ongoing assistance is needed to help SEAs continue to improve their AEM systems in a manner that meets the needs of a diverse group of stakeholders, including students and families, teachers, LEAs, educational publishers, software developers, accessible media producers, and distributors. Moreover, SEAs must learn how to adapt their AIM systems in the context of a rapidly changing landscape of educational materials.

On September 30, 2014, the AIM Center concluded its funding cycle, and the work of the National Center on Accessible Educational Materials for Learning (AEM Center) began. Building on the technical assistance and leadership provided by the AIM Center over the past five years, the new AEM Center will work to support stakeholders in scaling up their AEM-related implementation efforts with a particular focus on increasing the connection between the use of AEM and improved learning.

The remainder of this section presents a series of recommendations directed toward SEAs and LEAs, the new AEM Center, and OSEP. These recommendations are intended to facilitate continuation of the commitment to improving the quality and timeliness of delivery of AEM to all students who need such materials. In accordance with the change in language from “accessible instructional materials” (AIM) to “accessible educational materials” (AEM), all recommendations will reference materials as AEM.

Actionable Recommendations for SEAs and LEAs

The following recommendations for SEAs and LEAs will enhance their abilities to 1) build a comprehensive infrastructure for the timely delivery of high-quality AEM; 2) ensure that entities and individuals at all levels know their responsibilities related to the selection, acquisition, and use of high-quality AEM; 3) ensure that educators, families, and students are able to use the system to obtain and use high-quality AEM; 4) collect and use data that ensures the effectiveness of all services and informs the need for improvement; 5) demonstrate how the use of high-quality AEM impacts student independence, participation, and achievement as critical indicators of college and career readiness and improved outcomes.

  • Continue to strengthenthe implementation of a coordinatedstate system for the timely provision of AEM(Critical Component of Quality Indicator 1).
  • Improve and enhancethe implementation of a comprehensive data system designed to ensure that students who need AEM receive them in a timely manner (Critical Components of Quality Indicators 2, 5, and 6).
  • Initiate a process to facilitate the inclusion of AEM into the state SSIP.
  • Include accessibility considerations as an important component of procurement processes for content and technology.
  • Improve the development and dissemination of written operational guidelines, learning opportunities, and technical assistance related to AEM.
  • Begin to explore more closely the connection between AEM and learning while providing support to students in developing the necessary skills to be able to advocate for their own AEM-related needs.

Actionable Recommendations for the AIM Center

Data suggest the following actions by the AEM Center will1) support the efforts of administrators and AEM/NIMAS coordinators in the development of effective and efficient decision-making and distribution systems for AEM within their states; 2) assist key collaborators in the acquisition of the skills and knowledge needed to support the use of AEM across states, districts, and families; 3) increase the use of major AMPs for students meeting criteria for NIMAC-sourced materials; and 4) lead the development of new strategies for providing appropriate AEM-related services to all stakeholders from early learning through college and career.

  • Increase awareness of the Critical Components of Quality Indicators for the Provision of AEM to be used as a flexible tool to assist SEAs, LEAs,and others in the planning, implementation, and evaluation of coordinated systems for the timely acquisition and use of AEM.
  • Leverage the knowledge and skills of states and other entities that have made progress by working closely together to providerelevant, highly useful assistance to states who are struggling with AEM-related issues.
  • Provide the infrastructure and ongoing support needed to convene and sustain flexible, focused communities of practice that address a variety of AEM-related issues of importance to all states and other entities.

  • Work collaboratively with partners and others with knowledge and skills to develop high-quality, relevant, and useful products (e.g., learning opportunities, informational materials, guidance documents, checklists) and disseminate broadly through the Center’s website and other means.
  • Collaborate with data centers and researchers to identifyuseful and meaningful data to collect and use to determine the connection between the use of AEMand student outcomes.
  • Work closely with states and other entities to identify methods for collecting data relevant to determining the effectiveness of AEM systems, the use of AEM for learning, and theresulting impact on outcomes.
  • Collaborate with all federally funded NIMAS-/AEM-related projects and other providers so that guidance and services provided by all entities are aligned.
  • Support SEAs and LEAs in purchasing accessible learning materials and favoring publishers that offer accessible learning materials for purchase.

Actionable Recommendations for OSEP

OSEP has played an instrumental role in supporting the timely provision of accessible materials by investing in NIMAS- and AEM-related projects. The data in this report show that progress has been made as a result of these investments. In light of the fact that implementation of NIMAS/AEM varies widely across the nation, it is recommended that OSEP take the following actions.