BACKGROUND

Marcus Autism Center is a not-for-profit organization with a mission to provide information, services and programs to children with autism and related disorders, their families and those who live and work with them. Marcus Autism Center offers integrated advanced clinical, behavioral, educational and family support services through a single organization to reduce the stress for families that use our services.

The mission of the Marcus Autism Center Early Intervention program is to provide empirically based interventions for young children 18 months to 8 years of age, who have been diagnosed with autism and related disorders. The goal of the Early Intervention program is to develop readiness skills, enabling children to successfully transition to community schools and kindergarten programs. With a child-to-staff ratio of 3:1, children work in individual, small and large group environments to adequately prepare them for participating in community school settings.

The Early Intervention program uses applied behavior analysis (ABA) to document and evaluate each student’s progress in an individually designed intervention plan, including:

·  Training the essential introductory skills for learning.

·  Developing functional, spontaneous communication.

·  Developing social skills.

·  Developing functional and symbolic play skills.

·  Planning materials to develop cognitive skills.

·  Intervening in target problem behaviors.

·  Introducing academic skills.

·  Developing self-help and daily living skills.

The lead teacher in each classroom develops a weekly plan for each child and tracks progress at different levels daily, weekly and monthly. Progress reports are reviewed monthly by Center administrators. Accurate reporting of student achievement is critical to the Center in order to:

·  Gauge the effectiveness of the program overall (the Mission of the Center)

·  Continue to develop and improve the program

·  Understand the effectiveness of new programs and methods

·  Communicate the program’s effectiveness to key stakeholders to ensure future financial and volunteer support:

o  Student families

o  The State of Georgia Human Resources Department (public funding)

o  Private donors

o  Foundations

o  Potential new employees

In September 2009, the director of Early Intervention requested percent mastery for all student goals from the school year 2008-2009. Classrooms 1 and 2 had similar results in overall percent mastery. However, Classroom 3 had a significantly lower percentage. The discrepancies appeared at the same time a new teacher began work at the Center; therefore, they may indicate an administrative – rather than educational – issue to investigate and resolve.

The purpose of this Six Sigma project is to identify the source of the reporting discrepancies and develop tools to minimize or eliminate them in the future.

First, a thorough search of the Georgia Department of Education website was conducted to see if there were any standards on the storage and/or presentation of academic skills. The search yielded no results. Next, a similar search was done on the Emory Autism Center website. This organization runs a program, Walden Early Childhood Center, which is similar to the structure of the Marcus Autism Center. This search also yielded no results.

A final general Google search of “Autism and Data Storage” was conducted. An abstract of a study currently in progress was found in which investigators are comparing data collection via mTrial to “pen and paper” data collection. When this study is complete the possibility of electronic storage and analysis of student data could be investigated at the Marcus Autism Center. For now, The Marcus Autism Center Early Intervention Program appears to be so unique that there are no standards established for data storage and presentation.

DEFINE

Two process maps were created. The first documents the process teachers used to set goals for students (Figure 1).

The Director of the program has set Basic Goals to be achieved by all students, based on her specialized teaching experience. These are baseline goals and are not intended to be educationally representative or inclusive. Students typically achieve these basic goals early in their time at the center.

Issue 1: After basic goals are mastered, there is a lack of Standard Operating Procedures for how to select subsequent goals.

Result: An arbitrary selection of new goals with no uniformity of process between classrooms. Inexperienced teachers create goals with no clear purpose, e.g. “Giving a thumbs-up.” When goals do not fall within an educational category, one can not measure student progress in broad areas, such as math, communication, etc.

The second map documents the process teachers used to track student progress, and that administrative staff used to summarize student achievement into organizational metrics for reporting (Figure 2).

Issue 2: Inconsistent reporting between classrooms.

Result: Attempts to summarize classroom achievement levels was complicated by inconsistent documentation of current status of goals, number of goals worked, number of goals mastered. Results between classrooms differed dramatically; it was suspected that simple tallying of goals was the reason for the different performance between classrooms.

Table 1 lists the Key Output and Key Input Variables of interest in this study.
Figure 1: Process Map of Goal Selection for Students


Figure 2: Process Map of Student Goal Mastery Tracking by Teachers


Table 1: KOVs and KIVs

Key Output Variables / Key Input Variables
·  Consistent Reporting per Student
·  Consistent Reporting per Classroom
·  Consistent Goal Definitions / ·  Reference tables
·  Standard Operating Procedures
·  Training
·  Standardized Spreadsheets
·  Linked Excel Tables

Preliminary Analysis:

Currently student achievement plans and documentation of student development is not standardized. Teachers track plans daily on paper in order to know what tasks to work on and with which students.

The daily task achievement levels are updated weekly in a spreadsheet which summarizes the week’s activity. In turn, weekly activities are summarized monthly and any mastering of tasks is noted and associated with a broader area of achievement. However because the daily tasks are not standardized or directly associated with broader areas of achievement, this analysis is open to the interpretation of the analyst. It is thought that this may be a source of discrepancies between classroom achievement levels. Also, spreadsheets do not automatically summarize any information; goals are counted by hand.

The baseline analysis by teachers of classroom achievement levels using existing spreadsheet data is displayed in Table 2. (A re-classification of tasks indicated a much higher level of achievement for Classroom 3 than what is shown.)

Table 2: Percent Mastery by Classroom for School Year 2008-2009, Initial Summary

Classroom / Percent Mastery /
Classroom 1 / 54.1%
Classroom 2 / 58.5%
Classroom 3 / 8.2%

MEASURE

In order to determine the source of the reporting discrepancies, a gage R&R study was undertaken. Four appraisers were selected. One person who had analyzed several spreadsheets before was considered to be the “reference.” Appraisers had varying levels of experience with the tracking spreadsheets. The tracking spreadsheets were considered to be the measurement tool, as they contained the data for each student’s goal mastery. One student spreadsheet was selected randomly. Appraisers were asked to summarize:

1.  Total number of goals worked

2.  Total number of goals achieved

These were considered to be the “parts” which were measured by the appraisers and the tool (spreadsheet). Each appraiser measured the two parts twice; each measurement was taken days apart so that appraisers would not recall their measurements from Trial 1.


According to the Gage R&R analysis in Table 3, the MCI1 index for this study is 67% - clearly in the unacceptable range.

Table 3: Gage R&R Range Method Estimates for Goal Mastery Tracking Variation Source

Gage R&R Range Method Estimates / Study Variation (s-hat) / Study Variation (SV=6.0) / % of Total Variation
(100 x (s-hat /(s-hat total)
% R&R / = 34.9589 / 209.753 / MCI1 = 66.69%
% Repeatability / = 3.0414 / 18.248 / 5.80%
% Reproducibility / = 34.8264 / 208.958 / 66.44%

According to the ANOVA analysis in Table 4, the only significant factor in the gage analysis is the interaction between operator and part number. By far, the interaction term accounts for most of the variation in the model. This interaction is understandable since the spreadsheets were not standardized in any way and were not automatically calculating anything. Appraisers used the tool differently to count parts because no standardized way of measuring was in place and the spreadsheet was not error-proofed.

The Gage R&R summary graphs (Figures 3, 4) indicate that the measurements variation was highest for Part 1 – total goals worked. This indicates that the tracking spreadsheets were not formatted consistently, making it difficult to determine the number of goals worked, and therefore the % achievement. The primary source of the discrepancy between classroom achievement scores, it seems, is significantly different calculations of total goals worked – the denominator of the critical “% Achievement” metric.


Table 4: Gauge R&R Study of Student Achievement Sheets

ANOVA: Obs Measure versus Operator, Part number

Factor Type Levels Values

Operator random 4 A, B, C, Ref

Part number random 2 1, 2

Analysis of Variance for Obs Measure

Source DF SS MS F P

Operator 3 6946.0 2315.3 0.95 0.516

Part number 1 14641.0 14641.0 6.01 0.092

Operator*Part number 3 7305.0 2435.0 263.24 0.000

Error 8 74.0 9.3

Total 15 28966.0

S = 3.04138 R-Sq = 99.74% R-Sq(adj) = 99.52%

Expected Mean Square

Variance Error for Each Term (using

Source component term unrestricted model)

1 Operator -29.92 3 (4) + 2 (3) + 4 (1)

2 Part number 1525.75 3 (4) + 2 (3) + 8 (2)

3 Operator*Part number 1212.88 4 (4) + 2 (3)

4 Error 9.25 (4)

Gage R&R

%Contribution

Source VarComp (of VarComp)

Total Gage R&R 1222.13 44.48

Repeatability 9.25 0.34

Reproducibility 1212.88 44.14

Operator 0.00 0.00

Operator*Part number 1212.88 44.14

Part-To-Part 1525.75 55.52

Total Variation 2747.88 100.00

StudyVar %StudyVar

Source StdDev (SD) (6*SD) (%SV)

Total Gage R&R 34.9589 209.753 66.69

Repeatability 3.0414 18.248 5.80

Reproducibility 34.8264 208.958 66.44

Operator 0.0000 0.000 0.00

Operator*Part number 34.8264 208.958 66.44

Part-To-Part 39.0608 234.365 74.51

Total Variation 52.4202 314.521 100.00

Number of Distinct Categories = 1


Figure 3: Minitab Gage R&R (Crossed) Range Method Analysis of Student Achievement

Figure 4: Bias Analysis for Goal Mastery


ANALYZE

Figure 5 displays an excerpt from a student’s Individualized Education Plan (IEP) before standardization. Investigation of the spreadsheets showed several issues that could cause variation.

Issue 1: Goals that have not been started yet (N/A) are included.

Result: IEP’s are used as a general summary of the student’s progress. Including all possible goals that a student may work on for the school year makes the spreadsheet difficult to read and interpret.

Issue 2: Mastered and Current goals are mixed together.

Result: Presenting all goal statuses together makes it difficult to select only the goals which have been mastered versus goals on which the student is still working.

Issue 3: Combining sub-goals in one cell rather than separating them.

Result: Finding the number of goals on which the student is currently working versus the goals which have been mastered is left up to the interpretation of the reader. This metric would not be calculated consistently from appraiser to appraiser. For example, one appraiser may count each individual cell as one goal. In doing this, several sub-goals will not be counted, therefore under-representing the total number of goals.

Figure 6 displays an excerpt from an Excel spreadsheet used to store and track data. Investigation of the spreadsheets revealed several issues that could be cause for variation.

Issue 1: Current and Cumulative percentages were not displayed.

Result: Constantly update percentages is cumbersome and inefficient. Each time a percentage needs to be reported the analyst must open the student excel spreadsheet and manually calculate each.

Issue 2: Sub-goals are grouped together.

Result: When sub-goals are grouped together it is difficult to accurately report the number of goals on which each student is working or has mastered. Similar to Figure 5, it can be interpreted differently by each appraiser.

Many of these causes of variation could be attributed to the lack of Standard Operating Procedures. There is no standard that details how the student Excel spreadsheet or IEP should be formatted. There are no clear operational definitions for when a goal is “mastered.” Each teacher in each classroom measures achievement differently. There are no criteria for which types of goals to introduce after one goal is mastered. Teachers are choosing goals that are not clearly defined and our based on outside experience. Also, with no SOP, reporting data is left up to teacher’s discretion.


Figure 5: Individualized Education Plan before standardization

Figure 6: Goal tracking before standardization

Simple Math Count and Compare /
EXPRESSIVE
Date / Session / BL / Post TX /
1/12/2009 /
1
/ 80% / #N/A / Cards 1-4
1/13/2009 / 2 / 89% / #N/A /
1/21/2009 / 3 / 80% / #N/A
1/23/2009 / 4 / 100% / #N/A
1/26/2009 / 5 / 94% / #N/A
1/28/2009 / 6 / 75% / #N/A
2/2/2009 / 7 / 85% / #N/A
2/4/2009 / 8 / 90% / #N/A
2/6/2009 / 9 / 100% / #N/A
2/9/2009 / 10 / 100% / #N/A /
2/13/2009 / 11 / 87% / #N/A / Cards 5-8
2/16/2009 / 12 / 95% / #N/A
2/18/2009 / 13 / 83% / #N/A

IMPROVE