All State Agencies Are Required to Propose a Set of Key Performance Measures (KPM) During

All State Agencies Are Required to Propose a Set of Key Performance Measures (KPM) During

Internal Audit Review

Oregon Department of Education

Key Performance Measures Data Reliability

Audit Committee received report on 8/10/09

John Hutzler, JD, CIA, CGAP, CCSA
Chief Audit Executive
Oregon Department of Education

1

Internal Audit Review

Oregon Department of Education

Key Performance Measures Data Reliability

Background

Objectives, Scope and Methodology

Objectives

Scope

Methodology

Results

Conclusion

Recommendations

Commendation

Management Response

Background

In 1993, the Legislative Assembly asked agencies to include benchmark-based planning in performance measurement and budget policy. In 2001, with the passing of House Bill 3358, the Progress Board, in collaboration with the Legislative Fiscal Office (LFO), the Office of the Secretary of State (SOS) and the Department of Administrative Services (DAS) was tasked with developing performance measure guidelines for state agencies. ORS 291.110(2)(a) state each agency is required to develop key performance measures (KPMs) consistent with and aimed at achieving Oregon benchmarks and shall “identify the mission, goals, and objectives of the agency and any applicable benchmarks to which the goals are directed.” Key Performance Measures (KPMs) are designed to assist in accurately measuring and reporting on key indicators for agencies.

All state agencies are required to propose a set of key KPMs during their biennial budget request, and to report performance progress for approved KPMs in an Annual Performance Progress Report (APPR). The Department of Administrative Services (DAS) Office of Budget and Management (BAM) maintains a KPM reporting system which state agencies use to report their KPMs as a part of the biennial budget process. Oregon Administrative Rule 125-700-0050(5) requires an agency’s Chief Audit Executive to annually assess their performance measurement system integrity and to report the results to DAS as part of the agency’s risk assessment.

The Joint Committee on Ways and Means reviews and approves KPMs, and sets performance targets based on resources provided in an agency’s legislatively adopted budget. Because the Joint Committee on Ways and Means relies on reported KPM information during the budget process, the Legislative Fiscal Office was asked to conduct a review of KPM data quality.

LFO approached the Chief Audit Executive of DAS for assistance, and at her request, the Statewide Audit Advisory Committee (SAAC) formed a subcommittee to work with LFO and the BAM statewide performance measurement coordinator to define the scope and protocols for reviewing KPM data integrity.

The KPM system has criteria for the development of new KPMs and standardized reporting forms that have been developed and improved over the last few biennia. State agencies are required to report on data sources for their KPMs; however, no standardized statewide processes are in place for periodically reviewing the data integrity of KPM information.

Objectives, Scope and Methodology

The subcommittee defined data integrity as containing three elements: documentation, repeatability, and consistency. Verification of these elements was focused on exploring the answers to the following high-level questions:

  • Documentation: Are there appropriate documentation and controls in place to ensure consistency in reporting?
  • Repeatability: What conclusion might be drawn about the data quality of the data source, and can the data be accessed at any time to accurately replicate reported information?
  • Consistency: Was the methodology employed to calculate reported data appropriate and consistent over time?

The SAAC subcommittee took these high-level elements and developed a KPM auditing/reviewing template for evaluating data integrity. This template was used by ODE internal audit to conduct this KPM reviews. See Appendix A for a copy of this template.

The DAS Chief Audit Executive asked agencies with internal auditors to commit to reviewing 10-15% of their agency’s KPMs and to complete this work by September 30, 2008. ODE internal audit was unable to commit to that deadline due to previously scheduled audits its 2008 audit plan. This review was included in the 2009 ODE Audit Plan approved by the ODE Audit Committee at its meeting on February 9, 2009.

Objectives

The objectives of this review were to determine the accuracy and reliability of the data reported in ODE’s APPR for fiscal year 2008 (FY2008), and to help ensure that the data reported in the APPR was:

  • Documented: appropriate information behind the measure exists;
  • Repeatable: the information can be accurately recreated: and
  • Consistent: the measure is reported the same year to years.

Scope

Agencies were asked to review 10 to 15 percent of their KPMs. In its FY2008 APPR ODE reported on 25 KPMs. Internal audit selected three KPMs (12%) for inclusion in this review. I evaluated KPMs for inclusion in the review based upon the following factors:

  • Will ODE continue to report this measure? The usefulness of recommendations would be limited if a measure was being eliminated. In the 2008 APPR the department proposed ten of the reported KPMS for deletion.
  • Has ODE reported data for the measure for at least three data points? Without multiple data points, it would not be possible to evaluate consistency in reporting a measure. Thirteen of 25 KPMs met this criterion.
  • Do external controls exist that reduce the likelihood of inaccurate or unreliable data? Five KPMs involving data required for federal reporting were considered to be lower risk because of federal monitoring.
  • Is the KPM of particular interest to the State Legislature? ODE’s BAM analyst identified two KPMs as particularly significant to legislators.
  • Is the KPM of particular interest to ODE senior management? The Deputy Superintendent requested that at least one of the KPMs reviewed measure an area over which the department exerted primary control, as distinguished from measures of the overall performance of the K-12 system.

Based on these criteria, I selected the following three KPMs for review:

  1. KPM #1 – ACCESS TO PRE-KINDERGARTEN – Percentage of eligible children receiving Head Start/Oregon Pre-Kindergarten services.
  2. KPM #5 – HIGH SCHOOL GRADUATION – Percentage of secondary students who graduate, drop out or otherwise finish PK12 education.
  3. KPM #18 – TIMELY PUBLIC REPORTS – Number and percentage of key public reports released accurately and on time.

The scope of my review was limited to the accuracy and reliability of the current and historical data reported on these three measures in the 2008 APPR. However, in determining the scope of this review, I considered the definitions and documentation for all KPMs included in the 2008 APPR and shared my observations on other measures and the APPR overall in an end of survey conference with management.

Methodology

Work performed included:

  • Interviewing key personnel involved in KPM reporting;
  • Discussing various levels of risk involved with the performance measures;
  • Reviewing applicable policies and procedures related to KPM reporting;
  • Reviewing file documentation maintained by staff responsible for reporting on selected measures;
  • Obtaining APPR and supporting raw data and identifying proper data elements for each measure;
  • Reviewing measure definitions and calculations for accuracy;
  • Attempting to recreate values reported using calculation methodologies identified;
  • Examining databases containing data reported on;
  • Performing reconciliations of data held in databases to source documentation;
  • Comparing source documents to electronic records to test accuracy controls;
  • Determining if adequate controls are in place for measures;
  • Concluding on the verification of performance measure data; and
  • Reporting the results for each measure reviewed using the following definitions:
  • Verified: The performance reported is consistently accurate within plus or minus five percent and adequate controls are in place to ensure consistency and accuracy in collection of all supporting data and subsequent reports.
  • Verified with Qualifications: The performance reported is consistently accurate within plus or minus five percent, but adequate controls are not in place to ensure continued accuracy. The span of data is less than ideal or the performance measure definition is not followed, but the calculation remained within the five percent error range.
  • Factors Prevented Verification: Documentation is not available and controls are not adequate to ensure consistency and accuracy, or the performance measure definition is not followed and the correct measure results cannot be determined.
  • Inaccurate: The performance reported is commonly not within five percent of actual performance.
  • Not Applicable: Lack of adequate data exists for review for a justifiable reason, e.g. a new measure.

Results

  1. KPM #1 – ACCESS TO PRE-KINDERGARTEN – Percentage of eligible children receiving Head Start/Oregon Pre-Kindergarten services.

This measure has been included in ODE APPR reports since the 2005 APPR. The 2008 APPR includes data for this measure for the school years 2001-02 through 2007-08.

2002 / 2003 / 2004 / 2005 / 2006 / 2007 / 2008
62% / 61% / 59% / 60% / 60% / 57% / 62%

I selected it for review because seven data points were reported and ODE’s BAM analyst indicated that this was a very important measure to the Legislature since it increased funding for Pre-Kindergarten services in the 2007-09 budget.

The data owner for this performance measure has changed in each of the past three years.

Documentation of the measure indicates the following:

The numerator is the number of children receiving Oregon Head Start Pre-K services. This includes children funded through Oregon PreK, Region X Head Start, Region XI American Indian Head Start, and Region XII Migrant Seasonal Head Start for “seasonal farm workers”. The denominator is the number of children in Oregon in poverty who are eligible for Head Start. The number is calculated by Oregon Department of Education, Office of Student and Learning Partnerships, Early Childhood Section staff. It is based on the “3-4 Year Old Population” 2007 estimates, from the PopulationResearchCenter, PortlandStateUniversity and the state “poverty rate” data obtained from the US Census Bureau, Small Area Income and Poverty Estimates. The rate used for 2007 (19.47%) is the average of the poverty rates for the 0-4 year old general population across three biennia. Subsequently, research conducted by the Children’s Institute (a private research organization) and the Population Research Center at Portland State University in summer 2008 indicated a different state poverty rate of 23.1%. This rate will be used in the 2008-2009 calculations.

The data reported for KPM #1 in the 2008 APPR is from the report Oregon Head Start Prekindergarten - Annual Estimate of 3 and 4 Year Olds, produced annually by the ODE Head Start program staff in January. The report is well documented with clear instructions for HS/OPK programs around the state for counting the number of funded slots. The documentation explains that although “Only children with family income at or below the current federal poverty guideline are counted as ‘Eligible 3-4 Population,” the denominator in the KPM, some “enrolled children come from families that make over the poverty guideline … who meet other needs criteria developed by the program.”

Although reports from programs throughout the state of the number of funded slots are not audited, they are reviewed by ODE HS/OPK staff, who follow-up with programs on any reports that appear questionable, e.g. significant and unexplained differences from prior reports or federal reports. Each year’s statewide report, together with reports from each program supporting the calculation of the numerator and documentation supporting the data sources for population and poverty data used to calculate the denominator are maintained in notebooks. Staff produced notebooks for every year of reported data except 2006. The missing notebook could not be located before this review was completed. I confirmed a random sample of counts from the statewide report to the program reports and confirmed the calculations of the numerator and denominator from the sources identified.

Although data reported are consistently accurate within plus or minus five percent based upon the documentation of the calculations involved, there are several problems with the controls over this performance measure:

First, the numerator is not actually the number of children receiving Oregon Head Start/Pre-Kindergarten services. Rather, it is the total number of funded HS/OPK slots. Since a slot may be filled by more than one child over the course of the year, the number of slots does not equal the number of children served. The operational definition of the measure does not match the statement of the measure, which could confuse policymakers.

Second, a limited proportion of slots may be filled by children who do not meet the poverty eligibility guidelines of the program, but are otherwise eligible for services under local program guidelines. Since the denominator is the estimated number of Oregon children in poverty, the numerator and denominator used to calculate the percentage do not match. Therefore, the percentage reported is not, in fact, the percentage of poverty-eligible children (the denominator) served.

Third, the measure is significantly influenced by the population and poverty estimate methodologies employed to calculate the denominator, and a consistent methodology has not been applied over the period for which data is reported. For example had the 23.1% poverty rate proposed for 2008-09 been used to calculate 2007-08 performance, reported performance would have been 52% rather than 62%. Alternative methodologies for developing the eligible population have been considered in several years, but no policy or protocol for deciding between alternatives has been developed. The lack of such a control creates a risk that the calculation of the performance measure may be influenced by the messagemanagement wants to communicate to the Legislature.

Finally, neither the Data Owner nor HS/OPK staff isqualifiedto evaluate the population and poverty estimates provided by contracted demographers. HS/OPK staff disavowed any responsibility for the accuracy of the estimates of eligible children (the denominator). No quality assurance review of demographic data and calculations is performed by ODE. I found an error in one of theExcel spreadsheets provided by the PSUPopulationResearchCenter that should have been discovered by ODE staff had a basic formula audit been performed.

For the above reasons, I consider this performance measure to be Validated with Qualifications. The performance reported is consistently accurate within plus or minus five percent, but adequate controls are not in place to ensure continued accuracy.

  1. KPM #5 – HIGH SCHOOL GRADUATION – Percentage of secondary students who graduate, drop out or otherwise finish PK12 education.

This KPM is composed of four separate measures. Only one, the percentage of students who graduate, was selected for review. The 2008 APPR includes data for this measure for the school years 1999-2000 through 2006-2007, reported in the table by the year in which the school year ended.

2000 / 2001 / 2002 / 2003 / 2004 / 2005 / 2006 / 2007
82% / 82% / 81% / 82% / 81% / 79% / 78% / 76%

I included it in the review because eight years of performance data are reported, and because ODE’s BAM analyst indicated that this was a very important measure to the Legislature. With planned changes in the requirements for the Oregon Diploma, this KPM will be of particular interest in coming years.

The data owner identified in the 2008 APPR for this performance measure is no longer at ODE. Her replacement identified as the data owner for this measure when I began my review is also no longer at ODE. The new data owner for this measure will be the fourth person to have that responsibility in the past three years.

The data for this measure in the APPRis first reported annually in the Statewide Report Card as Graduation Rate Based on Enrollment. It consists of a countof regular diplomas for the school year divided by the count of 12th graders enrolled as of October 1 for the school year. The count of regular diplomas is taken from the High School Completers data collections and includes regular diplomas without a CIM code and regular diplomas with a CIM code granted to students under the age of 21 between September 1 and August 31. The count of 12th graders enrolled on October 1 is taken from the Fall Membership data collection.

Both of these collections are documented by data definitions and instruction guides for school personnel entering the source data. The Fall Membership Collection opens October 1st each year and closes in mid-November. Data is published in February after data quality checks have been performed to help ensure accurate and complete reporting. The data is used in subsequent reports, such as the Statewide Report Card and the APPR. The HS Completers Collection opens mid-May and closes at the end of October each year. The data is published in thespring after data quality checks have been performed and districts have validated the data ODE will use to make public reports. Much of the documentation for the KPM itself that was provided to me was developed only after this review was initiated.