U.S. Department of Education’s Administration of the Distance Education Demonstration Program

FINAL AUDIT REPORT

ED-OIG/A09-D0010

September 2004

Our mission is to promote the efficiency,U.S. Department of Education

effectiveness, and integrity of the Office of Inspector General

Department’s programs and operationsSacramento, California

Notice

Statements that managerial practices need improvements, as well as other conclusions and recommendations in this report, represent the opinions of the Office of Inspector General. Determinations of corrective action to be taken will be made by the appropriate

Department of Education Officials.

In accordance with Freedom of Information Act (5 U.S.C. § 552), reports issued by the

Office of Inspector General are available to members of the press and general public

to the extent information contained therein is not subject to exemptions under the Act.


UNITED STATES DEPARTMENT OF EDUCATION

OFFICE OF INSPECTOR GENERAL

September 30, 2004

Memorandum

TO:Sally Stroup

Assistant Secretary

Office of Postsecondary Education

Lead Action Official

Theresa Shaw

Chief Operating Officer

Federal Student Aid

FROM:Helen Lew /s/

Assistant Inspector General for Audit

SUBJECT:Final Audit Report

U.S. Department of Education’s Administration of the Distance Education

Demonstration Program

Control Number ED-OIG/A09-D0010

Attached is the subject final audit report that covers the results of our review of the Department’s administration Distance Education Demonstration Program from July 1999 through July 2003. An electronic copy has been provided to your Audit Liaison Officers. We received your comments in which you did not concur with our finding that the Department’s Second Report to Congress on the Distance Education Demonstration Program contained unsupported, incomplete, and inaccurate statements and did not fully agree with the related recommendation, You generally concurred with the other findings and recommendations presented in our draft report.

Corrective actions proposed (resolution phase) and implemented (closure phase) by your offices will bemonitored and tracked through the Department’s Audit Accountability and Resolution Tracking System (AARTS). ED policy requires that you develop a final corrective action plan (CAP) for our review in the automated system within 30 days of the issuance of this report. The CAP should set forth the specific action items, and targeted completion dates, necessary to implement final corrective actionson the findings and recommendations contained in this final audit report.

In accordance with the Inspector General Act of 1978, as amended, the Office of Inspector General is required to report to Congress twice a year on the audits that remain unresolved after six months from the date of issuance.

In accordance with the Freedom of Information Act (5 U.S.C. §552), reports issued by the Office of Inspector General are available to members of the press and general public to the extent information contained therein is not subject to exemptions in the Act.

We appreciate the cooperation given us during this review. If you have any questions, please call GloriaPilotti at 916-930-2399.

Enclosure

400 maryland ave., s.w., washington, d.c. 20202-0500

Our mission is to ensure equal access to education and to promote educational excellence throughout the Nation.

TABLE OF CONTENTS

Page

EXECUTIVE SUMMARY

BACKGROUND......

AUDIT RESULTS......

FINDING NO. 1 –The Second Report Contained Unsupported, Incomplete, and Inaccurate Statements

FINDING NO. 2 – DEDP Participants Did Not Provide Complete and Consistent Information on Annual Reports

FINDING NO. 3 –Department Has Not Submitted Reports to Congress By the Statutory Due Dates

OBJECTIVES, SCOPE, AND METHODOLOGY

STATEMENT ON MANAGEMENT CONTROLS

Attachment 1 –Issues Identified in DEDP Reports Concerning Adherence to HEAProvisions and Federal Regulations 20
Attachment 2 –Results of OIG Review ofParticipants'Programs and Enrollment Data Reported on the 1998-1999 to 20012002Annual Reports 25
Attachment 3 –Results of OIG Review ofParticipants'Student Outcome Data Reportedon20012002Annual Reports 26
Attachment 4 –Department Comments on Draft Report...... 27
ED-OIG/A09-D0010Page 1 of 35 1

EXECUTIVE SUMMARY

The Distance Education Demonstration Program (DEDP), which is jointly administered by the Department’s Office of Postsecondary Education (OPE) and Federal Student Aid (FSA), has not met the statutory requirement to provide Congress with information on the specific Higher Education Act (HEA) and regulatory requirements that should be altered to provide students greater access to distance education programs.

Even though the Department[1] has not met the statutory requirement, it could take steps to improve the information in reports to Congress. We found that—

  • The Second Report to Congress on the Distance Education Demonstration Program (Second Report) contained statements that were not supported by information in the report or other documents provided for our review. The report also contained incomplete and inaccurate statements. Reliance on these statements could adversely affect policy decisions made by the Department and Congress.
  • DEDP participants did not provide complete and consistent information on their annual reports. Without complete and consistent data, the Department cannot properly evaluate DEDP participants and the impact of waiving HEA provisions and regulations.
  • The Department did not submit reports to Congress by the statutory due dates. The Congress was provided two-year-old data in the Second Report and has not received theinformation available from the DEDP participants for years 2001-2002 and 20022003 because the Department has not submitted the required reports to Congress for2003 and2004.

We recommend that the Department establish a review process for DEDP reports to Congress that ensures information and conclusions presented in the reports are supported, complete, and accurate. We also recommend that the Department enhance its efforts to obtain complete and consistent information in DEDP participants’ annual reports and disclose data deficiencies in its analyses and reports. In addition, we recommend that the Department establish firm timelines for the completion of data analyses, and DEDP report development, review, and issuance. We concluded that the Department provided the statutorily required oversight of DEDP participants.

In its response to the draft report, the Department did not agree with our finding that the Second Report contained unsupported, incomplete, and inaccurate statements and did not fully agree with the related recommendation. The Department generally agreed with the other findings and recommendations. The Department expressed concern with the accuracy of information presented in Attachment 1 of the report, but did not identify any inaccuracies in its comments. We thoroughly reviewed the support for our findings and are confident the attachment included in this final report is accurate. The Department’s comments and our response are summarized at the end of each finding and the introduction to Attachment 1. The comments are presented in their entirety in Attachment 4.

BACKGROUND

The Higher Education Act Amendments of 1998, enacted in October 1998, authorized the DEDP. One of the purposes of the DEDP is to determine the specific statutory and regulatory requirements that should be altered to provide greater access to high quality distance education programs. The legislation authorized the Secretary to exempt participating institutions from certain provisions in the HEA that inhibited their ability to offer distance education. The Secretary’s waiver authority included the 50Percent Rules for telecommunications students and classes,[2] the minimum weeks of instruction in an academic year, and regulations implementing the general provisions of the HEA in PartG of Title IV.

The HEA allows demonstration programs that are strictly monitored by the Department to test the quality and viability of expanded distance education programs currently restricted under the HEA. The HEA mandated that the Department provide oversight and required the Department, on a continuing basis, to (1)assure compliance of institutions, systems or consortia with the TitleIV requirements that had not been waived; (2) provide technical assistance; (3)monitor fluctuations in the student population enrolled in participating institutions, systems or consortia; and (4)consult with appropriate accrediting agencies or associations and appropriate State regulatory authorities.

Beginning July1,1999, the Secretary could waive requirements for up to 15participants (first year participants). In the third year of the DEDP, which began July 1, 2001, the Department could expand the program to include additional participants (third year participants). At the conclusion of our fieldwork, there were 22, involving over 100institutions from 20states and the District of Columbia.[3]

AUDIT RESULTS

The objectives of our review were to determine if 1)the DEDP is meeting the statutory requirement to provide information on specific statutory and regulatory requirements that should be altered for distance education programs, and 2)the Department provided the statutorily required oversight of DEDP participants. Our review covered the period from July 1999 (thebeginning of the DEDP) through July 2003.

The Department took appropriate steps to initiate the DEDP, but has not met the statutory requirement to provide Congress with information on the specific HEA and regulatory requirements that should be altered to provide greater access to distance education programs. The Department issued two reports to Congress on the DEDP—the Report to Congress on the Distance Education Demonstration Programs (First Report), dated January 2001, and the Second Report,dated July 2003. Neither report identified changes in specific statutory and regulatory requirements that were needed to provide greater access to high quality distance education programs.

The First Report and the Second Report did identify issues related to adherence to the 50Percent Rules and financial aid determinations for students who enrolled in a changing mix of courses delivered by different methods to complete their educational programs. However, as the Department disclosed in its SecondReport, other identified issues related to the application of HEA and regulatory requirements were not unique to distance education. Attachment1 provides a summary of the issues presented in the DEDP reports.

Even though the Department has not yet reached conclusions on specific statutory and regulatory requirements, it could take steps to improve the information provided in its reports to Congress on the DEDP. We found that the Second Report contained unsupported, incomplete, and inaccurate statements; DEDP participants did not provide complete and consistent information on annual reports; and the Department has not submitted reports to Congress by the statutory due dates.

We concluded that the Department provided the statutorily required oversight of DEDP participants, which was to, on a continuing basis, (1)assure compliance of participants with requirements of Title IV (other than the sections and regulations that are waived), (2) provide technical assistance to participants, (3) monitor fluctuations in participant student population, and (4) consult with accrediting agencies and state regulatory agencies.

FINDING NO. 1 –The Second Report Contained Unsupported, Incomplete, and Inaccurate Statements

The Department has not met the statutory requirement to provide Congress with information on the specific HEA and regulatory requirements that should be altered to provide students greater access to distance education programs. However, the Second Report contained statements that provide information that could be used in evaluating the need for changes in the HEA and Federal regulations. We found that several of the statements were not supported by information in the report or other documents provided for our review. We also found statements that were incomplete and inaccurate.

The Department issued Information Quality Guidelines that provide quality criteria for principal offices to use in the review of information products that they plan to disseminate to the public. The Guidelines are applicable to information that the Department disseminates on or after October 1, 2002. Two of the factors used by the Guidelines to assess information quality are utility and objectivity.

Utility refers to the usefulness of the information to its intended users.... To maximize the utility of influential information, care must be taken in the review stage to ensure that the information can be clearly understood and, where appropriate and to the extent practical, an external user of the information can reproduce the steps involved in producing the information.

Objectivity refers to the accuracy, reliability, and unbiased nature of information. It is achieved by using reliable information sources and appropriate techniques to prepare information products. Objectivity involves both the content and the presentation of the information. Content should be complete, include documentation of the source of any information used, as well as, when appropriate, a description of the sources of any errors in the data that may affect the quality of the information product. The presentation of the information should be clear and in a proper context so that users can easily understand its meaning.

Reported Median Retention Rates for Programs

Delivered Solely Onsite and Solely Through

Distance Education Were Meaningless

The segment title “Student Success” included the following statements on median retention rates[4] for baccalaureate and graduate degree programs delivered solely with onsite courses and those programs delivered solely with distance education courses:

Data reported by the nine participants that offer baccalaureate degree programs…showed higher median retention rates for students enrolled in onsite programs than for distance education program enrollees. However, the gap between the two narrowed from ten percentage points after two years (66%onsitevs.56% distance education) to just two percentage points after three years (50%vs.48%). [Page 9 of the Second Report.]

***

[T]he median retention rate [for graduate degree programs] after two years for students enrolled in distance education programs is higher than for those enrolled in onsite programs (63.5 % vs. 55%). This persists into the third year, where median retention rates were 63% and 51%, respectively. [Page 10 of the Second Report.]

The reported median rates were meaningless because the method used to derive the median rates presented in the Second Report was flawed, and thus, conclusions drawn from the median rates were unsupported.

  • The participant rates used in the analysis varied significantly among the participants. Forexample, the rates after two years for baccalaureate degrees offered through distance education ranged from 20percent to 99 percent. Due to the variance, the median would not be a representative measure of student retention rates.
  • The analysis included participants that did not offer both onsite and distance programs forthe degree type. As a result, the lists used to select the median rate did not have a corresponding percentage for the other delivery method.
  • The analyses included student groups (cohorts) that had few students in the initial enrollment. For example, one student cohort for graduate programs offered onsite reported only five students enrolled in the programs.
  • The participant rates were calculated using student outcome data that were incomplete and inconsistent. (Finding No. 2 of the report provides details on the incomplete and inconsistent data identified by our review.)

A more appropriate method for evaluating retention/completion for onsite and distance delivery methods would be to assess the differences in the retention/completion rates for participants that offer programs both solely onsite and solely through distance education and limit the analysis to those participants or report years with complete, consistent data and cohorts that have a minimum number of students.

Conclusion on Impact of Distance Education Methods

on Student Outcomes Was Unsupported

The Second Report included the following statements on the impact correspondence and other distance education delivery methods had on student outcomes.

The mode of [distance education] delivery does not appear to be a salient factor instudent outcomes [for]baccalaureate degree programs [for demonstration program participants]. The institutions offering full degrees through correspondence report retention rates well above the median. Those participants offering only online courses report retention rates that cluster around the median. The retention rates reported by participants offering programs through a mix of media (which include courses offered by correspondence, interactive video, videotape, and online) show much greater variation across institutions. [Page 9 of the Second Report.]

***

As with baccalaureate degree programs, [distance education] delivery mode does not seem to be a significant factor affecting retention in these graduate degree programs. [Page 10 of the Second Report.]

We were unable to locate data in the report or other documents provided during our review that supported the Department’s conclusions. DEDP participants’ annual reports did not provide a breakdown of outcome data by distance education delivery method (i.e., video, audio, internet, correspondence, and other). The retention rates for individual participants used as a basis for this conclusion were derived using student outcome data reported on annual reports for degree programs offered solely through distance education rather than specific distance education methods. Also, as noted in the prior section, the method used to derive the median rates was flawed, and as a result, the median rates were meaningless and conclusions drawn from those rates were unsupported.

Report’s Conclusions

Lacked Sufficient Details

The “Conclusions”segment of the Second Report stated there was “growing consensus”in the following policy areas:
  • The current rules that define education delivered via telecommunications and videocassette or disc recordings as correspondence education if the total of such courses and correspondence meet or exceed 50% of the courses provided need to be revised or eliminated.
  • The definition of correspondence education needs to be revisited.

* **

  • The quality of distance education programs should be assessed through the same accreditation process that governs on-campus programs.

The report did not explicitly state the Department’s position on these changes or explain “growing consensus.” Also, the report did not explain the basis or impetus for the above statements. According to the DEDP Director, the statements reflected the consensus of the Department. The DEDP Director’s supervisor stated the statements reflected the consensus of the entire higher education community (i.e., educational institutions, the Department, and Congress). Both stated that the statements were supported by the Department’s experiences in negotiated rulemaking and ongoing dialogue with the higher education community, DEDP participants, and other institutions.