Program Review Report Template – IR Data Guide

______

#1. Student Participation Measures (Demographics, Enrollment, & Retention)

Measure:Student Demographics by Gender, Race/Ethnicity, and Age

Years Reported:Demographics are reported as an average of the last three academic years, 2013-2014, 2014-2015, and 2015-2016. Three years of data are used to generate a more accurate picture of demographic trends and to generate a larger sample size that can be problematic when disaggregating program data.

Definition/Data Source:Data Warehouse, STUDENT table.Demographics are self-reported by the student on the application form. The IR Office reports the gender, race/ethnicity, and age group the student reported in their first quarter in the program.

Measure:Annual Program FTE-S - Budgeted

Years Reported:2013-2014, 2014-2015, and 2015-2016

Definition/Data Source:RTC’s Annual Budget Book, from the Professional-Technical FTEs tables.

Measure:Annual Program FTE-S - Actual

Years Reported:2013-2014, 2014-2015, and 2015-2016

Definition/Data Source:Data Warehouse,CLASS table. One FTE, or Full Time Equivalent, is equal to 15 credits for a quarter, or 45 credits for a year. For example, if a course is 5 credits, that is equivalent to .33 FTE. If there are 10 students enrolled in that course, the total FTE for that course is then 3.3. In order to calculate program-specific FTE each year, the student FTE is added up for each course with an institutional intent code of 21 (occupational preparatory courses) with a department code that matches the program. The total student FTE for the year is then divided by three to obtain the annual FTE.

EXAMPLE: Consider the example below for the Philosophy department. The course is worth 5 credits, or .33 FTE. That means that 12 students were enrolled in fall 2015, 11 students in winter 2016, and 15 students in the spring of 2016. The quarterly FTE totals of 4, 3.666, and 5.333 are added up and then divided by 3 for an annual FTE of 4.3.

Quarter / Department / Course / FTE Total / Credits
Fall 2015 / PHIL& / INTRO TO PHILOSOPHY / 4 / 5
Winter 2016 / PHIL& / INTRO TO PHILOSOPHY / 3.666 / 5
Spring 2016 / PHIL& / INTRO TO PHILOSOPHY HYBR / 5.333 / 5

Measure:Average Quarterly Headcount

Years Reported:2013-2014, 2014-2015, and 2015-2016

Definition/Data Source:Data Warehouse, STUDENT table.The average number of students enrolled in the program in fall, winter, and spring quarters. This includes all degree-seeking intent students (excludes waitlist), regardless of whether or not they were new to the program.

Measure:Average % of Capacity

Years Reported:2013-2014, 2014-2015, and 2015-2016

Definition/Data Source:The capacities are obtained from the Student Load Count Reports. The average quarterly headcount is divided by the total capacity to calculate the average % of capacity. The average % of capacity across all programs is also provided here in order to make comparisons with the RTC average and to provide some context for what this calculations means.

Measure:1st to 2nd Quarter Retention

Years Reported:2013-2014, 2014-2015, and 2015-2016

Definition/Data Source:Data Warehouse, Student Achievement Initiative database and STUCLASS table. Retention is calculated for students that are in the PEP cohorts for each quarter. These cohorts consist of students who are new to RTC, with degree-seeking intents (excludes waitlist). This may not include ALL students who started in your program in each cohort. On average, PEP cohorts represent 70% of new students in programs. Retention is then calculated as the percentage of students who returned the next quarter. If the program has only a one-quarter option, then retention is not provided.

#2. Student Success Measures (Completion & Labor Market Outcomes)

Measure:Completion

Years Reported:2010-2011, 2011-2012, 2012-2013

Definition/Data Source:Data Warehouse, COMPLETION table.Completion is calculated for students that are in the PEP cohorts for each quarter. These cohorts consist of students who are new to RTC, with degree-seeking intents (excludes waitlist). This may not include ALL students who started in your program in each cohort. On average, PEP cohorts represent 70% of new students in programs. Completion rates are provided separately for students who obtained only a certificate, and those who also obtained a degree. The total completion rate is then the sum of the degree and certificate completion rates. Students are only counted if they obtained their credential in your program. Completion rates are checked within three years from when the student started in the program. Most schools track completion within three or four years to account for part-time students who might need more time to finish the program. As a result, the most recent year we have completion rates available for is 2012-2013.

Measure:Completion by Race/Ethnicity and Gender

Years Reported:2010-2011, 2011-2012, 2012-2013

Definition/Data Source:Data Warehouse, COMPLETION table.Completion is calculated for students that are in the PEP cohorts for each quarter. These cohorts consist of students who are new to RTC, with degree-seeking intents (excludes waitlist). This may not include ALL students who started in your program in each cohort. On average, PEP cohorts represent 70% of new students in programs. Completion rates are provided separately for students who obtained only a certificate, and those who also obtained a degree. The total completion rate is then the sum of the degree and certificate completion rates. Students are only counted if they obtained their credential in your program. Completion rates are checked within three years from when the student started in the program. Most schools track completion within three or four years to account for part-time students who might need more time to finish the program. As a result, the most recent year we have completion rates available for is 2012-2013.

The average completion rate from 2010-2013 is provided for each demographic group. Three years of data are used to generate a more accurate picture of demographic trends and to generate a larger sample size that can be problematic when disaggregating program data.

Measure:Estimated Placement Rate

Years Reported:2012-2013, 2013-2014, 2014-2015

Definition/Data Source:Data Warehouse, Data Linking for Outcomes Assessment (DLOA) database.

What is the DLOA?

The DLOA is a database containing data compiled on an annual basis to meet college and the State Board for Community and Technical Colleges (SBCTC) needs of outcomes data related to employment and further education of college students. The DLOA includes data for completers and leavers of vocational, academic, worker retraining, or apprenticeship programs that left the system during the previous academic year.

Who is Included?

Students are included in the DLOA database only after they have not been enrolled anywhere in the system for at least one year, whether or not they obtained an award. Only those students with a valid social security number in their registration or completions records are included, since only those students have a possibility of matching to external databases. Students who re-enroll as “lifelong learners” in classes such as parent education or industrial first aid, or who do not enroll in 10 or more state or contract credits are regarded as having left the college, and thus are included in the DLOA file.

Students who continue in further training after completing a certificate or degree or students who transfer between two-year colleges are not included in the DLOA as they have not yet left the two-year system. Because International students do not have social security numbers and because they do not intend to work in the United States, they are not included in the DLOA.

Records in these files are de-identified for the purpose of protecting the identity of each student. Even though the records are de-identified, special attention needs to be paid to the level of aggregation that is released for public consumption so that a reasonable person may not deduce the identity of the student.

When is the Data Pulled?

The DLOA provides employment status and wages for students in the 3rd quarter after college. Research indicates that most community and technical college graduates are able to move into the kinds of jobs consistent with their level of training by the 3rd quarter. Prior to that time, many are employed, but may be continuing in the employment they had while in college or in jobs that are not consistent with their level of training. Due to the timing of the data pull, the data files are not released to the Institutional Research (IR) Offices until December for the previous year. Therefore, the most recent database we have at Renton Technical College (RTC) is for the 2014-15 leavers and completers.

Program Review Report Template – IR Data Guide

______

Data Standards

  • As indicated under the common data standards, the administrative data linking does not result in a count of those employed, but only those who are placed in Unemployment Insurance (UI).
  • The SBCTC indicates that the total employment rate can be estimated based on the percent of completers and leavers placed in UI by the 3rd quarter they are employed following college. If the placement rate is 90% or lower for students who completed a degree or certificate, an adjustment factor of 1.1 is added. In this way, the employment rates we provide at RTC are considered “estimated” job placement rates.
  • Median wages are reported as the most meaningful measure of estimated earnings.
  • Outcomes are only reported for groups that are sufficiently large enough that the hourly rates and quarterly earnings are representative of the group, not individual behavior. The smallest group for which data should be reported is 25. The data standards call for aggregating several years’ data for a given program in an effort to meet the “sufficiently large” criteria.

In order to meet the “sufficientlylarge” data standard, we aggregate three years’ of data for each program. Otherwise, we would not be able to provide any data as most programs would not have at least 25 records each year. Estimated placement is only reported for students who completed either a certificate or a degree in the program. The average placement rate for programs included in the program review process is 86.5%.

Measure:Estimated MedianAnnualized Wages and Median Annualized Earnings

Years Reported:2012-2013, 2013-2014, 2014-2015

Definition/Data Source:Data Warehouse, Data Linking for Outcomes Assessment (DLOA) database. See documentation above for the DLOA. Median annualized wages are calculated as the hourly rate multiplied by 2,080 (40 hours per week times 52 weeks per year = 2,080). Median annualized earnings are the reported quarterly earnings multiplied by four.

In order to meet the “sufficientlylarge” data standard, we aggregate three years’ of data for each program. Otherwise, we would not be able to provide any data as most programs would not have at least 25 records each year. Wages are only reported for students who completed either a certificate or a degree in the program. The average for all RTC programs going through the program review process is $35,381 and $31,244, respectively.

Measure:Current Demand Status

Years Reported:Based on the most recent information available on the website. At the time the data was pulled, the website had been updated as of July, 2016.

Definition/Data Source:Employment Security Department website. The program’s CIP code is crosswalked with SOC codes at to identify associated occupations. The identified SOC codes are entered into the search box for King County to obtain labor market information. The following website provides documentation as to how the ESD determines occupation demand:

Measure:Projected Job Growth (Average Annual Growth Rate)

Years Reported: Based on the most recent information available on the website. At the time the data was pulled, the projections were reported for 2014-2024.

Definition/Data Source:EEmployment Security Department website. The program’s CIP code is crosswalked with SOC codes at to identify associated occupations. The identified SOC codes are entered into the search box for King County to obtain labor market information.

#3. Student Satisfaction (Student Evaluations)

Measure:Are this program’s policies, practices, and resources adequate?

Years Reported:2013-2014, 2014-2015, and 2015-2016

Definition/Data Source:Derived from the quarterly student evaluations. The students are asked ten questions about the program overall and their responses are averaged for each year. For 2013-2014 and 2014-2015, the responses were recorded on a scale from 1 to 4 ranging from strongly disagree to strongly agree, but a neutral category was added for 2015-16, changing the scale from 1 to 5. As a result, interpretation of the results needs to account for the difference in scoring between these two timeframes.

QUESTION / Strongly Agree / Agree / Neutral / Disagree / Strongly Disagree
The grading rules and policies were clearly stated in writing. /  /  /  /  / 
The objectives of each course were clearly explained. /  /  /  /  / 
The information presented in class related directly to the course objectives. /  /  /  /  / 
I had opportunities to practice what was taught in class. /  /  /  /  / 
The texbook(s) and other supplies were valuable. /  /  /  /  / 
Assigned reading, homework, and activities helped me learn what was taught in class. /  /  /  /  / 
My progress was evaluated in a variety of ways (tests, papers, projects, etc.). /  /  /  /  / 
I was given meaningful feedback on tests and other work. /  /  /  /  / 
Grades were assigned fairly. /  /  /  /  / 
Computers, equipment, and tools were sufficient to meet course objectives. /  /  /  /  / 

Measure:Is this program meeting your expectations?

Years Reported:2013-2014, 2014-2015, and 2015-2016

Definition/Data Source:Derived from the quarterly student evaluations. Only students who indicate they are in their first quarter of the program are asked this question. For 2013-2014 and 2014-2015, the responses were recorded on a scale from 1 to 4 ranging from strongly disagree to strongly agree, but a neutral category was added for 2015-16, changing the scale from 1 to 5. As a result, interpretation of the results needs to account for the difference in scoring between these two timeframes.

Measure:Overall rating of the program

Years Reported:2013-2014, 2014-2015, and 2015-2016

Definition/Data Source:Derived from the quarterly student evaluations. The campus wide averages are provided for this question for comparison purposes and to provide more context for interpretation of the data.

#5. Program Advisory Committee (Annual Survey)

Measure:Advisory committee feedback

Years Reported:2015-2016

Definition/Data Source:The annual Advisory Committee Survey that is administered each spring quarter. Only 2015-2016 data is provided here as the survey was revised and we only have one year of data so far. Committee members are asked to review ten aspects of the program including program outcomes, curriculum, program length, employability soft skills, technology/equipment, program resources, real-world learning opportunities, business/industry collaboration, innovation, and overall program quality. For each item, respondents are provided with detailed response options and are asked to review these descriptions and to select the best option. This feedback provides the faculty and dean team with areas of strength, as well as areas for growth. Responses are scored on a scale from 1 to 4 and averaged for the program.

Example item:

Overall Program Quality / Don’t Know
I am not aware of how the program has addressed this area. / Minimal/
Emerging
Compared to other programs in similar settings, this program is of low quality and has poor outcomes. / Quality
Compared to other programs in similar settings, this program appears to be performing at an average level; there are few aspects of this program that are worthy of replication. / Exemplary
Compared to other programs in similar settings, this program appears to be of high quality, and has many notable components that could be adapted in other settings.

Measure:Average number of respondents

Years Reported:2015-2016

Definition/Data Source:The annual Advisory Committee Survey that is administered each spring quarter. Only 2015-2016 data is provided here as the survey was revised and we only have one year of data so far. In the future, this will be the average number of respondents each year, but as of now is just the number of responses for the spring 2016 administration.

Measure:Advisory committee representation

Years Reported:Report for the current advisory committee when you complete the template.

Definition/Data Source:This is not provided by the IR Office and needs to be completed by the faculty/dean team. A determination needs to be made as to whether or not the advisory committee representation is sufficient in each of these areas: gender, race, employer vs. employee, labor unions, and RTC alumni. This question needs to be completed, even if there were no responses from the advisory committee. If you find that the advisory committee is not representative, it might be important to address this in the action items.

#6. Program Personnel (Faculty FTE & Student/Faculty Ratios)

Measure:Program Faculty FTE-F

Years Reported:2013-2014, 2014-2015, and 2015-2016

Definition/Data Source:Data Warehouse,from the CLASS table. The Faculty FTE is not a count of how many full-time faculty teach in the program. Rather, the FTE-F, or Full Time Equivalent for Faculty, is a term used by the SBCTC and the colleges to refer to faculty workload. The SBCTC defines FTE-F as one instructional employee assigned to teach a full-time load of courses for nine months. The colleges define FTE-F as a unit measurement representing the percentage of full-time workload (or effort) required for an instructor to teach a class.

For example:

  • Instructor Jones (FTEF of 100% in the Quarterly Staff Extract File) teaches four classes. Each of Instructor Jones’s classes will beassigned an FTE-F of 25%, or .25.
  • Instructor Smith (also FTE-F of 100% in the Quarterly Staff Extract File) teaches five classes. Each of Instructor Smith’s classes will be assigned an FTE-F of 20%, or .20.

Since FTE-F is based on a calculation which combines both instructor and class records, the FTE-F percentage can be different for different sections of the same class.