Service Unit Continuous
Improvement Process Workbook

Revised September 2014

For Questions Regarding this Material Contact:

Kathleen S. Fenton

Office of Institutional Effectiveness

972.985.3737

Service Unit Continuous Improvement Process Workbook

TABLE OF CONTENTS

Chapter 1: Collin College Administrative & Educational Support Unit Overview 1

Purpose 1

Focus 1

Institutional Effectiveness Characteristics 2

Identification of Administrative and Educational Support Services 2

Documenting a Continuous Improvement Plan 4

Chapter 2: Overview of Assessment 9

Purpose of this Chapter 9

Background on Assessment 9

Introduction to Assessment 10

Purposes of Assessment 12

Characteristics of Effective Assessment 12

Chapter 3: Defining Outcomes 13

Purpose of this Chapter 13

Definition of Outcomes 14

Developing Operational Outcomes 15

Chapter 4: Continuous Improvement Plan 23

Step 1 – Apply Timeline for CIP 24

Step 2 – Identify and Document Team/Stakeholders 25

Step 3 – Service Unit Information 26

Step 4 – Document institutional outcome statements, measure(s) and target(s) 27

Step 5 – Data 35

Step 6 – Findings/Conclusions 35

Step 7 - Service Unit Continuous Improvement Plan (CIP) – Combine steps 39

Step 8 – Document the use of assessment results/findings to improve your unit’s effectiveness 43

Step 9 – Close the Loop 46

Step 10 - Strategic Planning Input 46

Appendix A: References 50

List of Exhibits

Exhibit 1: Template for a Continuous Improvement Plan 5

Exhibit 2: Example of a Completed CIP Plan 19

Exhibit 3: Entering Outcomes on the CIP document 29

Exhibit 4: Examples for Generating Ideas for Developing your Outcome Statements 30

Exhibit 5: Add a Measure and Target for Each Outcome 32

Exhibit 6: Entering your Findings/Conclusions 37

Exhibit 7: Service Unit CIP-Combine Steps Combine Steps 41

Exhibit 8: Document the Evidence of Improvement Example 45

Service Unit Continuous Improvement Process Workbook

Chapter 1: Collin College Administrative & Educational Support Unit Overview

Purpose

The purpose of the Institutional Effectiveness (IE) assessment initiative, which includes Administrative and Educational Support (AES) units, is to demonstrate the use of assessment results to improve the services provided students in order to positively impact their learning outcomes. Our regional accrediting association, the Southern Association of Colleges and Schools Commission on Colleges (SACSCOC), requires an accredited college to identify its expected outcomes for its administrative and educational support services; assess whether it achieves these outcomes and provide evidence of improvement based on analysis of those results (Comprehensive Standards 3.3.1.2 and 3.3.1.3).

Collin College conducts a 5-year Program Review cycle that includes two 2-year cycles of Continuous Improvement followed by a year of self-study and planning. Program Review is documented for support units in WEAVE to meet the SACSCOC requirements. Both administrative and educational service units review and revise their identified outcomes, select related measures, set targets to determine progress, analyze the prior year’s data and develop their continuous improvement plans by the end of December in alternate years. The spring semester is used to make logistical plans, including budgeting, and/or to begin to implement the improvement plan. Plan implementation begins no later than the next fall with assessment data collected during the spring. Improvement plans support institutional priorities and are integrated into the strategic planning and budgeting processes for the college.

Focus

Institutional effectiveness encourages and supports innovation by providing information that clearly focuses on the effectiveness of services to students, faculty, staff, or end results of processes. Assessment activities are focused at the unit level and address improvement of services. AES services improvement activities focus on effectiveness through the fine tuning or incremental improvement of services, based upon existing staff, resources and procedures.

Institutional effectiveness planning is related to the statement of purpose for the institution. It asks the basic question, “How well are our students learning,” and “How effectively and efficiently are our administrative services functioning?” The answers are ends or outcomes- oriented and focus upon the results of the units’ efforts (as opposed to the efforts or processes implemented). Note: Strategic planning includes implementation of new strategies, technology, etc while IE focuses on the results of these strategies, not the strategies themselves.

Institutional Effectiveness Characteristics

Institutional effectiveness planning is characterized by:

·  Expected results (administrative objectives)

o  Measures of increased satisfaction, timeliness of response, courteousness, and knowledge

o  Direct measures of increased levels of service

o  Validation of services by external reviews

o  Impact on Users or Users’ benefit(s) after receiving services

·  Means of assessment

·  Targets set as a means of gauging progress

·  Actual assessment results

·  The use of results to improve services

Ask, “Given the personnel and resources we currently have, how can our unit improve the impact of its services?”

Identification of Administrative and Educational Support Services

Identification of AES services is flexible, based on the criteria of separate budget or distinct services. Each service area which has either a separate budget or distinct service, even if under a larger unit umbrella, should develop its own separate IE assessment plan. Alternatively, each distinct service area should be represented by one or more administrative outcomes/means of assessment/use of results for improvement.

Administrative Units—provide services which maintain the institution and are essential to its operations, such as Accounting, Registrar, Physical Plant, Human Resources, etc.

Academic and Student Support Units—contribute directly to student learning or instruction, such as Academic Advising, Math and Writing Centers, Open Computer Labs, Career Counseling and the Library. Each educational support unit focuses upon providing services directly to students or contributes to the institution’s overall learning environment. These units have both “process” and “student” outcomes.

Differences between academic and workforce assessment compared with administrative and academic and student support services assessment

The primary difference between student outcomes assessment, as practiced in instructional programs and assessment in AES services relates to the focus of the expected outcome results. In the instructional arm of the college, expected results are focused upon educational or student learning outcomes, what the students will be able to think, know or do when they have completed their program. Statements related to each instructional program’s intended results focus on the results of student learning and not on what the faculty or department intends to do.

Service Outcomes: Overall Satisfaction, Efficiency, Effectiveness and Strategic Plan goal(s) such as Completers

On the other hand, with administrative and academic and student support units, statements regarding what services the unit intends to accomplish are entirely acceptable since many of these units are removed from direct contact with the learning environment. Outcomes for administrative units consist primarily of such “process oriented” statements describing the support process or service which the unit intends to accomplish. At Collin, our service unit outcome priorities focus on unit support of strategic goals, efficiencies, effectiveness and constituents’ satisfaction. Occasionally, “expansion of services” is an appropriate outcome. Such an outcome may arise when there are opportunities stemming from new technology, changes in state or federal regulations, a competitor’s actions that impact your service area, etc.

Components for Discussion in a completed AES Service Assessment/Improvement Plan

·  Identification of primary functions and services or products - related to how the service supports the college’s mission and statement of purpose by the unit’s mission statement.

·  Outcomes - Intended administrative end-result or outcome statements related to primary functions

·  Measure(s) - to quantify the outcomes and the Target(s)–as criteria for success for each administrative objective

·  Results/Findings - A summary of the data actually collected when the planned assessment occurred and unit staff conclusions about its meaning

·  Improvement Plan - Description of how these data are used to improve services

Examining Results to Reach Conclusions and Determine Findings

Data-driven decision making requires you to find and examine data from all available sources to establish benchmarks and identify key areas to target for improvement.

This data may not be specifically designed for your unit nor may all of it be quantifiable in a given cycle. Look for more than one type of data and data from more than a single source or perspective. Observations, anecdotes and “borrowed” data from other units may be helpful to corroborate and increase confidence in your findings and conclusions.

Documenting a Continuous Improvement Plan

The following Continuous Improvement Plan template illustrates the components of an AES Unit plan and improvement actions. The Continuous Improvement Plan (CIP) provides documented evidence of a data-based improvement cycle. Collect and report data for each outcome, even though it is recommended that your unit focus on one outcome for improvement.

Why collect and report data for each outcome, if you are only going to focus on one for improvement? Think of the outcomes as key performance indicators like the vital signs used to monitor a person’s health: pulse, temperature, and blood pressure. If one or more of these vital signs goes awry, it is a signal to take action to bring the body’s system back into alignment. If you monitored only one of these signs, such as the pulse, and ignored a temperature fluctuation, you might miss an infection until it impacts the pulse rate, when it would be far more difficult to treat. So monitor all your outcomes on a regular basis but focus your improvement actions on the one with the greatest institutional priority or feasibility.

Collin College 1

Service Unit Continuous Improvement Process Workbook

Exhibit 1: Template for a Continuous Improvement Plan

/ Continuous Improvement Plan (CIP) Documentation

Date: Name of Administrative or Academic and Student Support Unit:

Contact name: Contact email: Contact phone: Office Location:

Mission:

PART I: Might not change from year to year

A. Outcomes(s)
Results expected in this department/program / B. Measure(s)
The instrument or process used to measure results / C. Target(s)
The level of success expected /
/ Continuous Improvement Plan (CIP) Documentation

PART II: For academic year (enter year i.e. 2011-12)

A. Outcomes(s)
Results expected in this department/program / D. Action Plan
Years 5 & 2
Based on analysis of previous assessment, create an action plan and include it here in the row of the outcomes(s) it addresses. / E. Implement Action Plan
Years 1 & 3
Implement the action plan and collect data / F. Data Results Summary
Years 2 & 4
Summarize the data collected / G. Findings
Years 2 & 4
What does data say about outcome? /

Collin College 5

Service Unit Continuous Improvement Process Workbook

Use of Assessment Results to Improve Your Service or Unit’s Effectiveness or Efficiency

Close the assessment loop by analyzing the collected data as identified in the plan. Determine what the data mean relative to the intended outcome. This last step is to connect the collected data to the analysis as the basis for the direction and focus of improvement, in a repeating continuous improvement cycle.

It sometimes happens that you have an improvement action in mind, even before formally considering the data. But decisions don’t happen in a vacuum. In such an instance, we need to stop and articulate the reasoning (conclusions based on data) that prompted your unit to identify a specific improvement action. You can work backwards to identify the data behind your unit’s conclusions. Think of when the idea for the improvement action arose. What factor(s) prompted this idea? It might be some unanticipated data that came to your unit’s attention, including observations or input from others, such as a recently released public report or a new state regulation. These data might not have been known or relevant when the last Continuous Improvement Plan was made. It is appropriate to capture these types of data and integrate them into the analysis phase of the planning process. Then check whether the data as a whole support your conclusions and make a determination that the desired improvement action still retains priority, supported by the available data.

Did the unit meet the target for each intended outcome? If so, consider raising the target of success for that outcome. Or having met the target, choose a new outcome if appropriate. If your results fall short of the target, review the unit’s and the college’s related processes to identify an improvement action. It is recommended that the AES service or unit only focus on one intended outcome for improvement each academic year.

Implementation of AES Unit Assessment

Institutional effectiveness planning will be done by the support units cyclically, each year. Service and/or unit assessment may be based on institutional data and point-of-service unit surveys. The cumulative results of the annual assessments will support incremental improvement with existing resources and culminate in providing information for decision making, including during institutional effectiveness planning. Since the budgeting process comes to a conclusion in late spring, there is an opportunity to request supplemental funds to support expansion of services or conversion to new technologies when appropriate and supported by evidence-based decision making.

The IE Support Service planning and assessment cycle is supported by two college-wide support services surveys: one for students and a second survey for faculty and staff. These surveys are administered through SNAP and emailed to students, faculty and staff, or all groups as appropriate to the service functions and constituents. Results are posted for unit review and analysis.

Timeline for Continuous Improvement Planning and Assessment

The continuous improvement process is cyclical. A 2-year cycle consists of one year of implementation and data collection followed by a year of data analysis to determine findings and development of an action plan for improvement. In the 5-year Program Review cycle, there are two 2-year continuous improvement cycles followed by a year of self-study and planning to prepare to begin again with the next two 2-year continuous improvement planning and assessment cycles.

Year One:

·  Implementation is usually during the academic year

·  Data collection occurs during the spring semester, at a minimum

Year Two:

·  Analyze available data and document improvements, beginning in summer term through early fall.

·  Develop a new data-driven action plan and report results and action plan to Institutional Effectiveness Office by February 1.