Overview of the
TEIP Program Assessment Protocol
The TEIP Program Assessment Protocol is designed to support health promotion practitioners in the use of Evidence-Informed Practice principles to enhance local programs.
Evidence-Informed Practice is defined as the “best available practice or policy based on available evidence for a specific group” (ref). This term expands upon commonly known terms such as Best Practice, Better Practice or Recommended Practice – to be inclusive of terminology relevant to effective health promotion programs.
Evidence Supporting the TEIP Program Assessment Protocol
The TEIP Program Assessment Protocol is based on research by the University of Waterloo, Population Health Research Group, to identify “Best” and “Promising” Practices in Chronic Disease Prevention. The protocol does NOT aim to label a program as ‘best’ or ‘promising’. Its purpose is to identify areas for enhancement along 19 criteria associated with exemplary community-based health promotion programs. In this regard, the Program Assessment Protocol is a quality assessment and capacity-building tool.
Through applying the 19 criteria, practitioners can use these basic elements of Evidence-Informed Practice to enhance local health promotion programs.
Steps in the TEIP Program Assessment Protocol
1. Select a program to assess
2. Complete the Program Survey
3. Three independent reviewers complete the Program Assessment Worksheet
4. Hold a Consensus Meeting using the Consensus Summary Sheet
5. Write a Final Consensus Report to communicate the final criteria ratings and suggestions for program enhancement
6. Make decisions about your program upon reviewing and discussing the Final Consensus Report
Selecting a Program to Assess
The Program Assessment Protocol is particularly valuable in assessing community-based health promotion programs. Here are a few suggestions to keep in mind when choosing a program to assess:
The program to be assessed should have:
- An overall goal of addressing a specific aspect of health promotion
- At least some data accumulated and program materials available
- The ability to demonstrate some benefits
- Multiple program activities
- Motivated and committed program staff
The last criterion is critical. Staff involved with the program to be assessed must have the motivation and the desire for the program to go through the assessment process. Ultimately, it is the program staff that will be involved with the assessment and will plan and implement necessary improvements.
Selecting Program Reviewers
When selecting program reviewers, keep these characteristics in mind - A reviewer may:
- Have expertise in your specific area of health promotion
- Be able to objectively assess your program
- Work with a similar program to yours
- Be well known for his/her success in health promotion programming
- Be someone unfamiliar with your program to provide a fresh and objective perspective
Typically, External Reviewers are recruited when communities or agencies agree to act as external reviewers for each other. It is strategic to recruit at least one external reviewer with expertise related to your program and/or intended audience as this may result in better suggestion for program enhancement. However, the process of completing the survey itself can lead to great ideas and direction for program enhancement, even before the input from the program reviewers!
Organizing the Consensus Meeting
The Consensus Meeting is perhaps the most critical aspect of the Program Assessment Protocol. The main purpose is to develop consensus regarding the final assessment levels and suggestions for enhancement for each of the criteria. However, experience has shown that participation in the consensus meeting is where the real learning occurs.
To hold a successful Consensus Meeting, the following roles are recommended:
1. Consensus Meeting Coordinator
- Forwards appropriate documents to the Survey Respondent and Program Reviewers
- Communicates timelines and follows-up to ensure Survey and Program Assessment Worksheets are received and completed as required
- Coordinates timetables and logistics in order to schedule a Consensus Meeting (in person or via teleconference) within two weeks of assessment completion
- Summarizes the information from each of the three Program Reviewers onto the Consensus Summary Sheet and forwards a copy to each participant prior to the Consensus Meeting
2. Independent Consensus Meeting Facilitator
- Uses facilitation skills to ensure all participants have equal opportunity to contribute to discussion
- Keeps the discussion within the established timelines (usually 90 minutes)
- Ensures that a Consensus Level and Suggestions for Enhancement are discussed and documented for each of the 19 criteria
3. Consensus Meeting Recorder
- Takes notes of the discussion. Documents the final Consensus Levels and Suggestions for Enhancement for each criterion
Writing the Final Consensus Report
The Final Consensus Report contains a concise, thoughtful and orderly summary of the most useful suggestions for program enhancement. It is a distillation of the best suggestions arising from the Consensus Meeting and from the Program Assessment Worksheets completed by the Program Reviewers.
The person selected to write the report should have excellent writing skills, be adapt at synthesizing multiple sources of information and to able to critically assess the usefulness of the suggestions for enhancement. Ideally, the writer has attended the Consensus Meeting (i.e. Perhaps one of the Program Reviewers can write the Final Consensus Report).
Applying the Learnings
1. Do not expect to implement every suggestion
- Your team is the expert on your program and will have a good sense of whether or not a suggestion is feasible, workable etc.
- Funding, staffing resources, ministry requirements and client characteristics must be considered in selecting enhancements to implement
2. Set priorities
- It will not be possible to implement all suggestions, even good ones
- Your team will need to set priorities
- Consult with your team and partners to select the most important and strategic areas for improvement
3. Identify the research, resources and tools you will need
- You may require additional research to support your program or you may need to search out new resource manuals, materials or evaluation tools
For Further Information and Support
The TEIP On-Line Learning Module - Improving Health Promotion Using Evidence – Informed Practice is a web-based audiovisual learning tool which provides a self-paced guide to the TEIP Program Assessment Protocol. It can be accessed from the TEIP website (http://teip.hhrc.net) or directly from the following URL: http://teip.hhrc.net/resource/learning/TEIP_LM1.html .
© Copyright 2006 by Towards Evidence-Informed Practice, Heart Health Resource Centre, Toronto, Canada.
Permission to copy this resource is granted provided source is acknowledged.
TEIP Program Assessment Protocol - Revised Dec 2007 OVERVIEW - Page 1
How to Complete the Program Survey
Introduction
The purpose of the Program Assessment Protocol is to identify areas for enhancement to guide future program planning and refinement. The Program Survey consists of 19 questions designed to assess how well a program meets the 19 criteria associated with exemplary community-based health promotion programs.
Select someone familiar with the program to be the Survey Respondent and to assemble additional program documentation. The Survey Respondent and the Program Reviewers (who perform the Program Assessment) should not collaborate. This is to ensure integrity in the process, maintain independence of the reviewers and ensure all reviewers have access to the same information.
Guidelines for the Survey Respondent
1. Interpret the meaning of each survey question
- Always review the criteria Definition and the Assessment Guide in the Program Assessment Worksheet to give you clues for interpreting the requirements of each criterion.
2. Seek assistance from other members of your program team
- Your program team as well as representatives from the intended audience may have pertinent information to contribute.
3. Formulate and write your response to each survey question
- Identify the key elements to include in your response
- Include data to support your answers where available
- Ensure that your responses are comprehensive, yet concise
4. Collect and attach supporting documentation for each response
- Attach copies of surveys, reports, program logic models, meeting minutes, program brochures, letters of support, etc. that support your responses to the survey questions.
5. Completing a Program Survey may take longer than expected
- A first-timer Survey Respondent may require 1-3 days to complete the work. Experience in completing the survey shortens the time considerably.
- The more complete and better organized the program documentation, the easier it is to complete the survey
- A completed survey becomes an excellent source of program documentation.
Background Information
Name of Program:
Community:
Survey Respondent: Name: Email: Tel #:
Directions: Please provide approximately 1/2 page of information for all the following questions
Program Summary: Write a concise description of your program such that a reader can quickly grasp the reason for the program, the intended audience, the objectives, the major activities, how the program is delivered, who is involved and any other important or distinctive features.
Program Need
P
1. Needs Assessment
How was the need for this program identified? What data were collected and how recently? How were these data used? If a needs assessment was conducted, provide summary or report if available. Please provide examples of data collection tools used if not included in the report.
2. Duplication Avoidance / Environmental Scan
Was a formal or informal scan of existing community programs conducted? How recently? Please describe how your program will uniquely fill any identified gaps? If a scan was conducted, provide summary or report if available. Please provide examples of data collection tools used if not included in the report.
Program Content
3. Theory and Literature Evidence
Describe any research undertaken to guide the development of the program (e.g. journal articles, relevant theories, conceptual frameworks, literature reviews, reports of best and promising practices). How recently was this information developed? Please describe relevant examples of how the research, including theory or a conceptual framework, informed program development and implementation.
4. Program Objectives and Logic Model
List all program level and activity level objectives, whether process or outcome-related. Identify those which are SMART (e.g. Specific, Measurable, Achievable, Realistic and Timely). If a program logic model exists, please provide.
5. Environmental Support
To what degree, if any, has your program addressed Environmental Support to create physical and/or social environments that support healthy behaviours (e.g. walking trails, bicycle racks at worksites, etc.)?
6. Policy
To what degree, if any, has your program addressed policy (i.e. changing the formal or informal rules of governing bodies to support healthy behaviours)? Some examples are advocacy to support by-law change or worksite policy development.
7. Sequencing
Does this program involve a sequence of activities designed to maximize population impact over time (i.e. activities in program are sequenced to move from Awareness to Skill Building to Environmental Support to Policy Development)? Please explain.
Program Process
8. Collaboration
Please describe and/or list the partners involved in this specific program, whether they contribute to program planning and/or implementation and/or evaluation and what resources they contribute beyond regular committee meetings. You may use the headings below to organize the information (optional).
Suggested Table
Group / Involved Since When?
(Approximately) / Role in Program
e.g. planning, implementation, evaluation / Active or Passive Involvement? / Types of Resources Contributed
(not amounts)
9. Mobilization of Community Resources
Please describe how program has identified and accessed any additional community resources not already mentioned in Question # 8 – above.
10. Community Engagement
Please describe whether and how the intended audience is involved in aspects of program planning, implementation and evaluation. If data were reported elsewhere in the survey, simply indicate where (e.g. see # 9, paragraph 2).
11. Sustainability
How was sustainability of the issue, the program, the behaviour change and the partnerships addressed? To what extent is the program dependent on one-time or special funding?
12. Visibility
How was the program promoted within the community? Was the promotion based on a specific strategy? If yes, please describe. If relevant, please provide a sample of program materials to support strategy of a “common look”.
13. Opinion Leader Support
Is an opinion leader or champion associated with your program? If so please comment on who the leader or leaders were, their link to the issue, and their level of involvement?
Program Evaluation
14. Formative Evaluation / Pilot Testing / Focus Testing
How were program activities, materials and methods tested for relevance, comprehension and acceptability to the primary audience? How were the results used? If a formative evaluation was conducted, provide summary or report if available. Please provide examples of data collection tools used if not included in the report.
15. Process Evaluation
Was a process evaluation conducted which assessed program participation and/or implementation? How recently? Describe other program data tracked during implementation. How was this information summarized and how often? How has this information (both process evaluations and other data tracked) been used to modify programming? If a process evaluation was conducted, provide summary or report if available. Please provide examples of data collection tools used if not included in the report.
16. Outcome Evaluation
What data has been collected to measure program outcome? If a logic model is available for your program please identify which of the outcome objectives were assessed. What control measures have been established to enhance validity and reliability of findings? How have results been summarized and reported, and how recently? Please provide any examples of data collection tools and methodology (e.g. pre and post tests, participant feedback surveys etc.) and any reports, if available.
17. Program Documentation
If other communities wanted to duplicate your program, what documentation is available? Please list. How easily could they replicate all aspects of your program, from planning to implementation to evaluation, both agency level and participant level activities? What information is missing? Please provide examples of relevant documentation. Where impractical (i.e. document too large) please forward a table of contents instead.
18. Context Documentation