Choosing an Evaluator RSP 1/31/2013
Choosing an External Lead Evaluator
Steps Involved in Competitive Grant Situations:
1. Have a rough evaluation plan in mind
2. List all of the evaluation tasks (design, data collection, analysis, etc.) for various components of the plan
3. Consider the skills and costs of the personnel needed to perform the tasks
4. Determine which tasks can be performed in house (and budget for them)
5. Investigate external evaluation candidates and options
6. Interview/negotiate roles, tasks, costs with candidates & choose one
7. Share draft project plan with lead evaluator asap so s/he can draft evaluation plan for you (at no cost!)
8. If it’s terrible, choose another candidate
9. If it’s decent, collaborate with evaluator to make it excellent (well-integrated, detailed, budgeted)
10. Negotiate final subcontract/consultant budget as well as other in-house evaluation costs
11. Ask nicely for the evaluator to draft your Human Subjects section and/or IRB submission
12. If funded, develop a contract that spells out mutual expectations for the working relationship
Office of Juvenile Justice and Delinquency Prevention. Evaluating Juvenile JusticePrograms: A Design Monograph for State Planners. Washington, DC: Prepared for theU.S. Department of Justice, Office of Juvenile Justice and Delinquency Prevention byCommunity Research Associates, Inc.; 1989.pp.56-57.
Recognizing a Good Evaluator:
Evaluators are made (often self-made) and not born, so it is possible to identify what they are made of. There are three critical qualities of a good evaluator-experience, skill, and brains. Each will be reviewed here.
Program evaluators learn by doing, and the key to conducting good evaluations is knowing the ins and outs of the political and logistical aspects of program evaluation. Because, there are few academic programs in the country that have a program evaluation curriculum, a good evaluator is usually one who has done a number of them, and whose references will vouch for the work done.
Evaluation skills refers to knowledge about research design, methodology and statistics. These are taught in academic programs, and they can also be easily identified in a review of written materials provided by a prospective evaluator. Like experience in evaluation, research design and methods skills are refined in practice. They will most often be found in who has utilized those skills in prior program evaluations.
Brains refers to the thought process in program evaluation research. The best evaluations are often found in unique or creative applications of research skills to the particularities of the program being evaluated. This may in a special sampling strategy, utilization of a measure from another discipline, creative use of archival records, or an effective explanation of statistical methods to non-technical readers that indicates competence and confidence in the subject matter.
You have three sources of information that will help you determine the experience, skills, and brains of an evaluator. They are (1) the written examples of past work performed by the evaluator, (2) the resumes and references provided, and (3) the actual evaluation plan submitted for the program at hand. Careful consideration of each of these with further discussions with the potential evaluator when you feel it is necessary, will help you in the selection process.
U.S. Department of Housing and Urban Development, Office of Policy Development and Research. A Guide to Evaluating Crime Control of Programs in Public Housing. Washington, DC: Prepared for the U.S. Department of Housing and Urban Development by KRA Corporation; 1997. (excerpted)
A Good Evaluator:
Is willing to work collaboratively to develop an evaluation plan that meets your needs.
Is able to communicate in simple, practical terms.
Has experience evaluating similar programs and working with similar levels of resources.
Has experience with statistical methods.
Considers cultural differences.
Has the time available to do the evaluation.
Has experience developing data collection forms or using standardized instruments.
Is willing to work with a national evaluation team (if there is one).
Will treat data confidentially.