Considerations for PRiSM Outcomes Reporting for FY07

May 2007

Will Snyder

413/545-3876

I. General Background on Outcomes

Outputs vs. Outcomes. Many staff have already begun to report individual outputs,or project activities, on PRiSM. In contrast to outputs (what we do), outcomes are reported in terms of actions taken or learning demonstrated by our program participants (what they do). The Logic Model helps to clarify the relationship between the two. It’s worth getting familiar with this tool. (See

Audience for PRiSM Reports of Outcomes. The audience for the reports generated by PRISM is Extension’s director and program directors, for their use with internal (university) and external (federal, state, and community) stakeholders. Online reporting may seem mechanical and impersonal, but your readers will be living, breathing stakeholders. Keep these people in mind as you report.

How to Report Outcomes in General. There are two ways to report outcomes in PRiSM: 1) Indicators and2) Narratives (with narratives taking two forms: Project Impact Statements, and Anecdotesand Testimonies.) Here is a brief review:

1. PRiSM Indicators. These are specific, measurable statements about the people reached by our programs -- how many have gained particular skills and knowledge and have taken particular actions as a result. They “indicate” the extent to which an outcome in our five year plan has been achieved.

In our reports, PRiSM requires that we provide actual numbers for each of the indicators specified in our individual plans. This reporting function is required for the new CSREES report. In PRiSM, we also need to describe how we arrived at those numbers, and our confidence in those numbers.

Individual reports on indicators are compiled by PRiSM into project reports as well as reports on the critical issues we outlined in our five year plan. Reporting on indicators is required for all projects.

Reporting on indicators is discussed in more detail below.

2. Narrative Reports. These reports in text form can be used to demonstrate or explain the impact of a project in ways that cannot be adequately captured by the indicator numbers.

A. Project Impact Statements. These are similar to the information reported in the “Short Impact” section of CSREES reports in past years. They providea fuller picture of the project for a range of stakeholders. Including Project Impact Statements is recommended for all projects. You can paste in text and attach documents.

Depending on the range of stakeholders who are interested in the project, you may decide to report on a variety of outcomes:

  • Educational outcomes, including but not limited to the Indicators for the CSREES report. This is also the place to reportmore complete background and context for the indicator numbers reported to CSREES.
  • Strategic outcomes, such as results of our actions to position programs for new funding, or evidence of strengthened relationships with partner organizations.
  • Management outcomes, such as results of new promotion or recruiting systems, or evidence of improved evaluation participation.

In all cases, it is essential to describe each outcome clearly and to provide supporting evidence. Keep in mind that these are reports of outcomes, not outputs.

B. Anecdotes and Testimonies. This PRiSMnarrative space can be used to display what partners, stakeholders, participants, and others have said about our work. This is the place for attaching emailed thank you notes, quotes from evaluations, links to news articles related to our outcomes. This evidence can supplement, reinforce, and add some color to Project Impact Statements. Including this evidence is optional but recommended. Again, you can paste in text and attach documents.

II. Specifics for Reporting on Indicators

For every indicator that we included in our individual plans, PRiSM asks us to answer this question: “How many changed (in the way that was desired/anticipated) for each location?” In other words, to report, wemust gather information about those who were affected byour project efforts: What did they learn? How did their behavior change? We also need to track the communities these participants are from.

How much and what kinds of evidence are necessary to report adequately on an indicator? This is essentially a judgment call, depending on the audience for the report. The requirements of evidence for a peer reviewed research publication are different from the requirements of evidence needed for a legislator to decide whether a program has been worthwhile for the citizens of her district.

In PRiSM reporting, the audience is Extension’s director and program directors. The aim is to give them what they need to describe and explain Extension’s work to our various internal and external stakeholders, and to satisfythesestakeholders’(especially the critics’) questions about our programs. Stakeholder standards of evidence can sometimes be quite high and can also sometimes be quite individual and particular.

The most efficient (and often the most convincing) program evaluations use evidence that is generated in the normal course of the program. Good evaluation does not require a lot of time and extra steps, just a thoughtful plan for gathering evidence. For example, 4-H project records already document the 4-H member’s learning and development in an authentic, detailed, and credible way.

Some questions to consider in assessing the quality of your evidence:

  • What portion of the participants does the evidence represent? If a sample, how representative is it?
  • Does the evidence indicate a change in knowledge? actual behavior change based on that learning? change in the community or environment as a result of these behaviors?
  • Can the learning and changes be linked to our programs? To what extent have our programs caused the change?

III. Practical Evaluation Tasks for FY 07

Much of your team’s planning effort this summer will be spent developing new project plans (including evaluation plans) and individual plans for FY08. These plans will be duebefore your outcome reports for FY07 are due. However, you and your team may find it useful to do some of your outcomes reporting for this year first, so you can learn from your mistakes this year rather than next!

FY07 is our first year with this reporting system. Not all of us had a clear plan in mind for how we would measure indicators when we wrote or selected them from menus in PRiSM. If this was true for you, here is my suggestion for a way to proceed. I am available to help.

Consider these questions:

  • How do you get your own gut sense of whether this program is working or not? What “indicators” do you use, PRiSM or no PRiSM? If you could have a few hours with the Extension Director to show him or her the most compelling evidence that you are getting good outcomes, where would you go and who would you have him/her talk to? Keep these common sense answers in mind.
  • Look at the outcomes and indicators you chose for your PRiSM plan. What specific numbers and measurements do you already have in hand that youcan use as evidence for your indicators? What additional evidence will be easy to collect?
  • What’s the quality of the evidence? Divide it roughly into two categories 1) stronger (resulting from a plan that is well designed and carried out, usingsystematic and purposeful data collection) or 2) weaker (based on anecdotal, partial, inferred, or weakly documented evidence).
  • Is stronger evidence needed for the Extension Director to answer stakeholders, especially the skeptics? You may know your critics better than anyone else.
  • If so, what are the opportunities for efficiently gathering evidence for this indicator in the next few months? What opportunities will provide the most convincing evidence?

This is a key point: For FY07, the most important information in your report on indicators will be not the numbers for each indicator. It will be your description of how you arrived at those numbers, and your assessment of the validity of those numbers. This should be written up and entered into the “Notes and Documentation” text box in PRiSM.

Final FY07 PRiSM reports, including indicators with Notes and Documentation and any narrative reports, are due Friday November 30.

I am available to work with you to answer these questions, and to plan ways to collect more evidence if this is needed.