NERP Monitoring and Evaluation Strategy

Section 1: Introduction

1.1National Environmental Research Program

The National Environmental Research Program (NERP) is an ongoing program with around $20 million available per annum for environmental public good research. It builds on the lessons learned from the Commonwealth Environment Research Facilities program, replacing that program and focusing more specifically on biodiversity and improving research delivery to the Australian Government, other end users and stakeholders.

NERP is administered by the Department of Sustainability, Environment, Water, Population and Communities (the department). It currently has 4 years of identified funding approved between 2011 and 2014.

1.2TheMonitoring and Evaluation Strategy

Evaluation is an important part of the delivery of government programs or policies, providing support for decisions on a number of elements. For example, the appropriateness of its design and implementation and the achievement of intended results.

The NERP Monitoring and Evaluation Strategy (the Strategy) process is broken down into four interrelated components:

»Monitoring – the biannual collection and analysis of information to assist timely decision-making. Thus ensures accountability and provides the basis for evaluation and learning.

»Evaluating – the biannual assessment of the impact, appropriateness, effectiveness, efficiency and legacy of NERPhubs and the program.

»Reporting – communication of the findings associated with the evaluation process.

»Improvement – the use of the evaluation findings to inform the next hub Annual Work Plan and the ongoing revision of each hub’s Multi-Year research Plans.

1.3Reasons for having a Monitoring and Evaluation Strategy

The Strategy enables assessment of progress towards the four-year NERP outcomes. The Strategy also provides a framework for the collection and assessment of information to inform all stages of the program cycle and to facilitate continuous improvement.

The Strategy provides a roadmap for undertaking the Monitoring and Evaluation (M&E) activities required for outcomes reporting within the scope of NERP, rather than detailed instructions for doing so. These are outlined in the NERP M&E Plan.

While reflecting the requirements of the NERP, the Strategy takes into account Australian Government guidelines for outcomes, achievements and reporting. Australian Government guidelines identify best practice considerations for implementing initiatives, including the requirements for accountability and transparency in expenditure of public funds through reporting by outcomes.

The Strategy aligns with the reporting cycle outlined in each hub’sMulti-Year Research Plan (MYRP) and Annual Work Plan(AWP). It will enable each hub to provide information that answers two key questions:

»To what extent is each hub and the program making progress towards its stated project, annual and multi-year outcomes?

»How well are these outputs and outcomes being communicated to policy decision makers to allow them to effect change?

1.4Structure of this document

TheStrategy is intended for two user groups:

»The department, which is responsible for the delivery of the NERP, and

»NERP funding recipients.

To assist both groups in the implementation of the Strategy, this document is structured as follows:

»Section 2 describes the Strategy and how it integrates the four-year outcomes for the NERP with the annual work planning cycle.

»Section 3 sets out the Strategy requirements for recipients.

»Section 4 sets out the Strategy requirements for the department.

This Strategy is supported by a series of documents on MErelated issues (Figure 1).

Figure 1: Relationship between the Monitoring and Evaluation Strategy and supporting documents

Section 2: Monitoring and Evaluation Strategy Methodology

2.1Principles

The following principles provide overarching guidance for the development and conduct of NERP M&E activities. ME activities and outputs will be:

»clear, meaningful, readily understandable, wherever possible based on existing documentation and processes, and linked to the relevant phase in the NERP’s 4 year plan;

»consistent with international and ethical best practice for evaluation, and with indigenous customary and scientific norms for information collection, analysis and use; and

»designed to measurethe program’s impact against the policy priorities specified by the department.

2.2Responsibilities

Responsibility for implementing the Strategy is shared between the research hubs and the department.Each hub has responsibility for preparing and implementing an M&E Plan in accordance with the requirements outlined in this Strategy and the associated NERP M&E Plan.

2.3Framing the Strategy: using Program Logic and Key Performance Indicators

2.3.1Program Logic

Developing a programlogic is a key element of the evaluation process because it articulates the rationale behind an initiative. Program logic describes the relationships between activities and desired outcomes, and the links between them. It shows a series of expected consequences, not just a series of events, at different outcome levels within the program logic hierarchy. The Strategy is underpinned by program logic, against which the key performance indicators about the initiative can be clearly articulated.

Program logic is applied at the overall NERP level, and at the hub and project levels. This enables planning for evaluations at the relevant scale as well as an understanding of the links between project activities, targets and four-year outcomes.

Figure 2 uses the program logic hierarchy to show the different levels of outcomes expected at the NERP whole-of-initiative scale and the broad cause-and-effect links between them.

Figure 2: NERP Program Logic

2.3.2Milestone Reporting

Each hub will identify their project specific annual milestone deliverables in their Annual Work Plan – under the “Activities and Milestones” column. There is no need for hubs to duplicate this information in their M&E Plans. As part of the biannual progress reporting process the NERP Team will send a copy of the progress reporting template pre-filled with the information specified in their project milestones. Hubs will complete their template and provide back to the NERP team by the specified date.

2.3.3Key Performance Indicators

Key Performance Indicators are the next level of hub performance measurement. A suite of common indicators, specified in the NERP M&E Plan, will be measuredby all hubs. These standardised indicators will enable the department to collate and analyse program delivery and performance data. Hubs are able to define additional indicators specific to their own activities should they so choose.

Program specific indicators will be used by the department to monitor their own activities in managing the program. These indicators, and the methodology for their application, are outlined in the NERP M&E Plan.

Section 3: Hub ME requirements

NERP M&E requirements are identified in the funding agreements between hubs and the department. This includes the development of an M&E Plan for each hub.

3.1Hub Monitoring and Evaluation plans

Each NERP hub will prepare an M&E Plan to guide the monitoring, evaluation, reporting and improvement of themes. The requirements for these plans are detailed in the NERP M&E Plan. The M&E Plan includes requirements for:

»project milestone reporting;

»project-specific key performance indicators for the three phases of the program;

»a timetable and process to review and update plans; and

»linkages to the Hub Science Communication Plan to disseminate information to other stakeholders.

Budgeting, developing and implementing hub M&E will assist hub managers to evaluate and progressively report on progress towards outcomes specified in the Multi-Year Research Plans.

3.2Project progress and financial reporting

3.2.1Biannual progress reporting

Biannual progress and final reports provide project performance informationfor all aspects of hub projects. These reports also provide information on the contribution of projects to meeting the specified policy needs.Project milestone reporting is the primary method to report progress. Key performance indicators, summarising the project milestone reporting, are the secondary method to report these outcomes.

Information from hub progress reports will be used by the department in addressing the following key evaluation questions:

»Did the projects achieve their intended project outcomes?

»What contributions have the projects made to NERPstrategic policy questions and agreed policy outcomes?

»What results have the projects delivered? How do these results compare to those intended?

Reporting needs are addressed through the preparation and implementation of individual hub MEplans.The timing of project progress reporting is specified in funding agreements between recipients and the department.

3.2.2Project financial reporting

Project financial reporting provides information on project governance, implementation and finance.These are some of the primary inputs to biannual progress reporting which forms the basis for making payments to recipients. Project financial reports enable the department to make payment decisions by monitoring progress in project implementation and expenditure and provide information on progressive acquittal of expenditure.

The timing for project financial reporting is as specified in the funding agreements between recipients and the Australian Government. In the case of NERP Tropical Ecosystem Hub research institutions this is specified in their contracts with the hub’s administrator.

3.2.3Hub yearly achievement summary

Each hub will develop ayearly achievement summary to be published as a key mechanism for communicating progress to the Australian public. The achievement summary will present a simple summary of progress made towards the hub’s goals. The achievement summary will be based on data collected through M&E processes and any other sources considered appropriate by each hub.

Section 4: NERP MEactivities

4.1Administration of projects

To meet its M&E requirements for evaluation, reporting and continuous improvement the department will undertake a range of activities at program level. Departmental M&E activities will vary during the life of the program. They will primarily draw on information from the NERP hubs and program specific indicators.

The monitoring, evaluation, reporting and improvement by the department of its administration of NERP aims to ensure, among other things, that appropriate mechanisms are in place to provide the department with the information it needs to answer the following key evaluation questions:

»Did the hubs deliver their projects the way they agreed to?

»What results have the hubs delivered? How do these results compare to those intended?

»Did the hubs achieve their intended outcomes (i.e. policy impact)?

Administration of projects by the department may include verification and validation of project activities, reports and evaluations.

4.2NERP program reporting

Responsibility for monitoring, evaluating, reporting and improving NERP as a whole lies with the department. This will ensure, among other things, that appropriate plans are in place to provide the department with the information it needs to answer the program’s strategic questions.

4.2.1NERP annual reporting

Program level reporting includes administrative and performance reporting through departmentalAnnual Reports.

4.2.2ME review

The NERP M&E Plan will be reviewed annually to ensure that itremains relevant and effective. Feedback from hubs regarding the M&E process will be a key input.

4.2.3 End of program evaluation

The NERP evaluation is an evaluation of the whole program at the conclusion of its first four years.It will determine:

»the extent of achievement of the program’s goals

»what improvements could be made to improve the program

»whether the program’s format and processeswere suitable to delivering it’s goals

The NERP evaluation should:

»focus on achievement and future directions

»aim to summarise the initiative and its achievements to date

»engage key stakeholders

September 2011NERP Monitoring and Evaluation Strategy Page 1 of 9