Terms of Reference

Undertaking Data Quality Assessment in Tanzania Mainland and Zanzibar

1.0  Introduction

The Legal Services Facility (LSF) has been operational since September 2011 and was registered as a non-profit company limited by guarantee in October 2013. It is a basket fund created to channel funding on equal opportunity basis to organizations which are providing legal aid and paralegal services in Tanzania mainland and Zanzibar. These services assist individuals to claim their rights, redress grievances and protect their fundamental human rights.

The LSF aims to promote and protect human rights for all, particularly for poor women, children, men and the vulnerable, including people living with HIV/AIDS. It works closely with the government at all levels, development partners, organizations involved in the provision of legal aid, including paralegal services, and other stakeholders. It strives towards the Government assuming its responsibility for regulating legal aid provision, including paralegals which has been formalized through legislation and needs to be further institutionalized. Lastly, the LSF contributes to enhancing awareness about the role and importance of legal aid and paralegal services amongst public and private legal sector stakeholders

In 2016, LSF launched it is new strategic plan 2016-2020 with aim of increasing access to justice for all, in particular for women in Tanzania. To realise this overall goal of the organization, LSF requires good system in place of data management and reporting in order to improve the result-based management for four key result areas articulated in the theory of change and result framework. The programme works to realise four major results areas namely i). Accessibility of quality legal aid services, ii). Legally empowered communities, in particular women; iii). Conducive environment for legal aid provision, and lastly (iv) Institutional sustainability of legal aid services and LAPs.

2.0. Background and Justification

The LSF Secretariat commissioned an external evaluation of phase 1, 2012-2015 and throughout the report, various suggestions have been made to ways in which LSF practice from the 2012 – 2015 period could be improved. In relation to the data quality assurance and credibility, the external evaluation report highlighted the need to have an independent data validation body because the numbers (data) presented by LSF are so critical to government, donor and other. On an appreciation of the scope and scale of the issue of access to justice, it is critical that there can be 100% confidence in data reported. LSF is continuing to improve the web-based system, but there are still significant challenges in achieving timely and accurate reporting – which means, for example, that extrapolations are used to try and get the most realistic picture overtime.

The external evaluation recommended that it is worth considering a division of labour where LSF’s efforts are focused on capacity building towards better reporting and results collection, and a separate body is charged with independent validation of figures and of the assumptions behind any extrapolations used.

In response to the evaluation findings, LSF conducted its first external data validation assessment (part of data quality assessment) to 29 paralegal units from the sampled zones to understand and compare data reported in the paralegal units on their strengths and weaknesses in terms of validity, reliability, precision, integrity, timeliness as well as documentation of results. Among others, results indicated that, many more legal aid services were provided to clients than what has been previously reported in both web based system and quarterly reports. The assessment further confirmed that, the number of cases attended were more compared to those that LSF had received within the same reporting period.

LSF finds that the first data validation exercise needs to be extended to a series of data quality assessments to foster substantial improvement in data management systems, data use and data sharing by LSF, implementing partners and their respective paralegals. This will help the implementing partners and paralegals to have reliable and useful data for accountability purpose through which the LSF and stakeholders will be able to achieve quality results. As a fundamental part of data assurance, it is planned to have data quality assessments four times a year, on quarterly basis. LSF wants to engage a long-term contract with consultant /s or a firm to carry out these data quality/validation assessments.

3. Objectives

Main objective

The objective of the assessment is to facilitate institutional accountability, transparency and credibility of the quality of data that the organization uses in reporting and disseminating to stakeholders through technical and external data assessors.

Specific Objectives

Specific objectives of this activity are:

1.  Assess the design and implementation of the program/project’s data management and reporting systems,

2.  Trace and verify reported data for key indicators at selected units (disaggregated by sex)

3.  Assess whether the reported data are actual (real) data by verifying their completeness, validity (suitability), Availability, Accurate and timeliness through documented evidences.

4.  Develop the action plan for improvement and,

5.  Identify periodic improvements between zones and produce a comparative annual report.

4. SCOPE AND TASK

LSF data quality assessment activity should include assessing six dimensions of data which are reliability, completeness, timeliness, precision, integrity and confidentiality. The consultancy should additionally look at capacity assessment needs and skills requirements for ensuring data management is adhered at all levels. This aligns well with the rest of data quality assessment protocols, which is critical to conduct institutional analysis, much as data collection, documentation and reporting is concerned.

The activity will be conducted in all regions of Tanzania mainland and Zanzibar periodically following selected samples. The sample will be clustered according to zones of which a total of 6-7 regions will be assessed per quarter with 1-2 region(s) in each zone. The assessment is expected to be conducted by external consultant(s)/firms with LSF full support. It is estimated that, this assignment may take up to 25 days (field work and reporting) per each quarter. It will be done in January, April, July and October each year starting 2018. The assignment in January will be an overall “Data quality assurance” for 2017. The actual timeframe will be finalized with the consultant upon selection. The audience/participants will include M&E unit LSF, Regional mentor organization, paralegal organizations and clients at field level where by the LSF will be assessed one per year same as other supported organizations/grantees.

The following key functional areas will guide the consultant during the assignment:

i.  M&E Structures, Functions and Capabilities

•  To check at all levels (Units, RMOs and LSF) if key M&E and data-management staff have clear assigned responsibilities and duties?

•  Have the majority of key M&E and data-management staff received the required training(s)?

•  Are the key result areas monitored at all levels that is LSF, RMOs and Paralegal in terms of data collected?

•  How Data collected are in line with program/project indicators

ii.  Indicator Definitions and Reporting Guidelines

•  Are there operational indicator definitions meeting relevant standards that are systematically followed by all implementers?

•  Has the program/project clearly documented (in writing) what is reported to who, and how and when reporting is required?

iii.  Data Collection and Reporting Forms and Tools

•  Are there standard data-collection and reporting forms that are systematically used?

•  Are data reported are from the planned activities or otherwise?

•  Is data recorded with sufficient precision/detail to measure relevant indicators?

•  Are data maintained in accordance with international or national confidentiality guidelines?

•  Are source documents kept and made available in accordance with a written policy?

•  Is the Data flow map available clearly defining how data flow from the source until is reported?

iv.  Data Management Processes

•  Does clear documentation of data collection, aggregation and manipulation steps exist?

•  Are data quality challenges identified and are mechanisms in place for addressing them?

•  Are there clearly defined and followed procedures to identify and reconcile discrepancies in reports?

•  Check the validity of data reported, for example are legal education data reliable?

•  Are there clearly defined and followed procedures to periodically verify data source?

v.  Links with National Reporting System

•  Does the data collection and reporting system of the program/project link to the LSF Reporting System?

5. Methodology

5.1. Sample

Due to the nature of the program, the LSF has divided its area of operations into seven zones including Zanzibar. Purposive sampling will be used to select regional mentor organizations and their respective paralegals within clustered Zones per quarter. Thus, LSF and consultant will agree and make a sampling for the regions to be validated, in the process, intention is to cover the whole country by the end of the assignment. The sampling might include covering from one zone to another or two zone per exercise or a random sampled regions per exercise. However, the final DQA will include the random sampling of regions from each zone. It is expected that one assignment will cover about 7 sampled regions. The audience will include the LSF M&E unit, regional mentor organizations and paralegal organizations, community members (clients disaggregated by gender).

5.2. Approaches

The assessment will include:

i.  Off-site desk review of documentation provided by M&E unit at LSF and by regional mentor organization.

ii.  On-site follow-up assessments at the regional mentor organization M&E Unit and at paralegal organization/units

iii.  In-depth verifications at the paralegal organizations and follow-up verifications at the regional mentor organizations basically looking into available connection between the two actors, documentation review, trace and verification of report numbers, cross checks of reports and spot checks (if feasible).

All approaches mentioned should be employed in a highly professional manner for the audience for a promising productive collaboration including the use of appropriate language.

6. The deliverables of the assignment

This assignment is expected to take 25working days, however, the first assignment will take longer than the other to an estimation of 45 days because that will be the start of the contracting procedures, as provided in section 9 below. Upon the start of the assignment;

•  Consultant(s)/firm will have to finalize the final research protocol and tools i.e inception report

•  Training of the research team and pre-testing of tools

In conducting the DQA, the assessment team will collect and document:

•  Evidence related to the review of the program/project’s data management and reporting system; and

•  Evidence related to data verification.

The documentation will include:

•  Completed procedures and templates included in the DQA Tool.

•  Write-ups of observations, interviews, and conversations with key data quality officials at the M&E Unit at LSF and regional mentor organizations, and at paralegal organizations. Normally, reports from paralegals to RMOs pass through the WEO, it will be interesting to countercheck if that happens for quality assurance.

•  Preliminary findings and draft Recommendation Notes based on evidence collected in the procedures;

•  Final Assessment Report. The Final Assessment Report will summarize the evidence the Assessment Team collected, identify specific assessment findings or gaps related to that evidence, and include recommendations to improve data quality.

The report will also include the following summarized statistics that are calculated from the system assessment and data verification procedures:

•  Strength of the Data Management and Reporting System based on a review of the program/project’s data collection and reporting system, including responses to questions on how well the system is designed and implemented;

•  Accuracy, suitability Availability, Completeness, Timeliness and validity of reported data

6.  Terms of Payment

No of Quarter / Percent’s / Inception Reports / Final Reports
Quarter 1 / 20% / 15% / 5%
Quarter 2 / 20% / 15% / 5%
Quarter 3 / 20% / 15% / 5%
Quarter 4 / 40% / 30% / 10%
Total / 100%

. The maximum available budget for data quality assignment is TShs 100,000,000/= per year.

8. Profile of the Candidate

General Qualifications:

·  Relevant degrees in relation to project management, monitoring and evaluation, Data quality management including DQA and validation, social science or any other relevant field, or combination of both.

·  Must have strong analytical skills and an understanding of reporting system, data elements data validation, and data analysis used by NGOs and Donors.

·  Proven experience and demonstrative relevant experience of carrying out similar assignment(s) in reputable organizations, provide one sample

·  He/she must demonstrate technical skills and knowledge in evaluation, data assessment and research experience is a requirement.

·  Experience in working with Non-Governmental Organizations of which experience in working with Legal Aid Providers will be considered a distinct added advantage.

·  Excellent communication and good report writing skills

·  Knowledge of both Kiswahili and English language desirable.

9. Consultant contracting process

No / Activity / Proposed timeline / Actual days
1 / Call for Tender / Four weeks / 28
2 / Review of Tender / Wednesday of the 5th week / 1-2
3 / Contracting process: This will be done once / Monday of the 7th week / 1
4 / Inception report: This will only written once. / Monday of 22nd January,2017 / 5
6 / Tool development and pre-testing: This will be done once / Two weeks immediately after acceptance of inception report / 10
7 / Field work and data collection / Two weeks / 10
7 / Data triangulation and Interpretation / One week / 5
8 / Draft report preparation / Two weeks / 10
9 / Validation meeting / One day / 1
10 / Final reporting / Three days. / 3
11 / Total / 45