3
MANAGING for DEVELOPMENT RESULTS IN THE SOUTH AFRICAN SOCIAL SECURITY AGENCY (SASSA):
CASE STUDY: SOUTH AFRICA
Background
The South African Social Security Agency is a relatively new organization, created almost a year ago to deliver social grants to vulnerable groups (elderly, disabled, and chidren). In fulfilling its mandate, SASSA faces a myriad of challenges among which are to (i) provide a comprehensive social security services to vulnerable groups whose true social needs are often difficult to assess; (ii) deliver quality services to beneficiaries within the context of skill shortages; (iii) overcome widespread fraud and leakages
Monitoring and Evaluation framework and implementation plan for SASSA has been drawn up against strategic objectives of SASSA to support the implementation of activities towards the achievement of outcomes. The objective of this document is to provide a framework to monitor and evaluate the management and administration of Social Security programs within the context of Results Based Monitoring and Evaluation (RBM&E). This thinking is consistent with Presidency’s Government Wide Monitoring and Evaluation framework approved by Cabinet of South Africa in 2004.
The framework has eight main strategic objectives, namely:
· to promote a culture of continuous learning and improvement within the Agency;
· to promote the effective and efficient deployment of resources for implementation and administration of social security policies and programs;
· to facilitate accountability at all management levels;
· to facilitate the utilization of reliable, timely and relevant information to all relevant stakeholders;
· to disseminate best practice findings for improved project and program performance;
· to strengthen evaluation capacity;
· to coordinate and standardize processes and procedures used for monitoring and evaluation; and
· to ensure the ability to consolidate information collected during the monitoring and evaluation process in order to provide global information.
PART ONE : MfDR Challenges facing SASSA
M&E activities: M&E activities are currently not integrated into the life of the agency. Institutionalising M&E means that these activities have to be part of the culture & practice of the Agency to form basis on which SASSA plans its work, learn and improve. Our need to institutionalise M&E successfully, M&E should be designed to build on information & reporting systems that already exist within SASSA. Capacity development and support for effective and institutionalised M&E is needed.
Effective monitoring & evaluation of performance: Lack of integrated statistical system driven by strategic priorities with outputs and outcomes indicators; no reliable reporting mechanism and no requisite electronic infrastructure.
Utilization of Evaluation Information: Lack of strong evaluation culture that will create demand for work of the Evaluation Unit; Institutionalising M&E function beyond supply side preoccupations such as having well designed evaluation architecture to provide quality monitoring information and data to improve budget choices, policy and decision-making, to enhance accountability as well as strengthen program and service delivery. Our principal objective is to conduct useful or influential evaluations whose findings could be used to influence resource allocation, budget decisions, facilitate efficient service delivery and to improve proper targeting of service users
Evaluation Methods and Strategies: To design M&E tools and strategies that are robust and which will enable us to determine the real outcomes and impact of SASSA beyond simple numbers A related challenge is whether the task of undertaking evaluation should be entrusted with either internal or external evaluators or consultants Whist it is important to know the number of South Africans receiving social grants, the most useful information is to determine if the received incomes have resulted in improved economic and social circumstances of beneficiaries that can be attributed to the delivery of social grants A true impact study may necessitate collaboration with other social agencies/research institutions. This is because social grants alone may not be enough to make a significant dent in poverty reduction
Core competency of Evaluation Staff: The M&E Department has recruited core staff with mixed skills set. Most have good academic qualifications but are yet to be tested in programming and policy environment. ? Sound technical skills to horn political acumen to be effective in meeting multiple needs and requirements of Senior Management, line managers, service providers and beneficiaries
Change Management: SASSA need a Champion who can lead the advocacy for, development of and sustainability of our M & E system as well as an environment that is in a state of readiness in terms of incentives and demands for designing and building a results-based M & E system;
No clear lines of authority in terms of roles and responsibilities of existing structures for assessing the performance of SASSA, thus enhancing
accountability for good or non performance;
Funding: Inappropriate budget for M & E institutionalisation and formal links with other international, regional and national departments, appropriate line ministries, NGOs or research institutions to enhance and align operations and efforts;
Establishment of multi-sectoral working groups with the various departments, agencies and sectors to provide consensus on various aspects of M & E design and implementation;
Strong stakeholder participation and consultative processes that ensure that citizens continuously hold government accountable
Knowledge management to ensure that lessons are shared with other projects, programs and policies across government, NGO and donor sectors; as well as
Substantial investments in engineering in relation to skills and expertise and requisite knowledge to hone the politics of monitoring and evaluation. Under-engineering the system has a potential of undermining M & E initiatives.
Author: LM Bosch
17 October 2008