Common Framework for a Literacy Survey Project

Literacy Survey

Analytical Guidelines and Tabulation Plan

April2014

Contents

Preface

1. Introduction

2. Uses of Literacy Assessment Data

2.1 Users of Assessment Data and Their Technical Competence

2.2 The Issues That Literacy Assessment Data Can Address

2.2.1Understanding the Learning Needs of Adults at Various Levels of Literacy and Numeracy Skills and Determining Perceived Barriers to Improved Literacy Levels

2.2.2Literacy’s Relationship to Inequalities in Social and Cultural Outcomes

2.2.3The Quality of Education Provided by the Formal System

2.2.4The Adequacy of Adult Learning Systems

2.2.5Literacy as a Barrier to Achieving High Rates of Macro-Economic Growth

2.2.6Literacy’s Relationship to Social Inequality in Economic Outcomes at Individual Level

2.2.7The Relationship between Self-Declared Literacy and Individual Literacy Skills

2.2.8Other Uses of Literacy Assessment Data

3. Responsibilities of the Countries

4. Tabulation Plan

Preface

The project “Common Framework for a Literacy Survey” was executed by the Caribbean Community (CARICOM) Secretariat under funding provided by the Inter-American Development Bank (IDB) Regional Public Goods Facility and was intended to design a common approach to the measurement of literacy in countries. This common framework is built upon international methodologies and fundamentally the International Survey of Reading Skills (ISRS) that enable reliable measurement of literacy than what presently exists in the Region.

The literacy assessment is designed to measure functional literacy. In other words, it will determine an individual’s literacy level by employing a series of questions designed to demonstrate the use of their literacy skills. This involves two steps – the objective testing of an adult’s skill level and the application of a proficiency standard that defines the level of mastery achieved. The assessment measures the proficiency of respondents on three continuous literacy scales - prose, document and numeracy. In addition it will collect information on reading component skills. Component skills are thought to be the building blocks upon which the emergence of reading fluency is based. Information on the reading component skills will be collected from people at the lower end of the literacy scale only. The testing phase is preceded by a selection phase which includes the administering of a Background or Household questionnaire and post the selection of the respondent from the specific household an initial pre-assessment is undertaken through a filter test booklet to determine what type of assessment should be undertaken in the testing phase.

A consultant, Mr. Scott Murray of Canada was hired to undertake the provision of services on this project. The CARICOM Secretariat (including Regional Statistics and Human and Social Development Directorate) and the CARICOM Advisory Group on Statistics (AGS) were instrumental in the execution of the project throughout all phases. In addition, there was participation by Member States and some Associate Members relative to the technical rollout of the instruments and documents.

This document is aimed at providing <country undertaking a Literacy Survey> with guidelines that can enable the analysis of the literacy survey data as recommended under the IDB-funded CARICOM project. A draft tabulation plan is also included.

1

1. Introduction

This document sets out analytical guidelines for a Literacy Assessment that supports the generation of several estimates including the following:

  • Average proficiency scores for prose literacy, document literacy and numeracy
  • Numbers and proportions of adults at each proficiency level for prose literacy, document literacy and numeracy
  • Scores on each of the reading components tests
  • The classification of adults with Levels 1 and 2 prose skills into groups sharing common patterns of strength and weakness on the reading components tests
  • Correlations between skills and variables thought to determine observed differences in skills within and between countries
  • Correlations between skills and variables thought of as labour market, education, health and social outcomes

These can be undertaken at the level of the entire adult population or for any sub-groups for which sufficient sample has been included to support reliable estimates.

In addition to this document, further details could be found in several of the international reports, national reports and thematic reports from the IALS and ALL studies. A few useful examples include:

  • Coulombe, S., Tremblay, JF and S. Marchand.(2004). Literacy Scores, Human Capital and Growth across Fourteen OECD Countries.Ottawa: Statistics Canada. Cat. No. 89-552-MIE, no. 11.
  • Coulombe, S. and Tremblay, JF (2006) Human Capital and Canadian Provincial Standards of living
  • Coulombe, S. and Tremblay, JF (2006) Migration, Human Capital, and Skills Redistribution across the Canadian Provinces, Working Paper 2006 D-07
  • Green, D.A. & Riddell, W.C. (2001).Literacy, Numeracy and Labour Market Outcomes in Canada. Ottawa and Hull: Statistics Canada and Human Resource Development Canada
  • Green, D.A. & Riddell, W.C. (2002).Literacy and Earnings: An Investigation of the Interaction of Cognitive and Unobserved Skills in Earnings Generation
  • Green, D.A. and Riddell, C. (2007) Literacy and the Labour Market: The Generation of Literacy and its Impact on Earnings, Statistics Canada and HRSDC, Ottawa
  • Raudenbush, S. W., & Kasim, R. M. (2002).Adult Literacy, Social Inequality, and the Information Economy: Findings from the National Adult Literacy Survey.Ottawa and Hull: Statistics Canada and Human Resource Development Canada.
  • Rubensson, K. and Desjardins, R. (2007) Adult Learning in Canada: A Comparative Perspective,Statistics Canada and HRSDC
  • Shalla, V. and Schellenberg, G. (1998) The Value of Words: Literacy and Economic Security in Canada, Statistics Canada and HRDC, Ottawa
  • Statistics Canada and OECD (1995), Literacy, Economy and Society: First results of the International Adult Literacy Survey, Ottawa and Paris
  • Statistics Canada and HRDC, (1996) Reading the Future: A portrait of literacy in Canada, Ottawa.
  • Statistics Canada and OECD (2000), Literacy Skills for the Information Age: Final results of the International Adult Literacy Survey, Ottawa and Paris.
  • Statistics Canada and HRSDC (2004), Literacy scores, human capital and growth across fourteen OECD countries, Coulombe, Tremblay and Marchand, authors, Ottawa

1

2. Uses of Literacy Assessment Data

At the highest level, the use of any data can be divided into two categories:

  • Uses that are ‘policy-related’ i.e. the data they produce have no direct bearing on decisions taken about individual units but are used indirectly to formulate national policy, establish priorities, allocate funds and decide on implementation methodologies. These non-administrative uses have little direct bearing on the outcomes of individuals in the short run but may have a profound impact on outcomes in the long run because of their influence on policy, funding or practice.
  • Uses that are “administrative” in nature i.e. the data are used to take decisions that have a direct impact on an individual unit – be ita student, a teacher, an administrator, a school or some larger unit. For example, administrative uses involve using assessment data to guide instruction for a particular student or to signal mastery of a particular level, to determine programme eligibility for particular students, or to assign supplemental resources to particular schools.

However, in its basic form, the Literacy Assessment is designed to provide data for the former, to inform public policy debate, be used to formulate policy and to monitor policy impact. Itis not designed to serve administrative uses although the instruments could be applied to separate samples of students to serve these purposes.

Policy-oriented assessment systems can be classified within a framework devised to describe the uses of official statistics in multiple domains (Overgaag and Goddeburre, 1989). This framework proposes the following categories of use:

  • For knowledge generation i.e. to understand the casual structure of the domain(s) of interest and their relationship to key covariates;
  • To inform policy and programme design i.e. to identify the nature and scope of the problem to be fixed, the relative priority for action, the cost of inaction and key elements of the remedial intervention;
  • To monitor indicators of key outcomes with a view to identifying any unanticipated departures from established trends or relationships; and
  • To evaluate the impact of specific policy and/or programme interventions undertaken at the macro-level.

This Literacy Assessment will not be able to assess individual literacy programs as it is designed to provide the above information for the national population or possibly large sub-populations. With this design, individual literacy programmes cannot be assessed because there will not be enough respondents from any one literacy programme in the sample to give reliable results. However, national planners or administrators may wish to use the same instruments for such an assessment by organizing a separate survey of all, or a sample, of participants from a specific literacy programme.

2.1 Users of Assessment Data and Their Technical Competence

The fitness of any statistical data may be judged only in terms of two criteria:

  • The use to which the data will be put; and
  • The technical ability of the users.

At the risk of stating the obvious, users vary greatly in their interest and ability to understand and apply statistical information in their decision-making.

Statistical products that fail to respect this fact are likely to fail to reach maximum impact.

Similarly, statistical products and services that fail to support the uses foreseen by key users will fail to achieve maximum impact.

The potential users of literacy assessment data are numerous and extraordinarily diverse in their ability to deal with drawing inferences from complex statistical data. In many cases, the same user has a need for a range of products and services to meet a variety of uses of differing technical content.

Key user groups are as follows:

  • Citizens- Citizens need information to judge whether the education system is meeting its social, cultural and economic goals and whether it is doing so in an efficient and effective way. Most citizens have limited statistical acumen and little interest in details and nuance – they want and need a set of stylized facts about the performance of the system.
  • Educational administrators- Educational administrators at several levels need information for multiple purposes such as:

-Directors of literacy and non-formal education programmes- Directors of literacy and non-formal education programmes need information to reflect upon the performance of teachers in particular domains and on the performance of specific groups of students, to adjust teaching priorities and curricula, to formulate targeted in service training for teachers, to design compensatory programmes and supports, to demonstrate performance to administrators higher up in the system and to argue for additional resources. Directors are generally reasonably comfortable with statistical data but have little time to undertake primary analysis themselves.

-Subject matter and diagnostic specialists- Subject matter and diagnostic specialists need information for the same reasons but also to reflect on the relative performance of programmes and to take action to improve same. As a group, they have mixed statistical skills – the specialists generally have advanced analytic skills and an interest and ability to use statistical information

-Administrators at the regional, provincial and national level- Administrators at the regional, provincial and national level, including specialists in particular assessment domains and those responsible for accountability measures and reporting, need information for the same reasons. As a group they have access to statistical expertise and the resources to apply them but have a need for stylized facts about the performance of their part of the system. Their key clients are politicians, including the minister(s) responsible for education and learning, teachers and citizens.

-Community leaders- Community leaders, including local politicians, need information to assess whether literacy programmes are producing what the community needs to meet their social, cultural and economic goals. Most community leaders have very limited quantitative skills but can usually access what they need in the community.

-Training institutions- Training institutions responsible for the training of new teachers require information on the performance of current approaches to teacher training, curricula and instruction. As a rule the staffs have access to statistical expertise required to use assessment data.

-Non-governmental agencies, research institutes and social advocates- Non-governmental agencies, research institutes and social advocates need information to monitor trends in educational outcomes and to argue for structural and policy changes. As a group these agencies have mixed ability to deal with statistical data.

-Politicians and policy-makers- Politicians and policy-makers in a variety of national ministries need information for several purposes. All ministers and ministries need information to understand the ability of their clients to use print and to adjust their communication strategies and channels accordingly.

  • Education ministries need information for a variety of purposes:
  1. to understand the performance of the current education system and what factors influence relative success;
  2. to adjust the level and distribution available funds to achieve maximum return on investment;
  3. to adjust programme design, curriculum, instructional methods and delivery mechanisms to match learning needs;
  4. to argue for additional resources; and
  5. to inform pre-service and in-service training of instructors.
  • Labour ministers and their policy-makers need information to understand the quality and quantity of literacy and numeracy skills and the labour market needs for these skills.
  • Culture ministers and their policy-makers need information to understand the relative position of linguistic and cultural minorities either in the official language(s) or minority languages.
  • Health ministers and their policy-makers need information to understand the relationship of literacy to population health and to design appropriate communication strategies.
  • Tax officials and their policy-makers need information to understand the literacy levels of the taxpaying public so that they can engineer their reporting systems accordingly.
  • Social development ministers and their policy-makers need information to understand trends in literacy levels and the role that they play in creating social inequity in economic, educational, social and other outcomes.
  • Agriculture ministers and their policy-makers need information to understand the connections between literacy level and changes in agricultural practice.
  • Industry ministers and their policy-makers need information to monitor the supply of literates that are available to the workforce. As a rule ministers and policy-makers do not have strong quantitative skills nor do they have an interest in doing analysis themselves. They want generally only want stylized facts provided by their own technicians.
  • The media - Print, radio and television play a central role in disseminating the results of any national Literacy Assessment as it is by this means that the main messages first reach many users. Failure to get the media to report on the assessment system in an objective way that encourages users to seek more information can doom even the best assessment systems to obscurity. Similarly, where media are critical of the assessment or create sensational messages that do not reflect the data, national assessment programmes will be threatened. At a minimum national study teams will spend a lot of time “fighting fires”.

As noted above literacy assessment systems have the potential to create winners and losers. As a result, different groups of users will be predisposed to support, or to argue against assessments, depending on what they perceive it to be in their interest. Viewed from a communication standpoint, the goal of the national project team is:

-To maintain the support of users who are initially supportive;

-To win the support of additional users who may be neutral or mildly opposed; and

-To address the concerns of opponents in a balanced and neutral way in all publications and related analyses.

2.2 The Issues That Literacy Assessment Data Can Address

Having identified the general uses to which literacy assessment data will be put and the likely users, it is important to set out the issues of public policy upon which these methods have been designed to shed empirical light.These issues are important in the first instance because they provide the rationale for participating countries to invest scarce resources in implementing the study.They are also important because they provide a starting point for planning an analysis programme that will produce products and services that match the needs and technical competence of key users.Dissemination and communication programmes will ensure that these products and services reach the intended audience.

As currently designed, the literacy assessment approach can inform seven issues that are of central importance to policy development in all countries. These issues include:

2.2.1Understanding the learning needs of adults at various levels of literacy and numeracy skills and determining perceived barriers to improved literacy levels;

2.2.2Literacy’s relationship to inequalities in social and cultural outcome;

2.2.3The quality of education provided by the formal system;

2.2.4The adequacy of adult learning systems;

2.2.5Literacy as a barrier to achieving high rates of macro-economic growth;

2.2.6Literacy’s relationship to social inequalities in economic outcomes at the individual level; and

2.2.7 The relationship between self-declared literacy and individual literacy skills.

2.2.1Understanding the Learning Needs of Adults at Various Levels of Literacy and Numeracy Skills and Determining Perceived Barriers to Improved Literacy Levels

The design is such that it will provide data that can be used to serve educational ends. Specifically the assessment will allow users to explore questions such as:

-What is the distribution of component reading skills in the adult population?

-How are component reading skills related to the emergence of fluency as defined in the literacy and numeracy measures?

-What do these patterns imply for the design of curricula, instruction and delivery?