U.S. IMPACTstudies research overview
A research team led by Mike Crandall and Karen E. Fisher of The University of Washington Information School, with support from the Institute of Museum and Library Services and the Bill & Melinda Gates Foundation,is examining the impact of free access to computers and the Internet on the well-being of individuals, families, and communities.
Purpose
Public libraries have provided free access to the Internet and computers since the 1990s. Libraries have also provided access to digital resources, databases, networked and virtual services, training, technical assistance, and technology-trained staff. Past decision-making regarding public access computer services has been based on such measures as number of users/sessions, length of time computers are in use, number of unfilled requests, and results of satisfaction surveys (e.g., Jaeger, Bertot, & McClure, 2007). However, little research has examined the relationship between free access to computers and outcomes that benefit individuals, families, and communities.
Working with libraries, users, and communities, and an expert committee of library leaders, researchers, and public policy organizations, the IMPACT research team is
- documenting the positive and/or negative results from the presence or absence of public access computing resources and services in public libraries; and
- developing robust and broadly applicable indicators for the impact of free access to computers, the Internet, and related services.
The researchers are specifically interested in outcomes and indicators related to 7 domains: (1) civic engagement, (2) eCommerce, (3) education, (4) eGovernment, (5) health, (6) employment, and (7) social inclusion. These domains are relevant to policy goals and consistent with the public library mission. The ultimate aim is for these indicators to guide decision-making and generate public support for public access computing in public libraries. According to Hatry (2006), to be useful, indicators need to be:
- specific (unique, unambiguous);
- observable (practical, cost effective to collect, measurable);
- understandable (comprehensible);
- relevant (measures important dimensions, appropriate, related to the program, of significance, predictive, timely);
- time bound (covered a specific period of time); and
- valid (provided reliable, accurate, unbiased, consistent, and verifiable data).
The IMPACT studies are testing and validating indicators to ensure their usefulness to libraries and policy makers and will work towards developing an outcome evaluation system that is cost effective and easy to use.
Research summary
To identify key areas of public access computing (PAC) impact and build outcome indicators, the IMPACT team is currently engaged in two projects:
- A mixed methods analysis of PAC users funded by the Institute of Museum and Library Services (IMLS) consisting of a nationwide telephone survey to generate generalizable findings and 4 case studies to help us understand analytic findings and stimulate policy insights.
- A nationwide web survey administered in public libraries. This initiative is funded by the Bill & Melinda Gates Foundation and will extend the value of the IMLS telephone survey by augmenting the data collection, allowing us to gain responses from individuals commonly missed by conventional survey methods (low income, youth, homeless), and linking user outcomes to library resources.
Together, these two efforts will establish and test candidate indicators and provide valuable information about users of public access computing.
IMLS mixed methods study
Mixed methods research has been used in research and program evaluation in many different fields, but has not been used extensively in library research (Fidel, 2009). Research involving both qualitative and quantitative methods provides the opportunity to increase the validity of research, better understand conditionalities and context, and counteract biases inherent in any research method.
The IMLS mixed methods study will generate two rich sources of data. The telephone survey data will provide a representative picture of the prevalence of different types of people using public access computers and how it benefits them. The case studies will provide information about contextual influences on library outcomes, such as available resources or the policy environment, as well as testing the relevance of candidate indicators with library stakeholders. Analyzed together, they will capture a holistic picture of computer and Internet use in public libraries and test the validity of findings.
The telephone survey was developed through an iterative process with the research team and experts from the library, research, and public policy communities. The survey asks general questions about PAC use (frequency/alternative access), specific types of use across the seven domains, use on behalf of others, use of other library resources, and demographics. It has been extensively tested in field conditions and will be offered in English and Spanish.
The goal for the telephone survey is to complete 1130 interviews with users of public access computing in libraries. This will enable us to estimate the number of people in the United States who use PAC resources at a +/- 3.5% margin of error at the 95% confidence interval. The survey will also provide a rich data source for examining how people use PAC and will allow us to evaluate candidate indicators.
Four case study visits are being conducted at the Enoch Pratt Free Library in Baltimore, Maryland; the Blair Public Library in Fayetteville, Arkansas; the Oakland Public Library in California; and the Marshalltown Public Library in Iowa. One week is being spent at each siteconducting observation, interviews and focus groups with PAC users (ages 14 and up), librarians, administrators, IT staff, persons from allied organizations (e.g., other community technology centers, city councils, senior centers, local schools and colleges) as well as mini-interviews with people at Internet cafes, tourist offices, hair salons, book stores, and other community focal points. In-depth library and community profiles gathered from additional data sources along with extensive research note-taking by members of the research team round out the case study methodology. These case study instruments and techniques were field tested with the Mount Vernon City Library in WashingtonState in August 2008.
Public library web survey
The web survey is essentially the same instrument as the telephone survey, with only minor variations to account for the different platform. The web survey has also been translated into Spanish. Web survey data is augmenting data gathered through the telephone survey and, coupled with data from the National Center for Educational Statistics (NCES) on the selected libraries’ resources, will enable analysis of the relationship between available PAC resources and user outcomes. Because Internet surveys are an emerging method and may be a low-cost alternative to gathering patron-level data, the results from the web and telephone surveys will also be compared in order to gauge the effectiveness of web surveys for future library research.
Library systems selected for participation are being asked to link to the web survey through their websites during a designated two-week period.Libraries are being provided a unique URL and varied methods for linking to the survey including buttons, float-in/pop-up scripts, and HTML code. Participating libraries will be provided a comprehensive report on the data collected through their public access system if enough respondents are obtained to ensure confidentiality and statistical validity of the results. If not, a comparative report of national averages of peer libraries will be provided instead.
The web survey includes 636 randomly selected library systems (of the country’s 9,198 administrative units across 50 states) and is expected to yield 80,000 completed surveys with an overall response rate of 12%. Once aligned with the telephone survey, data from the web survey are expected to be generalizable with a margin of error of +/- 2.2% at the 95% confidence interval, offering substantial flexibility for the development of an insightful portrait of PAC users.
Outcomes and Dissemination
Through our replicable, transportable, and triangulated methodology, we will identify measurable indicators of the social, economic, personal, and/or professional impact of free access to computers, the Internet, and related services at public libraries, and of negative impact where service is weak or absent.
We will also provide new, reliable data on the benefits to individuals, families, and communities of these services and resources at public libraries. This will enable the public library community to document and use data collected about the impacts of PAC to assist with improvements in services, support local and national advocacy and funding efforts, and provide a solid basis for future research efforts.
Dissemination efforts will begin in June 2009, with the release of preliminary analysis from the IMLS mixed methods study. Results from the U.S. IMPACT Study web survey will be disseminated in August, 2009. Along with library and information science researchers, our targeted audience includes librarians and policymakers who can use our research for evaluation and policy decision making. Delivery of individualized library web survey reports is scheduled for September.
References & Resources
Fidel, R. (2008). "Are we there yet?: Mixed methods research in library and information science." Library & Information Science Research (07408188)30(4): 265-272.
Greene, J. C. and V. J. Caracelli (1997). Advances in mixed-method evaluation : the challenges and benefits of integrating diverse paradigms. San Francisco, Jossey-Bass Publishers.
Hatry, H. P. (2006). Performance measurement: Getting results.Washington, D.C.: Urban Institute Press.
Hatry, H. P., L. Lampkin, et al. (2003). Developing community-wide outcome indicators for specific services. Washington, D.C., Urban Institute.
Jaeger, P. T., Bertot, J. C., & McClure, C. (2007).Public libraries and the Internet 2006: Issues, findings, and challenges. Public Libraries, 46 (5), 71-78.
Jick, T. D. (1979). "Mixing Qualitative and Quantitative Methods: Triangulation in Action." Administrative Science Quarterly24(4): 602-611.
Lampkin, L. , Winkler, M., Kerlin, J., Hatry, H., Natenshon, D., Saul, J., et al. (2006). Building a common outcome framework to measure nonprofit performance.Washington, D.C.: Urban Institute. Available at
Sue, V. M. and L. A. Ritter (2007). Conducting online surveys. Los Angeles, Sage Publications.
Thomas, S. J. (2004). Using web and paper questionnaires for data-based decision making : from design to interpretation of the results. Thousand Oaks, Calif., Corwin Press.
U.S. IMPACT studies research overview| 1