Compon Media and Interview Data Protocols 11-10-2010

Compon Media and Interview Data Protocols (@3-31-11 jb)

1.  Overview of Compon Project

Purpose, Scope and Method:

Using discourse and relational network methods, to discover the underlying social factors affecting the diffusion and effectiveness upon national climate change mitigation efforts of three stimuli (signals) that flow from the global (UN) level into the national level. The three stimuli are 1) new scientific information (centrally IPCC), 2) norms concerning mitigation action (principally as represented in UNFCCC agreements), and 3) implementory burdens and schemes (voluntary action, carbon tax, cap and trade, CDM, REDD, Targets and Timetables, Technology Transfer, Financing Mechanisms). The project plans to test the explanatory capacity of different hypotheses that purport to identify the social factors that empower or disempower the effect of these three global stimuli upon national mitigation efforts under different country contextual circumstances. The analytical method proceeds by first validating the combination of hypothetical factors that effectively empower or disempower each of the three stimuli in each distinct country case. In the next step, using method of Qualitative Comparative Analysis, compares the mixture of factors found effective in the distinct country cases to look for sets or combinations of factors that are effective within a restricted context (such as “developing nations”) or effective across a broader range of national contexts. The ultimate purpose will be to develop a social theory of global climate change mitigation.

Theory Construction: Network, social and political

Through discussion, Compon members will have to develop testable hypotheses to explain cross-national variation in national mitigation responses to global climate change. We can draw these hypotheses from existing literature about politics, culture and society, or from substantive observation, or from specialized fields such as formal network analysis. We already have a tentative list of hypotheses which will be subject to revision and improvement through our discussions.

The field of social network analysis which we employ has developed some generic theoretical principles that apply to all networks of whatever scale and composition, such as concerning the relationship between centrality and power. Our network data will allow us to test these theorized effects as they pertain to processes of social change and political decision-making. Network data such as we collect (at the “meso” level, among the organizations involved in bringing about “macro” social change and policy decisions) provides a new, detailed and systematic examination of important aspects of the processes that comprise and bring about social change and policy decisions.

The hypotheses we develop will be couched in ways that permit their testing using network and other data. Since policy network analysis is a relative new, cutting-edge method, our hypotheses will feedback into and help improve the realm of general theory about social change and political decision-making. Our results will also contribute to actual formation of policies at national and international levels to help with mitigation of climate change and its other aspects.

Development of Hypotheses

Deductive from Network Theory:

Drawn from network theory about the effects of centrality, brokerage and other positions in networks in producing the power to affect social change and political decision-making.

Deductive from Social and Political Theory:

Drawn from theory about the social, cultural, economic and political factors conducive for and against effective global agreements and national efforts on the mitigation of climate change.

Inductive from observations:

Principles thought to affect mitigation efforts drawn from case studies and other observations and commentary.

Analytical Design:

Within each case, the data will contribute to the analysis of social and political processes and theories about that country. At the same time, the comparison of multiple or all cases, in a cross-national fashion, or as they related to international policy-formation networks and actors, will permit our project to draw more general and global conclusions.

Scientific Method:

The project must follow objective methods of data collection and analysis so that they could in principle be repeated by some other researchers who would reach the same conclusions.

All hypotheses need to be stated in terms that permit of falsification (finding them invalid) using the objective data.

Contribution to actual solutions:

The human capacity to manage the problem of global climate change depends upon the development of new institutions that provide sufficient incentives to induce all major producers of greenhouse gas (GHG) emissions and holders of GHG sinks (forests) to act so as to rapidly reduce the global atmospheric concentrations of GHG. Preventing the heating of the global atmosphere from exceeding the red line of two degrees Centigrade (above the pre-industrial level)[1] will require that developed nations reduce their annual GHG emissions levels by 80% by the year 2050, and that major developing countries reduce by at least 30%, compared to the 1990 baseline. To attain this goal, total global annual emissions must start to decline by the year 2015. Attaining these reduction targets will require a historically-unprecedented level of global cooperation involving a great degree of rearrangement of living, consumption, fiscal and other patterns in the developed countries and also a great transfer of resources and support from them to the developing countries. The existing international treaty attempting to establish this direction of ecological redevelopment, the Kyoto Protocol, while an instructive start, has not produced much real mitigation and does not include the developing countries. The efforts to create a new international agreement, to reach a peak at COP15 in Copenhagen, Denmark in December 2009, have to craft a new type of institution that will enlist the willing efforts of all related nations. This massive undertaking faces the most enormous “problem of collective action” ever experienced by global humanity. The practical purpose of the Compon project is to elucidate the social principles that will maximize the capacity of new agreements to produce effective mitigation by different nations.

2.  Protocols for Data Collection

Sources of Data -- Three Types

·  Printed media (articles in mass media, government records and documents, organizational publications) at Levels 1, 2, 3 and possibly 4.

·  Semi-structured interviews (coded for meaning categories using Nvivo or by hand).

·  Policy network survey (quantitative analysis analyzed with software).

Data Compatibility

All Compon cases should follow the same data collection procedures as much as possible in order to maximize the eventual production of a common data set for comparative analysis.

Relative weight of different types and sources of data

While all cases will try to follow a common format, to some degree, the use of data sources and types will depend upon the relative quality of the data available in the specific case context. For instance, where newspaper articles are not available in on-line and downloadable format, the research team may have to reduce the number of years collected.

Treatment of Media Data

Newspaper and other media selection

Select one conservative, one progressive and one economic newspaper (if possible).

Can also use other media: television, radio, internet blogs--if useful and if transcripts are available (Nvivo can code them in original audio and visual format without need for transcripts).

Collect legislative records if available; in digital format if possible, or scan for database and coding.

Selection of newspaper articles

By keyword search in media database.

Search for keywords climate change (CC), global warming (GW) and Kyoto Protocol (KP) (or other indigenous words for the same phenomenon). Use an and/or search, so that you will pick up the article if it mentions one or more of any of the three keywords. Your search engine may have capacity to use a “Boolean” search criteria that permits searching for articles that contain either/or CC, GW and KP.

Level 1 coding of article numbers.

During this search for keyword articles,

·  Conduct counts of numbers of articles per year in each newspaper (A) mentioning the keywords.

·  Also record the total number or articles published per year in each newspaper (B).

·  Dividing A by B will produce one measure of “news share.”

·  If you can record the total number of words in the keyword articles per year (Aw) and the total number of words in the newspaper by year (Bw), and divide Aw by Bw, that will give an even more exact “news share” measure.

·  If possible, please do both measures.

*Level 1 coding is on all newspaper articles (raw number) mentioning one or more keywords.

*Level 2, 3 and 4 coding is on the cleaned media database.

Constructing the Cleaned Media Database for Levels 2, 3 and 4 coding

Database Cleaning and Reduction

For coding the articles at Levels 2, 3 and 4, the team will need to construct a “cleaned” data base. From the article database remove all articles that are duplicates, that are notices for meetings, or that only mention the keyword(s) without any substantive discussion or information. Some articles will have a substantive discussion about energy or politics or some other relevant topic but just mention CC/GW once with much discussion of it per se. In that case, keep the article in the cleaned data base, but in Level 2 coding, code it as having CC/GW as a minor theme (in comparison to having GW/CC as a major or middling theme).

This removal will reduce the size of the database. Keep articles reprinted from foreign sources, but they do not need to be coded using DNA. When coding articles from foreign sources, code them using the Excel variables, but you do not need to code them for Excel Variables 13 through 16 (about public debates or policy contents).

Storage of Database:

Store all documents in one or more long text files or MS Word files (properly labeled for easy identification of contents). You can put the data into several text or DNA files if one gets too long, but a single file can hold hundreds or even thousands of documents.

Database Backup:

Keep backup copies of all text files (Compon will try to establish a common backup system on the web) and other materials in at least three separate physical places and update each one daily.

3.  Protocols for Coding the Media Data (Levels 2, 3 & 4)

These procedures address all levels of Compon media coding. It presumes the team has produced the needed digital data set of newspaper articles and other media (legislative records) needed for coding. This data set should contain all articles concerning CC/GW published in the national newspapers (or a random sample if there are too many for coding), including articles reprinted from foreign press.

As noted above, Level 1 coding consists of computerized keyword counts (climate change, global warming, Kyoto Protocol and equivalents) using all articles in the three newspapers for 1997 to 2008 that mention the keywords. You can count these numbers on line without downloading the articles.

Level 2, 3 and 4 coding requires that you download and clean the articles so that you only retain articles that have some substantive discussion of CC/GW. You will discard articles that only mention CC/GW in passing, such as a notice for a lecture.

·  The Level 2 Article Frame Coding Procedure codes the themes of the whole article (using variables in the standardized Excel database). The Level 2 Article Frame Procedure codes the general approach of an entire article.

·  The Level 3 Actor-Statement Coding Procedure codes the articles for information on actors being quoted and their statements (using the Discourse Analyzer Software supplied by Philip Leifeld). The Level 3 Actor-Statement Procedure uses the DNA software (or its equivalent) to code the statements of actors cited or quoted in the newspaper according to the stances they adopt toward proposed solutions and concrete policy proposals.

·  Level 4 requires a careful reading of each article and coding the content for themes using the Nvivo qualitative data analysis software. Level 4 produces the most sensitive coding of the nuances of how media and actors frame the issue of climate change. But it is very time-consuming, so only can be done if there is sufficient research assistance available.

The same two coding procedures can be applied to the coding of other documents, such as records of legislative (Parliamentary, Congressional, Diet, etc.) debates. Having identified the main themes, actors and stances through this coding, the teams can then conduct in-depth qualitative interviews with the actors to obtain information on their deeper motivations, such as rationales and beliefs (these can be coded using Nvivo QDA software).

The Article Frame Procedure codes the data into an Excel or SPSS or other spreadsheet data base. An Excel spreadsheet accompanying this guide (on Compon project website) contains all the variables. The Actor-Statement procedure codes the data using the Discourse Network Analyzer (DNA) software package authored by Philip Leifeld (please use latest version downloaded directly from his website). Coding the Actor Statement data using the DNA software will take some extra time during the coding process, but it has several advantages: automatically preparing the data in the matrix form needed by the network analysis programs (UCINet, Netminer, Connectrix, etc.), providing instant access to the actual quotations highlighted in the original text, and running special analyses provided by its author.

The two procedures of media coding (Level 2 Article Frame and Level 3 Actor-Statement) will permit the analysis of the discourse content of the media public sphere and legislative debates, the actors appearing in them, the positions they hold, and changes in all of those over time. Not all articles will contain Actor-Statement information. If the article lacks such information, please code it only with the Article Frame Procedure.

After the Level 2 and 3 coding is finished, the coded categories of public debate and policy issue (Level 2) and actor statement category (Level 3) can be coded into typologies of issues using the typology in the appendix for cross-national consistency.

All coding of actors, organizational names, categories and other data should be done in the native language of the team and the case. It is important to retain the natural meaning content of the codes, for ease of use by the team and for later team publications in their own language. Translations into English for joint publications can be done later.