Evolution of Evaluations for Critical, Reflective and Deliberative Discourse: National Issues Forums On-line
M.E. Holt, Ed.D., Associate Professor, Adult Education, University of Georgia
F. Rees, Ph.D., Program Planner and Evaluator, Athens, Georgia
J.D. Swenson, Ph.D., Associate Professor, School of Comm., Ithaca College
P.B. Kleiber, Ed.D., Department Head, Independent Study Program, UGA
Introduction
"The thinking that created the problems we face now cannot be used to solve the problems that thinking created" (Paraphrased from Albert Einstein, in Rothman and Sokoloff, 1997).
Interest in evaluation of deliberation
This project combines interests in the topics of assessment and public deliberation and uses National Issues Forums as a context for analysis of both. The evolution of evaluation tools across three years (1994-1996) will be traced and future assessment techniques will be described. As evaluators, our primary research question is "How can on-line deliberation be assessed?" As educators, our questions driving this on-going project are "Can deliberation occur on-line?" and "If so, can the deliberative process be learned through participation in an electronic forum?" As evaluators and educators employing instructional media, our purpose here is to share how our questions (and means for addressing them) have evolved as the technology has changed.
National Issues Forums
National Issues Forums is the name used to identify a public dialogue process and accompanying materials introduced in the United States in 1981 by the Charles F. Kettering Foundation and the Public Agenda Foundation. Each year these foundations and other contributing partners develop neutral non-partisan materials that frame public policy issues around three or four choices for public consideration and potential action. A primary purpose of these forums is to advance people in their thinking in group conversations from personal opinion to public judgment. Participation has increased dramatically each year and forums occur in a wide spectrum of settings: colleges and universities, civic organizations, libraries, prisons, literacy projects, labor unions and presidential libraries, to name a few. Until recently, forums were almost entirely face-to-face exchanges with an occasional attempt to conduct them using television interface methods.
Public deliberation in cyberspace
For the past three fall academic terms, the authors have experimented with on-line National Issues Forums involving student participants enrolled at the University of Georgia and IthacaCollege. In fall 1994 participants discussed on-line "People and Politics," and in fall 1995, "Affirmative Action." The first two years electronic forums were conducted via an e-mail listserv. In 1996, using the topic, "How Should We Govern America?" and web-conferencing software (Facilitate.Com), the deliberations moved to the World Wide Web. Each of the three forums were held over a six-week period and followed from the procedures and materials developed by National Issues Forums for face-to-face forums. Each week the moderators posted a choice, posed the options as questions, and asked participants to respond. While one technical moderator at each site helped participants with skills such as using the software and accessing the web, two on-line moderators took responsibility for the content and process of the deliberation. In addition, participants were provided at the web site with guidelines for on-line discussion, netiquette, informed consent and research statements before the forum began.
The evolution of evaluating on-line forums
Evaluations of on-line National Issues Forums (NIF) have occurred every fall from 1994 through 1996. This section will summarize the evaluation tools utilized to assess the deliberative process for each on-line forum. The next section will describe more fully the assessment techniques and our findings. The final section offers a discussion and recommendations for future research.
The evaluation tools used for our first on-line NIF consisted of a short (9 item) on-line summative evaluation, a summative (27 item) off-line evaluation, and moderators' journal notes. The second year the same assessment tools were used but a discourse analysis was added. For the third NIF, evaluation tools consisted of a midway on-line evaluation, off-line summative evaluation, discourse analysis, constant comparative analysis of students' journals logging their reflections on both content and process, technical- and content-moderators' plans and notes recorded as e-mail postings, and a protocol analysis of the Web forum transcript. Table 1 (Insert here) outlines the type of evaluations for these on-line forums over three years.
In Kirkpatrick's (1994) model, our assessment tools have evolved from Level I evaluations to Level III. Level I evaluations may be thought of as self-reported reactions: did the learners enjoy the educational experience and did they feel good about it. The first year's on-line and off-line summative evaluations are the least sophisticated to prepare and cumbersome to analyze, however, they yielded rich data. Reactions from participants in the electronic forum proved instructive in guiding the development of each new forum. Further, the findings from the Level I evaluations contributed to the design and interpretation of higher level evaluation methods: discourse, constant comparative and protocol analysis.
The lessons learned from the initial listserv experience served as the basis for The Electronic Forum Handbook: Study Circles in Cyberspace (Kleiber, Holt and Swenson, 1996). Analysis of the experience and evaluation data from the first forum is the focus of "The Changing Face of Distance Education," (Holt, Kleiber and Swenson, 1995) presented in Vancouver at the 1995 International Evaluation Association conference.
Summative evaluations were repeated in the second and third forums. They provide self-reports on the potential of NIF on-line to contribute to learning, the potential to develop communication skills and to develop community in cyberspace. The results from these Level I evaluations allowed us to evaluate expectations versus experiences, participation patterns, technology as a mediating factor, moderator effectiveness and the comparative quality of on-line forums to face-to-face public deliberation.
Evaluation results from the first forum found little evidence for public deliberation. The "talk" can best be described as a wheel without a rim. Participants' replies to moderator stimulus probes from the center hub resembled the spokes of a wheel but there was very little dynamic interaction between participants.
Discourse analysis was added to the evaluators' toolbox in our assessment of the second forum. Evaluating the extent of participant-to-participant interaction can be achieved with methods of discourse analysis. By tracking the interactivity, participants engagement can be determined. Since "engagement" is a defining characteristic of public deliberation, this Level II evaluation method is powerful. Discourse analysis also distinguished recurrent themes and identified value conflicts and value convergence. The results from the second forum reveal NIF on-line served more as an awareness-raising vehicle than as a site for decision-making (Rees, 1996/7).
For the third forum, the technology change from a listserv to the web required us to revise, refine and render new evaluation methods in order to effectively assess whether deliberation occurred on-line. The most recent forum deployed summative evaluations again. The summative off-line evaluation was originally constructed as a 27 item questionnaire. It was increased by two items in the second year and to 64 items in the third year for the web facilitated forum. Items came from reviewing the literature about on-line conferences, computer-mediated communication, and other educational electronic activities, from colleagues in the National Issues Forum network, and classroom experiences of the investigator-instructors. The large jump in the number of items in the third year was an intentional attempt to capture more precise information about the reflective and deliberative process from student-participants' perspectives. Items were both close-ended and open-ended.
The on-line summative evaluation was a 9 item short survey administered at the conclusion of the first and second forums. In the third forum, an on-line midway evaluation was tried for this initial web-based forum in order to troubleshoot concerns, needs and problems. In the first and second forums, moderators' journals contributed additional formative and summative assessment data. In the third forum, moderators, planners and instructors kept a reflective log through exchanging electronic mail postings. This archive offered additional evaluative information.
Two additional methods were added to the assessment mix in the third forum. One, a constant comparative method of analysis and two, a protocol analysis. The first assessment technique involved examining student-participants' journals thematically and is described more fully below. Like discourse analysis, the constant comparative technique is a Level II evaluation method. The second new assessment methodology involved coding the transcript of participants' postings to the web forum for evidence of the six-steps of the deliberation process. The technique of protocol analysis fits Kirkpatrick's (1994) definition of a Level III evaluation method in that performance of skills in deliberation are measured.
Evolution of evaluation methods moved from Level I "smile sheet" evaluations to Level II and III evaluations. Level II evaluations emphasize knowledge acquisition which discourse analysis and constant comparative analysis of participants' journals effectively assess. Level III evaluations assess behavioral changes with the application of skills in performance. Triangulating evaluative methods enhances our understanding of how and when deliberation occurs on-line.
Summary of Evaluation Methodologies and Findings
Summative Evaluations
By collecting student reactions each year, forum leaders have been better able to anticipate concerns of future participants and give direction to engaging learners in the forum. Results have added new information and insights for planners to fold into their strategic designs for ways to advance and model reflective and deliberative forms of critical thinking in electronic forums.
One example of information obtained that influenced the development of future forums is the importance of feedback. Persons participating in computer-mediated forums are "invisible," unlike face-to-face forums. Several participants reported frustration when they posted a comment or question and it did not receive any replies. One said, "I need immediate feedback, which allows for clarification of content and follow-up questions or comments." This alerted planners to orient moderators and participants to the importance of responding, not allowing words to just hang in cyberspace, thereby creating and intensifying isolation. "I was a little disappointed in not knowing how most felt about what I was sending. I could not know who agreed or disagreed with my comments except by those who stated so. The social aspects are very limited," reported another participant.
A second example of how Level I evaluation data proved valuable came from a few participants who were not United States citizens. Issues of multicultural participation and diverse ways of knowing and thinking surfaced. Future consideration of these findings are key to plans for an international electronic forum.
Finally, planners learned in each forum that intergenerational demographics in group composition should not be ignored, but age diversity needs to be acknowledged and affirmed. Ways and approaches to achieve more constructive intergenerational dialogue are informed by these findings. New techniques continue to be explored based upon reactions to the NIF on-line experiences reported in summative evaluations. "I was surprised at the involvement of the youth," reflected one Georgia participant.
Self-assessments disclosed in the summative off-line evaluations indicated thinking evolved and changed across the life of the forum. These findings directed attention to patterns, stages and cycles in the progression of a forum. Research questions for the discourse, constant comparative and protocol analyses were formulated from the results from these Level I evaluations. New research questions continue to surface that can be explored further by triangulating methods.
To illustrate, one student replied to the question which asked if the Web conference encouraged communication: "I began to see communication as a real effort to facilitate change and believe in the importance of a collective voice. I started to ponder issues and talk with friends about them." Another said, "Socially, I appreciated hearing the other's opinions and greatly appreciated the ice-breakers. I felt somewhat intimidated by the subject and therefore took a while in warming up....I wanted to at least make some intelligent remarks that were accurate." Finally, although a small percentage felt their thinking had not changed, those who indicated it had made quite strong statements: "Through all the negativity, I became more positive about America." "I think they [my responses] went from somewhat topical to in-depth." "My primary answers were broad; as time went on I became more critical and specific with suggestions."
Dramatically, students in both Listserv and Web forums reported the influence of these activities on their reflective thinking. Students appreciated being asked what they thought about the issues, observed it was necessary to ponder the questions before responding, and said the process prohibited "knee-jerk" reactions. Illustrative comments included: "Absolutely, I think this is what the forum did the most for me. Most of the time my best responses came after I exited the forum." "This was perhaps the most evident area of a positive outcome because the forum could be entered or exited at will; there was time for reflection." "Yes, you couldn't just write anything and sound like a fool." "I think this was a strength of the forum." "This was its strongest attribute. I learned by analyzing others' opinions, the questions proposed and by evaluating my own beliefs and feelings."
Ehrmann (1995) argues in research about technology for teaching and learning we have not asked the right questions. Humorously he chides that too much research has attempted to see if learning is better achieved by using technology than not:
I've got two pieces of bad news about that experimental English comp course where students used computer conferencing. First, over the course of the semester, the experimental group showed no progress in abilities to compose an essay. The second piece of bad news is that the control group taught by traditional methods, showed no progress either (1995:20).
Summative off-line evaluations are useful to initiate a research investigation in an area where little groundwork exists to ask the right kind of questions about new applications of instructional technology. The findings reshaped our research inquiries and agenda leading us to develop new tools to tackle these questions.
Discourse Analysis
One significant advantage to researching on-line forums is that the entire record of communicative acts is archived. Participants, moderators and researchers can revisit the exact words previously transmitted in a forum. Simultaneously, a major challenge is that much is absent: eye contact, body language, facial expressions and voice tone, for example. Electronic forums require participants to converse in written language which enables students to characterize, concretize, and conceptualize their ideas, values, beliefs and attitudes in a comprehensible, rational and reasonable manner.
The theoretical underpinnings for discourse analysis of NIF on-line derive from critical language study. Discourse analysis can be used to investigate the interaction process that takes place when people communicate (Rees, Cervero, Moshi and Wilson, 1997). It reveals what Fairclough (1979, 1992) labeled the power "in" and "behind" language. The power behind language is its macro structure; the norms of interaction used by particular individuals in particular settings that have particular meanings. These norms of language use reflect cultural beliefs and reinforce cultural norms. Language use encodes the "culture" of a people, an organization, an idiosyncratic group, or any two friends. In this respect, language use is a cultural artifact. The power in language is the way language is actually practiced in dialogue to condition certain effects. Thus, the power in language can alter macro norms. The impact language use has on the effects of "talk" make "reading" the electronic record of interaction dynamics both crucial and judicious. Discourse analysis allowed us to assess the transformative power of public deliberation.
Analysis of verbatim interaction entails marking who talks, when they talk, and what linguistic devices they use in transactions and across transactions. Transactions are groups of utterances tied together thematically around a particular topic. When the macro and micro structure of the process of speaker-response transactional exchange focuses the investigation, turn-taking becomes the unit of analysis rather than more traditional linguistic features such as lexicon or grammar. The macro structure of turn-taking serves as a yardstick against which micro patterns of exchange are compared (Hinds, 1979; Jones & Jones, 1979; Ochs, 1979). The micro as well as the macro structure of turn-taking come to be very rule-governed, the consequence of which is that both structures interact to condition who speaks, when they speak, and the linguistic devices they use (Fairclough, 1989, 1992). This systematization is necessary for efficient meaning-making or mutual apprehension of communicative action (Lakoff, 1980). Systematization is also the reason why interpretation of communicative action is contextualized by cultural, social, and political values and beliefs. The fact is, communicative action can have varied and, sometimes, contradictory meaning because context plays such a large part in the interpretive process.