Primary Contact:

Curtis A. Keim

Dean of the Faculty

Moravian College

1200 Main Street

Bethlehem PA 18018-6650

610-861-1348 (office)

610-625-7784 (fax)

Project Title:

“Value Added Assessment of Programs of Intense Student-Faculty Interaction:

Developing Intentional Learners”

In this project, five national liberal arts institutions--Drew University, Moravian College (administrator), Muhlenberg College, Roanoke College, and Susquehanna University—will assess selected programs of intense student-faculty interaction for their contributions to the liberal education goal of developing intentional learners. The members of this collaborative have established a variety of academic programs intended to intensify the liberal arts experience, including first-year seminars, writing-across-the-curriculum, undergraduate research, and capstone experiences. In collaboration, the institutions will develop methods and tools to assess the value of such programs in promoting intentional learning.

Project Dates:

Start: August 1, 2006

End: July 31, 2009

Project Cost:

The Teagle Foundation: $300,360

Shared Institutional Costs: $415,950

Total Project Cost: $716,310

Value Added Assessment of Programs of Intense Student-Faculty Interaction:

Developing Intentional Learners

Drew University, Moravian College (administrator), Muhlenberg College,

Roanoke College, Susquehanna University

Project Rationale

As baccalaureate institutions committed to the ideals of liberal education, the members of this collaborative have, in recent years, established or expanded academic programs of intense student-faculty interaction. Consistent with our commitment to small classes and faculty accessibility, such programs are intended to improve the effectiveness and rigor of traditional pedagogies and curricular offerings. At our institutions, these programs include first-year seminars, capstone courses and senior experiences, writing across the curriculum, values and ethics courses, experiential learning, study abroad, and undergraduate research. We propose in this project to assess systematically the value added by such programs in terms of both program-specific learning goals and the development of intentional learners.

The development of intentional learners is a common goal of liberal education. The Greater Expectations program of the Association of American Colleges and Universities defines the goal as follows:

Becoming an intentional learner means developing self-awareness about the reason for study, the learning process itself, and how education is used. Intentional learners are integrative thinkers who can see connections in seemingly disparate information and draw on a wide range of knowledge to make decisions. They adapt the skills learned in one situation to problems encountered in another: in a classroom, the workplace, their communities, or their personal lives.[1]

In a Teagle Foundation-sponsored planning workshop, the collaborative teams discovered that each is particularly interested in how our institutions help students move from being passive recipients of information to practitioners of the full potential of intentional learning in a liberal education environment. Thus in our work, in addition to assessing program-specific goals, we will assess the role of our selected programs in fostering student awareness about the value and process of learning, student integration of knowledge from multiple disciplines, and student application of skills from one context into another.

A further theme of our proposal is the pursuit of assessment tools and methods in the context of liberal education. There is strong agreement among the collaborative participants that we need methods of assessment that better support our liberal arts missions and that better engage faculty members in the improvement of learning. This is not meant to imply that our current methods are unhelpful. Each institution currently administers national surveys such as the Cooperative Institutional Research Program (CIRP) and the National Survey of Student Engagement (NSSE) and uses the information gathered to make program improvements. In addition, under encouragement from our accrediting bodies, each institution is developing a variety of local assessment methods for courses, majors, departments, programs, and general education. Yet even as the sites and methods of assessment proliferate and our assessment skills improve, we perceive that we still do not understand our students holistically as learners.

We thus will explore ways to develop and integrate three categories of assessment information—national and local student surveys, student work, and results of qualitative face-to-face interactions. The key value-added assessment innovations will be the development of suitable assessment tools, the integration of different types of information, and the sharing of assessment knowledge and tasks among institutions. In the broadest sense, our goal is to foster a community of practice that promotes assessment-based inquiry and decision-making, a holistic understanding of our students, and the values of liberal education.

Project Description

Assessment Information

National and Local Surveys: Our institutions currently administer at least ten different national assessment surveys that are listed below. We would like to improve in our use of these surveys by integrating their data with other kinds of evidence and by collaborating to explore how the information might be understood. In addition, as we learn more about the relevant questions to ask, we will develop local surveys to help understand the learning choices students make and how they perceive themselves as intentional learners.

Direct Evidence: Each project will incorporate direct evidence of student work, varying according to the needs of the specific project. In our pilot projects, we are analyzing assignments students already complete for courses. We are also developing rubrics for assessing student work relevant to the separate projects as well as to our common intentional learner project. As our expertise increases, some institutions will develop additional direct methods for analyzing student improvement (such as portfolios and the CLA or a CLA-type instrument) while maintaining a common rubric regarding intentional learning.[2]

Interactive Assessment: We will explore methods of “interactive assessment,” our term for a category of assessment methods that emphasizes face-to-face interactions between investigators and students. As background for our work, we are particularly impressed by the work of Richard Light and his colleagues in the Harvard Assessment Seminars. These seminars organized faculty members and students at twenty-five institutions to investigate various aspects of student development through personal interviews. Light writes that “these personal interviews paint an entirely different picture from the kind of information that comes from a large-scale, check-box style of survey questionnaire.”[3] The picture painted is complex and full of the depth, breadth, and vitality that other methods of assessment rarely portray. Moreover, developing such a complex assessment picture is especially interesting to faculty members.

A large-scale personal interview project is beyond our resources at this time, so we will develop interactive assessment information through various kinds of focus groups. For these focus groups, we will use standard research techniques for asking questions (with some questions that are the same for each institution), sampling, recording data (notes from a person who is not the leader and tape recordings as backup), and analysis (using rubrics to classify and count responses). However, the primary value of interactive assessment is the generation of ideas and questions that might be missed in surveys and analysis of student work. As we learn more about how and how much our students are growing, we can identify common themes and follow up with appropriate direct assessments and surveys.

First Steps

With the help of a Teagle Foundation “first step” grant, the collaborative has begun to develop its project. In the fall, Dr. Mary Allen led an outstanding two-day assessment workshop at Moravian College. We have attached the workshop program and Dr. Allen’s curriculum vitae.[4] At the workshop, twenty-one faculty members and administrators considered the range of value added assessment strategies and methods, as well as varieties of focus groups for interactive assessment.[5] We have engaged Dr. Allen on an as-needed basis to respond to e-mail questions. With the remainder of our Teagle Foundation grant, teams from the institutions will gather again in May at Drew University to discuss our pilot projects and continue our collaboration. Our goal for using our “first step” grant was to carry our ideas forward, whether we eventually received a larger grant or not. We are extremely pleased to have succeeded. Below we describe the early stages of our separate projects. These individual descriptions do not capture, however, the value we have already gained from collaboration in terms of sharing, energy, and acquisition of assessment knowledge.

The collaborative institutions are beginning projects as follows:

Drew University: First-year Seminar—A 25-year-old program with multiple advising, intellectual, and development goals, including the growth of students as intentional learners. Drew will design several direct and indirect ways to assess what the required seminars are contributing, both during the first year and over four years. Based on first-year NSSE data, Drew will develop questions and conduct pilot focus groups this spring and prepare baseline data for further studies. With respect to the student advising component of the first-year seminars, Drew will develop both quantitative and qualitative methods to assess the extent to which students learn to take an active and intentional role in planning curricular and extra curricular components of their experience at Drew, and in planning for their lives after graduation. The goal is to use these data to improve the advising system at Drew.

Moravian College: First-year Writing and Writing-Intensive Courses in the Major—These required courses are intended to develop writing, information literacy, and critical thinking skills at the introductory and advanced levels. This spring, Moravian will assess first-year and senior work and conduct pilot focus groups at both levels with the goals of understanding whether students are learning to write for multiple audiences and are otherwise developing as intentional learners. The team will also study whether there are correlations of findings with NSSE and CIRP data.

Muhlenberg College: Capstone courses and student-faculty research—There is variability in the type of capstone experience that academic programs offer and in the availability of opportunities for students to engage in student-faculty research and study. By comparing outcomes for students with and without these experiences (using NSSE data, a student questionnaire, and focus groups with graduating seniors), Muhlenberg will begin this spring to study the effectiveness of the programs in fostering academic growth and intentional learning.

Roanoke College: Freshman orientation and first-year seminar—Roanoke initiated a new orientation approach in 2005 that paired orientation groups with faculty members who led activities, advised students, and facilitated discussion of the orientation reading assignment and other topics. Roanoke also is developing a first-year seminar that will provide greater interaction between first-year students and the faculty. Assessment of the new orientation process began this academic year and provided information to improve the program and build additional assessment processes.

Susquehanna University: First-year Core Perspectives and senior Capstone Experience—Susquehanna will assess how these new required courses encourage the development of intentional learners. This spring, course instructors are developing focus group questions using course, NSSE, and CIRP information, and in the fall they will apply the focus group template developed by Peter Hart for the AAC&U.

Timeline

With funding from The Teagle Foundation, our collaborative will continue to hold the fall and spring seminars begun in 2005-2006. These six gatherings will feature presentations by consultants and sharing among participants of assessment tools, information, and experiences. The seminars will rotate among the institutions, thus allowing partners to better understand each other’s students and thus distributing the opportunity to promote wider campus discussions. Our first-steps seminar at Moravian last fall allowed Moravian to invite Dr. Mary Allen to lead a separate workshop for non-Teagle-sponsored faculty members. Drew is planning a separate workshop on its campus when Dr. Allen leads our first-steps Teagle seminar in May.

In order to continue our conversations between seminars, we will create a collaborative listserv. We have also identified one or two participating or consulting faculty members at each institution who has expertise in quantitative and qualitative research and will continue to engage Dr. Mary Allen as an assessment consultant to provide advice for the overall project and assist collaborative partners on an as-needed basis.

The first year of our project will focus on defining valid measures for value added in the development of intentional learners, on improving our skills in applying the three assessment methods for pre- and post-experience assessment, on learning to integrate various kinds of data, and on developing assessment strategies that take advantage of our collaborative. We are considering, for example, cross-institution scoring of student work and developing one or several common instruments so that we can do some external benchmarking of aspects of intentional learning that are not assessed in the national surveys that we already administer. We are also considering seminar presentations on creating and applying assessment rubrics (a topic presented in outline by Dr. Allen), on student intellectual development, and on ways NSSE can be used to assess student growth. At our seminar at Drew this May, we will share what each institution has accomplished this far and prepare concrete plans for the fall. We have engaged Dr. Allen for this workshop.

In the second year, as our understanding, skill, and collaboration matures, we will refine data collection techniques and analyze two years of data. We will identify patterns of successful teaching and learning within individual programs and across different kinds of programs, levels of students, and institutions. We will also begin to complete the assessment cycle by considering changes to our programs. At the present time we cannot predict what those specific changes will be (curricula, pedagogies, administrative structures,…) or how we will use our resources to effect them (course preparation, workshop organization or attendance,…), but we envision that by the summer between years two and three we will be planning and actually implementing changes.

In our third year, we will continue information collection and analysis as well as program change. By this time, our knowledge of assessment methods, our programs, and our students will be sufficiently developed that we can also begin to share that knowledge beyond our campuses. We will encourage faculty members involved in the investigations to present their findings in academic publications and conferences such as those sponsored by the Association of American Colleges and Universities, the American Council on Education, the Middle States Association, and the Southern Association of Colleges and Schools. We will also consult with specialists who can help us present our collaborative findings to a broad range of audiences beyond the academy.

Collaborative Team