September 29, 2004

M E M O R A N D U M

To: Participants of the Adding Value to the MSP Evaluations Conference

From: Norman Webb, Rob Meyer, and Paula White of the Adding Value Team

Subject: Summary of the Fourth Adding Value Conference

The fourth meeting of the Adding Value to the Mathematics and Science Partnership Evaluations Conference was held on September 16-17, 2004 at the Wisconsin Center for Education Research, University of Wisconsin-Madison. MSP evaluators, participants, and presenters were Terry Ackerman, Ruth Anderson, Joanne Bogart, Frank Davis, Jim Dorward, MaryAnn Gaines, Arlen Gullickson, Susan Millar, Judith Monsaas, Penelope Nolte, Beth Rodgers, Ben Sayler, Suzanne Sublette, and Valerie Williams. Persons in attendance from the Wisconsin Center for Education Research Adding Value Project were Janet Kane, Rob Meyer, Norman Webb, and Paula White. This memo summarizes progress made at the meeting.

Introductions and Review of Agenda for Conference

Norman Webb, Principal Investigator, Wisconsin Center for Education Research gave an introduction to conference participants by identifying the goals of the conference and the goals, principles, and activities of the Adding Value Project.

Conference Goals:

·  Further community among MSP evaluators

·  Address common issues

·  Provide assistance in analyses

Adding Value Project Goals:

·  Increase the knowledge of MSP evaluators about design, indicators, and conditions needed to successfully measure change in student learning over time

·  Develop useful tools and designs for evaluators to attribute outcomes to MSP activities

·  Apply techniques for analyzing the relationship between student achievement and MSP project activities to evaluate the success of MSP projects

Project Principles:

·  Build on what has been learned about evaluating large-scale systemic reform

·  Develop a learning community among the MSP evaluators

Project Activities:

·  Provide technical assistance to MSP evaluators regarding MSP evaluation challenges

·  Two-day semiannual meetings each spring and fall

·  Teleconference meetings to identify evaluation needs

·  Site visits

·  Prototypes for value-added data analyses

·  Value-added and alignment tools

Site Round Robin

Norman Webb asked the evaluator or representative from each MSP or RETA present at the conference to provide a brief summary of the evaluation activities and issues associated with their project.

Black Hills Special Services Cooperative, Promoting Reflective Inquiry in Mathematics (PRIME): We have a MSP targeted grant focusing on mathematics. Our goal is for math teachers to do 100 hours of professional development on content and inquiry-based instruction and then for teachers to implement the inquiry-based areas in the classroom. We have a large Native American population and a huge population gap; few Native Americans make it beyond the eighth grade. We’re working on keeping the students in high school and trying to close the achievement gap. The state standardized test is not well aligned with classroom instruction. We’re using performance assessment at grades 4, 8, and 11 and we also have standardized assessment at that level. Inverness is serving as the external evaluator.

El Paso Collaborative for Academic Excellence: We have a comprehensive grant with 12 school districts participating. The resources are directed towards staff developers who spend a lot of time in the classroom to help teachers implement the curriculum frameworks designed jointly by the University of Texas-El Paso faculty. We lay out topics to be covered by grade and levels of cognitive demand. This is linked to the state standards. The department chairs are very involved in ensuring the success of a math-science partnership, which is the key to maintaining sustainability. We also have enhanced partnerships with business leaders, a parent involvement component, research projects for teachers, and a new teacher induction program.

Mathematics and Science Partnership of Southwest Pennsylvania: We have a comprehensive grant with 40 school districts involved. We administered the Survey of Enacted Curriculum online. We had to modify the survey to only include items related to the MSP initiatives. We had a 66 percent response rate. We’re also going to administer a principal survey at the K-8 level.

Texas Engineering Experimental Station, Alliance for Improvement of Mathematics Skills PreK-16 (AIMS): We have a targeted MSP in our second year. The project revolves around four goals: 1) to enhance professional learning of preK-16 administrators and teachers, 2) to provide challenging curricula, 3) to enhance applications of technology, and 4) to conduct research on the effectiveness of the interventions. We’re using an instructional content instrument. This year we added a higher education survey.

University of North Carolina, North Carolina Partnership for Improving Mathematics and Science (NC-PIMS): We have a comprehensive MSP working with 17 counties in North Carolina. They’re primarily the poorest counties and quite diverse. We’re trying to close the gap through lateral entry teachers from businesses who have content knowledge but lack pedagogical knowledge. We’re just starting our third year; it’s a cascade model. We are collecting benchmark data. We’re administering the CCSSO Survey of Enacted Curriculum this year. We’re also using a diagnostic model to provide a profile of information on skills students have mastered or not mastered.

University System of Georgia, Partnership for Reform in Science and Mathematics (PRISM): The focus is on P-12 and university collaboration around three goals: 1) to raise standards and expectations for students and to improve student achievement, 2) to improve the quality of teaching in mathematics and science using professional development for in-service teachers, and 3) higher education involvement with P-12 schools. The challenges as far as evaluation are what is and isn’t PRISM and setting up a tracking record to link the various variables. One strategy is learning communities both for higher education and P12 and tracking the nature of those. Another strategy is to change the faculty reward system within all the universities in Georgia; this is interesting to evaluate.

University of Wisconsin, Madison, System-Wide Change for All Learners and Educators (SCALE): Involves four school districts including Los Angeles, Madison, Providence, and Denver and two institutions of higher education: UW-Madison and the University of Pittsburgh, plus the Institute For Learning (IFL). SCALE is organized around five goals: 1) instructional system, 2) development of immersion units, 3) development of pre-service capacity in institutes of higher education, 4) equity issues and closing the gap issues, and 4) research and evaluation team’s work including the indicator line of work, targeted studies, case studies, and building partnerships.

Utah State University, Building Evaluation Capacity of STEM Projects: As a RETA project, we provide up to ten days a year of consulting to MSP projects. We are now assisting sites to respond to site visit reports as a result of feedback they received. We’re helping individual projects on how to provide evidence in the project evaluation to show that they’re changing. We’re developing an online logic tool and an online evaluation course 101 so you can steer your administrators, parent, and teacher groups to make sense of evaluation efforts and findings. See www.usu.edu/cbec.

Vermont Institute for Science and Mathematics, Vermont Mathematics Partnership: We are a targeted project working on math. I’ve worked primarily with qualitative research on the evaluation, but there has also been a great deal of quantitative data collected. The project grew out of a masters program to produce teacher leaders in math. We’ve done observations with high and low involvement. We’ve developed a math a second language course and we’re looking at the norm-referenced test score data to see what we can say about the interventions. It’s been an evolving partnership with the university; we’ve doubled the number of participants in the project.

Western Washington University, North Cascades and Olympic Science Partnership: We are a comprehensive project with 28 school districts participating, covering a huge region of the state. The focus is on pre-service and in-service science indicators and a curriculum adoption program. We have a training institute for teachers with the focus on content immersion. We’re looking for partnerships tools and ways of measuring the connections and sustainability.

Assessing Partnerships Part I: Partnership Issues and Approaches

Norman Webb asked participants to have a “structured conversation” to think deeply about the issue of partnerships. We’ve structured some questions that will help us get at the notions of partnerships. Our focus is not only on K-12, but on K-20.

Ø  What constitutes a partnership within the MSP solicitation? Who are the partners? What are the interactions and processes among the partners?

·  NSF’s concept of partnerships changes. We are subject to the new vision of partnerships as NSF and projects become more definitive of what are how partnerships.

·  Different conceptualizations of sustainability – see Jeanne Rose Century’s work on this: http://cse.edc.org/work/research/rsr/default.asp.

·  Partners and partnerships as defined in the proposals have both formal and informal dimensions – different forms of inclusion/exclusion – e.g., the way STEM faculty and K-12 administrators make sense of the world—not on the same page about assumptions they make. Multiple cultures rarely have key players who are able to put on the table ideas requiring cross-cultural negotiations.

·  Different stakes for partners - something must be at stake for actors to see the value in partnerships – need something at the table for everybody. There needs to be some sort of common experience to bind people.

·  Partners must both contribute as well as gain some advantage. Science and mathematics faculty may feel as though they are not always getting something out of the partnership, maybe even a hindrance.

Ø  How do you make people into “stakeholders” who see the value in the partnership?

·  Staff turnover - some parts of the country have an 80 percent teacher turn-over rate—what’s the nature of that partnership, then—how can we get the initiative and professional learning in place “long enough” to sustain a partnership?

·  The partnership has to consider the individuals as well as the roles to be included in the collective.

·  The partnership involves the “NSF way” specified in the RFP; the other way involves how the work “really happens” as well as the expectations that people have (tenure track faculty member versus a principal or a teacher trying to meet standards).

·  The partnership needs to start with consensus, but also work long enough to operationalize and sustain it.

·  Concept of common vision or “minimal consensual cohesion.”

·  Idea of competing partnerships. “We’re not the only game in town.” Our partners have other partnerships that may have higher priority than our relationships.

·  Institutionalization of elements that already exist serve to help build sustainability (if the program ends, then there are remnants left over).

·  Actors/partners need a common vision as well as the idea that benefits accrue to partners as a result. Notion of mandatory partners versus volunteer partnerships.

·  NSF is concerned with core math/science faculty supporting teachers; a large emphasis was placed on this during site visit.

·  Partners are long standing institutions that are officially defined, as the partners do not really turn over.

·  NSF understanding is about how higher education can support K-12; but the focus groups highlight the other players and de-emphasize how higher education can do this. Leadership structure allows higher education to play a strong role.

·  Interaction needs to be two ways - different definitions of who the partners are; K-12 districts, higher education (universities, community colleges, levels of universities, Research I science centers), state agencies, Chamber of Commerce, professional organizations (as institutions).

·  The external evaluator is considered a partner as well as the professional development agency (“change agency”)—a third entity to help support the school districts and universities change.

Ø  Who within a partnership is subject to evaluation? (Goals, expectations, processes, etc.)

·  Evaluate the nature of the partnership to determine if it’s contributing to the changes in teaching and learning. We’re not looking at the nature of the partnership, per se but rather the characteristics of the partnership.

·  We have institutions that are already actors and then there’s the idea of partnerships that says people can work together. How do people come together to solve a problem? The partnership gets at a new dimension addressing how to solve the problem—“what do people do to enable change?”

·  What else do we want to accomplish? NSF wants a change in the culture of institutions.

·  Difficult to measure the “in between,” perhaps look at “ecosystem models” that would include formal/informal relationships as well as systems of reward.

·  We might need to build partnership profiles. Paying attention to simple things like the number of interactions and by whom might be worthwhile. Idea of examining the character of interactions of people who come from different backgrounds—the result is “new knowledge” that is a product of the partners coming together (Sharon Derry- at the WCER in Ed Psych- has a model of this).

·  Evaluate the people involved because individuals have “currency in their communities” and status, “credibility indexes” - social networks. Are these people the “top dogs”?

·  Frustrating in relation to site visits, NSF seems to want quantitative studies, then the second visit occurred and one of the professors wanted a qualitative case study. It would be helpful if NSF had more explicit guidelines; we all define it our own way, and then different MSP program officers have different expectations of the methodology to use. We’re getting mixed messages.

Ø  If NSF is considered a partner, how are we going to get stable information?

·  The key thing to remember is that we’re operating under the heading of a cooperative agreement; it might not be in line with our site visit team, but we have to maintain some direction while remaining flexible. Need to focus on evaluating what is changing and what is particularly important in the individual partnership. We won’t all focus on all the same change variables.

·  The Inspector General evaluated NSF; need to develop a set of guidelines for projects. At the programmatic NSF can be assumed to be a partner, but a project partnership has lower level partnerships as well (e.g. teacher-teams).

·  Need to have State Agency involved, hierarchical model.

·  Working together versus building a partnership; looking at whether or not people are actually forming strong linkages. If NSF holds purse strings, how can they be a fair and equitable partner?