May 21, 2008 Carnegie Affiliates Videoconference

PARTICIPANTS

Indian River College (IRC)

Deanna Voehl

Email

Deb Whiting

Email

Loyola Marymount University (LMU)p

Jackie Dewar

Email:

Park University (PU)

"Dailey-Hebert, Amber"

Email

Thompson Rivers University (TRU)

Lyn Baldwin

Email:

Elizabeth Templeman

Email:

NOTES FROM THE VIDEOCONFERENCE

------

The conference began at 10 AM Pacific Time with LMU conveying greetings from Tony Ciccone, CASTL Director. Listed in APPENDIX 1 at the end of these notes are comments and suggestions that Tony sent on the discussion topics that had been proposed before the videoconference.

After introductions each institution described its recent SoTL work and posed one or more questions of interest. The videoconference ended with a discussion of the Affiliates suggestions for critical next steps for SoTL and Carnegie’s role.

Outcomes, resources to be shared and suggestions are summarized here:

·  Each Affiliate can link a campus website describing their SoTL work to the Carnegie Website. Send the url that you want linked to "Alkoraishi, Zaineb"

·  In order to find out if any of the 14 Affiliates are sending representatives to STLHE conference, the team leader at any Affiliate can send an email to the Affiliates listserve It was emphasized that only the team leader at each Affiliate campus can send and receive email at this address.

·  TRU has developed a faculty survey regarding teaching and SoTL that it plans to use now and in two years or so after the SoTL initiatives. They expressed willingness to share the survey questions and if used by other institutions we might be able to make some comparisons across institutions.

·  Several institutions were working on revising student evaluation of teaching (SET) instruments and were at various stages of the process. LMU will share the summary of its work in the past year to develop a pedagogy-neutral (Appendix 2) and the guidelines document it developed for interpreting SET data (Accessible at http://www.lmu.edu/Page35750.aspx ). The following link takes you to an on-line course evaluation form that anyone can use for formative assessment and that allows you to select question from a large question bank that is organized by pedagogy type. http://www.ctd.ucsd.edu/resources/evaluations/index.htm

·  A background document on developing SETs was recently written by Ed Nuhfer, a colleague at Cal State University Channel Islands. Contact Ed Nuhfer () who directs the CSUCI teaching/learning center to request a copy.

·  There was interest in details of the regional Symposium sponsored by Park University and 7 other institutions. PU offered some advice that limiting the number of attendees made the conference more manageable and allowed for smaller work groups.

·  There was interest in TRU’s model of bipartite and tripartite faculty contracts where faculty are committed to two types of work (teaching and service) or three types of work (teaching, research, and service). The agreement outlining different ranks can be found at http://cariboo.tru.ca/trufa/Collective_Agreement/TRUFA_Collective_Agreeement_2004-2010.pdf. Note that ranks are detailed beginning on page 27 and assigned workload is discussed beginning on page 85. If there are questions, don’t hesitate to contact Lyn Baldwin () or Elizabeth Templeman ()

·  IRC described the development currently underway to create a web-based best practices resource

·  PU is interested in creating a website with resources, information and links to SoTL work that is structured in such a way as to be useful to beginners at SoTL as well as more experienced faculty.

·  PU is now publishing a SoTL journal InSight that is accepting submissions from outside of PU. The next volume (#3) will be out in July.

DISCUSSION OF CRITICAL NEXT STEPS FOR SOTL AND CARNEGIE’S ROLE

The videoconference closed with a discussion of what we considered important next steps for SoTL and how Carnegie might play a role. Participants observed that the Carnegie role currently provides visibility and weight to this work and facilitates collaboration amongst institutions. Being able to say that our SoTL work is part of a larger initiative is important and the CASTL name lends more emphasis than that of a professional society alone would. We all felt that maintaining and expanding collaborative linkages was very important, as was having the Carnegie site as a repository of SoTL work and SoTL participants, both individuals and institutions. This is one way it can continue to assist collaborations. All of the participants were interested in knowing more about plans for the gathering of institutional presidents and provosts prior to or at the Oct 22-25, 2009 ISSTOL conference.


APPENDIX 1: Previously proposed discussion topics with feedback and suggestions from Tony Ciccone


o Exploring synergies between SoTL and student outcomes assessment
The theme group facilitated by Cheryl Albers at Buffalo State has
been working with the CLASSE instrument through NSSE and Alex
McCormick on an interesting project that does just this. You may want
to touch base with her in about a month, when she'll have some
results to share.
o Taking a SoTL approach to assess the effectiveness of faculty
development centers
Here the connection might be with Lin Langley and Mary Savina in the
"cross-cutting themes group"
o Taking a scholarly approach to evaluating teaching and/or
learning
We've started to think about the evaluation of teaching as a way into
the studying the impact of the SoTL "movement." I'd love to have this
discussion with you or others.
o Exploring synergies between SOTL and the Impact of Service-
Learning Courses on student learning
No particular work that I'm aware of going on here, but certainly an
excellent path to follow.


APPENDIX 2 Loyola Marymount University SET report

Report to LMU Academic Leadership Conference from Committee on Student Evaluation of Teaching 4-25-08

Prepared by Jackie Dewar and Margaret Kasimatis

Context: After holding two workshops in the Center for Teaching Excellence (CTE) in Fall 2006 titled “How Can/Do We Evaluate Teaching at LMU” and conducting a follow-up survey, a report from CTE to Faculty Senate and the CAO recommended a committee be appointed to review/revise the student teaching evaluation form. This recommendation was reinforced in discussions at the 2007 Academic Leadership Workshop, in which participants expressed strong dissatisfaction with the current teaching evaluation instrument. A committee was convened in late Summer 2007 and began meeting every two weeks throughout the 2007-8 academic year.

Committee Members (Spring 2008):

Richard ‘Sonny’ Espinoza, Paul Humphreys (replaced Michael Miranda due to sabbatical), Linda Leon, Jennifer Pate-Offenberg, James Roe (Chair) with Jackie Dewar and Margaret Kasimatis serving as resource persons to the committee

Actions:

·  Developed and distributed guidelines for interpreting student evaluation form data to deans, chairs and the Committee on Rank and Tenure

·  Developed a pilot form that attempts to take into account best teaching practice, but remains pedagogically neutral

·  Pilot tested the form in Fall 2007 in 14 sections of courses, both grad and undergrad and representing all schools and colleges (Total number of students = 278)

·  Ran an initial statistical analysis comparing particular questions on the old form with certain questions on the new form .

·  Completed the analysis of responses to an open-ended question at the end of the pilot form asking students to comment on the pilot form as compared to the standard form

·  Held two focus groups with students to examine and compare the forms but they did not actually use them (one with 8 grad students; one with 8 undergrad students)

·  Recruited additional pilot testers for Spring 2008 with the intent to increase the number of data points to at least 30 so as to be able to do additional testing (e.g., internal consistency)

Outcomes:

·  Initial statistical analysis shows that certain questions on the pilot form revised to be pedagogy-neutral still capture key instructional dimensions that were on the old form (such as organization and clarity)

·  The graduate focus group indicated a preference for the new form

·  The undergraduate focus group was more mixed

·  The initial qualitative analysis of the 278 forms indicates overwhelming preference for the new form by the actual users of the form

Next Steps:

·  Analyze data from the Spring pilot

·  Survey faculty who were in the pilots regarding their satisfaction with the form

·  Complete the data analysis and report and hold faculty forum(s) in early Fall 2008

·  Distribute guidelines for interpreting student evaluation form data to all faculty

·  Develop a pool of optional questions that are directed to particular pedagogies (e.g., lecture, performance class, etc.) for use by departments or colleges

·  In 2009 make final recommendation to Faculty Senate for action