L21 Steering Committee Minutes

3/18/13

Present: Andy Agar, Ehud Benor, Cathy Cramer, Karen Gocsik, Bob Gross, Alex Hartov, Josh Kim, Tom Luxon, Jennifer Taxman, Ellen Waite-Franzen, Barbara Knauff, Susan Zaslaw, Pat Coogan

1.  Spring Term Pilot Course Update – Barbara Knauff

Eight faculty have agreed to participate in the Spring Term pilot courses, teaching in BB SP 11 or Canvas. As decided earlier, the pilot would include a number of smaller writing and rhetoric courses taught in either Canvas or Blackboard in order to get a closer comparison of the systems.

Larger courses: Gross, BIOL 4, Bb (31)

Somoff, RUSS 13, Canvas (40)

Goodman, SOCY 1, Bb (73)

Irwin, ECON 39, Canvas (2 sections, 70)

Writing and Rhetoric: Gocsik, WRIT 11, Bb (16)

Compton, SPEE 30, Bb (14)

Anguiano, SPEE 20 & 27 , Canvas (2 courses, 20)

Sargent, WRIT 43, Canvas (32)

Question: How advanced are the features that are being used by faculty teaching the pilot courses? Karen Gocsik (WRIT 11, BB): Plans to use discussion board for peer review, blog, may try rubrics for students to apply to peer feedback for student presentations. She noted that she is not having difficulty importing Word documents into the course.

Bob Gross (BIOL 4 BB): Will use discussion board, blog for continual peer review. Reported that the new Bb interface is like "night and day" compared to SP 7; he is accessing and using tools that have been there, but not as apparent.

2.  Review of Evaluation Activities and Discussion of Feedback Plan – Barbara Knauff

Barbara described evaluation activities during this phase of the LMS review, the intended audience, and the plan for gathering feedback from each (handout: L21 Evaluation Activities and Feedback Plan). Activities include mini courses, self-service exploration, Spring Term pilot courses, Town Hall demos, guided labs, and peer references.

Ehud expressed concern about the order of the rating scales used in the self-service testing environments. They appear as rating from negative to positive. Is it counter-productive? Are we testing for positivity or negativity? If possible, the team will adjust the rating scales to move from positive to negative.

Discussion: Are we asking the right questions?

Pilot courses:

Barbara explained that the team will collect feedback from pilot faculty and students at mid-term and at the end of the term. Because of the variability in the pilot environment - e.g., different courses, different faculty - pilot testing will not produce a one-to-one comparison of the platforms; rather, user feedback will provide information on different dimensions of usability. What information does the Steering Committee need?

Discussion: Suggested questions / areas of inquiry for pilot courses

For faculty:

·  Did you attempt or think about some change in your course design after evaluating this tool?

·  How well has the system leveraged your thinking about course design?

·  Did you find a feature in the LMS that you are testing that encouraged you to try something in your teaching?

·  Get a sense of what tools they used with each system. Level of expertise. Will show different view and sense of the system.

·  Canvas pilots: Describe their experience with system upgrades while teaching the course.

·  Is this system integral to the course, embedded in the course, or is it an add-on?

·  The evaluation has many dimensions and it is difficult to ask faculty to compare tools when both systems are new to them. Are we testing the system or new ways of teaching the course?

·  Attempt to get a sense of the emotional reaction to the LMS. How did they feel? Prompt an emotional response in open-ended questions at the beginning of the survey.

For students:

·  Does the view in your course change your perspective on how you see course?

·  Describe any features used in the pilot that you had not used before.

·  Does the LMS increase your sense of ownership of your learning experience?

·  Do you think using the LMS increased your engagement and learning in this course?

·  Does the LMS allow you to see the logical progression of the material in the course?

·  Ask about experience with mobile apps.

·  Compare systems: Is your experience with the new Bb or Canvas better than your previous experience with Bb?

·  Is this system integral to the course, embedded in the course, or is it an add-on?

·  Use approach similar to National Survey of Student Engagement in question design.

·  Will analytics be evaluated?

·  Focus groups: ask faculty for permission to conduct short, in-class student focus groups in pilot courses, 30 minutes maximum. Bob and Karen supported this approach. Schedule toward the end of the term, frame as a teachable moment.

·  Attempt to get a sense of the emotional reaction to the LMS. How did they feel? Prompt an emotional response in open-ended questions at the beginning of the survey.

Mobile:

Josh asked about evaluation plan for mobile. How will these features be tested?

The committee agreed that broader student representation in testing mobile apps is a preferable approach.

Discussion on approaches to evaluating mobile:

·  In the pilot course orientation, Instructional Designers would be deliberate about pointing out mobile applications. Ask whichif students intend to use mobile apps. In evaluation, ask whether students used mobile apps, what they used it on, describe the experience.

·  Add links to the mobile apps to a visible place on the main page of each system.

·  Work mobile activities into the guided labs.

·  Develop Kresge training focusing on using mobile devices. Need to consider which tasks to include on mobile, as not all apply.

Organizations:

Tom asked whether the LMS support of organizations will be tested. Ellen replied that better sites for organizations are needed, not in an LMS.

Peer references:

Barbara provided a list of peer institutions that might be approached about either Canvas or Blackboard. What questions should we ask and who should we talk to?

Discussion and suggestions:

·  Involve faculty and students. Barbara noted that these individuals may difficult to contact, but that instructional designers may provide good information on their experiences with faculty.

·  Ask for existing structured data, surveys about implementation.

·  Do you have any regrets about your decision?

·  Understand what they were using, and what the drivers were for looking at a new system.

·  How easy and how costly was implementation?

·  Was there an uptick in LMS use after adopting the new LMS?

·  Describe your change process and experiences.

·  We need to identify our goals for this process and ask reference if that was their goal.

·  Impact on demand for instructional design support.

·  SIS integration: Given their systems, how easy was it to integrate. How did it work out in real life?

·  How easy was it to work with the company? How responsive were they?

Can we make a recommendation before having all the Spring Term Course Pilot feedback?

Susan reviewed the feedback timeline (handout: Feedback Timeline)

Discussion: We will miss student review of materials if early decision; there is significant student experience toward the end of the course. Grade center data may not be fully utilized. Cathy indicated that she was hesitant to make decision before a course is over.

Is the extra feedback from the end of the term that much different from the mid-term feedback? Is the information gathered mid-course and the other information that the team has gathered compelling enough to wait or not. Do we ask whether we are ready to make a decision at the May 22 meeting, or want additional information?

A broader conversation is needed about change. More information is needed about the faculty's willingness to change. The committee agreed that it is very important to have commitment from faculty that they want to change as often as Bb changes. And, if a recommendation wereas made to move to Canvas, do we know how faculty will react to constant incremental change? It is also important to look at the long-term relationship with the company and we should not just be evaluating on functionality.

Follow ups

1.  BarbaraK: draft a more definite version of a survey or interview questions and circulate to the committee by email.

2.  Explore the feasibility of offering the Kresge mini course on mobile.

3.  Address mobile apps questions to pilot courses.

4.  Change order of questions in the pilot course survey

5.  Ask faculty whether we could hold a student focus group during class or X hour.

Next Steering Committee meeting: Wednesday, April 10, 10:00 – 11:30

2013-03-18-SC-Meeting-Minutes