Rating of LO/animation – OSCAR

Introduction

The purpose of this document is to prepare a rating strategy for OSCAR LO/animation. (and to implement the same on OSCAR website)

This document contains details of some of the (mostly used) strategies and techniques used to evaluate LO/animation. A survey of online LO repositories, evaluation of the Los/animations, related papers and journals was done to prepare this document.

Mostly it is found that there are two purposes for the evaluation and publishing of the evaluation results.

  1. Searching the repositories : An evaluation/review published on the site that helps users to identify and choose quality Los for the topic/subject. The users can be teachers, students or others.
  2. Improving the quality of LO/animation.

I have picked some of the general techniques that are used which are given in the subsequent pages.

When we do the rating we have to consider the purpose for which we are asking the rating. Four techniques are given with some details. Each page gives one technique that is used. The last page gives a summary with a sample of some LO websites and their survey support details.

Please give a thought to the rating approach that we should follow. When we meet on Thursday can we also talk about this so that I can get the work started.

Developing and evaluating a strategy for learning from animations

Based on the cognitive process of learning from animations two testing strategies are used.

Uwe Kombartzky, Rolf Ploetzner*, Sabine Schlag, Berthold Metz

First strategy - tests the design of the animation as interpreted by the learner

(which includes text and successful frame sequence, the animation sequence etc)

The second strategy - tests the learning aspect of the user.

  • A question addressing factual knowledge
  • A question addressing conceptual knowledge
  • A question addressing rule-based knowledge

This paper used this strategy to improve the animations which I feel as good as rating an animation.

------

Developing and evaluating a strategy for learning from animations

Uwe Kombartzky, Rolf Ploetzner*, Sabine Schlag, Berthold Metz

Institute of Media in Education, University of Education, Kunzenweg 21, D-79117 Freiburg, Germany

Received 21 November 2008; revised 6 March 2009; accepted 8 May 2009

Based on current theories of multimedia learning, a strategy for learning from animations.

Appendix A. The learning techniques employed in the strategy for learning from animations as per cognitive process.

1. Orientation and development of expectations:

  • Observing the animation and formulating expectations

2. Selection and organisation:

  • Identifying and sketching important frames
  • Identifying and taking notes of important statements
  • Identifying and marking important regions in the frames
  • Identifying and marking important phrases in the statements
  • Labelling regions in the frames

3. Transformation and integration

  • Expressing relations between frames and statements in one’s own words
  • Summarising the displayed process in one’s own words

Appendix B. Three sample questions from the posttest

  1. Example of a question addressing factual knowledge: ‘‘Honey bees communicate the location of resources to other bees. Name two possibilities of how bees commu-nicate with other bees.’’
  2. Example of a question addressing conceptual knowledge: ‘‘In the picture on the left hand side, you see the position of the comb, the resource, and the sun. In the picture on the right hand side, draw how the honey bee will dance on the comb in order to communicate the location of the resource to other bees.’’
  3. Example of a question addressing rule-based knowledge: ‘‘The comb and the resource remain at the same location, however, the position of the sun changes during the day. What is the time of day when the honey bee performs each of the two dances shown in the pictures below?’’

LORI (Learning Object Review Instrument)

(John Nesbit () , Karen Belfer (), Tracey Leacock ()

What is LORI?

In evaluating a learning object with LORI,

reviewers can rate and comment with respect to

nine items:

For each item, quality is evaluated on a rating

scale consisting of five levels. If the item is

judged not relevant to the learning object, or if

the reviewer does not feel qualified to judge that

criterion, then the reviewer may opt out of the

item by selecting “not applicable”. The reliability

of LORI was investigated by Vargo, Nesbit,

Belfer and Archambault (2003).

  1. Content Quality: Veracity, accuracy, balanced presentation of ideas, and appropriate level of detail
  1. Learning Goal Alignment: Alignment

among learning goals, activities, assessments, and learner characteristics.

  1. Feedback and Adaptation: Adaptive content or feedback driven by differential

learner input or learner modeling.

  1. Adaptive Motivation: Ability to motivate and interest

an identified population of learners .

  1. Presentation Design: Design of visual and

auditory information for enhanced learning

and efficient mental processing.

  1. I

nteraction Usability: Ease of navigation,

predictability of the user interface, and

quality of the interface help features.

  1. Accessibility: Design of controls and

presentation formats to accommodate

disabled and mobile learners

.

  1. Reusability: Ability to use in varying

learning contexts and with learners from

differing backgrounds

.

  1. Standards Compliance: Adherence to

international standards and specifications

LOS-S Learning Object Survey – Students

(Interdisciplinary Journal of E-Learning and Learning Objects Volume 4, 2008

(Investigating the Use of Learning Objects for Secondary School Mathematics

Robin Kay and Liesel Knaack

University of Ontario Institute of Technology, Canada)

; )

Strongly Disagree 1, Disagree 2, Slightly Disagree 3, Neutral 4, Slightly Agree 5, Agree 6, Strongly Agree 7.

Learning

1. Working with the learning object helped me learn.

2. The feedback from the learning object helped me learn.

3. The graphics and animations from the learning object helped me learn.

4. The learning object helped teach me a new concept.

5. Overall, the learning object helped me learn.

Quality

6. The help features in the learning object were useful.

7. The instructions in the learning object were easy to follow.

8. The learning object was easy to use.

9. The learning object was well organized.

Engagement

10. I liked the overall theme of the learning object.

11. I found the learning object motivating.

12. I would like to use the learning object again.

13. What, if anything, did you LIKE about the learning object?

14. What, if anything, did you NOT LIKE about the learning object?

Student Feedback Form

(Based on SUS-proposed during template design of OSCAR)

Participant's Name: Concept Name: Date:

Introduction:

This questionnaire is based on the System Usability Scale (SUS), which was developed by John Brooke while working at Digital Equipment Corporation. It is a simple, ten-item scale giving a global view of subjective assessments of usability of the “Learning Object”(LO) for learning. To what extent do you agree with the following statements:

1 = Strongly Agree 3= Undecided 4= Disagree

2 = Agree 5 = Strongly Disagree

Strongly Strongly

disagree agree

1. The concept appears simpler after viewing the animation.

1 2 3 4 5

2. I still need expert help in understanding the concept

1 2 3 4 5

3.There are extra information that is distracting from

the LO's learning goals.

1 2 3 4 5

4. The various controls are intuitive and easy to use.

1 2 3 4 5

5. I am comfortable in navigating through the animation.

1 2 3 4 5

6. The glossary is helpful.

1 2 3 4 5

7. The questionnaire helps in evaluating knowledge of

the concept.

1 2 3 4 5

8. The “Further Reading” button is effective.

1 2 3 4 5

9. The LO appears appealing.

1 2 3 4 5

10. The LO is complex.

1 2 3 4 5

How do some of the web sites, rate Los/animations?

  1. Individual user rating is based on LORI on a scale of 5. The individual ratings are published on the site on another page in a table format.
  2. There was one site which used the result of assessments in the LO to rate the LO. Missed out on bookmarking this site similar to the strategy of (Developing and evaluating a strategy for learning from animations)
  3. There was another site which used the user's navigation through the LO along with the above (2) strategy to rate the LO for further improvement of the LO.
  4. Collaborative evaluation is used in Workshops and with small groups of experts. In this model,
  • Small evaluation teams are formed with participants representing relevant knowledge (e.g., subject matter expert, learner, instructional designer).
  • A team leader or moderator chooses objects for review,
  • Schedules a panel review activity, and invites team members to participate on the panel.

These reviews can be conducted online or offline during workshops. These are published along with the LO/animation and are also used for further improving the LO/animation.

Examples of Open-Access Repositories and evaluation support.

  • Telecampus 1997 66,000 Online university courses and programs; mainly aggregation

levels 3 and 4. No support for quality evaluation

  • Apple Learning Interchange 1998 21,000 Online resources for K-12 education; and 2. No support for quality evaluation
  • MathForum 1996 8,800 Online mathematics resources for K-12 and post-secondary

education; mainly aggregation levels 1 and 2. No support

for quality evaluation

  • Alexandria/Careo 2001 2,500 Online materials for post-secondary education, aggregation levels 1 and 2. No support for quality evaluation
  • Merlot 1997 7,000 Online materials for post-secondary education (with some

K-12); Support for user comments and peer reviews

- Uses LORI and Collaborative evaluation.

  • Harvey Project 1999 600 Online materials and courses on human physiology; university level; aggregation levels 1–3. Support for user comments and peer reviews. Use the data collected from assessments and the user interactions on the LO to give rating and improve the Los. Also use LORI
  • Wisconsin Online 1999 600 Centralized storage of online resources supporting

, Resource Center Wisconsin’s technical colleges; Support for user comments. Uses LORI