How to Use a
Systems Approach
for eLearning eValuation
Based Upon Using
Enabling Objectives
/
HOUSTON DENVER CHARLOTTE
ORLANDO JACKSONVILLE
ATLANTA AUSTIN DALLAS /
UNIVERSITY
Developed and Presented by
Jim (Mo) Moshinskie, PhD, CPT[1]
Contact: 254-756-7535
Baylor Onsite Workshops Available from Dr.Mo:E-Learning Made E-Z – immersion seminar for trainers and subject matter experts on how to plan, design, develop, and evaluate eLearning.
eValuation – How to build eLearning using enabling objectives to design and evaluate defendable results. / Powered by Vuepoint Learning System
Before the Course
Development Begins
Stage
/Procedures
(Steps to follow in order as shown) / Principles(Tips to consider as appropriate)
1.1 / DETERMINE THE CONTENT
- Determine exactly the right content to be taught based upon a thorough front-end needs analysis (FEA).
- Link it directly and clearly to measurable business needs.
- Discover how management defines success and get their approval / support of the eLearning content.
- Form the desired performance into a process (with discrete independent stages)[2]. Do not worry about the content within each stage right now, just confirm the process.
- Gather all the information that will become the content for the course.
- Once all the content has been identified and approved, separate and mark the need to know material from the nice to know material.[3]
Know / Nice to
Know
The minimum knowledge needed to perform the process exemplarily. / Extraneous information that can be used for reference and “tell me more” screens.
/
- Involve the management, SME[4], ID[5], and Community of Practice (COP) in selecting the needed topic using Business Analysis and Performance Analysis techniques.
- Consider also involving:
Information Technology (IT)
Product Development
HRD
Marketing
Sales
International
- Establish that the expectations of everyone are clearly delineated.
- Confront any problems early.
- When gathering content, look for brochures, reports, sales information, manuals, job aids, slides, pictures, lectures, artwork, books, scientific data, research results, web searches, etc.
- Clearly mark the information found within each source above as to “need” and “nice” to aid the instructional designers.
1.2 / WRITE THE TERMINAL OBJECTIVE (COURSE MISSION)
- Write the terminal objective for the entire course. This terminal objective will appear within the course introductory screens.
- Get stakeholder approval.
- The terminal objective is written in two or three succinct sentences.
- Example:
At the end of this course, the learner will be able tofollow the five-stage Star Selling Process to open and close a car sell successfully.
1.3 / FORM CONTENT INTO A PROCESS
- Form each separate stage within the performance process into short (10-15 screens) stand-alone modules.[6] A generic model of this course would look something like this:
- Course introduction
- Course terminal objective (See 1.2 above)
- Menu of topics (modules) covered in the course
- How to navigate the interface
Module 2 /
- A very brief, high-level overview of the entire process - (Ideal job for a FLASH animation with sound and interactivity).
- Pre-Test (with questions linked to specific objectives and modules so VLS can build a customized roadmap).
- Stress why this content is important.
Module 3 /
- Teach Stage 1 using the ROPES model
Module 4 /
- Teach Stage 2 using the ROPES model
Module 5 /
- Teach Stage 3 using the ROPES model
Module 6 /
- Build authentic scenarios that allow the learners to perform what they have learned and receive prompt and legitimate feedback. HINT: Start with easy scenarios, then move to harder ones.
Module 7 /
- Summarize the process again
- Give the post-assessment
- Print a completion certificate
- Design each teaching module to teach one stage specifically using the ROPES instructional design model.[7]
In the example above, modules 3 - 5 are the teaching modules. Each other these modules could be developed as learning objects and perhaps be used in other courses.
- Use the Zoom Principle in which you first teach the overall process then “zoom” in and teach what to do in each stage of the process.
- This gives the learners a useful “big-picture” of the desired performance, and shows how it breaks down into convenient, easy to remember stages.
1.4 / DETERMINE PROCEDURES AND PRINCIPLES
- Divide the content of each teaching module into sets of procedures (exact steps done in a certain order every time) and principles (supporting guidelines derived from best practices and lessons learned).
Note that this is the same method we are using in this paper: The first column numbers the stages, the second column names the stage and shows the sets of procedures to perform, and the third column lists principles to consider.
- Determine how to present any prerequisite support information that the learners will need to know so they can understand the procedures and principles better (This support information includes the prerequisite concepts[8] and facts[9].)
- Input from SME, exemplary workers, research, and best practices can be used to devise the needed procedures (the steps) and principles (the tips) for each stage.
The Actual Steps
Principles = / Tips and Guidelines- Consider presenting concepts and facts as nice to know information in the glossary, library, mini-lessons, or hyperlinked to detailed reference screens.
- VLS provides a searchable library environment for this support information.
1.5 / WRITE THE ENABLING OBJECTIVES[10]
- Write a short enabling objective for each key procedure or principle identified in step 1.4.
- Assure that each enabling objective is measurable.
- Combine the enabling objectives into a single list (See exhibit 1).
Note: VLS Content Creator allows you to link learning objects together into courses. In this manner, any time you update the original learning object, you automatically update it wherever it appears in any course. /
- Example of enabling objectives:
At the end of module 2, you will be able to:
- Greet customers appropriately.
- Ask the correct discovery questions.
- Determine their specific needs.
- Overcome any objections, and
- Close the sale.
(The enabling objectives are displayed on the screen as a bulleted list, each starting with a measurable verb).
During the Actual Course Development
2.1 / CLASSIFY THE ENABLING OBJECTIVES (See Exhibit 1)
- Classify all of the enabling objectives listed in item 1.5 according to importance:
C = considered a critical, must-do objective.
Q = needs question in the pre- and post-assessments. - Sequence the enabling objectives within each module:
Hint:
First teach the procedures then the principles;
first teach any concepts then the specific facts.
- Involve SME, C of P, ID
- We suggest you first teach the procedures required to complete that stage because it sets the big picture in the learner’s mind, then present the principles with tips to perform the procedures better.
2.2 / WRITE THE ASSESSMENT QUESTIONS
- Write the Pre-Assessment questions using a variety of VLS templates. Write at least one question for every critical enabling objective.
- Link the questions to specific modules within the course so VLS software can automatically build a unique, customized roadmap consisting only of those modules in which the learner missed questions.
- Use good test writing procedures.
- Since you will collect statistics on the scores, you will need the right number of questions in the pre- and post-assessments. The rule of thumb is 18-20 questions.
- Consult an educational statistical analysis book to confirm your particular corporate needs.
2.3 / DESIGN EACH MODULE
- Design and build each module using ROPES.
- Build the “REVIEW & RELATE” screens to include the review and relate information.
- Build the “OVERVIEW” overview screen, displaying the key enabling objectives.
- In VLS, the course tree showing modules and screens conveniently appears on the left side of the Content Creator authoring screen. The order of screens can be easily changed, and modules and screens can be renamed as well.
- Using eye candy appropriately, such as animated movies made from Macromedia FLASH, get the learners’ attention during the opening screens[12].
2.4 / DEVELOP THE “PRESENT” SCREENS
- Build the “P” (present and practice) screens.
- Design the screens to teach the enabling objectives in such a manner that clearly demonstrates linkage to the specific enabling objective. Use VLS templates to vary the screens.
- Include meaningful interactivity on as many screens as possible using a variety of VLS templates.
- Stress the “how to” of the desired performance.
- Title screens so linkage to specific enabling objectives are clear.
- A good rule of thumb when teaching a procedure or principle is to entitle the screen: “How to…”
- Research shows the more the course is entertaining, the better the results.[13]
2.5 / DEVELOP THE “EXERCISE” SCREENS
- Build the “E” (engage and exercise) screens using VLS interaction templates, FLASH templates, hot spots, rollovers, and FLASH movies.
- Incorporate authentic case studies, scenarios, and/or simulations that allow learners to apply the critical enabling objectives to realistic job situations. Interview SME, DW, COP, or exemplary employees, or read reports to devise authentic scenarios, simulations, and case studies.
- Give legitimate feedback.
- Case study = a detailed presentation of an actual situation followed by a chance for the learner to make decisions on how to handle it.
- Scenario = a short situation in which the learner chooses correct behaviors
- Simulation[14] = a graphic representation of a setting (e.g., sales office) in which learners interact with characters and/or objects on the screen (e.g., angry customer or ringing telephone). May be accented with audio.
2.6 / WRITE THE “SUMMARY” SCREEN
- Write the “SUMMARY” (summary) screen – Restate the critical information from the critical enabling objectives from the course.
- Design the course so that when learners finish, they will be directed to the automatic Course Evaluation so they can evaluate the course immediately.
- Be brief in this summary.
- VLS provides a built-in evaluation tool that allows learners to view Level 1 questions about the course. The learners simply click their response to each question.
- VLS also allows the addition of open ended questions, i.e., How can we make this course better for your specific job needs?
2.7 / PREPARE THE POST-ASSESSMENT
- Write the post-assessment – Write a reliable, valid question for each critical enabling objective.[15] You can use the same pre-test as the post-test if you wish. Determine the passing score.
- Decide what to do with learners who fail:
A) Re-take the entire course, or
B) Direct to another personalized Roadmap that VLS builds based upon Post-assessment results.
- Publish the draft course to the server.
- Field test your course allowing the learners to use the VLS Comment feature to give suggestions and changes directly to the design team.
- Revise as necessary
- Get key stakeholder signoff
- Using the VLS Comment feature within the VLS interface, learners can type their suggestions, changes, and ideas and submit them directly to the design team.
- The design team can access and print these Comments from the VLS Administrator. Each Comment automatically includes who sent the message, when, and what screen is involved.
Following the Course and During Any Blended Solutions
3.1 / ANALYZE THE LEVEL 1 DATA- Collect data for a pre-determined time period.
- Determine the range[16], mean, and standard deviation (SD) on Kirkpatrick Level I Evaluations (How well the learners enjoyed the course).
Standard Deviation (SD)
/Possible Indication
High / Low / = / Strength of trainingLow / Low / = / Weakness of training
High / High / = / Tendency toward positive perception, but no everyone agrees
Low / High / = / Tendency toward having negative perception, but no everyone agrees
- Evaluate the results. Standard deviations (SD) below .5 indicate agreement and those above 1.0 indicate disagreement.
- Use judgment for statements that fall in the middle (e.g., M = 3, SD = .75)
- VLS software automatically collects Level 1 evaluation scores.
- The chart shows possible indications from mean and standard deviation scores.
- As you become more experienced in interpreting results, you can fine-tune this chart to your specific topics and course.
3.2 / ANALYZE LEVEL 2 DATA
- Compare the pre- and post-assessment scores statistically. (Consider using the paired t-test[17] test to statistically compare the results of the two assessments. Your goal is to determine if there was a significant difference between the two sets of scores.)
- Collect and analyze the means and standard deviations[18] for the pre- and post-assessment scores.
- VLS automatically collects pre- and post-assessment scores. This information is available by users, by regions, by departments, etc.
- VLS can print a certificate for learners who pass the course.
3.3 / CONDUCT THE BLENDED WORKSHOP
- Conduct an after-course blended workshop so learners can apply the enabling objectives in mock situations made as authentic as possible.
- Evaluate the learners[19] on their application of the critical enabling objectives during the blended workshop based upon a five-point Likert scale (1 = strongly disagree to 5 = strongly agree).
- Calculate a mean and standard deviation on the scores for the learners and on the workshop, and analyze results.
- The ROPES model can be used to develop the agenda for the blended workshop.
- Using a check sheet can help the workshop coaches evaluate the participants and collect data on performance.
- Input from SMEs, exemplary employees, line managers, etc. can help you devise realistic workshop exercises.
- Blended experience could occur online using VLS chat, push, and comment features.
3.4 / ALLOW THE LEARNERS TO SELF-ASSESS
- Ask learners to evaluate their confidence levels on performing the critical enabling objectives using a five point Likert scale (1 = strongly unconfident to 5 = strongly confident).
- Include opened ended questions such as:
Which performance areas do you need specific help on now?
- Examples:
I can change a tire.
I can close a sale.
I can manage my time. - Compare these scores to the workshop evaluations (item 3.1 above) and later performance scores at the actual worksite (item 3.3).
3.5 / QUERY MANAGERS
- Select a pre-determined number of employees randomly[20] and query their immediate line managers on the employee’s implementation of the critical enabling objectives in their actual workplaces.
- Base this query on a five point Likert scale as the supervisor or coach evaluates each critical objective (1 = strongly disagree to 5 = strongly agree).
- Calculate means and standard deviations as described above, and analyze results.
- Ask a series of open ended questions, e.g., How could we train your staff better on this topic?
- Quantitative evaluation – numbers (Means that are 4.0 and above are high; those 2.5 and below are low).[21]
- Qualitative evaluation – Count the number of positive and negative statements; look for trends and similarities.
Hint: In qualitative analysis, consider throwing out the one best comment and
the one worst comment before
evaluating.
3.6 / CORRELATE ANY POSSIBLE RESULTS
- Tie critical enabling objectives to actual customer surveys and correlate results (Consider using the Pearson R[22] correlation coefficient).
- Consider this when evaluating how well the course objective was achieved (See step 1.2) and predict future success.
- Consider correlating scores to learning styles if this information is collected.
- Consider 1 month, 6 month, and 1 year
3.7 / ANALYZE THE ON-THE-JOB PERFORMANCE
- Randomly select a pre-determined number of learners and evaluate results from a quality walk or shop.
- Base this evaluation on a five-point Likert scale
(1 = strongly disagree to 5 = strongly agree). - Calculate a mean and standard deviation.
- Use someone like a team coach or project manager as the evaluator.
- Use a specific checklist to avoid interrrater errors.
3.8 / COMPARE TO OTHER COURSES
- Compare statistics of this course to other eLearning courses conducted by your company.
- Compare statistics of this course to any national statistics that may be available.
- Visit ispi.org
- Visit astd.org
- Read eLearning magazines and journals
- Attend conferences
- Network with other performance technologists
3.9 / MAKE THE FINAL EXECUTIVE REPORT
- Meet with the team to review results, addressing such issues as:
- Were any assessment questions too easy or too hard?
- Were any topics that seemed too easy or too hard?
- How were the results by the different departments, regions, or countries?
- What was the course completion rate?
- What did learners like and dislike the most?
- Do you see any emerging trends or patterns?
- What was the time on task?
- What modules did the learners spend the most or less time taking?
- Devise a written report that succinctly describes the eValuation results.
- Share results with the key stakeholders.
- Share results with the learners – they have the right to know.
- Discuss any areas where the results were unexpected, both high and low.
- Revise course, lesson, performance objectives, and questions as necessary.
- Share best practices (knowledge management).
- Toot your own horn as loud and often as you can.
- VLS tracks the results by questions and by topics, departments, regions, etc. as you set it up in VLS Administrator.
- Always be alert to other variables that may have skewed results other than your interventions, e.g., economy, weather, etc.
- Date stamp courses so content will be fresh.
- Determine if the results can be used to forecast future performance.
- Be honest.
Exhibit 1: Sample Enabling Objectives
Course Title: How to Close a Sale