Janette Cullinan
Master’s Project - Soft Skills CBT
Summary of HRD Professional Survey
Site Overview
This site presents the findings of a user survey conducted by Janette Cullinan, a graduate in the Learning, Design, and Technology program at the Stanford University School of Education. This survey is one of the research methods she is using to inform the design of a class prototype for Human Resource Development professionals that will provide advice on the selection and implementation of soft skills CBT programs.
Master's Project Brief
I will prototype a WBT course targeted for HRD managers who are starting to consider delivering soft skills training via CBT or WBT. The course will cover two main topics (1) criteria for evaluating different programs, and (2) success factors for implementing CBT or WBT programs within an organization. Research will include (1) literature review, (2) interviews with HRD professionals who have successfully implemented a CBT or WBT soft skills program, and (3) reviews of three to five top products in this market space. The course should model my recommendations for good program design (although not all features may be functional in the prototype).
Survey Design
The web-based survey was designed to elicit user input on the issues critical to HRD practitioners when choosing and implementing a soft skills CBT program. The survey was sent via email to personal contacts and posted on the American Society for Training and Development Learning Technology threaded discussion. It focused on two central questions:
- User questions about how to choose between two CBT programs on the same topic.
- User concerns about using CBT programs for soft skills training in their organization.
Respondent Demographics
Total Number of Respondents: 12
HRD Job Title
/Industry
Consultant / 25% / High Tech / 67%Director / 25% / Financial Services / 8%
Manager / 33% / Public Accounting / 8%
Vice President / 17% / Consulting / 8%
Consumer Goods / 8%
HRD Experience
/ Experience with CBT1-2 / 17% / Taken at home / 25%
3-5 / 0 / Taken at work / 92%
5-7 / 50% / Implemented at work / 50%
8-10 / 33% / Participated in design / 50%
10+ / 0
Findings
Questions about Evaluating CBT Programs
Survey respondents most often cited Return on Investment as an important differentiator between programs, particularly cost investment. Generally, however, potential users brought up a wide variety of factors to consider when evaluating designs.
50% / ROI assessment, particularly cost42% / Technology access and reliability, particularly fit with current company infrastructure
34% / Amount and quality of interactivity
Quality of instructional design for retention and transfer
25% / Customizability
17% / Credibility of provider
Learning objectives match
8% / Quality and amount of practice time
Usability
Currency
Concerns about Implementing CBT Programs
Potential course users had a variety of concerns about implementing CBT programs for soft skills training. By far, the concern expressed most often was that soft skills are about human interaction and are best taught in an interactive group environment where people learn from a live instructor and their peers in class. The table below shows the rest of the concerns cited.
67% / Soft skills training requires live human interaction34% / Negative learner perception of CBT soft skills programs
Learner drop-out rates for self-paced programs
Technology access and reliability
25% / Difficulty customizing to the individual learner’s needs
17% / Ability to adapt programs to company training needs
Effectiveness of CBT for all learners
8% / Amount of “practice”
Uninterrupted time for desktop learning
Difficult to test comprehension
Cost
Implications
- Because of the level of concern about human interaction being an integral part of soft skills learning, my recommendations should address this issue. Currently, my bias is to include CBT programs as part of a larger learning process that includes peer coaching and group interaction.
- Users seemed to have very general ideas about what criteria to use in order to evaluate programs; some specific advice addressing the concerns raised by users could be very useful (i.e. ROI, technology fit, quality of interaction, and quality of instructional design).
- Implementation suggestions must include proven strategies for ensuring self-paced learners complete the program.
- Recommendations for evaluation criteria and implementation success factors must include technology considerations, access and reliability. One of my top recommendations will be to involve IS professionals at the very beginning, preferably those who are inclined to be advocates rather than barriers.