TOPIC III. MARKETING RESEARCH AND CONSUMER BEHAVIOR 85

Ed 5

Gregory J. Baleja

Alma College

Marketing Research:

Questionnaire Construction Process

Each year during my lectures on Marketing Research, I spend one full class period discussing the Questionnaire Construction Process in great depth. I find the presentation of the concepts associated with the questionnaire construction process to be rather straightforward. In fact the presentations go so well, that many students come to the conclusion that creating a questionnaire, or evaluating a current one, in the real world is relatively easy. They seem to believe that all one has to do is to understand the basic objectives of the research, and then simply translate these objectives into a formal questionnaire. It is this erroneous assumption that bothers me.

After going through a detailed list of generalizations associated with the construction of a questionnaire, I then ask the students to critically evaluate the “Instructor Evaluation Form” that they have been exposed to many times during their tenure at Alma College. Some of the questions utilized on this form in the recent past, are illustrated below. Samples of the student’s comments are shown in italics following each of these questions. For each of these questions, the students were instructed to assign a letter grade according to the following scale:

A = Excellent/almost always

B = Good/frequently

C = Fair/sometimes/average

D = Poor/infrequently

E = Fail/almost never

1. The instructor was well prepared for class.

How do we know if they are, or are not, well prepared for class?

What is meant by “well prepared?” --- Well dressed? Awake? An outline of the lecture on the board? Etc.

It’s hard to judge whether the instructor is well prepared for class because the instructor can talk about numerous topics related to the subject.

2.  The instructor was aware when students did not understand the material.

How does the instructor know when the students do not understand?

How would we know if the instructor was aware or not?

Is it the instructor’s responsibility to find out when people don’t understand, or is it the student’s responsibility to inform the instructor?

Instructor awareness of a student’s not understanding is difficult to measure. Whether the instructor did anything about it is somewhat more measurable.

3. The grading methods were made clear to students.

This question is very ambiguous. What is clear to one person may not be clear for another.

4. Exams reflected the important aspects of the course.

This may not be within the respondent’s "zone of experience."

Who is in a better position to know the important aspects of the course, the instructor or the students?

5. The instructor was readily available for help.

-  What is meant by readily available?

Usually professors aren’t available between 9 p.m. and midnight, when the majority of the students do their homework, does this mean they aren’t available?

If the student doesn’t seek outside help, they won’t be able to answer this question.

6. The instructor showed respect for students.

-  What is considered respect?

The question is better set up for a dichotomous (yes/no) type of answer. It would be hard for a student to rate the respect she received from the instructor.

The question should be reworded to "was respectful toward students."

Too broad: what defines "showing respect?" A score of lower than "A" would indicate that the instructor did not "almost always" show respect.

After the students have found a variety of faults with each of the questions on our "Instructor Evaluation Form," I then assign them the task of creating their own version of this evaluation form. Once they have turned in their rendition of the form, I then allow the class to critically evaluate the best forms. It is interesting to note that typically, the class finds as much fault with the forms that they created, as they do with the current form that they have been criticizing for many years.

The purpose of this exercise is to demonstrate to the students, the difficulty encountered when trying to operationalize the questionnaire construction process.

Geoffrey P. Lantos

Stonehill College

The Joy of . . . Marketing Research?

The least-loved course in our marketing curriculum is probably Marketing Research. Many students enter this course with fear and loathing. The idea of research spooks them because it conjures up images of laboratories, dull people in white smocks, archaic formulas, dry-as-dust statistics, and esoteric procedures hatched in the ivory towers of academia. The professor has a handicap from day one, as most students would just as soon be undergoing a root canal or working on a chain gang.

My philosophy is to try to make this course as painless and pleasant as possible. On syllabus day I let them know that, while they will need to think, struggle, and work hard, they can have some fun and enjoyment in learning too. I encourage them to share their personal experiences with marketing researchers, the good, bad, and ugly (e.g., being accosted by bright-eyed, zealous young fieldworkers in shopping malls, hustled by silver-tongued researchers on those dinner-time phone calls, etc.). Too, I remind them that, while research is a technical subject, much of the course is really a study in human behavior (e.g., dealing with response biases, increasing response rates, etc.), like the Consumer Behavior course which so many of them prefer.

Once the sugar-coating message has been imparted, you can use the usual suspects: interesting videos; guest speakers; case studies; debates on ethical issues; pass-arounds of real research proposals, research reports, and syndicated data, etc.; and a semester-long project to gain their involvement and which you can continuously implore them to discuss as examples of concepts throughout the course. In addition, I try to follow up in gaining their interest (if not enthusiasm) in some of the following specific ways:

I clip interesting material from the popular press, which is always reporting polls on various political, social, and marketplace issues and occasionally contains a cute (if not uproariously funny) cartoon. Make transparencies of or scan these into your PowerPointÔ presentation (e.g., a percentage distribution showing the proportion of people who favor each of the Three Stooges is more interesting than the proportion of companies in the widget industry that make high-end vs. low-end widgets).

When discussing organizational and human relations issues in marketing research (e.g., disagreements between marketing managers and researchers over issues such as time, money, and decision making), I have students "choose sides" and role play either the manager or the researcher engaged in a heated discussion on each issue, defending their point of view.

Exploratory/qualitative research is probably the topic that generates the most opportunities for enjoyable learning. I break students into small groups and have them brainstorm ways to find "professional consumer detectives" (a sort of expert opinion approach) for various target markets (e.g., to learn about tennis players, talk to professionals who have frequent or intense contact with them, such as coaches, trainers, and sporting goods salespeople). You (or one of your students) can lead a student through part of an in-depth interview or several students through a focus group on a subject of interest to students (e.g., music, sports, and fashion always hit their hot buttons), and have students critique it afterward. Find an example or two of specific uses of projective techniques not found in your textbook and try them out on students to see what you can learn about their "deep thoughts."

Secondary research is one of the dullest subjects to teach. Rather than drone on about all of the different information sources, have student teams go on a "scavenger hunt" in your library and/or on the Internet during a class period to see who can find the most of a list of twenty or so specific pieces of information you request (e.g., Coca-Cola's market share last year, growth rate of the personal computer industry, etc.).

Discussing the different types of survey error can be deadly if you don't have interesting examples. Scan other textbooks and the popular press for these. A form of marketing research students can all relate to is student course evaluations. These can be used to discuss such issues as auspices bias (students are often kind to their professors), extremity bias (they often "strongly agree" or "strongly disagree" with all Likert items), and social desirability bias (some students claim they study more than twelve hours per week for this course!).

It is easy to draw them into a discussion of questionnaire design since most mistakes result from lack of understanding of human psychology. For instance, in discussing memory error ask how many can recall things like the losing vice presidential candidate in the most recent election or the loser in the last World Series. Try to draw out areas where they might be subject to social acceptability bias (e.g., dating and drinking behavior). Involve them by having them analyze actual questionnaires that you've received in the mail (many of these are more full of holes than a piece of Swiss cheese) or that your colleagues in industry pass along to you.

Another killer topic is measurement. Have them come up with conceptual and operational definitions of intriguing and controversial constructs such as "alternative music" and "sexual harassment." In order to demonstrate the importance of precision in formulating an operational definition, I ask students to imagine I'm a Martian (not hard for many of them) and have no concept as to how to heat a cup of water using a hot pot. I do exactly as they tell me to do. For instance, if they tell me to plug in the hot pot, I put the plug in an unlikely place, such as the overhead machine outlet. When they tell me to plug it into the wall, I pretend to keep my finger on the prongs and get electrocuted, and when I'm instructed to pour the water out of the cup (presumably into the pot) most of it ends up on the floor (bring paper towels!). To illustrate validity I bring a dartboard and darts, have several students give it their best shot, and then discuss the accuracy of their efforts in terms of validity (hitting or being near the bulls eye) and reliability (being consistent in where the darts land).

The subject of sampling can be spiced up by bringing a bucket of marbles to class. After telling students that, despite what they might think, I haven't lost them yet, I note how there are different kinds of marbles: large and small, solid color and multicolor, clear and opaque. A la "Sesame Street" I pretend that they are people to be sampled, and illustrate concepts such as sampling units and population elements, random error, and the various sampling methods (convenience, quota, etc.). Hamming it up a bit, talking to the marbles and such, makes it amusing for students.

Interviewing and fieldwork is a pretty Mickey Mouse topic. Rather than regurgitating material in the textbook, have them get involved in the classroom. Pick a topic, have the class brainstorm a list of, say, ten questions on the topic, improve their wording and sequence according to the guidelines for good questionnaire design, and ask students to pair off for conducting interviews. They interview each other for ten or fifteen minutes each, then summarize each other’s answers for the class and critique each other’s interviewing style.

I tell students that editing, coding and keyboarding data is about as much fun as peeling potatoes and onions -- it might even make you want to cry. I share some of the editing problems I encounter with my student evaluations (circling two response categories, questions answered in the wrong place, etc.) and how I deal with them. I also give them a postcoding exercise in which I pick a topic they can all relate to (e.g., likes and dislikes regarding cafeteria food) and have them write their open-ended responses to a few questions. After class, I write out the verbatims, put them on a transparency, break them into groups, and have each group come up with a postcoding scheme. The different ways various groups postcode is a good illustration of the subjectivity of research.

Data analysis is probably the most difficult topic to make palatable. My best advice here is to collect lots of interesting and amusing examples, stay interested and enthused yourself (fake it if you have to!), and don't take it oh-too-seriously.

While implementing these ideas probably won't elevate Marketing Research to the status of favored marketing course on your campus, it might make students a little more at ease, interested, and involved, and might even improve a tad your own marketing research (student evaluations).

Mark A. Mitchell, University of South Carolina Spartanburg

Stephen J. Taylor, Palmetto Council of the Boy Scouts of America

Replication of a National Study

for Local Use: The Case of the

Boy Scouts of America

Introduction

A university is (and should be) an integral part of the surrounding community. Increasingly, experiential- or application-based learning exercises are being incorporated into university course offerings. Many not-for-profit service organizations within your local marketplace are seemingly in constant need for assistance without the ability to pay for such work. These organizations provide an excellent opportunity for experiential learning and professional service to be combined effectively.