COMS Assessment Report: 2013-14

2013-2014 Annual Program Assessment Report

Please submit report to your department chair or program coordinator, the Associate Dean of your College and the assessment office by Monday, September 30, 2014. You may submit a separate report for each program which conducted assessment activities.

College: Mike Curb College of Arts, Media, and Communication

Department: Communication Studies

Program:

Assessment liaison: Aimee Carrillo Rowe

1.  Overview of Annual Assessment Project(s). Provide a brief overview of this year’s assessment plan and process.

In accordance with our five-year assessment plan for 2013-2014, the Department of Communication Studies (COMS) examined undergraduate Program Student Learning Outcome (PSLO) 3: “describe and explain the relationship between communication and culture.” This was the seventh year in which the department has engaged in program assessment and the third year that the faculty has worked in “Teams.” The Assessment Team (Team 3) developed an authentic task: we creating an assignment asking students to analyze an Adbusters culture jamming image. Specifically, we asked students to reflect on how the image communicated about United States American culture and how the ad reveals the ways communication shapes culture. Team 3 developed and normed a rubric, which we used to evaluate student responses.

The department benchmark for PSLO 3 (40% of students would meet or exceed the PSLO) was exceeded. Three-quarters (75%) of student responses met or exceeded PSLO 3 and no students failed to meet the PSLO. However, one-quarter of the responses were rated at only approaching PSLO 3. The analysis of these results, along with other data comparisons, are discussed below.

We also engaged in informal assessment studies based on the department’s curricular and hiring needs. We were interested in learning more about: 1. The role of COMS 150 in preparing students for our major and increasing COMS enrollments; and 2. The role of COMS 440 in developing critical thinking and writing skills. Carrillo Rowe assisted John Kephart in conducting interviews with students from several sections of COMS 150. Carrillo Rowe also conducted a longitudinal study of COMS 440 students, reviewing first and final drafts of papers developed through faculty and peer feedback over the course of the semester.

2.  Assessment Buy-In. Describe how your chair and faculty were involved in assessment-related activities. Did department meetings include discussion of student learning assessment in a manner that included the department faculty as a whole?

The Assessment Team consisted of four full-time faculty members: Aimee Carrillo Rowe (Assessment Team Chair), Kathryn Sorrells, Jeanine Minge, and Stacy Holman Jones. The department also assigned a graduate student/Teaching Associate, Darla Anderson, to assist Carrillo Rowe with assessment.

The faculty participated at four different phases of the assessment process: designing the assessment tool, collecting data, evaluating data, and closing the loop. Faculty also participated in collecting more targeted, informal, and qualitative data on particular courses of interest for developing the major and growing the department.

Collecting Data. At the beginning of the 2013-2014 academic year, the faculty decided at a faculty meeting to make student participation mandatory (either as a required or extra credit assignment). We believed mandatory participation would increase student and faculty involvement in the assessment process. As a result, most of the faculty of the evaluated courses incorporated the assessment activity into their course curriculum.

In the Fall 2013 semester, Carrillo Rowe notified faculty who were teaching the courses under review that their courses would be assessed the following spring semester. Especially since student participation was mandatory, we wanted to give faculty adequate time to embed the assessment assignment into their course curriculum. Those instructors were: Tamar Artin, Julie Chekroun, Nicole Embree, Gigi Hessamian, John Kephart, Kevin McDonald, Amanda McRaven, Rebecca Litke, Karen Peck, Ronda Picarelli, Hengameh Rabizabeh, Bridget Sampson, and Kathryn Sorrells. Carrillo Rowe fielded faculty concerns about the process, particularly that assessment would reflect negatively on their courses. Carrillo Rowe assured them that assessment was designed to study program—not individual course—student learning outcomes. A few weeks before the assignment was sent to students (Spring 2014), Carrillo Rowe informed the faculty that their students would soon receive an email with their assessment assignment. Some faculty reported providing reminders about the assignment to their students. Instructors, supported by Carrillo Rowe and Teaching Associate, Darla Anderson, assisted students who had technical difficulties with the assessment software or the process in general. Upon the completion of the response period, the team notified faculty which of their students had completed the assessment assignment.

Designing and Evaluating Data. The Assessment Team served several vital roles in this stage of assessment process. Team 3 designing the assignment, developed an appropriate rubric, normed the rubric, and evaluated one hundred selected student responses. The team collaborated over the course of several meetings and email exchanges to develop the authentic task assignment to access PSLO 3, select the images, and prepare the assignment instructions and writing prompt.

When we received nearly four hundred responses, Team 3 agreed on methods to select our data. The team determined to limit the scope of student responses to those offered by Communication Studies majors, arguing that this method would give us the best data about our Program Student Learning Outcomes. Out of the 392 valid responses we received, 212 (54%) were submitted by Communication Studies majors. We then decided to randomly select 100 of those responses.

Team members developed and utilized a rubric to rate student responses. First, the Team reviewed several rubric options and developed one by modifying the rubric Team 2 had used the year before. Once we received student responses, we met to norm our evaluations of their answers, working toward consensus on how to interpret and apply each criterion. Finally, Team members individually rated student responses. Those ratings were combined to determine final scores for each response.

During the data analysis phase, Carrillo Rowe worked with Anderson to analyze the data generated from the ratings. We compared ratings according to academic status (junior/senior), by course, by number of courses taken (one vs. two or more courses). Finally, Carrillo Rowe and Anderson prepared the draft report, which was circulated among the faculty. We included their suggestions in the final draft of the 2013-2014 Assessment Report.

Faculty Participation in Closing the Loop: The 2013-2014 Assessment Report was presented at the second faculty meeting of the 2014-15 academic year in early September. Faculty members commented on two drafts of the assessment report. Their feedback was included in the final draft. Carrillo Rowe and Team Three (Sorrells, Holman-Jones, and Minge) commented on the findings and the process and invite faculty input. Faculty agreed to engage in more small-scale assessment studies to augment our large study focused on PSLOs. Ben Attias agreed to work with Aimee Carrillo Rowe to conduct a follow-up study on COMS 150 to verify that it would be a good course to add to the major. Carrillo Rowe also agreed to use moodle technology to pilot a before and after self-study in her COMS 440 course. If this works well, other faculty agreed they would conduct this kind of assessment in their courses as well. We hope these smaller, targeted studies will help us provide evidence to guide our faculty hiring and curricular changes in years to come.

Qualitative Study: The faculty undertook two targeted assessment projects: 1. Carrillo Rowe aided John Kephart in interviewing COMS 150 students; 2. Carrillo Rowe conducted informal longitudinal study by assessing COMS 440 students' first and final writing assignments. In order to close the loop on the COMS 150 study, we intend to further assess COMS 150 students in an effort to determine if the course should be made a prerequisite for the major. Data from the COMS 440 revealed that the course teaches strong analytical and writing skills. We closed the loop, using this data to support a request for a faculty line in cultural studies.

3.  Student Learning Outcome Assessment Project. Answer items a-f for each SLO assessed this year. If you assessed an additional SLO, copy and paste items a-f below, BEFORE you answer them here, to provide additional reporting space.

3a. Which Student Learning Outcome was measured this year?

Program Student Learning Outcome 3: “Describe and explain the relationship between communication and culture.”

3b. Does this learning outcome align with one or more of the University’s Big 5 Competencies? (Delete any which do not apply)

Yes, PSLO 3 aligned with the following of the Big 5 Competencies:

·  Critical Thinking

·  Written Communication

3c. What direct and/or indirect instrument(s) were used to measure this SLO?

Given PSLO 3 concerned students’ ability to explain the relationship between communication and culture, the team developed a direct instrument in the form of an assignment that required students to analyze a cultural jamming image and demonstrate an understanding of the relationship between communication and culture. The three-part prompt had short directed questions to minimize confusion and variation among student responses.

The prompt stated:

Please review the images on the attached document. Use just one of those
images to respond to these questions.

1. Which image are you analyzing, Corporate American Flag or iSuck?

2. What message does this image communicate about U.S. culture? (one to
two paragraphs)

3. How is communication used to comment on culture? (one to two paragraphs)

We administered this assignment to the following 25 sections in March/April 2013 (March 24 through April 6) using assessment software:

COMS 301 (Performance, Language, and Cultural Studies)—5 sections

COMS 356 (Intercultural Communication)—12 sections

COMS 360 (Communication and the Sexes)—5 sections

COMS 401 (Performance and Social Change)—2 sections

COMS 445 (Communication and Popular Cultures)—1 section

Procedures: We used assessment software to administer the assignment to students in the identified courses. Students received an electronically-generated email that included a link to the assignment instructions, directing students to the image and then respond to the prompt either online or by attaching a document. Students had fifteen days to respond. Emails were regularly sent to students reminding them to complete the assignment. The frequency of those emails increased as the deadline approached. After the deadline, the responses were reviewed to ensure the responses addressed the questions presented. Incomplete responses—those that had no answer—were deleted.

3d. Describe the assessment design methodology: For example, was this SLO assessed longitudinally (same students at different points) or was a cross-sectional comparison used (comparing freshmen with seniors)? If so, describe the assessment points used.

Since we had such a large pool of data, the team decided to eliminate responses from non-majors and focus our energies on responses from its majors. We believed since we were studying a program student learning outcome, we should focus on students who are engaged fully in multiple courses in the major. Various cross-sectional comparisons were conducted, including comparing results of juniors and seniors, comparing the results across the five courses, and comparing the results of students who were enrolled in one course examined to those enrolled in two or more courses.

Qualitative Study: Kephart and Carrillo Rowe conducted interviews with fifty students from two sections of COMS 150: those taught by Randi Picarelli (25) and Hengeameh Rebizadeh (25). We asked them a series of open-ended questions about what they were learning, inviting them to discuss the most important concepts they'd learned about and how it influenced their decision to join the major. This informal study was designed to provide data about students' experience of COMS 150. Our method was neither systematic nor comprehensive. Further, we targeted classes taught by two of our most experienced and indeed popular Instructors, so the findings may not be representative.

Carrillo Rowe conducted an informal longitudinal study on COMS 440 students, evaluating and comparing first and final drafts of culture critique papers that utilized cultural studies and ethnographic methods.

3e. Assessment Results and Analysis of this SLO: Provide a summary of how the results were analyzed and highlight findings from the collected evidence.

The findings were compared against the benchmark; we found that student answers met and exceeded the benchmark. The analysis of the results included comparing ratings by academic status (junior/senior), by course, by number of courses taken (one vs. two or more courses).

The benchmark was met because the large majority (75%) of our students are achieving competency for PSLO 3 with none failing to meet PSLO 3. Although the large majority of our students are achieving competency for PSLO 3, one-quarter have yet to achieve it. The average score (8.38) as a percentage of possible points (12) was 69.8%, which is a fairly low score. This would indicate that our students have a basic understanding of “the relationship between communication and culture,” but lack a nuanced understanding of this relationship. Interestingly, the scores for those enrolled in multiple sections of the courses under assessment review were much higher than those enrolled in just one course. This finding suggests the importance of reinforcing concepts across courses during the same semester.

Raters reported some weaknesses in student responses: some students drew on theories from fields outside of the discipline; others had a poor to adequate ability analyze texts and/or use theories to make sense of them. For instance, students often glossed over or failed to answer the third prompt, which asked them to explain how the culture jam demonstrates the relationship between communication and culture.

Several students referenced classroom activities and assignments in their answers. This finding suggests a correlation between classroom assignments designed to help students connect concepts to texts and student success. Students appear to cultivate the skill of applying theory and concepts to artifacts when faculty focus on teaching students to do so. We suggest faculty instruction and assignments should emphasize communication-based theories and their application to various texts and artifacts.

The course with the lowest scores was COMS 301, which is a core course and a prerequisite for two other courses examined (COMS 401 and COMS 445). The latter courses had higher percentages of responses meeting or exceeding PSLO 3. According to the department Alignment Matrix, students taking COMS 301 are expected to "demonstrate" PSLO 3, even though it had not yet been introduced or emphasized to them. PSLO 3 is shown as "introduced, emphasized, and demonstrated" in COMS 401 and COMS 445, which would be appropriate given that by the time they take those courses, they've already been introduced to PLSO 3 in COMS 301. Students in COMS 401 and COMS 445 would be expected to have even higher scores because they had completed COMS 301 and they were enrolled in courses that considered PSLO 3 from a variety of perspectives. This suggests that the department should adjust the Alignment Matrix to indicate that COMS 301 should "Introduce" PSLO 3.