General Education Quadrennial Review 2016-2019

General Education Quadrennial Review 2016-2019

Assessment Report

for the

General Education Quadrennial Review 2016-2019

Please submit the completed electronic copy to

Instructions:
· For questions 6 – 9, cut and paste the text from the approved QRII Assessment Plan in the appropriate sections.
· Contact Academic Affairs and Research if you need a copy of the approved plan.
· Please clearly indicate if changes have been made to methods outlined in approved plan.

1. Course Prefix and Number: ENG1013

2. Course Title: COMPOSITION II

3. Contact Persons (Department Chair Name, Department, Email Address, Phone Number)

  • Janelle Collins, Ph.D., Chair of English, Philosophy, and World Languages, , (870) 972-2429
  • Kristi Costello, Ph.D., Director of the Writing Program and Writing Center, , (870) 972-2429
  • Airek Beauchamp, Assistant Director of the Writing Program and Writing Center, , (870) 972-2222

Framework

View the General Education Goals and associated SLOs and courses here:

4. Please mark with an X the Student Learning Outcome (SLO) you chose to evaluate.

Quadrennial Review Assessment Timetable
Review Year / General Education Goal / Student Learning Outcome / Due for submission
2016 / Communicating effectively / Students will be able to:
Construct and deliver a well-organized, logical, and informative oral or written presentation, accurately documented, that demonstrates proficiency in standard American English ☐x / Monday, October 3, 2016
Using mathematics / Students will be able to:
(1) Interpret and analyze quantitative/mathematical information (such as, formulas, graphs, and tables) ☐
(2) Apply mathematical methods to solve problems ☐
2017 / Developing a life-long appreciation of the arts and humanities / Students will be able to:
(1) Recognize works of literature or fine arts and place them in their historical, cultural, and social contexts ☐
(2) Interpret works of fine arts or literature ☐ / Monday, October 2, 2017
2018 / Developing a strong foundation in the social sciences / Students will be able to:
(1) Explain the processes and effects of individual and group
behavior ☐
(2) Analyze events in terms of the concepts and relational proposition generated by the social science tradition ☐ / Monday, October 1, 2018
2019 / Using science to accomplish common goals / Students will be able to:
Understand concepts of science as they apply to contemporary issues ☐ / Monday, October 7, 2019

5. Connection: Briefly explain how the learning outcome you selected above relates to the discipline of the course being assessed, i.e. how does this course speak to the proposed learning outcome?

Composition as a discipline inherently focuses on communicating arguments and ideas and utilizing rhetoric in an intentional and skillful manner. It is the charge of our Composition II offerings to ensure that students are able to construct and deliver a well-organized, logical, and informative oral or written presentation, accurately documented, that demonstrates proficiency in standard American English.

· For questions 6 – 9, cut and paste the text from the approved QRII Assessment Plan in the appropriate sections.
· Contact Academic Affairs and Research if you need a copy of the approved plan.
· Please clearly indicate if changes have been made to methods outlined in approved plan.

6.Assessment Instrument Description – (Briefly describe the instrument, including how the instrument is a valid measure of the outcome. Submit the actual instrument at the end of the document under the Appendix (originally #10 on previous form).)

As an instrument of assessment we designed a rubric to assess Composition II students’ written work. Composition faculty worked together at our pre-semester assessment workshop to decide what constitutes “effective communication [in writing]” and, more specifically, what constitutes a “well-organized, logical, and informative written presentation.” The rubric includes collaboratively written narrative step-down language in an effort to codify performance indicators. As such, this rubric (used 2014-2015) asked raters to assess each piece of writing on a scale of one to four (with four being the highest score) in three different categories, which represent a hierarchy of elements: content and thesis, organization and coherence, and style and mechanics (see Appendix 1 for rubric). Then, with the help of ITTC, we created an online repository to collect students’ essays so we can distribute them among faculty to read at our assessment workshop. For the 2016-2017 academic years, we have, based on faculty feedback at the 2015 assessment workshop, added an additional section collaboratively created by faculty members to assess students’ understanding of MLA. Similar to the remainder of the rubric categories we have used written narrative step-down language in an effort to codify performance indicators (see Appendix 1).

6.1 – Identify and explain deviations, if any, from the approved assessment instrument.

There was no deviation from the approved assessment instrument.

7. Benchmark – (What is the expected level of student proficiency related to the learning outcome?)

In Composition II, we hope that a majority of students (60% or better) will score threes or above in all criteria and at least 80% of students with scores of 2 or better, while also recognizing that the work we assign students in Composition II is more complex and intellectually rigorous than that assigned in Composition I. However, given what we’ve seen as students’ lack of preparedness when they enter Composition I, the rigor of Composition II, and many students’ struggles with generating academic prose, we expect many students may still receive twos (i.e., “emerging”).

7.1 – Identify and explain deviations, if any, from the approved benchmark.

While there was no deviation from the approved benchmark, we do hope to use the data from this report to develop more quantifiable and defined benchmarks for the QRIII.

8. Data Collection Process (Describe the data collection process and any planned sampling strategies. Consider the following items: term/s, section/s, location/s, modalities, and the sampling process. The data collection process should ultimately include all students taking a general education course or give all students taking the general education course an equal probability (i.e. random sampling) of being included in the data sample. This includes the Paragould campus courses, concurrent credit courses, and online, web-assisted, and traditional course formats. )

For the last two years, every spring semester, all A-State Jonesboro Composition II instructors, including concurrent faculty members, are required to include in their Composition II syllabi that a required essay needs to be submitted by the students to the Composition Assessment website ( upon submitting a final draft to the course instructor. Thus, every spring, Composition II students submit an extended (1250 words or more) researched academic argument essay in MLA style. These essays are then rated the following summer by Composition faculty using the common rubric that accounts for “fundamentals of written communication”: Content/Thesis, Organization/Coherence, and Style/Mechanics. Since the main goal in Comp II is to introduce students to the conventions, expectations, and practices of writing and researching at the college level, we only collect the research argument, the culminating assignment of Composition II, though eventually, we would like to alternate between essay rating and inter-textual citation analysis, such as that conducted by the Citation Project.

Prior to formally rating essays, faculty members engage in norming to establish inter-rater reliability. Following norming and the calculation of an above 95% confidence-level, with a confidence interval of +/- 10%, and a corresponding representative sample, each essay, stripped of all identifying information, is randomly distributed to two readers who each read independently. The two scores are calculated independently and also averaged together in an effort to keep two distinct data points.

While the aim in Composition I is to help students develop writing strategies and approach reading and writing tasks rhetorically, the primary aim of Composition II is to teach students to write, read, and research academically. Thus, for this first cycle (2014-2015, 2015-2016, 2016-2017) we gathered students’ extended researched academic argument essays in MLA style. We plan to engage in comparative analysis of students’ academic writing over time, particularly as we continue to develop and improve our curriculum and strategies. Contrary to Composition I, which is a multi-genre course, we hope assessing just the extended researched academic argument essays in MLA will help us not only meet our goal of assessing general "effective communication" (through writing), but also gather information about what we, as Composition II faculty, are doing well and not so well in preparing academic writers.

Currently the maintenance of the online repository is too onerous and the system too basic to allow multiple submissions from different classes at one time (i.e., Comp I and Comp II) concurrently, particularly while being able to differentiate between them (i.e., draw separate data from each). However, our hope is that, in the next year or two, through either the purchase of a commercial software or the hiring of a full-time software engineer, we will be able to create a system where Comp I and Comp II students can go to the same website to upload their essays and that the Director of the Writing Program will be able to personally upload classes, instructors, and students so we can ensure the system is ready to go each semester when we need it to be. Thus far, we have had to rely on other busy professionals with ITTC and IT to do this work for us. Once we have a fully functioning system, we are open to assessing both Comp I and II in the fall and the spring if the committee thinks it best.

In the meantime, however, we are fairly confident that assessing Comp II in the spring semester is the best plan for now since, between our traditional and concurrent students, our representative sample is drawn from more than 80% of Composition II’s annual enrollment. Additionally, because several students who take Composition II in the fall are students who unsuccessfully completed the course the previous semester, we currently do not have a system in place to ensure that we’re not evaluating these students twice.

8.1 – Identify and explain deviations, if any, from the approved data collection process.

There was no deviation from the approved data collection process.

9. Planned Number of Observations (To the best of your ability, estimate the number of observations expected from the data collection process for the reporting period. Example: 120 expected observations (30 students per year for 4 years)

For a confidence interval of just over 90%, we needed to rate a representative sample of approximately 56 essays from Spring 2015.

9.1 – Identify and explain deviations, if any, from the planned number of observations.

We actually achieved a better confidence interval and level than we had approved from the Assessment Committee. We had a 95% confidence level with a confidence interval of +/- 10% by assessing 80 (with two scores per essay for a total of 159 observations) from Spring 2016’s 492 total and 75 (with two scores per essay for a total of 150 observations-- one was corrupted) from Spring 2015’s 350 total.

10. Results and Analysis of Assessment

10.1 – Results (What did you find? As appropriate, report both item and aggregate results)

Table 1: Composition II—Argument Essay—Spring 2015

Frequencies/ Percent

4 / 3 / 2 / 1
Content and Thesis / 14.0% / 32.0% / 44.0% / 10%
Organization and Coherence / 10.7% / 29.3% / 50.0% / 10.0%
Style and Mechanics / 7.3% / 39.9% / 41.3% / 12.0%

Table 2: Composition II—Argument Essay—Spring 2015

Frequencies/Percent—Total Rubric Score

12 / 11 / 10 / 9 / 8 / 7 / 6 / 5 / 4 / 3
2.0% / 8.0% / 4.7% / 14.0% / 14.0% / 20.0% / 21.3% / 8.7% / 2.7% / 4.7%

Table 3: Composition II—Argument Essay—Spring 2016

Frequencies/Percent

4 / 3 / 2 / 1
Content and Thesis / 16.4% / 34.6% / 39.6% / 9.4%
Organization and Coherence / 15.7% / 40.3% / 40.9% / 3.1%
Style and Mechanics / 10.1% / 47.8% / 37.1% / 5.0%
MLA / 15.7% / 31.4% / 37.1% / 15.7%

Table 4: Composition II—Argument Essay—Spring 2016

Frequencies/Percent—Total Rubric Score

16 / 15 / 14 / 13 / 12 / 11 / 10 / 9 / 8 / 7 / 6 / 5
3.8% / 4.4% / 6.9% / 6.3% / 10.7% / 12.6% / 13.8% / 12.6% / 18.9% / 4.4% / 3.1% / 2.5%

The frequency scores/percentages illustrate the data from the total observations, whereas the Inter-rater reliability statistics (i.e., consistency and consensus) are located in the appendix

10.2 – Analysis (How did you interpret what you found, i.e. what are your conclusions?)

In Composition II, we hoped at least 50% of our students would earn a score of three or four in each of the three areas and at least 80% of our students would score 2 or better in each of the three areas: content and thesis, organization and coherence, and style and mechanics.

In Spring 2015, 46% scored 3 or above in content and thesis, 40% scored 3 or above in organization and coherence, and 47.2% scored a 3 or above in style and mechanics. Overall, in Spring 2015, 28.7% of students received an average score of 3 or above and 84% scored 2 or above; the former does not meet our benchmark, and the latter exceeds our benchmark. Though I am concerned with the occurrences of scores of 2 from the Spring 2015 data, I am also aware that this data was compiled just one year after the Writing Program was re-established, that the initial focus of the Director of the Writing Program was Comp I (i.e., we had not yet made many moves toward improving Comp II), and, at that time, there were only two Rhetoric and Composition specialists on staff.

We werehappy to see that scores improved in Spring 2016; 51% scored 3 or above in content and thesis, 56% scored 3 or above in organization and coherence, 57.9% scored a 3 or above in style and mechanics, and 47.1% scored three or above in our new category, MLA. Overall, in Spring 2016, 32.1% of students received an average score of 3 or above and 90% scored 2 or above.

There are incredibly few incidences of scores of 1 in organization and coherence and style and mechanics, but there are too many scores of one in MLA (15.7%) and Content and Thesis (9.4%).However, they do not surprise us. The sophistication of students’ topics and arguments factor into their content and thesis scores and, though we try to steer students toward academic inquiries and deter them from choosing top-forty binary topics (legalization of marijuana, abortion, dry county/ wet county, drinking age), many students do not heed this advice (see our Action Plan for our plan to address this). Along the same lines, we added the MLA section because we recognized as a faculty that, despite our concentration on it in the classroom, that too many students are not producing accurately formatted MLA papers. We hope that separating MLA into its own category will allow us to better measure how we are doing teaching and enforcing MLA style.

Similar to the increase in scores in Comp I, though I’d like to say the improved scores between 2014 and 2015 are representative of the newly hired composition specialists (we now have four), additional time spent norming essays together, the new professional development and resources provided for Composition faculty, the increased standardization of our courses, and, as a result of all of these factors, better teaching-- and I do posit that all of these things have combined to improve the quality of our students’ writing and the teaching taking place in our Composition courses-- I am also aware that the increased participation of our graduate assistants in the rating of the essays (from 14% of the raters being graduate students in 2014 to 25% in 2015) might partly account for the higher scores. This is also illustrated through the higher incidences of raters saying they felt their scores were higher than their colleagues on the confidential post-assessment workshop survey (see Appendix 6).

In sum, I do believe that the data further illustrates what I know to be true—that students have more difficulty in Comp II than in Comp II, and it reinforces what I am seeing in our meetings and workshops and what I experience when observing Composition faculty: that writing instruction on this campus has improved and, as a result, the quality of the writing produced in our Composition courses is also improving.

11. Action Plan (Specify any concerns you have about the results, either positive or negative. Explain your proposed changes to address these concerns.)

We will continue working as writing faculty to improve the scores and our strategies for teaching academic writing, research, and reading, as well as continue urging disciplines to either require an upper-level writing course and/or teach students the conventions of research and academic writing in their upper-level major courses since, as the scores suggest, students are still having difficulty with academic writing and research.

As mentioned above, there are too many scores of one in MLA (15.7%) and Content and Thesis (9.4%) in the most recent data. To improve these scores, we will continue to collaborate with faculty on what pedagogies or teaching techniques are currently working and where we can assist with new strategies in regards to teaching topic selection and MLA. Already we have:

  • added MLA resources on Composition Instructor Network;
  • instituted an instructive session at our pre-semester workshop on MLA and teaching the new elements (led by Leslie Reed and Kerri Bennett);
  • ordered and distributed copies of the new MLA 8 edition to faculty.

Finally, we hope to develop and provide additional resources and tips to instructors for providing more useful and engaging MLA instruction, ranging from how to explain to students why MLA citation is important to streamlining the mechanics of this formatting. In terms of content and thesis, we are working to improve and expand the Composition II section of Pack Prints as well as shop for a new Composition II rhetoric.

Though our IRR scores improved between 2015 and 2016 and were quite consistent in 2016, we want them to continue to improve (see Appendix for IRR scores) and thus, we are encouraging more faculty members to utilize the assessment rubric in classroom settings through in-class norming and for grading purposes. This will not only help the faculty members be more comfortable using the rubric and students understand how assessment works and how they are scored (if their essay is selected), but it will also help to ensure that Composition faculty are keeping in mind the shared goals, outcomes, and expectations of the course (i.e., to teach students “effective communication [in writing]” and, more specifically, what constitutes a “well-organized, logical, and informative written presentation”). Additionally, we hope norming in class using the rubric will help students better understand our expectations as they pertain to MLA and Content and Thesis.

11.1 – Who has been involved in the action plan? (Identify your implementation plan.)

Ideas for the action plan were generated and agreed upon by all of the faculty members present at the pre-semester Composition workshop.