February 21, 2014

To:Jill Compton

From:Gary Brown

Subject:The professional learning community (PLC) work in Quantitative Reasoning in Frankfort, February 6th (Statistics) and 7th (Math) recap

The following brief report is outlined as follows:

  1. A recap of the fall 2013 assessment norming
  2. Results of the practice round relative to grading and junior rising competency as determined by Kentucky faculty
  3. Reliability from the assessment norming
  4. Propositions underpinning and following the norming
  5. Assessment results from February Pilot Assessment
  6. Statistics Team Results
  7. Math Team Results
  8. Propositions from the assessment
  9. Outline of assignment scaffolding for proficiency
  10. Sample Pinnacle Matrix from the scaffolding exercise
  11. Propositions from the scaffolding exercise.
  12. Next Steps—Cultivating Assessment Expertise in Quantitative Reasoning

  1. Recap of the fall 2013 assessment norming
  2. Allthree 2013 meetings (Statistics PLC, Mathematics PLC, Developmental Math PLC) used the VALUE Critical Thinking Rubric and sample student work to assess Quantitative Reasoning. The Rubric used an eight point scale. Faculty agreed to set anchor points for graduating competency (5) and rising junior competency (3). The results of each PLC’s evaluations are presented in thefollowing histogram:

  1. Reliability of the rating is presented in the chart below:

  1. The results prompted me to present the following propositions that were confirmed in the AAC&U Quality Collaboratives meeting in January, in Washington D. C.

Reliability is an indicator of expert consensus around the quality of learners’ performances.

Expert consensus is validity.

Expert consensus is useful for guiding improvement.

Reliability is a measure of community & unity.

Reliability among affinity disciplines across sectors confirms the viability and utility of assessment based upon expert judgment.

  1. February pilot assessment results

With the preceding as backdrop, we collected samples of assignments and student work from PLC participants. During the February meetings (Statistics PLC meeting and a meeting of the combined Math/Developmental Math PLCs), we assessed samples using the VALUE Quantitative Reasoning Rubric in accordance with feedback from fall meetings.

  1. The Statistics session included 6 faculty (severe weather limited attendance) who assessed 12 sample papers, reading each paper at least twice. Initial reliability was 60%, though each group successfully resolved ratings to achieve 100% agreement on sample papers. Faculty suggested time calibrating in advance of rating would have been useful.

  1. The Math session included 33 faculty who rated sample work. Fifteen papers were reconciled from 56% to 100% reconciled reliability as depicted in the histogram that follows.

  1. Propositions from the pilot assessment

Assessment is not the same as grading

Calibration preceding assessment in valuable for reliability, specifically for shifting from a relative scale (grading) to an absolute scale (assessment).

A maturing assessment initiative reflects the assignments that generated the student work, which traditionally have been developed for grading within a particular course rather than for statewide assessment on an absolute scale.

It is unlikely that quantitative proficiency will be successfully achieved or exceeded by students based upon a single assignment within a single course.

  1. Outlining Scaffolding toward Quantitative Proficiency
  2. To address the limitations of single assignment assessment, we piloted a pinnacle matrix approach to scaffolding a sequence of assignments that intentionally target proficiency. The following figure illustrates the concept using assignments shared by participating faculty.

  • The model rests upon Kentucky’s adaptation of the Quantitative Essential Learning Outcomes (similar in construct to Lumina’s Degree Qualification Profile), rendered in the black base.
  • Measurement of the constructs is accomplished by using the VALUE Quantitative Reasoning Rubric, rendered in gray. Note how snug the dimensions of the rubric are to the Kentucky statewide learning outcomes.
  • The rising arrows pass through the developmental states or categories of the rubric, from emerging through developing, advancing, and, ultimately, mastering. (“Emerging,” it was pointed out, may be better registered as “novice.” )
  • The Service Learning project is an en example of an assignment that is actually introduced to entry level students, but the assignment also sits on the pinnacle as one example of an assignment that may well service novice learners but affords learners the opportunity to demonstrate mastery as well, presuming other activities, such as the sample “math acceleration” assignment or a mindful sequence of “inquiry projects” help prepare them to demonstrate mastery at the pinnacle or capstone of the curriculum.
  1. Samples from the meeting

/ Rhonda Creech shares her teams’ pinnacle matrix with math faculty in Frankfort.
The pinnacle matrix for the Service Learning Project. Note the integration of content and skill development rising up the pinnacle. Note in particular the engagement of professionals at the pinnacle—“turn into organization.” /
  1. Findings and subsequent propositions from the scaffolding exercise.

Scaffolding is not the same as a sequence of topics as it requires integration of skills and learning.

Some sample scaffolds developed by faculty in the meeting targeted learning bottlenecks (“mini-scaffolds”).

Some sample scaffolds shared by faculty targeted projects—population estimates, for instance.

Most scaffolds were developed within single courses, which implies the corollary challenge....

The achievement or full realization of Quantitative proficiency will require unprecedented interdisciplinary collaboration.

  1. Next Steps
  2. Identify local project leads from Kentucky Math and Statistics faculty.
  3. Collect assignments and samples of student work.
  4. Implement a train the trainer around process to facilitate work locally.
  5. Invite continued feedback and ideas
  6. Integrate this work with the Multi-State Collaborative work funded by Gates
  7. Consider additional funding opportunities
  8. Promote quantitative literacy and keep Kentucky’s leadership in this national effort on the front burner and on the front page.

Bonus

Comments captured from faculty discussing student work:

  • “Interpretation of results was not discussed at all.”
  • “Calculations were not explained. What were students doing with the quadratic formula exactly? What were they talking about?

1