CENTRAL MICHIGAN UNIVERSITY

PROFESSIONAL EDUCATION ASSESSMENT COMMITTEE (PEAC)

January 6, 2017 • 8:30-9:50 a.m. • EHS 413

Members: Ren Hullender (Chair, CCFA), Ray Allen (CHP), Natalia Collings (CEHS), Susan Griffith (CHSBS), Mary Senter (Designee), Rachel Daniel (Student), Dee Yarger (PK-12)

Ex-Officio Members: Larry Corbett (CSS), Mike Carson (Academic Effectiveness), Jennifer Klemm (CSS),

Absent: Patrick Graham, Kevin Cunningham (CEHS),

Guests: Yenpin Su-Ritzler (Academic Effectiveness), Betty Kirby (CEHS Acting Dean), Megan Goodwin (CEHS Interim Associate Dean)

  1. Happy New Year! Call to Order
  2. Ren Hullender called the meeting to order at 8:38am and thanks committee members for attending a called meeting.
  3. Approval of December 16 minutes
  4. MOTION: Mary Senter moved for the acceptance of the December 16th meeting minutes. Susan Griffith seconded the motion. Vote: Yes-7, No-0. MOTION PASSED.
  5. Old Business
  6. CAEP Accreditation [Jennifer Klemm, Director, Office of Planning & Research]

−Hullender asked Jennifer Klemm to bring a prioritized list of next steps for CAEP in relation to the PEAC charge, to provide oversight and leadership for program assessment, and to then begin setting target dates for the shared priorities.

−Klemm provided a Prioritized CAEP Stipulation Tasks outline:

●Stipulations 1 and 5 tasks

  1. Big picture: (1) Review evidences we use and how, and if, they meet CAEP standards, and (2) Decide what evidences are to be used, how quality they are and compare them to the CAEP evidence guide to evaluate each one for reliability, validity and technical efficacy to make sure each standard is being met
  2. Details: Review each measure and each CAEP standard and sub-standard (1.1, 1.2, etc.) to see what evidences are being used? What is missing? What is needed to meet CAEP sufficiency standards?

●If the Student Teaching Evaluation and Common Lesson Plan Rubric are kept they need to be reviewed for increased technical efficacy, validity, and reliability and more measurable language needs to be added. CAEP notes improvements were better than the old assessments but still do not meet the minimum requirements.

●After assessments are improved, new criterion levels need to be established for each measure. The question “what do we want students to meet to be considered successful?” needs to be answered

●Data Collection will be handled using Taskstream to develop a system for (1) consistent data collection, (2) disaggregation by program, and (3) alignment to standards.

●Continuous improvement – There is currently not a way that CMU monitors how programs are using the data CAEP wants used to improve programs. Need to establish a way to answer “how are programs making data-informed improvements?” Klemm has met with Claudia Douglass and Mike Carson to discuss options and ways to make a stream-lined process for programs to do that.

−Klemm asked for committee member input on how the committee wanted to handle the work – whether it be using workgroups and/or her completing work and bringing it back to the committee. Klemm also wanted to review the 8 annual measures used in the CAEP annual report.

−Members questions and comments regarding CAEP tasks and next steps are summarized below:

●Which Common Lesson Plan Rubric and Student Teaching Evaluation did CAEP receive? CAEP received data from the old instrument with the improved assessments included to show revisions made.

●To do empirical validation there needs to be data – have the new instruments been used yet? The new instruments have been used and there is data to be reviewed. Klemm noted she could bring the data to the committee once it is decided how they wanted to move forward with the work.

●Everyone is surprised by this

●Are there any national instruments CAEP has endorsed? Good work has been done here and it shouldn’t be thrown out but this seems to be a moving target if there aren’t any examples or feedback bring provided. – Betty Kirby noted there is an instrument CAEP has endorsed. Initially there wasn’t one but now there are some examples of the materials being provided.

●Are other faculty members of the Professional Education Unit (PEU) to be included in the work of crunching the data and creating interrater reliability for the created instruments? There aren’t many voting members to source from for workgroup activity. Kirby noted PEAC is a body that can make recommendations and for this effort a recommendation could be made to use an outside source such as the Office of Institutional Research or a consultant. We have a short time frame and will put forth the necessary resources to support this work and make sure things are moving forward. Hullender agreed that PEAC is a recommending body and there may be more work on the table than the committee could handle so recommending outside sources is definitely a consideration.

●Griffith noted the one-pager is succinct and helpful.

−Members discussed how to organize the work moving forward. Comments and questions summarized below:

●What can be done simultaneously and what needs to be done sequentially? Klemm noted the big picture needs to happen first. If the student teaching evaluation and the common lesson plan rubric are kept that work can be done simultaneously, but if they aren’t going to be kept that work would have to wait.

  1. Kirby recommended taking a look at the chart that outlines where periodic measures are throughout the program. What do we have and where? Do we need to make a shift towards assessing throughout the program, if there is too much left for the end? Klemm supported that path and added wanting to map everything back to each standard.

●Need to determine a way to show the impact and growth on P-12 learners in pre-service experiences whether it is using a case study to or an assignment (see candidate impact on learners during the unit).

  1. Klemm noted Dawn Decker has implemented a scenario in EDU 380 with Special Education students where an intervention is presented and growth is monitored. May be something to look into.

●Criterion levels can be set after the “big picture” and assessments are put in order.

●Need to evaluate old instruments (increase measurable language and collect data for instrument validation)

●Need to pilot the assessments

  1. Klemm noted the Student Teaching evaluation has been used for two semesters
  2. Susan Griffith noted she hadn’t seen the instruments used in a way to establish interrater reliability, though feedback and data had been collected. She also noted wanting to make sure cooperating teachers

−Taskstream

●Members asked questions about the functionality, capability, present use and future use of Taskstream

  1. Megan Goodwin confirmed that Taskstream has been used in the college over the past year with some of the graduate level programs. It is capable of a wide variety of data management, and can be directly used to look at data from multiple instruments down to the student level.
  2. Kirby noted on January 27th the college will be bringing in a Taskstream consultant with sessions for anyone to attend. They will showcase what Taskstream covers. This system was recommended at a CAEP function to be used quickly and to be able to view everything. Questions are good and welcomed at the meeting.
  3. Ray Allen noted from what he gleaned during the last presentation on Taskstream the system was adaptable and accommodating – the next step, though, is to have the evaluation plan in place as soon as possible, before the January 27th meeting, so the committee knows what is needed from the system
  4. Hullender suggested the work be done in a workgroup, versus another whole committee meeting before the 27th.
  5. Carson noted if questions came up during the workgroup meeting that CAEP has worked with a lot of different instruments and may have good insight to offer.
  6. Members concurred that getting the evaluation plan in order was top priority and could be used as marching orders to progress the work in an orderly way.

−Klemm noted she will be attending the CAEP Conference in St. Louis in March. There is a pre-conference workshop on instruments she will be attending where attendees are asked to bring their instruments for a review session with a CAEP team member and get feedback.

●Kirby noted this is one of the ways to document our efforts to improve.

●Klemm noted where the committee is currently is an opportunity to be thoughtful about the entire assessment program. What are we doing? Is it giving us the information we need to help our students? What do we want it to do?

−Hullender noted there doesn’t need to be a lot of assessments but identifying a few goods ones that give us what is being looked for and shows what is being done is key. This is an opportunity to streamline from assessment from the time they enter the program all the way to three years out in a valuable and meaningful way. He also noted that is why he was adamant about voting yes for the Diversity Transformation Team instrument because it was good and fits in with others while addressing an area we are lacking in.

−Recommendations?

●Natalia Collings recommended having one group work on the evaluation plan to maintain consistency and progress alignment. Griffith agreed, noting consistent language is important.

●Senter recommended bringing up the assessment instrument conversation at the next PEU meeting in January.

  1. Kirby noted the agenda was still being worked out but was already pretty full with Dr. Dale being on campus, the provost coming, CAEP updates, and the Secondary Faculty Learning Community but she would take note and take it into consideration.
  2. Senter noted other than everyone being on the same page you need involvement from people from a wide range of departments to gain interrater reliability on the instruments, otherwise they are all being used in different ways or there are instances that have come up before where feedback isn’t coming back here because there is no established way for reporting. PEAC is the centralized committee and get on the same page but everyone needs to be on the same page.

●Allen noted creating the plan is critical at this point in order to make decisions. Part of that plan does need to be a good programmatic system to provide feedback with people from all levels. Need to expedite some things but keep others in mind.

●Griffith suggested presenting the plan at the meeting on the 27th to show others this is the progress PEAC is making and collect any responses or feedback. Small group work can’t be done there but it may be a way to start to build a system of work that shows and documents progress and intention.

  1. Kirby noted HLC required a plan of action because of the probationary decision. It was communicated late in the semester (stayed and worked until the 22nd and back on the 3rd) but there is a basic overall plan outlined from January 2017 to Fall 2019. Doesn’t mean the plan doesn’t need faculty input but a plan needed to be submitted.
  2. Griffith noted she just wanted to make sure something is shared because if the committee is quiet it may come across as all is well in hand and there doesn’t need to be any input or assistance from anyone else.
  3. Klemm noted she is comfortable sharing a plan on the 20th at the PEU meeting.
  4. Goodwin suggested bringing the plan that as drafted to the committee for the small group to look at and make adjustments then show the larger group at the PEU meeting.
  5. Hullender took volunteers for the “big picture” workgroup who would work to say clearly here are the stipulations, this is what CAEP is requiring, here is the plan, and here is what we are working on.
  1. MOTION: Susan Griffithmoved that PEAC begin our work to address CAEP feedback by developing a program assessment map of the BS in education degree program. Mary Senter seconded. Vote: Yes-7, No-0. MOTION PASSED.

−Discussion of the motion:

●Natalia Collings began a conversation about the e-portfolio, a potential missed opportunity for use as an instrument to fill a gap in the evaluation plan. Kirby noted Taskstream has a portfolio function. Goodwin recommended inviting the e-portfolio coordinator to the Taskstream training. Collings noted her perceived lack of commitment to instruments despite efforts towards consistency.

  1. Senter noted that would be an endorsement of the plan - it could be shared with students, cooperating teaching, university coordinators – people are on different pages and use the same language for different things.
  2. Griffith noted she believes the first step is to get the framework done and have the conversation started – once the map is made gaps will be more clear.
  3. Collings noted after gaps have been identified would the group be looking for outside instruments? She often looks for what teachers are actually using in schools, for example Danielson and 5 plots, to connect with the authentic tools that students will use when they leave. They would have more buy-in to instruments.
  4. Dee Yarger noted 5D+ is an evaluation that is growing and she has had multiple trainings and professional development experiences about it. Having hands on experience with it before working in the schools would be helpful for students.
  5. Klemm noted she would be glad to look and see what systems are available and proprietary.

−Hullender noted the workgroup needs to get together between now and the 27th to get that done. Group can meet with Klemm to provide feedback. Klemm will put it together and other people can provide feedback. Allen noted his preference for this type of work was in person versus via email back and forth.

−Carson asked if Klemm had a starting point for the plan. Klemm noted she would first each standard and then list what we have to measure and then look to see the timing of when the assessments are administered. Griffith noted presenting it as clearly but “abstractly” as possible in a visual way would be beneficial.

−Senter noted whether from the CEHS dean or other college deans, departments need to strongly encouraged to have representation at the meetings. Kirby noted the dean’s office has been working with associate deans in the other colleges for two years working to gain more representation at the meetings. At the fall dean’s meeting it was reiterated and as recently as past Tuesday the provost was asked to ask the deans. The most powerful tool though is peers – letting them know they are needed even more so now with accreditation and its increased complexity. She encouraged everyone to “bring a friend” to the PEU meeting.

−Hullender wanted to confirm that there would be no work on the Common Lesson Plan Rubric or the Student Teaching evaluations at this time until the work for the plan had been done.

−Members requested official Outlook meeting dates for calendars for upcoming, set and added, meeting dates.

  1. Inquiry into the use of Taskstream

−Goodwin noted there is going to be a meeting today at 11am, you will be hearing more and it will be more actively used. Hullender noted the a large part of the plan is looking at it to collect data and connect to BlackBoard and hopefully be able to see data in real time.

−Kirby shared that a reconsideration letter from the college was submitted to CAEP on December 20th or 21st of December. In addition the State of Michigan submitted a strong letter of support on December 21st. There is no update at this time because the council doesn’t meet until April though MDE did request a response by March 1st. The letter first goes to senior associate and if they deem its worthy of consideration they send it to the council. A copy of the letter was sent to both the senior associate and the president so they can deem it acceptable. The letter does not change our efforts though because updates need to be done regardless. It would be great if the probationary status was lifted though.

●Senter asked if there was any HLC involvement? Kirby noted when any program goes on probation they get involved. Goodwin supported that they seem to want to be more informed than involved. Kirby noted it had to be a letter with action plan.

  1. Future meeting dates [added meetings may become necessary]

●January 27, 2017

●February 10 (possible) - official

●February 24

●March 24

●April 14 (possible) – official

●April 28

  1. Adjournment
  2. Meeting adjourned at 9:58 a.m.