AN UPDATE ON THE STATE BOARD OF EDUCATION PREPARED BY:
MARYLAND STATE
EDUCATION
ASSOCIATION · NEA
February 23, 2016

MEETING DATES FOR
THE STATE
BOARD OF EDUCATION
March 22, 2016
April 26, 2016
May 24, 2016
June 28, 2016
July 26,2016
August 23, 2016
September 27, 2016
October 25, 2016 / Consent Agenda Items
The State Board of Education (SBOE) approvedthe following:
  • February 11 & 12, 2016 minutes
  • Personnel
  • Budget Adjustments for January, 2016
Information and Discussion
Assessment and Accountability Update
Relationship Between Partnership for Assessment of Readiness for Collegeand Careers(PARCC) Test Scores and College Admission
Dr. Henry Johnson, chief academic officer,Dr. Douglas Strader, director for Planning and Assessment Branch,briefed the SBOE on comparisons between PARCC results and scores on the ACT & SAT.
  • The Maryland Research Assessment Center (MARC) was asked to look at the first administration of PARCC and its reliability to ACT & PSAT. The data was limited due to only being able to use high school administrations of Algebra, Algebra 2, and English 10; the population pool was further limited due to timing of when students were taking the tests.
  • Results indicate that there is a moderate relationship between PARCC test scores and PSAT.
  • A score of 750 on PARCC (a level 4) would give a correlated 500 English Language Arts (ELA) & Math for SAT; for ACT a scale score of 21; this is where Maryland would want to be; actually ACT should be at 23.
Comments:
Dr. S. James Gates, SBOE vice president, stated that this is a creative way to look at the data, but the devil is in the details; asked if the Boardshould feel confident in these scores and the suggested correlation.
Response: Yes, but keeping in mind the limited sample size.
Dr. Gates further asked if this methodology has been used in other states.
Response:For questions regarding methodology, the researchers from the Maryland Assessment Research Center (MARC)who conducted the research, were present to explain: Due to the fact that the researchers were using the data of first time test takers, the data pool for comparison is very limited; the plan is to continue the research with additional administrations.
Mrs. Madhu Sidhu, SBOE member, commented that she is glad Maryland is part of PARCC; it’s a good quality test. She wouldexpect scores to be more accurate in the future since students didn’t have to pass the test this time, just take it.
Response:MARC would estimate that would happen, too.
Dr. Gates added that when parents express concern of dropping scores, he tells them the standards being tested represent“a higher hoop” for students to jump through.
Study of PARCC Results by Mode of Delivery (Mode Effect)
Dr. Strader and the researchers from MARC began a presentation to answer previous SBOE questions regarding the effect of testing mode on scores.
Highlights:
  • The first year of PARCC scores compared those who took the assessment by paper and pencil with those who took it online; there are many benefits to online;benefits were explained in 2007.
  • Maryland began using online assessments with science tests in 2007; at that time needed to mirror the paper, since that was the standard.
  • Now the online is the standard; the difference is the studentshave to engage in multi-media. Scale scores are based on the online tests.
  • 2014-15 administration of PARCC had 80% participation online. Thestudy showed there are some discrepancies: in elementary math, not too much; in secondary, the discrepancy is much greater; in ELA the discrepancy is much higher throughout the grades and for both level 3 & 4 scores; Grade 3-7 not much discrepancy in math.
Comments:
Mrs. Linda Eberhart, SBOE member, asked about students scoring at Level 2.
Response: Pretty much the same.
Possible Reasons for Discrepancies:
  • Technical Issues of the test – several groups looked at this possibility, butnone were found in either development, administration or scoring of the test, that contributed to a mode effect.
Comments:
Mr. Larry Giammo, SBOE member, inquired whether the researchers looked question by question. Perhaps the user interface may be the issue or it may be a design issue. He requested to see the variation data.
Response: Not necessarily because the items from online to paper may not be exactlythe same.
Mr. Guffrie Smith, SBOE president, noticed that there were more questions and promised that the Board will take more time at another session.
Ms. Eberhart agreed to table her question as long as there will be another session. This is critical: is it readiness or because of technical development; reports in the news said Maryland had technology problems.
  • Differing populations – when analyzing, the populations were not random samples; looked at how students performed on prior tests; the group who took the assessment through paper and pencil tended to have a higher percentage of high performing students compared to the students who took the test online, making the paper and pencil group NOT a random sample; this explains 40% of the deviation.
  • Matched sample analysis – when looking at disaggregated scores, there is some discrepancy in the “All Students” category on Math 8 and Algebra 1; however, when sub groups are split out, there is much more alignment.
Comments:
Mr. Andy Smarick, SBOE member, asked for the panel to walk the Board through one cell of their chart.
Dr. Strader narrated an example.
  • Readiness Extended Constructed Response– this seemed to have the biggest mode effect – not typing time nor online platform tools; the findings were surprising not to see this in math. For this, the researchers looked at anecdotal comments to answer, why such a large score discrepancy with ELA? Students said: not knowing how much is enough; online, students didn’t know how much to write to “Fill in the box”; on paper the area to write was visually much larger than the online text box.
Comments:
Mr. Smarick expressed concern that he and the researchers had a difference in “readiness” definitions. Researchers are talking about the design theygave students was not matching; to me that is your problem; you should be matching our clients’needs rather than having them do what you want.
State Board of Ed members inquired whether the assessments took into account multiple intelligences and Universal Design for Learning (UDL).
Response by Dr. Johnson:If the assessments took those things into account, a difference in past mode effects studies would have been exhibited, but that is not the case.Teachers at the high school level were not necessarily teaching the standards with fidelity – some were just waiting to see how it goes.
Dr. Jack Smith, interim state superintendent, added: we have been examining a study that looks at how instructional practices of teachers relate to student scores.
Mrs. Sidhu asked why test makers couldn’t put the word count expectations in the box. She also had concerns regarding specific special educationstudents: there is a population who cannot take this test at all; what are we doing about this?
Response:MSDE is looking at a test and have acontract we will be presenting to the Board on March 8th; not PARCC but it is based on specific standardized skills; communication competency is primary – not portfolios but indicators such as eyegaze and finger point.
Dr. Gates commented that the Equation Editor not seen as a factor in the discrepancy is astounding to me. As a mathematician, it would be a huge barrier to me.If kids are not using an equation editor in class, it is a terrible, terrible thing to do to kids in putting it on the test.
Mr. Giammo asked Dr. Smith to please give the Board data for item analysis variation.
Response: it is important even for instructional purposes.
Mr. Giammo inquired further: So, what decisions do we need to make as a Board?
Response: We are contracted to use PARCC thru June 30th of this year. After that, it is recommended to use it again in the next year and continue to analyze and start lookingat what else is in the market place.That way, a decision can be made and be ready after 2018 when the contract will again be up. MSDE staff cautioned about going it on our
own, citing Tennessee, Florida, etc. as states where that strategy has imploded.On the March Monday meeting agenda, this is at the top of the list.
Graduation Assessment Requirements
Dr. Johnson continued addressing the Board to provide a recommendation to what PARCC scale score/performance level should be adopted to satisfy the assessment graduation requirement for Algebra I and English 10. The discussion was a reminder of what should and should not be expected in light of the fact that students for the first two years only need to participate in the assessment – not pass it.
Dr. Strader presented apsychometric analysis of what MSDE recommends in the future along with Dr. Chow, one of the researchers from MARCwho explained data and the study.MARC worked inconjunction with MSDE’s technical department that worked on cut scores for HSAs. Here are the passing rates by PARCC Algebra 1 performancelevel and cut scores.
Highlights:
  • The High School Assessments (HSAs) have been part of the Maryland graduation requirements since 2003.
  • During the 2014-2015 school year, the PARCC Algebra I and English 10 tests replaced the HSAs.
  • The HSAs are on a scale from 240 to 650 score points; 412 is the passing score for Algebra; 396 is passing for English.
  • The PARCC assessments are on a scale from 650-850 score points.
  • The research questions asked what PARCC scale scores correlate to the corresponding HSA passing scores?
  • The study findings are summarized below:
Content / HSA
Passing
Score / PARCC Equivalent Score / PARCC
Confidence Interval
Algebra I / 412 / 720 / 697 / 743
ELA 10 / 396 / 707 / 686 / 728
Comments:
Mr. James DeGraffenreidt, SBOE member, asked, is this a math exercise or does it tell us what students can actually do?
Response: We start with the one and then go to the other.
  • Scores show that for English 10 and Algebra 1, about 60% of our students would pass with a PARCC Performance Level of 3 & 4.
  • The researchers recommend that the Board sets the score at 725.
Mr. DeGraffenreidt continued questioning by pointing out that Maryland has about 80% of our students graduate. This says to a 10th grader that you are approaching the expectations for graduation if you do… How do we finish that sentence? What follows this to make the Maryland diploma meaningful?
Mr. Chester Finn, SBOE member, stated, this does not seem right to me. The fundamental assumption is that what was good enough for HSA is good enough PARCC. I thought the standards were being raised; that here in Maryland it would be setting the bar higher. If we are saying a Level 3 is the standard for graduation and level 4 is the indication for College and Career Ready, then we have to tell people that upfront; and not perpetuate the lie, again, that they are college and career ready when they graduate.
Mr. DeGraffenreidt countered that we are saying the same thing differently.
Response from Dr. Smith:Our current students are not ready; students taking PARCC now have not had the benefit of 10 years of The Standards.
Mr. DeGraffenreidt stated that the Board needs a more in-depth conversation to talk about transitional scores.
Response:We (MSDE staff and the Board) are starting that conversation. The MSDE staff and researchers have analyzed the data and made recommendations; we need to go to publication by the April meeting. MSDE staff based these recommendations on the Board’s previous indications of wanting students to graduate; if the Board said they wanted Level 4 as a threshold, staff would have started there.
Mr. Finn countered that your recommendation here does not have a time limit; it does not suggest a transitional period. MSDE said that they didn’t want to have two diplomas, one for course completion one for College and Career Ready. Then fine. But what do we want that one diploma to mean?
Mr. Smith added, raise the bar and set some time limits.
Mr. Giammo pointed out that the Board keeps dancing around the idea of two different diploma standards; this needs to be decided – not here today, but it needs to be addressed. I would have done this differently, maybethe only one who would have done this, but I would have gone item by item – sit with teachers– ask them, is this what a kid needs to know to graduate?
Ms. Eberhart commented that that is what was previously done in the summer with teachers- item by item. What about the Bridge program?
Mr. DeGraffenreidtstrongly urged the Board not to even discuss two diplomas at this time – we do not have enough information as to what College and Career Readymeans yet; how it impacts the achievement gap; we don’t know enough for making that kind of decision.
Response by Dr. Johnson:MSDE would like to bring to the March Board Meeting more information and hopefully go to publishing COMAR in April – it must be done by May, so that students entering the ninth grade know what their expectations are.
Board members continued asking: What do the Local Education Agencies (LEAs) say? Have you talked with them?
Response by Dr. Smith:No, we always start with the Board as the decision making body. The LEAs will be getting the information today.
Mr. Finn asked, does that mean the LEAs will be getting this document with MSDE’srecommendations as stated? Because there seems to be a lot of push back here.
Response by Dr. Smith: Yes, but MSDE is meeting with the superintendents next week and will have the conversation with them like we did with you today. It’s a difficult conversation to have, we had to grapple with how do you eat an elephant? One bite at a time. This is where we decided to start the conversation, it needs to be continued and expanded, debated and discussed.
Dr. Gates stated that the Superintendents need to see this as a transitional document.
Mr. DeGraffenreidtfurther added, I think first impressions are important. You can’t give them this and then wait a week and say, Oh but that’s not accurate. Are we clear that we want those specifications included in what is sent to the superintendents?
Mr. Finn added, we need to tell the superintendents if the goal is to eat the elephant, thenthe goal is to eat an elephant. We need to make that clear.
BREAK was taken.
Action Item
Social Studies Assessment
Dr. Smith introduced the Social Studies assessment issue arising from legislation passed in 2012 that requires a decision to be made and possible action to be taken.
Dr. Johnson and Ms. Heather Lageman, director of Curriculum and Instruction, referred to Maryland Law 7-203 (2012) that requires the Board to determine whether the PARCC assessments administered to middle and high school students in 2014-2015 adequately measure the skills and knowledge set forth in the state’s adopted curricula for the core content area and take action on a portion that effects the middle school test. Data and analysis was presented to explain why MSDE staff feels that PARCC does not address these thresholds.
Dr. Smith further explained that Dr. Lowery had kept postponing this issue, and now based on the MSDE analysis, it is not reasonable to develop and implement an appropriate test within the mandated time frame. The Board needs to send a letter to the General Assembly explaining this and asking for a delay, while the commission is studying this and other testing issues.
Comments:
Mr. Finn commented that the Board needs to make it clear to the legislature that we intend to comply; that the delay is not to avoid compliance.He also asked about a history test in Maryland.
Response: No state history test exists; MSDE needs to revisit.
Mr. DeGraffenreidtagreed with Mr. Finn, stating the need to include that the Board is aware this a legislative mandate; the letter should also outline the pathway MSDE would use to achieve compliance.
Dr. Michele Jenkins Guyton, SBOE member, said this falls into the too much testing issue; can a social studies assessmentbe coupledwith something else? Is there a test out therethat can be used?
Ms. Eberhart questioned whether the timing for an 8th grade test right up against the Citizenship Test is wise; should we move the middle school test to 5th grade?
Dr. Smith returned the focus to communicating with legislators; said it seemed the Board was ok with the letter; asked for logistical guidance.
Mr. DeGraffenreidtasked if the Board can get a legislative solution or administrative waiver so folks don’t get angry that the Boardis out of compliance.
Response: There is no administrative office for a pass; we need to have a legislative solution; propose an amendment to a current educational bill that would address this.
The Board agreed through consensus that the letter would come from President Guffrie Smith.