Corrected
SPECIAL EDUCATION ADVISORY COUNCIL
Minutes – March 10, 2017
9:00 a.m. – 12:00 p.m.
PRESENT: Debbie Cheeseman, Annette Cooper, Gabriele Finn, Sage Goto, Martha Guinan, Valerie Johnson, Amanda Kaahanui (staff), Dale Matsuura, Thomas Moon (for Stacey Oshio) Kaui Rezentes, Charlene Robles, Susan Rocco (staff), Rosie Rowe, Ivalee Sinclair, Todd Takahashi, Christina Tydeman, Steven Vannatta, Amy Wiech, Jasmine Williams,
EXCUSED: Brendelyn Ancheta, Bernadette Lane, Kaili Murbach, Tricia Sheehey, James Street, Gavin Villar, Susan Wood
ABSENT: Bob Campbell, Toby Portner, Dan Ulrich
GUESTS: Daintry Bartoldus, Jeff Krepps, Lori Morimoto, Tom Sato, Lori Sumida, Flora Switzer
TOPIC / DISCUSSION / ACTIONCall to Order / Chair Martha Guinan called the meeting to order at 9:10 a.m.
Introductions / Members introduced themselves to guests and to new Special Education Director and Director of the Monitoring, Accountability and Compliance Office, Christina Tydeman. Christina shared her previous experiences as a middle school special education teacher and working in the areas of accountability and data governance. In her new role she is moving her position and office from being transitional to more solid footing, helping to clarify SEA functions and looking forward to learning from SEAC.
Announcements / Amanda Kaahanui announced that a few more volunteers are needed to man the SEAC table at the SPIN Conference. Todd Takahashi and Martha Guinan have signed up for shifts to date. / Ivalee Sinclair, Flora Switzer, Annette Cooper and Amy Weich volunteered.
Review of Minutes for February 10, 2017 Meeting / Martha pointed out that under Input from the Public, Restriction on who can conduct a Functional Behavioral Assessment , the minutes should reflect that the team was advised not to do FBAs. / The minutes were approved as corrected.
Overview of the State Systemic Improvement Plan (SSIP) Submission / Debbie Farmer led a discussion about the draft SSIP – Phase 3 report which is due to be submitted to the Office of Special Education Programs (OSEP) on April 3rd. She expained that the SSIP is Indicator 17 of the Annual Performance Report. Phases 1 and 2 focused on a State-identified Measurable Result (SiMR) of reading achievement of 3rd and 4th graders with specific learning disabilities, other health disabilities and speech language disabilities. Phase 3 describes implementation measures. The Special Education Section (SES) received the task of completing the SSIP submission in October 2016, and although a lot was done at the school level, there was not a mechanism to collect evidence. Debbie shared a document that included data, activities that have taken place to date and questions developed by the SES related to implementation. She encouraged members to provide input on the draft document. The
SEAC Minutes
March 10, 2017
Page 2
Overview of the State Systemic Improvement Plan (SSIP) Submission (cont.) / questions will be turned into OSEP to help determine if Hawaii has met its outcomes.Questions/comments from members and guests
C. In Phase 1, SEAC was included but segregated from DOE stakeholders. In Phase 2, SEAC was largely excluded with the exception of providing input on targets, even though we expressed an interest and a perceived duty to participate.
Q. When you speak of Complex Area staff, are you speaking to the teacher level, too? A. No, strictly state and complex staff.
Q. Will SEAC members have an opportunity to respond to the draft after today? A. Yes, we would need feedback by the end of March.
Q. Are you working with anyone from WestEd, the national consultants for the APR/SSIP? I understand that they have been convening learning communities. A. When we received the SSIP responsibility in October, we had some contact with West Ed but not the learning communities.
C. In your implementation goals regarding stakeholder participation, I don’t see parents included.
Q. Do you have a list of evidenced-based practices (EBPs)? Yes, I can provide one.
Q. Is OSEP asking states to forumulate questions regarding how we have been doing things in the past? A. It’s more about effecting change by 2020.
Q. Are we using the 3rd and 4th grade achievement as an indicator that we are helping all kids? What about for the current kids in our system who are older? A. (Christina) If you look at the implementation language, it is not specific to K-3. It’s just that the measures and targets are for 3-4th graders.
Q. Is your effort only around mild disabilities? A. No, it’s for all kids.
Q. Is the assumption that the training you are providing is to all sped and general education teachers? A. If we do training on EBPs, it is kind of like a Response to Intervention model—robust for Kindergarten to grade two.
C. The quantitative piece is missing from the outcomes. At what rate do we want improvement to happen? In evaluation, we need to see if the plan
SEAC MinutesC.
Msrch 10, 2017
Page 3
Overview of the State Systemic Improvement Plan (SSIP) Submission (cont.) / Questions/comments from members and guests (cont.)was successfully implemented. A. We tried to be respectful of previous outcomes and added a few more for clarification. We also put in the data we might look for—for example, were teachers registered for training. We tried to keep the same train of thought; otherwise we would have to change everything.
C. (Christina) I agree with the comment about outcomes. We need to put in evidence-based questions. There is no indication now as to whether the training was effective, etc. My intent is to develop those measures for internal monitoring. As members look through the draft, if questions can be modified to make them more evidence-based, that will help drive our monitoring.
C. It could help your implementation, if you have benchmarks for monitoring.
C. For early intervention, we have to do the same thing. The way we devise our questions is how the data went in—for example, of all staff in a demonstration site, how many completed training, what were the pre- and post- competency measurements/
Q. When you are talking about measurable objectives—“to what extent does observation indicate an increase in the use of EBPs” for example—how are you going to measure fidelity? Is it only by a walk-through? A. It is also through progress monitoring.
C. I don’t see that aligning. How will it measure implementation of the EBPs? Are they or aren’t they progress monitoring? A. We have no way to gather the evidence.
Q. When you look at 3rd and 4th graders, how are you tracking them for definitive information that the kids who got training have kept the skill and continued improving?
C. Teacher retention goals needs to be used as a data point when talking about fidelity. You won’t get fidelity, if the teacher changes every year.
Q. Is special education money funding all students to get these resources? Is the k-4 training using special education training for all teachers in a school? A. No, it’s all from general funds.
C. The SSIP is the tool to communicate how Hawaii is meeting the
SEAC Minutes
March 10, 2017
Page 4
Overview of the State Systemic Improvement Plan (SSIP) Submission (cont.) / Questions/comments from members and guests (cont.)requirements of the federal government, but we can have higher aspirations for our students, and SEAC can address the needs of all age groups and disabilities.
Q. If SEAC wanted the targets and outcomes to be higher, how did you come up with these numbers? A. They were set previously.
C. Like the APR targets, they are very conservative. A. OSEP has cautioned states to have reachable outcomes.
C. There has to be a context for the target—math or an algorithm.
C. I would like to hear that you are committing to align the questions to measureable outcomes. A. (Christina) I will affirm that we are planning to do that. I am eager to get to an evidence-based plan.
Q. Would it help to have a small group from SEAC meet with you to massage the draft? A. (Christina) My hesitation is only because of the short timeline. Your input will be used to vet ideas for evidence. If you see any pieces that are obviously missing, that would then flag us to create evidence for monitoring. (Debbie) The SSIP will also have to go to the Deputy for approval prior to submittal.
C. One recommendation is to come up with long-term outcomes and work backwards to design implementation. There are two kinds of data—head count vs. fidelity. Early on, you want to get as many trained as possible, so head count is the low hanging fruit. As you improve, you can shift to fidelity.
C. On the stakeholder page, the questions don’t have outcomes involving parents. If you want parents to read to their child, they will need similar resources, training and strategies as teachers to work as a team.
C. In order to include parents, you need to define stakeholders.
C. On your data sheet, percentages may not be the best way to measure proficiency. You need to talk about acceleration towards proficiency.
Q. Would you define fidelity? A. It means we are doing it with rigor and it is done right.
C. (Martha) I request that you come back and show us what is turned in as a final product. If you need comment on a specific item, email susan and she will get it out to members.
SEAC Minutes
March 10, 2017
Page 5
Assessments under the Every Student Succeeds Act (ESSA) / Tom Saka, Director of the Assessment and Accountability Branch, gave members an overview of the purpose of assessments, current testing requirements, and regulations regarding statewide assessments from the Every Student Succeeds Act (ESSA). In an effort to reduce testing, the Department made end of course exams optional and reduced mandatory assessements from 20 to 16. Tom’s office explored a flexible option within ESSA to replace the Smarter Balanced Assessment (SBA) in high school with the ACT or the SAT. The latter two tests do not have enough accommodations for special education students, and the ACT cannot diffferentiate for low performers. The Department has also looked into ESSA’s plan to award a few states the waiver to do ‘innovative assessments’ instead of the typical standardized assessment like SBA or PARC. The Hawaii State Teachers Association (HSTA) would like Hawaii to be a pilot state. However, since Hawaii is a unitary school district, all schools would have to scale up quickly under an Innovative Assessment waiver. Current efforts to reduce time spent in testing include 1) looking at the NGSS Science innovative assessment which utilizes projects and portfolios, and 2) looking at whether we can take writing out of the SBA and use a portfolio instead. In the meantime, Hawaii is waiting to see if other states apply for the innovative assessment option. Tom doesn’t want to commit teachers to doing things without careful consideration of options with HSTA.Questions/comments from members and guests:
Q. Are the ACT or the SAT developers looking at ways to increase accommodations? A. Yes. The SAT has a lot more accommodations, but the data won’t be available until next fall.
Q. Since one of the problems with the SBA is that test results are available too late to impact instruction, can you give the test earlier? A. Yes, but it is designed to measure what you know and have learned, so testing early may include information that the student hasn’t been exposed to yet.
Q. Why did Hawaii give up its old Hawaii Achievement Standards? A. In part from national pressure to adopt a test aligned with the Common Core.
Q. Since a good number of students with developmental disabilities do not appear to be benefiting from taking the statewide assessment, is / A copy of the Powerpoint of the Assessments Update was distributed to members.
SEAC Minutes
March 10, 2017
Page 6
Assessments under the Every Student Succeeds Act (ESSA) cont. / Questions/comments from membrs and guests:there a way to offer an assessment that has been normed on students with a lower I.Q.? A. The assessments no longer talk about norms but about criterion—specific skills per grade level. For special education students, it’s more about testing at the appropriate grade level.
C. The innovative assessment option would be better for most students with IEPs. A. It would be interesting and we are not unsupportive. It would depend also on what kind of writing task we give.
C. A big problem with SBA has been the pressure put on schools to raise achievement scores resulting in an inordinate amount of time preparing for the test. A. If you teach to standards, it shouldn’t take more time, and fortunately, with the movement in ESSA and Strive HI to no longer rank schools, fewer schools are taking practice tests.
Legislative Report / Ivalee Sinclair reported on the following items:
Funding education through tax on rental properties
SEAC wrote testimony in support of HSTA’s efforts to create an additional source of funding for public education through taxes on rental properties and the visitor industry. Ivalee’s concern, which was echoed by Representative Takumi and Senator Kidani, is that if more money is available from another source, the legislature will quit funding their share of education costs.
Restraints and Seclusion training
SEAC requested that the Finance Committee restore $300,000 cut by the Governor and earmarked for training on the proper use of behavioral interventions and restraints, as well as add $85, 000 for a manager to oversee the training and data collection.
Teacher Qualifications under ESSA
At the February 21st Human Resources Committee meeting of the Board of Education, input was sought on how Hawaii should set State Certification Requirements for special education teachers. The options are 1) to have teachers who hold a valid special education license be considered “state certified” to teach their content areas with only the license, or 2) be required to demonstrate subject matter competence in / A list of legislative initiatives that SEAC is following was distributed.
SEAC Minutes