Hackett Community Engagement Team Meeting Minutes
Wednesday, 1/20/16, 6-8 pm
Minutes prepared by: Anne Erling
Dr. V. was not in attendance for this meeting (and as it turns out, she announced her resignation the following day, effective Friday 1/22). From the student front, Amelia Colafati, an 8th grader at Hackett, attended the meeting in place of Mimi Morse who couldn’t make it.
Review and Approval of Last Month’s Meeting Minutes: XrystyaSzyjka provided her name to fix an error in December’s minutes. With this change, and following a motion that was seconded, December’s meeting minutes were unanimously approved.
Review of Upcoming 2nd Quarter Report:For much of the rest of the meeting, Mr. Paolino walked us through current student indicators related to Hackett’s chosen metrics under Receivership, and the status of implementation of the initiatives included in this year’s Continuation Plan. This information will be reported to SED in the 2nd Quarterly Report which is due on 2/12 thanks to an extension allowed by SED in order for Hackett to be able to finish NWEA testing first and include winter NWEA testing results in the report.Hackett expects to finish winter NWEA testing on 2/5/16.
Mr. Paolino handed out a first draft of the 2nd Quarter Report, along with two Leading Indicator Progress Reports, one on yellow paper reflecting data as of 12/31/15, and another on green paper reflecting data as of 1/18/16. He will also email electronic copies of these reports to us. The Quarterly Report draft includes draft language prepared by the Building Leadership Team (BLT) in its meeting yesterday, with some lines left blank for initiatives Mr. Paolino would like to discuss with the CET before writing. The Leading Indicator Progress Reports are tools prepared by a SUNY advisor to the BLT that serve to provide a monthly snapshot of where Hackett stands on 10 of the 11 indicators selected to gauge Hackett’s progress out of Receivership. The “Priority Schools” indicator (which measures student results on the state exams) is the only one without data in this report, and that’s because Hackett can’t yet predict how it will fare on that indicator for two reasons: One, SED has not yet given the school its target Annual Yearly Progress (AYP) scores for each of its subgroups, so we don’t yet know how success on this indicator will be determined. We also don’t have a tool that provides a reliable predictor of how well students will do on the state exams. Ken Robbins may be able to use student NWEA MAP (Measures of Academic Progress) scores to approximate State exam scores once we get our target AYP values from SED.
Walk-through of Hackett performance on its indicators: Based on the following results, Mr. Paolino predicts Hackett will score at least 73 points (and perhaps as high as 81 points if winter NWEA math results show strong student learning gains), which exceeds the 67 point cut-off for success in this year’s Receivership challenge.
Level 1 Indicators:
- Priority Schools – No data included because SED has not yet provided Hackett AYP targets. Even when we get them, though, we have no local measure that definitively correlates with state exam performance.
- School Safety/Weighted VADIRS – At the meeting we got a primer on VADIRs and a snapshot of where Hackett stands at this point in the year.
“VADIR” stands for Violent and Disruptive Incident Reporting. There are 20 categories of VADIRs, and those that result in injury or are serious in another way (ie, having a weapon at school) are considered “weighted VADIRS” and earn a school one or a number of points, ranging from homicide on the high end (worth 100 points) through Assault with a weapon (30 points), Reckless endangerment with a weapon (25 points),Reckless endangerment without a weapon (1 point), Larceny without a weapon (1 point), to Minor altercations without weapons (meaning physical contact without resulting injury) at the bottom of the scale, of which there are 28 types (worth 0 points).
The school’s VADIR database is stored online, accessible through District staff log-ins. When there is a discipline incident at school, a teacher/staff person files a report in this online system. The system includes detailed definitions of each VADIR, and staff read through them to assign a correct incident name to what transpired. To be a reported VADIR, an incident has to happen during school hours. If an incident happens outside of school hours, school staff can “report” the incident (which would be required prior to suspending the student), but wouldn’t “record” the incident as a VADIR. Bullying is an unweighted VADIR, unless it involves a weapon.
At this point in the year, Hackett has 70 VADIRs, of which 3 are from weighted incidents worth one point a piece (larceny without a weapon, reckless endangerment without a weapon, and a weapon found under uncertain circumstances), yielding a total of 3 VADIR points. To be considered a Persistently Dangerous School, a school must have a School Violence Index (SVI) of at least 1.5 (calculated by dividing the school’s number of total VADIRs, weighted and unweighted, by the number of students in the building) or an SVI of .5 and 60 or more weighted incidents. Using this formula, Hackett now has an SVI of .12 (70 VADIRS/598 students). In the past Hackett has had an SVI of greater than .5, but never 60 or more weighted incidents, so has never been labeled a “Persistently Dangerous School.”
- 3-8 ELA All Students Growth (PLEASE VERIFY:In SED documents, I think this indicator is referred to as “Mean Student Growth Percentile in ELA”): This indicator is meant to shine a light on how well a particular teacher or school develops the students in their charge in a way that is independent of what skills students arrive with. To calculate this indicator, the state groups individual students into statewide groupings of students with similar scores on past state assessments and similar demographics, and then compares their results on the next state assessmentwith others from their statewide cohortto come up with an individual student’s
“growth percentile.” Students earn high growth percentiles when they perform better than expected based on results from the rest of their cohort. They earn low growth percentiles when the rest of their cohort outperforms them. Under the State’s teacher evaluation system that has come under fire and is now under a four-year hiatus, growth percentiles for all students in a particular teachers’ roster have been averaged together to give a teacher a mean score that has been used in the State’s system for rating teachers as “Highly Effective” through “Developing” and “Ineffective.” Growth percentiles for all students in a particular school have been averaged together to give a school and its principal a mean percentile to show how effectively a school develops the students in its care, compared with others statewide.
For this Indicator Progress Report, Hackett is doing a calculation similar to the State’s, but using student scores on NWEA tests in place of the state exam results that the State uses. As in the State’s calculation, students are being placed into groups by demographic and past scores and compared to others outside the school. The goal with this monitoring method is to approximate as closely as possible the calculation SED will perform, in order to predict the school’s likelihood of meeting its target for this indicator.
At the meeting, Mr. Paolino showed us the report he uses to analyze how students are doing, and how subgroups of students in the school are doing. In addition to showing raw NWEA scores, the report also provides a target value for the number of points a particular student would need to earn in order to have a 50% likelihood of passing the State ELA exam (earn at least a level 2).
Using its internally developed NWEA-based calculation, Hackett has calculated a school-wide Mean Student Growth Factor for ELA of 59.2%, meaning that students at Hackett are posting above average improvements in their NWEA scores, compared to others city and statewide. This mean growth factor (which, again, is an approximation based on NWEA data, not State assessment data) is well above the State’s target for Hackett of 42.7%. Hopefully NWEA scores will prove predictive of State assessment results, and these results will be matched when SED uses state exam results to run these growth calculations.
At the meeting we also discussed the NWEA test in general and the student experience of taking it. The test is done at a computer in the school library, and questions are generated based on a student’s accuracy on the previous question. In this way, each student’s test is different and, if done seriously, is intended to measure a student’s maximum level of ability. At the lower end of the academic spectrum, test scores are intended toincrease by large chunks each season. At the upper end of the academic spectrum it can be difficult to increase test scores by a point or two.
Amelia Colafati provided a student perspective on the experience of taking the test. She explained that the test: has questions that get more difficult the more correct questions a student answers, starts with questions matching the level of a student’s prior season score, which can be difficult enough to psyche out a student, and if done seriously in its entirety can take hours. Given this, not all students put a sincere effort into the full test. Faced with an upcoming lunch period or favorite class, students might post any odd answer to the last questions to get the exam to stop.
Hackett has implemented a few changes meant to increase the likelihood of getting reliable results from this test. Students are allowed to pause and resume the test later if they need a break, students might be told the grade equivalent of their score if doing so might encourage a student to try harder, and students are encouraged to retake the test if their score seems lower than expected. Also, ELA and math classes are run by substitutes during NWEA testing so that classroom teachers can supervise testing for their students. Some students have responded to these changes with more serious efforts on the tests.
- 3-8 Math All-Students Growth (Again, I believe this indicator is actually SED’s “Mean Student Growth Percentile in Math”): This indicator works the same as the above indicator for ELA, except that the State will here be pulling State math assessment data to calculate Hackett’s score, and in this monitoring tool Hackett is using NWEA math results. This January 18 Progress Report still has a percentile figure (54.7%) based on Fall NWEA test results, since Winter NWEA math testing has not yet ended. The figure may change when the new results are incorporated into the formula, but it’s likely it will still be above SED’s goal for Hackett, which is 38.3%. Again, our local formula uses NWEA test data, not the state assessment data that SED will use. Hackett doesn’t know how well NWEA growth correlates with State assessment growth, but hopes that it will prove to be predictive.
- NWEA Math All Students Growth: This indicator is a locally developed indicator based solely on growth in scores on the NWEA math tests, offered in the Fall, Winter and Spring, with the baseline taken from a student’s score during Spring testing of the prior year, and culminating with their score in Spring testing of the current year. The goal for each student is to increase three or more “levels” during this period (one level per Fall, Winter and Spring testing season). Above a test score of 212, each level is worth one point (so to move up three levels a student would be expected to move from a test score of 212 to 215 in this time period, for example), but below 212, a level consists of a several point increase in test scores.
There’s no figure in the Current Data column because math NWEA testing hasn’t ended yet. After Fall NWEA testing, 38.2% of students showed growth of the sort needed to increase a grade level by the end of the year. SED’s target for Hackett is 37.9%, so Hackett narrowly beat that figure in the Fall.
Unlike in the prior two indicators, this data isn’t only an approximation of what SED will calculate, but is the actual figure SED will use, since this indicator is based solely on NWEA scores, to which the school has full access.
- NWEA Reading All Students Growth: NWEA winter testing for reading is complete so this progress report has current data for this indicator. All students who are on target to improve three levels this year need to have a two level gain by now. 507 of our 598 students have Spring, 2015 scores that enable them to be counted in this calculation, and of those 507, 314 meet the standard of increasing their scores by at least two levels, for an overall student body percentage of 61.9%. Our SED target is well below this at 50.6%.
Level 2 Indicators:
- Teacher Attendance: Mr. Paolino reviews attendance figures every day. Any person who has a class role, sees students, and instructs them every day falls into the calculation for this indicator. At Hackett we have 61 teachers now, though the number can fluctuate based on new hires or teachers who go out on extended leave. Our current teacher attendance rate is 96.67%. We’d need a monumental collapse not to make our current year target of 94%. This rate reflects attendance at work, not necessarily work with students. If staff are engaged in professional development outside of the classroom (like Studio Classrooms) they are counted present in this figure, even though students will still experience a substitute teacher that day.
- School Safety – Number of Students in Out of School Suspension for at least 1 day: 53 students have been suspended for at least one day by this point in the year. By comparison, Hackett was suspending an annual total of 215 students until last year, so this is a vast improvement and is significantly less than Hackett’s SED target of 203 students.
Why the change? Mr. Paolino points to implementation of PBIS throughout the building, the added enrichment period that has given students extra time and support to complete classwork (with the result that 75% of students have failed no classes, and 45% made honor roll), and Hackett’s great full-service department.
A general conversation ensued about student behavior and balancing the goal of limited OSS with ensuring a well-functioning school. Dahlia Herring encouraged Hackett and the CET to work hard to lower this number further with the statement that 53 kids in OSS are 53 kids too many. Mr. Paolino described the sorts of student behavior the school experiences and manages through its full-service department rather than OSS: for example, a student with a hoodie and loud, foul language escorted out of the cafeteria, brought to full-service where a discussion with a counselor there led her to apologize to Mr. Paolino and continue with her day at school. The school, though, needs the flexibility to figure out where to draw the line. Mr. Paolino challenged the CET with the question, how many times can you say to someone, “Can I have your phone? You violated the code of conduct.” You may need to tell the student that such behavior can render you unemployed in professional America. Also, one student’s disruptive behavior in a classroom disrupts the learning for all the other students in that classroom. The school’s response to inappropriate behavior needs to protect the needs of the students who are trying to learn, at the same time as it seeks to respond as sensitively as possible to the challenges of the students who are misbehaving.
When putting together this year’s Continuation Plan for Hackett, the CET had marginally explored Guilderland’s Focus program, which provides a separate educational environment for students with behavioral issues, meant to provide them supports to enable them to have a successful school experience and avoid OSS. The CET saw this program as a positive option for Hackett, but we didn’t have time to study it enough to figure out how it might work at Hackett. At this meeting a member brought this option up again as something we should look into as a possible response to this issue. Mr. Paolino mentioned that that program is run at the high school level where it can group students of different grade levels in the same courses, an option Hackett doesn’t have at the middle school level. This difference impacts staffing and space issues for such a program.
At the meeting Mr. Paolino also showed the group several school safety charts he can view that graph out school incidents by day of the week, grade levels, etc. The weeks end (Thursday and Friday) better than they begin, and 6th graders have the best behavior, followed by eighth graders. 7th grade behavior is the worst.
- DTSDE Tenet 6: Parent and Community Engagement: Nothing is written in the “Analysis/Report Out” column for this tenet in the draft 2nd Quarterly Report because Mr. Paolino would like to get the CET’s input prior to writing. Mr. Paolino reminded the group that this tenet will be scored by SED using its DTSDE rubric, and that we will achieve at least the “Developing” grade that is our target for this year.
Still, members of the CET expressed frustration with the lack of movement on any of the Parent and Family Engagement objectives included in this year’s Continuation Plan. Again, mention was made of convening a subgroup of the CET to focus on this part of the plan, starting with ironing out the details of what the community outreach workers will be doing, and what sort of person we need to do that job effectively, and again members raised hands to be counted among this group. Barry Walston offered to email everyone who has expressed interest in serving on this sub-committee, along with Dr. V. and Cathy Edmondson to get the ball rolling on this issue. We were reminded that Mike Paonetta will begin work on 2/1 as Hackett’s SIG officer and will serve as a point person to oversee these and other initiatives under Receivership. [The day after this meeting, though, with news of Dr. V’s resignation came news that Mike Paonetta will be moving over to the principal’s job at Myers Middle School, and therefore will not be coming to Hackett, leaving an ongoing supervisory hole for these initiatives.]