Why should parents be interested? Because the Melrose Public Schools will be measured based on the aggregate student data of all district schools, and schools will be measured based on aggregate student data in those schools. When you see the newspapers reporting on districts, or read more refined data on different websites, you should know how these numbers were calculated. The results should be one factor on which resources are allocated in the district and taxpayers should feel confident that their money is being used wisely to improve education for our students. You'll also want to be cognizant of the fact that the schools are responsible for education all students in every group and subgroup, and it's our privilege as a community to ensure that we meet students where they are and do everything we can to bring them to the next level equitably. As always, the Melrose School Committee will hear a report from administrators on the results in late fall.
Why did MA need a new accountability system?
·
US Dept. of Education’s Every Student Succeeds
Act (ESSA)
o
DESE was thinking about a new accountability
system even before ESSA. DESE hadn’t designed the system to have Level II be
bad (left room for interpretation); needed to change the words around accountability.
o
A lot of federal $ at stake (Title I
allocations: out this past Monday).
o
Requirements: “annual meaningful differentiation
between schools;” “ambitious state-designed LT goals;” continued annual testing;
95% participation; identify lowest performing 5% of schools, and HS with grad
rates <67%; identify schools with low performing subgroups.
·
State requirements/reasoning
o
Achievement Gap Act
o
Public info sharing
o
State resource and federal grant allocations
Timeline: did a listening tour, modeling, listening,
revising (4/16-4/17); submitted plan to US Dept. of Ed., approved by BOE in
June 2017. Final plan: not exactly representative of everyone. (This Sept. 19th
BOE meeting: 1st results from new accountability system.)
Metrics
Non-High Schools
·
Achievement: MCAS average scaled score for
English, Math; CPI for Science
·
Student growth: mean SGP for ELA and Math
·
English Language Proficiency: progress made (note:
4% of students in MA were ELL in 2002 and 9% now)
o
Half of ELL students are in 5 MA districts
o
Using ACCESS as the testing measure
o
Takes about 6 years to get kids to ELL
proficiency
·
Chronic absenteeism (measured as 10% of days =
18ish). There is a “fairly serious attendance problem” across the Commonwealth
and kids who are in school more do better. A school day measured as a day that
a student is receiving services. Wide range across districts. In one district,
33% of students miss more than 10 days.
High Schools
·
Similar to non-high schools, but adds elements
of graduation, dropout, % of 11th and 12th grade students
completing at least one advanced course (AP, IB, post-secondary).
Problem with measuring the closing of the achievement gap:
·
Must have two reference points and there are
schools that have all high needs students and others that have no high needs
students.
·
Problem: when the high performing group
declines, the gap is still closing (but not in the desired manner). Want lowest
performing students to get better. (Raising the floor.) Better if you say lower
performers vs. higher performers.
·
Urban schools do really well when students stay
but so much transience skews the data. Will now look at lowest performers who
stay (e.g. 3rd to 4th grade) and they’ll have an
improvement trajectory.
Ask: how are our students doing on each of the metrics listed above.
Supt. will have two numbers: percentile and percentage (by
school). (If percentage is at 75% or higher, in general, school is meeting
improvement targets.) These numbers are weighted equally.
·
Categorizing schools: those needing intervention
(15%) and those that don’t (85%). Determined at the discretion of the
Commissioner. If percentile is between 1-10% and not already in needing
intervention, identified in need of focused/targeted support. Still requires a
minimum of 20 students.
·
No more “if you have a Level 2 school then you
are a Level 2 district.” Won’t give districts a rating like this.
What should SC members focus on? It depends:
·
Mostly: “how are schools improving?” Subgroups
concerns. Specific concerns: chronic absenteeism, ELL, etc.
·
Small number of districts: are you among lowest
performers?, equity concerns, dropout/graduation rate concerns.
District and school report cards:
· DESE will redesign report card in late fall 2018.
·
Measures of performance/opportunity beyond
assessment & accountability results:
o
Discipline, availability of art education (this
was BIG), educator data, grade 9 course-passing, per-pupil expenditures
·
MCAS scores won’t be the first metric reported.
·
Why isn’t discipline in accountability system?
Couldn’t allow a situation where a school thought about disciplining or not
disciplining based on accountability but it will be on report card. Last cuts:
arts education and Grade 9 course passing. (Students who fail one class in 9th
gr. are 25% more likely to drop out.)
·
Comment: there is huge differential in quality
of art programs. DESE: they don’t yet have criteria to measure.
·
Educator data is important to student outcomes
too (inexperience, out of field, staff attendance) but it’s not in the criteria
at this time.
Key factors in student success: is child progressing in all
facets of education (math, science, having fun, etc.)? Unfortunately, there is
no “fun” metric.