spring breakthrough

Spring break at school? New research says it helps middle schoolers catch up

PHOTO: Photo by Lane Turner/The Boston Globe via Getty Images
Students transition to gym class from Zoe Pierce's sixth grade science class at the Impact School in Springfield, Massachusetts in January 2017. Working with state officials, Springfield education leaders have crafted a first-of-its-kind plan for a Massachusetts school system, spinning off the middle schools into what effectively is their own miniature school system.

It has spread across Massachusetts as a school turnaround strategy: bring students into school over spring break for hours and hours of extra instruction.

That could be a recipe for unhappy kids. But recent research in Lawrence and now Springfield, two Massachusetts towns under pressure to improve their schools, suggests that students don’t mind spending those weeks in school — and they really benefit from it.

In Springfield, students who participated were much more likely to score proficient on state exams later that year. They were less likely to be suspended afterwards, too.

“This is a minimally disruptive strategy for helping struggling students catch up,” said Beth Schueler of the University of Virginia, who studied the program. “You don’t have to fire or find a large pool of new teachers.”

It’s the latest research pointing to the benefits of intensive small group tutoring. But the Springfield program wasn’t open to all: Students were invited based on whether they were seen as likely to benefit and to behave, raising questions about whether the model helps the kids who need it most.

What are these vacation academies?

Schueler’s study focuses on nine low-achieving middle schools in Springfield, all part of a partnership between the state, local teachers union, and district formed to head off a state takeover. The spring break program (known as “empowerment academies”) started in 2016, during the partnership’s first year, in a bid to boost test scores.

Springfield’s April vacation academy focused on math — particularly on the academic standards that frequently come up on state exams, the researchers noted. Classes were small, with about one teacher for every 10 students. It amounted to an additional 25 hours of math instruction, or approximately an extra month worth of exponents and equations.

The idea was extra math help, yes, but also designed to be appealing for students. Schools feature award assemblies and special theme days, like crazy hair days or superhero and villain dress-up days.

“It’s not just boot camp,” said Chris Gabrieli, the head of a nonprofit that helped implement the city’s turnaround approach. “If this was miserable for kids, it would never work the second year.”

The model was pioneered by Jeff Riley, now the Massachusetts commissioner of education, as principal of a Boston middle school. He took the idea to Lawrence as the head of that district’s state takeover.

Who got to attend?

Students were chosen for the program based on who was seen as likely to benefit and behave. That meant leaders avoided students with attendance or behavior issues, an approach used at the school where Riley piloted the model, too.

In Springfield, once the school decided which students were eligible, leaders allowed Schueler to randomly assign some the opportunity to attend or not in order to study the program. (Some students who didn’t win a spot attended anyway, and some students who won a spot didn’t attend, something the study accounted for.)

Ultimately, the students who went to vacation academy were different than the student population as a whole — only 10 percent had a disability, compared to 22 percent of all sixth- and seventh-graders in the district. Attendees were also somewhat more likely to be girls, but were equally likely to qualify for free or reduced-price lunch.

Teachers applied to work over the break, and got bonuses — $2,500 in Springfield — for doing so.

How does the program affect students?

In terms of test scores, students on the cusp of clearing the state’s proficiency bar saw the biggest gains.

Students at school over spring break were much more likely to score proficient on the state math test: about 35 percent did compared to 25 percent of similar students who didn’t attend, according to Schueler’s study, published last month in the peer-reviewed journal Education Finance and Policy.

Their average test scores were higher than those of students who didn’t attend the academy, though that boost wasn’t statistically significant.

Students were also less likely to be suspended after the vacation academy. While about 10 percent of control students were suspended once or more, less than 7 percent of students who went to the academy were.

The academy also may have helped students’ grades, with averages in both math in reading improving slightly, though that was not statistically significant.

How a student benefited may be connected to which version of the program he or she attended. In some academy programs, students had one teacher all day; in others they rotated among teachers who worked on different standards. Students in the first group saw bigger declines in suspensions; students in the second group saw larger test-score gains.

“It could be that the additional time … provided by the stability of a single teacher allowed for the development of more deep and positive teacher-student relationships,” Schueler wrote. “It is also possible that the program changed educator perceptions of participating students in a way that decreased their likelihood to turn to exclusionary discipline for those children.”

Should other districts adopt this model?

This kind of intensive academic help has been shown to work over and over again. A previous study found test score improvements from the same vacation academy model in Lawrence, Massachusetts, and programs that provide individualized tutoring throughout the year have produced even bigger gains in Boston, Chicago and Houston.

But these programs have faced difficulty growing because they come with a hefty price tag — sometimes upwards of a few thousand dollars per student. The Springfield program costs about $600 per student, which is lower because its student-teacher ratio is higher.

Tutoring, Schueler said, “has a high upfront cost that I think deters many districts from pursuing this as a key strategy. These vacation academies are potentially a more scalable approach.”

It still might be a worthwhile investment for districts. But they will have to contend with the fact that the model doesn’t include certain students — particularly those with behavior and attendance problems, who could be in the most academic trouble.

“This is not an intervention aimed at helping every single kid of every type,” said Gabrieli.

Of students who didn’t participate in the program, Schueler said, “We don’t know, if they actually ended up coming, if we would see the same effects.”

state test results

With accelerated growth in literacy and math, Denver students close in on state averages

Angel Trigueros-Martinez pokes his head from the back of the line as students wait to enter the building on the first day of school at McGlone Academy on Wednesday. (Photo by AAron Ontiveroz/The Denver Post)

Denver elementary and middle school students continued a recent streak of high academic growth this year on state literacy and math tests, results released Thursday show. That growth inched the district’s scores even closer to statewide averages, turning what was once a wide chasm into a narrow gap of 2 percentage points in math and 3 in literacy.

Still, fewer than half of Denver students in grades three through eight met state expectations in literacy, and only about a third met them in math.

Find your school’s test scores
Look up your elementary or middle school’s test scores in Chalkbeat’s database here. Look up your high school’s test results here.

Denver’s high schoolers lagged in academic growth, especially ninth-graders who took the PSAT for the first time. Their test scores were lower than statewide averages.

“We are absolutely concerned about that,” Superintendent Tom Boasberg said Thursday of the ninth-grade scores, “and that is data we need to dig in on and understand.”

Students across Colorado took standardized literacy and math tests this past spring. Third- through eighth-graders took the Colorado Measures of Academic Success, or CMAS, tests, which are also known as the PARCC tests. High school students took college entrance exams: Ninth- and 10th-graders took the PSAT, a preparatory test, and 11th-graders took the SAT.

On CMAS, 42 percent of Denver students in grades three through eight met or exceeded state expectations in literacy. Statewide, 45 percent of students did. In math, 32 percent of Denver students met expectations, compared with 34 percent statewide.

While Denver’s overall performance improved in both subjects, third-grade literacy scores were flat. That’s noteworthy because the district has invested heavily in early literacy training for teachers and has seen progress on tests taken by students in kindergarten through third grade. That wasn’t reflected on the third-grade CMAS test, though Boasberg said he’s hopeful it will be as more students meant to benefit from the training take that test.

On the PSAT tests, Denver ninth-graders earned a mean score of 860, which was below the statewide mean score of 902. The mean PSAT score for Denver 10th-graders was 912, compared with the statewide mean score of 944. And on the SAT, Denver 11th-graders had a mean score of 975. Statewide, the mean score for 11th-graders was 1014.

White students in Denver continued to score higher, and make more academic progress year to year, than black and Hispanic students. The same was true for students from high- and middle-income families compared with students from low-income families.

For example, 69 percent of Denver students from high- and middle-income families met expectations on the CMAS literacy tests, compared with just 27 percent of students from low-income families – which equates to a 42 percentage-point gap. That especially matters in Denver because two-thirds of the district’s 92,600 students are from low-income families.

Boasberg acknowledged those gaps, and said it is the district’s core mission to close them. But he also pointed out that Denver’s students of color and those from low-income families show more academic growth than their peers statewide. That means they’re making faster progress and are more likely to reach or surpass grade-level in reading, writing, and math.

Denver Public Schools pays a lot of attention to annual academic growth, as measured by a state calculation known as a “median growth percentile.”

The calculation assigns students a score from 1 to 99 that reflects how much they improved compared with other students with similar score histories. A score of 99 means a student did better on the test than 99 percent of students who scored similarly to him the year before.

Students who score above 50 are considered to have made more than a year’s worth of academic progress in a year’s time, whereas students who score below 50 are considered to have made less than a year’s worth of progress.

The state also calculates overall growth scores for districts and schools. Denver Public Schools earned a growth score of 55 on the CMAS literacy tests and 54 on the CMAS math tests. Combined, those scores were the highest among Colorado’s 12 largest districts.

Other bright spots in the district’s data: Denver’s students learning English as a second language – who make up more than a third of the population – continued to outpace statewide averages in achievement. For example, 29 percent of Denver’s English language learners met expectations in literacy, while only 22 percent statewide did, according to the district.

Denver eighth-graders also surpassed statewide averages in literacy for the first time this year: 45 percent met or exceeded expectations, as opposed to 44 percent statewide. That increase is reflected in the high growth scores for Denver eighth-graders: 52 in math and 57 in literacy.

Those contrast sharply with the ninth-grade growth scores: 47 in math and an especially low 37 in literacy. That same group of students had higher growth scores last year, Boasberg said; why their progress dropped so precipitously is part of what district officials hope to figure out.

Trending up

Most schools in Tennessee’s largest district show growth on state test

PHOTO: Laura Faith Kebede
Students at Freedom Preparatory Academy's high school prepare to take their TNReady geometry test.

Most schools in Shelby County Schools showed progress in all subjects except science, but students still outshined their peers across the state in science, earning them the state’s highest rating in growth.

About half of schools in the Memphis district saw a bump in English scores, also earning the district the highest rating of growth under the Tennessee Value-Added Assessment System, known as TVAAS.

Superintendent Dorsey Hopson attributed the gains to a renewed focus in preschool education in recent years, adding a reading curriculum more aligned with state standards, and doubling down on literacy training for teachers and students.

“When you think about the investments that we’ve been able to make in schools over the last two years, I think the data is showing that we’re seeing a good return on our investment,” he told reporters Thursday.

But the scores don’t come without tension. Hopson recently teamed up with Shawn Joseph, the director of Metro Nashville Public Schools, to declare “no confidence” in the state’s test delivery system, which has been plagued with online problems since it began in 2016. Still, Hopson said educators are utilizing the data available to adjust strategies.

“It’s an imperfect measure, but it’s the measure we have right now,” he said. Hopson worries the failures of the state’s online testing system used by high schoolers made “some teachers and students lose focus.”

“There’s impact on those kids that we may never know about,” he said.

Find your school and compare here

The state doesn’t release data for an exam if fewer than 5 percent of students performed on grade level or if 95 percent of students were above grade level. An asterisk signifies that a school’s score falls in one of those two categories.

District-wide results released in July show that more young students are reading on grade-level, and that math scores went up across the board. But the percentage of high school students who scored proficient in reading dropped by 4 percentage points. Shelby County Schools still lags significantly behind the state average.

Shelby County Schools also improved its overall growth score, which measures how students performed compared to peers across the state who scored similarly to them the year before. It increased from 1 to 2 on a scale of 5. More than half of schools scored 3 or above, meaning those students scored on par or more than their peers.

The district’s nearly 200 schools include about 50 charter schools that are managed by nonprofit organizations but receive public funding. The rest are run by the district.

Below are charts showing the five schools that performed best and worst in the district in each subject, as well as those that grew or declined the most in each subject.

The state doesn’t release data for an exam if fewer than 5 percent of students were on grade level or if 95 percent of students were above grade level. The charts below only include schools that fall in between that range.

English Language Arts

Graphic by Samuel Park
Graphic by Samuel Park
Graphic by Samuel Park
Graphic by Samuel Park

Math

Graphic by Samuel Park
Graphic by Samuel Park
Graphic by Samuel Park
Graphic by Samuel Park

Science

Graphic by Samuel Park
Graphic by Samuel Park
Graphic by Samuel Park

Graphic by Samuel Park

Social Studies

Graphic by Samuel Park
Graphic by Samuel Park
Graphic by Samuel Park
Graphic by Samuel Park