spring breakthrough

Spring break at school? New research says it helps middle schoolers catch up

PHOTO: Photo by Lane Turner/The Boston Globe via Getty Images
Students transition to gym class from Zoe Pierce's sixth grade science class at the Impact School in Springfield, Massachusetts in January 2017. Working with state officials, Springfield education leaders have crafted a first-of-its-kind plan for a Massachusetts school system, spinning off the middle schools into what effectively is their own miniature school system.

It has spread across Massachusetts as a school turnaround strategy: bring students into school over spring break for hours and hours of extra instruction.

That could be a recipe for unhappy kids. But recent research in Lawrence and now Springfield, two Massachusetts towns under pressure to improve their schools, suggests that students don’t mind spending those weeks in school — and they really benefit from it.

In Springfield, students who participated were much more likely to score proficient on state exams later that year. They were less likely to be suspended afterwards, too.

“This is a minimally disruptive strategy for helping struggling students catch up,” said Beth Schueler of the University of Virginia, who studied the program. “You don’t have to fire or find a large pool of new teachers.”

It’s the latest research pointing to the benefits of intensive small group tutoring. But the Springfield program wasn’t open to all: Students were invited based on whether they were seen as likely to benefit and to behave, raising questions about whether the model helps the kids who need it most.

What are these vacation academies?

Schueler’s study focuses on nine low-achieving middle schools in Springfield, all part of a partnership between the state, local teachers union, and district formed to head off a state takeover. The spring break program (known as “empowerment academies”) started in 2016, during the partnership’s first year, in a bid to boost test scores.

Springfield’s April vacation academy focused on math — particularly on the academic standards that frequently come up on state exams, the researchers noted. Classes were small, with about one teacher for every 10 students. It amounted to an additional 25 hours of math instruction, or approximately an extra month worth of exponents and equations.

The idea was extra math help, yes, but also designed to be appealing for students. Schools feature award assemblies and special theme days, like crazy hair days or superhero and villain dress-up days.

“It’s not just boot camp,” said Chris Gabrieli, the head of a nonprofit that helped implement the city’s turnaround approach. “If this was miserable for kids, it would never work the second year.”

The model was pioneered by Jeff Riley, now the Massachusetts commissioner of education, as principal of a Boston middle school. He took the idea to Lawrence as the head of that district’s state takeover.

Who got to attend?

Students were chosen for the program based on who was seen as likely to benefit and behave. That meant leaders avoided students with attendance or behavior issues, an approach used at the school where Riley piloted the model, too.

In Springfield, once the school decided which students were eligible, leaders allowed Schueler to randomly assign some the opportunity to attend or not in order to study the program. (Some students who didn’t win a spot attended anyway, and some students who won a spot didn’t attend, something the study accounted for.)

Ultimately, the students who went to vacation academy were different than the student population as a whole — only 10 percent had a disability, compared to 22 percent of all sixth- and seventh-graders in the district. Attendees were also somewhat more likely to be girls, but were equally likely to qualify for free or reduced-price lunch.

Teachers applied to work over the break, and got bonuses — $2,500 in Springfield — for doing so.

How does the program affect students?

In terms of test scores, students on the cusp of clearing the state’s proficiency bar saw the biggest gains.

Students at school over spring break were much more likely to score proficient on the state math test: about 35 percent did compared to 25 percent of similar students who didn’t attend, according to Schueler’s study, published last month in the peer-reviewed journal Education Finance and Policy.

Their average test scores were higher than those of students who didn’t attend the academy, though that boost wasn’t statistically significant.

Students were also less likely to be suspended after the vacation academy. While about 10 percent of control students were suspended once or more, less than 7 percent of students who went to the academy were.

The academy also may have helped students’ grades, with averages in both math in reading improving slightly, though that was not statistically significant.

How a student benefited may be connected to which version of the program he or she attended. In some academy programs, students had one teacher all day; in others they rotated among teachers who worked on different standards. Students in the first group saw bigger declines in suspensions; students in the second group saw larger test-score gains.

“It could be that the additional time … provided by the stability of a single teacher allowed for the development of more deep and positive teacher-student relationships,” Schueler wrote. “It is also possible that the program changed educator perceptions of participating students in a way that decreased their likelihood to turn to exclusionary discipline for those children.”

Should other districts adopt this model?

This kind of intensive academic help has been shown to work over and over again. A previous study found test score improvements from the same vacation academy model in Lawrence, Massachusetts, and programs that provide individualized tutoring throughout the year have produced even bigger gains in Boston, Chicago and Houston.

But these programs have faced difficulty growing because they come with a hefty price tag — sometimes upwards of a few thousand dollars per student. The Springfield program costs about $600 per student, which is lower because its student-teacher ratio is higher.

Tutoring, Schueler said, “has a high upfront cost that I think deters many districts from pursuing this as a key strategy. These vacation academies are potentially a more scalable approach.”

It still might be a worthwhile investment for districts. But they will have to contend with the fact that the model doesn’t include certain students — particularly those with behavior and attendance problems, who could be in the most academic trouble.

“This is not an intervention aimed at helping every single kid of every type,” said Gabrieli.

Of students who didn’t participate in the program, Schueler said, “We don’t know, if they actually ended up coming, if we would see the same effects.”

testing 1-2-3

Tennessee students to test the test under reworked computer platform

PHOTO: Getty Images

About 45,000 students in a third of Tennessee districts will log on Tuesday for a 40-minute simulation to make sure the state’s testing company has worked the bugs out of its online platform.

That platform, called Nextera, was rife with glitches last spring, disrupting days of testing and mostly disqualifying the results from the state’s accountability systems for students, teachers, and schools.

This week’s simulation is designed to make sure those technical problems don’t happen again under Questar, which in June will finish out its contract to administer the state’s TNReady assessment.

Tuesday’s trial run will begin at 8:30 a.m. Central Time and 9 a.m. Eastern Time in participating schools statewide to simulate testing scheduled for Nov. 26-Dec. 14, when some high school students will take their TNReady exams. Another simulation is planned before spring testing begins in April on a much larger scale.

The simulation is expected to involve far more than the 30,000 students who will test in real life after Thanksgiving. It also will take into account that Tennessee is split into two time zones.

“We’re looking at a true simulation,” said Education Commissioner Candice McQueen, noting that students on Eastern Time will be submitting their trial test forms while students on Central Time are logging on to their computers and tablets.

The goal is to verify that Questar, which has struggled to deliver a clean TNReady administration the last two years, has fixed the online problems that caused headaches for students who tried unsuccessfully to log on or submit their end-of-course tests.


Here’s a list of everything that went wrong with TNReady testing in 2018


The two primary culprits were functions that Questar added after a successful administration of TNReady last fall but before spring testing began in April: 1) a text-to-speech tool that enabled students with special needs to receive audible instructions; and 2) coupling the test’s login system with a new system for teachers to build practice tests.

Because Questar made the changes without conferring with the state, the company breached its contract and was docked $2.5 million out of its $30 million agreement.

“At the end of the day, this is about vendor execution,” McQueen told members of the State Board of Education last week. “We feel like there was a readiness on the part of the department and the districts … but our vendor execution was poor.”

PHOTO: TN.gov
Education Commissioner Candice McQueen

She added: “That’s why we’re taking extra precautions to verify in real time, before the testing window, that things have actually been accomplished.”

By the year’s end, Tennessee plans to request proposals from other companies to take over its testing program beginning in the fall of 2019, with a contract likely to be awarded in April.

The administration of outgoing Gov. Bill Haslam has kept both of Tennessee’s top gubernatorial candidates — Democrat Karl Dean and Republican Bill Lee — in the loop about the process. Officials say they want to avoid the pitfalls that happened as the state raced to find a new vendor in 2014 after the legislature pulled the plug on participating in a multi-state testing consortium known as PARCC.


Why state lawmakers share the blame, too, for TNReady testing headaches


“We feel like, during the first RFP process, there was lots of content expertise, meaning people who understood math and English language arts,” McQueen said. “But the need to have folks that understand assessment deeply as well as the technical side of assessment was potentially missing.”

Academic Accountability

Coming soon: Not one, but two ratings for every Chicago school

Starting this month, Chicago schools will have to juggle two ratings — one from the school district, and another from the state.

The Illinois State Board of Education is scheduled to release on October 31 its annual report cards for schools across the state. This year, for the first time, each school will receive one of four quality stamps from the state: an “exemplary” or “commendable” rating signal the school is meeting standards while an “underperforming” or “lowest performing” designation could trigger intervention, according to state board of education spokeswoman Jackie Matthews.

A federal accountability law, the Every Student Succeeds Act, requires these new ratings.

To complicate matters, the city and state ratings are each based on different underlying metrics and even a different set of standardized tests. The state ratings, for example, are based on a modified version of the PARCC assessment, while Chicago ratings are based largely on the NWEA. The new state ratings, like those the school district issues, can be given out without observers ever having visited a classroom, which is why critics argue that the approach lacks the qualitative metrics necessary to assess the learning, teaching, and leadership at individual schools.

Patricia Brekke, principal at Back of the Yards College Preparatory High School, said she’s still waiting to see how the ratings will be used, “and how that matters for us,” but that parents at her school aren’t necessarily focused on what the state says.

“What our parents usually want to know is what [Chicago Public Schools] says about us, and how we’re doing in comparison to other schools nearby that their children are interested in,” she said.

Educators at Chicago Public Schools understand the power of school quality ratings.  The district already has its own five-tiered rating system: Level 1+ and Level 1 designate the highest performing schools, Level 2+ and Level 2 describe for average and below average performing schools, respectively, and Level 3, the lowest performance rating, is for schools in need of “intensive intervention.” The ratings help parents decide where to enroll their children, and are supposed to signal to the district that the school needs more support. But the ratings are also the source of angst — used to justify replacing school leaders, closing schools, or opening new schools in neighborhoods where options are deemed inadequate.

In contrast, the state’s school quality designations actually target underperforming and lowest-performing schools with additional federal funding and support with the goal of improving student outcomes. Matthews said schools will work with “school support managers” from the state to do a self-inquiry and identify areas for improvement. She described Chicago’s school quality rating system as “a local dashboard that they have developed to communicate with their communities.”

Staff from the Illinois State Board of Education will be traveling around the state next week to meet with district leaders and principals to discuss the new accountability system, including the ratings. They’ll be in Bloomington, Marion, O’Fallon, Chicago, and Melrose Park. The Chicago meeting is Wednesday, Oct. 24, at 5 p.m. at Chicago Public Schools headquarters.

Rae Clementz, director of assessment and accountability at the state board said that a second set of ratings reveals “that there are multiple valid ways to look at school quality and success; it’s a richer picture.”

Under auspices of the Every Student Succeeds Act, the state school report cards released at the end of the month for elementary schools are 75 percent based on academics, including English language arts and math test scores, English learner progress as measured by the ACCESS test, and academic growth. The other 25 percent reflects the school climate and success, such as attendance and chronic absenteeism.

Other measures are slated to be phased in over the next several years, including academic indicators like science proficiency and school quality indicators, such as school climate surveys of staff, students and parents

High school designations take a similar approach with English and math test scores but will take into account graduation rates, instead of academic growth, and also includes the percentage of  9th graders on track to graduate — that is freshmen who earn 10 semester credits, and no more than one semester F in a core course.

Critics of Chicago’s school rating system argue that the ratings correlate more with socioeconomic status and race than they do school quality, and say little about what’s happening in classrooms and how kids are learning. Chicago does try to mitigate these issues with a greater emphasis on growth in test scores rather than absolute attainment, school climate surveys, and including academic growth by priority groups, like African-American, Latino, ELL, and students in special education.

Cory Koedel, a professor of economics and public policy at the University of Missouri, said that many rating systems basically capture poverty status with a focus on how high or low students score on tests. Chicago’s approach is fairer than that of many other school systems.

“What I like about this is it does seem to have a high weight on growth and lower weight on attainment levels,” he said.

Morgan Polikoff, a professor at University of Southern California’s school of education, said that Chicago’s emphasis on student growth is a good thing “if the purpose of the system is to identify schools doing a good job educating kids.”

Chicago weights 50 percent of the rating on growth, but he’s seen 35 to as low as 15 percent at other districts. But he said the school district’s reliance on the NWEA test rather than the PARCC test used in the state school ratings was atypical.

“It’s not a state test, and though they say it aligns with standards, I know from talking to educators that a lot of them feel the tests are not well aligned with what they are supposed to be teaching,” he said. “It’s just a little odd to me they would have state assessment data, which is what they are held accountable for with the state, but use the other data.”

He’s skeptical about school systems relying too heavily on standardized test scores, whether the SAT, PARCC or NWEA, because “You worry that now you’re just turning the curriculum to test prep, and that’s an incentive you don’t want to create for educators.”

He said the high school measures in particular include a wide array of measures, including measures that follow students into college, “so I love that.”

“I really like the idea of broadening the set of indicators on which we evaluate schools and encouraging schools to really pay attention to how well they prepare students for what comes next,” he said.