voucher verdict

Students’ math scores drop for years after using a private school voucher in country’s largest program

Ft. Wayne, IN: Students walk past "The Road to Success" sign displayed in the hallway at Horizon Christian Academy in Ft. Wayne, Indiana December 20, 2016. Horizon Christian Academy is one of more than 300 schools that accepts vouchers in Indiana, which has the largest statewide voucher program in the nation.

Low-income students who use a voucher to attend private school in Indiana see their math scores drop for several years as a result, according to a new study.

The findings are a blow to the argument that poor students benefit from the choice to attend a private school, a policy championed by U.S. Secretary of Education Betsy DeVos.

“Our results do not provide robust support that the [voucher program] has been successful to date at improving student achievement for low-income students who use a voucher to switch from a public to a private school,” conclude the researchers, Mark Berends of Notre Dame and Joseph Waddington of the University of Kentucky.

The paper, focuses on the initial rollout of what has become the largest school voucher program in the country. In the most recent school year, over 35,000 students were enrolled in the initiative.

The study examines a few thousand low-income students who switched from public to private school using a voucher starting in the 2011-12 school year.

Notably, the authors show that low-income students who used a voucher had slightly higher starting test scores than low-income kids who stayed in public schools. This gives credence to fears that a voucher program could concentrate the most disadvantaged students in the public school system.

The authors attempt to control for these and other factors to isolate the effect of attending a private school. (Unlike some voucher studies, this paper is not able to compare students who randomly won or lost a chance to attend private school — a stronger method.)

In math, the results, which focus on grades five through eight, are consistently negative. Even four years into the program, students who use a voucher had lower test scores than public school students.

In English, there were no clear effects. Here, there was some evidence that voucher students improved over time, though there were no statistically significant positive effects after four years.

The results were generally consistent for students of different races, genders, and locations. The findings, though, might not hold for more affluent students or the increasing numbers of participating students who never attended a public school to begin with. Neither group is accounted for in this study.

The results, published this week in the peer-reviewed Journal of Policy Analysis and Management, are largely in line with an earlier version of the same study — with a key exception. The first paper suggested that declines in math disappeared for students who used a voucher for multiple years. The latest version finds that the negative effects seem to persist for at least four years.

The results, then, undermine the argument that recent studies showing drops in voucher students’ test scores just represent students adjusting to private school. A recent study in Washington, D.C. showed substantial drops in math achievement persisted into a second year. Negative results also were consistent in an Ohio voucher study. In Louisiana, math and reading test scores bounced back according to one method but not another after three years; drops were consistent in social studies.

Voucher advocates have responded by arguing that tests should not be the sole judge of these programs’ success, pointing to more favorable research looking at high school graduation and college enrollment. Despite test score declines, a recent study showed that Louisiana’s program had no effect or somewhat positive effects on college enrollment.

“Although academic achievement outcomes are important for researchers, policymakers, and practitioners to consider, parents make schooling decisions for their children based on a multitude of factors, including academics, location, safety, and religion,” Waddington and Berends write. “Therefore, researchers need to examine outcomes beyond test scores.”

Critics might also point to other concerns not captured in the study: A Chalkbeat investigation in 2017 found that about one in 10 Indiana private schools that accepted a voucher had policies that explicitly discriminated against LGBT students.

That’s perfectly legal under Indiana’s system, as well as the vast majority of publicly funded private school choice programs.

testing testing

McQueen declares online practice test of TNReady a success

PHOTO: Manuel Breva Colmeiro/Getty Images

Tennessee’s computer testing platform held steady Tuesday as thousands of students logged on to test the test that lumbered through fits and starts last spring.

Hours after completing the 40-minute simulation with the help of more than a third of the state’s school districts, Education Commissioner Candice McQueen declared the practice run a success.

“We saw what we expected to see: a high volume of students are able to be on the testing platform simultaneously, and they are able to log on and submit practice tests in an overlapping way across Tennessee’s two time zones,” McQueen wrote district superintendents in a celebratory email.

McQueen ordered the “verification test” as a precaution to ensure that Questar, the state’s testing company, had fixed the bugs that contributed to widespread technical snafus and disruptions in April.

The spot check also allowed students to gain experience with the online platform and TNReady content.

“Within the next week, the districts that participated will receive a score report for all students that took a practice test to provide some information about students’ performance that can help inform their teachers’ instruction,” McQueen wrote.

The mock test simulated real testing conditions that schools will face this school year, with students on Eastern Time submitting their exams while students on Central Time were logging on.

In all, about 50,000 students across 51 districts participated, far more than the 30,000 high schoolers who will take their exams online after Thanksgiving in this school year’s first round of TNReady testing. Another simulation is planned before April when the vast majority of testing begins both online and with paper materials.

McQueen said her department will gather feedback this week from districts that participated in the simulation.

testing 1-2-3

Tennessee students to test the test under reworked computer platform

PHOTO: Getty Images

About 45,000 students in a third of Tennessee districts will log on Tuesday for a 40-minute simulation to make sure the state’s testing company has worked the bugs out of its online platform.

That platform, called Nextera, was rife with glitches last spring, disrupting days of testing and mostly disqualifying the results from the state’s accountability systems for students, teachers, and schools.

This week’s simulation is designed to make sure those technical problems don’t happen again under Questar, which in June will finish out its contract to administer the state’s TNReady assessment.

Tuesday’s trial run will begin at 8:30 a.m. Central Time and 9 a.m. Eastern Time in participating schools statewide to simulate testing scheduled for Nov. 26-Dec. 14, when some high school students will take their TNReady exams. Another simulation is planned before spring testing begins in April on a much larger scale.

The simulation is expected to involve far more than the 30,000 students who will test in real life after Thanksgiving. It also will take into account that Tennessee is split into two time zones.

“We’re looking at a true simulation,” said Education Commissioner Candice McQueen, noting that students on Eastern Time will be submitting their trial test forms while students on Central Time are logging on to their computers and tablets.

The goal is to verify that Questar, which has struggled to deliver a clean TNReady administration the last two years, has fixed the online problems that caused headaches for students who tried unsuccessfully to log on or submit their end-of-course tests.


Here’s a list of everything that went wrong with TNReady testing in 2018


The two primary culprits were functions that Questar added after a successful administration of TNReady last fall but before spring testing began in April: 1) a text-to-speech tool that enabled students with special needs to receive audible instructions; and 2) coupling the test’s login system with a new system for teachers to build practice tests.

Because Questar made the changes without conferring with the state, the company breached its contract and was docked $2.5 million out of its $30 million agreement.

“At the end of the day, this is about vendor execution,” McQueen told members of the State Board of Education last week. “We feel like there was a readiness on the part of the department and the districts … but our vendor execution was poor.”

PHOTO: TN.gov
Education Commissioner Candice McQueen

She added: “That’s why we’re taking extra precautions to verify in real time, before the testing window, that things have actually been accomplished.”

By the year’s end, Tennessee plans to request proposals from other companies to take over its testing program beginning in the fall of 2019, with a contract likely to be awarded in April.

The administration of outgoing Gov. Bill Haslam has kept both of Tennessee’s top gubernatorial candidates — Democrat Karl Dean and Republican Bill Lee — in the loop about the process. Officials say they want to avoid the pitfalls that happened as the state raced to find a new vendor in 2014 after the legislature pulled the plug on participating in a multi-state testing consortium known as PARCC.


Why state lawmakers share the blame, too, for TNReady testing headaches


“We feel like, during the first RFP process, there was lots of content expertise, meaning people who understood math and English language arts,” McQueen said. “But the need to have folks that understand assessment deeply as well as the technical side of assessment was potentially missing.”