NCLB Public School Choice and Supplemental Educational Services: Outcomes
Supplemental Educational Services:
Compared to public school choice under NCLB, there has been far more research and evaluation done on the impact of SES. In addition, there are numerous state evaluations that annually measure the impact of individual SES providers. Yet there is a lack of consensus on the extent to which SES has been effective in raising student achievement. In part, this is due to clear differences in implementation of SES from the state level all the way down to the provider level, with some clearly doing a better job than others.
As with nearly any federal program, SES is not always the same everywhere; there are different providers offering varying kinds of services to different-sized groups of children for varied numbers of hours. Models vary from computer-based instruction to one-on-one tutoring. With so many variables, researchers have yet to undertake more sophisticated research to answer not only the question of whether or not SES helps raise student academic achievement, but which kinds of SES are best to offer and under what conditions do they provide the best results.
However, there are reasons to be optimistic when it comes to SES outcomes.
A recent evaluation by researchers from Vanderbilt University and the RAND Corporation found “significant and positive effects of SES on student test scores in mathematics.” For students receiving two or more years of SES tutoring, the report found “significant cumulative impact on test score gains in both mathematics and reading.” (Springer, M.G., Pepper, M.J., Ghosh-Dastidar, B. (2009) Supplemental Educational Services and Student Test Score Gains: Evidence from a Large, Urban School District.)
Similarly, a major RAND Corporation report released by the U.S. Department of Education in January 2009 indicated that in an evaluation of the effects of SES in seven school districts nationwide from 2002 to 2005, student participants experienced gains in achievement in both reading and mathematics which were greater than the gains for nonparticipating students. “On average, across seven districts, participation in supplemental educational services had a statistically significant, positive effect on students’ achievement in reading and math. Students participating for multiple years experienced larger gains.” (U.S. Department of Education, www.ed.gov/rschstat/eval/choice/nclb-choice-ses-final/index.html)
As more states have conducted evaluations, many have found small yet positive effects of SES. Chicago Public Schools (CPS), a district that for many years fully embraced SES, specifically found that SES participants showed a 5% greater reading gain and a 13.2% greater math gain than would have been expected had they not participated in SES. The study also concluded that student achievement in math was directly affected by the number of hours of SES instruction, and that SES-served students with disabilities achieved more significant gains in reading and math than students without disabilities. (Chicago Public Schools, Office of Research, Evaluation and Accountability, “The 2007 Supplemental Educational Services Program: Year Four Summative Evaluation,” https://research.cps.k12.il.us/resweb/pe)
These results echo an earlier CPS study in 2005, which found that SES programs were most helpful for those students who were farthest behind in reading and math. According to the CPS report, the federally funded tutoring “seemed to have helped students catch up to their peers in terms of gains on the Illinois Test of Basic Skills in both math and reading.”
Similarly, positive results were found in Portland, Oregon, where a 2010 audit found that the average achievement gains for the district’s 435 students who participated in SES during the 2008-2009 school year were slightly larger than the gains of students who didn’t participate in the tutoring. The report went on to say, “when students spent 20 hours or more in SES tutoring, they scored significantly higher than their non-tutored peers on the following year’s achievement tests.” In 2009, that meant that 60% of students receiving tutoring met academic benchmarks, compared to just 24% of students who did not participate in tutoring. (Portland Public Schools, “Supplemental Educational Services: Overall Compliance with Requirements but Opportunities Exist to Improve Effectiveness.” www.pps.k12.or.us/files/board/audit_published_february_2010.pdf)
Yet results vary markedly by location, a fact that has yet to be examined nationally. For example, Los Angeles Unified School District found in 2007 that SES participation led to a “statistically higher, yet substantively negligible performance gain” on the California Standards Test. Gains were greatest among students with high SES attendance records, and were most pronounced among elementary school students. (“LAUSD Report #352, “The Impact of Supplemental Educational Services Participation on Student Achievement 2005-2006,” by the Los Angeles United School District Program Evaluation and Research Branch. http://notebook.lausd.net/portal/page?_pageid=33,102413&_dad=ptl&_schema=PTL_EP)
While the main factor for determining the success of SES must be its impact on increasing student achievement, parent satisfaction also can demonstrate the extent to which the program has been successful. Indeed, most states take into account parent satisfaction when it comes to evaluating individual providers.
Florida, which has written the federal requirements for SES into state law, will evaluate its SES providers for the first time this year using the following formula:
- Student learning gains: 60%
- Attendance and completion data: 15%
- Parent satisfaction: 5%
- District satisfaction: 10%
- Principal satisfaction: 10%
North Carolina, as noted above, evaluates its SES providers using the following formula: student achievement 50%, parent satisfaction 25%, and student attendance 25%. Hawaii collects data on parent satisfaction, but does not use it to rate providers. Regardless of the weight states give to parent satisfaction, rankings vary by provider. In North Carolina, for example, parent satisfaction with individual providers ranged from 67 to 100%.
While fewer states calculate broader statistics on parent satisfaction, the data that does exist are positive. A New Mexico Public Education Department evaluation report on SES programs for 2005-2006 found that almost 90% of surveyed parents said that SES resulted in “some,” “a lot,” or “extensive” academic progress on the part of their children. A Hawaii Department of Education study conducted in 2008 concluded, after looking at three years of data, that “parents and guardians are overwhelmingly positive about [SES] tutors, especially logistics.” Hawaii SES providers routinely received positive ratings from more than 80% of parents surveyed from 2005 to 2008. As part of a 2005 SES evaluation, Chicago Public Schools found that nearly 87% of parents were satisfied with their child’s instruction, with at least 8 in 10 parents indicating that their children’s’ participation in SES improved study skills, made homework easier, and led to higher grades. And, a four-state study indicated that more than 82% of parents agreed or strongly agreed that they were pleased with the SES services received by their children, and believed these services helped their children’s achievement.
For many years, different researchers using competing methods have reached opposite conclusions about various forms of public school choice (e.g. magnet schools). However, few published studies have specifically examined NCLB school choice. According to a 2010 dissertation, these “published studies on NCLB choice were inconclusive, with one finding improved achievement in choice schools (Okpala et al., 2007) and the other finding no choice advantage (McCombs, 2007).” The dissertation itself did not find higher student achievement linked to public school choice. To date, research has not fully addressed questions such as whether students’ achievement increases over time as they continue to attend higher-performing schools.
Another evaluation of NCLB school choice was a 2007 report released by the U.S. Department of Education. The report found that across six districts, “no statistically significant effect on achievement, positive or negative, was found for students participating in Title I school choice.” However, the report noted that the sample size was very small “so there was limited statistical power to detect effects;” because of this, the report advised readers to use caution when making definitive conclusions regarding the results. (U.S. Department of Education, Office of Planning, Evaluation and Policy Development, Policy and Program Studies Service, State and Local Implementation of the No Child Left Behind Act, Volume I—Title I School Choice, Supplemental Educational Services, and Student Achievement, Washington, D.C., 2007.)
After 10 years of being in effect, SES and public school choice have provided several million students with the ability to receive services that most likely would have otherwise not been available. However, the data is clear that millions more could have received these services.
While some states and districts embraced these options for their students as evidenced by their relatively high participation rates, far too many states all but ignored the requirement to offer SES and school choice to parents and students.
Education reformers can play a greater role in ensuring these services are made available by understanding the extent to which their state and school district has carried out these options. As such, this chart includes links to every state website devoted to SES (which oftentimes also covers school choice). In addition, a state link is included which identifies those schools and districts currently required to offer these options. Information on these sites, as well as district websites, provides a sense of the extent to which these options are being offered. When it is clear they are not being offered – it is time to ask, “why not?”