When Gavin Newsom signed California’s Ethnic Studies (ES) requirement into law, he indicated one reason why was that “a number of studies have shown these courses boost student achievement.” But the reality is that there is no significant evidence these courses boost student achievement.
The most prominent studies regarding the positive effect of ES are from Stanford researchers who studied an ES program that was implemented in five high schools in the San Francisco Unified School District, before being required in all district schools in 2016. But the positive conclusions drawn from these studies are erroneous, reflecting a flawed experimental design and inappropriate inferences. I was never a fan of the old saw, “There are lies, damn lies, and statistics,” because the peer-review process within academic publishing roots out deficiencies and mistakes in research. But in this case, it didn’t. Not even close.
Long story short, ES was supposed to have been taken by students with a GPA of less than 2.0 in the five San Francisco schools. The researchers believed that they could draw sharp statistical inference about the impact of ES by comparing future outcomes of students just under the 2.0 GPA threshold with those students who had a GPA just over the threshold but who didn’t take the course. Their main finding was nothing less than shocking. They concluded that taking an ES course raised the overall GPA of students by 1.4 points, in effect saying that the ES course turned a sub-C student into a B+ student. The magnitude of this treatment effect from a single intervention is literally unheard of in education circles.
But as is often the case when something is too good to be true, it turns out the Stanford analysis does not support such a conclusion. Research conducted by Richard Sander of UCLA and Abraham Wyner of the University of Pennsylvania highlights serious deficiencies of the Stanford study, so large that virtually no conclusions can be drawn from the research regarding the impact of ES studies on student learning. They note that so little can be learned from the study that it is possible that ES courses can lead to lower achievement.
One key problem with the Stanford study is that the both the treatment group—those students with GPAs under 2.0, and who were supposed to have taken the course—and the control group—those students with GPAs over 2.0, and who were supposed to have been omitted from the course—were polluted. The treatment group included those who should have been in the control group, and about 40 percent of students who should have taken the course did not.
Losing 40 percent of those students with a GPA under 2.0 created a very small sample, with only 67 students with a GPA under 2.0 in the treatment group. Nearly twice as many students with a GPA greater than 2.0 took the course as an elective. This means that the treatment group was primarily made up of those who should have been in the control group.
By omitting so many of the low-GPA students from the treatment group, particularly those near the threshold, and by including so many students in the treatment group who should have been in the control group, particularly those near the threshold, the analysis loses its ability to draw these comparisons. Moreover, only four teachers taught the ES course. This makes it difficult to separate out the impact of a teacher from the impact of the course curriculum. And if you are wondering if the high schools implemented any other educational interventions to help students with low GPAs? Well, they did, which means that the researchers would need to control for those effects in trying to measure the individual contribution of ES. But they did not try to do this, even though they were aware of other types of programs and student support being provided. Just like that, this natural experiment that was supposed to shed light on the effect of ES on student outcomes became anything but that.
For a moment, put aside the various weedy details here, and ask yourself the following: If an intervention is so remarkably successful, as the Stanford study concludes, something that could turn a C or below student into a B+ student, then shouldn’t it be readily apparent within the data? Shouldn’t the impact jump out at us?
It should, but it doesn’t. The researchers sorted students into groups by GPA, ranging from 1.2–1.3 up to 3.9–4.0, and reported GPA for these students the following academic year, after the ES course. As you might guess, students with very high GPA continued to have very high GPA the following year, and students with very low GPA continued to have very low GPA the following year. The students who had a GPA just above 2.0 had a subsequent GPA that was nearly identical to students who had a GPA just below 2.0. Nothing amazing about the impact of an ES course jumping out at us.
So where is the big effect that the study reports? Part of this comes from that the fact that students who began with a GPA between 2.2 and 2.3 had a GPA between 1.6 and 1.7 the following year. And the students with a GPA between 1.7 and 1.8 on average also had a GPA between 1.6 to 1.7 the following year.
All these students (a small number in any case) had a lower GPA the following year. It is just that the ones who began with a GPA between 1.7 and 1.8 declined less than those who began with a GPA between 2.2 to 2.3. Voilà. There you have it. In other words, “Hey mom, I know my grades declined this year, but hey, it could have been worse!”
Professors Sander and Wyner describe the Stanford research as “shoddy” and state that “California parents are not being told the truth about the education of their children.”
More broadly, ES courses have been required for the last five years in all San Francisco high schools. But if taking an ES course was a game changer, then it should be obvious in student learning outcomes. It is not. Test scores among SF high school students since 2015, the last year before all SF students were required to take ES, haven’t budged.
Requiring ES will fatten the wallets of those whose business is to teach ES and to train new faculty to teach ES. But there is no reason to believe that it will improve student learning outcomes in a state where more than 80 percent of Hispanic and Black students lack proficiency in mathematics and science.