Data vs. Politics: Why Has Commissioner Chester Stepped Back from PARCC?
Massachusetts Education Commissioner and PARCC Chairman Mitchell Chester just took a big step back from the test he鈥檚 championed in Massachusetts, and observers are left wondering why. As of last school year, the state allowed districts to choose between the use of PARCC, the Common Core-aligned assessment taken by students in eleven states last school year, and MCAS, the Massachusetts state assessment in place since 1998. In the debate between PARCC and MCAS, Chester has always been a staunch supporter of adopting PARCC. But last week Chester said that rather than choosing between the use of PARCC or MCAS, Massachusetts should create a third option: a PARCC-inspired, Massachusetts-specific exam he called 鈥淢CAS 2.0.鈥
What鈥檚 behind the Commissioner鈥檚 decision? At this point, it鈥檚 unclear what drove the Chester鈥檚 change of heart, but one thing is for sure: it wasn鈥檛 the data.
Two major pieces of data on PARCC and MCAS were released this month, and the results far from clarify the Commissioner’s new stance. The first piece, a commissioned by the state of Massachusetts to determine how well MCAS and PARCC predicted students鈥 success in college, was released earlier this month. Researchers at Mathematica Policy Research concluded that both tests effectively predicted how ready students were for college, though PARCC predicted readiness in math slightly better than MCAS. With this in mind, observers might have expected that this year鈥檚 PARCC and MCAS scores would be roughly equivalent, but not so: the second piece of data, , showed that fewer students scored proficient or advanced on PARCC exams than MCAS.
The discrepancies in the tests’ predictive power, as well as state test scores, have led some observers to speculate that MCAS sets a lower bar for achievement than PARCC. But given the limitations of both the Mathematica study and the state鈥檚 assessment data, it鈥檚 not so simple.
In the Mathematica study, researchers tested first-year, in-state students at public Massachusetts colleges and universities. Half of the students took MCAS, and half took a 10th grade PARCC exam in either math or reading. These results were then compared with the student鈥檚 grades in college and SAT scores. From these data, researchers analyzed how well PARCC and MCAS predicted a student鈥檚 college grades. They found that both exams predicted grades effectively, making them both effective measures of college readiness. The study, however, has several limitations:
- It is not longitudinal research; that is, it does not follow students over time. A study of this kind would have given us a more accurate idea of the relationship between a student鈥檚 10th grade MCAS/PARCC scores and her/his success in college.
- It did not test an age-appropriate sample. Researchers gave a test of 10th grade skills to a group of college students, who likely performed differently on it than they might have in the 10th grade.
- It was limited to in-state students at public colleges and universities, who may have demographic features different from students who attended out-of-state schools, private institutions, or community colleges. 听
- It only tested the validity of tenth grade assessments. We can glean from this research that all four 10th grade exams may be effective predictors of college readiness, but that that leaves us knowing nothing about the efficacy of any other PARCC or MCAS test.
The Mathematica study isn鈥檛 necessarily bad practice; it is simply a preliminary finding produced under constraint. But that means that its findings are far from definitive.
Similarly, we cannot draw broad conclusions from comparing Massachusetts鈥 PARCC and MCAS scores, as there are limitations to this data as well. Because districts were able to choose which assessment they administered this year, there are visible differences between districts that administered each exam. Most notably, districts who chose to administer MCAS had, on average, 10 percent fewer low-income students than districts who administered PARCC. Though the department used representative sampling to mitigate demographic differences, it would have been impossible to entirely remove the effect of this difference on student performance. Similar to the Mathematica study, it鈥檚 also only one testing cycle鈥檚 worth of data. There are many potential reasons for discrepancies between exams, and only multiple years of testing could spot difference with any accuracy.
Naturally the state does not want to proceed much longer administering two different assessments for the sake of comparing their results鈥攖hey want to move forward with a single strong option. But looking at the emerging research, there is nothing to suggest that one test is superior to another, nor that a third option is necessary. Rather than reinventing the wheel with a costly, time-consuming new assessment (see Chad Aldeman鈥檚 post on this ), the state should build upon the emerging research they have commissioned and gathered to make an evidence-based decision.
Instead, it seems state leadership is bowing to political pressure. Commissioner Chester has always been a strong supporter of the Common Core, and it came as a surprise to many that he has withdrawn his support for continuing the use of PARCC in Massachusetts. He has faced opposition from politicians, advocates, and from the public, who have largely used PARCC as a proxy for opposition to the Common Core standards. And perhaps it鈥檚 the latter that pushed the Commissioner to call for the continued use of a state-specific assessment. In a State Secretary of Education Jim Peyser, who delivered the recommendation alongside Chester, was summarized as saying:
鈥淩etaining state control over the test would not only give Massachusetts leeway in crafting the assessment given to students, but would also allow for more flexibility in making changes to the underlying Common Core curriculum standards that the state test must be aligned with.鈥
听
This observation from Peyser points to what may be the real reason Massachusetts wants to drop PARCC: Common Core politics. As the debate in states over Common Core continues, and more constituencies enter the dialogue, state governments are under greater pressure to distance themselves from the standards. Peyser鈥檚 position may signal not only the state鈥檚 wish to break off from the PARCC consortium, but a longer-term push away from the Common Core. 听If this is truly the case, it represents grim news for the future of Common Core. As American Enterprise Institute Fellow Gerard Robinson , Massachusetts is a leading state on education, and their choices about the future of PARCC will serve as a particularly strong signal to other states. The same would be true if Massachusetts chose to alter or abandon the Common Core.
As the Commissioner backs away from the evidence-based approach he previously championed, the state is left without a clear vision for the future of student assessment. If Massachusetts spends further time and money developing a new assessment, implementing the test in classrooms, and waiting for data on its effectiveness, schools will continue to operate in limbo. As Linda Noonan, Executive Director of the MA Business Alliance for Education, said in a recent interview: 鈥淸MCAS 2.0] sounds like a sort-of Band-Aid approach, and perhaps generated more from political motivation than from…an educational imperative.鈥 “