At the Tuesday meeting of the state Board of Elementary and Higher Education where the latest standardized test scores were released, those scores weren’t the main topic of the day. Instead, talk focused on a new twist in the ongoing discussion of whether to keep using the MCAS test or switch to PARCC: How about neither one?
Mitchell Chester, commissioner of elementary and secondary education, is due to make his recommendation on the tests to the board before its Nov. 17 vote. In a special meeting Monday, he told the board that he was now weighing a third possibility, or “Door No. 3,” as he put it: a so-called “MCAS 2.0,” which could use elements of the new PARCC tests to build a state-specific assessment.
“It’s not a binary decision,” Chester said Tuesday.
Public discussions of the decision, however, have largely focused on choosing between MCAS, the Massachusetts Comprehensive Assessment System test, and PARCC, the Partnership for Assessment of Readiness for College and Careers test. Massachusetts has been part of the consortium of states developing PARCC, and Chester chairs the PARCC governing board.
Now, state education secretary James Peyser said he’s willing to consider the new option.
“In thinking about MCAS 2.0, I think the question is, ‘What are the next steps in developing that assessment?'” Peyser said Tuesday.
Chester indicated that the work already done on PARCC could provide some of those steps.
“The question in my mind,” he said, “is to what extent can we take advantage of the development that’s been done in PARCC to take us down that road of a next-generation assessment?”
Laura Slover, CEO of PARCC Inc., the nonprofit organization that’s developing the new test, told the board Monday that it would be possible to use items from PARCC in a new, Massachusetts-designed test. But she noted that such a switch would require issuing a new request for proposals, which could delay any change, and that the costs of developing a new test are unknown.
The state has estimated that just getting all Massachusetts schools the necessary technology for administering PARCC (or any other test) on computers could cost as much as $12.3 million. Infrastructure upgrades could cost the state about $2.4 million.
Meanwhile, the statewide test results released Tuesday indicate that fewer students overall met state standards on the PARCC assessment than on MCAS.
These results are similar to the preliminary ones released last month, but Tuesday’s release includes the results for students who took the PARCC test this past spring on paper (about 41 percent), as well as those who used a computer (about 59 percent).
In most grades and subjects, the results show, students who took PARCC were less likely to score as having “met” or “exceeded” expectations than MCAS students were to score “Proficient” or “Advanced.” In fourth grade, however, the percentages were the same for math on both tests; in English language arts, a higher percentage of fourth graders passed this bar on MCAS than on PARCC.
“There’s no question that PARCC has set a higher standard for student performance,” commissioner Chester said at the Tuesday meeting.
Overall in grades 3-8, 60 percent of students met or exceeded expectations on PARCC, vs. 68 percent scoring proficient or advanced on MCAS; in math, it was 52 percent for PARCC, 60 percent for MCAS.
“These statewide PARCC scores will help establish a baseline for comparison with other PARCC states and with our own progress over time,” Chester said in a statement, “should the board choose to adopt PARCC within our statewide assessment.”
As part of a two-year “test drive” of PARCC, districts could choose this year whether to use PARCC or MCAS. And because districts that chose PARCC were different from MCAS districts, the department said, it did not use all students’ scores in making its comparisons.
Instead, it used “representative samples” (about 75 percent of both groups) that were matched to statewide statistics on achievement variables and demographic measures. In particular, according to a presentation to the state board, “MCAS districts had 10% fewer low income students than PARCC students.”
PARCC results for individual schools are expected to be released in November, possibly after the board makes its decision on what test to use. In the special meeting Monday, the board heard presentations on two recent studies comparing PARCC and MCAS.
One, from Mathematica Policy Research, showed that the two tests were about equal in their ability to predict college performance — and, like the SAT, were only moderately successful in doing so.
The other, commissioned by the state’s Executive Office of Education, concluded that both tests are “high-quality assessments with respect to issues of validity and reliability” that could be aligned with Common Core standards, but it also found complex benefits and risks for both. As a result, it said, “There is not a simple answer to the question of ‘MCAS or PARCC?’ ”
In response to a question at Tuesday’s board meeting, Chester and Peyser agreed that MCAS and PARCC are about equal in predicting college performance.
“But that’s one piece of the puzzle we’re trying to solve here,” Peyser said. “The other is to try to make sure that we’re sending the right signals to teachers and students while they’re in high school about what the expectations are,” both for what students should have learned at each level and for what they need to do in order to prepare for college and careers.
With the possibility of an MCAS 2.0, that puzzle just acquired a few new pieces.
WBUR’s Peter Balonon-Rosen contributed to this report.