Hey there, time traveller!
This article was published 11/5/2012 (3706 days ago), so information in it may no longer be current.
In Nick Martin's piece Some schools opted out of national math test (May 10) Manitoba Teachers' Society president Paul Olson implies the disappointing mathematics performance of Manitoba students is a mere conjurer's trick, with other provinces tinkering with the study's samples.
"You just never know if there's a quiet conversation going on (to suggest a school with below-average academic performance opt out)," he says.
Only Manitoba, Olson stresses, saw all its randomly selected schools participate.
Olson first raised his objections of this sort when he debated the results of the test with me and Anna Stokke, a fellow math professor, last fall on CJOB.
Jarringly, a union representative inserts himself into the issue to defend the status quo by arbitrating the interpretation of a carefully articulated 174-page study, commissioned by the provincial ministers of education. In contrast, Gerald Farthing, deputy minister of education, expressed concern about the findings. In our ongoing discussions with him, Farthing has always taken our concerns seriously.
Let's examine some of Olson's claims. First, consider those missing schools.
Nationally, 1,687 schools were randomly selected for the study, distributed by population. Manitoba and P.E.I. had 100 per cent school participation; all but three had more than 95 per cent; none had less than 80 per cent. Altogether, a remarkably good response rate.
Small numbers of dropped clusters of data are benign; if a school does not participate, both strong and weak students are excluded. A more effective way to skew scores in one's favour would be to exclude only weak students.
Manitoba systematically exempted 5.2 per cent of its students from the study -- a much higher rate than the two top-performing provinces, Quebec (0.7 per cent) and Ontario (1.8 per cent), and greater than all except four other low-performing provinces.
Students were excluded from this study for three reasons: functional disabilities; intellectual disabilities or socio-emotional conditions; and language.
This places Olson's claim that Manitobans participated "regardless... of poverty, fetal alcohol spectrum disorder or other factors affecting academic performance" in a very different light.
Yet another table in the report records "non-participation" -- students failing to participate though not explicitly excluded. Manitoba had a 92 per cent participation rate -- lower than all provinces except Quebec and Yukon territory, and much lower than Ontario's.
My point is Olson's suggestion that Manitoba's poor performance is due to rigging could just as easily cut the other way. But there is no case for either contention -- the exclusions of both types are normal, clearly reported and well within "natural variation."
Olson's charges are very serious. In effect, he is arguing that someone -- administrators? politicians? participants? -- deliberately tampered with important data used to inform decisions affecting millions of Canadians' lives. To openly accuse public servants of this type of mischief is no small matter.
I challenge the MTS president to be specific about whom he regards as having scuttled this study and to supply concrete evidence of wrongdoing.
Martin cites Olson as saying anyone who understands math knows the differences among scores for most provinces and territories clustered below the top two were meaningless, as tiny as 0.25 per cent.
Well -- ahem -- I "understand math."
In fact, the differences in scores were "as tiny as" zero per cent because Saskatchewan and Nova Scotia had identical scores.
I fail, however, to see why this is worth noting. Excluding the top two provinces leaves scores ranging from 460 (P.E.I., at the bottom) to 495 (Alberta, in third place), 7.6 per cent higher and a 35-point difference -- well outside the "stated margin of error" (four points for Alberta; 8.3 for P.E.I.). Manitoba's score is 27 below Alberta's, and has a 4.2 point "margin of error." Evidently, Olson is wrong again.
The results of the test, conducted for the Council of Ministers of Education Canada, mirror the findings of the 2009 PISA international assessment, which has a three-year cycle.
Even more alarmingly, Manitoba's PISA scores have fallen steadily over the last three assessments, from the Canadian average in 2003 to second-last six years later -- a striking trend.
We see other red flags for Manitoba in CEMC's report.
Manitoba has the highest percentage, 16 per cent, of students who performed at the lowest test level. The Canadian average was nine per cent. In this group, the Grade 8 students are expected to solve problems "at a very low cognitive level." The example given involves adding a column of seven numbers -- and calculators are permitted.
This suggests our weakest students are not getting the help they need (remember, this figure is derived after excluding students with impairments and disabilities). Our highest-performing group is also at the bottom of the heap (one per cent versus the national average of four per cent), which begs the question: Whom does our math program serve well?
We are pleased to see ministry officials are taking these matters seriously and discussing real solutions with us. Why does the MTS president feel compelled to say, in effect, "move along folks, nothing to see here"?
Robert Craigen is an associate professor of mathematics at the University of Manitoba and founded WISE Math with Anna Stokke. A link to the full test report can be found at wisemath.org .