Hey there, time traveller!
This article was published 3/2/2010 (2909 days ago), so information in it may no longer be current.
It’s not news that Manitoba does not compile and make public a school-by-school breakdown of Grade 12 marks in each subject, graduation rates, dropout rates, attendance, and postsecondary participation and success.
We’ve known that for years.
Yes, I know, as my email correspondents tell me, my even just reporting that right-wing think tanks want such material compiled and made public so that schools can be compared, shows once again that I am a neo-con zealot and that I unequivocally hate teachers.
This will come as a surprise to those other people who think I still check in regularly with Fidel each day.
The Selinger government, the Manitoba Teachers’ Society, and the Manitoba School Boards Association all say that they oppose such information being compiled and disseminated, that they do not see the usefulness in it. The teachers’ union even implies and dares to hint that comparisons and rankings are an ideological plot to attack teachers and the public school system.
Obviously, the NDP, teachers and trustees can all be accused of wanting to protect their (vulgar word).
People are choosing up sides over the usefulness of keeping score and rating schools.
Whether it can actually be done, using supportable methodology, using data and analysis that statisticians would support, producing empirical research that the researchers could defend and justify, figuring out with certainty how parental background and employment and education and socioeconomic conditions can be measured and weighted against marks and academic achievement to produce a valid comparison of how well teachers or school is performing...
A few years back, when the Fraser Institute was ranking high schools in Ontario, B.C. and Alberta from best to worst, it found that the data the Fraser Institute needed for its methodology here simply didn’t exist to rank Manitoba high schools from top to bottom.
The data might have been available, had the Tories won the 1999 provincial election — they were already making public the school-by-school scores in grades 3 and 12 math and language arts, and had plans for significant expansion of province-wide tests to other subjects and other grades.
I made the point at the time that I had met teachers who were doing a fabulous job — I’m not saying they were better teachers than others, or that others could not have done the same in their circumstances — whose students’ 65 average was a far greater achievement than the 88 achieved somewhere else.
I’m thinking specifically about people who taught back then at R.B. Russell High School. I’d met now-retired principal Wally Stewart and now-retired English teacher Brian MacKinnon, and a bunch of other terrific faculty.
They confronted wretched socioeconomic challenges, students living in neighbourhoods with appalling poverty and high crime rates, and they engaged those students, helped them stay out of gangs, educated them, helped them get into postsecondary programs and prepare them for jobs. Mural art was one of their methods.
How much harder is that than teaching Grade 11 pre-cal or biology to a kid at Fort Richmond Collegiate or Kelvin High School or River East Collegiate whose mom is a neurosurgeon and whose dad teaches quantum physics at the university?
How do you measure and compare and rank that?
I’ve said before that the education my kids received at Grant Park High School was as good an education as any I believe they could have received anywhere in Manitoba, including at the prestigious private schools.
But how do you measure how the faculty did with the kids who weren’t on the honour roll, kids who weren’t supported at home by two employed parents, by two educated parents who valued education and took a deep interest in their children’s learning?
I don’t know how you’d compare that to the job teachers did at Kelvin, Vincent Massey, Shaftesbury, with a similar mix of kids, let alone how such schools would compare to the overall job done by faculty with a higher proportion of kids not enjoying such demographic positives.
And I know that not every teacher in nursery to Grade 12 totally engaged my kids, completely connected with them, and drew the best possible out of them. The vast majority, sure, but not all of them. How do you weigh that? And compare it, and rank it?
Yes, I can see the point that there would be great reason for concern, should two schools with similar demographics produce an average of 86 in applied math in one school and 64 in the other. One would hope the superintendent might ask questions. But how often does that happen? Yes, I know, without the data, we don’t know. What would it say about the two schools if one had an average of 76 and the other 74.8? Is one better than the other?
I know there’s a school of thought that uses value-added education as a measure, based on the improvement year-to-year in academic performance. Sorry, can’t remember off the top of my head the think tanks involved, but as I recall, they look at comparative improvement from one year to the next, rather than the actual mark, to see how schools are doing.
So taking an average from 68 to 74 would be really good, would get a school on the podium. As I also vaguely recall, opponents of the plan argued that if the average went down from 94 to 92, the school would get a bad mark year-to-year.
And then there’s the power of anecdotal comparisons, which I would argue carries a lot of weight, and packs some validity, but which can’t be measured qualitatively.
We talk about schools, and staffs, and individual teachers, and the impact of a principal. We still do it, even with both kids in university, when parents of younger kids turn to us for advice.
When child the elder was in early years, and child the younger probably not yet in school, the word out in the neighbourhood was that if your kid was in junior high at a particular local school, and was having difficulty in one or more subjects, you’d be lucky if you heard about it by March. A new principal came, and suddenly you didn’t hear those stories anymore.
We all knew which teachers were reputed to be absolute gems, and did cartwheels when the timetables came out and our kids were in those teachers’ class. We worried when someone who’d been superb teaching an older child, transferred elsewhere just as the younger child was hoping to be in that teacher’s class come September.
I still highly recommend our elementary school to younger parents, even though our kids are long gone. When teachers stay for a long time and are content not to move elsewhere, when the principal stays, when vice-principals move immediately into principals’ jobs after a couple of years, when parents speak highly of their kids’ experience, when people move into the catchment area to get a spot, how do you measure and rank and compare that?
I also point out to parents that the teachers at our old school may be just as good at other elementary schools in the ‘hood, but point out that the range of programs and facilities can’t compare, gyms can’t compare, nor can the proximity of not-for-profit licenced day care. It’s a good school. Period.
Would my kids have done as well in high school at Kelvin, which is actually our catchment school for regular programming? Probably. I know they excelled at Grant Park, and I know there are people who used schools of choice to move their kids across several catchment areas to get them into Grant Park.
How would Grant Park rank against Kelvin, Sisler, Churchill, Daniel Mac? And should we even be using the word ‘rank’, or the word ‘against’? What think tank can tell us with indisputable certainty and methods beyond challenging, that a school is not a good school, or that a faculty is not a good faculty, or can identify which teachers are bringing down the ranking and why? What’s the cutoff point?
Look at the annual Maclean’s university rankings, and the controversy among academics about the magazine’s methodology.
I always go back to what Peter Brass says — Brass is the universities advisor at St. John’s-Ravenscourt School. SJR is rumoured to be a good school, but that’s only anecdotal, since we don’t have province-wide rankings.
But I digress... Brass says that there is not a bad public university anywhere in Canada, regardless whether a magazine’s methodology and weighted stats put one ahead of another. It’s a matter of looking for the university that best suits what a student wants out of a university education, says Brass.
Is there a think tank that can analyze whether a school, or a school division, or a public education system is meeting a community’s needs? In a catchment area that produces at-risk kids, kids with parents lacking higher education, or unemployed parents, or one parent or guardian, or kids who’ve never been in a real school because they’ve fled wars and lived in refugee camps, or kids who come to school not speaking English or French as a first or second or third language, are we giving those kids in those schools the additional resources they need?
And will publicly running a school’s Grade 12 average mark in geography get those schools the resources they need?
Meanwhile, enjoy the debate and buy lots of papers to keep reading about it.
Read more by Nick Martin.