Hey there, time traveller!
This article was published 25/1/2017 (1985 days ago), so information in it may no longer be current.
Greg Mason has done a great service to the research community by finally making available the Mincome data collected at a cost of $17 million to Canadian taxpayers (Revisiting Manitoba’s basic income experiment, Jan. 23).
His research group received $1 million in 1980 to make these data widely available to the research community. Digitization was completed in 1984 and the tape circulated among a few labour economists before it became obsolete in the late 1980s. On Jan. 3, 2017 the data were deposited in the University of Manitoba library.
Dauphin was the minor site in the Mincome experiment; Winnipeg was the most important site. Four factors, however, made it impossible to assess Winnipeg health outcomes. First, only a small proportion of Winnipeg citizens participated in Mincome. Second, as Mason assures us, Mincome did not collect any significant health data. Third, many of the participants in the experiment dropped out before it was completed, so any data collected was incomplete. And fourth, the "state-of-the-art" design incorporated into Mincome chose too small a sample size to generate significant effects. That meant a second wave of participants was introduced into the study mid-term, further muddying data analysis.
How was it possible, then, to determine that health improved significantly in Dauphin? Dauphin was a saturation site, dismissed by Mincome researchers as unimportant. In fact, it was this design that salvaged the results.
As a saturation site, anybody who lived in Dauphin during the Mincome experiment received the promise they could receive support from Mincome if their income fell below the prescribed threshold. This meant even people who didn’t receive money from Mincome were relieved of the stress associated with the fear of income insecurity.
Moreover, the Mincome researchers were not able to design a reasonable control group for Dauphin, so they couldn’t figure out how to ensure the results observed were due to Mincome. In the 40 years since Mincome, statistical design has improved a great deal.
I was able to find three control subjects for every Dauphin resident — matched not only on age and sex, but on every variable in the long-form census. So we can be sure that observed results were due to Mincome.
Finally, I was able to draw on a database not available to Mincome researchers — medicare data, routinely collected every time anyone in Manitoba goes to the hospital or visits a family doctor. This is objective data, based on billings, rather than the self-reported data collected by Mincome researchers and subject to all the distortions of memory that limit self-reported data.
So what did the Dauphin data show? Hospitalizations fell by 8.5 per cent, largely due to improvements in mental health. The number of visits to family doctors fell significantly, largely due to improvements in mental health. High school completion rates increased, based on Department of Education data. All these are relative to the control group.
These data are not based on the "dusty boxes" of records from the Mincome project, but the boxes do, nonetheless, contain lots of data not yet digitized. David Calnitsky, then a doctoral student at the University of Wisconsin, used some of these data to show that "stigma" associated with collecting welfare was not experienced by Mincome recipients, and that family relationships improved.
As Ontario, Finland, the Netherlands and Oakland begin new experiments, people want to know how quality of life improves and how expenditures on other social programs decline with a basic income. They are interested in results beyond the demonstration that people don’t quit their jobs, as Wayne Simpson and Derek Hum showed in 1983 based on Mason’s digitized data.
Evelyn Forget is a health economist at the University
of Manitoba. Her re-examination of Mincome and ongoing work on guaranteed annual income is supported by CIHR and SSHRC.