Hey there, time traveller!
This article was published 16/7/2010 (2383 days ago), so information in it may no longer be current.
Last week witnessed two seemingly unrelated events, which in fact are connected to issues of medical error and patient safety in Winnipeg's hospitals. The first is that on July, the Winnipeg Free Press reported that 50-year-old, Gerald Briggs is suing the Winnipeg Regional Health Authority, the province, two hospitals and some nurses after suffering brain damage believed by his lawyer to be the result of improper care for a stroke. The second is that the University of Manitoba announced Dr. Brian Postl's five-year term as dean of the faculty of medicine began July 1, 2010. What does yet another allegation of medical error in Winnipeg's hospitals have to do with the appointment of Postl?
Postl is an advocate of what, in patient safety literature is termed, "systems thinking". In an email to Health Science Centre staff, when he was Winnipeg Regional Health Authority CEO Postl blamed Brian Sinclair's death on health care "system frailties" (Winnipeg Free Press, February 7, 2009). As in the case of the tragic life and death of Brian Sinclair, it has become commonplace to blame errors that occur as part of biomedical health care on "the system". Biomedical proponents such as Postl argue that medical error can be reduced through a focus on developing a "safety culture" within the health-care system.
The hallmark of such an approach is the conviction that medical errors cannot be prevented by focusing on negligent actions of individual health-care staff who commit errors. Rather, it is suggested that medical errors emerge as a result of the complexity of the health-care system. "Safety culture" proponents believe that such mistakes cannot be prevented by what they term the "naming, blaming, and shaming" approach, which focuses on individual responsibility for errors.
The notion that patient safety can be secured through the simplistic introduction of a "safety culture" is naive from a sociological standpoint. Such "safety culture" discourse is, however, used effectively in defence of medical dominance over the patient-safety system.
For example, it allows medical officials to explain away instances of medical error, such as that which resulted in Brian Sinclair's death, with reference to "system frailties". By scapegoating the system, medical officials attempt to explain medical error in a manner that doesn't challenge the medical profession's control over defining and responding to medical mishaps. The problem is that "systems thinking" treats the medical system as if it were a closed system in which only medical professionals have the ability or right to decide what counts as medically inappropriate, harmful, or negligent behaviour.
Further, the agency of individuals involved in the system is overlooked.
The "safety culture" argument holds that the behaviour of health-care professionals cannot be judged without considering the systemic context of their conduct. In this way, blaming the system actually safeguards medical dominance over the health-care system.
The "safety culture" hypothesis is consistently disproven by social science. There is no clearer illustration of this than research showing that approximately half of health-care workers in Canadian hospitals can't seem to be persuaded to wash their hands in the interests of the safety of their patients (CIHI, 2004).
Health care professionals are well aware of risks posed to patients by hospital-based infections and still as many as half of them don't take straightforward actions to prevent such harm.
So called, "systems thinking" approaches, such as awareness-raising and educational campaigns, have proven unsuccessful in affecting cultural change, which would promote patient safety through basic infection control measures, such as hand washing.
Research shows that, rather than learning from mistakes blamed on "the system", physicians tend to cope with them with resigned acceptance. The reluctance of physicians to participate in "safety culture" initiatives through reporting their mistakes has been well documented by social scientists.
Patient-safety lessons can, however, be learned from regulatory measures which have met with success in reducing rates of hospital-based infections by holding non-compliant health care professionals and the hospitals they work for accountable for their actions.
This shows that behavioural change occurs through changing the structural context that makes that behaviour possible in the first place. One of the most effective and straightforward means to accomplish structural change is with regulatory measures which govern individual behaviour. For example, individuals avoid the adverse events associated with smoking not because they suddenly learn such behaviour is bad for their health; they quit smoking when it became stigmatized, expensive, and, as with the case of smoking in public or in automobiles with children in them, illegal.
That is, a population health version of "name, blame, and shame" lowered smoking rates! Similarly, physicians and nurses have become compliant with hand-washing rules in those hospitals which punish individuals who don't wash their hands. Likewise, industries such as nuclear power and aviation have been more successful in preventing errors than the health-care system because they employ a blend of systemic and regulatory measures.
The sociological lesson is that patient safety from medical error will only be achieved when regulatory models from other high-risk technical systems are adopted to offset the dominance of the medical profession.
Christopher J. Fries is a health sociologist on Faculty in the Department of Sociology at the University of Manitoba and co-author of the forthcoming book from Oxford University Press, Canada, Pursuing Health and Wellness: Healthy Society, Healthy People.