Certain uncertainties: Modes of patient safety in healthcare
The article seeks to explore a critical examination of how the patient safety movement ‘thinks of itself’, and what it mostly sees as its mandate: the reduction of errors. The author suggests that this is not the ‘only’ way to look at patient safety, and that errors can, at least in theory, be an important part of promoting safety. To say the article flips the conventional approach to ‘patient safety’ on its head would not be an overstatement, though it is worth noting that many of the ideas she puts forward have emerged in other camps as well, if phrased slightly differently. The most notable ally to such premises are the “Safety 2” concepts that have been under construction in the healthcare resilience movement authored by people like Erik Hollnagel, which suggests an approach to safety that promotes an ‘accentuation of the positive’ versus a rigid adherence to ‘eliminating the negative’.
These are not necessarily ‘easy’ concepts to articulate, much less ‘sell’ to lawmakers, policy makers, or perhaps especially to the public. While those who work in healthcare generally recognize the need for a discretionary space in which the inherent uncertainty of a clinical presentation can be understood towards diagnosis and treatment, governments and the public still most often reduce such uncomfortable ‘truth’ to a moral or competence question around an individual provider or two. “They should have” is still a common antecedent to ‘findings’ in quality reviews of care, with poorly performing individuals and/or devices as comforting conclusions. Such an approach ‘…assume(s) that general knowledge about the causes of patient safety is separate from a more intimate understanding of the actions and situations that enhance or undermine safety…” (Jerak-Zuiderent, p.736). To paraphrase Dekker, another allied thinker in this realm, the same conditions that promote success most of the time are the self-same conditions that produce unintended harm some of the time. Such thinking is an uphill struggle to replace the prevalent paradigm that, “aim(s) to prevent errors and reduce uncertainty by finding the ‘root cause’ of the errors and/or by educating the recalcitrant and resistant professionals and reforming their working culture to avoid errors.” (Jerak-Zuiderent, p. 736)
She then goes on to propose her own thesis, which includes a real world understanding of the clinical realm in a humanistic manner: “Creatively understanding what safety means in a specific event can even require disregarding existing protocols or guidelines.” (Jerak-Zuiderent, p.736). A Human Resources department apologist, she is not. She goes on to support this thought with support from both Michel Foucault and Georges Canguilhem, to help bring us out of the 18th century of ‘perfect rationality’ and into a contemporary understanding of ‘knowing and acting’. She moves into her own ethnographic research that revealed the spectrum of understanding of the very concept of ‘patient safety’, from staircases for patients to personal safety from mugging for healthcare professionals. She then presents a rich case example of how a rigid adherence to policy can create ‘certain unsafety’ as a logical and practical consequence of trying to root out ‘uncertain safety’. Ultimately, she makes a compelling case (in my view) for a recognition that ‘not all errors are created equal’, and that some indeed are necessary by-products of dealing with the ubiquitous uncertainty that accompanies ‘good’ healthcare.
I really enjoyed this article. I hope you do too!
Ryan Sidorchuk, An Occasional Contributor to HSSA
Balancing “No Blame” with Accountability in Patient Safety
This article by two of the acknowledged leaders of the North American patient safety movement attempts to address the important question of the intersection of the responsibility of individual providers’ performance with the myriad systemic issues which challenge frontline providers on a daily if not hourly basis. Unfortunately they have not accomplished their task, for several reasons.
They begin by setting up a “straw dog” with the suggestion that any serious individual or organization involved in PS work would maintain a “no blame” approach to individual performance. No examples are provided of such an organization or individual, simply because they do not exist. It is not hard to knock down such a non-existent straw dog.
Secondly the authors frequently confuse the concepts of “accountability” and “responsibility”. It seems as if they equate the concept of accountability with punishment following non-or under-performance of tasks by individuals. They are of course entitled to maintain such a position but then they should reasonably add some evidence that punishment is in fact an effective means of changing and guiding behaviour. Such evidence is not only thin – it does not exist.
The conflation of “accountability” and “responsibility”, while understandable, is particularly unfortunate as it diverts the reader’s attention from the value of creating conditions (such that both providers and recipients of healthcare can tell their stories) where accountability becomes a powerful tool to promote understanding of possible initiatives to improve safety and quality of services.
Probably the most serious weakness involves the frequent use or reference to “system” or “systems” as an explanation for events that harm patients. At no point do they stop to explain what they mean by these terms; nor do they pause to ask what kind of a system healthcare might be and what might be the implications of such a determination. This is a very common failing on the part of healthcare researchers, managers and academics who fling around the concepts of “systems” and “systems thinking” as if they were a kind of magic talisman that permits us all to stop thinking – and certainly without providing a notion of how they understand or apply the concepts in their work.
One area of the article that remains particularly under-developed is the invocation of James Reason’s “culpability of unsafe acts” decision tree as the basis for the range of punishments (more gently referred to as “consequences”) they propose for various transgressions by individual providers. At no point do they talk about the decision tree and the several a priori decisions it requires in order to sort the “culpable bad apples” from the rest of the staff who struggle to provide safe care in the face of frequently inadequate conditions.
There clearly are acts that deserve blame and consequences. The examples they provide (hand washing for instance) would benefit from a truly systemic nonlinear understanding of the conditions in which healthcare providers are expected to work before jumping to the conclusion that punishments should be found that would “fit the crime”.
Giving Back the Pen:
Disclosure, Apology and Early Compensation Discussions after Harm in the Healthcare Setting
In her recently published book After Harm, Nancy Berlinger shares a story about Bishop Desmond Tutu as he comments on the importance of restitution or compensation after an event that has led to harm. Transparency and disclosure are very much on the healthcare agenda in Canada. The increased interest in training providers for difficult conver- sations and disclosure is a positive sign. Using honest disclosure and apology as important interventions, organi- zations are beginning to adopt a more open approach to the concept of rebuilding trust after a patient has been harmed. But there continues to be significant reluctance to take the next logical step to solidify the fiduciary relation- ship between provider and patient – the willingness to enter into early discussions about compensation, non-monetary and otherwise.
The Winnipeg Regional Health Authority has developed, with the participation of the facility insurers, a process to identify those cases in which it would be appropriate not only to offer an apology of responsibility but also to initiate discussions around the questions of restitution and compensation. The article describes the steps that led to the development of a detailed process map for such cases and shares the algorithm that has been adopted. As well, the potential challenges associated with such an approach when there are multiple liability and insurance providers are discussed.
Here is an interesting article review written by Ryan Sidorchuk
A systematic review of evidence on the links between patient experience and clinical safety and effectiveness
This article submits patient experience as an important third pillar of defining ‘quality’ in healthcare, alongside such fundamentals as clinical effectiveness and of course, patient safety. It suggests that categories such as ‘compassion’ are an important consideration to healthcare, delineated through categories of relational and functional aspects. There is also a correlation suggested between patient experience and the identification of care misadventures. As such, it promotes the integration of patients and families as important members of the healthcare team towards the mitigation of unintended harm.
Ryan Sidorchuk , 4 February 2013
Cognition, Technology and Work (2012) Special Issue (vol.14)
An entire special issue of this journal covers a fascinating range of topics, three of which are covered now with other articles highlighted in the future. Link
Coping with complexity: past, present and future
This is a vintage contribution from a recognized leader in the field of cognitive systems engineering. Hollnagel writes in a clear and easily understandable way and provides an historical overview of CSE and human factors efforts over the past several decades. By focusing on the gradual evolution of how complexity has been understood and responded to in different times the writer provides a bold prescription for moving forward. This is well summed up in his closing comment “It seems to be the unenviable dilemma of human factors and (cognitive) ergonomics that we inadvertently create the complexity of tomorrow by trying to solve the problems of today with the mindset (models, theories and methods) of yesterday.”
Adapting to change and uncertainty
Nemeth provides persuasive arguments for the necessity of looking at how operators in the real world adapt to their working conditions, bringing into focus the longstanding differentiation between “work as designed (or imagined)” and “work as done”. The author points out that the evolution of CSE over the past several decades has allowed this adaptive capacity of human operators in diverse situations to become more apparent and to serve as guide posts for efforts to strengthen organizations and CAS. Managing complexity successfully will require a clear understanding of human adaptive abilities.
Complexity: learning to muddle through
Starting with Ashby’s Law of Requisite Variety (1956) and incorporating the organizational insights of Thompson in the 1960’s, Flach offers a sensitive theoretical umbrella for the wide range of articles in this special issue. Of great interest is his re-framing of Perrow’s four-quadrant grid, making clear the essentially dynamic nature of complex adaptive systems by introducing the axes of dimensionality and interdependence (thus the link to Thompson’s work) to promote a fuller understanding of CAS. In a second instance he introduces two other dynamic axes – locus of control and flexibility. An important contribution in this article is Flach’s emphasis on the concept of controlling complexity and suggests the metaphor of self-organization as a guiding principle. His statement “There is no single best way, nor is there a fixed target” coupled with the observation that “goals [which] emerge and change over time…are more typically a retrospective product of sensemaking than an a priori guide to action” will be an important point of reflection for leaders and managers of CAS in the 21st century.
(RR: 6 Nov 12)