by Philip Boxer
How are we to know what constitutes an ‘unintentional’ error? I want to be able to identify ‘unintentional’ errors as errors of intent in order to be able to ask why a diagnostician can be rendered ‘blind’ to the actual nature of the situated condition before him. Is it because no-one can know what is wrong, given the current state of knowledge? Or is it because the diagnostician is blind to the possibility that his mental model is itself inadequate? To clarify this distinction, I start from James Reason’s work on understanding errors – ‘To Err is Human’[1].
Understanding ‘intentional’ errors[2]
James Reason defines an error as the failure of a planned sequence of mental or physical activities to achieve its intended outcome when these failures cannot be attributed to chance. As a cognitive psychologist, he distinguishes three things:
- the mental model that frames the way the diagnostician makes sense of the situation (‘knowledge-based performance’);
- some way in which the diagnostician uses this model to establish correspondence between a plan and the situation observed (‘rule-based performance’); and
- some way in which the diagnostician uses this plan to define actions on the situation (‘execution-based performance’).
Both the “intended outcome” and the “planned sequence of activities” are constructs of this mental model, which corresponds to the diagnostician’s ‘way of making sense’. In these terms, therefore, there are three kinds of error corresponding, in reverse order, to errors of (1) execution, (2) planning and (3) intent:
- the way the actions prescribed by the diagnostician’s way of making sense may be put into practice wrongly (‘execution-based performance’),
- the way the model is used to ‘make sense’ of the situation may be at fault (‘rule-based performance’), and
- the mental model may itself be unable to make the situation tractable (‘knowledge-based performance’).
According to Reason, however, error is not meaningful without the consideration of intent. That is, it has no meaning when applied to unintentional behaviors because errors depend on two kinds of failure, either actions do not go as intended (i.e. an error of execution) or the intended action is not the correct one (i.e. an error of planning). In the first kind of failure, the desired outcome may or may not be achieved; in the second kind of failure, the desired outcome cannot be achieved. For Reason, “intentional” errors thus reduce to errors of planning or execution, so if we follow him in restricting ourselves to ‘intentional’ errors, there are only errors of planning and execution. This is why errors of intent are not addressed in ‘To Err is Human’.[2]
Understanding ‘unintentional errors’
Reason nevertheless differentiates between slips or lapses and mistakes. A slip or lapse occurs when the action conducted is not what was intended.[3] It is an error of execution. In contrast, a mistake occurs when the action proceeds as planned but fails to achieve its intended outcome because the planned action was wrong.[4] It is an error of planning. On the other hand, a mistake in medicine might involve selecting the wrong drug because the diagnosis is wrong. In this case, the situation itself was mis-assessed resulting in the action planned being wrong (unless serendipitously right!).
From the patient’s perspective, not only should a medical intervention proceed properly and safely (i.e. free of errors of planning and execution), but it should also be the correct intervention for their condition in their particular circumstances. We are thus adding that the condition should itself have been correctly identified (i.e. the medical intervention should be free of errors of intent, such errors arising through the adoption of an unwanted aim).
The question here is why the condition itself should have been incorrectly identified, leading to the adoption of an unwanted plan. For Reason, a further distinction is made between mistakes at the level of applying rules and applying knowledge (e.g. ‘Human Error’[1] p56). Reason suggests that this distinction reflects two kinds of error: (i) because the situation was assessed incorrectly; and (ii) because of a lack of knowledge of the situation. In the first case we can agree that this is an error of planning. We would say that the situation was assessed incorrectly in the way the diagnostician ‘read’ the situation using his mental model. But in the second case we must ask what is meant by “lack of knowledge of the situation”. If this were “knowledge” in the terms of knowledge that the mental model uses, then it is the same as (i). But if not this, then it must mean that the existing unconscious mental model leaves the diagnostician ignorant of some aspects of the situation because it has no place for them. In the physical sciences, this would be like using Newtonian mechanics to predict the motion of electrons subject to quantum effects. In order to understand ‘unintentional’ errors, therefore, we need to consider the reasons for the inadequacy of the unconscious mental model being used.
‘Unintentional’ errors as evidencing unconscious valencies[5]
A theory of the subject would have to give an account of the subject’s conscious mental model of his or her world and its trajectories, to which the subject experienced himself or herself as being subjected.[6] But it would also have to give an account of the way in which the subject experienced himself or herself as being subjected to his or her unconscious mental model.[7] These two forms of subjection together constitute a ‘double subjection’, in which the concept of ‘valency’ provides a way of referring to an unconscious investment in particular ways of organising the subject’s conscious mental model of his or her world, i.e. in particular ways in which the conscious and the unconscious are held in relation to each other.
Using the concepts of double subjection and valency, we can now account for all three kinds of error as follows. The subject’s conscious mental model and its trajectories are the diagnostician’s ‘way of making sense’, from which ‘plans’ associated with ‘intended outcomes’ and ‘planned sequences of activities’ may be derived. The third kind of error of intent arises from the nature of the subject’s unconscious valency for particular kinds of ways of making sense because of the ways in which they support the subject’s identifications:
- Type I: correspondence error, in which the subject’s conscious mental models fail to anticipate the subject’s experience in a given situation (i.e. an error of execution creates a correspondence error);
- Type II: coherence/inconsistency error, in which an elaboration of the subject’s conscious model with respect to the situation renders the model itself incoherent through its inconsistency with the subject’s unconscious experience. The occurrence of such an error can be said to be a ‘signal’ to the subject of the need to change his or her conscious model but, in general, the error does not determine a suitable elaboration, or even guarantee that one may be found. (i.e. the way the subject’s conscious model is put into relation with the situation creates an error of planning, which reveals the incoherence/inconsistency of the model itself with respect to the situation);
- Type III: decidability error, in which the subject faces more than one possible elaboration of his or her conscious model of the situation because the model is non-deterministic with respect to the situation, generating mutually inconsistent models from which to choose. This undecidability means that no diagnosis can in fact be made with the available model, making problematic its use in any choosing ‘what to do’. Under these conditions, for the subject to act ‘as if’ he or she knows ‘what to do’ will be because of his or her valency to that course of action (i.e. acting from unconscious valency to a particular course of action in the face of such undecidability creates in an error of intent).
The implications?
The report following ‘To Err is Human’[2] was on the Quality of Health Care in America project[8]. It considered a range of quality issues related to the overuse, underuse and misuse of treatments.[9] If we are not to question the motives of doctors (i.e. as intentionally following forms of diagnosis that serve their own interests while not serving the interests of their patients), then overuse (e.g. operating on knees unnecessarily) and underuse (e.g. not prescribing a prophylactic orthotic treatment where it could mitigate future conditions such as the need for an operation) should both be considered as arising from errors of intent.
Understood in these terms, it is necessary but not sufficient, in order to address issues of quality, to seek to ensure the use of the best possible conscious models (e.g. through relying on the influence of ‘evidence-based’ forms of research). The subject must also be prepared to take up the ethical challenge of questioning his or her unconscious investment in his or her particular ways of knowing i.e. questioning being subject to an unconscious model that is inadequate because of what it is structurally ‘blind’ to/lacks[10]. The process by which this ‘blindness’ can be worked with involve triple-loop learning, making use of reflexive processes.
Notes
[1] Reason, James, Human Error, Cambridge: Cambridge University Press, 1990.
[2] This section uses extracts from “To Err is Human: Building a Safer Health System”, Institute of Medicine, National Academy Press 1999. Pp46-47. This blog was provoked by the fact that errors of intent were absent from the analysis.
[3] The difference between a slip and a lapse is that a slip is observable and a lapse is not. For example, turning the wrong knob on a piece of equipment would be a slip; not being able to recall something from memory is a lapse. These are errors of commission or omission in the way the plan (of intended actions) is translated into practice.
[4] In medicine, slips, lapses, and mistakes are all serious and can potentially harm patients. For example, in medicine, a slip might be involved if the physician chooses an appropriate medication, writes 10 mg when the intention was to write 1 mg. The original intention is correct (the correct medication was chosen given the patient’s condition), but the action did not proceed as planned. If the terms “slip” and “mistake” are used, it is important not to equate slip with “minor.” Patients can die from slips as well as mistakes.
[5] Extracts drawn from Philip Boxer and Bernie Cohen, ‘Doing Time: The Emergence of Irreversibility’, Annals of the New York Academy of Sciences 901:13-25, 2000; and Boxer, P.J. and Cohen, B. (1997) “Analyzing the lack of Demand Organization”, 1st International Conference on Computing Anticipatory Systems, Liege.
[6] In psychoanalytic terms, this would be subjection to a ‘reality principle’.
[7] In psychoanalytic terms, this would be subjection to a ‘pleasure principle’.
[8] Chassin, Mark R., Galvin, Robert W., and the National Roundtable on Health Care Quality. The Urgent Need to Improve Health Care Quality, JAMA, 280(11):1000–1005, 1998.
[9] The impact of underuse, overuse and misuse on these issues was later published in ‘Crossing the Quality Chasm’: Committee on Quality of Health Care in America (2001). Crossing the Quality Chasm: A New Health System for the 21st Century, National Academy Press.
[10] The process by which the subject ‘conceals’ this structural blindness is through conserving forms of vagueness in in the way he or she applies his or her models.