Thinking Fast, Thinking Slow

At the recent International Conference of Emergency Medicine 2012 in Dublin, one of the keynote speakers was a leading expert in the role of cognitive bias in medical decision making.  Dr. Pat Croskerry is a Canadian Emergency Physician that speaks widely on the subject of how medical errors can result from the way we think.
Here are two important slides from his presentation:
TypeITypeIIDualProcess

Clinical Decision-making – Pat Croskerry

He argued that in everyday life, we operate in two cognitive modes depending on the situation with which we are faced.  For everyday issues, that are familiar and frequent, we respond in intuitive fashion e.g. driving a car to our local shops.  Our actions are automatic and rapid.  However, when we are faced with unusual circumstances, our actions become more cautious and analytical e.g. driving a car in an unfamiliar city – scrutinising road markings, signs and obstacles along our journey.   Naturally if we approached every old problem in such a measured way, our lives would be exceedingly tedious and laborious.   Conversely, as we become more practised or well-versed with a situation, our actions should require less energy, attention or effort so that our attention could be devoted to more novel situations.

In the clinical situation the analogy would be treating a simple ankle sprain versus assessing a patient with a constellation of seemingly unrelated physical complaints.  Yet he argues that medical error frequently occurs because we occasionally mistake what appears to be a familiar problem with what is actually something else more important or sinister – for instance, an alleged ‘ankle sprain’ which turns out to be a ruptured Achilles tendon.  The error is often compounded by either not eliciting important information in the first place (‘the pain occured as I was leaping, not when I landed’) or ignoring clues that contradict the hypothesis (the absence of tenderness over the ligament).  This observation is particularly salient in medicine because many serious, unusual conditions can masquerade as more common, benign ones.

Early in their medical training, we are trained to be cautious and systematic e.g. beginning the physical examination at the hand, reporting a CXR with the patient details.   There is very good reason for this.  We do not have the breadth of experience to accurately identify or eliminate important information in their assessments.

However, through our early clinical years, we begin to amass an increasing amount of clinical experience.  We will encounter frequently seen conditions that (more or less) present in similar fashion.  Our confidence grows and our ability to recognise patterns and data-clusters improves.  Diagnostic ability become more rapid, and the mental effort of ‘thinking through cases’ is diminished.  However, disaster ever lurks around the corner.  Clinical assessment also becomes sloppier and brief, important details are not checked for and hasty conclusions and decisions begin to be made.  In many cases, the novice clinician ‘gets away’ with this because the decision was correct (even though the adequacy of the premises did not) or the error is later corrected for (often to the chagrin of somebody else).   But occasionally, a major incident finally occurs (leading to death or disability) resulting from a consistent and long-standing pattern of sub-optimal practise.

Croskerry argues the critical error first occurs at the beginning of the medical asssessment when either the clinician chooses ‘fast thinking’ over ‘slow thinking’.  But he also concedes that at various points along this path the clinician is capable of switching from ‘fast’ to ‘slow’ or ‘slow’ to ‘fast’ depending on further information that may be obtained.   This occurs if suddenly new data appears to contradict the original premise e.g. atypical symptoms, incomplete diagnostic criteria, lack of response to treatment.  He describes this as the ‘toggle’ function.  Again, mistakes may occur if the doctor dismisses or explains away these observations and relentlessly progresses along his original premise.

It is this point, that I disagree with him as to how we can prevent this from happening.  Croskerry argues strongly that we need to be simply aware of the dangers of ‘fast thinking’ and be conscious when we are operating in this mode.  His emphasis is more on ‘how we think’ rather than ‘what we know’.     This bias is probably related to the fact that his audience are generally more experienced clinicians where cognitive issues are more likely to contribute to error.  After all this is his field of expertise.  I think the problem is more complex for junior and middle greade doctors.  I believe it arises when the clinical problem is first encountered – ‘recognising’ or ‘not recognising’ if the presentation is familiar or not and then not being able to toggle to ‘slow thinking’ if required.  Clearly experience has a lot to play in determining this but I think there are things that can specifically taught that can assist in this.

An example is in the case of ‘acute back pain’.  The key diagnoses not to be missed include unstable stable fractures, spinal infections, malignancy and important retroperitioneal lesions such as a leaking aortic aneursym.  These cautions can be expressed by defined criteria in widely accepted imaging guidelines for back pain – these include old age, history of malignancy and fever, severe trauma, elevated inflammatory markers, pain lasting > 4 weeks etc.    Rather, than having clinicians memorise by rote all the possible benign and siniser causes of back pain, they should first learn to memorise and recall the major ‘red flags‘ that alert them to important possibilities.  By actively searching for these, it may permit the clinician to remain in energy-conserving, efficient ‘fast’ thinking mode or force them to ‘toggle’ to ‘slow’ their thinking and evaluate the patient more cautiously for more detailed information to confirm or exclude alternative possibilities.

Whilst cognitive biases are an important part of research into medical error, in isolation they are not the sole cause.  Insufficient clinical knowledge and clinical experience are also a contributor (particularly in the junior clinician).  Since it is impossible to teach ‘all of medicine’ to produce safe medical graduates, perhaps it would be more effective to impart to them the critical lessons of clinical medicine and reinforce the generic, systematic data-collection and reasoning methods required of competent and prudent doctors to counter the natural preference for ‘fast thinking’.  There are only about 200 common presentations for disease.  Memorising the most salient facts is well within the capacity of most clinicians.

Leave a Reply

Your email address will not be published. Required fields are marked *