Core Skills — 24 December 2020 at 5:05 pm

Anatomy of a Preventable Death: “Non-Technical” Skills in Expedition and Wilderness Medicine

Dr Edi Albert

Taking the science of medical error analysis and systems-based thinking out of the hospital, our very own Adventure Medic patron, Dr Edi Albert takes us through a case-based account of how we can learn from and ultimately prevent mistakes when on expeditions or in the wilderness. We also learn that sometimes the most valuable skills any expedition medic can bring to the table are of the ‘non-technical’ variety, and that these factors have the power to potentially avoid fatal outcomes.

Non-Technical Skills

Expedition and wilderness medicine education rightly focuses on “hard” knowledge and skills: reliable and reproducible algorithms and approaches for patient assessment, and for example, when to and how to use an Epipen or improvise a splint for a fracture 1.

These “hard” skills are all necessary, but not sufficient. We need to know when and when not to use our skills. We need to learn not only how to make decisions, but importantly to understand the basis of our decision making, and how as humans we interact both with our environment and with the other humans around us 2.

Case based learning with small groups can be used to teach these “non-technical skills” by focusing less on the clinical medicine, and more on the context of the case and how it was managed 3. This article describes the case of a man with severe high-altitude pulmonary oedema and identifies the problems with reasoning and decision making that ultimately led to his death.

A previously fit and healthy 26 year old man was carried down by porters from the 5300m pass he was attempting to cross. He was brought into the village (4800m) at 6pm, just as the sun was setting. He was given some pills by the guide and was put to bed, awaiting a helicopter at first light. By 11pm he had been declared dead. The resident doctors in the village were unaware of the unfolding catastrophe until it was too late.

Systems Approach

Most healthcare professionals will be familiar with the Swiss Cheese Model of medical error. This is the idea that in a healthcare (or trekking company) system there are various checks and balances that occur during conventional patient care (or operation of a trek), and that all of these must fail for a bad outcome to occur: in other words, all the holes in the cheese line up 4.


A similar concept exists in the world of mountain rescue and in the root-cause analyses of mountain accidents. It is rare to find a single cause for such an accident. There is usually a chain of bad decisions each exacerbating the effect of the following ones, culminating in the final event: fatal or otherwise 5.

In the modern world it is customary to point the finger, avoid understanding the complexity, and find somebody or something to blame. In a complex system such as a hospital, we cannot, or should not, usually find a single individual or event (or hole in the cheese) to blame 2.

Similarly, in mountaineering, and mountain accidents we should avoid simplistic explanations. They may be easy and hence seductive, they may fit nicely into a newspaper headline, but they can unfairly malign (or indeed exonerate) those involved, and most importantly there is nothing we can learn that will help us next time. Systems thinking describes the process of understanding how things influence one another and provides a useful approach to understanding, and learning from, complex processes 6. A systems thinking approach is used to reflect upon this case. The Iceberg Model is particularly apt for our purposes.

The young man in question (we’ll call him Simon) was on the “trip of a lifetime”, happy, healthy and brimming with enthusiasm and energy.  He was also, like many with that combination of youth and a Y chromosome, ignorant of his situation and with a misplaced sense of invincibility.

The first hole in the cheese didn’t even look like a hole: just an unfortunate, and perhaps annoying inconvenience. The trekking group had booked through an agency that advertised a guide from their own country as well as a local guide from the country they were to be visiting. On this occasion there had been a mix up and only the local guide was available.

Early on in the trip Simon became unwell with a simple upper respiratory tract infection (URTI). A common enough occurrence, and in Simon’s view nothing to worry about. The guide, however, had sufficient insight to recognise that the combination of an URTI with the continued exertion and gain in altitude was not a good idea. He said that he tried to convince Simon to stay put and rest for a day or two. He said that Simon had been pushy and insisted on continuing.

Let us consider for a moment, from the perspective of the company supplying the local guide, would it seem reasonable to provide a less experienced guide on the basis that another, experienced international guide would be present? Indeed, might it have provided a reasonable opportunity for a less experienced guide to gain more experience in a setting of safe supervision? Did the local guide on this particular trip have insufficient knowledge and skills to do the job? Was he out of his depth even before the trip started? We don’t know, but it is possible.

What information had the group been provided with, and what “homework” of their own had they (and especially Simon) done? How “tight” was their itinerary? We don’t know, but ignorance of the effects of altitude combined with tight itineraries are a recipe for disaster.

Graded Assertiveness

How did their individual personalities and conflict resolution skills influence this conversation? What exactly went on between them that resulted in the decision to continue? We don’t know. We can imagine that the guide backed down.

Graded assertiveness is a technique that allows somebody to improve their assertiveness, particularly in crisis situations. It is now widely taught in critical care medicine.

Even armed with such techniques, the situation has to been seen through the lens of a wider cultural context (the lower levels of the systems thinking iceberg). I have heard time and time again from Nepalese guides who try to advise their clients and are frustrated that they are not listened to. This client-guide relationship is clearly different from some other guided settings around the world: one can’t imagine the advice of a Swiss IFMGA guide on the Hörnli Ridge of the Matterhorn being ignored by her client.

Simon and his group pushed onwards and upwards and whilst we don’t have full details of what went on during that fateful ascent (of 900m) and descent (of 600m), we do know that he became unwell and sufficiently incapacitated that he had to be carried down. He was brought into the village at 6pm, taken to a lodge and briefly sat down in the corner of the dining room. Two paediatricians from another group saw him and became concerned. They reported that he had a decreased level of consciousness and a respiratory rate of 60 breaths per minute.

When questioned by the paediatricians, the guide seemingly became defensive, declared that he had given Simon some treatment with Diamox (acetazolamide) and had arranged for a helicopter evacuation at first light. The two doctors were seemingly placated and went back to their own business. The lodge owners were not made aware of his condition. Two doctors from the local clinic, on duty with radios switched on, and with appropriate experience and equipment to treat severe altitude illness were sitting in a nearby lodge awaiting their dinner, oblivious to the existence of this critically, indeed terminally, ill patient. Had the lodge owners been informed they would have taken matters into their own hands and called the local doctors.

Let us stop for a minute, and gaze down on this scene. Hopefully the combination of altitude gain, decreased level of consciousness, and respiratory rate of 60 are ringing some mighty serious alarm bells in the head of you, the reader. Lest you are unfamiliar with altitude illness and indicators of severe illness, let us quickly dispense with the medicine, as that should actually be the easy part, and isn’t the real focus of this article.

In the context of a recent gain in altitude and significant exertion, especially on the background of a respiratory illness, acute high-altitude illness must be suspected. The extremely high respiratory rate at rest points to high altitude pulmonary oedema (HAPE). The decreased level of consciousness points to either severe hypoxia secondary to HAPE, or high-altitude cerebral oedema, or a combination of both. The initial action of carrying the patient down was the correct one. The differential diagnoses had been correctly identified.

It’s what happened next that is concerning. Even in the absence of the issue of altitude, a respiratory rate of 60 in an adult is indicative of a critical illness and impending respiratory failure.  Even in the absence of both altitude and respiratory rate, the decreased level of consciousness is indicative of significant cerebral dysfunction. How were these warning signs indicating that this young man was heading rapidly for a peri-arrest situation missed by people who should have known better?

Cognitive Bias

Before we attempt to unravel this, let’s make sure that we understand the concept of cognitive bias. Put simply, a cognitive bias is a systematic error in thinking that occurs when people are processing and interpreting information in the world around them and affects the decisions and judgments that they make. If that doesn’t sound simple go back and read the above sentence again until it makes sense and then click on the link above. Even if you think you know about cognitive bias, I bet you didn’t know there are four broad, usually inter-related causes of cognitive bias.

  1. Cognitive overload – too much going on for one brain to handle.
  2. Heuristics – these mental shortcuts can be very useful when used appropriately – but can get us in a world of trouble when not.
  3. Peer pressure – well, we all know about that surely?
  4. Individual motivation – making a particular decision may lead to an outcome that is not favourable for the decision maker, and therefore any reason not to make that decision will be seized upon and clung to.

Normalcy Bias

So, using what we know about cognitive biases and heuristics, can we hypothesise about what went on that evening? And importantly, can we learn from it? A good starting point to unravel this puzzle is probably normalcy bias – the concept of underestimating the possibility of a disaster happening to somebody based on the fact that it has never happened to you before. In this case perhaps the guide was used to seeing unwell trekkers and calling a helicopter and giving them Diamox and getting a happy ending (so to speak). Similarly, with the paediatricians, was it hard for them to fathom just how serious it was because they had never encountered this before?

Biases need not act in isolation: and almost certainly didn’t that night. Perhaps the Dunning-Kruger effect was at work; in which under-skilled people over-estimate their ability and highly-skilled people over-estimate the ability of others. Was the guide over confident and did the paediatricians discount their own concerns thinking that the guide knew more than they did? It seems very likely.

Similarly, the affect heuristic was also influencing decision making. This is where a current emotion – fear for example – shortcuts a systematic and rational approach to decision making, and in this example that fear paradoxically led to a worse outcome.

Interoceptive bias no doubt had a role to play with the whole ensemble: guide, paediatricians and the rest of Simon’s group (not least his girlfriend). Interoceptive bias is perhaps most famously linked with the study of parole judges that demonstrated that they were consistently more lenient just after lunch when well fed and rested. In this case, the sensory inputs of fatigue, hunger, cold and possibly other symptoms related to high altitude, had a negative impact on decision making; think, ’you’re not you when you’re hungry’.

Finally, group think clinched it. Harmony and conflict minimisation in small groups is usually a good thing on treks. However, conformity and consensus decisions may result in adverse outcomes.

But a respiratory rate of 60 in an adult you say?! I have seen numerous infants with a respiratory rate of 60 and have not been too concerned: no doubt these paediatricians have seen many more than me. Indeed, their whole medical practice involves a very different set of norms from adult medicine. Could it have been that they simply relied upon their usual heuristic decision making processes: fine for the little ones but disastrous for an adult? On its own perhaps this is unbelievable, but combined with all these other factors it would seem possible.

Simon was helped to his room by his girl-friend and a couple of friends. We know very little about the next few hours. We presume that he was tucked up in bed and slowly deteriorating. What was going through his girl-friends mind we never found out. At 10:30pm he made his final trip to the toilet where he collapsed. A commotion ensued, involving his friends, the aforementioned paediatricians, and also, finally, the lodge owners became aware of the unfolding disaster. They were of course unencumbered by the biases described above (biases may be fairly fixed but can also be very context specific).

They immediately called the village’s two resident doctors who were on scene within a few minutes to witness the young man receiving CPR. He had fixed dilated pupils and no discernible cardiac output. They completed several cycles of CPR, but he was pronounced dead at 11pm; cardio-respiratory arrest secondary to HAPE. A black helicopter flew the body out the next morning. A sobering story: a disconcerting and uncomfortable one.

Overcoming Bias

If you’re reading this and thinking “that couldn’t happen to me” well, then… you’re wrong! Remember the Bromiley case? If two consultant anaesthetists could have that happen to them on their own comfortable familiar turf during a simple routine operation, how much more likely is an equivalent scenario at high altitude after a long and tiring trek when you are hungry, thirsty and fatigued?

  1. We are all subject to a multitude of biases, so how do we get past that? How do we make sure that this doesn’t happen to us? The first step is undoubtedly to recognise that these biases exist, be aware of the most common ones, and understand the causes and pre-conditions for when cognitive bias is likely to affect decision making. Obviously overcoming these biases isn’t easy. A good place to start would be to put twenty minutes aside and watch Glenn Singleman’s presentation “Cognitive Bias and Risk Management” and take note of the general and specific debiasing techniques; learn how to work with them and incorporate them in times of difficulty.
  2. Emotions serve a number of purposes and in this setting we can use the emotions of fear, disquiet, or unease as an alarm bell to trigger a slowing down of our thought processes. Let’s apply some of these concepts now to Simon’s case. One can imagine a chaotic, emotionally charged scene. So stop and take a breath. Stop and critically analyse the situation; rather than riding the emotional rollercoaster as others may do. Realise that, just ‘hoping” doesn’t work as a strategy.
  3. Use your emotional intelligence to pick up non-verbal cues: those indicating his own anxiety and fear shouldn’t be beyond the capacity of even the averagely sympathetic climber. Simplify the concepts and decisions that need to be made. Discard things that are merely a distraction. Simple questions can help, escalated slowly if needed: “Have you treated somebody this sick before?” “Can we help you make a decision?” “Is there a local doctor who can help us?” “Should we be doing something else to help him?”
  4. Metacognition is a central skill. Think about your thinking; examine both your thoughts and emotions. Don’t just take them for granted and follow them. Be your own devil’s advocate. And while you’re doing it, sit down with your buddy, and order a hot drink and some food whilst you chew it over together.
  5. Take accountability yourself; don’t let it become diluted throughout the group until that accountability vanishes. Being accountable and accepting that accountability flicks a switch inside you and helps you step up to the plate. Summarise the facts as you know them. Get meaningful feedback from others and gain clarity in your own head.


If this all sounds overly complex and difficult, there is actually an optimistic note on which to finish: it seems likely that if just one person had done one thing differently, then Simon’s care may have taken a different trajectory and he may have survived. To extrapolate that into the future and to keep you from making the same mistakes, I will leave you with this thought; you don’t have to get it all right – you just have to get some of it right. And that is eminently achievable for all of us.


  1. Schrading, WA. et al. Core Content for Wilderness Medicine Training: Development of a Wilderness Medicine Track Within an Emergency Medicine Residency. Wilderness Environ Med. 2018;29(1):78 – 84.
  2. Mellor, A., Dodds, N., Joshi, R. et al. Faculty of Prehospital Care, Royal College of Surgeons Edinburgh guidance for medical provision for wilderness medicine. Extrem Physiol Med. 2015; 4(22).
  3. McLean SF. Case-Based Learning and its Application in Medical and Health-Care Fields: A Review of Worldwide Literature. J Med Educ Curric Dev. 2016;3.
  4. Reason J. Human error: models and management. BMJ. 2000;320(7237):768‐770.
  5. Chamarro A, Fernández-Castro J. The perception of causes of accidents in mountain sports: A study based on the experiences of victims. Accident Analysis and Prevention. 2009;41(1);197-201.
  6. Senge PM, The Fifth Discipline: The Art and Practice of the Learning Organization. New York; Currency, 2006.


Edi Albert is generalist in rural and remote medicine, based in Tasmania, but regularly working in various remote locations around Australia including aero-medical retrieval with the Royal Flying Doctors Service, at Perisher Ski Resort, and has previously deployed to Antarctica. He is director of the Healthcare in Remote and Extreme Environments Program at the University of Tasmania and a founding director of the pre-hospital care charity Sandpiper Australia. He travels widely and enjoys climbing, kayaking, hiking, skiing and sailing.