As in all walks of life, mistakes are fairly common in sports: a wide receiver drops a catch, a soccer player misses a penalty, or an athlete makes a small technical error. Generally, these errors aren’t catastrophic and death or serious injury in sports are, fortunately, extremely rare events.
However, mistakes can have a huge effect on performance outcomes. Many knockout matches in soccer are decided by a penalty shootout, which continues until a player misses. The 1994 World Cup was settled by such a method, with Roberto Baggio, that year’s runner-up in the Ballon d’Or (the award given to the world’s best player), missing the decisive kick. Mistakes can also happen outside of competition. A coach might make a programming error, for example, which can lead to underperformance or injury having a major negative effect on an athlete’s career.
The Person Approach
How we view errors can have a significant impact on how we go about preventing them. Traditionally—and even today—we most often take a person approach.
Here, errors are viewed as being driven by the person making the mistake: through inattention, carelessness, recklessness, lack of ability, lack of knowledge, or any other perceived moral or personal weakness. When using the person approach, we blame the person who made the mistake, view it as their problem, and move on.
In the case of Roberto Baggio, he got the blame for missing the penalty. In a 4x100m relay, the athlete who causes a botched changeover gets blamed—something I know from experience.
The person approach is problematic. As a competitor at the Olympic Games with a very strong chance of winning a medal, I was highly motivated to not make an error. And yet, I still did. Roberto Baggio needed to score his penalty and absolutely did not want to miss. And yet, he did. People who make errors typically don’t want to make the error, and yet we view these errors as often being due to a fault in, or by, that person.
Blaming the error on the person also prevents lessons from being learned. We criticize the person for being flawed in some way and move on, only to make the same mistake again on a team or program level. In my relay example, a person was blamed, the team moved on, and then was either disqualified or did not finish in the 2010 European Championships, the 2011 and 2013 World Championships, and the 2012 Olympic Games. Clearly, the required learning—what was needed to prevent errors from happening in the future—did not happen.
Similarly, Italy’s football coaches could—and did—blame Roberto Baggio for missing his World Cup Final penalty. But they were destined to have players miss penalties in the future—as happened at the 1998 World Cup (where Baggio, in a storyline of redemption, actually scored his penalty try).
It’s easy to blame the person, but as discussed in Black Box Thinking by Matthew Syed, such an approach actually prevents adaptation; meaning, we are destined to make the same mistake time and time again.
The System Approach
In contrast to the person approach, we have the system approach. Here, humans are viewed as being fallible, with mistakes and errors almost being expected. In this approach, the context of why the mistake happened is analyzed and understood, taking into account broader, systemic factors. So, when a player misses a penalty in a shootout, a person utilizing the system approach would explore why this happened by asking questions such as:
- Had the player practiced penalties under conditions simulating competition?
- Were they able to regulate their emotions?
- Did they have a set routine?
In the case of my relay, I was running last leg for only the second time in my life and had been regularly leaving early in training sessions—an error that was uncorrected prior to the competition. By taking a systems approach to the relay, we can build out some important lessons: relay teams need to have plenty of competitive practice, athletes need to be confident and experienced in running their leg, and errors in training need to be noticed, highlighted, and fixed.
This approach, viewing the error of the person as a symptom of problems in the system, prevents those systems—relay teams, soccer teams, etc.—from being doomed to make the same mistakes time after time. Essentially, instead of asking who made the mistake, we need to ask why the mistake occurred.
Essentially, instead of asking who made the mistake, we need to ask why the mistake occurred, says @craig100m. Share on XThis in turn has a knock-on effect. If we use mistakes as a learning experience—which we can in this scenario because we’re not scared of being blamed or motivated to blame someone else—we’re able to take steps to avoid the error in the future.
When a relay team is disqualified, athletes are very quick to distance themselves from being identified as the cause of the mistake, because they know they will be blamed for it—something that is even more common and unpleasant in the social media age. If, instead, the relay coach and team took a system approach—with no individual blame being apportioned—they could be open and honest about why the mistake occurred, reducing the chances of it happening in future.
Reducing the shame, embarrassment, or punishment associated with making a mistake is a crucial step in being able to avoid the same mistake later on, as it allows a more open and honest discussion around why it happened. If the same mistakes occur repeatedly—e.g. a relay team that is repeatedly disqualified—it’s likely not an issue with the people, but the system itself.
Avec Fromage
What does this have to do with cheese?
A leading error researcher (yes, that is a thing) James Reason developed the “Swiss cheese model of system accidents” as a method of explaining and examining error. This model holds that any system has a set of barriers that prevent accidents and errors from occurring, which are represented as slices of cheese. In the case of a nuclear power plant, these defense systems would include sensors to detect when something was awry, alarms to notify human controllers, and automatic shutdowns to prevent catastrophe chain reaction events from occurring.
From a sports coaching perspective, a common accident is a training injury, which is a complex event with many feed-in causative factors. If we view injuries (somewhat simplistically) as an interaction between load (both acute and chronic) and tissue tolerance, then there are a number of defense barriers that can prevent the accident from occurring. These include building tissue tolerance, identifying when tissue tolerance is insufficient or compromised, understanding acute load, and monitoring chronic load.
Ideally, all of these defensive layers are intact and resistant to errors, strength imbalances are identified through effective and validated methods, and load is fully understood and quantified. However, as Reason writes, each defense layer is instead full of varying holes—much like a slice of Swiss cheese. However, unlike Swiss cheese—in which the holes are permanent and stable—in the Swiss cheese model, the holes are fluid, opening and closing in line with environmental, person-based, and systematic variations.
An important part of this model is that any single hole in the cheese does not cause a bad outcome, because the subsequent defense layer (or slice of cheese) offers protection. In extreme circumstances, however, the holes in the cheese align in such a way that an error or accident occurs. As a result, to avoid errors, we need to:
- Develop sufficient layers of protection.
- Ensure these layers are resilient enough to not fail—or develop holes—in the same place or time.
Holes in the cheese can arise for two main reasons: active failures and latent conditions. An active failure is an erroneous (or, in the original model, an unsafe) act committed by people within the system: a missed scoring opportunity or leaving too early in the relay. Importantly, active failures have a short-lived, highly acute impact on the resistance of the system to error.
Leaving early in a relay is an active failure, but it’s also “savable”—the incoming runner can call for the outgoing runner’s hand earlier, or the outgoing runner can slow down slightly. Active failures are typically the obvious mistakes that are identified as the reason the larger scale error occurred—but, as detailed above, they are often a symptom of underlying issues as opposed to the direct cause.
By focusing on active failures as the direct cause, we are doomed to continue to make the same mistakes in the future. Instead, we need to focus on the second of the two common causes of holes: latent conditions.
Reason refers to latent conditions as “resident pathogens”: problems in the system that increase the downstream chances of error. Any decision in “design”—which, within a sporting context, includes both the training program and individual training session design—can have downstream, unintended consequences, which are often both hard to predict in advance and challenging to identify after the fact.
Latent conditions can serve to increase the size of the hole, the time the hole is open, or both, as well as create conditions which increase the risk of a mistake being made. Returning to the sprint relay example, latent conditions might include a lack of funding from the governing body to conduct sufficient relay practices to mitigate the risk of a mistake, or a lack of speedy feedback to the athlete when a mistake is made, preventing rapid corrections from being made and practiced.
To reduce the risk of errors, and the magnitude of the consequence of these errors, we therefore need to be able to:
- Reduce and limit the occurrence of errors.
- Build tolerance within our system, such that errors are easily identified and better tolerated to reduce the risk of a catastrophic failure.
Again, the example of a sprint relay comes to mind: by exposing the relay team to high-level competitions before the major Championships—along with developing realistic training sessions—we can expose the athletes to a variety of scenarios from which they can learn, adapt, and make the correct response when under pressure.
Relay exchanges rarely go exactly to plan; there is often some combination of the outgoing athletes leaving early or late, or their hand being in not quite the right position, or interference or pressure from competing teams. The athletes involved in the changeover—the incoming athlete in particular—need to actively manage the changeover, making rapid decisions under pressure to allow the change to happen.
Practicing changeovers in non-competitive conditions allows for the basics to be learned, but does not expose the athletes to these pressures and decisions, reducing tolerance in the system—in this scenario, the relay exchange. Exposure to relay changeovers under competitive conditions increases the library of potential scenarios—and the means of successfully executing a changeover in these scenarios—that the athletes have access to, building tolerance to error within the system.
Towards Fondue
Expanding further on the Swiss cheese model, in 2014 Yunqiu Li and Harold Thimbleby developed the hot cheese model. This is a more active model than the original, with its purpose being to highlight the interaction between defense layers.
In the original model, the defense layers—represented by the slices of cheese—are discrete and separate. In the hot cheese model, the layers of cheese are visualized as slowly melting, dripping down onto—and therefore affecting—the next layer down. If an error slips through a previous defense layer, it exerts a force on the subsequent slice of cheese, increasing possibility of a hole developing.
In the sprint relay example, realistic training sessions make athletes better able to complete a changeover under pressure. As a result, implementing realistic training is, in the case of the model, akin to inserting a slice of cheese.
During the realistic training sessions, however, if the athletes make errors that are either not identified or corrected—for example, being inconsistent around leaving on the checkmark—then this puts pressure on the newly inserted slice of cheese, increasing the risk of a hole or a drip developing.
Conversely, introducing this new layer of cheese in the form of competitive practice may introduce a new risk to the system—perhaps the athletes become overconfident in their abilities to complete a changeover based on their limited success in training so they don’t take an upcoming competition seriously enough.
A Process (Not Processed)
These cheese models can teach us how to support athlete performance. We must be able to detect errors by actively searching for them and replicating the scenarios under which they might occur.
Much like a fire drill can detect how efficiently a group of people can evacuate a building, regular tests of athletes can allow us to understand how they are progressing. However, anyone who has taken part in a fire drill will recall that, in almost all scenarios, they don’t actually believe there is a fire; consequently, their behavior might not mimic what they would do in an actual fire.
Much like a fire drill can detect how efficiently a group of people can evacuate a building, regular tests of athletes can allow us to understand how they are progressing, says @craig100m. Share on XWhile we probably can’t start setting fire to buildings, in sports we can expose athletes to much more realistic training scenarios, either by utilizing representative design principles, or by exposing athletes to lower level “practice” competitions prior to a major championship, which would allow them to better identify any errors in their preparation. If the error we’re seeking to avoid is injury, then early identification of risk and detection of issues is important.
Secondly, it’s important to keep in mind that errors are learning opportunities. By having a blame culture in place, learning will be limited since people will seek to cover their backs and avoid taking ownership of mistakes.
By framing mistakes as errors within the system—latent conditions—as opposed to individual mistakes, we can better reduce the chances of this happening in the future. If a mistake is made by an individual, asking, “Why did the person make that mistake?” is far more useful than thinking, “That person made a mistake, and is therefore inept.”
It’s important to keep in mind that errors are learning opportunities, says @craig100m. Share on XWe must also develop defensive barriers—slices of cheese—to reduce the risk of future errors. Crucially though, these barriers must not unnecessarily constrain athlete performance. The easiest way to never miss a penalty kick in soccer is to not take one…but this isn’t an option. Instead, using training and other methods, we must increase the capacity of athletes to be resistant to making an error—perhaps by making them better able to perform under pressure, or having a wider library of potential solutions that they can call upon to solve any given problem.
Finally, and in line with the hot cheese model, we need to be wary that whenever we introduce an additional defense barrier there can be unexpected downstream effects.
For example, by implementing an increased number of relay practices, we potentially reduce the risk of errors made in competition; conversely, we also increase the exposure of athletes to high speed running which, if they’re not resilient enough to handle it, increases their risk of injury. Often, we can’t predict these downstream effects, so a period of heightened alert after making a change is often required to detect any potential issues.
There is a well-worn saying that “all models are wrong, but some are useful.” The Swiss and hot cheese models of error are imperfect yet important models which allow us to better understand why mistakes are made in sports, and how we can do a much better job of reducing their incidence in the future.
Since you’re here…
…we have a small favor to ask. More people are reading SimpliFaster than ever, and each week we bring you compelling content from coaches, sport scientists, and physiotherapists who are devoted to building better athletes. Please take a moment to share the articles on social media, engage the authors with questions and comments below, and link to articles when appropriate if you have a blog or participate on forums of related topics. — SF