Sports science and sport coaching are essentially a saturated marketplace. By this, I mean the field is full of very intelligent, qualified people doing good work, and there’s considerable competition for jobs. Thanks to the internet, access to information is no longer a problem. Provided you can afford to buy them, everyone has access to the same books, websites, and social media streams. This suggests to me that there’s potentially no performance advantage for us to gain by accessing new information—because everyone who can access it will. Instead, we might gain improvements either by using the information differently or by making better decisions about which information to use in our daily practice.
To that end, I’m interested in smarter thinking—as opposed to accessing new information—and trying to analyze and use information differently than others.
One book that has been hugely influential in this regard is Blunder: Why Smart People Make Bad Decisions, by Zachary Shore. At the time he wrote the book, Shore was an associate professor of national security affairs at the Naval Postgraduate School, and he believes in learning the lessons of history to enhance future decisions. While the book uses historical lessons to inform future military policy (understandable, given Shore’s job), the stories it tells cover a broad range of subjects and past events. And it’s easy to see how the lessons might apply to sports coaching.
The book, as suggested by the subtitle, is primarily about judgment calls—specifically bad ones—and how otherwise intelligent people fall into a common set of cognition traps that set them up for a downfall. In the book, Shore defines cognition traps as:
- Rigid ways of approaching and solving problems
- Inflexible mindsets formed by faulty reasoning
- Ways of approaching a problem based on preconceived notions and pre-set thought patterns
- Often linked to an emotional response
To avoid cognitive traps, an easy, immediate solution is to keep the four points above in mind and take a flexible approach to problems, check your thought processes and reasoning, attempt to minimize pre-set notions, and avoid emotional decision-making or responses. Building on these, Shore proposes seven cognition traps for us to consider; by being aware of their existence, hopefully, we can avoid them.
1. Exposure Anxiety
Exposure anxiety is the fear of being seen as weak, and the belief that failure to act firmly will weaken one’s position. I’m sure we’ve all experienced the challenge of making a suggestion and having someone respond negatively or behave disrespectfully. An example is arriving late to training. Often, our first instinct is to punish the person, but that doesn’t take into account these second-order effects:
- What is that person’s response to your punishment?
- How does that affect your wider team?
An example given in the book is the 2006 Israel-Lebanon War, in which Israel responded to a small attack by Hezbollah with a massive retaliation of air and artillery strikes as well as their invasion of southern Lebanon. By the time the UN brokered a cease-fire agreement, 165 Israelis had died and up to 500,000 were displaced. The Arab nations unified in support of Lebanon, and Hezbollah’s standing in the Arab world significantly improved.Instead of using excessive punishment to respond to key problems, like arriving late to training, remain as balanced as possible, says @craig100m. Click To Tweet
Conversely, in some circles, Israel was viewed as unnecessarily aggressive, which harmed their international standing. Israel’s initial aggressive response was borne out of the belief that Hezbollah, and the wider Arab world, viewed Israel as weak. By executing a massive retaliation, Israel hoped to signal its strength; instead, they harmed their standing. The key learning point for this cognition trap is not to overcompensate by using excessive force or punishment, but to instead remain as balanced as possible in your response to key problems.
With this cognition trap, we confuse the causes of complex events, leading us to think we have an explanation when we don’t. A great example is back pain, which many of us suffer. It’s tempting to believe that back pain has a simple, underpinning cause; there’s something wrong structurally and resolving the issue will reduce (and hopefully eliminate) pain. This is the basis for MRI scans for back pain—to understand what is wrong with the patient structurally and find a fix.
Generally, when a patient has an MRI scan, some structural issue is spotted, and it’s assumed to be the cause of their pain. A level of complexity comes into play when we realize that many people without back pain have the same structural abnormalities as back pain patients. Often, what we think is the cause of back pain may well be a normal part of the aging process. And this causes further problems for back pain patients; when they get the results of their MRI, the way clinicians explain the results to them can cause additional pain and delay recovery. This isn’t to stay that structural issues aren’t a cause of back pain, but they’re not the cause of all back pain—and assuming they are often leads to more problems.
Several variations of causefusion are built upon:
- overlooking important causal links
- overemphasizing an overly simplistic explanation
- believing the consequence of an issue is the cause of it (you’ll have to read the book to see this issue in action, explained through the history of schizophrenia)
What’s the solution? Be aware that we don’t fully understand complex events, and factor this into your thought processes. Consider what you might be missing, and whether A causes B, B causes A, or B happens when A happens because of unknown cause C. By keeping an open mind to the cause of key events—and understanding that we might not ever be able to explain complex phenomena fully—we can hopefully reduce the incidence of this cognition trap.
Flatview occurs when we view things as binary, black or white choices. In essence, we perceive things as very discrete, while the real world tends to be continuous in nature with a lot of “fluffiness” and context. As an example from sport, we might consider a particular exercise—in European circles, the power clean is a good example—as crucial within a training program. This is a flatview: the exercise is determined to be good, and by extension, training programs that don’t include it are bad. It ignores any context, such as individual differences between athletes or timing. We see this type of thinking all the time in online arguments. People often believe the person they’re arguing with is either completely with them or completely against them, with no in-between.With a flatview, one determines an exercise to be *good*, and by extension, a training program that doesn't include it as *bad*. It ignores any context, says @craig100m. Click To Tweet
Clearly, flatview has a huge potential to harm our thinking. How do we guard against it? First, we have to avoid overly simplistic explanations, which, by their very nature, tend to be binary. Instead, we must embrace context and nuance and filter our thoughts and decisions with these in mind. Second, we need to cultivate empathy; why does the person who disagrees with me think like that? What could be different about their knowledge, experience, or perspective that leads them to have a different belief or viewpoint? By seeking out and embracing different views and rejecting simplistic explanations, we can prevent ourselves from falling victim to a flatview.
Squats are a useful way to build leg strength. By getting stronger in the squat, we get stronger leg muscles, which makes us faster. As a result, all sprinters should squat.
The above sentence is clearly nonsense, and yet it’s what happens with cure-allism—where we hold a dogmatic belief that a successful theory can be applied indiscriminately. We see this all the time in the diet world; someone tries a diet, and their health improves successfully (by their definition). As a result, they push this particular diet—be it keto, paleo, low carb, high fat, vegan, or carnivore—as the one true cure everyone should follow.When one is successful with a diet and they push that diet as the one true cure for everyone, they're falling into the trap of cure-allism, says @craig100m. Click To Tweet
Similar to flatview, this approach removes nuance and context from our thinking. It’s similar to Philip Tetlock’s Hedgehog vs. Fox thinking. Hedgehogs know a lot about one thing; they are big idea people, and they try to use that big idea to explain everything. A strength and conditioning (S&C) coach may view an athlete’s issues as strength-related while a physiotherapist may see the same problem as load-based. This desire to use our predominant theory and mental model to explain everything harms us. Instead, we should endeavor to be like Tetlock’s foxes, who know a little about a lot and are more accepting of nuance, context, and different approaches.
Infomania represents an obsessive relationship with information. We live in a world saturated with data to the point that our relationship with it is potentially dysfunctional, turning us into infomaniacs. Shore believes there are two types of infomaniacs:
- Infomisers who hog information and don’t want to share (the coach with the secret plan)
- Infovoiders who actively seek to avoid information, thinking that doing so will keep them from being misled
Both types of infomaniacs are in for a hard time. By not sharing information, infomisers won’t have their thoughts and beliefs evaluated and developed. Conversely, infovoiders will never become fully informed if they don’t actively seek out information, especially the information that contradicts their viewpoint.
Let’s take a coach who is potentially interested in understanding patterns of fatigue and injury in their athletes. An infomiser would likely collect a range of information on training loads, volumes, and intensities and keep it to themselves. However, if they were open to sharing the data with their support teams, such as a physiotherapist or S&C coach, they might achieve greater insight.
In contrast, an infovoider wouldn’t have collected any data, possibly out of a fear that it would illustrate their approach was wrong. Both types avoid tapping into the knowledge and wisdom of others (see flatview), but in different ways. The outcome is incomplete knowledge and understanding. The key is to collect the right amount of data and share it with those who could enhance your understanding and decision-making processes.
6. Mirror Imaging
I’m sure we’ve all discussed with someone a reasonably simple issue that escalated into an argument, all because we assumed—in this case wrongly—that the person we’re speaking to has the same viewpoint as we do. This is the mirror imaging cognition trap, where we assume that the other side will think like us. In the vast majority of cases, though, they don’t. Perhaps they have different information, life experiences, or a different grasp on many key underlying concepts.
We see this in armed conflicts the world over: one nation’s army thinks the other nation has the same motivations and incentives as they do. As we see time and time again, however, this is rarely the case. In recent years, for example, it’s become clear that normal disincentives don’t work against suicide bombing. Generally, we assume that people don’t want to cause themselves harm, but in this scenario, the attacker is fully prepared to die. This has forced a shift in tactics for those dealing with suicide bombers. They’re required to see things from the bomber’s perspective and change their defensive procedures accordingly. Similarly, recent occupations of countries such as Afghanistan and Iraq have proved difficult for US forces because they often don’t have the contextual and historical understanding of the local way of life and belief systems, which is far removed from what they are used to back home.
So how do we guard against this?
Awareness of this trap is important. We need to be aware that other people likely see things differently than we do. So the first step is to understand how they see things—typically by simply asking them. Second, building the ability to be empathetic can be hugely useful when guarding against this trap, as would knowledge of the context and history of the people you’re dealing with, especially if they’re from a different culture or background than your own. Finally, the risk of mirror imaging suggests that teams made up of people from a diverse range of backgrounds can very useful. Especially if there is a culture of trust, as everyone would likely have a different perspective—as subtle as it might be—on the issue at hand.
7. Static Cling
We often hear that first impressions are everything. When it comes to the last cognition trap, this couldn’t be more true. Static cling refers to the continued holding of a static image or belief around something, one that doesn’t change with updated information—often because you reason heavily against considering the information to update your worldview.
We see this in businesses all the time. An organization has success making one type of product, but can’t respond to—or predict—a change in demand for something else. Car manufacturers will struggle if they cling to the idea that their customers want inefficient petrol cars at the expense of the cheaper, cleaner electric power that will be heavily supported by governments. The situation is similar to mobile phone manufacturers who didn’t adequately respond to the development of the smartphone—driven by Apple’s iPhone—in the late 2000s.Successful teams & coaches are those who innovate & adapt to rule changes quicker than others, avoiding static cling thinking, says @craig100m. Click To Tweet
When it comes to sport, we see that successful teams and coaches are those who innovate and adapt to any rule changes quicker than others—often turning the rules in their favor. In the early 1990s, for example, soccer introduced the backpass rule. Before this, players could pass the ball back to their goalkeeper, who could pick it up and punt it down the pitch. This often slowed the game down, making it boring. The introduction of the backpass rule, however, meant that the keeper could no longer pick up the ball when it was passed to him by a player on their team.
The rule caused teams to adapt. They could either stop involving the goalkeeper in their play or change the position into a more dynamic one. Slowly, we’ve seen goalkeepers evolve from being relatively poor with the ball at their feet to today’s goalie, who is essentially an additional field player. Pep Guardiola, potentially the most innovative manager around, changed the goalkeeper’s role even further by selecting goalkeepers based on their passing as opposed to the more traditional skills. By adapting their definition of what a goalkeeper is, these teams and coaches rapidly altered their worldview and avoided falling victim to static cling. They responded to the rule changes much quicker and set themselves up for sustained success.
Tips for Better Thinking
Based on the cognition traps detailed above, here are some clear ideas we can take moving forward to improve our thought processes:
- Read widely. To be more like a fox, you have to know a little about a lot. Exposure to new ideas makes you more prepared to understand what you don’t know and allows you to consider multiple explanations for a given phenomenon.
- Cultivate empathy. If you can open yourself up to different viewpoints, you’ll likely understand why people made the decision they did or have the perspective they hold. Thinking through things in the other person’s shoes allows you to view the problem differently.
- Collect some data. Don’t have your head in the sand, and share your information appropriately with people who can enhance your understanding.
- Develop mental flexibility. Hold several competing models and explanations in your head, and hold your opinions loosely. Be open-minded, and guided by evidence—but understand that your own biases will potentially prevent you from correctly weighing evidence in your thought processes.
- Develop a sense of nuance and context. Avoid simple explanations for complex processes and black or white thinking.
By doing these things, we should hopefully guard ourselves against many cognition traps identified by Shore while enhancing both our thought and decision-making processes. The result will be better decisions around various factors that support performance, hopefully propelling your athletes to a personal best.
Since you’re here…
…we have a small favor to ask. More people are reading SimpliFaster than ever, and each week we bring you compelling content from coaches, sport scientists, and physiotherapists who are devoted to building better athletes. Please take a moment to share the articles on social media, engage the authors with questions and comments below, and link to articles when appropriate if you have a blog or participate on forums of related topics. — SF