In this three-part series, I explore 10 different research questions that I feel sports science could make a big difference by attempting to answer—and in many cases, is close to doing so. In Part 1, the first three questions I explored were:
- Is a low-carb, high-fat diet effective for athletes?
- Is caffeine really ergogenic for everyone?
- Are isometric loading exercises as effective as eccentric loading exercises for hamstring injury prevention?
Obviously, I have my own biases, and some of these areas are from the fields in which I hold a strong interest, but I have tried to cast the net as wide as possible. For each question, I’ve provided:
- A brief review of what we know so far.
- Why it’s important to know more.
My expectation is that, over the next 10 years, we will get closer to more concrete answers in many of these.
What Effect Does the Gut Microbiome Have on Athletic Performance?
The human microbiome—the collection of bacteria across our bodies, but primarily centered within the digestive tract—has been the subject of increased interest over recent years. The gut microbiome has a number of roles to play in the maintenance of optimal health, including the production of nutrients such as vitamin K2, the neutralization and breakdown of pathogens and carcinogens, and the regulation of the immune system. Even more recently, research has shown that the gut microbiome can influence the levels of brain neurotransmitters such as dopamine, via the gut-brain axis, as well as the control of inflammation and oxidative stress during endurance exercise.
Because our gut microbiome clearly has a host of important roles, both in terms of general health and exercise performance, and is highly modifiable by diet, interest has grown in trying to harness this knowledge as a means to enhance performance. At present, we know that an increased amount of diversity within the gut microbiome is positive; obese people tend to have reduced bacterial diversity compared to lean subjects (and, in mice at least, transplanting the microbiome of a lean individual to an obese one can drive weight loss). Elite athletes also have an increased gut microbial diversity relative to non-athletic subjects (the main drivers of this being an increased amount of exercise, as well as an increased amount of protein intake).
And that, in essence, is where we are now: We know that we want diversity, and we know that consumption of a varied diet and exercise promote that diversity. If we were to test the microbiome of athletes, we likely would struggle to give more in-depth and personalized advice than that, at present. This is why we need additional research in this area; it would be worthwhile to be able to understand how we can utilize the information from a microbiome test to identify key areas for change. This, in turn, should drive performance enhancements.
These changes could be driven through the targeted use of particular bacterial strains through the consumption of probiotics, the recommendation of specific dietary changes, and even—potentially—the modification of training. We already have promising evidence that probiotic use can support immunity within athletes undergoing heavy training, and so further insights in this area should prove worthwhile.
Right now, we’re at the starting line of being able to utilize #gutmicrobiome info within sport, says @craig100m. Share on XTo that end, the main questions I feel need answering within the sporting sphere in regard to the human gut microbiome are:
- Can we utilize gut microbiome testing to provide specific interventions aimed at improving performance?
- Do changes in gut microbiome act as markers of overtraining or excessive training load?
- How does microbiome diversity change across the course of both the training and competitive periods, and can we use this information to target key changes within the microbiome?
As such, I feel that, right now, we’re very much at the starting line of being able to utilize gut microbiome information within sport. We require further developments to drive the field forward and enhance our understanding—which, in turn, will hopefully lead to performance enhancements.
Why Does This Matter?
The human gut microbiome has interested scientists and the general public for a number of years. We know that an increased diversity is important, and we know the basic building blocks of what drives this diversity, but outside of that we struggle to make specific recommendations. By increasing our knowledge in this area, we may be able to use the regular screening of the gut microbiome in athletes to develop personalized recommendations for nutrition and training practices, and use it to serve as a marker of training load.
Can We Develop Real-Time Markers of Exercise Adaptation?
When we set training programs for athletes, we hope to improve their sporting performance, in part by improving their physiological abilities. Therefore, coaches have to set training that provides sufficient stimulus for adaptation to the given exercises to take place, and select exercises that drive the correct adaptations. A second important issue is that of recovery: Coaches must program training so that the sessions promote fatigue, but not too much fatigue that the athlete under-performs at the next training session or competition, or becomes injured. As such, there is a fine balancing act between sufficient workload to drive adaptations and not too much so the athlete becomes overly fatigued and/or injured. This is, of course, difficult.
The development of effective training programs has an additional challenge: It is often hard to determine which adaptations have taken place until weeks or months later. This is due, in part, to the improvements derived from training programs occurring in tiny increments on a session-by-session basis. As a result, coaches and support staff often have to rely on trial and error, selecting sessions and exercises that they think may drive the relevant adaptations and hoping for the best.
However, given that we now know there can be considerable individual variation in response to a training stimulus—both between athletes (i.e., what works for athlete A may not work for athlete B), and in the same athlete across time (i.e., what works for athlete C in year 1 may not work for athlete C in year 2)—selecting these exercises can be difficult. A potential solution may be the development of real-time markers of exercise adaptation; basically, can we develop a test or tests that tell us how well the athlete is adapting to the training stimulus, and indeed whether the specific required adaptations are occurring, in (or close to) real time.
The development of real-time markers of exercise adaptation would lead to more effective training, says @craig100m. Share on XThere are a couple of leading candidates in this area. One is cell-free DNA, which refers to circulating fragments of DNA found within the blood. At rest, small amounts of cfDNA can be found in our blood, but following both acute and chronic physiological stress, the concentration of cfDNA increases rapidly. As exercise represents a source of physiological stress, increases in cfDNA occur following both prolonged endurance exercise and resistance training sessions, as well as following a 12-week training block. Perhaps even more importantly—from a biomarker perspective—cfDNA changes appear proportional to both exercise intensity and duration, and are transient, often returning to baseline with 24 hours, even after highly exhaustive exercise.
In addition, cfDNA may also be a potential marker of fatigue; in a 12-week resistance training program, increases in cfDNA correlated with increases in mean training load within each three-week sub-block, with the highest concentrations associated with a decrease in physical performance. Furthermore, some research has demonstrated that the correlations between cfDNA and rating of perceived exertion (RPE, a subjective—but reliable—marker of training load) are stronger than those for lactate and RPE. It also showed that increases in cfDNA concentrations are greater than any other biomarker, potentially suggesting enhanced sensitivity compared to more traditional biomarkers.
In Part 3 of this series, I’ll discuss the prediction of training response; looking at how miRNAs may play a role, with specific miRNAs associated with an increased chance of being a responder to a certain type of training. miRNAs may also act as a useful biomarker of exercise response. For example, a number of studies have demonstrated that specific miRNA concentrations change in response to a single aerobic training session, as well as a longer-term aerobic training program.
Similar to cfDNA, miRNA concentrations appear sensitive to training intensity and duration. Potentially even more important, miRNA concentrations can plateau if there is insufficient training progression, demonstrating miRNA’s potential as a method to monitor training. Finally, the extent of miRNA changes following aerobic training are proportional to the training load, with specific miRNAs associated with post-exercise inflammation—information that may potentially guide recovery techniques.
Alongside miRNA and cfDNA, which appear to offer promise as markers of how well the athlete is adapting to and tolerating a training load, there is the potential that we can measure specific adaptations. There are a number of ways that this may be possible, including measuring the proteins produced by specific genes (proteomics), along with measuring specific epigenetic changes at particular points within a gene. As an example, the body is able to add a type of tag to certain points within DNA, which make that specific region of DNA harder to read. These tags are methyl (-CH3) chemical groups; hence, this process is termed “methylation.” DNA methylation can potentially be passed from generation to generation, although the majority of methylation markers are transient, and can be added and removed (termed “de-methylation”) according to different stimuli.
One potent stimulus of DNA methylation and de-methylation is exercise. For example, sedentary individuals are far more likely to have methylation markers on a gene called PPARGC1A, which is involved in the promotion of mitochondrial biogenesis—an important aspect of improvements in aerobic fitness. If these sedentary subjects start to exercise more frequently, however, this methyl group gets removed, allowing the subsequent exercise adaptations to occur. This raises the potential for monitoring of specific DNA methylation patterns, which may be indicative of the types of adaptation that are occurring, so that training can be adjusted to target the specific adaptations that are required.
A limitation, at present, is that these tests are likely to be highly invasive. Epigenetic changes, such as methylation, histone modifications, and miRNA concentrations, tend to be tissue-specific. As such, if you want to understand what is occurring in the muscles of your athletes, then you need a muscle tissue sample. You get this through a muscle biopsy—a somewhat invasive procedure that has the potential to cause damage, making its adoption by elite athletes unlikely. Furthermore, cfDNA testing appears to require the collection of blood almost immediately following exercise, which again has practical issues. Ideally, we will be able to develop saliva collection techniques for cfDNA, miRNA, and similar; at present, there are some methylation markers that can be collected via saliva.
Ideally, we will be able to develop saliva collection techniques for cfDNA, miRNA, and similar, says @craig100m. Share on XOf course, the danger with such an approach is that coaches become over-reliant on the data, seeking to derive specific, very narrow adaptations, such as increases in mitochondrial biogenesis or type-II muscle fiber hypertrophy. While it is tempting to go hunting for these adaptations, training as a whole is often more than the sum of its parts. So, while a different exercise may not drive the specific adaptation required to the same extent, it may enhance competition performance to a greater extent. As such, if these real-time markers of exercise adaptation are developed, then coaches and support staff will need to take a holistic, pragmatic approach to such information, using it to guide their decisions, but not making it the sole basis of what they do.
Why Does This Matter?
The purpose of training is to enhance performance, and so coaches have to develop training plans that they believe will do so. This can be difficult—if the training load is too high or too low, optimal adaptation will not occur, and injury or fatigue is more likely. As such, if we can develop sensitive, real-time markers that allow us to better understand the impact of specific sessions and training programs on an athlete, then we can make small adjustments to training sessions on the fly, hopefully improving performance to a greater extent.
Can We Use Genetic Testing to Predict Talent?
Let me ask you a question: If, on the day you were born, you moved to Jamaica to live with Usain Bolt, eating the same foods as him, living the same lifestyle, and doing the same training, do you think you would break the 100m World Record? Most people would answer no (I’m always surprised by the people that answer yes), which illustrates something that we all understand: Elite athletes are intrinsically different than “normal” people.
Research tends to back this up, too: A study from 2007 reported that the heritability estimate for being an elite athlete is around 66%, which we can roughly interpret to say that the difference between Usain Bolt (the elite athlete) and your dad (probably not an elite athlete) is approximately two-thirds due to inherited factors, and these factors are primarily (but not exclusively) genetic. More recently, researchers found that your chances of winning an Olympic medal are higher if you have a family member who has already done so.
Interest in this area has led to the identification of a number of genetic variants that appear more common in elite athletes. An example of one of these is ACTN3, with research showing that variation at a particular point in this gene is more common in elite sprinters than non-elite sprinters. This genetic variant in ACTN3 appears to modify muscle fiber type, with the “sprint” version of this gene associated with an increased proportion of type-II fibers, something that is obviously advantageous to elite sprinters. These findings have been well replicated, and ACTN3 may even have an influence on training adaptations, post-exercise recovery, and injury risk, as I explored in a paper back in 2017. There are other genes that have been shown to impact the attainment of elite athlete status, such as ACE and PPARGC1A, but of them all, ACTN3 appears to lead the way.
So, if we know that genetics play a role in the development of elite athlete status, and we know some of the genes that cause this, can we use genetic testing to predict those individuals who will go on to become elite athletes? At present, no—and there are a number of reasons for this.
The first is that the effect of any individual gene is likely quite low. For example, ACTN3, the gene which likely has one of the largest impacts on elite performance, explains roughly 3% of the variance between individuals. This is not an insignificant amount, but it’s also not huge. Secondly, while individuals with a certain version of ACTN3 are more likely to be elite speed-power athletes, roughly 80% of the population of the world have this same genetic variant. As a result, the vast majority of people on this planet with the “sprint” version of this gene are not elite athletes. Furthermore, research has shown that even if you don’t have the “sprint” version of this gene, you can still be a successful athlete (in addition to this study, I know of two Olympic sprinters—one of whom is an Olympic medalist—who do not have the sprint version of this gene).
We don’t know enough about which genes impact the attainment of elite athlete status, says @craig100m. Share on XThis means that we cannot use a single gene to identify future elite athletes, because no single gene exists with the required predictive ability. Instead, a better approach may be to combine a number of genes into an algorithm. There’s a problem here too—we actually don’t know all that many genes that influence the attainment of elite performance. This is a problem common in medical research, whereby researchers know that genetics explains much of the variance between individuals that get a disease and those that don’t due to the heritability estimates gained from previous research. However, at present they have been unable to identify those specific genetic variants; this is termed the “missing heritability problem.”
A great example is that of height: research suggests that genetic variation explains around 80% of the variation in height between individuals. However, while scientists have discovered around 1,185 genetic variants associated with height, these variants “only” explain around 25% of the difference between individuals. This means that the remaining 55% of variance explained by genetics remains uncovered. We see this with elite athlete status: At present, around 155 genetic markers have been identified to contribute to the attainment of elite athlete status. This likely does not explain enough of the variance between athletes to be used in any predictive capacity. (At the time of writing, I have a paper under review testing this in a small sample of elite athletes.)
Right now, then, the main issue is that we don’t know enough about which genes impact the attainment of elite athlete status. In order to improve the predictive ability of genetic tests for talent, we need to discover a lot more. The problem here is that the discovery of new genetic traits associated with elite athlete status is difficult; because the effect size of any single variant is likely to be very small, researchers require very large sample sizes, well in excess of 1,000 subjects.
Now, there aren’t many elite athletes around, so recruiting 1,000 to a study can be very difficult. This issue is hindering research at present. Nevertheless, if additional genes are discovered, my personal belief is that we will (eventually) be able to develop a threshold score for an algorithm that contains all the required genetic variants: a score above this, and the athlete is more likely to become an elite athlete; a score below, and they are less likely.
However, it still will be the case that some, and perhaps most, individuals with a score above this threshold will not become elite athletes, while some of those with scores below this threshold may. As a result, genetic testing for talent identification will likely never be completely predictive, but it may provide more information on which decisions can be based. It may also be used to guide training prescription in the future, as I wrote in a 2017 paper.
Even if we could develop a genetic test for talent, it’s not clear whether we should use it, says @craig100m. Share on XA secondary issue around this is whether it is ethical to utilize a genetic test for talent, should one ever be developed. A number of prominent researchers in the field have expressed doubts as to whether such an approach is ethically justified, and there are certainly a lot of unanswered questions regarding the use of such tests. Here are a few:
- Can a club compel players to undergo a genetic test?
- What happens if an individual is found to possess a genetic variant associated with disease?
- Would such information be used to further discriminate against the player?
As such, even if we could develop a genetic test for talent, it’s not clear if we should even use it.
Why Does This Matter?
Because identifying the next Cristiano Ronaldo or Usain Bolt at a young age can be hugely profitable for sports clubs, there is an interest in methods that might be utilized to support such an approach. The use of genetic testing to identify future elite athletes is a scenario envisioned by many, and the technology is now available for such a test to take place. However, at present, such a test would not be accurate, and, furthermore, it’s hard to envision how it ever would be.
Additionally, such tests have serious ethical questions surrounding their use, and these would need to be rectified before the tests can even be considered for utilization. However, there is some evidence that genetic information could be used to inform training program design, supplement use, and dietary advice, as well as for managing injury risk. As such, this is an area to potentially keep an eye on in the future, to see how it develops.
Do Sports Supplements Have an Additive Effect, or Is There a Ceiling?
These days, we have a pretty good idea of which supplements have the potential to exert a performance-enhancing effect, or at least don’t negatively affect performance when the dosing is correct. For example, as I’ve explored previously, we can be pretty sure that caffeine is performance-enhancing for most people, most of the time. We can also add to that list common ergogenic aids such as sodium bicarbonate, beetroot juice, beta-alanine, and a handful of others.
Of course, when these ergogenic aids are researched, they are commonly studied in isolation: Give a group of subjects some caffeine tablets to see if their performance improves, and if it does, you can easily isolate what drove that performance enhancement. However, athletes rarely take a performance-enhancing supplement in isolation.
For example, many utilize caffeine-containing energy drinks for their pre-training and competition caffeine kick, and these drinks often come with sugar and taurine, two substances that also have ergogenic effects. Interestingly, a recent meta-analysis on the effects of energy drinks on sporting performance concluded that, as the taurine dose of these drinks increased, so too did the ergogenic effects, while this wasn’t the case for the caffeine dose. Additionally, an endurance athlete might consume both beetroot juice and caffeine separately, but close together in terms of timing, during their pre-race preparation. What we need to better understand—and precious few studies actually examine—is the effect of these ergogenic aids when combined.
We need to better understand the effect of ergogenic aids when they’re combined, says @craig100m. Share on XWhen two supplements have a similar mechanism, there is the possibility that taking them together could exert no additional effects. For example, beta-alanine and sodium bicarbonate are both cellular buffers; does taking the maximum ergogenic dose of one mean that any additional intake of the second supplement provides no further effects? Or, because the mechanisms are similar but not the same (e.g., beta alanine is an intracellular buffer, while sodium bicarbonate is an extracellular buffer), does taking both together provide an additional benefit? Could ergogenic aids cancel each other, for example?
Similarly, is there a performance ceiling associated with ergogenic aids? If, for example, the most an athlete can improve with nutritional interventions is 3%, and a single ergogenic aid increases performance by 3%, then do additional ergogenic aids, even those working through separate mechanisms, provide no additional benefits? I’ve seen this covered briefly in a number of papers, including this editorial from Shona Halson and David Martin, but the most comprehensive review I’ve come across on the subject was authored by Louise Burke and published in the journal Sports Medicinein 2017.
In her paper, Burke reported on some of the more common supplement co-ingestion strategies, of which the most commonly studied was that of sodium bicarbonate and beta-alanine. The results showed a wide spread of findings: Some studies reported combined benefits, others no effects, and others negative interactions. In part, this is due to both a low number of studies and a low number of subjects in each study—demonstrating why further research in this sphere would be useful.
The short answer to this question, then, is that we don’t know. And yet, it is clearly important to enhance our understanding in this area, because athletes regularly co-ingest ergogenic aids as a means to enhance performance. We can, and must, better understand these potential interactions in order to drive athletic performance forward.
Why Does This Matter?
While we understand that ergogenic aids—when taken in isolation—enhance performance, athletes very rarely consume these ergogenic aids on their own. Instead, they more commonly consume them in combination with other performance-enhancing nutrients. However, the effect of this co-ingestion of ergogenic aids is poorly understood: The potential is that taking two or more such aids together may further enhance performance through additive mechanisms; have no additional benefit; or lead them to compete with one another, reducing the performance enhancement. This area has been very poorly studied, demonstrating a need for further exploration in the future, and for research to accurately mirror how ergogenic aids are used by athletes in real life.
Since you’re here…
…we have a small favor to ask. More people are reading SimpliFaster than ever, and each week we bring you compelling content from coaches, sport scientists, and physiotherapists who are devoted to building better athletes. Please take a moment to share the articles on social media, engage the authors with questions and comments below, and link to articles when appropriate if you have a blog or participate on forums of related topics. — SF