*Dr Nigel AS Taylor, PhD1 ,Dr James D Cotter, PhD2
1Department of Biomedical Science, University of Wollongong, Wollongong, NSW 2522, Australia.
Physiological adaptation occurs in response to repeated stress application. Acute environmental stress, of either external (exogenous) or internal (endogenous) origin, disturbs the internal environment. Such stresses (forcing functions) displace physiological variables from one steady state towards another, typically in an exponential fashion. For example, the heart rate and ventilatory responses to an increase in exercise intensity display an exponential rise, to ensure oxygen supply matches elevations in metabolic demand 1.
Since all organisms are in a state of dynamic equilibrium with the environment, thermal energy can readily be gained or lost. Humans possess regulatory mechanisms that ensure the stability of the internal environment, with such stability being conducive to life and optimal physiological function. Exposure to exogenous and endogenous (metabolic) heat sources displaces body temperatures upwards. Homeostatic mechanisms, comprised of sensors, signal integrators, effector organs and a communication network, respond by modulating effector organ function to regulate body temperature within a narrow range. Repeated exposure to heat stress (an adaptation stimulus), either through exogenous sources or high-intensity exercise, elicits adaptations that result in a more effective defence of body temperature. These adaptations take the form of behavioural strategies or physiological adaptations, the latter of which are the principal focus of this paper.
Physiological heat adaptation can occur in response to naturally-occurring climatic changes (acclimatisation: 2, 3, 4), to artificial heat exposure (acclimation: 5, 6, 7) and to training-induced elevations in body temperatures 8, 9, 10. This paper’s second emphasis is on the adaptations that accompany acclimation and high-intensity, endurance exercise.
Since humans contain many interdependent regulatory systems, with common and discrete effector organs, the nature of systemic and organ adaptations varies across stimuli, and even within an adaptation stimulus. Indeed, adaptation rarely affects one organ or system, but it is an integrated physiological response affecting many structures and mechanisms simultaneously, and at differentrates. The time course of adaptation may be described using six general characteristics 11, as illustrated in Figure 1.
Figure 1: Adaptation theory: A hypothetical illustration of physiological changes following the application and withdrawal of an adaptation stimulus (see text for discussion)
Adaptation requires the stimulus to exceed a critical threshold, which is a function of the regulatory system’s sensitivity. For instance, if a stimulus fails to disturb homeostasis, then neither an acute corrective adjustment nor physiological adaptation will occur.
Body tissues are thermal energy reservoirs, with heat loss facilitated through the transport of thermal energy in the blood, and exchanged with the environment via non-evaporative pathways (radiation, convection, conduction), and the conversion of sweat into vapour. During exercise in the heat, non-evaporative heat dissipation is impeded, and even reversed, so humans become heavily reliant upon evaporative cooling. The capacity to continue exercising without an excessive body temperature increase is now dictated by the presence of clothing and sweat evaporation.
The interaction of physiological and environmental characteristics determines thermal compensability 14, which identifies conditions where temperature regulation is challenged. This is the ratio of the required evaporative heat loss (Ereq) to the maximal sustainable evaporative cooling (Emax; Appendix). Thermal discomfort and physiological strain slowly increase as this ratio approaches unity. However, when Ereq exceeds Emax, the conditions are uncompensable. This occurs when air temperature or water vapour content (humidity) exceed critical thresholds. At rest, these thresholds are much higher, but uncompensability may still eventuate (e.g. Turkish bath). However, since exercise has a pronounced affect upon heat balance, then even cool conditions may become uncompensable for athletes. This role of metabolic heat is often overlooked, with the environmental state invariably being considered the primary cause of heat stress 15. Yet for athletes, the risk of heat-induced performance decrements and heat illness can have as much to do with heat production as they do with external heat loading.
For many athletes, this change in compensability is irrelevant. In swimming, heat balance is minimally challenged, since pool temperatures generally ensure sufficient heat loss. However, when an 80kg, 100m sprinter, wearing athletic clothing, runs at maximal speed, the metabolic heat production lowers the air temperature at which thermal compensability is lost, moving it from 31-32°C (rest) to ~10°C during a race, when external power averages ~900 Watts. However, the race is so short that athletes store an insignificant quantity of heat. For a fencer, the air temperature at which conditions become uncompensable during competition is ~12°C. The proximity of these two air temperatures reflects the interaction of exercise intensity and clothing on heat balance.
Keeping a physiological perspective
It is clear that, for a 100m sprinter, heat adaptation cannot provide a thermally meaningful physiological benefit. Observation alone shows that, for marathon running, heat illness can be problematic. Indeed, Pugh et al. 23 recorded core temperatures over 40°C for runners in comfortable conditions (22-23.5°C, relative humidity 52-58%). Clearly exercise intensity drove core temperature upwards. Others have reported core temperatures in the range 39.4-39.7°C during short training runs in late winter (4°C; 24) and even during a 5km run 25. At these core temperatures, the safety margin is minimal, although much lower temperatures have also been observed following a marathon (26: 38.7°C (N=62); 27: 38.9°C (N=38)).
Conductive exchanges are negligible in many sports, since surface contacts are brief and localised.
Advantages and disadvantages of heat adaptationWhen is heat adaptation appropriate?
Most athletes and coaches are familiar with altitude training, and many will be aware of the debate surrounding its possible benefits to performance at sea level 28. However, there is no doubt that adaptation to altitude enhances performance at altitude. This generalisation also applies to heat adaptation. Using a first-principles thermodynamic model, several athletic events will be evaluated to illustrate the thermal impact of metabolic heat production, and then the possible benefits of heat adaptation will be assessed.
At rest, heat production is ~1.5 W.kg-1 15. When exercising, additional heat is produced in proportion to the exercise intensity. This is illustrated in Figure 2, using the predicted external power and total metabolic heat productions for world record performances in events from 100m through to the marathon, using the morphometric characteristics of each record holder. In a worst-case scenario, without heat loss, the heat production curve in Figure 2 would also represent heat storage. Since storing 3.47kJ of heat elevates 1kg of tissue 1°C, mean body temperature would rise ~2°C and ~4°C when racing the 1500m and 3000m races (respectively), at world record pace. Therefore, unless body temperature was elevated substantially during the warm-up, athletes would theoretically not be expected to experience a clinically significant elevation in core temperature when running distances of 1500m or less, even when running at world record pace in an environment that prevented heat loss.
Figure 2: External work rate and total (metabolic) heat production during world record track performances over distances from 100m through to the marathon, using the morphometric characteristics of each record holder. If all of this heat is stored, core temperatures will rapidly rise, and two critical body temperature elevation lines have been added (2°C and 4°C: see text for discussion).
Since humans can actively dissipate heat, the scenarios in Figure 2 will rarely eventuate. So the second stage of modelling included heat exchanges, typical athletic clothing (35% skin coverage), running speeds at record pace, and two climatic conditions: 22°C (50% relative humidity) and 33°C (50% relative humidity). Heat exchanges, and the resultant heat storage, permitted computation of body temperature changes for the six races, with calculations performed to simulate athletes before and after heat adaptation. In the latter instance, acclimation-induced changes in sweating, skin blood flow and skin temperatures (see text below), and the subsequent impact upon heat balance, were modelled. These data are presented in Figure 3, but do not consider the significant impact of variations in solar radiation. Events up to 1500m are not adversely affected by thermal strain. At 23°C, this may extend to 5000m, and possibly 10 000m, with predicted body temperature changes.
Body temperature changes at 33°C, for distances >5000m are much higher than expected, and attributable to forcing the model to retain record pacing in uncompensable conditions, and to inherent model limitations. Nevertheless, changes following adaptation provide a useful comparison.
being consistent with the literature 23, 24, 25. However, under hot-humid conditions, people almost invariably modify pacing to facilitate task completion 29, 30.
Figure 3: Predicted changes in mean body temperature for heat-adapted and non-adapted athletes racing at world record pace over distances from 100m through to the marathon, under two environmental states.
A popular misconception with heat adaptation is that an elevated sweat secretion is accompanied by an equivalent rise in evaporative cooling. Whilst increased sweating is advantageous for sedentary people with a low maximal sweat rate, this is rarely the case for endurance-trained athletes. Evaporation is not determined by sweat rate, but the maximal evaporative cooling permitted by the environment and clothing. To illustrate this, consider the predicted sweat rates necessary for heat balance at 33°C in the examples presented in Figure 3 (assuming complete evaporation): 1 960, 1 890 and 1 920 ml.h-1 (5000m to marathon). These are not excessive sweat rates, and equate with an average required evaporation (Ereq) of 4200kJ (averaged over race durations and events). However, the average maximal evaporation (Emax) was only 1100kJ. Thus, only ~25% of the required sweat could be evaporated. Indeed, sweat rates in excess of ~500 ml.h-1 represent wasteful fluid loss in these examples.
Heat adaptation can affect Emax. Superior heat adaptation comes from long-term acclimatisation. This is best observed in heat-adapted indigenes who display lower skin blood flows 31, 32 and higher skin temperatures during heat stress 32, 33, 34, relative to non-adapted controls. However, such changes may not be evident in athletes following short-term adaptation. Lower skin blood flows reduce cardiovascular strain, improve cardiovascular efficiency and result is less cardiovascular drift. Higher skin temperatures reduce heat gain, and elevate cutaneous water vapour pressure, Emax and evaporation. These effects are less apparent at high work rates (Figure 3). However, significant reductions in body temperature may be evident, with respective increases in Emax at 22°C and 33°C averaging 8% and 12% across events. The threshold for this effect occurs at some race distance beyond 1500m.
What about other sports?
Firstly, consider team sports involving intermittent, high-intensity exercise. Physiological strain, including core temperature, heart rate and sweat rates, is typically higher in intermittent than continuous exercise at the same average intensity. Furthermore, team-sport athletes are generally larger and more heavily clothed, adversely affecting heat production and dissipation. Indeed, sweat losses often exceed rehydration rates 35, with exercise impairing gastric emptying 36, 37, and progressive dehydration elevating intensity-specific core temperatures 38.
Field hockey provides a useful example. Whilst the game is very fast, at any time, only a few players are working at high intensity, and only for a short duration. International hockey is played on synthetic and watered surfaces. Independently of the wider environmental humidity, evaporation elevates water vapour pressure immediately over the pitch. Clothing typically covers 50-60% of the skin surface. The exception being the goalkeeper, who is active for much of the game, albeit at a much lower intensity, but is heavily clad with protective and other clothing. These factors reduce evaporation. In Figure 3, this equates with an upward displacement of body temperature. Heat adaptation must exert its influence by elevating skin temperature, since sweat rate increases have minimal impact.
Core temperatures in several field- and racquet sports appear to be similar to those for endurance exercise, presumably due to the factors described above. Temperatures measured rectally and from gastro-intestinal thermometer pills at the completion of competition, have mostly been in the range 39-40°C for soccer 39, rugby 40, 41, squash (40-min game; 25), tennis (N=1; 20-min practice; Cotter: unpublished), and during golf played in the heat without rehydration 42. However, as with endurance exercise, core temperatures may consistently exceed 40°C and even 41°C, bearing a stronger relationship with exercise intensity than fluid deficit 42. A case exists for heat adaptation in those sports involving some combination of restricted water availability, prolonged play without player substitution, hot-humid environments, minimal air flow or the use of clothing, since there is minimal opportunity to implement behavioural strategies to combat core temperature elevations. However, in sports such as golf, educational advice would suffice, since neither intensity nor speed should dictate behaviour.
Are there disadvantages of heat adaptation?
Throughout this paper, altitude training is referenced, since lessons should be learned from different adaptation practises. For competition at altitude, athletes need to train and compete at altitude. However, altitude training to improve sea level performance, whilst debated, is based on the premise that hypoxia stimulates red cell production and oxygen transport. For adaptation to be initiated, the hypoxic stimulus must exceed a critical threshold (Figure 1), and this stimulus, by definition, must reduce performance. One cannot occur without the other. Therefore when athletes live at altitude, training quality is impaired. Athletes can still work at maximal intensity, but this represents a sub-maximal performance. The risk here is that one adaptation stimulus results in an intensity reduction of another. This is the logic of the “live high and train low” approach.
This analogy transfers easily to heat adaptation. Endurance training and heat adaptation serve quite different physiological and psychological outcomes, and require different training prescriptions. There is no question that endurance exercise improves heat tolerance, but exercise-heat adaptation offers minimal benefit to the endurance fitness of highly trained athletes, but has some potential to even be detrimental. If athletes are transported to hot-humid regions prior to competition, then training quality will suffer until the necessary adaptations become established. The current authors offer the following recommendations (elaborated below) to optimise performance in the heat, whilst minimising negative outcomes: live in the heat, experience heat under competition, acclimate for a specific climate and do high-quality training in the cool.
First, hot-humid climates affect athletes during competition, but also have an immediate impact on arrival from abroad. This is expressed in the form of lethargy, sleep disturbances, loss of appetite, mild dehydration, greater strain during training, and even a reduced training capacity. The ideal way to become accustomed to these changes is to live for an extended time in the climate in which competition will occur (acclimatise). This may require a moving into that climate at the end of winter so that physiological adaptations progress with seasonal changes: “live hot”.
Second, athletes need to experience the discomforts of competing in hot-humid conditions. This is necessary to more fully appreciate the physiological strain, to practice behavioural strategies, to counteract adverse physiological effects and to rehearse pacing strategies. Whilst record pacing under 33°C (Figure 3) has been modelled in this paper, attempts to achieve this in longer events are certain to fail. There is no better way to learn this than to attempt personal best pacing (with appropriate supervision) in hot-humid conditions: “experience heat”.
Third, if heat adaptation is essential, then it is supplementary training, and not a substitute for high-quality endurance training. Use the acclimation procedures discussed below, and closely replicate the conditions of the target climate: “acclimation specificity”.
Fourth, athletes must sustain training quality. Unlike the hypoxic stress of altitude, temperature and humidity fluctuate. Well-prepared athletes will ensure that training quality has been maintained, so that optimal performance under more favourable conditions will also be assured. To achieve this, the current authors recommend that athletes undertake high-intensity training under temperate conditions: “train cool”.
To continue with the altitude analogy, one may ask: does heat adaptation enhance performance in temperate environments? To the best of the authors’ knowledge, there is no convincing evidence to support (or refute) this proposition, despite endurance-based competition being thermally stressful. As discussed above, one may suggest that the reverse may hold, particularly if training quality is compromised.
Stages of heat adaptation
Adaptation responses are attributed to inherited (genotypic adaptation) and acquired variations (phenotypic adaptation). The focus of this review is upon phenotypic adaptation, expressed through changes in morphological configuration (e.g. sweat gland size) and modification of effector organ control (e.g. sudomotor threshold or sensitivity). Regulatory changes may be positive or negative 44, such that acute responses are either amplified or blunted (habituated). The classical example of a negative adaptation is the blunted metabolic reaction of Australian Aborigines to cold 45, where shivering thermogenesis is less pronounced during an acute cold exposure, relative to that seen in non-adapted Europeans. It is often thought that heat adaptation displays only positive phenotypic trends 46, 47. However, this is not correct 48.
The heat adaptation continuum
The least adapted state is typified by the sedentary and thermally non-adapted person, who lives and works in a temperate or air-conditioned environment. In this state, there exists neither exercise- nor heat-induced adaptation traits. However, such individuals typically respond favourably to both training and heat adaptation stimuli (high responders).
Regular exposure to passive heating (e.g. sauna or bath) induces cardiovascular and sweating changes 50: positive heat adaptation. While advantageous at rest, these responses have traditionally been found not to confer an advantage during exercise, as passive heating typically induces only slight-moderate strain, and is therefore a less effective adaptation stimulus 51, 52.
If a sedentary person becomes physically active, further adaptation occurs. Now effector systems that support both exercise and thermoregulation undergo adaptation 3, 8, 53. The literature is replete with reports of 20-25% elevations in peak oxygen consumption, or sweating, in high responders following adaptation. Nevertheless, such observations must not be over-interpreted or assumed to occur in more adapted, low-responding individuals.
The fourth transitional state eventuates when a physically active person experiences regular, exercise-heat adaptation stimuli. At this stage, less powerful adaptations occur, but the person will now acquire the physiological characteristics accompanying heat adaptation. Prior to adaptation, physically active people possess a thermal advantage over sedentary individuals during exercise in the heat. However, this advantage is lost when both groups are exposed to exercise-heat acclimation 54.
The final two states are typified by the elite, non-adapted and the elite, heat-adapted athlete. In this context, “elite” refers to endurance training state. By definition, elite athletes are low responders to exercise and thermal adaptation stimuli, responding as if already heat adapted 55. However, research with highly trained athletes shows that, despite a high basal adaptation, substantial shifts in the physiological-, psychophysical- and performance responses can be induced through acclimation 56, 57. While there is evidence that physically active indigenes display a more efficient negative adaptation 31, these authors are unaware of corroborating data in athletes.
Physiological changes accompanying heat acclimation
An early adaptative response is an expansion of the plasma volume. This improves maintenance of the osmotic potential of the blood. Either reduced plasma protein loss 60, 61, or greater extracellular electrolyte retention 62, mediate this expansion, which was thought to wane with repeated heat exposure 3, 63. However, this trend is only evident when constant exercise intensities (forcing functions) are used during heat adaptation. It has recently been established that, when physiological strain is maintained using progressive increments in exercise intensity (see Combined exercise-heat acclimation), this expansion can be sustained for at least three weeks 64. Indeed, the entire extracellular fluid volume can be held in a significantly elevated state 64.
During exercise in the heat, protracted sweating contracts the interstitial and plasma volumes. This reduces central blood volume and stroke volume, elevating cardiac frequency to sustain cardiac output and blood pressure. Heat adapted people, with an expanded vascular volume and cardiovascular adaptations, more effectively regulate blood pressure in the face of this fluid loss. Thus, for a given exercise intensity, the stroke volume is larger and the cardiac frequency lower 52, 54, 65.
The above changes permit an elevation in skin blood flow during heat exposure (positive adaptation), and often a lowering of the vasodilatory threshold 66, 67. Both mechanisms facilitate a more rapid translocation of central heat for dissipation. Accordingly, people report reduced effort sense and tolerate exercise better. These skin blood flow changes represent a second-phase response, but are not always evident. When indigenous people from hot regions are studied, skin blood flow is lower than in non-adapted controls 31, 32; the third (negative) phase of adaptation. A reduced skin blood flow can elevate skin temperature, minimising heat gain and increasing evaporation. Central blood volume and mean arterial pressure are better defended, and progressive cardiovascular drift, typically observed during endurance exercise in the heat 38, 68, is largely prevented. The net result is increased cardiovascular efficiency. The negative aspect of this change would be that heat flow from the core is reduced. At this time, the net effect of a greater heat loss at the periphery and superior cardiovascular stability, relative to lower core-to-peripheral heat flow, has not been investigated in non-indigenous people exercising at greater intensities.
An elevated skin temperature buffers thermal energy influx, but sweating provides the most effective means of heat dissipation in hot environments. While it is a wasteful process that takes a few days (7-14) to become established 13, the most generally observed heat adaptation response is a marked elevation in sweat secretion (positive adaptation), though this is not always present 69. Such increases result from a rise in the steady state sweat rate 7, 70, 71, an increase in sudomotor sensitivity and a reduced threshold for sweating onset 6, 71. Heat adaptation also produces glandular hyperthrophy 70, 72 and increased glandular secretion, but does not change the number of active sweat glands 73, 74, 75. Sweat glands also reabsorb more sodium and chloride 76, and are less affected by water accumulation on the skin (hidromeiosis: 49). As a result of these changes, the non-adapted sweat rate (1-1.5 l.h-1) can be doubled, with quite prolific sweating observed in some elite athletes 77. However, Mitchell et al. 65 described much of this additional secretion as extravagant, finding that a 30% elevation in sweating only increased evaporation by 10%. Many of these adaptations also accompany endurance training, even in cool-temperate climates 6, 78, 79, but are less than elicited through similar training in the heat.
Unfortunately, a more robust sudomotor system does not implicitly provide a physiological benefit for the clothed athlete. Clothing is a semi-permeable barrier to water molecules, so the microclimate rapidly attains a water vapour pressure that prevents evaporation, and heavy sweating elevates thermal discomfort 80, leading to rapid dehydration and performance decrements 16, 81, 82.
Collectively, these physiological transformations enable a greater heat flow to the skin, they provide a greater potential for heat dissipation, and can result in reduced core temperatures 10, 65, 83. Thus, the competing, heat-adapted and semi-clothed athlete can move from a state deemed to be uncompensable, into a state in which thermal compensation is attainable, but at a considerable fluid cost.
Before leaving this topic, the third phase of heat adaptation is revisited. Elsewhere, the evidence has been reviewed and the case presented, that lower sweat rates in tropical indigenes represent thermoregulatory habituation (negative adaptation), and that this state is consistent with superior heat adaptation 48. It is an energy- and water-efficient adaptation. This phenomenon was first suggested by Glaser and Whittow 84, observed but not appreciated by Wyndham et al. 85, and then supported experimentally by Fox et al. 31, Hori et al. 86 and Candas 49. The present authors are not aware of any evidence for its existence in highly trained athletes, though it is suspected to be present in endurance trained people who have lived for several years in hot climatic regions.
Though evidence is fragmentary, and often confounded by differences in diet and health, data do exist to show that indigenes from the hottest climates (South Asia, Africa, India and Australia) have larger surface area to body mass ratios 87. This morphological configuration is well suited to the energy-efficient dry heat exchanges, and to a reduced reliance upon evaporation (at least when air temperature is <36°C in the shade, or <31°C in the sun). This difference was recently discussed by Marino et al. 88, when comparing ethnic differences in elite distance runners.
Physiological versus psychological adaptation
Consider a marathoner preparing for the Olympics. Since 1984, only one Olympics was held in conditions for which the risk of heat illness was low, and for three Games, the risk was very high, as it will again be in Beijing, China. Accordingly, a record attempt must not form part of the racing strategy. But at which pace should the athlete race? The development of a complete answer to this question is beyond the scope of this review, but some general guidelines may be noted.
Finally, the athlete needs to experience racing against others following different strategies. Consider a situation where one athlete decides that the ideal pace is 3 min 15 sec per km, while others decide that 3 min per km is the appropriate strategy. How does the athlete deal with this? Inappropriate choices can undo years of physiological preparation, so psychological adaptation is absolutely essential.
Principles and practices of heat adaptation
Passive heat acclimation
Exercise-induced heat adaptation
Endurance fitness increases cardiovascular stability 9, 97, resulting in more favourable fluid dynamics during exposure 60. Furthermore, the repeated stimulation to remove heat improves heat tolerance. Indeed, endurance trained people have an earlier sweating onset 98 and show a more rapid increase in sweat secretion 97.
Not only does endurance training elicit heat adaptation, but it facilitates more rapid attainment of adaptation during subsequent heat acclimation 10, 99. Indeed, Pandolf et al. 10 found subjects with a superior aerobic power (>65 ml.kg-1.min-1) would acclimate rapidly (~4 days). Moreover, slower adaptation is seen in people with a low basal fitness 100; the high- but slowly-responding person. For such people, the initial stages of acclimation elevate fitness more than acclimation state.
It is important to recognise that aerobic power per se does not enhance heat adaptation, but the physiological adaptations occurring during fitness acquisition are beneficial. In particular, adaptations associated with an increasing thermal load are critical. This is evident from experiments on swimmers 79, and the sweatless training of Hessemer et al. 101. These projects established that exercise without a sustained increase in body temperature is an insufficient stimulus for heat adaptation.
Many researchers have tried to determine the most effective method through which exercise may improve heat tolerance. Unfortunately, such comparisons are limited by the absence of measures of the thermal load (core temperature change). Consider the field-based studies of Edholm et al. 90 and Turk and Worsley 102. These experiments evaluated the efficacy of exercise-induced heat adaptation accompanying military training regimens, but without quantifying or standardising thermal load. Similarly, Cohen and Gisolfi 40 studied the influence of endurance-training intensity on heat tolerance, also without recording thermal strain. Such reports often form a basis for developing heat adaptation strategies, but the current authors do not support this practise, because it is difficult to interpret observations from such experiments in the absence of thermal strain standardisation. Exceptions include studies conducted by Shvartz et al. 103 and Regan et al. 104, where exercise adaptation was shown to be less effective than a combined exercise-heat acclimation, even with equivalent core temperatures.
Nevertheless, exercise under cool-temperate conditions can foster heat adaptation, and several generalisations are noteworthy. First, adaptation depends upon the capacity of the exercise mode to elevate body temperature. Without this, adaptation is unlikely. Second, since heat storage is proportional to exercise intensity, heavier workloads are essential for adaptation. Third, the cumulative adaptation impulse appears more critical than workload intensity, once the critical adaptation threshold has been surpassed. Fourth, continuous exercise under temperate conditions more readily supports an elevated core temperature than intermittent exercise. Finally, one must consider the principles of adaptation specificity. Since both systemic and local mechanisms are affected during adaptation, exercise-induced adaptation should occur using the exercise mode in which the athlete will compete.
In spite of the benefits of temperate endurance training, it does not provide an adequate stimulus for complete heat adaptation. For instance, Wyndham 51 has suggested that the thermoregulatory benefits during heat exposures are only beneficial in the first two hours. Therefore, for activities that extend beyond this time, endurance training is not an adequate substitute for heat acclimation 105, 106. Furthermore, once an adequate basal fitness is achieved, there is little additional thermoregulatory advantage accompanying continued training 53, 107. Thus for the endurance athlete, heat exposure must form an integral component of preparation for competition in the heat. Indeed, the elevation of both core and skin temperatures appears necessary for complete adaptation 104, 108. A partial progression towards this state may be achieved by training under a solar load, or by adding clothing to retain heat and moisture.
Interest in the latter procedure originated in the 1960s 9, 109, and it was thought to have significant practical advantages 110-113. Unfortunately, there is little empirical evidence to support that possibility. In fact, this method appears to have no greater benefit than endurance training alone. Closer examination of the research shows a general failure to use appropriate experimental controls or to standardise dependent variables across conditions. At this time, it is concluded that minimal physiological benefit can arise from using sweat clothing.
One might predict superior adaptation from training with a solar load, but few projects have focussed on this topic, and these are hard to interpret, due to difficulties controlling climatic conditions in the field. However, such exposures most closely approximate the conditions that are obtained during acclimatisation and competition, and solar radiation readily elevates skin temperature. In this regard, Jessen 114 established that the thermoregulatory responses for a fixed air temperature are significantly greater with solar radiation (goat model). It is therefore reasonable to postulate that training with solar loading elicits a more specific adaptation, and one that is superior to that elicited within a climate chamber at the same temperature. Evidence supporting this possibility was provided by Edholm et al. 90, who compared heat tolerance in soldiers living in different climates (desert versus cool). Acclimatised soldiers displayed superior tolerance with fewer heat-related casualties.
Armstrong et al. 115 studied well-trained distance runners during spring and summer, finding equivalent heat tolerance. Thus the solar loading of summer did not increase tolerance above that evident after spring. This does not mean that these athletes were already optimally adapted. It is more likely that summer training did not provide an adequate additional adaptation stimulus. Two implications extend from this work. First, well-trained endurance athletes do not need supplementary thermal preparation during seasonal changes in climate. Second, when athletes travel from autumn-winter to compete in the opposite hemisphere, additional heat exposures will be required, due to the inability of the solar load during these months to elicit an adequate adaptation.
Combined exercise-heat acclimation
Humid heat acclimation results in a greater elevation in sweating 69, 103. Adaptation to dry conditions may not provide adequate protection for humid exposures 115. While the experimental evidence relating to the transference of acclimation benefits between hot-dry and hot-humid conditions is sparse, it appears that adaptation specificity occurs 116. For instance, Shvartz et al. 103 observed, when subjects were adapted to either hot-humid or hot-dry conditions, and subsequently exposed to hot-dry conditions, that less thermal strain was evident following hot-dry acclimation (see also: 2, 8, 9).
Conventional acclimation regimens involve moderate-to-heavy exercise in a temperature- and humidity-controlled chamber. However, the cumulative adaptation impulse can be modified independently of climatic conditions, through changes in external power. Three categories of exercise forcing function have been used: (a) constant work-rate regimens, (b) self-regulated exercise regimens, and (c) controlled-hyperthermia regimens 117.
(a) Constant work-rate regimens
This method has wide application, but there are two major limitations regarding its effectiveness and interpretation. First, since all subjects exercise at the same absolute intensity, their physiological strain may vary widely. Second, since work rate is constant, thermal strain during sequential exposures progressively declines, constraining adaptation. This point is further developed below.
(b) Self-regulated exercise regimens
(c) Controlled hyperthermia regimens
From the present authors’ projects, two observations are important. First, controlled hyperthermia in the heat invokes a superior adaptation, relative to that seen when the same method is used in temperate conditions, even though both stimuli elicited equivalent core temperature changes 104. Thus exogenous heat seems necessary for adaptation. Second, the present authors have established that the plasma volume expansion, considered once to be a transitory phenomenon, can be preserved for at least 21 days when the controlled hyperthermia method was used 64.
It is suggested that this regimen can induce a more complete adaptation than either of the other techniques. But which regimen should the athlete adopt? The controlled hyperthermia method targets thermal adaptation, and not athletic performance. Its purpose is to clamp the thermal load. For many athletes, particularly endurance runners, intermittent exercise is a training practise that does not mimic competition. For such athletes, it is first recommended that intermittent exercise be used to stimulate adaptation, then continuous exercise to trial pacing strategies. However, the use of a constant workload during acclimation is not recommended.
Forcing function considerations
To illustrate the potential impact of differences in the thermal forcing function, the present authors compare two 12-day adaptation regimens, each of 90-min, each using exercise in the heat, and each aimed at initially elevating core temperature to 38.5°C. The most common adaptation protocol uses a constant load function 10, 120. Such regimens produce a gradual rise to the target core temperature (Figure 4A). Since adaptation results in the physiological impact becoming progressively smaller, as reflected in core temperature, the adaptation impulse decreases over successive exposures. The inset Figure shows this as a fall in cumulative core temperature. Contrast this with the controlled-hyperthermia protocol, where the body temperature elevation is more rapid, and then held constant during successive exposures (Figure 4B). The work rate must increase as adaptation progresses, resulting in the adaptation stimulus being higher and more stable (Figure 4A versus 4B inset). Both regimens elicit acclimation. One method targets a specific work rate, while the other targets a thermal load. Both techniques have useful applications. However, since physiological systems adapt to the specific nature of the adaptation stimulus, then both regimens invoke qualitatively similar, yet quantitatively different adaptations.
Figure 4: Core temperature changes on days 1, 4, 8 and 12 of two, 12-day heat acclimation regimens using either the constant work rate (A) and controlled hyperthermia methods (B). Insets show integrated core temperature for each day.
There may be no substitute for living and training under hot conditions to improve performance in the heat. However, high intensity training is invariably hard to sustain in the heat. With these points in mind, it is recommended that athletes live in the heat, experience heat under the pressure of competition, acclimate for a specific climate and then undertake high-quality training in the cool. Athletes must not ignore the psychological aspects of preparing for competition in the heat, since inappropriate choices can undo years of physiological preparation.
Adaptation impulse = ((Tb-1-Tb-0)*t1) + ((Tb-2-Tb-0)*t2) + ((Tb-i-Tb-0)*ti) [°C.min]
Thermal compensability: the ratio of required evaporative cooling to the maximal attainable evaporative cooling for a given environment and clothing configuration.
Thermal compensability = Ereq / Emax [dimensionless]
Ereq = H -Eresp ?R ?C [W]
R ?C = 6.45 * AD * (Tsk-Ta) / ITOT [W]
Emax = 6.45 * AD * im / ITOT * 2.2 * (Psk - (RHa* Pa)) [W]
Address for correspondence:
Heat adaptation: Guidelines for the optimisation of human performance