Article Text

Download PDFPDF

Seasonal change in bone, muscle and fat in professional rugby league players and its relationship to injury: a cohort study
  1. Erin C Georgeson1,
  2. Benjamin K Weeks1,
  3. Chris McLellan2,
  4. Belinda R Beck1
  1. 1Centre for Musculoskeletal Research, School of Rehabilitation Sciences, Griffith Health Institute, Griffith University, Gold Coast, Queensland, Australia
  2. 2Faculty of Health Sciences and Medicine, Bond University, Gold Coast, Queensland, Australia
  1. Correspondence to Dr Belinda R Beck; b.beck{at}griffith.edu.au

Abstract

Objectives To examine the anthropometric characteristics of an Australian National Rugby League team and identify the relationship to type and incidence of injuries sustained during a professional season. It was hypothesised that body composition would not change discernibly across a season and that injury would be negatively related to preseason bone and muscle mass.

Design A repeated measure, prospective, observational, cohort study.

Setting Griffith University, Gold Coast, Australia.

Participants 37 professional male Australian National Rugby League players, 24.3 (3.8) years of age were recruited for preseason 1 testing, of whom 25 were retested preseason 2.

Primary and secondary outcome measures Primary outcome measures included biometrics; body composition (bone, muscle and fat mass; dual-energy x-ray absorptiometry; XR800, Norland Medical Systems, Inc); bone geometry and strength (peripheral quantitative CT; XCT 3000, Stratec); calcaneal broadband ultrasound attenuation (BUA; QUS-2, Quidel); diet and physical activity history. Secondary outcome measures included player injuries across a single playing season.

Results Lean mass decreased progressively throughout the season (pre=81.45(7.76) kg; post=79.89(6.72) kg; p≤0.05), while whole body (WB) bone mineral density (BMD) increased until mid-season (pre=1.235(0.087) g/cm2; mid=1.296(0.093) g/cm2; p≤0.001) then decreased thereafter (post=1.256(0.100); p≤0.001). Start-of-season WB BMD, fat and lean mass, weight and tibial mass measured at the 38% site predicted bone injury incidence, but no other relationship was observed between body composition and injury.

Conclusions Significant anthropometric changes were observed in players across a professional rugby league season, including an overall loss of muscle and an initial increase, followed by a decrease in bone mass. Strong relationships between anthropometry and incidence of injury were not observed. Long-term tracking of large rugby league cohorts is indicated to obtain more injury data in order to examine anthropometric relationships with greater statistical power.

  • Sports Medicine

This is an open-access article distributed under the terms of the Creative Commons Attribution Non-commercial License, which permits use, distribution, and reproduction in any medium, provided the original work is properly cited, the use is non commercial and is otherwise in compliance with the license. See: http://creativecommons.org/licenses/by-nc/2.0/ and http://creativecommons.org/licenses/by-nc/2.0/legalcode.

Statistics from Altmetric.com

Request Permissions

If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.

Article summary

Article focus

  • Does rugby league player body composition change across a 12-month period in response to preseason training, seasonal game play and off-season rest?

  • Is preseason body composition (bone, muscle and fat) related to injury incidence throughout the season?

  • Are changes in body composition during the playing season related to incidence of injuries?

Key messages

  • Professional rugby league players lose lean mass across a playing season but regain it with preseason training.

  • Strong relationships were not detected between anthropometric characteristics and incidence of injury.

Strengths and limitations of this study

  • Comprehensive anthropometric data were collected from a professional rugby league team at four time points across a 12- month period (preseason, mid-season, postseason and preseason) to track changes in body composition related to preseason training, playing and off seasons.

  • The most valid and reliable instruments (DXA and pQCT) were employed to determine anthropometric measures. pQCT data are novel in this cohort.

  • Low absolute number of injuries limited the ability to detect strong relationships between injuries and anthropometric measures.

Introduction

Rugby league is a physically demanding, high-impact, full-body contact professional sport. It requires well-developed muscle strength and endurance, speed, agility and aerobic power.1 The high frequency and force of physical collisions encountered during a game2–4 lead to a higher incidence of musculoskeletal injuries5 than is typically observed in other team sports.2 ,3 ,6 Injury incidence has been reported from 44.9 to 462.7 injuries per 1000 player hours.2 ,7–9 Such large variations in reported injury incidence may be attributed to inter-study differences in definitions, data collection and reporting methods used.10 Furthermore, skill level, playing intensity and seasonal conditions have been suggested to influence injury incidence.2 ,8 ,9 As 15–30% of total seasonal injuries are classified ‘severe’ (ie, causing a player to miss five or more games),2 ,8 ,11 players can miss up to 20% of games in a season.7 A range of factors have been associated with an increased risk of injury, including low preseason running speed and maximal aerobic power, lighter body weight and greater number of playing years’ experience.5 Furthermore, injury risk is dependent on player position (eg, forwards vs backs); forwards typically sustaining more injuries than backs.9 ,12

Some anthropometric and physiological characteristics of rugby league players have been described, based on traditional body composition measures of weight, height, body mass index (BMI) and skinfold thickness.13 ,14 Few more direct measures of bone and muscle have been reported, and the relationship of body composition to injury remains unknown.15 Analysis of seasonal anthropometric changes to rugby league players and the examination of relationships of anthropometrics with rate and type of injury may reveal important risk factors. In particular, the identification of modifiable risk factors would give rise to opportunities to reduce the risk of injury to players.

Dual-energy x-ray absorptiometry (DXA) provides a reliable estimate of body composition (bone, muscle and fat).16 A recent study of English Super League players observed an increase in body fat from DXA across the playing season, and a decrease in lean mass.15 Bone mass was also observed to increase to mid-season, but decrease thereafter. It is well known that DXA estimates of bone mass are based on two-dimensional measures of areal bone mineral density that cannot fully account for bone size, discriminate between cortical and trabecular bone envelopes or measure elements of bone morphology that are critical to whole-bone strength. In light of the known limitations of DXA measures, seasonal changes in rugby league player bone strength remains uncertain. Peripheral quantitative CT (pQCT) can discriminate between cortical and trabecular envelopes, and measures true volumetric bone mineral density (BMD) and parameters of bone morphology; thereby providing a superior indication of bone strength than DXA.17–20 Such measures have not previously been reported for a rugby league cohort.

The aim of the current study was to examine preseason, mid-season and postseason body composition of a professional Australian team using DXA and pQCT, and to identify relationships between baseline and change in body composition with rate and type of injuries sustained across a season.

Methods

Ethical approval

Ethical approval for the study was granted by the Griffith University Human Research Ethics Committee (PES/28/08/HREC). Written informed consent was obtained from each participant.

Subjects

All members of an Australian National Rugby League team (n=44), age 24.6 (3.4) years, playing an average of 15.7 (7.2) games (58% of the season) consented to participate in the study.

Study design and conduct

A repeated measure, prospective, observational study was conducted. Data were collected on four occasions over a 12-month period, that is, pre- (March), mid- (July) and post- (September/October) 2009 southern hemisphere rugby league season (September/October) and pre-2010 rugby league season (March). All data were collected in the Bone Densitometry Research Laboratory at Griffith University, Gold Coast.

Behavioral characteristics—diet and physical activity

Nutrition was assessed via the Cancer Council Victoria's Dietary Questionnaire for Epidemiological Studies (DQES), a diet instrument validated for the Australian population.21 The DQES contains a series of questions pertaining to the subjects’ normal dietary intake over the preceding 12 months. Responses were computer analysed and an estimate of total daily energy intake and calcium intake were obtained.

The Bone-specific Physical Activity Questionnaire (BPAQ) is a validated tool for quantifying historical physical activity participation relevant to the musculoskeletal system.22 Participants were asked to record (1) all regular physical activities performed throughout their life and the approximate number of years of participation; and (2) all activities performed on a regular basis over the previous 12 months, including frequency of participation. The BPAQ was analysed using a custom-designed programme (available at http://www.fithdysign.com/BPAQ/) developed on LabVIEW software (National Instruments, Austin, Texas, USA) to produce current, past and total bone-specific physical activity history scores.22 All participants (n=44) completed diet and BPAQ questionnaires.

Biometrics

Subject height was measured to the nearest 0.01 m using a wall-mounted stadiometer (HART Sport & Leisure, Brisbane, Australia). Weight was measured to the nearest 0.01 kg using a robust digital scale (CH-150K, AND Mercury, Brisbane, Australia). Upper-extremity skeletal dominance was determined as the preferred writing hand. Lower-extremity skeletal dominance was determined to be the non-kicking leg according to procedures established and validated in our laboratory.23

Anthropometry

Dual-energy x-ray absorptiometry

DXA was used to determine whole body (WB), lumbar spine (LS), non-dominant femoral neck (FN) and forearm (FA) bone mineral content (BMC; g), bone area (cm2) and bone mineral density (BMD; g/cm2) (XR800 Norland Cooper Surgical, Fort Atkinson, WI, USA, Illuminatus software V.4.2.4). Additionally, WB scans were used to determine lean and fat mass. Short-term DXA measurement precision in our lab is 0.9%, 1.1%, 0.4%, 0.8%, 0.6%, 0.8% and 2.3% for WB, FN, LS, distal radius-ulna, proximal radius-ulna, lean and fat mass, respectively.

Peripheral quantitative CT

pQCT (XCT 3000; Stratec Pforzheim, Germany) was used to examine non-dominant tibiae total, trabecular and cortical tibial volumetric densities (mg/cm3), strength-strain index (SSI; mm3), principal moments of inertia (Imin and Imax; mm4) and muscle area (mm2) at the 4%, 14%, 38% and 66% sites. Short-term measurement precision in our lab is 1.5% and 0.6% for total tibial density at the 4% and 38% sites, respectively.

Quantitative ultrasound

Quantitative ultrasound (QUS-2; Quidel, Mountain View, California, USA) was used to measure broadband ultrasound attenuation (BUA; dB/MHz) of the non-dominant calcaneus. Short-term measurement precision with repositioning was 2.5%.

Performance measures

Single leg stance

Static balance ability was tested using the standard single leg stance (SLS) test.24 Participants stood with feet pelvis-width apart, forearms crossed over their chest and fingers at shoulders. With eyes closed, one foot was lifted off the ground to the level of the opposite ankle, but not touching. Timing commenced from foot lift-off and ceased when (1) arms moved from their starting position; (2) feet touched; (3) elevated foot touched the ground or moved towards/away from the planted foot; (4) grounded foot adjusted position to maintain balance or (5) eyes opened. Each participant was allowed to practise and have up to three attempts on each leg. The subject's best time was recorded in seconds and the test repeated for the opposite leg. A single investigator performed all SLS tests.

Vertical jump

Leg muscle power was assessed by the vertical jump test.25 Participants began by standing beside the Yardstick vertical jump device (Swift Performance Equipment, Wacol, Queensland, Australia), with both feet grounded and positioned shoulder-width apart. The participant was asked to reach as high as possible with their preferred arm and the height of reach was recorded. Participants were then instructed to jump as high as possible in counter-movement manner without arm swing, and tap the device at the peak of their jump. The best of three attempts was recorded in cm. A single investigator measured and recorded all vertical jump trials.

Injury data collection

All injuries, new or recurrent, sustained during the study period were recorded by two team physiotherapists in attendance at all games and training sessions. An injury was recorded if a rugby league activity/game resulted in any pain, discomfort, illness or disability, and required the player to seek medical intervention from team medical staff.10 All injuries were recorded whether the player missed a subsequent training session/game or not. Injury details including anatomical location and tissue involvement, cause and severity (determined by the number of training days and games missed) were recorded.

Game exposure hours were calculated by multiplying the number of players, by the number of games and the duration of the match (ie, 13 players ×29 games ×1.33 h/game).10 Injury incidence was recorded per 1000 game or training hours.2 ,10 ,26 Match injury incidence was calculated by dividing the number of recorded match injuries by the game exposure hours and multiplying by 1000.

Statistical analysis

Statistical analysis was performed using SPSS V.17.0 for Windows (IBM, Chicago, Illinois, USA). Descriptive statistics, mean (SD) were generated for subject and injury characteristics and independent t tests of the original 2009 cohort (n=37) were used to compare the anthropometric characteristics of injured versus non-injured players. Correlation analyses were performed to identify relationships between injury incidence, biometrics, anthropometrics, active test performance and lifestyle factors of the 2009 cohort (n=37). Subsequent multiple regression analyses were used to determine the ability of biometrics and behavioural characteristics to predict variance in anthropometrics and injury incidence. Q–Q plots were generated to determine whether data were normally distributed and Levene's test was used to examine homogeneity of variance. To examine change in anthropometric characteristics across the season, repeated measures analysis of variance (ANOVA) was used, with and without covariates of calcium, weight, age and past BPAQ score (n=37). Results were considered statistically significant at p≤0.05.

Results

A total of 44 different subjects were tested. At baseline 2009, 37 players were tested and 32 were tested in 2010. Player relocation resulted in 12 players leaving the cohort at the end of the 2009 season and seven players joining the study prior to 2010 testing. Table 1 describes the baseline characteristics of the cohort. All data were normally distributed. Notably, players exhibited bone mass that was, on average, over 1 SD higher than age-matched and sex-matched norms for the WB (Z-score range= +0.63 to +1.63), spine (Z-score range= +0.99 to +2.39) and femoral neck (Z-score range= +0.63 to +3.03).

Table 1

Baseline characteristics of players at 2009 preseason (n=37)

Anthropometric and performance measures

Bone, muscle and fat measures at all four testing sessions are presented in table 2. Controlling for age, weight, dietary energy intake, calcium consumption and past physical activity, lean mass decreased at each measurement time point throughout the 2009 season, but returned to pre-2009 season values by 2010 preseason (mean difference=0.230, 95% CI −0.987 to 1.168 kg; p=0.86). Neither weight (mean difference=0.300, 95% CI −0.624 to 1.225 kg, p=0.51), nor fat mass (mean difference=0.650, 95% CI −0.441 to 1.656 kg, p=0.24) changed over the 12-month period. The lean and fat per cent response followed a similar pattern (figure 1A,B).

Table 2

DXA and pQCT parameters at all measurement time points

Figure 1

Change in per cent lean (A) and per cent fat (B) of players measured at all testing time points (n=19). (* Significantly different from baseline; Δ significant change between time points.)

Only WB BMD changed significantly across the 2009 season (mean difference=0.021, 95% CI 0.008 to 0.035 g/cm2, p=0.01), increasing until mid-season (mean difference=0.061, 95% CI 0.054 to 0.069 g/cm2, p=0.001) and decreasing thereafter (mean difference=−0.050, 95% CI −0.060 to 0.025 g/cm2, p=0.001), but remaining higher than 2009 preseason values (mean difference=0.010, 95% CI 0.002 to 0.019 g/cm2, p=0.02; figure 2A). No significant changes were observed for LS or FN BMD over the 12-month period (figure 2B,C); however, changes were evident for forearm BMD. Distal forearm BMD decreased until mid-season (mean difference=−0.007, 95% CI −0.013 to 0.001 g/cm2, p=0.01; figure 2D), with a subsequent significant increase from mid-season to postseason (mean difference=0.007, 95% CI 0.002 to 0.010 g/cm2, p=0.001), at which time BMD of the proximal radius and ulna also significantly increased (mean difference=0.007, 95% CI 0.002 to 0.014 g/cm2, p=0.01; figure 2E). Proximal radius BMD decreased from 2009 postseason to 2010 preseason (mean difference=−0.010, 95% CI −0.016 to −0.004 g/cm2, p=0.01).

Figure 2

Seasonal change in whole body (A; n=19), lumbar spine (B; n=20), femoral neck (C; n=19), distal radius (D; n=18) and proximal radius (E; n=18) bone mineral density (BMD) and tibial bone mass (F) at 4% (n=20) and 38% (n=19) sites of professional rugby league players. (* Significantly different from baseline; Δ significant change between time points).

No significant changes were observed in tibial mass at the 4% tibial site (p=0.364). Tibial mass increased at the 14% site over the 12-month period (mean difference=0.017, 95% CI 0.002 to 0.031 g/cm2, p=0.03) and at the 38% site at each time point relative to preseason 2009 (p≤0.05; figure 2F).

No significant changes were observed in single leg stance or vertical jump measures across the season.

Exposure and injuries

Table 3 demonstrates the type and frequency of training sessions performed throughout the 2009 season. Preseason training consisted of a greater focus on strength, skills and conditioning, while a greater emphasis was placed on football drills and between-match recoveries during the latter stages of the season. A total of 29 weekly games were played during the 2009 National Rugby League season, inclusive of three preseason and two finals matches for a total team game exposure of 501 h (13 players ×29 games ×1.33 h).27 A total of 51 injuries were reported across the 2009 season, equating to an incidence of roughly 101 injuries per 1000 player hours (95% CI 81.2 to 120.8). There were 43 new injuries and 8 re-injuries. Twelve injuries occurred when players were running, 21 while being tackled, 4 when tackling another player, 1 when kicking, 8 from a collision, 1 from a fall and 4 were of unknown cause. The players’ dominant sides were affected 45% of the time. Injury types are presented in table 4. Injuries were classified into one of four categories: bone (fracture or bone bruise), muscle (tear or strain), joint disruption (connective tissue/ligament/tendon injury or dislocation) or impact injury (haematoma/contusion, concussion or rib/sternal injury) and plotted in figure 3.

Table 3

Type and frequency of training sessions performed throughout the 2009 season

Table 4

Category and frequency of injuries sustained throughout the 2009 season (n=51)

Figure 3

Frequency distribution by injury category of professional rugby league players across a playing season (n=37).

There were no differences in baseline BMI, BUA, WB BMD and BMC, WB lean and fat mass, and tibial mass at the 38% site (p>0.05) between injured (n=20) and non-injured (n=16) players. No factor predicted injury incidence or muscle, connective tissue or impact injuries. In contrast, baseline WB BMD, fat mass, lean mass, weight and tibial mass at the 38% site accounted for around 53% of the variance in bone-related injuries (R2=0.527; p=0.003).

Discussion

Our objective was to examine the anthropometric characteristics of Australian National Rugby League players, preseason, mid-season and postseason, to determine whether relationships exist between those characteristics, and the incidence and type of injuries sustained during a professional season. We found that body composition changed throughout the season, with players tending to lose lean mass and gain fat mass as the season progressed. We also observed WB BMD increased until mid-season and decreased thereafter. Body composition elements were largely unrelated to injury incidence and type throughout the season. Only start-of-season WB BMD, fat and lean mass, weight and tibial mass (38% site) contributed to the prediction of bone injury incidence.

Soft tissue changes

Our observation that players lost lean mass across the season paralleled those of a recent report of body composition changes of English Super League players. However, we did not observe the increase in body fat of that study,15 or of a skinfold study of adult amateur Australian Rugby League players.28

We suggest that the lean mass changes reflect the reduction in players’ strength and resistance training regime from 2–4 sessions/week in the preseason, to only 0–2 sessions/week during the season, when greater emphasis was placed on match preparation and physical recovery. Markedly increased playing exposure and intensity, including representative games, had the effect of reducing between-game recovery time such that fatigue, microtrauma and injuries5 ,12 ,15 reduce the capacity for fitness and strength training. The implications of the observed lean tissue changes are unclear. It is possible that reductions in lean mass may reduce muscle strength and endurance, speed, agility and power across the course of the season;1 however, the degree of such an effect would be difficult to quantify.

Bone changes

It was not unexpected to find that the average bone mass of the elite rugby league players at each DXA measured site was considerably higher than the ‘normal’ population. The majority of players participating in the current study began playing football prior to, or during, their adolescent years, a time when bone is highly responsive to mechanical loading.29 Although it is difficult to differentiate the influence of self-selection (more physically robust individuals tolerating the rigours of rugby league playing at the elite level better than more diminutive individuals) from the osteogenic effect of rugby league playing, the observed seasonal variation in bone mass is informative. The WB BMD gain to mid-season, is likely attributable to the highly favourable preseason loading stimulus of high-intensity fitness and resistance training, coupled with a typically tardy bone-remodelling response. The subsequent loss in the latter stages of the season may reflect a relative ‘detraining’ effect on an ‘over-adapted’ skeleton as training time and intensity is dramatically reduced to optimise game and injury recovery. An increase in WB BMD from preseason 2009 to preseason 2010 may be indicative of continued maturational growth in a relatively young cohort that is yet to attain peak bone mass. This observation is especially meaningful in the light of the observed bone changes across a playing season suggesting that even the robust skeletons of very highly trained young adult male athletes are sensitive to subtle changes in mechanical loading.

The current study is the first to examine the baseline and seasonal changes in elite rugby league player bone morphology and volumetric density from pQCT. Whereas the significant increase in cortical tibial mass (38% site) across the season has not previously been observed, we found no relationship between any pQCT parameter and player injury. The lack of observed relationship is potentially related to the small sample size and relatively small number of injuries. Further pQCT data collection is indicated.

Our results revealed that the average BMI of all players was 29(2.3) kg/m2. Irrespective of position, this value was higher than the recommended BMI guidelines (18.5–25 kg/m2),14 thereby classifying the cohort as overweight or obese. In this case, the metric clearly misrepresents the endomorphic mesomorph physiques of rugby league players.14 Indeed, our DXA findings indicate that rugby league player body composition is largely comprised of lean mass (average 80.6 kg or 84.9%) with only a small proportion of fat (average 12.2 kg or 8.7%). Our findings confirm that BMI should not be used to track changes in body composition in mesomorphic athletes.15

Injuries

Joint disruption injuries were the most common injury sustained (figure 3) which is in partial agreement with previous reports,2 ,5 ,30 but not all.9 ,11 ,12 ,31 Start-of-season anthropometric factors had limited ability to predict injury incidence or type throughout the season. Furthermore, there were no correlations between any anthropometric factors and muscle, connective tissue or impact injuries; however, WB BMD, fat and lean mass, weight and tibial mass at the 38% site were predictive of bone-related injuries.

Limitations

Although the total participant numbers in our study are comparable to that of other reports,15 ,32 reduced player attendance at 2009 postseason testing somewhat weakened statistical power. The relatively low injury numbers also limited our ability to detect significant associations with risk factors.

Conclusion

We observed an overall loss in lean mass of players throughout a professional rugby league season. This was accompanied by an increase in WB BMD until mid-season, which progressively decreased thereafter. We did not identify any strong relationships between body composition and injuries, with the exception of a relationship between baseline WB BMD, fat and lean mass, weight and tibial mass (38% site) and bone injury incidence. Longer term tracking of rugby league player body composition and injuries is warranted.

Acknowledgments

The authors would like to thank the staff and players of the Australian National Rugby League team who participated in the study.

References

Footnotes

  • Contributors EG, the primary researcher, contributed to study design, data collection and analysis; prepared the manuscript draft; assisted with editing and approved the final text. BW, researcher, contributed to data collection and analysis, reviewed the manuscript draft, edited and approved the final text. CM, team strength and conditioning trainer, liaised did liaison with the Australian National Rugby League club, assisted in recruiting and scheduling player testing times, and contributed to data collection. BB, researcher, was responsible for initial study concept, design and liaising with the football club, contributed to data collection and analysis, contributed to manuscript structure and drafting, and edited and approved the final text.

  • Funding None.

  • Competing interests None.

  • Ethics approval Griffith University Human Research Ethics Committee.

  • Provenance and peer review Not commissioned; externally peer reviewed.

  • Data sharing statement There are no additional data available.