the predictive validity of functional screening tests for ...

1 downloads 0 Views 722KB Size Report
Overuse injuries result from the inability of cells to repair micro-damage caused by ...... Journal of the Royal Navy Medical Service, 89(1), 11–18. Aminaka, N.
THE PREDICTIVE VALIDITY OF FUNCTIONAL SCREENING TESTS FOR LOWER EXTREMITY OVERUSE INJURIES IN MILITARY RECRUITS OF THE ARMED FORCES OF MALTA by SANDRO VELLA ST20045697

A project submitted in partial fulfilment of the Requirements for the Degree of Master of Science (Sport and Exercise Medicine)

Cardiff School of Sport Ysgol Chwaraeon Caerdydd Cardiff Metropolitan University Prifysgol Fetropolitan Caerdydd

March 2016

Prifysgol Fetropolitan Caerdydd

:

Cardiff Metropolitan University

THE DEGREE OF MASTER OF SCIENCES

DECLARATION This work is being submitted in partial fulfilment of the requirements for the degree of Master of Science (Sports and Exercise Medicine) and has not previously been accepted in substance for any degree and is not being concurrently submitted in candidature for any degree.

Signed.................................................(candidate) Date Word Count – 8108 STATEMENT 1 This dissertation is the result of my own work and investigations, except where otherwise stated. Where correction services have been used, the extent and nature of the correction is clearly marked in a footnote(s). Other sources are acknowledged by footnotes giving explicit references. A bibliography is appended.

Signed...................................................(candidate) Date STATEMENT 2 I hereby give consent for my dissertation, if accepted, to be available for photocopying and for inter-library loan, for deposit in Cardiff Met’s eRepository, and that the title and summary may be available to outside organisations.

Signed.....................................................(candidate) Date

i

Project Title: THE PREDICTIVE VALIDITY OF FUNCTIONAL SCREENING TESTS FOR LOWER EXTREMITY OVERUSE INJURIRIES IN MILITARY RECRUITS OF THE ARMED FORCES OF MALTA Author names and affiliations: SANDRO VELLA ͥ ͥ Armed Forces of Malta, Medical Centre, Luqa Barracks, Malta Corresponding author: SANDRO VELLA ͥ ͥ ͥ ͥ ‘Nativity’ Triq Xatbet L-Art Attard ATD 1411 Malta Telephone: 0035679619930 Email: [email protected]

1

Acknowledgements It is not the crossing of the finish line that gives you such satisfaction; it is every step you take in order to reach it. This dissertation would not have been possible without the support of many people. I would like to thank everyone who contributed to the studies and supported me during the last years. My cordial thanks goes to Dr. Isabel Sarah Moore, my dissertation supervisor, for her helpful, influential and essential advice. She was always there with a sharp eye for detail, and a source of knowledge. I would also like to express my appreciation to the collaborators of the Armed Forces of Malta (Colonel Mark Mallia, Lieutenant Colonel Wallace Camilleri, Sergeant Major Matthew Psaila, Major Jason Ebejer, Lieutenant Maverick Scerri, Sergeant Charles Dimech and Bombardier Aaron Ancillieri) who helped in administrative organization and facilitated data collection to occur without delays. I would also like to thank all recruits of the Armed Forces of Malta who volunteered in this study. Last, but not the least, I need to thank my wife, Claudia, for putting up with an absentee husband and for providing me with the support throughout my studies for the past three years. It is a credit to her that I have been able to go to work whilst pursuing further studies. Without her constant support, motivation and understanding, it would have not been possible for me to arrive to this stage.

2

Abstract Objectives: Basic military training is physically demanding with recruits suffering mostly from overuse-related lower extremity injuries, resulting in loss of training days, increased medical costs, discharges and dropouts. Hence, the study aimed at investigating risk factors and predictive validity of three functional tests, namely, the single leg hop test, weight-bearing lunge test and Y-excursion balance test, to identify military recruits with increased risk of lower extremity overuse injuries during basic military training. Design: Prospective cohort design. Setting: The Armed Forces of Malta training school, medical centre and Luqa’s barracks. Participants: All 84 invited recruits who started their basic military training in August 2015 agreed to participate in the study. Eight participants terminated their training prematurely, resulting in a final cohort of 76, consisting of 73 males and 3 females with a mean age of 22 ± 2.41 years. Main Outcome Measures: All lower extremity overuse injuries were recorded using the Orchard Sports Injury Classification System. These injuries were analysed for associations with risk factors including: age, height, leg length, body mass index and fitness score, and with functional tests including: single leg hop test, weight-bearing lunge test and Y-excursion balance test for their predictive validity. Results: A total of 29 recruits sustained 42 lower extremity overuse injuries. The only risk factor associated with lower extremity overuse injuries was pre-

3

basic military training fitness level (p=0.04), with lower fitness level being related to a higher risk of sustaining LEOI. Functional tests valid for predicting lower extremity overuse injuries in recruits included the normalized single leg hop test for distance (p=0.04) with a cut-off score of 86%, indicating recruits jumping a lesser distance being at an increased risk of sustaining LEOI; and the composite right-left lower extremity asymmetry of the Y-excursion balance test (p=0.04) with a cut-off score of 12cm, indicating a higher risk of LEOI in those with an increased asymmetry. Conclusion: This study indicated fitness level as a risk factor for lower extremity overuse injuries in military recruits suggesting adjustments to selection and the use of prevention strategies including pre-conditioning to maintain high fitness level before initiation of basic military training. The normalized single leg hop test for distance and the Y-excursion balance test composite reach limb difference, valid predictors of lower extremity overuse injuries might be implemented and used as part of the pre-recruitment screening procedure in order to identify military recruits possessing specific deficits in the lower extremities rendering them vulnerable to such injuries. Such implementation might lead to significant benefits in terms of reduced injury rates.

Keywords: Basic military training, lower extremity overuse injuries, risk factors, fitness level, screening, predictive validity, functional tests, single leg hop test, weight-bearing lunge test, Y-excursion balance test.

4

Abbreviations ACL

-

Anterior Cruciate Ligament

AD-ROM

-

Ankle Dorsiflexion Range of Motion

AFM

-

Armed Forces of Malta

BMI

-

Body Mass Index

BMT

-

Basic Military Training

LEOI

-

Lower Extremity Overuse Injuries

PFPS

-

Patellofemoral Pain Syndrome

ROM

-

Range of Motion

SLHT

-

Single Leg Hop Test

SEBT

-

Star Excursion Balance Test

WBLT

-

Weight-Bearing Lunge Test

YEBT

-

Y-Excursion Balance Test

5

Introduction 1. Overview Military recruits undergo basic military training (BMT), characterized by repetitive strenuous and vigorous weight bearing exercises including marching, calisthenics, climbing, hurdling, crawling, jumping, digging, lifting and carrying loads while hiking (Kaufman, Brodine & Shaffer, 2000; Knapik et al., 2006). With specific reference to the Armed Forces of Malta (AFM), the 18week BMT consists of a standardised weekly exposure time of seven and a half hours of endurance running, upper body circuit training, combat physical training and route marches. Recruits undergo such training to achieve full potential of their physical fitness (Mohammadi, Azma, Nazeh, Emadifard, Etemadi, 2013) which is vital for operational readiness (Kaufman et al., 2000). Yet, the high and sharp increase in physical demands experienced during this period result in frequent musculoskeletal injuries (Mattila, Parkkari, Korpela, Pihlajamaki, 2006) leading to a reduction in physical fitness, loss of training days, increased medical costs (although there are no official costs reported by AFM, the United States military defence reported an estimate of $31,000 per injured discharged recruit during the 2005 fiscal year [Niebuhr, Scott, Powers & Krauss, 2008]), discharges and dropouts (Blacker, Wilkinson, Bilzon & Rayson, 2008). In fact, in the only study that has been carried out on the AFM, 8% of the total recruits in 2014, dropped out because of musculoskeletal injuries (Psaila, 2015). Military recruits do suffer from acute musculoskeletal injuries, yet most injuries tend to be overuse-related, mainly affecting the lower extremities (Kaufman et

6

al., 2000; Canham-Chervak, 2010). In fact, various studies have reported high incidence rates of overuse injuries in military recruits. Rauh and colleagues (2006) noted that 50% of the Marine Corps recruits sustained lower extremity overuse injuries (LEOI) (Rauh, Macera, Trone, Shaffer & Brodine, 2006), while both Davidson, Wilson, Chalmers, Wilson and McBride (2009) and Baxter, Baycraft and Baxter (2011) reported 10% of the recruits of the New Zealand Defence Force were affected by LEOI at any one time of the training period. Yet such rates tend to vary between studies due to different training regimes and methodologies used (Kaufman et al., 2000; Blacker et al., 2008). Notwithstanding, this data is quite alarming considering that the amount of injuries sustained by military recruits is comparable to or even greater than that reported by endurance athletes (Kaufman et al., 2000). Considering that most of the physical training performed includes strenuous repetitive exercises of the lower limb (Rauh et al., 2006), most of the injuries tend to occur mostly at or below the knee (Kaufman et al., 2000). In fact, Hauret and colleagues (2010) reported that 82% of the 1.6 million injuries sustained by United States army recruits were overuse-related, with 42% of these 42% involving the lower extremities, of which 23% were sustained by the knee, followed by the foot and ankle (15%) and the pelvis, hip and thigh (4%) (Hauret, Jones, Bullock, Canham-Chervak & Canada, 2010). Overuse injuries result from the inability of cells to repair micro-damage caused by cumulative micro-traumatic forces (Maganaris, Narici, Almekinders & Maffulli, 2004). These forces result from repetitive and/or excessive activity following high volume and intensity of the BMT, especially in those not adapted to sharp increases in physical demands (Jones, Thacker, Gilchrist, Kimsey & 7

Sosin, 2002; Knapik et al. 2010), who strain tissues beyond their ability to return to their healthy state leading to insidious onset of pain and inflammation (Hoffman, Chapnik, Shamis, Givon & Davidson, 1999). Such injury etiology result in overuse conditions, including stress fractures within the bones of the feet, enthesopathies involving tendons and fascia in the feet, fascial and muscular compartment syndromes in the lower leg, tibial stress reaction syndromes, tibial stress fractures, as well as contractile, ligamentous, and intra-articular injuries to the knee, hip and lumbo-pelvic regions (Rome, Handoll & Ashfort, 2005). The explanations for the high frequency of LEOI in military recruits are multifactorial with these being linked to intrinsic (those from within the body) and extrinsic risk factors (those outside of the body) (Murphy, Connolly & Beynnon, 2003; Blacker et al., 2008). Amongst military recruits, external risk factors include; equipment choice, weight of carried load (Birrell, Hooper & Haslam, 2007), smoking (Kaufman et al., 2000; Bulzacchelli, Sulsky, Rodriguez-Monguio, Karlsson & Hill, 2014) and training surfaces variation (Leggat and Smith, 2007), whilst internal risk factors include; gender (Gemmell, 2002; Allsopp, Scarpello, Andrews & Pethybridge, 2003), physical fitness (Bell, Mangione, Hemenway, Amoroso & Jones 2000; Blacker et al., 2008; Bulzacchelli et al., 2014), body mass index (BMI) (Blacker et al., 2008), age (Bulzachelli et al., 2014), ethinicity (Blacker et al., 2008) previous injury (Kaufman et al., 2000), lower limb strength (Allsopp et al., 2003), and flexibility (Knapik, Sharp, Canham-Chervak, Hauret, Patton & Jones, 2001). Yet, solely recognising risk factors of overuse injuries as problematic does little for their prevention. Despite this, only a few military large-scale prospective studies 8

have validated the concept of LEOI preventive measures through predicting those at risk of such injuries (Pope, Herbert, Kirwan & Graham, 1999; Rauh, Macera, Trone, Reis & Shaffer, 2010). Factors for overuse injuries are complex and there is still relatively little known regarding their prediction (Serfontein, 2009). Thus, it would appear logical that such complexity be reflected in the use of screening tools, which acknowledge the multi-factorial nature of injury (Hreljac, Marshall & Hume, 2000; Serfontein, 2009). Low-cost, valid and reliable predictive screening tools may be a crucial component in preventing overuse injuries in those at high risk of developing LEOI on a large scale such as that in the military. With this information, training can be adjusted for these recruits in order to prevent potential overuse injuries and thus saving the military a substantial amount of money, time and other resources (Baxter, 2014). 2. Functional Screening Tests It is acknowledged that overuse injuries are multifactorial (Molloy, Feltwell, Scott, Niebuhr, 2012), however, within a military setting, certain risk factors including weaponry carriage and restricted arm movements cannot be modified (Baxter, 2014). The most feasible recommendation with regards to preventive measures could be the identification and correction of an individual’s specific internal risk factors, such as lower extremity strength, flexibility and postural stability, which are vital physical components needed by recruits during their BMT (Jones & Knapik, 1999) On the basis that identifiable deficits in functional movement patterns have the potential to limit performance and make one susceptible to injury (Schneiders, Davidsson, Horman &

9

Sullivan, 2011), new approaches are focusing on examination of movement patterns and their relationship to injury (Cook, Burton & Hoogenboom, 2006; Kiesel, Plisky & Voight, 2007; Mottram & Commerford, 2008) rather than examining a sole muscle group or joint. This is based on data indicating that dysfunction in one body part may contribute to malfunctioning in other body parts (Wainner, Whitman, Cleland & Flynn, 2007). Thus the section below will consider internal risk factors including lower extremity strength, flexibility and postural stability with regards to their screening methods. 2.1. Lower Extremity Strength According to the closed kinetic chain, lower extremity proximal strength controls distal segments so as to prevent injuries (Trojian, 2006). Hence, a malfunctioning lower extremity joint may lead to injuries manifested in other joints or structures (Prentice & Voight, 2001). In fact, lack of lower extremity muscle strength has been associated with an increased risk of LEOI (King, 2013) and this has been exhibited in different studies (Duffey, Martin, Cannon, Craven & Messier, 2000; Soderman, Alfredson, Pietila & Werner, 2001; Mangine et al. 2014). Through a prospective study, Mangine et al. (2014) reported an association between bilateral lower extremity muscle structural differences including muscle thickness and pennation angle evaluated through ultrasound, and increased lower extremity injuries in basketball players (Mangine et al., 2014). Although isokinetic strength tests

lack training

specificity, Soderman et al. (2001) reported a reduction in quadriceps and hamstring strength to be related to a higher risk of overuse injuries in female soccer players (OR=1.13) and Duffey et al. (2000), identified decreased knee flexor and extensor muscle strength as predictors of anterior knee pain in 10

runners. Yet, although such studies report lack of strength to be related to lower extremity injuries in the wider population, there is a lack of research on military recruits. The only prospective study reported that male recruits who had lower extremity strength at one standard deviation below the population mean, were at a significantly higher risk of stress fractures than the stronger recruits (Hoffman et al., 1999). However, the use of one-repetition leg press method maybe questioned, as it lacks specificity of training. On the other hand a functional screening test, might be ideal since it closely simulates a given task (Reiman & Manske, 2009). Such a method could include inexpensive and rapid field tests, which are effective and reliable at predicting a person’s lower extremity strength and power (Hamilton, Shultz, Schmidt, & Perrin, 2008; Keeley, Plummer & Oliver, 2011). Ostenberg and colleagues (1998) proposed the use of Single Leg Hop Test (SLHT) for distance to detect lower limb weaknesses and asymmetries in healthy populations, yet, there is a dearth of research investigating the effectiveness of specific hop jumps in predicting strength deficits in military recruits (Ostenborg, Roos, Ekdahl & Roos, 1998). The SLHT mimics the specific action of jumping and landing (Jones & Knapik, 1999; Fitzgerald, Lephart, Hwang & Wainner, 2001). It possesses high intrarater reliability measures in healthy individuals with intra-class correlation coefficients ranging between 0.93 (Bandy, Rusche & Tekulve, 1994) to 0.96 (Bolgla & Keskula, 1997). It has been validated and typically used to identify those at risk of knee instability after anterior cruciate ligament (ACL) injuries (Grindem et al., 2011). Yet, the SLHT has also been validated as a test for lower limb strength after having obtained criterion validity against the vertical jump test (Gustavsson et al., 2006), the latter, a widely used test to measure 11

lower extremity muscle strength (De Salles, Vasconcellos, De Salles, Fonseca & Dantas, 2012). The current available studies making use of the SLHT in order to predict risk of sustaining lower extremity injuries have come up with contradictory results (Davies & Zillmer, 2000; Brumitt, Heiderscheit, Manske, Niemuth & Rauh, 2013). In a prospective study carried out on 110 collegiate athletes, Brumitt et al. (2013) found an association between greater SLHT distance and increased risk of lower extremity injuries in males. Contradicting this finding, Davies and Zillmer (2000) indicated that the greater the SLHT, the less the risk of sustaining lower extremity injuries and proposed that male athletes should be able to hop at least 80% of their height in order to return to sport after injury. Such conflicting results might stem from methodological limitations since the former study failed to investigate the playing exposure times, as those who had a greater hopping distance might have had greater playing times, hence increasing their risk of injuries. Thus, having a cohort, such as the military recruit, completing identical training for the same exposure time would eliminate or control for such potential confounders of injuries. 2.2. Ankle Dorsiflexion Range of Motion Adequate ankle dorsiflexion range of movement (AD-ROM) is vital for the normal functional performance of activities including walking and running (Rabin, Kozol, Spitzer & Finestone, 2015). Furthermore, it is an important component in the absorption of lower limb load when marching and landing from a jump (Malliaras, Cook & Kent, 2006). Despite this, existing knowledge on the potential importance of AD-ROM for the development of LEOI in military

12

recruits is in general rather poor with research having mostly focused on dorsiflexion range on various specific and individual conditions in athletes (Riddle, Pulisic, Pidcoe & Johnson, 2003; Piva, Goodnite & Childs, 2005; Backman & Danielson, 2011). Studies have pointed out the association between decreased AD-ROM and lower extremity injuries (Gabbe, Bennell, Wajswelner & Finch, 2004; Piva et al., 2005).

Malliaras et al. (2006) demonstrated that volleyball players

exhibiting AD-ROM less than 45 degrees were at 1.8 to 2.8 times the risk of developing patellar tendinopathy. Nonetheless, due to the cross-sectional design of the study, one cannot infer reduced AD-ROM as the cause of patellar injuries or as a mere side effect of the condition. Addressing this, in a prospective study on 90 junior elite basketball players, Backman and Danielson (2011) found that those with limited AD-ROM were at a significantly higher risk of developing patellar tendinopathy. With regards to plantar fasciitis, Irving et al. (2007) reported more people with increased AD-ROM in the chronic heel pain group (45°) compared to the healthy group (40°) (Irving, Cook, Young & Menz, 2007). This study is in direct contrast with the data obtained by Patel and DiGiovanni (2011) who found 83% of patients with plantar fasciitis, had restricted ankle AD-ROM. Such contradictory results might stem from the different methodologies used to measure AD-ROM as the former study used the non-weight bearing method whilst the latter used the weight-bearing method. The Weight Bearing Lunge Tests (WBLT) is a practical and valid test for assessing AD-ROM in the clinical setting (Krause, Cloud, Forster, Schrank & Hollman, 2011), possessing excellent inter-tester reliability ranging between 13

0.90 (Konor, Morton, Eckerson & Grindstaff, 2012) and 0.97 (Bennell, Talbot, Wajswelner, Techovanich, Kelly & Hall, 1998). Unfortunately, to date, there is a dearth of literature making use of such test in predicting LEOI in military recruits, with the available literature focusing on selected conditions. Pope and colleagues (1998) conducted a prospective study on 1093 army recruits over a 12-week training program and reported WBLT not to be predictive of lower extremity stress fractures due to no correlation between the two (Pope, Herbet & Kirwan, 1998). Furthermore, Rabin et al. (2015) did not find a significant difference in AD-ROM between those suffering from Achilles tendinopathy and those not. Yet, both of these studies limited its investigation to a selected few lower extremity injuries and thus one cannot deduce whether decreased ankle dorsiflexion was associated with other types of overuse injuries. There is thus a need for prospective studies to examine the validity of WBLT in predicting the risk of all LEOI. 2.3. Postural Stability and Neuromuscular Control The ability of athletes to control the position of their centre of gravity is a potential risk factor for lower extremity injuries (Murphy et al., 2003; Zazulak, Hewett, Reeves, Goldberg & Cholewicki, 2007a). This is underpinned by the theoretical foundation that decreased stability or proprioceptive deficit (Cuğ, Ak, Ozdemir, Korkusuz & Behm, 2012) at or around a joint, leads to a delayed reflex response potentially leading to injurious movements (Zazulak et al. 2007a).In fact, Switlick and colleagues (2015) have linked a deficit in the neuromuscular control, postural control and balance with a higher risk of lower extremity injuries in athletes (Switlick, Kernozek & Meardon, 2015). Furthermore, the importance of postural control, which using the theory of the 14

kinetic chain is defined as the ability of the core muscles to support and control distal movements (Trojian & McKeag, 2006), has been highlighted in studies linking balance and proprioceptive ability of the core muscles to future injuries (Trojian & McKeag, 2006; Hrysomallis, 2007; Zazulak et al., 2007b). Such suggestions underpin the investigations of stability and proprioception as the basis of overuse injury screening test (Lee, 2008). There have been a number of studies which have used various proprioceptive/balance tests in order to quantify risks based on the proprioceptive ability of individuals, however the majority of studies have focused on the relationship between postural control and acute ankle injuries (McGuine, Greene, Best & Leverson, 2000) with not much studies focusing on LEOI. Furthermore, de Jong et al. (2005) argues that performance on each proprioceptive test is different and results should be recognised as test-specific, especially when considering that static proprioceptive tests do not require a great degree of dynamic balance (de Jong, Kilbreath, Refshauge & Adams, 2005) which may thus not provide an insight into the additional injury risks (Butler, Lehr, Fink, Kiesel & Plisky, 2013). Conversely, dynamic postural-control tests mimic the demands of physical activities thus providing a better evaluation of injury risks (Gribble, Hertel & Plisky, 2012). Such test is the Star Excursion Balance Test (SEBT), a functional, inexpensive and an easy-to-use clinical dynamic balance and postural-control test, which can translate itself into movement tasks done during training (Gribble et al., 2012). In healthy individuals, this test possesses moderate to excellent (intraclass correlation coefficient [ICC] 0.84 - 0.99) intra-rater reliability (Hertel, Miller & Denegar, 2000; Plisky, Gorman, Butler, Kiesel, Underwood & Elkins, 15

2009). This test has been validated to detect the level of neuromuscular and postural-control deficits in patients with lower extremity injuries such as chronic ankle instability (Gribble, Hertel & Denegar, 2007), ACL reconstruction (Herrington, Hatcher, Hatcher & McNicholas, 2009) and patellofemoral pain syndrome (PFPS) (Aminaka & Gribble, 2008). SEBT has also been used in prospective studies in order to predict risks of sustaining lower extremity injuries (Plisky, Rauh, Kaminski & Underwood, 2006; Butler et al., 2013). Using the SEBT as a predictive test for lower extremity injuries, Plisky and colleagues (2006) reported that basketball players with anterior right-to-left reach differences of more than 4 cm during the pre-season, were two and a half times more likely to sustain lower extremity injuries whilst girls with a composite reach score of less than 94% of their limb length were at six and a half times more likely to sustain a lower extremity injury (Plisky et al., 2006). Furthermore, in a study by Butler et al. (2013), it has been found that poor dynamic balance performance exhibited as a composite SEBT score of less than 89% has been associated with elevated lower extremity injury risks in college football players. Despite promising results, the external validity of such a test is currently limited as the differences observed in the results between studies indicate the need for population-specific cut-off points to be developed for screening injury risks (Gribble et al., 2012). As a result, there is a need for studies using SEBT on military recruits so as to establish its predictive validity and suitable cut-off points for LEOI. Only by doing this, SEBT can better used to predict injuries in such a population.

16

3. Purpose of the study Several risk factors have been linked with LEOI in military recruits during BMT. However, specific screening programmes for LEOI have never been undertaken in the AFM. The use of the tools that have been validated in nonmilitary populations will be considered in this study to assess their utility in the military setting as the relationship between SLHT, WBLT and YEBT, each individually, on LEOI risks in military recruits has never been previously reported. Thus, the aim of the present study was to investigate risk factors and predictive validity of these three functional tests to identify military recruits with increased risk of LEOI during the four-month of BMT. Ultimately, the purpose of this study is to optimise recruit selection methods in the AFM and when possible, alter risk factors prior to initiation of BMT to help reduce injury rate and premature discharge from BMT.

17

Method Participants A cohort of 87 military recruits (81 males and 6 females), scheduled to undergo BMT in July 2015 satisfied criteria for enlistment into the AFM. They were invited to participate in the study through a letter distributed during their medical screening. All of the cohort agreed to participate (mean age of 22 ± 2.4; mean mass of 74.3 ± 10 kg and mean height of 173.3 ± 7.9 cm), and provided written informed consent after receiving comprehensive information including that they could withdraw from the study at any point during BMT, which lasted 126 days. No specific exclusion criteria were applied since enrolment procedures within the AFM do not allow participants to undertake BMT if suffering from acute or chronic musculoskeletal injuries. Consent to carry out the study was obtained from the appropriate personnel at the AFM. Ethical approval was obtained from the Cardiff Metropolitan University Ethics Board and the University of Malta Ethics Board. Test procedures All testing was carried out on the same day by the same investigator, a week prior commencement of BMT and participants were given a 15 minute presentation to familiarise them with the methodology. Questionnaires consisting of demographic and anthropometric data including age, sex, height, mass, dominant limb and previous level of injuries during the preceding three months were handed out to each participant. Further anthropometric data including: BMI, calculated by dividing the mass in kilograms by height in meters squared; and average leg length of both limbs obtained by quantifying 18

the distance between the anterior superior iliac spine to the center of the ipsilateral malleolus, was collected. Moreover, pre-BMT physical fitness level score consisting of push-ups, sit-ups and a one-mile running test with a maximum possible score of 300, was recorded by attaining it directly from the participant’s entry files at the AFM training school. The functional screening tests included the single leg hop test, weight-bearing lunge tests and Y-excursion balance test. Preceding these tests, subjects were asked to perform a dynamic warm-up consisting of a five minute jog (Brumitt et al., 2013), followed by all the three tests in random order. The tests were performed in a gym hall consisting of laminated flooring. To reduce the risks of fatigue, a 30-second and a two-minute rest were provided between each repetition and between each test, respectively. SLHT: The SLHT was performed following the procedure of Noyes and colleagues (Noyes, Barber & Mangine, 1991). Recruits stood with one leg on a line (strip of tape on the floor which was placed perpendicular to a measuring tape (Figure 1). They were asked to hop forward as far as possible along the line of the tape measure with no restriction to arm movements. All subjects performed one practice trial for each leg before the actual recorded trials. The recruits then performed three single leg hops on each leg, alternating legs after each hop. For the test to be recorded, subjects had to hold the landing position for five seconds (Davies & Zillmer, 2000). If not, the single leg hop was repeated. The distance hopped was recorded from the starting line to the rear most heel. The maximum distance in centimetres obtained from the three trials was used for analysis.

19

Figure 1. Single leg hop test with the right lower extremity. WBLT: The WBLT was performed following the procedure of Benell and colleagues (Bennell, Talbot, Wajswelner, Techovanich, Kelly & Hall, 1998). A strip of tape and a measuring tape parallel to it were fixed to the floor, continuing up straight the wall (Figure 2). The foot was placed on the tape such that the heel and great toe were aligned on top of the line. The participant lunged forward to contact the centre of the patella with the line on the wall, while the foot position maintained with the heel in contact with the floor. With this position in hold, the subject’s foot was then progressed away from the wall until the maximal distance between the wall and great toe was obtained without lifting the heel off from the floor. Subjects performed three trials with each limb. The maximal distance obtained for each limb was used for statistical analysis.

20

Figure 2. Weight Bearing Lunge Test Y-Excursion Balance Test (YEBT): Based on factor analysis results obtained by Hertel and colleagues (2006) indicating shared variances across the eight reach directions present in the SEBT (Hertel, Braham, Hale & OlmstedKramer, 2006), the subject could perform the test in only three directions, namely the anterior, posteromedial and posterolateral, thus the YEBT was used as a modification of the SEBT. The YEBT was performed following the procedure of Plisky et al. (2009). Participants stood with one leg in the middle of a Y, formed by three pieces of tape in different directions: anterior, posteromedial and posterolateral, with the direction based on the reaching leg in relation to the stance leg (Figure 3). Three measuring tapes were fixed parallel to each tape. Subjects were asked to reach as far as possible in each direction with the other leg and lightly touch the line whilst maintaining balance 21

with the standing leg. In order to minimize the learning effect, participants were asked to reach in each direction six times (Hertel et al., 2000), prior to performing three actual trials (three trials with each lower limb). A trial was repeated if the subject lost balance with the standing leg or touched down on the reaching foot. Distance was recorded from the tape measure and the maximum reach score from the centre line was extracted for data analysis.

Figure 3. Y-excursion balance test – reaching in the anterior direction with the left lower extremity.

Injury Surveillance: Throughout the 126-days of BMT, the recruit cohort was followed and injuries were registered when physical damage was reported to the medical centre. All sustained injuries were recorded through the Orchard Sports Injury Classification System version 10.1 (Rae & Orchard, 2007). For the purpose of this investigation, the definition of overuse injuries used was that of a ‘musculoskeletal problem of insidious onset associated with repetitive physical activity’ (Kaufman, Brodine, Shaffer, Johnson & Cullison, 1999). Injuries were recorded by a medical doctor specialised in sports and exercise medicine. The data collected included the diagnosis and the lower extremity 22

anatomical side and site. The latter was categorized by region including hip and thigh, knee, leg, ankle and foot. For the purpose of descriptive analysis, multiple injuries for the same recruit were recorded. The study investigator reviewed injury records throughout the BMT to ensure appropriate data collection. Due to the probability that recruits might not have reported musculoskeletal injuries due to fear of any consequences that would result because of this, the investigator met each recruit individually at the end of the BMT in order to determine any unreported injuries. All injuries were reviewed and classified as acute or overuse type of injury, but for the purposes of the study statistical analyses was conducted only for overuse injuries. Data Analysis For the purpose of this study, a ‘control’ was calculated by taking the average scores of the uninjured recruits for both lower extremities. Also, to have comparable reaching distances between recruits, distances obtained from SLHT and YEBT were normalized to leg length (Reiman and Manske, 2009), yet this was not required for WBLT (Hoch and McKeon, 2011). The normalized value was obtained by dividing the distances reached by the leg length multiplied by 100 to express value as a percentage. With regards to the YEBT, normalized composite reach distance was obtained through the summation of scores for each limb for the three reach directions, divided by three times the leg length and multiplied by 100. Lower extremity asymmetry was obtained by subtracting the lowest from the highest score obtained from both lower extremities. With regards to YEBT, composite right-left lower extremity reach differences was obtained by subtracting one limb composite score from the other for each recruit. 23

Independent Variables: Independent variables in this study included risk factors, namely the: age, height, BMI and fitness score. (Although females were included in the analysis, comparisons between genders was not performed due to the low number of female participants [n=3]). Functional test score variables are grouped in table 1. Table 1. Independent variables – functional tests and their respective variables Single-leg hop Test

Y-excursion balance test

Weight-bearing lunge test

Right and left lower extremities

Anterior reach of right and left lower extremities

Right and left lower extremities

Lower extremity asymmetry

Anterior reach lower extremity asymmetry

Lower extremity asymmetry

Posteromedial for right and left lower extremities Posteromedial reach lower extremity asymmetry Posterolateral for right and left lower extremities Posterolateral reach lower extremity asymmetry Composite reach for right and left lower extremities Composite reach lower extremity asymmetry

24

Dependent variable: Presence or absence of overuse injuries denotes the dependant categorical variable. Injuries were sub-grouped into body area (hip, thigh, knee, leg, ankle and foot) for each side of the body (right and left). Statistical Analysis Descriptive statistics (means and standard deviations) were calculated for the subjects’ baseline demographic characteristics, functional test scores and injury data. The injury risk was calculated as follows: Injury risk = number of recruits with one or more injuries / total number of recruits. The injury incidence was calculated as follows: Injury incidence (per 100 days) = total number of injuries / (total number of recruits x number of training days x 100). All data was checked for normality of distribution through Shapiro-Wilk test. To examine associations between potential individual injury risk factors and overuse injuries, continuous data from fitness scores was analysed using independent-sample T-test. Due to their non-parametric distribution of data, age, height, BMI, leg length were analysed using Mann-Whitney U-test. Analysis for predictors of lower extremity overuse injuries was performed by comparing normally distributed test scores between all the injured limbs and the uninjured recruits ‘control’ limb via an independent sample t-test. Mann-U Whitney test was performed for non-parametric tests scores. Because cut-off points for these tests in military recruits have not been previously reported, a 25

cut-off point for statistically significant predictors that most accurately discriminated between injured and non-injured recruits was obtained through a receiver-operating characteristic curve (Zou, O’Malley & Mauri, 2007). The cut-off point on the curve was identified as the uppermost left point on the graph which maximized both sensitivity and specificity. Subsequently, relative to the cut-off point obtained, a 2 x 2 contingency table was created in order to compare the proportion of recruits in a high-risk group with the proportion of recruits in the referent group through risk ratios. To test for significant differences between the injured and uninjured lower extremity scores of the same recruit, a paired-sample t-test was used. Furthermore, a one-way analysis of variance was used to test for significance between the mean scores of the lower extremities obtained by three groups, that is, the non-injured recruits, the one-extremity injured recruits and both lower extremity injured recruits. Statistical analysis was performed using SPSS for Windows (version 21.0) with a p-value of 0.05 to indicate statistical significance.

26

Results A total of 76 military recruits formed the final population, after eight recruits (five males and three females), representing 10.5% of the total population dropped

out

of

the

BMT.

The

baseline

characteristics

including

anthropometrics of the final population are shown in table 1. Table 1. Baseline characteristics of military recruits per gender. Values expressed as means ± standard deviations. Test

Total

Recruits

76

Age (y)

22 ± 2.41

Height (cm)

173.3 ± 7.87

Leg Length (cm)

92.6 ± 4.37

Weight (kg)

74.3 ± 10.04

BMI

24.7 ± 3.24

Fitness Score

180.87 ± 41.83

Personal reasons (37.5%) accounted for the majority of the dropouts. However two recruits (25%) were discharged due to knee injuries, another two (25%) blamed their musculoskeletal ailments, whilst one (12.5%) was discharged for failing to obtain the required standards. Their data was not included in the statistical analysis. Injuries A total of 39 recruits (51.3%) from a cohort of 76 incurred lower extremity injuries (acute and/or overuse injury) out of which 29 (74.4%) sustained at least one LEOI. With regards to the latter, this amounted to 38.2% of the whole 27

population, with an injury risk of 0.38. During the course of the BMT, 42 overuse injuries in 37 lower extremities were reported, with 8 recruits sustaining injuries to both lower extremities leading to an incidence rate of 0.55 injuries per 100 days of BMT. Overall, there was a minor difference with regards to the lower extremity dominancy and injuries, with more injuries (a difference of 9.5%) occurring in the non-dominant lower extremity as shown in table 2. The most frequent LEOI suffered included patellofemoral pain syndrome (19%), medial tibial stress syndrome (19%) followed by calcaneal stress fracture (14.3%). The full list of injury types is presented in appendix A. Table 2. Injuries grouped according to the dominant and non-dominant limbs. (Dominancy was defined as the extremity used to kick a football) Body Area Injured

Dominant

NonDominant 4

Total injuries (%) 8 (19)

Hip and thigh

4

Knee

4

6

10 (23.8)

Leg

7

8

15 (35.7)

Ankle

1

2

3 (7.1)

Foot

3

3

6 (14.3)

Total number of LEOI(%)

19 (45.2)

23 (54.8)

42 (100)

Risk Factors Age, height and BMI were not associated with risks of LEOI. The only significant risk factor of LEOI was fitness score (Table 3). Recruits with a higher fitness score, indicative of a higher fitness level, had a significant lesser risk of sustaining LEOI when compared to those with lower fitness scores.

28

Table 3. Baseline characteristics of military recruits according to presence of LEOI. Test

P-value

Recruits with LEOI Yes (n=47)

No (n=29)

Age

22.2 ± 2.6

21.9 ± 2.3

0.62

Height (cm)

173.5 ± 7.1

173.3 ± 8.4

0.77

BMI

24.8 ± 3

24.6 ± 3.4

0.83

Fitness Score

168.4 ± 43.6

188.6 ± 39.2

0.04

Functional Screening Tests as Predictors of LEOI The only significant predictors for LEOI included the SLHT for distance (p=0.04, 95% CI for uninjured [96, 112], for injured [97, 100]) and the YEBT composite right-left lower extremity difference (p=0.04, 95% CI for uninjured [5, 9], for injured [8, 13]), with all the other test resulting insignificant (Table 4). The former showed those who jumped a greater distance had a decreased risk of sustaining LEOI when compared to those who jumped a smaller distance. The latter test indicated that those with an increased composite excursion asymmetry between the lower extremities, had a higher risk of LEOI when compared to those with a more symmetrical excursion. The cut-off scores for these tests, classifying recruits at an increased risk of LEOI and their resultant relative are presented in table 5.

29

Table 4. Functional test scores comparing mean uninjured recruits’ lower extremity ‘control’ (n = 47) and injured lower extremities (n = 37). Standard deviations for every test for each group are included in brackets. Test

Injured

Uninjured

P-value

SLHT normalized distance hopped (%)

93 (20)

104 (28)

0.04

SLHT lower extremity asymmetry (cm)

8.3 (5.5)

7.7 (6.6)

0.31

YEBT normalized anterior reach (%)

72 (10)

74 (9)

0.58

YEBT anterior lower extremity asymmetry (cm)

3.8 (4)

3.1 (3)

0.80

101 (13.6)

103 (13)

0.51

5.6 (5.2)

4.2 (3.6)

0.17

92.5 (14.4)

97.3 (13)

0.12

5.9 (5.2)

4.8 (3.2)

0.74

YEBT normalized composite reach (%)

88.6 (11.4)

91.2 (10.4)

0.28

YEBT composite lower extremity asymmetry (cm)

10.6 (8.4)

6.9 (6.1)

0.04

WBLT (cm)

12.9 (3.1)

12.1 (2.9)

0.24

WBLT lower extremity asymmetry (cm)

0.7 (0.9)

1.1 (1)

0.07

YEBT normalized posteromedial reach (%) YEBT posteromedial lower extremity asymmetry (cm) YEBT normalized posterolateral reach (%) YEBT posterolateral lower extremity asymmetry (cm)

30

Table 5. Test cut-off points and relative risks for statistically significant predictors of LEOI in military recruits

SLHT normalized distance hopped

>86%