Aughey R.J.,Victoria University of Melbourne
International Journal of Sports Physiology and Performance | Year: 2011
Global positioning system (GPS) technology was made possible after the invention of the atomic clock. The first suggestion that GPS could be used to assess the physical activity of humans followed some 40 y later. There was a rapid uptake of GPS technology, with the literature concentrating on validation studies and the measurement of steady-state movement. The first attempts were made to validate GPS for field sport applications in 2006. While GPS has been validated for applications for team sports, some doubts continue to exist on the appropriateness of GPS for measuring short high-velocity movements. Thus, GPS has been applied extensively in Australian football, cricket, hockey, rugby union and league, and soccer. There is extensive information on the activity profile of athletes from field sports in the literature stemming from GPS, and this includes total distance covered by players and distance in velocity bands. Global positioning systems have also been applied to detect fatigue in matches, identify periods of most intense play, different activity profiles by position, competition level, and sport. More recent research has integrated GPS data with the physical capacity or fitness test score of athletes, game-specific tasks, or tactical or strategic information. The future of GPS analysis will involve further miniaturization of devices, longer battery life, and integration of other inertial sensor data to more effectively quantify the effort of athletes. © 2011 Human Kinetics, Inc.
Aughey R.J.,Victoria University of Melbourne
International Journal of Sports Physiology and Performance | Year: 2011
Background: Australian football (AF) is a highly intermittent sport, requiring athletes to accelerate hundreds of times with repeated bouts of high-intensity running (HIR). Players aim to be in peak physical condition for finals, with anecdotal evidence of increased speed and pressure of these games. Purpose: However, no data exists on the running demands of finals games, and therefore the aim of this study was to compare the running demands of finals to regular season games with matched players and opponents. Methods: Player movement was recorded by GPS at 5 Hz and expressed per period of the match (rotation), for total distance, high-intensity running (HIR, 4.17-10.00 m·s-1) and maximal accelerations (2.78-10.00 m·s -2). All data was compared for regular season and finals games and the magnitude of effects was analyzed with the effect size (ES) statistic and expressed with confidence intervals. Results: Each of the total distance (11%; ES: 0.78 ± 0.30), high-intensity running distance (9%; ES: 0.29 ± 0.25) and number of maximal accelerations (97%; ES: 1.30 ± 0.20) increased in finals games. The largest percentage increases in maximal accelerations occurred from a commencement velocity of between 3-4 (47%; ES: 0.56 ± 0.21) and 4-5 m·s-1 (51%; ES: 0.72 ± 0.26), and with <19 s between accelerations (53%; ES: 0.63 ± 0.27). Conclusion: Elite AF players nearly double the number of maximal accelerations in finals compared with regular season games. This large increase is superimposed on requirements to cover a greater total distance and spend more time at high velocity during finals games. Players can be effectively conditioned to cope with these increased demands, even during a long competitive season. © 2011 Human Kinetics, Inc.
Aughey R.J.,Victoria University of Melbourne
Journal of Science and Medicine in Sport | Year: 2013
Objectives: It is not known if the activity profile of elite Australian football players changes across two levels of competition. The aims of this study were therefore to: (1) classify the activity profile of elite and sub-elite Australian football for players from one elite Australian football club; and (2) compare the activity profile of elite footballers across both elite and sub-elite competitions. Design: Quantitative case-study approach. Methods: Movement was recorded by 5Hz global positioning system and expressed relative to game time for total; and high-velocity running distance (4.17-10.00ms-1) and maximal accelerations (2.78-10.00ms-2). The difference was expressed as a percentage and effect size statistic with confidence intervals. Results: Elite Australian football players had 8% greater total 11% more high intensity running; and 16% more maximal accelerations during matches in 2009 compared to 2008. Players at a sub-elite level had no change in total; 9% less high intensity running but 23% greater maximal accelerations during the same period. In 2008 there was a 5% lower total covered by players in sub-elite competition; no difference in high intensity running; and 28% less maximal accelerations compared to elite. In 2009 the gap was larger for distance of running as sub-elite had 15% less total and 20% less high intensity running than elite. Similar to 2008, sub-elite players had 23% less maximal accelerations in 2009. Conclusions: The activity profile of players in the elite competition has increased over these two seasons, but not in the sub-elite. This has implications for teams where players must move between competitions during the season. © 2012 Sports Medicine Australia.
Goudey B.,Victoria University of Melbourne
BMC genomics | Year: 2013
It has been hypothesized that multivariate analysis and systematic detection of epistatic interactions between explanatory genotyping variables may help resolve the problem of "missing heritability" currently observed in genome-wide association studies (GWAS). However, even the simplest bivariate analysis is still held back by significant statistical and computational challenges that are often addressed by reducing the set of analysed markers. Theoretically, it has been shown that combinations of loci may exist that show weak or no effects individually, but show significant (even complete) explanatory power over phenotype when combined. Reducing the set of analysed SNPs before bivariate analysis could easily omit such critical loci. We have developed an exhaustive bivariate GWAS analysis methodology that yields a manageable subset of candidate marker pairs for subsequent analysis using other, often more computationally expensive techniques. Our model-free filtering approach is based on classification using ROC curve analysis, an alternative to much slower regression-based modelling techniques. Exhaustive analysis of studies containing approximately 450,000 SNPs and 5,000 samples requires only 2 hours using a desktop CPU or 13 minutes using a GPU (Graphics Processing Unit). We validate our methodology with analysis of simulated datasets as well as the seven Wellcome Trust Case-Control Consortium datasets that represent a wide range of real life GWAS challenges. We have identified SNP pairs that have considerably stronger association with disease than their individual component SNPs that often show negligible effect univariately. When compared against previously reported results in the literature, our methods re-detect most significant SNP-pairs and additionally detect many pairs absent from the literature that show strong association with disease. The high overlap suggests that our fast analysis could substitute for some slower alternatives. We demonstrate that the proposed methodology is robust, fast and capable of exhaustive search for epistatic interactions using a standard desktop computer. First, our implementation is significantly faster than timings for comparable algorithms reported in the literature, especially as our method allows simultaneous use of multiple statistical filters with low computing time overhead. Second, for some diseases, we have identified hundreds of SNP pairs that pass formal multiple test (Bonferroni) correction and could form a rich source of hypotheses for follow-up analysis. A web-based version of the software used for this analysis is available at http://bioinformatics.research.nicta.com.au/gwis.
Roquilly A.,Victoria University of Melbourne
Critical Care Medicine | Year: 2014
OBJECTIVE:: Trauma induces a state of immunosuppression, which is responsible for the development of nosocomial infections. Hydrocortisone reduces the rate of pneumonia in patients with trauma. Because alterations of dendritic cells and natural killer cells play a central role in trauma-induced immunosuppression, we investigated whether hydrocortisone modulates the dendritic cell/natural killer cell cross talk in the context of posttraumatic pneumonia.DESIGN:: Experimental study.SETTINGS:: Research laboratory from an university hospital.SUBJECTS:: Bagg Albino/cJ mice (weight, 20–24 g).INTERVENTIONS:: First, in an a priori substudy of a multicenter, randomized, double-blind, placebo-controlled trial of hydrocortisone (200 mg/d for 7 d) in patients with severe trauma, we have measured the blood levels of five cytokines (tumor necrosis factor-α, interleukin-6, interleukin-10, interleukin-12, interleukin-17) at day 1 and day 8. In a second step, the effects of hydrocortisone on dendritic cell/natural killer cell cross talk were studied in a mouse model of posttraumatic pneumonia. Hydrocortisone (0.6 mg/mice i.p.) was administered immediately after hemorrhage. Twenty-four hours later, the mice were challenged with Staphylococcus aureus (7 × 10 colony-forming units).MEASUREMENTS AND MAIN RESULTS:: Using sera collected during a multicenter study in patients with trauma, we found that hydrocortisone decreased the blood level of interleukin-10, a cytokine centrally involved in the regulation of dendritic cell/natural killer cell cluster. In a mouse model of trauma-hemorrhage–induced immunosuppression, splenic natural killer cells induced an interleukin-10–dependent elimination of splenic dendritic cell. Hydrocortisone treatment reduced this suppressive function of natural killer cells and increased survival of mice with posthemorrhage pneumonia. The reduction of the interleukin-10 level in natural killer cells by hydrocortisone was partially dependent on the up-regulation of glucocorticoid-induced tumor necrosis factor receptor-ligand (TNFsf18) on dendritic cell.CONCLUSIONS:: These data demonstrate that trauma-induced immunosuppression is characterized by an interleukin-10–dependent elimination of dendritic cell by natural killer cells and that hydrocortisone improves outcome by limiting this immunosuppressive feedback loop. © 2014 by the Society of Critical Care Medicine and Lippincott Williams & Wilkins
Hrysomallis C.,Victoria University of Melbourne
Sports Medicine | Year: 2011
The relationship between balance ability and sport injury risk has been established in many cases, but the relationship between balance ability and athletic performance is less clear. This review compares the balance ability of athletes from different sports, determines if there is a difference in balance ability of athletes at different levels of competition within the same sport, determines the relationship of balance ability with performance measures and examines the influence of balance training on sport performance or motor skills.Based on the available data from cross-sectional studies, gymnasts tended to have the best balance ability, followed by soccer players, swimmers, active control subjects and then basketball players. Surprisingly, no studies were found that compared the balance ability of rifle shooters with other athletes. There were some sports, such as rifle shooting, soccer and golf, where elite athletes were found to have superior balance ability compared with their less proficient counterparts, but this was not found to be the case for alpine skiing, surfing and judo. Balance ability was shown to be significantly related to rifle shooting accuracy, archery shooting accuracy, ice hockey maximum skating speed and simulated luge start speed, but not for baseball pitching accuracy or snowboarding ranking points. Prospective studies have shown that the addition of a balance training component to the activities of recreationally active subjects or physical education students has resulted in improvements in vertical jump, agility, shuttle run and downhill slalom skiing. A proposed mechanism for the enhancement in motor skills from balance training is an increase in the rate of force development. There are limited data on the influence of balance training on motor skills of elite athletes. When the effectiveness of balance training was compared with resistance training, it was found that resistance training produced superior performance results for jump height and sprint time.Balance ability was related to competition level for some sports, with the more proficient athletes displaying greater balance ability. There were significant relationships between balance ability and a number of performance measures. Evidence from prospective studies supports the notion that balance training can be a worthwhile adjunct to the usual training of non-elite athletes to enhance certain motor skills, but not in place of other conditioning such as resistance training. More research is required to determine the influence of balance training on the motor skills of elite athletes. © 2011 Adis Data Information BV. All rights reserved.
Liddle B.,Victoria University of Melbourne
Environmental Modelling and Software | Year: 2013
This paper analyzes urban population's and affluence's (GDP per capita's) influence on environmental impact in developed and developing countries by taking as its starting point the STIRPAT framework. In addition to considering environmental impacts particularly influenced by population and affluence (carbon emissions from transport and residential electricity consumption), the paper determines whether and, if so, how those environmental impact relationships vary across development levels by analyzing panels consisting of poor, middle, and rich countries. The development-based panels approach is an improvement on the GDP per capita polynomial model used in the Environmental Kuznets curve and other literature for several reasons: (i) it allows one to determine whether the elasticity of all variables considered varies according to development; (ii) it is arguably a more accurate description of the development process; (iii) it avoids potentially spurious regressions involving nonlinear transformations of nonstationary variables (GDP per capita squared); and (iv) unlike the polynomial model, it allows for the possibility that elasticities are significantly different across development levels but still positive-precisely the relationship expected for the environmental impacts considered here. Whether or not the elasticity for affluence was greater than that for population was a function of both the choice of dependent variable and the makeup of the panel (all countries, poor, middle, or rich). Furthermore, the estimated elasticities varied, in a nonlinear fashion, according to the development process: U-shaped, inverted U-shaped, and monotonic patterns were revealed, again, depending on the dependent variable. © 2012 Elsevier Ltd.
Bishop D.,Victoria University of Melbourne
Sports Medicine | Year: 2010
A well designed diet is the foundation upon which optimal training and performance can be developed. However, as long as competitive sports have existed, athletes have attempted to improve their performance by ingesting a variety of substances. This practice has given rise to a multi-billion-dollar industry that aggressively markets its products as performance enhancing, often without objective, scientific evidence to support such claims. While a number of excellent reviews have evaluated the performance-enhancing effects of most dietary supplements, less attention has been paid to the performance-enhancing claims of dietary supplements in the context of team-sport performance. Dietary supplements that enhance some types of athletic performance may not necessarily enhance team-sport performance (and vice versa). Thus, the first aim of this review is to critically evaluate the ergogenic value of the most common dietary supplements used by team-sport athletes. The term dietary supplements will be used in this review and is defined as any product taken by the mouth, in addition to common foods, that has been proposed to have a performance-enhancing effect; this review will only discuss substances that are not currently banned by the World Anti-Doping Agency. Evidence is emerging to support the performance-enhancing claims of some, but not all, dietary supplements that have been proposed to improve team-sport-related performance. For example, there is good evidence that caffeine can improve single-sprint performance, while caffeine, creatine and sodium bicarbonate ingestion have all been demonstrated to improve multiple-sprint performance. The evidence is not so strong for the performance-enhancing benefits of β-alanine or colostrum. Current evidence does not support the ingestion of ribose, branched-chain amino acids or β-hydroxy-β-methylbutyrate, especially in well trained athletes. More research on the performance-enhancing effects of the dietary supplements highlighted in this review needs to be conducted using team-sport athletes and using team-sport-relevant testing (e.g. single- and multiple-sprint performance). It should also be considered that there is no guarantee that dietary supplements that improve isolated performance (i.e. single-sprint or jump performance) will remain effective in the context of a team-sport match. Thus, more research is also required to investigate the effects of dietary supplements on simulated or actual team-sport performance. A second aim of this review was to investigate any health issues associated with the ingestion of the more commonly promoted dietary supplements. While most of the supplements described in the review appear safe when using the recommended dose, the effects of higher doses (as often taken by athletes) on indices of health remain unknown, and further research is warranted. Finally, anecdotal reports suggest that team-sport athletes often ingest more than one dietary supplement and very little is known about the potential adverse effects of ingesting multiple supplements. Supplements that have been demonstrated to be safe and efficacious when ingested on their own may have adverse effects when combined with other supplements. More research is required to investigate the effects of ingesting multiple supplements (both on performance and health). © 2010 Adis Data Information BV. All rights reserved.
Hrysomallis C.,Victoria University of Melbourne
Sports Medicine | Year: 2013
Along with the enjoyment and the other positive benefits of sport participation, there is also the risk of injury that is elevated in contact sport. This review provides a summary of injury incidence in Australian Rules Football (ARF), identifies injury risk factors, assesses the efficacy of interventions to reduce injury risk and makes recommendations for future research. The most common injuries were found to be muscle strains, particularly hamstrings; joint ligament sprains, especially ankle; haematomas and concussion. The most severe joint injury was anterior cruciate ligament rupture. Mouthguards are commonly worn and have been shown to reduce orofacial injury. There is evidence that thigh pads can reduce the incidence of thigh haematomas. There is a reluctance to wear padded headgear and an attempt to assess its effectiveness was unsuccessful due to low compliance. The most readily identified risk factor was a history of that injury. There were conflicting findings as to the influence strength imbalances or deficit has on hamstring injury risk in ARF. Static hamstring flexibility was not related to risk but low hip flexor/quadriceps flexibility increased hamstring injury risk. High lower-limb and high hamstring stiffness were associated with an elevated risk of hamstring injury. Since stiffness can be modulated through strength or flexibility training, this provides an area for future intervention studies. Low postural balance ability was related to a greater risk of ankle injury in ARF, players with poor balance should be targeted for balance training. There are preliminary data signifying a link between deficiencies in hip range of motion and hip adductor strength with groin pain or injury. This provides support for future investigation into the effectiveness of an intervention for high-risk players on groin injury rate. Low cross-sectional area of core-region muscle has been associated with more severe injuries and a motor control exercise intervention that increased core muscle size and function resulted in fewer games missed due to injury. A randomized controlled trial of the effectiveness of eccentric hamstring exercise in decreasing hamstring injury rate in ARF players was unsuccessful due to poor compliance from muscle soreness; a progressive eccentric training intervention for ARF should be given future consideration. Jump and landing training reduced injury risk in junior ARF players and it would be advisable to include this component as part of a neuromuscular training intervention. A multifaceted programme of sport-specific drills for hamstring flexibility while fatigued, sport skills that load the hamstrings and high-intensity interval training to mimic match playing conditions showed some success in reducing the incidence of hamstring injuries in ARF. A countermeasure designed to reduce injury risk is more likely to be adopted by coaches and players if it also has the scope to enhance performance. © Springer International Publishing Switzerland 2013.
Haines K.J.,Victoria University of Melbourne
Critical Care Medicine | Year: 2015
OBJECTIVE:: The objective of the review was to evaluate and synthesize the prevalence, risk factors, and trajectory of psychosocial morbidity in informal caregivers of critical care survivors.DATA SOURCES:: A systematic search of MEDLINE, PsychInfo, PubMed, CINAHL, Cochrane Library, Scopus, PILOTS, EMBASE, and Physiotherapy Evidence Database was undertaken between January and February 2014.STUDY SELECTION:: Citations were screened independently by two reviewers for studies that investigated psychosocial outcomes (depression, anxiety, stress, posttraumatic stress disorder, burden, activity restriction, and health-related quality of life) for informal caregivers of critical care survivors (mechanically ventilated for 48 hr or more).DATA EXTRACTION:: Data on study outcomes were extracted into a standardized form and quality assessed by two independent reviewers using the Newcastle-Ottawa Scale, the Physiotherapy Evidence Database, and the National Health and Medical Research Council Hierarchy of Evidence guide. Preferred Reporting Items for Systematic Reviews guidelines were followed.DATA SYNTHESIS:: Fourteen studies of 1,491 caregivers were included. Depressive symptoms were the most commonly reported outcome with a prevalence of 75.5% during critical care and 22.8–29% at 1-year follow-up. Risk factors for depressive symptoms in caregivers included female gender and younger age. The greatest period of risk for all outcomes was during the patient’s critical care admission although psychological symptoms improved over time. The overall quality of the studies was low.CONCLUSIONS:: Depressive symptoms were the most prevalent in informal caregivers of survivors of intensive care who were ventilated for more than 48 hours and persist at 1 year with a prevalence of 22.8–29.0%, which is comparable with caregivers of patients with dementia. Screening for caregiver risks could be performed during the ICU admission where intervention can be implemented and then evaluated. Further high-quality studies are needed to quantify anxiety, stress, caregiver burden, and posttraumatic stress disorder outcomes in informal caregivers of long-stay patients surviving ICU. Copyright © by 2015 by the Society of Critical Care Medicine and Wolters Kluwer Health, Inc. All Rights Reserved.