Siervo M.,Vitality |
Faber P.,University of Aberdeen |
Lara J.,Vitality |
Gibney E.R.,Agriculture and Food Science Center |
And 6 more authors.
Metabolism: Clinical and Experimental | Year: 2015
Objectives. Weight loss (WL) is associated with a decrease in total and resting energy expenditure (EE). We aimed to investigate whether (1) diets with different rate and extent of WL determined different changes in total and resting EE and if (2) they influenced the level of adaptive thermogenesis, defined as the decline in total or resting EE not accounted by changes in body composition. Methods. Three groups of six, obesemen participated in a total fast for 6 days to achieve a 5% WLand a very lowcalorie (VLCD, 2.5 MJ/day) for 3 weeks or a low calorie (LCD, 5.2 MJ/day) diet for 6 weeks to achieve a 10% WL. A four-component modelwas used to measure body composition. Indirect calorimetry was used to measure resting EE. Total EE was measured by doubly labelled water (VLCD, LCD) and 24-hour whole-body calorimetry (fasting). Results. VLCD and LCD showed a similar degree ofmetabolic adaptation for total EE (VLCD = - 6.2%; LCD = -6.8%). Metabolic adaptation for resting EE was greater in the LCD (-0.4 MJ/day, - 5.3%) compared to the VLCD (-0.1 MJ/day, -1.4%) group. Resting EE did not decrease after shortterm fasting and no evidence of adaptive thermogenesis (+0.4 MJ/day) was found after 5%WL. The rate ofWL was inversely associated with changes in resting EE (n = 30, r = 0.-42, p = 0.01). Conclusions. The rate of WL did not appear to influence the decline in total EE in obese men after 10% WL. Approximately 6% of this decline in total EE was explained by mechanisms of adaptive thermogenesis. © 2015 Elsevier Inc. All rights reserved.
Osborne A.,Health Science Center |
Osborne A.,Teagasc |
Blake C.,Health Science Center |
McNamara J.,Teagasc |
And 3 more authors.
Occupational Medicine | Year: 2010
Background: Farming is an occupation that predisposes individuals to health problems including musculoskeletal disorders (MSDs). There is limited research regarding MSDs among farmers especially in Ireland. Aims: To establish the prevalence of MSDs, identify the most commonly affected body: regions and to explore what factors may influence the development of the most common MSDs among farmers in Ireland. Methods: A questionnaire survey of Irish farmers was conducted. The study sample comprised 600 farmers (100 farmers from each of the six main farm enterprise systems in Ireland). Results: Of the 600 farmers, 56% had experienced a MSD in the previous year. The most commonly experienced MSDs were back pain (37%) and neck/shoulder pain (25%). Other MSDs experienced in the previous year included knee pain (9%), hand-wrist-elbow pain (9%), ankle/foot pain (9%) and hip pain (8%). Overall, MSDs were more common in farmers working longer hours (P<0.05). Back pain was more prevalent in full-time farmers (P<0.05), while prevalence of hip pain was greater in farmers who were older (P<0.01), full time (P<0.05), farming for longer (P<0.01) and working for longer hours (P<0.01). Farm enterprise was not a factor in influencing the development of MSDs. Conclusions: These findings suggest that the number of hours worked by farmers, rather than enterprise specific tasks render farmers more susceptible to MSDs. Further investigation is needed to explore risk factors in the development of MSDs. © The Author 2010. Published by Oxford University Press on behalf of the Society of Occupational Medicine. All rights reserved.
Munoz A.,Agriculture and Food Science Center |
Munoz A.,University of Granada |
Palgan I.,Agriculture and Food Science Center |
Noci F.,Agriculture and Food Science Center |
And 4 more authors.
Food Research International | Year: 2012
Inactivation of Escherichia coli and Listeria innocua by combinations of High Intensity Light Pulses (HILP), Ultrasound (US) and Pulsed Electric Fields (PEF) and sub-lethal concentrations of nisin (2.5mg/L) or lactic acid (500mg/L) was investigated in two different buffer systems (pH 4 for E. coli and pH 7 for L. innocua). Individually, HILP (3.3J/cm 2), US (126s residence time, 500W, 40°C) and PEF (24kV/cm, 18Hz and 1μs of pulse width) did not induce a microbial reduction of greater than 2.7 or 3.6 log units, for L. innocua and E. coli, respectively. Combined treatment using HILP+PEF sufficiently inactivated E. coli without antimicrobial addition. The addition of either antimicrobial enhanced the effect of US+PEF for both E. coli and L. innocua. The addition of lactic acid enhanced the effect of HILP+US. For L. innocua the addition of nisin enhanced the effect of HILP+PEF. This confirms the potential of selected non-thermal technologies for microbial inactivation when combined with antimicrobials. Industrial relevance: The application of sublethal non-thermal processing and GRAS antimicrobial hurdle combinations has the potential to allow for the production of safe, stable products while also maintaining the desired organoleptic characteristics of a minimally processed product. An initial step to assessing the suitability of non thermal treatments is to evaluate their efficacy in model solutions prior to their study in food systems. © 2012.
Cummins E.,Agriculture and Food Science Center |
Kennedy R.,Agriculture and Food Science Center |
Cormican M.,National University of Ireland
Science of the Total Environment | Year: 2010
Cryptosporidium species are protozoan parasites associated with gastro-intestinal illness. Following a number of high profile outbreaks worldwide, it has emerged as a parasite of major public health concern. A quantitative Monte Carlo simulation model was developed to evaluate the annual risk of infection from Cryptosporidium in tap water in Ireland. The assessment considers the potential initial contamination levels in raw water, oocyst removal and decontamination events following various process stages, including coagulation/flocculation, sedimentation, filtration and disinfection. A number of scenarios were analysed to represent potential risks from public water supplies, group water schemes and private wells. Where surface water is used additional physical and chemical water treatment is important in terms of reducing the risk to consumers. The simulated annual risk of illness for immunocompetent individuals was below 1 × 10- 4 per year (as set by the US EPA) except under extreme contamination events. The risk for immunocompromised individuals was 2-3 orders of magnitude greater for the scenarios analysed. The model indicates a reduced risk of infection from tap water that has undergone microfiltration, as this treatment is more robust in the event of high contamination loads. The sensitivity analysis highlighted the importance of watershed protection and the importance of adequate coagulation/flocculation in conventional treatment. The frequency of failure of the treatment process is the most important parameter influencing human risk in conventional treatment. The model developed in this study may be useful for local authorities, government agencies and other stakeholders to evaluate the likely risk of infection given some basic input data on source water and treatment processes used. © 2009 Elsevier B.V. All rights reserved.
Buckley M.,Agriculture and Food Science Center |
Hunter A.,Agriculture and Food Science Center |
Acta Horticulturae | Year: 2012
Sportsturf superintendents are under increasing pressure to provide immaculate turf surfaces capable of withstanding all year round wear and use. This demands frequent and intense nutrient (NPK) applications to promote plant growth, maintain turf quality so that it will withstand imposed agronomic pressures. The use of such materials has the potential to cause significant nutrient leaching and pollution of both surface and ground water sources. However, an EU directive similar to that within the Agriculture sector will quite possibly be introduced in the very near future which will aim to severely restrict the level of inorganic fertilizers and chemicals applied to land in the coming years. This restriction will make the management of such sporting areas more difficult thereby increasing the challenge to maintain standards at their current levels. Consequently, experimentation determining the effects of composted waste is necessary. This project is seeking to investigate and control nutrient loss using composted waste material derived from three different sources together with their potential as replacements or partial replacements for inorganic nutrients.