Time filter

Source Type

Maddison R.,University of Auckland | Foley L.,University of Auckland | Ni Mhurchu C.,University of Auckland | Jiang Y.,University of Auckland | And 4 more authors.
American Journal of Clinical Nutrition | Year: 2011

Background: Sedentary activities such as video gaming are independently associated with obesity. Active video games, in which players physically interact with images on screen, may help increase physical activity and improve body composition. Objective: The aim of this study was to evaluate the effect of active video games over a 6-mo period on weight, body composition, physical activity, and physical fitness. Design: We conducted a 2-arm, parallel, randomized controlled trial in Auckland, New Zealand. A total of 322 overweight and obese children aged 10-14 y, who were current users of sedentary video games, were randomly assigned at a 1:1 ratio to receive either an active video game upgrade package (intervention, n = 160) or to have no change (control group, n = 162). The primary outcome was the change from baseline in body mass index (BMI; in kg/m2). Secondary outcomes were changes in percentage body fat, physical activity, cardiorespiratory fitness, video game play, and food snacking. Results: At 24 wk, the treatment effect on BMI (-0.24; 95% CI: -0.44, -0.05; P = 0.02) favored the intervention group. The change (±SE) in BMI from baseline increased in the control group (0.34 ± 0.08) but remained the same in the intervention group (0.09 ± 0.08). There was also evidence of a reduction in body fat in the intervention group (-0.83%; 95% CI: -1.54%, 20.12%; P = 0.02). The change in daily time spent playing active video games at 24 wk increased (10.03 min; 95% CI: 6.26, 13.81 min; P < 0.0001) with the intervention accompanied by a reduction in the change in daily time spent playing nonactive video games (-9.39 min; 95% CI: -19.38, 0.59 min; P = 0.06). Conclusion: An active video game intervention has a small but definite effect on BMI and body composition in overweight and obese children. This trial was registered in the Australian New Zealand Clinical Trials Registry at http://www.anzctr.org.au/ as ACTRN12607000632493. © 2011 American Society for Nutrition.


Ni Mhurchu C.,Medical Research Council Human Nutrition Research | Ni Mhurchu C.,University of Auckland | Capelin C.,Kantar Worldpanel | Dunford E.K.,George Institute for International Health | And 3 more authors.
American Journal of Clinical Nutrition | Year: 2011

Background: In the United Kingdom, sodium reduction targets have been set for a large number of processed food categories. Assessment and monitoring are essential to evaluate progress. Objectives: Our aim was to determine whether household consumer panel food-purchasing data could be used to assess the sodium content of processed foods. Our further objectives were to estimate the mean sodium content of UK foods by category and undertake analyses weighted by food-purchasing volumes. Design: Data were obtained for 21,108 British households between October 2008 and September 2009. Purchasing data (product description, product weight, annual purchases) and sodium values (mg/100 g) were collated for all food categories known to be major contributors to sodium intake. Unweighted and weighted mean sodium values were calculated. Results: Data were available for 44,372 food products. The largest contributors to sodium purchases were table salt (23%), processed meat (18%), bread and bakery products (13%), dairy products (12%), and sauces and spreads (11%). More than one-third of sodium purchased (37%) was accounted for by 5 food categories: bacon, bread, milk, cheese, and sauces. For some food groups (bread and bakery, cereals and cereal products, processed meat), purchase-weighted means were 18-35% higher than unweighted means, suggesting that market leaders have higher sodium contents than the category mean. Conclusion: The targeting of sodium reduction in a small number of food categories and focusing on products sold in the highest volumes could lead to large decreases in sodium available for consumption and therefore to gains in public health. © 2011 American Society for Nutrition.


White S.L.,George Institute for International Health | White S.L.,University of Sydney | Polkinghorne K.R.,Monash Medical Center | Atkins R.C.,Monash Medical Center | And 3 more authors.
American Journal of Kidney Diseases | Year: 2010

Background: The Chronic Kidney Disease Epidemiology Collaboration (CKD-EPI) is more accurate than the Modification of Diet in Renal Disease (MDRD) Study equation. We applied both equations in a cohort representative of the Australian adult population. Study Design: Population-based cohort study. Setting & Participants: 11,247 randomly selected noninstitutionalized Australians aged ≥ 25 years who attended a physical examination during the baseline AusDiab (Australian Diabetes, Obesity and Lifestyle) Study survey. Predictors & Outcomes: Glomerular filtration rate (GFR) was estimated using the MDRD Study and CKD-EPI equations. Kidney damage was defined as urine albumin-creatinine ratio ≥ 2.5 mg/mmol in men and ≥ 3.5 mg/mmol in women or urine protein-creatinine ratio ≥ 0.20 mg/mg. Chronic kidney disease (CKD) was defined as estimated GFR (eGFR) ≥ 60 mL/min/1.73 m2 or kidney damage. Participants were classified into 3 mutually exclusive subgroups: CKD according to both equations; CKD according to the MDRD Study equation, but no CKD according to the CKD-EPI equation; and no CKD according to both equations. All-cause mortality was examined in subgroups with and without CKD. Measurements: Serum creatinine and urinary albumin, protein, and creatinine measured on a random spot morning urine sample. Results: 266 participants identified as having CKD according to the MDRD Study equation were reclassified to no CKD according to the CKD-EPI equation (estimated prevalence, 1.9%; 95% CI, 1.4-2.6). All had an eGFR ≥ 45 mL/min/1.73 m2 using the MDRD Study equation. Reclassified individuals were predominantly women with a favorable cardiovascular risk profile. The proportion of reclassified individuals with a Framingham-predicted 10-year cardiovascular risk ≥ 30% was 7.2% compared with 7.9% of the group with no CKD according to both equations and 45.3% of individuals retained in stage 3a using both equations. There was no evidence of increased all-cause mortality in the reclassified group (age- and sex-adjusted hazard ratio vs no CKD, 1.01; 95% CI, 0.62-1.97). Using the MDRD Study equation, the prevalence of CKD in the Australian population aged ≥ 25 years was 13.4% (95% CI, 11.1-16.1). Using the CKD-EPI equation, the prevalence was 11.5% (95% CI, 9.42-14.1). Limitations: Single measurements of serum creatinine and urinary markers. Conclusions: The lower estimated prevalence of CKD using the CKD-EPI equation is caused by reclassification of low-risk individuals. © 2010 National Kidney Foundation, Inc.


Gallagher M.P.,George Institute for International Health | Kelly P.J.,University of Sydney | Jardine M.,George Institute for International Health | Perkovic V.,George Institute for International Health | And 4 more authors.
Journal of the American Society of Nephrology | Year: 2010

Cancer is a widely recognized complication of transplantation, and the effects of various immunosuppressive drugs on cancer risk remains controversial. This randomized trial allocated 489 recipients of first cadaveric renal transplants to one of three groups: Azathioprine and prednisolone, cyclosporine monotherapy, or cyclosporine monotherapy followed by a switch to azathioprine and prednisolone after 3 months. Here, we report cancer outcomes by non-skin cancer (including melanoma) and skin cancer (excluding melanoma) for 481 patients during a median follow-up of 20.6 years. A total of 226 patients developed at least one cancer: 95 with non-skin cancer and 171 with skin cancer. In the intention-to-treat analysis, mean times to first non-skin cancer (16.0, 15.3, and 15.7 years for groups 1 through 3, respectively) and first skin cancer (13.6, 14.3, and 15.2 years, respectively) were not different among the three groups or between any subgroup. In multivariate analyses, non-skin cancer associated with increasing age and previous smoking history, whereas skin cancer associated with increasing age, nonbrown eye color, fairer skin, and a functioning transplant. Treatment allocation did not associate with development of either form of cancer in multivariate analyses. In conclusion, these immunosuppressive regimens, widely used in recent decades, carry similar risks for carcinogenicity after kidney transplantation. Copyright © 2010 by the American Society of Nephrology.


Palmer S.C.,Brigham and Women's Hospital | Navaneethan S.D.,Cleveland Clinic | Craig J.C.,Childrens Hospital at Westmead | Johnson D.W.,Princess Alexandra Hospital | And 11 more authors.
Annals of Internal Medicine | Year: 2010

Background: Previous meta-analyses suggest that treatment with erythropoiesis-stimulating agents (ESAs) in chronic kidney disease (CKD) increases the risk for death. Additional randomized trials have been recently completed. Purpose: To summarize the effects of ESA treatment on clinical outcomes in patients with anemia and CKD. Data Sources: MEDLINE (January 1966 to November 2009), EMBASE (January 1980 to November 2009), and the Cochrane database (to March 2010) were searched without language restriction. Study Selection: Two authors independently screened reports to identify randomized trials evaluating ESA treatment in people with CKD. Hemoglobin target trials or trials of ESA versus no treatment or placebo were included. Data Extraction: Two authors independently extracted data on patient characteristics, study risks for bias, and the effects of ESA therapy. Data Synthesis: 27 trials (10 452 patients) were identified. A higher hemoglobin target was associated with increased risks for stroke (relative risk [RR], 1.51 [95% CI, 1.03 to 2.21]), hypertension (RR, 1.67 [CI, 1.31 to 2.12]), and vascular access thrombosis (RR, 1.33 [CI, 1.16 to 1.53]) compared with a lower hemoglobin target. No statistically significant differences in the risks for mortality (RR, 1.09 [CI, 0.99 to 1.20]), serious cardiovascular events (RR, 1.15 [CI, 0.98 to 1.33]), or end-stage kidney disease (RR, 1.08 [CI, 0.97 to 1.20]) were observed, although point estimates favored a lower hemoglobin target. Treatment effects were consistent across subgroups, including all stages of CKD. Limitations: The evidence for effects on quality of life was limited by selective reporting. Trials also reported insufficient information to allow analysis of the independent effects of ESA dose on clinical outcomes. Conclusion: Targeting higher hemoglobin levels in CKD increases risks for stroke, hypertension, and vascular access thrombosis and probably increases risks for death, serious cardiovascular events, and end-stage renal disease. The mechanisms for harm remain unclear, and meta-analysis of individual-patient data and trials on fixed ESA doses are recommended to elucidate these mechanisms. Primary Funding Source: None. © 2010 American College of Physicians.


Domanski M.J.,Mount Sinai Cardiovascular Institute | Mahaffey K.,Duke University | Hasselblad V.,Duke University | Brener S.J.,New York Methodist Hospital | And 11 more authors.
JAMA - Journal of the American Medical Association | Year: 2011

Context: Several small studies have suggested that cardiac enzyme elevation in the 24 hours following coronary artery bypass graft (CABG) surgery is associated with worse prognosis, but a definitive study is not available. Also, the long-term prognostic impact of small increases of perioperative enzyme has not been reported. Objective: Toquantify the relationship between peak post-CABG elevation of biomarkers of myocardial damage and early, intermediate-, and long-term mortality, including determining whether there is a threshold below which elevations lack prognostic significance. Data Sources: Studies (randomized clinical trials or registries) of patients undergoing CABG surgery in which postprocedural biomarker and mortality data were collected and included. A search of the PubMed database was performed in July 2008 using the search terms coronary artery bypass, troponin, CK-MB, and mortality. Study Selection: Studies evaluating mortality and creatine kinase (CK-MB), troponin, or both were included. One study investigator declined to participate and 3 had insufficient data. Data Extraction: Two independent reviewers determined study eligibility. The principal investigator from each eligible study was contacted to request his/her participation. Once institutional review board approval for the use of these data for this purpose was obtained, we requested patient-level data from each source. Data were examined to ensure that cardiac markers had been measured within 24 hours after CABG surgery, key baseline covariates, and mortality were available. Results: A total of 18 908 patients from 7 studies were included. Follow-up varied from 3 months to 5 years. Mortality was found to be a monotonically increasing function of the CK-MB ratio. The 30-day mortality rates by categories of CK-MB ratio were 0.63% (95% confidence interval [CI], 0.36%-1.02%) for 0 to <1, 0.86% (95% CI, 0.49%-1.40%) for 1 to <2, 0.95% (95% CI, 0.72%-1.22%) for 2 to <5, 2.09% (95% CI, 1.69%-2.57%) for 5 to <10, 2.78% (95% CI, 2.12%-3.58%) for 10 to <20, and 7.06% (95% CI, 5.46%-8.96%) for 20 to ≥40. Of the variables considered, the CK-MB ratio was the strongest independent predictor of death to 30 days and remained significant even after adjusting for a wide range of baseline risk factors (χ2=143, P<.001; hazard ratio [HR] for each 5 point-increment above the upper limits of normal [ULN]=1.12;95%CI, 1.10-1.14). This result was strongest at 30 days, but the adjusted association persisted from 30 days to 1 year (χ2=24; P<.001; HR for each 5-point increment above ULN=1.17; 95% CI, 1.10-1.24) and a trend was present from 1 year to 5 years (χ2=2.8; P=.10; HR for each 5-point increment above ULN=1.05; 95% CI, 0.99-1.11). Similar analyses using troponin as the marker of necrosis led to the same conclusions (χ2=142 for 0-30 days and χ2=40 for 30 days to 6 months, both P<.001; HR for each 50 points above the ULN=1.28; 95% CI, 1.23-1.33 and 1.15; 95% CI, 1.10-1.21, respectively). Conclusions: Among patients who had undergone CABG surgery, elevation of CK-MB or troponin levels within the first 24 hours was independently associated with increased intermediate- and long-term risk of mortality. ©2011 American Medical Association. All rights reserved.


Ferreira M.L.,University of Sydney | Machado G.,Federal University of Minas Gerais | Latimer J.,George Institute for International Health | Maher C.,George Institute for International Health | And 2 more authors.
European Journal of Pain | Year: 2010

Little is known about factors determining health care-seeking behavior in low back pain. While a number of studies have described general characteristics of health care utilization, only a few have aimed at appropriately assessing determinants of care-seeking in back pain, by comparing seekers and non-seekers. The objective of this systematic review was to identify determinants of health care-seeking in studies with well-defined groups of care-seekers and non-seekers with non-specific low back pain. A search was conducted in Medline, AMED, Cinahl, Web of Science, PsycINFO, National Research Register, Cochrane Library and LILACS looking for population- based surveys of non-specific low back pain patients older than 18 years, published since 1966. To be included in the review, studies needed to report on characteristics of well-defined groups of care-seekers and non-seekers. Methodological quality was assessed using a criteria list based on sampling, response rate, data reproducibility, power calculation and external validity. Risk estimates were expressed as odd ratios (95% confidence intervals). When possible, meta-analyses were performed, using a random effects model. Eleven studies were included in the review. Pooled results show that women are slightly more likely to seek care for their back pain as are patients with a previous history of back pain. Pain intensity was only slightly associated with care-seeking, whereas patients with high levels of disability were nearly eight times more likely to seek care than patients with lower levels of disability. © 2009 European Federation of International Association for the Study of Pain Chapters. Published by Elsevier Ltd. All rights reserved.


Wand B.M.,The University of Notre Dame Australia | Chiffelle L.A.,Progress Physiotherapy Services | O'Connell N.E.,Brunel University | McAuley J.H.,George Institute for International Health | Desouza L.H.,Brunel University
European Spine Journal | Year: 2010

For an individual, the functional consequences of an episode of low back pain is a key measure of their clinical status. Self-reported disability measures are commonly used to capture this component of the back pain experience. In non-acute low back pain there is some uncertainty of the validity of this approach. It appears that self-reported assessment of disability and direct measurements of functional status are only moderately related. In this cross-sectional study, we investigated this relationship in a sample of 94 acute low back pain patients. Both self-reported disability and a performance-based assessment of disability were assessed, along with extensive profiling of patient characteristics. Scale consistency of the performance-based assessment was investigated using Cronbach's alpha, the relationship between self-reported and performance-based assessment of disability was investigated using Pearson's correlation. The relationship between clinical profile and each of the disability measures were examined using Pearson's correlations and multivariate linear regression. Our results demonstrate that the battery of tests used are internally reliable (Cronbach's alpha = 0.86). We found only moderate correlations between the two disability measures (r = 0.471, p < 0.001). Self-reported disability was significantly correlated with symptom distribution, medication use, physical well-being, pain intensity, depression, somatic distress and anxiety. The only significant correlations with the performance-based measure were symptom distribution, physical well-being and pain intensity. In the multivariate analyses no psychological measure made a significant unique contribution to the prediction of the performance-based measure, whereas depression made a unique contribution to the prediction of the self-reported measure. Our results suggest that self-reported and performance-based assessments of disability are influenced by different patient characteristics. In particular, it appears self-reported measures of disability are more influenced by the patient's psychological status than performance-based measures of disability. © 2009 Springer-Verlag.


Martiniuk A.L.C.,George Institute for International Health
International Journal of Health Services | Year: 2012

The weak health system in Honduras contributes to poor health indicators. To improve population health, a number of volunteer medical brigades from developed countries provide health services in Honduras. To date, there is little information on the brigades' activities and impact. The primary objective of this article is to increase understanding of the type of health care provided by voluntary medical brigades by evaluating and presenting data on patients' presenting symptoms, diagnoses, and care outcomes. The article focuses on an ongoing medical brigade organized by Canadian health professionals in conjunction with Honduras' largest national non-governmental organization. This is a descriptive study of data that are routinely collected by volunteer Canadian health care professionals. Data on all patients presenting to temporary primary health care facilities across Honduras between 2006 and 2009 were analyzed. The data were used to analyze patient demographics, presenting symptoms, diagnoses, and treatments. We found that the brigades provide additional human resources to the relatively weak Honduran health care system. However, while brigades may increase solidarity between Hondurans and Canadians, concerns persist regarding cost-effectiveness and continuity of care for conditions treated by short-term brigade volunteers. Greater scrutiny is needed to increase brigades' effectiveness and ensure they are supportive of domestic health systems. © 2012, Baywood Publishing Co., Inc.


Webster J.L.,George Institute for International Health | Dunford E.K.,George Institute for International Health | Neal B.C.,George Institute for International Health
American Journal of Clinical Nutrition | Year: 2010

Background: Processed foods are major contributors to population dietary salt intake. Parts of the Australian food industry have started to decrease salt in a number of products. A definitive baseline assessment of current sodium concentrations in foods is key to targeting reformulation strategies and monitoring progress. Objectives: Our objectives were to systematically collate data on the sodium content of Australian processed food products and compare sodium values against maximum target levels established by the UK Food Standards Agency (UK FSA). Design: Categories of processed foods that contribute the majority of salt to Australian diets were identified. Food-composition data were sought for all products in these categories, and the sodium content in mg/100 g (or mg/100 mL for liquids) was recorded for each. Mean sodium values were calculated for each grouping and compared with the UK FSA benchmarks. Results: Sodium data were collected for 7221 products in 10 food groups, 33 food categories, and 90 food subcategories. The food groups that were highest in sodium were sauces and spreads (1283 mg/100 g) and processed meats (846 mg/100 g). Cereal and cereal products (206 mg/100 g) and fruit and vegetables (211 mg/100 g) were the lowest in sodium. Sixty-three percent of food categories had mean sodium concentrations above the UK FSA targets, and most had wide ranges between the most and least salty product. Conclusions: Many products, particularly breads, processed meats, and sauces, have salt amounts above reasonable benchmarks. The variation in salt concentrations between comparable products suggests that reformulation is highly feasible for many foods. © 2010 American Society for Nutrition.

Loading George Institute for International Health collaborators
Loading George Institute for International Health collaborators