Cohen S.M.,Duke University |
Kim J.,Biostatistics Center |
Asche C.,University of Utah |
Courey M.,University of California at San Francisco
Laryngoscope | Year: 2012
Objectives/Hypothesis: To determine the prevalence and common causes of dysphonia as diagnosed by primary care physicians (PCPs) and otolaryngologists and to evaluate differences in etiologies offered by these providers. Study Design: Retrospective analysis of data from a large, nationally representative administrative U.S. claims database. Methods: Patients were identified as dysphonic based on International Classification of Diseases, Ninth Revision, Clinical Modification codes from January 1, 2004, to December 31, 2008. Data regarding age, sex, geographic location, and type of physician providing the dysphonia diagnosis were collected. Overall and age-related prevalence rates, as well as frequency of specific etiologies by provider type, were calculated. Results: Of the almost 55 million individuals in the database, 536,943 patients (ages 0 to >65 years) were given a dysphonia diagnosis (point prevalence rate of 0.98%). The prevalence rate was higher among females as compared to males (1.2% vs. 0.7%) and among those >70 years of age (2.5%). The most frequent diagnoses overall were acute laryngitis, nonspecific dysphonia, benign vocal fold lesions, and chronic laryngitis. PCPs more commonly diagnosed acute laryngitis, whereas otolaryngologists more commonly diagnosed nonspecific dysphonia and laryngeal pathology. Gastroesophageal reflux was more commonly diagnosed as a comorbid condition by otolaryngologists than by PCPs. Overall laryngeal cancer prevalence in this treatment-seeking population was 2.2% and was greatest among males >70 years of age. Conclusions: This analysis of insurance claims data from a nationally representative database represents the largest study of its kind. Important differences in dysphonia prevalence related to age, sex, diagnosis, and physician type were identified. © 2011 The American Laryngological.
Triant V.A.,Massachusetts General Hospital |
Lee H.,Biostatistics Center |
Sax P.E.,Brigham and Women's Hospital |
Grinspoon S.K.,Harvard University
Journal of Acquired Immune Deficiency Syndromes | Year: 2010
Background: The effects of immunologic and virologic factors on acute myocardial infarction (AMI) rates in patients with HIV are unclear. Methods: HIV-infected patients in a US healthcare system were assessed for AMI. Results: Of 6517 patients with HIV, 273 (4.2%) had an AMI. In a model adjusting for cardiovascular risk factors, antiretroviral medications, and HIV parameters, CD4 count less than 200/mm (odds ratio, 1.74; 95% confidence interval, 1.07 to 2.81; P = 0.02) predicted AMI. Increased HIV viral load was associated with AMI accounting for cardiovascular disease risk factors and antiretroviral medications but was not significant when CD4 count was considered. Conclusions: Immunologic control appears to be the most important HIV-related factor associated with AMI. © 2010 by Lippincott Williams & Wilkins.
DesRoches C.M.,Massachusetts General Hospital |
Campbell E.G.,Institute for Health Policy |
Vogeli C.,Institute for Health Policy |
Zheng J.,Harvard University |
And 6 more authors.
Health Affairs | Year: 2010
Understanding whether electronic health records, as currently adopted, improve quality and efficiency has important implications for how best to employ the estimated $20 billion in health information technology incentives authorized by the American Recovery and Reinvestment Act of 2009.We examined electronic health record adoption in U.S. hospitals and the relationship to quality and efficiency. Across a large number of metrics examined, the relationships were modest at best and generally lacked statistical or clinical significance. However, the presence of clinical decision support was associated with small quality gains. Our findings suggest that to drive substantial gains in quality and efficiency, simply adopting electronic health records is likely to be insufficient. Instead, policies are needed that encourage the use of electronic health records in ways that will lead to improvements in care. ©2010 Project HOPE - The People-to-People Health Foundation, Inc.
Britos M.,University of Maryland Baltimore County |
Smoot E.,Biostatistics Center |
Liu K.D.,University of California at San Francisco |
Thompson B.T.,Massachusetts General Hospital |
And 2 more authors.
Critical Care Medicine | Year: 2011
Objectives: The criteria that define acute lung injury and the acute respiratory distress syndrome include PaO2/Fio2 but not positive end-expiratory pressure or Fio2. PaO2/Fio 2 ratios of some patients increase substantially after mechanical ventilation with positive end-expiratory pressure of 5-10 cm H2O, and the mortality of these patients may be lower than those whose PaO 2/Fio2ratios remain <200. Also, PaO2/ Fio2 may increase when Fio2 is raised from moderate to high levels, suggesting that patients with similar PaO2/Fio 2 ratios but different Fio2 levels have different risks of mortality. The primary purpose of this study was to assess the value of adding baseline positive end-expiratory pressure and Fio2 to PaO 2/Fio2 for predicting mortality of acute lung injury/acute respiratory distress syndrome patients enrolled in Acute Respiratory Distress Syndrome Network clinical trials. We also assessed effects of two study interventions on clinical outcomes in subsets of patients with mild and severe hypoxemia as defined by PaO2/Fio2. DESIGN:: Analysis of baseline physiologic data and outcomes of patients previously enrolled in clinical trials conducted by the National Institutes of Health Acute Respiratory Distress Syndrome Network. SETTING:: Intensive care units of 40 hospitals in North America. PATIENTS:: Two thousand three hundred and twelve patients with acute lung injury/acute respiratory distress syndrome. INTERVENTIONS:: None. MEASUREMENTS AND MAIN RESULTS:: Only 1.3% of patients enrolled in Acute Respiratory Distress Syndrome Network trials had baseline positive end-expiratory pressure <5 cm H2O, and 50% had baseline positive end-expiratory pressure ≥10 cm H2O. Baseline PaO2/FIO2 predicted mortality, but after controlling for PaO2/FIO2, baseline positive end-expiratory pressure did not predict mortality. In contrast, after controlling for baseline PaO2/FIO2, baseline FIO2 did predict mortality. Effects of two study interventions (lower tidal volumes and fluid-conservative hemodynamic management) were similar in mild and severe hypoxemia subsets as defined by PaO2/FIO2 ratios. Conclusion: At Acute Respiratory Distress Syndrome Network hospitals, the addition of baseline positive end-expiratory pressure would not have increased the value of PaO 2/FIO2 for predicting mortality of acute lung injury/acute respiratory distress syndrome patients. In contrast, the addition of baseline FIO2 to PaO2/FIO2 could be used to identify subsets of patients with low or high mortality. © 2011 by the Society of Critical Care Medicine and Lippincott Williams & Wilkins.
Wills A.-M.,Neurological Clinical Research Institute |
Wills A.-M.,Harvard University |
Hubbard J.,Massachusetts General Hospital |
Macklin E.A.,Biostatistics Center |
And 18 more authors.
The Lancet | Year: 2014
Background: Amyotrophic lateral sclerosis is a fatal neurodegenerative disease with few therapeutic options. Mild obesity is associated with greater survival in patients with the disease, and calorie-dense diets increased survival in a mouse model. We aimed to assess the safety and tolerability of two hypercaloric diets in patients with amyotrophic lateral sclerosis receiving enteral nutrition. Methods: In this double-blind, placebo-controlled, randomised phase 2 clinical trial, we enrolled adults with amyotrophic lateral sclerosis from participating centres in the USA. Eligible participants were aged 18 years or older with no history of diabetes or liver or cardiovascular disease, and who were already receiving percutaneous enteral nutrition. We randomly assigned participants (1:1:1) using a computer-generated list of random numbers to one of three dietary interventions: replacement calories using an isocaloric tube-fed diet (control), a high-carbohydrate hypercaloric tube-fed diet (HC/HC), or a high-fat hypercaloric tube-fed diet (HF/HC). Participants received the intervention diets for 4 months and were followed up for 5 months. The primary outcomes were safety and tolerability, analysed in all patients who began their study diet. This trial is registered with ClinicalTrials.gov, number NCT00983983. Findings: Between Dec 14, 2009, and Nov 2, 2012, we enrolled 24 participants, of whom 20 started their study diet (six in the control group, eight in the HC/HC group, and six in the HF/HC group). One patient in the control group, one in the HC/HC group, and two in the HF/HC group withdrew consent before receiving the intervention. Participants who received the HC/HC diet had a smaller total number of adverse events than did those in the other groups (23 in the HC/HC group vs 42 in the control group vs 48 in the HF/HC group; overall, p=0·06; HC/HC vs control, p=0·06) and significantly fewer serious adverse events than did those on the control diet (none vs nine; p=0·0005). Fewer patients in the HC/HC group discontinued their study diet due to adverse events (none [0%] of eight in the HC/HC group vs three [50%] of six in the control group). During the 5 month follow-up, no deaths occurred in the nine patients assigned to the HC/HC diet compared with three deaths (43%) in the seven patients assigned to the control diet (log-rank p=0·03). Adverse events, tolerability, deaths, and disease progression did not differ significantly between the HF/HC group and the control group. Interpretation: Our results provide preliminary evidence that hypercaloric enteral nutrition is safe and tolerable in patients with amyotrophic lateral sclerosis, and support the study of nutritional interventions in larger randomised controlled trials at earlier stages of the disease. Funding: Muscular Dystrophy Association, National Center for Research Resources, National Institutes of Health, and Harvard NeuroDiscovery Center.
Martin K.S.,University of Saint Joseph at West Hartford |
Wu R.,Biostatistics Center |
Wolff M.,Ethel Donaghue Center for Translating Research into Practice and Policy |
Colantonio A.G.,University of Connecticut Health Center |
Grady J.,Biostatistics Center
American Journal of Preventive Medicine | Year: 2013
Background The number of food pantries in the U.S. has grown dramatically over 3 decades, yet food insecurity remains a persistent public health problem. Purpose The goal of the study was to examine the impact of a food pantry intervention called Freshplace, designed to promote food security. Design Randomized parallel-group study with equal randomization. Setting/participants Data were collected from June 2010 to June 2012; a total of 228 adults were recruited over 1 year from traditional food pantries and randomized to the Freshplace intervention (n=113) or control group (n=115), with quarterly follow-ups for 12 months. Intervention The Freshplace intervention included a client-choice pantry, monthly meetings with a project manager to receive motivational interviewing, and targeted referrals to community services. Control group participants went to traditional food pantries where they received bags of food. Main outcome measures Data analyses were conducted from July 2012 to January 2013. Outcomes were food security, self-sufficiency, and fruit and vegetable consumption. Multivariate regression models were used to predict the three outcomes, controlling for gender, age, household size, income, and presence of children in the household. Results At baseline, half of the sample experienced very low food security. Over 1 year, Freshplace members were less than half as likely to experience very low food security, increased self-sufficiency by 4.1 points, and increased fruits and vegetables by one serving per day compared to the control group, all outcomes p<0.01. Conclusions Freshplace may serve as a model for other food pantries to promote food security rather than short-term assistance by addressing the underlying causes of poverty. © 2013 American Journal of Preventive Medicine.
Kolmakova A.,Johns Hopkins University |
Wang J.,Biostatistics Center |
Brogan R.,Loyola University Maryland |
Chaffin C.,University of Maryland Baltimore County |
Rodriguez A.,Johns Hopkins University
Endocrinology | Year: 2010
Our goal was to examine the effect of deficiency of the lipoprotein receptor, scavenger receptor class B type I (SR-BI), on progesterone secretion in human granulosa cells (HGL5). Scrambled or SR-BI small interfering RNA [knockdown (KD)] cells were exposed to dimethylsulfoxide [DMSO, vehicle for forskolin (Fo)], Fo, serum, high-density lipoprotein, low-density lipoprotein (LDL), or Fo plus lipoproteins or serum for 24 h. Progesterone secretion was lower in all of the SR-BI KD cells regardless of treatment. We examined progesterone secretion in SR-BI KD, LDL receptor KD, and double KD cells incubated with DMSO, Fo, LDL, or Fo + LDL for 6-24 h. As compared with scrambled cells, progesterone secretion was lower in SR-BI and double KD cells regardless of treatment; whereas progesterone secretion was only lower in LDL receptor KD cells incubated with LDL and Fo + LDL. We measured phosphorylation of hormone-sensitive lipase (pHSL) expression, intracellular total cholesterol (TC) mass, and progesterone secretion in scrambled and SR-BI KD cells incubated with DMSO or Fo for 2-24 h. The expression of pHSL was similar between the cells and conditions. The mean change in TC mass and progesterone secretion was lower in SR-BI KD cells exposed to DMSO and Fo. Incubating SR-BI KD cells with 22-hydroxy cholesterol did not overcome the reduction in progesterone secretion. At different time points, RNA expression of steroidogenic acute regulatory protein, side-chain cleavage, and 3β-hydroxysteroid dehydrogenase was significantly lower in SR-BI KD cells incubated with Fo. In conclusion, SR-BI protein deficiency, in part, might explain progesterone deficiency in some infertile women. Copyright © 2010 by The Endocrine Society.
Knowler W.C.,U.S. National Institute of Diabetes and Digestive and Kidney Diseases |
Edelstein S.L.,Biostatistics Center |
Goldberg R.B.,University of Miami |
Ackermann R.T.,Northwestern University |
And 8 more authors.
Diabetes Care | Year: 2015
OBJECTIVE : Glycated hemoglobin (HbA1c), a standard measure of chronic glycemia for managing diabetes, has been proposed to diagnose diabetes and identify people at risk. The Diabetes Prevention Program (DPP) was a 3.2-year randomized clinical trial of preventing type 2 diabetes with a 10-year follow-up study, the DPP Outcomes Study (DPPOS).We evaluated baseline HbA1c as a predictor of diabetes and determined the effects of treatments on diabetes defined by an HbA1c <6.5% (48 mmol/mol).RESEARCH DESIGN AND METHODS: We randomized 3,234 nondiabetic adults at high risk of diabetes to placebo, metformin, or intensive lifestyle intervention and followed them for the development of diabetes as diagnosed by fasting plasma glucose (FPG) and 2-h postload glucose (2hPG) concentrations (1997 American Diabetes Association [ADA] criteria). HbA1c was measured but not used for study eligibility or outcomes. We now evaluate treatment effects in the 2,765 participants who did not have diabetes at baseline according to FPG, 2hPG, or HbA1c (2010 ADA criteria).RESULTS : Baseline HbA1c predicted incident diabetes in all treatment groups. Diabetes incidence defined by HbA1c ‡6.5% was reduced by 44% by metformin and 49% by lifestyle during the DPP and by 38% bymetformin and 29% by lifestyle throughout follow-up. Unlike the primary DPP and DPPOS findings based on glucose criteria, metformin and lifestyle were similarly effective in preventing diabetes defined by HbA1c.CONCLUSIONS : HbA1c predicted incident diabetes. In contrast to the superiority of the lifestyle intervention on glucose-defined diabetes, metformin and lifestyle interventions had similar effects in preventing HbA1c-defined diabetes. The long-term implications for other health outcomes remain to be determined. © 2015 by the American Diabetes Association.
Channa R.,Wilmer Eye Institute |
Sophie R.,Wilmer Eye Institute |
Bagheri S.,Wilmer Eye Institute |
Shah S.M.,Wilmer Eye Institute |
And 6 more authors.
American Journal of Ophthalmology | Year: 2015
Purpose To determine the incidence and progression of macular atrophy in patients with neovascular age-related macular degeneration (AMD) treated with vascular endothelial growth factor (VEGF) antagonists. Design Retrospective interventional case series. Methods All patients with neovascular AMD treated by the same physician during a 12-month period of ascertainment had all images from their entire follow-up period evaluated, and areas of retina that developed atrophy were compared to the same areas prior to the onset of anti-VEGF treatment. Longitudinal measurements of retinal atrophy were made. Results In 39 patients, 52 eyes with neovascular AMD were identified. We excluded 5 eyes from analysis (4 had retinal pigment epithelium tears, and 1 had a laser scar). Fundus photographs of the remaining eyes showed that 18/47 eyes (38%) contained hypopigmented areas suggestive of atrophy within the macula at some time during follow-up. Spectral-domain optical coherence tomography confirmed that these areas had loss of retinal pigmented epithelium and ellipsoids zones, with or without subretinal material suggestive of subretinal fibrosis. Comparison of fundus photographs with fluorescein angiograms showed that in 13/18 eyes (72%), atrophy developed in areas previously occupied by choroidal neovascularization, and the other 5 eyes had atrophy prior to the onset of anti-VEGF treatment. The mean (±standard deviation) rate of increase in pure atrophic areas (no subretinal material) was 0.7 ± 0.8 mm2 per year, with a range of 0.01-2.6 mm2/year. Conclusion Treatment of neovascular AMD with a VEGF-neutralizing protein can result in regression of choroidal neovascularization, which is sometimes associated with atrophy of overlying retina. © 2015 Elsevier Inc. ALL RIGHTS RESERVED.
2013 American College of Cardiology/American Heart Association and 2004 Adult Treatment Panel III cholesterol guidelines applied to HIV-infected patients with/without subclinical high-risk coronary plaque
Zanni M.V.,Massachusetts General Hospital |
Fitch K.V.,Massachusetts General Hospital |
Feldpausch M.,Massachusetts General Hospital |
Han A.,Massachusetts General Hospital |
And 9 more authors.
AIDS | Year: 2014
Background: The 2013 American College of Cardiology/American Heart Association (ACC/AHA) cholesterol guidelines are being applied to HIV-infected patients but have not been validated in this at-risk population, which is known to have a high prevalence of subclinical high-risk morphology (HRM) coronary atherosclerotic plaque. Objective: To compare recommendations for statins among HIV-infected subjects with/without HRM coronary plaque according to 2013 ACC/AHA versus 2004 Adult Treatment Panel III guidelines. Methods/design: Data from 108 HIV-infected subjects without known cardiovascular disease (CVD) or lipid-lowering treatment who underwent contrast-enhanced computed tomography angiography were analyzed. Recommendations for statin therapy according to 2013 versus 2004 guidelines were assessed among those with/without HRM coronary plaque. Results: Among all subjects, 10-year atherosclerotic cardiovascular disease (ASCVD) risk score was 3.3% (1.6, 6.6), yet 36% of subjects had HRM coronary plaque. Among those with HRM coronary plaque, statins would be recommended for 26% by 2013 guidelines versus 10% by 2004 guidelines (P=0.04). Conversely, among those without HRM coronary plaque, statins would be recommended for 19% by 2013 guidelines versus 7% by 2004 guidelines (P=0.005). In multivariate modeling, while 10-year ASCVD risk score related to HRM coronary plaque burden (P=0.02), so too did other factors not incorporated into 2013 guidelines. Conclusion: The 2013 ACC/AHA cholesterol guidelines recommend statin therapy for a higher percentage of subjects with and without HRM coronary plaque relative to 2004 guidelines. However, even by 2013 guidelines, statin therapy would not be recommended for the majority (74%) of HIV-infected subjects with subclinical HRM coronary plaque. Outcome studies are needed to determine the utility of new statin recommendations and the contribution of HRM coronary plaque to CVD events among HIVinfected subjects. © 2014 Wolters Kluwer Health.