Boston, MA, United States
Boston, MA, United States

Time filter

Source Type

Grebely J.,University of New South Wales | Page K.,University of California at San Francisco | Sacks-Davis R.,Burnet Institute | Sacks-Davis R.,Monash University | And 20 more authors.
Hepatology | Year: 2014

Although 20%-40% of persons with acute hepatitis C virus (HCV) infection demonstrate spontaneous clearance, the time course and factors associated with clearance remain poorly understood. We investigated the time to spontaneous clearance and predictors among participants with acute HCV using Cox proportional hazards analyses. Data for this analysis were drawn from an international collaboration of nine prospective cohorts evaluating outcomes after acute HCV infection. Among 632 participants with acute HCV, 35% were female, 82% were Caucasian, 49% had interleukin-28 (IL28)B CC genotype (rs12979860), 96% had injected drugs ever, 47% were infected with HCV genotype 1, and 7% had human immunodeficiency virus (HIV) coinfection. Twenty-eight percent were HCV antibody negative/RNA positive at the time of acute HCV detection (early acute HCV). During follow-up, spontaneous clearance occurred in 173 of 632, and at 1 year after infection, 25% (95% confidence interval [CI]: 21, 29) had cleared virus. Among those with clearance, the median time to clearance was 16.5 weeks (IQR: 10.5, 33.4), with 34%, 67%, and 83% demonstrating clearance at 3, 6, and 12 months. Adjusting for age, factors independently associated with time to spontaneous clearance included female sex (adjusted hazards ratio [AHR]: 2.16; 95% CI: 1.48, 3.18), IL28B CC genotype (versus CT/TT; AHR, 2.26; 95% CI: 1.52, 3.34), and HCV genotype 1 (versus non-genotype 1; AHR: 1.56; 95% CI: 1.06, 2.30). The effect of IL28B genotype and HCV genotype on spontaneous clearance was greater among females, compared to males. Conclusions: Female sex, favorable IL28B genotype, and HCV genotype 1 are independent predictors of spontaneous clearance. Further research is required to elucidate the observed sex-based differences in HCV control. © 2013 by the American Association for the Study of Liver Diseases.


Korb D.R.,Korb Associates | Herman J.P.,Pittsfield Eye Associates | Blackie C.A.,Korb Associates | Scaffidi R.C.,Tufts Medical School | And 4 more authors.
Cornea | Year: 2010

Purpose: The purpose of this study was to investigate the prevalence of lid wiper epitheliopathy (LWE) in patients diagnosed with dry eye disease (DED). Methods: Patients were recruited for two groups. Inclusion criteria for the DED group (n = 50) was: a score greater than 10 with the Standard Patient Evaluation of Eye Dryness questionnaire, fluorescein break-up time 5 seconds or less, corneal and conjunctival staining with fluorescein, lissamine green Grade 1 or greater (scale 0-3), and Schirmer test with anesthesia 5 mm or less. For the asymptomatic group (n = 50), inclusion criteria were: no dry eye symptoms, fluorescein break-up time 10 seconds or greater, no corneal or conjunctival staining, and Schirmer test 10 mm or greater. Sequential instillations (n = 2, 5 minutes apart) of a mixture of 2% fluorescein and 1% lissamine green solution were used to stain the lid wipers of all patients. LWE was graded (scale 0-3) using the horizontal lid length and the average sagittal lid widths of the stained wiper. Results: In symptomatic patients, 88% had LWE, of which 22% was Grade 1, 46% Grade 2, and 20% Grade 3. In asymptomatic patients, 16% had LWE, of which 14% was Grade 1, 2% was Grade 2, and 0% Grade 3. The difference in prevalence of lid wiper staining between groups was significant (P<0.0001). Conclusions: The prevalence of LWE was six times greater for the DED group and the prevalence of LWE Grade 2 or greater was 16 times greater for the DED group than for the control group. These data further establish LWE as a diagnostic sign of dry eye disease. © 2010 Lippincott Williams & Wilkins.


Dwyer J.T.,Tufts Medical School | Dwyer J.T.,Frances Stern Nutrition Center | Woteki C.,Education and Economics | Bailey R.,U.S. National Institutes of Health | And 7 more authors.
Nutrition Reviews | Year: 2014

This article reviews the current landscape regarding food fortification in the United States; the content is based on a workshop sponsored by the North American Branch of the International Life Sciences Institute. Fortification of the food supply with vitamins and minerals is a public health strategy to enhance nutrient intakes of the population without increasing caloric intake. Many individuals in the United States would not achieve recommended micronutrient intakes without fortification of the food supply. The achievement and maintenance of a desirable level of nutritional quality in the nation's food supply is, thus, an important public health objective. While the addition of nutrients to foods can help maintain and improve the overall nutritional quality of diets, indiscriminate fortification of foods could result in overfortification or underfortification in the food supply and nutrient imbalances in the diets of individuals. Any changes in food fortification policy for micronutrients must be considered within the context of the impact they will have on all segments of the population and of food technology and safety applications and their limitations. This article discusses and evaluates the value of fortification, the success of current fortification efforts, and the future role of fortification in preventing or reversing nutrient inadequacies. © 2014 International Life Sciences Institute.


Lapane K.L.,Virginia Commonwealth University | Sands M.R.,Brown University | Sands M.R.,Memorial Hospital of Rhode Island | Yang S.,Virginia Commonwealth University | And 3 more authors.
Osteoarthritis and Cartilage | Year: 2012

Objective: To examine use of complementary and alternative medicine (CAM) among individuals with radiographic-confirmed osteoarthritis (OA) of the knee. Methods: We included 2679 participants of the Osteoarthritis Initiative with radiographic tibiofemoral knee OA in at least one knee at baseline. Trained interviewers asked a series of specific questions relating to current OA treatments including CAM therapies (seven categories - alternative medical systems, mind-body interventions, manipulation and body-based methods, energy therapies, and three types of biologically based therapies) and conventional medications. Participants were classified as: (1)conventional medication users only, (2) CAM users only; (3) users of both; and (4) users of neither. Polytomous logistic regression identified correlates of treatment approaches including sociodemographics and clinical/functional correlates. Results: CAM use was prevalent (47%), with 24% reporting use of both CAM and conventional medication approaches. Multi-joint OA was correlated with all treatments (adjusted odds ratios (aOR) conventional medications only: 1.62; CAM only: 1.37 and both: 2.16). X-ray evidence of severe narrowing (OARSI grade 3) was associated with use of glucosamine/chondroitin (aOR: 2.20) and use of both (aOR: 1.98). The Western Ontario and McMaster Universities (WOMAC)-Pain Score was correlated with conventional medication use, either alone (aOR: 1.28) or in combination with CAM (aOR: 1.41 per one standard deviation change). Knee Outcomes in Osteoarthritis Survey (KOOS)-Quality of Life (QOL) and Short Form (SF)-12 Physical Scale scores were inversely related to all treatments. Conclusion: CAM is commonly used to treat joint and arthritis pain among persons with knee OA. The extent to which these treatments are effective in managing symptoms and slowing disease progression remains to be proven. © 2011 Osteoarthritis Research Society International.


Stevens R.G.,University of Connecticut Health Center | Brainard G.C.,Thomas College | Blask D.E.,Tulane University | Lockley S.W.,Harvard University | Motta M.E.,Tufts Medical School
CA Cancer Journal for Clinicians | Year: 2014

Breast cancer is the leading cause of cancer death among women worldwide, and there is only a limited explanation of why. Risk is highest in the most industrialized countries but also is rising rapidly in the developing world. Known risk factors account for only a portion of the incidence in the high-risk populations, and there has been considerable speculation and many false leads on other possibly major determinants of risk, such as dietary fat. A hallmark of industrialization is the increasing use of electricity to light the night, both within the home and without. It has only recently become clear that this evolutionarily new and, thereby, unnatural exposure can disrupt human circadian rhythmicity, of which three salient features are melatonin production, sleep, and the circadian clock. A convergence of research in cells, rodents, and humans suggests that the health consequences of circadian disruption may be substantial. An innovative experimental model has shown that light at night markedly increases the growth of human breast cancer xenografts in rats. In humans, the theory that light exposure at night increases breast cancer risk leads to specific predictions that are being tested epidemiologically: evidence has accumulated on risk in shift workers, risk in blind women, and the impact of sleep duration on risk. If electric light at night does explain a portion of the breast cancer burden, then there are practical interventions that can be implemented, including more selective use of light and the adoption of recent advances in lighting technology and application. CA Cancer J Clin 2014;64:207-218. © 2013 American Cancer Society. © 2013 American Cancer Society, Inc.


Letendre S.L.,University of California at San Diego | Zheng J.C.,University of Nebraska Medical Center | Kaul M.,Sanford Burnham Institute for Medical Research | Yiannoutsos C.T.,Indiana University | And 4 more authors.
Journal of NeuroVirology | Year: 2011

Chemokines influence HIV neuropathogenesis by affecting the HIV life cycle, trafficking of macrophages into the nervous system, glial activation, and neuronal signaling and repair processes; however, knowledge of their relationship to in vivo measures of cerebral injury is limited. The primary objective of this study was to determine the relationship between a panel of chemokines in cerebrospinal fluid (CSF) and cerebral metabolites measured by proton magnetic resonance spectroscopy (MRS) in a cohort of HIV-infected individuals. One hundred seventy-one stored CSF specimens were assayed from HIV-infected individuals who were enrolled in two ACTG studies that evaluated the relationship between neuropsychological performance and cerebral metabolites. Concentrations of six chemokines (fractalkine, IL-8, IP-10, MCP-1, MIP-1β, and SDF-1) were measured and compared with cerebral metabolites individually and as composite neuronal, basal ganglia, and inflammatory patterns. IP-10 and MCP-1 were the chemokines most strongly associated with individual cerebral metabolites. Specifically, (1) higher IP-10 levels correlated with lower N-acetyl aspartate (NAA)/creatine (Cr) ratios in the frontal white matter and higher MI/Cr ratios in all three brain regions considered and (2) higher MCP-1 levels correlated with lower NAA/Cr ratios in frontal white matter and the parietal cortex. IP-10, MCP-1, and IL-8 had the strongest associations with patterns of cerebral metabolites. In particular, higher levels of IP-10 correlated with lower neuronal pattern scores and higher basal ganglia and inflammatory pattern scores, the same pattern which has been associated with HIV-associated neurocognitive disorders (HAND). Subgroup analysis indicated that the effects of IP-10 and IL-8 were influenced by effective antiretroviral therapy and that memantine treatment may mitigate the neuronal effects of IP-10. This study supports the role of chemokines in HAND and the validity of MRS as an assessment tool. In particular, the findings identify relationships between the immune response - particularly an interferon-inducible chemokine, IP-10 - and cerebral metabolites and suggest that antiretroviral therapy and memantine modify the impact of the immune response on neurons. © The Author(s) 2011.


Nitti V.W.,New York University | Mourtzinos A.,Tufts Medical School | Brucker B.M.,Tufts Medical School
Journal of Urology | Year: 2014

Purpose Many investigators have used the number of pads to determine the severity of post-prostatectomy incontinence and yet the accuracy of this tool remains unproven. We determined whether the patient perception of pad use and urine loss reflects actual urine loss. We also identified a quality of life measure that distinguishes patients by severity of incontinence. Materials and Methods We prospectively enrolled 235 men from a total of 18 sites 6 months or more after radical prostatectomy who had incontinence requiring protection. Patients completed a questionnaire on the perception of pad number, size and wetness, a quality of life question, several standardized incontinence questionnaires and a 24-hour pad test that assessed pad number, size and weight. SPSS® was used for statistical analysis. Results Perception of the number of pads used closely agreed with the number of pads collected during a 24-hour pad test. Perceived and actual pad size had excellent concordance (76%, p <0.001). Patients with wet and soaked pads had statistically and clinically significantly different pad weights that were uniquely different from each other and from those of patients who were almost dry and slightly wet. Response to the quality of life question separated the men into 4 statistically significantly different groups based on mean 24-hour pad weight. Conclusions Patients accurately described the number, size and degree of wetness of pads collected during a 24-hour pad test. These values correlated well with actual urine loss. The single question, "To what extent does urine loss affect your quality of life?" separated men into distinct categories. © 2014 by American Urological Association Education and Research, Inc.


Gordon F.D.,Tufts Medical School | Gordon F.D.,Lahey Clinic Medical Center
Clinics in Liver Disease | Year: 2012

Ascites is the pathologic accumulation of fluid in the peritoneum. It is the most common complication of cirrhosis, with a prevalence of approximately 10%. Over a 10-year period, 50% of patients with previously compensated cirrhosis are expected to develop ascites. As a marker of hepatic decompensation, ascites is associated with a poor prognosis, with only a 56% survival 3 years after onset. In addition, morbidity is increased because of the risk of additional complications, such as spontaneous bacterial peritonitis and hepatorenal syndrome. Understanding the pathophysiology of ascites is essential for its proper management. © 2012 Elsevier Inc.


Artur Z.,Tufts Medical School
Pathology Case Reviews | Year: 2011

Calciphylaxis is an ominous clinicopathological syndrome characterized by painful skin necrosis and ulceration caused by vascular occlusion, thrombosis, and microcalcifications of small arteries and capillaries in the subcutaneous tissue. Prognosis in calciphylaxis is poor, and prompt histological diagnosis is essential because 50% of patients with calciphylaxis are expected to die within the first year of diagnosis. This report presents a typical case of calciphylaxis and discusses recent advances in the understanding of the pathogenesis of calciphylaxis and the histological differential diagnosis. Recent basic research and clinicopathological correlation studies indicate that calciphylaxis is not merely a metastatic calcification phenomenon secondary to increased calcium and/or phosphate levels. Intravascular thrombosis and vascular injury emerge as critical independent pathogenetic factors in calciphylaxis. The new research insights have implications for diagnostic criteria of calciphylaxis. Copyright © 2011 by Lippincott Williams & Wilkins.


Gardin J.M.,Hackensack University Medical Center | Bartz T.M.,University of Washington | Polak J.F.,Tufts Medical Center | O'Leary D.H.,Tufts Medical School | Wong N.D.,University of California at Irvine
Journal of the American Society of Echocardiography | Year: 2014

Background The aim of this study was to evaluate whether the addition of ultrasound carotid intima-media thickness (CIMT) measurements and risk categories of plaque help predict incident stroke and cardiovascular disease (CVD) in older adults. Methods Carotid ultrasound studies were recorded in the multicenter Cardiovascular Health Study. CVD was defined as coronary heart disease plus heart failure plus stroke. Ten-year risk prediction Cox proportional-hazards models for stroke and CVD were calculated using Cardiovascular Health Study-specific coefficients for Framingham risk score factors. Categories of CIMT and CIMT plus plaque were added to Framingham risk score prediction models, and categorical net reclassification improvement (NRI) and Harrell's c-statistic were calculated. Results In 4,384 Cardiovascular Health Study participants (61% women, 14% black; mean baseline age, 72 ± 5 years) without CVD at baseline, higher CIMT category and the presence of plaque were both associated with higher incidence rates for stroke and CVD. The addition of CIMT improved the ability of Framingham risk score-type risk models to discriminate cases from noncases of incident stroke and CVD (NRI = 0.062, P =.015, and NRI = 0.027, P <.001, respectively), with no further improvement by adding plaque. For both outcomes, NRI was driven by down-classifying those without incident disease. Although the addition of plaque to CIMT did not result in a significant NRI for either outcome, it was significant among those without incident disease. Conclusions In older adults, the addition of CIMT modestly improves 10-year risk prediction for stroke and CVD beyond a traditional risk factor model, mainly by down-classifying risk in those without stroke or CVD; the addition of plaque to CIMT adds no statistical benefit in the overall cohort, although there is evidence of down-classification in those without events. © 2014 by the American Society of Echocardiography.

Loading Tufts Medical School collaborators
Loading Tufts Medical School collaborators