Entity

Time filter

Source Type

Boston, MA, United States

Nitti V.W.,New York University | Mourtzinos A.,Tufts Medical School | Brucker B.M.,Tufts Medical School
Journal of Urology | Year: 2014

Purpose Many investigators have used the number of pads to determine the severity of post-prostatectomy incontinence and yet the accuracy of this tool remains unproven. We determined whether the patient perception of pad use and urine loss reflects actual urine loss. We also identified a quality of life measure that distinguishes patients by severity of incontinence. Materials and Methods We prospectively enrolled 235 men from a total of 18 sites 6 months or more after radical prostatectomy who had incontinence requiring protection. Patients completed a questionnaire on the perception of pad number, size and wetness, a quality of life question, several standardized incontinence questionnaires and a 24-hour pad test that assessed pad number, size and weight. SPSS® was used for statistical analysis. Results Perception of the number of pads used closely agreed with the number of pads collected during a 24-hour pad test. Perceived and actual pad size had excellent concordance (76%, p <0.001). Patients with wet and soaked pads had statistically and clinically significantly different pad weights that were uniquely different from each other and from those of patients who were almost dry and slightly wet. Response to the quality of life question separated the men into 4 statistically significantly different groups based on mean 24-hour pad weight. Conclusions Patients accurately described the number, size and degree of wetness of pads collected during a 24-hour pad test. These values correlated well with actual urine loss. The single question, "To what extent does urine loss affect your quality of life?" separated men into distinct categories. © 2014 by American Urological Association Education and Research, Inc.


Dwyer J.T.,Tufts Medical School | Dwyer J.T.,Frances Stern Nutrition Center | Woteki C.,Education and Economics | Bailey R.,U.S. National Institutes of Health | And 7 more authors.
Nutrition Reviews | Year: 2014

This article reviews the current landscape regarding food fortification in the United States; the content is based on a workshop sponsored by the North American Branch of the International Life Sciences Institute. Fortification of the food supply with vitamins and minerals is a public health strategy to enhance nutrient intakes of the population without increasing caloric intake. Many individuals in the United States would not achieve recommended micronutrient intakes without fortification of the food supply. The achievement and maintenance of a desirable level of nutritional quality in the nation's food supply is, thus, an important public health objective. While the addition of nutrients to foods can help maintain and improve the overall nutritional quality of diets, indiscriminate fortification of foods could result in overfortification or underfortification in the food supply and nutrient imbalances in the diets of individuals. Any changes in food fortification policy for micronutrients must be considered within the context of the impact they will have on all segments of the population and of food technology and safety applications and their limitations. This article discusses and evaluates the value of fortification, the success of current fortification efforts, and the future role of fortification in preventing or reversing nutrient inadequacies. © 2014 International Life Sciences Institute.


Letendre S.L.,University of California at San Diego | Zheng J.C.,University of Nebraska Medical Center | Kaul M.,Sanford Burnham Institute for Medical Research | Yiannoutsos C.T.,Indiana University | And 4 more authors.
Journal of NeuroVirology | Year: 2011

Chemokines influence HIV neuropathogenesis by affecting the HIV life cycle, trafficking of macrophages into the nervous system, glial activation, and neuronal signaling and repair processes; however, knowledge of their relationship to in vivo measures of cerebral injury is limited. The primary objective of this study was to determine the relationship between a panel of chemokines in cerebrospinal fluid (CSF) and cerebral metabolites measured by proton magnetic resonance spectroscopy (MRS) in a cohort of HIV-infected individuals. One hundred seventy-one stored CSF specimens were assayed from HIV-infected individuals who were enrolled in two ACTG studies that evaluated the relationship between neuropsychological performance and cerebral metabolites. Concentrations of six chemokines (fractalkine, IL-8, IP-10, MCP-1, MIP-1β, and SDF-1) were measured and compared with cerebral metabolites individually and as composite neuronal, basal ganglia, and inflammatory patterns. IP-10 and MCP-1 were the chemokines most strongly associated with individual cerebral metabolites. Specifically, (1) higher IP-10 levels correlated with lower N-acetyl aspartate (NAA)/creatine (Cr) ratios in the frontal white matter and higher MI/Cr ratios in all three brain regions considered and (2) higher MCP-1 levels correlated with lower NAA/Cr ratios in frontal white matter and the parietal cortex. IP-10, MCP-1, and IL-8 had the strongest associations with patterns of cerebral metabolites. In particular, higher levels of IP-10 correlated with lower neuronal pattern scores and higher basal ganglia and inflammatory pattern scores, the same pattern which has been associated with HIV-associated neurocognitive disorders (HAND). Subgroup analysis indicated that the effects of IP-10 and IL-8 were influenced by effective antiretroviral therapy and that memantine treatment may mitigate the neuronal effects of IP-10. This study supports the role of chemokines in HAND and the validity of MRS as an assessment tool. In particular, the findings identify relationships between the immune response - particularly an interferon-inducible chemokine, IP-10 - and cerebral metabolites and suggest that antiretroviral therapy and memantine modify the impact of the immune response on neurons. © The Author(s) 2011.


Stevens R.G.,University of Connecticut Health Center | Brainard G.C.,Thomas College | Blask D.E.,Tulane University | Lockley S.W.,Harvard University | Motta M.E.,Tufts Medical School
CA Cancer Journal for Clinicians | Year: 2014

Breast cancer is the leading cause of cancer death among women worldwide, and there is only a limited explanation of why. Risk is highest in the most industrialized countries but also is rising rapidly in the developing world. Known risk factors account for only a portion of the incidence in the high-risk populations, and there has been considerable speculation and many false leads on other possibly major determinants of risk, such as dietary fat. A hallmark of industrialization is the increasing use of electricity to light the night, both within the home and without. It has only recently become clear that this evolutionarily new and, thereby, unnatural exposure can disrupt human circadian rhythmicity, of which three salient features are melatonin production, sleep, and the circadian clock. A convergence of research in cells, rodents, and humans suggests that the health consequences of circadian disruption may be substantial. An innovative experimental model has shown that light at night markedly increases the growth of human breast cancer xenografts in rats. In humans, the theory that light exposure at night increases breast cancer risk leads to specific predictions that are being tested epidemiologically: evidence has accumulated on risk in shift workers, risk in blind women, and the impact of sleep duration on risk. If electric light at night does explain a portion of the breast cancer burden, then there are practical interventions that can be implemented, including more selective use of light and the adoption of recent advances in lighting technology and application. CA Cancer J Clin 2014;64:207-218. © 2013 American Cancer Society. © 2013 American Cancer Society, Inc.


Gordon F.D.,Tufts Medical School | Gordon F.D.,Lahey Clinic Medical Center
Clinics in Liver Disease | Year: 2012

Ascites is the pathologic accumulation of fluid in the peritoneum. It is the most common complication of cirrhosis, with a prevalence of approximately 10%. Over a 10-year period, 50% of patients with previously compensated cirrhosis are expected to develop ascites. As a marker of hepatic decompensation, ascites is associated with a poor prognosis, with only a 56% survival 3 years after onset. In addition, morbidity is increased because of the risk of additional complications, such as spontaneous bacterial peritonitis and hepatorenal syndrome. Understanding the pathophysiology of ascites is essential for its proper management. © 2012 Elsevier Inc.

Discover hidden collaborations