Durham, NC, United States
Durham, NC, United States

Duke University is a private research university located in Durham, North Carolina, United States. Founded by Methodists and Quakers in the present-day town of Trinity in 1838, the school moved to Durham in 1892. In 1924, tobacco and electric power industrialist James B. Duke established The Duke Endowment, at which time the institution changed its name to honor his deceased father, Washington Duke.The university's campus spans over 8,600 acres on three contiguous campuses in Durham as well as a marine lab in Beaufort. Duke's main campus—designed largely by African American architect Julian Abele—incorporates Gothic architecture with the 210-foot Duke Chapel at the campus' center and highest point of elevation. The first-year-populated East Campus contains Georgian-style architecture, while the main Gothic-style West Campus 1.5 miles away is adjacent to the Medical Center.Duke's research expenditures in the 2012 fiscal year were $1.01 billion, the seventh largest in the nation. Competing in the Atlantic Coast Conference, Duke's athletic teams, known as the Blue Devils, have captured 15 team national championships, including four by its high profile men's basketball team. Duke was ranked among the world's best universities by both THE and QS, while tying for 8th in the 2015 U.S. News & World Report "Best National Universities Rankings." In 2014, Thomson Reuters named 32 Duke professors to its list of Highly Cited Researchers. The only schools with more primary affiliations were Harvard, Stanford, and UC Berkeley. Wikipedia.


Time filter

Source Type

Patent
Georgetown University and Duke University | Date: 2016-04-05

Disclosed are heterocyclic compounds that are ligands for nicotinic acetylcholine receptors. The compounds are useful for treating a mammal suffering from any one of a range of therapeutic indications, including Alzheimers disease, Parkinsons disease, dyskinesias, Tourettes syndrome, schizophrenia, attention deficit disorder, anxiety, pain, depression, obsessive compulsive disorder, chemical substance abuse, alcoholism, memory deficit, pseudodementia, Gansers syndrome, migraine pain, bulimia, obesity, premenstrual syndrome or late luteal phase syndrome, tobacco abuse, post-traumatic syndrome, social phobia, chronic fatigue syndrome, premature ejaculation, erectile difficulty, anorexia nervosa, disorders of sleep, autism, mutism, trichotillomania, and hypothermia.


Patent
Duke University | Date: 2016-05-13

The present invention generally relates to methods of modulating Ca_(v)1.2 channels and Ca_(v)1.2 channel activators.


Patent
Duke University | Date: 2016-06-17

Provided herein are compounds, compositions, including pharmaceutical compositions, having anti-cancer activity. Also provided are methods for diagnosing, detecting, and treating cancer in a subject, as well as a method for evaluating cancer stage in a subject, wherein the methods include determining the amount of a Ca^(2+)/calmodulin dependent kinase kinase (CaMKK) in a sample. Further provided are methods of screening and identifying a compound that inhibits CaMKK.


Patent
Duke University and Los Alamos National Security LLC | Date: 2015-03-19

In certain aspects the invention provides HIV-1 immunogens, including envelopes (CH505) and selections therefrom, and methods for swarm immunizations using combinations of HIV-1 envelopes.


Systems, methods and related devices used to produce and collect polarized noble gas to inhibit, suppress, detect or filter alkali metal nanoclusters to preserve or increase a polarization level thereof. The systems can include a pre-sat chamber that has an Area Ratio between 20 and 500.


Patent
Duke University | Date: 2016-07-13

The present invention provides monoclonal antibodies and antigen-binding fragments thereof that specifically bind to CD20, as well as pharmaceutical compositions comprising the same. The invention further provides methods of using the monoclonal antibodies, antigen-binding fragments, and pharmaceutical compositions, for example, in methods of depleting B cells or in treating B cell disorders. Also provided are cells, nucleic acids and methods for producing the monoclonal antibodies.


Patent
Immunolight Llc. and Duke University | Date: 2016-08-25

Products, compositions, systems, and methods for modifying a target structure which mediates or is associated with a biological activity, including treatment of conditions, disorders, or diseases mediated by or associated with a target structure, such as a virus, cell, subcellular structure or extracellular structure. The methods may be performed in situ in a non-invasive manner by placing a nanoparticle having a metallic shell on at least a fraction of a surface in a vicinity of a target structure in a subject and applying an initiation energy to a subject thus producing an effect on or change to the target structure directly or via a modulation agent. The nanoparticle is configured, upon exposure to a first wavelength _(1), to generate a second wavelength _(2 )of radiation having a higher energy than the first wavelength _(1). The methods may further be performed by application of an initiation energy to a subject in situ to activate a pharmaceutical agent directly or via an energy modulation agent, optionally in the presence of one or more plasmonics active agents, thus producing an effect on or change to the target structure. Kits containing products or compositions formulated or configured and systems for use in practicing these methods.


Patent
Immunolight Llc., Duke University and North Carolina State University | Date: 2015-04-22

A system and method for imaging or treating a disease in a human or animal body. The system provides to the human or animal body a pharmaceutical carrier including one or more phosphors which are capable of emitting ultraviolet or visible light into the body and which provide x-ray contrast. The system includes one or more devices which infuse a diseased site with a photo-activatable drug and the pharmaceutical carrier, an initiation energy source comprising an x-ray or high energy source which irradiates the diseased site with at least one of x-rays, gamma rays, or electrons to thereby initiate emission of said ultraviolet or visible light into the body, and a processor programmed to at least one of 1) produce images of the diseased site or 2) control a dose of said x-rays, gamma rays, or electrons to the diseased site for production of said ultraviolet or visible light at the diseased site to activate the photoactivatable drug.


Patent
Advanced Liquid Logic, Inc. and Duke University | Date: 2016-12-01

Methods and devices for conducting chemical or biochemical reactions that require multiple reaction temperatures are described. The methods involve moving one or more reaction droplets or reaction volumes through various reaction zones having different temperatures on a microfluidics apparatus. The devices comprise a microfluidics apparatus comprising appropriate actuators capable of moving reaction droplets or reaction volumes through the various reaction zones.


Patent
Johns Hopkins University and Duke University | Date: 2016-11-16

We found mutations of the R132 residue of isocitrate dehydrogenase 1 (IDH1) in the majority of grade II and III astrocytomas and oligodendrogliomas as well as in glioblastomas that develop from these lower grade lesions. Those tumors without mutations in IDH1 often had mutations at the analogous R172 residue of the closely related IDH2 gene. These findings have important implications for the pathogenesis and diagnosis of malignant gliomas.


Patent
Duke University and The Government Of The United States As Represented By The Secretary Of Health And Human Services | Date: 2016-10-11

We tested the in vitro and in vivo efficacy of a recombinant bispecific immunotoxin that recognizes both EGFRwt and tumor-specific EGFRvIII receptors. A single chain antibody was cloned from a hybridoma and fused to toxin, carrying a C-terminal peptide which increases retention within cells. The binding affinity and specificity of the recombinant bispecific immunotoxin for the EGFRwt and the EGFRvIII proteins was measured. In vitro cytotoxicity was measured. In vivo activity of the recombinant bispecific immunotoxin was evaluated in subcutaneous models and compared to that of an established monospecific immunotoxin. In our preclinical studies, the bispecific recombinant immunotoxin, exhibited significant potential for treating brain tumors.


Patent
Duke University and Emory University | Date: 2015-02-18

Provided herein are recombinant constructs, vectors and expression cassettes including a first promoter which is suitably a tRNA promoter operably connected to a first polynucleotide encoding a first single guide RNA and a second promoter operably connected to a second polynucleotide encoding a Cas9 polypeptide. The first single guide RNA includes a first portion complementary to a strand of a target sequence of a DNA virus and a second portion capable of interacting with the Cas9 polypeptide. Also provided are codon optimized Staphylococcus aureus derived Cas9 polynucleotides and polypeptides with nuclear localization signals and optionally an epitope tag. Also provided are constructs for production of sgRNAs including a tRNA. Methods of inhibiting viral replication, inhibiting expression of a target sequence from a virus or treating a viral infection or viral induced cancer using the compositions are also provided.


Patent
Duke University | Date: 2014-12-23

Described are methods and materials for diagnosing a subjects predisposition for cardiovascular disease by detecting a copper deficiency genetic marker, as well as methods of alleviating Cu transport impairment. Specifically, the Cu deficiency genetic marker may be within the gene encoding a transmembrane Cu transporter protein (Ctri) or its regulatory sequences.


Methods of making polycationic nanofibers by grafting cationic polymers onto electrospun neutral nanofibers and polycationic nanofibers produced by the methods are provided herein. In addition, methods of using the polycationic nanofibers to reduce inflammation, to adsorb anionic compounds such as heparin or nucleic acids, to inhibit the growth of microbes or inhibit the formation of a biofilm are also provided. The polycationic nanofibers may be in a mesh and may be included in a medical device, wound dressing, bandage, or as part of a graft.


A sensor comprising a semiconductor layer having a two dimensional electron gas (2DEG) and an oxide layer in electronic contact with the semiconductor layer is provided. A method of detecting an analyte molecule using such sensor is also provided.


Patent
Duke University | Date: 2016-11-14

Optical systems based on an objective lens comprising one or more plastic lens elements are disclosed. The inclusion of plastic lens element reduces one or more of system cost, size, weight, and/or complexity. The chromatic performance of some imaging systems in accordance with the present invention is improved by incorporation of a diffractive surface into the entry surface of the objective lens.


Methods and systems for large-scale face recognition. The system includes an electronic processor to receive at least one image of a subject of interest and apply at least one subspace model as a splitting binary decision function on the at least one image of the subject of interest. The electronic processor is further configured to generate at least one binary code from the at least one splitting binary decision function. The electronic processor is further configured to apply a code aggregation model to combine the at least one binary codes generated by the at least one subspace model. The electronic processor is further configured to generate an aggregated binary code from the code aggregation model and use the aggregated binary code to provide a hashing scheme.


Rapid radio frequency (RF) microwave devices and methods are disclosed. According to an aspect, a waveguide includes a body having first and second components that are attachable together to form an interior having a surface. Further, the waveguide includes a conductive material formed on the interior surface and shaped to convey electromagnetic waves.


Patent
Duke University | Date: 2016-08-26

System and Method for automatically removing blur and noise in a plurality of digital images. The system comprises an electronic processor configured to receive the plurality of digital images, perform motion estimation and motion compensation to align the plurality of digital images, determine an alignment of the plurality of digital images with respect to a reference frame, generate a consistency map based on the alignment of the plurality of digital images with respect to the reference frame, combine the plurality of digital images aligned with respect to the reference frame in the Fourier domain using a quality of alignment information from the consistency map to generate an aggregated frame, and apply a post-processing filter to enhance the quality of the aggregated frame.


Patent
Duke University | Date: 2017-04-19

The present invention relates to compositions, methods, and kits for eliciting an immune response to at least one CMV antigen expressed by a cancer cell, in particular for treating and preventing cancer. CMV determination methods, compositions, and kits also are provided.


Costs direct decisions that in?uence the effectiveness of radiology in the care of patients on a daily basis. Yet many radiologists struggle to harness the power of cost measurement and cost management as a critical path toward establishing their value in patient care. When radiologists cannot articulate their value, they risk losing control over how imaging is delivered and supported. In the United States, recent payment trends directing valuebased payments for bundles of care advance the imperative for radiology providers to articulate their value. This begins with the development of an understanding of the providers' own costs, as well as the complex interrelationships and imaging-associated costs of other participants across the imaging value chain. Controlling the costs of imaging necessitates understanding them at a procedural level and quantifying the costs of delivering specifc imaging services. Effective product-level costing is dependent on a bottom-up approach, which is supported through recent innovations in time-dependent activity-based costing. Once the costs are understood, they can be managed. Within the high fxed cost and high overhead cost environment of health care provider organizations, stakeholders must understand the implications of misaligned top-down cost management approaches that can both paradoxically shift effort from low-cost workers to much costlier professionals and allocate overhead costs counterproductively. Radiology's engagement across a broad spectrum of care provides an excellent opportunity for radiology providers to take a leading role within the health care organizations to enhance value and margin through principled and effective cost management. Following a discussion of the rationale for measuring costs, this review contextualizes costs from the perspectives of a variety of stakeholders (relativity), discusses core concepts in how costs are classifed (rudiments), presents common and improved methods for measuring costs in health care, and discusses how cost management strategies can either improve or hinder highvalue health care (realities). © 2016 RSNA.


Mccarty B.,Duke University
Journal of Medicine and Philosophy (United Kingdom) | Year: 2016

In The Anticipatory Corpse, Jeffrey Bishop claims that modern medicine has lost formal and final causality as the dead body has become epistemologically normative, and that a singular focus on efficient and material causality has thoroughly distorted modern medical practice. Bishop implies that the renewal of medicine will require its housing in alternate social spaces. This essay critiques both Bishop's diagnosis and therapy by arguing, first, that alternate social imaginaries, though perhaps marginalized, are already present within the practice of medicine. And second, the essay argues that alternate social imaginaries in medicine can be reclaimed not through separatist communities but in the re-narration of conceptually underdetermined practices. Given Bishop's invitation for theology to engage medicine, this essay then draws from theologian Dietrich Bonhoeffer for the kind of diagnosis and therapy currently needed, concluding with a contemporary example of how an alternate social imaginary is being instantiated in modern medicine. © The Author 2016. Published by Oxford University Press, on behalf of the Journal of Medicine and Philosophy Inc. All rights reserved.


Draelos Z.D.,Duke University
Clinics in Dermatology | Year: 2017

Patients with rosacea present a challenge to the dermatologist, as they typically possess sensitive skin, need facial Demodex and bacterial colonization control, exhibit vasomotor instability, require camouflaging of telangiectatic mats, and desire prescription treatment. Currently available pharmaceuticals are aimed at inflammation reduction, primarily with the use of topical and oral antibiotics. Recently, vasoconstrictor formulations have emerged, but these drugs have only a temporary effect and improve appearance without addressing the underlying cause, which remains largely unknown. Cosmeceuticals, including cleansers, moisturizers, cosmetics, sunscreens, and anti-inflammatory botanicals, can be used as adjuvant therapies in combination with traditional therapies. This review explores the effective use of cosmeceuticals in the treatment of rosacea to enhance pharmaceutical outcomes and meet patient expectations in a more satisfactory manner. © 2016 Elsevier Inc.


Chiba-Falek O.,Duke University
Current Opinion in Genetics and Development | Year: 2017

Synucleinopathies are a group of neurodegenerative diseases that share a common pathological lesion of intracellular protein inclusions largely composed of aggregates of alpha-synuclein protein. Accumulating evidence, including genome-wide association studies, has implicated the alpha-synuclein (SNCA) gene in the etiology of synucleinopathies and it has been suggested that SNCA expression levels are critical for the development of these diseases. This review focuses on genetic variants from the class of structural variants (SVs), including multiplication of large genomic segments and short (<50 bp) genomic variants such as simple sequence repeats (SSRs), within the SNCA locus. We provide evidence that SNCA-SVs play a key role in the pathogenesis of synucleinopathies via their effects on gene expression and on regulatory mechanisms including transcription and splicing. © 2017 Elsevier Ltd


McAdams D.,Duke University
Annals of the New York Academy of Sciences | Year: 2017

Widespread adoption of point-of-care resistance diagnostics (POCRD) reduces ineffective antibiotic use but could increase overall antibiotic use. Indeed, in the context of a standard susceptible-infected epidemiological model with a single antibiotic, POCRD accelerates the rise of resistance in the disease-causing bacterial population. When multiple antibiotics are available, however, POCRD may slow the rise of resistance even as more patients receive antibiotic treatment, belying the conventional wisdom that antibiotics are “exhaustible resources” whose increased use necessarily promotes the rise of resistance. © 2017 New York Academy of Sciences.


McAdams D.,Duke University
Annals of the New York Academy of Sciences | Year: 2017

Point-of-care diagnostics that can determine an infection's antibiotic sensitivity increase the profitability of new antibiotics that enjoy patent protection, even when such diagnostics reduce the quantity of antibiotics sold. Advances in the science and technology underpinning rapid resistance diagnostics can therefore be expected to spur efforts to discover and develop new antibiotics, especially those with a narrow spectrum of activity that would otherwise fail to find a market. © 2017 New York Academy of Sciences.


DNA double-strand breaks (DSBs) are traditionally associated with cancer through their abilities to cause chromosomal instabilities or gene mutations. Here we report a new class of self-inflicted DNA DSBs that can drive tumor growth irrespective of their effects on genomic stability. We discover a mechanism through which cancer cells cause DSBs in their own genome spontaneously independent of reactive oxygen species or replication stress. In this mechanism, low-level cytochrome c leakage from the mitochondria leads to sublethal activation of apoptotic caspases and nucleases, which causes DNA DSBs. In response to these spontaneous DNA DSBs, ATM, a key factor involved in DNA damage response, is constitutively activated. Activated ATM leads to activation of transcription factors NF-κB and STAT3, known drivers of tumor growth. Moreover, self-inflicted DNA DSB formation and ATM activation are important in sustaining the stemness of patient-derived glioma cells. In human tumor tissues, elevated levels of activated ATM correlate with poor patient survival. Self-inflicted DNA DSBs therefore are functionally important for maintaining the malignancy of cancer cells.Cell Research advance online publication 24 March 2017; doi: 10.1038/cr.2017.41. © 2017 Shanghai Institutes for Biological Sciences, Chinese Academy of Sciences


Zentella R.,Duke University
Nature Chemical Biology | Year: 2017

Plant development requires coordination among complex signaling networks to enhance the plant's adaptation to changing environments. DELLAs, transcription regulators originally identified as repressors of phytohormone gibberellin signaling, play a central role in integrating multiple signaling activities via direct protein interactions with key transcription factors. Here, we found that DELLA is mono-O-fucosylated by the novel O-fucosyltransferase SPINDLY (SPY) in Arabidopsis thaliana. O-fucosylation activates DELLA by promoting its interaction with key regulators in brassinosteroid- and light-signaling pathways, including BRASSINAZOLE-RESISTANT1 (BZR1), PHYTOCHROME-INTERACTING-FACTOR3 (PIF3) and PIF4. Moreover, spy mutants displayed elevated responses to gibberellin and brassinosteroid, and increased expression of common target genes of DELLAs, BZR1 and PIFs. Our study revealed that SPY-dependent protein O-fucosylation plays a key role in regulating plant development. This finding may have broader importance because SPY orthologs are conserved in prokaryotes and eukaryotes, thus suggesting that intracellular O-fucosylation may regulate a wide range of biological processes in diverse organisms. © 2017 Nature Publishing Group, a division of Macmillan Publishers Limited. All Rights Reserved.


Rausher M.D.,Duke University
American Naturalist | Year: 2017

Reinforcement can contribute to speciation by increasing the strength of prezygotic isolating mechanisms. Theoretical analyses over the past two decades have demonstrated that conditions for reinforcement are not unduly restrictive, and empirical investigations have documented over a dozen likely cases, indicating that it may be a reasonably common phenomenon in nature. Largely uncharacterized, however, is the diversity of biological scenarios that can create the reduced hybrid fitness that drives reinforcement. Here I examine one such scenario—the evolution of the “selfing syndrome” (a suite of characters including reductions in flower size and in nectar, pollen, and scent production) in highly selfing plant species. Using a fourlocus model, where the loci are (1) a discrimination locus, (2) a targetof-discimination locus, (3) a pollen-production locus, and (4) a selfing-rate locus, I determine the conditions under which this syndrome can favor reinforcement, an increase in discrimination through change at locus 1, in an outcrossing species that experiences gene flow from a highly selfing species. In the absence of both linkage disequilibrium between loci and pollen discounting, reinforcement can occur, but only in a very small fraction of the parameter combinations examined. Moderate linkage (r = 0.1) between one pair of loci increases this fraction moderately, depending on which two loci are linked. Pollen discounting (a reduction in pollen exported to other plants due to increased selfing), by contrast, can increase the fraction of parameter combinations that result in reinforcement substantially. The evolution of reduced pollen production in highly selfing species thus facilitates reinforcement, especially if substantial pollen discounting is associated with selfing. © 2016 by The University of Chicago.


Lebedev M.A.,Duke University | Nicolelis M.A.L.,Duke University
Physiological Reviews | Year: 2017

Brain-machine interfaces (BMIs) combine methods, approaches, and concepts derived from neurophysiology, computer science, and engineering in an effort to establish real-time bidirectional links between living brains and artificial actuators. Although theoretical propositions and some proof of concept experiments on directly linking the brains with machines date back to the early 1960s, BMI research only took off in earnest at the end of the 1990s, when this approach became intimately linked to new neurophysiological methods for sampling large-scale brain activity. The classic goals of BMIs are 1) to unveil and utilize principles of operation and plastic properties of the distributed and dynamic circuits of the brain and 2) to create new therapies to restore mobility and sensations to severely disabled patients. Over the past decade, a wide range of BMI applications have emerged, which considerably expanded these original goals. BMI studies have shown neural control over the movements of robotic and virtual actuators that enact both upper and lower limb functions. Furthermore, BMIs have also incorporated ways to deliver sensory feedback, generated from external actuators, back to the brain. BMI research has been at the forefront of many neurophysiological discoveries, including the demonstration that, through continuous use, artificial tools can be assimilated by the primate brain’s body schema. Work on BMIs has also led to the introduction of novel neurorehabilitation strategies. As a result of these efforts, long-term continuous BMI use has been recently implicated with the induction of partial neurological recovery in spinal cord injury patients. © 2017 the American Physiological Society.


Kepler T.B.,Boston University | Wiehe K.,Duke University
Immunological Reviews | Year: 2017

Most broadly neutralizing antibodies (BNAbs) elicited in response to HIV-1 infection are extraordinarily mutated. One goal of HIV-1 vaccine development is to induce antibodies that are similar to the most potent and broad BNAbs isolated from infected subjects. The most effective BNAbs have very high mutation frequencies, indicative of the long periods of continual activation necessary to acquire the BNAb phenotype through affinity maturation. Understanding the mutational patterns that define the maturation pathways in BNAb development is critical to vaccine design efforts to recapitulate through vaccination the successful routes to neutralization breadth and potency that have occurred in natural infection. Studying the mutational changes that occur during affinity maturation, however, requires accurate partitioning of sequence data into B-cell clones and identification of the starting point of a B-cell clonal lineage, the initial V(D)J rearrangement. Here, we describe the statistical framework we have used to perform these tasks. Through the recent advancement of these and similar computational methods, many HIV-1 ancestral antibodies have been inferred, synthesized and their structures determined. This has allowed, for the first time, the investigation of the structural mechanisms underlying the affinity maturation process in HIV-1 antibody development. Here, we review what has been learned from this atomic-level structural characterization of affinity maturation in HIV-1 antibodies and the implications for vaccine design. © 2017 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd


Kelsoe G.,Duke University | Haynes B.F.,Duke University
Immunological Reviews | Year: 2017

Induction of broadly neutralizing antibodies (bNAbs) is a major goal of HIV vaccine development. BNAbs are made during HIV infection by a subset of individuals but currently cannot be induced in the setting of vaccination. Considerable progress has been made recently in understanding host immunologic controls of bNAb induction and maturation in the setting of HIV infection, and point to key roles for both central and peripheral immunologic tolerance mechanisms in limiting bnAb development. Immune tolerance checkpoint inhibition has been transformative in promotion of anti-tumor CD8 T-cell responses in the treatment of certain malignancies. Here, we review the evidence for host controls of bNAb responses, and discuss strategies for the transient modulation of immune responses with vaccines toward the goal of enhancing germinal center B-cell responses to favor bNAb B-cell lineages and to foster their maturation to full neutralization potency. © 2017 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd


Borrow P.,University of Oxford | Moody M.A.,Duke University
Immunological Reviews | Year: 2017

Induction of broadly neutralizing antibodies (bnAbs) capable of inhibiting infection with diverse variants of human immunodeficiency virus type 1 (HIV-1) is a key, as-yet-unachieved goal of prophylactic HIV-1 vaccine strategies. However, some HIV-infected individuals develop bnAbs after approximately 2-4 years of infection, enabling analysis of features of these antibodies and the immunological environment that enables their induction. Distinct subsets of CD4+ T cells play opposing roles in the regulation of humoral responses: T follicular helper (Tfh) cells support germinal center formation and provide help for affinity maturation and the development of memory B cells and plasma cells, while regulatory CD4+ (Treg) cells including T follicular regulatory (Tfr) cells inhibit the germinal center reaction to limit autoantibody production. BnAbs exhibit high somatic mutation frequencies, long third heavy-chain complementarity determining regions, and/or autoreactivity, suggesting that bnAb generation is likely to be highly dependent on the activity of CD4+ Tfh cells, and may be constrained by host tolerance controls. This review discusses what is known about the immunological environment during HIV-1 infection, in particular alterations in CD4+ Tfh, Treg, and Tfr populations and autoantibody generation, and how this is related to bnAb development, and considers the implications for HIV-1 vaccine design. © 2017 The Authors. Immunological Reviews published by John Wiley & Sons Ltd


Margolis D.M.,University of North Carolina at Chapel Hill | Koup R.A.,National Institute of Allergy and Infectious Diseases | Ferrari G.,Duke University
Immunological Reviews | Year: 2017

The bar is high to improve on current combination antiretroviral therapy (ART), now highly effective, safe, and simple. However, antibodies that bind the HIV envelope are able to uniquely target the virus as it seeks to enter new target cells, or as it is expressed from previously infected cells. Furthermore, the use of antibodies against HIV as a therapeutic may offer advantages. Antibodies can have long half-lives, and are being considered as partners for long-acting antiretrovirals for use in therapy or prevention of HIV infection. Early studies in animal models and in clinical trials suggest that such antibodies can have antiviral activity but, as with small-molecule antiretrovirals, the issues of viral escape and resistance will have to be addressed. Most promising, however, are the unique properties of anti-HIV antibodies: the potential ability to opsonize viral particles, to direct antibody-dependent cellular cytotoxicity (ADCC) against actively infected cells, and ultimately the ability to direct the clearance of HIV-infected cells by effector cells of the immune system. These distinctive activities suggest that HIV antibodies and their derivatives may play an important role in the next frontier of HIV therapeutics, the effort to develop treatments that could lead to an HIV cure. Published 2017. This article is a U.S. Government work and is in the public domain in the USA.


Verkoczy L.,Duke University | Alt F.W.,Howard Hughes Medical Institute | Tian M.,Howard Hughes Medical Institute
Immunological Reviews | Year: 2017

A major challenge for HIV-1 vaccine research is developing a successful immunization approach for inducing broadly neutralizing antibodies (bnAbs). A key shortcoming in meeting this challenge has been the lack of animal models capable of identifying impediments limiting bnAb induction and ranking vaccine strategies for their ability to promote bnAb development. Since 2010, immunoglobulin knockin (KI) technology, involving inserting functional rearranged human variable exons into the mouse IgH and IgL loci has been used to express bnAbs in mice. This approach has allowed immune tolerance mechanisms limiting bnAb production to be elucidated and strategies to overcome such limitations to be evaluated. From these studies, along with the wealth of knowledge afforded by analyses of recombinant Ig-based bnAb structures, it became apparent that key functional features of bnAbs often are problematic for their elicitation in mice by classic vaccine paradigms, necessitating more iterative testing of new vaccine concepts. In this regard, bnAb KI models expressing deduced precursor V(D)J rearrangements of mature bnAbs or unrearranged germline V, D, J segments (that can be assembled into variable region exons that encode bnAb precursors), have been engineered to evaluate novel immunogens/regimens for effectiveness in driving bnAb responses. One promising approach emerging from such studies is the ability of sequentially administered, modified immunogens (designed to bind progressively more mature bnAb precursors) to initiate affinity maturation. Here, we review insights gained from bnAb KI studies regarding the regulation and induction of bnAbs, and discuss new Ig KI methodologies to manipulate the production and/or expression of bnAbs in vivo, to further facilitate vaccine-guided bnAb induction studies. © 2017 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd


Marine microbes exhibit seasonal cycles in community composition, yet the key drivers of these patterns and microbial population fidelity to specific environmental conditions remain to be determined. To begin addressing these questions, we characterized microbial dynamics weekly for 3 years at a temperate, coastal site with dramatic environmental seasonality. This high-resolution time series reveals that changes in microbial community composition are not continuous; over the duration of the time series, the community instead resolves into distinct summer and winter profiles with rapid spring and fall transitions between these states. Here, we show that these community shifts involve switching between closely related strains that exhibit either summer or winter preferences. Moreover, taxa repeat this process annually in both this and another temperate coastal time series, suggesting that this phenomenon may be widespread in marine ecosystems. To address potential biogeochemical impacts of these community changes, PICRUSt-based metagenomes predict seasonality in transporters, photosynthetic proteins, peptidases and carbohydrate metabolic pathways in spite of closely related summer- and winter-associated taxa. Thus, even small temperature shifts, such as those predicted by climate change models, could affect both the structure and function of marine ecosystems.The ISME Journal advance online publication, 24 February 2017; doi:10.1038/ismej.2017.4. © 2017 International Society for Microbial Ecology


Pollara J.,Duke University
Current Opinion in HIV and AIDS | Year: 2017

PURPOSE OF REVIEW: The ability to induce broadly neutralizing antibody (bNAb) responses is likely essential for development of a globally effective HIV vaccine. Unfortunately, human vaccine trials conducted to date have failed to elicit broad plasma neutralization of primary virus isolates. Despite this limitation, in-depth analysis of the vaccine-induced memory B-cell repertoire can provide valuable insights into the presence and function of subdominant B-cell responses, and identify initiation of antibody lineages that may be on a path towards development of neutralization breadth. RECENT FINDINGS: Characterization of the functional capabilities of monoclonal antibodies isolated from a HIV-1 vaccine trial with modest efficacy has revealed mechanisms by which non-neutralizing antibodies are presumed to have mediated protection. In addition, B-cell repertoire analysis has demonstrated that vaccine boosts shifted the HIV-specific B-cell repertoire, expanding pools of cells with long third heavy chain complementarity determining regions – a characteristic of some bNAb lineages. SUMMARY: Detailed analysis of memory B-cell repertoires and evaluating the effector functions of isolated monoclonal antibodies expands what we can learn from human vaccine trails, and may provide knowledge that can enable rational design of novel approaches to drive maturation of subdominant disfavored bNAb lineages. Copyright © 2017 Wolters Kluwer Health, Inc. All rights reserved.


A vaccine that can effectively prevent HIV-1 transmission remains paramount to ending the HIV pandemic, but to do so, will likely need to induce broadly neutralizing antibody (bnAb) responses. A major technical hurdle toward achieving this goal has been a shortage of animal models with the ability to systematically pinpoint roadblocks to bnAb induction and to rank vaccine strategies based on their ability to stimulate bnAb development. Over the past 6 years, immunoglobulin (Ig) knock-in (KI) technology has been leveraged to express bnAbs in mice, an approach that has enabled elucidation of various B-cell tolerance mechanisms limiting bnAb production and evaluation of strategies to circumvent such processes. From these studies, in conjunction with the wealth of information recently obtained regarding the evolutionary pathways and paratopes/epitopes of multiple bnAbs, it has become clear that the very features of bnAbs desired for their function will be problematic to elicit by traditional vaccine paradigms, necessitating more iterative testing of new vaccine concepts. To meet this need, novel bnAb KI models have now been engineered to express either inferred prerearranged V(D)J exons (or unrearranged germline V, D, or J segments that can be assembled into functional rearranged V(D)J exons) encoding predecessors of mature bnAbs. One encouraging approach that has materialized from studies using such newer models is sequential administration of immunogens designed to bind progressively more mature bnAb predecessors. In this review, insights into the regulation and induction of bnAbs based on the use of KI models will be discussed, as will new Ig KI approaches for higher-throughput production and/or altering expression of bnAbs in vivo, so as to further enable vaccine-guided bnAb induction studies. © 2017 Elsevier Inc.


The incidence of Staphylococcus aureus bacteremia (SAB) is significantly higher in African American (AA) than in European-descended populations. We used admixture mapping (AM) to test the hypothesis that genomic variations with different frequencies in European and African ancestral genomes influence susceptibility to SAB in AAs. A total of 565 adult AAs (390 cases with SAB; 175 age-matched controls) were genotyped for AM analysis. A case-only admixture score and a mixed χ2(1df) score (MIX) to jointly evaluate both single-nucleotide polymorphism (SNP) and admixture association (P<5.00e-08) were computed using MIXSCORE. In addition, a permutation scheme was implemented to derive multiplicity adjusted P-values (genome-wide 0.05 significance threshold: P<9.46e-05). After empirical multiplicity adjustment, one region on chromosome 6 (52 SNPs, P=4.56e-05) in the HLA class II region was found to exhibit a genome-wide statistically significant increase in European ancestry. This region encodes genes involved in HLA-mediated immune response and these results provide additional evidence for genetic variation influencing HLA-mediated immunity, modulating susceptibility to SAB.Genes and Immunity advance online publication, 23 March 2017; doi:10.1038/gene.2017.6. © 2017 The Author(s)


News Article | April 13, 2017
Site: www.techtimes.com

A new report from the National Academies of Sciences, Engineering, and Medicine called for the scientific research community in the United States to improve research integrity practices and policies, including probing allegations of misconduct and promoting ethical action. Its key recommendation: have universities and scientific organizations create and operate an independent non-governmental Research Integrity Advisory Board (RIAB) to strengthen this mission. "The research enterprise is not broken, but it faces significant challenges in creating the conditions needed to foster and sustain the highest standards of integrity," said committee chair Robert Nerem, committee chair of the report titled “Fostering Integrity in Research” and published Tuesday, April 11. Nerem finds good reason to do this now, as “Congress could step in” and act unilaterally. It likely would not have the research enterprise’s best interests, and one should be leery of such intervention, he explained. The National Academy of Sciences has not established standards for appropriate scientific conduct in about 25 years, during which science has transformed significantly. Making research integrity far more complex than before, according to Nerem, are matters that include the rise of interdisciplinary research, the advent of new technologies, as well as greater global collaboration. The committee started to work in 2012 on this, with the objective of updating its 1992 National Academies report dubbed “Responsible Science.” The said report proposed the same advisory body, but the call went unheeded, according to Science magazine. The old report was prompted by a series of research misconduct cases that affected the field’s reputation as well as led to the formation of the Office of Research Integrity (ORI) to prove misconduct around federally funded biomedical research. ORI as well as its counterpart office at the National Science Foundation (NSF) continue to receive allegations each year, resulting in dozens of misconduct findings. At Duke University several years earlier, an alleged fabrication by cancer researcher Anil Potti is considered one notorious example of integrity issues. For Nerem, the North Carolina school’s response to the misconduct charges was just “as flawed as the behavior itself.” According to growing evidence, significant percentages of published results in certain areas are non-reproducible, which could be due to unidentified factors or errors. Experts increasingly find, however, that falsification of data and undesirable practices, such as misappropriating statistics, play a role in this irreproducibility. New adverse practice also surface, including journals with little to no editorial review or quality control in papers they are publishing while they continue to charge authors with hefty fees. Journal article retractions have also climbed, with a substantial portion attributed to research misconduct. The new report noted that the RIAB would foster ethical behavior without having a direct role in probes, regulation, or accreditation, as well as setting enforceable standards for the field. It also urged government offices and private foundations funding scientific research to quantify conditions that may be associated with misconduct and detrimental practices. Investigations, added Nerem, could also be subjected to external peer review currently done for journal articles as well as grant proposals. The report comes at a time when scientists are rallying against changes that include proposed slashes in the science budget and health budget cuts as well as decreased interest in science-based factual inquiry. © 2017 Tech Times, All rights reserved. Do not reproduce without permission.


News Article | May 3, 2017
Site: www.sciencemag.org

The Amazon rainforest is a treasure trove of biodiversity, containing 10% of the planet’s species in its 6.7 million square kilometers. How it got to be that way has been fiercely disputed for decades. Now, a new study suggests that a large section of the forest was twice flooded by the Caribbean Sea more than 10 million years ago, creating a short-lived inland sea that jump-started the evolution of new species. But the new evidence still hasn’t convinced scientists on the other side of the debate. “It’s hard to imagine a process that would cover such a large forest with an ocean,” says lead author Carlos Jaramillo, a paleontologist at the Smithsonian Tropical Research Institute in Panama City who has been in both camps. Researchers generally agree that parts of the Amazon were once under water, but they don’t agree on where the water came from. Those in the “river camp” argue that freshwater streaming down from the rising Andes sliced up the land below, dividing plants and animals into isolated groups that later turned into new species. The fast-growing mountains also created microclimates at different elevations, sparking speciation and funneling new plants and animals into the Amazon basin. However, when marine microorganisms were discovered in Amazonian sediments in the 1990s, some scientists hypothesized that the forest was once inundated by an ocean, which created new species as forest dwellers quickly adapted to the flood. But proving either case—the river view or the ocean view—is tough. Rocks and fossils that could paint a definitive picture are exceedingly rare. So Jaramillo and his colleagues turned to a different kind of data: cores drilled into the jungle floor. Six centimeters wide and 600 meters deep, the cylindrical cores preserve a record of the region’s past environments in the form of pollen, fossils, and sediments, going back tens of millions of years. Jaramillo used two cores: one from eastern Colombia, drilled by an oil company, and one from northeastern Brazil, taken by the Brazilian Geology Survey in the 1980s. Jaramillo’s team went through the cores layer by layer. Most of the remains came from land-dwelling species. But in two thin layers, it found marine plankton and seashells. The Colombian core even contained a fossilized shark’s tooth and a mantis shrimp, both ocean dwellers. That was enough to convince Jaramillo, who was once firmly in the river camp, that the Caribbean Sea had reached down into the western Amazon of Brazil, Ecuador, and Peru twice: once 18 million years ago, and again 14 million years ago, he writes today in . “It’s a lost ecosystem,” he says. These seas didn’t last for long. In northwest Brazil, the first flood endured some 200,000 years, while the second lasted 400,000 years. Colombia, which is closer to the Caribbean, was inundated for a longer period, 900,000 and 3.7 million years, respectively. Those floods could have been caused by the growing Andes, Jaramillo says. The mountains would have pushed down the rest of the continent as they thrust upward, letting seawater flow in. But that water would have been quickly displaced as freshwater and sediments flowed down the peaks and rebuilt the basin. In geological time, these floods lasted a mere blink of the eye, Jaramillo says, “but it’s still a long time for a tree.” Even these relatively short events would have transformed the region. The new work “makes the case [for marine flooding] much stronger, and it makes the timing more definite,” says Carina Hoorn, a geologist and palynologist at the University of Amsterdam and Ikiam Regional University of Amazonia in Tena, Ecuador, who first proposed the marine flooding theory. But Paul Baker, a geologist at Duke University in Durham, North Carolina, and Yachay Tech in Urcuquí, Ecuador, is still a firm member of the river camp. “In [Colombia], I don’t have any problem with there being a marine incursion,” Baker says. But the Brazilian core troubles him, because marine-looking plankton has turned up in other ancient freshwater lakes in Europe, he says. More convincing to Baker would be a measurement of oxygen isotopes in the shells, which could reveal whether they grew in salt- or freshwater. Jaramillo says he’s already working on it. He’d also like to find more Amazonian fossils to study species that may have gone extinct during this dynamic time. For now, there’s only one thing Jaramillo, Hoorn, and Baker can all agree on: They will need to drill and study many more cores from across the region to solve the mystery of the Amazon’s biodiversity.


News Article | April 17, 2017
Site: www.rdmag.com

A vaccine targeting cytomegalovirus (CMV) antigen pp65, combined with high-dose chemotherapy (temozolomide), improved both progression-free survival and overall survival for a small group of glioblastoma (GBM) patients. Journal in Which the Study was Published: Clinical Cancer Research, a journal of the American Association for Cancer Research. Author: Lead author of the study is Kristen Batich, MD, PhD, a researcher in the lab of senior author John Sampson, MD, PhD, chair of the Department of Neurosurgery at Duke University. Background: The typical median survival for GBM patients is less than 15 months. To overcome these poor numbers, the researchers took advantage of CMV's affinity for GBM, with the viral proteins being expressed in roughly 90 percent of these tumors. Building on previous research, they used CMV as a proxy for GBM, targeting the virus with pp65-specific dendritic cells to spotlight the tumor for the immune system. How the Study Was Conducted and Results: The cohort of 11 patients who received this combination therapy demonstrated a median progression-free survival of 25.3 months and a median overall survival of 41.1 months, and three patients remain progression-free more than seven years after diagnosis, Batich explained. "The clinical outcomes in GBM patients who received this combination were very striking," Batich said. Previous work had shown that TMZ generates profound lymphopenia or the loss of immune cells, which offers a unique opportunity to retrain the immune system, Batich explained. The researchers administered dose-intensified temozolomide (TMZ) as a strategy to further enhance the immune response. "The dose-intensified temozolomide induces a strong state of lymphopenia," said Batich. "With that comes an opportune moment to introduce an antigen-specific vaccine, which redirects the immune system to put all hands on deck and fight that target." One of the noteworthy results from the study was the excellent response rate despite the high proportion of regulatory T cells, which dampen the immune response and rebounded sharply following TMZ administration. This finding may actually be cause for optimism, Batich noted. "If we could preclude this regulatory T-cell rebound, it could have additionally enhancing effects on the pp65 vaccine response," said Batich. Limitations: Though the survival results are quite encouraging, the authors caution that this was a single-arm study without a control group. In addition, the cohort was quite small. Though the outcomes far outpaced historical controls, a more robust trial will be needed to confirm these results. In addition, the team wants to better understand the mechanisms that underlie the strong response rate and refine this combination therapy to produce even better results. "We want to understand why some patients do better than others," said Batich.


News Article | May 1, 2017
Site: www.chromatographytechniques.com

A whistleblower lawsuit filed against Duke University and two scientists that alleges fraud enabled them to secure $112 million in federal funding can proceed, a judge ruled last week. The suit, brought under the False Claims Act, was originally filed in 2013. It could lead to the school losing triple the amount of the grant money, and could entitle the whistleblower to tens of millions of dollars. The complaint alleges that Erin Potts-Kant, and her supervisor William Foster, used fraudulent work to successfully get funding. The two scientists and Duke all filed motions to dismiss last month. But a federal judge ruled on April 25 that the lawsuit will proceed, according to federal court documents on file. The whistleblower suit was unsealed by a U.S. district court last year. The lawsuit was first reported by the science watchdog website Retraction Watch. The lawsuit was filed by former Duke employee Joseph Thomas, who had worked in the pulmonary division as a laboratory research analyst along with Potts-Kant and Foster. Potts-Kant allegedly used a new tool called a flexiVent machine, and tailored the results of the experiments. Sometimes she deliberately did not expose mice to the chemical, medication, or other exposure called for in the experiment – or other times just altered the results. “Potts-ant’s fraud was sometimes more direct and brazen,” the lawsuit claims. “She sometimes did not run the experiments at all, instead manufacturing data that would correspond to the hoped-for outcome of the experiment.” The papers then provided basis for the successful grants. The suit contends it was the deciding factor in 53 grants totaling approximately $112 million. Potts-Kant was arrested in March 2013 on allegations she had embezzled more than $14,000 from the school to go on spending sprees at stores – and then even falsified receipts. Potts-Kant later pleaded guilty and was sentenced to probation, community service, and a fine. The lawsuit also alleged that Duke and Foster had not appropriately overseen Potts-Kant’s work to prevent the fraud. Duke has since defended its stance, saying that it reported improprieties to authorities when it was made aware of the extent of the misconduct. Several experts have told Retraction Watch that the lawsuit could lead to a wave of similar lawsuits at other schools.


News Article | April 19, 2017
Site: www.cnet.com

Alphabet, the parent company of Google, will launch a study to collect health data from 10,000 volunteers, the company said Wednesday. The study is part of an initiative called Project Baseline that aims to "test and develop new tools and technologies to access, organize and activate health information," according to a media release. The health information will help the company define a baseline of health and gain a better understanding of risk factors for disease. Verily Life Sciences, part of Alphabet, will work with Duke University School of Medicine and Stanford Medicine on the study. Project Baseline will begin enrolling participants in the next few months. They'll be followed for at least four years. Data collected will be handled on Google computing infrastructure and hosted on the Google Cloud Platform. Anonymized Project Baseline study data will be available to qualified researchers in the future, the media release said.


News Article | April 17, 2017
Site: www.techrepublic.com

Duke University, working with SAP, has a redesigned statistics site that provides team basketball data dating back to 1906. Find out what's next for Blue Devils fans.


News Article | May 2, 2017
Site: www.chromatographytechniques.com

Using new gene-editing technology, researchers have rewired mouse stem cells to fight inflammation caused by arthritis and other chronic conditions. Such stem cells, known as SMART cells (Stem cells Modified for Autonomous Regenerative Therapy), develop into cartilage cells that produce a biologic anti-inflammatory drug that, ideally, will replace arthritic cartilage and simultaneously protect joints and other tissues from damage that occurs with chronic inflammation. The cells were developed at Washington University School of Medicine in St. Louis and Shriners Hospitals for Children-St. Louis, in collaboration with investigators at Duke University and Cytex Therapeutics Inc., both in Durham, N.C. The researchers initially worked with skin cells taken from the tails of mice and converted those cells into stem cells. Then, using the gene-editing tool CRISPR in cells grown in culture, they removed a key gene in the inflammatory process and replaced it with a gene that releases a biologic drug that combats inflammation. The research is available online in the journal Stem Cell Reports. “Our goal is to package the rewired stem cells as a vaccine for arthritis, which would deliver an anti-inflammatory drug to an arthritic joint but only when it is needed,” said Farshid Guilak, the paper’s senior author and a professor of orthopedic surgery at Washington University School of Medicine. “To do this, we needed to create a ‘smart’ cell.” Many current drugs used to treat arthritis — including Enbrel, Humira and Remicade — attack an inflammation-promoting molecule called tumor necrosis factor-alpha (TNF-alpha). But the problem with these drugs is that they are given systemically rather than targeted to joints. As a result, they interfere with the immune system throughout the body and can make patients susceptible to side effects such as infections. “We want to use our gene-editing technology as a way to deliver targeted therapy in response to localized inflammation in a joint, as opposed to current drug therapies that can interfere with the inflammatory response through the entire body,” said Guilak, also a professor of developmental biology and of biomedical engineering and co-director of Washington University’s Center of Regenerative Medicine. “If this strategy proves to be successful, the engineered cells only would block inflammation when inflammatory signals are released, such as during an arthritic flare in that joint.” As part of the study, Guilak and his colleagues grew mouse stem cells in a test tube and then used CRISPR technology to replace a critical mediator of inflammation with a TNF-alpha inhibitor. “Exploiting tools from synthetic biology, we found we could re-code the program that stem cells use to orchestrate their response to inflammation,” said Jonathan Brunger, PhD, the paper’s first author and a postdoctoral fellow in cellular and molecular pharmacology at the University of California, San Francisco. Over the course of a few days, the team directed the modified stem cells to grow into cartilage cells and produce cartilage tissue. Further experiments by the team showed that the engineered cartilage was protected from inflammation. “We hijacked an inflammatory pathway to create cells that produced a protective drug,” Brunger said. The researchers also encoded the stem/cartilage cells with genes that made the cells light up when responding to inflammation, so the scientists easily could determine when the cells were responding. Recently, Guilak’s team has begun testing the engineered stem cells in mouse models of rheumatoid arthritis and other inflammatory diseases. If the work can be replicated in animals and then developed into a clinical therapy, the engineered cells or cartilage grown from stem cells would respond to inflammation by releasing a biologic drug — the TNF-alpha inhibitor — that would protect the synthetic cartilage cells that Guilak’s team created and the natural cartilage cells in specific joints. “When these cells see TNF-alpha, they rapidly activate a therapy that reduces inflammation,” Guilak explained. “We believe this strategy also may work for other systems that depend on a feedback loop. In diabetes, for example, it’s possible we could make stem cells that would sense glucose and turn on insulin in response. We are using pluripotent stem cells, so we can make them into any cell type, and with CRISPR, we can remove or insert genes that have the potential to treat many types of disorders.” With an eye toward further applications of this approach, Brunger added, “The ability to build living tissues from ‘smart’ stem cells that precisely respond to their environment opens up exciting possibilities for investigation in regenerative medicine.”


News Article | May 3, 2017
Site: www.prweb.com

LearnHowToBecome.org, a leading resource provider for higher education and career information, has analyzed more than a dozen metrics to determine the best two-year and four-year schools in North Carolina for 2017. 50 four-year colleges and universities were ranked, and Duke University, University of North Carolina at Chapel Hill, North Carolina State University at Raleigh, Wake Forest University and Queens University of Charlotte were the top five. Of the 50 two-year schools also made the list, with McDowell Technical Community College, Rockingham Community College, Asheville-Buncombe Technical Community College, Pitt Community College and Durham Technical Community College taking the top five positions. A complete list of schools is included below. “Students in North Carolina have a lot of options when it comes to earning a certificate or degree, but the schools on our list have distinguished themselves as being a cut above the rest,” said Wes Ricketts, senior vice president of LearnHowToBecome.org. “Not only do they offer solid educational programs, they also have career services that lead to strong post-college earnings.” To be included on the “Best Colleges in North Carolina” list, all schools must be regionally accredited, not-for-profit institutions. Each college is ranked on additional statistics including the number of degree programs offered, the availability of career and academic resources, the opportunity for financial aid, graduation rates and annual alumni earnings 10 years after entering college. Complete details on each college, their individual scores and the data and methodology used to determine the LearnHowToBecome.org “Best Colleges in North Carolina” list, visit: The Best Four-Year Colleges in North Carolina for 2017 include: Appalachian State University Barton College Belmont Abbey College Bennett College Brevard College Campbell University Catawba College Chowan University Davidson College Duke University East Carolina University Elizabeth City State University Elon University Fayetteville State University Gardner-Webb University Greensboro College Guilford College High Point University Johnson C Smith University Lees-McRae College Lenoir-Rhyne University Livingstone College Mars Hill University Meredith College Methodist University Montreat College North Carolina A & T State University North Carolina Central University North Carolina State University at Raleigh North Carolina Wesleyan College Pfeiffer University Piedmont International University Queens University of Charlotte Saint Augustine's University Salem College Shaw University St Andrews University University of Mount Olive University of North Carolina at Asheville University of North Carolina at Chapel Hill University of North Carolina at Charlotte University of North Carolina at Greensboro University of North Carolina at Pembroke University of North Carolina Wilmington Wake Forest University Warren Wilson College Western Carolina University William Peace University Wingate University Winston-Salem State University The Best Two-Year Colleges in North Carolina for 2017 include: Alamance Community College Asheville-Buncombe Technical Community College Beaufort County Community College Bladen Community College Blue Ridge Community College Caldwell Community College and Technical Institute Cape Fear Community College Carolinas College of Health Sciences Carteret Community College Catawba Valley Community College Central Carolina Community College Central Piedmont Community College Cleveland Community College Coastal Carolina Community College College of the Albemarle Craven Community College Davidson County Community College Durham Technical Community College Fayetteville Technical Community College Forsyth Technical Community College Gaston College Guilford Technical Community College Halifax Community College Haywood Community College James Sprunt Community College Johnston Community College Lenoir Community College Martin Community College McDowell Technical Community College Mitchell Community College Montgomery Community College Nash Community College Pamlico Community College Piedmont Community College Pitt Community College Randolph Community College Rockingham Community College Rowan-Cabarrus Community College Sandhills Community College South Piedmont Community College Southeastern Community College Southwestern Community College Stanly Community College Surry Community College Vance-Granville Community College Wake Technical Community College Wayne Community College Western Piedmont Community College Wilkes Community College Wilson Community College ### About Us: LearnHowtoBecome.org was founded in 2013 to provide data and expert driven information about employment opportunities and the education needed to land the perfect career. Our materials cover a wide range of professions, industries and degree programs, and are designed for people who want to choose, change or advance their careers. We also provide helpful resources and guides that address social issues, financial aid and other special interest in higher education. Information from LearnHowtoBecome.org has proudly been featured by more than 700 educational institutions.


News Article | April 26, 2017
Site: www.scientificcomputing.com

Materials scientists and engineers have developed a sensor that is fast, sensitive and efficient enough to detect specific wavelengths of electromagnetic energy while on the move. The technology could actively scan areas for methane or natural gas leaks, monitor the health of vast fields of crops or quickly sort plastics for recycling. Working closely with the optoelectronic materials company SRICO, engineers from Duke University have built a prototype detector that beats the existing competition in size, weight, power, speed and, most importantly, cost. The new technology relies on metamaterials -- engineered structures made of carefully designed repeating cells that can interact with electromagnetic waves in unnatural ways. By combining seemingly simple patterns of metal with extremely thin slices of perfect crystals, the engineers created a streamlined device able to detect invisible infrared signatures emitted by various kinds of gasses, plastics and other sources. The results appeared on February 20, 2017, in the journal Optica. "The benefit of using metamaterials is that different components required in a detector can be combined into one feature," said Willie Padilla, professor of electrical and computer engineering at Duke. "That simplification gains you a lot of efficiency." In a typical thermal detector, infrared light waves are absorbed and converted into heat by a black substance, essentially soot. That heat is conducted to a separate component that creates an electrical signal that is then read out. This setup creates speed limitations, and only by overlaying filters or a complex system of moving mirrors, can specific wavelengths be singled out. The new metamaterial sensor skirts both of these issues. Each tiny section of the detector consists of a pattern of gold sitting on top of lithium niobate crystal. This crystal is pyroelectric, meaning that when it gets hot, it creates an electrical charge. Like shaving a piece of cheese off a block, engineers at SRICO use an ion beam to peel a slice of crystal just 600 nanometers thick. This technique eliminates potential defects in the crystalline structure, which reduces background noise. It also creates a thinner slice than other approaches, allowing the crystal to heat up more quickly. Ordinarily, this crystal is so thin that light would simply travel through without being absorbed. However, researchers tailor the top layer of gold into a pattern that combines with the properties of the crystal to cause the pixel to absorb only a specific range of electromagnetic frequencies, removing the need for separate filters. When the crystal heats up and generates an electric charge, the gold then does double duty by carrying the signal to the detector's amplifier, eliminating the need for separate electrical leads. "These designs allow this technology to be 10 to 100 times faster than existing detectors because the heat is created directly by the crystal" said Jon Suen, a postdoctoral associate in Padilla's laboratory. "This lets us create devices with fewer pixels and also presents the ability to sweep the detector across an area or capture images in motion." "This is such a good marriage of technologies," said Vincent Stenger, an engineer at SRICO and coauthor of the paper. "Working with Duke has been one of the most ideal situations I've had with technology transfer. We can focus on making the material and they can focus on the device structure. Both sides have been contributing with a clear product in mind that we're now working on marketing." The researchers can fabricate the device to detect any specific range of electromagnetic frequencies simply by redesigning the details of the gold pattern. Stenger and his colleagues at SRICO have already created a single-pixel prototype as a proof of concept. They are currently working to find funding from industry investors or possibly a follow-on government grant. The researchers are optimistic as their device has many advantages over existing technologies. Its fast detection time would allow it to quickly scan over an area while looking for methane or natural gas leaks. The simplicity of its design makes it lightweight enough to carry into fields to assess the health of agricultural crops. "You could even make this into a low-cost lab instrument for spectroscopy for medical samples," said Padilla. "I'm not sure what the eventual price point would be, but it'd be a lot less than the $300,000 instrument we currently have in our laboratory."


News Article | April 26, 2017
Site: www.eurekalert.org

The Howard Hughes Medical Institute's (HHMI) Medical Research Fellows Program has selected 79 talented medical and veterinary students to conduct in-depth, mentored biomedical research. Fifty-three percent of the awardees are female, the greatest representation of women in the program to date. Starting this summer, each fellow will spend a year pursuing basic, translational, or applied biomedical research at one of 32 academic or nonprofit research institutions across the United States. "The Med Fellows Program allows exceptional MD, DVM, and DDS students to effectively shift course and conduct rigorous research at top institutions throughout the country," says David Asai, senior director in science education at HHMI. "It's an extraordinary opportunity for future physicians, veterinarians, and dentists to explore the intersection of medicine and scientific discovery, and we hope that each student comes away further empowered to pursue a career as a physician-scientist." Now, 28 years after the Med Fellows Program was first launched, it has helped more than 1,700 medical, veterinary, and dental students establish a foothold in the research world. In this year's group, 18% of the fellows are from minority groups typically underrepresented in the biomedical sciences, and seven fellows will continue their research for another year. Tolu Rosanwo, a second-year fellow and medical student at Case Western Reserve University School of Medicine, says the program is a gift, but for Rosanwo, it was a gift that left her wanting more. "I couldn't leave just as my research was starting to show promise," she says. "I'm still intrigued by my initial question, and I want to see it through." That initial question dates back to Rosanwo's childhood, growing up with two siblings with sickle cell anemia. Her curiosity about what caused them to be sick turned into a committed desire to understand and contribute to a treatment for the disorder. Now, in the laboratory of George Daley, Dean of Harvard Medical School and an alumnus of the HHMI Investigator Program, she's trying to tackle that question. "An important and profound place to be is in between science and patients," she says. "I want to be a physician whose patient care is informed by research, and vice versa." Anna Cheng, a first-year fellow and current medical student at University of South Florida Morsani College of Medicine, started dabbling in the scientific method as a high school student. Science had always interested her, but when her best friend and her godmother found themselves in a fight against cancer, Cheng decided to narrow her scientific focus. "My best friend was diagnosed with leukemia and my godmother with ovarian cancer. I wanted to understand why - to figure it out," she says. "Yes, I was interested in cancer research, but I had personal factors that really drove me." During her undergraduate studies at Duke University, Cheng continued to make time for lab research, fitting it in over summers and in between coursework. And though she valued the experiences, the fleeting glimpses of bench time only whet her appetite for more. The Med Fellows Program, she says, provided her the opportunity for more sustained exposure to research. "I feel so fortunate, because now I get to pursue a project for an entire year," she says. After a thoughtful pause, she amends her statement. "But the program's experience isn't really just a year. It's something that will serve me well for the rest of my career." The Med Fellows Program takes a multilevel mentoring approach to help incoming fellows get off to a strong start, make new connections, and access a network of support throughout their fellowship year. Various meetings bring the fellows together to connect with newly minted Med Fellow alumni, early-career faculty, and senior investigators to participate in seminars and learn from physician-scientists at various career stages. The most direct form of support comes from each fellow's mentor. Cathy Wu, an alumna from the early days of the Med Fellows Program and associate professor at the Dana-Farber Cancer Institute, will be mentoring her third med fellow this fall. "The fellows are such a terrific bunch - they're brimming with enthusiasm, super smart, and eager to learn," Wu says. As someone who took great inspiration from her own mentors as a student in the program, Wu emphasizes that the mentor-mentee relationship is a crucial part in learning how to approach investigation. "Part of the Med Fellows Program is getting a sense of the opportunities and resources available - having the latitude to explore and learn about the investigative process. When I was a fellow, the program helped me cement research as part of my medical career," she says. "I'm eager for these students to have their year, too." In collaboration with HHMI, five partners - the American Society of Human Genetics, Burroughs Wellcome Fund, Citizens United for Research in Epilepsy, Foundation Fighting Blindness, and Parkinson's Foundation - will fund 8 of the 79 aspiring physician- and veterinarian- scientists, bringing the program's total investment to $3.4 million. The Howard Hughes Medical Institute plays an important role in advancing scientific research and education in the United States. Its scientists, located across the country and around the world, have made important discoveries that advance both human health and our fundamental understanding of biology. The Institute also aims to transform science education into a creative, interdisciplinary endeavor that reflects the excitement of real research. HHMI's headquarters are located in Chevy Chase, Maryland, just outside Washington, D.C.


News Article | April 26, 2017
Site: www.scientificcomputing.com

Materials scientists and engineers have developed a sensor that is fast, sensitive and efficient enough to detect specific wavelengths of electromagnetic energy while on the move. The technology could actively scan areas for methane or natural gas leaks, monitor the health of vast fields of crops or quickly sort plastics for recycling. Working closely with the optoelectronic materials company SRICO, engineers from Duke University have built a prototype detector that beats the existing competition in size, weight, power, speed and, most importantly, cost. The new technology relies on metamaterials -- engineered structures made of carefully designed repeating cells that can interact with electromagnetic waves in unnatural ways. By combining seemingly simple patterns of metal with extremely thin slices of perfect crystals, the engineers created a streamlined device able to detect invisible infrared signatures emitted by various kinds of gasses, plastics and other sources. The results appeared on February 20, 2017, in the journal Optica. "The benefit of using metamaterials is that different components required in a detector can be combined into one feature," said Willie Padilla, professor of electrical and computer engineering at Duke. "That simplification gains you a lot of efficiency." In a typical thermal detector, infrared light waves are absorbed and converted into heat by a black substance, essentially soot. That heat is conducted to a separate component that creates an electrical signal that is then read out. This setup creates speed limitations, and only by overlaying filters or a complex system of moving mirrors, can specific wavelengths be singled out. The new metamaterial sensor skirts both of these issues. Each tiny section of the detector consists of a pattern of gold sitting on top of lithium niobate crystal. This crystal is pyroelectric, meaning that when it gets hot, it creates an electrical charge. Like shaving a piece of cheese off a block, engineers at SRICO use an ion beam to peel a slice of crystal just 600 nanometers thick. This technique eliminates potential defects in the crystalline structure, which reduces background noise. It also creates a thinner slice than other approaches, allowing the crystal to heat up more quickly. Ordinarily, this crystal is so thin that light would simply travel through without being absorbed. However, researchers tailor the top layer of gold into a pattern that combines with the properties of the crystal to cause the pixel to absorb only a specific range of electromagnetic frequencies, removing the need for separate filters. When the crystal heats up and generates an electric charge, the gold then does double duty by carrying the signal to the detector's amplifier, eliminating the need for separate electrical leads. "These designs allow this technology to be 10 to 100 times faster than existing detectors because the heat is created directly by the crystal" said Jon Suen, a postdoctoral associate in Padilla's laboratory. "This lets us create devices with fewer pixels and also presents the ability to sweep the detector across an area or capture images in motion." "This is such a good marriage of technologies," said Vincent Stenger, an engineer at SRICO and coauthor of the paper. "Working with Duke has been one of the most ideal situations I've had with technology transfer. We can focus on making the material and they can focus on the device structure. Both sides have been contributing with a clear product in mind that we're now working on marketing." The researchers can fabricate the device to detect any specific range of electromagnetic frequencies simply by redesigning the details of the gold pattern. Stenger and his colleagues at SRICO have already created a single-pixel prototype as a proof of concept. They are currently working to find funding from industry investors or possibly a follow-on government grant. The researchers are optimistic as their device has many advantages over existing technologies. Its fast detection time would allow it to quickly scan over an area while looking for methane or natural gas leaks. The simplicity of its design makes it lightweight enough to carry into fields to assess the health of agricultural crops. "You could even make this into a low-cost lab instrument for spectroscopy for medical samples," said Padilla. "I'm not sure what the eventual price point would be, but it'd be a lot less than the $300,000 instrument we currently have in our laboratory."


News Article | April 24, 2017
Site: www.eurekalert.org

(Duke University) Three years of fracking has not contaminated groundwater in northwestern West Virginia, but accidental spills of wastewater from fracked wells may pose a threat to surface water, according to a study led by scientists at Duke University. The scientists used a broad suite of geochemical and isotopic tracers to sample for contaminants in 112 water wells near shale gas sites, including 20 wells that were sampled both before and after fracking began.


News Article | April 17, 2017
Site: www.forbes.com

What has the impact been of utilizing fracking for extracting natural gas from shale? originally appeared on Quora: the place to gain and share knowledge, empowering people to learn from others and better understand the world. Answer by Ava Mohsenin, studies Economics & Environmental Studies at McGill University, on Quora: The environmental effects of shale gas are varied - wide ranges of importance and risk level. First, many say that the burning of natural gas emits fewer greenhouse gases per unit of energy than burning alternative energy sources like oil or coal, however this may not necessarily be true when observing the full life cycle of natural gas especially taking extraction into account. Second, another key environmental impact is the amount of water needed to access shale gas through hydraulic fracking. Estimates vary, but one study from Duke University found that 250 billion gallons of water was used to extract unconventional shale gas and oil from hydraulically fractured wells in the United States between 2005 and 2014. During the same period, the fracked wells generated about 210 billion gallons of wastewater. Injecting such vast amounts of water into the earth can also cause minor earthquakes, but greater magnitude ones could occur if there is a pre-stressed fault in the same location. Another environmental impact is the risk of “slickwater” (a blend of water and added chemicals to improve viscosity) containing harmful chemicals and contaminating water under the ground or migrating upwards through aquifers. This contamination at the development and production stages is extremely dangerous - deep groundwater has a much higher salinity than shallow groundwater, which is fresh, and the two do not mix naturally. In the process of drilling one must be aware of the various aquifers present so the fresh groundwater does not become contaminated by the deeper saline water. The construction of wells in the development stage is the most common method for groundwater and ecosystem contamination when poorly built. A poorly constructed dam gives large potential for fluids to contaminate groundwater and the surrounding environment through fractures in the rock.These are just a few of many plausible negative environmental consequences of extracting shale gas, but some of the most significant environmental impacts even arise from the construction of wells, including accidental spills of oils, drilling muds, and potentially toxic “slickwater”. Besides strictly environmental impacts, there are social ones too. The need for large volumes of water over short time periods for hydraulic fracking can cause stress at the coldest, driest, and most critical times of the year for communities surrounding fracking sites. There can also be spills associated with storing, mixing, and pumping slickwater, meaning there is a chance chemicals involved in slickwater could infiltrate groundwater and the soil, causing potential health issues. Drilling rigs running all day and night create noise heard up to 4km away, and Volatile Organic Compounds which contribute to smog can leave odors up to 600m from a fracking platform. Shale gas also indirectly affects human social systems. For example, the province of Quebec could receive $71 up to $475 million over the next 25 years from shale gas revenues, but it is highly debated as some argue it could cost taxpayers up to four times the amount of revenue that will be brought in. This is an important point regarding the distribution of economic benefits. In Quebec, natural gas accounts for only 13% of the energy consumed, and over one third of that is from commercial sector consumption (MEI). Therefore, the main beneficiaries of exploiting shale gas in Quebec won’t be residential households, who rely predominantly on electricity, but instead provincial commercial and businesses. Depending on the profitability, productivity, and efficiency of natural gas production, the ability to export shale gas in the medium-long future would extend the beneficiaries nationwide - all which is unknown and subject to the Precautionary Principal. Another consideration socially, is the consequence of shale boom towns - local economies that would benefit but these booms in turn degrade local infrastructure, and inflationary pressures can make some regions unaffordable for residents. This question originally appeared on Quora - the place to gain and share knowledge, empowering people to learn from others and better understand the world. You can follow Quora on Twitter, Facebook, and Google+. More questions:


News Article | April 19, 2017
Site: phys.org

Verily Life Sciences partnered with Duke University School of Medicine and Stanford Medicine for the Project Baseline study intended to collect broad health data from approximately 10,000 participants over the course of at least four years. People in the study will make routine clinic visits, complete surveys, and use wrist-worn devices packed with sensors to gather biological data. "The Project Baseline study has the opportunity to significantly influence our current body of knowledge by better understanding the indicators of wellness," American Heart Association chief executive Nancy Brown said in a statement. "The outcome of this study could inspire a new generation of tools that are geared towards disease prevention versus just diagnosis and treatment." Last week, Verily unveiled a wrist-worn "Study Watch" designed to gather complex health data in clinical studies. Study Watch is meant for research and will be put to work in several studies including a multi-year study to identify patterns in the progression of Parkinson's disease, according to a blog post by Verily team members. Environmental, genetic, and molecular information will be included in data collected in the baseline study as part of a "journey to comprehensively map human health," according to Verily. "Currently, most of what we see as treating physicians are short snapshots in time of an individual and primarily after they are already ill," said Sanjiv Sam Gambhir, chair of radiology at Stanford Medicine and director of the Canary Center for Cancer Early Detection. "By focusing on the health of a broad population, we can eventually have a meaningful impact on the well-being of patients around the world." Verily was part of the Google X laboratory known for big-vision projects such as self-driving cars and internet service delivered by high-altitude balloons, but was spun into an independent unit at Google-parent Alphabet in mid-2015. Verily works on diabetes with the French pharmaceutical group Sanofi and has another partnership in bioelectronic medicine with the British firm GlaxoSmithKline.


News Article | April 17, 2017
Site: news.yahoo.com

When Elliott, now 19, was a junior in high school, here’s what an average day looked like: He’d wake up at 5:30, shower, get dressed, eat a quick breakfast, and then ride his bike to the bus stop, which was marked by a roughly built wooden hut. Once there, he’d reach up to the roof of the hut, where he’d stashed a bowl and a baggie of marijuana. “I hate school, so I always smoked right before the bus picked me up at 6:20,” Elliott tells Yahoo Beauty. “It calmed me down.” In the afternoon, he’d finish up his homework and then head out onto the back porch to 420, assured that no one other than his single mom would see him, since he lived on a dead-end street. “My mom doesn’t really care,” Elliott says. “She’d rather I smoke than do heroin.” His love affair with weed kicked off on Halloween night in 2014, when Elliott, then 16, lit up for the first time with friends. Although he didn’t feel anything, he was still curious, so he tried it again. And the second time, he got high. “It was pretty great,” Elliott says. “Weed is the best drug because you are in control of yourself and what’s going on.” Elliott claims he hasn’t noticed any negative side effects from marijuana use — and that he could stop anytime he wanted. Meanwhile, there’s Liz, now 18, who started smoking weed regularly at the age of 12 as a coping mechanism, as she puts it, for the upset she felt around her parents’ divorce. “At first I kind of just felt, like, very… relaxed, spacey,” she says. “After a while, after I started using day after day, I kind of just felt more lethargic. No motivation for anything. Very apathetic. And I felt, like, a lot of paranoia along with that.” By her early teens, Liz had developed a pot habit — not to mention an eating disorder and a self-harming problem — severe enough to land her in a residential treatment program, the Newport Academy. “I realized that I had a problem with marijuana when I found that I couldn’t be comfortable when I was sober,” she tells Yahoo, adding that the softening marijuana laws across the country are sending what feels to her like “a mixed message” about the safety of weed. Many Americans feel similarly conflicted about marijuana and its effects on physical and mental health, caught somewhere between Elliott and Liz. According to a new exclusive Yahoo News/Marist Poll, a slight majority of Americans — 51 percent — think using marijuana poses a health risk, while 44 percent think it does not, and 5 percent remain unsure. When it comes to teens, that narrative has begun to shift, due to a series of studies pointing out that the vulnerable, still-developing brains of adolescents do not mix so well with marijuana. But definitive research about how cannabis specifically affects teens still remains frustratingly elusive, as for every study out there suggesting that pot has deleterious effects, another analysis affirms its harmlessness. In fact, the lack of conclusive answers is what triggered the National Institute on Drug Abuse (NIDA) to recently embark upon a large-scale longitudinal study that will track 10,000 adolescents into early adulthood to look at how use of illicit substances, including marijuana, affects their developing brains and shapes their lives. In the meantime, Yahoo Beauty spoke with top researchers to get as clear a picture as possible of what we do know about weed and the teenage brain. First, a quick synopsis of how marijuana operates: The body’s endocannabinoid system regulates intercellular communication via cannabinoid receptors in the nervous system and brain. “The endocannabinoid system is the master regulator of homeostasis,” Gregory Gerdeman, assistant professor of biology at Eckerd College, tells Yahoo Beauty. “If our electrical system gets too excited, it dampens it down; if cells are moving sluggishly, it speeds things up.” When an individual uses marijuana, its THC molecules attach to these cannabinoid receptors, altering their activity and triggering a blissed-out sensation, as well as potential paranoia and anxiety. (CBD molecules, also found in weed, give users a mellow feeling that counteracts the high and are the main source of marijuana’s medicinal benefits.) Cannabinoids are intimately involved in the growth and development of the brain, guiding the wiring of the neural network. And just as a house under construction is not as solid as a completed building, the teen brain is more sensitive than its adult counterpart. “In this period of critical neural vulnerability, exposure to things like THC can change the trajectory of how the brain develops over time,” Staci Gruber, director of the Cognitive and Clinical Neuroimaging Core at McLean Hospital in Belmont, Mass., tells Yahoo Beauty. Or, as NIDA director Nora Volkow, MD, puts it, the fully grown-up brain has a degree of resiliency that younger brains lack, so “marijuana may have unique, negative effects that may not be present in an adult.” The pothead slacker spacing out in class is a common stereotype. And evidence does suggest that herb might diminish intellectual capacity. “When individuals smoke marijuana, we see changes within the prefrontal cortex, which is a critical part of the brain right behind your eyebrows, responsible for things like decision making, consciousness, and abstract reasoning,” Gruber says. During adolescence, the brain eliminates unneeded neurons so that it can operate more efficiently, in a process called synaptic pruning. “When a child is born, he or she has many more neurons than an adult brain,” Volkow says. “It’s almost like a sculpture, where the artist chips away at the stone until it [forms the desired] shape. [The brain] gets rid of some neurons and creates connections that maximize the functions that a particular child is going to need in order to be successful as an adult.” Marijuana disrupts glutamate receptors, neurotransmitters involved in synaptic pruning; as a result, extraneous neurons may not be effectively phased out and can drag down our cognitive capacity, affecting everything from memory to executive control. Volkow likens it to the operation of an airport. “The more connections you have, the more communication there’s going to be from one place to another. But too many connections clog the system,” she says. “Of course, too few connections also interfere with your ability to transfer people place to place — and studies have shown that people who consume large quantities of marijuana during adolescence have far fewer connections into the hippocampus, which is one of the main brain regions involved with memory and learning.” In particular, says John Kelly, MD, professor of psychiatry in addiction medicine at Harvard Medical School and director of the Recovery Research Institute, “it can impact memory consolidation, which is the encoding of short-term information into long-term memories. We learn by contextualizing new information and relating it to other memories in our memory bank. If the information hasn’t been properly encoded, we won’t be able to draw upon it as a resource.” Marijuana can also decrease myelin, a protective coating around axons of neurons that increases the speed at which electrochemical impulses travel in the brain. “If you don’t have enough myelin, you may be scatterbrained and suffer from attention problems,” Kelly says. “Basically, you’re on the slow train.” A study from Northwestern Medicine found that young adults who smoked marijuana daily for about three years as teens had an abnormally shaped hippocampus and performed poorly on long-term-memory tasks — two years after they stopped using the drug. Compared with a control group, they scored 18 percent worse on a test of memory processes used for daily problem solving and to sustain friendships. And research out of Duke University linked long-term marijuana use before age 18 to a lasting drop in IQ. At age 38, subjects scored an average of eight points lower compared with their results when they were 13 years old. Yet Gerdeman cautions against jumping to conclusions. “The human brain is a plastic structure that undergoes small morphological changes with time, learning, experience, stress, trauma, meditation, exercise, medication, and yes, cannabis,” he says. “I’m not going to tell you there is no reason to be concerned, but these findings should be viewed with nuance.” He points out that some studies portray a cautionary tale based on brain imaging without showing a corresponding functional deficit, while others fail to control for influential variables like binge drinking. It’s not only intellect that bears the brunt of ganja use at a young age. Research suggests that pot can affect EQ, or emotional intelligence, as much as IQ, thanks to the fact that heavy users have trouble pulling up memories that can inform current decision making. When navigating a relationship or social interaction, “your prefrontal cortex will scan the rest of the brain to see if you have been exposed in the past to a similar situation that can guide you or predict what’s going to happen,” Volkov says. And if someone doesn’t have ready access to that feedback, he or she is at a disadvantage. What’s more, brain-imaging research has shown that THC targets the prefrontal cortex, the area associated with emotional regulation and social skills. “The prefrontal cortex is the brain’s brake system; it triggers us to look before we leap,” Kelly says. “Inadequate synaptic pruning in this region can increase impulsivity and disinhibition.” When a person’s prefrontal cortex isn’t operating at its optimal level, he or she might react inappropriately, from losing his or her temper at a friend to engaging in unprotected sex. On the other hand, research from the University of Kentucky, Lexington supports Elliott’s experience: Lonely teens who hit herb had higher levels of self-worth, better mental health, and a lower risk of depression than those who abstained. It may reek of reefer madness, but some of the most alarming research points at a link between marijuana use and psychosis. According to a recent paper published in the journal Biological Psychiatry, daily pot use in teens can increase the risk of psychosis from 1 percent to 3 percent. And a study in the American Journal of Psychiatry found that for each year that adolescent males engaged in regular marijuana use, their chances of experiencing psychotic symptoms surged by 21 percent, even a year after they’d stopped using the drug. “Some people may have a genetic propensity for mental illness like schizophrenia that only manifests under certain conditions,” Kelly says. “In these individuals, chronic exposure to THC over time might trigger a switch that turns on the genes that promote psychosis.” Again, there’s debate about whether weed is truly at fault. A Harvard study failed to find a causal link between schizophrenia and cannabis use, suggesting instead that family history was the deciding factor; and a review in the journal Schizophrenia Research revealed that although cannabis use is increasing in the U.K., rates of schizophrenia and psychosis are falling. There’s also the chicken-and-egg question — people prone to psychiatric disorders might be more likely to turn to substances in the first place. Although the matter is still up for debate, Gerdeman has found that “teens with preexisting signs of psychotic tendencies or genetic predispositions who go on to use cannabis heavily are at a greater risk of developing schizophrenia.” Is It Addictive or Not? While it’s true that pot’s got nothing on harder drugs like heroin and cocaine, some people do get hooked — and the risk is greater for teens. “Approximately 9 percent of individuals who are exposed to marijuana will become addicted, but if you take marijuana as a teenager, it goes up to 19 percent,” Volkov says. “And 50 percent of teens that use marijuana on a daily basis will become addicted.” Marijuana activates a part of the brain called the nucleus accumbens, which is a key player in the brain’s reward circuitry, and this can lead to a dependency. “The earlier a person’s brain is exposed to chemical substances, the likelier it is to become sensitized to them,” Kelly says. “When you prime the pump during adolescence, the neurons become adapted to the drug and are altered in such a way that they start to expect its presence.” While the jury is out on how harmful marijuana actually is for adolescents, the majority of researchers agree that the two biggest risk factors are the age of the onset of use and the frequency of use. Basically, the younger someone starts burning one down and the more often they get blazed, the greater the potential harm in terms of brain damage, mental illness, and addiction. As Gruber says, the message for teens should be, “Just say no for now. It’s worth the wait.” As for Elliott and Liz, they both report that they’re doing well, although their relationship with weed is very different. Elliott, now a host at a high-end restaurant, still wakes and bakes. “I could quit any day if I wanted to, but I don’t want to,” he says. “Parents are so hard on their kids about it, but it’s not a terrible thing.” Liz, on the other hand, has steered clear of marijuana since rehab and is focused on graduating from high school. “That’s a really big thing that I never thought I would do,” she says. “I’m thrilled about my future … and I have more faith in myself … and can advocate for myself in ways I couldn’t before. … I don’t need to use marijuana in order to be the person that I want to be. I can just be that person authentically.” Read more from the Yahoo Weed & the American Family series: Americans families defending pot as never before, Yahoo News/Marist Poll finds How Republicans and Democrats in Congress are joining forces to defeat Sessions’ war on weed Cannabis advocate Melissa Etheridge: ‘I’d much rather have a smoke with my grown kids than a drink’ These mothers of suicides don’t think marijuana is harmless ‘Cannabis has made me a better parent’: One mom’s confession Follow us on Instagram, Facebook, and Pinterest for nonstop inspiration delivered fresh to your feed, every day. For Twitter updates, follow @YahooStyle and@YahooBeauty.


News Article | May 3, 2017
Site: www.futurity.org

Conservation projects that protect forests and encourage a diversity of plants and animals provide a variety of benefits to humans. But a new study suggests that improved human health is not among those benefits—at least when health is measured through the lens of infectious disease. The findings, published in Philosophical Transactions of the Royal Society B, are based on an analysis of the relationship between infectious diseases and their environmental, demographic, and economic drivers in dozens of countries over 20 years. “I’m a firm believer that insights from ecology can help us manage disease and protect species,” says coeditor Kevin Lafferty, a senior ecologist with the US Geological Survey and a principal investigator at University of California, Santa Barbara’s Marine Science Institute Marine Science Institute. “But ecological systems are too complicated to expect one-size-fits-all solutions.” The findings show that increased biodiversity―measured as the number of species and amount of forested land―is not associated with reduced levels of infectious disease—and in some cases, disease burdens actually increased as areas became more forested over time. “There are a lot of great reasons for conservation, but control of infectious disease isn’t one of them,” says lead author and parasite ecologist Chelsea Wood, assistant professor in the School of Aquatic and Fishery Sciences at the University of Washington. “We’re not going to improve public health by pushing a single button. This study clearly shows that―at the country level―conservation is not a disease-control tool.” The researchers considered forestation, biodiversity, wealth, temperature, precipitation, and urbanization and found that any of those factors on their own could have a positive, negative, or neutral effect, depending on the disease. By far the most consistent finding, though, was this: The wealthier the country, the less disease; and the more wealth increased, the lower the burden of infectious disease. Further, increasing urbanization actually reduces disease, most likely because cities bring people closer to medical care and give them greater access to vaccinations, clean water, and sanitation. So even though cities crowd people together, the net benefit of their services results in reductions of infectious disease. “This paper has some good news that is rarely part of the story in our field,” Lafferty says. “Our analysis shows across the board—with just a couple of exceptions—that the burden of infectious diseases has diminished considerably over the last two decades and that is mostly due to increased wealth and urbanization.” Researchers used Institute for Health Metrics and Evaluation’s Global Burden of Disease database, a massive, worldwide effort to document premature death and disability from hundreds of diseases and injuries from 1990 to the present. The study compares data on 24 infectious diseases―including malaria, dengue, rabies, typhoid, tuberculosis, and leprosy―with separate, published data on population density, wealth, bird, and mammal species richness, forest cover, precipitation, and other environmental measures to analyze the effects these factors had, if any, on disease burden per country. Most conservation decisions are made at the country level, so researchers focused at that scale when analyzing whether conservation could be used as a tool for improving public health. Over the 20-year period, they saw no relationship between biodiversity (number of species present) and the overall burden of infectious disease. But for each individual disease, there was a unique set of drivers that were important in deciding whether burden increased or decreased over time. For example, as rates of precipitation went up, so did the burden of “geohelminths”―a group of gut parasites that includes hookworm, whipworm, and roundworm. Together, the geohelminths affect 1.5 billion people. Moist soil is an ideal environment for the development of these worms. Humans can become infected when they contact or accidentally ingest contaminated soil―for example, from unwashed vegetables. As rates of precipitation increase with climate change, this public health threat should be acknowledged and accounted for, the researchers say. The authors hope the disease-specific information included in the study reveals pathways toward effective control, and helps country officials avoid inadvertently exacerbating existing public health problems. “I hope this study encourages people to explicitly acknowledge the potential disease-related risks and benefits of conservation projects,” Wood says. “The absolute last thing we want to do is a conservation project that gets people sick.” This paper is the concluding piece in an entire special edition dedicated to exploring whether conservation promotes or hinders infectious disease control. The edition’s coauthors convened about two years ago to explore all sides of this controversial question, and the resulting papers examine specific diseases such as malaria, Lyme disease and schistosomiasis, as well as broader topics of policy and economics. “What’s really unique about this issue is that we have gone all the way from theory articles that look at how biodiversity changes might affect disease to multiple field studies of various conservation interventions at different scales to an examination of the global drivers of biodiversity change,” says lead editor Hillary Young, assistant professor in UCSB’s ecology, evolution and marine biology department. “We wanted to present cases for viable and useful public health interventions.” “There is no one-size-fits-all lever, where improving access to healthcare is going to affect all infectious diseases,” Young adds. “This body of work highlights the need to understand the nuances that make biodiversity and conservation effective levers.” Alex McInturff of the University of California, Berkeley, DoHyung Kim of the University of Maryland, and researchers from Duke University are coauthors of the study. The Michigan Society of Fellows and the Department of Ecology and Evolutionary Biology at the University of Michigan, along with funding from the authors’ institutions and agencies funded the work.


News Article | April 17, 2017
Site: www.futurity.org

New research identifies ways to prompt low- and moderate-income households to save more of their tax refund. Motivational prompts to save tax refunds and suggested savings amounts for the tax refund can increase saving among low- and moderate-income households, finds a new experimental study from the Brown School at Washington University in St. Louis. The study was part of the Refund to Savings Initiative, which uses behavioral economics to encourage tax filers to save their tax refund. The collaboration is based on the idea that tax time is an opportune moment to encourage low- and moderate-income households to save, as the tax refund is often the largest single payment these households receive all year. The experiment involved households that filed income tax returns with an online preparer and chose to receive their refund electronically. These filers were randomized into eight treatment groups, which received different combinations of motivational savings prompts and suggested shares of the refund to save—either 25 percent or 75 percent—and a control group, which received neither prompts nor suggested savings amounts. The study shows that treatment group members were more likely to contribute at least some of their refund to a savings account and more likely to split their tax refund. The study appears in the Journal of Consumer Affairs. “Our study adds to the growing body of research on interventions that center on tax filing as a potential way to increase saving among low- and moderate-income households,” says lead author Michal Grinstein-Weiss, professor at the Brown School at Washington University in St. Louis, associate director of the Center for Social Development, and founding director of the Envolve Center for Health Behavior Change. “These interventions are promising since they have relatively low costs and are scalable to a broad population,” she says. Additionally, the study suggests that the presentation of choices—which behavioral scientists call “choice architecture”—can significantly influence outcomes. The study found that savings anchors, which are suggestions that filers save a certain percentage of their refund, made a difference in savings behavior, unlike messages prompting filers to think about emergencies, retirement, or special purchases. The anchors partly aimed to encourage tax filers to split their refund between a savings account and other accounts, and the size of the anchor was strongly associated with the amount placed into savings for those who split the refund. For example, the 50 percentage-point difference in the two suggested savings anchors—25 percent and 75 percent—is associated with a 30 percentage-point difference in the actual amount of the refund contributed to a savings account, says Grinstein-Weiss. The anchors also increased savings rates and savings amounts across the tax-filing population in general. In contrast, specific savings prompts for general, emergency, and retirement savings did not raise contributions to savings, and in some cases actually reduced contributions. Co-authors on the study are from the Brown School; the Brookings Institution; the Pew Charitable Trusts; and Duke University. The Refund to Savings Initiative comes from the Center for Social Development at the Brown School, Duke University, and Intuit Inc., the maker of TurboTax tax preparation software.


News Article | April 27, 2017
Site: www.wired.com

“What if you could type directly from your brain?” Regina Dugan said, as the same words appeared on the towering screen behind her, one digital character at a time, a cursor leading the way. “It sounds impossible,” she continued, taking another measured step across the stage. “But it’s closer than you may realize.” Dugan once oversaw Darpa, the visionary research arm of the US Department of Defense. Now, after a stint at Google, she oversees a blue-sky lab at Facebook called Building 8. Her keynote speech last week at the company’s annual developer conference marked her public debut as a Facebooker. At Google, she worked on modular smartphones and ways of converting our immediate surroundings into 3D virtual worlds. At Facebook, she and her team are building, among other things, a computer interface for the human brain. It was a powerful speech, especially when she showed a short video of a woman with Lou Gehrig’s disease, or ALS, who can already control a computer tablet keyboard with her thoughts. Later, Dugan extolled the importance of Facebook’s “terrifying” effort to build something that has never been built before. “Why do we sign up to be terrified each day?” she said. “That is the price we pay for the privilege of making something great.” But before you buy into this very Silicon Valley message too completely, you should realize that Dugan’s project, like so many of her projects at Darpa and Google, is truly a leap of faith. And, in some ways, her speech misrepresented what’s possible. “On the one hand, it is very exciting that these ideas are being discussed,” says Miguel A.L. Nicolelis, the Duke University neuroscientist whose lab has been at the center of brain-machine interface research since the late 1990s. “But the announcement was more like science fiction than something grounded in physical reality.” Facebook wants to outrace the competition to the next big computing platform, whether it’s virtual reality, augmented reality, or now machine-brain interfaces. Apple and Google beat Mark Zuckerberg and company to the smartphone, and he doesn’t want to lose again. But as always in Silicon Valley, there are other motivations at work here. Facebook is also a company that wants to be seen as the kind of innovator that will do good for the world, especially at a time when so many people are questioning the company’s impact on public discourse. Dugan said that within “a few years,” her researchers aim to produce a system that lets people type with their thoughts three times faster than you can type with a smartphone keyboard, a much faster rate that what she demonstrated in the video of the woman with ALS. This kind of technology, she said, will not only help people with disabilities but allow us all to use computing devices while continuing to interact with people here in the real world. But Robert Riener, a professor for Sensory-Motor Systems at the Department of Health Sciences and Technology at ETH Zurich and another pioneer in this field, believes that such tech is more like a decade away—if it happens at all. Silicon Valley has a history of pushing technologies forward faster than academia, but it also has a history of overestimating how quickly it can move. Both are worth considering in the wake of Dugan’s speech. Facebook declined our efforts to discuss this research with Dugan, but clearly, more is at play here than just science. Dugan wants to do great things by risking failure, but she and her colleagues are also working to foster the belief that Facebook will do these great things faster than anyone else. Across Silicon Valley, as strange as it may sound, a race to build a human-brain interface is already underway, and Facebook must compete for mindshare and talent. Just a week before Dugan’s speech, Tesla founder Elon Musk unveiled his new machine-brain interface company, Neuralink, and earlier this year, Silicon Valley entrepreneur Bryan Johnson unveiled a similar effort, called Kernel. Like Facebook, both of these new companies are promoting their efforts to achieve things that are impossible today. “All this is marketing,” says Nicolelis, who oversaw the academic work of two scientists who are part of Musk’s new startup, including the Neuralink CEO. Some scientists believe that at least some of the tech promoted by these companies will never be possible, even setting aside the ethical quandary of whether humans should be seeking to turn their brains into machines at all. “Elon believes that the brain works very much like a computer does,” Pascal Kaufmann, a neuroscientist who has explored similar research and is now the founder of an artificial intelligence company called Starmind, says of Musk’s recent announcement. “Whilst Tesla, Space-X and many other endeavors are possible with a lot of efforts, sweat and cash, the brain code cannot be unlocked even with unlimited cash resources as long as the underlying theoretical neuroscientific foundations are missing or just plain wrong.” Johnson and Musk aim to implant devices inside the skull that can shuttle information between the brain and outside machines. But as Nicolelis says, such untested tech poses dangerous risks for healthy human beings. “I can’t imagine that anyone, on ethical grounds, would allow a healthy human being to be implanted with devices like that,” he says, echoing what other neuroscientists have said. The brain-machine interfaces that scientists like Nicolelis have long explored are quite different. They aim to build implants that can help treat people with epilepsy, Parkinson’s, and other maladies. These implants can gather data about these conditions and perhaps even alleviate symptoms through what’s called deep brain stimulation. Johnson and Musk say they will begin with this kind of work, but they also see it as a path to devices implanted in healthy brains. To Facebook’s credit, it’s not hyping the possibility of implants. Instead, Dugan says, her team, which now spans more than 60 scientists and technologists, is exploring interfaces that could read brain activity from outside the skull. But according to Nicolelis and other neuroscientists, the kind of technology she describes may not be possible for ten or even twenty years, if at all. Facebook hopes to use sensors that can read brain activity through optical imaging technology, but reliably taking such readings from such a distance is not feasible today, let alone the extreme difficulty of actually interpreting those signals. Today, scientists understand very little about how the brain actually works. But even if listeners give Dugan the benefit of the doubt, her speech painted a picture that wasn’t quite what it seemed. She wants to build a rapid-fire non-invasive machine-brain interface for everyone. She said that this is “just the kind of fluid human computer interface needed for AR”—the digital “augmented reality” overlay on the physical world that Facebook and so many other companies see as the future. But in promoting this effort, she showed off a woman with ALS guiding a software keyboard through a brain implant—a very invasive device. What’s more, Nicolelis argues, this woman—the subject of a recent Stanford study—didn’t need an implant. She could have driven that keyboard, albeit at an even slower rate, through much simpler external devices. “It didn’t make any sense to me,” Nicolelis says of the video that Dugan slotted into her speech. In other words, Dugan and Facebook showed off technology that is not really representative of the work they’re doing and may even be dubious science. Confused? We don’t blame you. Just remember this: Typing with your brain is probably just as a far away as you think.


Prostate cancer cells depend on signaling through the androgen receptor (AR) to grow and survive. Many anti-cancer therapies that target ARs are initially successful in patients, including a class of drugs known as CYP17A1 inhibitors, which interfere with AR signaling by blocking the synthesis of androgen. However, over time, adaptations to AR expression and function lead to treatment resistance and disease relapse. Recently, the observation that a CYP17A1 inhibitor, seviteronel, effectively treated a patient's prostate cancer without actually lowering androgen levels led researchers at Duke University to further investigate the drug's therapeutic activity. In a study published this week in the JCI, a team led by Donald McDonnell found that many CYP17A1 inhibitors also function as competitive AR antagonists, indicating a more complex and potentially more effective role for the drug in treating prostate cancer. The researchers then demonstrated CYP17A1 inhibitors that act at AR receptors can inhibit the growth of prostate tumor cells expressing a treatment-resistant AR mutation. These findings provide insights into a mechanism that may lead to the development of more effective therapies for treatment-resistant prostate cancer.


News Article | May 29, 2017
Site: news.yahoo.com

Dan Ariely is the author of Predictable Irrationality, a favorite book around Fool HQ, and -- oh yeah -- he's also the James B. Duke Professor of Psychology and Behavioral Economics at Duke University. I was able to interview Dan about investing, information overload, and more. Want to listen instead? Download the audio version of this interview here. Rana Pritanjali: Do you think the process of investing is similar to other jobs? Dan Ariely: I think probably HR. When you're trying to interview people, you try to understand them. You're trying to understand [what their potential is] and trying to understand what they're hiding. So I think it's maybe similar to that. Rana Pritanjali: Why do losses hurt humans so much? Is this true for just a few of us, or is it broadly true? Dan Ariely: So it's true for people in general, and this is called "loss aversion." For example, we find that if you have a coin flip that you have a 50% chance of making $1,100 and a 50% chance of losing $1,000. The expected value is positive, but we don't think of it as positive. We think, "Oh, my goodness. If I lost $1,000, I would be very miserable. If I won $1,100 I would be happy, but it wouldn't offset it, so let me not take that bet." Now, we think that the reason is evolutionary. If you think about nature, if you get something good (like you get to eat more food and so on) that's a good thing, but if you do something bad, you can die. So nature has kind of tuned us to look at the negative side because if you get a bit more food, a bit more money or whatever, there's a positive expected value but it's limited. Whereas on the negative side, you can lose a lot. So because of that we just attune more to losses. Rana Pritanjali: What do you think are some of the most important traits of successful investors? Dan Ariely: Maybe not follow your emotions. Human emotions get us to buy high and sell low, so we have to fight those. And being systematic. Looking forward rather than backward. Thinking about what you want to do moving forward and not looking at what's happened to the stock in the past. Not being impulsive is important, and not following the herd. And then the other one, I would say, is that it's about setting up processes that help us to do things in a more systematic way. So if you basically create a system where you say, "Here are my rules. I'm going to sell when it hits this number. I'm going to buy when it hits this number. I'm going to do X, Y, and Z." If you stick to your rules, you're probably going to do OK. But what happens is that most of us create rules and then we don't stick to them, because we are tempted at the moment, so sticking to our rules is important. Rana Pritanjali: Do you believe in "information overload?" How is this affecting us as investors? Dan Ariely: Absolutely. It's very hard for us to deal with lots and lots of information. Of course, today we're getting lots and lots of information, so what do we do when we get too much information? We simplify. We use heuristics. We rely on only part of the information. On the most salient information. And that, of course, means that the most salient information is probably the information everybody else knows, as well; so we become less independent in our opinions from other investors. If everybody is in information overload, and we all do simplifications, then what happens is that we follow the simplest source of information, which is probably common to everybody. Rana Pritanjali: What human characteristics will be hardest for robots (or artificial intelligence) to replicate? Dan Ariely: I think serendipity. Randomness. Creativity. We don't know. Creativity is part of what is called an NP-complete problem. These are problems that require computation exponential times, and we don't know how to even search. We don't know what the right algorithm is. So I think creativity is going to be one challenge. Serendipity is going to be another one. And following the rules is going to be much easier. Rana Pritanjali: In your opinion, what are some of the most important human characteristics that have led to the species' dominance? Dan Ariely: Opposable thumbs. Language. Language allows us to create tremendous advantages. The invention of the wheel. The invention of money. Memory. We have really incredible memory for ideas. Abstract thinking. And then maybe the other one is that we are inherently social animals. We care about other people. We have what is called "social utility." We can put ourselves in the position of other people. We have empathy. We care about others, which allows us to create societies that rely on each other and get tremendous benefit from each other. Rana Pritanjali: Why is it that long-term investing seems so right in theory but so very hard in practice? Dan Ariely: Actually, everything long-term is hard in practice. We have a hard time not overeating. Saving money. Not texting and driving. Washing our hands. You name it. We're in general not good about anything that requires sacrifice in the short term for the long term. Rana Pritanjali: Why is it so hard for humans to admit our mistakes? Dan Ariely: Well, a couple of reasons. First of all, we don't actually see our own mistakes all the time. We have such great cognitive storytelling ability that the moment we tell a story, we think to ourselves that this was actually the truth. We call our research center at Duke "the Center for Advanced Hindsight" because we're such good storytellers that we come up with stories that portray us in good ways, so we don't always even see our own mistakes. And then on top of that, it's about our social standing, and admitting failure also admits that we might be wrong in the future. Rana Pritanjali: Have you done any experiments on "fear of missing out?" Is this becoming more common, and what is the cause? Dan Ariely: I have not done any research on that, but I did a little video on that. (Link is here!]


News Article | May 1, 2017
Site: www.rdmag.com

Using new gene-editing technology, researchers have rewired mouse stem cells to fight inflammation caused by arthritis and other chronic conditions. Such stem cells, known as SMART cells (Stem cells Modified for Autonomous Regenerative Therapy), develop into cartilage cells that produce a biologic anti-inflammatory drug that, ideally, will replace arthritic cartilage and simultaneously protect joints and other tissues from damage that occurs with chronic inflammation. The cells were developed at Washington University School of Medicine in St. Louis and Shriners Hospitals for Children-St. Louis, in collaboration with investigators at Duke University and Cytex Therapeutics Inc., both in Durham, N.C. The researchers initially worked with skin cells taken from the tails of mice and converted those cells into stem cells. Then, using the gene-editing tool CRISPR in cells grown in culture, they removed a key gene in the inflammatory process and replaced it with a gene that releases a biologic drug that combats inflammation. The research is available online April 27 in the journal Stem Cell Reports. “Our goal is to package the rewired stem cells as a vaccine for arthritis, which would deliver an anti-inflammatory drug to an arthritic joint but only when it is needed,” said Farshid Guilak, PhD, the paper’s senior author and a professor of orthopedic surgery at Washington University School of Medicine. “To do this, we needed to create a ‘smart’ cell.” Many current drugs used to treat arthritis — including Enbrel, Humira and Remicade — attack an inflammation-promoting molecule called tumor necrosis factor-alpha (TNF-alpha). But the problem with these drugs is that they are given systemically rather than targeted to joints. As a result, they interfere with the immune system throughout the body and can make patients susceptible to side effects such as infections. “We want to use our gene-editing technology as a way to deliver targeted therapy in response to localized inflammation in a joint, as opposed to current drug therapies that can interfere with the inflammatory response through the entire body,” said Guilak, also a professor of developmental biology and of biomedical engineering and co-director of Washington University’s Center of Regenerative Medicine. “If this strategy proves to be successful, the engineered cells only would block inflammation when inflammatory signals are released, such as during an arthritic flare in that joint.” As part of the study, Guilak and his colleagues grew mouse stem cells in a test tube and then used CRISPR technology to replace a critical mediator of inflammation with a TNF-alpha inhibitor. “Exploiting tools from synthetic biology, we found we could re-code the program that stem cells use to orchestrate their response to inflammation,” said Jonathan Brunger, PhD, the paper’s first author and a postdoctoral fellow in cellular and molecular pharmacology at the University of California, San Francisco. Over the course of a few days, the team directed the modified stem cells to grow into cartilage cells and produce cartilage tissue. Further experiments by the team showed that the engineered cartilage was protected from inflammation. “We hijacked an inflammatory pathway to create cells that produced a protective drug,” Brunger said. The researchers also encoded the stem/cartilage cells with genes that made the cells light up when responding to inflammation, so the scientists easily could determine when the cells were responding. Recently, Guilak’s team has begun testing the engineered stem cells in mouse models of rheumatoid arthritis and other inflammatory diseases. If the work can be replicated in animals and then developed into a clinical therapy, the engineered cells or cartilage grown from stem cells would respond to inflammation by releasing a biologic drug — the TNF-alpha inhibitor — that would protect the synthetic cartilage cells that Guilak’s team created and the natural cartilage cells in specific joints. “When these cells see TNF-alpha, they rapidly activate a therapy that reduces inflammation,” Guilak explained. “We believe this strategy also may work for other systems that depend on a feedback loop. In diabetes, for example, it’s possible we could make stem cells that would sense glucose and turn on insulin in response. We are using pluripotent stem cells, so we can make them into any cell type, and with CRISPR, we can remove or insert genes that have the potential to treat many types of disorders.” With an eye toward further applications of this approach, Brunger added, “The ability to build living tissues from ‘smart’ stem cells that precisely respond to their environment opens up exciting possibilities for investigation in regenerative medicine.”


News Article | April 17, 2017
Site: www.rdmag.com

A vaccine targeting cytomegalovirus (CMV) antigen pp65, combined with high-dose chemotherapy (temozolomide), improved both progression-free survival and overall survival for a small group of glioblastoma (GBM) patients. Journal in Which the Study was Published: Clinical Cancer Research, a journal of the American Association for Cancer Research. Author: Lead author of the study is Kristen Batich, MD, PhD, a researcher in the lab of senior author John Sampson, MD, PhD, chair of the Department of Neurosurgery at Duke University. Background: The typical median survival for GBM patients is less than 15 months. To overcome these poor numbers, the researchers took advantage of CMV's affinity for GBM, with the viral proteins being expressed in roughly 90 percent of these tumors. Building on previous research, they used CMV as a proxy for GBM, targeting the virus with pp65-specific dendritic cells to spotlight the tumor for the immune system. How the Study Was Conducted and Results: The cohort of 11 patients who received this combination therapy demonstrated a median progression-free survival of 25.3 months and a median overall survival of 41.1 months, and three patients remain progression-free more than seven years after diagnosis, Batich explained. "The clinical outcomes in GBM patients who received this combination were very striking," Batich said. Previous work had shown that TMZ generates profound lymphopenia or the loss of immune cells, which offers a unique opportunity to retrain the immune system, Batich explained. The researchers administered dose-intensified temozolomide (TMZ) as a strategy to further enhance the immune response. "The dose-intensified temozolomide induces a strong state of lymphopenia," said Batich. "With that comes an opportune moment to introduce an antigen-specific vaccine, which redirects the immune system to put all hands on deck and fight that target." One of the noteworthy results from the study was the excellent response rate despite the high proportion of regulatory T cells, which dampen the immune response and rebounded sharply following TMZ administration. This finding may actually be cause for optimism, Batich noted. "If we could preclude this regulatory T-cell rebound, it could have additionally enhancing effects on the pp65 vaccine response," said Batich. Limitations: Though the survival results are quite encouraging, the authors caution that this was a single-arm study without a control group. In addition, the cohort was quite small. Though the outcomes far outpaced historical controls, a more robust trial will be needed to confirm these results. In addition, the team wants to better understand the mechanisms that underlie the strong response rate and refine this combination therapy to produce even better results. "We want to understand why some patients do better than others," said Batich.


News Article | May 4, 2017
Site: www.eurekalert.org

IMAGE:  An illustration of how 3-D-printed metamaterial unit cells could be combined like Lego blocks to create structures that bend or focus microwave radiation more powerfully than any material found in... view more DURHAM, N.C. -- Researchers at Duke University have 3-D printed potent electromagnetic metamaterials, using an electrically conductive material compatible with a standard 3-D printer. The demonstration could revolutionize the rapid design and prototyping of radio frequency applications such as Bluetooth, Wi-Fi, wireless sensing and communications devices. Metamaterials are synthetic materials composed of many individual, engineered devices called cells that together produce properties not found in nature. As an electromagnetic wave moves through the metamaterial, each engineered cell manipulates the wave in a specific way to dictate how the wave behaves as a whole. Metamaterials can be tailored to have unnatural properties such as bending light backwards, focusing electromagnetic waves onto multiple areas and perfectly absorbing specific wavelengths of light. But previous efforts have been constrained to 2-D circuit boards, limiting their effectiveness and abilities and making their fabrication difficult. In a new paper appearing online in the journal Applied Physics Letters, Duke materials scientists and chemists have shown a way to bring electromagnetic metamaterials into the third dimension using common 3-D printers. "There are a lot of complicated 3-D metamaterial structures that people have imagined, designed and made in small numbers to prove they could work," said Steve Cummer, professor of electrical and computer engineering at Duke. "The challenge in transitioning to these more complicated designs has been the manufacturing process. With the ability to do this on a common 3-D printer, anyone can build and test a potential prototype in a matter of hours with relatively little cost." The key to making 3-D printed electromagnetic metamaterials a reality was finding the right conductive material to run through a commercial 3-D printer. Such printers usually use plastics, which are typically terrible at conducting electricity. While there are a few commercially available solutions that mix metals in with the plastics, none are conductive enough to create viable electromagnetic metamaterials. While metal 3-D printers do exist, they cost as much as $1 million and take up an entire room. That's where Benjamin Wiley, Duke associate professor of chemistry, came in. "Our group is really good at making conductive materials," said Wiley, who has been exploring these materials for nearly a decade. "We saw this gap and realized there was a huge unexplored space to be filled and thought we had the experience and knowledge to give it a shot." Wiley and Shengrong Ye, a postdoctoral researcher in his group, created a 3-D printable material that is 100 times more conductive than anything currently on the market. The material is currently being sold under the brand name Electrifi by Multi3D LLC, a startup founded by Wiley and Ye. While still not nearly as conductive as regular copper, Cummer thought that it might just be conductive enough to create a 3-D printed electromagnetic metamaterial. In the paper, Cummer and doctoral student Abel Yangbo Xie show that not only is Electrifi conductive enough, it interacts with radio waves almost as strongly as traditional metamaterials made with pure copper. That small difference is easily made up for by the printed metamaterials' 3-D geometry -- the results show that the 3-D printed metamaterial cubes interact with electromagnetic waves 14 times better than their 2-D counterparts. By printing numerous cubes, each tailored to specifically interact with an electromagnetic wave in a certain way, and combining them like Lego building blocks, researchers can begin to build new devices. For the devices to work, however, the electromagnetic waves must be roughly the same size as the individual blocks. While this rules out the visible spectrum, infrared and X-rays, it leaves open a wide design space in radio waves and microwaves. "We're now starting to get more aggressive with our metamaterial designs to see how much complexity we can build and how much that might improve performance," said Cummer. "Many previous designs were complicated to make in large samples. You could do it for a scientific paper once just to show it worked, but you'd never want to do it again. This makes it a lot easier. Everything is on the table now." "We think this could change how the radio frequency industry prototypes new devices in the same way that 3-D printers changed plastic-based designs," said Wiley. "When you can hand off your designs to other people or exactly copy what somebody else has done in a matter of hours, that really speeds up the design process." This work was supported by a Multidisciplinary University Research Initiative grant from the Office of Naval Research (N00014-13-1-0631). Microwave Metamaterials Made by Fused Deposition 3D Printing of a Highly Conductive Copper-based Filament. Yangbo Xie, Shengrong Ye, Christopher Reyes, Pariya Sithikong, Bogdan Popa, Benjamin J. Wiley, and Steven A. Cummer. Applied Physics Letters, 2017. DOI: 10.1063/1.4982718


« Average new-vehicle fuel economy in US up 0.1 mpg in April | Main | Gevo signs definitive supply agreement with HCS Holding for commercial supply of renewable isooctane » A commitment to reducing global emissions of short-lived climate pollutants (SLCPs) such as methane and black carbon could slow global warming while boosting public health and agricultural yields, aligning the Paris Climate Agreement with global sustainable development goals, according to new analysis by an international research panel published in the journal Science. Methane and black carbon (soot) are the second- and third-most powerful climate-warming agents after carbon dioxide. They also contribute to air pollution that harms the health of billions of people worldwide and reduces agricultural yields. Lead author Drew T. Shindell, professor of climate science at Duke University’s Nicholas School of the Environment, co-authored the paper with colleagues from the Institute for Governance & Sustainable Development in Washington, D.C.; the University of British Colombia; the London School of Hygiene and Tropical Medicine; the University of York; the United Nations Environment Program; Scripps Institution of Oceanography; Colorado State University; the International Institute for Applied Systems Analysis in Austria; and TERI University in India. Acting now to reduce these emissions would contribute to long-term goals set under the 2015 Paris Climate Agreement while concurrently offering governments substantial benefits in the short term for investing in sustainable development—a set of goals through 2030 that countries also agreed to in 2015. The paper builds upon previous work by the Climate and Clean Air Coalition (CCAC), an international consortium of more than 100 countries and non-state partners working to reduce SLCPs. Shindell chairs the CCAC’s Science Advisory Panel; his co-authors of the new policy forum are all members or affiliates of that panel. In the new article, they point out that in addition to saving human lives and boosting global food security, curbing SLCPs will significantly slow the pace of climate change over the next 25 years. This could help reduce biodiversity losses and slow amplifying climate feedbacks such as snow-and-ice albedo that are highly sensitive to black carbon. Under the Paris Agreement, many countries have already committed to reducing SLCPs, Shindell noted, yet they are combining those pledges into a single, so-called “CO -equivalent” reporting method that lumps SLCPs into the same basket as carbon dioxide and other long-lived greenhouse gases. Maintaining separate reporting methods for each pollutant would provide a clearer understanding of the benefits associated with SLCPs’ reduction.


News Article | April 17, 2017
Site: www.eurekalert.org

Bottom Line: A vaccine targeting cytomegalovirus (CMV) antigen pp65, combined with high-dose chemotherapy (temozolomide), improved both progression-free survival and overall survival for a small group of glioblastoma (GBM) patients. Journal in Which the Study was Published: Clinical Cancer Research, a journal of the American Association for Cancer Research. Author: Lead author of the study is Kristen Batich, MD, PhD, a researcher in the lab of senior author John Sampson, MD, PhD, chair of the Department of Neurosurgery at Duke University. Background: The typical median survival for GBM patients is less than 15 months. To overcome these poor numbers, the researchers took advantage of CMV's affinity for GBM, with the viral proteins being expressed in roughly 90 percent of these tumors. Building on previous research, they used CMV as a proxy for GBM, targeting the virus with pp65-specific dendritic cells to spotlight the tumor for the immune system. How the Study Was Conducted and Results: The cohort of 11 patients who received this combination therapy demonstrated a median progression-free survival of 25.3 months and a median overall survival of 41.1 months, and three patients remain progression-free more than seven years after diagnosis, Batich explained. "The clinical outcomes in GBM patients who received this combination were very striking," Batich said. Previous work had shown that TMZ generates profound lymphopenia or the loss of immune cells, which offers a unique opportunity to retrain the immune system, Batich explained. The researchers administered dose-intensified temozolomide (TMZ) as a strategy to further enhance the immune response. "The dose-intensified temozolomide induces a strong state of lymphopenia," said Batich. "With that comes an opportune moment to introduce an antigen-specific vaccine, which redirects the immune system to put all hands on deck and fight that target." One of the noteworthy results from the study was the excellent response rate despite the high proportion of regulatory T cells, which dampen the immune response and rebounded sharply following TMZ administration. This finding may actually be cause for optimism, Batich noted. "If we could preclude this regulatory T-cell rebound, it could have additionally enhancing effects on the pp65 vaccine response," said Batich. Limitations: Though the survival results are quite encouraging, the authors caution that this was a single-arm study without a control group. In addition, the cohort was quite small. Though the outcomes far outpaced historical controls, a more robust trial will be needed to confirm these results. In addition, the team wants to better understand the mechanisms that underlie the strong response rate and refine this combination therapy to produce even better results. "We want to understand why some patients do better than others," said Batich. Funding & Disclosures: This study was funded by the National Institutes of Health. Sampson holds stock ownership and is on the board of directors with Annias Immunotherapeutics; serves as a consultant and advisory board member for Celldex Therapeutics; reports honoraria for Celldex Therapeutics, Bristol-Myers Squibb, and Brainlab; and a co-inventor on a patent describing the immunologic targeting of CMV antigens in cancer. Batich is a co-inventor on a patent for improving the immunogenicity of dendritic cell vaccines. Founded in 1907, the American Association for Cancer Research (AACR) is the world's first and largest professional organization dedicated to advancing cancer research and its mission to prevent and cure cancer. AACR membership includes more than 37,000 laboratory, translational, and clinical researchers; population scientists; other health care professionals; and patient advocates residing in 108 countries. The AACR marshals the full spectrum of expertise of the cancer community to accelerate progress in the prevention, biology, diagnosis, and treatment of cancer by annually convening more than 30 conferences and educational workshops, the largest of which is the AACR Annual Meeting with more than 21,900 attendees. In addition, the AACR publishes eight prestigious, peer-reviewed scientific journals and a magazine for cancer survivors, patients, and their caregivers. The AACR funds meritorious research directly as well as in cooperation with numerous cancer organizations. As the Scientific Partner of Stand Up To Cancer, the AACR provides expert peer review, grants administration, and scientific oversight of team science and individual investigator grants in cancer research that have the potential for near-term patient benefit. The AACR actively communicates with legislators and other policymakers about the value of cancer research and related biomedical science in saving lives from cancer. For more information about the AACR, visit http://www. . To interview Kristen Batich, contact Julia Gunther at julia.gunther@aacr.org or 215-446-6896.


News Article | May 5, 2017
Site: www.eurekalert.org

SOLOMONS, MD (MAY 5, 2017)--A new study by scientists at the University of Maryland Center for Environmental Science's Chesapeake Biological Laboratory, Cornell University and Duke University is the first in a series to understand how marine mammals like porpoises, whales, and dolphins may be impacted by the construction of wind farms off the coast of Maryland. The new research offers insight into previously unknown habits of harbor porpoises in the Maryland Wind Energy Area, a 125-square-mile area off the coast of Ocean City that may be the nation's first commercial-scale offshore wind farm. Offshore wind farms provide renewable energy, but activities during the construction can affect marine mammals that use sound for communication, finding food, and navigation. "It is critical to understand where marine mammals spend their time in areas of planning developments, like offshore wind farms, in order to inform regulators and developers on how to most effectively avoid and minimize negative impacts during the construction phase when loud sounds may be emitted," said Helen Bailey, the project leader at the UMCES' Chesapeake Biological Laboratory. Scientists from the University of Maryland Center for Environmental Science used underwater microphones called hydrophones to detect and map the habits of harbor porpoises, one of the smallest marine mammals. Bailey describes harbor porpoises as "very shy" ranging 4 to 5 feet long with a small triangular fin that can be hard to spot. They swim primarily in the ocean, spending summers north in the Bay of Fundy and migrating to the Mid-Atlantic, as far south as North Carolina, in the winter. There are about 80,000 of them in the northwestern Atlantic. "There was so little known about them in this area," said Bailey. "It was suspected they used the waters off Maryland, but we had no idea how frequently they occurred here in the winter until we analyzed these data." Porpoises produce echolocation clicks, a type of sonar that hits an object and reflects back to tell them its distance, size and shape. They use it to navigate and feed. The researchers used hydrophones anchored 65-145 feet deep, and about 10 feet off the bottom of the ocean, to pick up these clicks over the course of a year. "We found that harbor porpoises occurred significantly more frequently during January to May, and foraged for food significantly more often in the evenings to early mornings," said study author Jessica Wingfield. Scheduling wind farm construction activities in the Maryland WEA to take place during summer months (June to September) could reduce the likelihood of disturbance to harbor porpoises. "We were certainly surprised by how frequently we detected harbor porpoises because there had not been a lot of reported sightings," said Wingfield. Maryland Department of Natural Resources secured the funding for this study from the Maryland Energy Administration's Offshore Wind Development Fund and the Bureau of Ocean Energy Management. "Year-round spatiotemporal distribution of harbour porpoises within and around the Maryland wind energy area" was recently published in PLOS ONE. The University of Maryland Center for Environmental Science leads the way toward better management of Maryland's natural resources and the protection and restoration of the Chesapeake Bay. From a network of laboratories located across the state, UMCES scientists provide sound advice to help state and national leaders manage the environment, and prepare future scientists to meet the global challenges of the 21st century.


News Article | May 8, 2017
Site: www.spie.org

Frequency- and phase-diverse spatial light modulation can more than double terahertz image acquisition efficiency, effectively parallelizing the single-pixel imaging process. Most modern imaging systems function in a parallel acquisition scheme.1, 2 For example, the ubiquitous digital optical cameras of today employ arrays of pixels that each detect local light intensity, and simultaneously generate proportional electrical signals to construct an image. However, assembling the large quantities of detectors that are required for parallel imaging is not always feasible for other frequencies of light. In particular, there is a gap in current technology ranging from about 0.1 to 10 terahertz (THz), often referred to as the ‘terahertz gap.’3 Here single-pixel imaging may be advantageous: only one detector is used, with a spatial light modulator (SLM) to serially acquire many measurements of a scene. Metamaterials (i.e., engineered materials) enable the construction of high-performance SLMs because their electromagnetic properties can be designed via unit cell geometry. Until recently, single-pixel imaging was inherently slow because it necessitates making a number of serial measurements equal to the number of pixels in the final image. Compressive sensing is a prominent approach that seeks to increase acquisition speeds by reducing the number of measurements made by the single pixel detector. However, the image reconstructions from compressive measurements can be computationally expensive (NP-hard).5 Further, the measurement process remains serial, meaning that acquisition time is still directly proportional to the desired image size. We developed an efficient single-pixel imaging system enabled by a metamaterial SLM6 whose pixels' absorption peak can be dynamically brought high or low via applied bias voltage with great speed and precision. Light from a THz source passes through the object to be imaged and is focused onto the metamaterial SLM (see Figure 1).7 Each pixel oscillates between high and low absorption at frequency f with a specific phase, either 0 or π, a technique known in communications engineering as binary phase-shift keying (BPSK).8 The spatial pattern of 0 and π phases—or the ‘mask’—is, in our case, given by a row of a Hadamard matrix, shown to be optimal in single-pixel imaging.9 The light from each SLM pixel is then focused into the single-pixel THz detector, where the summed phase and amplitude of the signal are read by a lock-in amplifier detection scheme. Figure 1. Schematic of the experimental setup for quadrature phase-shift keying (QPSK) imaging. Light from a terahertz (THz) source transmits through an object and is focused onto a spatial light modulator (SLM). Two distinct masks from the Hadamard matrix (mask 1 and mask 2) are encoded simultaneously by the SLM, and light is then refocused into a single-pixel detector. Schematic of the experimental setup for quadrature phase-shift keying (QPSK) imaging. Light from a terahertz (THz) source transmits through an object and is focused onto a spatial light modulator (SLM). Two distinct masks from the Hadamard matrix (mask 1 and mask 2) are encoded simultaneously by the SLM, and light is then refocused into a single-pixel detector. 4 The Q and I axes correspond respectively to the quadrature and in-phase components of the QPSK states. Figure 2. Experimental characterization of advanced modulation states. (a) QPSK states realized simultaneously on three different frequencies (f , f , f ). (b) QPSK states realized for a single frequency shown with mean and standard deviation indicators. (c) Time domain data for different binary phase-shift keying (BPSK) state combinations on four different orthogonal frequency division multiplexing frequencies (f , f , f , f ). Higher-voltage states correspond to πphase, and low-voltage states to - πphase. Experimental characterization of advanced modulation states. (a) QPSK states realized simultaneously on three different frequencies (f, f, f). (b) QPSK states realized for a single frequency shown with mean and standard deviation indicators. (c) Time domain data for different binary phase-shift keying (BPSK) state combinations on four different orthogonal frequency division multiplexing frequencies (f, f, f, f). Higher-voltage states correspond to πphase, and low-voltage states to - πphase. 4,11 We parallelize the single-pixel imaging process by displaying more than one mask simultaneously, in two different ways.10 First, we use four phase values (π/4, 3π/4, 5π/4, 7π/4) instead of the original two, a method known as quadrature phase-shift keying (QPSK): see Figure 2(b).4 With twice as many phase values, we can display two masks at once and simultaneously measure their results. This deterministically doubles the acquisition speed, since we complete the same number of measurements in half the time. Figure 3(b) and (c) shows the QPSK imaging results. Figure 3. (a) Image of an original cross object aperture and the (b) BPSK and (c) QPSK images acquired with our single-pixel THz imaging system. (d) Image of an original ‘D’ object aperture and the (e) 1-frequency, (f) 2-frequency, and (g) 4-frequency BPSK images acquired with a similar THz imaging system. (a) Image of an original cross object aperture and the (b) BPSK and (c) QPSK images acquired with our single-pixel THz imaging system. (d) Image of an original ‘D’ object aperture and the (e) 1-frequency, (f) 2-frequency, and (g) 4-frequency BPSK images acquired with a similar THz imaging system. 4, 11 In the second parallelization method, we employ some number of modulation frequencies greater than one.11 These frequencies—four in the case shown in Figure 2(c)—are chosen to be orthogonal in order to minimize interference between them, a technique known as orthogonal frequency division multiplexing (OFDM).12 This allows four masks to be displayed simultaneously, and thus four measurements to be recorded at once via a lock-in detection scheme. This technique therefore yields a fourfold increase in acquisition speed. However, it necessarily spreads the full modulation power of the SLM across several frequencies, so a decrease in signal-to-noise ratio (SNR) is inevitable, as is evident in the imaging results that we obtained: see Figure 3(e–g). On the other hand, this trade of SNR for acquisition speed is made at a constant detector integration time, which can be advantageous in some cases. The effects of these two parallelization methods combine multiplicatively. By employing the QPSK and OFDM methods together, we achieved a deterministic eightfold increase in acquisition speed. Further, these techniques are completely compatible with compressive sensing approaches.7 Naturally, there is the question of extending these techniques with more frequencies and phase values for even greater acquisition speed. While this is perfectly feasible in the case of OFDM, QPSK is difficult to extend in the context of single-pixel imaging due to the inherent spatial multiplexing of such a system. A phase-sensitive detection scheme must be able to distinguish between measurements of the simultaneous masks, and in the present context this leaves room for only two masks: one encoded in-phase, and one encoded in-quadrature. The advanced modulation techniques highlighted here are enabled by metamaterial SLMs, and provide a pathway to solving the inherently slow, serial nature of current single-pixel imaging methods. Extensions of QPSK and OFDM to more frequencies and phases have the potential to increase image acquisition speed to a nearly arbitrary degree, limited only by the SNR of the system.13 Improvements to single-pixel methods can help fill the terahertz gap and facilitate related applications in security screening,14 all-weather navigation,15 and biosensing.16 Overall, we expect the scalability of metamaterials and of these advanced modulation methods to have a significant impact in imaging fields, particularly those in the IR, far-IR, and millimeter wave regimes. In our future work, we will extend these techniques to small-format detector array systems, as well as hyperspectral and polarimetric imaging. This research was funded in part by National Science Foundation grant ECCS-1002340 and Office of Naval Research grant N00014-11-1-0864. Duke University Willie Padilla is a professor in electrical and computer engineering. Currently his research interests involve the THz, IR, and optical properties of metamaterials for spectroscopy, imaging, and energy investigations. Christian Nadell is a PhD candidate working under Willie Padilla. His research interests involve the study of metamaterials and their THz and IR imaging applications. 1. O. Katz, P. Heidmann, M. Fink, S. Gigan, Non-invasive real-time imaging through scattering layers and around corners via speckle correlations, Nat. Photonics 8(10), p. 784-790, 2014. doi:10.1038/nphoton.2014.189 4. C. C. Nadell, C. M. Watts, J. A. Montoya, S. Krishna, W. J. Padilla, Single pixel quadrature imaging with metamaterials, Adv. Opt. Mater. 4(1), p. 66-69, 2016. doi:10.1002/adom.201500435 5. T. Strohmer, Measure what should be measured: progress and challenges in compressive sensing, CoRR abs/1210.6730, 2012. 11. C. M. Watts, C. C. Nadell, J. Montoya, S. Krishna, W. J. Padilla, Frequency-division-multiplexed single-pixel imaging with metamaterials, Optica 3(2), p. 133-138, 2016. doi:10.1364/OPTICA.3.000133


News Article | May 4, 2017
Site: www.eurekalert.org

IMAGE:  Human sources of black carbon and other short-lived climate pollutants include flares from oil and gas wells, such as these in the Bakken Field of North Dakota. view more DURHAM, N.C. -- A commitment to reducing global emissions of short-lived climate pollutants (SLCPs) such as methane and black carbon could slow global warming while boosting public health and agricultural yields, aligning the Paris Climate Agreement with global sustainable development goals, a new analysis by an international research panel shows. Methane and black carbon - or soot - are the second and third most powerful climate-warming agents after carbon dioxide. They also contribute to air pollution that harms the health of billions of people worldwide and reduces agricultural yields. "Unlike long-lived greenhouse gases such as carbon dioxide, SLCPs respond very quickly to mitigation. It's highly likely that we could cut methane emissions by 25 percent and black carbon by 75 percent and eliminate high-warming hydrofluorocarbons altogether in the next 25 years using existing technologies, if we made a real commitment to doing this," said Drew T. Shindell, professor of climate science at Duke University's Nicholas School of the Environment. Acting now to reduce these emissions would contribute to long-term goals set under the 2015 Paris Climate Agreement while concurrently offering governments substantial benefits in the short term for investing in sustainable development - a set of goals through 2030 that countries also agreed to in 2015. "The urgency in dealing with SLCPs now rather than later is that if we wait to address them, we continue to incur all these damages - increased public health burdens and reduced agricultural yields - along the way," Shindell said. "If we want to avoid those costs, and keep millions of people from dying, we need to do this now. "Adding a pathway goal would help reduce the risks faced by the current generation and our children, complementing the Paris Agreement's long-term target that reduces risks for future generations," he said. Shindell and colleagues from 10 other international research institutions published their peer-reviewed policy forum article May 5 in Science. The article builds upon previous work by the Climate and Clean Air Coalition (CCAC), an international consortium of more than 100 countries and non-state partners working to reduce SLCPs. Shindell chairs the CCAC's Science Advisory Panel; his co-authors of the new policy forum are all members or affiliates of that panel. In the new article, they point out that in addition to saving human lives and boosting global food security, curbing SLCPs will significantly slow the pace of climate change over the next 25 years. This could help reduce biodiversity losses and slow amplifying climate feedbacks such as snow-and-ice albedo that are highly sensitive to black carbon. Under the Paris Agreement, many countries have already committed to reducing SLCPs, Shindell noted, yet they are combining those pledges into a single, so-called "CO2-equivalent" reporting method that lumps SLCPs into the same basket as carbon dioxide and other long-lived greenhouse gases. Maintaining separate reporting methods for each pollutant would provide a clearer understanding of the benefits associated with SLCPs' reduction. "Targeting immediate reductions in SLCP emissions is the most beneficial path we can take toward achieving the Paris Climate Agreement's goal of reducing warming by 2oC," Shindell said. "You could, conceivably, delay reducing these pollutants for decades and still achieve that goal. But why would you want to if there are all these advantages to be gained by following this path, instead?" Researchers at the Institute for Governance & Sustainable Development in Washington, D.C.; the University of British Colombia; the London School of Hygiene and Tropical Medicine; the University of York; the United Nations Environment Programme; Scripps Institution of Oceanography; Colorado State University; the International Institute for Applied Systems Analysis in Austria; and TERI University in India co-authored the new article with Shindell. CITATION:"A Climate Policy Pathway for Near- and Long-Term Benefits," D. Shindell, N. Borgford-Parnell, M. Brauer, A. Haines, J.C.I. Kuylenstierna, S.A. Leonard, V. Ramanathan, A. Ravishankara, M. Amann and L. Srivastava. Science, May 5, 2017: DOI: 10.1126/science.aak9521


News Article | May 8, 2017
Site: www.eurekalert.org

Mothers protect their babies and teach them habits to stay healthy and safe as they grow. A new UCLA-led study shows that beneficial bacteria from mothers do much the same thing. The study found that 30 percent of the beneficial bacteria in a baby's intestinal tract come directly from mother's milk, and an additional 10 percent comes from skin on the mother's breast. What's more, babies who breast-feed even after they begin eating solid food continue reaping the benefits of a breast milk diet -- a growing population of beneficial bacteria associated with better health. After birth, beneficial bacteria from the mother and environment colonize the infant's intestine, helping digest food and training the baby's immune system to recognize bacterial allies and enemies. But scientists still don't completely understand the mechanisms that help babies establish a healthy gut microbiome -- the diverse community of bacteria that inhabits the intestines. "Breast milk is this amazing liquid that, through millions of years of evolution, has evolved to make babies healthy, particularly their immune systems," said Dr. Grace Aldrovandi, the study's senior author and a professor of pediatrics and chief of infectious diseases at UCLA Mattel Children's Hospital. "Our research identifies a new mechanism that contributes to building stronger, healthier babies." The findings appear in the May 8 issue of the JAMA Pediatrics. The study, which looked at 107 mother-infant pairs, is the largest to date showing the transfer of bacteria in the milk into the baby's gut, Aldrovandi said. Earlier research has shown that a balanced bacterial community in the intestine is a key factor in people's susceptibility to immune diseases. For example, children who develop type 1 diabetes have abnormalities in their gut microbiomes; what's more, a healthy gut appears to protect against allergies, asthma and inflammatory bowel disease throughout life. "We're appreciating more and more how these bacterial communities, particularly in the intestine, help guard against the bad guys," Aldrovandi said. "We know from animal model systems that if you get good bacteria in your gut early in life, you're more likely to be healthy." Throughout the babies' first year of life, researchers collected samples of breast milk and infant stool, and swabs from the skin around the nipple. They analyzed the samples to assess which bacteria were shared between mothers and infants, and calculated the relative abundance of the bacteria. The origin of breast milk bacteria remains unclear; one hypothesis is that it travels to the breast from the mother's intestine. The project did not address how babies who are fed only formula acquire heathy microbiomes. Aldrovandi and colleagues want to expand the research to evaluate more samples in late infancy to better understand the transition to an adult microbiome. They would like to test in the lab how bacteria that are provided through breast-feeding are critical in infants' immune responses, and determine which beneficial bacteria are missing in people who have certain diseases. The study's other authors are Dr. Pia Pannaraj and Dr. Jeffrey Bender of University of Southern California's Keck School of Medicine; Shangxin Yang, Adrienne Rollie, Helty Adisetiyo, Fan li and Dr. Chiara Cerini of Children's Hospital Los Angeles; Sara Zabih, Pamela Lincez, Kyle Bittinger, Aubrey Bailey and Frederic Bushman of University of Pennsylvania; and Dr. John Sleasman of Duke University School of Medicine. The study was funded by grants from the National Institutes of Health (K23 HD072774-02, K12 HD052954-09, R01 AI052845, R01 AI1001471, UM1AI106716) and the University of Pennsylvania Center for AIDS Research (P30 AI045008).


CHAPEL HILL, N.C.--(BUSINESS WIRE)--Dr. Elizabeth A. Gulledge and Bill Sanford, M.S., leadership trainers and speakers at Bell Leadership Institute, will present at the ATD 2017 International Conference & Exposition in Atlanta. Registration is now open for the conference, which will be hosted at the Georgia World Congress Center on May 21-24, 2017. Elizabeth will address the audience on Mastering the Art of Change Leadership: How to Change Yourself, Others and Organizations on May 24, 10:00 AM in room B406. The session will guide participants through the process of creating change in their organizations in a way that yields positive outcomes. She will incorporate personal skill development exercises for participants to practice applying the proven methods of leading change and think deeply about dilemmas when creating change. Bill will present on Great Leaders, Great Results: Building Leadership Talent for Organizational Success on May 24, 10:00 AM in room B316. Participants will discover the single most important cause of the performance level of companies, teams and families – and knowing that cause, they will be better able to boost the performance of their organizations by making smarter decisions about where to invest their time and money. This session is extremely useful for those who want to persuasively make the case for selecting and building leadership talent, and who would like a framework for doing so. Over the span of four days, ATD 2017 will host over 10,000 attendees, offering more than 300 educational sessions on various topics within the learning industry. Participants will have the opportunity to hear from over 400 speakers, including following keynote speakers: Captains Mark Kelly and Scott Kelly, highly decorated NASA astronauts and retired US Navy Captains; Dr. Kelly McGonigal, Health Psychologist and Lecturer at Stanford University; and Dr. Ronan Tynan, Irish Tenor, Paralympic Champion and MD. Elizabeth received her BA (Political Science) from Duke University and Master's degrees (International Relations and International Business) and PhD (Organizational Theory) from the University of St. Andrews in the United Kingdom. She attended the University of St. Andrews as a Ransome Scholar and served as the Dean's Research Fellow in Organizational Psychology. In addition, she was a visiting scholar at the Center for Public Policy and Management at Duke University and a Rothemere Scholar at the University of Oxford. Bill earned master’s degrees from the London School of Economics and from George Mason University’s Institute for Conflict Analysis and Resolution. He earned his bachelor’s degree from the University of North Carolina at Chapel Hill, which he attended on a Morehead Scholarship. The Association for Talent Development (ATD), formerly known as the American Society for Training & Development (ASTD), is a professional membership organization supporting those who develop the knowledge and skills of employees in organizations around the world. ATD's mission is to "Empower professionals to develop talent in the workplace." ATD 2017 online rate is $1,875 for members and $2,200 for nonmembers. The online rate ends May 12, 2017. For more information and to register, visit the registration page. Bell Leadership Institute is a recognized leader in leadership training and executive education. Since 1972, Bell Leadership has helped organizations develop leadership mastery through its programs and services. Its training programs have been used by more than 500,000 leaders in more than 5,000 organizations in over 50 countries.


News Article | May 5, 2017
Site: www.chromatographytechniques.com

A new study by scientists at the University of Maryland Center for Environmental Science’s Chesapeake Biological Laboratory, Cornell University and Duke University is the first in a series to understand how marine mammals like porpoises, whales, and dolphins may be affected by the construction of wind farms off the coast of Maryland. The new research offers insight into previously unknown habits of harbor porpoises in the Maryland Wind Energy Area, a 125-square-mile area off the coast of Ocean City that may be the nation’s first commercial-scale offshore wind farm. Offshore wind farms provide renewable energy, but activities during the construction can affect marine mammals that use sound for communication, finding food, and navigation. “It is critical to understand where marine mammals spend their time in areas of planning developments, like offshore wind farms, in order to inform regulators and developers on how to most effectively avoid and minimize negative impacts during the construction phase when loud sounds may be emitted,” said Helen Bailey, the project leader at the UMCES’ Chesapeake Biological Laboratory. Scientists from the University of Maryland Center for Environmental Science used underwater microphones called hydrophones to detect and map the habits of harbor porpoises, one of the smallest marine mammals. Bailey describes harbor porpoises as “very shy” ranging 4 to 5 feet long with a small triangular fin that can be hard to spot. They swim primarily in the ocean, spending summers north in the Bay of Fundy and migrating to the Mid-Atlantic, as far south as North Carolina, in the winter.  There are about 80,000 of them in the northwestern Atlantic. “There was so little known about them in this area,” said Bailey. “It was suspected they used the waters off Maryland, but we had no idea how frequently they occurred here in the winter until we analyzed these data.” Porpoises produce echolocation clicks, a type of sonar that hits an object and reflects back to tell them its distance, size and shape. They use it to navigate and feed. The researchers used hydrophones anchored 65-145 feet deep, and about 10 feet off the bottom of the ocean, to pick up these clicks over the course of a year. “We found that harbor porpoises occurred significantly more frequently during January to May, and foraged for food significantly more often in the evenings to early mornings,” said study author Jessica Wingfield. Scheduling wind farm construction activities in the Maryland WEA to take place during summer months (June to September) could reduce the likelihood of disturbance to harbor porpoises. “We were certainly surprised by how frequently we detected harbor porpoises because there had not been a lot of reported sightings,” said Wingfield. Maryland Department of Natural Resources secured the funding for this study from the Maryland Energy Administration’s Offshore Wind Development Fund and the Bureau of Ocean Energy Management. “Year-round spatiotemporal distribution of harbour porpoises within and around the Maryland wind energy area” was recently published in PLOS ONE.


News Article | April 26, 2017
Site: www.prweb.com

All six of the most popular employers of elite MBA graduates will be holding sessions for incoming MBA students at the second annual Pre-MBA Networking Festival sponsored by Poets&Quants on May 11-12 in New York City. Google, McKinsey & Co., The Boston Consulting Group, Bain & Co., Deloitte, and Amazon—the top half dozen companies most MBA want to work in—will all hold sessions in their New York offices. They will be among more than 20 world-class sponsors, presenters and world-class MBA employers, including Anheuser-Busch, Accenture Strategy, American Express, A.T. Kearney, CommonBond, Deutsche Bank, General Electric, Goldman Sachs, J.P. Morgan, L’Oréal, Morgan Stanley, and PwC. “This is an extraordinary event, a once-in-a-lifetime opportunity for newly admitted MBA candidates to gain a deep and early understanding of their future career options,” says John A. Byrne, editor-in-chief of PoetsandQuants.com. “The employer sessions give incoming students a headstart in helping to decide what industry and job they would most want to have as an internship and ultimately a full-time position.” The first festival, held last year, was a rousing success, with 100% of the attendees saying in a survey that they would highly recommend the event to others. “Students came away with great contacts that immediately led to summer internships at great companies, including Amazon, JP Morgan Chase and McKinsey,” adds Byrne. “And they were able to get a handle on each of these companies without the pressures or distractions of balancing their school work and social obligations.” Registered attendees as of April 15th include successful MBA applicants to Harvard Business School, the University of Pennsylvania’s Wharton School of Business, the University of Chicago’s Booth School of Business, Northwestern Kellogg School of Management, Columbia Business School, Yale’s School of Management, NYU’s Stern School of Business, INSEAD, London Business School, Dartmouth College’s Tuck School of Business, Cornell’s Graduate Johnson School of Management, Duke University’s Fuqua School of Business, among many other of the world’s leading business schools. Some 10% of the registered attendees will be coming to the event from overseas. The festival opens on the evening of May 11 at New York University’s Stern School of Business, with a panel discussion on MBA careers that will feature Colleen Baum, principal at McKinsey, Marco Caggiano, managing director at JP Morgan’s merger & acquisitions group, Michael Grimstad, regional leader for Amazon Prime Now, Brian Perkins, global vice-president at AbInBev’s Budweiser, and Priyanka “Piya” Nair Newkirk, Vice President of Makeup Marketing at Lancôme, L’Oréal. NYU Stern Dean Peter Henry will welcome attendees to the opening, where students also will hear from a career coach and a comedian. On May 12th, attendees are able to attend five separate company sessions where executives and partners explain what it’s like to work in their companies, what kinds of jobs newly hired MBAs perform, and offer candid advice on how to pursue an opportunity in their companies. The festival ends with a lavish evening reception sponsored by JPMorgan Chase. Attendees can begin to create their custom agenda for the festival on April 27th. Registration for the event closes on April 30. There is still a limited number of seats still available. To register for the event, go to: http://poetsandquants.com/event/2017-poetsquants-premba-networking-festival/ To see a video highlighting last year’s festival, go to: https://youtu.be/2l-qocugQ6c


News Article | April 17, 2017
Site: phys.org

Such compounds, which include chemotherapeutic agents like 5-fluorouracil and gemcitabine, popular HIV drugs like AZT, and potent hepatitis B treatments like acyclovir, have dramatically changed the outcomes for millions of people afflicted with life-threatening illnesses. Duke University scientists have now modeled the complex shape and movement of biomolecules to make an animation depicting how nucleoside analogs and natural nucleosides are transported into cells. The heart of the system is a specific molecule aptly named the concentrative nucleoside transporter, or CNT. The scientists' movie shows CNT slowly moving its cargo like an elevator, stopping at various points across the cell membrane before reaching the other side. Their findings, published early online in Nature, provide important structural information that could be used to design smarter, more specific anticancer and antiviral drugs. "Our study is the first to provide a visualization of almost every possible conformation of this transporter in motion," said senior study author Seok-Yong Lee, Ph.D., associate professor of biochemistry at Duke University School of Medicine. "By understanding how this transporter recognizes and imports nucleosides, we may be able to redesign drugs that are better at getting inside specific cells like those harboring cancer or a virus." The blueprint for every living organism lies in the twisted strands of DNA buried within cells. These strands are comprised of four nucleotide "bases" - G, A, C, T, arranged along a backbone of sugars and phosphate molecules. Every time a cell grows and divides, it has to make more copies of those original strands of DNA. Hence, active cells are constantly importing more building blocks to replenish their genetic material, especially the essential nucleosides, which are like a nucleotide base without a phosphate attached. Fifty years ago, scientists designed the first nucleoside analogs, molecular mimics that muck up this DNA construction supply chain in order to incapacitate rapidly growing and particularly needy cancer cells and viruses. Like their natural counterparts, nucleoside analogues are carried across the cell membrane by special proteins called nucleoside transporters. In this study, Lee's group sought to capture one of the most common transporters, known as the concentrative nucleoside transporter or CNT, as it traversed the membrane. Marscha Hirschi, a graduate student in Lee's lab, used a technique called x-ray crystallography to create an atomic-level three-dimensional picture of the protein. She then took a series pictures of CNT in different conformations to produce a kind of time-lapse video of the transporter in action: first, as it is ready to capture the nucleoside uridine on the surface of the cell; next, as it moved across the membrane in stages; and finally, as it released the uridine inside the cell. "We found that there is a region on the protein called the transport domain that acts like an elevator, shifting into different conformations as it transports cargo up and down across the membrane," said Lee. "Other studies had shown that many transporters move in this way, but ours is the first to record nearly all of the stages of the elevator model. This more detailed understanding could provide a platform to the future development of drugs that are more selective and efficient." Lee says that transporters responsible for importing a variety of different molecules, such as neurotransmitters, metabolites, and ions, use mechanisms similar to CNT. Thus, the new findings could have implications that reach beyond viral infections and cancer to a number of different clinically relevant physiological processes.


News Article | April 13, 2017
Site: www.cemag.us

Bacteria are everywhere. And despite widespread belief, not all bacteria are “bad.” However, to combat those that can cause health issues for humans, there has been an over-reliance on the use of antibiotics — so much so, that many of them are now proving ineffective due to bacteria developing increased resistance to them. “More and more antibiotics are essentially becoming useless,” says Robert Smith, Ph.D., assistant professor in the Department of Biological Sciences at NSU’s Halmos College of Natural Sciences and Oceanography. “Even the most routine infections, such as ear infections that are often seen in children, are becoming more challenging and expensive to treat.” This notion isn’t new — just prior to winning his Nobel Prize in 1945, Alexander Fleming, the scientist who discovered antibiotics, warned that overusing them would lead to bacteria that were no longer killed by these drugs.  Since then, scientists and bacteria have been locked in a deadly arms race. While scientists rush to discover new antibiotics, bacteria fight back by developing new tools to resist antibiotics. In recent years, the bacteria have been winning. So this paradigm led researchers at NSU to take another look at how bacteria do what they do to see if there was another way to approach this issue. Researchers are now focusing on developing new ways to treat infections that reduce the use of antibiotics. And what the NSU researchers found, working with colleagues from Duke University and the University of Minnesota, was interesting. Their findings are detailed in the March 27 edition of Scientific Reports. One way that bacteria infect people is by working together. First, they build a home called a biofilm, and then use chemicals to “talk with each other.” This allows the bacteria to coordinate an attack on the infected person. Led by NSU graduate Cortney Wilson, Smith’s lab recently discovered that by shaking the house that the bacteria have built, the ability of the bacteria to talk to one another is affected. Wilson earned her Master’s degree from NSU and is now at the University of Colorado, Boulder. “We found that shaking the bacteria forced them to face a decision; do they want to grow, or do they want to cooperate,” Smith says. “And if we shook them at just the right frequency, we created enough confusion that the bacteria could do neither effectively.” Smith notes that this strategy to prevent bacteria from talking to one another has promise in reducing the need for antibiotics. The team of scientists hope to begin testing their theory in more species of bacteria, and eventually in mice. “It is a very exciting time for our research team. We are looking forward to building upon our very promising results and to moving our strategy into the clinic.”


Fewer people could be recommended for primary prevention statin therapy, including many younger adults with high long-term cardiovascular disease risk, if physicians adhere to the 2016 U.S. Preventive Services Task Force (USPSTF) recommendations for statin therapy compared with the 2013 American College of Cardiology/American Heart Association (ACC/AHA) guidelines, according to a study published by JAMA. The 2013 ACC/AHA guidelines substantially expanded the population eligible for statin therapy by basing recommendations on an elevated 10-year risk of atherosclerotic cardiovascular disease (ASCVD). The 2016 USPSTF recommendations for primary prevention statin therapy increased the estimated ASCVD risk threshold for patients (including those with diabetes) and required the presence of at least one cardiovascular risk factor (i.e., hypertension, diabetes, dyslipidemia, or smoking) in addition to elevated risk. Michael J. Pencina, Ph.D., of Duke University, Durham, N.C., and colleagues used National Health and Nutrition Examination Survey (NHANES) data (2009-2014) to assess statin eligibility under the 2016 USPSTF recommendations vs the 2013 ACC/AHA cholesterol guidelines among a nationally representative sample of 3,416 U.S. adults ages 40 to 75 years with fasting lipid data and triglyceride levels of 400 mg/dl or less, without prior cardiovascular disease (CVD). The researchers found that if fully implemented, the USPSTF recommendations would be associated with statin initiation in 16 percent of adults without prior CVD, in addition to the 22 percent of adults already taking lipid-lowering therapy; in comparison, the ACC/AHA guidelines would be associated with statin initiation in an additional 24 percent of patients. Among the 8.9 percent of individuals in the primary prevention population who would be recommended for statins by ACC/AHA guidelines but not by USPSTF recommendations, 55 percent would be adults ages 40 to 59 years with an average 30-year cardiovascular risk greater than 30 percent, and 28 percent would have diabetes. "If these estimates are accurate and assuming these proportions can be projected to the U.S. population, there could be an estimated 17.1 million vs 26.4 million U.S. adults with a new recommendation for statin therapy, based on the USPSTF recommendations vs the ACC/AHA guideline recommendations, respectively--an estimated difference of 9.3 million individuals," the authors write. "Alternative approaches to augmenting risk-based cholesterol guidelines, including those that explicitly incorporate potential benefit of therapy, should be considered." Editor's Note: Please see the article for additional information, including other authors, author contributions and affiliations, financial disclosures, funding and support, etc.


News Article | April 28, 2017
Site: www.biosciencetechnology.com

Using new gene-editing technology, researchers have rewired mouse stem cells to fight inflammation caused by arthritis and other chronic conditions. Such stem cells, known as SMART cells (Stem cells Modified for Autonomous Regenerative Therapy), develop into cartilage cells that produce a biologic anti-inflammatory drug that, ideally, will replace arthritic cartilage and simultaneously protect joints and other tissues from damage that occurs with chronic inflammation. The cells were developed at Washington University School of Medicine in St. Louis and Shriners Hospitals for Children-St. Louis, in collaboration with investigators at Duke University and Cytex Therapeutics Inc., both in Durham, N.C. The researchers initially worked with skin cells taken from the tails of mice and converted those cells into stem cells. Then, using the gene-editing tool CRISPR in cells grown in culture, they removed a key gene in the inflammatory process and replaced it with a gene that releases a biologic drug that combats inflammation. The research is available online April 27 in the journal Stem Cell Reports. “Our goal is to package the rewired stem cells as a vaccine for arthritis, which would deliver an anti-inflammatory drug to an arthritic joint but only when it is needed,” said Farshid Guilak, Ph.D., the paper’s senior author and a professor of orthopedic surgery at Washington University School of Medicine. “To do this, we needed to create a ‘smart’ cell.” Many current drugs used to treat arthritis — including Enbrel, Humira and Remicade — attack an inflammation-promoting molecule called tumor necrosis factor-alpha (TNF-alpha). But the problem with these drugs is that they are given systemically rather than targeted to joints. As a result, they interfere with the immune system throughout the body and can make patients susceptible to side effects such as infections. “We want to use our gene-editing technology as a way to deliver targeted therapy in response to localized inflammation in a joint, as opposed to current drug therapies that can interfere with the inflammatory response through the entire body,” said Guilak, also a professor of developmental biology and of biomedical engineering and co-director of Washington University’s Center of Regenerative Medicine. “If this strategy proves to be successful, the engineered cells only would block inflammation when inflammatory signals are released, such as during an arthritic flare in that joint.” As part of the study, Guilak and his colleagues grew mouse stem cells in a test tube and then used CRISPR technology to replace a critical mediator of inflammation with a TNF-alpha inhibitor. “Exploiting tools from synthetic biology, we found we could re-code the program that stem cells use to orchestrate their response to inflammation,” said Jonathan Brunger, Ph.D., the paper’s first author and a postdoctoral fellow in cellular and molecular pharmacology at the University of California, San Francisco. Over the course of a few days, the team directed the modified stem cells to grow into cartilage cells and produce cartilage tissue. Further experiments by the team showed that the engineered cartilage was protected from inflammation. “We hijacked an inflammatory pathway to create cells that produced a protective drug,” Brunger said. The researchers also encoded the stem/cartilage cells with genes that made the cells light up when responding to inflammation, so the scientists easily could determine when the cells were responding. Recently, Guilak’s team has begun testing the engineered stem cells in mouse models of rheumatoid arthritis and other inflammatory diseases. If the work can be replicated in animals and then developed into a clinical therapy, the engineered cells or cartilage grown from stem cells would respond to inflammation by releasing a biologic drug — the TNF-alpha inhibitor — that would protect the synthetic cartilage cells that Guilak’s team created and the natural cartilage cells in specific joints. “When these cells see TNF-alpha, they rapidly activate a therapy that reduces inflammation,” Guilak explained. “We believe this strategy also may work for other systems that depend on a feedback loop. In diabetes, for example, it’s possible we could make stem cells that would sense glucose and turn on insulin in response. We are using pluripotent stem cells, so we can make them into any cell type, and with CRISPR, we can remove or insert genes that have the potential to treat many types of disorders.” With an eye toward further applications of this approach, Brunger added, “The ability to build living tissues from ‘smart’ stem cells that precisely respond to their environment opens up exciting possibilities for investigation in regenerative medicine.”


News Article | April 18, 2017
Site: www.biosciencetechnology.com

Some of the most effective treatments against viral infections and cancer belong to a class of drugs called nucleoside analogs. These are essentially faulty versions of molecular building blocks that can slip into cells and get incorporated into DNA, effectively throwing a wrench into the machinery that viruses and cancer cells to make copies of themselves. Such compounds, which include chemotherapeutic agents like 5-fluorouracil and gemcitabine, popular HIV drugs like AZT, and potent hepatitis B treatments like acyclovir, have dramatically changed the outcomes for millions of people afflicted with life-threatening illnesses. Duke University scientists have now modeled the complex shape and movement of biomolecules to make an animation depicting how nucleoside analogs and natural nucleosides are transported into cells. The heart of the system is a specific molecule aptly named the concentrative nucleoside transporter, or CNT. The scientists' movie shows CNT slowly moving its cargo like an elevator, stopping at various points across the cell membrane before reaching the other side. Their findings, published early online in Nature, provide important structural information that could be used to design smarter, more specific anticancer and antiviral drugs. "Our study is the first to provide a visualization of almost every possible conformation of this transporter in motion," said senior study author Seok-Yong Lee, Ph.D., associate professor of biochemistry at Duke University School of Medicine. "By understanding how this transporter recognizes and imports nucleosides, we may be able to redesign drugs that are better at getting inside specific cells like those harboring cancer or a virus." The blueprint for every living organism lies in the twisted strands of DNA buried within cells. These strands are comprised of four nucleotide "bases" - G, A, C, T, arranged along a backbone of sugars and phosphate molecules. Every time a cell grows and divides, it has to make more copies of those original strands of DNA. Hence, active cells are constantly importing more building blocks to replenish their genetic material, especially the essential nucleosides, which are like a nucleotide base without a phosphate attached. Fifty years ago, scientists designed the first nucleoside analogs, molecular mimics that muck up this DNA construction supply chain in order to incapacitate rapidly growing and particularly needy cancer cells and viruses. Like their natural counterparts, nucleoside analogues are carried across the cell membrane by special proteins called nucleoside transporters. In this study, Lee's group sought to capture one of the most common transporters, known as the concentrative nucleoside transporter or CNT, as it traversed the membrane. Marscha Hirschi, a graduate student in Lee's lab, used a technique called x-ray crystallography to create an atomic-level three-dimensional picture of the protein. She then took a series pictures of CNT in different conformations to produce a kind of time-lapse video of the transporter in action: first, as it is ready to capture the nucleoside uridine on the surface of the cell; next, as it moved across the membrane in stages; and finally, as it released the uridine inside the cell. "We found that there is a region on the protein called the transport domain that acts like an elevator, shifting into different conformations as it transports cargo up and down across the membrane," said Lee. "Other studies had shown that many transporters move in this way, but ours is the first to record nearly all of the stages of the elevator model. This more detailed understanding could provide a platform to the future development of drugs that are more selective and efficient." Lee says that transporters responsible for importing a variety of different molecules, such as neurotransmitters, metabolites, and ions, use mechanisms similar to CNT. Thus, the new findings could have implications that reach beyond viral infections and cancer to a number of different clinically relevant physiological processes.


News Article | May 3, 2017
Site: www.eurekalert.org

More use of technology led to increases in attention, behavior and self-regulation problems for adolescents already at risk for mental health issues, new study finds DURHAM, N.C. -- More use of technology is linked to later increases in attention, behavior and self-regulation problems for adolescents already at risk for mental health issues, a new study from Duke University finds. "Also, on days at-risk adolescents use technology more, they experience more conduct problems and higher ADHD symptoms compared to days they use technology less," said Madeleine J. George, a Duke Ph.D. candidate and the lead author of the study. However, the study also found that using technology was linked to some positive outcomes: On days when adolescents spent more time using digital technologies they were less likely to report symptoms of depression and anxiety. The research, published May 3 in a special issue of Child Development, looks at associations between adolescents' mental health symptoms and how much time they spent each day texting, using social media and using the Internet. For the study, 151 young adolescents completed surveys on smartphones about their daily digital technology use. They were surveyed three times a day for a month and were assessed for mental health symptoms 18 months later. The youth participating were between 11 and 15 years old, were of a lower socioeconomic status and were at a heightened risk for mental health problems. The adolescents spent an average of 2.3 hours a day using digital technologies. More than an hour of that time was spent texting, with the adolescents sending an average of 41 texts a day. The researchers found that on days when adolescents used their devices more -- both when they exceeded their own normal use and when they exceeded average use by their peers -- they were more likely to experience conduct problems such as lying, fighting and other behavioral problems. In addition, on days when adolescents used digital devices more, they had difficulty paying attention and exhibited attention deficit-hyperactivity disorder symptoms. The study also found that young adolescents who spent more time online experienced increases in conduct problems and problems with self-regulation -- the ability to control one's behavior and emotions -- 18 months later. It's unclear whether high levels of technology use were simply a marker of elevated same-day mental health symptoms or if the use of technology exacerbated existing symptoms, said Candice Odgers, the senior author of the study and a professor in Duke's Sanford School of Public Policy. On the positive side, the researchers found evidence that digital technology use may be helpful to adolescents experiencing depression and anxiety. More time spent texting was associated with fewer same-day symptoms of depression and anxiety. "This finding makes sense when you think about how kids are commonly using devices to connect with their peers and social networks," said Odgers, a faculty fellow at the Duke Center for Child and Family Policy. The findings suggest contemporary youth may be using digital technology to connect in positive ways versus isolating themselves, the authors said. In the past, some research found that teenagers using digital technology were socially isolated. But at that time, only a small minority of youth were frequently online. Odgers noted that the adolescents in the study were already at an increased risk for mental health problems regardless of digital device use. It's therefore unclear if the findings would apply to all adolescents. Because this was a correlational study, it is possible factors other than technology use could have caused the increase in mental health problems. As rates of adolescent technology use continue to climb, more work is needed to investigate its effects, the researchers say. Odgers and George are now conducting a large study of more than 2,000 N.C. adolescents to determine how and why high digital device use predicts future problems among some adolescents. The study also looks at whether being constantly connected during adolescence could provide opportunities to improve mental health. This study was supported by the William T. Grant Foundation and the Verizon Foundation. Russell was supported by the National Institute on Drug Abuse (T32 DA017629, P50 DA010075 and P50 DA039838). Odgers is a Jacobs Foundation Advanced Research Fellow and a fellow of the Canadian Institute for Advanced Research. CITATION: "Concurrent and Subsequent Associations between Daily Digital Technology Use and High-Risk Adolescents' Mental Health Symptoms," Madeleine J. George, Michael A. Russell, Joy R. Piontak and Candice L. Odgers. Child Development, May 3, 2017. DOI: 10.1111/cdev.12819


Integrated Dermatology Group (IDG), a leading national dermatology practice, has expanded its presence in Virginia by acquiring the practice of Dr. William Shields and partnering with Dr. Jonathan Schreiber, who will serve as Medical Director of the practice known as Integrated Dermatology of Newport News, LLC. Dr. Schreiber first partnered with Integrated Dermatology Group in 2014 with the opening of Integrated Dermatology of Tidewater, LLC in Norfolk, Virginia. He now serves as Medical Director of both practices. After graduating with honors from Stanford University, Dr. Schreiber attended Duke University, where he earned both a medical degree and a Ph.D. in Pharmacology. He then completed his internship at Boston Medical Center and his Dermatology residency at Tufts New England Medical Center and Boston Medical Center’s combined program. Dr. Schreiber is a fellow of the American Academy of Dermatology and a member and past President of the Tidewater Dermatology Society. “By partnering with or selling to IDG, the dermatologist can focus on providing high-quality patient care while IDG manages the practice infrastructure and back-office operations, ensuring best clinical practices and patient outcomes,” said Jeff Queen, co-CEO of Integrated Dermatology Group. “For dermatologists who want to monetize all their practice's value, IDG has a program giving them the opportunity to do so. These dermatologists continue to practice dermatology and accrue additional income from the practice,” said Andrew Queen, co-CEO of Integrated Dermatology Group. IDG is continuing its systematic national expansion. The Newport News announcement comes on the heels of recent announcements of partnerships in both White Plains (Integrated Dermatology of White Plains, LLC) and Chevy Chase (Integrated Dermatology of Chevy Chase, LLC), Maryland. For more information about one of the nation's largest and fastest-growing dermatology groups, with practices located across the country, please contact Integrated Dermatology Group at www.mydermgroup.com, or call Jeff Queen at 561-314-2000, extension 1038.    About Integrated Dermatology Group Headquartered in Boca Raton, Florida, Integrated Dermatology Group is one of the country’s largest providers of dermatology care. The company has expanded its presence nationally by acquiring and partnering with dermatology practices across the United States. This exclusive model enables selling dermatologists to realize all or part of the value of their practices and gives them the choice of either retiring or remaining at the practice indefinitely, maintaining autonomy and control over the practice of medicine. Simultaneously, IDG presents dermatologists with the unique opportunity to immediately join an established private practice as a partner, not an employee, with the infrastructure, support, and resources the larger group provides. IDG's mission is to improve the quality of practice life for its dermatologists while adding to their financial success. As members of IDG, dermatologists focus on providing high-quality patient care as IDG removes the stress of day-to-day management by implementing best practices in the areas of compliance, financial services, human resources, payers, and more. For additional information, visit www.mydermgroup.com.


AUSTIN, Texas--(BUSINESS WIRE)--R. Mack Harrell, MD, FACP, FACE, ECNU, has been elected as President of the American College of Endocrinology (ACE). “I’m honored to be elected by my peers to lead an organization dedicated to excellence in endocrine teaching for practicing endocrinologists and patients,” said Dr. Harrell. Dr. Harrell attended the University of North Carolina at Chapel Hill as an undergraduate and was awarded a Morehead Fellowship in Medicine at UNC in 1975. He performed his postgraduate studies in Internal Medicine at the University of Minnesota and completed his endocrinology fellowship at Duke University, where he specialized in bone and mineral metabolism, as well as served on the academic faculty. A member of AACE since 1991, Dr. Harrell was elected President of the American Association of Clinical Endocrinologists (AACE) in 2014. He has served on numerous AACE Committees and task forces as a member, Chair or Co-Chair, with special expertise in socioeconomics, legislative and regulatory issues. Currently, Dr. Harrell practices in Hollywood, Fla., where he focuses his efforts in the nascent field of interventional endocrinology and performs pre-operative imaging for endocrine surgery. From the academic world to private practice, to Foundation staff model work with the Cleveland Clinic to hospital-based endocrine surgery practice, Dr. Harrell has witnessed the practice of endocrinology from every conceivable angle. “I have evolved my practice focus through many different iterations, with guidance from mentors at AACE,” said Dr. Harrell. “Over the next year as President of the College, I look forward to leading others to take the same informed risks that I have taken to improve the practice of endocrinology domestically and internationally by fostering advances in patient care and patient education.” About the American Association of Clinical Endocrinologists (AACE) The American Association of Clinical Endocrinologists (AACE) represents more than 7,500 endocrinologists in the United States and abroad. AACE is the largest association of clinical endocrinologists in the world. A majority of AACE members are certified in endocrinology, diabetes and metabolism and concentrate on the treatment of patients with endocrine and metabolic disorders including diabetes, thyroid disorders, osteoporosis, growth hormone deficiency, cholesterol disorders, hypertension and obesity. Visit our site at www.aace.com. About the American College of Endocrinology (ACE) The American College of Endocrinology (ACE) is the educational and scientific arm of the American Association of Clinical Endocrinologists (AACE). ACE is the leader in advancing the care and prevention of endocrine and metabolic disorders by: providing professional education and reliable public health information; recognizing excellence in education, research and service; promoting clinical research and defining the future of Clinical Endocrinology. Please visit www.aace.com/college.


News Article | May 8, 2017
Site: www.businesswire.com

NEWTOWN SQUARE, Pa.--(BUSINESS WIRE)--XyloCor Therapeutics Inc., a privately held biotech company, today announced that the U.S. Food and Drug Administration (FDA) has granted Fast Track designation to its lead product candidate XC001 (AdVEGF-All6A+), a cardiovascular angiogenic gene therapy. XC001 is a one-time treatment being investigated for improving exercise tolerance in patients who have chronic angina that is refractory to standard medical therapy and not amenable to conventional revascularization procedures such as coronary artery bypass surgery and percutaneous coronary intervention and stents. “Achieving Fast Track status validates the need for XC001, which has the potential to be a unique treatment for this serious condition with high unmet need - chronic, refractory angina,” said Al Gianchetti, President and Chief Executive Officer of XyloCor. “This designation is supported by strong scientific evidence for XC001 and clinical validation of this mechanism of action in refractory angina. This important designation is intended to contribute to an expedited development and regulatory review process, which can get the drug sooner to patients who can benefit from it.” The FDA Fast Track designation is designed to facilitate the development and expedite the review of new drugs and vaccines intended to treat or prevent serious conditions and that demonstrate the potential to address an unmet medical need. XC001 is a novel gene therapy that promotes angiogenesis, the formation of new vessels that can provide arterial blood flow to myocardial regions with inadequate blood supply. Enhancing myocardial blood flow with therapeutic angiogenesis is intended to relieve myocardial ischemia, improve regional and global left ventricular performance, alleviate angina symptoms and disability and potentially improve prognosis. “There are many patients in the United States with refractory angina and there are no available treatment options,” said Magnus Ohman, Professor of Medicine, The Kent and Siri Rawson Director, Duke Program for Advanced Coronary Disease, Duke University School of Medicine. “These patients have significant limitations in terms of their daily activities because of the chest pain associated with their ischemic disease and XC001 could be an important new option for them.” An IND for XC001 is open with the FDA and XyloCor intends to commence clinical trials upon funding. XyloCor Therapeutics is a private biopharmaceutical company developing novel gene therapy for people with unmet medical need from advanced coronary artery disease. XyloCor is focused on developing its lead product, XC001, for patients with refractory angina with no treatment options and its secondary product, XC002, for patients with cardiac tissue damage from heart attacks. XyloCor was founded by Dr. Ronald Crystal and Dr. Todd Rosengart, who both sit on XyloCor’s advisory board. Dr. Crystal is the Bruce Webster Professor and Chairman, Department of Genetic Medicine, Weill Cornell Medicine and Director of the Belfer Gene Therapy Core Facility. Dr. Rosengart is Professor and Chairman, DeBakey Bard Chair of Surgery, Michael E. DeBakey Department of Surgery, Baylor College of Medicine. XyloCor has a licensing agreement with Cornell University granting the company worldwide rights to develop, manufacture and commercialize XC001. With a strong scientific foundation, compelling preclinical and clinical evidence and an experienced team, XyloCor is poised for success and to help patients lead better, healthier lives. For more information, visit www.xylocor.com.


News Article | April 20, 2017
Site: www.futurity.org

Researchers are working to digitally preserve the bodies of lemurs that have died so future students and scientists might learn more about lemur anatomy—and “virtually dissect” them. Almost all of the roughly 100 species of lemurs are facing extinction in the wild due to logging, mining, hunting, and slash-and-burn agriculture. Which is why, when an animal at the Duke University Lemur Center dies from illness, injury, or old age, a licensed veterinarian performs a postmortem exam within 24 hours of death, organs are removed, and tissue samples are collected so that other researchers can make use of them. Cadavers of each species go into storage freezers or are preserved in formalin—not for ghoulish curiosity, but so that years from now their bodies could be still be useful for research and education. And though these animals are gone, their bodies are now being preserved for present and future generations with help from an X-ray imaging technique called micro-computed tomography (microCT). Soon, anyone will be able to go online to MorphoSource.org and get 3D views of the internal anatomy of dozens of lemurs and other rare and endangered prosimian primates, in micrometer detail, without disturbing the original specimens. “Even when they’ve passed, these animals continue to contribute valuable scientific data,” says former Duke graduate student Gabe Yapuncich, who has been leading the effort to scan the specimens with assistant professor Doug Boyer. Yapuncich got the idea for the project while earning his doctorate in evolutionary anthropology. He was scanning the skeletal remains of present-day primates to see if certain foot bone measurements could help to reconstruct how much their extinct relatives might have weighed. Researchers can learn about many aspects of primate biology from fossils, but most fossil specimens consist of isolated teeth or fragments of bones. Complete or even partial skeletons are rare. “It seemed like a waste just to scan the foot and send it back,” Yapuncich says. “Once I had a specimen on loan, I tended to scan the whole thing.” Yapuncich demonstrated how the technology works at Duke’s Shared Materials Instrumentation Facility. A giant lead-lined box there looks like an airport security scanner. Inside, a Styrofoam cooler filled with dry ice contains the frozen remains of Beauty, a female bamboo lemur who died in 1985. The microCT scanner blasts a cone-shaped beam of X-rays at the cooler as it spins slowly on a rotating platform. The X-rays that pass through Beauty’s body hit a detector on the back wall, which records a snapshot. The scanner takes thousands of snapshots for each full rotation. The data go to a computer, which uses the images to reconstruct two-dimensional cross sections of Beauty’s insides, and these are stacked like slices of bread into a 3D model. Yapuncich peers at the result on a nearby computer screen. It’s a 3D close-up of Beauty’s head built from 1,900 cross-sectional images. With the click of a mouse, he digitally dissects away Beauty’s fur, skin, and soft tissue to reveal the skeleton underneath in stunning, three-dimensional detail. He can also look at any 2D slice to see internal structures in cross section. The Duke Lemur Center is committed to studying lemurs without harming them. Imaging cadavers makes it possible to perform “virtual” dissections that would never be allowed on living animals. With standard microCT researchers can visualize hard tissues such as bones and teeth, but by using special iodine-based stains, they can also see soft tissues such as muscles, nerves, and blood vessels in the deceased animals. Yapuncich has scanned the remains of more than 100 animals so far. A fat-tailed dwarf lemur named Jonas is one of them. When he died in 2015 at age 29, suffering from cataracts and other signs of age, he was the oldest of his kind. The scan shows his tail curled around his body, the roughly two dozen tail bones neatly lined up one after the other. “If you go to a museum collection, the tail vertebrae are just a bunch of bones in a box,” Yapuncich says. The Duke Lemur Center fields dozens of cadaver requests from researchers each year. But these rare and fragile specimens can only be examined so many times using traditional methods. Repeated shipping and handling may expose them to damage and freeze-thaw cycles that would inevitably speed their decay. By creating high resolution 3D scans and putting them online, researchers hope to reduce destructive sampling and insure the availability of specimens for future study. “There aren’t that many available,” says Duke R&D engineer and microCT specialist Justin Gladman. “If one researcher dissects and destroys one, the next researcher can’t do anything with it.” “By scanning them in the microCT and creating these beautiful 3D models, we can digitize the specimens and share them online,” Gladman says. “Instead of being locked in a museum drawer, they’re freely available.” In the digital afterlife, Merlin’s bony appendages are no longer nimble but still intact. He was one of four of the fewer than 60 endangered aye-ayes living in captivity worldwide that died suddenly over 36 hours at the Duke Lemur Center in October 2016. Staff and researchers were devastated. The culprit, tests later revealed, was a natural toxin found in avocados, not previously known to be harmful to lemurs, which damaged their heart muscles. In assistant professor Doug Boyer’s lab on Science Drive, recent Duke graduate Darbi Griffith uses software to stitch together nearly 3,000 2D images of Merlin into a 3D rendering. Merlin was very popular with lemur center staff, and often enjoyed using his incredibly slender and dexterous middle finger to gently tease mealworms from his keepers’ closed fists. The 3D volume rendering shows his body cloaked in skin and muscle. With a click his flesh fades away, and Griffith can zoom in on Merlin’s skull to examine the complex wear patterns on his teeth, or peer inside his cranial cavity to estimate the size and shape of his brain. Griffith has uploaded these 3D images to an online database Boyer created called MorphoSource. Because the digitization is ongoing the Lemur Center scans haven’t been made publicly available yet, but when they are, visitors to MorphoSource will be able to compare Merlin to other individuals, or measure anatomical variation across species. Anyone will be able to browse the specimens, measure them, download the raw data, and even create their own 3D lemur models, both of bodies and skeletons, on a 3D printer. “It’s the largest collection of 3D lemur scans. That’s pretty cool,” Griffith says. Grants from the National Science Foundation supported this research.


News Article | April 17, 2017
Site: www.techrepublic.com

Just in time for the NCAA tournament and March Madness, Duke University is unveiling a revamped online statistics site so that Blue Devils fans can better access player stats, team records, and individual game box scores for its men's basketball teams dating back to 1906. The site's first version came out in 2014, and the most recent update is more streamlined. It is built on a SAP HANA database platform and it mirrors the offerings that SAP provides to its business customers. The key difference being instead of giving business data, it's aimed at fans who want to know, for example, how many field goals Christian Laettner made during the 1991-92 basketball season and post-season. The answer? He made 54 out of 99 attempts, including the infamous shot against the University of Kentucky Wildcats in the East Regional finals that put Duke in the 1992 NCAA Final Four. The result was a 25-year rivalry, with Kentucky fans still cringing at the thought of the game. SEE: March Madness: 5 data sources that could predict the 2017 NCAA championship (TechRepublic) Coincidentally, the University of Kentucky is also one of SAP's customers, but for business purposes, not player stats. "We got a chance to talk to someone in the athletic department for Kentucky and they said, 'oh, it's great that you have this project with Duke, have you seen our basketball team lately?'" "Fans of Duke are always looking up stats. All of those player comparables is something we see a strong demand for," said Frank Wheeler, region vice president and general manager for SAP Sports and Entertainment North America. "You want to have this element of stickiness and building on the technology so when a fan goes there to look up one thing, he's now hooked. Ultimately, the time spent on the site is going to improve." SAP has similar sites for the National Hockey League (NHL) and the National Basketball Association (NBA). "When the NBA released their first site with us in 2013, it resulted in a 40% increase in traffic, and a 50% increase in the amount of time actually spent on the site," Wheeler said. The reason SAP creates these sites for sports teams is because it shows off what they can do for business customers. "First off, it provides value to our customers and gives Duke a great platform, and for us to tell that story resonates across industries. It doesn't mean we're going to do a lot of business with other universities, although we could. It resonates with people dealing with similar challenges that want to get data out to customers through a HANA cloud database," Wheeler said. Duke's basketball statistics were integrated using LSI Consulting's hosted SAP HANA platform and UI5 repeatable solution. The new site works on a laptop or mobile device and has multi-platform social media integration, shot charts, and printer-friendly box scores. Students at Duke were part of the core project team behind the solution. They don't get class credit, but they get recognition for their work, and it opens up job opportunities for them, said Ryan Craig, executive director of digital strategy for Duke. The next update for the site will come within the next 3-6 months, and will allow for computational journalism and the ability to look up a question such as: "When was the last time someone scored 30 points four games in a row," Wheeler explained. Wheeler expects the site to continue to draw a strong audience. "Duke men's basketball has a rich history that predates the internet age. For years, historical player data was laboriously hand-collected off of microfiche. Now, by harnessing the power of the SAP HANA platform, Duke fans are able to collect the fruit of that labor and view vast amounts of data in one easy-to-use tool."


News Article | May 5, 2017
Site: phys.org

UMCES graduate student Jessica Wingfield is first author on the paper. Credit: University of Maryland Center for Environmental Science A new study by scientists at the University of Maryland Center for Environmental Science's Chesapeake Biological Laboratory, Cornell University and Duke University is the first in a series to understand how marine mammals like porpoises, whales, and dolphins may be impacted by the construction of wind farms off the coast of Maryland. The new research offers insight into previously unknown habits of harbor porpoises in the Maryland Wind Energy Area, a 125-square-mile area off the coast of Ocean City that may be the nation's first commercial-scale offshore wind farm. Offshore wind farms provide renewable energy, but activities during the construction can affect marine mammals that use sound for communication, finding food, and navigation. "It is critical to understand where marine mammals spend their time in areas of planning developments, like offshore wind farms, in order to inform regulators and developers on how to most effectively avoid and minimize negative impacts during the construction phase when loud sounds may be emitted," said Helen Bailey, the project leader at the UMCES' Chesapeake Biological Laboratory. Scientists from the University of Maryland Center for Environmental Science used underwater microphones called hydrophones to detect and map the habits of harbor porpoises, one of the smallest marine mammals. Bailey describes harbor porpoises as "very shy" ranging 4 to 5 feet long with a small triangular fin that can be hard to spot. They swim primarily in the ocean, spending summers north in the Bay of Fundy and migrating to the Mid-Atlantic, as far south as North Carolina, in the winter. There are about 80,000 of them in the northwestern Atlantic. "There was so little known about them in this area," said Bailey. "It was suspected they used the waters off Maryland, but we had no idea how frequently they occurred here in the winter until we analyzed these data." Porpoises produce echolocation clicks, a type of sonar that hits an object and reflects back to tell them its distance, size and shape. They use it to navigate and feed. The researchers used hydrophones anchored 65-145 feet deep, and about 10 feet off the bottom of the ocean, to pick up these clicks over the course of a year. "We found that harbor porpoises occurred significantly more frequently during January to May, and foraged for food significantly more often in the evenings to early mornings," said study author Jessica Wingfield. Scheduling wind farm construction activities in the Maryland WEA to take place during summer months (June to September) could reduce the likelihood of disturbance to harbor porpoises. "We were certainly surprised by how frequently we detected harbor porpoises because there had not been a lot of reported sightings," said Wingfield. Maryland Department of Natural Resources secured the funding for this study from the Maryland Energy Administration's Offshore Wind Development Fund and the Bureau of Ocean Energy Management. "Year-round spatiotemporal distribution of harbour porpoises within and around the Maryland wind energy area" was recently published in PLOS ONE. Explore further: Study recommends ongoing assessment of impact of offshore wind farms on marine species More information: Jessica E. Wingfield et al, Year-round spatiotemporal distribution of harbour porpoises within and around the Maryland wind energy area, PLOS ONE (2017). DOI: 10.1371/journal.pone.0176653


News Article | May 2, 2017
Site: www.materialstoday.com

Engineers from Duke University in the US in collaboration with SRICO have developed a fast and sensitive pyroelectric mobile sensor that can detect specific wavelengths of electromagnetic energy. Using gold-plated crystals, the technology could provide a low-cost alternative to current infrared sensors that scan for methane or natural gas leaks, as well as monitoring the health of crops or even sorting through plastics for recycling. The multi-functional prototype detector, which was reported in Optica [Suen et al. Optica (2017) DOI: 10.1364/OPTICA.4.000276], depends on metamaterials, engineered structures that comprise designed repeating cells that interact with electromagnetic waves in various ways. With metamaterials, the components required for a detector can be combined into one feature, providing much-needed efficiency; here, in combining patterns of metal with very thin slices of perfect crystals, the team was able to create a device that detects invisible infrared signatures emitted by a range of gasses, plastics and other sources. In most thermal detectors, infrared light waves are absorbed and converted into heat, which is conducted to a separate component, creating an electrical signal that can then be read out. Such a process brings speed limitations, so it is only by overlaying filters or using a complicated system of moving mirrors that enable specific wavelengths to be singled out. However, every part of this new detector consists of a pattern of gold located on top of lithium niobate crystal – as the crystal is pyroelectric, when it gets hot it creates an electrical charge. The device was designed to detect any particular range of electromagnetic frequencies just by redesigning the details of the gold pattern. This crystal would usually be so thin that light would travel through without being absorbed. Here, they designed the top layer of gold into a pattern that works along with the crystal’s properties to result in the pixel absorbing only a specific range of electromagnetic frequencies, which eliminates the need for separate filters. When the crystal heats up and generates an electric charge, the gold transports the signal to the detector's amplifier; as the heat is created directly by the crystal, fewer pixels are necessary. As researcher Jonathan Suen said “We found that pyroelectric detectors are ideal since their optimization requires the coordination of a number of factors including cost, optical absorption, heat generation and transport, and electrical readout.” The team has already developed a single-pixel prototype as proof of concept, research that could lead to new multi-functional metamaterial designs that combine optical properties with other physical phenomena, and also plan to improve further the sensitivity and speed of the detector.


Positive Data Presented at the Association for Research in Vision and Ophthalmology Annual Meeting pSivida Anticipates Reporting Top Line Results from the Second Pivotal Phase 3 Clinical Trial in June 2017 WATERTOWN, Mass., May 08, 2017 (GLOBE NEWSWIRE) -- pSivida Corp. (NASDAQ:PSDV) (ASX:PVA), a leader in the development of sustained release drug products and technologies, today announced positive 12-month follow-up data for the Company’s Durasert three-year treatment for posterior segment uveitis, which was presented at the Association for Research in Vision and Ophthalmology (ARVO) 2017 Annual Meeting.  The data is from the Company’s first Phase 3 trial. The data was presented by Dr. Glenn J. Jaffe, Robert Machemer Professor of Ophthalmology at Duke University School of Medicine in Durham, NC.   Dr. Jaffe is a leading authority on posterior segment uveitis, a devastating disease and the third leading cause of blindness. A total of 129 patients were enrolled in the first Phase 3 clinical trial and the primary endpoint was prevention of recurrence of posterior uveitis at six months, with patients continuing to be followed for 36 months. To view Dr. Jaffe’s entire presentation, please visit the Company’s website at www.psivida.com, under ‘News and Events,’ and click on ‘Presentation and Publications.’ “The data presented today by Dr. Jaffe continues to reinforce pSivida’s proven technology and depth of our innovation,” commented Nancy Lurker, President and Chief Executive Officer.  “The results, both at six and 12 months, demonstrated a significant reduction in the prevention of recurrence of posterior segment uveitis – a devastating disease and the third leading cause of blindness.  Our market research indicates strong interest in using the product driven by the results of our first Phase 3 clinical trial demonstrating a significant, and durable, reduction in recurrence rates, improvements in BCVA and a favorable tolerability profile at 12 months. We continue to expect the first read-out of our second Phase 3 clinical trial of Durasert and submission of the European Market Authorization Application (MAA) by the end of June.  We remain on track to also file a New Drug Application (NDA) with the FDA in the calendar fourth quarter of 2017.” Posterior segment uveitis is a chronic, non-infectious inflammatory disease affecting the posterior segment of the eye, often involving the retina, which is a leading cause of blindness in the developed and developing countries. It affects people of all ages, producing swelling and destroying eye tissues, which can lead to severe vision loss and blindness. In the U.S., posterior uveitis affects  between 80,000 – 100,000 people., Patients with posterior uveitis are typically treated with systemic steroids, but over time frequently develop serious side effects that can limit effective dosing. Patients then often progress to steroid-sparing therapy with systemic immune suppressants or biologics, which themselves can have severe side effects including an increased risk of cancer. pSivida Corp. (www.psivida.com), headquartered in Watertown, MA, is a leader in the development of sustained-release drug products for treating eye diseases. pSivida has developed three of only four FDA-approved sustained-release treatments for back-of-the-eye diseases. The most recent, ILUVIEN®, a micro-insert for diabetic macular edema, licensed to Alimera Sciences, is currently sold directly in the U.S. and three EU countries. Retisert®, an implant for posterior uveitis, is licensed to and sold by Bausch & Lomb. pSivida's lead product candidate, Durasert™ micro-insert for posterior segment uveitis being independently developed, is currently in pivotal Phase 3 clinical trials. pSivida's pre-clinical development program is focused on using its core platform technology, Durasert™, to deliver drugs to treat wet age-related macular degeneration, glaucoma, osteoarthritis and other diseases. To learn more about pSivida please visit www.psivida.com and connect on Twitter, LinkedIn, Facebook and Google+. SAFE HARBOR STATEMENTS UNDER THE PRIVATE SECURITIES LITIGATION REFORM ACT OF 1995: Various statements made in this release are forward-looking, and are inherently subject to risks, uncertainties and potentially inaccurate assumptions. All statements that address activities, events or developments that we intend, expect or believe may occur in the future are forward-looking statements. Some of the factors that could cause actual results to differ materially from the anticipated results or other expectations expressed, anticipated or implied in our forward-looking statements include uncertainties with respect to: our ability to achieve profitable operations and access to needed capital; fluctuations in our operating results; further impairment of our intangible assets; successful commercialization of, and receipt of revenues from, ILUVIEN® for diabetic macular edema (“ILUVIEN”), which depends on Alimera’s ability to continue as a going concern and the effect of pricing and reimbursement decisions on sales of ILUVIEN; safety and efficacy results of the second Durasert three-year uveitis Phase 3 clinical trial and the number of clinical trials and data required for the Durasert three-year uveitis marketing approval applications in the U.S. and EU; our ability to file and the timing of filing and acceptance of the Durasert three-year uveitis marketing approval applications in the U.S. and EU; our ability to use data in a U.S. NDA from clinical trials outside the U.S.; maintenance of European orphan designation for Durasert three-year uveitis; our ability to successfully commercialize Durasert three-year uveitis, if approved; potential off-label sales of ILUVIEN for uveitis; consequences of fluocinolone acetonide side effects; potential declines in Retisert® royalties; our ability to develop Tethadur to successfully deliver large biologic molecules and develop products using it; efficacy and our future development of an implant to treat severe osteoarthritis; our ability to successfully develop product candidates, initiate and complete clinical trials and receive regulatory approvals; our ability to market and sell products; the success of current and future license agreements; termination or breach of current license agreements; our dependence on contract research organizations, vendors and investigators; effects of competition and other developments affecting sales of products; market acceptance of products; effects of guidelines, recommendations and studies; protection of intellectual property and avoiding intellectual property infringement; retention of key personnel; product liability; industry consolidation; compliance with environmental laws; manufacturing risks; risks and costs of international business operations; effects of the potential U.K. exit from the EU; legislative or regulatory changes; volatility of stock price; possible dilution; absence of dividends; and other factors described in our filings with the Securities and Exchange Commission. You should read and interpret any forward-looking statements in light of these risks. Should known or unknown risks materialize, or should underlying assumptions prove inaccurate, actual results could differ materially from past results and those anticipated, estimated or projected in the forward-looking statements. You should bear this in mind as you consider any forward-looking statements. Our forward-looking statements speak only as of the dates on which they are made. We do not undertake any obligation to publicly update or revise our forward-looking statements even if experience or future changes makes it clear that any projected results expressed or implied in such statements will not be realized.


News Article | May 3, 2017
Site: www.eurekalert.org

COLUMBUS, Ohio - More than half of breast cancer patients (57 percent) undergoing mastectomy lack the necessary medical knowledge to make a high-quality decision about reconstructive surgery that aligns with their personal goals, suggesting a trend toward overtreatment, according to a new study conducted by researchers at The Ohio State University Comprehensive Cancer Center - Arthur G. James Cancer Hospital and Richard J. Solove Research Institute (OSUCCC - James). "High-quality" decisions were defined as those that demonstrated adequate medical knowledge of treatment choices - including associated risks - and that also matched with the patient's specific goals and preferences for choosing whether or not to pursue reconstructive surgery. Researchers say shared decision-making tools are needed to help women make decisions based on a full understanding of treatment choices and associated risks alongside their personal goals for surgery. Researchers report the findings online first in the medical journal JAMA Surgery May 3, 2017. In this observational, single-institution study, researchers sought to evaluate the quality of 126 adult breast cancer patients' decisions about breast reconstruction after mastectomy. All patients had stage I-III invasive ductal/lobular breast cancer, ductal carcinoma in situ (DCIS) or were having preventive mastectomies. The majority of patients (73 percent) had early-stage disease. Researchers measured study participants' medical knowledge about mastectomy and mastectomy with reconstruction -- for example, effects of surgery on appearance and associated risks. They also measured individual preferences of what mattered most to patients. Key preference factors included breast appearance/shape post treatment, length of recovery time and risk for complications. "We found that less than half of the women had adequate medical knowledge about breast reconstruction and made a choice that aligned with their personal preferences. This is very concerning to us, because it means that some women did not get the treatment they truly preferred, and quite a few had more treatment than they preferred," says Clara Lee, MD, principal investigator of the study and a breast reconstructive surgeon at The OSUCCC - James. Lee holds a dual associate professor appointment in the colleges of medicine and public health at Ohio State. "Many women were quite concerned about complication risks, but they didn't actually know how high the risk was. This may explain some of the overtreatment that we saw," she adds. Researchers found that only 43 percent of the patients in the study demonstrated an understanding of at least half of the important facts about reconstruction and made a choice that was consistent with their preferences. Understanding of surgical complications was particularly low, with only 14 percent of patients demonstrating strong knowledge of associated risks. "As breast cancer providers, we need to talk about the pros and cons of surgery to help women make treatment choices. Shared decision-making between the surgeon and patient would be particularly useful for this decision. We need to connect patients with decision aids to help them really think through what is most important to them," Lee adds. Collaborators in this National Cancer Institute-funded study include Allison Deal, MD, and Ruth Huh, BA, of Lineberger Comprehensive Cancer Center at University of North Carolina Chapel Hill; Michael Pignone, MD, MPH, of University of Texas at Austin; and Peter Ubel, MD, of Duke University. "The interesting thing is that these findings are not unique to breast reconstruction," adds Pignone, study coauthor and chair of the Department of Internal Medicine at the Dell Medical School at The University of Texas at Austin. "In other places where we've looked at decision quality, we see gaps in patients' understanding of key information and poor alignment between the things they care most about and the treatments that they choose. It means that we need to do a much better job of providing decision support to patients, so that the care they get is, ultimately, the care they want." Lee and colleagues in Ohio State's colleges of engineering, communication and public health are working on a study to evaluate treatment decisions in early-stage breast cancer patients to assess how communication with their providers affects their decision-making. This ongoing study examines patients' knowledge, preferences, and expectations about future well-being. Information from this study is expected to help clinicians develop tools to aid patients in making an informed decision about their care. The Ohio State University Comprehensive Cancer Center - Arthur G. James Cancer Hospital and Richard J. Solove Research Institute strives to create a cancer-free world by integrating scientific research with excellence in education and patient-centered care, a strategy that leads to better methods of prevention, detection and treatment. Ohio State is one of only 47 National Cancer Institute (NCI)-designated Comprehensive Cancer Centers and one of only a few centers funded by the NCI to conduct both phase I and phase II clinical trials on novel anticancer drugs sponsored by the NCI. As the cancer program's 308-bed adult patient-care component, The James is one of the top cancer hospitals in the nation as ranked by U.S. News & World Report and has achieved Magnet designation, the highest honor an organization can receive for quality patient care and professional nursing practice. At 21 floors and with more than 1.1 million square feet, The James is a transformational facility that fosters collaboration and integration of cancer research and clinical cancer care. Learn more at cancer.osu.edu.


News Article | May 3, 2017
Site: www.eurekalert.org

More than half of breast cancer patients (57 percent) undergoing mastectomy lack the necessary medical knowledge to make a high-quality decision about reconstructive surgery that aligns with their personal goals, suggesting a trend toward overtreatment, according to a new study conducted at The Ohio State University Comprehensive Cancer Center -- Arthur G. James Cancer Hospital and Richard J. Solove Research Institute (OSUCCC - James). "High-quality" decisions were defined as those that demonstrated adequate medical knowledge of treatment choices -- including associated risks -- and that also matched with the patient's specific goals and preferences for choosing whether or not to pursue reconstructive surgery. Researchers say shared decision-making tools are needed to help women make decisions based on a full understanding of treatment choices and associated risks alongside their personal goals for surgery. Researchers report the findings online first in the medical journal JAMA Surgery May 3, 2017. In this observational, single-institution study, researchers sought to evaluate the quality of 126 adult breast cancer patients' decisions about breast reconstruction after mastectomy. All patients had stage I-III invasive ductal/lobular breast cancer, ductal carcinoma in situ (DCIS) or were having preventive mastectomies, and the majority (73 percent) had early-stage disease. Researchers measured study participants' medical knowledge about mastectomy and mastectomy with reconstruction -- for example, effects of surgery on appearance and associated risks. They also measured individual preferences of what mattered most to patients. Key preference factors included breast appearance/shape post treatment, length of recovery time and risk for complications. "We found that less than half of the women had adequate medical knowledge about breast reconstruction and made a choice that aligned with their personal preferences. This is very concerning to us, because it means that some women did not get the treatment they truly preferred, and quite a few had more treatment than they preferred," says Clara Lee, MD, principal investigator of the study and a breast reconstructive surgeon at The OSUCCC - James. Lee holds a dual associate professor appointment in the colleges of medicine and public health at Ohio State. "Many women were quite concerned about complication risks, but they didn't actually know how high the risk was. This may explain some of the overtreatment that we saw," she adds. Researchers found that only 43 percent of the patients in the study demonstrated an understanding of at least half of the important facts about reconstruction and made a choice that was consistent with their preferences. Understanding of surgical complications was particularly low, with only 14 percent of patients demonstrating strong knowledge of associated risks. "As breast cancer providers, we need to talk about the pros and cons of surgery to help women make treatment choices. Shared decision-making between the surgeon and patient would be particularly useful for this decision. We need to connect patients with decision aids to help them really think through what is most important to them," Lee adds. Collaborators in this National Cancer Institute-funded study include Allison Deal, MD, and Ruth Huh, BA, of Lineberger Comprehensive Cancer Center at University of North Carolina Chapel Hill; Michael Pignone, MD, MPH, of University of Texas at Austin; and Peter Ubel, MD, of Duke University. Lee and colleagues in Ohio State's colleges of engineering, communication and public health are working on a study to evaluate treatment decisions in early-stage breast cancer patients to assess how communication with their providers affects their decision-making. This ongoing study examines patients' knowledge, preferences, and expectations about future well-being. Information from this study is expected to help clinicians develop tools to aid patients in making an informed decision about their care. The Ohio State University Comprehensive Cancer Center - Arthur G. James Cancer Hospital and Richard J. Solove Research Institute strives to create a cancer-free world by integrating scientific research with excellence in education and patient-centered care, a strategy that leads to better methods of prevention, detection and treatment. Ohio State is one of only 47 National Cancer Institute (NCI)-designated Comprehensive Cancer Centers and one of only a few centers funded by the NCI to conduct both phase I and phase II clinical trials on novel anticancer drugs sponsored by the NCI. As the cancer program's 308-bed adult patient-care component, The James is one of the top cancer hospitals in the nation as ranked by U.S. News & World Report and has achieved Magnet designation, the highest honor an organization can receive for quality patient care and professional nursing practice. At 21 floors and with more than 1.1 million square feet, The James is a transformational facility that fosters collaboration and integration of cancer research and clinical cancer care. Learn more at cancer.osu.edu.


TEL-AVIV, Israel, April 20, 2017 (GLOBE NEWSWIRE) -- RedHill Biopharma Ltd. (NASDAQ:RDHL) (Tel-Aviv Stock Exchange:RDHL) (“RedHill” or the “Company”), a specialty biopharmaceutical company primarily focused on the development and commercialization of late clinical-stage, proprietary, orally-administered, small molecule drugs for gastrointestinal and inflammatory diseases and cancer, today announced the publication of an article describing the positive results from the Phase I clinical study with YELIVA® (ABC294640)1 in advanced solid tumors. The article2, entitled “A Phase I Study of ABC294640, a First-in-Class Sphingosine Kinase-2 Inhibitor, in Patients with Advanced Solid Tumors”, was authored by scientists from the Medical University of South Carolina (MUSC) Hollings Cancer Center and Apogee Biotechnology and was published in Clinical Cancer Research. The article is available online on the journal’s website3. YELIVA® is a Phase II-stage, proprietary, first-in-class, orally-administered sphingosine kinase-2 (SK2) selective inhibitor with anticancer and anti-inflammatory activities, targeting multiple oncology, inflammatory and gastrointestinal indications. By inhibiting the SK2 enzyme, YELIVA® blocks the synthesis of sphingosine 1-phosphate (S1P), a lipid signaling molecule that promotes cancer growth and pathological inflammation. The open-label, dose-escalation, pharmacokinetic (PK) and pharmacodynamic (PD) first-in-human Phase I study with YELIVA® treated 21 patients with advanced solid tumors, most of whom were gastrointestinal cancer patients, including pancreatic, colorectal and cholangiocarcinoma cancers. The Phase I study was conducted at the MUSC Hollings Cancer Center and led by Principal Investigators Melanie Thomas, MD, and Carolyn Britten, MD. The primary objectives of the study were to identify the maximum tolerated dose (MTD) and the dose-limiting toxicities (DLTs) and to evaluate the safety of YELIVA®. The secondary objectives of the study were to determine the pharmacokinetic (PK) and pharmacodynamic (PD) properties of YELIVA® and to assess its antitumor activity. Final results from the Phase I study with YELIVA® in patients with advanced solid tumors confirmed that the study successfully met its primary and secondary endpoints, demonstrating that the drug is well-tolerated and can be safely administered to cancer patients. There was one partial response in a patient with cholangiocarcinoma and six patients had stable disease as their best response. The study included the first-ever longitudinal analyses of plasma S1P levels as a potential PD biomarker for activity of a sphingolipid-targeted drug. The administration of YELIVA® resulted in a rapid and pronounced decrease in S1P levels over the first 12 hours, with return to baseline at 24 hours, which is consistent with clearance of the drug. A Phase II study with YELIVA® for the treatment of advanced hepatocellular carcinoma (HCC) is ongoing at MUSC Hollings Cancer Center. The study is supported by a grant from the NCI, awarded to MUSC, which is intended to fund a broad range of studies on the feasibility of targeting sphingolipid metabolism for the treatment of a variety of solid tumor cancers, with additional support from RedHill. A Phase Ib/II study with YELIVA® for the treatment of refractory or relapsed multiple myeloma is ongoing at Duke University Medical Center. The study is supported by a $2 million grant from the NCI Small Business Innovation Research Program (SBIR) awarded to Apogee, in conjunction with Duke University, with additional support from RedHill. A Phase I/II clinical study evaluating YELIVA® in patients with refractory/relapsed diffuse large B-cell lymphoma and Kaposi sarcoma patients is ongoing at the Louisiana State University Health Sciences Center. The study is supported by a grant from the NCI awarded to Apogee, with additional support from RedHill. A Phase Ib study to evaluate YELIVA® as a radioprotectant for prevention of mucositis in head and neck cancer patients undergoing therapeutic radiotherapy is planned to be initiated in the third quarter of 2017. YELIVA® recently received FDA Orphan Drug designation for the treatment of cholangiocarcinoma. RedHill plans to initiate a Phase IIa clinical study with YELIVA® in patients with advanced, unresectable, intrahepatic and extrahepatic cholangiocarcinoma in the third quarter of 2017. A Phase II study to evaluate the efficacy of YELIVA® in patients with moderate to severe ulcerative colitis is planned to be initiated in the second half of 2017. About YELIVA® (ABC294640): YELIVA® (ABC294640) is a Phase II-stage, proprietary, first-in-class, orally-administered, sphingosine kinase-2 (SK2) selective inhibitor with anticancer and anti-inflammatory activities. RedHill is pursuing with YELIVA® multiple clinical programs in oncology, inflammatory and gastrointestinal indications. By inhibiting SK2, YELIVA® blocks the synthesis of sphingosine 1-phosphate (S1P), a lipid-signaling molecule that promotes cancer growth and pathological inflammation. SK2 is an innovative molecular target for anticancer therapy because of its critical role in catalyzing the formation of S1P, which is known to regulate cell proliferation and activation of inflammatory pathways. YELIVA® was originally developed by U.S.-based Apogee Biotechnology Corp. and completed multiple successful pre-clinical studies in oncology, inflammation, GI and radioprotection models, as well as the ABC-101 Phase I clinical study in cancer patients with advanced solid tumors. The Phase I study included the first-ever longitudinal analysis of plasma S1P levels as a potential pharmacodynamic (PD) biomarker for activity of a sphingolipid-targeted drug. The administration of YELIVA® resulted in a rapid and pronounced decrease in S1P levels, with several patients having prolonged stabilization of disease. YELIVA® received Orphan Drug designation from the U.S. FDA for the treatment of cholangiocarcinoma. The development of YELIVA® was funded to date primarily by grants and contracts from U.S. federal and state government agencies awarded to Apogee Biotechnology Corp., including the U.S. National Cancer Institute, the U.S. Department of Health and Human Services’ Biomedical Advanced Research and Development Authority (BARDA), the U.S. Department of Defense and the FDA Office of Orphan Products Development. About RedHill Biopharma Ltd.: RedHill Biopharma Ltd. (NASDAQ:RDHL) (Tel-Aviv Stock Exchange:RDHL) is a specialty biopharmaceutical company headquartered in Israel, primarily focused on the development and commercialization of late clinical-stage, proprietary, orally-administered, small molecule drugs for the treatment of gastrointestinal and inflammatory diseases and cancer. RedHill has a U.S. co-promotion agreement with Concordia for Donnatal®, a prescription oral adjunctive drug used in the treatment of IBS and acute enterocolitis, as well as an exclusive license agreement with Entera Health for EnteraGam®, a medical food intended for the dietary management, under medical supervision, of chronic diarrhea and loose stools. RedHill’s clinical-stage pipeline includes: (i) RHB-105 - an oral combination therapy for the treatment of Helicobacter pylori infection with successful results from a first Phase III study; (ii) RHB-104 - an oral combination therapy for the treatment of Crohn's disease with an ongoing first Phase III study, a completed proof-of-concept Phase IIa study for multiple sclerosis and QIDP status for nontuberculous mycobacteria (NTM) infections; (iii) BEKINDA® (RHB-102) - a once-daily oral pill formulation of ondansetron with an ongoing Phase III study for acute gastroenteritis and gastritis and an ongoing Phase II study for IBS-D; (iv) RHB-106 - an encapsulated bowel preparation licensed to Salix Pharmaceuticals, Ltd.; (v) YELIVA® (ABC294640) - a Phase II-stage, orally-administered, first-in-class SK2 selective inhibitor targeting multiple oncology, inflammatory and gastrointestinal indications; (vi) MESUPRON - a Phase II-stage first-in-class, orally-administered protease inhibitor, targeting pancreatic cancer and other solid tumors and (vii) RIZAPORT® (RHB-103) - an oral thin film formulation of rizatriptan for acute migraines, with a U.S. NDA currently under discussion with the FDA and marketing authorization received in two EU member states under the European Decentralized Procedure (DCP). More information about the Company is available at: www.redhillbio.com. 1 YELIVA® is an investigational new drug, not available for commercial distribution. 2 The article was authored by Carolyn D. Britten, Melanie B. Thomas, Elizabeth Garrett-Mayer, Steven H. Chin, Keisuke Shirai, Besim Ogretmen, Tricia A. Bentz, Alan Brisendine, Kate Anderton, Susan L. Cusack, Lynn W. Maines, Yan Zhuang and Charles D. Smith. This press release contains “forward-looking statements” within the meaning of the Private Securities Litigation Reform Act of 1995. Such statements may be preceded by the words “intends,” “may,” “will,” “plans,” “expects,” “anticipates,” “projects,” “predicts,” “estimates,” “aims,” “believes,” “hopes,” “potential” or similar words. Forward-looking statements are based on certain assumptions and are subject to various known and unknown risks and uncertainties, many of which are beyond the Company’s control, and cannot be predicted or quantified and consequently, actual results may differ materially from those expressed or implied by such forward-looking statements. Such risks and uncertainties include, without limitation, risks and uncertainties associated with (i) the initiation, timing, progress and results of the Company’s research, manufacturing, preclinical studies, clinical trials, and other therapeutic candidate development efforts; (ii) the Company’s ability to advance its therapeutic candidates into clinical trials or to successfully complete its preclinical studies or clinical trials; (iii) the extent and number of additional studies that the Company may be required to conduct and the Company’s receipt of regulatory approvals for its therapeutic candidates, and the timing of other regulatory filings, approvals and feedback; (iv) the manufacturing, clinical development, commercialization, and market acceptance of the Company’s therapeutic candidates; (v) the Company’s ability to successfully market Donnatal® and EnteraGam®, (vi) the Company’s ability to establish and maintain corporate collaborations; (vii) the Company's ability to acquire products approved for marketing in the U.S. that achieve commercial success and build its own marketing and commercialization capabilities; (viii) the interpretation of the properties and characteristics of the Company’s therapeutic candidates and of the results obtained with its therapeutic candidates in research, preclinical studies or clinical trials; (ix) the implementation of the Company’s business model, strategic plans for its business and therapeutic candidates; (x) the scope of protection the Company is able to establish and maintain for intellectual property rights covering its therapeutic candidates and its ability to operate its business without infringing the intellectual property rights of others; (xi) parties from whom the Company licenses its intellectual property defaulting in their obligations to the Company; and (xii) estimates of the Company’s expenses, future revenues capital requirements and the Company’s needs for additional financing; (xiii) competitive companies and technologies within the Company’s industry. More detailed information about the Company and the risk factors that may affect the realization of forward-looking statements is set forth in the Company's filings with the Securities and Exchange Commission (SEC), including the Company's Annual Report on Form 20-F filed with the SEC on February 23, 2017. All forward-looking statements included in this Press Release are made only as of the date of this Press Release. We assume no obligation to update any written or oral forward-looking statement unless required by law.


News Article | April 17, 2017
Site: www.techrepublic.com

Self-driving cars, drones, robots, gene editing—science fiction obsessions that have triggered many fears—have come to fruition faster than many predicted. While these emerging technologies have the potential to make our lives healthier, safer, and easier, the flip side is more grim: Eugenics, joblessness, privacy loss, and worsening economic inequality. In the book The Driver in the Driverless Car: How Our Technology Choices Will Create the Future, out this week, Vivek Wadhwa, a distinguished fellow at Carnegie Mellon University's College of Engineering and a director of research at Duke University's Pratt School of Engineering, explores the risks and rewards of our new technology, and how our choices will determine if our future errs on the side of Star Trek or Mad Max. The book began as a general look at the future and what could be possible with emerging technologies. But in the last two years, "I started getting more and more worried about the downsides of technology—the industry destruction it's causing, and the risks, dangers, and policy issues," Wadhwa told TechRepublic. "I was shocked at how fast it was happening." As evidenced by the election of US President Donald Trump, "the gap between the haves and the have nots is widening," Wadhwa said. "If we continue along the path we are on, we're going to create the dystopia of Mad Max. It's that dire." SEE: How Google's DeepMind beat the game of Go, which is even more complex than chess Many people are unaware of how rapidly technology is advancing, Wadhwa said. Take AI, which in the book, Wadhwa refers to as "both the most important breakthrough in modern computing and the most dangerous technology ever created by man." "We need AI to make intelligent decisions for us, to manage the massive amounts of data being gathered, and to give us better health—all the good," Wadhwa said. "The bad is when you look at the latest generations of machine learning, the creators have no clue how these things are making the decisions they are making." Privacy is another concern that many consumers are not paying enough attention to, Wadhwa said, and will soon become a thing of the past. He points to Internet of Things (IoT) devices that are constantly listening and learning about their human users, and even interacting with each other. "It isn't science fiction," Wadhwa said. "It's all happening as we speak." Technology offers the potential to solve the greatest challenges facing humanity to give us a science fiction utopia future, with "unlimited food, energy, and education, so life is not about making money, but about knowledge, enlightenment, sharing, and reaching for the stars," Wadhwa said. "That future is as close as 30 years from now. It's within our reach and lifetimes. But the Mad Max future is coming sooner than I expected." Wadhwa outlines three questions about any emerging technology to determine whether it will lead us to utopia or dystopia: 1. Does it have the potential to benefit everyone equally? When considering this question, Wadhwa points to AI physicians. Currently, the rich have better access to healthcare than the poor. With the rise of digital doctors, healthcare would be more readily available to everyone, as smartphones are. This is opposed to something like gene editing, which only the rich would have access to. "If only the rich have it, it creates dystopia," Wadhwa said. "We need to make sure we share the society we're creating." 2. What are its risks and rewards? This question involves weighing all potential risks and rewards of a new technology, Wadhwa said. For example, consider IoT: Do the rewards of having a refrigerator that can tell what foods you need to buy outweigh the privacy risks? The same should be considered for gene editing, as mentioned above. 3. Does it promote autonomy or dependence? Though some argue that many people are now dependent on smartphones, the fact remains that ten years ago, they did not exist, Wadhwa said, and we still have the ability to turn our phones off and go about our lives. He considers this question for self-driving cars: If these vehicles become the norm, humans likely would not be allowed to drive anymore, and would become dependent upon them for transportation. However, they would allow for autonomy as well, in terms of being able to travel anywhere for a low cost, no matter what age or disabilities a person may have. "Everyone gains autonomy from self driving cars, while we become dependent on them," Wadhwa said. How can we avoid the path to dystopia? "By learning. By deciding. By speaking up," Wadhwa said. "Each of us has a say. Your voice is as important as my voice." Our individual choices around technology matter, Wadhwa argues. He points to the recent controversies surrounding Uber, and how users chose to delete the app from their phones. People working in the tech industry must consider the impact of their innovations on the world at large, Wadhwa said. "In the tech industry, we have blinders on," he said. "We have to start taking responsibility for the dystopia we're creating."


News Article | April 26, 2017
Site: phys.org

In a study published April 26, 2017 in Biology Letters, researchers have identified genes that enable the fish to perform this extraordinary homing feat with help from Earth's magnetic field. Generated by the flow of molten metal in its core, the Earth's magnetic field ranges from a mere 25 microteslas near the equator to 65 microteslas toward the poles—making it more than a hundred times weaker than a refrigerator magnet. Diverse animal species can detect such weak magnetic fields and use them to navigate. First identified in birds in the 1960s, this sense, called magnetoreception, has since been documented in animals ranging from bees and salamanders to sea turtles. But despite more than half a century of research, the underlying molecular and cellular machinery remains a mystery. To work out the genetic basis, Duke University postdoctoral associate Bob Fitak and biology professor Sönke Johnsen and colleagues investigated changes in gene expression that take place across the rainbow trout genome when the animal's magnetic sense is disrupted. In a basement aquarium on the Duke campus, they randomly scooped up one fish at a time from a tank into a small holding container, and placed the container inside a coil of wire. The coil was connected to a capacitor, which discharged an electric current to create a split-second magnetic pulse inside the coil, about 10 times weaker than the magnetic field generated by an MRI machine in a hospital. Next the researchers sequenced all the gene readouts, or RNA transcripts, present in the brains of 10 treated fish and 10 controls to find out which genes were switched on and off in response to the magnetic pulse. Disrupting the fish's internal compass with the magnetic pulse triggered changes in 181 out of the roughly 40,000 genes they examined. Notably, the brains of treated fish showed increased expression of genes involved in making ferritin, a protein that stores and transports iron inside cells. Treated fish also showed changes in genes involved in the development of the optic nerve. "The results suggest that the detection system is based on iron that may be connected with or inside the eyes," Johnsen said. The findings are consistent with the idea, first proposed nearly 40 years ago, that animals have tiny magnetic particles of an iron-containing compound called magnetite in their bodies. The magnetite particles are thought to act like microscopic compass needles, relaying information to the nervous system by straining or twisting receptors in cells as they attempt to align with the Earth's magnetic field. "You can think of them as mini magnets that the body's cells can sense," Fitak said. Magnetite has been found in the beaks of birds, the brains of sea turtles, the tummies of honeybees, and the nasal passages of rainbow trout. Other studies have even found minuscule amounts of magnetite in the human brain, but recent research suggests most of it comes from air pollution rather than occurring naturally, and it's unclear whether they give humans a subconscious magnetic sense. The researchers suspect the iron-binding ferritin protein may be involved in repair when the fish's magnetite-based compass is disrupted or damaged. Next they plan to do similar experiments with other tissues, such as the retina, and additional species that live in the ocean but travel to their freshwater hatching grounds each spring to spawn, such as American shad. "Scientists don't know what proteins might be involved in magnetite-based magnetoreception, but now we have some candidate genes to work with," Fitak said. Explore further: Researchers find cells that move in response to Earth's magnetic field More information: Robert R. Fitak et al, Candidate genes mediating magnetoreception in rainbow trout, Biology Letters (2017). DOI: 10.1098/rsbl.2017.0142


News Article | April 26, 2017
Site: www.eurekalert.org

DURHAM, N.C. -- In the spring when water temperatures start to rise, rainbow trout that have spent several years at sea traveling hundreds of miles from home manage, without maps or GPS, to find their way back to the rivers and streams where they were born for spawning. In a study published April 26, 2017 in Biology Letters, researchers have identified genes that enable the fish to perform this extraordinary homing feat with help from Earth's magnetic field. Generated by the flow of molten metal in its core, the Earth's magnetic field ranges from a mere 25 microteslas near the equator to 65 microteslas toward the poles -- making it more than a hundred times weaker than a refrigerator magnet. Diverse animal species can detect such weak magnetic fields and use them to navigate. First identified in birds in the 1960s, this sense, called magnetoreception, has since been documented in animals ranging from bees and salamanders to sea turtles. But despite more than half a century of research, the underlying molecular and cellular machinery remains a mystery. To work out the genetic basis, Duke University postdoctoral associate Bob Fitak and biology professor Sönke Johnsen and colleagues investigated changes in gene expression that take place across the rainbow trout genome when the animal's magnetic sense is disrupted. In a basement aquarium on the Duke campus, they randomly scooped up one fish at a time from a tank into a small holding container, and placed the container inside a coil of wire. The coil was connected to a capacitor, which discharged an electric current to create a split-second magnetic pulse inside the coil, about 10 times weaker than the magnetic field generated by an MRI machine in a hospital. Next the researchers sequenced all the gene readouts, or RNA transcripts, present in the brains of 10 treated fish and 10 controls to find out which genes were switched on and off in response to the magnetic pulse. Disrupting the fish's internal compass with the magnetic pulse triggered changes in 181 out of the roughly 40,000 genes they examined. Notably, the brains of treated fish showed increased expression of genes involved in making ferritin, a protein that stores and transports iron inside cells. Treated fish also showed changes in genes involved in the development of the optic nerve. "The results suggest that the detection system is based on iron that may be connected with or inside the eyes," Johnsen said. The findings are consistent with the idea, first proposed nearly 40 years ago, that animals have tiny magnetic particles of an iron-containing compound called magnetite in their bodies. The magnetite particles are thought to act like microscopic compass needles, relaying information to the nervous system by straining or twisting receptors in cells as they attempt to align with the Earth's magnetic field. "You can think of them as mini magnets that the body's cells can sense," Fitak said. Magnetite has been found in the beaks of birds, the brains of sea turtles, the tummies of honeybees, and the nasal passages of rainbow trout. Other studies have even found minuscule amounts of magnetite in the human brain, but recent research suggests most of it comes from air pollution rather than occurring naturally, and it's unclear whether they give humans a subconscious magnetic sense. The researchers suspect the iron-binding ferritin protein may be involved in repair when the fish's magnetite-based compass is disrupted or damaged. Next they plan to do similar experiments with other tissues, such as the retina, and additional species that live in the ocean but travel to their freshwater hatching grounds each spring to spawn, such as American shad. "Scientists don't know what proteins might be involved in magnetite-based magnetoreception, but now we have some candidate genes to work with," Fitak said. Other authors include Benjamin Wheeler of Duke, and David Ernst and Kenneth Lohmann of the University of North Carolina, Chapel Hill. This research was supported by the Air Force Office of Scientific Research (FA9550-14-1-0208).


NEW YORK, NY--(Marketwired - Apr 10, 2017) - On April 10, 2017, Daxor Corporation ( : DXR) received a letter from the NYSE MKT LLC ("the Exchange") noting that it is not in compliance with the listing standards pertaining to the timing of SEC filings as set forth in Sections 134 and 1101 of the NYSE MKT Company Guide ("the Company Guide") because of a delay in filing its N-CSR for the fiscal year ended December 31, 2016. "The Company primarily attributes the delay in filing the N-CSR to new personnel in the Controller's office, the death of our Chairman Dr. Joseph Feldschuh in January of this year, and the need for the newly-retained auditing firm to review the company's books and records. We are working diligently with our independent auditors to provide all the necessary information, including finalization of all adjustments and supporting analyses, so they can complete the audit of financial statements for the fiscal year ended December 31, 2016. The Company has no disagreements with our auditors, and we expect and intend to complete the filing of the N-CSR on or before April 30th, which we expect to be in advance of the period allowed by the NYSE, bringing the firm into compliance," said Eric P. Coleman, Daxor's' Chief Financial Officer. "Management is confident that issues which delayed this important report have been solved and that we will file in an efficient and timely fashion going forward", stated Daxor's CEO Michael Feldschuh. To maintain its listing, the Company must submit either its N-CSR or a plan of compliance by April 28, 2017. In other news, the company is pleased to note that the Uniformed Services University of Health Sciences has been awarded a grant to study hemodilution utilizing Daxor's BVA-100 device in collaboration with Duke University which is anticipated to start May 1, 2017. The grant notes that Daxor's method is recognized as the gold standard of accuracy and calls for an exploration of fluid hemodilution metrics using the company's technology as a benchmark. "This study recognizes the acute need for accurate pre- and post-operative volume assessment, and we anticipate it will further reinforce the importance of Daxor's unique technology," noted Michael Feldschuh. Cautionary Note Regarding Forward-Looking Statements:  This press release contains forward-looking statements within the meaning of the Private Securities Litigation Reform Act of 1995. All statements other than statements of historical fact contained in this press release are forward-looking statements. In some cases forward-looking statements can be identified by terminology such as "anticipate," "believe," "can," "continue," "could," "estimate," "expect," "intend," "may," "plan," "potential," "predict," "project," "should," or "will" or the negative of these terms or other comparable terminology and include statements regarding the expected timing of the filing of the N-CSR .  These forward-looking statements are based on management's expectations and assumptions as of the date of this press release and are subject to a number of risks and uncertainties, many of which are difficult to predict that could cause actual results to differ materially from current expectations and assumptions from those set forth or implied by any forward-looking statements. The information in this release is provided only as of the date of this release, and we undertake no obligation to update any forward-looking statements contained in this release on account of new information, future events, or otherwise, except as required by law.


News Article | April 17, 2017
Site: www.eurekalert.org

They are shy and elusive. They are tinier than a dolphin. And they are disappearing fast. Despite heroic efforts, vaquita porpoises are dying at astounding rates in illegal fishing nets in their limited habitat in the northwestern corner of the Gulf of California. Last week, two more vaquitas were found dead. Fewer than 30 vaquitas are believed to be alive today, making them the most endangered marine mammal in the world. But there is reason to hope. An unusual, diverse, international coalition of partners called VaquitaCPR and led by the Mexican government has worked feverishly to develop a bold, first-ever emergency plan to rescue the vaquita and place them in a sanctuary until illegal fishing is ended and their habitat is cleared of deadly gillnets. This week, the Mexican government's Ministry of Environment and Natural Resources (SEMARNAT) announced a pledge of up to $3 million dollars to help launch the first critical phase of this emergency plan, including construction of a sea pen sanctuary. This is a significant financial commitment, but additional support from the public is vital to ensure the full implementation of this daring effort to recover a population that totaled 600 animals just 20 years ago. "The challenge is staggering," says The Marine Mammal Center's Executive Director Dr. Jeff Boehm, who is leading the coalition's fundraising efforts. "How we respond to this emergency reveals who we are as a society. It sets precedent. We are asking the public to step up and donate what they can today at http://www. to match the Mexican government's generous funding. Additional donations are needed for veterinary care, staffing, and equipment and to ensure the program is not cut short because of lack of funds." The critical need for support from the public to help save the vaquita has been reinforced by a number of celebrities, who are asking their fans to help fund the project, according to Dr. Cynthia Smith, Executive Director of the National Marine Mammal Foundation. The Foundation is one of the primary partners supporting VaquitaCPR. Dr. Smith thanks singer, songwriter, and actress Miley Cyrus, and actors Leonardo DiCaprio, Chris Hemsworth, and Carolyn Hennesy. "Public outreach and awareness is so essential to this project," said Dr. Smith. "When people understand the world is about to lose something dear, they will try to make a difference." The caring, compassion, and concern that prompted the development of the emergency plan to save the vaquita from extinction gained additional support today. The Association of Zoos and Aquariums (AZA) announced its members have committed their support through its Save Animals from Extinction (SAFE) program and pledged to raise additional funds for VaquitaCPR. In recent years, significant contributions have enabled efforts that have focused on assessing the population and educating the public about the devastating threat facing the endangered porpoise. The Mexican government has expended more than $100 million to date on these efforts and more. According to Debborah Luke, AZA's Senior Vice President for Conservation & Science, the AZA community has also contributed to vaquita conservation through its innovative SAFE program in the past five years. The illegal gillnets killing vaquita are used to catch another endangered species, the totoaba. The fish's dried swim bladders fetch huge sums of money in China and Hong Kong, where it is believed the bladders help maintain youthful-looking skin. "We are very grateful that both the Mexican government and AZA have pledged support and hope it will inspire others who share our determination to save the vaquita to donate," emphasized Dr. Lorenzo Rojas-Bracho, lead researcher and head of Mexico's International Committee for the Recovery of the Vaquita (CIRVA). "Does the public care enough to help save the most endangered marine mammal in the world? I think so. We can't stand by and watch this precious resource disappear. It will be challenging, but we must try." To support the rescue effort, learn more about the vaquita and for information about VaquitaCPR, visit VaquitaCPR.org VaquitaCPR is led by Mexico's Ministry of Environment and Natural Resources (SEMARNAT). 'VaquitaCPR' is dedicated to conserving, protecting, and helping this rare porpoise recover. The National Marine Mammal Foundation, The Marine Mammal Center, and the Chicago Zoological Society are primary partners in this extraordinary conservation effort. Key collaborators in Mexico include the International Committee for the Recovery of the Vaquita (CIRVA), the National Institute of Ecology and Climate Change (INECC), the Mexican Association of Habitats for the Interaction and Protection of Marine Mammals (AMHMAR), and Acuario Oceanico. Additional United States collaborators are Duke University and the Marine Mammal Commission, with NOAA Fisheries providing technical expertise. The Association of Zoos and Aquariums, Dolphin Quest, SeaWorld, Vancouver Aquarium, the International Marine Animal Trainer's Association and the Association of Marine Mammal Parks and Aquariums are offering support and expertise to the program and assisting with fundraising.


News Article | April 13, 2017
Site: www.rdmag.com

A new reconfigurable device that emits patterns of thermal infrared light in a fully controllable manner could one day make it possible to collect waste heat at infrared wavelengths and turn it into usable energy. The new technology could be used to improve thermophotovoltaics, a type of solar cell that uses infrared light, or heat, rather than the visible light absorbed by traditional solar cells. Scientists have been working to create thermophotovoltaics that are practical enough to harvest the heat energy found in hot areas, such as around furnaces and kilns used by the glass industry. They could also be used to turn heat coming from vehicle engines into energy to charge a car battery, for example. "Because the infrared energy emission, or intensity, is controllable, this new infrared emitter could provide a tailored way to collect and use energy from heat," said Willie J. Padilla of Duke University, North Carolina. "There is a great deal of interest in utilizing waste heat, and our technology could improve this process." The new device is based on metamaterials, synthetic materials that exhibit exotic properties not available from natural materials. Padilla and doctoral student Xinyu Liu used a metamaterial engineered to absorb and emit infrared wavelengths with very high efficiency. By combining it with the electronically controlled movement available from microelectromechanical systems (MEMS), the researchers created the first metamaterial device with infrared emission properties that can be quickly changed on a pixel-by-pixel basis. As reported in The Optical Society's journal for high impact research, Optica, the new infrared-emitting device consists of an 8 × 8 array of individually controllable pixels, each measuring 120 X 120 microns. They demonstrated the MEMS metamaterial device by creating a "D" that is visible with an infrared camera. The researchers report that their infrared emitter can achieve a range of infrared intensities and can display patterns at speeds of up to 110 kHz, or more than 100,000 times per second. Scaling up the technology could allow it to be used to create dynamic infrared patterns for friend or foe identification during combat. In contrast to methods typically used to achieve variable infrared emission, the new technology emits tunable infrared energies without any change in temperature. Since the material is neither heated nor cooled, the device can be used at room temperature while other methods require high operating temperatures. Although experiments with natural materials have been successful at room-temperature, they are limited to narrow infrared spectral ranges. "In addition to allowing room-temperature operation, using metamaterials makes it simple to scale throughout the infrared wavelength range and into the visible or lower frequencies," said Padilla. "This is because the device's properties are achieved by the geometry, not by the chemical nature of the constituent materials that we're using." The new reconfigurable infrared emitter consists of a movable top layer of patterned metallic metamaterial and a bottom metallic layer that remains stationary. The device absorbs infrared photons and emits them with high efficiency when the two layers are touching but emits less infrared energy when the two layers are apart. An applied voltage controls the movement of the top layer, and the amount of infrared energy emitted depends on the exact voltage applied. Using an infrared camera, the researchers demonstrated that they could dynamically modify the number of infrared photons coming off the surface of the MEMS metamaterial over a range of intensities equivalent to a temperature change of nearly 20 degrees Celsius. The researchers say that they could modify the metamaterial patterns used in the top layer to create different colored infrared pixels that would be each be tunable in intensity. This could allow the creation of infrared pixels that are similar to the RGB pixels used in a TV. They are now working to scale up the technology by making a device with more pixels -- as many as 128 X 128 -- and increasing the size of the pixels. "In principle, an approach similar to ours could be used to create many kinds of dynamic effects from reconfigurable metamaterials," said Padilla. "This could be used to achieve a dynamic infrared optical cloak or a negative refractive index in the infrared, for example."


The new technology could be used to improve thermophotovoltaics, a type of solar cell that uses infrared light, or heat, rather than the visible light absorbed by traditional solar cells. Scientists have been working to create thermophotovoltaics that are practical enough to harvest the heat energy found in hot areas, such as around furnaces and kilns used by the glass industry. They could also be used to turn heat coming from vehicle engines into energy to charge a car battery, for example. "Because the infrared energy emission, or intensity, is controllable, this new infrared emitter could provide a tailored way to collect and use energy from heat," said Willie J. Padilla of Duke University, North Carolina. "There is a great deal of interest in utilizing waste heat, and our technology could improve this process." The new device is based on metamaterials, synthetic materials that exhibit exotic properties not available from natural materials. Padilla and doctoral student Xinyu Liu used a metamaterial engineered to absorb and emit infrared wavelengths with very high efficiency. By combining it with the electronically controlled movement available from microelectromechanical systems (MEMS), the researchers created the first metamaterial device with infrared emission properties that can be quickly changed on a pixel-by-pixel basis. As reported in The Optical Society's journal for high impact research, Optica, the new infrared-emitting device consists of an 8 × 8 array of individually controllable pixels, each measuring 120 X 120 microns. They demonstrated the MEMS metamaterial device by creating a "D" that is visible with an infrared camera. The researchers report that their infrared emitter can achieve a range of infrared intensities and can display patterns at speeds of up to 110 kHz, or more than 100,000 times per second. Scaling up the technology could allow it to be used to create dynamic infrared patterns for friend or foe identification during combat. In contrast to methods typically used to achieve variable infrared emission, the new technology emits tunable infrared energies without any change in temperature. Since the material is neither heated nor cooled, the device can be used at room temperature while other methods require high operating temperatures. Although experiments with natural materials have been successful at room-temperature, they are limited to narrow infrared spectral ranges. "In addition to allowing room-temperature operation, using metamaterials makes it simple to scale throughout the infrared wavelength range and into the visible or lower frequencies," said Padilla. "This is because the device's properties are achieved by the geometry, not by the chemical nature of the constituent materials that we're using." The new reconfigurable infrared emitter consists of a movable top layer of patterned metallic metamaterial and a bottom metallic layer that remains stationary. The device absorbs infrared photons and emits them with high efficiency when the two layers are touching but emits less infrared energy when the two layers are apart. An applied voltage controls the movement of the top layer, and the amount of infrared energy emitted depends on the exact voltage applied. Using an infrared camera, the researchers demonstrated that they could dynamically modify the number of infrared photons coming off the surface of the MEMS metamaterial over a range of intensities equivalent to a temperature change of nearly 20 degrees Celsius. The researchers say that they could modify the metamaterial patterns used in the top layer to create different colored infrared pixels that would be each be tunable in intensity. This could allow the creation of infrared pixels that are similar to the RGB pixels used in a TV. They are now working to scale up the technology by making a device with more pixels—as many as 128 X 128—and increasing the size of the pixels. "In principle, an approach similar to ours could be used to create many kinds of dynamic effects from reconfigurable metamaterials," said Padilla. "This could be used to achieve a dynamic infrared optical cloak or a negative refractive index in the infrared, for example." Explore further: Metamaterial device allows chameleon-like behavior in the infrared


News Article | April 28, 2017
Site: www.businesswire.com

NEW YORK--(BUSINESS WIRE)--FOX News Channel (FNC) will debut a one-hour political talk show entitled, The FOX News Specialists on May 1st, 2017, announced Suzanne Scott, Executive Vice President of Programming. Co-hosted by FNC’s Eric Bolling, Katherine Timpf and Eboni K. Williams, the program will air weeknights live at 5PM/ET from New York City. Each night, The FOX News Specialists will feature Bolling, Timpf and Williams alongside two special guest experts to discuss the top stories driving the headlines of the day. The daily “specialists” will join the co-hosts to provide unique and unpredictable analysis on the stories trending in America. In making the announcement, Scott said, “Eric, Katherine and Eboni’s diverse opinions and backgrounds will provide our audience with an hour of informative and entertaining analysis on daily stories that are most important to Americans. The combination of the co-hosts’ expertise in business, millennial and legal topics, respectively, will make for lively and compelling discourse.” During his tenure at FNC, Bolling has served as the co-host of FNC’s top-rated ensemble program The Five (weekdays 5-6PM/ET) and continues to host Cashin' In (Saturdays 11:30AM-12PM/ET). Previously, he served as host of FBN's Follow the Money, Happy Hour and FOXNews.com's Strategy Room. Prior to joining FBN, Bolling helped launch and was an original panelist on CNBC's Fast Money. Before embarking on a television career in business news, Bolling was an independent trader based out of the New York Mercantile Exchange (NYMEX) and also served on the NYMEX's Board of Directors for five years. A successful author, his first book “Wake Up America” became an instant New York Times best seller. Bolling is a graduate of Rollins College in Winter Park, FL and was awarded a fellowship to Duke University's School of Public Policy. Timpf joined FNC in 2015 as a contributor and provides political commentary regularly to The Greg Gutfeld Show (Sundays at 10PM/ET) and across daytime and primetime programming. She is also a writer for National Review and the host of the podcast The Kat Timpf Show, which airs every Monday on Barstool Sports. Timpf has contributed to several publications including The Orange County Register and Investor’s Business Daily. Previously, she worked as a reporter for Campus Reform in Arlington, VA, and was a digital editor for The Washington Times. Timpf also served as a producer and reporter for Total Traffic Network in Santa Ana, CA. A 2012-13 Robert Novak Fellow, she graduated from Hillsdale College with a B.A. in English. Williams also joined FNC in 2015 and currently serves as a contributor, occasionally appearing on Outnumbered (weekdays, 12PM/ET) and The Five (weekdays, 5PM/ET). Prior to joining FNC, Williams served as a CBS News correspondent, HLN contributor and talk radio host for Los Angeles’ KFI AM640. She began her professional career in Louisiana in the wake of Hurricane Katrina where she clerked for the Louisiana Secretary of State and the Louisiana Attorney General’s Office as a law student. Williams also worked for various politicians, including New Orleans city council members. Throughout North Carolina and the Greater Los Angeles Area, she has served as both a public defender and in private practice, specializing in family law, criminal and civil litigation. Williams received a B.A. in communications and African-American Studies from the University of North Carolina at Chapel Hill and a J.D. from Loyola University New Orleans College of Law. FOX News Channel (FNC) is a 24-hour all-encompassing news service dedicated to delivering breaking news as well as political and business news. The number one network in cable, FNC has been the most watched television news channel for 15 years and according to a Suffolk University/USA Today poll, is the most trusted television news source in the country. Owned by 21st Century Fox, FNC is available in 90 million homes and dominates the cable news landscape, routinely notching the top ten programs in the genre.


Alphabet’s Google division is, fundamentally, in the business of selling data. That is a useful thing to keep in mind when Alphabet’s Verily comes calling for your medical data. But Google is also inarguably useful; this is why, despite knowing that my every move is being tracked by the company, I still make use of Google search, Gmail, and Google Docs, among its other myriad services. Verily’s Project Baseline is, in some sense, the health equivalent of those kinds of services — it has the potential to greatly expand our knowledge about what human health looks like. Not incidentally, the project will be of service to Verily as well. Researchers will collect genetic data, blood samples, medical images, and other information. In 2014, Verily — then a division of Google X — announced the Baseline Project, a collaboration with Duke University and Stanford University to try to get a sense of what a “normal” human looks like. Today, the group announced it will begin enrolling 10,000 healthy people, following a pilot in about 200 people that began in 2014. Over the course of four years, researchers will collect genetic data, blood samples, medical images, and other information from the study participants. That “other information” might include environmental data, as well as responses to phone surveys, and data from sensors in the Study Watch, a sensor-packed smartwatch announced last week. The studies are starting in the San Francisco Bay Area and North Carolina, though the scientists behind the effort hope to expand the areas surveyed. And because the program is meant to be nationally-representative, recruitment may be a little slow. When it’s over, there will be a database of anonymized data that plenty of researchers — including those from the pharmaceutical industry — will have access to. This style of study isn’t unprecedented; in fact, it’s been a feature of medical discovery for quite some time. The most famous example is the Framingham Heart Study, which began in 1948 with about 5,000 patients. At the time, doctors didn’t know much about heart attack and stroke, except that they were common and often deadly. So the Framingham study was devised in order to follow people ages 30 to 62 from the town of Framingham, Massachusetts for years and see if there were clues to those ailments. In 1948, and every two years afterwards, the study participants checked in. A second generation was added in 1971, and a third in 2002. Framingham has provided clues to most major cardiovascular risk factors Over the course of decades, Framingham has provided clues to most major cardiovascular risk factors: high blood pressure, high cholesterol, smoking, obesity, diabetes, and a sedentary lifestyle, among others. Framingham alone wasn’t enough to identify all these contributors, of course — but it told other scientists where to look. Something like 1,200 articles have been published in academic health journals over the last 50 years on Framingham alone. Project Baseline is twice the size of the original Framingham study population and is attempting more comprehensive measurement. And unlike Framingham, which was funded primarily by the National Heart, Lung and Blood Institute, this study is funded by Verily. Government spending on science has stagnated over the last decade, and Framingham has been among its casualties; the study lost 40 percent of its funding in 2013 as a result of the budget sequester. In fact, it would be a lot harder to get a publicly-funded study like Framingham off the ground today — both because it’s expensive and because it’s hard to predict what studies like this will find. Verily’s Project Baseline, then, is a mightily ambitious piece of basic science, and one that could prove useful. Advisory board member Adrian Hernandez, a cardiologist and professor of medicine at Duke, says the Project Baseline group is “aiming to build an early discovery platform.” It’s possible subtle changes occur in some areas — biomarkers, behavior, anything really — before a disease takes hold, Hernandez notes. Discovering what those changes are may lead to earlier and better treatment. Beyond those broad brushstrokes, it’s a bit difficult to say what a study like this is for until well after the fact, points out Stanford’s Sam Gambhir, who also sits on the advisory board. With a cohort study like this one — or like Framingham — it’s impossible to know the medical impact until well afterward. It is, however, possible to take an educated guess at what Verily gains by running the study. The point of a publicly-traded company is to make money for its shareholders, and Verily is owned by Alphabet, which is a publicly-traded company. So Verily stands to make money — the question is how. Verily, according to Alphabet’s investor documents, sells R&D services and licenses. This is worth keeping in mind; the data generated by Project Baseline will be shared with “qualified researchers.” Duke and Stanford are the two obvious places where the data will be shared, and the researchers there are likely to use Project Baseline data just as they’d use other data. When I asked about the possibility for pharmaceutical companies to access this data, Jessica Mega, the chief research officer at Verily, got squirrely. Here’s her initial reply: “There's a scientific executive committee that will review every request and the composition includes individuals with a leadership from Duke and Stanford, so it would need to be in line with the overall mission of the study.” Many companies sell their large databases of customer information I asked again, if someone who worked at a company like Pfizer would be able to gain access to the data, and Mega said, “Yes, as long as the intent is try to improve medical discovery.” Verily doesn’t yet figured out what “access” to the data will look like, I was told, but “the philosophy of the study is to make this information broadly available to qualified researchers.” Verily declined to comment on the possibility of fees for accessing its data. Many companies sell their large databases of customers’ information for discovery research. 23andMe, for instance, sells de-identified data from its genetic tests to researchers (such as Stanford), drug companies (Genentech, a subsidiary of Roche), and other entities (including the Michael J Fox Foundation, which does Parkinson’s research). In fact, the data is 23andMe’s moneymaker, not the tests it sells. Ancestry LLC, which also sells genetic test kits, does the same. It stands to reason Project Baseline could create similar revenue opportunities for Verily. There is also the matter of the Study Watch, which provides Verily with a number of opportunities. The most obvious is a consumer version of the Study Watch, though Mega says the company doesn’t currently plan to make the watch commercially available. The watch could also be used as a platform other researchers license in order to gather fairly continuous data. As it happens, the Study Watch is being used in the Personalized Parkinson’s Project — another Verily study, taking place in the Netherlands. That trial and Project Baseline may serve as trial runs for future research uses. Those two studies may also provide some sense of how reliable the data from the watch is, as well. (The Study Watch wasn’t used in the Project Baseline initial pilot run of about 200 patients.) Most wearables don’t require FDA approval, as long as they are marketed as “wellness devices.” But any kind of specific medical claim would probably require the regulators’ sign-off. Verily declined to comment on any regulatory plans for the watch. At this stage, it’s too early to speculate about what public good might come from Project Baseline, though it does seem likely there will be advantages to the work. It’s even possible that Project Baseline will be as useful to research as Framingham was 50 years ago. It may provide — as Framingham did before it — clues to the roots of common ailments. There may even be new drugs developed out of it. One thing is nearly certain, though: Project Baseline is meant to benefit Verily, too.


News Article | April 27, 2017
Site: www.eurekalert.org

Using new gene-editing technology, researchers have rewired mouse stem cells to fight inflammation caused by arthritis and other chronic conditions. Such stem cells, known as SMART cells (Stem cells Modified for Autonomous Regenerative Therapy), develop into cartilage cells that produce a biologic anti-inflammatory drug that, ideally, will replace arthritic cartilage and simultaneously protect joints and other tissues from damage that occurs with chronic inflammation. The cells were developed at Washington University School of Medicine in St. Louis and Shriners Hospitals for Children-St. Louis, in collaboration with investigators at Duke University and Cytex Therapeutics Inc., both in Durham, N.C. The researchers initially worked with skin cells taken from the tails of mice and converted those cells into stem cells. Then, using the gene-editing tool CRISPR in cells grown in culture, they removed a key gene in the inflammatory process and replaced it with a gene that releases a biologic drug that combats inflammation. The research is available online April 27 in the journal Stem Cell Reports. "Our goal is to package the rewired stem cells as a vaccine for arthritis, which would deliver an anti-inflammatory drug to an arthritic joint but only when it is needed," said Farshid Guilak, PhD, the paper's senior author and a professor of orthopedic surgery at Washington University School of Medicine. "To do this, we needed to create a 'smart' cell." Many current drugs used to treat arthritis -- including Enbrel, Humira and Remicade -- attack an inflammation-promoting molecule called tumor necrosis factor-alpha (TNF-alpha). But the problem with these drugs is that they are given systemically rather than targeted to joints. As a result, they interfere with the immune system throughout the body and can make patients susceptible to side effects such as infections. "We want to use our gene-editing technology as a way to deliver targeted therapy in response to localized inflammation in a joint, as opposed to current drug therapies that can interfere with the inflammatory response through the entire body," said Guilak, also a professor of developmental biology and of biomedical engineering and co-director of Washington University's Center of Regenerative Medicine. "If this strategy proves to be successful, the engineered cells only would block inflammation when inflammatory signals are released, such as during an arthritic flare in that joint." As part of the study, Guilak and his colleagues grew mouse stem cells in a test tube and then used CRISPR technology to replace a critical mediator of inflammation with a TNF-alpha inhibitor. "Exploiting tools from synthetic biology, we found we could re-code the program that stem cells use to orchestrate their response to inflammation," said Jonathan Brunger, PhD, the paper's first author and a postdoctoral fellow in cellular and molecular pharmacology at the University of California, San Francisco. Over the course of a few days, the team directed the modified stem cells to grow into cartilage cells and produce cartilage tissue. Further experiments by the team showed that the engineered cartilage was protected from inflammation. "We hijacked an inflammatory pathway to create cells that produced a protective drug," Brunger said. The researchers also encoded the stem/cartilage cells with genes that made the cells light up when responding to inflammation, so the scientists easily could determine when the cells were responding. Recently, Guilak's team has begun testing the engineered stem cells in mouse models of rheumatoid arthritis and other inflammatory diseases. If the work can be replicated in animals and then developed into a clinical therapy, the engineered cells or cartilage grown from stem cells would respond to inflammation by releasing a biologic drug -- the TNF-alpha inhibitor -- that would protect the synthetic cartilage cells that Guilak's team created and the natural cartilage cells in specific joints. "When these cells see TNF-alpha, they rapidly activate a therapy that reduces inflammation," Guilak explained. "We believe this strategy also may work for other systems that depend on a feedback loop. In diabetes, for example, it's possible we could make stem cells that would sense glucose and turn on insulin in response. We are using pluripotent stem cells, so we can make them into any cell type, and with CRISPR, we can remove or insert genes that have the potential to treat many types of disorders." With an eye toward further applications of this approach, Brunger added, "The ability to build living tissues from 'smart' stem cells that precisely respond to their environment opens up exciting possibilities for investigation in regenerative medicine." Brunger JM, Zutshi A, Willard VP, Gersbach CA, Guilak F. Genome engineering of stem cells for autonomously regulated, closed-loop delivery of biologic drugs. Stem Cell Reports. April 27, 2017. This work was supported by the National Institute of Arthritis and Musculoskeletal and Skin Diseases and the National Institute on Aging of the National Institutes of Health (NIH), grant numbers AR061042, AR50245, AR46652, AR48182, AR067467, AR065956, AG15768, OD008586. Additional funding provided by the Nancy Taylor Foundation for Chronic Diseases; the Arthritis Foundation; the National Science Foundation (NSF), CAREER award number CBET-1151035; and the Collaborative Research Center of the AO Foundation, Davos, Switzerland. Authors Farshid Guilak, and Vincent Willard have a financial interest in Cytex Therapeutics of Durham, N.C., which may choose to license this technology. Cytex is a startup founded by some of the investigators. They could realize financial gain if the technology eventually is approved for clinical use. Washington University School of Medicine's 2,100 employed and volunteer faculty physicians also are the medical staff of Barnes-Jewish and St. Louis Children's hospitals. The School of Medicine is one of the leading medical research, teaching and patient-care institutions in the nation, currently ranked seventh in the nation by U.S. News & World Report. Through its affiliations with Barnes-Jewish and St. Louis Children's hospitals, the School of Medicine is linked to BJC HealthCare.


News Article | May 8, 2017
Site: www.materialstoday.com

A cartilage-mimicking material created by researchers at Duke University may one day allow surgeons to 3D print replacement knee parts that are custom-shaped to each patient's anatomy. Human knees come with a pair of built-in shock absorbers called the menisci. These ear-shaped hunks of cartilage, nestled between the thigh and shin bones, cushion every step we take. But a lifetime of wear-and-tear – or a single wrong step during a game of soccer or tennis – can permanently damage these key supports, leading to pain and an increased risk of developing arthritis. The novel hydrogel-based material developed by the Duke researchers is the first to match human cartilage in strength and elasticity, while also remaining 3D-printable and stable inside the body. To demonstrate how it might work, the researchers used a $300 3D printer to create custom menisci for a plastic model of a knee. "We've made it very easy now for anyone to print something that is pretty close in its mechanical properties to cartilage, in a relatively simple and inexpensive process," said Benjamin Wiley, an associate professor of chemistry at Duke and author of a paper on this work in ACS Biomaterials Science and Engineering. After we reach adulthood, the meniscus has limited ability to heal on its own. Surgeons can attempt to repair a torn or damaged meniscus, but often it must be partially or completely removed. Available implants either do not match the strength and elasticity of the original cartilage, or are not biocompatible, meaning they do not support the growth of cells to encourage healing around the site. Recently, materials called hydrogels have been gaining traction as a replacement for lost cartilage. Hydrogels are biocompatible and share a very similar molecular structure to cartilage: if you zoom in on either, you'll find a web of long string-like molecules with water molecules wedged into the gaps. But researchers have struggled to create recipes for synthetic hydrogels that are equal in strength to human cartilage or that are 3D-printable. "The current gels that are available are really not as strong as human tissues, and generally, when they come out of a printer nozzle they don't stay put – they will run all over the place, because they are mostly water," Wiley said. Feichen Yang, a graduate student in Wiley's lab and a fellow author of the paper, experimented with mixing together two different types of hydrogels – one stiffer and stronger, and the other softer and stretchier – to create what is called a double-network hydrogel. "The two networks are woven into each other," Yang said. "And that makes the whole material extremely strong." By changing the relative amounts of the two hydrogels, Yang could adjust the strength and elasticity of the mixture to arrive at a formula that best matches that of human cartilage. He also mixed in a special ingredient, a nanoparticle clay, to make the mock-cartilage 3D-printable. With the addition of the clay, the hydrogel flows like water when placed under shear stress, such as when squeezed through a small printer nozzle, but as soon as the stress is gone, the hydrogel immediately hardens into its printed shape. 3D printing of other custom-shaped implants, including hip replacements, cranial plates and even spinal vertebrae, is already practiced in orthopedic surgeries. These custom implants are based on virtual 3D models of a patient's anatomy, which can be obtained from computer tomography (CT) or magnetic resonance imaging (MRI) scans. Meniscus implants could also benefit from 3D printing's ability to create customized and complex shapes, the researchers say. "Shape is a huge deal for the meniscus," Wiley explained. "This thing is under a lot of pressure, and if it doesn't fit you perfectly it could potentially slide out, or be debilitating or painful." "A meniscus is not a homogenous material," Yang added. "The middle is stiffer, and the outside is a bit softer. Multi-material 3D printers let you print different materials in different layers, but with a traditional mold you can only use one material." In a simple demonstration, Yang took a CT scan of a plastic model of a knee and used the information from the scan to 3D print new menisci using his double network hydrogel. The whole process, from scan to finished meniscus, took only about a day, he says. "This is really a young field, just starting out," Wiley said. "I hope that demonstrating the ease with which this can be done will help get a lot of other people interested in making more realistic printable hydrogels with mechanical properties that are even closer to human tissue." This story is adapted from material from Duke University, with editorial changes made by Materials Today. The views expressed in this article do not necessarily represent those of Elsevier. Link to original source.


A study that surveyed a national sample of emergency department health care providers and adult patients suggests that patients are substantially more willing to disclose their sexual orientation than health care workers believe. In a report on the study, published in JAMA Internal Medicine on April 24, the researchers found that nearly 80 percent of health care professionals believed patients would refuse to provide sexual orientation information, while only 10.3 percent of patients reported they would refuse. Closing that disclosure gap, the investigators say, has the potential to improve the care of lesbian, gay and bisexual patients, a population with historically poorer overall health and less access to health care and insurance compared to the straight population. An estimated 8 million American adults identify as lesbian, gay or bisexual, according to government sources. The new research, part of the EQUALITY Study, is a collaborative effort among researchers at the Johns Hopkins University School of Medicine, Brigham and Women's Hospital and Harvard Medical School. "Unlike racial/ethnic and age data, information about sexual orientation and gender identity has not been collected routinely in health care settings, which limits the ability of researchers and clinicians to determine the unique needs of the lesbian, gay and bisexual communities," says Brandyn Lau, M.P.H., C.P.H., assistant professor of surgery at the Johns Hopkins University School of Medicine and the report's senior author. "Health care providers haven't collected these data, at least in part due to fear of offending patients, but this study shows that most patients actually would not be offended," he says. For the study, which the researchers believe is the first to compare patients' and clinicians' views about routine collection of sexual identity data, the scientists focused on emergency departments (EDs), which see more than 130 million visits annually in the U.S. EDs are the source of nearly half of inpatient hospital admissions in the U.S. and the primary point of entry for uninsured and underinsured patients, making them ideal locations for collecting sexual orientation information. In the study's first phase, the research team recruited 53 adult patients and 26 health care professionals from three community hospitals and two academic medical centers for qualitative interviews that lasted an average of one hour. The interviews took place between August 2014 and January 2015, and one of two researchers had a guided conversation with each participant about sexual orientation collection in the ED, barriers and facilitators to collection, and preferred methods of collection. The interviews revealed that although clinicians recognized the importance of disclosure of sexual orientation when medically relevant, most patients believed that sexual orientation was always relevant. Similarly, many patients stated that sexual orientation was something health care professionals needed to know, not only for the sake of patients' health care, but also for recognizing and normalizing the lesbian, gay and bisexual population. "Our patients are telling us that routinely asking all patients who come to the ED about this information creates a sense of normalcy toward people of all sexual orientations and signals that each patient is equally welcome here, including the three to 10 percent of Americans who identify as lesbian, gay or bisexual," says Adil Haider, M.D., M.P.H., Kessler Director of the Center for Surgery and Public Health at Brigham and Women's Hospital and the paper's first author. In the next phase of the study, the research team conducted an online survey using a nationally representative sample from GfK's KnowledgePanel and GfK's Physician Consulting Network. GfK is a marketing research company with survey tools tailored to a variety of interests. Surveys were sent to 1,516 potential adult patients (244 lesbian, 289 gay, 179 bisexual and 804 straight) and 429 ED health care professionals (209 physicians and 220 nurses) between March and April 2015. The 53-question survey for patients and 45-question survey for health care professionals consisted of multiple-choice responses, Likert scale choices (agree/strongly agree, neutral, and disagree/strongly disagree) and open-ended questions. 333 (77.8 percent) of all surveyed clinicians said patients would refuse to provide their sexual orientation if asked, but only 154 patients (10.3 percent) reported they would refuse to do so, indicating a significant discrepancy between what physicians and patients believe, the researchers said. In the population-weighted results, 143 straight patients (10.1 percent of the total), one lesbian (4.8 percent), three gay (12.0 percent) and five bisexual (16.4 percent) patients would refuse to share their sexual orientation in the ED. Bisexual patients were almost twice as likely to refuse to provide their sexual orientation as were straight patients. Both patients and clinicians indicated nonverbal self-report as the preferred method of sexual orientation information collection. "We need to make collecting sexual orientation information a regular part of our practice, similar to how other demographic information such as age and race is collected, and because I don't think providers will start consistently collecting these data on their own, clinics and hospitals need to mandate it," says Lau. As a next step, the research team will test different approaches to data collection, also as part of the EQUALITY Study. Other authors on this paper include Lisa M. Kodadek, Claire Snyder, Laura Vail, Danielle German and Susan Peterson of the Johns Hopkins University; Eric B. Schneider, Rachel R. Adler and Anju Ranjit of Harvard Medical School; Maya Torain of Duke University School of Medicine; Ryan Y. Shields of Yale School of Medicine; and Jeremiah D. Schuur of Brigham & Women's Hospital. This study was funded by PCORI (contract AD-1306-03980). Brandyn Lau is supported by the Institute for Excellence in Education Berkheimer Faculty Education Scholar Grant contract CE-12-11-4489 from the Patient Centered Outcomes Research Institute (PCORI) and grant 1R01HS024547 from the Agency for Healthcare Research and Quality.


News Article | April 19, 2017
Site: www.eurekalert.org

DURHAM, N.C. -- A cartilage-mimicking material created by researchers at Duke University may one day allow surgeons to 3-D print replacement knee parts that are custom-shaped to each patient's anatomy. Human knees come with a pair of built-in shock absorbers called the menisci. These ear-shaped hunks of cartilage, nestled between the thigh and shin bones, cushion every step we take. But a lifetime of wear-and-tear -- or a single wrong step during a game of soccer or tennis -- can permanently damage these key supports, leading to pain and an increased risk of developing arthritis. The hydrogel-based material the researchers developed is the first to match human cartilage in strength and elasticity while also remaining 3-D-printable and stable inside the body. To demonstrate how it might work, the researchers used a $300 3-D printer to create custom menisci for a plastic model of a knee. "We've made it very easy now for anyone to print something that is pretty close in its mechanical properties to cartilage, in a relatively simple and inexpensive process," said Benjamin Wiley, an associate professor of chemistry at Duke and author on the paper, which appears online in ACS Biomaterials Science and Engineering. After we reach adulthood, the meniscus has limited ability to heal on its own. Surgeons can attempt to repair a torn or damaged meniscus, but often it must be partially or completely removed. Available implants either do not match the strength and elasticity of the original cartilage, or are not biocompatible, meaning they do not support the growth of cells to encourage healing around the site. Recently, materials called hydrogels have been gaining traction as a replacement for lost cartilage. Hydrogels are biocompatible and share a very similar molecular structure to cartilage: if you zoom in on either, you'll find a web of long string-like molecules with water molecules wedged into the gaps. But researchers have struggled to create recipes for synthetic hydrogels that are equal in strength to human cartilage or that are 3-D-printable. "The current gels that are available are really not as strong as human tissues, and generally, when they come out of a printer nozzle they don't stay put -- they will run all over the place, because they are mostly water," Wiley said. Feichen Yang, a graduate student in Wiley's lab and author on the paper, experimented with mixing together two different types of hydrogels -- one stiffer and stronger, and the other softer and stretchier -- to create what is called a double-network hydrogel. "The two networks are woven into each other," Yang said. "And that makes the whole material extremely strong." By changing the relative amounts of the two hydrogels, Yang could adjust the strength and elasticity of the mixture to arrive at a formula that best matches that of human cartilage. He also mixed in a special ingredient, a nanoparticle clay, to make the mock-cartilage 3-D-printable. With the addition of the clay, the hydrogel flows like water when placed under shear stress, such as when being squeezed through a small needle. But as soon as the stress is gone, the hydrogel immediately hardens into its printed shape. 3-D printing of other custom-shaped implants, including hip replacements, cranial plates, and even spinal vertebrae, is already practiced in orthopedic surgery. These custom implants are based on virtual 3-D models of a patient's anatomy, which can be obtained from computer tomography (CT) or magnetic resonance imaging (MRI) scans. Meniscus implants could also benefit from 3-D printing's ability to create customized and complex shapes, the researchers say. "Shape is a huge deal for the meniscus," Wiley said. "This thing is under a lot of pressure, and if it doesn't fit you perfectly it could potentially slide out, or be debilitating or painful." "A meniscus is not a homogenous material," Yang added. "The middle is stiffer, And the outside is a bit softer. Multi-material 3-D printers let you print different materials in different layers, but with a traditional mold you can only use one material." In a simple demonstration, Yang took a CT scan of a plastic model of a knee and used the information from the scan to 3-D print new menisci using his double network hydrogel. The whole process, from scan to finished meniscus, took only about a day, he says. "This is really a young field, just starting out," Wiley said. "I hope that demonstrating the ease with which this can be done will help get a lot of other people interested in making more realistic printable hydrogels with mechanical properties that are even closer to human tissue." This research was supported by start-up funds from Duke University and grants from the National Science Foundation (ECCS-1344745, DMR-1253534). CITATION: "3D Printing of a Double Network Hydrogel with a Compression Strength and Elastic Modulus Greater than that of Cartilage," Feichen Yang, Vaibhav Tadepalli and Benjamin J. Wiley. ACS Biomaterials Science and Engineering, online April 3, 2017. DOI: 10.1021/acsbiomaterials.7b00094


News Article | April 22, 2017
Site: www.gizmag.com

To demonstrate how their 3-D-printable, cartilage-mimicking material might work, the researchers used a US$300 3D printer to create custom menisci for a model of a knee (Credit: Duke University/Feichen Yang) Far more than a simple hinge, the human knee is a complex, intricate mechanism, and a knee injury is a painful and debilitating of condition that's difficult and expensive to repair. Duke University is developing a cartilage-like material based on hydrogel that may make the task of repairing knees easier. The 3D-printable hydrogel allows bioengineers to create bespoke artificial replacement parts for injured knees that are tailored to match the old part both in shape and mechanical properties. 3D printing has been a boon to surgeons. By using virtual models of a patient's body parts from computer tomography or magnetic resonance imaging scans, surgeons can provide new hips, cranial sections, and spinal vertebrae that are close matches to the original. They're even used to produce detailed models of the human heart for cardiac surgeons to plan complicated operations or to fashion mechanical implants that fit exactly. Such a technology should be a godsend for repairing damaged knees, but knee anatomy is very tricky. One of the key components are the menisci, and it is this component that the researchers used to demonstrate the potential of the new material. Menisci are ear-shaped pieces of cartilage that sit inside the knee between the femur and tibia. They act as shock absorbers and lubricating bearings that allow the joint to bend and move smoothly in a surprisingly subtle and dynamic manner, and they can withstand decades of use that would destroy a man-made device. However, hard use or even an awkward stumble can rip the menisci, resulting in a painful chronic injury and the increased risk of arthritis because, in adults, the menisci cannot heal quickly or completely. This means that injured knees often require surgery, removal of the damaged meniscus, or replacing it with plastic implants. But these implants are a poor match for the original in strength or elasticity. They are also often non-biocompatible, so the surrounding tissues cannot heal properly. 3D printing should be an improvement, but the meniscus must fit exactly or it could slip out or cause great pain. In addition, the material itself must be tailored for the job. "A meniscus is not a homogenous material," says graduate student Feichen Yang. "The middle is stiffer, And the outside is a bit softer. Multi-material 3D printers let you print different materials in different layers, but with a traditional mold you can only use one material." So far, that looks like an endorsement for 3D printing, but what to print the meniscus out of? A favored candidate is hydrogel, which is stable inside the body, biocompatible, and has a similar molecular structure to cartilage with long-chain molecules holding in water. The catch is finding a hydrogel that has the same properties as cartilage and can be printed. The current hydrogels lack the strength and ooze away when put through a 3D printer due to their high water content, so the Duke researchers are looking at a new hydrogel-based material that they claim is the first to match human cartilage in strength and elasticity, yet is 3D-printable. It was created by Yang, who combined a stiff strong hydrogel with one that is soft and stretchy. When mixed, the polymer chains wove together to create a new hydrogel that is strong and elastic. More importantly, by altering the proportions, these properties could be controlled across different parts of the artificial meniscus. Yang then added a nanoparticle clay to the hydrogel to make it printable. When subjected to shear stress, the clay particles collapse into a smooth-flowing liquid, but once in place after oozing through a printing needle, they set hard to hold up the hydrogel structure. Using a US$300 printer, Yang was able to print a replacement meniscus with the new hydrogel in only a day – showing that what was once a daunting manufacturing process could soon be simple and inexpensive. "This is really a young field, just starting out," says Benjamin Wiley, an associate professor of chemistry. "I hope that demonstrating the ease with which this can be done will help get a lot of other people interested in making more realistic printable hydrogels with mechanical properties that are even closer to human tissue." The research was published in ACS Biomaterials Science and Engineering.


News Article | May 4, 2017
Site: www.theengineer.co.uk

A cartilage-mimicking material created at Duke University could allow surgeons to 3D print replacement knee parts that are custom-shaped. The hydrogel-based material is claimed to be the first to match human cartilage in strength and elasticity while also remaining 3D-printable and stable inside the body. To demonstrate how it might work, the researchers used a $300 3D printer to create custom menisci for a plastic model of a knee. “We’ve made it very easy now for anyone to print something that is pretty close in its mechanical properties to cartilage, in a relatively simple and inexpensive process,” said Benjamin Wiley, an associate professor of chemistry at Duke and author on the paper – 3D Printing of a Double Network Hydrogel with a Compression Strength and Elastic Modulus Greater than that of Cartilage – which appears online in ACS Biomaterials Science and Engineering. Surgeons can attempt to repair a torn or damaged meniscus, but often it must be partially or completely removed. Available implants either do not match the strength and elasticity of the original cartilage, or are not biocompatible. Hydrogels have been gaining traction as a replacement for lost cartilage as they are biocompatible and share a very similar molecular structure to cartilage. Researchers have, however, struggled to create recipes for synthetic hydrogels that are equal in strength to human cartilage or that are 3D-printable. “The current gels that are available are really not as strong as human tissues, and generally, when they come out of a printer nozzle they don’t stay put – they will run all over the place, because they are mostly water,” Wiley said. Feichen Yang, a graduate student in Wiley’s lab and author on the paper, experimented with mixing together two different types of hydrogels – one stiffer and stronger, and the other softer and stretchier – to create a double-network hydrogel. “The two networks are woven into each other,” Yang said. “And that makes the whole material extremely strong.” By changing the relative amounts of the two hydrogels, Yang could adjust the strength and elasticity of the mixture to arrive at a formula that best matches that of human cartilage. He also mixed in a nanoparticle clay to make the mock-cartilage 3D-printable. With the addition of the clay, the hydrogel flows like water when placed under shear stress, such as when being squeezed through a small needle. But as soon as the stress is gone, the hydrogel hardens into its printed shape. In a simple demonstration, Yang took a CT scan of a plastic model of a knee and used the information from the scan to 3D print new menisci using his double network hydrogel. The whole process, from scan to finished meniscus, took only about a day, he said.


"Mike brings a remarkable amount of industry knowledge and expertise," said Raj Rajan, Chairman and CEO of SoftWear Automation. "We're thrilled to have him evangelize our technology to help redesign the textile and apparel industry supply chain." "The sewn products industry is ripe for innovation, and SoftWear Automation is perfectly positioned as a leader in this space," said Dr. Fralix. "I'm excited to join the team at this rapid stage of its growth. This is a fantastic opportunity to help lead the adoption of SoftWear's unique robots in a market I know so well." SoftWear Automation's Sewbots™ are currently commercially deployed in the home goods and automotive sectors. With Sewbots™ produced goods currently on retail shelves globally, SoftWear is committed to disrupting, in a positive way, the $100 billion sewn products industry. Sewbots™, fully autonomous sewing worklines, allow companies to SEWLOCAL™, geographically shortening the distance between consumers and manufacturers. An internationally recognized speaker, Dr. Fralix offers expertise on a variety of topics, including 3D product development, sizing for fit, production scheduling, industrial engineering, ergonomics, full package production, simulation, lean manufacturing systems, sustainable technologies, and the digital supply chain. He holds a Bachelor's degree in Applied Mathematics and Philosophy from North Carolina State University, a Master's degree in Business Administration from Duke University, and a Doctorate in Textile Technology Management from North Carolina State University, where he was appointed an Adjunct Associate Professor in 2009. He is also active in several industry organizations including the International Apparel Federation, where he serves as a member of the Board and Chairman of the Technical Committee. About SoftWear Automation, Inc. SoftWear Automation, Inc. is an Atlanta-based machine-vision and robotics startup disrupting the $100 billion sewn products industry by creating autonomous sewn products work lines in home goods, footwear and apparel. SoftWear's fully automated Sewbots™ allow manufacturers to SEWLOCAL™, moving their supply chains closer to the customer while creating higher quality products at a lower cost. For more information, visit www.softwearautomation.com. To view the original version on PR Newswire, visit:http://www.prnewswire.com/news-releases/dr-mike-fralix-to-work-with-softwear-automation-as-a-technology-evangelist-to-accelerate-the-deployment-of-sewbotstm-300452256.html


News Article | April 17, 2017
Site: www.gizmag.com

Even though you've seen refrigerators covered in them, magnets are actually hard to come by. In fact, according to Duke University (DU), only about five percent of all known inorganic compounds exhibit even a little magnetism. So instead of searching the world for new magnets, researchers at the school and at Trinity College in Dublin have recently synthesized two of their own, from a list of 236,115 potential creations. To be clear, magnets have been made in the lab before. But oftentimes, the process that leads to their genesis can be more of a hit-and-miss endeavor than an exacting science. To help give them an edge, the DU researchers turned, naturally, to a computer. Specifically, they worked with a computational model that let them try out different molecules in different arrangements for a class of materials called Heusler alloys, which consist of three different elements arranged in particular ways. Considering the range of possible elements (55) and atomic structures, the potential compounds numbered 236,115. Using the model to see how different atoms and structures would react with each other, the researchers eventually whittled their list down to 22 possible candidates that could display magnetism, and then reduced that list to just 14 compounds by eliminating close relatives. Next it was time to make the materials in the lab, which was still a challenge, but an much easier one. "It can take years to realize a way to create a new material in a lab," said Corey Oses, a DU doctoral student. "There can be all types of constraints or special conditions that are required for a material to stabilize. But choosing from 14 is a lot better than 200,000." Oses worked with Stefano Curtarolo, professor of mechanical engineering and materials science and director of the Center for Materials Genomics at Duke, on a paper that was recently published about the materials in the journal Science Advances. The team relied upon Stefano Sanvito, professor of physics at Trinity College in Dublin, Ireland to actually produce the new magnetic materials, and, after years of attempts, he had success with with two. The first consists of cobalt, manganese and titanium (Co2MnTi) and holds it magnetism to the impressive temperature of 938 K (1228° F), which could make it an idea candidate in a range of industrial applications. The second is made of a mix of manganese, platinum and palladium (Mn2PtPd) and, although it doesn't actually produce a magnetic field of its own, it has electrons that react strongly to magnetic fields. This would make it a good candidate for use in hard drives although, beyond that, its use is somewhat limited because its behavior is difficult to predict. Still, the researchers say that the function of the magnets isn't really as important as the fact that they were developed in the first place. "It doesn't really matter if either of these new magnets proves useful in the future," said Curtarolo. "The ability to rapidly predict their existence is a major coup and will be invaluable to materials scientists moving forward." Oses adds that the new computer model could also help the world move away from its reliance on rare-earth elements such as yttrium and neodymium, a common but hard-to-secure source of current magnets. "Many high-performance permanent magnets contain rare earth elements," said Oses. "And rare earth materials can be expensive and difficult to acquire, particularly those that can only be found in Africa and China. The search for magnets free of rare-earth materials is critical, especially as the world seems to be shying away from globalization."


Dr. Porter was most recently Chief Medical Officer of Dance BioPharm, focused on the development of inhaled insulin products to treat diabetes.  Previously, she was Vice President, Medical Development of Amylin Pharmaceuticals where she led the R&D efforts for the Amylin-Lilly Alliance culminating in the approval of the GLP-1 agonist Bydureon (exenatide extended release), the first once weekly treatment for Type 2 diabetes.  Earlier, Dr. Porter held positions of increasing leadership at GlaxoSmithKline, where she was responsible for clinical strategy for Avandia (rosiglitazone) for Type 2 diabetes.  Dr. Porter earned a B.S. in Biology from William & Mary, an M.D. from Duke University, and completed her fellowship in Endocrinology and Hypertension at Brigham and Women's Hospital. "Eiger and Stanford have made amazing progress across multiple clinical studies in which exendin 9-39 was shown to prevent and reduce symptoms of hypoglycemia in post-bariatric surgical patients during an oral glucose tolerance test (OGTT), and I'm very encouraged by the results," said Lisa Porter, M.D.  "Exendin 9-39 represents the first potential targeted therapy for patients suffering from PBH, a significant unmet medical need.  I'm excited to join the team and lead this program moving forward." Eiger is developing a proprietary, novel liquid formulation of exendin 9-39 which in dog studies has demonstrated a greater than two-fold increase in peak plasma concentrations compared to the original lyophilized powder of exendin 9-39.  Development of a liquid formulation of exendin 9-39 represents an opportunity for lower dosing and once on the market, would eliminate the need for patients to dissolve powder in saline, which could be a more convenient product presentation for patients.  Eiger is evaluating the new exendin 9-39 liquid formulation in patients in the ongoing MAD study and also in a Phase 1 PK study scheduled for Q2 2017, both of which will inform the next, larger Phase 2 study planned for second half 2017. About Insulin, GLP-1, and Exendin 9-39 Insulin is the principal physiologic hormone secreted to control high blood glucose levels.  Abnormal increases in insulin secretion can lead to profound hypoglycemia (low blood sugar), a state that can result in significant morbidities, including seizures, brain damage, and coma.  GLP-1 is a gastrointestinal hormone that is released postprandially from the intestinal L-cells.  GLP-1 binds to GLP-1 receptors on the beta cells of the pancreas and increases the release of insulin.  In patients with PBH, GLP-1-mediated insulin secretion is dysfunctionally exaggerated. Exendin 9-39 is a 31-amino acid peptide that selectively targets and blocks GLP-1 receptors, normalizing insulin secretion by the pancreas, and thereby reducing postprandial hypoglycemia.  Exendin 9-39 is being investigated as a novel treatment for PBH.  Exendin 9-39 has been granted orphan designation in the European Union by the EMA for the treatment of non-insulinoma pancreatogenous hypoglycemia syndrome (NIPHS) and orphan designation in the United States by the FDA for the treatment of hyperinsulinemic hypoglycemia.  Both of these broad designations include PBH.  A therapy that safely and effectively mitigates insulin-induced hypoglycemia has the potential to address a significant unmet therapeutic need for certain rare medical conditions associated with hyperinsulinism.  Exendin 9-39 has never been approved or commercialized for any indication.  The long-term efficacy and safety of subcutaneous (SC) injected exendin 9-39 have not yet been established.  More information on exendin 9-39 clinical trials may be found at www.clinicaltrials.gov. About Post-Bariatric Hypoglycemia (PBH) Approximately 150,000-200,000 bariatric surgical procedures are performed each year in the United States, and another 100,000 are performed each year in Europe.  The estimated prevalence of PBH is approximately 30,000 in the United States and approximately 25,000 in the European Union.  As the number of bariatric surgeries to treat obesity and related comorbidities has increased, so too has the number of individuals who experience PBH, with symptoms typically developing 12 to 18 months following surgery.  PBH can occur with a range of severity in post-bariatric surgery patients.  Mild to moderate hypoglycemia may be managed largely through dietary carbohydrate restriction, whereas severe hypoglycemia results in neuroglycopenic outcomes (altered mental status, loss of consciousness, seizures, coma) which are unresponsive to diet modification.  Severe PBH can be debilitating with a significant negative impact on quality of life.  There is no approved pharmacologic therapy. About Eiger Eiger is a clinical-stage biopharmaceutical company committed to bringing to market novel products for the treatment of rare diseases.  The company has built a diverse portfolio of well-characterized product candidates with the potential to address diseases for which the unmet medical need is high, the biology for treatment is identified, and for which an effective therapy is urgently needed.  For more information, please visit the Company's website at www.eigerbio.com. Note Regarding Forward-Looking Statements This press release contains forward-looking statements that involve substantial risks and uncertainties.  All statements, other than statements of historical facts, included in this press release regarding our strategy, future operations, future financial position, future revenue, projected expenses, prospects, plans and objectives, intentions, beliefs and expectations of management are forward-looking statements.  These forward-looking statements may be accompanied by such words as "anticipate," "believe," "could," "estimate," "expect," "forecast," "intend," "may," "plan," "potential," "project," "target," "will" and other words and terms of similar meaning.  Examples of such statements include, but are not limited to, whether or not pegylated interferon lambda-1a, lonafarnib, ubenimex or exendin 9-39, including SC formulation, may be further developed and approved, whether Phase 1 and Phase 2 studies of exendin 9-39 will show safety and activity consistent with early clinical results, including the interim results of the MAD study, or that the new liquid formulation will be consistent with results seen with IV and SC formulations of exendin 9-39, statements relating to the availability of cash for Eiger's future operations, Eiger's ability to develop its drug candidates for potential commercialization, the timing of the commencement and number and completion of Phase 2 trials and whether the products can be successfully developed or commercialized.  Various important factors could cause actual results or events to differ materially from the forward-looking statements that Eiger makes, including the risks described in the "Risk Factors" sections in the Annual Report on Form 10-K for the period ended December 31, 2016 and our periodic reports filed with the Securities and Exchange Commission.  Eiger assumes no obligation to update any forward-looking statements, except as required by law. To view the original version on PR Newswire, visit:http://www.prnewswire.com/news-releases/eiger-announces-industry-veteran-lisa-porter-md-to-lead-development-of-exendin-9-39-for-the-treatment-of-post-bariatric-hypoglycemia-300440724.html


News Article | May 3, 2017
Site: www.scientificcomputing.com

Material scientists have predicted and built two new magnetic materials, atom-by-atom, using high-throughput computational models. The success marks a new era for the large-scale design of new magnetic materials at unprecedented speed. Although magnets abound in everyday life, they are actually rarities -- only about five percent of known inorganic compounds show even a hint of magnetism. And of those, just a few dozen are useful in real-world applications because of variability in properties such as effective temperature range and magnetic permanence. The relative scarcity of these materials can make them expensive or difficult to obtain, leading many to search for new options given how important magnets are in applications ranging from motors to magnetic resonance imaging (MRI) machines. The traditional process involves little more than trial and error, as researchers produce different molecular structures in hopes of finding one with magnetic properties. Many high-performance magnets, however, are singular oddities among physical and chemical trends that defy intuition. In a new study, materials scientists from Duke University provide a shortcut in this process. They show the capability to predict magnetism in new materials through computer models that can screen hundreds of thousands of candidates in short order. And, to prove it works, they've created two magnetic materials that have never been seen before. "Predicting magnets is a heck of a job and their discovery is very rare," said Stefano Curtarolo, professor of mechanical engineering and materials science and director of the Center for Materials Genomics at Duke. "Even with our screening process, it took years of work to synthesize our predictions. We hope others will use this approach to create magnets for use in a wide range of applications." The group focused on a family of materials called Heusler alloys -- materials made with atoms from three different elements arranged in one of three distinct structures. Considering all the possible combinations and arrangements available using 55 elements, the researchers had 236,115 potential prototypes to choose from. To narrow the list down, the researchers built each prototype atom-by-atom in a computational model. By calculating how the atoms would likely interact and the energy each structure would require, the list dwindled to 35,602 potentially stable compounds. From there, the researchers conducted a more stringent test of stability. Generally speaking, materials stabilize into the arrangement requiring the least amount of energy to maintain. By checking each compound against other atomic arrangements and throwing out those that would be beat out by their competition, the list shrank to 248. Of those 248, only 22 materials showed a calculated magnetic moment. The final cut dropped any materials with competing alternative structures too close for comfort, leaving a final 14 candidates to bring from theoretical model into the real world. But as most things in a laboratory turn out, synthesizing new materials is easier said than done. "It can take years to realize a way to create a new material in a lab," said Corey Oses, a doctoral student in Curtarolo's laboratory and second author on the paper. "There can be all types of constraints or special conditions that are required for a material to stabilize. But choosing from 14 is a lot better than 200,000." For the synthesis, Curtarolo and Oses turned to Stefano Sanvito, professor of physics at Trinity College in Dublin, Ireland. After years of attempting to create four of the materials, Sanvito succeeded with two. Both were, as predicted, magnetic. The first newly minted magnetic material was made of cobalt, magnesium and titanium (Co2MnTi). By comparing the measured properties of similarly structured magnets, the researchers were able to predict the new magnet's properties with a high degree of accuracy. Of particular note, they predicted the temperature at which the new material lost its magnetism to be 940 K (1232 degrees Fahrenheit). In testing, the actual "Curie temperature" turned out to be 938 K (1228 degrees Fahrenheit) -- an exceptionally high number. This, along with its lack of rare earth elements, makes it potentially useful in many commercial applications. "Many high-performance permanent magnets contain rare earth elements," said Oses. "And rare earth materials can be expensive and difficult to acquire, particularly those that can only be found in Africa and China. The search for magnets free of rare-earth materials is critical, especially as the world seems to be shying away from globalization." The second material was a mixture of manganese, platinum and palladium (Mn2PtPd), which turned out to be an antiferromagnet, meaning that its electrons are evenly divided in their alignments. This leads the material to have no internal magnetic moment of its own, but makes its electrons responsive to external magnetic fields. While this property doesn't have many applications outside of magnetic field sensing, hard drives and Random Access Memory (RAM), these types of magnets are extremely difficult to predict. Nevertheless, the group's calculations for its various properties remained spot on. "It doesn't really matter if either of these new magnets proves useful in the future," said Curtarolo. "The ability to rapidly predict their existence is a major coup and will be invaluable to materials scientists moving forward."


News Article | April 17, 2017
Site: www.eurekalert.org

DURHAM, N.C. -- Material scientists have predicted and built two new magnetic materials, atom-by-atom, using high-throughput computational models. The success marks a new era for the large-scale design of new magnetic materials at unprecedented speed. Although magnets abound in everyday life, they are actually rarities -- only about five percent of known inorganic compounds show even a hint of magnetism. And of those, just a few dozen are useful in real-world applications because of variability in properties such as effective temperature range and magnetic permanence. The relative scarcity of these materials can make them expensive or difficult to obtain, leading many to search for new options given how important magnets are in applications ranging from motors to magnetic resonance imaging (MRI) machines. The traditional process involves little more than trial and error, as researchers produce different molecular structures in hopes of finding one with magnetic properties. Many high-performance magnets, however, are singular oddities among physical and chemical trends that defy intuition. In a new study, materials scientists from Duke University provide a shortcut in this process. They show the capability to predict magnetism in new materials through computer models that can screen hundreds of thousands of candidates in short order. And, to prove it works, they've created two magnetic materials that have never been seen before. The results appear April 14, 2017, in Science Advances. "Predicting magnets is a heck of a job and their discovery is very rare," said Stefano Curtarolo, professor of mechanical engineering and materials science and director of the Center for Materials Genomics at Duke. "Even with our screening process, it took years of work to synthesize our predictions. We hope others will use this approach to create magnets for use in a wide range of applications." The group focused on a family of materials called Heusler alloys -- materials made with atoms from three different elements arranged in one of three distinct structures. Considering all the possible combinations and arrangements available using 55 elements, the researchers had 236,115 potential prototypes to choose from. To narrow the list down, the researchers built each prototype atom-by-atom in a computational model. By calculating how the atoms would likely interact and the energy each structure would require, the list dwindled to 35,602 potentially stable compounds. From there, the researchers conducted a more stringent test of stability. Generally speaking, materials stabilize into the arrangement requiring the least amount of energy to maintain. By checking each compound against other atomic arrangements and throwing out those that would be beat out by their competition, the list shrank to 248. Of those 248, only 22 materials showed a calculated magnetic moment. The final cut dropped any materials with competing alternative structures too close for comfort, leaving a final 14 candidates to bring from theoretical model into the real world. But as most things in a laboratory turn out, synthesizing new materials is easier said than done. "It can take years to realize a way to create a new material in a lab," said Corey Oses, a doctoral student in Curtarolo's laboratory and second author on the paper. "There can be all types of constraints or special conditions that are required for a material to stabilize. But choosing from 14 is a lot better than 200,000." For the synthesis, Curtarolo and Oses turned to Stefano Sanvito, professor of physics at Trinity College in Dublin, Ireland. After years of attempting to create four of the materials, Sanvito succeeded with two. Both were, as predicted, magnetic. The first newly minted magnetic material was made of cobalt, magnesium and titanium (Co2MnTi). By comparing the measured properties of similarly structured magnets, the researchers were able to predict the new magnet's properties with a high degree of accuracy. Of particular note, they predicted the temperature at which the new material lost its magnetism to be 940 K (1232 degrees Fahrenheit). In testing, the actual "Curie temperature" turned out to be 938 K (1228 degrees Fahrenheit) -- an exceptionally high number. This, along with its lack of rare earth elements, makes it potentially useful in many commercial applications. "Many high-performance permanent magnets contain rare earth elements," said Oses. "And rare earth materials can be expensive and difficult to acquire, particularly those that can only be found in Africa and China. The search for magnets free of rare-earth materials is critical, especially as the world seems to be shying away from globalization." The second material was a mixture of manganese, platinum and palladium (Mn2PtPd), which turned out to be an antiferromagnet, meaning that its electrons are evenly divided in their alignments. This leads the material to have no internal magnetic moment of its own, but makes its electrons responsive to external magnetic fields. While this property doesn't have many applications outside of magnetic field sensing, hard drives and Random Access Memory (RAM), these types of magnets are extremely difficult to predict. Nevertheless, the group's calculations for its various properties remained spot on. "It doesn't really matter if either of these new magnets proves useful in the future," said Curtarolo. "The ability to rapidly predict their existence is a major coup and will be invaluable to materials scientists moving forward." This work was supported by the Science Foundation of Ireland, the EU Commission and the National Science Foundation (DGF1106401). "Accelerated discovery of new magnets in the Heusler alloy family." Stefano Sanvito, Corey Oses, Junkai Xue, Anurag Tiwariy, Mario Zic, Thomas Archer, Pelin Tozman, Munuswamy Venkatesan, J. Michael D. Coey, and Stefano Curtarolo. Science Advances, April 14, 2017. DOI: 10.1126/sciadv.1602241


News Article | April 17, 2017
Site: www.rdmag.com

Material scientists have predicted and built two new magnetic materials, atom-by-atom, using high-throughput computational models. The success marks a new era for the large-scale design of new magnetic materials at unprecedented speed. Although magnets abound in everyday life, they are actually rarities -- only about five percent of known inorganic compounds show even a hint of magnetism. And of those, just a few dozen are useful in real-world applications because of variability in properties such as effective temperature range and magnetic permanence. The relative scarcity of these materials can make them expensive or difficult to obtain, leading many to search for new options given how important magnets are in applications ranging from motors to magnetic resonance imaging (MRI) machines. The traditional process involves little more than trial and error, as researchers produce different molecular structures in hopes of finding one with magnetic properties. Many high-performance magnets, however, are singular oddities among physical and chemical trends that defy intuition. In a new study, materials scientists from Duke University provide a shortcut in this process. They show the capability to predict magnetism in new materials through computer models that can screen hundreds of thousands of candidates in short order. And, to prove it works, they've created two magnetic materials that have never been seen before. "Predicting magnets is a heck of a job and their discovery is very rare," said Stefano Curtarolo, professor of mechanical engineering and materials science and director of the Center for Materials Genomics at Duke. "Even with our screening process, it took years of work to synthesize our predictions. We hope others will use this approach to create magnets for use in a wide range of applications." The group focused on a family of materials called Heusler alloys -- materials made with atoms from three different elements arranged in one of three distinct structures. Considering all the possible combinations and arrangements available using 55 elements, the researchers had 236,115 potential prototypes to choose from. To narrow the list down, the researchers built each prototype atom-by-atom in a computational model. By calculating how the atoms would likely interact and the energy each structure would require, the list dwindled to 35,602 potentially stable compounds. From there, the researchers conducted a more stringent test of stability. Generally speaking, materials stabilize into the arrangement requiring the least amount of energy to maintain. By checking each compound against other atomic arrangements and throwing out those that would be beat out by their competition, the list shrank to 248. Of those 248, only 22 materials showed a calculated magnetic moment. The final cut dropped any materials with competing alternative structures too close for comfort, leaving a final 14 candidates to bring from theoretical model into the real world. But as most things in a laboratory turn out, synthesizing new materials is easier said than done. "It can take years to realize a way to create a new material in a lab," said Corey Oses, a doctoral student in Curtarolo's laboratory and second author on the paper. "There can be all types of constraints or special conditions that are required for a material to stabilize. But choosing from 14 is a lot better than 200,000." For the synthesis, Curtarolo and Oses turned to Stefano Sanvito, professor of physics at Trinity College in Dublin, Ireland. After years of attempting to create four of the materials, Sanvito succeeded with two. Both were, as predicted, magnetic. The first newly minted magnetic material was made of cobalt, magnesium and titanium (Co2MnTi). By comparing the measured properties of similarly structured magnets, the researchers were able to predict the new magnet's properties with a high degree of accuracy. Of particular note, they predicted the temperature at which the new material lost its magnetism to be 940 K (1232 degrees Fahrenheit). In testing, the actual "Curie temperature" turned out to be 938 K (1228 degrees Fahrenheit) -- an exceptionally high number. This, along with its lack of rare earth elements, makes it potentially useful in many commercial applications. "Many high-performance permanent magnets contain rare earth elements," said Oses. "And rare earth materials can be expensive and difficult to acquire, particularly those that can only be found in Africa and China. The search for magnets free of rare-earth materials is critical, especially as the world seems to be shying away from globalization." The second material was a mixture of manganese, platinum and palladium (Mn2PtPd), which turned out to be an antiferromagnet, meaning that its electrons are evenly divided in their alignments. This leads the material to have no internal magnetic moment of its own, but makes its electrons responsive to external magnetic fields. While this property doesn't have many applications outside of magnetic field sensing, hard drives and Random Access Memory (RAM), these types of magnets are extremely difficult to predict. Nevertheless, the group's calculations for its various properties remained spot on. "It doesn't really matter if either of these new magnets proves useful in the future," said Curtarolo. "The ability to rapidly predict their existence is a major coup and will be invaluable to materials scientists moving forward."


News Article | May 3, 2017
Site: www.scientificcomputing.com

Material scientists have predicted and built two new magnetic materials, atom-by-atom, using high-throughput computational models. The success marks a new era for the large-scale design of new magnetic materials at unprecedented speed. Although magnets abound in everyday life, they are actually rarities -- only about five percent of known inorganic compounds show even a hint of magnetism. And of those, just a few dozen are useful in real-world applications because of variability in properties such as effective temperature range and magnetic permanence. The relative scarcity of these materials can make them expensive or difficult to obtain, leading many to search for new options given how important magnets are in applications ranging from motors to magnetic resonance imaging (MRI) machines. The traditional process involves little more than trial and error, as researchers produce different molecular structures in hopes of finding one with magnetic properties. Many high-performance magnets, however, are singular oddities among physical and chemical trends that defy intuition. In a new study, materials scientists from Duke University provide a shortcut in this process. They show the capability to predict magnetism in new materials through computer models that can screen hundreds of thousands of candidates in short order. And, to prove it works, they've created two magnetic materials that have never been seen before. "Predicting magnets is a heck of a job and their discovery is very rare," said Stefano Curtarolo, professor of mechanical engineering and materials science and director of the Center for Materials Genomics at Duke. "Even with our screening process, it took years of work to synthesize our predictions. We hope others will use this approach to create magnets for use in a wide range of applications." The group focused on a family of materials called Heusler alloys -- materials made with atoms from three different elements arranged in one of three distinct structures. Considering all the possible combinations and arrangements available using 55 elements, the researchers had 236,115 potential prototypes to choose from. To narrow the list down, the researchers built each prototype atom-by-atom in a computational model. By calculating how the atoms would likely interact and the energy each structure would require, the list dwindled to 35,602 potentially stable compounds. From there, the researchers conducted a more stringent test of stability. Generally speaking, materials stabilize into the arrangement requiring the least amount of energy to maintain. By checking each compound against other atomic arrangements and throwing out those that would be beat out by their competition, the list shrank to 248. Of those 248, only 22 materials showed a calculated magnetic moment. The final cut dropped any materials with competing alternative structures too close for comfort, leaving a final 14 candidates to bring from theoretical model into the real world. But as most things in a laboratory turn out, synthesizing new materials is easier said than done. "It can take years to realize a way to create a new material in a lab," said Corey Oses, a doctoral student in Curtarolo's laboratory and second author on the paper. "There can be all types of constraints or special conditions that are required for a material to stabilize. But choosing from 14 is a lot better than 200,000." For the synthesis, Curtarolo and Oses turned to Stefano Sanvito, professor of physics at Trinity College in Dublin, Ireland. After years of attempting to create four of the materials, Sanvito succeeded with two. Both were, as predicted, magnetic. The first newly minted magnetic material was made of cobalt, magnesium and titanium (Co2MnTi). By comparing the measured properties of similarly structured magnets, the researchers were able to predict the new magnet's properties with a high degree of accuracy. Of particular note, they predicted the temperature at which the new material lost its magnetism to be 940 K (1232 degrees Fahrenheit). In testing, the actual "Curie temperature" turned out to be 938 K (1228 degrees Fahrenheit) -- an exceptionally high number. This, along with its lack of rare earth elements, makes it potentially useful in many commercial applications. "Many high-performance permanent magnets contain rare earth elements," said Oses. "And rare earth materials can be expensive and difficult to acquire, particularly those that can only be found in Africa and China. The search for magnets free of rare-earth materials is critical, especially as the world seems to be shying away from globalization." The second material was a mixture of manganese, platinum and palladium (Mn2PtPd), which turned out to be an antiferromagnet, meaning that its electrons are evenly divided in their alignments. This leads the material to have no internal magnetic moment of its own, but makes its electrons responsive to external magnetic fields. While this property doesn't have many applications outside of magnetic field sensing, hard drives and Random Access Memory (RAM), these types of magnets are extremely difficult to predict. Nevertheless, the group's calculations for its various properties remained spot on. "It doesn't really matter if either of these new magnets proves useful in the future," said Curtarolo. "The ability to rapidly predict their existence is a major coup and will be invaluable to materials scientists moving forward."


News Article | May 3, 2017
Site: www.materialstoday.com

Material scientists have predicted and built two new magnetic materials, atom-by-atom, using high-throughput computational models. Their success marks a new era for the large-scale design of new magnetic materials at unprecedented speed. Although magnets abound in everyday life, they are actually rarities – only about 5% of known inorganic compounds show even a hint of magnetism. And of those, just a few dozen are useful in real-world applications because of variability in properties such as effective temperature range and magnetic permanence. The relative scarcity of magnetic materials can make them expensive or difficult to obtain, leading many researchers to search for new options given how important magnets are in applications ranging from motors to magnetic resonance imaging (MRI) machines. The traditional search process involves little more than trial and error, with researchers producing different molecular structures in hopes of finding one with magnetic properties. Many high-performance magnets, however, are singular oddities that defy intuition. In a new study, materials scientists from Duke University provide a shortcut to this process, developing computer models to predict magnetism in new materials by screening hundreds of thousands of candidates in short order. And to prove these models work, they've created two magnetic materials that have never been seen before. Their results appear in a paper in Science Advances. "Predicting magnets is a heck of a job and their discovery is very rare," said Stefano Curtarolo, professor of mechanical engineering and materials science and director of the Center for Materials Genomics at Duke University. "Even with our screening process, it took years of work to synthesize our predictions. We hope others will use this approach to create magnets for use in a wide range of applications." The group focused on a family of materials called Heusler alloys – materials made with atoms from three different elements arranged in one of three distinct structures. Considering all the possible combinations and arrangements available using 55 elements, the researchers had 236,115 potential candidates to choose from. To narrow the list down, the researchers built each candidate atom-by-atom in a computational model. By calculating how the atoms would likely interact and the energy each structure would require, the list dwindled to 35,602 potentially stable compounds. From there, the researchers conducted a more stringent test of stability. Generally speaking, materials stabilize into the arrangement requiring the least amount of energy to maintain. By checking each compound against other atomic arrangements and throwing out those that would be beat by their competition, the list of candidates shrank to 248. Of those 248, only 22 materials showed a calculated magnetic moment. The final cut dropped any materials with competing alternative structures too close for comfort, leaving a final 14 candidates to bring out from theoretical model into the real world. But as often proved in the laboratory, synthesizing new materials is easier said than done. "It can take years to realize a way to create a new material in a lab," said Corey Oses, a doctoral student in Curtarolo's laboratory and second author on the paper. "There can be all types of constraints or special conditions that are required for a material to stabilize. But choosing from 14 is a lot better than 200,000." For the synthesis, Curtarolo and Oses turned to Stefano Sanvito, professor of physics at Trinity College in Dublin, Ireland. After years of attempting to create four of the materials, Sanvito succeeded with two, and both were, as predicted, magnetic. The first newly-minted magnetic material was made of cobalt, magnesium and titanium (Co MnTi). By comparing the measured properties of similarly structured magnets, the researchers were able to predict the new magnet's properties with a high degree of accuracy. Of particular note, they predicted that the temperature at which the new material lost its magnetism would be 940K. In testing, the actual ‘Curie temperature’ turned out to be 938K – an exceptionally high number. This, along with the fact that it doesn’t contain any rare earth elements, makes this new magnetic material potentially useful in many commercial applications. "Many high-performance permanent magnets contain rare earth elements," said Oses. "And rare earth materials can be expensive and difficult to acquire, particularly those that can only be found in Africa and China. The search for magnets free of rare-earth materials is critical, especially as the world seems to be shying away from globalization." The second material was a mixture of manganese, platinum and palladium (Mn PtPd), which turned out to be an antiferromagnet, meaning that its electrons are evenly divided in their alignments. So although the material has no internal magnetic moment of its own, its electrons are responsive to external magnetic fields. While this property doesn't have many applications outside of magnetic field sensing, hard drives and Random Access Memory (RAM), these types of magnets are extremely difficult to predict. Nevertheless, the group's calculations for its various properties proved to be spot on. "It doesn't really matter if either of these new magnets proves useful in the future," said Curtarolo. "The ability to rapidly predict their existence is a major coup and will be invaluable to materials scientists moving forward." This story is adapted from material from Duke University, with editorial changes made by Materials Today. The views expressed in this article do not necessarily represent those of Elsevier. Link to original source.


News Article | May 3, 2017
Site: co.newswire.com

bioMONTR Labs is proud to announce that its Laboratory Director, Dr. Susan Fiscus, has been named as the 2017 recipient of the Ed Nowakowski Senior Memorial Clinical Virology Award presented by the Pan American Society for Clinical Virology. This accolade is awarded to individuals who have made a major impact on the epidemiology, treatment or understanding of the pathogenesis of viral diseases. Prior to joining bioMONTR Labs, Dr. Fiscus served as the director of UNC’s Retrovirology Core Laboratory. Preceding her 25-year career at UNC, Dr. Fiscus earned her bachelor’s degree from Bates College, her master’s degree in botany from Duke University and her doctoral degree in microbiology from Colorado State University. During her time at UNC, Fiscus worked on studies for the AIDS Clinical Trials Group (ACTG), the Pediatric ACTG, the HIV Prevention Trials Network, and the Centers for Disease Control (CDC) among other organizations. Each year, PASCV’s awards program provides an array of awards that recognize the contributions made by individuals to the field of clinical virology. These awards will be presented next week during the 2017 Clinical Virology Symposium in Savannah, Georgia. bioMONTR Labs is a privately owned CLIA Certified lab located in Research Triangle Park, NC. The company offers a comprehensive menu of high-complexity molecular assays as well as other proprietary and esoteric molecular testing. Please visit http://www.biomontr.com for more information on the company, its management, test menu, and capabilities.


News Article | May 3, 2017
Site: phys.org

Over the past few years, 3-D printers that make plastic objects have become somewhat commonplace—Duke's Innovation Co-Lab has more than 60 of them for use by faculty, staff and students. Metal 3-D printers, however, are an entirely different beast, requiring much more infrastructure and a steeper learning curve. But as the Duke students working out the kinks in the system this past year will tell you, it's been well worth the investment. "The metal 3-D printer allows us to make designs that could never be fabricated by traditional manufacturing processes," said Sam Morton, a senior studying mechanical engineering. "It allows us to make actual biomedical devices out of titanium that we can then test and get feedback on from the surgeons we're working with." "The turnaround is so quick," echoed Dr. Robert Isaacs, associate professor of neurosurgery at the Duke University School of Medicine, who is working with Morton on his project. "You come up with a plan or notice a problem in a prototype, and then almost in no time, Sam comes back and delivers exactly what you just said. You think it, and then you're holding it. It's incredible." The two groups of seniors are pursuing separate biomedical devices—an intricate titanium spacer for spinal fusion surgeries and titanium scaffolds for large bone defects. Both share common requirements and challenges. Spinal fusions are common procedures meant to alleviate back pain caused by two vertebrae rubbing together as a disc of cartilage fails. Surgeons insert the device resembling a small, hollow Lego between the two vertebrae, promoting the growth of bone around the device to fuse the two bones and stop future movement. Similarly, titanium scaffolds provide strength and support to surgically removed portions of other bones while encouraging regrowth. Current devices for both procedures however, have several drawbacks. Metal-based devices are often too hard and opaque to the imaging technologies doctors use to see how well the bone is growing. Plastic devices are not as strong and do not facilitate as much bone growth. The two senior design projects sought to create 3-D printed devices that would have the best of both worlds. Complex, sponge-like details encourage bone growth in and around the implant while minimizing the amount of metal used and allowing doctors to image the results afterward. "The titanium 3-D printer is a really great opportunity because it lets you create structures that you couldn't make using normal manufacturing techniques," said Samantha Sheppard, also a senior in mechanical engineering. "We're able to create porous structures that are better for bone ingrowth and that really could only be made with this printer." "The doctors know a lot about what type of devices they want but traditionally haven't been able to get them manufactured," said Morton. "With the 3-D metal printer, I'm able to come in and, without any limitations really, create a geometry that satisfies their needs." In many ways, the 3-D metal printer works much like a traditional plastic 3-D printer—by building a piece from the ground up, layer-by-layer. A robotic arm much like a windshield wiper first sweeps a thin layer of titanium dust across a metal plate. A high-power laser then melts the dust in the specific pattern of the bottom layer. The entire plate then drops 30 microns, and the process repeats until the part is finished. A single part just a few centimeters tall takes the machine between three and five hours to finish. For an entire plate filled with a dozen or more complex parts, a run can take upwards of 20 hours. Either way, the printer allows for rapid prototyping using a metal that is notoriously difficult to machine. "For titanium, which is a very hard metal to manufacture with other methods, the printer allows you to make intricate parts with internal complexities relatively fast and in a multiplicative way," said Ken Gall, professor and chair of mechanical engineering and materials science, who led the effort to acquire the printer. "Rather than sending out one design with a complex shape for somebody else to make and waiting for it, you can put in 10 designs and manufacture them all at the same time." While the biomedical devices the students are designing are not yet approved for human use, the 3-D metal printer allows the students to make pieces that are of the same quality as those that could be eventually used. "For the students to actually be able to make real parts that match what the doctors want designed and then to actually test those parts is something that you would rarely see at a university," said Gall, who is associate director of Duke MEDx, a new initiative to strengthen ties between clinicians and engineers. "I wanted to change what Duke was capable of doing and part of that involved bringing a 3-D metal printer to campus. It's going to open a lot of windows for different types of projects at all levels, from freshmen to faculty." "It's a unique situation here at Duke, because of the proximity and relationship between the medical center and the mechanical engineering and materials science department," said Morton. "Traditionally it would be extremely hard for someone like me to not only have access to a printer like this, but to create finalized products, take them over to a medical center, meet with surgeons and then perform CT and tomography scans on my device all within the same day. That's extremely rare and not necessarily happening at many other institutions." Explore further: From rapid prototyping to output at scale, two metal 3D printing systems are announced


News Article | April 25, 2017
Site: www.eurekalert.org

Doctors believe that communication with their patients is important, but most studies of physician/elderly patient communication do not mention that hearing loss may affect this interaction. The findings come from a review led by two NYU professors published in the Journal of the American Geriatrics Society. Many researchers have explored communication between doctors and their patients, but how many of them have considered the importance of hearing loss? To investigate this question, a team led by Dr. Joshua Chodosh of New York University School of Medicine and Dr. Jan Blustein, the NYU's Robert F. Wagner School of Public Service and the School of Medicine, reviewed the published medical literature on doctor-patient communication, selecting research studies that involved patients aged 60 years and older. Of the 67 papers included in their study, only 16 (23.9%) included any mention of hearing loss. In some cases (4 out of the 67), people with hearing loss were excluded from the study. Three of the studies reported on an association between hearing loss and quality of care. In only one study did the researchers offer patients some kind of hearing assistance to see whether it would improve communication. (It found that offering hearing assistance improved patients' understanding.) "Hearing loss has long been neglected in the medical community," said Chodosh. "As a geriatrician, I see many patients who struggle to hear what I'm saying to them. That makes me less certain that they are getting what they need." The findings suggest that research on physician-elderly patient communication has largely overlooked a highly prevalent, important, and remediable influence on the quality of communication. "Patients are often older people, for whom hearing loss is a daily issue. It's also an issue that's ripe for research: how can we attend to and improve hearing and understanding so that patients get the best quality care possible?" said Blustein. In an accompanying editorial, Frank Lin, MD, PhD of the Johns Hopkins School of Medicine and Heather Whitson, MD, MHS of the Duke University School of Medicine noted that the review offers a major opportunity for practice improvement. "Common sense, low (or no) cost strategies can be employed to mitigate the negative impact of both hearing and vision loss in patient communication," they wrote. "And some accommodations (e.g., minimizing ambient noise, speaking face to face, creating patient education materials with large-print font) are so simple and potentially beneficial that they could be implemented universally."


News Article | April 10, 2017
Site: www.medicalnewstoday.com

Biomedical engineers have developed a way to deliver drugs to specific types of neurons in the brain, providing an unprecedented ability to study neurological diseases while also promising a more targeted way to treat them. Drugs are the tool of choice for studying the connections between neurons, and continue to be the mainstream treatment for neurological disease. But a major drawback in both endeavors is that the drugs affect all types of neurons, complicating the study of how cell receptors in the synapse - the gap between neurons - work in an intact brain, and how their manipulation can lead to clinical benefits and side effects. A new method named DART (Drugs Acutely Restricted by Tethering) may overcome these limitations. Developed by researchers at Duke University and the Howard Hughes Medical Institute, DART offers researchers the first opportunity to test what happens when a drug is targeted exclusively to one cell type. In its inaugural study, DART reveals how movement difficulties in a mouse model of Parkinson's Disease are controlled by the AMPA receptor (AMPAR) - a synaptic protein that enables neurons to receive fast incoming signals from other neurons in the brain. The results reveal why a recent clinical trial of an AMPAR-blocking drug failed, and offer a new approach to using the pharmaceutical. The paper appeared online in the journal Science. "This study marks a major milestone in behavioral neuropharmacology," said Michael Tadross, assistant professor of biomedical engineering, who is in the process of moving his laboratory from the HHMI Janelia Research Campus to Duke. "The insights we gained in studying Parkinson's mice were unexpected and could not have been obtained with any previous method." DART works by genetically programming a specific cell type to express a sort of GPS beacon. The "beacon" is an enzyme borrowed from bacteria that is inert - it does nothing more than sit on the cell surface. Nothing, that is, until researchers deliver drugs loaded with a special homing device. Researchers administer these drugs at such low doses that they do not affect other cells. Because the homing system is so efficient, however, the drug is captured by the tagged cells' surface, accumulating within minutes to concentrations that are 100 to 1,000 times higher than anywhere else. In an experiment using a mouse model of Parkinson's disease, Tadross and colleagues attached the homing signal beacon to two types of neurons found in the basal ganglia - the region of the brain responsible for motor control. One type, referred to as D1 neurons, are believed to give a "go" command. The other, referred to as D2 neurons, are thought to do just the opposite, providing commands to stop movements. Using DART, Tadross delivered an AMPAR-blocking pharmaceutical to only D1-neurons, only D2-neurons, or both. When delivered to both cell types simultaneously, the drugs improved only one of several components of motor dysfunction - mirroring the lackluster results of recent human clinical trials. The team then found that delivering the drug to only the D1/"go" neurons did absolutely nothing. Surprisingly, however, by targeting the same drug to D2/"stop" neurons, the mice's movements became more frequent, faster, fluid and linear - in other words, much closer to normal. While the drug stops neurons from receiving certain incoming signals, it does not completely shut them down. This nuance is particularly important for a subset of the D2 neurons that have two prominent forms of firing. With DART, these components could be separately manipulated, providing the first evidence that Parkinson's motor deficits are attributable to the AMPAR-based component of firing in these cells. Tadross said this level of nuance could not have been obtained with prior cell type-specific methods that completely shut neurons down. "Already in our first use of DART, we've learned something new about the synaptic basis of circuit dysfunction in Parkinson's disease," said Tadross. "We've discovered that targeting a specific receptor on specific types of neurons can lead to surprisingly potent improvements." Tadross is already looking into how this discovery might translate into a new therapy by delivering drugs to these neurons through an emerging viral technique. He is also beginning work to develop a version of DART that does not need the genetically added homing beacon to work. Both efforts will require years of research before seeing fruition - but that's not stopping Tadross. "All too often in basic science, approaches are developed that may 'one day' make a difference to human health," he said. "At Duke, there's a palpable emphasis on providing new treatments to people as quickly as possible. I'm very excited that in this environment, my lab can work collaboratively with scientists, physicians, and biotech to solve the real-world challenges involved." This research was funded by the Howard Hughes Medical Institute.


News Article | April 17, 2017
Site: www.eurekalert.org

Boulder, Colo. -- April 12, 2017 -- Dr. Robin Canup, associate vice president of the Space Science and Engineering Division at Southwest Research Institute (SwRI) has been named a member of the American Academy of Arts and Sciences. The 2017 class of inductees includes leaders from academia, business, public affairs, the humanities and the arts. Academy members contribute to publications and studies of science and technology policy, energy and global security, the humanities and culture, and education. Canup, who joined SwRI in 1998, is particularly known for her studies concerning the formation of planets and their satellites, including her research that demonstrated a single impact from a Mars-sized object could have produced the Earth-Moon system. "This is fantastic recognition for Robin and her research," said Dr. Jim Burch, vice president of SwRI's Space Science and Engineering Division. "Her work has been vastly important to our understanding of the Earth-Moon system and our place in the universe." Canup holds a bachelor's degree in physics from Duke University, and a master's degree and doctorate in astrophysical, planetary and atmospheric sciences from the University of Colorado at Boulder. She has received several honors during her career including the American Astronomical Society Division for Planetary Sciences' Harold Urey Prize (2003) and the American Geophysical Union's Macelwane Medal (2004). She was also named one of Popular Science magazine's "Brilliant 10" young scientists to watch (2004) and was elected a member of the National Academy of Sciences (2012). Canup will be inducted Oct. 7, 2017, in Cambridge, Mass. Other inductees of the Academy's class of 2017 include singer-songwriter John Legend, mathematician Maryam Mirzakhani, writer Chimamanda Ngozi Adichie, and award-winning actress Carol Burnett. Editors: A photo to accompany this story is available at: http://www.


News Article | April 18, 2017
Site: news.yahoo.com

(Reuters Health) - More than nine million people may miss out on cholesterol-lowering drugs that prevent heart attacks and strokes if doctors choose one set of medical guidelines over another, according to a new study. That's because the government-backed U.S. Preventive Services Task Force (USPSTF) set a higher threshold for use of the drugs, known as statins, than the American College of Cardiology and the American Heart Association (ACC/AHA). "I would say we’re still searching for the perfect guidelines," said lead author Michael Pencina, of Duke University in Durham, North Carolina. The 2013 ACC/AHA guidelines recommend statins for people ages 40 to 75 with at least a 7.5 percent risk of having a heart attack or stroke in the next 10 years. (The ACC/AHA cardiovascular risk estimator tool is available online here: http://bit.ly/2pPwoXh.) The ACC/AHA also recommends statins for people with cardiovascular disease, for diabetics between ages 40 and 75 and for adults with high levels of “bad” low-density lipoprotein cholesterol. The 2016 USPSTF recommendation endorses statins for people ages 40 to 75 with at least a 10 percent or greater risk of a heart attack or stroke over the next decade and at least one cardiovascular risk factor like diabetes or high blood pressure. Pencina told Reuters Health fewer people would be using statins under the more conservative USPSTF guidelines. "What we wanted to do is quantify the impact and look at what it means in terms of numbers." The researchers applied the recommendations to nationally representative data collected from 3,416 people without a history of cardiovascular disease between 2009 and 2014. Overall, 21.5 percent were already on statins to prevent heart attacks and strokes. An additional 24.3 percent would be on statins if all doctors followed the ACC/AHA guidelines, compared to an additional 15.8 percent if all doctors followed the USPSTF recommendation. The difference between the two guidelines represents about 9.3 million people in the United States, the researchers write in JAMA. Under the USPSTF guidelines, some diabetics would be excluded from statin use. More than half of those excluded would be middle-aged adults with a more than 30 percent average risk of a cardiovascular event over the next 30 years. "About one in three people are going to experience a cardiovascular event over the next 30 years," said Pencina. In a statement to Reuters Health, the USPSTF said its recommendations are based on the best available evidence about a preventive service's benefits and harms. "Because the USPSTF makes recommendations that are closely tied to the available evidence, we focused on recommending statins for the people who the evidence showed were most likely to benefit, though ultimately this decision should be made through a conversation between each patient and their doctor," the statement continued. In its review of evidence, the USPSTF focused on 19 trials involving a total of 71,344 people who had no history of cardiovascular disease. Overall, people were 14 percent less likely to die during the study period if they were taking statins than if they were taking a dummy pill or nothing at all. The risk of serious side effects from statins was also low. The USPSTF is always more conservative in its recommendations than professional organizations - not just for cholesterol, said Dr. Steve Nissen, chairman of the Robert and Suzanne Tomsich Department of Cardiovascular Medicine at the Cleveland Clinic. "Whether you treat or not treat is frankly something that should be a discussion between patient and physician," he told Reuters Health. "That’s how I do it." Nissen, who was not involved in the new study, said some entity should step in to clear up the confusion between the USPSTF, ACC/AHA and several other statin guidelines. "I’m not terribly happy to have multiple guidelines floating around out there," he said. Pencina said it's important for patients to be informed about their risk of cardiovascular disease and understand the risks and benefits of statins. "Both sets of guidelines - to their credit - recommend an informed decision between the patient and the clinician," he said. "Those are crucial."


News Article | April 18, 2017
Site: www.eurekalert.org

If you are raised by other species, then how do you know who you are? Although heterospecific foster parents rear brood parasitic brown-headed cowbird chicks, juvenile cowbirds readily recognize and affiliate with other cowbirds. That's because they have a secret handshake or password. Specifically, the "password" hypothesis helps explain this paradox of species recognition: Social recognition processes in brood parasites are initiated by exposure to a password: in the case of cowbirds, a specific chatter call. A new study appearing in the Journal of Experimental Biology describes the neural basis for password-based species recognition in cowbirds. Roughly 1% of bird species are obligate brood parasites. Female obligate brood parasites shirk parental care duties by laying their eggs in the nests of other females. This breeding strategy is extremely successful for the female parasite but raises questions, particularly with respect to species recognition. For instance, how does a juvenile bird that is not raised by familial members come to recognize its own species and avoid imprinting on the host species that cared for it from the day it hatched? One possibility is that young brood parasites use a password to identify conspecifics, and learning about species-specific signals occurs only after the password is used to find conspecifics. Researchers have now demonstrated the neural basis for password-based species recognition in an obligate brood parasite. They showed that the auditory forebrain regions in cowbirds, which respond selectively to learned vocalizations, such as songs, also respond selectively to non-learned chatter. However, if the password is not used to locate other cowbirds, the young brood parasite will mis-imprint on its host species -- a process manifested in the brain by elevated gene induction in response to the host's song. "Our study reveals a neural basis for this password as well as a neural signature of mis-imprinting in young brood parasites that have prolonged exposure to host species songs," said Dr. Kathleen Lynch, lead author of the study and Assistant Professor of Biology at Hofstra University. Dr. Mark Hauber, Professor of Psychology at Hunter College and the Graduate Center of the City University of New York (CUNY), who co-authored the article, first carried out behavioral experiments to find evidence for password-based species recognition. Dr. Hauber said, "After our discovery of the password as a behavioral mechanism in parasitic cowbirds over 15 years ago as a graduate student, it is rewarding for me to be working on an NSF [National Science Foundation] grant to identify the neural basis of this behavior as a professor." Unlike parental male songbirds, which usually learn to sing at a very young age by mimicking their fathers, parasitic cowbirds learn song in their second year and delay song production until their third year. "This study is interesting because the particular life history of brood parasitic songbirds such as the brown-headed cowbird requires song learning to proceed differently in this species than in most others," said Dr. Jill Soha, Associate Scholar at Duke University, who was not affiliated with the study. "Understanding the neural mechanisms that guide this type of song learning advances our knowledge not only of brood parasite ontogeny and evolution but also, through comparative study, our understanding of the neural mechanisms underlying song learning in general." Dr. Lynch and her colleagues have revealed novel insights into the neural basis of species recognition in cowbirds, which dovetail with known behavioral responses and advance our understanding of social recognition in brood parasites. Hofstra University is a dynamic private institution of higher education where more than 11,000 full- and part-time students choose from undergraduate and graduate offerings in liberal arts and sciences, business, engineering, applied science, communication, education, health sciences and human services, honors studies, the Maurice A. Deane School of Law and the Hofstra North Shore-LIJ School of Medicine. The City University of New York is the nation's leading urban public university. Founded in New York City in 1847, the University comprises 24 institutions: 11 senior colleges, seven community colleges, and additional professional schools. The University serves nearly 275,000 degree-credit students and 218,083 adult, continuing and professional education students. For more information, please contact: Shante Booker and or visit http://www.


News Article | April 3, 2017
Site: www.techtimes.com

Thyroid cancer has been reported to be the fastest increasing cancer diagnosis in the United States. A new study says the annual growth in thyroid cancer incidence tripled between 1975 and 2013, with a majority of the new diagnoses being papillary thyroid cancer. In recent years, epidemiologists have been attributing the surge in thyroid cancer to the broad detection of more cases. With advanced tools like fine-needle biopsies and ultrasound systems, doctors are now better equipped to diagnose thyroid cancer, including those with slow-growing nonmalignant symptoms. "While overdiagnosis may be an important component to this observed epidemic, it clearly does not explain the whole story," said Dr. Julie Sosa, co-author and Duke University's head of endocrine surgery. From the database of National Cancer Institute, the researchers analyzed more than 77,000 cases of thyroid cancer reported between 1974 and 2013. Analysis showed a tripling of thyroid cancer cases from that period. From 1994 to 2013, advanced thyroid cancer cases increased 3 percent annually, while the number of deaths increased by nearly 1 percent every year. Sosa said thyroid cancers are showing a marked increase despite the relatively less lethal nature of the disease. The new study rules out the rising rates of thyroid cancer to an escalation in the detection of more cases, and attributes the increase in thyroid cancer incidence and mortality rates to many factors, including exposure to flame retardants. The findings have been published in the Journal of the American Medical Association and the paper was presented at the Endocrine Society's meeting in Orlando recently. Flame retardants in home products are one of the causative factors of papillary thyroid cancer, according to the study. Studies have shown that many flame retardants have endocrine-disrupting chemicals that interfere with thyroid homeostasis, so the researchers turned their attention to flame retardants to trace its relationship with papillary thyroid cancer. "Our study results suggest higher exposure to several flame retardants in the home environment may be associated with the diagnosis and severity of papillary thyroid cancer," Sosa said. Papillary thyroid cancer is exacerbated by polybrominated diphenyl ethers, which are pollutants present in home products, plastics, foodstuff, and pesticides. According to the National Cancer Institute, more than 60,000 Americans are diagnosed with thyroid cancer a year. Of these, nearly 75 percent are women and 82 percent are white. In another study, it was found that among the ethnic groups showing higher vulnerability to thyroid cancer are Hispanics and African Americans. "Thyroid cancer incidence is leveling off in the United States. Our analysis, however, shows that the trend of deceleration mainly occurred in non-Hispanic Whites and in older populations, whereas the rate of thyroid cancer continuously increased among the young and the Hispanic and black populations," noted lead author Anupam Kotwal. Other reasons behind the increase in thyroid cancer are rising obesity rates and a decreasing number of smokers. Obese adults in the United States have tripled between 1960 and 2012, with the highest growth in numbers recorded between 1980 and 2010.Another factor aiding the growth of thyroid cancer comes as a surprise - a decline in smoking. Though smoking threatens the heart and lungs, it is associated with reducing the risk of thyroid cancer by 30 to 40 percent, added the study. However, this shouldn't be taken as a recommendation to smoke in order to prevent thyroid cancer, the researchers warned. "It's just an interesting association that we see in our data, and it provides some clues to what factors are involved in thyroid cancer development," said NCI epidemiologist Cari Kitahara. © 2017 Tech Times, All rights reserved. Do not reproduce without permission.


If you are raised by other species, then how do you know who you are? Although heterospecific foster parents rear brood parasitic brown-headed cowbird chicks, juvenile cowbirds readily recognize and affiliate with other cowbirds. That's because they have a secret handshake or password. Specifically, the "password" hypothesis helps explain this paradox of species recognition: Social recognition processes in brood parasites are initiated by exposure to a password: in the case of cowbirds, a specific chatter call. A new study appearing in the Journal of Experimental Biology describes the neural basis for password-based species recognition in cowbirds. Roughly 1% of bird species are obligate brood parasites. Female obligate brood parasites shirk parental care duties by laying their eggs in the nests of other females. This breeding strategy is extremely successful for the female parasite but raises questions, particularly with respect to species recognition. For instance, how does a juvenile bird that is not raised by familial members come to recognize its own species and avoid imprinting on the host species that cared for it from the day it hatched? One possibility is that young brood parasites use a password to identify conspecifics, and learning about species-specific signals occurs only after the password is used to find conspecifics. Researchers have now demonstrated the neural basis for password-based species recognition in an obligate brood parasite. They showed that the auditory forebrain regions in cowbirds, which respond selectively to learned vocalizations, such as songs, also respond selectively to non-learned chatter. However, if the password is not used to locate other cowbirds, the young brood parasite will mis-imprint on its host species—a process manifested in the brain by elevated gene induction in response to the host's song. "Our study reveals a neural basis for this password as well as a neural signature of mis-imprinting in young brood parasites that have prolonged exposure to host species songs," said Dr. Kathleen Lynch, lead author of the study and Assistant Professor of Biology at Hofstra University. Dr. Mark Hauber, Professor of Psychology at Hunter College and the Graduate Center of the City University of New York (CUNY), who co-authored the article, first carried out behavioral experiments to find evidence for password-based species recognition. Dr. Hauber said, "After our discovery of the password as a behavioral mechanism in parasitic cowbirds over 15 years ago as a graduate student, it is rewarding for me to be working on an NSF [National Science Foundation] grant to identify the neural basis of this behavior as a professor." Unlike parental male songbirds, which usually learn to sing at a very young age by mimicking their fathers, parasitic cowbirds learn song in their second year and delay song production until their third year. "This study is interesting because the particular life history of brood parasitic songbirds such as the brown-headed cowbird requires song learning to proceed differently in this species than in most others," said Dr. Jill Soha, Associate Scholar at Duke University, who was not affiliated with the study. "Understanding the neural mechanisms that guide this type of song learning advances our knowledge not only of brood parasite ontogeny and evolution but also, through comparative study, our understanding of the neural mechanisms underlying song learning in general." Dr. Lynch and her colleagues have revealed novel insights into the neural basis of species recognition in cowbirds, which dovetail with known behavioral responses and advance our understanding of social recognition in brood parasites. More information: A neural basis for password-based species recognition in an avian brood parasite Journal of Experimental Biology 2017 : doi: 10.1242/jeb.158600 , jeb.biologists.org/content/early/2017/04/12/jeb.158600


News Article | April 28, 2017
Site: www.businesswire.com

BRENTWOOD, Tenn.--(BUSINESS WIRE)--LifePoint Health, Inc. (NASDAQ: LPNT) today announced results for the first quarter ended March 31, 2017. For the first quarter ended March 31, 2017, consolidated revenues were $1,630.2 million, up 3.1% from $1,580.7 million for the same period last year. Net income for the first quarter ended March 31, 2017, was $64.0 million, up $40.1 million, compared with net income of $23.9 million for the same period last year. Net income for the first quarter ended March 31, 2017, includes other non-operating gains of $25.9 million, or $16.2 million net of income taxes, of which $18.0 million, or $11.3 million net of income taxes, is related to the settlement of a contingent liability previously established in connection with a prior hospital acquisition, and $7.9 million, or $4.9 million net of income taxes, is related to the transfer of certain of the Company’s home health agencies and hospices to an unconsolidated joint venture partnership. Net income for the first quarter ended March 31, 2016, includes charges of $24.7 million, or $15.5 million net of income taxes, related to cardiology-related lawsuits, and an impairment loss of $1.2 million, or $0.8 million net of income taxes, related to the write-off of certain capital assets. Diluted earnings per share attributable to LifePoint Health, Inc. stockholders for the first quarter ended March 31, 2017, increased to $1.46 compared with $0.48 for the same period last year. Diluted earnings per share attributable to LifePoint Health, Inc. stockholders for the first quarter ended March 31, 2017, were positively impacted by $0.39 per share as a result of the aforementioned non-operating gains in the aggregate. Diluted earnings per share attributable to LifePoint Health, Inc. stockholders for the first quarter ended March 31, 2016, were negatively impacted by $0.37 per share as a result of the combination of the aforementioned cardiology-related lawsuits and impairment loss. When adjusted to exclude these various items, adjusted diluted earnings per share attributable to LifePoint Health, Inc. stockholders for the first quarter ended March 31, 2017, increased to $1.07 per share compared with $0.85 per share for the same period last year. Additional information regarding adjusted diluted earnings per share attributable to LifePoint Health, Inc. stockholders, including uses by management and others and a reconciliation to diluted earnings per share attributable to LifePoint Health, Inc. stockholders, is set forth under the section titled “Unaudited Supplemental Information.” Finally, Adjusted EBITDA for the first quarter ended March 31, 2017, increased 21.1% to $195.6 million compared with $161.6 million for the same period last year, and Adjusted Normalized EBITDA for the first quarter ended March 31, 2017, increased 5.0% to $195.6 million compared with $186.3 million for the same period last year. Adjusted Normalized EBITDA for the first quarter ended March 31, 2016, has been adjusted to exclude the impact of $24.7 million in charges related to cardiology-related lawsuits. Additional information regarding Adjusted EBITDA and Adjusted Normalized EBITDA, including definitions, uses by management and others and a reconciliation to net income, is set forth in this release under the section titled “Unaudited Supplemental Information.” Commenting on the results, William F. Carpenter III, Chairman and Chief Executive Officer of LifePoint Health, said, “We are pleased with our first quarter, in which we continued to see sequential volume improvement and margin expansion. Our efforts to integrate our recently acquired hospitals and the $2.3 billion of revenue that we have added over the last few years are on track,” said William F. Carpenter III, Chairman and Chief Executive Officer of LifePoint Health. “We remain committed to our strategic priorities of delivering high-quality care and service, growth, cost management and the development of high-performing talent to drive value for our shareholders.” A listen-only simulcast, as well as a 30-day replay, of LifePoint Health’s first quarter 2017 conference call will be available on line at www.lifepointhealth.net/investor-relations today, Friday, April 28, 2017, beginning at 10:00 a.m. Eastern Time. LifePoint Health (NASDAQ: LPNT) is a leading healthcare company dedicated to Making Communities Healthier®. Through its subsidiaries, it provides quality inpatient, outpatient and post-acute services close to home. LifePoint owns and operates community hospitals, regional health systems, physician practices, outpatient centers, and post-acute facilities in 22 states. It is the sole community healthcare provider in the majority of the non-urban communities it serves. More information about the Company can be found at www.LifePointHealth.net. All references to “LifePoint,” “LifePoint Health” or the “Company” used in this release refer to affiliates or subsidiaries of LifePoint Health, Inc. Important Legal Information. Certain statements contained in this release are based on current management expectations and are “forward-looking statements” within the meaning of Section 27A of the Securities Act of 1933, as amended, and Section 21E of the Securities Exchange Act of 1934, as amended, and are intended to qualify for the safe harbor protections from liability provided by the Private Securities Litigation Reform Act of 1995. Numerous factors exist which may cause results to differ from these expectations. Many of the factors that will determine our future results are beyond our ability to control or predict with accuracy. Such forward-looking statements reflect the current expectations and beliefs of the management of LifePoint, are not guarantees of performance and are subject to a number of risks, uncertainties, assumptions and other factors that could cause actual results to differ from those described in the forward-looking statements. These forward-looking statements may also be subject to other risk factors and uncertainties, including without limitation: (i) the effects related to the enactment and implementation of healthcare reform, the possible repeal and replacement of the Affordable Care Act, the possible enactment of additional federal or state healthcare reforms and possible changes in healthcare reform laws and other federal, state or local laws or regulations affecting the healthcare industry including the timing of the implementation of reform; (ii) the extent to which states support increases, decreases or changes in Medicaid programs, or alter the provision of healthcare to state residents through regulation or otherwise; (iii) delays in receiving payments for services provided, reductions in Medicare or Medicaid payments (including increased recoveries made by Recovery Audit Contractors (RACs) and similar governmental agents), compared to the timing of expanded coverage; (iv) reductions in reimbursements from commercial payors and risks associated with consolidation among commercial insurance companies and shifts to insurance plans with narrow networks, high deductibles or high co-payments; (v) the continued viability of our operations through joint venture entities, the largest of which is Duke LifePoint Healthcare, our partnership with a wholly controlled affiliate of Duke University Health Systems, Inc.; (vi) our ability to successfully integrate acquired facilities into our ongoing operations and to achieve the anticipated financial results and synergies from such acquisitions, individually or in the aggregate; (vii) the deterioration in the collectability of “bad debt” and “patient due” accounts, and the number of individuals without insurance coverage (or who are underinsured) who seek care at our facilities; (viii) industry emphasis on value-based purchasing and bundled payment arrangements; (ix) whether our efforts to reduce the cost of providing healthcare while increasing the quality of care are successful; (x) the ability to attract, recruit or employ and retain qualified physicians, nurses, medical technicians and other healthcare professionals and the increasing costs associated with doing so, including the direct and indirect costs associated with employing physicians and other healthcare professionals; (xi) the loss of certain physicians in markets where such a loss can have a disproportionate impact on our facilities in such market; (xii) the application and enforcement of increasingly stringent and complex laws and regulations governing our operations and healthcare generally (and changing interpretations of applicable laws and regulations), related enforcement activity and the potentially adverse impact of known and unknown government investigations, litigation and other claims that may be made against us; (xiii) risks due to cybersecurity attack or security breach and our access to personal information of patients and employees; (xiv) our ability to successfully implement enterprise-wide information technology systems; (xv) payor controls designed to reduce inpatient services; (xvi) our ability to generate sufficient cash flow to fund all of our capital expenditure programs and commitments; (xvii) adverse events in states where a large portion of our revenues are concentrated; (xviii) liabilities resulting from potential malpractice and related legal claims brought against our facilities or the healthcare providers associated with, or employed by, such facilities or affiliated entities; (xix) our increased dependence on third parties to provide purchasing, revenue cycle and payroll services and information technology and their ability to do so effectively; (xx) our ability to acquire healthcare facilities on favorable terms and the business risks, unknown or contingent liabilities and other costs associated therewith; and (xxi) those other risks and uncertainties described from time to time in our filings with the Securities and Exchange Commission. Therefore, our future results may differ materially from those described in this release. LifePoint undertakes no obligation to update any forward-looking statements, or to make any other forward-looking statements, whether as a result of new information, future events or otherwise. Adjusted EBITDA is defined by the Company as earnings before depreciation and amortization; interest expense, net; other non-operating (gains) loss; provision for income taxes; and net income attributable to noncontrolling interests and redeemable noncontrolling interests. Additionally, Adjusted Normalized EBITDA has been adjusted to exclude the impact of $24.7 million in charges related to cardiology-related lawsuits recognized during the first quarter of 2016. LifePoint’s management and Board of Directors use Adjusted EBITDA and Adjusted Normalized EBITDA to evaluate the Company’s operating performance and as a measure of performance for incentive compensation purposes. LifePoint’s credit facilities use Adjusted EBITDA, subject to further permitted adjustments, for certain financial covenants. The Company believes Adjusted EBITDA and Adjusted Normalized EBITDA are measures of performance used by some investors, equity analysts, rating agencies and lenders to make informed decisions as to, among other things, the Company’s ability to incur and service debt and make capital expenditures. In addition, multiples of current or projected Adjusted EBITDA and Adjusted Normalized EBITDA are used by some investors and equity analysts to estimate current or prospective enterprise value. Adjusted EBITDA and Adjusted Normalized EBITDA should not be considered as measures of financial performance under U.S. generally accepted accounting principles (“GAAP”), and the items excluded from Adjusted EBITDA and Adjusted Normalized EBITDA are significant components in understanding and assessing financial performance. Adjusted EBITDA and Adjusted Normalized EBITDA should not be considered in isolation or as an alternative to net income, cash flows generated by operating, investing or financing activities or other financial statement data presented in the condensed consolidated financial statements as an indicator of financial performance. Because Adjusted EBITDA and Adjusted Normalized EBITDA are not measurements determined in accordance with GAAP and are susceptible to varying calculations, Adjusted EBITDA and Adjusted Normalized EBITDA as presented may not be comparable to other similarly titled measures of other companies. The following table reconciles net income as reflected in the unaudited condensed consolidated statements of operations to Adjusted EBITDA and Adjusted Normalized EBITDA: From time to time, the Company incurs certain non-recurring gains or losses that are normally nonoperational in nature and that it does not consider relevant in assessing its ongoing operating performance. When significant, LifePoint’s management and Board of Directors typically exclude these gains or losses when evaluating the Company’s operating performance and in certain instances when evaluating performance for incentive compensation purposes. Additionally, the Company believes that some investors and equity analysts exclude these or similar items when evaluating the Company’s current or future operating performance and in making informed investment decisions regarding the Company. Accordingly, the Company provides adjusted diluted earnings per share attributable to LifePoint Health, Inc. stockholders as a supplement to its comparable GAAP measure of diluted earnings per share attributable to LifePoint Health, Inc. Adjusted diluted earnings per share attributable to LifePoint Health, Inc. stockholders should not be considered as a measure of financial performance under GAAP, and the items excluded from adjusted diluted earnings per share attributable to LifePoint Health, Inc. stockholders are significant components in understanding and assessing financial performance. Adjusted diluted earnings per share attributable to LifePoint Health, Inc. stockholders should not be considered in isolation or as an alternative to diluted earnings per share attributable to LifePoint Health, Inc. stockholders as presented in the condensed consolidated financial statements. The following table reconciles diluted earnings per share attributable to LifePoint Health, Inc. stockholders as reflected in the unaudited condensed consolidated statements of operations to adjusted diluted earnings per share attributable to LifePoint Health, Inc. stockholders:


News Article | May 4, 2017
Site: www.businesswire.com

LOS ANGELES--(BUSINESS WIRE)--TaskUs is in yet another year of exponential revenue growth, and amid recent announcements of significant domestic expansion, the company has named tech industry veteran, Jarrod Johnson, to Senior Vice President of Sales. Johnson has significant growth experience with some of the most well-known tech companies and consumer brands in the world. “Jarrod is a tremendous asset. He brings twenty years of experience growing businesses in the technology and services sectors and a deep understanding of our space,” noted CEO Bryce Maddock. “Jarrod will be instrumental in our growth strategy both domestically and internationally.” Johnson is a veteran of IBM and Xerox (after its acquisition of Dallas-based ACS). Most recently, he worked with FacilitySource, a SaaS platform and Real Estate Outsourcing leader backed by Warburg Pincus, where he was the SVP of Sales, Marketing, and Client Development. At Xerox, Johnson was Group President of Retail and Consumer Brand IT services globally and held multiple sales leadership positions. Over his 10 years at IBM, Johnson was involved in digital consulting, sales, and sales operations responsible for a $4.5B sales territory. In this position Johnson helped retail, consumer products, and travel/transportation clients in the US build and manage leading digital solutions to drive revenue growth. Johnson earned his Master of Business Administration from the Fuqua School of Business at Duke University. Johnson lives in Dallas, Texas where TaskUs has opened a sales office. The company’s first domestic outsourcing site opened last year in San Antonio and last month the company announced plans for expansion of the site and the creation of 500 new jobs. Johnson stated, “TaskUs provides an unmatched opportunity for me to continue to support the leading and disrupting companies in the digital economy. Our commitment to scaling our operating platform globally allows us to serve our clients with even more solutions, across even more markets.” TaskUs provides the people, process and technology that power the world’s most notable brands and disruptive companies. They are the leading provider of customer care and back office outsourcing to evolving businesses. Our unique focus on transformational growth scales support systems and increases our partners’ bottom lines. To learn about Ridiculously Good Outsourcing options, visit TaskUs.com.

Loading Duke University collaborators
Loading Duke University collaborators