Sydney, Australia
Sydney, Australia

The University of Sydney is a public university located in Sydney, New South Wales. The main campus spreads across the suburbs of Camperdown and Darlington on the southwestern outskirts of the Sydney CBD. Founded in 1850, it is the oldest university in Australia and Oceania. It has 32,393 undergraduate and 16,627 graduate students . The University of Sydney is organised into sixteen faculties and schools, through which it offers bachelor's degrees, master's degrees, and doctoral degrees. Wikipedia.


Time filter

Source Type

Patent
Salk Institute for Biological Studies and University of Sydney | Date: 2016-09-12

Novel compounds having a formula embodiments of a method of making the same, and of a composition comprising them are disclosed herein. Also disclosed are embodiments of a method of treating or preventing a metabolic disorder in a subject, comprising administering to a subject (e.g., via the gastrointestinal tract) a therapeutically effective amount of one or more of the disclosed compounds, thereby activating FXR receptors in the intestines, and treating or preventing a metabolic disorder in the subject. Additionally disclosed are embodiments of a method of treating or preventing inflammation in an intestinal region of a subject, comprising administering to the subject (e.g., via the gastrointestinal tract) a therapeutically effective amount of one or more of the disclosed compounds, thereby activating FXR receptors in the intestines, and thereby treating or preventing inflammation in the intestinal region of the subject.


Patent
University of Sydney | Date: 2015-03-12

The invention relates to RNA production and processing in plastids of higher plants.


International development practitioners are highly optimistic that mobile phones can improve the lives of the inhabitants of remote rural areas in developing countries with an underdeveloped transportation infrastructure. However, the instrumental role of telecommunication is unclear in contexts where residents’ information-sharing networks are strongly geographically constrained by their limited mobility. Empirical research on the interactions between telecommunication and travel in rural areas of developing countries is lacking. This study analyses physical and virtual contact patterns within 1270 instrumental information-sharing relationships reported by the inhabitants of the Pulau Panggung and Sumber Rejo rural subdistricts of Indonesia. In 2013, we implemented an exogenous mobility intervention. In 2014, we administered a network survey in 16 randomly selected farming groups to map local residents’ egocentric and sociocentric physical and virtual travel networks. By comparing the observed networks with simulated random networks, analysing the relationship characteristics and their history, and performing a regression analysis with fixed effects, we examine the complementarity and substitution between telecommunication and travel in the creation and maintenance of social networks. By examining the effects of the exogenous intervention, we can explain the mechanisms underlying the uncovered associations. The results suggest path dependency between physical and virtual travel in remote rural areas. The implication for transportation policy is that physical mobility is a precondition for the creation of virtual information-sharing links. Instrumental communication relationships that do not socially require regular physical co-presence can be partially substituted by virtual travel only after virtual links have been created through physical mobility. Therefore, in contrast with general expectations, mobile telephony in remote rural regions is more practical if the transportation infrastructure is adequately developed. The paper concludes with a discussion of the potential contribution of the sociocentric network perspective to transportation research. © 2017 The Author


Hensher D.A.,University of Sydney
Transportation Research Part A: Policy and Practice | Year: 2017

The digital age has opened up new opportunities to improve the customer experience in using public transport. Specifically, we see the role of smart technology in the hands of customers as the new rubric to deliver services that are individualised to the needs and preferences of current and future public transport users. This frontline of service delivery has become known as mobility as a service (MaaS) whereby an individual can book a service delivered through a range of possible modes of transport. At one extreme we have point-to-point car based services such as Uber, Lyft, BlaBlaCar and RydHero (for children), with futuristic suggestions of these gravitating to driverless vehicles (cars and buses). Variations around this future are bus-based options that include smart bookable ‘point-via-point-to-point’ services that offer up options on travel times and fares (with the extreme converting to the point-to-point car service, possibly also operated by a bus business); as well as the continuation of conventional bus services (with larger buses) where the market for smart MaaS is difficult or inappropriate to provide (e.g., contracted (often free) school bus services). This paper, as a think piece, presents a number of positions that could potentially represent future contexts in which bus services might be offered, recognising that a hybrid multi-modal state of affairs may be the most appealing new contract setting, enabling the design of contracts to be driven by the mode-neutral customer experience, and the growing opportunity to focus on MaaS. We suggest that the adrenal rush for mobility services, however, may not deliver the full solution that supporters are suggesting. © 2017 Elsevier Ltd


The land sector is essential to achieve the Paris Agreement's goals. Agriculture and land use contribute between 20 and 25 per cent of global greenhouse gas emissions. The Paris Agreement's aim to keep the average global temperature rise between 1.5 and 2 degrees Celsius implies that drastic emission cuts from agriculture are needed. The sequestration potential of agriculture and land use offers an important mechanism to achieve a transition to net-zero carbon emissions worldwide. So far, however, states have been reluctant to address emissions from, and sequestration by, the agricultural sector. Some states that have or are setting up a domestic emission-trading scheme allow for the generation of offsets in agriculture, but only to a limited extent. Australia is the only country that has a rather broad set of methodologies in place to award credits to farmers for all kinds of carbon-farming projects. This article reviews the experience with the Australian model so far, with the objective of articulating transferable lessons for regulatory design aimed at reducing greenhouse gas emissions from agriculture. It finds that it is possible to regulate for the reduction of emissions from agriculture and for increased sequestration in agricultural soils and in vegetation on agricultural lands, provided that certain conditions are met. Regulation must focus on individual projects at farms, based on a long-term policy that has a wider focus than just emission reduction. Such projects must comply with climate-smart methodologies that ensure the delivery of real, additional, measurable, and verifiable emission reductions and also foster long-term innovation and create economic, social, and environmental cobenefits. Moreover, a robust and reliable mrv system must be put in place. © KONINKLIJKE BRILL NV, LEIDEN, 2017.


Smith G.J.D.,Australian National University | O'Malley P.,University of Sydney
British Journal of Criminology | Year: 2017

The experience of driving is mediated by a politics of data-driven governance and resistance. These politics hinge on the extensive use of networked digital devices/data by road authorities and users. The former operate such technologies to manipulate the behaviour of drivers, while the latter deploy them to subvert the depersonalizing systems of control to which they are subjected. Using evidence derived from two online forums, we explore both the meanings that certain road users ascribe to the simulated justice they experience, but also the mediated practices of resistance they perform. We suggest that this example of 'technoscientific citizenship', where in response to unpalatable crime control measures discrete drivers coalesce on virtual forums and share/crowdsource digital data, poses some interesting new epistemic questions as regards emerging forms of public criminology. © The Author 2017.


Fleming S.,University of Sydney
Optics InfoBase Conference Papers | Year: 2016

Metamaterials are a hot research topic which has excited significant media interest. A novel undergraduate experiment provides hands-on fabrication of a metamaterial and characterization of diffraction-free propagation, with potential to inspire and excite students. © OSA 2016.


Black C.M.,University of Sydney
Transnational Environmental Law | Year: 2017

Economic arguments in support of linking emissions trading schemes suggest that such linking could provide access to lower cost abatement options and increase market stability. The decisions of whether and how to link emissions trading schemes often focus on the design features of the relevant schemes, but an additional factor which has the potential to undermine the efficiency of linked schemes is taxation. This article systematically tests two alternative approaches to the direct (income) taxation of cross-border transfers of emission allowances for differential tax outcomes. Four hypothetical transactions are considered under three different linking mechanisms and on the assumption that a tax treaty based on the OECD Model Tax Convention on Income and on Capital is in force. This analysis evidences that, in some cases – and especially if the relevant jurisdictions adopt different approaches to the taxation of allowance transactions under domestic law – there is the potential for timing differences or double taxation that could impact on the efficiency of the linked trading schemes. It is therefore important for tax implications to be considered as part of any linking proposal. © Cambridge University Press 2017


Lyster R.,University of Sydney
Environmental Politics | Year: 2017

A Capability Approach is adopted to critically analyse, in the interests of Climate Justice, whether the Paris Agreement is likely to adequately protect human and non-human Capabilities from the worst impacts of climate disasters. The mitigation, adaptation, and loss and damage provisions of the Paris Agreement are not convincing. Adaptation offers only a modest response to climate change, compared with mitigation, and current financial commitments to fund adaptation in developing countries are far too low. Consequently, the Parties to the United Nations Framework Convention on Climate Change have a long way to go in their negotiations before they have any hope of meeting their agreed temperature goals, and protecting human and non-human Capabilities from climate disasters. © 2017 Informa UK Limited, trading as Taylor & Francis Group


Kwan T.H.,University of Sydney | Wu X.,University of Sydney
Applied Energy | Year: 2017

The thermoelectric generator (TEG) is a clean and noiseless renewable electrical power source that requires no moving parts. In the meantime, the hybrid photovoltaic cell and thermoelectric generator (PV/TEG) system is also popularly considered in research because of its potential improved power conversion efficiency over its monolithic counterparts. This paper continues the work on several previous publications from the authors where the focus here is to perform maximum power point tracking (MPPT) on the hybrid PV/TEG system. It reuses the Lock-On Mechanism (LOM) MPPT algorithm which was previously used by the authors in two separate publications on the solar panel and TEG alone. In comparison to conventional fixed step based MPPT algorithms, the LOM algorithm improves the MPP tracking performance by adaptively scaling the DC-DC converter duty cycle whenever the MPP is located. In doing so, the steady state oscillations become negligibly small thus be considered eliminated and a smooth steady state MPP response is achieved. The simulation and experiment in this paper are conducted using a double SEPIC converter where each input source is treated independently with the proposed algorithm. Results prove that the proposed algorithm is fast and stable in comparison to the conventional fixed step hill climbing algorithm. © 2017 Elsevier Ltd.


Warren C.R.,University of Sydney
Soil Biology and Biochemistry | Year: 2017

During the early stages of ecosystem development there are increases in plant and soil microbial biomass, nutrient availability and rates of nutrient cycling; but little is known about how pools of small organic N vary during the initial stages of soil development. The aim of this study was to examine how the pool of small organic N compounds varies during the initial stages of soil development, and if age differentially affects D- and L-enantiomers of protein amino acids. Measurements were made at a soil chronosequence on the east coast of Tasmania that comprised a series of sub-parallel beach dunes and ridges varying in age from <100 years to 5500 years. Capillary electrophoresis-mass spectrometry was used to identify and quantify the main small organic N compounds in free, adsorbed and microbial fractions of the soil; while chiral liquid chromatography-mass spectrometry was used to quantify amino acid enantiomers in hydrolysed soil and the free, adsorbed and microbial fractions of soil. CE-MS detected 66 small (<250 Da) organic N compounds of which 63 could be positively identified. Small organic N was dominated by protein amino acids, while there were also large amounts of quaternary ammonium compounds and alkylamines. There were differences among chronosequence sites in the profile of small organic N, but these differences were not monotonically related to age and there was no evidence for a build-up of recalcitrant compounds over time. Differences were instead site-specific and related to presence/absence of particular non-protein amino acids which probably related to the presence/absence of specific plants and/or microbes that produce and/or can metabolise different non-protein amino acids. In free solution and microbial biomass D enantiomers of many amino acids were below detection limits (i.e. < 0.125 nmol g−1) and D-enantiomers were at low concentrations relative to L enantiomers such that across all ages and replicates the summed concentration of D-amino acids was 0-3-0.6% of L amino acids. There was no evidence that absolute or relative concentrations of D-enantiomers in free solution, microbial biomass or hydrolysates were larger at the older chronosequence sites. The consistent lack of an effect of soil age on D/L probably indicates that the turnover of soil proteins is comparatively rapid and thus soil proteins are similarly young even among sites in which soil age is vastly different. © 2017 Elsevier Ltd


Jeffery C.J.,University of Sydney
Journal of Experimental Marine Biology and Ecology | Year: 2017

In this study the small honeycomb barnacle Chamaesipho tasmanica Foster and Anderson occupied the mid-intertidal area at Cape Banks, New South Wales, but varied greatly in abundance from low to upper shores within its range of distribution. Higher densities of these barnacles lived on lower shores, while larger barnacles lived on upper shores. Settlement (involving larval supply and larval choice), rather than post-settlement mortality, has already been found to determine abundance and distribution of these gregarious barnacles, while longevity governs the size of larger barnacles. In this study, it was hypothesised that reproductive output would also influence the demography of Chamaesipho. Three models were proposed, and appropriate hypotheses tested experimentally, to explain the relationships between numbers, and size of adults, and reproductive output, and their influence on future populations of barnacles. Results demonstrated that larger barnacles had a greater reproductive output than smaller barnacles. Reproductive output, however, was not density-dependent and did not determine ultimate populations of Chamaesipho at Cape Banks. Consequently this study emphasised the importance of early settlement processes in determining the abundance and distribution of this barnacle. © 2016 Elsevier B.V.


Gilbert G.L.,University of Sydney
Public Health Research and Practice | Year: 2016

In March 2016, the World Health Organization declared the 2014-15 Ebola virus disease (EVD) outbreak officially over. With around 29 000 cases and 11 000 deaths in 27 months, this EVD outbreak was more than 60 times larger than any before, and unique in its cross-border spread and involvement of urban centres. Local and international responses were slow and initially inadequate, but establishment of the United Nations Mission for Ebola Emergency Response, 9 months after the outbreak began, allowed a coordinated effort that slowed and eventually controlled the spread of disease. Internationally, there were fears that EVD would spread widely beyond Africa, despite reassurances from public health authorities. However, after nurses in the US became infected, public fear and concern for the safety of healthcare workers led to political intervention and varied, sometimes excessive, border controls, quarantine arrangements and hospital preparations. Altogether, fewer than 30 EVD cases were managed in countries outside Africa, all but three of which were acquired in West Africa. In Australia, the Australian Health Protection Principal Committee led the internal response, including enhanced screening of incoming passengers at international airports and development of public health and laboratory testing protocols by expert subcommittees. States and territories nominated designated hospitals to care for EVD patients. Development of EVD infection prevention and control (IPC) guidelines was initially poorly coordinated within and between jurisdictions, often with significant discrepancies, causing confusion and fear among healthcare workers. The Infection Prevention and Control Expert Advisory Group was established to develop national IPC guidelines. There were no confirmed cases in Australia, but investigation of several people with suspected EVD provided valuable experience in use of protocols and high-level containment facilities. The Australian Government was initially reluctant to send aid workers to West Africa, but later contracted a private company to staff and manage a treatment centre in Sierra Leone, which treated 91 patients with EVD during 4 months of operation. Among the lessons learnt for Australia was the need to increase awareness of routine IPC practices in hospitals, where significant deficiencies were exposed, and to maintain a high enough level of preparedness to protect healthcare workers and the public from the next, inevitable, infectious disease emergency. © 2016 Gilbert.


Calvo R.A.,University of Sydney | Peters D.,University of Sydney
Conference on Human Factors in Computing Systems - Proceedings | Year: 2016

As the focus in HCI has moved from functionality to usability to the user experience, we have moved toward greater human-centerdness. In a newest iteration, we are beginning to acknowledge the psychological impact that our pervasive technologies have on us. Rather than assuming negative impact is inevitable, as designers we are in a position to actively recruit digital experience to help us thrive. By turning to wellestablished methods in fields such as psychology, neuroscience, and economics, we can begin to design, and develop new technologies to foster psychological wellbeing and human potential - an area of research and practice we have referred to as "positive computing" [1]. In this course we will explore approaches to evaluating and designing for wellbeing determinants like autonomy [3,5], competence [5], connectedness [5], meaning [4], and compassion [2], as a first step towards a future in which all digital experience supports flourishing. © 2016 Authors.


Coles P.J.,University of Waterloo | Berta M.,California Institute of Technology | Tomamichel M.,University of Sydney | Wehner S.,Technical University of Delft
Reviews of Modern Physics | Year: 2017

Heisenberg's uncertainty principle forms a fundamental element of quantum mechanics. Uncertainty relations in terms of entropies were initially proposed to deal with conceptual shortcomings in the original formulation of the uncertainty principle and, hence, play an important role in quantum foundations. More recently, entropic uncertainty relations have emerged as the central ingredient in the security analysis of almost all quantum cryptographic protocols, such as quantum key distribution and two-party quantum cryptography. This review surveys entropic uncertainty relations that capture Heisenberg's idea that the results of incompatible measurements are impossible to predict, covering both finite- and infinite-dimensional measurements. These ideas are then extended to incorporate quantum correlations between the observed object and its environment, allowing for a variety of recent, more general formulations of the uncertainty principle. Finally, various applications are discussed, ranging from entanglement witnessing to wave-particle duality to quantum cryptography. © 2017 American Physical Society.


Bonnet X.,CNRS Chizé Center for Biological Studies | Naulleau G.,CNRS Chizé Center for Biological Studies | Shine R.,University of Sydney
American Naturalist | Year: 2017

The parchment-shelled eggs of squamate reptiles take up substantial water from the nest environment, enabling the conversion of yolk into neonatal tissue and buffering the embryo against the possibility of subsequent dry weather. During development, increasing amounts of water are stored in the embryonic sacs (i.e., membranes around the embryo: amnion, allantois, and chorion). The evolution of viviparity (prolonged uterine retention of developing embryos) means that embryonic-sac fluid storage now imposes a cost (increased maternal burdening), confers less benefit (because the mother buffers fetal water balance), and introduces a potential conflict among uterine siblings (for access to finite water supplies). Our data on nine species of squamate reptiles and published information on three species show that the embryonic-sac fluids comprise around 33% of neonatal mass in viviparous species versus 94% in full-term eggs of oviparous squamates. Data on parturition in 149 vipers (Vipera aspis, a viviparous species) show that larger offspring store more fluids in their fetal sacs and that an increase in litter size is associated with a decrease in fluid-sac mass per offspring. Overall, the evolutionary transition from oviparity to viviparity may have substantially altered selective forces on offspring packaging and created competition among offspring for access to water reserves during embryonic development. © 2017 by The University of Chicago.


Background:Chemotherapy in platinum-resistant ovarian cancer (PROC) aims for palliation and prolonging of progression-free survival (PFS). This study compares Health-related Quality of Life (HRQoL) and efficacy between single-agent chemotherapy and tamoxifen in PROC.Methods:Patients with PROC were randomised (2 : 1) to chemotherapy (weekly paclitaxel 80 mg m-2 or four weekly pegylated liposomal doxorubicin 40 mg m-2) or tamoxifen 40 mg daily. The primary end point was HRQoL. Secondary end points were PFS by RECIST and overall survival (OS).Results:Between March 2002 and December 2007, 156 and 82 patients were randomised to chemotherapy and tamoxifen, respectively. In the chemotherapy arm, a significantly larger proportion of patients experienced a worsening in their social functioning. There was no difference in the proportion of patients experiencing improvement of gastrointestinal symptoms. Median PFS on tamoxifen was 8.3 weeks (95% CI, 8.0–10.4) compared with 12.7 weeks (95% CI, 9.0–16.3) on chemotherapy (HR, 1.54; 95% CI, 1.16–2.05; log-rank P=0.003). There was no difference in OS between the treatment arms.Conclusions:Patients on chemotherapy had longer PFS but experienced more toxicity and poorer HRQoL compared with tamoxifen. Control over gastrointestinal symptoms was not better on chemotherapy. These data are important for patient counselling and highlight the need to incorporate HRQoL end points in studies of PROC.British Journal of Cancer advance online publication 24 January 2017; doi:10.1038/bjc.2016.435 www.bjcancer.com. © 2017 Cancer Research UK


Jones G.R.D.,St Vincents Hospital | Jones G.R.D.,University of Sydney
Biochemia Medica | Year: 2017

There are many activities currently being undertaken in the field of laboratory medicine under the broad heading of “harmonization”. These include traceability of results to international reference standards, processes to align results from assays where traceability has not been achieved (analytical harmonization) and international or national clinical guidelines based on studies from many parts of the world. Many of these issues are global in nature, with clinical evidence derived from studies performed in all parts of the world and multinational diagnostic companies providing assays worldwide. As with all aspects of medicine, progress can only be assured where these is evidence of effectiveness of the activities. External Quality Assurance (EQA) programs are designed to meet this need. Currently EQA processes have significant limitations in meeting the global needs of the laboratory medicine community. This paper aims to identify the steps that can be taken to allow current and future EQA programs to provide information on global variation in results. It is only by being aware of result differences that steps can be taken to improve performance. © Croatian Society of Medical Biochemistry and Laboratory Medicine.


Pham C.H.,University of Sydney
Journal of Constructional Steel Research | Year: 2017

Thin-walled structural channel members are commonly manufactured with cut-outs to allow access for building services such as plumbing, electrical and heating systems in the walls and ceilings. The presence of holes in the members will cause changes in the stress distribution and there will be consequently changes in the buckling characteristic and ultimate strength capacity. Recent work by Pham and Hancock has provided solutions to determine the shear buckling load using the Spline Finite Strip Method (SFSM) for whole thin-walled channel sections without holes. In this paper, the same methodology is utilised to study and provide solutions to the elastic shear buckling firstly for the perforated square plates and subsequently for the whole thin-walled lipped channel sections with centrally located holes. Both circular and square holes with the same sizes and diameters were chosen for investigation. The main variables are the diameters of the circular holes and the sizes of the square holes. While there is only uniform pure shear applied throughout in square plates, three different cases for shear loading in the channel are considered to maintain longitudinal equilibrium. The method is also benchmarked against the Finite Element Method (FEM) using software package ABAQUS/Standard in all cases. Comparisons between hole shapes, loading cases and buckling modes of both square plates and channels are included. For design purposes, approximate equations for shear buckling coefficients of square plate and channel section containing central circular and square holes are also proposed in this paper. Design example is also provided for design purposes. © 2016 Elsevier Ltd


Dougherty K.,University of Sydney
Proceedings of the International Astronautical Congress, IAC | Year: 2016

The establishment of the Woomera Rocket Range in 1947 rapidly made clear to the Australian Department of Supply (the local partner in the Range development in conjunction with the British Ministry of Supply) its lack of capability in the scientific and technical fields that were required to support the missile and other weapons research which would be carried out at the Range. Consequently, in 1949, the Australian Government established the Australian Defence Scientific Service (ADSS), in order to consolidate and expand the nation's defence-related research and development efforts. This new agency incorporated the Long Range Weapons Establishment, which managed the Woomera Range, and the Defence Research Laboratories, which had been established prior to, and just after, the Second World War. These research facilities were later combined to form the Weapons Research Establishment (WRE), the major division of the ADSS within the Department of Supply. Although space activities were still considered science fiction when the ADSS was formed, over the following three decades the Service would carry out research and innovation that contributed not only to Australia's modest space activities between 1957-1979, but also to the missile and space projects of the United Kingdom and United States. This paper will present examples of the research and innovation carried out under the auspices of the ADSS that either contributed to Australia's early space activities, and those of its allies, or could have formed the basis of a more extensive national space effort, had the Australian Government decided to establish such a program. Copyright © 2016 by the International Astronautical Federation (IAF). All rights reserved.


Cannabis use increases rates of psychotic relapse and treatment failure in schizophrenia patients. Clinical studies suggest that cannabis use reduces the efficacy of antipsychotic drugs, but there has been no direct demonstration of this in a controlled study. The present study demonstrates that exposure to the principal phytocannabinoid, Δ9-tetrahydrocannabinol (THC), reverses the neurobehavioral effects of the antipsychotic drug risperidone in mice. THC exposure did not influence D2 and 5-HT2A receptor binding, the major targets of antipsychotic action, but it lowered the brain concentrations of risperidone and its active metabolite, 9-hydroxy risperidone. As risperidone and its active metabolite are excellent substrates of the ABC transporter P-glycoprotein (P-gp), we hypothesized that THC might increase P-gp expression at the blood–brain barrier (BBB) and thus enhance efflux of risperidone and its metabolite from brain tissue. We confirmed that the brain disposition of risperidone and 9-hydroxy risperidone is strongly influenced by P-gp, as P-gp knockout mice displayed greater brain concentrations of these drugs than wild-type mice. Furthermore, we demonstrated that THC exposure increased P-gp expression in various brain regions important to risperidone’s antipsychotic action. We then showed that THC exposure did not influence the neurobehavioral effects of clozapine. Clozapine shares a very similar antipsychotic mode of action to risperidone, but unlike risperidone is not a P-gp substrate. Our results imply that clozapine or non-P-gp substrate antipsychotic drugs may be better first-line treatments for schizophrenia patients with a history of cannabis use.Neuropsychopharmacology advance online publication, 29 March 2017; doi:10.1038/npp.2017.50. © 2017 American College of Neuropsychopharmacology


Eggleton B.J.,University of Sydney
Optics InfoBase Conference Papers | Year: 2016

On-chip Stimulated Brillouin scattering (SBS) is the focus of current research because of its potential for integration of a variety of important photonic functionalities. Here, we demonstrate record high chip-based SBS of over 50dB net-gain which allows for advanced microwave photonic signal processing applications. © OSA 2016.


Eggleton B.,University of Sydney
Optics InfoBase Conference Papers | Year: 2016

My talk will review our progress and achievements in developing circuits that harness interactions between optical waves and hypersonic phonons towards a new class of silicon based optical phononic processor that is CMOS compatible. © OSA 2016.


Webster R.,University of Sydney | Castellano J.M.,National Health Research Institute | Castellano J.M.,Monteprincipe University Hospital | Onuma O.K.,World Health Organization
The Lancet | Year: 2017

Regulatory approvals for cardiovascular polypills are increasing rapidly across more than 30 countries. The evidence clearly shows polypills improve adherence and cardiovascular disease risk factors for patients with indications for use of polypill components—ie, those with established cardiovascular disease or at high risk. However, the implementation of polypills into clinical practice has many challenges. The clinical trials literature provides insights into the clinical impact of a polypill strategy, including cost-effectiveness, safety of use, substantial improvement in adherence, and better risk factor control than usual care. Despite the clear need for such a strategy and the available clinical data backing up the use of the polypill in different patient populations, challenges to widespread implementation, such as an absence of government reimbursement and poor physician uptake (identified from on the ground experience in countries following commercial rollout), have greatly obstructed real-world implementation. Obtaining the full public health benefit of polypills will require education, advocacy, endorsement, and implementation by key global agencies such as WHO and national clinical bodies, as well as endorsement from governments. © 2017 Elsevier Ltd


News Article | April 26, 2017
Site: scienceblogs.com

Thank you Dr. Barb Goodman (Director of SD Biomedical Research Infrastructure Network, Fellow of the American Physiological Society, Sanford School of Medicine of the University of South Dakota) who sent me information about thirsty koalas. Koalas typically hydrate themselves from the leaves of eucalyptus trees. But recently researchers at the University of Sydney have noticed the animals are drinking water as eucalyptus trees have succumbed to wildfires and climate change. Koalas have found a friend in Robert Frend, who is a farmer in New South Wales and creator of “Blinky Drinker”, which are water stations for koalas:


"With this study, we showed we could produce biomedically relevant MR images using nanodiamonds as the source of contrast in the images and that we could switch the contrast on and off at will," says David Waddington, lead author of the paper and a PhD student at the University of Sydney in Australia. Waddington is currently working with Matthew Rosen, PhD, in the Low-Field Imaging Laboratory at the Martinos Center. "With competing strategies, the nanodiamonds must be prepared externally and then injected into the body, where they can only be imaged for a few hours at most. However, as our technique is biocompatible, we can continue imaging for indefinite periods of time. This raises the possibility of tracking the delivery of nanodiamond-drug compounds for a variety of diseases and providing vital information on the efficacy of different treatment options." Waddington began this work three years ago as part of a Fulbright Scholarship awarded early in his graduate work at the University of Sydney, where he is a member of a team led by study co-author David Reilly, PhD, in the new Sydney Nanoscience Hub - the headquarters of the Australian Institute for Nanoscale Science and Technology, which launched last year. As part of the Reilly group, Waddington played a crucial role in early successes with nanodiamond imaging, including a 2015 paper in Nature Communications. He then sought to extend the potential of the approach by collaborating with Rosen at the Martinos Center and Ronald Walsworth, PhD, at Harvard University, also a co-author of the current study. Rosen's group is a world leader in the area of ultra-low-field magnetic resonance imaging, a technique that proved essential to the development of in vivo nanodiamond imaging. Previously, the use of nanodiamond imaging in living systems was limited to regions accessible using optical fluorescence techniques. However, most potential diagnostic and therapeutic applications of nanoparticles, including tracking of complex disease processes like cancer, call for the use of MRI - the gold standard for noninvasive, high-contrast, three-dimensional clinical imaging. In the present study, the researchers show that they could achieve nanodiamond-enhanced MRI by taking advantage of a phenomenon known as the Overhauser effect to boost the inherently weak magnetic resonance signal of diamond through a process called hyperpolarization, in which nuclei are aligned inside a diamond so they create a signal detectable by an MRI scanner. The conventional approach to hyperpolarization uses solid-state physics techniques at cryogenic temperatures, but the signal boost doesn't last very long and is nearly gone by the time the nanoparticle compound is injected into the body. By combining the Overhauser effect with advances in ultra-low-field MRI coming out of the Martinos Center, the researchers were able to overcome this limitation - thus paving the way for high-contrast in vivo nanodiamond imaging over indefinitely long periods of time. High-performance ultra-low-field MRI is itself a relatively new technology, first reported in Scientific Reports in 2015 by Rosen and Martinos Center colleagues. "Thanks to innovative engineering, acquisition strategies and signal processing, the technology offers heretofore unattainable speed and resolution in the ultra-low-field MRI regime," says Rosen, director of the Low-Field Imaging Laboratory, an assistant professor of Radiology at Harvard Medical School and the senior author of the current paper. "And importantly, by removing the need for massive, cryogen-cooled superconducting magnets, it opens up a number of new opportunities, including the nanodiamond imaging technique we've just described." The researchers have noted several possible applications for their new approach to nanodiamond-enhanced MRI. These include the accurate detection of lymph node tumors, which can aid in the treatment of metastatic prostate cancer, and exploring the permeability of the blood-brain barrier, which can play an important role in the management of ischemic stroke. Because it provides a measurable MR signal for periods of over a month, the technique could benefit applications such as monitoring the response to therapy. Included in treatment monitoring are applications in the burgeoning field of personalized medicine. "The delivery of highly specific drugs is strongly correlated with successful patient outcomes," says Waddington, who was honored with the Journal of Magnetic Resonance Young Scientist Award at the 2016 Experimental NMR Conference in recognition of this work. "However, the response to such drugs often varies significantly on an individual basis. The ability to image and track the delivery of these nanodiamond-drug compounds would, therefore, be greatly advantageous to the development of personalized treatments." The researchers continue to explore the potential of the technique and are now planning a detailed study of the approach in an animal model, while also investigating the behavior of different nanodiamond-drug complexes and imaging them with the new capability.


News Article | April 13, 2017
Site: www.chromatographytechniques.com

A wild-born, pure Australian desert dingo called Sandy Maliki has taken out first place in the World's Most Interesting Genome competition.The UNSW-led proposal to have Sandy's DNA decoded was one of five finalists for the Pacific Biosciences SMRT Grant, which provides cutting-edge sequencing of the complete genome of a particularly fascinating plant or animal. The public determined the winner, with 2-year-old Sandy securing 41 percent of the international community votes, closely followed by a Temple Pitviper snake, then a solar-powered sea slug, an explosive bombardier beetle, and a pink pigeon. "We are thrilled that our bid to have Sandy's DNA sequenced captured the public's imagination," says project leader, Professor Bill Ballard of the UNSW School of Biotechnology and Biomolecular Sciences. "Sandy is truly a gift to science. As a rare, wild-born pure dingo, she provides a unique case study. Pure dingoes are intermediate between wild wolves and domestic dogs, with a range of non-domesticated traits. So sequencing Sandy's genome will help pinpoint some of the genes for temperament and behaviour that underlie the transition from wild animals to perfect pets. "As well, learning more about dingo genetics will help efforts to conserve these wonderful Australian animals, through the development of improved tests for dingo purity," Professor Ballard says. Sandy and her sister and brother were discovered as 3-week-old pups in the Australian desert near the Strzelecki Track in 2014 by NSW animal lovers, Barry and Lyn Eggleton, who have hand-reared them ever since. The pups were close to death and their parents could not be found. The dingo sequencing project will be the first to test Charles' Darwin's 1868 theory that the process of domestication can be divided into two steps: unconscious selection as a result of non-intentional human influences; and artificial selection as a result of breeding by humans for desired traits. "This project will reveal the DNA changes between wolves and dingoes (unconscious selection) and dingoes and dogs (artificial selection)," says Ballard. A key aim of the annual international PacBio competition, which attracted more than 200 entries this year, is to raise public awareness of science and how genomic research can benefit society. Sandy's team, which set up a DancingwithDingoes Facebook page, enlisted the support of a wide variety of people around the world, including animal conservationists and fans of wolves, dingoes and dogs. "We also engaged with staff and students at UNSW, by bringing two pure alpine dingoes from the Bargo Dingo Sanctuary onto campus for everyone to meet," says Ballard. The cutting edge PacBio technology allows DNA to be sequenced in long sections containing tens of thousands of bases, rather than in shorter sections of a few hundred bases, as with existing techniques. This can reveal important rearrangements in the genome that affect gene expression. The sequencing will be carried out at the University of Arizona, with initial analysis by Computomics in Germany. The Australian team behind the Sandy project also includes Claire Wade of the University of Sydney, Richard Melvin of UNSW, Robert Zammit of the Vineyard Veterinary Hospital and Andre Minoche of the Garvan Institute of Medical Research. UNSW has a strong reputation in genomics research, with scientists at the university's Ramaciotti Centre for Genomics having worked on the genomes of a variety of other important native creatures, including the koala, the Tasmanian devil, the wombat, the platypus, the Queensland fruit fly and the Wollemi Pine. "We're very proud of UNSW's history of contribution to genomics and we are delighted that Sandy's genome will now be sequenced as the prize for winning this competition," says UNSW molecular biologist and Deputy Vice-Chancellor (Education) Professor Merlin Crossley. "Australia has so many interesting animals to sequence and the results enhance our understanding of evolution and biology and help improve agriculture and pest management". Dingoes were introduced to Australia about 5,000 years ago. It is widely accepted they were not domesticated by Indigenous Australians. Pure dingoes are becoming increasingly rare as the native animals interbreed with wild dogs and domestic dogs, and are targeted as pests by landowners.


News Article | April 17, 2017
Site: www.newscientist.com

How can a predator catch and consume prey bigger than itself? That’s one of the marvels and mysteries of the appropriately named kingsnake. Kingsnakes can kill and consume rat snakes at least 20 per cent larger than themselves. Now we may finally know how they manage to ensnare their quarry in the first place. Imagine trying to fit a large garden hose inside a small one. The kingsnake faces a similar challenge in engulfing a bigger tube-shaped animal. We know that they achieve this with flexible jaws and by crushing prey inside S-bends, like squeezing spaghetti through a pasta machine. But a lingering mystery was how these smaller snakes have the power to subdue and handle those bigger than themselves – an extra-challenging feat because unlike mammals that suffocate and become unconscious quickly, snakes survive anoxia for much longer and can thus put up a fight. Perplexed by this puzzle, David Penning at Missouri Southern State University, Joplin, and Brad Moon at the University of Louisiana at Lafayette used three experiments to investigate. First, they examined 36 preserved specimens of three kingsnake and three rat snake species to examine how much muscle they had relative to their body size. They also measured how much force snakes could exert while trying to pull away, to assess escape performance for 98 snakes restrained with a harness. Surprisingly, they found that kingsnakes did not have a greater proportion of muscle for their size in comparison with rat snakes, nor did they exert a proportionally greater pulling force. All bigger individuals had more muscle and pulling force, regardless of species. In a third experiment, the researchers shook dead mice in front of 182 snakes to stimulate them to engage in a struggle with their prey. Sensors attached to the mice allowed measurement of how much constricting pressure kingsnakes exerted compared with ratsnakes. What prey-squeezing observations revealed was very different levels of constriction pressure – with all kingsnakes producing higher pressures than the rat snakes. The most powerful kingsnake species studied, the California kingsnake (Lampropeltis californiae), exerted more than double the constricting pressure of the weakest of the three rat snake species, the western rat snake (Pantherophis obsoletus). What seemed to be key to creating their power was the way they positioned the coils in their bodies. “Almost all the rat snakes had this really variable, haphazard application of their body, whereas all the kingsnakes were in this elegant, spring-like pattern,” says Penning. The king, it seems, is well-sprung. Its posture appears to make the kingsnake’s constriction method more efficient and powerful, although that hypothesis still needs testing further. “It’s fascinating to see such a similarity in muscle size and strength translate into such a major difference in constriction force,” says Rick Shine at the University of Sydney in Australia. “Clearly, evolution has fine-tuned superficially similar systems to produce very different performance outcomes. It’s a fascinating glimpse into the world of snake-eat-snake.” Read more: Watch a rattlesnake plan attack by clearing path for its strike


News Article | March 10, 2017
Site: www.techtimes.com

There is no doubt that there is still debate on the medical merits of cannabis, but the varying information still does not stop many people from using cannabis to aid their conditions. A recent survey in Australia shows that people with epilepsy are included in this population. In a first nationwide survey on cannabis use among epilepsy patients in Australia, researchers found that 14 percent of people with epilepsy have used cannabis products to manage their condition when the current medicine for epilepsy do not work for them or rendered intolerable side effects. Significant success was reported among 90 percent of adult users and 71 percent of children users. The partnership between The Lambert Initiative and the University of Sydney was published on March 9 in the journal Epilepsy & Behavior. Researchers surveyed 976 respondents and examined cannabis use, reasons for the usage, and self-reported perception of the benefits. Among the respondents, their main reasons for trying cannabis were to manage treatment-resistant epilepsy and to try another method of medication with a better side-effect profile compared to standard antiepileptic drugs. "Despite the limitations of a retrospective online survey, we cannot ignore that a significant proportion of adults and children with epilepsy are using cannabis-based products in Australia, and many are self-reporting considerable benefits to their condition," said Anastasia Suraev of The Lambert Initiative and lead author of the study. With these results, the researchers believe that proper education and safe access should be provided to people with epilepsy to ensure safe usage and lessen illegal black market reliance. This case is not the first one that shows people resorting to cannabis for treatment when the more conventional drugs do not work for them. In January alone, a mother shared her and her son's journey as they tried different strains of cannabis to help him cope with his Autism Spectrum Disorder and a painful gut disease that had him throwing violent rages. "It seemed like a miracle," said the mother of the treatment after seeing the improvement on her son in the seven years that they've been using it. Pet owners have also been trying cannabis to help their ailing pets who are suffering from cancer or arthritis. In this regard, veterinarians still warn pet owners to be cautious in giving their pets cannabis as dogs are more sensitive to the components of cannabis and react differently than humans. Recently, a comprehensive 400-page analysis on the medicinal value of cannabis has been released, showcasing both the pros and cons of the substance. Amid this confirmation on cannabis' pain relieving properties, caution should still be taken in ingesting and accumulating cannabis from proper sources. © 2017 Tech Times, All rights reserved. Do not reproduce without permission.


Nanodiamonds - synthetic industrial diamonds only a few nanometers in size - have recently attracted considerable attention because of the potential they offer for the targeted delivery of vaccines and cancer drugs and for other uses. Thus far, options for imaging nanodiamonds have been limited. Now a team of investigators based at the Athinoula A. Martinos Center for Biomedical Imaging at Massachusetts General Hospital has devised a means of tracking nanodiamonds noninvasively with magnetic resonance imaging (MRI), opening up a host of new applications. They report their findings today in the online journal Nature Communications. "With this study, we showed we could produce biomedically relevant MR images using nanodiamonds as the source of contrast in the images and that we could switch the contrast on and off at will," says David Waddington, lead author of the paper and a PhD student at the University of Sydney in Australia. Waddington is currently working with Matthew Rosen, PhD, in the Low-Field Imaging Laboratory at the Martinos Center. "With competing strategies, the nanodiamonds must be prepared externally and then injected into the body, where they can only be imaged for a few hours at most. However, as our technique is biocompatible, we can continue imaging for indefinite periods of time. This raises the possibility of tracking the delivery of nanodiamond-drug compounds for a variety of diseases and providing vital information on the efficacy of different treatment options." Waddington began this work three years ago as part of a Fulbright Scholarship awarded early in his graduate work at the University of Sydney, where he is a member of a team led by study co-author David Reilly, PhD, in the new Sydney Nanoscience Hub - the headquarters of the Australian Institute for Nanoscale Science and Technology, which launched last year. As part of the Reilly group, Waddington played a crucial role in early successes with nanodiamond imaging, including a 2015 paper in Nature Communications. He then sought to extend the potential of the approach by collaborating with Rosen at the Martinos Center and Ronald Walsworth, PhD, at Harvard University, also a co-author of the current study. Rosen's group is a world leader in the area of ultra-low-field magnetic resonance imaging, a technique that proved essential to the development of in vivo nanodiamond imaging. Previously, the use of nanodiamond imaging in living systems was limited to regions accessible using optical fluorescence techniques. However, most potential diagnostic and therapeutic applications of nanoparticles, including tracking of complex disease processes like cancer, call for the use of MRI - the gold standard for noninvasive, high-contrast, three-dimensional clinical imaging. In the present study, the researchers show that they could achieve nanodiamond-enhanced MRI by taking advantage of a phenomenon known as the Overhauser effect to boost the inherently weak magnetic resonance signal of diamond through a process called hyperpolarization, in which nuclei are aligned inside a diamond so they create a signal detectable by an MRI scanner. The conventional approach to hyperpolarization uses solid-state physics techniques at cryogenic temperatures, but the signal boost doesn't last very long and is nearly gone by the time the nanoparticle compound is injected into the body. By combining the Overhauser effect with advances in ultra-low-field MRI coming out of the Martinos Center, the researchers were able to overcome this limitation - thus paving the way for high-contrast in vivo nanodiamond imaging over indefinitely long periods of time. High-performance ultra-low-field MRI is itself a relatively new technology, first reported in Scientific Reports in 2015 by Rosen and Martinos Center colleagues. "Thanks to innovative engineering, acquisition strategies and signal processing, the technology offers heretofore unattainable speed and resolution in the ultra-low-field MRI regime," says Rosen, director of the Low-Field Imaging Laboratory, an assistant professor of Radiology at Harvard Medical School and the senior author of the current paper. "And importantly, by removing the need for massive, cryogen-cooled superconducting magnets, it opens up a number of new opportunities, including the nanodiamond imaging technique we've just described." The researchers have noted several possible applications for their new approach to nanodiamond-enhanced MRI. These include the accurate detection of lymph node tumors, which can aid in the treatment of metastatic prostate cancer, and exploring the permeability of the blood-brain barrier, which can play an important role in the management of ischemic stroke. Because it provides a measurable MR signal for periods of over a month, the technique could benefit applications such as monitoring the response to therapy. Included in treatment monitoring are applications in the burgeoning field of personalized medicine. "The delivery of highly specific drugs is strongly correlated with successful patient outcomes," says Waddington, who was honored with the Journal of Magnetic Resonance Young Scientist Award at the 2016 Experimental NMR Conference in recognition of this work. "However, the response to such drugs often varies significantly on an individual basis. The ability to image and track the delivery of these nanodiamond-drug compounds would, therefore, be greatly advantageous to the development of personalized treatments." The researchers continue to explore the potential of the technique and are now planning a detailed study of the approach in an animal model, while also investigating the behavior of different nanodiamond-drug complexes and imaging them with the new capability. Other authors of the Nature Communications paper include Mathieu Sarracanie and Najat Salameh of the Martinos Center; Huiliang Zhang, and David R. Glenn of the Walsworth team at Harvard University; and Ewa Rej, Torsten Gaebel, and Thomas Boele of the Reilly team at the ARC Centre of Excellence for Engineered Quantum Systems, University of Sydney. Support for the study includes funding from the U.S. Department of Defense/USAMRMC, the Australian Nuclear Science and Technology Organisation and the Australian-American Fulbright Commission. Massachusetts General Hospital, founded in 1811, is the original and largest teaching hospital of Harvard Medical School. The MGH Research Institute conducts the largest hospital-based research program in the nation, with an annual research budget of more than $800 million and major research centers in HIV/AIDS, cardiovascular research, cancer, computational and integrative biology, cutaneous biology, genomic medicine, medical imaging, neurodegenerative disorders, regenerative medicine, reproductive biology, systems biology, photomedicine and transplantation biology. The MGH topped the 2015 Nature Index list of health care organizations publishing in leading scientific journals and earned the prestigious 2015 Foster G. McGaw Prize for Excellence in Community Service. In August 2016 the MGH was once again named to the Honor Roll in the U.S. News & World Report list of "America's Best Hospitals."


News Article | April 17, 2017
Site: www.newscientist.com

If you’re the kind of person who relishes adventure, you may literally see the world differently. People who are open to new experiences can take in more visual information than other people and combine it in unique ways. This may explain why they tend to be particularly creative. Openness to experience is one of the “big five” traits often used to describe personality. It is characterised by curiosity, creativity and an interest in exploring new things. Open people tend to do well at tasks that test our ability to come up with creative ideas, such as imagining new uses for everyday objects like bricks, mugs or table tennis balls. There’s some evidence that people with a greater degree of openness also have better visual awareness. For example, when focusing on letters moving on a screen, they are more likely to notice a grey square appearing elsewhere on the display. Now Anna Antinori at the University of Melbourne in Australia and her team are showing that people who score more highly when it comes to the openness trait “see” more possibilities. “They seem to have a more flexible gate for the visual information that breaks through into their consciousness,” Antinori says. Antinori and her colleagues asked 123 university students to complete a binocular rivalry test, in which they simultaneously saw a red image with one eye and a green image with the other eye for 2 minutes. Usually, the brain can only perceive one image at a time, and most participants reported seeing the image flip between red and green. But some subjects saw the two images fused into a patchwork of red and green – a phenomenon known as “mixed percept”. The higher the participants scored for openness on a personality questionnaire, the more they experienced this mixed perception. “When you present open people with the binocular rivalry dilemma, their brains are able to flexibly engage with less conventional solutions,” Antinori says. “We believe this is the first empirical evidence that they have different visual experiences to the average individual.” In contrast, the other four major personality traits – extroversion, neuroticism, agreeableness and conscientiousness – weren’t significantly linked to experiencing this mixed perception. The results could explain why people with a high degree of openness tend to be more creative and innovative, Antinori says. “When they come up with all these crazy new uses for bricks, it might be because they really perceive the world differently,” she says. The findings also hint at why extremely open people are more prone to paranoia and delusions, says Niko Tiliopoulos at the University of Sydney, Australia. “At those levels of openness, people may actually see reality differently,” he says. “For example, they may ‘see’ spirits, or misinterpret interpersonal or other signals.” According to Antinori, there are similarities between high levels of openness and the experience of taking magic mushrooms. Previous work by her team has found that psilocybin – a hallucinogenic compound in magic mushrooms – increases a person’s openness scores in a personality questionnaire, and their experience of mixed percept in binocular rivalry tests. The team has also found that some forms of meditation can increase mixed image perception in binocular rivalry tests. Antinori next wants to see if similar neural processes are involved in mixed perception, creative thinking and the shifts in visual perception caused by psilocybin and meditation. “It seems that openness alters the filter of consciousness, and we’d like to know how,” she says.


News Article | April 17, 2017
Site: www.eurekalert.org

The BioScience Talks podcast features discussions of topical issues related to the biological sciences. On landscapes around the world, environmental change is bringing people and large carnivores together--but the union is not without its problems. Human-wildlife conflict is on the rise as development continues unabated and apex predators begin to reoccupy their former ranges. Further complicating matters, many of these species are now reliant on human-provided foods, such as livestock and trash. For this episode of BioScience Talks, we're joined by Dr. Thomas Newsome of Deakin University and the University of Sydney. Writing in BioScience, Newsome and his colleagues use gray wolves and other large predators as case studies to explore the effects of human-provided foods. They find numerous instances of species' changing their social structures, movements, and behavior when these resources are available. Perhaps most concerning, they've found that human-fed populations often form distinct genetic subgroups, which could lead to future speciation events. To hear the whole discussion, visit this link for this latest episode of the BioScience Talks podcast.


News Article | April 17, 2017
Site: www.newscientist.com

No jab, no play. So says the Australian Prime Minister, Malcolm Turnbull, who has announced a proposal to bar unvaccinated children from attending preschools and daycare centres. Currently, 93 per cent of Australian children receive the standard childhood vaccinations, including those for measles, mumps and rubella, but the government wants to lift this to 95 per cent. This is the level required to stop the spread of infectious disease and to protect children who are too young to be immunised or cannot be vaccinated for medical reasons. Federal childcare subsidies have been unavailable to the families of unvaccinated children since January 2016, and a version of the new “no jab, no play” policy is already in place in Victoria, New South Wales and Queensland. Other states and territories only exclude unvaccinated children from preschools during infectious disease outbreaks. The proposed policy is based on Victoria’s model, which is the strictest. It requires all children attending childcare to be fully immunised, unless they have a medical exemption, such as a vaccine allergy. Nesha Hutchinson from the Australian Childcare Alliance – an advocacy group for childhood education – says that a nationwide “no jab, no play” policy would be likely to raise immunisation rates. However, she is concerned that children of parents who object to vaccination would miss out on quality early childhood education. The policy may also affect children from disadvantaged families, who are less likely to be immunised, and risk becoming further marginalised if they lose access to education. Punitive measures may also galvanise the anti-vaccination movement, warns Julie Leask at the University of Sydney. “People without any previous interest in vaccination may defend anti-vaccination activists and join their cause because they are concerned about the threat to civil liberties,” she says. Leask prefers the New South Wales model, which makes it procedurally complex but not impossible to send unvaccinated kids to childcare, and also ensures that children’s immunisation records are checked. This policy has increased child immunisation rates by the same amount as the harsher approach in Victoria, she says. Leask also believes that campaigns and reminders are good ways to improve vaccination rates without inciting opposition.


News Article | May 5, 2017
Site: www.newscientist.com

It is pest control without poison. A new type of bait that stops rats from having babies is helping to tackle infestations in several US cities. The bait – known as ContraPest – was approved by the US Environmental Protection Agency last August. It makes rats infertile by triggering early menopause in females and impairing sperm production in males. There are no side effects and the rats eventually die of natural causes. The technique is considered more benign than other control strategies being investigated, such as gene drive, which can be used to spread infertility genes through pest populations. A recent report by the US National Academies of Sciences warned that gene drive could have unforeseen consequences. The first field trial of ContraPest, conducted in the New York City Subway in 2013, halved the resident rat population in three months. Two more trials have now been completed in the US – one at a large-scale farm and one in an urban area – both in East Coast cities. Rat numbers at the farm fell by one-third over three months. In the urban area, population growth was suppressed during the peak breeding season so that the population expanded at only one-third the expected rate. “You’ll never wipe out rats completely – they’re too smart,” says Brandy Pyzyna from SenesTech, the biotechnology company in Arizona that developed the bait. “But if you think about it, one breeding pair of rats can produce 15,000 pups in a year,” she says. “Even if you can reduce that by a third in a few months, you’re already talking 5000 fewer rats, and the population will continue to go down.” ContraPest is more humane and effective than rat poison, says Pyzyna, who presented the latest results at the Australasian Vertebrate Pest Conference in Canberra, Australia, this week. The problem with killing rats is that others simply move in and take their place, she says. Fertility control, on the other hand, maintains a small population of existing rats that guard their territory from newcomers. The active ingredients – triptolide and 4-vinylcyclohexene diepoxide – can cause infertility in other animals, but not at the small doses used in the bait. The flavoured liquid is kept inside bait stations that are only accessible to rats. Once ingested, the chemicals are broken down by the rats’ metabolism, preventing them from contaminating predators or the wider environment. Peter Banks at the University of Sydney says the approach looks promising, but needs more long-term research. Rats that don’t take the bait may end up having bigger, healthier litters because there is less competition for food, he says. “It’s really, really hard to eradicate pests,” he says. Pyzyna and her colleagues are continuing to research the effects of ContraPest in rat populations, while also adapting it to other pest species. They are currently working on reformulating the bait to target mice and feral pigs, but they also have their sights set on feral deer, dogs and cats. Read more: Is it right to kill millions of animals if it protects others?


News Article | May 8, 2017
Site: www.eurekalert.org

Scientists express concerns about the effect of energy drinks on individuals, particularly teens, with familial long QT syndrome in a new study published in the International Journal of Cardiology Amsterdam, The Netherlands, May 8, 2017 - Caffeinated energy drinks can trigger serious cardiac events including cardiac arrest in individuals not known to have a specific heart disease of genetic origin. Scientists in Australia have now assessed the risk of cardiac events following consumption of energy drinks in patients diagnosed with congenital long QT syndrome (LQTS), a condition that affects 1 in 2000 and that can cause rapid, irregular heartbeat that can lead to sudden death. In their study, published in the International Journal of Cardiology, they report that even small amounts of energy drinks can cause changes in the heart that can lead to life-threatening arrhythmias and recommend cautioning young patients, some of whom may still be unaware of an existing heart condition, about the danger. Used by millions, there has been an explosion in the consumption of "energy drinks" in the past 15 years, the most popular of which are Red Bull® and Monster®. The hemodynamic effects of energy drinks in healthy young adults have been assessed in prior studies with results including increased blood pressure, but no change in heart rate. This is the first study specifically designed to test the effects of these energy drinks in individuals who carry the gene faults (mutations) causing congenital LQTS. "The potential cardiovascular risk of energy drinks continues to emerge as an important public health issue," explained lead investigator Professor Christopher Semsarian, MBBS, PhD, MPH, of the University of Sydney and Centenary Institute, Australia. "The population most at risk is teenagers and young adults, representing the population these drinks are most heavily marketed towards. Since energy drinks are widely available to all ages and over the counter, it is important that cardiovascular effects of these drinks are investigated." The study was designed to assess the acute cardiovascular responses to energy drink consumption in patients with familial LQTS and to discover whether any identified cardiovascular effects correlate with changes in blood levels of the active ingredients - caffeine and taurine. Investigators recruited 24 patients aged 16 to 50. More than half were symptomatic before diagnosis and receiving beta-blocker therapy. Most had undergone genetic testing, 13 of whom had a documented pathogenic or likely pathogenic mutation. Participants were assigned to energy drink or control drink groups for the first study visit. The energy drink consisted of two Red Bull sugar-free cans totaling 160mg of caffeine and 2000mg of taurine, totaling 500ml. The control drink was a cordial-based 500ml drink with no caffeine or taurine. Electrocardiograms and blood pressure were recorded every 10 minutes, while signal-averaged electrocardiogram (SAECG) testing and repeat bloods were collected every 30 minutes for a total observation time of 90 minutes. The results of the study show that three patients (12.5%) exhibited dangerous QT prolongation following energy drink consumption and two of the three had sharp increases in blood pressure. These patients all had a documented family history of sudden cardiac death and two of them had previously experienced severe clinical manifestations and received an implantable cardioverter-defibrillator for recurrent syncope. "Some individual patients may be at a higher risk," commented Professor Semsarian. "We therefore suggest caution in allowing the consumption of energy drinks in young patients with LQTS." In an accompanying commentary, Professor Peter J. Schwartz, MD, Head of the Center for Cardiac Arrhythmias of Genetic Origin, IRCCS Istituto Auxologico Italiano, Milan, Italy commented, "Data suggest that the majority of LQTS patients destined to become symptomatic have the first event well after having become a teenager, which implies that a significant number of youngsters with LQTS will help themselves to energy drinks without knowing their real condition and thus endangering themselves." "When something, in this case energy drinks, is ingested by millions of individuals all over the world, a percentage such as 12.5% is no longer small, and the findings deserve careful consideration," added commentary co-author Federica Dagradi, MD, of the Center for Cardiac Arrhythmias of Genetic Origin, IRCCS Istituto Auxologico Italiano. "We should avoid spreading unjustified alarms and fears, but at the same time, we should not ignore potential dangers."


News Article | May 2, 2017
Site: www.prweb.com

Leading authority in dental implant placement, Dr. Jin Y. Kim, is currently accepting new patients without a referral for All-on-4® full-arch tooth replacements. Known for his periodontal expertise around the globe, Dr. Kim focuses on helping people who are missing teeth in Anaheim and surrounding areas, to enjoy the aesthetic, functional and oral health benefits of implant-supported dentures. Since the procedure can be performed in one day, patients leave the office with a beautiful, functional smile. Losing one or more teeth can be a traumatic experience, and the effects do not stop with a gap in the smile and new challenges when chewing. When the upper and lower teeth meet, the pressure travels through the roots to the jaw bone and stimulates healthy bone density growth. Without that process, the bone begins to degrade and crumble, changing the facial structure and creating serious oral health problems. Traditional dentures restore the smile and assist with eating, but they cannot replicate that essential oral health function of the tooth roots. Dental implants resolve the issue with a surgically placed post that secures the tooth to the jaw bone while also allowing chewing motions to perform the bone stimulation function. Not only that, the firm anchor permits people to eat crunchy, chewy or sticky foods without fear of displacement that is common to removable dentures. If people have been missing teeth in Anaheim, CA, for some time, the amount of bone available to fuse to an implant post could be diminished significantly, and may not be solid enough to provide the necessary support. Rather than performing bone grafts to create a foundation for replacement teeth and restore jaw structure, Dr. Kim may often recommend All-on-4 dental implant solutions. This full-arch option requires just four posts, which are strategically placed to maximize healthy bone growth. The custom-made dental prosthetic affixes to these posts firmly and looks and functions like natural teeth. Considering the affordability, minimal recovery time and high success rate of the All-on-4 technique, many patients with missing teeth in Anaheim, CA, find it an ideal solution to end self-consciousness and declining oral health. To learn more about the benefits of implant supported dentures or to schedule an appointment, people can call Dr. Kim’s Diamond Bar practice location at 909-860-9222, or West Garden Grove office at 714-898-8757, or visit his website at http://www.drjinkim.com. Dr. Jin Y. Kim is a periodontist dedicated to providing personalized dental care in Diamond Bar and Garden Grove, CA. Dr. Kim attended the University of Sydney Faculty of Dentistry before furthering his education with an advanced degree in pathology from the Medical School of the same University. Dr. Kim completed a periodontics and implant surgery residency at UCLA School of Dentistry. A uniquely dual board-certified specialist, Dr. Kim was board-certified by the American Board of Periodontology and the American Board of Oral Implantology/Implant Dentistry. The International Congress of Oral Implantologists and the American Academy of Implant Dentistry both gave him the title of Fellow. He was also inducted to be a Fellow of the prestigious American College of Dentists. Dr. Kim enjoys lecturing at UCLA School of Dentistry as well as national and international academic and clinical associations and universities including the International Association of Dental Research, American Academy of Periodontology and Academy of Osseointegration. To learn more about Dr. Jin Kim and the services he offers, visit his website at http://www.drjinkim.com or call 909-860-9222 for the Diamond Bar location or 714-898-8757 for the West Garden Grove location to schedule an appointment.


News Article | April 28, 2017
Site: www.eurekalert.org

Australian and German researchers have collaborated to develop a genetic algorithm to confirm the rejection of classical notions of causality. Dr Alberto Peruzzo from RMIT University in Melbourne said: "Bell's theorem excludes classical concepts of causality and is now a cornerstone of modern physics. "But despite the fundamental importance of this theorem, only recently was the first 'loophole-free' experiment reported which convincingly verified that we must reject classical notions of causality. "Given the importance of this data, an international collaboration between Australian and German institutions has developed a new method of analysis to robustly quantify such conclusions." The team's approach was to use genetic programming, a powerful machine learning technique, to automatically find the closest classical models for the data. Together, the team applied machine learning to find the closest classical explanations of experimental data, allowing them to map out many dimensions of the departure from classical that quantum correlations exhibit. Dr Chris Ferrie, from the University of Technology Sydney, said: "We've light-heartedly called the region mapped out by the algorithm the 'edge of reality,' referring to the common terminology 'local realism' for a model of physics satisfying Einstein's relativity. "The algorithm works by building causal models through simulated evolution imitating natural selection - genetic programming. "The algorithm generates a population of 'fit' individual causal models which trade off closeness to quantum theory with the minimisation of causal influences between relativistically disconnected variables." The team used photons, single particles of light, to generate the quantum correlations that cannot be explained using classical mechanics. Quantum photonics has enabled a wide range of new technologies from quantum computation to quantum key distribution. The photons were prepared in various states possessing quantum entanglement, the phenomenon which fuels many of the advantages in quantum technology. The data collected was then used by the genetic algorithm to find a model that best matches the observed correlations. These models then quantify the region of models which are ruled out by nature itself. The team includes theoretical physicists and computer scientists from the ARC Centre for Engineered Quantum Systems (EQuS) at the University of Sydney, the Centre for Quantum Software and Information at the University of Technology Sydney and the Institute for Theoretical Physics at the University of Cologne as well as the experimental group at RMIT University's Quantum Photonics Laboratory. The research, "Explaining quantum correlations through evolution of causal models", has been published in Physical Review A and can be accessed online. For interviews: Dr Chris Ferrie, csferrie@gmail.com, or Dr Alberto Peruzzo, alberto.peruzzo@rmit.edu.au or +61 410 790 860. For general media enquiries: David Glanz, +61 3 9925 2807 or +61 438 547 723 or david.glanz@rmit.edu.au.


Camurus (NASDAQ STO: CAMX) and Braeburn Pharmaceuticals today announced positive top-line results from a long-term Phase 3 trial supporting the safety and efficacy of CAM2038 (weekly and monthly buprenorphine depots) in patients with moderate-to-severe opioid use disorder. "These new Phase 3 results add to the growing body of evidence supporting the use of our weekly and monthly buprenorphine depots (CAM2038) as a flexible, individualized therapy for patients with opioid use disorder," said Fredrik Tiberg, President & CEO, Camurus. "The present long-term study confirms the safety profile and efficacy of CAM2038 in both new-to-treatment patients and patients on maintenance treatment with daily buprenorphine. The results further strengthen our upcoming regulatory submissions to EMA and FDA in mid-2017." "People living with opioid use disorder need additional therapies that can provide meaningful improvement of treatment outcomes and quality of life. It is particularly important that we reduce the stigma and burdens associated with existing treatment approaches that require daily use of medications," said Prof. Nicholas Lintzeris, MBBS, PhD, FAChAM, Conjoint Professor of Addiction Medicine, University of Sydney, Australia.  "We are pleased with the study treatments and results of this Phase 3 long-term safety study, showing that these buprenorphine depots were well-tolerated by patients and provided high levels of efficacy across the 48-week treatment period." A total of 228 patients were enrolled in the study conducted at 29 sites across the U.S., Europe and Australia. 162 (71 %) patients completed the 48-week study treatment period. The safety profile of CAM2038 was similar to that observed in previous shorter term trials. A total of 17 (7%) serious adverse events were reported in this 48-week study (52 weeks including follow-up), of which none was considered related to the study medication. Importantly, as in the previous Phase 3 efficacy study, no opioid overdoses were reported for patients treated with CAM2038 depot injections. Overall, headache, nausea, vomiting, nasopharyngitis, and urinary tract infection were the most common adverse events; in each case reported by less than 10% of patients. Injection site reactions occurred in 20% of the participants and were generally mild (16.3%) or moderate (3.5%). Severe injection site pain was reported for one patient (0.4%). Notably, more than 5000 injections of CAM2038 were administered in the study. Efficacy was assessed by weekly and monthly urine toxicology tests. On average, 75% of the urine samples were negative for illicit opioids across the 48-week treatment period. "The positive results from this study, coupled with the earlier reported positive results from the pivotal Phase 3 efficacy trial, enable our teams to finalize regulatory submissions seeking approval in the U.S., Europe and other key global markets," said Behshad Sheldon, President and CEO of Braeburn Pharmaceuticals. "Opioid addiction is an overwhelming public health epidemic. In the U.S. alone, there are 2.6 million patients diagnosed with opioid addiction, and approximately 30,000 people die every year from opioid overdoses. We look forward to bringing these innovative options of weekly and monthly buprenorphine medicines to patients as quickly as possible." "The successful completion of this study marks an important step forward in the development of provider-administered depot medications for the treatment of opioid use disorder," noted Michael Frost, MD, medical director, Eagleville Hospital and President of Frost Medical in Philadelphia, and the coordinating investigator for the study.  "Having both weekly and monthly formulations as well as multiple dosage strengths available, allows the treatment to be tailored to the individual needs of patients. Those who participated in the study tolerated the treatment well whether they were transitioned from other forms of buprenorphine or were new entrants to treatment." For more information about CAM2038 and the Phase 3 trial, please see the full press release at http://www.camurus.com. This information is information that Camurus AB is obliged to make public pursuant to the EU Market Abuse Regulation and the Swedish Securities Markets Act. The information was submitted for publication, through the agency of the chief executive officer, 08.00 AM CET on 2 May 2017.


Camurus (NASDAQ STO: CAMX) and Braeburn Pharmaceuticals today announced positive top-line results from a long-term Phase 3 trial supporting the safety and efficacy of CAM2038 (weekly and monthly buprenorphine depots) in patients with moderate-to-severe opioid use disorder. "These new Phase 3 results add to the growing body of evidence supporting the use of our weekly and monthly buprenorphine depots (CAM2038) as a flexible, individualized therapy for patients with opioid use disorder," said Fredrik Tiberg, President & CEO, Camurus. "The present long-term study confirms the safety profile and efficacy of CAM2038 in both new-to-treatment patients and patients on maintenance treatment with daily buprenorphine. The results further strengthen our upcoming regulatory submissions to EMA and FDA in mid-2017." "People living with opioid use disorder need additional therapies that can provide meaningful improvement of treatment outcomes and quality of life. It is particularly important that we reduce the stigma and burdens associated with existing treatment approaches that require daily use of medications," said Prof. Nicholas Lintzeris, MBBS, PhD, FAChAM, Conjoint Professor of Addiction Medicine, University of Sydney, Australia.  "We are pleased with the study treatments and results of this Phase 3 long-term safety study, showing that these buprenorphine depots were well-tolerated by patients and provided high levels of efficacy across the 48-week treatment period." A total of 228 patients were enrolled in the study conducted at 29 sites across the U.S., Europe and Australia. 162 (71 %) patients completed the 48-week study treatment period. The safety profile of CAM2038 was similar to that observed in previous shorter term trials. A total of 17 (7%) serious adverse events were reported in this 48-week study (52 weeks including follow-up), of which none was considered related to the study medication. Importantly, as in the previous Phase 3 efficacy study, no opioid overdoses were reported for patients treated with CAM2038 depot injections. Overall, headache, nausea, vomiting, nasopharyngitis, and urinary tract infection were the most common adverse events; in each case reported by less than 10% of patients. Injection site reactions occurred in 20% of the participants and were generally mild (16.3%) or moderate (3.5%). Severe injection site pain was reported for one patient (0.4%). Notably, more than 5000 injections of CAM2038 were administered in the study. Efficacy was assessed by weekly and monthly urine toxicology tests. On average, 75% of the urine samples were negative for illicit opioids across the 48-week treatment period. "The positive results from this study, coupled with the earlier reported positive results from the pivotal Phase 3 efficacy trial, enable our teams to finalize regulatory submissions seeking approval in the U.S., Europe and other key global markets," said Behshad Sheldon, President and CEO of Braeburn Pharmaceuticals. "Opioid addiction is an overwhelming public health epidemic. In the U.S. alone, there are 2.6 million patients diagnosed with opioid addiction, and approximately 30,000 people die every year from opioid overdoses. We look forward to bringing these innovative options of weekly and monthly buprenorphine medicines to patients as quickly as possible." "The successful completion of this study marks an important step forward in the development of provider-administered depot medications for the treatment of opioid use disorder," noted Michael Frost, MD, medical director, Eagleville Hospital and President of Frost Medical in Philadelphia, and the coordinating investigator for the study.  "Having both weekly and monthly formulations as well as multiple dosage strengths available, allows the treatment to be tailored to the individual needs of patients. Those who participated in the study tolerated the treatment well whether they were transitioned from other forms of buprenorphine or were new entrants to treatment." For more information about CAM2038 and the Phase 3 trial, please see the full press release at http://www.camurus.com. This information is information that Camurus AB is obliged to make public pursuant to the EU Market Abuse Regulation and the Swedish Securities Markets Act. The information was submitted for publication, through the agency of the chief executive officer, 08.00 AM CET on 2 May 2017.


LUND, Sweden and PRINCETON, New Jersey, May 2, 2017 /PRNewswire/ -- New data support long-term safety and efficacy of weekly and monthly subcutaneous buprenorphine depots in patients with opioid use disorder Camurus (NASDAQ STO: CAMX) and Braeburn Pharmaceuticals today announced positive top-line results from a long-term Phase 3 trial supporting the safety and efficacy of CAM2038 (weekly and monthly buprenorphine depots) in patients with moderate-to-severe opioid use disorder. "These new Phase 3 results add to the growing body of evidence supporting the use of our weekly and monthly buprenorphine depots (CAM2038) as a flexible, individualized therapy for patients with opioid use disorder," said Fredrik Tiberg, President & CEO, Camurus. "The present long-term study confirms the safety profile and efficacy of CAM2038 in both new-to-treatment patients and patients on maintenance treatment with daily buprenorphine. The results further strengthen our upcoming regulatory submissions to EMA and FDA in mid-2017." "People living with opioid use disorder need additional therapies that can provide meaningful improvement of treatment outcomes and quality of life. It is particularly important that we reduce the stigma and burdens associated with existing treatment approaches that require daily use of medications," said Prof. Nicholas Lintzeris, MBBS, PhD, FAChAM, Conjoint Professor of Addiction Medicine, University of Sydney, Australia.  "We are pleased with the study treatments and results of this Phase 3 long-term safety study, showing that these buprenorphine depots were well-tolerated by patients and provided high levels of efficacy across the 48-week treatment period." A total of 228 patients were enrolled in the study conducted at 29 sites across the U.S., Europe and Australia. 162 (71%) patients completed the 48-week study treatment period. The safety profile of CAM2038 was similar to that observed in previous shorter term trials. A total of 17 (7%) serious adverse events were reported in this 48-week study (52 weeks including follow-up), of which none was considered related to the study medication. Importantly, as in the previous Phase 3 efficacy study, no opioid overdoses were reported for patients treated with CAM2038 depot injections. Overall, headache, nausea, vomiting, nasopharyngitis, and urinary tract infection were the most common adverse events; in each case reported by less than 10% of patients. Injection site reactions occurred in 20% of the participants and were generally mild (16.3%) or moderate (3.5%). Severe injection site pain was reported for one patient (0.4%). Notably, more than 5000 injections of CAM2038 were administered in the study. Efficacy was assessed by weekly and monthly urine toxicology tests. On average, 75% of the urine samples were negative for illicit opioids across the 48-week treatment period. "The positive results from this study, coupled with the earlier reported positive results from the pivotal Phase 3 efficacy trial, enable our teams to finalize regulatory submissions seeking approval in the U.S., Europe and other key global markets," said Behshad Sheldon, President and CEO of Braeburn Pharmaceuticals. "Opioid addiction is an overwhelming public health epidemic. In the U.S. alone, there are 2.6 million patients diagnosed with opioid addiction, and approximately 30,000 people die every year from opioid overdoses. We look forward to bringing these innovative options of weekly and monthly buprenorphine medicines to patients as quickly as possible." "The successful completion of this study marks an important step forward in the development of provider-administered depot medications for the treatment of opioid use disorder," noted Michael Frost, MD, medical director, Eagleville Hospital and President of Frost Medical in Philadelphia, and the coordinating investigator for the study.  "Having both weekly and monthly formulations as well as multiple dosage strengths available, allows the treatment to be tailored to the individual needs of patients. Those who participated in the study tolerated the treatment well whether they were transitioned from other forms of buprenorphine or were new entrants to treatment." For more information about CAM2038 and the Phase 3 trial, please see the full press release at http://www.camurus.com. This information is information that Camurus AB is obliged to make public pursuant to the EU Market Abuse Regulation and the Swedish Securities Markets Act. The information was submitted for publication, through the agency of the chief executive officer, 08.00 AM CET on 2 May 2017.


Maron B.J.,Minneapolis Heart Institute Foundation | Ommen S.R.,Mayo Medical School | Semsarian C.,University of Sydney | Spirito P.,Ente Ospedaliero Ospedali Galliera | Olivotto I.,University of Florence
Journal of the American College of Cardiology | Year: 2014

Hypertrophic cardiomyopathy (HCM) is a common inherited heart disease with diverse phenotypic and genetic expression, clinical presentation, and natural history. HCM has been recognized for 55 years, but recently substantial advances in diagnosis and treatment options have evolved, as well as increased recognition of the disease in clinical practice. Nevertheless, most genetically and clinically affected individuals probably remain undiagnosed, largely free from disease-related complications, although HCM may progress along 1 or more of its major disease pathways (i.e., arrhythmic sudden death risk; progressive heart failure [HF] due to dynamic left ventricular [LV] outflow obstruction or due to systolic dysfunction in the absence of obstruction; or atrial fibrillation with risk of stroke). Effective treatments are available for each adverse HCM complication, including implantable cardioverter-defibrillators (ICDs) for sudden death prevention, heart transplantation for end-stage failure, surgical myectomy (or selectively, alcohol septal ablation) to alleviate HF symptoms by abolishing outflow obstruction, and catheter-based procedures to control atrial fibrillation. These and other strategies have now resulted in a low disease-related mortality rate of <1%/year. Therefore, HCM has emerged from an era of misunderstanding, stigma, and pessimism, experiencing vast changes in its clinical profile, and acquiring an effective and diverse management armamentarium. These advances have changed its natural history, with prevention of sudden death and reversal of HF, thereby restoring quality of life with extended (if not normal) longevity for most patients, and transforming HCM into a contemporary treatable cardiovascular disease. © 2014 by the American College of Cardiology Foundation.


MacKay J.P.,University of Sydney | Font J.,University of Sydney | Segal D.J.,University of California at Davis
Nature Structural and Molecular Biology | Year: 2011

Spectacular progress has been made in the design of proteins that recognize double-stranded DNA with a chosen specificity, to the point that designer DNA-binding proteins can be ordered commercially. This success raises the question of whether it will be possible to engineer libraries of proteins that can recognize RNA with tailored specificity. Given the recent explosion in the number and diversity of RNA species demonstrated to play roles in biology, designer RNA-binding proteins are set to become valuable tools, both in the research laboratory and potentially in the clinic. Here we discuss the prospects for the realization of this idea. © 2011 Nature America, Inc. All rights reserved.


Graeber M.B.,University of Sfax | Graeber M.B.,University of Sydney | Streit W.J.,University of Florida
Acta Neuropathologica | Year: 2010

The past 20 years have seen a gain in knowledge on microglia biology and microglia functions in disease that exceeds the expectations formulated when the microglia "immune network" was introduced. More than 10,000 articles have been published during this time. Important new research avenues of clinical importance have opened up such as the role of microglia in pain and in brain tumors. New controversies have also emerged such as the question of whether microglia are active or reactive players in neurodegenerative disease conditions, or whether they may be victims themselves. Premature commercial interests may be responsible for some of the confusion that currently surrounds microglia in both the Alzheimer and Parkinson's disease research fields. A critical review of the literature shows that the concept of "(micro)glial inflammation" is still open to interpretation, despite a prevailing slant towards a negative meaning. Perhaps the most exciting foreseeable development concerns research on the role of microglia in synaptic plasticity, which is expected to yield an answer to the question whether microglia are the brain's electricians. This review provides an analysis of the latest developments in the microglia field. © 2009 Springer-Verlag.


Lu X.,Huazhong University of Science and Technology | Naidis G.V.,RAS Joint Institute for High Temperatures | Laroussi M.,Old Dominion University | Ostrikov K.,CSIRO | Ostrikov K.,University of Sydney
Physics Reports | Year: 2014

This review focuses on one of the fundamental phenomena that occur upon application of sufficiently strong electric fields to gases, namely the formation and propagation of ionization waves-streamers. The dynamics of streamers is controlled by strongly nonlinear coupling, in localized streamer tip regions, between enhanced (due to charge separation) electric field and ionization and transport of charged species in the enhanced field. Streamers appear in nature (as initial stages of sparks and lightning, as huge structures-sprites above thunderclouds), and are also found in numerous technological applications of electrical discharges. Here we discuss the fundamental physics of the guided streamer-like structures-plasma bullets which are produced in cold atmospheric-pressure plasma jets. Plasma bullets are guided ionization waves moving in a thin column of a jet of plasma forming gases (e.g.,He or Ar) expanding into ambient air. In contrast to streamers in a free (unbounded) space that propagate in a stochastic manner and often branch, guided ionization waves are repetitive and highly-reproducible and propagate along the same path-the jet axis. This property of guided streamers, in comparison with streamers in a free space, enables many advanced time-resolved experimental studies of ionization waves with nanosecond precision. In particular, experimental studies on manipulation of streamers by external electric fields and streamer interactions are critically examined. This review also introduces the basic theories and recent advances on the experimental and computational studies of guided streamers, in particular related to the propagation dynamics of ionization waves and the various parameters of relevance to plasma streamers. This knowledge is very useful to optimize the efficacy of applications of plasma streamer discharges in various fields ranging from health care and medicine to materials science and nanotechnology. © 2014 Elsevier B.V.


Pan S.L.,National University of Singapore | Tan B.,University of Sydney
Information and Organization | Year: 2011

Despite an abundance of prescriptions and examples for the conduct of case research in the literature, the fact that most prescriptions tend to (1) articulate general principles/guidelines that are difficult to translate into specific, actionable steps, (2) hold only under idealized conditions and may be unworkable in the field, and (3) emphasize the need to be flexible without explaining how flexibility can be achieved, is creating a steep learning curve. To address these gaps, a structured-pragmatic-situational (SPS) approach to conducting case research is proposed with detailed instructions provided for each of its eight steps. The eight steps include (1) access negotiation, (2)conceptualizing the phenomenon,(3) collecting and organizing the initial data, (4) constructing and extending the theoretical lens, (5) confirming and validating data, (6) selective coding, (7) ensuring theory-data-model alignment, and (8) writing the case report. With its prescriptions, the SPS approach introduces a number of conceptual innovations, integrates the different recommendations of some of the most frequently cited works on the case research method into a coherent whole, and suggests resolutions for a number of common issues that confront case researchers. © 2011 Elsevier Ltd.


Reynoso A.A.,University of Sydney | Frustaglia D.,University of Seville
Physical Review B - Condensed Matter and Materials Physics | Year: 2013

Quantum wires subject to the combined action of spin-orbit and Zeeman coupling in the presence of s-wave pairing potentials (superconducting proximity effect in semiconductors or superfluidity in cold atoms) are one of the most promising systems for the developing of topological phases hosting Majorana fermions. The breaking of time-reversal symmetry is essential for the appearance of unpaired Majorana fermions. By implementing a time-dependent spin rotation, we show that the standard magnetostatic model maps into a nonmagnetic one where the breaking of time-reversal symmetry is guaranteed by a periodical change of the spin-orbit coupling axis as a function of time. This suggests the possibility of developing the topological superconducting state of matter driven by external forces in the absence of magnetic fields and magnetic elements. From a practical viewpoint, the scheme avoids the disadvantages of conjugating magnetism and superconductivity, even though the need of a high-frequency driving of spin-orbit coupling may represent a technological challenge. We describe the basic properties of this Floquet system by showing that finite samples host unpaired Majorana fermions at their edges despite the fact that the bulk Floquet quasienergies are gapless and that the Hamiltonian at each instant of time preserves time-reversal symmetry. Remarkably, we identify the mean energy of the Floquet states as a topological indicator. We additionally show that the localized Floquet Majorana fermions are robust under local perturbations. Our results are supported by complementary numerical Floquet simulations. © 2013 American Physical Society.


Lim L.S.,Singapore Eye Research Institute | Mitchell P.,University of Sydney | Seddon J.M.,Tufts University | Holz F.G.,University of Bonn | And 2 more authors.
The Lancet | Year: 2012

Age-related macular degeneration is a major cause of blindness worldwide. With ageing populations in many countries, more than 20% might have the disorder. Advanced age-related macular degeneration, including neovascular agerelated macular degeneration (wet) and geographic atrophy (late dry), is associated with substantial, progressive visual impairment. Major risk factors include cigarette smoking, nutritional factors, cardiovascular diseases, and genetic markers, including genes regulating complement, lipid, angiogenic, and extracellular matrix pathways. Some studies have suggested a declining prevalence of age-related macular degeneration, perhaps due to reduced exposure to modifiable risk factors. Accurate diagnosis combines clinical examination and investigations, including retinal photography, angiography, and optical coherence tomography. Dietary anti-oxidant supplementation slows progression of the disease. Treatment for neovascular age-related macular degeneration incorporates intra ocular injections of anti-VEGF agents, occasionally combined with other modalities. Evidence suggests that two commonly used anti-VEGF therapies, ranibizumab and bevacizumab, have similar efficacy, but possible differences in systemic safety are difficult to assess. Future treatments include inhibition of other angiogenic factors, and regenerative and topical therapies.


Bell M.L.,University of Sydney
Statistical Methods in Medical Research | Year: 2014

Patient-reported outcomes are increasingly used in health research, including randomized controlled trials and observational studies. However, the validity of results in longitudinal studies can crucially hinge on the handling of missing data. This paper considers the issues of missing data at each stage of research. Practical strategies for minimizing missingness through careful study design and conduct are given. Statistical approaches that are commonly used, but should be avoided, are discussed, including how these methods can yield biased and misleading results. Methods that are valid for data which are missing at random are outlined, including maximum likelihood methods, multiple imputation and extensions to generalized estimating equations: weighted generalized estimating equations, generalized estimating equations with multiple imputation, and doubly robust generalized estimating equations. Finally, we discuss the importance of sensitivity analyses, including the role of missing not at random models, such as pattern mixture, selection, and shared parameter models. We demonstrate many of these concepts with data from a randomized controlled clinical trial on renal cancer patients, and show that the results are dependent on missingness assumptions and the statistical approach. © The Author(s) 2013 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.


Wei C.J.,Peking University | Clarke G.L.,University of Sydney
Journal of Metamorphic Geology | Year: 2011

Pseudosections calculated with thermocalc predict that lawsonite-bearing assemblages, including lawsonite eclogite, will be common for subducted oceanic crust that experiences cool, fluid-saturated conditions. For glaucophane-lawsonite eclogite facies conditions (500-600°C and 18-28kbar), MORB compositions are predicted in the NCKMnFMASHO system to contain glaucophane, garnet, omphacite, lawsonite, phengite and quartz, with chlorite at lower temperature and talc at higher temperature. In these assemblages, the pyrope content in garnet is mostly controlled by variations in temperature, and grossular content is strongly controlled by pressure. The silica content in phengite increases linearly with pressure. As the P-T conditions for these given isopleths are only subtly affected by common variations in bulk-rock compositions, the P-T pseudosections potentially present a robust geothermobarometric method for natural glaucophane-bearing eclogites. Thermobarometric results recovered both by isopleth and conventional approaches indicate that most natural glaucophane-lawsonite eclogites (Type-L) and glaucophane-epidote eclogites (Type-E) record similar peak P-T conditions within the lawsonite stability field. Decompression from conditions appropriate for lawsonite stability should result in epidote-bearing assemblages through dehydration reactions controlled by lawsonite+omphacite=glaucophane+epidote+H2O. Lawsonite and omphacite breakdown will be accompanied by the release of a large amount of bound fluid, such that eclogite assemblages are variably recrystallized to glaucophane-rich blueschist. Calculated pseudosections indicate that eclogite assemblages form most readily in Ca-rich rocks and blueschist assemblages most readily in Ca-poor rocks. This distinction in bulk-rock composition can account for the co-existence of low-T eclogite and blueschist in high-pressure terranes. © 2011 Blackwell Publishing Ltd.


Grant
Agency: European Commission | Branch: H2020 | Program: RIA | Phase: PHC-08-2014 | Award Amount: 25.06M | Year: 2015

The TBVAC2020 proposal builds on the highly successful and long-standing collaborations in subsequent EC-FP5-, FP6- and FP7-funded TB vaccine and biomarker projects, but also brings in a large number of new key partners from excellent laboratories from Europe, USA, Asia, Africa and Australia, many of which are global leaders in the TB field. This was initiated by launching an open call for Expressions of Interest (EoI) prior to this application and to which interested parties could respond. In total, 115 EoIs were received and ranked by the TBVI Steering Committee using proposed H2020 evaluation criteria. This led to the prioritisation of 52 R&D approaches included in this proposal. TBVAC2020 aims to innovate and diversify the current TB vaccine and biomarker pipeline while at the same time applying portfolio management using gating and priority setting criteria to select as early as possible the most promising TB vaccine candidates, and accelerate their development. TBVAC2020 proposes to achieve this by combining creative bottom-up approaches for vaccine discovery (WP1), new preclinical models addressing clinical challenges (WP2) and identification and characterisation of correlates of protection (WP5) with a directive top-down portfolio management approach aiming to select the most promising TB vaccine candidates by their comparative evaluation using objective gating and priority setting criteria (WP6) and by supporting direct, head-to head or comparative preclinical and early clinical evaluation (WP3, WP4). This approach will both innovate and diversify the existing TB vaccine and biomarker pipeline as well as accelerate development of most promising TB vaccine candidates through early development stages. The proposed approach and involvement of many internationally leading groups in the TB vaccine and biomarker area in TBVAC2020 fully aligns with the Global TB Vaccine Partnerships (GTBVP).


Patent
Centenary Institute Of Cancer Medicine And Cell Biology, Wenkart Foundation, Medvet Science Pty Ltd. and University of Sydney | Date: 2010-06-04

The present invention relates to methods for modulating angiogenesis, comprising administering to a subject, or cells or tissue derived therefrom: (i) one or more miRNA, or precursors or variants thereof, wherein at least one of said miRNA comprises a seed region comprising the sequence UCACAGU (SEQ ID NO:37) to inhibit angiogenesis; or (ii) one or more antagonists of a miRNA, wherein said miRNA comprises a seed region comprising the sequence UCACAGU (SEQ ID NO:37) to promote or induce angiogenesis. Also provided are methods of diagnosis of conditions associated with abnormal angiogenesis, or determining predisposition thereto. Suitable pharmaceutical compositions are also provided.


Grant
Agency: European Commission | Branch: FP7 | Program: CP-IP | Phase: HEALTH-2011.2.2.1-2 | Award Amount: 24.91M | Year: 2012

The goal of this proposal (INMiND) is to carry out collaborative research on molecular mechanisms that link neuroinflammation with neurodegeneration in order to identify novel biological targets for activated microglia, which may serve for both diagnostic and therapeutic purposes, and to translate this knowledge into the clinic. The general objectives of INMiND are: (i) to identify novel mechanisms of regulation and function of microglia under various conditions (inflammatory stimuli; neurodegenerative and -regenerative model systems); (ii) to identify and implement new targets for activated microglia, which may serve for diagnostic (imaging) and therapeutic purposes; (iii) to design new molecular probes (tracers) for these novel targets and to implement and validate them in in vivo model systems and patients; (iv) to image and quantify modulated microglia activity in patients undergoing immune therapy for cognitive impairment and relate findings to clinical outcome. Within INMiND we bring together a group of excellent scientists with a proven background in efficiently accomplishing common scientific goals (FP6 project DiMI, www.dimi.eu), who belong to highly complementary fields of research (from genome-oriented to imaging scientists and clinicians), and who are dedicated to formulate novel image-guided therapeutic strategies for neuroinflammation related neurodegenerative diseases. The strength of this proposal is that, across Europe, it will coordinate research and training activities related to neuroinflammation, neurodegeneration/-regeneration and imaging with special emphasis on translating basic mechanisms into clinical applications that will provide health benefits for our aging population. With its intellectual excellence and its crucial mass the INMiND consortium will play a major role in the European Research Area and will gain European leadership in the creation of new image-guided therapy paradigms in patients with neurodegenerative diseases.


Grant
Agency: European Commission | Branch: FP7 | Program: CP-FP | Phase: KBBE-2008-1-4-05 | Award Amount: 3.08M | Year: 2009

The overall objective of the project is to collect and analyze new data on non-tariff measures (NTMs), particularly on governmental standards and regulations that prescribe the conditions for importing agri-food products into the EU market and into the markets of the main competing players. Furthermore, impacts from EU NTBs on least developing country (LDC) exports are examined. The project will deliver the following results: 1. An analytical framework for defining measures, methods, products and countries. 2. A data base on NTMs in EU, USA, Canada, Japan, China, India, Brazil, Argentina, Australia, Russia and New Zealand. 3. Comparative analyses on the impact of NTMs on agri-food trade of the EU. 4. Policy recommendations from case studies for quantifying NTMs on fruits and vegetables, meat and dairy trade clusters with the EU. 5. Policy recommendations from case studies on the impacts of EU private and public standards in LDCs. 6. Dissemination of project results to key stakeholders. This will be achieved: A. By optimizing complementarities of the project with ongoing NTM research on the TRAINS data base at UNCTAD. B. By organizing the research work in research, database, management and dissemination work packages. C. By developing research methodologies that are innovative and robust, optimizing the direct usefulness of the end results for the end users. D. By proposing a partner consortium that together reunites the relevant needs, for: Scientific excellence and international project experience Appropriate geographic coverage to collect the required data in all countries Linkages and complementarities with ongoing international NTM analyses (UNCTAD, OECD, World Bank, IFPRI) Policy contacts, dialogue and influence Efficient and effective project management E. With a budget of 314.5 person months, 2.372 M EC request, for 19 partners, over 30 months.


Grant
Agency: European Commission | Branch: FP7 | Program: CP-FP | Phase: HEALTH.2011.2.4.3-1 | Award Amount: 8.25M | Year: 2012

Background: A significant proportion of pre-diabetics, show macro and micro vascular complications associated with hyperglycaemia. Although many trials have demonstrated the efficacy of lifestyle and pharmaceutical interventions in diabetes prevention, no trial has evaluated the extent to which mid- and long-term complications can be prevented by early interventions on hyperglycaemia. Aims: To assess the long-term effects on multiple complications of hyperglycaemia of early intensive management of hyperglycaemia with sitagliptin, metformin or their combination added to lifestyle intervention (LSI) (diet and physical activity), compared with LSI alone in adults with non-diabetic intermediate hyperglycaemia (IFG, IGT or both). Study Design: Long-term, multi-centre, randomised, partially double blinded, placebo controlled, phase-IIIb clinical trial with prospective blinded outcome evaluation. Participants will be randomised to four parallel arms: 1) LSI \ 2 placebo tablets/day; 2) LSI \ 2 Metformin tablets of 850 mg/day; 3) LSI \ 2 Sitagliptin tablets of 50 mg/day; 4) LSI \ 2 tablets of a fixed-dose combination of Sitagliptin 50mg and Metformin 850 /day. Active intervention will last for at least 3 years, and additional follow-up up to 5 years. Setting and population: Males and Females with pre-diabetes (IFG, IGT or both) aged 45 to 74 years selected from primary care screening programs in 15 clinical centres from 12 countries: Australia, Austria, Bulgaria, Germany, Greece, Italy, Lithuania, Poland, Serbia, Spain, Switzerland and Turkey. (N=3000) Main Outcomes: The primary endpoint is a combined continuous variable: the microvascular complication ndex (MCI) composed by a linear combination of the Early Treatment Diabetic Retinopathy Study Scale (ETDRS) score (based on retinograms), the level of urinary albumin to creatinine ratio, and a measure of distal small fibre neuropathy (sudomotor test by SUDOSCAN), measured during baseline visit and at 36th and 60th month visits after randomisation. In addition, this project will include the evaluation of early novel serological biomarkers of systemic inflammation, early micro-vascular damage, non-alcoholic fatty liver disease, insulin sensitivity and insulin secretion, and measures of quality of life, sleep quality (somnograms) and neuropsychological evaluation. Vascular function and structure will be evaluated in a subset of participants (n=1000), including cIMT and microvascular endothelial function measured by EndoPAT. Expected results: By evaluating the effect of aggressive treatments in pre-diabetes for the early prevention of diabetes complication, this project has the potential of changing the current paradigm of early management of hyperglycaemia. The ultimate goal is the development of a standardized core protocol for the early prevention of microvascular and other complications, impacting social cost as a result not only in health care, but also in disabilities at work.


Grant
Agency: European Commission | Branch: FP7 | Program: CP-IP | Phase: KBBE.2012.2.2-03 | Award Amount: 14.15M | Year: 2013

The primary goal of PREVIEW is to identify the most efficient lifestyle pattern for the prevention of type-2 diabetes in a population of pre-diabetic overweight or obese individuals. The project comprises two distinct lines of evidence, both embracing European and overseas countries: 1) A multicentre, clinical randomized intervention trial with a total of 2,500 pre-diabetic participants, including children and adolescents, adults and elderly. The duration will be 3 years for the adults and elderly, and 2 years for the children and adolescents. 2) Large population studies using data from all age groups. Focus in both lines of evidence will be on diet (specifically protein and glycemic index) and intensity of physical activity, as well as their interaction with the lifestyle factors, habitual stress and sleeping pattern as well as behavioural, environmental, cultural, and socioeconomic variables. PREVIEW will significantly increase our knowledge on how specific lifestyle factors can help preventing type-2 diabetes. Type-2 diabetes accounts for about 90% of all cases of diabetes, primarily caused by the worldwide obesity epidemic. Diabetes is a costly disease and according to WHO, the direct health care costs of diabetes range from 2.5% to 15% of annual national health care budgets. This worrying trend calls for action and a need for a variety of innovative approaches. PREVIEW aims to be such an innovative attempt including all necessary disciplines and stakeholders, who can contribute to developing new ways for the prevention of this wide-spread life-style related disease. The strategic impact of PREVIEW concerns the massive problems associated with the global diabesity epidemic (obesity and type-2 diabetes) and therefore includes partners from Europe (East, West, North and South) and Australia, New Zealand, and Canada. PREVIEW will thereby contribute to improving health over the life-span of the population in Europe as well as worldwide. Overall the public health and socio-economic impact of PREVIEW is expected to be very significant.


News Article | October 26, 2016
Site: www.eurekalert.org

A UC Riverside-led team of astronomers use observations of a gravitationally lensed galaxy to measure the properties of the early universe RIVERSIDE, Calif. (http://www. ) -- Although the universe started out with a bang it quickly evolved to a relatively cool, dark place. After a few hundred thousand years the lights came back on and scientists are still trying to figure out why. Astronomers know that reionization made the universe transparent by allowing light from distant galaxies to travel almost freely through the cosmos to reach us. However, astronomers don't fully understand the escape rate of ionizing photons from early galaxies. That escape rate is a crucial, but still a poorly constrained value, meaning there are a wide range of upper and lower limits in the models developed by astronomers. That limitation is in part due to the fact that astronomers have been limited to indirect methods of observation of ionizing photons, meaning they may only see a few pixels of the object and then make assumptions about unseen aspects. Direct detection, or directly observing an object such as a galaxy with a telescope, would provide a much better estimate of their escape rate. In a just-published paper, a team of researchers, led by a University of California, Riverside graduate student, used a direct detection method and found the previously used constraints have been overestimated by five times. "This finding opens questions on whether galaxies alone are responsible for the reionization of the universe or if faint dwarf galaxies beyond our current detection limits have higher escape fractions to explain radiation budget necessary for the reionization of the universe," said Kaveh Vasei, the graduate student who is the lead author of the study. It is difficult to understand the properties of the early universe in large part because this was more than 12 billion year ago. It is known that around 380,000 years after the Big Bang, electrons and protons bound together to form hydrogen atoms for the first time. They make up more than 90 percent of the atoms in the universe, and can very efficiently absorb high energy photons and become ionized. However, there were very few sources to ionize these atoms in the early universe. One billion years after the Big Bang, the material between the galaxies was reionized and became more transparent. The main energy source of the reionization is widely believed to be massive stars formed within early galaxies. These stars had a short lifespan and were usually born in the midst of dense gas clouds, which made it very hard for ionizing photons to escape their host galaxies. Previous studies suggested that about 20 percent of these ionizing photons need to escape the dense gas environment of their host galaxies to significantly contribute to the reionization of the material between galaxies. Unfortunately, a direct detection of these ionizing photons is very challenging and previous efforts have not been very successful. Therefore, the mechanisms leading to their escape are poorly understood. This has led many astrophysicists to use indirect methods to estimate the fraction of ionizing photons that escape the galaxies. In one popular method, the gas is assumed to have a "picket fence" distribution, where the space within galaxies is assumed to be composed of either regions of very little gas, which are transparent to ionizing light, or regions of dense gas, which are opaque. Researchers can determine the fraction of each of these regions by studying the light (spectra) emerging from the galaxies. In this new UC Riverside-led study, astronomers directly measured the fraction of ionizing photons escaping from the Cosmic Horseshoe, a distant galaxy that is gravitationally lensed. Gravitational lensing is the deformation and amplification of a background object by the curving of space and time due to the mass of a foreground galaxy. The details of the galaxy in the background are therefore magnified, allowing researchers to study its light and physical properties more clearly. Based on the picket fence model, an escape fraction of 40 percent for ionizing photons from the Horseshoe was expected. Therefore, the Horseshoe represented an ideal opportunity to get for the first time a clear, resolved image of leaking ionizing photons to help understand the mechanisms by which they escape their host galaxies. The research team obtained a deep image of the Horseshoe with the Hubble Space Telescope in an ultraviolet filter, enabling them to directly detect escaping ionizing photons. Surprisingly, the image did not detect ionizing photons coming from the Horseshoe. This team constrained the fraction of escaping photons to be less than 8 percent, five times smaller than what had been inferred by indirect methods widely used by astronomers. "The study concludes that the previously determined fraction of escaping ionizing radiation of galaxies, as estimated by the most popular indirect method, is likely overestimated in many galaxies," said Brian Siana, co-author of the research paper and an assistant professor at UC Riverside. "The team is now focusing on direct determination the fraction of escaping ionizing photons that do not rely on indirect estimates." This paper, "The lyman continuum escape fraction of the cosmic horseshoe: a test of indirect estimates," has been published in the Astrophysical Journal. In addition to Vasei and Siana, the authors are Alice E. Shapley (UCLA), Anna M. Quider (University of Cambridge, United Kingdom), Anahita Alavi (UC Riverside), Marc Rafelski (Goddard Space Flight Center / NASA), Charles C. Steidel (Caltech), Max Pettini (University of Cambridge), Geraint F. Lewis (University of Sydney). Mario De Leo Winkler, a postdoctoral researcher in the UCR Department of Physics and Astronomy, made significant contributions to this article.


News Article | October 27, 2016
Site: spaceref.com

Although the universe started out with a bang it quickly evolved to a relatively cool, dark place. After a few hundred thousand years the lights came back on and scientists are still trying to figure out why. Astronomers know that reionization made the universe transparent by allowing light from distant galaxies to travel almost freely through the cosmos to reach us. However, astronomers don't fully understand the escape rate of ionizing photons from early galaxies. That escape rate is a crucial, but still a poorly constrained value, meaning there are a wide range of upper and lower limits in the models developed by astronomers. That limitation is in part due to the fact that astronomers have been limited to indirect methods of observation of ionizing photons, meaning they may only see a few pixels of the object and then make assumptions about unseen aspects. Direct detection, or directly observing an object such as a galaxy with a telescope, would provide a much better estimate of their escape rate. In a just-published paper, a team of researchers, led by a University of California, Riverside graduate student, used a direct detection method and found the previously used constraints have been overestimated by five times. "This finding opens questions on whether galaxies alone are responsible for the reionization of the universe or if faint dwarf galaxies beyond our current detection limits have higher escape fractions to explain radiation budget necessary for the reionization of the universe," said Kaveh Vasei, the graduate student who is the lead author of the study. It is difficult to understand the properties of the early universe in large part because this was more than 12 billion year ago. It is known that around 380,000 years after the Big Bang, electrons and protons bound together to form hydrogen atoms for the first time. They make up more than 90 percent of the atoms in the universe, and can very efficiently absorb high energy photons and become ionized. However, there were very few sources to ionize these atoms in the early universe. One billion years after the Big Bang, the material between the galaxies was reionized and became more transparent. The main energy source of the reionization is widely believed to be massive stars formed within early galaxies. These stars had a short lifespan and were usually born in the midst of dense gas clouds, which made it very hard for ionizing photons to escape their host galaxies. Previous studies suggested that about 20 percent of these ionizing photons need to escape the dense gas environment of their host galaxies to significantly contribute to the reionization of the material between galaxies. Unfortunately, a direct detection of these ionizing photons is very challenging and previous efforts have not been very successful. Therefore, the mechanisms leading to their escape are poorly understood. This has led many astrophysicists to use indirect methods to estimate the fraction of ionizing photons that escape the galaxies. In one popular method, the gas is assumed to have a "picket fence" distribution, where the space within galaxies is assumed to be composed of either regions of very little gas, which are transparent to ionizing light, or regions of dense gas, which are opaque. Researchers can determine the fraction of each of these regions by studying the light (spectra) emerging from the galaxies. In this new UC Riverside-led study, astronomers directly measured the fraction of ionizing photons escaping from the Cosmic Horseshoe, a distant galaxy that is gravitationally lensed. Gravitational lensing is the deformation and amplification of a background object by the curving of space and time due to the mass of a foreground galaxy. The details of the galaxy in the background are therefore magnified, allowing researchers to study its light and physical properties more clearly. Based on the picket fence model, an escape fraction of 40 percent for ionizing photons from the Horseshoe was expected. Therefore, the Horseshoe represented an ideal opportunity to get for the first time a clear, resolved image of leaking ionizing photons to help understand the mechanisms by which they escape their host galaxies. The research team obtained a deep image of the Horseshoe with the Hubble Space Telescope in an ultraviolet filter, enabling them to directly detect escaping ionizing photons. Surprisingly, the image did not detect ionizing photons coming from the Horseshoe. This team constrained the fraction of escaping photons to be less than 8 percent, five times smaller than what had been inferred by indirect methods widely used by astronomers. "The study concludes that the previously determined fraction of escaping ionizing radiation of galaxies, as estimated by the most popular indirect method, is likely overestimated in many galaxies," said Brian Siana, co-author of the research paper and an assistant professor at UC Riverside. "The team is now focusing on direct determination the fraction of escaping ionizing photons that do not rely on indirect estimates." This paper, "The lyman continuum escape fraction of the cosmic horseshoe: a test of indirect estimates," has been published in the Astrophysical Journal. In addition to Vasei and Siana, the authors are Alice E. Shapley (UCLA), Anna M. Quider (University of Cambridge, United Kingdom), Anahita Alavi (UC Riverside), Marc Rafelski (Goddard Space Flight Center / NASA), Charles C. Steidel (Caltech), Max Pettini (University of Cambridge), Geraint F. Lewis (University of Sydney). Mario De Leo Winkler, a postdoctoral researcher in the UCR Department of Physics and Astronomy, made significant contributions to this article. Please follow SpaceRef on Twitter and Like us on Facebook.


News Article | April 15, 2016
Site: www.spie.org

The first detection of a gravitational wave depended on large surfaces with excellent flatness, combined with low microroughness and the ability to mitigate environmental noise. Albert Einstein's general theory of relativity predicted that massive, accelerating bodies in deep space, such as supernovae or orbiting black holes, emit huge amounts of energy that radiate throughout the universe as gravitational waves. Although these "ripples in spacetime" may travel billions of light years, Einstein never thought the technology would exist that would allow for their detection on Earth. But a century later, the technology does exist at the Laser Interferometer Gravitational-Wave Observatory (LIGO). Measurements from two interferometers, 3000km apart in Louisiana and Washington State, have provided the first direct evidence of Einstein's theory by recording gravitational-wave signal GW150914, determined to be produced by two black holes coalescing 1.2 billion light years away. At the heart of the discovery lies fused silica optics with figure quality and surface smoothness refined to enable measurement of these incredibly small perturbations. Their design is an important part of LIGO's story. The black hole coalescence was detected as an upward-sweeping 'chirp' from 35 to 300Hz, which falls in the detectors' mid-frequency range that is plagued by noise from the optics. Left and right images show data from Hanford and Livingston observatories. Click to enlarge. (Caltech/MIT/LIGO Laboratory) "Most impressive are [the optics'] size combined with surface figure, coating uniformity, monolithic suspensions, and low absorption," says Daniel Sigg, a LIGO lead scientist at Caltech. LIGO's optics system amplifies and splits a laser beam down two 4km-long orthogonal tubes. The two beams build power by resonating between reflective mirrors, or 'test masses,' suspended at either end of each arm. This creates an emitted wavelength of unprecedented precision. When the split beam recombines, any change in one arm's path length results in a fringe pattern at the photodetector. For GW150914, this change was just a few times 10-18 meters. Reducing noise sources at each frequency improves interferometer sensitivity. Green shows actual noise during initial LIGO science run. Red and blue (Hanford, WA and Livingston, LA) show noise during advanced LIGO's first observation run, during which GW150914 was detected. Advanced LIGO's sensitivity goal (gray) is a tenfold noise reduction from initial LIGO. Click to enlarge. (Caltech/MIT/LIGO Laboratory) But the entire instrument is subject to environmental noise that reduces sensitivity. A noise plot shows the actual strain on the instruments at all frequencies, which must be distinguished from gravity wave signals. The optics themselves contribute to the noise, which most basically includes thermal noise and the quality factor, or 'Q,' of the substrate. "If you ping a wine glass, you want to hear 'ping' and not 'dink'. If it goes 'dink', the resonance line is broad and the entire noise increases. But if you contain all the energy in one frequency, you can filter it out," explains GariLynn Billingsley, LIGO optics manager at Caltech. That's the Q of the mirrors. Further, if the test mass surfaces did not allow identical wavelengths to resonate in both arms, it would result in imperfect cancellation when the beam recombines. And if non-resonating light is lost, so is the ability to reduce laser noise. Perhaps most problematic, the optics' coatings contribute to noise due to stochastic particle motion. Stringent design standards ameliorate these problems. In 1996, a program invited manufacturers to demonstrate their ability to meet the specifications required by initial LIGO's optics. Australia's Commonwealth Science and Industrial Research Organisation (CSIRO) won the contract. "It was a combination of our ability to generate large surfaces with excellent flatness, combined with very low microroughness," says Chris Walsh, now at the University of Sydney, who supervised the overall CSIRO project. "It requires enormous expertise to develop the polishing process to get the necessary microroughness (0.2-0.4nm RMS) and surface shape simultaneously." Master optician Achim Leistner led the work, with Bob Oreb in charge of metrology. Leistner pioneered the use of a Teflon lap, which provides a very stable surface that matches the desired shape of the optic during polishing and allows for controlled changes. "We built the optics to a specification that was different to anything we'd ever seen before," adds Walsh. Even with high-precision optics and a thermal compensation system that balances the minuscule heating of the mirror's center, initial LIGO was not expected to detect gravity waves. Advanced LIGO, begun in 2010 and completing its first observations when GW150914 was detected, offers a tenfold increase in design sensitivity due to upgrades that address the entire frequency range. "Very simply, we have better seismic isolation at low frequencies; better test masses and suspension at intermediate frequencies; and higher powered lasers at high frequencies," says Michael Landry, a lead scientist at the LIGO-Hanford observatory. At low frequencies, mechanical resonances are well understood. At high frequencies, radiation pressure and laser 'shot' noise dominate. But at intermediate frequencies (60-100 Hz), scattered light and beam jitter are difficult to control. "Our bucket is lowest here. And there are other things we just don't know," adds Landry. "The primary thermal noise, which is the component at intermediate frequency that will ultimately limit us, is the Brownian noise of the coatings." To improve signal-to-noise at intermediate frequencies, advanced LIGO needed larger test masses (340mm diameter). California-based Zygo Extreme Precision Optics won the contract to polish them. "We were chosen based on our ability to achieve very tight surface figure, roughness, radius of curvature, and surface defect specifications simultaneously," says John Kincade, Zygo's Extreme Precision Optics managing director. The test masses required a 1.9km radius of curvature, with figure requirements as stringent as 0.3nm RMS. After super-polishing to extremely high spatial frequency, ion beam figuring fine-tunes the curvature by etching the surface several molecules at a time. This allows reliable shape without compromising on ability to produce micro-roughness over large surfaces. Advanced LIGO input test mass champion data. Zygo achieved figuring accuracy to 0.08nm RMS over the critical 160mm central clear aperture, and sub-nanometer accuracy on the full clear 300mm aperture of many other samples. Click to enlarge. (Zygo Extreme Precision Optics) Dielectric coatings deposited on the high-precision surfaces determine their optical performance. CSIRO and the University of Lyon Laboratoire des Materiaux Avances shared the contract to apply molecule-thin alternating layers of tantalum and silica via ion-beam sputtering. Katie Green, project leader in CSIRO's optics group, says "the thickness of the individual layers are monitored as they're deposited. Each coating consists of multiple layers of particular thicknesses, with the specific composition of the layers varying depending on how the optic needs to perform in the detector." Additionally, gold coatings around the edges provide thermal shielding and act as an electrostatic drive. LIGO's next observation run is scheduled to begin in September 2016. And after Advanced LIGO reaches its design sensitivity by fine-tuning current systems, further upgrades await in the years 2018-2020 and beyond. "One question is how you reduce the thermal noise of the optics, in particular their coatings. But coating technologies make it hard to get more than a factor of about three beyond Advanced LIGO's noise level," says Landry. One possibility is operating at cyrogenic temperatures. But "fused silica becomes noisy at cold temperatures, and you need a different wavelength laser to do this," according to Billingsley. Another way of increasing the sensitivity at room temperature is to use 40km-arm-length interferometers. Other optics-related systems reduce noise. Advanced LIGO's test masses are suspended on fused silica fibers, creating monolithic suspension that reduces thermal noise and raises the system's resonant frequency compared with initial LIGO. "The Q of that system is higher so an entire band shrinks. That means opening up more space at lower frequencies, where binary black holes are," says Landry. In the 17th century, Galileo pointed a telescope to the sky and pioneered a novel way of observing the universe. Now, LIGO's detection of GW150914 marks another new era of astronomy. As advances in glass lenses enabled Galileo's discoveries, so have state-of-the-art optics made LIGO's discoveries possible. And with astronomy's track record of developing new generations of optical devices, both the astrophysical and precision optics communities are poised for an exciting future.


News Article | November 29, 2016
Site: www.eurekalert.org

In a small study of young or recently retired NFL players, researchers at Johns Hopkins report finding evidence of brain injury and repair that is visible on imaging from the players compared to a control group of men without a history of concussion. In a report on the study that used positron emission tomography (PET) and MRI, published in JAMA Neurology on Nov. 28, the researchers highlighted the value of PET imaging to monitor a marker of injury and repair in the brains of NFL players and athletes in other contact sports. The new research builds on a rising tide of anecdotal evidence and a few scientific studies suggesting that people with repeated concussive head injuries incurred while playing football, hockey or boxing are at higher-than-normal risk of developing the neurodegenerative disease called chronic traumatic encephalopathy (CTE). CTE is associated with memory deficits, confusion, poor decision-making and later onset of dementia. However, because CTE is often only diagnosed at autopsy, and because similar symptoms may occur in people without repeated head injuries, researchers, including those at Johns Hopkins Medicine, have been developing methods to better visualize tissue damage in the living brain to demonstrate better cause and effect. "The exciting part of our new findings is that we now believe we have a useful tool to monitor the brains of NFL players and athletes in other contact sports," says Jennifer Coughlin, M.D., assistant professor of psychiatry and behavioral sciences at Johns Hopkins. "We can measure TSPO, a PET biomarker of brain injury, in these younger players, and we can now begin to follow it over time to see if the brain is repairing itself or not." In early 2015, the Johns Hopkins research team published PET imaging results showing higher levels of this same biomarker in the brains of nine elderly former NFL players compared to control participants. However, since they initially studied elderly players who were many years from play, the researchers were unable to tell if the findings were also linked to aging and vascular disease, independent of past NFL play. For the new study, the researchers collected PET imaging data from 11 men without a history of concussion and compared the scans to those of 12 young NFL players, all of whom were still active or had retired within the past 12 years. All players had a self-reported history of at least one concussion. These players were an average age of 31 years old. About 80 percent were Caucasian and 20 percent were African-American. The control participants were matched to the players by body mass index, age and education level. The PET imaging was acquired using a radioactive chemical that binds to translocator protein 18 kDa (TSPO), which is normally found at low levels in healthy brain tissue. Since TSPO is increased during cellular response to brain injury, high levels of the TSPO signal on each PET scan can indicate where injury and reparative processes occur. The researchers found higher radiotracer binding to TSPO in players compared to control participants in eight of the 12 brain regions studied. These regions included the hippocampus, a region functionally involved in memory. Separately, the researchers examined data from MRI scans to look for structural changes in the brains of the study participants. They found no evidence of brain tissue loss in players compared to control participants in any of the brain regions examined, yet they did find some evidence of white matter changes in the players' brains. "We suspect that when the brain moves during a hard hit, it causes a shearing injury of the white matter fibers that travel across the brain," says Coughlin. Coughlin cautioned that there are some limitations to the imaging technique. For example, the radiotracer used in the PET scans doesn't work well in people with a specific variation in the gene that codes for TSPO protein, which occurs in about one in 10 people of European descent. Also, the researchers observed that use of creatine supplements -- taken by athletes to improve performance -- may interfere with the imaging results, necessitating further study of this effect before including participants taking creatine. "With further research using this technology, we may better understand the relationship between concussion and brain damage," says Coughlin. "Further understanding may help inform players of associated risk, and will allow us to test preventive and therapeutic interventions that may improve the lives of players." According to Centers for Disease Control and Prevention estimates, anywhere from 1.6 to 3.8 million concussions happen each year in the U.S. because of sports or recreational activities. Other researchers contributing to the study include Yuchuan Wang, Il Minn, Nicholas Bienko, Emily Ambinder, Xin Xu, Matthew Peters, John Dougherty, Melin Vranesic, Soo Min Koo, Hye-Hyun Ahn, Merton Lee, Chris Cottrell, Haris Sair, Akira Sawa, Cynthia Munro, Robert Dannals, Constantine Lyketsos, Gwenn Smith, Brian Caffo, Susumu Mori and Martin Pomper of The Johns Hopkins University; Christopher Nowinski of Boston University; Michael Kassiou of the University of Sydney; and Tomas Guilarte of Florida International University. The study was funded by the Brain and Behavior Research Foundation, the Alexander Wilson Schweizer Fellowship, the National Institute of Environmental Health Sciences (NIEHS-ES007062) and the GE/NFL Head Health Challenge. The funders had no role in designing, conducting or reporting on the study results.


News Article | November 14, 2016
Site: www.eurekalert.org

Seaweed-eating fish are becoming increasingly voracious as the ocean warms due to climate change and are responsible for the recent destruction of kelp forests off the NSW north coast near Coffs Harbour, research shows. The study includes an analysis of underwater video covering a 10 year period between 2002 and 2012 during which the water warmed by 0.6 degrees. "Kelp forests provide vital habitat for hundreds of marine species, including fish, lobster and abalone" says study first author Dr Adriana Vergés of UNSW and the Sydney Institute of Marine Science. "As a result of climate change, warm-water fish species are shifting their range and invading temperate areas. Our results show that over-grazing by these fish can have a profound impact, leading to kelp deforestation and barren reefs. "This is the first study demonstrating that the effects of warming in kelp forests are two-fold: higher temperatures not only have a direct impact on seaweeds, they also have an indirect impact by increasing the appetite of fish consumers, which can devour these seaweeds to the point of completely denuding the ocean floor. "Increases in the number of plant-eating fish because of warming poses a significant threat to kelp-dependent ecosystems both in Australia and around the globe," she says. The study is published in the journal Proceedings of the National Academy of Sciences. The team recorded underwater video around August-time each year at 12 sites along a 25 kilometre stretch of coast adjacent to the Solitary Island Marine Park off northern NSW. During this period, kelp disappeared completely from all study sites where it was initially present. At the same time the proportion of tropical and sub-tropical seaweed-eating fish swimming in these areas more than tripled. Grazing also intensified, with the proportion of kelp with obvious feeding marks on it increasing by a factor of seven during the decade. "We also carried out an experiment where we transplanted kelp onto the sea floor. We found that two warm-water species - rabbitfish and drummer fish - were the most voracious, eating fronds within hours at an average rate of 300 bites per hour" says Dr Vergés. "The number of fish that consumed the smaller algae growing on rock surfaces also increased, and they cleared the algae faster when there was no kelp present. This suggests the fish may help prevent kelp regrowing as well, by removing the tiny new plants." In Australia, kelp forests support a range of commercial fisheries, tourism ventures, and recreation activities worth more than $10 billion per year. "The decline of kelp in temperate areas could have major economic and management impacts," says Dr Vergés. The video footage used in the study from 2002 onwards was originally collected for a very different research project - to measure fish populations inside and outside sanctuary zones in a marine park. But the team realised it could also be used to determine whether kelp was present in the background or not. This unplanned use of an historic dataset is a good example of the value of collecting long-term data in the field, especially if it includes video or photos for permanent records. The team behind the study includes Professor Peter Steinberg, director of the Sydney Institute of Marine Science (SIMS), Dr Ezequiel Marzinelli and Dr Alexandra Campbell, also from UNSW and SIMS, Dr Christopher Doropoulos from CSIRO, and other researchers from the University of Queensland, the University of Sydney, the NSW Department of Primary Industries, James Cook University, Centre for Advanced Studies in Blanes Spain, and Nanyang Technical University in Singapore.


WEST PALM BEACH, FL--(Marketwired - November 29, 2016) - Organizers of the Equine World Stem Cell Summit (EWSCS) are pleased to announce a partnership with the North American Veterinary Regenerative Medicine Association (NAVRMA). The Equine World Stem Cell Summit will be held as a dedicated showcase track of the esteemed World Stem Cell Summit on December 7-9 at the Palm Beach County Convention Center in West Palm Beach, FL. Riders, owners, trainers, veterinarians, and more are welcome to attend and learn more about this exciting and wide-ranging topic. Bernie Siegel, Founder & Chair of the World Stem Cell Summit, stated, "We are excited to partner with NAVRMA. They are a committee of some of the most respected research scientists and veterinary practitioners in the industry, and they share our mission to accelerate regenerative medicine to improve health and deliver cures, whether for humans or animals." The EWSCS will welcome Dr. Alan Nixon, Chair of the Board of Directors at NAVRMA, as a speaker at the summit. Dr. Nixon is the Director of the Comparative Orthopaedics Laboratory at Cornell University. Dr. Nixon obtained his veterinary degree from the University of Sydney in 1978 and completed a surgical residency and research degree at Colorado State University in 1983. After five years in the Department of Surgical Sciences at the University of Florida, Dr. Nixon moved to New York in 1988, where he is currently a professor in the Department of Clinical Sciences at Cornell University. Dr. Nixon's research includes joint pathobiology and cartilage repair with growth factor gene-enhanced chondrocyte and stem cell transplantation techniques, genetic characterization of OCD in animals and man using microarray expression studies, and clinical application of growth factor recombinant proteins and gene therapy for improved joint, tendon, and bone repair. "We are excited to participate in the Equine World Stem Cell Summit and believe NAVRMA and EWSCS is a natural partnership," said Dr. Nixon. "We encourage professional improvement and the exchange of knowledge and ideas among people interested in veterinary regenerative medicine. The summit is the perfect place to share information and encourage learning not only for veterinarians and researchers, but for interested owners, riders, trainers, and breeders in the equine industry." Throughout the three-day conference, produced by the non-profit Regenerative Medicine Foundation, attendees will hear from industry-leading veterinarians and researchers. As the single conference uniting the global stem cell community, the WSCS provides a platform for the equine community to interact with leading researchers and institutions, as well as industry, investor, and philanthropic groups. The conference attracts approximately 1,000 attendees from 40 countries with 225 speakers and program participants. Registration for the Equine World Stem Cell Summit is $500 for the three-day track. Sign up online and use the code "EQUINERM" today! To learn more about the 12th Annual World Stem Cell Summit and the Equine World Stem Cell Summit and to find out how you can get involved as a sponsor or attendee, visit www.worldstemcellsummit.com or email Alan Fernandez at alan@regmedfoundation.org. The Equine World Stem Cell Summit is a dedicated track of the World Stem Cell Summit, to be held December 6-9 at the Palm Beach County Convention Center in West Palm Beach, FL. Throughout the conference, leading scientists and veterinarians will present fellow researchers, veterinarians, owners, trainers, riders, and interested public with the latest information on the regenerative medicine that is transforming the care and treatment of horses. The World Stem Cell Summit, produced by the Regenerative Medicine Foundation strives to unite, educate, and empower the global stem cell and regenerative medicine communities and to create a supportive environment for the field. The principal organizing partners for the 2016 World Stem Cell Summit include the Regenerative Medicine Foundation, Mayo Clinic, Kyoto University Institute for Integrated Cell Material Science, Center for Advancement of Science in Space (CASIS), Wake Forest Institute for Regenerative Medicine, Interdisciplinary Stem Cell Institute at the University of Miami Miller School of Medicine, Nova Southeastern University, CCRM and the Cure Alliance. To learn more, visit www.worldstemcellsummit.com.


News Article | December 19, 2016
Site: www.marketwired.com

Robert Pizzari, Anandh Maistry and Jane Bounds join Trustwave to drive further growth in cybersecurity and managed security services across the Asia-Pacific Region SYDNEY, AUSTRALIA--(Marketwired - December 19, 2016) - Trustwave today announced the appointment of Robert Pizzari as Senior Vice President of Asia-Pacific Sales, Anandh Maistry as Vice President of Sales for Australia and New Zealand, and Jane Bounds as its first Director of Asia-Pacific Marketing. The new executive appointments represent Trustwave's increased focus on driving growth in the Asia-Pacific region. Pizzari is responsible for all Asia-Pacific sales activity at Trustwave. Maistry is responsible for sales in Australia and New Zealand and reports into Pizzari. Bounds is responsible for all Trustwave marketing activities in Asia-Pacific including marketing with and through Trustwave telecommunications and channel partners in the region. According to Gartner's Worldwide Information Security Forecast 2Q16 Update, the Asia-Pacific cybersecurity market is expected to grow from $17.2B in 2016 to $23.2B in 2020. Trustwave Executive Vice President of Global Sales Dave Feringa said, "Trustwave has a massive opportunity in Asia-Pacific given our connection with Singtel and its affiliates. Robert and Anandh bring a solid set of sales leadership skills to the company, and I look forward to their helping expand our business delivering cybersecurity and managed security services across Asia-Pacific." Trustwave Chief Marketing Officer Steve Kelley said, "Jane is a highly experienced marketer with experience encompassing all aspects of marketing, using multiple touch points to create brand momentum. Her passion for strong engagement with sales leaders and the market, to identify the 'sweet spot' for opportunity, will benefit Trustwave as we continue to execute on our strategy to become the premier cybersecurity and managed security services provider across the Asia-Pacific market." Pizzari joins Trustwave from F5 Networks, where he was most recently Managing Director of Australia and New Zealand. He also held Asia-Pacific sales leadership roles at F5 and Cisco. He received a bachelor's degree from Victoria University. He is based in Singapore. With more than 25 years of IT industry experience, Maistry joins Trustwave from Citrix Systems, where he was Senior Director for Sales responsible for defining, building and driving a growth strategy. Prior to Citrix, Maistry worked for Oracle where he was the Vice President for their Systems Business in Asia, and Cisco where he was Managing Director for the Services Provider Organization. He studied computer science at UTS Sydney. He is based in Sydney. Bounds joins Trustwave from IBM Australia and New Zealand, where she was most recently Market Executive for the IBM Systems business unit. She held a number of marketing leadership roles at IBM spanning IBM hardware, as well as software including Tivoli and security, data management and Lotus. She received a bachelor's degree in computer science from the University of Sydney. She is based in Sydney. About Trustwave Trustwave helps businesses fight cybercrime, protect data and reduce security risk. With cloud and managed security services, integrated technologies and a team of security experts, ethical hackers and researchers, Trustwave enables businesses to transform the way they manage their information security and compliance programs. More than three million businesses are enrolled in the Trustwave TrustKeeper® cloud platform, through which Trustwave delivers automated, efficient and cost-effective threat, vulnerability and compliance management. Trustwave is headquartered in Chicago, with customers in 96 countries. For more information about Trustwave, visit https://www.trustwave.com. All trademarks used herein remain the property of their respective owners. Their use does not indicate or imply a relationship between Trustwave and the owners of such trademarks.


News Article | November 15, 2016
Site: www.prweb.com

The search is on for the 2016 International Brand Master, an exemplary educational marketing and branding professional, who will be recognized in the eighth annual International Brand Master Award competition sponsored by Educational Marketing Group (EMG), a premier U.S. brand development and marketing agency. Educational marketing professionals may nominate a colleague or themselves now through Friday, January 13, 2017, on EMG's website at http://emgonline.com/ibm-award/past-awards/2016-master/. "This is an excellent opportunity to bring attention to hard working education brand marketing professionals who do amazing work," said Bob Brock, president of EMG. International Brand Master Award nominees must be educational marketing professionals who: A top-notch international panel of judges comprised of EMG senior consultants, prior awardees, and several experienced volunteers from the profession carefully review the credentials and achievements of each nominee. This panel selects up to three finalists based on their credentials and achievements. Marketing practitioners from around the world then have the opportunity to vote for their candidate of choice from among the finalists. The scores of the select panel of judges and the public votes are combined to determine the International Brand Master. This year's winner will join an elite club of previous International Brand Masters: Since the creation of the award in 2009, nominations have come in from across the U.S., Belgium, Netherlands, Australia, the United Kingdom, and Thailand. Over the last six years, highly successful volunteer judges in the educational marketing field hailed from the U.S., Finland, Canada, England, Ireland, and South Africa. And during the public voting portion of the finalists during the last six years, over 8,000 votes were cast worldwide via EMG’s website, and on Facebook, Google+, and Twitter. International Brand Master Award Background The International Brand Master award was established in 2009. In 2015, EMG received nominations from the United Kingdom, the United States, Australia and Portugal. A blue-ribbon panel of volunteer international judges reviewed supporting materials provided by the nominees. The judges narrowed the pool of seven nominees to two from the United States, Shelly Brenckman, Texas A&M University, and Katie Kempf, Ursuline Academy, and one from Australia, Johanna Lowe, University of Sydney. The three finalists were asked to provide a 500-word and three-URL statement which were voted on by professional marketers from around the world. Over the previous seven years, over 9,000 votes from around the world were cast. The winner was chosen by a combination of public votes and the EMG panel of judges. For more information see: http://emgonline.com/ibm-award/. Educational Marketing Group Background EMG is a full-service, integrated brand development and marketing agency that provides custom-tailored research, brand development, creative development, new media services, professional development and media services for universities throughout North America. Headquartered in Denver, the company was established in 1997 and has operated in the education arena exclusively for 18 years. Clients have included North Carolina State University, Old Dominion University, University of Mary Washington, Washington State University, Virginia Tech, Cal Poly, Dalhousie University, University of Victoria, University of Colorado, University of Illinois, University of Michigan, University of Wyoming, and many others. Information: http://www.emgonline.com.


News Article | March 22, 2016
Site: www.scientificcomputing.com

With enough computing effort most contemporary security systems will be broken. But a research team at the University of Sydney has made a major breakthrough in generating single photons (light particles), as carriers of quantum information in security systems. The collaboration involving physicists at the Centre for Ultrahigh bandwidth Devices for Optical Systems (CUDOS), an ARC Centre of Excellence headquartered in the School of Physics, and electrical engineers from the School of Electrical and Information Engineering, has been published in Nature Communications. The team's work resolved a key issue holding back the development of password exchange which can only be broken by violating the laws of physics. Photons are generated in a pair, and detecting one indicates the existence of the other. This allows scientists to manage the timing of photon events so that they always arrive at the time they are expected. Lead author Dr. Chunle Xiong, from the School of Physics, said: "Quantum communication and computing are the next generation technologies poised to change the world." Among a number of quantum systems, optical systems offer particularly easy access to quantum effects. Over the past few decades, many building blocks for optical quantum information processing have developed quickly," Xiong said. "Implementing optical quantum technologies has now come down to one fundamental challenge: having indistinguishable single photons on-demand," he said. "This research has demonstrated that the odds of being able to generate a single photon can be doubled by using a relatively simple technique — and this technique can be scaled up to ultimately generate single photons with 100 percent probability." CUDOS director and co-author of the paper, Professor Ben Eggleton, said the interdisciplinary research was set to revolutionize our ability to exchange data securely — along with advancing quantum computing, which can search large databases exponentially faster. "The ability to generate single photons, which form the backbone of technology used in laptops and the Internet, will drive the development of local secure communications systems — for safeguarding defense and intelligence networks, the financial security of corporations and governments and bolstering personal electronic privacy, like shopping online," Professor Eggleton said. "Our demonstration leverages the CUDOS Photonic chip that we have been developing over the last decade, which means this new technology is also compact and can be manufactured with existing infrastructure." Co-author and Professor of Computer Systems, Philip Leong, who developed the high-speed electronics crucial for the advance, said he was particularly excited by the prospect of further exploring the marriage of photonics and electronics to develop new architectures for quantum problems. "This advance addresses the fundamental problem of single photon generation — promises to revolutionize research in the area," Professor Leong said. The group — which is now exploring advanced designs and expects real-world applications within three to five years — has involved research with University of Melbourne, CUDOS nodes at Macquarie University and Australian National University and an international collaboration with Guangdong University of Technology, China.


News Article | February 5, 2016
Site: news.mit.edu

Elastin is a crucial building block in our bodies. Its flexibility allows skin to stretch and twist, blood vessels to expand and relax with every heartbeat, and lungs to swell and contract with each breath. But exactly how this protein-based tissue assembles itself to achieve this flexibility remained an unsolved question — until now. This material has a remarkable combination of flexibility and durability: Elastin is one of the body’s most long-lasting component proteins, with an average survival time comparable to a human lifespan. During that time, the elastin in a blood vessel, for example, will have gone through an estimated 2 billion cycles of pulsation. A team of researchers at MIT, in Australia, and in the U.K. has carried out an analysis that reveals the details of a hierarchical structure of scissor-shaped molecules that gives elastin its remarkable properties. The findings were published this week in the journal Science Advances, in a paper by postdoc Giselle Yeo and professor Anthony Weiss of the University of Sydney, Australia; MIT graduate student Anna Tarakanova and McAfee Professor of Engineering Markus Buehler; and two others. Elastin tissues are made up of molecules of a protein called tropoelastin, which are strung together in a chain-like structure, and which Weiss and his team have been studying in the lab for many years. In this work, they collaborated with Buehler and Tarakanova at MIT, who have specialized in determining the molecular structure of biological materials through highly detailed atomic-scale modeling. Combining the computational and laboratory approaches provided insights that neither method could have yielded alone, team members say. While the study of elastin has been going on for a long time, Weiss says “this particular paper is exciting for us on three levels.” First, thanks to synchrotron imaging done by team member Clair Baldock at the University of Manchester in the U.K., the research revealed the shape and structure of the basic tropoelastin molecules. But these were snapshots — still images that could not illuminate the complex dynamics of the material as it forms large structures that can stretch and rebound. Those dynamic effects were revealed through the combination of computer modeling and laboratory work. “It’s really by combining forces with these three groups” that the details were pieced together, Weiss says. Tarakanova explains that in Buehler’s lab, “we use modeling to study materials at different length scales, and for elastin, that is very useful, because we can study details at the submolecular scale and build up to the scale of a single molecule.” By examining the relationship of structure across these different scales, “we could predict the dynamics of the molecule.” The dynamics turned out to be complex and surprising, Weiss says. “It’s almost like a dance the molecule does, with a scissor twist, like a ballerina,” with legs opening and closing repeatedly. Then, the scissor-like appendages of one molecule naturally lock onto the narrow end of another molecule, like one ballerina riding piggyback on top of the next. This process continues, building up long, chain-like structures. These long chains weave together to produce the flexible tissues that our lives depend on — including skin, lungs, and blood vessels. These structures “assemble very rapidly,” Weiss says, and this new research “helps us understand this assembly process.” A key part of the puzzle was the flexibility of the molecule itself, which the team found was controlled by the structure of key regions and the overall shape of the protein. The material has a combination of regions that are highly ordered and regions that are disordered. The disordered regions help to provide flexibility, while the ordered regions confer longevity. The team tested the way this flexibility comes about by genetically modifying the protein and comparing the characteristics of the modified and natural versions. They revived a short segment of the elastin gene that has become dormant in humans, which changes part of the protein’s configuration. They found that even though the changes were minor and only affected one part of the structure, the results were dramatic. The modified version had a stiff region that altered the molecule’s movements and weakened it. This helped to confirm that certain specific parts of the molecule, including one with a helical structure, were essential contributors to the material’s natural flexibility. That insight in itself could prove useful medically, the team says, as it could explain why blood vessels become weakened in people with certain disease conditions, perhaps as a result of a mutation in that gene. While the findings specifically relate to one particular protein and the tissues it forms, the team says the research may help in understanding a variety of other flexible biological tissues and how they work. “The integration of experiment and modeling in identifying how the molecular structure endows materials with exceptional durability and elasticity, and studying how these materials fail under extreme conditions yields important insights for the design of new materials that replace those in our body, or for materials that we can use in engineering applications in which durable materials are critical,” says Buehler, who is head of the MIT Department of Civil and Environmental Engineering. “We are excited about the new opportunities that arise from this collaboration and the potential for future work,” Buehler says, “because designing materials that last for many decades without breaking down is a major engineering challenge that nature has beautifully accomplished, and on which we hope to build.” “This is fascinating work,” says Chwee Tak Lim, professor of biomedical engineering at the National University of Singapore, who was not involved in this research. Lim says “I believe this work is significant in that it not only enables us to better understand the requisite conditions for the formation of ‘healthy’ elastins, whether in our body or in producing them for biomaterial applications, but also provides insights into certain tissue dysfunctions arising from elastin mutations.” The research team, which also included Steven Wise of the Heart Research Institute in Sydney, was supported, in part, by grants from the Australian Research Council; the National Institutes of Health; the Wellcome Trust; the Biotechnology and Biological Sciences Research Council, UK; and the U.S. Office of Naval Research.


News Article | October 27, 2016
Site: www.eurekalert.org

Washington, DC -- Men can take birth control shots to prevent pregnancy in their female partners, according to a new study published in the Endocrine Society's Journal of Clinical Endocrinology & Metabolism. Researchers are still working to perfect the combination of hormonal contraceptives to reduce the risk of mild to moderate side effects, including depression and other mood disorders. While women can choose from a number of birth control methods, men have few options to control their own fertility. Available methods for men include condoms, vasectomies and withdrawal. Better birth control options are needed for men. In 2012, 40 percent of all pregnancies worldwide were unintended, according to the Guttmacher Institute. "The study found it is possible to have a hormonal contraceptive for men that reduces the risk of unplanned pregnancies in the partners of men who use it," said one of the study's authors, Mario Philip Reyes Festin, MD, of the World Health Organization in Geneva, Switzerland. "Our findings confirmed the efficacy of this contraceptive method previously seen in small studies." The prospective Phase II single arm, multi-center study tested the safety and effectiveness of injectable contraceptives in 320 healthy men ages 18 to 45. The participants had all been in monogamous relationships with female partners between the ages of 18 and 38 for at least a year. The men underwent testing to ensure they had a normal sperm count at the start of the study. The men received injections of 200 milligrams of a long-acting progestogen called norethisterone enanthate (NET-EN) and 1,000 milligrams of a long-acting androgen called testosterone undecanoate (TU) for up to 26 weeks to suppress their sperm counts. Healthcare professionals gave the men two injections every eight weeks. Participants initially provided semen samples after eight and 12 weeks in the suppression phase and then every 2 weeks until they met the criteria for the next phase. During this time, the couples were instructed to use other non-hormonal birth control methods. Once a participant's sperm count was lowered to less than 1 million/ml in two consecutive tests, the couple was asked to rely on the injections for birth control. During this period known as the efficacy phase of the study, the men continued to receive injections every eight weeks for up to 56 weeks. Participants provided semen samples every eight weeks to ensure their sperm counts stayed low. Once the participants stopped receiving the injections, they were monitored to see how quickly their sperm counts recovered. The hormones were effective in reducing the sperm count to 1 million/ml or less within 24 weeks in 274 of the participants. The contraceptive method was effective in nearly 96 percent of continuing users. Only four pregnancies occurred among the men's partners during the efficacy phase of the study. Researchers stopped enrolling new participants in the study in 2011 due to the rate of adverse events, particularly depression and other mood disorders, reported by the participants. The men reported side effects including injection site pain, muscle pain, increased libido and acne. Twenty men dropped out of the study due to side effects. Despite the adverse effects, more than 75 percent of participants reported being willing to use this method of contraception at the conclusion of the trial. Of the 1,491 reported adverse events, nearly 39 percent were found to be unrelated to the contraceptive injections. These included one death by suicide which was assessed not to be related to the use of the drug. Serious adverse events that were assessed as probably or possibly related to the study included one case of depression, one intentional overdose of acetaminophen, and a man who experienced an abnormally fast and irregular heartbeat after he stopped receiving the injections. "More research is needed to advance this concept to the point that it can be made widely available to men as a method of contraception," Festin said. "Although the injections were effective in reducing the rate of pregnancy, the combination of hormones needs to be studied more to consider a good balance between efficacy and safety." The study, "Efficacy and Safety of an Injectable Combination Hormonal Contraceptive for Men," will be published online at http://press. , ahead of print. Other authors of the study include: Hermann M. Behre of Martin Luther University of Halle-Wittenberg in Halle, Germany; Michael Zitzmann of the University of Münster in Münster, Germany; Richard A. Anderson of The University of Edinburgh in Edinburgh, United Kingdom; David J. Handelsman of the University of Sydney and Concord Hospital in Sydney, Australia; Silvia W. Lestari of the University of Indonesia in Jakarta, Indonesia; Robert I. McLachlan of Monash Medical Centre in Melbourne, Australia; M. Cristina Meriggiola of the University of Bologna in Bologna, Italy; Man Mohan Misro of the National Institute of Health & Family Welfare in New Dehli, India; Gabriela Noe of the Instituto Chileno de Medicina Reproductiva in Santiago, Chile; Frederick C. W. Wu of Manchester Royal Infirmary in Manchester, U.K.; Ndema A. Habib and Kirsten M. Vogelsong of the World Health Organization of Geneva, Switzerland; and Marianne M. Callahan, Kim A. Linton and Doug S. Colvard of CONRAD, East Virginia Medical School, a reproductive health organization based in Arlington, VA. The research was co-sponsored and funded by UNDP/UNFPA/UNICEF/WHO/World Bank Special Program of Research, Development, and Research Training in Human Reproduction in Geneva, Switzerland, and CONRAD (using funding from the Bill & Melinda Gates Foundation and the U.S. Agency for International Development). The injectable hormones were provided by Schering AG, which has since merged with Bayer Pharma AG. For more information on men's health, visit the Endocrine Society's centennial website. Endocrinologists are at the core of solving the most pressing health problems of our time, from diabetes and obesity to infertility, bone health, and hormone-related cancers. The Endocrine Society is the world's oldest and largest organization of scientists devoted to hormone research and physicians who care for people with hormone-related conditions. The Society, which is celebrating its centennial in 2016, has more than 18,000 members, including scientists, physicians, educators, nurses and students in 122 countries. To learn more about the Society and the field of endocrinology, visit our site at http://www. . Follow us on Twitter at @TheEndoSociety and @EndoMedia.


News Article | December 8, 2016
Site: www.eurekalert.org

A child mummy from the 17th century, found in a crypt underneath a Lithuanian church, was discovered to harbor the oldest known sample of the variola virus that causes smallpox. Researchers who sequenced the virus say it could help answer lingering questions about the history of smallpox, including how recently it appeared in humans (perhaps more recently than we thought) and when specific evolutionary events occurred. Their study appears December 8 in Current Biology. "There have been signs that Egyptian mummies that are 3,000 to 4,000 years old have pockmarked scarring that have been interpreted as cases of smallpox," says first author Ana Duggan, a postdoctoral fellow at the McMaster University Ancient DNA Center in Canada. "The new discoveries really throw those findings into question, and they suggest that the timeline of smallpox in human populations might be incorrect." The research team gathered the disintegrated variola virus DNA from the mummy after obtaining permission from the World Health Organization. Using RNA baits designed from existing variola sequences, the researchers targeted variola sequences found within the extracted DNA from the mummy's skin. Then they reconstructed the entire genome of the ancient strain of the virus and compared it to versions of the variola virus genome dating from the mid-1900s and before its eradication in the late 1970s. They concluded that these samples shared a common viral ancestor that originated sometime between 1588 and 1645--dates that coincide with a period of exploration, migration, and colonization that would have helped spread smallpox around the globe. "So now that we have a timeline, we have to ask whether the earlier documented historical evidence of smallpox, which goes back to Ramses V and includes everything up to the 1500s, is real," says co-author Henrik Poinar, the director of the Ancient DNA Centre at McMaster University in Canada. "Are these indeed real cases of smallpox, or are these misidentifications, which we know is very easy to do, because it is likely possible to mistake smallpox for chicken pox and measles." In addition to providing a more accurate timeline for the evolution of smallpox, the researchers were also able to identify distinct periods of viral evolution. One of the clearest instances of this occurred around the time that Edward Jenner famously developed his vaccine against the virus in the 18th century. During this period, the variola virus appears to have split into two strains, variola major and variola minor, which suggests that vaccination, which led to eradication of smallpox, may have changed the selection pressures acting on the virus and caused it to split into two strains. The researchers hope to use this work to identify how the sample they discovered in Lithuania compares to others that were sweeping throughout other countries in Europe at the same time. But in the bigger context of smallpox research, the scientists are optimistic that their work will provide a stepping stone to allow virologists to continue to trace smallpox and other DNA viruses back through time. "Now we know all the evolution of the sampled strains dates from 1650, but we still don't know when smallpox first appeared in humans, and we don't know what animal it came from, and we don't know that because we don't have any older historical samples to work with," says co-author Edward Holmes, a professor at the University of Sydney in Australia. "So this does put a new perspective on this very important disease, but it's also showing us that our historical knowledge of viruses is just the tip of the iceberg." This work was supported by the McMaster Ancient DNA Centre at McMaster University, the Department of Virology at the University of Helsinki, the Department of Anatomy, Histology and Anthropology at Vilnius University, the Marie Bashir Institute for Infectious Diseases and Biosecurity, the Department of Biochemistry and Molecular Science and Biotechnology at the University of Melbourne, the Department of History at Duke University, the Department of Biology at McMaster University, UC Irvine, the Mycroarray in Michigan, the Department of Chemical Engineering at the University of Michigan, the Center for Microbial Genetics and Genomics at Northern Arizona University, the Laboratoire d'Anthropologie Biologique Paul Broca at the PSL Research University, Helsinki University Hospital, the Department of Forensic Medicine at the University of Helsinki, the Department of Pathology at the University of Cambridge, the Michael G. DeGroote Institute for Infectious Disease Research at McMaster University and the Humans & the Microbiome Program at the Canadian Institute for Advanced Research. Current Biology, Duggan, Marciniak, Poinar, Emery, Poinar et al: "17th Century Variola Virus Reveals the Recent History of Smallpox" http://www.cell.com/current-biology/fulltext/S0960-9822(16)31324-0 Current Biology (@CurrentBiology), published by Cell Press, is a bimonthly journal that features papers across all areas of biology. Current Biology strives to foster communication across fields of biology, both by publishing important findings of general interest and through highly accessible front matter for non-specialists. Visit: http://www. . To receive Cell Press media alerts, contact press@cell.com.


News Article | January 4, 2016
Site: phys.org

An international group of astronomers led by the University of Sydney has discovered strong magnetic fields are common in stars, not rare as previously thought, which will dramatically impact our understanding of how stars evolve. Using data from NASA's Kepler mission, the team found that stars only slightly more massive than the Sun have internal magnetic fields up to 10 million times that of the Earth, with important implications for evolution and the ultimate fate of stars. "This is tremendously exciting, and totally unexpected," said lead researcher, astrophysicist Associate Professor Dennis Stello from the University of Sydney. "Because only 5-0 percent of stars were previously thought to host strong magnetic fields, current models of how stars evolve lack magnetic fields as a fundamental ingredient," Associate Professor Stello said. "Such fields have simply been regarded insignificant for our general understanding of stellar evolution. "Our result clearly shows this assumption needs to be revisited." The findings are published today in the journal Nature. The research is based on previous work led by the Californian Institute of Technology (Caltech) and including Associate Professor Stello, which found that measurements of stellar oscillations, or sound waves, inside stars could be used to infer the presence of strong magnetic fields. This latest research used that result to look at a large number of evolved versions of our Sun observed by Kepler. More than 700 of these so-called red giants were found to show the signature of strong magnetic fields, with some of the oscillations suppressed by the force of the fields. "Because our sample is so big we have been able to dig deeper into the analysis and can conclude that strong magnetic fields are very common among stars that have masses of about 1.5-2.0 times that of the Sun," Associate Professor Stello explained. "In the past we could only measure what happens on the surfaces of stars, with the results interpreted as showing magnetic fields were rare." Using a new technique called asteroseismology, which can 'pierce through the surface' of a star, astronomers can now see the presence of a very strong magnetic field near the stellar core, which hosts the central engine of the star's nuclear burning. This is significant because magnetic fields can alter the physical processes that take place in the core, including internal rotation rates, which affects how stars grow old. Most stars like the Sun oscillate continuously because of sound waves bouncing back-and-forth inside them. "Their interior is essentially ringing like a bell." Associate Professor Stello said. "And like a bell, or a musical instrument, the sound they produce can reveal their physical properties." The team measured tiny brightness variations of stars caused by the ringing sound and found certain oscillation frequencies were missing in 60 percent of the stars because they were suppressed by strong magnetic fields in the stellar cores. The results will enable scientists to test more directly theories of how magnetic fields form and evolve—a process known as magnetic dynamos—inside stars. This could potentially lead to a better general understanding of magnetic dynamos, including the dynamo controlling the Sun's 22-year magnetic cycle, which is known to affect communication systems and cloud cover on Earth. "Now it is time for the theoreticians to investigate why these magnetic fields are so common," Associate Professor Stello concluded. More information: Dennis Stello et al. A prevalence of dynamo-generated magnetic fields in the cores of intermediate-mass stars, Nature (2016). DOI: 10.1038/nature16171


Five new JAMA and JAMA Internal Medicine studies published online compare a variety of health outcomes in men with low testosterone who used testosterone. Four of the five testosterone-related studies are from the Testosterone Trials, a group of placebo-controlled, coordinated trials designed to determine the efficacy of testosterone gel use by men 65 or older with low testosterone for no apparent reason other than age. The studies examined the health outcomes of memory and cognitive function, bone density, coronary artery plaque volume and anemia. A fifth study, which was not part of the Testosterone Trials, examined the association of testosterone replacement therapy with cardiovascular outcomes. In this study, researchers tested if treating older men with low testosterone with a testosterone gel for a year would slow the progression of coronary artery plaque volume compared with a placebo gel. The study included 138 men (73 who received testosterone gel and 65 who received placebo gel). Findings: Among the men, using testosterone gel for one year compared with placebo gel increased the amount of coronary artery noncalcified plaque, an early sign of increased risk of heart problems. Larger studies are needed to understand the clinical implications of this finding. Authors: Peter J. Snyder, M.D., of the University of Pennsylvania, Philadelphia, and colleagues as part of the Testosterone Trials. To place an electronic embedded link to this study in your story: This link will be live at the embargo time: http://jamanetwork. Researchers also wanted to know if older men with low testosterone who used testosterone gel for one year compared with placebo gel improved their memory and cognitive function. Among 493 men with age-associated memory impairment (AAMI), 247 received testosterone gel and 246 received placebo for one year. Findings: Using testosterone gel for one year compared with placebo gel was not associated with improved memory or cognitive function. Authors: Peter J. Snyder, M.D., of the University of Pennsylvania, Philadelphia, and colleagues as part of the Testosterone Trials. To place an electronic embedded link to this study in your story: This link will be live at the embargo time: http://jamanetwork. In this study, researchers wanted to determine if older men with low testosterone and mild anemia could improve their anemia by using testosterone gel for one year. Of the 788 men enrolled in the Testosterone Trials, 126 were anemic at the start and, of those, 62 had anemia of known causes. Findings: Testosterone gel increased hemoglobin levels and corrected the anemia (of both known and unknown causes) in older men with low testosterone more than placebo gel. Authors: Peter J. Snyder, M.D., of the University of Pennsylvania, Philadelphia, and coauthors as part of the Testosterone Trials. To place an electronic embedded link to this study in your story: This link will be live at the embargo time: http://jamanetwork. Another question researchers examined was whether using testosterone gel would help older men with low testosterone improve their bone density and strength. This study included 211 men, of whom 110 received testosterone gel and 101 got the placebo gel. Findings: Using testosterone gel for one year by older men with low testosterone increased bone density and strength compared with placebo, more so in the spine than hip and more so in trabecular bone than cortical-rich peripheral bone. Authors: Peter J. Snyder, M.D., of the University of Pennsylvania, Philadelphia, and coauthors as part of the Testosterone Trials. To place an electronic embedded link to this study in your story: This link will be live at the embargo time: http://jamanetwork. This study, which was not part of the Testosterone Trials, examined the association between testosterone replacement therapy (TRT) and cardiovascular outcomes in men 40 or older with low testosterone at Kaiser Permanente California. The study, which was observational, included 8,808 men who were ever prescribed TRT given by injection, orally or topically. Findings: Among men with low testosterone, dispensed testosterone prescriptions were associated with a lower risk of cardiovascular outcomes over a median follow-up of about three years. Authors: T. Craig Cheetham, Pharm.D., M.S., of the Southern California Permanente Medical Group, Pasadena, and coauthors. To place an electronic embedded link to this study in your story: This link will be live at the embargo time: http://jamanetwork. Editor's Note: Please see the articles for additional information, including other authors, author contributions and affiliations, financial disclosures, funding and support, etc. Media Advisory: To contact Peter J. Snyder, M.D., email Abbey Anderson at Abbey.Anderson@uphs.upenn.edu or call 215-349-8369. To contact T. Craig Cheetham, Pharm.D. , M.S., email Vincent Staupe at vincent.p.staupe@kp.org or call 510-267-7364. Related video: There will be a JAMA Report video story on testosterone treatment and health outcomes for you to use in your report. Video will be available under embargo at this link and include broadcast-quality downloadable video files, B-roll, scripts and other images. Please contact JAMA Network Media Relations at mediarelations@jamanetwork.org if there are questions. Related materials: The JAMA editorial, "Testosterone and Male Aging," by David J. Handelsman, M.B.B.S., Ph.D., F.R.A.C.P., F.A.H.M.S., of the University of Sydney and Concord Hospital, Sydney, Australia; the JAMA Internal Medicine editorial, "Further Elucidation of the Potential Benefits of Testosterone Therapy in Older Men," by Eric Orwoll, M.D., of the Oregon Health & Science University, Portland; and the JAMA Internal Medicine editorial, "Addressing Ethical Lapses in Research," by Bernard Lo, M.D., of the Greenwell Foundation, New York, and JAMA Internal Medicine deputy editor Deborah Grady, M.D., M.P.H., of the University of California, San Francisco, also are available on the For The Media website


News Article | December 12, 2016
Site: www.rdmag.com

Researchers in Australia have developed the one diamond you won’t see on an engagement ring anytime soon. The Australian National University has led an international project aimed at making a diamond harder than a jeweler’s diamond and useful for cutting through ultra-solid materials on mining sites. Jodie Bradby, Ph.D., an associate professor at ANU, led a team to make nano-sized Lonsdaleite—or hexagonal diamonds only found in nature at the site of meteorite impacts such as Canyon Diablo in the U.S. "This new diamond is not going to be on any engagement rings,” Bradby said in a statement. “You'll more likely find it on a mining site—but I still think that diamonds are a scientist's best friend. “Any time you need a super-hard material to cut something, this new diamond has the potential to do it more easily and more quickly.” Bradby’s team included ANU Ph.D. student Thomas Shiell and experts from RMIT University, the University of Sydney and the United States. The team made the Lonsdaleite in a diamond anvil at 400 degrees Celsius, halving the temperature at which it can be formed in a laboratory. “The hexagonal structure of this diamond's atoms makes it much harder than regular diamonds, which have a cubic structure,” Bradby said. “We've been able to make it at the nanoscale and this is exciting because often with these materials 'smaller is stronger'.” RMIT professor Dougal McCulloch, a co-researcher on the study, said the collaboration of world-leading experts in the field was crucial to the project’s success. “The discovery of the nano-crystalline hexagonal diamond was only made possible by close collaborative ties between leading physicists from Australia and overseas, and the team utilized state-of-the-art instrumentation such as electron microscopes,” he said in a statement. Prof. David McKenzie, a corresponding author from the University of Sydney, explained that he was working the night shift in a U.S. laboratory as part of the research when he noticed a little shoulder on the side of a peak. “And it didn't mean all that much until we examined it later on in Melbourne and in Canberra and we realized that it was something very, very different,” he said in a statement. The Lonsdaleite was named after British female crystallographer Dame Kathleen Lonsdale, who was the first woman elected as a Fellow to the Royal Society.


News Article | December 6, 2016
Site: www.eurekalert.org

New Rochelle, NY, December 6, 2016--Legalization of cannabis for medical or leisure use is increasing in the U.S., and many experts and cannabis users alike agree that package warnings stating the health risks are needed. The warnings suggested by cannabis users are not necessarily the same as those of medical experts though, as shown in a new study published in Cannabis and Cannabinoid Research, a new peer-reviewed journal from Mary Ann Liebert, Inc., publishers. The article is available free on the Cannabis and Cannabinoid Research website. In the article "Cannabis Users' Recommended Warnings for Packages of Legally Sold Cannabis: An Australia-Centered Study," authors John Malouff, Caitlin Johnson, University of New England, and Sally Rooke, University of Sydney, Australia, asked young adults who had used cannabis at least once to suggest a warning that governments could mandate on cannabis packages. Some youths in Australia view cannabis as potentially harmful, and many of their recommended warnings agreed with those of experts, particularly related to the effects of cannabis on driving ability, mental health and psychological functioning, addiction/abuse risk, and long-term physical effects. However, the study participants also suggested some types of warnings not typically recommended by experts. "One of the many challenges created by legalization is how to package cannabis products," says Editor-in-Chief Daniele Piomelli, PhD, University of California-Irvine, School of Medicine. "This is no small problem: think how different a box of gummy bears and a bottle of medications look, and how this difference can influence use. We hope that this contribution will be the first of several examining this issue from different perspectives." Cannabis and Cannabinoid Research is the only peer-reviewed open access journal dedicated to the scientific, medical, and psychosocial exploration of clinical cannabis, cannabinoids, and the biochemical mechanisms of endocannabinoids. Led by Editor-in-Chief Daniele Piomelli, PhD, the Journal publishes a broad range of human and animal studies including basic and translational research; clinical studies; behavioral, social, and epidemiological issues; and ethical, legal, and regulatory controversies. Complete information is available on the Cannabis and Cannabinoid Research website. Mary Ann Liebert, Inc., publishers is a privately held, fully integrated media company known for establishing authoritative peer-reviewed journals in many promising areas of science and biomedical research, including Journal of Medicinal Food, The Journal of Alternative and Complementary Medicine, and Journal of Child and Adolescent Psychopharmacology. Its biotechnology trade magazine, GEN (Genetic Engineering & Biotechnology News), was the first in its field and is today the industry's most widely read publication worldwide. A complete list of the firm's 80 journals, books, and newsmagazines is available on the Mary Ann Liebert, Inc., publishers website.


News Article | October 25, 2016
Site: www.biosciencetechnology.com

Increased muscle strength leads to improved brain function in adults with Mild Cognitive Impairment (MCI), new results from a recent trial led by the University of Sydney has revealed. With 135 million people forecast to suffer from dementia in 2050, the study's findings--published in the Journal of American Geriatrics today--have implications for the type and intensity of exercise that is recommended for our growing ageing population. Mild Cognitive Impairment defines people who have noticeably reduced cognitive abilities such as reduced memory but are still able to live independently, and is a precursor to Alzheimer's disease. Findings from the Study of Mental and Resistance Training (SMART) trial show, for the first time, a positive causal link between muscle adaptations to progressive resistance training and the functioning of the brain among those over 55 with MCI. The trial was conducted in collaboration with the Centre for Healthy Brain Ageing (CHeBA) at University of New South Wales and the University of Adelaide. "What we found in this follow up study is that the improvement in cognition function was related to their muscle strength gains" said lead author Dr Yorgi Mavros, from the Faculty of Health Sciences, at University of Sydney. "The stronger people became, the greater the benefit for their brain." SMART was a randomised, double-blind trial involving 100 community-dwelling adults with MCI, aged between 55 and 86. They were divided into four groups doing either: Participants doing resistance exercise prescribed weight lifting sessions twice week for six months, working to at least 80 per cent of their peak strength. As they got stronger, the amount of weight they lifted on each machine was increased to maintain the intensity at 80 per cent of their peak strength. The primary outcomes of a paper published in 2014 found these participants' global cognition improved significantly after the resistance training, as measured by tests including the Alzheimer's disease Assessment Scale-Cognitive scale. The cognitive training and placebo activities did not have this benefit. The benefits persisted even 12 months after the supervised exercise sessions ended. "The more we can get people doing resistance training like weight lifting, the more likely we are to have a healthier ageing population," said Dr Mavros. "The key however is to make sure you are doing it frequently, at least twice a week, and at a high intensity so that you are maximising your strength gains. This will give you the maximum benefit for your brain." These new findings reinforce research from the SMART trial published earlier this year, whereby MRI scans showed an increase in the size of specific areas of the brain among those who took part in the weight training program. These brain changes were linked to the cognitive improvements after weight lifting. "The next step now is to determine if the increases in muscle strength are also related to increases in brain size that we saw," said senior author Professor Maria Fiatarone Singh, geriatrician at University of Sydney. "In addition, we want to find the underlying messenger that links muscle strength, brain growth, and cognitive performance, and determine the optimal way to prescribe exercise to maximise these effects."


News Article | November 22, 2016
Site: www.24-7pressrelease.com

PARKVILLE, AUSTRALIA, November 22, 2016-- Dr. James Angus has been included in Marquis Who's Who. As in all Marquis Who's Who biographical volumes, individuals profiled are selected on the basis of current reference value. Factors such as position, noteworthy accomplishments, visibility, and prominence in a field are all taken into account during the selection process.Dr. Angus has been an Honorary Professorial Fellow and Professor Emeritus of the Department of Pharmacology and Therapeutics, Faculty of Medicine, Dentistry and Health Sciences at the University of Melbourne since 2014. From 2003 to 2013, Dr. Angus was Dean of the Faculty of Medicine, Dentistry and Health Sciences.Dr. Angus earned a Bachelor of Science in pharmacology with honors and a Ph.D. from the University of Sydney. In 1974, he was a NHMRC Senior Research Officer at Hallstrom Institute of Cardiology, Royal Prince Alfred Hospital & Department of Medicine at the University of Sydney, and the Baker Medical Research Institute in Prahran, Victoria. In 1977 he received the NHMRC CJ Martin Travelling Fellowship to work with Sir James Black who would go on to receive the Nobel Prize for Medicine in 1988. Dr. Angus then continued to work at the Baker Medical Research Institute in a variety of roles over the next 15 years including Senior Research Officer, Research Fellow, Senior Research Fellow, Principal Research Fellow, and Senior Principal Research Fellow of the National Health and Medical Research Council. He was ultimately named Deputy Director of the Baker Medical Research Institute in 1992. In 1993, Dr. Angus was appointed to the Chair of Pharmacology at the University of Melbourne.His appointments include Chair of the Melbourne Genomics Health Alliance Phase 1 (2014-2016), Governor and Director of the Florey Institute of Neuroscience and Mental Health, a position he has held since 2012, member of the Program Steering Committee of the Australian Council of Learned Academies (2014-2016), member of the Steering Committee to establish the Australian Academy of Health and Medical Sciences (2013-2014), current President of the National Stroke Foundation, current Chair of the University of Melbourne Sport Board, and current Board Director of the Jack Brockhoff Foundation since 2015.Dr. Angus is a Fellow and former Council Member of the Australian Academy of Science as well as the International Academy of Cardiovascular Sciences, and an Honorary Fellow of the Australian Academy of Health and Medical Sciences. Over the past 25 years, he has received numerous research grants from such learned institutions and organizations as the NHMRC, Australian College of Anaesthetists, National Heart Foundation of Australia, Glaxo Smith Kline Pty Ltd, and Johnson & Johnson Pty Ltd.His scientific society memberships include the Australian Physiological and Pharmacology Society, Australian Society for Clinical and Experimental Pharmacology, British Pharmacological Society, Cardiac Society of Australia and New Zealand, High Blood Pressure Research Council of Australia, International Society for Heart Research, International Society of Autonomic Neuroscience and International Union of Pharmacology, of which he was first Vice President (2002-2006).Dr. Angus is a regular contributor to scientific journals, including the Clinical and Experimental Pharmacology and Physiology Journal, the Journal of Vascular Research, the British Journal of Pharmacology, and Pharmacology and Toxicology. He has attended and lectured at numerous national and international scientific meetings.In recognition of professional excellence, he was the recipient of the Alfred Gottschalk Medal of the Australian Academy of Science in 1984, and the Thomson ISI: Australian Citation Laureate in Pharmacology in 2004. In 2010, Dr. Angus was appointed an Officer to The Order of Australia for distinguished service to biomedical research, particularly in the fields of pharmacology and cardiovascular disease, as a leading academic and medical educator, and as a contributor to a range of advisory boards and professional organizations both nationally and internationally. Further, he received the Centenary Medal in 2003 for services to pharmacology and the community. His many important roles throughout scientific and academic circles have brought distinction to the University of Melbourne and to the Melbourne Medical School. For his professional efforts, Dr. Angus was selected for inclusion in Who's Who in Medicine and Healthcare, Who's Who in Science and Engineering, and Who's Who in the World.About Marquis Who's Who :Since 1899, when A. N. Marquis printed the First Edition of Who's Who in America , Marquis Who's Who has chronicled the lives of the most accomplished individuals and innovators from every significant field of endeavor, including politics, business, medicine, law, education, art, religion and entertainment. Today, Who's Who in America remains an essential biographical source for thousands of researchers, journalists, librarians and executive search firms around the world. Marquis now publishes many Who's Who titles, including Who's Who in America , Who's Who in the World , Who's Who in American Law , Who's Who in Medicine and Healthcare , Who's Who in Science and Engineering , and Who's Who in Asia . Marquis publications may be visited at the official Marquis Who's Who website at www.marquiswhoswho.com


News Article | March 1, 2017
Site: www.PR.com

Over 7,500 global students whittled down to just 96 for two-day event in Shenzhen Shenzhen, China, March 01, 2017 --( Each had undertaken a gruelling selection process to test their knowledge of IP and IT subjects, to be eligible for the trip. Huawei, known for its mobile phones, hosted 32 China teams, including six overseas teams, from across the globe at the inaugural event (25-26 February) at its Shenzhen HQ, including teams from Pakistan, South Africa, Australia, Russia, and Western Europe. They undertook an eight-hour practical challenge that thoroughly tests their knowledge of routers, switching, security, WLAN, plus cloud, storage and Big Data. The students are members of the Huawei ICT Academy, a worldwide not-for-profit programme that encourages the next generation of ICT professionals for a better connected world. Of the 150m people across the globe who work in ICT industries, just under 3m are estimated to work in professional or certified roles (1). Lintuo Wu, Huawei ICT Academy director, said: “The pan-global growth in internet start-ups suggests ICT talent is thriving, however, the opposite is the case. Businesses, particularly those in smaller countries, struggle to find people with the right ICT skills to deal with security, the cloud and data storage. “Training the next generation of IT professionals to understand how to handle the staggeringly large amounts of information we produce is a significant challenge. The Huawei ICT Academy hopes to influence the upskilling of the global ICT workforce through educational opportunities, to create a better ICT talent ecosystem.” Pallavi Malhotra, Huawei ICT Academy manager, said: “More and more university students, and those wanting to embark on an ICT career realise they need the full suite of academic knowledge and practical know-how to enter the current ICT workplace. Huawei ICT Academy qualifications provide the foundations, from routers or switching, to cloud computing, or security, to help students find those roles. We can also provide a career path for ICT professionals right up to certification, to enable them to aim for middle to senior level positions.” References: (1) “GIC 2020 Skills Assessment” published 2015 by IPPP, International Professional Practice Partnership About Huawei ICT Academy The Huawei ICT Academy is part of a number of Huawei outreach programmes to enrich life and improve efficiency around the world. Huawei works with over 200 global colleges and universities, including the University of Reading in the UK, University of Sydney in Australia, University of Alicante in Spain, National University of Computer & Emerging Sciences (FAST-NU) in Pakistan, and University of São Paulo in Brazil City University of Hong Kong, and has trained over 200,000 students up to 2016. To date, Huawei has provided cloud services to over 2,500 customers in the government and public utility, telecom, energy, and finance sectors across 108 countries and regions, deploying more than 1.4 million virtual machines. It has also built 660 data centers worldwide, including 255 cloud data centers. Its ICT solutions, products, and services are used in more than 170 countries and regions, serving over one-third of the world's population. With more than 170,000 employees, Huawei is committed to enabling the future information society, and building a better connected world. Shenzhen, China, March 01, 2017 --( PR.com )-- The best ICT students from across the globe flew to China this week to pit their brains against each other in a challenge that highlights the skills needed for a connected world.Each had undertaken a gruelling selection process to test their knowledge of IP and IT subjects, to be eligible for the trip.Huawei, known for its mobile phones, hosted 32 China teams, including six overseas teams, from across the globe at the inaugural event (25-26 February) at its Shenzhen HQ, including teams from Pakistan, South Africa, Australia, Russia, and Western Europe.They undertook an eight-hour practical challenge that thoroughly tests their knowledge of routers, switching, security, WLAN, plus cloud, storage and Big Data.The students are members of the Huawei ICT Academy, a worldwide not-for-profit programme that encourages the next generation of ICT professionals for a better connected world.Of the 150m people across the globe who work in ICT industries, just under 3m are estimated to work in professional or certified roles (1).Lintuo Wu, Huawei ICT Academy director, said: “The pan-global growth in internet start-ups suggests ICT talent is thriving, however, the opposite is the case. Businesses, particularly those in smaller countries, struggle to find people with the right ICT skills to deal with security, the cloud and data storage.“Training the next generation of IT professionals to understand how to handle the staggeringly large amounts of information we produce is a significant challenge. The Huawei ICT Academy hopes to influence the upskilling of the global ICT workforce through educational opportunities, to create a better ICT talent ecosystem.”Pallavi Malhotra, Huawei ICT Academy manager, said: “More and more university students, and those wanting to embark on an ICT career realise they need the full suite of academic knowledge and practical know-how to enter the current ICT workplace. Huawei ICT Academy qualifications provide the foundations, from routers or switching, to cloud computing, or security, to help students find those roles. We can also provide a career path for ICT professionals right up to certification, to enable them to aim for middle to senior level positions.”References:(1) “GIC 2020 Skills Assessment” published 2015 by IPPP, International Professional Practice PartnershipAbout Huawei ICT AcademyThe Huawei ICT Academy is part of a number of Huawei outreach programmes to enrich life and improve efficiency around the world.Huawei works with over 200 global colleges and universities, including the University of Reading in the UK, University of Sydney in Australia, University of Alicante in Spain, National University of Computer & Emerging Sciences (FAST-NU) in Pakistan, and University of São Paulo in Brazil City University of Hong Kong, and has trained over 200,000 students up to 2016.To date, Huawei has provided cloud services to over 2,500 customers in the government and public utility, telecom, energy, and finance sectors across 108 countries and regions, deploying more than 1.4 million virtual machines. It has also built 660 data centers worldwide, including 255 cloud data centers.Its ICT solutions, products, and services are used in more than 170 countries and regions, serving over one-third of the world's population. With more than 170,000 employees, Huawei is committed to enabling the future information society, and building a better connected world. Click here to view the list of recent Press Releases from Huawei ICT Academy


LAHORE, PAKISTAN, March 02, 2017 /24-7PressRelease/ -- Six ICT students from Pakistan who beat over 2,000 of their countrymen for a place at a global ICT challenge will fly to China this week to highlight the skills needed for a connected world. The students were selected as part of a Government-sponsored initiative that saw around 2,300 entrants from more than 30 universities tested during two gruelling rounds of competition. Six were eventually chosen from a shortlist of 50 during the second test event in Lahore in November. Huawei, known for its mobile phones, will host 32 China teams including six overseas teams, from across the globe at the inaugural event at its Shenzhen HQ, including representatives from Western Europe, South Africa, Australia, and Russia. Over 7,500 students entered the competition worldwide. The Pakistan delegation comprising two teams, will undertake an eight-hour practical challenge that thoroughly tests their knowledge of routers, switching, security, WLAN, plus cloud, storage, and Big Data. The top three teams will be announced at the end, along with six of the best mentors. The students are members of the Huawei ICT Academy, a worldwide not-for-profit programme that encourages the next generation of ICT professionals for a better connected world. The development of ICT skills in Pakistan is hampered by poor electricity supply to rural schools. Just 31% have access, compared with 53% in urban areas(1). Lintuo Wu, Huawei ICT Academy director, said: "The high number of entrants for this competition underlines the thirst in Pakistan to develop ICT knowledge and skills among University students. The Huawei ICT Academy hopes to influence the upskilling of Pakistan's ICT workforce through educational opportunities and the creation of a better ICT talent ecosystem." Pallavi Malhotra, Huawei ICT Academy manager, said: "More and more university students, and those wanting to embark on an ICT career realise they need the full suite of academic knowledge and practical know-how to enter the current ICT workplace. Huawei ICT Academy qualifications provide the foundations, from routers or switching, to cloud computing, or security, to help students find those roles. We can also provide a career path for ICT professionals right up to certification, to enable them to aim for middle to senior level positions." References 1. "Information and Communication Technology (ICT) in Education in Asia" April 2014, by UNESCO, United Nations Educational, Scientific and Cultural Organization About Huawei ICT Academy The Huawei ICT Academy is part of a number of Huawei outreach programmes to enrich life and improve efficiency around the world. Huawei works with over 200 global colleges and universities, including the University of Reading in the UK, University of Sydney in Australia, University of Alicante in Spain, National University of Computer & Emerging Sciences (FAST-NU) in Pakistan, and University of São Paulo in Brazil City University of Hong Kong, and has trained more than 200,000 students up to 2016. To date, Huawei has provided cloud services to over 2,500 customers in the government and public utility, telecom, energy, and finance sectors across 108 countries and regions, deploying more than 1.4 million virtual machines. It has also built 660 data centers worldwide, including 255 cloud data centers. Its ICT solutions, products, and services are used in more than 170 countries and regions, serving over one-third of the world's population. With more than 170,000 employees, Huawei is committed to enabling the future information society, and building a better connected world.


Hackett M.L.,University of Sydney | Hackett M.L.,University of Central Lancashire | Kohler S.,Maastricht University | O'Brien J.T.,University of Cambridge | And 2 more authors.
The Lancet Neurology | Year: 2014

The most common neuropsychiatric outcomes of stroke are depression, anxiety, fatigue, and apathy, which each occur in at least 30% of patients and have substantial overlap of prevalence and symptoms. Emotional lability, personality changes, psychosis, and mania are less common but equally distressing symptoms that are also challenging to manage. The cause of these syndromes is not known, and there is no clear relation to location of brain lesion. There are important gaps in knowledge about how to manage these disorders, even for depression, which is the most studied syndrome. Further research is needed to identify causes and interventions to prevent and treat these disorders. © 2014 Elsevier Ltd.


Bennett S.,University of Wollongong | Maton K.,University of Sydney
Journal of Computer Assisted Learning | Year: 2010

The idea of the 'digital natives', a generation of tech-savvy young people immersed in digital technologies for which current education systems cannot cater, has gained widespread popularity on the basis of claims rather than evidence. Recent research has shown flaws in the argument that there is an identifiable generation or even a single type of highly adept technology user. For educators, the diversity revealed by these studies provides valuable insights into students' experiences of technology inside and outside formal education. While this body of work provides a preliminary understanding, it also highlights subtleties and complexities that require further investigation. It suggests, for example, that we must go beyond simple dichotomies evident in the digital natives debate to develop a more sophisticated understanding of our students' experiences of technology. Using a review of recent research findings as a starting point, this paper identifies some key issues for educational researchers, offers new ways of conceptualizing key ideas using theoretical constructs from Castells, Bourdieu and Bernstein, and makes a case for how we need to develop the debate in order to advance our understanding. © 2010 Blackwell Publishing Ltd.


Orchard J.W.,University of Sydney | Seward H.,Australian Football League Medical Officers Association | Orchard J.J.,University of Sydney
American Journal of Sports Medicine | Year: 2013

Background: Injuries are common in all professional football codes (including soccer, rugby league and union, American football, Gaelic football, and Australian football). Purpose: To report the epidemiology of injuries in the Australian Football League (AFL) from 1992-2012 and to identify changes in injury patterns during that period. Study Design: Descriptive epidemiology study. Methods: The AFL commenced surveying injuries in 1992, with all teams and players included since 1996. An injury was defined as "any physical or medical condition that causes a player to miss a match in the regular season or finals (playoffs)." Administrative records of injury payments (which are compulsory as part of salary cap compliance) to players who do not play matches determined the occurrence of an injury. The seasonal incidence was measured in units of new injuries per club (of 40 players) per season (of 22 matches). Results: There were 4492 players listed over the 21-year period who suffered 13,606 new injuries/illnesses and 1965 recurrent injuries/ illnesses, which caused 51,919 matches to be missed. The lowest seasonal incidence was 30.3 new injuries per club per season recorded in 1993, and the highest was 40.3 recorded in 1998. The injury prevalence (missed matches through injury per club per season) varied from a low of 116.3 in 1994 to a high of 157.1 in 2011. The recurrence rate of injuries was highest at 25% in 1992 and lowest at 9% in 2012 and has steadily fallen across the 21 years (P<.01). The most frequent and prevalent injury was hamstring strain (average of 6 injuries per club per season, resulting in 20 missed matches per club per season; recurrence rate, 26%), although the rate of hamstring injuries has fallen in the past 2 seasons after a change to the structure of the interchange bench (P< .05). The rate of knee posterior cruciate ligament injuries fell in the years after a rule change to prevent knee-to-knee collisions in ruckmen (P<.01). Conclusion: Annual public reporting (by way of media release and reports available freely online) of injury rates, using units easily understood by laypeople, has been well received. It has also paved the way for rule changes with the primary goal of improving player safety. © 2013 The Author(s).


Rutledge P.J.,University of Sydney | Challis G.L.,University of Warwick
Nature Reviews Microbiology | Year: 2015

Microorganisms produce a wealth of structurally diverse specialized metabolites with a remarkable range of biological activities and a wide variety of applications in medicine and agriculture, such as the treatment of infectious diseases and cancer, and the prevention of crop damage. Genomics has revealed that many microorganisms have far greater potential to produce specialized metabolites than was thought from classic bioactivity screens; however, realizing this potential has been hampered by the fact that many specialized metabolite biosynthetic gene clusters (BGCs) are not expressed in laboratory cultures. In this Review, we discuss the strategies that have been developed in bacteria and fungi to identify and induce the expression of such silent BGCs, and we briefly summarize methods for the isolation and structural characterization of their metabolic products. © 2015 Macmillan Publishers Limited. All rights reserved.


Aitchison J.C.,University of Sydney | Buckman S.,University of Wollongong
Gondwana Research | Year: 2012

The Early Paleozoic Lachlan Fold Belt of eastern Australia is widely regarded as an ancient convergent plate margin beneath which paleo-Pacific (Panthalassic) oceanic lithosphere was continuously subducted. It is cited as the type example of a retreating accretionary orogeny. However, sandstone compositions, the sedimentological nature and timing of chert accumulation and overall stratigraphic architecture are not necessarily consistent with this model. We suggest an alternative explanation for growth of Gondwanan continental margin. Oceanic lithosphere outboard of the passive Gondwana continental margin was subducted beneath an extensive intra-oceanic island arc that now crops out as an allochthonous element (Macquarie arc) within the fold belt. Once intervening oceanic lithosphere was eliminated this arc collided with, and was emplaced upon the Gondwana margin. Recognition of four such events along this margin through the Phanerozoic suggests it is a significant mechanism for continental growth. © 2012 .


Cobo I.,University of Sydney | Li M.,Tyco Fire Protection Products | Sumerlin B.S.,University of Florida | Perrier S.,University of Warwick | Perrier S.,Monash University
Nature Materials | Year: 2015

The properties and applications of biomacromolecules, for example proteins, can be enhanced by the covalent attachment of synthetic polymers. This Review discusses the modification of these biomacromolecules with stimuli-responsive polymers. © 2015 Macmillan Publishers Limited. All rights reserved.


Siler K.,University of Toronto | Lee K.,University of California at San Francisco | Bero L.,University of Sydney
Proceedings of the National Academy of Sciences of the United States of America | Year: 2015

Peer review is the main institution responsible for the evaluation and gestation of scientific research. Although peer review is widely seen as vital to scientific evaluation, anecdotal evidence abounds of gatekeeping mistakes in leading journals, such as rejecting seminal contributions or accepting mediocre submissions. Systematic evidence regarding the effectiveness-or lack thereof-of scientific gatekeeping is scant, largely because access to rejected manuscripts from journals is rarely available. Using a dataset of 1,008 manuscripts submitted to three elite medical journals, we show differences in citation outcomes for articles that received different appraisals from editors and peer reviewers. Among rejected articles, desk-rejected manuscripts, deemed as unworthy of peer review by editors, received fewer citations than those sent for peer review. Among both rejected and accepted articles, manuscripts with lower scores from peer reviewers received relatively fewer citations when they were eventually published. However, hindsight reveals numerous questionable gatekeeping decisions. Of the 808 eventually published articles in our dataset, our three focal journals rejected many highly cited manuscripts, including the 14 most popular; roughly the top 2 percent. Of those 14 articles, 12 were deskrejected. This finding raises concerns regarding whether peer review is ill-suited to recognize and gestate the most impactful ideas and research. Despite this finding, results show that in our case studies, on the whole, there was value added in peer review. Editors and peer reviewers generally-but not always-made good decisions regarding the identification and promotion of quality in scientific manuscripts.


Diakos C.I.,University of Sydney | Charles K.A.,University of Sydney | McMillan D.C.,Royal Infirmary | Clarke S.J.,University of Sydney
The Lancet Oncology | Year: 2014

Inflammation is a recognised hallmark of cancer that substantially contributes to the development and progression of malignancies. In established cancers, there is increasing evidence for the roles that local immune response and systemic inflammation have in progression of tumours and survival of patients with cancer. This knowledge provides an opportunity to target these inflammatory responses to improve patient outcomes. In this Review, we examine the complex interplay between local immune responses and systemic inflammation, and their influence on clinical outcomes, and propose potential anti-inflammatory interventions for patients with cancer. © 2014 Elsevier Ltd.


Woodroffe C.D.,University of Wollongong | Webster J.M.,University of Sydney
Marine Geology | Year: 2014

Coral reefs provide significant evidence for former sea-level positions because of their geological preservation and suitability for dating. Interpretation of this evidence presumes an understanding of reef geomorphology, modern reef organism distributions, and environmental factors influencing them. Fossil reef terraces, formed during the last interglacial, marine oxygen isotope (MIS) substage 5e (~. 128-116. ka), are prevalent on many tropical shorelines and there has been ongoing debate as to the height reached by sea level during that highstand. Observations from numerous last interglacial sites suggest that sea level was at least 3. m above present sea level, implying less extensive icesheets than at present. An elevation of 6. m has commonly been adopted when correcting tectonically active sites for uplift. Recent compilations suggest elevations up to 8-9. m, but incorporate few observations from reefs where the last interglacial is found below sea level. Oscillation of sea level during MIS 5e has been interpreted from several sites, with recent studies inferring rapid rise of several metres at the end of the interglacial. These interpretations are at the limits to the precision with which corals can currently be dated and their palaeo-water depths inferred. It is not surprising that constraining last interglacial sea-level changes within uncertainties of less than 1-2. m remains controversial, considering sea-level variations recognised between reef sites in the Holocene, and observed geographical variation in isostatic or flexural adjustments. Fossil coral reefs on uplifting margins also provide clear evidence for MIS substages 5c and 5a, and those on Huon Peninsula indicate fluctuations related to Heinrich events (MIS 3). Interpretations show considerable variability between sites, with still greater uncertainties about sea-level timing and elevation during previous interglacials. Future study of extensive sequences of fossil reefs preserved on rapidly subsiding margins could address these uncertainties. Submerged reefs have already yielded important information about sea-level rise during the last deglaciation. Coring around Barbados and Tahiti, as well as on the Huon Peninsula, has produced a broadly consistent picture of ice melt, reflecting eustatic change since the last glacial maximum. These studies have shown the sensitivity of reefs to rapid sea-level rise associated with meltwater pulses, with some reefs drowning while others back-stepped. Integrated Ocean Drilling Program (IODP) expeditions to Tahiti, and recently the Great Barrier Reef, extended these records, but details of timing, nature and impact of deglacial meltwater pulses remain elusive. Studies of Holocene reefs have indicated different growth strategies; some kept up with sea level, while others caught up when sea level decelerated. Holocene sea level appears to have experienced a gradual rise up to present across the Caribbean, providing accommodation space for reefs to accrete vertically; whereas in the Indo-Pacific sea level has been near its present level since 7. ka, with many reef flats emergent following a slight fall of sea level caused by ocean siphoning. Microatolls on reef flats provide perhaps the clearest evidence of past sea-level position, but, in their absence, novel biological or other sea-level indicators are required to better constrain palaeo-water depths. There is an urgent need for further research from additional key reef locations, not only to decipher processes driving past sea-level change and its geographical variability, but also to better understand how coral reefs will respond in the context of future sea-level rise. © 2013 Elsevier B.V.


Grant
Agency: European Commission | Branch: FP7 | Program: CP-FP | Phase: HEALTH-2007-2.1.2-5 | Award Amount: 3.97M | Year: 2009

NeuroXsys will generate regulatory maps and models of the human X chromosome based on evolutionary conservation, with special attention to genes and regions implicated in X-linked neurological diseases. Vertebrate chromosomes are subdivided into domains of genomic regulatory blocks (GRBs) and NeuroXsys aims to map all GRBs on the X chromosome through bioinformatic approaches, extract gene regulatory sequences, and model their activity through transgenic reporter assays in the zebrafish juvenile and adult brain. One of the major deliverables of this project will be the publication of an online database that achieves correlation of disease genes ordered with respect to their regulatory regions and their experimentally and bioinformatically assessed function. NeuroXsys will seek to identify human disease mutations in neural gene regulatory elements. Implicated elements will be studied as regulators at single cell resolution in the zebrafish and mouse brain, defining expression patterns driven by the normal and mutant human regulatory DNA sequences. NeuroXsys will generate a regulatory map of a human chromosome.


Grant
Agency: European Commission | Branch: FP7 | Program: CP-IP | Phase: HEALTH-2009-2.1.1-1 | Award Amount: 15.31M | Year: 2010

In recent years, the zebrafish has emerged as a new vertebrate model organism for biomedical research which offers a unique combination of traits: a short generation time, small size and efficient breeding procedures make it the best choice among vertebrates for forward genetic screening and small-molecule screens, including toxicology, while the transparent embryo and larva offers unique opportunities for imaging of cell movement and gene expression in a developing organism. Building on recent advances in the zebrafish field, we will conduct high-throughput phenotyping of at least a thousand regulatory genes relevant for common human diseases, by behavioural assays (for viable mutants), 3D / 4D imaging and expression profiling (including high-throughput sequencing). We will include mutants generated by TILLING and by the new zinc finger nuclease method, as well as mutants from earlier forward-genetics screens. A phenotyping effort of this scale has never been undertaken before in any vertebrate organism. Complementing the study of mutants relevant for neurological disorders, we will produce an atlas of gene expression in the brain, the most comprehensive one in a vertebrate. We will further perform a genome-wide characterisation of regulatory elements of potential disease genes by a combination of bioinformatics and transgenics. Small-molecule screening for mutant rescue or disease-relevant processes will identify candidate drugs and provide insights into gene function. Our increasing knowledge on the regulators and their interactions with regulatory targets will be integrated with knowledge at cellular and organismic level. By capitalising on the virtues of the zebrafish system, this systems biology approach to the regulome will gain unique knowledge complementing ongoing work in mammalian systems, and provide important new stimuli for biomedical research.


Grant
Agency: European Commission | Branch: FP7 | Program: CSA-CA | Phase: NMP-2008-2.6-3 | Award Amount: 2.04M | Year: 2009

A coordination action is proposed to reinforce the international dimension of EU research on nanomaterials in formulations in the Asia-Pasific region. Three mechanism will be implemented to reach the widest possible audience in the appropriate formats that are convenient to the different stakeholders: (1) yearly major events, that will introduce a new concept to scientific gatherings and a departure from conventional meetings, (2) a researchers exchange program to seed new collaborations, facilitate joint projects and the realisation of future coordinated calls, and (3) the creation of a website devoted to nanomaterials in formulations, that will include up to date and reliable information on the newest research developments, funding opportunities, regulations, events and links to other nanotechnology initiatives.


Grant
Agency: European Commission | Branch: FP7 | Program: CP-IP | Phase: HEALTH-2011.2.2.1-2 | Award Amount: 17.04M | Year: 2012

NEURINOX aims at elucidating the role of NADPH oxidases (NOX) in neuroinflammation and its progression to neurodegenerative diseases (ND), as well as evaluating the potential of novel ND therapeutics approaches targeting NOX activity. NOX generate reactive oxygen species (ROS) and have emerged as regulators of neuroinflammation. Their role is complex: ROS generated by NOX lead to tissue damage in microglia-mediated neuroinflammation, as seen in amyotrophic lateral sclerosis (ALS), while absence of ROS generation enhances the severity of autoimmune-mediated neuroinflammation, as seen for e.g. in multiple sclerosis (MS). The objective of the 5 years NEURINOX project is to understand how NOX controls neuroinflammation, identify novel molecular pathways and oxidative biomarkers involved in NOX-dependent neuroinflammation, and develop specific therapies based on NOX modulation. The scientific approach will be to: (i) identify NOX-dependent molecular mechanisms using dedicated ND animal models (ii) develop therapeutic small molecules either inhibiting or activating NOX and test their effects in animal models (iii) test the validity of identified molecular pathways in clinical studies in ALS and MS patients. NEURINOX will contribute to better understand brain dysfunction, and more particularly the link between neuroinflammation and ND and to identify new therapeutic targets for ND. A successful demonstration of the benefits of NOX modulating drugs in ALS and MS animal models, and in ALS early clinical trials will validate a novel high potential therapeutics target for ALS and also many types of ND. NEURINOX has hence a strong potential for more efficient ND healthcare for patients and thus for reducing ND healthcare costs. This multi-disciplinary consortium includes leading scientists in NOX research, ROS biology, drug development SMEs, experts in the neuroinflammatory aspects of ND, genomics and proteomics, and clinicians able to translate the basic science to the patient.


News Article | November 10, 2016
Site: www.24-7pressrelease.com

GREENWICH, AUSTRALIA, November 10, 2016-- Dr. Roderick MacLeod has been included in Marquis Who's Who. As in all Marquis Who's Who biographical volumes, individuals profiled are selected on the basis of current reference value. Factors such as position, noteworthy accomplishments, visibility, and prominence in a field are all taken into account during the selection process.Dr. Roderick MacLeod is a medical educator and researcher with more than 30 years of experience in academia and palliative care. The former medical director of Dorothy House Hospice near Bath, England, Dr. MacLeod is an expert in palliative care and chronic illness with a host of editing and publishing credits to his name. Now serving as a senior staff specialist in palliative care and conjoint professor of the University of Sydney, he was appointed to his current role in 2012 by HammondCare as part of the organization's commitment to providing outstanding end-of-life care. Since then, he has focused on the themes of cancer, healthy aging and palliative care as a means of supporting patients and their families through their final transition.In addition to his role with Dorothy House, Dr. MacLeod has served as a professor for the University of Otago, director of palliative care for the Mary Potter Hospice Foundation, and a general practice physician with the Holt Group Practice. As an author and co-author, he has penned book chapters in several educational resources, including the Textbook of Palliative Medicine and "First Do No Self-Harm: Understanding and Promoting Physician Stress Resilience," which was published in 2013. With an impressive list of more than 50 journal articles and scholarly reviews written in the past 10 years alone, Dr. MacLeod is a well-respected and leading authority in the medical community. In 2013, he was credited as having contributed to a comprehensive research report published by the Canadian Institutes of Health Research.Dr. MacLeod is a fellow of the Chapter of Palliative Care of the Royal Australasian College Physicians, whose education committee he was formerly a member of from 2004 to 2008. He is also a fellow of the Royal College of General Practitioners and a past member of the New Zealand Ministry of Health's Palliative Care Advisory Group. In addition to being appointed as a member of the New Zealand Order of Merit, he was honored with the Inaugural Lloyd Morgan Charitable Trust fellowship and inclusion in Who's Who in Medicine in Healthcare, Who's Who in Science and Engineering, and Who's Who in the World.Dr. MacLeod's educational background includes a Ph.D. from the University of Glamorgan, a master's degree in medical education and a bachelor of medicine from the University of Dundee.About Marquis Who's Who :Since 1899, when A. N. Marquis printed the First Edition of Who's Who in America , Marquis Who's Who has chronicled the lives of the most accomplished individuals and innovators from every significant field of endeavor, including politics, business, medicine, law, education, art, religion and entertainment. Today, Who's Who in America remains an essential biographical source for thousands of researchers, journalists, librarians and executive search firms around the world. Marquis now publishes many Who's Who titles, including Who's Who in America , Who's Who in the World , Who's Who in American Law , Who's Who in Medicine and Healthcare , Who's Who in Science and Engineering , and Who's Who in Asia . Marquis publications may be visited at the official Marquis Who's Who website at www.marquiswhoswho.com


Grant
Agency: European Commission | Branch: FP7 | Program: CP-FP | Phase: HEALTH.2011.2.4.2-2 | Award Amount: 7.85M | Year: 2011

Biomarkers are considered as tools to enhance cardiovascular risk estimation. However, the value of biomarkers on risk estimation beyond European risk scores, their comparative impact among different European regions and their role in the drive towards personalised medicine remains uncertain. Based on harmonised and standardised European population cohorts we have built significant research collaboration, expertise and infrastructure in the EU. We will apply highly innovative SME-driven technologies and perform large-scale biomarker determination to assess the predictive value of existing and emerging biomarkers. Selection of emerging biomarkers will be based on integrated cutting-edge quantitative proteomic, transcriptomic, metabolomic, and miRNomic datasets established by private and public consortium members that will be disclosed to this consortium. Existing biomarkers will be selected based on non-redundancy and their association with cardiovascular risk and phenotypes. After SME-guided development of innovative assay systems biomarkers will be tested and validated in a stepwise fashion among European populations in primary and secondary prevention. In addition to their impact on risk prediction, their association with lifestyle determinants and cardiovascular phenotypes assessed by ultrasound and MRI technique will be evaluated. We will establish a BiomarCaRE panel which leads to improved disease prediction among different European populations. International collaborations with world-class clinical trial investigators will add data on the interaction of the BiomarCaRE panel with risk-lowering medication and lifestyle changes. The outcome of SME-driven technology development and clinical validation will undergo a medical technology assessment. The determination of cost-effectiveness will guide further clinical evaluation. These studies will reveal new methods of improved cardiovascular risk estimation and will open the path towards personalised medicine.


News Article | March 1, 2017
Site: www.eurekalert.org

A new report has highlighted a gender divide in the screening of patients for cardiovascular disease - the world's number one killer A new report has highlighted a gender divide in the screening of patients for cardiovascular disease - Australia's number one killer. Research from The George Institute for Global Health and The University of Sydney found men were significantly more likely to have their heart disease risk factors measured by their GP. The study published in the journal Heart also found the odds of being treated with the appropriate preventative medicines were 37 per cent lower for younger women at high risk of cardiovascular disease (CVD) than their male counterparts. Associate Professor Julie Redfern, from The George Institute for Global Health, said the results were especially concerning because more women than men die each year from cardiovascular disease. Associate Professor Redfern said: "Unfortunately there is still the perception that heart disease is a man's disease. This is not the case here in Australia, the UK or the US and we fear that one of the reasons more women are dying from heart disease is because they are not being treated correctly, including not even being asked basic questions about their health. " Risk factors for CVD include raised cholesterol and blood pressure levels and smoking. Female smokers have a 25 per cent greater risk of CVD than male smokers. The study of more than 53,000 patients across 60 sites in Australia found the odds of women being appropriately screened was 12% lower than men. It also found major discrepancies in the treatment of women at high risk of CVD. Younger women (aged 35-54) were 37% less likely than younger men to have appropriate medications, such as blood pressure drugs, statins and antiplatelets prescribed. By contrast, older women (aged 65 plus years) were 34% more likely than older men to have appropriate medications prescribed. Karice Hyun, who undertook the research for her PhD at the University of Sydney, said: "It is simply unacceptable that more than half of young women in this study did not receive appropriate heart health medications. "These medications can greatly reduce the likelihood of having a heart attack or stroke. If these findings are representative, many women could be missing out on life saving treatment right now - just because of their age and gender. "This fundamentally needs to change. We need a system wide solution to addressing these very worrying gaps in heart disease-related healthcare to ensure women are treated equally across the health system." Whilst the report highlighted gender disparity, it also revealed that just 43.3 per cent of all patients had all their necessary risk factors recorded, whilst only 47.5 per cent of patients at high risk of CVD were prescribed preventative medicines. Associate Professor Redfern added: "These findings really show that we need to do a better job of preventing and tackling CVD for all Australians if we have any hope to reducing the death toll." Every year more than 45,000 people die from CVD in Australia.


News Article | November 23, 2016
Site: www.eurekalert.org

Scientists are one step closer to understanding the link between different diet strategies and gut health, with new research presenting the first general principles for how diet impacts the microbiota. Researchers from the University of Sydney have found that the availability of intestinal nitrogen to microbes in the gut plays a key role in regulating interactions between gut microbes and their host animal. The study is published today in Cell Metabolism and led by researchers at the University of Sydney's Charles Perkins Centre. "There are many different diet strategies that claim to promote gut health, and until now it has been very difficult to establish clear causality between various types of diet and their effect on the host's microbiome. This is because there are many complex factors at play, including food composition, eating pattern and genetic background," said lead author Associate Professor Andrew Holmes, from the Charles Perkins Centre and School of Life and Environmental Sciences. "This research really lays the groundwork for future modelling by setting out the rules for a general model of how diet shapes the gut ecosystem. The simple explanation is that when we eat in a way that encourages cooperation between ourselves and bacteria we achieve a good microbiome, but when we eat in a way that doesn't require cooperation this lets bacteria do whatever they want - and mischief can ensue." The balance of gut bacteria in the microbiome plays a key role in such functions as immune regulation and digestive wellbeing, and has been linked to other health outcomes like obesity. Past studies have identified several patterns for how diet influences the microbiome, yet this has not led to a workable model that explains microbial response across many different types of diets. This new research is the latest in a series stemming from a seminal study in which 25 different diets comprised of different amounts of protein, carbohydrates and fat were systematically varied in 858 mice. Despite the huge diversity of gut bacteria, two main response patterns emerged in the study - microbe species either increased or decreased in their abundance depending on the animal's protein and carbohydrate intake. "The largest nutrient requirements for our gut bacteria are carbon and nitrogen in the foods we eat. As carbohydrates contain no nitrogen but protein does, the bacterial community response to the host animal's diet is strongly affected by this diets' protein-carbohydrate ratio," said Associate Professor Holmes. "The fact that this same pattern was seen across almost all groups of gut bacteria indicates that the makeup of the microbial ecosystem is fundamentally shaped by a need to access nitrogen in the intestinal environment." The researchers' new model suggests that while high-carbohydrate diets were the most likely to support positive interactions in the microbiome, such benefits were relative to the protein intake of the host animal. Researchers hope the new findings will lay the foundations for more accurate computer simulations to test hundreds of different diet variants, helping to better predict which dietary combinations lead to optimal gut health. "There are many ways to achieve a good diet, and the same diet won't work in the same way in each person," said co-author Professor Stephen Simpson, Academic Director of the Charles Perkins Centre. "The next step will be to more rapidly characterise which dietary combinations promote the best outcomes for each of our gut microbiomes, and to this end we are developing a computer simulation for how this might work in practice." The research was a collaboration between researchers at the University of Sydney, the University of Western Australia, Concord Hospital, ANZAC Research Institute and EWOS Innovation, Norway.


BETHLEHEM, PA--(Marketwired - Feb 15, 2017) - The appropriate use of parenteral nutrition, a topic of growing importance for treating critically ill patients, is the focus for the symposium sponsored by B. Braun Medical Inc. at this year's American Society for Parenteral and Enteral Nutrition (A.S.P.E.N.) Clinical Nutrition Week. B. Braun will present Dr. Gordon S. Doig, a leading researcher in the area of early parenteral nutrition in critically ill patients, at the symposium for continuing education credits on Feb. 19 from 6:30 a.m. - 7:30 a.m. in Grand Ballroom 8A at the Marriott Orlando World Center in Orlando, Fla. The event will be preceded by a breakfast buffet at 6:00 a.m. Doig's research, which has been published in the Journal of the American Medical Association and ClinicoEconomics and Outcomes Research, concludes that the use of parenteral nutrition in critically ill patients with short-term, relative contraindications to enteral nutrition may result in improved patient outcomes and significantly reduce total cost of care. Doig is associate professor of intensive care at the University of Sydney and Royal North Shore Hospital, Sydney, Australia. B. Braun's response to help address malnutrition in the hospitalized and home care patient will be presented throughout Clinical Nutrition Week, Feb. 18-20. B. Braun representatives at booth #201 will be available to discuss the company's new parenteral nutrition program -- PN360. According to the December 2016 Healthcare Cost and Utilization Project (HCUP) statistical brief #218, malnutrition has been associated with longer and more costly hospital stays, as well as a greater likelihood of comorbidity and death among hospitalized patients. Malnutrition may also contribute to post hospital syndrome, described as "an acquired transient period of vulnerability" following hospitalization, which may dramatically increase the risk of readmission, the brief indicated. Also on Feb. 19, from 12:45 p.m. to 1:45 p.m., B. Braun will host a poster session in the exhibit hall on "Understanding the Incidence of Bloodstream Infections and Patient Outcomes by Type of Parenteral Nutrition Preparation Method." B. Braun's poster detailing the session will be on display starting at 6 p.m. on Feb.18. B. Braun also will showcase its new macro and micro APEX® compounding system at booth #201 for health care facilities that need in-house compounding capabilities, and its wide selection of amino acid formulations, standard solutions, and related additives, in containers that are not made with natural rubber latex, PVC or DEHP. About B. Braun B. Braun Medical Inc., a leader in infusion therapy and pain management, develops, manufactures, and markets innovative medical products and services to the health care industry. The company is committed to eliminating preventable treatment errors and enhancing patient, clinician and environmental safety. B. Braun Medical is headquartered in Bethlehem, Pa., and is part of the B. Braun Group of Companies in the U.S., which includes B. Braun Interventional Systems, Aesculap® and CAPS®. Globally, the B. Braun Group of Companies employs more than 56,000 employees in more than 60 countries. Guided by its Sharing Expertise® philosophy, B. Braun continuously exchanges knowledge with customers, partners and clinicians to address the critical issues of improving care and lowering costs. To learn more about B. Braun Medical, visit www.BBraunUSA.com.


News Article | December 5, 2016
Site: www.eurekalert.org

The Seed Box, Sweden's largest research programme in the environmental humanities, is now allocating grants to researchers, writers and artists around the world. The projects investigate urgent environmental problems and present new, often artistic methods and pathways forward, aimed at exploring our relationship with the environment. "Until now, the environment has mainly been a subject for natural science and engineering. But environmental issues are also very relevant for the humanities and social sciences. They concern values and human conditions, and these are the domains of the humanities. With the Seed Money we want to nurture good ideas and green initiatives from the humanities, from all round the world," says Cecilia Åsberg, professor of gender, nature and culture at Linköping University and programme director of the Seed Box, which is based at Linköping University. The Seed Box's Seed Money venture aims to foster research in interdisciplinary and environmental humanities, by increasing researcher mobility and facilitating knowledge exchange between Swedish and foreign universities. To this end, 16 projects involving 40 individuals have received funding. The grants will go to exchanges for researchers, writers and artists, and to workshops, travel grants and a project on citizen science. Herbarium 3.0, a project that has secured USD 43,434 (EUR 40,854), investigates the plants around us that we no longer see. Our history is full of collected and pressed plants that have been put into herbaria with data on how, where and when they were found. And yet, despite this robust botanical history, many humans are now notably blind to the plants that share our world. "Plant blindness can make us insensitive to both the lives of plants and the deeply connected history of plant-human interactions. We want to move herbaria out of the archive and back into people's lives," says Tina Gianquitto, associate professor, Colorado School of Mines. The project will create a website where the public can share their experiences and relationships with plants. The narratives will be collected in public gardens around the world, including the New York Botanical Garden and the Gothenburg Botanical Garden in Sweden. The international projects that received funding will collaborate with a Swedish university, to bolster the exchange of knowledge. "The Seed Box: An Environmental Humanities Collaboratory" is a four-year pilot programme funded by Mistra, the Swedish Foundation for Strategic Environmental Research and Formas, the Swedish Research Council. It is based at Linköping University and has received roughly USD 4.9 million (EUR 4.1 million) to advance the environmental humanities in Sweden and worldwide. The call for funding was made in consultation with the Seed Box's funders. Hanna Husberg (Academy of Fine Arts, Vienna) Project: Troubled atmosphere: On the governance of air Will cooperate with Linköping University Amount granted: SEK 111,000 Erika Sigvardsdotter (Red Cross University College) and Jonas Gren Project: A poetic writer in residence. The return of bacteria - on the dangerous reduction of complex to complicated Will cooperate with Linköping University Amount granted: SEK 60,000 Franziska Bedorf (Uppsala University) Project: Travelling exhibition. The Melting Snows of Kilimanjaro and Other Stories: Of People, Land and Climate Change in East Africa. Amount granted: SEK 175,000 Jesse Peterson (KTH Royal Institute of Technology) Project: Two-day writer's workshop. Writing with Undisciplined Discipline: An Environmental Humanities Workshop. Amount granted: SEK 88,000 Katherine Gibson (Western Sydney University) Project: Urban Food Economies: Re-thinking Value for 'More-than-Capitalist' Futures. Will cooperate with KTH Royal Institute of Technology Amount granted: SEK 180,000 Tina Gianquitto (Colorado School of Mines, USA) project: Herbaria 3.0. Will cooperate with University of Gothenburg Amount granted: SEK 400,000 Sebastian Ureta (Universidad Alberto Hurtado, Chile) with Linda Soneryd (University of Gothenburg) Project: Assembling transnational toxic bodies: Embodying and mobilizing responsibility on the 'Arica Victims VS Boliden Minerals AB' case Will cooperate with University of Gothenburg Amount granted: SEK 505,000 Marco Armiero (KTH Royal Institute of Technology) Project: The United Toxic Autobiographies of Europe. Amount granted: SEK 365,000 Veronica Pacini-Ketchabaw (Western University, Canada) with Maria Svedäng (Stockholm University) Astrida Neimanis, (University of Sydney) Project: The Wild Weathering Collaboratory Amount granted: SEK 275,000 Jennifer Mae Hamilton (University of Sydney) Project: Research travel, Weathering the City Will cooperate with Linköping University Amount granted: SEK 75,000 Åsa Össbo (Umeå University) Project: Damage done: Exploring the ongoing consequences for Sami communities as a result of the Swedish hydropower development. Amount granted: SEK 372,000


News Article | November 23, 2016
Site: www.eurekalert.org

A groundbreaking study of the virosphere of the most populous animals - those without backbones such as insects, spiders and worms and that live around our houses - has uncovered 1445 viruses, revealing people have only scratched the surface of the world of viruses - but it is likely that only a few cause disease. The meta-genomics research, a collaboration between the University of Sydney and the Chinese Centre for Disease Control and Prevention in Beijing, was made possible by new technology that also provides a powerful new way to determine what pathogens cause human diseases. Professor Edward Holmes, from the Marie Bashir Institute for Infectious Diseases & Biosecurity and the School of Life and Environmental Sciences, who led the Sydney component of the project said although the research revealed humans are surrounded by viruses in our daily lives, these did not transfer easily to humans. "This groundbreaking study re-writes the virology text book by showing that invertebrates carry an extraordinary number of viruses - far more than we ever thought," Professor Holmes said. "We have discovered that most groups of viruses that infect vertebrates - including humans, such as those that cause well-known diseases like influenza - are in fact derived from those present in invertebrates," said Professor Holmes, who is also based at the University's multidisciplinary Charles Perkins Centre. The study suggests these viruses have been associated with invertebrates for potentially billions of years, rather than millions of years as had been believed - and that invertebrates are the true hosts for many types of virus. The paper, "Redefining the invertebrate RNA virosphere," is published tonight in Nature. "Viruses are the most common source of DNA and RNA on earth. It is all literally right under our feet," Professor Holmes said. The findings suggest viruses from ribonucleic acid, known as RNA - whose principal role is generally to carry instructions from DNA - are likely to exist in every species of cellular life. "It's remarkable that invertebrates like insects carry so very many viruses - no one had thought to look before because most of them had not been associated with human-borne illnesses." Although insects such mosquitoes are well-known for their potential to transmit viruses like zika and dengue, Professor Holmes stressed that insects should not generally be feared because most viruses were not transferable to humans and invertebrates played an important role in the ecosystem. Importantly, the same techniques used to discover these invertebrate viruses could also be used to determine the cause of novel human diseases, such as the controversial 'Lyme-like disease' that is claimed to occur following tick bites. "Our study utilised new techniques in meta-genomics, which we are also using to provide insights into the causes of human-borne diseases," said Professor Holmes, who is also a National Health and Medical Research Council Australia Fellow. "The new, expensive technologies available to researchers which have allowed us to do this landmark project, provide the ultimate diagnostic tool." Professor Holmes and his collaborators are conducting human studies using these new techniques to analyse Lyme-like disease and other clinical syndromes. This paper describes a landmark 'meta-transcriptomics' analysis of ~220 species of invertebrates from nine diverse animal phyla that have not previously been studied with respect to viral diversity and evolution. What we discovered is a picture of phylogenetic and genomic diversity that fundamentally changes our understanding of RNA virus evolution and re-writes the virology text book. The meta-sequencing study profiled more than 220 invertebrate species across nine animal phyla and discovered more viruses than have been documented in any one study. The research fills major gaps in the understanding of RNA and reveals viruses evolve in a far more complex way than was previously thought. For example, the study has found that viruses can capture genes (including from the animals they infect), lose genes and transfer genes among themselves. Together, the data from this new research present a view of the RNA virosphere that is more phylogenetically and genomically diverse than that depicted in current classification schemes and provide a more solid foundation for studies in virus ecology and evolution. It enables us to re-examine and re-define the invertebrate virosphere, providing a new perspective on the fundamental patterns and processes of viral evolution.


News Article | October 19, 2016
Site: www.techtimes.com

Zika Virus - What You Should Know Ticked Off! Here's What You Need To Know About Lyme Disease Tasmanian devil milk could be the answer to the global war against potentially deadly superbugs. Findings of a new study have revealed that milk from the Tasmanian devil, an endangered species of marsupial found in Australia, contains antimicrobial compounds that are capable of killing disease-causing pathogens, which include antibiotic-resistant bacteria such as the Staphylococcus aureus, also known as MRSA. Researchers from Sydney University scanned the genome of the Tasmanian devil and found six different types of naturally occurring antimicrobial compounds. After synthesizing these in the lab, the researchers tested their effectiveness at fighting and killing a number of drug-resistant bacteria and fungi. They found that the compounds are capable of killing staph, known to cause pneumonia, food poisoning and toxic shock syndrome. The compounds were likewise found capable of fighting the vancomycin antibiotic-resistant enterococcus that can cause meningitis and urinary tract infections (UTI). Vancomycin is a potent antibiotic and there are not a lot of available options for treating bugs that are resistant to this drug. The compounds also killed a potentially fatal species of yeast called Candida krusei and the hyper-virulent fungus Cryptococcus gattii. Experts agree that there is an urgent need for new drugs that can fight treatment-resistant infections. It is estimated that superbugs that can resist antimicrobials are responsible for 700,000 deaths per year. Earlier this year, the Centers for Disease Control and Prevention (CDC) revealed that antibiotic-resistant bacteria was responsible for one in every seven infections in some U.S. hospitals. A report by the Review of Antimicrobial Resistance likewise showed that drug-resistant bug will kill every 3 seconds, which is equivalent to about 10 million lives per year, come 2050. This rate will cost a loss of $100 trillion worldwide if nothing gets done about the risks posed by drug-resistant infections. Antimicrobial resistance is a concern because it can make medical procedures more dangerous to perform. The findings of the study on Tasmanian devil milk, which was published in Scientific Reports on Oct. 11, may hopefully help in the development of new drugs that would play a vital role in the global fight against superbugs, as researchers see potentials in the compounds found in the marsupial's milk in fighting drug-resistant pathogens. "Tasmanian devil cathelicidins Saha-CATH5 and 6 are potential candidates for drug development. Their broad-spectrum antibacterial activity and ability to kill MRSA and VREF could translate into numerous therapeutic applications," wrote study researcher Emma Peel and colleagues from the University of Sydney in Australia. © 2017 Tech Times, All rights reserved. Do not reproduce without permission.


BETHLEHEM, PA--(Marketwired - Feb 15, 2017) - The appropriate use of parenteral nutrition, a topic of growing importance for treating critically ill patients, is the focus for the symposium sponsored by B. Braun Medical Inc. at this year's American Society for Parenteral and Enteral Nutrition (A.S.P.E.N.) Clinical Nutrition Week. B. Braun will present Dr. Gordon S. Doig, a leading researcher in the area of early parenteral nutrition in critically ill patients, at the symposium on Feb. 19 from 6:30 a.m. - 7:30 a.m. in Grand Ballroom 8A at the Marriott Orlando World Center in Orlando, Fla. The event will be preceded by a breakfast buffet at 6:00 a.m. Doig's research, which has been published in the Journal of the American Medical Association and ClinicoEconomics and Outcomes Research, concludes that the use of parenteral nutrition in critically ill patients with short-term, relative contraindications to enteral nutrition may result in improved patient outcomes and significantly reduce total cost of care. Doig is associate professor of intensive care at the University of Sydney and Royal North Shore Hospital, Sydney, Australia. B. Braun's response to help address malnutrition in the hospitalized and home care patient will be presented throughout Clinical Nutrition Week, Feb. 18-20. B. Braun representatives at booth #201 will be available to discuss the company's new parenteral nutrition program -- PN360. According to the December 2016 Healthcare Cost and Utilization Project (HCUP) statistical brief #218, malnutrition has been associated with longer and more costly hospital stays, as well as a greater likelihood of comorbidity and death among hospitalized patients. Malnutrition may also contribute to post hospital syndrome, described as "an acquired transient period of vulnerability" following hospitalization, which may dramatically increase the risk of readmission, the brief indicated. Also on Feb. 19, from 12:45 p.m. to 1:45 p.m., B. Braun will host a poster session in the exhibit hall on "Understanding the Incidence of Bloodstream Infections and Patient Outcomes by Type of Parenteral Nutrition Preparation Method." B. Braun's poster detailing the session will be on display starting at 6 p.m. on Feb.18. B. Braun also will showcase its new macro and micro APEX® compounding system at booth #201 for health care facilities that need in-house compounding capabilities, and its wide selection of amino acid formulations, standard solutions, and related additives, in containers that are not made with natural rubber latex, PVC or DEHP. About B. Braun B. Braun Medical Inc., a leader in infusion therapy and pain management, develops, manufactures, and markets innovative medical products and services to the health care industry. The company is committed to eliminating preventable treatment errors and enhancing patient, clinician and environmental safety. B. Braun Medical is headquartered in Bethlehem, Pa., and is part of the B. Braun Group of Companies in the U.S., which includes B. Braun Interventional Systems, Aesculap® and CAPS®. Globally, the B. Braun Group of Companies employs more than 56,000 employees in more than 60 countries. Guided by its Sharing Expertise® philosophy, B. Braun continuously exchanges knowledge with customers, partners and clinicians to address the critical issues of improving care and lowering costs. To learn more about B. Braun Medical, visit www.BBraunUSA.com.


News Article | February 15, 2017
Site: www.eurekalert.org

Public officials faced with the tough task of communicating risk on contentious issues like vaccination or fluoridation - where the actual risk is low but public concern remains high - need to show that they care, demonstrate that they are taking action and strategically engage with the media. That's the message of a paper published today in the Sax Institute's Public Health Research & Practice journal. "With the rise of 'alternative facts' and the tendency for people to seek information that confirms their existing beliefs, it is no longer enough to simply have the right policy," said lead author Dr Claire Hooker from the Centre for Values, Ethics and Law in Medicine at the University of Sydney. "In circumstances where public concern and outrage is high even though the absolute risk is low, good quality scientific studies are not enough to ensure we protect the public's health. It's equally important to have the best approach to communicating with the public. "In situations of public health and environmental concerns - such as vaccinations, water fluoridation and the risk of Ebola outbreaks in Australia ? officials and experts are often anxious that community criticism of proven health interventions will prevent good policy. But our research suggests that trying to shut off this criticism can make things worse, particularly as it's now almost impossible to effectively control the flow of information on social media." Dr Hooker said there were best-practice strategies that Australian public health and environmental officials could look to adopt. "Research shows that when people are emotional about an issue they have more difficulty hearing and processing information, and are more likely to pay attention to negative information. That's why the golden rule of successful risk communication is that people need to hear that you care before they will care about what they hear. Officials need to communicate early and often, be upfront about areas of uncertainty or complexity, and prioritise building trust over trying to push a message. "Actions, of course, speak far louder than words. People don't want the 'official line' on a topic, they want to know what actions are been taken. Finally, communication is most effective when public health officials engage directly with affected communities and with the media, including local and community-based social media. This way local communities know that authorities have integrity, are competent and can be trusted - the key to reassuring people and reducing outrage," said Dr Hooker. Dr Hooker's paper was published in the latest issue of the Sax Institute's Public Health Research & Practice journal, which this month focuses on the theme of knowledge translation. "The transfer of evidence into the policy making process is rarely a simple and smooth process. The types of evidence used and the way that evidence is practically applied in policy processes varies and that's why there is a focus on the skill of knowledge translation itself, to improve this process where possible," according to Guest Editor Dr Andrew Milat from the NSW Ministry of Health. Communicating about risk: strategies for situations where public concern is high but the risk is low Public link once embargo lifts (for inclusion in news articles) http://bit.


News Article | December 14, 2016
Site: www.chromatographytechniques.com

Older adults are less inclined to take risks, but this behavior may be linked to changes in brain anatomy rather than age, according to a new study resulting from a collaboration between Yale and NYU. The finding adds to scientific understanding of decision making and may lead to strategies for modifying changes in risk behavior as people age. The study was published on Dec. 13 by Nature Communications. Research has demonstrated that older adults are less inclined to take certain types of risks, such as participating in a lottery. In a prior study, associate professor of comparative medicine and neuroscience Ifat Levy and colleagues documented a link between tolerance for taking risks and gray matter volume in an area in the back of the brain known as the posterior parietal cortex; the more gray matter young adults had, the more likely they were to take risks. In the new study, Levy and her co-authors, including first author Michael Grubb, a former NYU postdoc and currently at Trinity College, examined the phenomenon in older adults, who experience a natural decline in gray matter volume with age. The research team studied whether changes in gray matter volume in the posterior parietal cortex, or aging itself, accounted for older adults’ tendency to avoid risk. For the study, the research team presented a series of choices to 52 study participants, aged 18 to 88 years. Participants could either receive $5 or take their chances with a lottery of varying amounts and probabilities. For example, a participant could choose the certain gain of $5 or opt for a 25 percent chance of getting $20. Participants were each assigned a number denoting their level of risk tolerance based on their choices. The researchers also measured the gray matter volume in the posterior parietal cortex of each subject, drawn from MRI scans. After analyzing the risk choices and MRI measurements, the researchers confirmed that age-related decline in risk tolerance correlates more with changes in brain anatomy than with age. “We found that if we use both the gray matter volume and age together as predictors of risk attitudes, the gray matter volume is significant, while age is not,” said Levy. “This means that gray matter volume accounts for age-related changes in risk attitude more than age itself.” The finding provides new insight into neurological factors that affect risk preferences and decision making among older adults. It may also lead to strategies for modifying decision making. “We could use this understanding in order to try to, behaviorally or pharmacologically, change flawed decision making,” said Levy. “By understanding the basic processes at the core of complex behavioral changes, we facilitate ways to intervene and improve decision making.” Other study authors include Agnieszka Tymula (University of Sydney), Sharon Gilaie-Dotan (Bar Ilan University), and Paul W. Glimcher (NYU). The study was supported by grants from the National Institute on Aging (National Institutes of Health).


News Article | April 27, 2016
Site: www.nature.com

Not many graduate students who spend 50–60 hours in the laboratory each week are eager to take on an outside job — especially one that pays nothing. But Michael Lang, a PhD student in cell and developmental biology at the University of Michigan in Ann Arbor, has added two part-time, unpaid positions to his workload. He's the president of miLEAD Consulting, an independent, non-profit company based in Ann Arbor that connects the university's graduate students and postdoctoral researchers with local biotechnology and health-care companies that need help with product development, market analysis or branding. And he works directly for miLEAD to provide his own insights and analyses to companies. Lang thinks that the long hours are worth it. The consulting work helps him to build leadership and management skills that would come in handy if he were to reach his ideal goal of running an academic lab. And if that doesn't work out, he'll have a fall-back position: “I've always wanted to be a scientist, but a US$130,000 job at a top consulting firm sounds pretty good too.” Lang's group is one of several consulting organizations that have sprung up on US campuses in the past few years. They supply teams of postdocs and graduate students who can take a scientific approach to common questions faced by local biotechnology and pharmaceutical start-ups — what is the demand for a new product, what is the competition, what can be done to make a product better and what is the best way to profit from a good idea? Consultants do not always know how companies use their input or whether their advice makes a difference, but the value of the experience is undeniable. “We want to give people another bullet point on their CV,” Lang says. “It can get them over the hurdle to getting a job.” A few of these consulting groups, including miLEAD, are independent, non-profit companies with no official ties to their home institute. But most are affiliated with their host institutions, including Harvard University in Cambridge, Massachusetts, Stanford University in California and the University of Pennsylvania in Philadelphia. Such campus-based organizations haven't caught on outside the United States, but at least one global company, 180 Degrees Consulting, recruits postdocs and graduate students for consulting projects and gives scientific trainees in the United Kingdom and elsewhere a chance to add to their skill set. Whatever group they work for, trainees in consulting get valuable experience in analysis, decision making and team-based problem solving that can give them a boost in the job market. It is also a break from the normal routine. “Fast-paced teamwork can be a lot of fun,” says Huadi Zhang, a medical-science PhD student and co-president of Harvard Graduate Consulting Club. “I didn't have that kind of experience in the laboratory.” But on-the-side consulting is also a serious commitment and time drain — and there are several hoops to be jumped through if students want to start a group from scratch (see 'How to start a consultancy'). The field is not for everyone, but an increasing number of trainees have found that it is possible to consult their way into a career. For Lang, consulting has turned into a second life outside the lab. He estimates that he spends 10–15 hours a week fulfilling his duties as president of miLEAD: overseeing the search for clients, recruiting consultants and, importantly, training them in the basics of business. Working on a project — which might involve meeting with a company's board, talking to doctors or digging through research articles — generally takes him another 10–15 hours each week. These are huge time commitments for a graduate student with experiments to run and papers to write. But it's worth it, he says, for the boost it gives to his CV and research. “The additional work has helped me streamline my science,” he says. “There's not a lot of downtime in the lab.” Lang's recent projects include an eight-week gig for a Michigan pharmaceutical company that is developing a therapeutic drug for newborns. (Because of non-disclosure agreements, he cannot name the company.) He and his team studied the market for the drug, scoped out the competition and gauged its potential applications in neonatal medicine. Previously, he was on a team that spent four weeks assessing an app-based learning tool for college students that was developed at the University of Michigan. Lang says that miLEAD brought in $6,000 in revenue in 2015 and is aiming for $12,000 in 2016. The board uses all of the revenue for group-related activities, including flying in speakers for panel discussions and funding team-building gatherings. If the coffers get sufficiently full, Lang hopes to start a grant programme to help local businesses to get off the ground. miLEAD's fees for client companies are a tiny fraction of what a big-time consulting company would charge, but they underscore the professionalism of the process. “We treat this like a business,” he says. “If money is involved, better work gets done.” Conversely, Zhang says that the Harvard Graduate Consulting Club has no plans to start charging clients. “It's a way for us to give back to the community,” he notes. Although it is likely that local start-ups get some value from their consulting, improving a company's bottom line is not the main point of the exercise. “It's a learning experience for us,” says Zhang. Consulting organizations are starting to pop up on other campuses, giving more postdocs and graduate students a chance to try out the field. Simran Madan, a PhD student in translational biology at Baylor College of Medicine in Houston, Texas, is helping to kick-start consulting services as senior vice-president of the Consulting Club at the Texas Medical Center in Houston. This independent, non-profit group is drawing talent from several local institutions, including Baylor and the University of Texas Heath Science Center and MD Anderson Cancer Center in Houston. The group aims to begin offering consulting services by the end of the year. For now, Madan and club president Redwan Huq, a Baylor PhD student in molecular physiology and biophysics, are learning how to recruit potential consultants, provide training, structure consulting teams and attract clients. The plan is to charge local companies about $500 for 6 weeks of work analysing a product and coming up with a marketing or development plan, a price that should be attractive to cash-strapped start-ups. “Professional consultants are expensive, and you almost never see a start-up hiring a firm,” Madan says. “But they can get the same sort of analysis from a trainee.” One source of inspiration for Madan and Huq is the BALSA (Biotechnology and Life Sciences Advising) group, a successful consulting organization at Washington University in St Louis, Missouri. BALSA, which started in 2011, has 100 active members who participate in around 40 projects a year. About 60% of the members are science PhD students, 30% are science postdocs and a few are business or law students. Each job lasts six weeks, and each team includes three consultants, a project manager and an adviser. Most of the work involves product development and market analysis for local start-ups and entrepreneurs in the biotechnology, agriculture and health-care industries. The group also has clients in South Dakota; San Francisco, California; and Philadelphia, Pennsylvania, says Shivam Shah, who is the BALSA president and a PhD student in biomedical engineering at Washington University. A frequent BALSA client is Washington University's Office of Technology Management, which has often hired the team to help evaluate patent applications from faculty members. Shah says that the group tries to avoid having students evaluate their direct supervisors, but that is not always possible. Students aim to judge patent applications strictly on their scientific merit and real-world potential, he says. Since joining the group in 2013, Shah has worked on more than 20 projects as either a consultant or a project manager. Working on multiple projects has given him a chance to fine-tune his management style and learn more about the scientific marketplace, he says. He hopes to land a consulting job soon after getting his degree, perhaps with a health-care venture-capital firm looking for advice about wise places to invest. But a consulting career is hardly the only destination for BALSA members. Many have ended up working in industry as research scientists, patent specialists or consultants for companies such as the multinational agrochemical company Monsanto, based in St Louis, Missouri, and the New York-based computing giant IBM. And of the roughly 200 alumni of the programme, he estimates that about one-third have continued in academic careers. The skills learned in the consulting game — management, leadership and teamwork — would prove valuable to anyone running their own lab, Shah says. There is a paucity of organizations such as miLEAD and BALSA outside the United States, but early-career scientists in the United Kingdom, Europe and elsewhere can still get real-life consulting training. One option is a position with 180 Degrees Consulting, a global organization with branches in Cambridge, UK; King's College London; Munich, Germany; the University of Tokyo; the University of Sydney; and the University of California, Los Angeles, among many other sites. The company enlists students and postdocs to provide pro bono consulting to non-profit and humanitarian organizations around the world. Although the work generally is not focused on scientific issues, science PhD students and postdocs can bring valuable skills to the organization, says Daniel Jiang, a PhD student in computer science who in 2015 founded the 180 Degrees Consulting branch at King's College London. “I know more about data sets than a political-science major does,” he says. Jiang's group is working with a children's charity and sports charity in London, and a school in the Philippines. The company attracts people who want to make a positive difference in the world, Jiang says, but there are benefits for the consultants themselves. “It's a great opportunity for students to find out about a different career before they graduate,” he says. Lang of miLEAD is still technically a student, but he's racking up professional-grade experience and isn't slowing down: he'll jump into two new projects as an adviser this summer. He can't discuss details, but the big picture is clear: he'll be working long hours, thinking about tough problems and moving closer to a postgraduate career. Are the long days worth it? That's a cost–benefit analysis that he has figured out on his own, no consultant required.


News Article | November 3, 2016
Site: scienceblogs.com

Eric Hoffman (Children’s National Medical Center) presented work on chronic inflammatory diseases in children. He mentioned that while diets high in fats and carbohydrates (i.e. Western diets), obesity and sedentary lifestyles are associated with inflammation and related diseases (ex: asthma, type 2 diabetes), another contributor could be hormones. Kids who stay indoors more often have reduced exposure to sunlight and exercise less. This may alter the normal biological clock of these kids because their stress hormone levels stay high all day as opposed to peaking at certain times. This constant exposure to elevated stress hormones may then in turn contribute to the development of inflammation and its associated diseases. Monika Fleshner (University of Colorado, Boulder) presented research on the relationship between exercise, microbes within the gut, and stress. Not surprisingly, her team found that exercise reduced inflammation and depression. What was interesting was that exercise was also associated with a conversion to populations of gut microbes that are associated with health. Who knew exercise could benefit our gut microbes? David James (University of Sydney, Australia) presented his work that explored how proteins in the body are affected by exercise.


News Article | February 22, 2017
Site: www.techtimes.com

When used by males (aged 65 and above) with low levels of sex hormone, testosterone replacement therapy has led to health benefits and adverse effects alike, a lineup of five new studies has revealed. Testosterone levels in men naturally start to decline as they age. The loss of this sex hormone could spell negative health effects, making some male patients undergo this therapy to replenish its levels artificially. The group of five papers, published in JAMA and JAMA Internal Medicine, probed the role of testosterone treatment in different areas of health. Four of them delved on the effects on anemia, bone density, cognitive function, as well as coronary plaque buildup. The last study, an observational one, examined the association between testosterone therapy and overall cardiovascular well-being. Researchers from University of Pennsylvania’s Perelman School of Medicine, with support from the National Institutes of Health (NIH) and testosterone product maker AbbVie, conducted seven clinical trials. Their Testosterone Trials (TTrials) covered 788 males ages 65 and above who had low testosterone levels. Susan S. Ellenberg, the study’s lead biostatistician, clarified that TTrials are a single trial and randomization. "People could be in only one trial, or they could be in multiple trials, but it was all sub-studies under one big umbrella trial,” she said. Here are some key findings: Cognition Trial — No improvements found in memory and other measures of cognition for men using testosterone gel. Cardiovascular Trial  — More plaque buildup in the coronary arteries of men receiving testosterone, but the number of heart attacks and other cardiovascular events was similar for males in testosterone and placebo groups. Anemia Trial — Treatment helped correct anemia without identifiable cause as well as anemia stemming from iron deficiency. Last year, the researchers published the first results from TTrials, including improved sexual function and mood. Senior author Dr. Peter Snyder said that TTrials were designed to investigate testosterone therapy’s effectiveness rather than risk, apart from the fact that they did not address long-term consequences. For Ellenberg, on the other hand, there’s no “overwhelming single answer” to the question of testosterone therapy’s positive health impacts. “Treating 788 men for one year is far too few to draw conclusions,” Snyder added, pertaining to the significance of the increased coronary artery plaques and cardiovascular risk and stressing the need for a larger, longer-term trial in the future. For some experts, the results of the $50 million trial were a letdown. "The hopes for testosterone-led rejuvenation for older men are dimmed and disappointed if not yet finally dashed," said University of Sydney professor David J. Handelsman in an accompanying editorial. Testosterone prescription for age-related deficiency started in 2000, with the advent of rub-on products led by AbbVie-manufactured Androgel. In the United States (same as in Canada), up to 73 percent of testosterone prescribers are primary care providers, with only 18 percent of patients undergoing two blood tests prior to treatment. There was also a four-fold hike in the use of the therapy in the country from 2000 to 2011. The FDA cracked down on over-prescription two years ago, warning of increased heart attack and stroke risks. Concerns around testosterone therapy also led the NIH to fund the TTrials. © 2017 Tech Times, All rights reserved. Do not reproduce without permission.


News Article | November 15, 2016
Site: www.sciencenews.org

Australia’s early settlers hit the ground running, or least walking with swift determination. After arriving on the continent’s northwest coast by around 50,000 years ago, humans reached Australia’s southeastern interior within a thousand years or so, researchers find. This ancient trip covered more than 2,000 kilometers through terrain that, although stark and dry today, featured enough lakes and rivers at the time of Australia’s colonization to support long-distance treks, say archaeologist Giles Hamm of La Trobe University in Melbourne, Australia, and colleagues. Excavations at Warratyi rock-shelter indicate that it took only a few millennia for Australia’s early colonists to forge a distinctive Aboriginal culture that continued to develop over the next 40,000 years, Hamm’s team reports online November 2 in Nature. “Archaeological finds at Warratyi are surprisingly old and significant, especially coming from an excavation of only a meter of sediment,” Hamm says. These new discoveries are “remarkable and atypical” for Australia, says archaeologist Peter Hiscock of the University of Sydney. But the finds’ ages and significance for understanding Aboriginal culture will be debated, he predicts. Until now, the oldest human sites in Australia’s huge, arid interior dated to no more than 44,000 years ago in the continent’s northwest, not far from where the first settlers presumably arrived. Lake Mungo, now a dry lake bed in southeastern Australia, has yielded artifacts from about 50,000 years ago. Unlike artifacts at Warratyi that represent human activity over a long time span, it’s not known if Lake Mungo finds come from a group that made an isolated foray into the region before dying out within a few generations. Hamm’s group unearthed evidence of an intermittent human presence at Warratyi that lasted from around 49,000 to 10,000 years ago. People were largely absent between around 35,000 and 17,000 years ago, when the climate became substantially colder and drier, Hamm says. Finds at Warratyi dating to between 49,000 and 46,000 years ago include stone tools and a piece of reddish pigment. Bones from 16 mammal species and one reptile species were unearthed from various layers of sediment. Of particular interest were a partial leg bone from an extinct, rhino-sized marsupial and eggshells from a large, flightless bird. These animals died out not long after humans reached Australia, but it hasn’t been clear whether humans contributed to the extinctions via hunting or other actions (SN: 1/20/07, p. 38). Warratyi probably won’t resolve that issue. No butchery marks from stone tools appear on the marsupial fossil, although people may still have hunted the creature. Possibly burned areas appear on some eggshell fragments. Recent evidence from other Australian sites indicates that people were cooking this extinct bird’s eggs between 54,000 and 43,000 years ago. Other discoveries at Warratyi indicate Aboriginal people there made a variety of tools up to 10,000 years before similar tool types were known to have occurred elsewhere in Australia or in Southeast Asia, the scientists say. For instance, a 4-centimeter-long bone point that dates to more than 38,000 years ago is Australia’s earliest known bone tool. Comparably ancient discoveries include fragments of resin, which was probably used to glue stone tools to handles of some type. Tool handles probably came into use even earlier than that Down Under, argues archaeologist Sandra Bowdler of the University of Western Australia in Crawley. Researchers generally agree that, in Australia, stone cutting implements with ground, beveled edges were once attached to handles, Bowdler explains. A team led by Hiscock recently dated a ground-edge tool found in northwest Australia to between 49,000 and 44,000 years ago. That means handle use started there before it appeared at Warratyi, Bowdler holds. Tools displaying sharpened edges along one side appear at Warratyi between 30,000 and 24,000 years ago. While Hamm’s team regards these as the oldest such implements in Australia, Bowdler awaits more thorough dating of Warratyi sediment layers before accepting that conclusion. To date artifacts, Hamm’s group calculated the time since buried sediment was last exposed to sunlight and conducted radiocarbon analyses of charcoal from ancient hearths and of eggshell fragments. Questions remain about the age of the oldest Warratyi discoveries, says geochronologist Richard Roberts of the University of Wollongong in Australia. Two samples of the deepest artifact-bearing sediment were dated to around 44,000 to 43,000 years ago, whereas three radiocarbon dates of eggshells from the same sediment ranged in age from possibly more than 50,000 years to perhaps more than 45,000 years, in Roberts’ view. If the younger age is the correct one, then Warratyi finds are no older than those previously discovered at Riwi rock-shelter, another site in Australia’s arid interior. If older than 50,000 years, Roberts says, “the Warratyi artifacts would be among the oldest on the continent.” Editor's note: This article was updated on November 9, 2016,  to correct the layers in which animal bones were found at Warratyi, the age of ground-edge tools found in northwest Australia, and the number and types of samples dated in the deepest artifact-bearing layer at Warratyi.


News Article | March 4, 2016
Site: www.nature.com

Public-health workers are still struggling to stamp out the Ebola epidemic in West Africa. But the lessons learned from that outbreak — which exposed major flaws in the global public-health system — are shaping the escalating international response to the spread of Zika virus in the Americas. “Ebola is the gorilla in the room,” says Lawrence Gostin, a health-law and policy specialist at Georgetown University in Washington DC. “It’s driving everything.” He and others say that governments and international public-health agencies seem determined not to repeat the main mistake that they made with Ebola: waiting for much too long to respond to a brewing outbreak. The delay allowed Ebola to grow so out of control in West Africa that the epidemic there persists after more than 2 years and 11,000 deaths. By contrast, the global health community has moved aggressively against Zika, beginning with a declaration from the World Health Organization (WHO) on 1 February that the clusters of microcephaly and other neurological disorders that have appeared in Brazil coincident with outbreak of the virus, and previously in French Polynesia, constitute an inter-national public-health emergency. The WHO has never yet made such a declaration before knowing the cause of the condition of concern. The August 2014 declaration that Ebola was a public-health emergency came after the disease had been spreading in West Africa for 8 months and had killed 932 people. But although Zika has probably infected as many as 1 million people in the latest outbreak, the vast majority have recovered. And scientists have not proved a link between Zika and microcephaly, a condition in which infants are born with abnormally small heads and brains. “The WHO has perhaps gotten out ahead of its usual position of gathering and verifying all the evidence before taking a clear position,” says Adam Kamradt-Scott, a health-security specialist at the University of Sydney in Australia. “The WHO couldn’t afford to be seen to be asleep at the wheel a second time.” Other authorities have taken similarly bold action. On 3 February, the US Centers for Disease Control and Prevention (CDC) moved its emergency-response operations centre to its highest activation level, jump-starting US government research into, and surveillance of, the Zika virus. On the same day, the United Kingdom announced the creation of a Zika research fund with an initial budget of up to £1 million (US$1.4-million). And on 8 February, US President Barack Obama requested $1.8 billion from lawmakers for Zika-response activities. (By comparison, Obama’s $6.18-billion request for Ebola-response funding came 3 months after that virus was declared a global emergency.) The ongoing mobilization against Zika is not an over-reaction, says Suerie Moon, a global-health researcher at the Harvard T. H. Chan School of Public Health in Boston, Massachusetts. Although Zika — unlike Ebola — is not usually fatal, it has the potential to cause suffering and social and economic havoc. “It’s encouraging to see leadership and mobilization from WHO, CDC and other public-health institutions,” Moon says. “It shows that some of the lessons from Ebola have been digested.” WHO director-general Margaret Chan has acknowledged the agency’s failings on Ebola, citing “inadequacies and shortcomings in this organization’s administrative, managerial and technical infrastructures” in a speech last year. The Zika response also highlights persistent flaws of the global public-health system. Zika was first discovered in Africa in 1947, and caused a major outbreak in 2013 in the Pacific islands, but there is still no vaccine, treatment or common diagnostic test for the virus. Kamradt-Scott wonders if the world would be tracking Zika’s spread so closely if the virus had not emerged in Brazil, where hundreds of thousands of tourists are scheduled to attend the Olympic Games in August. “My own perception is that the international community hasn’t responded particularly swiftly to Zika,” he says. Moon notes that although the WHO is trying to ensure that researchers in government, academia and industry share data on the outbreak, drug companies developing Zika vaccines have not publicly agreed to participate. The WHO has long struggled to modulate its response to global-health crises, Gostin says. After it was criticized for reacting too strongly to the 2009 H1N1 influenza epidemic — declaring a full-scale pandemic, when the virus itself did not prove as deadly as was initially feared — it dialled back its response to the Ebola outbreak. Now the WHO is mounting an urgent response to Zika, in light of criticism of its reaction to Ebola. To Gostin, this inconsistency reinforces a perception that the WHO acts mainly on the basis of political, not medical, factors. “We need to stop fighting the last war,” he says.


News Article | November 17, 2016
Site: www.sciencemag.org

The world is in the grip of its seventh cholera pandemic, but that’s not exactly news. Today’s pandemic has been around since the 1960s, burning through developing countries like Democratic Republic of Congo and Haiti. Now, scientists have used DNA from historical samples to figure out how the modern strain—responsible for 1304 deaths last year alone—morphed from a harmless microbe centuries ago into a deadly pathogen today. Cholera, caused by the bacterium Vibrio cholerae, produces watery diarrhea that can lead to dehydration and death. Because Vibrio spreads through contact with raw sewage, it thrives in regions lacking clean water or modern sanitation. With proper treatment, mortality is low, but the deadly bacterium’s ability to spread rapidly has kept it at the forefront of global public health efforts. To find out how the current pandemic got its start, scientists looked at DNA from preserved cholera samples in laboratories around the world. In the same way scientists create evolutionary trees, the researchers made a branching map of relationships between strains through time using genetic comparisons. Combining their analysis with written accounts of old outbreaks, the team defined six stages in the evolution of the modern strain that led to its toxicity and its ability to spread, they report this week in the Proceedings of the National Academy of Sciences . Strain No. 7 diverged from its relatives, 1–6, around the turn of the 18th century, says paper author Peter Reeves, a microbiologist at the School of Life and Environmental Sciences at the University of Sydney in Australia. But that’s just an estimate: The first observation of the new lineage comes from a laboratory in El Tor, Egypt, in 1897. By that time, the “El Tor” strain differed from its relatives by about 30%, but it didn’t spread rapidly and it didn’t make people sick. The next decade was pivotal for the bacterium’s evolution. It bounced around the Middle East, picking up a key gene called tcpA, which encodes a hairlike structure on its surface that clings to the wall of the small intestine. This change alone didn’t make the strain pathogenic, but it may have helped it live longer in the guts of religious pilgrims traveling to and from Mecca. Then, sometime between 1903 and 1908, the El Tor strain picked up a crucial piece of hitchhiking DNA that likely triggered its ability to cause disease in humans. At the time, the sixth cholera pandemic was in full swing in the Middle East, Europe, and North Africa. Reeves says that a phage—a virus that infects bacteria—picked up the “classic” form of the cholera toxin gene from the sixth strain and then infected the El Tor strain, transferring the toxin gene. The new trait would have caused watery diarrhea, accelerating the disease’s spread through the water supply. But it was still missing the key genes it would need before it could cause a full-blown pandemic. From there, the strain moved east to Makassar, Indonesia. There, it gained new genes that likely increased transmissibility, along with the two “islands” of DNA used to identify it as unique today, Vibrio seventh pandemic (VSP) 1 and 2. However, Reeves says there’s very little evidence that the VSP islands did much, if anything, to help the bacteria spread. In fact, the researchers are still unsure which changes drove the increase in transmissibility from 1925 until 1961, when the disease spread around the rest of the world. To find out which changes accelerated the organism to pandemic levels, we would have to do human testing, says Reeves, infecting a large group with different strains to see how fast it spreads. “If we can understand what was happening in this pandemic strain, it might help us make predictions about whether any of the other ones have the potential,” he says. But, “of course you can’t do that as an experiment.” Given the information available, “this is a perfectly fine study,” says Edward Ryan, a microbiologist at Harvard University who was not involved in the new work. “It tells a coherent story that is very biologically plausible.” But it misses a big part of the picture, says microbiologist Mark Achtman at the University of Warwick in Coventry, England. “I am skeptical about the completeness of the evolutionary story when it is based exclusively on the remnants of human disease isolates.” Achtman says that analyses limited to human samples are lacking because they don’t examine the cholera always circulating in the environment, outside human hosts. At any point in history, he says, new strains could have arisen from these pathogens rather than from human-to-human transmission. But that would be extremely tough to nail down, Ryan says. Few historical environmental samples exist, and evidence that such strains made the jump to humans is even harder to come by. “Were there other strains in other areas of the world that weren’t collected that might give a more complete story?” Ryan asks. “Yes,” he says. “But at the end of the day … [this] is a reasonable biological story based on the best available data of how a pathogen emerges historically and came to dominate.”


News Article | December 8, 2016
Site: www.eurekalert.org

New genetic research from an international team including McMaster University, University of Helsinki, Vilnius University and the University of Sydney, suggests that smallpox, a pathogen that caused millions of deaths worldwide, may not be an ancient disease but a much more modern killer that went on to become the first human disease eradicated by vaccination. The findings, published in the journal Current Biology, raise new questions about the role smallpox may have played in human history and fuels a longstanding debate over when the virus that causes smallpox, variola, first emerged and later evolved in response to inoculation and vaccination. "Scientists don't yet fully comprehend where smallpox came from and when it jumped into humans," says evolutionary geneticist Hendrik Poinar, senior author of the study, director of the McMaster Ancient DNA Centre and a researcher with Michael G. DeGroote Institute of Infectious Disease Research. "This research raises some interesting possibilities about our perception and age of the disease." Smallpox, one of the most devastating viral diseases ever to strike humankind, had long been thought to have appeared in human populations thousands of years ago in ancient Egypt, India and China, with some historical accounts suggesting that the pharaoh Ramses V -who died in 1145 BC--suffered from smallpox. In an attempt to better understand its evolutionary history, and after obtaining clearance from the WHO in Geneva, scientists extracted the heavily fragmented DNA, from the partial mummified remains of a Lithuanian child believed to have died between 1643 and 1665, a period in which several smallpox outbreaks were documented throughout Europe with increasing levels of mortality. The smallpox DNA was captured, sequenced and the ancient genome, one of the oldest viral genomes to date, was completely reconstructed. There was no indication of live virus in the sample and so the mummies are not infectious. Researchers compared and contrasted the 17th Century strain to those from a modern databank of samples dating from 1940 up to its eradication in 1977. Strikingly, the work shows that the evolution of smallpox virus occurred far more recently than previously thought, with all the available strains of the virus having an ancestor no older than 1580. "This study sets the clock of smallpox evolution to a much more recent time-scale" said evolutionary biologist Eddie Holmes, a professor at the University of Sydney, Australia. "Although it is still unclear what animal is the true reservoir of smallpox virus and when the virus first jumped into humans." The pox viral strains that represent the true reservoir for human smallpox remains currently unsampled. Both the closest gerbil (Tetarapox) and camel pox are very distantly related and consequently are not the likely ancestors to smallpox, suggesting that the real reservoir remains at large or has gone extinct. Researchers also discovered that smallpox virus evolved into two circulating strains, variola major and minor, after English physician Edward Jenner famously developed a vaccine in 1796. One form of VARV (Variola virus), known as V. major was highly virulent and deadly, the other V, minor much more benign. However, scientists say, the two forms experienced a 'major population bottleneck' with the rise of global immunization efforts. The date of the ancestor of the minor strain corresponds well with the Atlantic Slave trade which was likely responsible for partial worldwide dissemination. "This raises important questions about how a pathogen diversifies in the face of vaccination. While smallpox was eradicated in human populations, we can't become lazy or apathetic about its evolution - and possible reemergence--until we fully understand its origins," says Ana Duggan, a post doctoral fellow in the McMaster Ancient DNA Centre. Whether the date of the ancestor, approximately 1580, precludes the massive destruction of aboriginal populations in central America by smallpox, introduced by the Spanish, remains questionable. To that end, researchers must carefully examine the remains of individuals buried in epidemic burials in central and southern America, say scientists. "This work blurs the line between ancient diseases and emerging infections. Much of smallpox evolution apparently happened in historic time," says Margaret Humphreys, historian of medicine at Duke University. Attention editors: Additional quotes from collaborators are available here: "I am excited to see that these remains from the Holy Spirit crypt, once scheduled to be buried, are now revealing so much about the health conditions of past Vilnius inhabitans. This research is yielding extraordinary information and we should especially be grateful to those unnamed people that still tell us stories after centuries." - Dario Piombino-Mascali - Vilnius University "Indeed, behind our rear window is another world; the time machine through which we call Archaeovirology," say post doctoral fellow Maria Perdomo and professors Klaus Hedman and Antti Sajantila at University of Helsinki. McMaster provides a high definition broadcast studio that can connect with any television broadcaster around the world. To book an interview please contact:


News Article | February 25, 2017
Site: www.techtimes.com

Back pain affects about 700 million people and is the leading cause of disability worldwide. Now, a new study shows that it is not just uncomfortable and debilitating, but also a potential indicator of early death. Examining the health and death records of 4,390 older Danish twins, a team of researchers from the University of Sydney in Australia found that those who reported lower back pain had a substantially greater chance of dying sooner than their peers. The team accessed the data from the twin subjects and looked at the link between the presence of back pain and mortality rates. They discovered that people with symptoms of back and neck pain had a 13 percent greater chance of dying than those without any symptom in their spine. Physiotherapy researcher and senior study author Dr. Paulo Ferreira deemed the finding significant, as back pain is typically not considered life-threatening. "As this study was done in twins, the influence of shared genetic factors is unlikely because it was controlled for in our analysis,” he explained in a statement. And not unlike compound interest, the 13 percent chance only increases each year. For Ferreira, the pain could be part of a whole pattern of poor health functional ability, upping the risk of death in the older population. Ferreira then emphasized the value of staying healthy and fit, amid growing evidence that surgery and medications are largely ineffective for treatment. People who exercise more, he cited as an example, will have less chances of suffering from back pain. “Even if you develop back pain and you are physically active, your prognosis is going to be much better as well, compared to those who just decide to stay at home and lie in bed,” he said. The new research follows previous research revealing that depressed individuals are 60 percent more likely to have low back pain in their lifetime. In Australia, an approximate 4 million suffer from some type of back pain, and the condition costs the Australian economy $1 billion annually in treatments alone. The findings were discussed in the European Journal of Pain. This isn’t the first research to point to a seemingly non-deadly condition in the elderly as a potential sign of early death. Previous research showed that elderly women breaking their hips are at a heightened risk of dying within a year of their injury. Most studies pointed to underlying poor health leading to the injury, not the injury leading to early mortality. In the United States, lower back pain remains one of the most common reasons behind doctor visits. Just recently, the American College of Physicians released new recommendations for back pain treatment, advising non-drug options such as Tai chi, yoga, acupuncture, and mindfulness meditation before one tries to take prescription or over-the-counter pain medications. A national Consumer Reports survey also disclosed that many back pain patients found that alternative therapies worked for their back pain. In the survey covering 3,562 patients, almost 90 percent who tried yoga or Tai chi said the techniques helped, while 84 and 83 percent found the same effects in massage and chiropractic. © 2017 Tech Times, All rights reserved. Do not reproduce without permission.


News Article | December 5, 2016
Site: www.eurekalert.org

HAMILTON, Dec. 5, 2016 - An analysis of 2,000-year-old human remains from several regions across the Italian peninsula has confirmed the presence of malaria during the Roman Empire, addressing a longstanding debate about its pervasiveness in this ancient civilization. The answer is in mitochondrial genomic evidence of malaria, coaxed from the teeth of bodies buried in three Italian cemeteries, dating back to the Imperial period of the 1st to 3rd centuries Common Era. The genomic data is important, say researchers, because it serves as a key reference point for when and where the parasite existed in humans, and provides more information about the evolution of human disease. "Malaria was likely a significant historical pathogen that caused widespread death in ancient Rome," says evolutionary geneticist Hendrik Poinar, director of McMaster's Ancient DNA Centre where the work was conducted. A serious and sometimes fatal infectious disease that is spread by infected mosquitoes, malaria and its parasite Plasmodium falciparum, is responsible for nearly 450,000 deaths every year, the majority of them children under the age of five. "There is extensive written evidence describing fevers that sound like malaria in ancient Greece and Rome, but the specific malaria species responsible is unknown," says Stephanie Marciniak, a former post doctoral student in the Ancient DNA Centre and now a postdoctoral scholar at Pennsylvania State University. "Our data confirm that the species was likely Plasmodium falciparum, and that it affected people in different ecological and cultural environments. These results open up new questions to explore, particularly how widespread this parasite was, and what burden it placed upon communities in Imperial Roman Italy," she says. Marciniak sampled teeth taken from 58 adults and 10 children interred at three Imperial period Italian cemeteries: Isola Sacra, Velia and Vagnari. Located on the coast, Velia and Isola Sacra were known as important port cities and trading centres. Vagnari is located further inland and believed to be the burial site of labourers who would have worked on a Roman rural estate. Using techniques developed at McMaster and abroad, researchers mined tiny DNA fragments from dental pulp taken from the teeth. They were able to extract, purify and enrich specifically for the Plasmodium species known to infect humans. It was a difficult and painstaking process, complicated by the very nature of the disease. Usable DNA is challenging to extract because the parasites primarily dwell within the bloodstream and organs, including the spleen and liver, which decompose and break down over time--in this instance, over the course of two millennia. Marciniak, Poinar, and Tracy Prowse from McMaster, alongside Luca Bandioli from the Luigi Pigorini National Museum of Prehistory and Ethnography in Rome and Edward Holmes from the University of Sydney recovered more than half of the P. falciparum mitochondrial genome from two individuals from Velia and Vagnari. P. falciparum remains the most prevalent malaria parasite in sub-Saharan Africa and the most-deadly anywhere, responsible for the largest number of malaria-related deaths globally. The findings are published in the journal Current Biology. McMaster provides a high definition broadcast studio that can connect with any television broadcaster around the world. To book an interview please contact:


News Article | December 12, 2016
Site: www.eurekalert.org

The Australian National University (ANU) has led an international project to make a diamond that's predicted to be harder than a jeweller's diamond and useful for cutting through ultra-solid materials on mining sites. ANU Associate Professor Jodie Bradby said her team - including ANU PhD student Thomas Shiell and experts from RMIT, the University of Sydney and the United States - made nano-sized Lonsdaleite, which is a hexagonal diamond only found in nature at the site of meteorite impacts such as Canyon Diablo in the US. "This new diamond is not going to be on any engagement rings. You'll more likely find it on a mining site - but I still think that diamonds are a scientist's best friend. Any time you need a super-hard material to cut something, this new diamond has the potential to do it more easily and more quickly," said Dr Bradby from the ANU Research School of Physics and Engineering. Her research team made the Lonsdaleite in a diamond anvil at 400 degrees Celsius, halving the temperature at which it can be formed in a laboratory. "The hexagonal structure of this diamond's atoms makes it much harder than regular diamonds, which have a cubic structure. We've been able to make it at the nanoscale and this is exciting because often with these materials 'smaller is stronger'." Lonsdaleite is named after the famous British pioneering female crystallographer Dame Kathleen Lonsdale, who was the first woman elected as a Fellow to the Royal Society. The research is published in Scientific Reports. Co-researcher Professor Dougal McCulloch from RMIT said the collaboration of world-leading experts in the field was essential to the project's success. "The discovery of the nano-crystalline hexagonal diamond was only made possible by close collaborative ties between leading physicists from Australia and overseas, and the team utilised state-of-the-art instrumentation such as electron microscopes," he said. Corresponding author from the University of Sydney, Professor David McKenzie, said he was doing the night shift in the United States laboratory as part of the research when he noticed a little shoulder on the side of a peak. "And it didn't mean all that much until we examined it later on in Melbourne and in Canberra - and we realised that it was something very, very different." Images related to the research are available via this Dropbox link. You can also watch the video interview with the researchers on the ANU YouTube channel.


News Article | October 31, 2016
Site: www.chromatographytechniques.com

A domestic cat in Australia tested positive for a drug-resistant strain of salmonella, a first for the country. The cat, which was being housed at a shelter, was brought to Concord Veterinary Clinic in New South Wales with an upper respiratory infection, which developed into a gut infection during treatment. The bacteria was resistant to about nine classes of drugs, including carbapenems, the last line of defense against salmonella in Australia. The cat’s condition continued to deteriorate, and it was ultimately euthanized. "This is the first time that a Salmonella strain with resistance to most antimicrobial drugs has been reported in any Australian domestic animal and it is a significant concern to public health," said Sam Abraham, Murdoch University researcher who led a study to identify the characteristics and risks of the Salmonella bug. Abraham was assisted by fellow researchers from Concord Hospital, Sydney University and Adelaide University. Analysis of the stool sample showed that the cat was infected with a Salmonella bacteria carrying the highly resistant IMP-4 gene. Eight other cats that were at the vet clinic were also tested for the superbug. Three tested positive—two of which had no direct contact with the sick cat—suggesting that the bacterial species is highly transferable, according to the researchers. According to Richard Malik, from the University of Sydney who was brought in to oversee outbreak containment, one of the infected cats showed no symptoms, another was kept in the same room, but did not have direct contact and the third was being kept in a different room within the clinic. There is also a possibility that the resistance of the bacteria had been built up through exposure to heavy metals. The researchers believe this is also increasing its resistance to common antimicrobial drugs. The researchers noted that the outbreak has been contained to just the few additional cats, and they have not received any other reports. The only other time Australia has seen this level of antimicrobial resistance was in a seagull colony off New South Wales. How the birds were infected is still unknown. The study led by Abraham has been accepted for publication in Scientific Reports.


News Article | November 30, 2016
Site: www.eurekalert.org

Sydney Grammar students, under the supervision of the University of Sydney and global members of the Open Source Malaria consortium, have reproduced an essential medicine in their high school laboratories. The drug, Daraprim, had been the subject of controversy when the price was hiked from US $13.50 to US$750 a dose last year. Daraprim - originally used as an antimalarial after its synthesis by Nobel Prize winner Gertrude Elion - is now more widely used as an anti-parasitic treatment for toxoplasmosis, which can be a dangerous disease for pregnant women and people with compromised immune systems, such as those living with HIV or AIDS. Daraprim is listed by the World Health Organisation as an essential medicine. In September 2015, Turing Pharmaceuticals acquired the market rights to Daraprim and raised the price of a dose more than 5000 percent overnight. CEO at the time, Martin Shkreli, stuck by the price, despite criticism including from US Secretary of State Hillary Clinton. To highlight the inequity of the monopoly, high school students in Sydney have been working with the Open Source Malaria consortium to make Daraprim in the laboratory using inexpensive starting materials, as part of the Breaking good - Open Source Malaria Schools and Undergraduate Program. Scientists anywhere in the world were able to view all the data generated and mentor the students to accelerate the science under the coordination from The University of Sydney's Dr Alice Williamson and Associate Professor Matthew Todd. Dr Williamson from the School of Chemistry said the scientific community could provide advice and guidance to the students online in real time. "The enthusiasm of the students and their teachers Malcolm Binns and Erin Sheridan was translated into a complete route in the public domain by the use of the Open Source Malaria platform," Dr Williamson said. "Anyone could take part and all data and ideas are shared in real time." Associate Professor Matthew Todd said the innovative open-source approach lowered the barrier to participation by researchers outside traditional institutions, such as universities and pharmaceutical companies, allowing students to work on real research problems of importance to human health. "Daraprim may be quickly and simply made, bringing into question the need for such a high price for this important medicine," Associate Professor Todd said. The findings were presented at the 2016 Royal Australian Chemical Institute Organic One Day Symposium today. Open Source Malaria is supported by the Medicines for Malaria Venture and the Australian Government, as well as by an international network of contributors.


News Article | November 1, 2016
Site: www.prweb.com

Dr. Jin Y. Kim, a board-certified specialist in periodontology, is pleased to announce that he is now accepting new patients for treatment of their gum disease in Diamond Bar, CA now without a referral. Dr. Kim uses an advanced laser dentistry treatment known as the LANAP® protocol, which precisely targets harmful bacteria while leaving healthy tissue unharmed. Dr. Kim recommends that all eligible patients with moderate to severe gum disease undergo this treatment in order to avoid potential systemic health issues associated with the condition. Gum disease is a severe condition that affects over 64 million American adults, according to the Centers for Disease Control and Prevention. When patients fail to have the condition treated, it can have serious consequences. Researchers have now discovered that the bacteria that cause periodontal disease in the mouth may be responsible for, or contribute to, systemic health issues such as heart disease, diabetes and certain cancers. This is why it is imperative that patients with gum disease in Diamond Bar, CA seek treatment immediately. The LANAP® protocol is designed to clear away pockets of infection that form along the gum line while promoting healing of the area in a sterile environment. Because it is performed using laser dentistry, Dr. Kim can specifically target diseased tissue while leaving the healthy tissue intact and unharmed. The results of the procedure are unmatched when considering the benefits it provides. Unlike traditional gum surgery, the LANAP® protocol is minimally invasive, scalpel-free and results in little to no pain or discomfort. Patients often report less sensitivity, less gum loss and very little downtime following the treatment when compared to traditional gum surgery. Patients who believe they may have some form of gum disease in Diamond Bar, CA are invited to contact Dr. Kim’s office to schedule an appointment to discuss how laser dentistry can help them reclaim their health. New patients can schedule appointments at Dr. Kim’s Diamond Bar location by calling 909-860-9222. Dr. Jin Y. Kim is a periodontist dedicated to providing personalized dental care in Diamond Bar and Garden Grove, CA. Dr. Kim attended the University of Sydney Faculty of Dentistry before furthering his education with an advanced degree in pathology from the Medical School of the same University. Dr. Kim completed a periodontics and implant surgery residency at UCLA School of Dentistry. A uniquely dual board-certified specialist, Dr. Kim was board-certified by the American Board of Periodontology and the American Board of Oral Implantology/Implant Dentistry. The International Congress of Oral Implantologists and the American Academy of Implant Dentistry both gave him the title of Fellow. He was also inducted to be a Fellow of the prestigious American College of Dentists. Dr. Kim enjoys lecturing at UCLA School of Dentistry as well as national and international academic and clinical associations and universities including the International Association of Dental Research, American Academy of Periodontology and Academy of Osseointegration. To learn more about Dr. Jin Kim and the services he offers, visit his website at http://www.drjinkim.com or call (909) 860-9222 for the Diamond Bar location or (714) 898-8757 for the West Garden Grove location to schedule an appointment.


News Article | February 15, 2017
Site: www.prweb.com

Dr. Jin Kim, a world-renowned periodontist, honors American Heart Month by raising awareness of the link between gum disease and heart disease. To help as many people as possible overcome gum disease in Garden Grove, CA, and prevent the accompanying heart issues that may ensue if left untreated, Dr. Kim is now accepting new patients for laser gum disease therapy, with or without a referral. Gum disease is a serious condition that often goes unnoticed. Over 47 percent of the U.S. population has some form of gum disease, according to the Centers for Disease Control and Prevention. Many with gum disease are unaware they have the condition until it becomes serious. Early symptoms, including swollen and bleeding gums, do not immediately cause a great level of discomfort. As the disease progresses, patients may experience persistent bad breath, pain, receding gums, bone loss and tooth loss. If left untreated, the infection that causes the gum disease can travel throughout the body. Current research suggests that the risk of exacerbating heart conditions or creating them increases when gum disease is present. Recommended by an authority in periodontology, Dr. Kim, the LANAP® protocol is a leading, minimally invasive treatment for gum disease. This technique allows Dr. Kim to only target the bacteria that cause periodontal disease, so healthy gum tissue remains intact and untouched. This results in little bleeding and swelling, and patients are often able to return to their regular routines in a very short period of time. The treatment can also help the gums reattach to the teeth, limiting tooth loss. Patients who may be showing signs of gum disease in Garden Grove, CA are invited to schedule a consultation with Dr. Kim. Doing so can allow them to reduce their risk of heart disease while maintaining a healthy smile. Dr. Jin Y. Kim is a periodontist dedicated to providing personalized dental care in Diamond Bar and Garden Grove, CA. Dr. Kim attended the University of Sydney Faculty of Dentistry before furthering his education with an advanced degree in pathology from the Medical School of the same University. Dr. Kim completed a periodontics and implant surgery residency at UCLA School of Dentistry. A uniquely dual board-certified specialist, Dr. Kim was board-certified by the American Board of Periodontology and the American Board of Oral Implantology/Implant Dentistry. The International Congress of Oral Implantologists and the American Academy of Implant Dentistry both gave him the title of Fellow. He was also inducted to be a Fellow of the prestigious American College of Dentists. Dr. Kim enjoys lecturing at UCLA School of Dentistry as well as national and international academic and clinical associations and universities including the International Association of Dental Research, American Academy of Periodontology and Academy of Osseointegration. To learn more about Dr. Jin Kim and the services he offers, visit his website at http://www.drjinkim.com or call (909) 860-9222 for the Diamond Bar location or (714) 898-8757 for the West Garden Grove location to schedule an appointment.


VANCOUVER, BRITISH COLUMBIA--(Marketwired - March 2, 2017) - Tajiri Resources Corp. (the "Company") (TSX VENTURE:TAJ) is pleased to report that, subject to TSX Venture Exchange approval, it has entered into an option agreement with the Pereira Group to acquire: Over the past several months Tajiri has been assessing projects and negotiating with Guyanese land holders to acquire projects and expand the Company's portfolio of gold tenements in Guyana. Following a strategy to acquire tenements which cover persistent structures, associated with gold mineralisation, over regional scale strike lengths, which include previously undrilled artisinal bedrock workings. Management is pleased to present the first of these acquisitions being the Frenchman's Creek, Winters Mine and Kanaimapu Projects. The three projects are all situated close to the main road connecting Guyana's capital Georgetown to Brazil and the projects are located between 150km and 260km south of Georgetown (see Figure 1). Acquisition cost of the three Projects will be over five years paid as follows: USD$30,000 upon signing, USD$50,000 on or before the first anniversary, USD$50,000 on or before the second anniversary, USD$425,000 on or before the third anniversary, and $USD1,000,000 on or before the fourth anniversary. The Company will also issue 1.5 million shares over five years to the vendors in the amounts of 500,000 upon signing, 500,000 on or before the first anniversary, 750,000 on or before the second anniversary, and 1,000,000 on or before the fourth year. The agreement provide Tajiri the right to acquire 100% interest in all three Projects while the Vendors will retain a 2% Net Smelter Return. To view Figure 1, please visit the following link: http://media3.marketwire.com/docs/tajiri_resources_march2_figure1.pdf A Brief outlines of the Frenchman's Creek Project is given below and details of the Winter's Mine and Kanaimapu Project will be given in a subsequent release. The Project is located 260km south of Georgetown and contains the Pott Falls Mine (operational between 1935 and 1941 with a 10 ton per hour mill- production unknown) and the Parrot Pit an artisanal pit excavated intermittently between the mid 1980s and 2011. The Project also contains extensive alluvial gold workings which indicate a potential source area of some 13 kilometres strike in the northern portion of the property which has shed gold into the surrounding predominantly south draining creeks. These creeks have been worked consistently downstream for distances of between 6 and 9 kilometres and intermittently since the early 20th Century. The most recent production records of the Vendors demonstrate that between 2007 and 2010 the property produced 42,203 ounces of gold from sluicing of alluvial material on the property. Alluvial gold production from the area prior to 2007 has not been recorded but a banka drilling and pitting program, performed by the Geological Survey of Guyana in 1975, covering the alluvial flats of Frenchman's Creek produced a historic resource of 500,000 cubic yards at a grade of 0.5 grams Au per cubic yard. Undoubtedly this resource has now been mined by artisinal producers. To view Figure 2, please visit the following link: http://media3.marketwire.com/docs/tajiri_resources_march2_figure2.pdf The Project area geology consist of NW - WNW trending greenstones of the mid-Proterzoic Barama-Mazaruni Group intruded by granites and porphyries. In the south of the Project the Muruwa formation a relatively undeformed sandstone unconformably overlies the greenstones. Regolith in the area is variable and consists of relict lateritic soils variably covered by transported sand. In situ mineralisation in the area is evident at the Pott Falls Mine and the Parrot Pit (see Figure 2 & 3) The Pott Falls Mine was active as an underground operation with an installed mill capacity of 10 tonnes per hour between 1935 and 1941. The mine exploited WNW striking, high grade, narrow 0.3-1.5m wide quartz veins which occurred at the contact between several narrow porphyry dikes intruded into mafic volcanics. Production data for the Pott Falls Mine is not available. However records of drift sampling reported by the geological survey of Guyana in 1938 gave a 70 foot strike length of a one foot wide vein at an average grade of 174 g/t Au. Five diamond holes were drilled at Potts Falls Mine between 1946 and 1947 and the best results were approximately 4.5m @ 6.2g/t Au. Details of this drill program and hole locations are not available From the early 1980s the Pott Falls mine was intermittently exploited by artisinal miners who excavated a shallow 200m x 75m pit at the prospect. Recent due diligence sampling of quartz vein ore removed from the pit at an artisanal crusher site returned 1,270 g/t and a composite sample of the crusher tailings returned 4.27 g/t Au. To view Figure 3, please visit the following link: http://media3.marketwire.com/docs/tajiri_resources_march2_figure3.pdf The only modern exploration in the area was conducted by Adex Exploration a Canadian Junior in 1994. Adex conducted grab sampling and a 50-100m x 200m spaced soil geochemical program covering approximately 25% of the Project area. Adex did not drill any of the prospects. Adex withdrew from the project in 1995 after the demise of the then property owner. Geochemical sampling by Adex outlined soil anomalism with several peak values >1 g/t around the Pott Falls Mine, Parrot Pitt and the Blackheart Zone (see Figure 2). At the Pott Falls mine geochemical anomalism >125ppb extended intermittently along strike from the mine for 2,000m over a width of 400-800m. At the Parrot Pit which is approximately 180m x 50m, geochemical anomalism (>125 ppb) extended 400m to the east of the pit over widths of 100-200m. Grab Sampling of quartz veins at the Parrot Pit returned values between 0.79g/t and 859 g/t Au. Parrot Pit reputedly produced 1,500 ounces of gold during the 1980s. Geochemical anomalism at the Black Heart Zone extended over an area of 850m x 400 m at >150ppb Au. A trench was hand excavated across the anomaly but only to a depth of 1.5m and was too shallow to sample the underlying saprolite. The trench produced a continuous width of 174m averaging 225 ppb with several samples >1 g/t Au. Tajiri views this trench as only demonstrating a continuous dispersed gold in soil anomaly and not mineralisation from a presumably more discretely constrained primary mineralised zone below the trench. The Frenchman's Creek Project represent an excellent opportunity for Tajiri. It contains several near drill ready targets and potential for regional scale gold bearing structures as evidenced by the abundant alluvial workings within the area. Over next several months Tajiri will conduct mapping and surface sampling to allow the planning of both drilling and regional exploration programs. On Behalf of the Board, Neither the TSX Venture Exchange nor its Regulation Services Provider (as that term is defined in the policies of the TSX Venture Exchange) accepts responsibility for the adequacy or accuracy of this release. This news release may contain forward-looking statements based on assumptions and judgments of management regarding future events or results. Such statements are subject to a variety of risks and uncertainties which could cause actual events or results to differ materially from those reflected in the forward-looking statements. The Company disclaims any intention or obligation to revise or update such statements. All information relating to geological results, drill plans, or other facts, statements of any geotechnical nature have been prepared by Mr. Dominic O'Sullivan. Mr. O'Sullivan is a professional geologist, holding a BSc (Hons) in Geology from University of Sydney and is a member in good standing with the MAusIMM. As such he is a qualified person as the term is defined by TSX Venture Exchange, and under National Instrument NI43-101 regulation and responsible for the relating contents of this release.


News Article | November 2, 2016
Site: www.newscientist.com

HUMANS migrate. It is a characteristic of our species. Yet now a migration crisis is headline news. More than a million desperate people fled to Europe in 2015, and nearly 4000 died trying. The influx is increasing and about to swell more as the weather improves. The United Nations says Europe faces “an imminent humanitarian crisis, largely of its own making”. And it is not alone. The UN has also censured Australia for sending boatloads of refugees to squalid camps in other countries. And US politicians talk of building a wall while tens of thousands of lone children flee violence in Latin America across the US-Mexican border. In January, the World Economic Forum ranked large-scale refugee flows as its global risk of highest concern. When the US Council on Foreign Relations drew up its top 10 priorities for conflict prevention in 2016, it included political instability in the EU caused by the influx of migrants. Concerns about refugees and economic migrants are grist to the mill for those who want Britain to vote to leave the EU in June. And there’s no doubt that migration will increase as the world’s economy becomes more globalised, and as demographic and environmental pressures bite. Should we be alarmed? What is the truth about migration? It is an emotive issue. But the scientific study of what happens when humans move is starting to supply some non-emotive answers. It’s showing that many widespread beliefs don’t hold up to scrutiny. “Concern about immigrants falls sharply when people are given even the most basic facts,” says Peter Sutherland, the UN Special Representative for migration. One analyst even says that removing all barriers to migration would be like finding trillion dollar bills on the sidewalk. The millions fleeing Syria have shone a spotlight on refugees, but that tragedy is just a small part of a bigger picture. More than 240 million people worldwide are international migrants. Refugees account for fewer than 10 per cent of the total and, in theory, they are the least contentious group, because many countries have signed international commitments to admit them. The rest are moving to work, or to join family members who have jobs. When such people travel with refugees, they are often derided as “just” economic migrants. This is unfair, says Alex Betts, head of the Refugee Studies Centre at the University of Oxford. Whether or not they meet the official definition of a refugee, many are escaping dire conditions that pose a threat to their survival. Although globalisation of the world’s economy has lifted millions out of poverty, it has not been able to create enough jobs where there are people in need of work. Aid funds are starting to address this problem – but for the most part people must go where there are jobs. That’s why some see migration as a crisis. The 2008 financial crash spawned insecurity about jobs and concerns about economic migrants. Several populist parties took the opportunity to warn of a flood of freeloaders at the gates, increasing the issue’s political visibility and hardening the policies of some mainstream parties, including in the UK. The US government decided not to bail out firms that hired too many immigrants. Spain paid migrants to leave – even after they had stopped coming as jobs disappeared. And feelings of insecurity remain. “The logic driving this is the idea that migrant workers present additional competition for scarce jobs,” says Ian Goldin at the University of Oxford. Indeed, it is probably part of our evolved nature to think that more for you means less for me (see “The truth about migration: How evolution made us xenophobes“). But that’s not how modern economies work. If economies really were zero-sum games in this way, wages would go down as labour supply increased and natives might well lose jobs to immigrants. But no modern economic system is that simple, says Jacques Poot at the University of Waikato, New Zealand. The knock-on of economic migration is that increased labour also brings an increase in profit, which business owners can invest in more production. They can also diversify, creating opportunities for a broader range of workers. In addition, migration means workers can be more efficiently matched to demand, and make the economy more resilient by doing jobs natives won’t or can’t do. “More people expand the economy,” says Goldin, because people are moving from where they cannot work productively to where they can. In a survey of 15 European countries, the UN’s International Labour Organisation (ILO) found that for every 1 per cent increase in a country’s population caused by immigration, its GDP grew between 1.25 and 1.5 per cent. The World Bank estimates that if immigrants increased the workforces of wealthy countries by 3 per cent, that would boost world GDP by $356 billion by 2025. And removing all barriers to migration could have a massive effect. A meta-analysis of several independent mathematical models suggests it would increase world GDP by between 50 and 150 per cent. “There appear to be trillion-dollar bills on the sidewalk” if we lift restrictions on emigration, says Michael Clemens at the Center for Global Development, a think tank in Washington DC, who did the research. But who gets those billions? Most of the extra wealth goes to migrants and to their home countries. In 2015, migrants sent home $440 billion, two and a half times the amount those countries received in foreign aid – promoting development and jobs at home. But what do natives of countries that attract migrants get out of it? In the EU it has been difficult to tease out the effect of free movement of workers from other economic results of membership. However, a study of non-EU member Switzerland is illuminating. Different parts of Switzerland allowed free access to EU workers at different times, enabling Giovanni Peri of the University of California, Davis, to isolate the effects. He found that while the workforce grew by 4 per cent, there was no change in wages and employment for natives overall. Wages increased a little for more educated Swiss people, who got jobs supervising newcomers, while some less educated Swiss people were displaced into different jobs. Peri has also looked at the situation in the US. “Data show that immigrants expand the US economy’s productive capacity, stimulate investment and promote specialisation, which in the long run boosts productivity,” he says. “There is no evidence that immigrants crowd out US-born workers in either the short or the long run.” Natives instead capitalise on language and other skills by moving from manual jobs to better-paid positions. Peri calculates that immigration to the US between 1990 and 2007 boosted the average wage by $5100 – a quarter of the total wage rise during that period. Further evidence comes from a meta-analysis Poot did in 2010, which collated all the research done up until that point. It reveals that rises in a country’s workforce attributable to foreign-born workers have only a small effect on wages, which could be positive or negative. At worst, a 1 per cent rise caused wages to fall by 0.2 per cent, mostly for earlier generations of immigrants. The impact on the availability of jobs for natives is “basically zero”, he says. Any tendency for wages to fall with an increase in immigration can be counteracted by enforcing a minimum wage. The UK Migration Advisory Committee came to a similar conclusion in 2012. “EU and non-EU migrants who have been in the UK for over five years are not associated with the displacement of British-born workers,” it reported. Very recent migrants do have a small impact, but mainly on previous migrants. What’s more, the ILO notes that low-skilled migrants do “dirty, dangerous and difficult” jobs, which locals do not want – crop picking, care work, cleaning and the like. Meanwhile, highly skilled migrants plug chronic labour shortages in sectors such as healthcare, education and IT. Nearly a third of UK doctors and 13 per cent of nurses are foreign-born. Another presumption made about migrants is that they put a strain on benefit systems. This is also not borne out by the evidence. “It is widely assumed that economic migrants are mainly poor people out to live off the tax money of the relatively rich,” says human rights expert Ian Buruma. “Most of them are not spongers. They want to work.” A lot go not to countries offering generous benefits, but to where there are jobs. Some 82 million people, 36 per cent of the world’s current migrants, have moved from one developing country to another, especially from Haiti to the Dominican Republic, Egypt to Jordan, Indonesia to Malaysia and Burkina Faso to Ivory Coast. Those who do end up in wealthier countries are not the burden people sometimes assume. The Organisation for Economic Co-operation and Development, which represents 34 of the world’s wealthiest nations, calculates that its immigrants on average pay as much in taxes as they take in benefits. Recent research shows that EU workers in the UK take less from the benefits system than native Brits do, mostly because they are younger on average. Moreover, they bring in education paid for by their native countries, and many return to their homeland before they need social security. Based on recent numbers, Britain should conservatively expect 140,000 net immigrants a year for the next 50 years. The Office for Budget Responsibility, the UK’s fiscal watchdog, calculates that if that number doubled, it would cut UK government debt by almost a third – while stopping immigration would up the debt by almost 50 per cent. Illegal migrants make a surprising extra contribution, says Goldin. While many work “informally” without declaring income for taxes, those in formal work often have taxes automatically deducted from their pay cheques, but rarely claim benefits for fear of discovery. Social security paid by employers on behalf of such migrants, but never claimed by them, netted the US $20 billion between 1990 and 1998, says Goldin. That, plus social security contributions by young legal migrants who will not need benefits for decades, is now keeping US social security afloat, he says. “One of the dominant, but empirically unjustified images is of masses of people flowing in… taking away jobs, pushing up housing prices and overloading social services,” writes Stephen Castles at the University of Sydney, Australia, and two colleagues in their book, The Age of Migration. They argue that an increase in migration is often the result rather than the cause of economic changes that harm natives – such as neoliberal economic policies. “The overwhelming majority of research finds small to no effects of migration on employment and wages,” says Douglas Nelson of Tulane University in New Orleans. “On purely economic grounds, immigration is good for everyone.” That may come as a welcome surprise to many. But economics is not the whole story. If perceptions about jobs and wages were the only problem, you would expect anti-immigrant views to run high where jobs are scarce. Yet a 2013 study of 24 European countries found that people living in areas of high unemployment tended not to have negative views of migrants. So, what else are we worried about? One major issue is a perceived threat to social cohesion. In particular, immigrants are often associated with crime. But here again the evidence doesn’t stack up. In 2013, Brian Bell at the London School of Economics and his colleagues found no change in violent crime in Britain linked either to a wave of asylum seekers in the 1990s, or eastern EU migrants after 2004. The asylum seekers were associated with a small increases in property crime such as theft – boosting existing local crime rates some 2 per cent – perhaps because they were not allowed to work, suggest the authors. But areas where eastern Europeans settled had significantly less of any crime. Another study found that immigrants had no impact on crime in Italy. And immigrants in the US are much less likely to commit crimes and are imprisoned less often than native-born Americans. Tim Wadsworth of the University of Colorado has even suggested that a rise in immigration in the 1990s may have driven an overall drop in US crime rates since then. Nevertheless, immigrants can put pressure on local communities. High rates of arrival can temporarily strain schools, housing and other services. “That is what people tend to see,” says Goldin. He says investment is required to mitigate these problems. “Governments need to manage the costs, which tend to be short-term and local,” he says. That’s a challenge, but it can be done. Bryan Caplan of George Mason University in Fairfax, Virginia, points out that since the 1990s, 155 million Chinese have moved from the countryside to cities for work. “This shows it’s entirely possible to build new homes for hundreds of millions of migrants given a couple of decades.” China may be managing the biggest mass migration in history, but there’s one problem it mostly doesn’t face. Perceived threats to national identity often top natives’ list of concerns about immigrants. It can even be an issue when such identities are relatively recent constructs. But countries with a clear ethnic identity and no recent history of significant immigration face the biggest problem, says Nelson. “It’s tricky for Sweden, which went from essentially no immigrants to 16 per cent in half a generation,” he says. And Denmark is another nation where anxiety over the loss of cultural homogeneity has been blamed for a backlash against immigrants. Elsewhere, there has been a hardening of attitudes. Ellie Vasta of Macquarie University in Sydney, Australia, is trying to understand why Europe, which embraced multiculturalism in the 1970s, today calls for cohesion and nationalism, demanding that immigrants conform and testing them for “Britishness” or “Dutchness”. She blames an increasing loss of cohesion in society due to “individualising” forces from mass media to the structure of work. As people rely more on their own resources, they have a longing for community. The presence of foreigners appears to disrupt this, creating a “desire to control differences”, she says. Research by Robert Putnam at Harvard University suggests this move away from multiculturalism could be problematic. He finds that increased diversity lowers “social capital” such as trust, cooperation and altruism. However, this can be overcome in societies that accommodate, rather than erase, diversity by creating “a new, broader sense of ‘we'”. In other words, success lies not in assimilation, but in adaptation on both sides. Canada has tried to achieve this by basing its national identity on immigration. Canadian prime minister Justin Trudeau told the World Economic Forum in Davos, Switzerland, this year that “diversity is the engine of investment. It generates creativity that enriches the world.” This view is shared by complex systems analyst Scott Page at the University of Michigan, Ann Arbor. He argues that culturally diverse groups, from cities to research teams, consistently outperform less diverse groups due to “cognitive diversity” – exposure to disagreement and alternative ways of thinking. “Immigration provides a steady inflow of new ways of seeing and thinking – hence the great success of immigrants in business start-ups, science and the arts,” he says. But more diversity means more complexity, and that requires more energy to maintain – investment in language skills, for example. The fact that immigrants have settled more successfully in some places than others suggests that specific efforts are required to get this right. Achieving broad agreement on core goals and principles is one, says Page. We had better learn how to manage diversity soon because it’s about to skyrocket in wealthy countries. As birth rates fall, there’s a growing realisation that workers from abroad will be required to take up the slack (see “The truth about migration: Rich countries need immigrants“). In addition, the fertility of incomers can stay higher than that of natives for several generations. In 2011, for the first time since mass European migration in the 19th century, more non-white than white babies were born in the US, mainly to recent Asian and Hispanic immigrants and their children. By 2050, white Americans will be a minority, says Bill Frey of the Brookings Institution in Washington DC. That’s good news for the US, he adds, because it gives the country a younger workforce and outlook than its competitors in Europe and Japan. Even if we finesse multiculturalism, there is a potential game changer looming on the horizon. Massive automation and use of robotics could make production less dependent on human labour. This “fourth industrial revolution” may see governments paying their citizens a guaranteed minimum wage independent of work. There has been little discussion of how this might affect a mobile global workforce. However, some warn that cheap, automated production in wealthy countries could destroy export markets for poor countries. This would worsen unemployment and political instability – and also massively boost migration pressure. One way to prepare for this would be to take a more coordinated and strategic approach to the global workforce. As it is, it’s hard to track migration amidst a mess of non-standardised data and incompatible rules. Countries do not agree on who is a migrant. Even the EU has no common policy or information for matching people to jobs. Migrants are usually managed by foreign ministries, not labour ministries that understand the job market. “What could be of real value would be for governments, companies and trade unions to get together and look at where the labour shortages are, and how they could be filled, with natives or migrants,” says Michelle Leighton, head of migration at the ILO. Amazingly, says Goldin, there is no global body to oversee the movement of people. Governments belong to the International Organisation for Migration but it is not an official UN agency so cannot set common policy. Instead, each country jealously guards its borders while competing for workers. Goldin and others think there should be a UN agency managing migration in the global interest, rather than leaving it to nations with differing interests – and power. This, combined with real empirical understanding of the impacts of migration, might finally allow humanity to capitalise on the huge positive potential of its ancient penchant for moving. Read more on the truth about migration here This article appeared in print under the headline “On the road again” Leader: “Migration: Do we want to go forwards or backwards?”


News Article | August 30, 2016
Site: phys.org

When the Moon abruptly cuts off sunlight from Earth at a total solar eclipse, our weather reacts to the sudden darkness. A new issue of the Philosophical Transactions of the Royal Society of London, the oldest surviving scientific journal, deals with the effects of the March 20, 2015 eclipse. Williams College professor Jay Pasachoff, former Fulbright visitor to Williams College Marcos Peñaloza-Murillo, recent alumna Allison Carter '16, and University of Michigan postdoc Michael Roman have an article in this theme issue of "Phil Trans A" discussing the effect measured. Pasachoff and Carter had been on Svalbard, an Arctic archipelago controlled by Norway, for the eclipse. They had carried sensors for temperature and pressure borrowed from Williams College's Jay Racela of the Center for Environmental Studies. The expedition to Svalbard was supported by a grant to Pasachoff from the Committee for Research and Exploration of the National Geographic Society. The bulk of the theme issue was about the effect of the partial eclipse that was also visible from the U.K. The dimming of sunlight over the hour or so during the partial eclipse making its effects measurable. On Svalbard, for the total eclipse, the temperature and pressure automatic sensors found only slight effects, though a thermometer hanging from one of the camera tripods recorded a dip in temperature from the 8°F to which the morning temperature had risen down to –7° a few minutes after the center of totality. Pasachoff and Peñaloza-Murillo, who is professor emeritus at the Universidad de los Andes in Mérida, Venezuela, have published a previous paper about the effect of a total eclipse on weather, and are planning further observations, again in collaboration with Roman, at the August 21, 2017, total solar eclipse that they will attempt to observe from Salem, Oregon. This time the expedition will again be supported in part by the Committee for Research and Exploration of the National Geographic Society, and Williams College, with Pasachoff as Principal Investigator, has also received a research grant from the Solar Terrestrial Program of the Atmospheric and Geospace Sciences Division of the U.S. National Science Foundation. Pasachoff has also borrowed temperature and pressure sensors, a datalogger system called HOBO made by Onset Computer Corporation, as part of his observations of the September 1 annular solar eclipse this week. Pasachoff, along with Naomi Pasachoff, Research Associate at Williams College, is observing the eclipse from Isle de la Réunion in the Indian Ocean east of Madagascar. They are joined there by Rob Lucas of the University of Sydney; Michael Kentrianakis, project manager for the Eclipse 2017 Task Force of the American Astronomical Society; Stephen Bedingfield of Canada; and Xavier Jubier of France, who has provided Google Maps of various eclipse paths, accessible through the website http://eclipses.info that Pasachoff maintains as Chair of the Working Group on Solar Eclipses of the International Astronomical Union. This event is Pasachoff's 64th solar eclipse and the 16th annular solar eclipse. The theme issue of Phil Trans A, titled "Atmospheric effects of solar eclipses stimulated by the 2015 UK eclipse," has been edited by Giles Harrison of the University of Reading and Edward Hanna of the University of Sheffield, both in the UK. The article by Pasachoff, Peñaloza-Murillo, Roman, and Carter is entitled "Terrestrial atmospheric responses on Svalbard to the 20 March 2015 Arctic total solar eclipse under extreme conditions." Pasachoff drafted the article as part of his spring-2016 sabbatical leave in the Planetary Sciences Department of the California Institute of Technology. Another article, "Symbolism and Discovery: Eclipses in Art," by Ian Blatchford, head of the group that runs the Science Museum, London, draws heavily on and acknowledges work on the overlap of art and astronomy by Pasachoff in collaboration with art-historian Roberta J. M. Olson of the New-York Historical Society. The theme issue will be officially published on September 28, as volume 374, issue 2077, of Philosophical Transactions A, though the papers are already available online. The Phil Trans was established in 1665, making it the longest running scientific journal in the world. "Philosophical" refers to natural philosophy, an old term for what we know call "science." More information: Atmospheric effects of solar eclipses stimulated by the 2015 UK eclipse. rsta.royalsocietypublishing.org/content/atmospheric-effects-solar-eclipses-stimulated-2015-uk-eclipse


Lawen A.,Monash University | Lane D.J.R.,University of Sydney
Antioxidants and Redox Signaling | Year: 2013

Iron is a crucial factor for life. However, it also has the potential to cause the formation of noxious free radicals. These double-edged sword characteristics demand a tight regulation of cellular iron metabolism. In this review, we discuss the various pathways of cellular iron uptake, cellular iron storage, and transport. Recent advances in understanding the reduction and uptake of non-transferrin-bound iron are discussed. We also discuss the recent progress in the understanding of transcriptional and translational regulation by iron. Furthermore, we discuss recent advances in the understanding of the regulation of cellular and systemic iron homeostasis and several key diseases resulting from iron deficiency and overload. We also discuss the knockout mice available for studying iron metabolism and the related human conditions. © Copyright 2013, Mary Ann Liebert, Inc.


Maron B.J.,Minneapolis Heart Institute Foundation | Maron M.S.,Hypertrophic Cardiomyopathy Center | Semsarian C.,University of Sydney | Semsarian C.,Royal Prince Alfred Hospital
Journal of the American College of Cardiology | Year: 2012

Hypertrophic cardiomyopathy (HCM) is the most common familial heart disease with vast genetic heterogeneity, demonstrated over the past 20 years. Mutations in 11 or more genes encoding proteins of the cardiac sarcomere (>1,400 variants) are responsible for (or associated with) HCM. Explosive progress achieved in understanding the rapidly evolving science underlying HCM genomics has resulted in fee-for-service testing, making genetic information widely available. The power of HCM mutational analysis, albeit a more limited role than initially envisioned, lies most prominently in screening family members at risk for developing disease and excluding unaffected relatives, which is information not achievable otherwise. Genetic testing also allows expansion of the broad HCM disease spectrum and diagnosis of HCM phenocopies with different natural history and treatment options, but is not a reliable strategy for predicting prognosis. Interfacing a heterogeneous disease such as HCM with the vast genetic variability of the human genome, and high frequency of novel mutations, has created unforeseen difficulties in translating complex science (and language) into the clinical arena. Indeed, proband diagnostic testing is often expressed on a probabilistic scale, which is frequently incompatible with clinical decision making. Major challenges rest with making reliable distinctions between pathogenic mutations and benign variants, and those judged to be of uncertain significance. Genotyping in HCM can be a powerful tool for family screening and diagnosis. However, wider adoption and future success of genetic testing in the practicing cardiovascular community depends on a standardized approach to mutation interpretation, and bridging the communication gap between basic scientists and clinicians. © 2012 American College of Cardiology Foundation.


Skilton M.R.,University of Sydney | Raitakari O.T.,University of Turku | Celermajer D.S.,University of Sydney
Hypertension | Year: 2013

Reduced fetal growth is associated with increased systolic blood pressure. Recently, we found an inverse association between serum ω-3 fatty acids and systolic blood pressure in young adults born with impaired fetal growth. We investigated the associations of dietary intake in childhood of the long-chain ω-3 fatty acids eicosapentaenoic acid and docosahexaenoic acid with blood pressure parameters in children born with reduced birth weight. We analyzed data from 3457 children aged 8 to 15 years participating in the continuous National Health and Nutrition Examination Survey 2003-2004, 2005-2006, and 2007-2008. Dietary intake was assessed by two 24-hour dietary recalls, birth weight by questionnaire, and blood pressure was measured. Systolic blood pressure was 1.1 mm Hg higher in those with reduced (<10th centile) compared with normal birth weight (≥10th centile), consistent with previous findings, although not statistically significant (P=0.40); however, pulse pressure was significantly higher in these children (3.4 mm Hg). In the 354 participants with reduced birth weight, when compared with children with the lowest tertile of intake, those who had the highest tertile of dietary eicosapentaenoic acid and docosahexaenoic acid intake had significantly lower systolic blood pressure (-4.9 mm Hg [95% confidence interval,-9.7 to-0.1]) and pulse pressure (-7.7 mm Hg [95% confidence interval,-15.0 to-0.4]). High-dietary intakes of eicosapentaenoic acid and docosahexaenoic acid are associated with lower systolic blood pressure and pulse pressure in children born with reduced birth weight. These data are consistent with the hypothesis that long-chain ω-3 fatty acids reduce blood pressure in those with impaired fetal growth. © 2013 American Heart Association, Inc.


Objective The study aims to provide information about variance components of psychosocial outcomes: within and between-participant variance, within-participant correlation and for cluster randomised trials, the intra-cluster correlation (ICC) and, also, to demonstrate how estimates of these variance components and ICCs can be used to design randomised trials and cluster randomised trials. Method Data from 15 longitudinal multi-centre psycho-oncology studies were analysed, and variance components including ICCs were estimated. Studies with psychosocial outcomes that had at least one measurement post-baseline including individual randomised controlled trials, cluster randomised trials and observational studies were included. Results Variance components and ICCs from 87 outcome measures were estimated. The unadjusted, single timepoint (first post-baseline) ICCs ranged from 0 to 0.16, with a median value of 0.022 and inter-quartile range 0 to 0.0605. The longitudinal ICCs ranged from 0 to 0.09 with a median value of 0.0007 and inter-quartile range 0 to 0.018. Conclusions Although the magnitude of variance components and ICCs used for sample-size calculation cannot be known in advance of the study, published estimates can help reduce the uncertainty in sample-size calculations. Psycho-oncology researchers should be conservative in their sample-size calculations and use approaches that improve efficiency in their design and analysis. Copyright © 2012 John Wiley & Sons, Ltd.


Broadbent E.,University of Auckland | Donkin L.,University of Sydney | Stroh J.C.,University of Marburg
Diabetes Care | Year: 2011

OBJECTIVE - To investigate diabetic patients' perceptions of illness and treatments, and explore relationships to adherence and blood glucose control. RESEARCH DESIGN ANDMETHODS - Forty-nine type 1 and one hundred and eight type 2 diabetic patients completed questionnaires assessing illness perceptions, treatment beliefs, and adherence to medications, diet, and exercise. Blood glucose control was assessed fromblood tests. RESULTS - Patients rated medication more important than diet and exercise, and reported higher adherence to medications. Insulin was perceived as more helpful for diabetes, while antihypertensives and cholesterol medication were perceived more helpful for preventing heart problems. Perceptions were associated with adherence to insulin, cholesterol and antihypertensive medications, exercise, and diet. Blood glucose control in type 1 diabetic patients was associated with insulin adherence and perceived personal control, and in type 2 diabetic patients to being prescribed insulin or antihypertensives, and perceived personal control. CONCLUSIONS - Patients hold specificmentalmodels about diabetes treatments, which are associated with adherence. © 2011 by the American Diabetes Association.


Liu K.,University of Sydney | Kaffes A.J.,Royal Prince Alfred Hospital
Alimentary Pharmacology and Therapeutics | Year: 2011

Aliment Pharmacol Ther 2011; 34: 416-423 Summary Background Obscure gastrointestinal bleeding (OGIB) is a commonly encountered clinical problem in gastroenterology and is associated with significant morbidity and mortality. The investigation and management of OGIB has changed dramatically over the past decade with the advent of newer gastroenterological and radiological technologies. Aim To review the current evidence on the diagnosis and investigation of OGIB. Methods We searched the PubMed database (1985-2010) for full original articles in English-language journals relevant to the investigation of OGIB. The search terms we used were 'gastrointestinal bleeding' or 'gastrointestinal hemorrhage' or 'small bowel bleeding' each in combination with 'obscure', or 'capsule endoscopy', or 'enteroscopy' or 'enterography' or 'enteroclysis'. Results Capsule endoscopy (CE) or double balloon enteroscopy (DBE) should be first line investigations. They are complimentary procedures with comparable high diagnostic yields. DBE is also able to provide therapeutic intervention. Newer technologies such as single balloon and spiral enteroscopy are currently being evaluated. Radiological and nuclear medicine investigations, such as CT enterography and CT enteroclysis, are alternative diagnostic tools when CE or DBE are contraindicated. Repeating the gastroscopy and/or colonoscopy may be considered in selective situations. An algorithm for investigation of obscure bleeding is proposed. Conclusions The development of capsule endoscopy and double balloon enteroscopy has transformed the approach to the evaluation and management of obscure gastrointestinal bleeding over the past decade. Older diagnostic modalities still play a complementary, but increasingly selective role. © 2011 Blackwell Publishing Ltd.


Matthews J.M.,University of Sydney | Lester K.,University of Sydney | Joseph S.,University of Sydney | Curtis D.J.,Monash University
Nature Reviews Cancer | Year: 2013

LIM-domain proteins are a large family of proteins that are emerging as key molecules in a wide variety of human cancers. In particular, all members of the human LIM-domain-only (LMO) proteins, LMO1-4, which are required for many developmental processes, are implicated in the onset or the progression of several cancers, including T cell leukaemia, breast cancer and neuroblastoma. These small proteins contain two protein-interacting LIM domains but little additional sequence, and they seem to function by nucleating the formation of new transcriptional complexes and/or by disrupting existing transcriptional complexes to modulate gene expression programmes. Through these activities, the LMO proteins have important cellular roles in processes that are relevant to cancer such as self-renewal, cell cycle regulation and metastasis. These functions highlight the therapeutic potential of targeting these proteins in cancer. © 2013 Macmillan Publishers Limited. All rights reserved.


Shine R.,University of Sydney | Doody J.S.,Monash University
Frontiers in Ecology and the Environment | Year: 2011

Understanding the reasons for disagreements about conservation issues can facilitate effective engagement between the people involved. Invasive species are often central to such debates, with researchers and members of the public frequently disagreeing about the nature and magnitude of problems posed by the invaders, and the best ways to deal with them. The spread of non-native cane toads (Rhinella marina) throughout Australia has stimulated research on toad impact and control, and has mobilized local communities to reduce cane toad numbers through direct action. Biologists and community groups have disagreed about many toad-related topics, providing an instructive case history about impediments to consensus. Debates about the ecological impacts of cane toads mostly reflect poor communication of available research results (ie scientists have been largely unsuccessful in transmitting their findings to community groups), whereas disagreements about toad control reflect an information vacuum about the effectiveness of alternative methods, such as trapping, biocontrol, and predator training to induce toad aversion, among others. Many other disagreements have arisen from the differing motivations of scientists and community groups. Although the debates are superficially about evidence, the deeper divergence reflects differing social pressures, the ways that information is transmitted, and how people evaluate the validity of information. © The Ecological Society of America.


Grant
Agency: European Commission | Branch: FP7 | Program: CP | Phase: ICT-2009.3.8 | Award Amount: 3.52M | Year: 2010

In the SOFI project, new active optical waveguides and integrated optoelectronic circuits based on a novel silicon-organic hybrid technology are introduced. The technology is based on the low-cost CMOS process technology for fabrication of the optical waveguides - allowing for the convergence of electronics with optics. It is complemented by an organic layer that brings in new functionalities so far not available in silicon. Recent experiments have shown that such a technology can boost the signal processing in silicon far beyond 100 Gbit/s - which corresponds to a tripling of the state-of-the art bitrate.\nSOFI focuses on a proof-of concept implementation of ultra-fast ultra-low energy optical phase modulator waveguides such as needed in optical communications. These devices will ultimately be used to demonstrate an integrated circuit enabling the aggregation of low-bitrate electrical signals into a 100 Gbit/s OFDM data-stream having an energy consumption of only 5 fJ/bit. However, the SOFI technology is even more fundamental. By varying the characteristics of the organic layer one may also envision new sensing applications for environment and medicine.\nThe suggested approach is practical and disruptive. It combines the silicon CMOS technology and its standardized processes with the manifold possibilities offered by novel organic materials. This way, for instance, the processing speed limitations inherent in silicon are overcome, and an order-of-magnitude improvement can be achieved. More importantly, the new technology provides the lowest power consumption so far demonstrated for devices in its class. This is supported by calculations and first initial tests. The low power consumption is attributed to the tiny dimensions of the devices and to the fact, that optical switching is performed in the highly nonlinear cladding organic material rather than in silicon.


Patent
NEWSOUTH INNOVATIONS PTY Ltd and University of Sydney | Date: 2012-02-04

Disclosed herein is a plant watering device (10) comprising a vessel, for example a tube (14), arranged to be fastened to at least one plant propagule. The tube 14 is further arranged such that when the at least one propagule (18) is so fastened water received by the tube (14) is drawn by the at least one propagule (18) through a portion of the tube (14).


Patent
University of Sydney and NewSouth Innovations Pty Ltd | Date: 2013-02-01

The present invention relates to a method of stabilising a tear film in an individual having an ocular surface inflammatory disorder by providing a compound to an ocular surface of the individual to reduce the synthesis of a cholesterol by a meibum-producing tissue.


News Article | January 4, 2016
Site: www.scientificcomputing.com

Tales of strange alien worlds, fantastic future technologies and bowls of sentient petunias have long captivated audiences worldwide. But science fiction is more than just fantasy in space; it can educate, inspire and expand our imaginations to conceive of the universe as it might be. We invited scientists to highlight their favorite science fiction novel or film and tell us what it was that captivated their imagination — and, for some, how it started their career. Long before the era of hard science fiction, Robert Heinlein took Einstein’s special theory of relativity and turned it into a masterpiece of young adult fiction. In Time for the Stars, Earth explores the Galaxy via a fleet of “torch ships,” spacecraft that travel at a significant fraction of the speed of light. Communication with the fleet is handled by pairs of telepathic twins, one of whom stays on Earth while the other journeys forth. The supposed simultaneity of telepathy overcomes the massive time delays that would otherwise occur over the immense distances of space. The catch is that at the tremendous speeds of these torch ships, time travels much slower than back on Earth. The story focuses on Tom, the space traveler, and his twin brother Pat, who remains behind. The years and decades sweep by for Pat, in a journey that takes mere months for Tom. Pat’s telepathic voice accelerates to a shrill accelerated squeal for Tom, as Einstein’s time dilation drives them apart, both metaphorically and physically. This is ultimately a breezy kids’ adventure novel, but it had a massive influence on me. Modern physics wasn’t abstruse. It was measurable, and it had consequences. I was hooked. And I’ve never let go. Stanley Kubrick’s 2001: A Space Odyssey encompasses human evolution, space, alien life and artificial intelligence. Despite being released the year before Apollo 11, the Academy Award winning special effects still make its vision of space inspiring. It can be spine tingling when seen at an old fashioned cinema with a wide screen and a 70mm print (such as Melbourne’s Astor). 2001 is also a product of its time. During the 1960s NASA consumed roughly four percent of the US federal budget, and if that had continued, then perhaps the International Space Station would be a giant rotating behemoth seen in 2001. Indeed 2001’s Pan Am spaceplane seems like a natural progression from early (ambitious) proposals for the Space Shuttle. Technologies in the film are ahead and behind what we have today. The most memorable (and arguably emotional) character of 2001 is HAL, an eerily intelligent computer that is far in advance of any computer in existence. And yet astronauts on the moon are using photographic film, rather than digital cameras. Kubrick deliberately made some space travel seem routine, so his space travelers are frozen in 1960s norms. The astronauts are mostly white men, with women mostly relegated to roles such as flight attendants (an exception is a Soviet scientist). Fortunately, in this regard, the 21st century is more advanced than 2001’s imagined future. The first book of the classic “Space Trilogy” was written 20 years before the launch of Sputnik 1 in 1957, the first “world-circling spaceship”. C. S. Lewis was no scientist — he was a professor of Medieval and Renaissance literature — but his deep knowledge of pre-modern cosmology gives his take on space travel a unique flavor. I find myself returning to Out of the Silent Planet and its sequel, Voyage to Venus, over and over again. In the story, Lewis' hero, Ransom, becomes a reluctant astronaut when kidnapped by the uber-colonial “hard” scientist Weston for a journey to Mars. Confined in the spherical spaceship, he becomes aware of a constant faint tinkling noise. In the world before space junk, it is a fine rain of micrometeoroids striking the aluminium shell. Ransom’s “dismal fancy of the black, cold vacuity, the utter deadness, which was supposed to separate the worlds” fostered by modern science, is transformed by the experience of actually being in space. His revelation is an intimately joyous recognition that space, far from being dead, is an “empyrean ocean of radiance,” whose “blazing and innumerable offspring” look down upon the Earth. He feels “life pouring into him from it every moment. How indeed should it be otherwise, since out of this ocean all the worlds and all their life had come?” How, indeed, could we not long for space after such a vision as this? Not just life on another planet… It was Larry Niven’s Ringworld that led, in part, to my career in astrophysics. Ringworld describes the exploration of an alien megastructure of unknown origin, discovered around a distant star. The artificial world is literally in the shape of a ring, with a radius corresponding to the distance of the Earth to the sun; mountainous walls on each side hold in the atmosphere, and the surface is decorated with a wide variety of alien plants and animals. The hero gets to the Ringworld via a mildly faster-than-light drive purchased at astronomical cost from an alien trading species, and makes use of teleportation disks and automated medical equipment. The appeal of high-technology stories like this are obvious: many contemporary problems, like personal transportation, overpopulation, disease, and death have all been solved by advanced technology; while of course, new and interesting problems have arisen. Grand in scope, and featuring some truly bold ideas, Ringworld (and Niven’s other books set in “Known Space”) are as keen now as when they were written, 40 years ago. Whether you have heard the radio play, read the book or seen the film, this story of a hapless Englishman negotiating his way through the galaxy is an essential piece of nerd culture. I first heard the play as a teenager, and even now not many weeks go by without me delving into sections of this trilogy of five-parts. As a scientist, my life can seem a little zany to an outsider. When your job does sometimes actually entail reversing the polarity of a neutron flow, you need to look to an even crazier fiction world for your escapism. And for me this book is it. A world where sperm whales and bowls of petunias can appear in space for no reason at all and staggering co-incidences happen every time you power up your spaceship. Life among the stars is rather more ordinary than it might seem. BBC The genius (and I do not use that word lightly) of Douglas Adams’ writing is that the loopy concepts of the book are presented with a thin veneer of “scienceness”, enough to make the fantastical concepts that little more believable. Then he “normalises” it all. A packet of peanuts will help you survive a matter transference beam, for instance. The heart of this book is its characters, a suite of people/aliens that are echoed in every workplace (certainly every laboratory) across the world. Walk into any science institute and there will be a two-headed power-hungry presidential leader railing upon post-docs, with brains the size of planets, who really wish you hadn’t talked to them about life. I get the impression that Douglas Adams would not have wanted you to take anything away from this book. But, for me it gives continued inspiration that there is always another way to sidle up to a problem. Most of all though: don’t panic. What might a post-scarcity society look like? I love a lot of science fiction, but Iain M. Banks’ classic space-opera Consider Phlebas is a special favorite. Banks describes the “Culture”, a diverse, anarchic, utopian and galaxy-spanning post-scarcity society. The Culture is a hybrid of enhanced and altered humanoids and artificial intelligences, which range from rather dull to almost godlike in their capabilities. Most people in the Culture lead a relaxed, hedonistic lifestyle, going to parties, doing art, taking drugs (which they can synthesize from bio-engineered glands) and generally having fun. The tedious business of actually running the whole show is mostly left up to the most powerful AIs, called Minds, who manifest themselves in the great star-ships and orbitals in which most citizens live. Of course, it’s a big galaxy, and not everyone shares the Culture’s easy-going approach to galactic citizenship. Consider Phlebas is set against the backdrop of a growing conflict between the Culture and the Indirans, a speciesist, religious and hierarchical empire with expansion on its mind. Perhaps the best thing about Consider Phlebas (apart from the wonderfully irreverent ship names the Minds give themselves) is the fact that a story from this conflict is told from the perspective of an Indiran agent, who despises the Culture and everything it stands for. My own take on the book is as an ode to progressive technological humanism, and the astute reader will find many parallels to contemporary political and cultural issues. Truman Burbank, played with a delectable balance of animation and pathos by Jim Carrey, lives a confected life as unwitting protagonist in a reality television show. Conceived on camera, adopted by a corporation and manipulated at every stage by the show’s sinister creative genius, Christof (Ed Harris), Truman nonetheless comes to realize that his world is a sham and that almost every interaction he ever had was a lie. Against the backdrop of Seahaven’s dystopic perfection, Weir exposes prescient glimpses of reality television, surveillance culture and the stalkerish targeted advertising we now find in our social media streams. It’s like a peppy 1984 but with corporate hegemony replacing the totalitarian state. But I was most gripped by the fresh take on ancient debates about rationalist nature and empiricist nurture. As a student of behavior, I’ve always rued the amputation of biology from the social sciences, particularly the wasted opportunity that saw sociobiology turned into a perjorative in the late 1970s, at least outside the study of insect sociality. The rejection of evolutionary thinking as “biological determinism”, and its positioning as opposite to progress and liberation, has always rankled me. I recall watching the film alone, between conferences, at an ancient cinema in Santa Cruz. What excited me most, and kept me up much of the night scribbling notes that would eventually shape my research direction and lead me to popular writing, was Weir’s clever inversion of the relationships between nature/nurture and determinism/free will. While Cristof’s nurture tramples Truman’s nature throughout the film, in the end something inherent to Truman sets him free, as he whispers: “You never had a camera in my head!”. The climax of The Truman Show. “Good afternoon, good evening and goodnight!” In the decade before Albert Einstein told us that time and space were malleable, H. G. Wells gave us the adventures of the Time Traveller. We never learn the name of this Victorian scientist, a man who explains “there is no difference between time and any of the three dimensions of space” and builds a machine to explore this new world. It was not from Einstein that I discovered the non-absolute nature of space and time, but from the Time Traveller, and his present-day incarnation, Doctor Who. A view of the distant future from the recent past. The Traveller doesn’t head to past, to be a voyeur at historical events, but into the unknown future. And the future of Wells is not glorious! The Traveller finds evolution has split humans in two, with delicate Eloi being little more than food for the subterranean Morlocks. Escaping mayhem and heading even further into the future, the Traveller finds the life’s last gasp under a swollen, red sun, eventually seeing the Earth succumbs to final freezing, before he returns to the relative safety of Victorian London. This scientific vision of the future struck me, and the nature of time has remained in my mind. At the end of the story the Traveller heads back to continue his exploration of the future; playing with the equations of relativity is likely to be the closest I will ever come to realizing this dream. Michael J. I. Brown, Associate professor, Monash University; Alice Gorman, Senior Lecturer in archaeology and space studies, Flinders University; Bryan Gaensler, Director, Dunlap Institute for Astronomy and Astrophysics, University of Toronto; Duncan Galloway, PhD; Senior Lecturer in Astrophysics, Monash University; Geraint Lewis, Professor of Astrophysics, University of Sydney; Helen Maynard-Casely, Instrument Scientist, Australian Nuclear Science and Technology Organisation; Matthew Browne, Senior Lecturer in Statistics, CQUniversity Australia, and Rob Brooks, Scientia Professor of Evolutionary Ecology; Director, Evolution & Ecology Research Centre, UNSW Australia. This article was originally published on The Conversation. Read the original article.


News Article | September 14, 2016
Site: phys.org

The research, made possible by cutting-edge AAO instrumentation, means that astronomers can now classify galaxies according to their physical properties rather than human interpretation of a galaxy's appearance. For the past 200 years, telescopes have been capable of observing galaxies beyond our own galaxy, the Milky Way. Only a few were visible to begin with but as telescopes became more powerful, more galaxies were discovered, making it crucial for astronomers to come up with a way to consistently group different types of galaxies together. In 1926, the famous American astronomer Edwin Hubble refined a system that classified galaxies into categories of spiral, elliptical, lenticular or irregular shape. This system, known as the Hubble sequence, is the most common way of classifying galaxies to this day. Despite its success, the criteria on which the Hubble scheme is based are subjective, and only indirectly related to the physical properties of galaxies. This has significantly hampered attempts to identify the evolutionary pathways followed by different types of galaxies as they slowly change over billions of years. Dr Luca Cortese, from The University of Western Australia node of the International Centre for Radio Astronomy Research (ICRAR), said the world's premier astronomical facilities are now producing surveys consisting of hundreds of thousands of galaxies rather than the hundreds that Hubble and his contemporaries were working with. "We really need a way to classify galaxies consistently using instruments that measure physical properties rather than a time consuming and subjective technique involving human interpretation," he said. In a study led by Dr Cortese, a team of astronomers has used a technique known as Integral Field Spectroscopy to quantify how gas and stars move within galaxies and reinterpret the Hubble sequence as a physically based two-dimensional classification system. "Thanks to the development of new technologies, we can map in great detail the distribution and velocity of different components of galaxies. Then, using this information we're able to determine the overall angular momentum of a galaxy, which is the key physical quantity affecting how the galaxy will evolve over billions of years. "Remarkably, the galaxy types described by the Hubble scheme appear to be determined by two primary properties of galaxies–mass and angular momentum. This provides us with a physical interpretation for the well known Hubble sequence whilst removing the subjectiveness and bias of a visual classification based on human perception rather than actual measurement." The new study involved 488 galaxies observed by the 3.9m Anglo Australian Telescope in New South Wales and an instrument attached to the telescope called the Sydney-AAO Multi-object Integral-field spectrograph or 'SAMI'. The SAMI project, led by the University of Sydney and the ARC Centre of Excellence for All-sky Astrophysics (CAASTRO), aims to create one of the first large-scale resolved survey of galaxies, measuring the velocity and distribution of gas and stars of different ages in thousands of systems. "Australia has a lot of expertise with this type of astronomy and is really at the forefront of what's being done," said Professor Warrick Couch, Director of the Australian Astronomical Observatory and CAASTRO Partner Investigator. "For the SAMI instrument we succeeded in putting 61 optical fibres within a distance that's less than half the width of a human hair. "That's no small feat, it's making this type of work possible and attracting interest from astronomers and observatories from around the world." Future upgrades of the instrument are planned that will allow astronomers to obtain even sharper maps of galaxies and further their understanding of the physical processes shaping the Hubble sequence. "As we get better at doing this and the instruments we're using are upgraded, we should be able to look for the physical triggers that cause one type of galaxy to evolve into another—that's really exciting stuff," Dr Cortese said. More information: The SAMI Galaxy Survey: the link between angular momentum and optical morphology. arxiv.org/abs/1608.00291


News Article | November 10, 2016
Site: spaceref.com

Astronomers may have solved the mystery of the peculiar volatile behavior of a supermassive black hole at the center of a galaxy. Combined data from NASA's Chandra X-ray Observatory and other observatories suggest that the black hole is no longer being fed enough fuel to make its surroundings shine brightly. Many galaxies have an extremely bright core, or nucleus, powered by material falling toward a supermassive black hole. These so-called "active galactic nuclei" or AGN, are some of the brightest objects in the Universe. Astronomers classify AGN into two main types based on the properties of the light they emit. One type of AGN tends to be brighter than the other. The brightness is generally thought to depend on either or both of two factors: the AGN could be obscured by surrounding gas and dust, or it could be intrinsically dim because the rate of feeding of the supermassive black hole is low. Some AGN have been observed to change once between these two types over the course of only 10 years, a blink of an eye in astronomical terms. However, the AGN associated with the galaxy Markarian 1018 stands out by changing type twice, from a faint to a bright AGN in the 1980s and then changing back to a faint AGN within the last five years. A handful of AGN have been observed to make this full-cycle change, but never before has one been studied in such detail. During the second change in type the Markarian 1018 AGN became eight times fainter in X-rays between 2010 and 2016. After discovering the AGN's fickle nature during a survey project using ESO's Very Large Telescope (VLT), astronomers requested and received time to observe it with both NASA's Chandra X-ray Observatory and Hubble Space Telescope. The accompanying graphic shows the AGN in optical light from the VLT (left) with a Chandra image of the galaxy's central region in X-rays showing the point source for the AGN (right). Data from ground-based telescopes including the VLT allowed the researchers to rule out a scenario in which the increase in the brightness of the AGN was caused by the black hole disrupting and consuming a single star. The VLT data also cast doubt on the possibility of obscuration by intervening gas causing the dimming. However, the true mechanism responsible for the AGN's surprising variation remained a mystery until Chandra and Hubble data was analyzed. Chandra observations in 2010 and 2016 conclusively showed that obscuration by intervening gas was not responsible for the decline in brightness. Instead, models of the optical and ultraviolet light detected by Hubble, NASA's Galaxy Evolution Explorer and the Sloan Digital Sky Survey in the bright and faint states showed that the AGN had faded because the black hole was being starved of infalling material. This starvation also explains the fading of the AGN in X-rays. One possible explanation for this starvation is that the inflow of fuel is being disrupted. This disruption could be caused by interactions with a second supermassive black hole in the system. A black hole binary is possible as the galaxy is the product of a collision and merger between two large galaxies, each of which likely contained a supermassive black hole in its center. The list observatories used in this finding also include NASA's Nuclear Spectroscopic Telescope Array (NuSTAR) mission and Swift spacecraft. Two papers, one with the first author of Bernd Husemann (previously at ESO and currently at the Max Planck Institute for Astronomy) and the other with Rebecca McElroy (University of Sydney), describing these results appeared in the September 2016 issue of Astronomy & Astrophysics journal. NASA's Marshall Space Flight Center in Huntsville, Alabama, manages the Chandra program for NASA's Science Mission Directorate in Washington. The Smithsonian Astrophysical Observatory in Cambridge, Massachusetts, controls Chandra's science and flight operations. Please follow SpaceRef on Twitter and Like us on Facebook.


News Article | March 9, 2016
Site: www.nature.com

Timothy Doran's 11-year-old daughter is allergic to eggs. And like about 2% of children worldwide who share the condition, she is unable to receive many routine vaccinations because they are produced using chicken eggs. Doran, a molecular biologist at the Commonwealth Scientific and Industrial Research Organisation (CSIRO) in Geelong, Australia, thinks that he could solve this problem using the powerful gene-editing tool CRISPR–Cas9. Most egg allergies are caused by one of just four proteins in the white, and when Doran's colleagues altered the gene that encodes one of these in bacteria, the resulting protein no longer triggered a reaction in blood serum from people who were known to be allergic to it1. Doran thinks that using CRISPR to edit the gene in chickens could result in hypoallergenic eggs. The group expects to hatch its first generation of chicks with gene modifications later this year as a proof of concept. Doran realizes that it could be some time before regulators would approve gene-edited eggs, and he hopes that his daughter will have grown out of her allergy by then. “If not, I've got someone ready and waiting to try the first egg,” he says. Chickens are just one of a menagerie of animals that could soon have their genomes reimagined. Until now, researchers had the tools to genetically manipulate only a small selection of animals, and the process was often inefficient and laborious. With the arrival of CRISPR, they can alter the genes of a wide range of organisms with relative precision and ease. In the past two years alone, the prospect of gene-edited monkeys, mammoths, mosquitoes and more have made headlines as scientists attempt to put CRISPR to use for applications as varied as agriculture, drug production and bringing back lost species. CRISPR-modified animals are even being marketed for sale as pets. “It's allowed us to consider a whole raft of projects we couldn't before,” says Bruce Whitelaw, an animal biotechnologist at the Roslin Institute in Edinburgh, UK. “The whole community has wholeheartedly moved towards genome editing.” But regulators are still working out how to deal with such creatures, particularly those intended for food or for release into the wild. Concerns abound about safety and ecological impacts. Even the US director of national intelligence has weighed in, saying that the easy access, low cost and speedy development of genome editing could increase the risk that someone will engineer harmful biological agents. Eleonore Pauwels, who studies biotechnology regulation at the Wilson Center in Washington DC, says that the burgeoning use of CRISPR in animals offers an opportunity for researchers and policymakers to engage the public in debate. She hopes that such discussions will help in determining which uses of CRISPR will be most helpful to humans, to other species and to science — and will highlight the limits of the technology. “I think there is a lot of value in humility about how much control we have,” she says. Disease resistance is one of the most popular applications for CRISPR in agriculture, and scientists are tinkering across a wide spectrum of animals. Biotechnology entrepreneur Brian Gillis in San Francisco is hoping that the tool can help to stem the dramatic loss of honeybees around the world, which is being caused by factors such as disease and parasites. Gillis has been studying the genomes of 'hygienic' bees, which obsessively clean their hives and remove sick and infested bee larvae. Their colonies are less likely to succumb to mites, fungi and other pathogens than are those of other strains, and Gillis thinks that if he can identify genes associated with the behaviour, he might be able to edit them in other breeds to bolster hive health. But the trait could be difficult to engineer. No hygiene-associated genes have been definitively identified, and the roots of the behaviour may prove complex, says BartJan Fernhout, chairman of Arista Bee Research in Boxmeer, the Netherlands, which studies mite resistance. Moreover, if genes are identified, he says, conventional breeding may be sufficient to confer resistance to new populations, and that might be preferable given the widespread opposition to genetic engineering. Such concerns don't seem to have slowed down others studying disease resistance. Whitelaw's group at the Roslin Institute is one of several using CRISPR and other gene-editing systems to create pigs that are resistant to viral diseases that cost the agricultural industry hundreds of millions of dollars each year. Whitelaw's team is using another gene-editing technique to alter immune genes in domestic pigs to match more closely those of warthogs that are naturally resistant to African swine fever, a major agricultural pest2. And Randall Prather at the University of Missouri in Columbia has created pigs with a mutated protein on the surface of their cells, which should make them impervious to a deadly respiratory virus3. Other researchers are making cattle that are resistant to the trypanosome parasites that are responsible for sleeping sickness. Whitelaw hopes that regulators — and sceptical consumers — will be more enthusiastic about animals that have had their genes edited to improve disease resistance than they have been for traits such as growth promotion because of the potential to reduce suffering. And some governments are considering whether CRISPR-modified animals should be regulated in the same way as other genetically modified organisms, because they do not contain DNA from other species. Doran's quest to modify allergens in chicken eggs requires delicate control. The trick is to finely adjust a genetic sequence in a way that will stop the protein from triggering an immune reaction in people, but still allow it to perform its normal role in embryonic development. CRISPR has made such precise edits possible for the first time. “CRISPR has been the saviour for trying to tackle allergens,” says Mark Tizard, a molecular biologist at CSIRO who works with Doran on chickens. Using the technique in birds still presents problems. Mammals can be induced to produce extra eggs, which can then be removed, edited, fertilized and replaced. But in birds, the fertilized egg binds closely to the yolk and removing it would destroy the embryo. And because eggs are difficult to access while still inside the hen, CRISPR components cannot be directly injected into the egg itself. By the time the egg is laid, development has proceeded too far for gene editing to affect the chick's future generations. To get around this, Tizard and Doran looked to primordial germ cells (PGCs) — immature cells that eventually turn into sperm or eggs. Unlike in many animals, chicken PGCs spend time in the bloodstream during development. Researchers can therefore remove PGCs, edit them in the lab and then return them to the developing bird. The CSIRO team has even developed a method to insert CRISPR components directly into the bloodstream so that they can edit PGCs there4. The researchers also plan to produce chickens with components required for CRISPR integrated directly into their genomes — what they call CRISPi chickens. This would make it even easier to edit chicken DNA, which could be a boon for 'farmaceuticals' — drugs created using domesticated animals. Regulators have shown a willingness to consider such drugs. In 2006, the European Union approved a goat that produces an anticlotting protein in its milk. It was subsequently approved by the US Food and Drug Administration, in 2009. And in 2015, both agencies approved a transgenic chicken whose eggs contain a drug for cholesterol diseases. About 4,000 years ago, hunting by humans helped to drive woolly mammoths (Mammuthus primigenius) to extinction. CRISPR pioneer George Church at Harvard Medical School in Boston, Massachusetts, has attracted attention for his ambitious plan to undo the damage by using CRISPR to transform endangered Indian elephants into woolly mammoths — or at least cold-resistant elephants. The goal, he says, would be to release them into a reserve in Siberia, where they would have space to roam. The plan sounds wild — but efforts to make mammals more mammoth-like have been going on for a while. Last year, geneticist Vincent Lynch at the University of Chicago in Illinois showed that cells with the mammoth version of a gene for heat-sensing and hair growth could grow in low temperatures5, and mice with similar versions prefer the colder parts of a temperature-regulated cage6. Church says that he has edited about 14 such genes in elephant embryos. But editing, birthing and then raising mammoth-like elephants is a huge undertaking. Church says that it would be unethical to implant gene-edited embryos into endangered elephants as part of an experiment. So his lab is looking into ways to build an artificial womb; so far, no such device has ever been shown to work. There are some de-extinction projects that could prove less challenging. Ben Novak at the University of California, Santa Cruz, for example, wants to resurrect the passenger pigeon (Ectopistes migratorius), a once-ubiquitous bird that was driven to extinction in the late nineteenth century by overhunting. His group is currently comparing DNA from museum specimens to that of modern pigeons. Using PGC methods similar to Doran's, he plans to edit the modern-pigeon genomes so that the birds more closely resemble their extinct counterparts. Novak says that the technology is not yet advanced enough to modify the hundreds of genes that differ between modern and historic pigeons. Still, he says that CRISPR has given him the best chance yet of realizing his lifelong dream of restoring an extinct species. “I think the project is 100% impossible without CRISPR,” he says. For decades, researchers have explored the idea of genetically modifying mosquitos to prevent the spread of diseases such as dengue or malaria. CRISPR has given them a new way to try. In November, molecular biologist Anthony James of the University of California, Irvine, revealed a line of mosquitoes with a synthetic system called a gene drive that passes a malaria-resistance gene on to the mosquitoes' offspring7. Gene drives ensure that almost all the insects' offspring inherit two copies of the edited gene, allowing it to spread rapidly through a population. Another type of gene drive, published last December8, propagates a gene that sterilizes all female mosquitoes, which could wipe out a population. The outbreak of mosquito-borne Zika virus in Central and South America has increased interest in the technology, and several research labs have begun building gene drives that could eliminate the Zika-carrying species, Aedes aegypti. Many scientists are worried about unintended and unknown ecological consequences of releasing such a mosquito. For this reason, Church and his colleagues have developed 'reverse gene drives' — systems that would propagate through the population to cancel out the original mutations9, 10. But Jason Rasgon, who works on genetically modified insects at Pennsylvania State University in University Park, says that although ecology should always be a consideration, the extent and deadliness of some human diseases such as malaria may outweigh some costs. Mosquitoes are some of the easiest insects to work with, he says, but researchers are looking at numerous other ways to use gene drives, including making ticks that are unable to transmit the bacteria that cause Lyme disease. Last year, researchers identified a set of genes that could be modified to prevent aquatic snails (Biomphalaria glabrata) from transmitting the parasitic disease schistosomiasis11. Last November, after a lengthy review, the US Food and Drug Administration approved the first transgenic animals for human consumption: fast-growing salmon made by AquaBounty Technologies of Maynard, Massachusetts. Some still fear that if the salmon escape, they could breed with wild fish and upset the ecological balance. To address such concerns, fish geneticist Rex Dunham of Auburn University in Alabama has been using CRISPR to inactivate genes for three reproductive hormones — in this case, in catfish, the most intensively farmed fish in the United States. The changes should leave the fish sterile, so any fish that might escape from a farm, whether genetically modified or not, would stand little chance of polluting natural stocks. “If we're able to achieve 100% sterility, there is no way that they can make a genetic impact,” Dunham says. Administering hormones would allow the fish to reproduce for breeding purposes. And Dunham says that similar methods could be used in other fish species. CRISPR could also reduce the need for farmers to cull animals, an expensive and arguably inhumane practice. Biotechnologist Alison van Eenennaam at the University of California, Davis, is using the technique to ensure that beef cattle produce only male or male-like offspring, because females produce less meat and are often culled. She copies a Y-chromosome gene that is important for male sexual development onto the X chromosome in sperm. Offspring produced with the sperm would be either normal, XY males, or XX females with male traits such as more muscle. In the egg industry, male chicks from elite egg-laying chicken breeds have no use, and farmers generally cull them within a day of hatching. Tizard and his colleagues are adding a gene for green fluorescent protein to the chickens' sex chromosomes so that male embryos will glow under ultraviolet light. Egg producers could remove the male eggs before they hatch and potentially use them for vaccine production. There are other ways that CRISPR could make agriculture more humane. Packing cattle into trailers or other small spaces often causes injuries, especially when the animals have long horns. So cattle farmers generally burn, cut or remove them with chemicals — a process that can be painful for the animal and dangerous for the handler. There are cattle varieties that do not have horns — a condition called 'polled' — but crossing these breeds with 'elite' meat or dairy breeds reduces the quality of the offspring. Molecular geneticist Scott Fahrenkrug, founder of Recombinetics in Saint Paul, Minnesota, is using gene-editing techniques to transfer the gene that eliminates horns into elite breeds12. The company has produced only two polled calves so far — both male — which are being raised at the University of California, Davis, until they are old enough to breed. Last September, the genomics firm BGI wowed a conference in Shenzhen, China, with micropigs — animals that grow to only around 15 kilograms, about the size of a standard dachshund. BGI had originally intended to make the pigs for research, but has since decided to capitalize on creation of the animals by selling them as pets for US$1,600. The plan is to eventually allow buyers to request customized coat patterns. BGI is also using CRISPR to alter the size, colour and patterns of koi carp. Koi breeding is an ancient tradition in China, and Jian Wang, director of gene-editing platforms at BGI, says that even good breeders will usually produce only a few of the most beautifully coloured and proportioned, 'champion quality' fish out of millions of eggs. CRISPR, she says, will let them precisely control the fish's patterns, and could also be used to make the fish more suitable for home aquariums rather than the large pools where they are usually kept. Wang says that the company will begin selling koi in 2017 or 2018 and plans to eventually add other types of pet fish to its repertoire. Claire Wade, a geneticist at the University of Sydney in Australia, says that CRISPR could be used to enhance dogs. Her group has been cataloguing genetic differences between breeds and hopes to identify areas involved in behaviour and traits such as agility that could potentially be edited13. Sooam Biotech in Seoul, best-known for a service that will clone a deceased pet for $100,000, is also interested in using CRISPR. Sooam researcher David Kim says that the company wants to enhance the capabilities of working dogs — guide dogs or herding dogs, for example. Jeantine Lunshof, a bioethicist who works in Church's lab at Harvard, says that engineering animals just to change their appearance, “just to satisfy our idiosyncratic desires”, borders on frivolous and could harm animal well-being. But she concedes that the practice is not much different from the inbreeding that humans have been performing for centuries to enhance traits in domestic animals and pets. And CRISPR might even help to eliminate some undesirable characteristics: many dog breeds are prone to hip problems, for example. “If you could use genome editing to reverse the very bad effects we have achieved by this selective inbreeding over decades, then that would be good.” Ferrets have long been a useful model for influenza research because the virus replicates in their respiratory tracts and they sometimes sneeze when infected, allowing studies of virus transmission. But until the arrival of CRISPR, virologists lacked the tools to easily alter ferret genes. Xiaoqun Wang and his colleagues at the Chinese Academy of Sciences in Beijing have used CRISPR to tweak genes involved in ferret brain development14, and they are now using it to modify the animals' susceptibility to the flu virus. He says that he will make the model available to infectious-disease researchers. Behavioural researchers are particularly excited about the prospect of genetically manipulating marmosets and monkeys, which are more closely related to humans than are standard rodent models. The work is moving most quickly in China and Japan. In January, for instance, neuroscientist Zilong Qiu and his colleagues at the Chinese Academy of Sciences in Shanghai published a paper15 describing macaques with a CRISPR-induced mutation in MECP2, the gene associated with the neurodevelopmental disorder Rett syndrome. The animals showed symptoms of autism spectrum disorder, including repetitive behaviours and avoiding social contact. But Anthony Chan, a geneticist at Emory University in Atlanta, Georgia, cautions that researchers must think carefully about the ethics of creating such models and whether more-standard laboratory animals such as mice would suffice. “Not every disease needs a primate model,” he says. Basic neuroscience could also benefit from the availability of new animal models. Neurobiologist Ed Boyden at the Massachusetts Institute of Technology is raising a colony of the world's tiniest mammal — the Etruscan tree shrew (Suncus etruscus). The shrews' brains are so small that the entire organ can be viewed under a microscope at once. Gene edits that cause neurons to flash when they fire, for instance, could allow researchers to study the animal's entire brain in real time. The CRISPR zoo is expanding fast — the question now is how to navigate the way forward. Pauwels says that the field could face the same kind of public backlash that bedevilled the previous generation of genetically modified plants and animals, and to avoid it, scientists need to communicate the advantages of their work. “If it's here and can have some benefit,” she says, “let's think of it as something we can digest and we can own.”


News Article | December 6, 2016
Site: www.eurekalert.org

Scientists have developed a new optical chip for a telescope that enables astronomers to have a clear view of alien planets that may support life. Seeing a planet outside the solar system which is close to its host sun, similar to Earth, is very difficult with today's standard astronomical instruments due to the brightness of the sun. Associate Professor Steve Madden from The Australian National University (ANU) said the new chip removes light from the host sun, allowing astronomers for the first time to take a clear image of the planet. "The ultimate aim of our work with astronomers is to be able to find a planet like Earth that could support life," said Dr Madden from the ANU Research School of Physics and Engineering. "To do this we need to understand how and where planets form inside dust clouds, and then use this experience to search for planets with an atmosphere containing ozone, which is a strong indicator of life." Physicists and astronomers at ANU worked on the optical chip with researchers at the University of Sydney and the Australian Astronomical Observatory. Dr Madden said the optical chip worked in a similar way to noise cancelling headphones. "This chip is an interferometer that adds equal but opposite light waves from a host sun which cancels out the light from the sun, allowing the much weaker planet light to be seen," he said. PhD student Harry-Dean Kenchington Goldsmith, who built the chip at the ANU Laser Physics Centre, said the technology works like thermal imaging that fire fighters rely on to see through smoke. "The chip uses the heat emitted from the planet to peer through dust clouds and see planets forming. Ultimately the same technology will allow us to detect ozone on alien planets that could support life," said Mr Kenchington Goldsmith from the ANU Research School of Physics and Engineering. The innovation builds on over 10 years of research on specialised optical materials and devices that has been supported through CUDOS, a centre of excellence funded by the Australian Research Council. The research is being presented at the Australian Institute of Physics Congress in Brisbane this week. ANU media team can provide a copy of the research paper and related images to journalists, upon request.


News Article | December 12, 2016
Site: www.cemag.us

The Australian National University (ANU) has led an international project to make a diamond that's predicted to be harder than a jeweler’s diamond and useful for cutting through ultra-solid materials on mining sites. ANU Associate Professor Jodie Bradby says her team — including ANU PhD student Thomas Shiell and experts from RMIT, the University of Sydney, and the United States — made nano-sized Lonsdaleite, which is a hexagonal diamond only found in nature at the site of meteorite impacts such as Canyon Diablo in the U.S. "This new diamond is not going to be on any engagement rings. You'll more likely find it on a mining site — but I still think that diamonds are a scientist's best friend. Any time you need a super-hard material to cut something, this new diamond has the potential to do it more easily and more quickly," says Bradby, from the ANU Research School of Physics and Engineering. Her research team made the Lonsdaleite in a diamond anvil at 400 degrees Celsius, halving the temperature at which it can be formed in a laboratory. "The hexagonal structure of this diamond's atoms makes it much harder than regular diamonds, which have a cubic structure. We've been able to make it at the nanoscale and this is exciting because often with these materials 'smaller is stronger'."


News Article | December 7, 2016
Site: www.rdmag.com

Scientists have developed a new optical chip for a telescope that enables astronomers to have a clear view of alien planets that may support life. Seeing a planet outside the solar system which is close to its host sun, similar to Earth, is very difficult with today's standard astronomical instruments due to the brightness of the sun. Associate Professor Steve Madden from The Australian National University (ANU) said the new chip removes light from the host sun, allowing astronomers for the first time to take a clear image of the planet. "The ultimate aim of our work with astronomers is to be able to find a planet like Earth that could support life," said Dr Madden from the ANU Research School of Physics and Engineering. "To do this we need to understand how and where planets form inside dust clouds, and then use this experience to search for planets with an atmosphere containing ozone, which is a strong indicator of life." Physicists and astronomers at ANU worked on the optical chip with researchers at the University of Sydney and the Australian Astronomical Observatory. Dr Madden said the optical chip worked in a similar way to noise cancelling headphones. "This chip is an interferometer that adds equal but opposite light waves from a host sun which cancels out the light from the sun, allowing the much weaker planet light to be seen," he said. PhD student Harry-Dean Kenchington Goldsmith, who built the chip at the ANU Laser Physics Centre, said the technology works like thermal imaging that fire fighters rely on to see through smoke. "The chip uses the heat emitted from the planet to peer through dust clouds and see planets forming. Ultimately the same technology will allow us to detect ozone on alien planets that could support life," said Mr Kenchington Goldsmith from the ANU Research School of Physics and Engineering. The innovation builds on over 10 years of research on specialised optical materials and devices that has been supported through CUDOS, a centre of excellence funded by the Australian Research Council. The research is being presented at the Australian Institute of Physics Congress in Brisbane this week.


News Article | February 15, 2017
Site: www.eurekalert.org

James Cook University scientists have helped discover the remnants of a massive undersea landslide on the Great Barrier Reef, approximately 30 times the volume of Uluru. JCU's Dr Robin Beaman said the remains of the slip, known as the Gloria Knolls Slide, were discovered 75 kilometres off the north Queensland coast near the town of Innisfail while the scientists were working from the Marine National Facility's blue-water research ship Southern Surveyor. "This is all that remains after a massive collapse of sediment of about 32 cubic kilometres' volume more than 300,000 years ago," he said. Dr Beaman said a debris field of large blocks, or knolls, and numerous smaller blocks, lies scattered over 30 kilometres from the main landslide remains, into the Queensland Trough, to a depth of 1350 metres. "We were amazed to discover this cluster of knolls while 3D multibeam mapping the deep GBR seafloor. In an area of the Queensland Trough that was supposed to be relatively flat were eight knolls, appearing like hills with some over 100 m high and 3 km long." Associate Professor Jody Webster from the University of Sydney likened the research to a detective story, first finding the knolls, then using later mapping to reveal the landslide source of the knolls. A sediment sample from a knoll at a depth of 1170 metres identified a remarkable cold-water coral community of both living and fossil cold-water coral species, gorgonian sea whips, bamboo corals, molluscs and stalked barnacles. "The oldest fossil corals recovered off the top of the knoll was 302 thousand years," says Dr Angel Puga-Bernabéu at the University of Granada and lead author on the study, "which means the landslide event that caused these knolls must be older". Modelling the potential tsunami for a sudden 'mass failure' on this scale yields a three-dimensional tsunami wave elevation of about 27 metres. However, the wave would likely be dampened significantly by the presence of any coral reefs. Considerably more seabed mapping and sampling is needed to fully assess the tsunami hazard to the Queensland coast posed by these types of underwater landslides. The scientists said one-third of the Great Barrier Reef lies beyond the seaward edge of the shallower reefs, and the discovery of this prominent undersea landslide and its vast debris field in the deep Great Barrier Reef reveals a far more complex landscape than previously known. This research is a collaborative effort between James Cook University, University of Sydney, University of Granada, University of Edinburgh and the Australian Nuclear Science and Technology Organisation.


News Article | December 6, 2016
Site: phys.org

Seeing a planet outside the solar system which is close to its host sun, similar to Earth, is very difficult with today's standard astronomical instruments due to the brightness of the sun. Associate Professor Steve Madden from The Australian National University (ANU) said the new chip removes light from the host sun, allowing astronomers for the first time to take a clear image of the planet. "The ultimate aim of our work with astronomers is to be able to find a planet like Earth that could support life," said Dr Madden from the ANU Research School of Physics and Engineering. "To do this we need to understand how and where planets form inside dust clouds, and then use this experience to search for planets with an atmosphere containing ozone, which is a strong indicator of life." Physicists and astronomers at ANU worked on the optical chip with researchers at the University of Sydney and the Australian Astronomical Observatory. Dr Madden said the optical chip worked in a similar way to noise cancelling headphones. "This chip is an interferometer that adds equal but opposite light waves from a host sun which cancels out the light from the sun, allowing the much weaker planet light to be seen," he said. PhD student Harry-Dean Kenchington Goldsmith, who built the chip at the ANU Laser Physics Centre, said the technology works like thermal imaging that fire fighters rely on to see through smoke. "The chip uses the heat emitted from the planet to peer through dust clouds and see planets forming. Ultimately the same technology will allow us to detect ozone on alien planets that could support life," said Mr Kenchington Goldsmith from the ANU Research School of Physics and Engineering. The innovation builds on over 10 years of research on specialised optical materials and devices that has been supported through CUDOS, a centre of excellence funded by the Australian Research Council. Explore further: Universe's first life might have been born on carbon planets


News Article | October 31, 2016
Site: globenewswire.com

SAN FRANCISCO, Oct. 31, 2016 (GLOBE NEWSWIRE) -- For the first time, symptoms of cancer-related cognitive decline – often called “chemobrain” –  were reversed in a large, home-based, randomized controlled trial, using unique computerized brain exercises, according to a report today in the Journal of Clinical Oncology. Breast cancer support groups first brought attention to a phenomenon they called “chemobrain” or “chemofog” in the 1980s, and its seriousness and very existence were questioned by many in the medical profession. Studies on the condition did not begin until the late 1990s, and there continues to be controversy over its causes. Despite studies indicating that up to 70 percent of patients treated with chemotherapy experience cognitive decline; that such symptoms can persist for 10 or more years; and that the effects can interfere with maintaining employment, relationships and day-to-day independence; there are no broadly accepted treatments for cancer-related cognitive impairment. Millions of patients are treated with chemotherapy each year, and there are more than 35 million people with 5 or more years of cancer survival, worldwide. Researchers from the Survivorship Research Group, University of Sydney, Australia conducted a randomized controlled trial among 242 cancer survivors, who reported cognitive problems 6-60 months after completing chemotherapy. All study participants received a phone consultation. Half were assigned to the control group and received standard care from their healthcare providers, and the other half received standard care plus were asked to engage in a home-based intervention of online brain exercises for a total of 40 hours (40 minutes, 4 times per week, for 15 weeks). The exercises used in the study were a suite of five visual speed of processing exercises that are part of BrainHQ, a commercially-available, online, brain-training subscription service. Researchers administered this home-based study remotely, using a standard self-report cognitive assessment (FACT-COG), with the Perceived Cognitive Impairment (PCI) subscale designated as the primary outcome measure.  Secondary endpoints included standard self-report assessments of stress (PSS); fatigue (FACT-F); anxiety/depression (GHQ); quality of life (QOL FACT-G); and an online self-administered neuropsychological test (Cogstate). Participants were assessed at baseline, after the 15-week intervention, and six months later. Researchers reported that the intervention group, as compared to the control group, reported significantly better Perceived Cognitive Impairment (the primary outcome measure: FACT-COG PCI) immediately after the intervention (p<0.0001), and six months later (p<0.0002). The intervention group also had significantly better performance on many secondary measures, including: on the stress measure after intervention and at six months; fatigue and anxiety/depression measures after training and with a trend to better performance at six months; on the quality of life measure at six months, but not immediately after training; and on all FACT-COG subscales after intervention, but only on some at six months. The computerized neuropsychological assessment (Cogstate) showed no between group difference after training or six months later. “The use of this web-based intervention led to improvements in cognitive symptoms that were sustained six months later,” said Dr. Janette Vardy of the University of Sydney, the senior author on the paper. “While this builds on prior work, to our knowledge, it is the largest trial showing improvement in cognitive symptoms among cancer survivors after chemotherapy.” “This is an important step forward,” commented Dr. Diane Von Ah of Indiana University, who ran a prior study using the same intervention, with similar results, in a classroom setting. “This new study suggests that this program can be used successfully in the home to address a serious problem that has too often been ignored, trivialized, or even denied to exist.” “We are excited by the addition of these independent research results to our body of knowledge,” said Dr. Henry Mahncke, CEO of Posit Science, the maker of the BrainHQ exercises used in this intervention. “We plan to approach appropriate regulatory agencies to explore the shortest path to getting a form of these exercises into the hands of patients who may be helped.”


News Article | February 15, 2017
Site: www.bbc.co.uk

Scientists have discovered evidence of a massive ancient undersea landslide next to Australia's Great Barrier Reef. The Gloria Knolls Slide is at least 300,000 years old and 32 cubic km in volume, or 30 times the size of Uluru, a rock landmark in central Australia. The landslide could also have triggered a tsunami, the international team says. The scientists said debris from the landslide, found as deep as 1,350m (4,430ft) below the sea, also provided clues about hidden marine life. The team made the discovery while conducting three-dimensional mapping of ancient reefs in the Queensland Trough, a vast basin adjoining the Great Barrier Reef. Dr Robin Beaman, from Queensland's James Cook University, said the researchers located a cluster of hills, or knolls, more than 1,100m beneath the surface. "What we discovered was the smoking gun," he told the BBC. "It was quite clear that those knolls were the remains of a very large undersea landslide that had occurred some time ago." That time was at least 300,000 years ago, he said, because coral fossils collected from the knolls went back that far, and the landslide would have predated them. He described it as "catastrophic collapse" because the knolls - as long as 3.6km (2.2 miles) - were found 30km from their original location. Other evidence of the landslide would have been buried over time, he said. The research, published in the journal Marine Geology, said the landslide had the potential to cause a large tsunami. Although modelling had put its elevation as high as 27m, the impact of a tsunami would have been significantly offset by the presence of coral reefs. "The Great Barrier Reef acts like a giant porous breakwater to reduce the energy [of ocean swell]," said Dr Beaman. "If it was in existence at the time of this landslide, it would have done a similar job." He said future risk to the Queensland coast appeared unlikely because it was a "a very old event", but it was a worthy topic for future research. The researchers found deep marine life including cold-water corals, molluscs and barnacles were thriving on the knolls. The corals, unlike their shallow reef counterparts, could survive in 4C temperatures with no sunlight, Dr Beaman said. He said possibilities for future research were exciting. "That really is the next frontier," he said. "We probably have a bit of an idea of what's living up to 200m or 300m [deep], but beyond that, very few people have done much work in this area." The research was a collaboration between James Cook University, University of Sydney, University of Granada, University of Edinburgh and the Australian Nuclear Science and Technology Organisation.


News Article | February 15, 2017
Site: www.eurekalert.org

CHICAGO (February 15, 2017): Patients with a type of advanced malignant cancer of the arms or legs have typically faced amputation of the afflicted limb as the only treatment option. However, a technique that limits the application of chemotherapy to the cancerous region can preserve limbs in a high percentage of these patients, researchers from five cancer centers in the United States and Australia report in a study published online as an "article in press" on the Journal of the American College of Surgeons website in advance of print publication. The researchers used the treatment technique, known as regional chemotherapy with isolated limb perfusion (ILI), in 77 patients with treatment-resistant, locally advanced soft tissue sarcomas (STS), and were able to salvage limbs in 77.9 percent of the cases. "Isolated limb infusion is a safe and effective technique of treatment of patients with locally advanced soft tissue sarcoma who otherwise might require amputation," said lead study author John E. Mullinax, MD, from Moffitt Cancer Center, Tampa, Fla. The study, conducted over a 22-year period from 1994-2016, is the largest one to date of limb preservation using ILI for sarcoma. "Advocates for ILI in these patients would argue that, with similar long-term survival data and meaningful overall response rates, patients would much prefer a treatment that preserves the affected extremity to one that does not," Dr. Mullinax said. ILI has historically been used primarily for melanoma of the extremities and the use of this technique in sarcoma is a more novel approach. Sarcoma is a rare type of cancer in the extremities with several different subtypes; the study patients who underwent ILI had 17 different subtypes of sarcoma. The rationale for amputation of soft tissue sarcoma of the arm or leg has been to prevent the cancer from spreading to, or metastasizing to, other parts of the body. Dr. Mullinax noted that one concern with the use of ILI in these cancers is that it does not address distant metastatic disease. "The reality is that those patients who develop metastatic disease after amputation or ILI likely may already have distant microscopic disease at the time of the procedure, but the radiographic staging studies are not sensitive enough to detect it," Dr. Mullinax said. "In this sense, the treatment of the extremity disease is not to the determinant of long-term survival." In the study population, 19 patients had 21 procedures for upper-extremity disease and 58 patients had 63 infusions for lower-extremity disease. The results varied significantly for the two groups. The overall three-month response rate to ILI was 58 percent, but it was only 37 percent for those with upper-extremity disease vs. 66 percent for lower-extremity disease. Likewise, those who had upper-extremity sarcomas had a lower median overall survival than their lower-extremity counterparts, 27.9 months vs. 56.6 months. For the entire study population, the median overall survival was 44.3 months. Entering the study, all the patients had sarcomas that could only be removed with an amputation, but afterward 30 percent had a complete response to ILI, many of these because patients were able to have a surgical procedure to remove the tumors without amputation. For those who eventually needed an amputation, the median time to do so was 4.5 months following ILI. The ILI technique involves circulating the chemotherapy agents melphalan and actinomycin D in the blood vessels of the affected area of the arm or leg, and the use of a tourniquet to block the chemotherapy drugs from circulating through the rest of the body, thus creating a closed circuit. The drugs circulate in the target area for 30 minutes, and then are flushed out before the tourniquet is removed and full circulation is restored. ILI for soft tissue sarcoma of the extremities can be repeated, whereas another procedure to administer chemotherapy to the arms or legs, hyperthermic isolated limb perfusion, requires an incision to openly cannulate the vessels and generally cannot be repeated, Dr. Mullinax explained. The ILI technique requires a team to help perform the procedure such as an interventional radiology team to place the catheter in the artery before the procedure, a perfusionist to oversee the circuit and an operating room staff familiar with chemotherapy precautions, Dr. Mullinax said. "Most patients would prefer to have more time with their leg rather than face an amputation," Dr. Mullinax said. "It's known that for patients with soft-tissue sarcoma, the life-limiting disease is not in the extremity but it's actually in the metastatic disease. An inoperable sarcoma of the thigh does not affect survival to the degree that metastatic disease in the lung does." Dr. Mullinax said one limitation of the study was that it did not randomize patients between ILI and amputation, so a head-to-head comparison of response to treatment and survival cannot be performed with this dataset. The study also did not evaluate quality of life or patient-related factors for those who had limb salvage vs. those who had amputation. Co-senior authors of the study are Ricardo J. Gonzalez, MD, FACS, and Jonathan S. Zager, MD, FACS, of the Moffitt Cancer Center; other coauthors are Hidde M. Kroon, MD, PhD, of the Melanoma Institute of Australia, University of Sydney; Neel Nath, BS, and Paul J. Mosca, MD, PhD, FACS, of Duke University, Durham, N.C.; Jeffrey M. Farma, MD, FACS, of the Fox Chase Cancer Center, Philadelphia; Rajendra Bhati, MD, FACS, of Marietta Memorial Hospital, Marietta, Ohio; Danielle Hardmann, BS, and Sean Sileno, BS, of Morsani College of Medicine, University of South Florida, Tampa; and Christina O'Donoghue, MD, Matthew Perez, MD, Syeda Mahrukh Hussnain Naqvi, MD, and Y. Ann Chen, PhD, of the Moffitt Cancer Center and Research Institute. A video summary of study highlights can be viewed at: https:/ This study was presented at the 128th annual meeting of the Southern Surgical Association, in Palm Beach, Florida, in December 2016. This work was supported by a Cancer Center Support Grant to the H. Lee Moffitt Comprehensive Cancer Center and Research Institute. NOTE: "FACS" designates that a surgeon is a Fellow of the American College of Surgeons. Citation: Isolated Limb Infusion as a Limb Salvage Strategy for Locally Advanced Extremity Sarcoma. Journal of the American College of Surgeons. About the American College of Surgeons The American College of Surgeons is a scientific and educational organization of surgeons that was founded in 1913 to raise the standards of surgical practice and improve the quality of care for all surgical patients. The College is dedicated to the ethical and competent practice of surgery. Its achievements have significantly influenced the course of scientific surgery in America and have established it as an important advocate for all surgical patients. The College has more than 80,000 members and is the largest organization of surgeons in the world. For more information, visit http://www.


News Article | September 8, 2016
Site: news.yahoo.com

If alcohol is a part of your weekly routine, you should make sure to find time to hit the gym: A new study from the United Kingdom suggests that regular exercise can help balance out the harmful effects of alcohol. People in the study who drank alcohol — but also exercised on a regular basis — were less likely to die from any cause during the study period, compared with those who drank but didn't exercise. And exercise had a particularly strong effect on drinkers' risk of dying from cancer. Alcohol is known to increase people's risk of cancer, but the study's findings showed that regular physical activity nearly canceled out this increased risk, according to the study, published today (Sept. 7) in the British Journal of Sports Medicine. [7 Cancers You Can Ward Off with Exercise] Although it's not exactly clear how exercise may counteract the effects of alcohol when it comes to cancer risk, there are several mechanisms that could possibly explain the link, the researchers, led by Emmanuel Stamatakis, an associate professor of exercise, health and physical activity at the University of Sydney in Australia, wrote in the study. Drinking alcohol, for example, has been shown to increase inflammation and decrease immune function, both of which have been linked to cancer, according to the study. Physical activity, on the other hand, has been shown to have the opposite effects — it decreases inflammation and increases immune function, the researchers wrote. Ultimately, the mechanisms by which alcohol may cause cancer may be matched up with the mechanisms by which exercise may prevent it, but the two activities have opposite effects, the researchers wrote. In other words, the effects of exercise may cancel out those of alcohol. In the study, the researchers looked at the exercise and drinking habits of more than 36,000 men and women in England and Scotland. The participants were divided into six groups based on their level of alcohol intake: those who never drank; former drinkers; occasional drinkers (meaning they hadn't had a drink in the past seven days); those who drank within the guidelines (no more than eight drinks/week for women or 12 drinks/week for men); "hazardous" drinkers (eight to 20 drinks/week for women, or 12 to 28 drinks/week for men); and "harmful" drinkers (more than 20 drinks/week for women and more than 28 drinks/week for men). [Here's How Much Alcohol Is OK to Drink in 19 Countries] The people in the study were also divided into groups based on the amount of physical activity they reported. There was an "inactive" group, which got less than the study's recommended 150 minutes of exercise each week; a group that got the recommended amount of physical activity, and those who got double or more the recommended amount of physical activity. The researchers found that for people in the inactive group, the more they drank, the more likely they were to die from any cause during the study period of about 10 years. However, when exercise was added to the mix, the researchers found that people's risk of dying decreased, though it was still linked to the amount a person drank. And when the researchers looked at a person's risk of dying from cancer specifically, however, they found that getting the recommended amount of weekly exercise nearly canceled out this risk entirely. The exception in both cases was for those in the "harmful" drinking group. Among these heavier drinkers, exercise did not lower the risk of dying, the researchers found. In addition, exercise did not have an effect on an alcohol drinker's risk of dying from heart disease, according to the study. Finally, the researchers found that there was a slightly beneficial effect to having an occasional drink: regardless of physical activity level, occasional drinkers were slightly less likely die from any cause, or from heart disease in particular, compared with other groups of drinkers. An occasional drink didn't have any beneficial effect on reducing a person's risk of dying from cancer, however. The researchers noted that there were several limitations to the study. While they looked at the amount of alcohol the participants drank, they did not look at the pattern of drinking, so they may have missedbinge drinking, they wrote. In addition, the researchers didn't consider other factors, such as diet, that may have an effect on a person's risk of dying. Copyright 2016 LiveScience, a Purch company. All rights reserved. This material may not be published, broadcast, rewritten or redistributed.


News Article | January 18, 2016
Site: phys.org

The findings address the longstanding debate among scientists about whether or not the bacterium Yersinia pestis –responsible for the Black Death—remained within Europe for hundreds of years and was the principal cause of some of the worst re-emergences and subsequent plague epidemics in human history. Until now, some researchers believed repeated outbreaks were the result of the bacterium being re-introduced through major trade with China, a widely-known reservoir of the plague. Instead, it turns out the plague may never have left. "The more plague genomes we have from these disparate time periods, the better we are able to reconstruct the evolutionary history of this pathogen" says evolutionary geneticist Hendrik Poinar, director of McMaster University's Ancient DNA Centre and a principal investigator at the Michael G. DeGroote Institute for Infectious Disease Research. Poinar collaborated with Edward Holmes at the University of Sydney, Olivier Dutour of the École Pratique des Hautes Études in France, and Kirsti Bos and Johannes Krause at the University of Tubingen, and others, to map the complete genomes of Y.pestis which was harvested from five adult male victims of the 1722 Plague of Provence. To do so, they analyzed the dental pulp taken from the five bodies, originally buried in Marseille, France. Researchers were able to extract, purify and enrich specifically for the pathogen's DNA, and then compare the samples with over 150 plague genomes representing a world wide distribution as well as from other points in time, both modern and ancient. By comparing and contrasting the samples, researchers determined the Marseille strain is a direct descendant of the Black Death that devastated Europe nearly 400 years earlier and not a divergent strain that came, like the previous pandemic strains Justinian and Black Death, from separate emergences originating in Asia. More extensive sampling of modern rodent populations, in addition to ancient human and rodent remains from various regions in Asia, the Caucasus and Europe, may yield additional clues about past ecological niches for plague. "There are many unresolved questions that need to be answered: why did the plague erupt in these devastating waves and then lay dormant? Did it linger in the soil or did it re-emerge in rats? And ultimately why did it suddenly disappear and never come back? Sadly, we don't have the answer to this yet," says Poinar. "Understanding the evolution of the plague will be critically important as antibiotic resistance becomes a greater threat, particularly since we treat modern-day plague with standard antibiotics. Without methods of treatment, easily treatable infections can become devastating again," he says. The research was published online today in the bioarchive bioRXIV, and is under review at the journal eLife. Explore further: Researchers reconstruct genome of the Black Death


News Article | September 14, 2016
Site: www.rdmag.com

In research published today, Australian scientists have taken a critical step towards understanding why different types of galaxies exist throughout the Universe. The research, made possible by cutting-edge AAO instrumentation, means that astronomers can now classify galaxies according to their physical properties rather than human interpretation of a galaxy’s appearance. For the past 200 years, telescopes have been capable of observing galaxies beyond our own galaxy, the Milky Way. Only a few were visible to begin with but as telescopes became more powerful, more galaxies were discovered, making it crucial for astronomers to come up with a way to consistently group different types of galaxies together. In 1926, the famous American astronomer Edwin Hubble refined a system that classified galaxies into categories of spiral, elliptical, lenticular or irregular shape. This system, known as the Hubble sequence, is the most common way of classifying galaxies to this day. Despite its success, the criteria on which the Hubble scheme is based are subjective, and only indirectly related to the physical properties of galaxies. This has significantly hampered attempts to identify the evolutionary pathways followed by different types of galaxies as they slowly change over billions of years. Dr Luca Cortese, from The University of Western Australia node of the International Centre for Radio Astronomy Research (ICRAR), said the world’s premier astronomical facilities are now producing surveys consisting of hundreds of thousands of galaxies rather than the hundreds that Hubble and his contemporaries were working with. “We really need a way to classify galaxies consistently using instruments that measure physical properties rather than a time consuming and subjective technique involving human interpretation,” he said. In a study led by Dr Cortese, a team of astronomers has used a technique known as Integral Field Spectroscopy to quantify how gas and stars move within galaxies and reinterpret the Hubble sequence as a physically based two-dimensional classification system. “Thanks to the development of new technologies, we can map in great detail the distribution and velocity of different components of galaxies. Then, using this information we’re able to determine the overall angular momentum of a galaxy, which is the key physical quantity affecting how the galaxy will evolve over billions of years. “Remarkably, the galaxy types described by the Hubble scheme appear to be determined by two primary properties of galaxies–mass and angular momentum. This provides us with a physical interpretation for the well known Hubble sequence whilst removing the subjectiveness and bias of a visual classification based on human perception rather than actual measurement.” The new study involved 488 galaxies observed by the 3.9m Anglo Australian Telescope in New South Wales and an instrument attached to the telescope called the Sydney-AAO Multi-object Integral-field spectrograph or ‘SAMI’. The SAMI project, led by the University of Sydney and the ARC Centre of Excellence for All-sky Astrophysics (CAASTRO), aims to create one of the first large-scale resolved survey of galaxies, measuring the velocity and distribution of gas and stars of different ages in thousands of systems. “Australia has a lot of expertise with this type of astronomy and is really at the forefront of what’s being done,” said Professor Warrick Couch, Director of the Australian Astronomical Observatory and CAASTRO Partner Investigator. “For the SAMI instrument we succeeded in putting 61 optical fibres within a distance that’s less than half the width of a human hair. “That’s no small feat, it’s making this type of work possible and attracting interest from astronomers and observatories from around the world.” Future upgrades of the instrument are planned that will allow astronomers to obtain even sharper maps of galaxies and further their understanding of the physical processes shaping the Hubble sequence. “As we get better at doing this and the instruments we’re using are upgraded, we should be able to look for the physical triggers that cause one type of galaxy to evolve into another—that’s really exciting stuff,” Dr Cortese said.


News Article | November 15, 2016
Site: www.eurekalert.org

Cytomegalovirus is a common herpesvirus that can cross the placenta, infect the fetus and cause damage to the developing brain. The retrospective observational study of 323 children with cerebral palsy reveals that 9.6 per cent had cytomegalovirus (CMV) DNA in blood taken from their newborn screening card. This proportion is much higher than the proportion of children with CMV detected in the newborn period in the general community, which is less than one per cent. Further, it is six times greater than the proportion of children with cerebral palsy who have had congenital CMV reported as an attributable cause of their condition to the Australian Cerebral Palsy Register (1.5 per cent), and higher than a recent retrospective study of Caucasian children with cerebral palsy (1.5 per cent). Congenital CMV infection has been estimated to occur in approximately 0.7 per cent of newborn infants of whom ten to 15 per cent exhibit signs of infection at birth. These infants carry a higher risk of permanent neurodevelopmental disabilities, including cerebral palsy. It's estimated that a further ten to 15 per cent of children with congenital CMV infection who are asymptomatic at birth will go on to develop neurologic signs and symptoms beyond the neonatal period, predominantly late-onset hearing loss. Cerebral palsy is the most common physical disability of childhood, and has been associated with a number of risk factors, including intrauterine infections such as congenital CMV. "Despite this known association, and estimates of neurologic disability from congenital CMV, few reports describe the prevalence and epidemiology of cerebral palsy associated with congenital CMV, said the study's senior author, Professor Cheryl Jones of the University of Sydney's Marie Bashir Institute of Infectious Diseases and Biosecurity. "Defining the role of congenital CMV as a risk factor for cerebral palsy is important because it is the most common intrauterine infection in developed countries, is potentially preventable, and antiviral therapy post-natally can reduce the severity of adverse neurologic outcomes." Study leader, Dr Hayley Smithers-Sheedy of the University of Sydney's Cerebral Palsy Alliance said: "This study serves as a timely reminder of the importance of CMV as a common intrauterine viral infection in developed countries and the potential for long-term consequences beyond the newborn period. "More research is needed to investigate the mechanisms and contribution of congenital CMV to the causal pathways to cerebral palsy."


News Article | November 24, 2016
Site: www.bbc.co.uk

An international research team led from Australia and China has discovered nearly 1,500 new viruses. The scientists looked for evidence of virus infection in a group of animals called invertebrates, which includes insects and spiders. Not only does the study expand the catalogue of known viruses, it also indicates they have existed for billions of years. The findings were published in the journal Nature. Few would argue that all living species on Earth are susceptible to viruses – these microscopic parasites are ubiquitous. But virologists have long suspected that our current view of the diversity of viruses is blinkered – all too often constrained to those causing disease in humans, animals and plants, or to those that we can grow in the laboratory. A trip to a tropical rainforest or the African savannah gives a snapshot into the incredible diversity of visible life on Earth, but understanding the potentially mind-boggling myriad of minuscule viruses has not been so easy. Capturing new viruses is not like netting a new species of butterfly – viruses are invisible. Undeterred by this practical problem an international team was keen to survey invertebrates for new viral species. Invertebrates are spineless creatures and the group includes many familiar animals, such as insects, spiders, worms and snails. They represent the vast majority of animal species in the world today. Scientists wanting to work out the totality of viral "life" – although many virologists would argue that viruses are not truly alive – are starting to adopt techniques that reveal their genetic calling cards, revealed in the things they infect. Just like powerful new telescopes are peering deeper into space, revealing a wealth of hitherto unknown stars, next-generation sequencing techniques are providing new insight into the magnitude of the invisible world of viruses; a world we call the virosphere. We are familiar with DNA, the "stuff of life" that makes up the blueprint of our genomes. But many viruses use a different chemical to construct their genomes – a substance known as RNA. Just like DNA, this consists of strings of individual building blocks, or bases; each designated by a different letter: A, C, G and U. Next generation sequencing allows researchers to quickly determine the sequence of these letters. And if you work out the order of the letters on any chain of RNA, you can determine if it belongs to a virus and whether or not the virus is new. Its potential for virus discovery is huge. The research team collected around 220 species of land- and water-dwelling invertebrates living in China, extracted their RNA and, using next-generation sequencing, deciphered the sequence of a staggering 6 trillion letters present in the invertebrate RNA "libraries". When the researchers analysed this mass of data they realised that they had discovered almost 1,500 new virus species – a whopping number by any measure. Many of these were so distinct that they did not easily fit into our existing virus family tree. Prof Elodie Ghedin from New York University, who was not directly involved with the study, told the BBC: "This is an extraordinary study providing the largest virus discovery to date. "It will no doubt remodel our view of the virus world and redraw virus phylogeny. "This is what happens when you combine a bold and brute force approach with the right technology and the right set of eyes." Even though some invertebrates carry viruses that can infect humans - like zika and dengue - the study authors do not think that these newly discovered viruses pose a significant risk. However, this cannot be ruled out entirely, and Prof Ghedin thinks that this is an important issue to address. "If we have learned anything from these types of true discovery projects is that when we start looking into places we haven’t looked at before, we find an incredible richness that goes beyond what was suspected. "It also makes a strong case for expanding virus surveillance to invertebrates in our quest to better understand (and predict) emerging viruses," she said. The research also showed that throughout time viruses have been trading genetic material to create new species – an incredible feat according to Prof Eric Delwart from the University of California, San Francisco, who told the BBC: "It shows a lego-like ability of different viral functional units to be recombined to create new viruses even when they originate from highly divergent viruses. The plasticity of viral genomes continues to amaze." Not only have these studies expanded our view of the diversity of viruses, they have also provided a more complete picture of virus history, as Prof Edward Holmes from the University of Sydney, who was involved in the study explained: "We have discovered that most groups of viruses that infect vertebrates – including humans, such as those that cause well-known diseases like influenza – are in fact derived from those present in invertebrates." He also believes that his group's data shows that viruses have been infecting invertebrates for possibly billions of years, raising the prospect that invertebrates are the true hosts for many types of virus. The researchers hope that next-generation sequencing can pave the way for virus discovery in a variety of other species. And it does not stop there. Prof Delwart thinks that further analyses of existing next-generation datasets may yield additional virus species unlike any that we have seen before. If future studies reveal anywhere near this number of new viruses, then we’ve only just scratched the surface. It seems that the virosphere is set to explode. Jonathan Ball is a professor of virology at Nottingham University. This coming Saturday, he will be taking part in CrowdScience, the new BBC World Service science weekly, which starts with a question from listener Ian in Jordan which is "where did viruses come from?"


News Article | December 3, 2016
Site: news.yahoo.com

Many types of exercise are linked to a lower risk of premature death, but activities like racquet sports, swimming and aerobics seem best at improving people's chances of staving off an early demise, according to a new study. Researchers found that people in the study who regularly played racquet sports had a 47 percent lower risk of dying over the course of the nine-year study than people who did not regularly engage in such sports. And people who regularly went swimming had a 28 percent lower risk of an early death during the study than those who did not regularly swim, the researchers found. Moreover, people who regularly did aerobics had a 27 percent lower risk of dying during the course of the nine-year study than people who did not regularly do such activity, the researchers found. [The 4 Types of Exercise You Need to Be Healthy] "These findings demonstrate that participation in specific sports may have significant benefits for public health," the researchers, at the UKK Institute in Finland and the University of Sydney in Australia, wrote in the study, published Tuesday (Nov. 29) in the British Journal of Sports Medicine. In the study, the researchers asked more than 80,000 people whether they had exercised in the past month and, if they had, what types of exercise they had done. The people were 52 years old, on average, at the start of the study; the researchers then followed the participants for nine years, on average. During the course of the study, 8,790 of the participants died. The researchers found that the people who reported swimming, doing aerobics or playing racquet sports in the past month at the start of the study were less likely to die during the study period than those who had not engaged in these activities in the past month at the start of the study. In addition, the researchers found that the people who reported cycling in the past month at the start of the study were 15 percent less likely to die during the study than those who did not report cycling in the past month at the start of the study. [The Odds of Dying from Shark Attacks, Tsunamis & Dozens of Other Causes] Those study participants who ran or jogged, and those who played football or rugby did not have a lower risk of dying during the study period than those who did not engage in these sports, the researchers also found. However, the results don't prove that engaging in certain types of physical activities directly causes people to live longer, the researchers noted. Rather, the findings suggest that there is a link between these activities and a longer life, the study said. The research did not look at why certain sports may help people to live longer than others.


News Article | April 14, 2016
Site: www.techtimes.com

Bed bugs nowadays are harder to kill because they have developed thicker skin to repel common insecticides, a study reveals. Researchers from the University of Sydney said that the increasing incidence of bed bugs in the last two decades can be attributed to the thicker cuticle the bugs developed through time. David Lilly, a PhD candidate from University of Sydney studied the skins of bed bugs and found out that thicker cuticle allows these bothersome bugs to survive constant exposure to over-the-counter insecticides. Bed bugs have an exoskeleton covering — the cuticle — like most insects. Lilly and his team used scanning electron microscopy to compare the cuticle thickness of insecticide resistant bugs to those that are easily killed by the same insecticides. The examination of the cuticles showed that those bugs with thicker cuticles are able to repel the insecticides. "If we understand the biological mechanisms bed bugs use to beat insecticides, we may be able to spot a chink in their armor that we can exploit with new strategies," said Lilly. The study was published in the journal PLOS ONE on April 13. Bed bugs (Cimex lectularius) are parasites that feed off blood and cause painful insect bites that often disrupt sleep. These blood suckers, once they infest, can be difficult to control as they can creep into the corners of bed frame, seams of mattresses, couches, wallpapers, and even drawers. Although they are not known to cause any serious diseases, they can cause severe itch that may lead to insomnia. Their resurgence is significantly hurting the tourism and hospitality industry that understanding how to effectively control them now has become an economic issue. It seems resorting to common insecticides becomes counter-productive because these bed bugs develop a thicker skin as their own defense mechanism to beat common insecticides. Lilly's study could be the reason why these parasites are becoming resistant to over-the-counter insecticides. A previous study conducted by Virginia Tech and New Mexico State University researchers showed that these bed bugs are can survive neonicotinoids, a commonly used class of insecticides. © 2016 Tech Times, All rights reserved. Do not reproduce without permission.


ANU Associate Professor Jodie Bradby said her team - including ANU PhD student Thomas Shiell and experts from RMIT, the University of Sydney and the United States - made nano-sized Lonsdaleite, which is a hexagonal diamond only found in nature at the site of meteorite impacts such as Canyon Diablo in the US. "This new diamond is not going to be on any engagement rings. You'll more likely find it on a mining site - but I still think that diamonds are a scientist's best friend. Any time you need a super-hard material to cut something, this new diamond has the potential to do it more easily and more quickly," said Dr Bradby from the ANU Research School of Physics and Engineering. Her research team made the Lonsdaleite in a diamond anvil at 400 degrees Celsius, halving the temperature at which it can be formed in a laboratory. "The hexagonal structure of this diamond's atoms makes it much harder than regular diamonds, which have a cubic structure. We've been able to make it at the nanoscale and this is exciting because often with these materials 'smaller is stronger'." Lonsdaleite is named after the famous British pioneering female crystallographer Dame Kathleen Lonsdale, who was the first woman elected as a Fellow to the Royal Society. The research is published in Scientific Reports. Co-researcher Professor Dougal McCulloch from RMIT said the collaboration of world-leading experts in the field was essential to the project's success. "The discovery of the nano-crystalline hexagonal diamond was only made possible by close collaborative ties between leading physicists from Australia and overseas, and the team utilised state-of-the-art instrumentation such as electron microscopes," he said. Corresponding author from the University of Sydney, Professor David McKenzie, said he was doing the night shift in the United States laboratory as part of the research when he noticed a little shoulder on the side of a peak. "And it didn't mean all that much until we examined it later on in Melbourne and in Canberra - and we realised that it was something very, very different." The diamond anvil the scientists used to make the nano-sized Lonsdaleite. Credit: Jamie Kidston, ANU More information: Thomas. B. Shiell et al. Nanocrystalline hexagonal diamond formed from glassy carbon, Scientific Reports (2016). DOI: 10.1038/srep37232


Nanotechnology - What You Should Know Graphene - Here's What You Should Know Researchers at the Australian National University have developed a way to create a nano-crystalline hexagonal diamond that is harder than a jeweler's diamond. Lonsdaleite is a hexagonal diamond named after famous crystallographer Dame Kathleen Lonsdale. In nature, this diamond is only found at meteorite impact sites such as Canyon Diablo in the United States. The discovery that lonsdaleite can be synthetically produced in a controlled laboratory setting presents a scientific breakthrough in terms of creating hexagonal diamonds. Making the stronger diamond was a team led by associate professor Jodie Bradby and her colleagues from the ANU, University of Sydney, RMIT University, and the United States. The team also included ANU doctoral student Thomas Shiell. In making lonsdaleite in the lab, the scientists used amorphous carbon as the basic material. The research has been published in Scientific Reports. Bradby, who teaches at the Research School of Physics and Engineering in ANU, said the lonsdaleite was made at 400 degrees Celsius, which is half the temperature normally required in making diamonds at laboratory settings. Regular diamonds are cubic in structure, but the diamond that Bradby's team created in the lab was hexagonal. "The hexagonal structure of this diamond's atoms makes it much harder than regular diamonds which have a cubic structure. We've been able to make it at the nanoscale and this is exciting because often with these materials 'smaller is stronger," Bradby said. The hexagonal structure drew the interest of the team thanks to a little bump on one side of the data graph. The deviation was surmised to be the result of the different structure of the material. Noting the synthetically made diamond took only half the temperature of previous efforts, Bradby said the structure was examined in the United States to see what goes on in the carbon material during compression. The team examined the material with a machine that has a ring of X-rays spinning around at a speed "close to the speed of light," Bradby said. The team then fired a beam of X-ray through the glassy carbon material, which was under extreme pressure from two normal diamonds. The X-ray diffraction measurements allowed the researchers to study the structure of the material. However, Bradby remarked not to expect hexagonal diamonds too soon on engagement rings. The extra-hard material will find greater use in the mining sector, where it can be used for splicing materials. "You'll more likely find it on a mining site - but I still think that diamonds are a scientist's best friend," she quipped. Meanwhile, coresearcher Dougal McCulloch from RMIT hailed the collaboration of global experts in making the project a success and recalled how the researchers were able to utilize advanced instrumentation like electron microscopes for their experiments. © 2017 Tech Times, All rights reserved. Do not reproduce without permission.


Lower stroke, mortality, renal failure, bleeding, atrial fibrillation, and length of intensive care unit stay with newer no-touch technique A landmark study led by Prof. Michael P. Vallely,​ MBBS, PhD, FRACS, of Sydney Heart and Lung Surgeons and the University of Sydney will be published in the February 28, 2017 issue of the Journal of the American College of Cardiology. According to the study, which involved 37,720 patients, a newer "no-touch" beating heart bypass surgery technique (anOPCABG) reduced postoperative stroke by 78% compared to traditional coronary artery bypass grafting (CABG). In addition, compared to traditional CABG, the newer "no-touch" technique also reduced postoperative mortality by 50%, renal failure by 53%, bleeding complications by 48%, atrial fibrillation by 34%, and length of intensive care unit stay by 13.3 hours. The co-authors of this study included world-renowned cardiothoracic surgeons from Australia, the United States, Canada, and the United Kingdom. Coronary artery bypass grafting (CABG) is a surgical procedure for ischemic heart disease, which is the most common cause of death in Western countries. In this disease, the gradual build-up of fat and calcium within the arteries of the heart causes narrowing, which reduces blood flow to the heart’s muscle. When the narrowing becomes very severe or completely blocked it causes a heart attack. CABG involves bypassing these blockages using a graft. The graft goes around the blocked artery to create new pathway for oxygen-rich blood to flow to the heart again. The aim of this is to relieve symptoms (including angina), help the patient resume a normal lifestyle, and to prevent the risk of heart attacks or other heart problems. However, traditional CABG involves stopping the heart during surgery and placing a clamp on the large vessel of the heart (aorta). Sewing the grafts to the heart is traditionally performed on a still, non-beating heart while the patient is on a heart-lung machine (“on-pump surgery”). In contrast, the newer no-touch, off-pump technique is performed on a beating heart without the heart-lung machine, using a small stabilizer. Since the heart-lung machine is not needed, the large vessels of the heart do not need to be manipulated (an “anaortic” or "no-touch" technique). This technique particularly benefits elderly and high-risk patients, and was shown in this new study to reduce the risk of stroke, death, and kidney injury following the operation. The no-touch beating heart technique performed "without aortic manipulation, whilst performed only by a minority of surgeons, has an important place in the higher risk patient undergoing CABG," commented Prof. Michael P. Vallely​, corresponding author of the study and cardiothoracic surgeon at Sydney Heart and Lung Surgeons, "this powerful analysis demonstrates the potential benefit, not only in the reduction of stroke, but also in mortality... [and] provides the most comprehensive and highest-quality evidence currently available [to] help inform decisions regarding the management of these patients." "Interestingly, the risk of stroke seemed to be directly related to the extent of aortic manipulation," said John G. Byrne, MD, of Hospital Corporation of America and Marzia Leacche, MD, of Spectrum Health in an editorial for the Journal of the American College of Cardiology, "a no-touch technique is probably a superior approach compared to conventional on-pump CABG with aortic clamping... in patients with increased cerebrovascular disease or atherosclerotic disease in the aorta." For the first time, an advanced Bayesian network model has been utilized to directly compare the clinical outcomes of all the major coronary artery bypass grafting techniques, including a totally anaortic or “no touch” off-pump technique, off-pump with a partial-clamp, off-pump with the clampless Heartstring device (St. Jude Medical, Saint Paul, Minnesota), and traditional on-pump with cross-clamp technique. This study will be published in the February 28, 2017 issue of the Journal of the American College of Cardiology. ​​Sydney Heart and Lung Surgeons has decades of combined experience in all aspects of adult cardiothoracic surgery. We offer comprehensive patient care, including pre-surgery meetings with the surgeon and daily hospital visits by our surgical team post-operation. Our group treats both private and public patients across Sydney’s major hospitals, including Strathfield Private Hospital, Macquarie University Hospital, Concord Repatriation General Hospital, and the Southern Highlands Private Hospital (Bowral), and are based at Royal Prince Alfred Hospital.


News Article | November 22, 2016
Site: www.prweb.com

People who want to replace their missing teeth with All-on-4® dental implants in Garden Grove, CA, can now visit Dr. Jin Kim, who is accepting new patients to his practice with or without referrals for this procedure. The revolutionary All-on-4® technique, which attaches a customized dental prosthesis to as few as four dental implants, provides patients with a secure, functional set of teeth. In addition, this procedure lets Dr. Kim give patients a new smile in just one appointment. The unique design of All-on-4® implants offers several health benefits for patient with multiple missing teeth in need of full-arch tooth replacements. These implants hold the attached prosthesis in place so securely that patients can eat varied and healthy diets, speak naturally and enjoy greater self-esteem. The implants also help halt the jaw bone density loss that would otherwise occur after tooth loss. The flexible nature of this technique, which lets Dr. Kim place implants wherever a patient has the most bone density remaining, makes this procedure available to people who might not qualify for individual implants. This procedure also eliminates the waiting period associated with single implants, so Dr. Kim can give patients a functional denture the same day that they receive dental implants in Garden Grove, CA. Dr. Kim, who has trained extensively and is known as a world-renowned periodontist, takes several steps to ensure the best results for every patient who visits his office for All-on-4® implants. To plan the placement of the implants, he uses 3D digital x-rays, which produce more detailed images than conventional x-rays. This advanced imaging technique helps Dr. Kim choose an implant placement that avoids contact with other oral structures, such as nerves and sinus cavities. The 3D planning software at his office further enables Dr. Kim to create an accurate, efficient treatment plan that meets each patient’s long-term needs. To learn more about the All-on-4® tooth replacement technique and its benefits, patients should visit Dr. Kim’s website at http://www.drjinkim.com. New patients who are ready to schedule consultations about receiving dental implants in Garden Grove, CA, are invited to do so through the website or by directly calling his office at (714) 898-8757. Dr. Jin Y. Kim is a periodontist dedicated to providing personalized dental care in Diamond Bar and Garden Grove, CA. Dr. Kim attended the University of Sydney Faculty of Dentistry before furthering his education with an advanced degree in pathology from the Medical School of the same University. Dr. Kim completed a periodontics and implant surgery residency at UCLA School of Dentistry. A uniquely dual board-certified specialist, Dr. Kim was board-certified by the American Board of Periodontology and the American Board of Oral Implantology/Implant Dentistry. The International Congress of Oral Implantologists and the American Academy of Implant Dentistry both gave him the title of Fellow. He was also inducted to be a Fellow of the prestigious American College of Dentists. Dr. Kim enjoys lecturing at UCLA School of Dentistry as well as national and international academic and clinical associations and universities including the International Association of Dental Research, American Academy of Periodontology and Academy of Osseointegration. To learn more about Dr. Jin Kim and the services he offers, visit his website at http://www.drjinkim.com or call (909) 860-9222 for the Diamond Bar location or (714) 898-8757 for the West Garden Grove location to schedule an appointment.


News Article | November 16, 2016
Site: www.prweb.com

Dr. Jin Kim proudly announces that the FDA has provided clearance for True Regeneration™ utilizing the LANAP® protocol, and he now invites new patients who have gum disease to receive this treatment at his two convenient practice locations in Garden Grove and Diamond Bar, CA, even if they do not have a referral. FDA clearance for True Regeneration™ with the LANAP® procedure is a significant achievement in the periodontal field and emphasizes the benefits of the procedure. Dr. Kim is a world-renowned periodontist who lectures internationally on leading dentistry techniques. He is proud to provide gum disease patients with the option of receiving laser dentistry in Garden Grove, CA, and Diamond Bar, CA. He utilizes the revolutionary LANAP® protocol as a gentle alternative to other periodontal disease treatments that require diseased tissue to be cut out of the mouth with the use of scalpels and other periodontal tools. While the many benefits of laser treatments have long been known and explored, the LANAP® protocol has recently been shown to regenerate tissue. This revolutionary new FDA clearance for True Regeneration™ with the LANAP® technique illustrates the procedure’s ability to regenerate tissue that has been lost due to inflammatory periodontal disease. Although there is still much research left to be done before the exact mechanism behind the LANAP® protocol’s ability to regenerate tissue is fully understood, preliminary findings reveal that the protocol is able to activate stem cells. Once activated, the stem cells play an important role in regenerating tissue. Patients who have lost gum tissue due to periodontal disease are encouraged to learn more about how treatments with the LANAP® protocol may benefit them. Dr. Kim invites new patients with or without a referral to visit his practice for laser dentistry in Garden Grove, CA or Diamond Bar, CA. Those with gum disease who wish to receive a consultation are invited to schedule an appointment with Dr. Kim by calling 714-898-8757 for the state-of-the-art Garden Grove location. Dr. Jin Y. Kim is a periodontist dedicated to providing personalized dental care in Diamond Bar and Garden Grove, CA. Dr. Kim attended the University of Sydney Faculty of Dentistry before furthering his education with an advanced degree in pathology from the Medical School of the same University. Dr. Kim completed a periodontics and implant surgery residency at UCLA School of Dentistry. A uniquely dual board-certified specialist, Dr. Kim was board-certified by the American Board of Periodontology and the American Board of Oral Implantology/Implant Dentistry. The International Congress of Oral Implantologists and the American Academy of Implant Dentistry both gave him the title of Fellow. He was also inducted to be a Fellow of the prestigious American College of Dentists. Dr. Kim enjoys lecturing at UCLA School of Dentistry as well as national and international academic and clinical associations and universities including the International Association of Dental Research, American Academy of Periodontology and Academy of Osseointegration. To learn more about Dr. Jin Kim and the services he offers, visit his website at http://www.drjinkim.com or call (909) 860-9222 for the Diamond Bar location or (714) 898-8757 for the West Garden Grove location to schedule an appointment.


News Article | November 29, 2016
Site: www.eurekalert.org

An international research collaboration, led by University of Sydney, has found that cycling, swimming, aerobics and racquet sports offer life saving benefits compared to running and football. Published today in the British Journal of Sports Medicine, the study also found that death from cardiovascular disease (CVD) was reduced in people who participated in swimming, racquet sports and aerobics. The study examined 80,000 adults over 30 years of age to investigate the link between participation in six different 'exercise disciplines' and death, including cycling, swimming, racquet sports, aerobics, football and running. The researchers drew on responses from 11 nationally representative annual health surveys for England and Scotland, carried out between 1994 and 2008. Compared with study participants who did not participate in the corresponding sport, risk of death from any cause was: Compared with study participants who did not participate in the corresponding sport, risk of death from cardiovascular disease was: "Our findings indicate that it's not only how much and how often, but also what type of exercise you do that seems to make the difference," said senior author Associate Professor Emmanuel Stamatakis from the Charles Perkins Centre, Faculty of Health Sciences and School of Public Health at the University of Sydney. "Participation in specific sports may have various benefits for health. These observations with the existing evidence should support the sport community together with other sectors to design and implement effective health enhancing exercise programs and physical activity in general," he said. Future research should aim to further strengthen the sport-specific evidence base and understanding of how to enable greater sports participation for people from all age groups and walks of life. This research was a large scale collaboration between University of Sydney, University of Oxford, UKK Institute (Finland), University of Edinburgh, and four other international universities. The researchers drew on responses from eleven nationally representative baseline health examination surveys carried out in the United Kingdom between 1994 and 2006 which looked at the association between participation in six different sport/exercise disciplines and mortality. In all, the analysis included 80,306 adults with an average age of 52. In each of the surveys, participants were quizzed about how much physical activity they had done in the preceding 4 weeks, and whether it had been enough to make them breathless and sweaty.


News Article | March 1, 2017
Site: www.eurekalert.org

Antibacterial compounds found in soil could spell the beginnings of a new treatment for tuberculosis, new research led by the University of Sydney has found. Believed by many to be a relic of past centuries, tuberculosis (TB) causes more deaths than any other infectious disease including HIV/AIDs. In 2015 there were an estimated 10.4 million new cases of TB and 1.4 million deaths from the disease. The bacterium causing TB (Mycobacterium tuberculosis) is becoming increasingly resistant to current therapies, meaning there is an urgent need to develop new TB drugs. In 2015 an estimated 480,000 cases were unresponsive to the two major drugs used to treat TB. It is estimated more than 250,000 TB deaths were from drug-resistant infections. An international collaboration led by University Professors Richard Payne, from the School of Chemistry, and Warwick Britton, from the Sydney Medical School and the Centenary Institute, has discovered a new compound which could translate into a new drug lead for TB. Its findings were published in Nature Communications today. The group was drawn to soil bacteria compounds known to effectively prevent other bacteria growing around them. Using synthetic chemistry the researchers were able to recreate these compounds with structural variations, turning them into more potent compounds called analogues. When tested in a containment laboratory these analogues proved to be effective killers of Mycobacterium tuberculosis. "These analogues inhibit the action of a key protein needed to build a protective cell wall around the bacterium," said Professor Payne. "Without a cell wall, the bacterium dies. This wall-building protein is not targeted by currently available drugs. "The analogues also effectively killed TB-causing bacteria inside macrophages, the cells in which the bacteria live in human lungs." Professor Payne said the findings are the starting point for a new TB drug. Planning for further testing and safety studies is underway. The research was done in collaboration with Colorado State University in the USA, Simon Fraser University in Canada, Warwick University in the UK, Monash University and the University of Queensland. It was funded by Australia's National Health and Medical Research Centre (NHMRC). Professors Payne and Britton also belong to the University's Marie Bashir Institute for Infectious Diseases and Biosecurity. Professor Payne won the Malcolm McIntosh Prize for Physical Scientist of the Year at the 2016 Prime Minister's Science Prizes.


News Article | March 30, 2016
Site: www.cemag.us

Click here for the Digital Edition. When Laboratory Worlds Collide By MaryBeth DiDonna After serving as the editor of Controlled Environments for several years, I was recently named editor of another ABM publication, Laboratory Design. Find out more about Laboratory Design's editorial opportunities as well as the upcoming 2016 Laboratory Design Conference, which will be held in Houston on April 25-27. Facility Profile: The Sydney Nanoscience Hub By Simon Ringer ​The new nanotechnology facility at the University of Sydney. Keeping the Future Cool By John Jackson How refrigeration technology has evolved to keep up with customer and regulatory demands.​ Tripwires: Key Issues to Consider When Designing Controlled Environments By Katherine M. Everett, PE, LEED AP A cure is always more expensive than preventive medicine. Managing Changing Cleanroom Operations By Bryan Sanderford ​Cleanrooms are designed to insure the area meets the necessary requirements of the clean production process. But what happens when the product changes and a different manufacturing process must be put in place along with new cleanroom requirements? Monitoring Considerations for Pharmaceutical Cleanrooms By Howard Abramowitz Facilities grappling with USP 797 and USP 800 verify their compliance through cleanroom certification.​ 2020 Vision: Higher Expectations for Contract Manufacturers By Barbara Kanegsberg and Ed Kanegsberg Contamination Control in and Out of the Cleanroom looks to the future. How It Works: Critical Performance Seating Meets the Needs of Today’s Cleanrooms By BioFit Engineered Products ​MVMT seating from BioFit Engineered Products is designed to address task-specific user range of motion in current critical performance applications. Cleanroom Tip: Ante Room Returns By Rick Meyer Tips for designing an ante room for a cleanroom facility.


Humphries R.M.,University of California at Los Angeles | Pollett S.,University of Sydney | Sakoulas G.,University of California at San Diego
Clinical Microbiology Reviews | Year: 2013

Daptomycin is a lipopeptide antimicrobial with in vitro bactericidal activity against Gram-positive bacteria that was first approved for clinical use in 2004 in the United States. Since this time, significant data have emerged regarding the use of daptomycin for the treatment of serious infections, such as bacteremia and endocarditis, caused by Gram-positive pathogens. However, there are also increasing reports of daptomycin nonsusceptibility, in Staphylococcus aureus and, in particular, Enterococcus faecium and Enterococcus faecalis. Such nonsusceptibility is largely in the context of prolonged treatment courses and infections with high bacterial burdens, but it may occur in the absence of prior daptomycin exposure. Nonsusceptibility in both S. aureus and Enterococcus is mediated by adaptations to cell wall homeostasis and membrane phospholipid metabolism. This review summarizes the data on daptomycin, including daptomycin's unique mode of action and spectrum of activity and mechanisms for nonsusceptibility in key pathogens, including S. aureus, E. faecium, and E. faecalis. The challenges faced by the clinical laboratory in obtaining accurate susceptibility results and reporting daptomycin MICs are also discussed. © 2013, American Society for Microbiology. All Rights Reserved.


Moss D.J.,University of Sydney | Morandotti R.,INRS EMT | Gaeta A.L.,Cornell University | Lipson M.,Cornell University
Nature Photonics | Year: 2013

Nonlinear photonic chips can generate and process signals all-optically with far superior performance to that possible electronically-particularly with respect to speed. Although silicon-on-insulator has been the leading platform for nonlinear optics, its high two-photon absorption at telecommunication wavelengths poses a fundamental limitation. We review recent progress in non-silicon CMOS-compatible platforms for nonlinear optics, with a focus on Si3N4 and Hydex®. These material systems have opened up many new capabilities such as on-chip optical frequency comb generation and ultrafast optical pulse generation and measurement. We highlight their potential future impact as well as the challenges to achieving practical solutions for many key applications. © 2013 Macmillan Publishers Limited.


Maron B.J.,Minneapolis Heart Institute Foundation | Maron M.S.,Hypertrophic Cardiomyopathy Center | Semsarian C.,University of Sydney
Heart Rhythm | Year: 2012

Risk stratification strategies employing sarcomere gene mutational analysis have proved imprecise in identifying high-risk patients with hypertrophic cardiomyopathy (HCM). Therefore, additional genetic risk markers that reliably determine which patients are predisposed to sudden death are needed. The objective of this study was to determine whether multiple disease-causing sarcomere mutations can be regarded as markers for sudden death in the absence of other conventional risk factors. Databases of 3 HCM centers were accessed, and 18 probands with 2 disease-causing mutations in genes encoding proteins of the cardiac sarcomere were identified. Severe disease progression or adverse cardiovascular events occurred in 7 of these 18 patients (39%), including 3 patients (ages 31, 37, and 57 years) who experienced sudden cardiac arrest but also were without evidence of conventional HCM risk factors; 2 survived with timely defibrillation and therapeutic hypothermia and 1 died. These 3 probands carried distinct and heterozygous disease-causing sarcomere mutations (including a man who inherited 1 mutation independently from each of his parents with HCM)that is, double MYBPC3 and TNNI3 mutations and compound MYBPC3 mutationsas the only predisposing clinical markers evident to potentially explain their unexpected cardiac event. These observations support the emerging hypothesis that double (or compound) mutations detected by genetic testing may confer a gene dosage effect in HCM, thereby predisposing patients to adverse disease progression. In 3 families, multiple sarcomere mutations were associated with a risk of sudden death, even in the absence of conventional risk factors.


Eggleton B.J.,University of Sydney | Poulton C.G.,University of Sydney | Poulton C.G.,University of Technology, Sydney | Pant R.,University of Sydney
Advances in Optics and Photonics | Year: 2013

We review recent progress in inducing and harnessing stimulated Brillouin scattering (SBS) in integrated photonic circuits. Exciting SBS in a chip-scale device is challenging due to the stringent requirements on materials and device geometry. We discuss these requirements, which include material parameters, such as optical refractive index and acoustic velocity, and device properties, such as acousto-optic confinement. Recent work on SBS in nano-photonic waveguides and micro-resonators is presented, with special attention paid to photonic integration of applications such as narrow-linewidth lasers, slowand fast-light, microwave signal processing, Brillouin dynamic gratings, and nonreciprocal devices. © 2013 Optical Society of America.


Chen M.,University of Virginia | Menicucci N.C.,University of Sydney | Pfister O.,University of Virginia
Physical Review Letters | Year: 2014

We report the experimental realization and characterization of one 60-mode copy and of two 30-mode copies of a dual-rail quantum-wire cluster state in the quantum optical frequency comb of a bimodally pumped optical parametric oscillator. This is the largest entangled system ever created whose subsystems are all available simultaneously. The entanglement proceeds from the coherent concatenation of a multitude of Einstein, Podolsky, and Rosen pairs by a single beam splitter, a procedure which is also a building block for the realization of hypercubic-lattice cluster states for universal quantum computing. © 2014 American Physical Society.


Burton O.J.,University of Aberdeen | Phillips B.L.,University of Sydney | Travis J.M.J.,University of Aberdeen
Ecology Letters | Year: 2010

Ecology Letters (2010)During range-advance, individuals on the expanding edge of the population face a unique selective environment. In this study, we use a three-trait trade-off model to explore the evolution of dispersal, reproduction and competitive ability during range expansion. We show that range expansion greatly affects the evolution of life-history traits due to differing selection pressures at the front of the range compared with those found in stationary and core populations. During range expansion, dispersal and reproduction are selected for on the expanding population front, whereas traits associated with fitness at equilibrium density (competitive ability) show dramatic declines. Additionally, we demonstrate that the presence of a competing species can considerably reduce the extent to which dispersal is selected upwards at an expanding front. These findings have important implications for understanding both the rate of spread of invasive species and the range-shifting dynamics of native species in response to climate change. © 2010 Blackwell Publishing Ltd/CNRS.


McIntyre P.B.,University of Sydney | O'Brien K.L.,International Vaccine Access Center | Greenwood B.,London School of Hygiene and Tropical Medicine | Van De Beek D.,University of Amsterdam
The Lancet | Year: 2012

Three bacteria-Haemophilus infl uenzae, Streptococcus pneumoniae, and Neisseria meningitidis-account for most acute bacterial meningitis. Measurement of the eff ect of protein-polysaccharide conjugate vaccines is most reliable for H infl uenzae meningitis because one serotype and one age group account for more than 90% of cases and the incidence has been best measured in high-income countries where these vaccines have been used longest. Pneumococcal and meningococcal meningitis are caused by diverse serotypes and have a wide age distribution; measurement of their incidence is complicated by epidemics and scarcity of surveillance, especially in low-income countries. Near elimination of H infl uenzae meningitis has been documented after vaccine introduction. Despite greater than 90% reductions in disease attributable to vaccine serotypes, all-age pneumococcal meningitis has decreased by around 25%, with little data from low-income settings. Near elimination of serogroup C meningococcal meningitis has been documented in several high-income countries, boding well for the eff ect of a new serogroup A meningococcal conjugate vaccine in the African meningitis belt.


Finniss D.G.,University of Sydney | Kaptchuk T.J.,Harvard University | Miller F.,U.S. National Institutes of Health | Benedetti F.,University of Turin
The Lancet | Year: 2010

For many years, placebos have been defined by their inert content and their use as controls in clinical trials and treatments in clinical practice. Recent research shows that placebo effects are genuine psychobiological events attributable to the overall therapeutic context, and that these effects can be robust in both laboratory and clinical settings. There is also evidence that placebo effects can exist in clinical practice, even if no placebo is given. Further promotion and integration of laboratory and clinical research will allow advances in the ethical use of placebo mechanisms that are inherent in routine clinical care, and encourage the use of treatments that stimulate placebo effects. © 2010 Elsevier Ltd. All rights reserved.


Patent
University of Sydney and Northern Sydney Local Health District | Date: 2013-11-01

The present invention relates to a method for detecting cardiovascular oxidative stress in an individual, comprising detecting in a blood sample from the individual modification of a cysteine at position 45 of the 1-subunit of the human erythrocyte ATP-dependent Na^(+)K^(+) pump protein or of an equivalent cysteine in a homologue or variant thereof. The invention further relates to a kit for detecting cardiovascular oxidative stress in an individual, the kit comprising at least one agent for detecting the presence of a modification in a cysteine at position 45 of the 1-subunit of the human erythrocyte ATP-dependent Na^(+)K^(+) pump protein or of an equivalent cysteine in a homologue or variant thereof, wherein said modification is a result of oxidation.


News Article | November 4, 2016
Site: www.newscientist.com

The numbers are in. We can now precisely count how many cancer-related DNA mutations accumulate in smokers’ organs over time. On average, there is one DNA mutation per lung cell for every 50 cigarettes smoked, according to a new analysis. People who smoke a pack of 20 a day for a year generate 150 mutations per lung cell, 97 per larynx cell, 39 per pharynx cell, 18 per bladder cell and six per liver cell. Epidemiological studies previously linked tobacco smoking with at least 17 classes of cancer, but this is the first time researchers have been able to quantify the molecular damage inflicted on DNA. Ludmil Alexandrov at Los Alamos National Laboratory in New Mexico and his colleagues achieved this by comparing tumour DNA from 2500 smokers and 1000 non-smokers. This allowed them to identify which mutations were associated with smoking. Theoretically, every DNA mutation has the potential to trigger a cascade of genetic damage that causes cells to become cancerous. However, we still don’t know what the probability is of a single smoking-related DNA mutation turning into cancer, or which mutation types are likely to be more malignant. “This is research we are currently pursuing,” Alexandrov says. Some smokers never develop cancer despite accruing thousands of mutations, but this is purely down to luck, Alexandrov says. “Smoking is like playing Russian roulette: the more you play, the higher the chance the mutations will hit the right genes and you will develop cancer,” he says. “However, there will always be people who smoke a lot but the mutations do not hit the right genes.” The team hopes their findings will deter people from taking up smoking and debunk the myth that social smoking is harmless. Every cigarette has the potential to cause genetic mutations, Alexandrov says. Quitting smoking will not reverse these mutations – they leave permanent scars on DNA – but it will prevent the added risk of more mutations, he says. There is good evidence that people who stop smoking have a significantly lower risk of premature death than those who continue, says Simon Chapman at the University of Sydney, Australia. For example, a UK study that followed 35,000 men for half a century found that smoking shaved 10 years off average life expectancy. But quitting at age 30 mostly erased the risk of premature death, and giving up at 50 halved it. “Many smokers believe there’s no point in quitting because the damage is already done,” says Chapman. “But if smokers quit by middle age, they can avoid nearly all the excess risk of tobacco-caused deaths.”


News Article | August 30, 2016
Site: www.sciencenews.org

A few Tasmanian devils have started a resistance movement against a contagious cancer that has depleted their numbers. Since devil facial tumor disease was first discovered in 1996, it has wiped out about 80 percent of the Tasmanian devil population. In some places, up to 95 percent of devils (Sarcophilus harrisii) have succumbed to facial tumors, spread when devils bite each other. Scientists had believed the tumor to be universally fatal. But a new study finds that a small number of devils carry genetic variants that help them survive the disease — at least long enough to reproduce, researchers report August 30 in Nature Communications. The finding could be important for the survival of the species. Previous studies have shown that the virulent tumor can hide itself from the devil’s immune system (SN: 4/20/13, p. 10). “What we reluctantly felt was that this was the end for the Tasmanian devil, because they really didn’t have a defense,” says Jim Kaufman, an evolutionary immunologist at the University of Cambridge not involved in the study. Indication that devils are evolving resistance to the deadly tumor “is really the most hopeful thing I’ve heard in a long time,” Kaufman says. As the cancer spread across Tasmania, Menna Jones and colleagues collected DNA from devils in three populations before and after the disease arrived. Jones, a conservation biologist at the University of Tasmania in Hobart, then teamed up with evolutionary geneticist Andrew Storfer from Washington State University in Pullman. His team examined the devil genetic instruction book, or genome, to see if differences between devils before and after the tumor’s arrival could explain why some survived while the rest succumbed. Scientists had thought that surviving devils just hadn’t caught the facial tumor yet because they were too young to breed and get bitten, Kaufman says. The new analysis indicates that devils surviving after the facial tumor has wreaked havoc have a genetic advantage. Storfer and colleagues found more than 90,000 DNA spots where a small number of devils have a different base (an information-carrying component of DNA) than most devils. The team looked for these genetic spelling differences — known as single nucleotide polymorphisms, or SNPs — that had been rare before the tumor swept through a population but then became common. Such a pattern could indicate that natural selection was working, picking out variants that helped devils beat the tumor. Two regions of the genome in all three of the devil populations contained SNPs that fit the profile. Because the two regions changed in all three populations, the change probably didn’t happen by chance, the researchers say. The resistance variants aren’t new mutations; there hasn’t been enough time for a helpful mutation to arise and spread across the island. Instead, the variants were probably already present in a small number of animals in the population and natural selection (via the tumor) weeded out the individuals that didn’t carry them. Those two genome regions contain a total of seven genes, some of which have been shown to be involved in fighting cancer or controlling the immune system in other mammals. The researchers aren’t sure which genes boost survival in the devils or how they work. The variants don’t necessarily make devils completely immune to the tumor; the results would look the same if the variants just allow infected individuals to live long enough to pass along their genes, says Storfer. More undiscovered genes may also contribute to the devils’ survival, he says. Researchers may be able to use genetics to better predict how the disease will spread in remaining uninfected devil populations. Breeding programs could incorporate animals that carry the survival variants to build the resistance movement. Complicating matters is a second devil facial tumor, discovered last year. Researchers don’t know whether the variants that allow devils to resist the first facial tumor disease will also work against the second. That’s why comparative genomicist Katherine Belov at the University of Sydney says conservationists shouldn’t breed the resistance genes into all Tasmanian devils. Devils need all the genetic diversity they can get to cope with other diseases and unknown challenges down the line. Restricting breeding to animals that have these variants could limit the ability overcome future difficulties, she says.


Principal research and neutron-scattering instrument scientist Vanessa Peterson has collaborated on a paper just published in Nature Chemistry that reports the discovery of a new torsion spring-like mechanism in a series of coordination frameworks that has implications for the strategic design of future materials with exceptional mechanical properties. The series of materials has the largest linear compressibilities of any crystalline compound, and the volume compressibilities are exceeded only by caesium, rubidium, and xenon. These compressibilities are both exceptionally large and sustained over a broad pressure range spanning at least 1 GPa. The work is the product of a long-standing collaboration with Cameron Kepert at the Molecular Materials Group at the University of Sydney, with the first author Samuel Duyker co-supervised by Kepert and Peterson during his PhD before taking up a postdoctoral position with Peterson, when this work was performed. "Understanding these multifunctional materials with the aim of controlling their range of useful properties is of great interest to the energy industry, but the research brings fundamental knowledge of the atomic and molecular mechanisms, which has broader application" said Peterson. In contrast to the shortening of bonds that occurs in conventional materials, compression in the new series of materials is achieved through structural deformation. The unusual new mechanism was found in a series 'coordination frameworks'—named for the coordinative bonds linking organic molecules and metal atoms. The torsional mechanism is unlike anything seen in any other material, being fundamentally distinct from other compressible frameworks. The team first became interested in the lanthanoid frameworks because of their unusual coordination geometries. "The material is a porous framework, constructed from two types of metals that are connected by cyanide bridging ligands. The coordination environment of the lanthanoid is inherently very unstable, but exists because it is stabilised by the overall framework connectivity." "Near 1 GPa, a change in lattice geometry occurs in the lanthanoid framework through a complex mechanism involving dramatic structural distortion," said Peterson. The LnN units acting like torsion springs are synchronised by rigid Fe(CN)6 units acting like gears. The LnN twists away from its original trigonal prismatic geometry becoming octahedral. These LnN units act as torsional centres that coil dramatically under pressure and enable extreme compressibility in combination with chemical and thermal stability for the first time. "There is competition between two relatively strong effects. A balance is achieved between the significant stored energy in the locally unstable lanthanoid coordination and the opposing force of the hexacyanidoferrate units, which strongly prefer not to change." "These lanthanoid frameworks compress by about 20% in volume at the relatively low pressure of 1 GPa, one of the largest known pressure responses for any crystalline material," said Peterson. This represents a new paradigm in the design of materials with anomalous mechanical behaviours. "There are a range of properties that can be tailored through choice of construction units. Selecting from a range of metal nodes and ligands gives control over the material's porosity, topology, that is, the way that the pores are connected, the chemical functionality, and flexibility of these units." "Because the contribution of each part or unit of the framework to these material properties are known, they can be controlled, and tailored to particular functions. With this knowledge, you can gain insight into how to structurally engineer the properties you want at the molecular level," said Peterson. Of great importance in understanding the details of the compression mechanism in this material series was the validation of computational models using neutron scattering. The combination of the two in this approach is well-known and very powerful. The neutron powder diffraction portion of the research was done on ANSTO's high intensity powder diffractometer, WOMBAT, for which Peterson is an instrument scientist. "The unique properties of neutrons allowed the neutron scattering experiment to be performed on a bulk sample, which was very useful, and provided great contrast in the scattering power of the elements," said Peterson. In earlier work, the team had found that the co-efficient of thermal expansion correlated linearly with the ionic radius of the lanthanoid in this family of materials. As part of the study, they substituted a range of elements, including yttrium, holmium, and lutetium, into the coordination site changing the coordinative bond strength. "We measured the overall property of thermal contraction or expansion using neutron scattering and computational methods to understand the mechanism. We found that there are low energy vibrational modes that give rise to this negative thermal expansion property. One of those modes was extremely unusual," said Peterson. "In theory you can tune the thermal expansion of the material by constructing it with an atom that has an ionic radius that relates to the co-efficient of thermal expansion you desire," explained Peterson. The results of this work were published in Angewandte Chemie in 2013. "We then looked at the possibility of locking in some of these interesting and unusual coordination geometries using chemical substitutions, and we found that by incorporating potassium into the structure, this could indeed be achieved." "The next logical thing to do was to see if we could induce this same coordination geometry change using pressure," said Peterson. When they squashed it, the material underwent the coordination geometry change they were looking for. They compared the compressibility of the material with a range of others, and found that it was extremely compressible —approaching the compressibility of solid polystyrene. They both measured and then calculated the volume change using density functional theory (DFT), for a range of chemically-substituted materials, deriving the compressibility from the change in volume as a function of pressure. "The compressibility changes as different metals are substituted into the unstable coordination geometry. This points towards tunable compressibility which is very desirable." They calculated the framework structure at different pressures to show exactly how the distortion happens and then calculated the contribution of each particular part of the framework in terms of energy—the energy cost of that deformation¬ and the overall contribution to the compressibility. "Basically it's a low energy distortion that relieves the framework strain. We showed that it can be accessed by thermal energy initially in the temperature study." A picture emerges of a torsion spring-like mechanism. Six-coordinate lanthanoid units act like a spring—they flex and distort as the Fe(CN) units act as gears. "The energy cost of deforming the Fe(CN) unit in the framework is very high, so instead it is the lanthanoid unit that deforms," said Peterson. In addition, the lanthanoid framework has sustained axial compression, which is the largest for any crystalline solid in particularly useful regions of .4 to 1.5 GPa. "It occurs via an interesting mechanism," said Peterson "There is a positive linear compressibility that goes through a small region of negative linear compressibility, where as it is squashed it gets smaller really rapidly but then bigger in that direction because of a cam-like action." "We examined quantitatively the energy cost of all of these components to learn exactly what they contribute to the overall volume response of the material to pressure. Once we understand the contribution that each part of the framework has to the overall properties, it can then be tuned." Generally speaking things will try to relieve strain. "Under pressure, the softest part with the lowest energy cost to relieve that pressure will respond." The team suspects that the luminescent properties of these materials are also likely to change upon compression, because the bonding changes so dramatically. "It's very satisfying to find a material with such interesting, useful properties, and then to establish exactly how it is they are achieved," said Peterson. Explore further: The material that's like an octopus More information: Samuel G. Duyker et al. Extreme compressibility in LnFe(CN) coordination framework materials via molecular gears and torsion springs, Nature Chemistry (2016). DOI: 10.1038/nchem.2431


The International Nurses Association is pleased to welcome Alayne J. Reid, BN, MHSc, MN, to their prestigious organization with her upcoming publication in the Worldwide Leaders in Healthcare. Alayne J. Reid is a Registered Nurse with over 40 years of experience in her field and an extensive expertise in all facets of nursing, especially cancer nursing and nurse education. Alayne is currently serving as a Clinical Facilitator within Griffith University in Queensland, Australia and is also affiliated with Logan Hospital. Alayne earned her initial Nursing Certification in 1978 from Princess Alexandra Hospital. She then attended the University of New England in New South Wales, where she graduated with her Bachelor Degree in Nursing in 1993, followed by her Graduate Certificate in Nursing Education in 1995. An advocate for continuing education, Alayne gained her Master’s Degree in Nursing in 2000 from Queensland University of Technology, before completing her Master’s Degree in Health Science in 2002 from the University of Sydney. Furthermore, Alayne is Chemotherapy/Biotherapy Certified, and maintains professional memberships with the Royal College of Nursing and the Australian Nurse Teachers Society. Throughout her successful career, Alayne has worked in various areas of nursing, having been a cancer nurse for 28 years. In her current position in Griffith University, Alayne lectures, tutors, and facilitates students during their practicums. She attributes her success to her desire to help and care for people, as well as being honest and nurturing. When she is not working, Alayne enjoys hiking with her husband, enjoying Australia’s National Parks, traveling, reading, and cross stitching. Learn more about Alayne J. Reid here: http://inanurse.org/network/index.php?do=/4130243/info/ and be sure to read her upcoming publication in the Worldwide Leaders in Healthcare.


The meta-genomics research, a collaboration between the University of Sydney and the Chinese Centre for Disease Control and Prevention in Beijing, was made possible by new technology that also provides a powerful new way to determine what pathogens cause human diseases. Professor Edward Holmes, from the Marie Bashir Institute for Infectious Diseases & Biosecurity and the School of Life and Environmental Sciences, who led the Sydney component of the project said although the research revealed humans are surrounded by viruses in our daily lives, these did not transfer easily to humans. "This groundbreaking study re-writes the virology text book by showing that invertebrates carry an extraordinary number of viruses - far more than we ever thought," Professor Holmes said. "We have discovered that most groups of viruses that infect vertebrates - including humans, such as those that cause well-known diseases like influenza - are in fact derived from those present in invertebrates," said Professor Holmes, who is also based at the University's multidisciplinary Charles Perkins Centre. The study suggests these viruses have been associated with invertebrates for potentially billions of years, rather than millions of years as had been believed - and that invertebrates are the true hosts for many types of virus. The paper, "Redefining the invertebrate RNA virosphere," is published tonight in Nature. "Viruses are the most common source of DNA and RNA on earth. It is all literally right under our feet," Professor Holmes said. The findings suggest viruses from ribonucleic acid, known as RNA - whose principal role is generally to carry instructions from DNA - are likely to exist in every species of cellular life. "It's remarkable that invertebrates like insects carry so very many viruses - no one had thought to look before because most of them had not been associated with human-borne illnesses." Although insects such mosquitoes are well-known for their potential to transmit viruses like zika and dengue, Professor Holmes stressed that insects should not generally be feared because most viruses were not transferable to humans and invertebrates played an important role in the ecosystem. Importantly, the same techniques used to discover these invertebrate viruses could also be used to determine the cause of novel human diseases, such as the controversial 'Lyme-like disease' that is claimed to occur following tick bites. "Our study utilised new techniques in meta-genomics, which we are also using to provide insights into the causes of human-borne diseases," said Professor Holmes, who is also a National Health and Medical Research Council Australia Fellow. "The new, expensive technologies available to researchers which have allowed us to do this landmark project, provide the ultimate diagnostic tool." Professor Holmes and his collaborators are conducting human studies using these new techniques to analyse Lyme-like disease and other clinical syndromes. Explore further: Biological factors predict which viruses will cause human epidemics More information: Mang Shi et al, Redefining the invertebrate RNA virosphere, Nature (2016). DOI: 10.1038/nature20167


News Article | December 8, 2016
Site: www.bbc.co.uk

The idea that smallpox is a very ancient human disease has been called into question. Scientists say the deadly pathogen appears to have been around for hundreds rather than thousands of years. Viral DNA from the mummified remains of a child living during the 17th Century - at the time of an epidemic - casts doubt on historical records. However, past descriptions have been based on physical signs, such as a pustular rash, which can be confused with other diseases. ''We managed to sequence the complete genome of the virus that causes smallpox, that's called variola virus,'' Dr Edward Holmes of the University of Sydney told BBC News. ''It's the oldest human virus ever sequenced.'' The researchers obtained permission to study samples of the pathogen from a child interred in the crypt of a church in Vilnius, Lithuania. Radiocarbon dating shows the child lived about 1650 AD, at a time when smallpox was common in Europe. ''This mummy allows us to calibrate very nicely the clock of evolution - it's a fossil, effectively,'' said Dr Holmes. ''This fossil tells us that in fact evolutionary history is much more recent than we thought before - it's actually only hundreds of years rather than thousands of years.'' However, it is not possible to determine where smallpox came from, what the ancestor of the virus was, and exactly when it first appeared in humans, he added. The child lived at a time when smallpox was spreading around the world, driven by global exploration and colonisation. This was before the development of vaccination, which began after the famous experiments of Edward Jenner in 1796. ''What we can show is that in fact most of the evolution of smallpox that we can measure occurred after 1796,'' said Dr Holmes. ''It looks like it is a more recent evolution than we ever thought before.'' The disease was officially eradicated in 1980, following a global immunisation campaign. Smallpox remains the only human disease eradicated by vaccination. Prof Jonathan Ball of Nottingham University, who was not connected with the study, said it shows ''pretty conclusively'' that smallpox viruses present in human outbreaks for which we have samples share a common ancestor that probably dates back to the late 16th to mid-17th centuries. However, he said, the question remains as to whether outbreaks occurred before that date, caused by strains that were never seen again. ''Only access to, and analysis of, even older samples will answer that; but these are difficult to find and difficult to work with, so perhaps we will never know.'' The research is published in the journal, Current Biology.


News Article | August 26, 2016
Site: news.yahoo.com

Tectonic plates around the Philippine Sea have jostled and twisted for millions of years. A previously unknown tectonic plate — one that has been swallowed up by the Earth — has been discovered in the Philippine Sea, according to a recent study. Using images constructed from earthquake data, geoscientists have developed a method for resurrecting a "slab graveyard" of tectonic plate segments buried deep within the Earth, unfolding the deformed rock into what it may have looked like up to 52 million years ago. This helped the researchers identify the previously unknown East Asian Sea Plate, where an ancient sea once existed in the region shortly after dinosaurs went extinct. The Philippine Sea lies at the juncture of several major tectonic plates. The Pacific, Indo-Australian and Eurasian plates frame several smaller plates, including the Philippine Sea Plate, which researchers say has been migrating northwest since its formation roughly 55 million years ago. [Photo Timeline: How the Earth Formed] In the process, the Philippine Sea Plate collided with the northern edge of the East Asian Sea Plate, driving it into the Earth's mantle. The southern area of the East Asian Sea Plate was eventually subducted by, or forced beneath, other neighboring plates, the researchers said. Geologists attempting to reconstruct the past were once limited to visible evidence of slow-moving changes, such as mountains, volcanoes or the echoes of ancient waterways. But with new imaging technologies, scientists can now glean information from hundreds of miles within the Earth's interior to map distant history. The slabs were previously identified with an imaging technique called seismic tomography, which uses earthquake waves and multiple monitoring stations to determine the speed at which different waves travel through the Earth. Those waves generally travel more quickly through old chunks of tectonic plates that "sink through the mantle, like a leaf through water," said study lead author Jonny Wu, a geologist formerly at National Taiwan University and now at the University of Houston. Wu and his colleagues at National Taiwan University focused on an area around the Philippine Sea, in part because of good data from the many seismic monitoring stations in this earthquake-heavy region. "East Asia has been a place where plates have been coming together, converging and disappearing from the Earth's surface in a process called subduction," Wu told Live Science. "Because the information you're looking for to piece together the history of the area is actually disappearing from the Earth's surface, it's made it very difficult." [In Photos: Ocean Hidden Beneath Earth's Surface] The East Asian Sea Plate was pieced together by a process of elimination when all but three of the 28 subducted slabs in the model had been traced back to connections with other modern plates. The region is also home to many relatively small tectonic plates, known as microplates, where movement is hard to reconstruct. "Those plates have long been tectonic mysteries, because it's really difficult to work out where they've been in the past," Wu said. "Just like if it's a puzzle, small fragments can fit in all these ways." The findings could provide researchers with a clearer picture of the history of the Philippine Sea and its surrounding regions. "The work [is] a groundbreaking advance in our understanding of the deep Earth structure in the most complex parts of the Eastern Hemisphere," Sabin Zahirovic, a geologist at the University of Sydney who was not involved in the study, told Live Science in an email. The new study is also a step toward a much-needed technical method of interpreting models built from earthquake data, said Hans-Peter Bunge, Chair of Geophysics at Ludwig Maximilians University in Munich, who was not involved with the new research. "Normally we would not have full access to the complexity of the interior structure," Bunge told Live Science. But this "important" new technique fills in the information missing from the seismic tomography images with carefully constrained guesses at what the material might be, and how the plates have moved, he added. And the researchers aren't stopping there. "As we keep working in other areas with a lot of unknowns — for example, South America or the Himalayas — we'll continue to test these methods and refine them, and hopefully contribute new ideas to Earth science," Wu said. The research was published online June 25 in the Journal of Geophysical Research: Solid Earth. Copyright 2016 LiveScience, a Purch company. All rights reserved. This material may not be published, broadcast, rewritten or redistributed.


News Article | December 8, 2016
Site: www.chromatographytechniques.com

Smallpox plagued humanity for centuries, claiming millions of lives, before vaccination eradicated it in 1980. Some theories place its scourge as far back as ancient Egypt and the peak of Rome. But DNA evidence in a 17th century mummy indicates smallpox may be a more modern disease than originally believed, according to a new study in the journal Current Biology. “Scientists don’t yet fully comprehend where smallpox came from and when it jumped into humans,” said Hendrik Poinar, senior author, from both McMaster University and the Michael G. DeGroote Institute of Infectious Disease Research. The smallpox DNA was collected from a child in Lithuania who had died in the middle of the 17th century – a time beset by smallpox outbreaks and rampant death. The DNA was heavily fragmented, but the scientists sequenced it. (Since none of the virus was live, it was not dangerous to handle). Many theories held that smallpox outbreaks had been the cause of widespread death as far back as ancient Egypt and the peak of the Roman Empire, all the way through to the 20th century. But the Lithuanian sample shows such similarities to the samples collected right before the 1980 eradication, that the evolutionary models showed it probably jumped to humans in just the last few centuries. “This study sets the clock of smallpox evolution to a much more recent time scale,” said Eddie Holmes, another author, from the University of Sydney. “Although it is still unclear what animal is the true reservoir of smallpox virus and when the virus first jumped into humans.” They also found that Variola virus evolved into two strains after the development of the first vaccines using dead forms of the virus at the end of the 18th century. The study raises further questions, especially whether the disease outbreaks among the people of Central and South America during Spanish colonization were indeed smallpox.


News Article | October 26, 2016
Site: www.newscientist.com

No time to muck around. Extreme evolutionary pressures seem to have caused Tasmanian devils to develop resistance to a deadly cancer in just a few generations. Devil facial tumour disease is a transmissible cancer that was first observed in Tasmanian devils in 1996. They usually contract the disease by biting a tumour on an infected animal. Initially, the fatality rate was reportedly almost 100 per cent. This high mortality rate has seen the total devil population decline by 80 per cent – and locally the figure can touch 95 per cent. This led to fears of rapid extinction, but some devil populations seem to be doing better than disease models would predict. To understand why, Menna Jones at the University of Tasmania, Australia, and her colleagues recently analysed the genomes of almost 300 devils from three separate regions in Tasmania. The researchers compared genetic samples taken from the three devil populations before and after the cancer arrived in each area. Populations affected by the disease differed from pre-disease ones in two regions of the genome – both with known links to cancer and immunity. This hints that genetic resistance to the cancer has spread through the devil populations, which might explain why numbers are defying expectations in disease-struck areas. “These gene variants would have been around before, but there was no evolutionary advantage to them being at high frequency,” says Katherine Belov at the University of Sydney. “Since the arrival of this new disease, the animals without these variants would have been dying, leading to an increase in the frequency of these protective variants.” Given the prevalence of the genetic changes in devil populations today – and what is known about their reproductive behaviour – the study authors estimate that resistance spread through the population over just four to six generations. “It’s as if extreme mortality has led to extreme evolutionary selection pressure,” says Jones. “It has happened a lot faster than we expected.” The findings feed into growing evidence that species can evolve in fewer than 10 generations if confronted with a major threat, she says. “Evolution isn’t a glacially slow process as once thought; it’s a dynamic, ‘happening-all-the-time’ thing.” Research shows that the infectious cancer is also evolving, but it is unlikely to become more lethal. Infectious diseases usually become less virulent over time, which helps avoid wiping out the hosts that support them, says Jones. For instance, the pest-control virus myxomatosis lost its potency in rabbits after a few years. One solution to further bolster Tasmanian devil numbers may be to introduce wild animals with the favourable gene variants into disease-free “insurance populations” maintained in captivity outside Tasmania, says Jones. This addition of genetic diversity should help the animals to adapt to future disease threats, she says. But even without any intervention, wild populations of devils should start to bounce back now that resistance is beginning to develop, says Belov. “It’s so exciting: 10 years ago everything was bleak, but now we’re past that,” she says.


News Article | November 23, 2016
Site: www.chromatographytechniques.com

A groundbreaking study of the virosphere of the most populous animals - those without backbones such as insects, spiders and worms and that live around our houses - has uncovered 1445 viruses, revealing people have only scratched the surface of the world of viruses - but it is likely that only a few cause disease. The meta-genomics research, a collaboration between the University of Sydney and the Chinese Centre for Disease Control and Prevention in Beijing, was made possible by new technology that also provides a powerful new way to determine what pathogens cause human diseases. Edward Holmes, from the Marie Bashir Institute for Infectious Diseases & Biosecurity and the School of Life and Environmental Sciences who led the Sydney component of the project, said although the research revealed humans are surrounded by viruses in our daily lives, these did not transfer easily to humans. "This groundbreaking study re-writes the virology text book by showing that invertebrates carry an extraordinary number of viruses - far more than we ever thought," said Holmes. "We have discovered that most groups of viruses that infect vertebrates - including humans, such as those that cause well-known diseases like influenza - are in fact derived from those present in invertebrates." The study suggests these viruses have been associated with invertebrates for potentially billions of years, rather than millions of years as had been believed - and that invertebrates are the true hosts for many types of virus. The paper, "Redefining the invertebrate RNA virosphere," is published in Nature. "Viruses are the most common source of DNA and RNA on earth. It is all literally right under our feet,"Holmes said. The findings suggest viruses from ribonucleic acid, known as RNA - whose principal role is generally to carry instructions from DNA - are likely to exist in every species of cellular life. "It's remarkable that invertebrates like insects carry so very many viruses - no one had thought to look before because most of them had not been associated with human-borne illnesses." Although insects such mosquitoes are well-known for their potential to transmit viruses like zika and dengue, Holmes stressed that insects should not generally be feared because most viruses were not transferable to humans and invertebrates played an important role in the ecosystem. Importantly, the same techniques used to discover these invertebrate viruses could also be used to determine the cause of novel human diseases, such as the controversial 'Lyme-like disease' that is claimed to occur following tick bites. "Our study utilized new techniques in meta-genomics, which we are also using to provide insights into the causes of human-borne diseases," said Holmes, who is also a National Health and Medical Research Council Australia Fellow. "The new, expensive technologies available to researchers which have allowed us to do this landmark project, provide the ultimate diagnostic tool." Holmes and his collaborators are conducting human studies using these new techniques to analyse Lyme-like disease and other clinical syndromes.


News Article | February 21, 2017
Site: www.eurekalert.org

The review, which adds 27 studies to update a previous Cochrane review, confirms earlier analyses by "providing definitive evidence that pharmaceutical industry funding of drug studies biases the results and conclusions to look favourable towards the drug of the sponsor," said senior author, Professor Lisa Bero of the University of Sydney's Charles Perkins Centre. The authors note there are several potential ways that industry sponsors can influence the outcome of a study, including the framing of questions, the design of a study, the conduct of a study, how data are analysed, selective reporting of favourable results, and "spin" in reporting conclusions. Also, while some journals now require that the role of the sponsor in the design, conduct and publication of the study be described, this practice is not widespread. Professor Bero said the key concern associated with industry-sponsored research evaluating drugs and medical devices was that there were no standard tools or validated criteria that include industry sponsorship as a risk of bias for such studies. "We need bias assessments tools for drug studies that take funding source into account," Professor Bero said. "Currently, we have no validated way to detect or evaluate these subtle but systematic biases." + More often had efficacy results that were more favourable to the sponsor's product (relative risk 1.27, confidence interval 1.19-1.37) ++ More often had favourable overall conclusions (relative risk 1.34, confidence interval 1.19-1.51) +++ Had less agreement between stated results and overall conclusions (relative risk 0.83, confidence interval 0.70-0.98). Co-author to the review, Dr Joel Lexchin, Professor Emeritus of York University, said the findings were especially concerning for patients and doctors. "Our views about the effectiveness and safety of many medicines may be distorted. Medicines may be both less safe and less effective than we think to the extent that the evidence about them comes from the companies making them," he said.


News Article | March 21, 2016
Site: www.scientificcomputing.com

Artificial intelligence must be kept under human control or we may become defenseless against its capabilities, warn two University of Sydney machine-learning experts. Professor Dong Xu, Chair in Computer Engineering from the University of Sydney’s School of Electrical Engineering and Information Engineering says the defeat of the world champion Go player has raised fresh concerns about the future role of artificial intelligence (AI) devices. The Professor, whose research interests include computer vision, machine-learning and multimedia content analysis, says the question now is how much we should control AI’s ability to self-learn. “The scientists and technology investors have been enthusiastic about AI for several years, but the triumph of the supercomputer has finally made the public conscious of its capabilities. This marks a significant breakthrough in the technology world,” Professor Xu says. “Supercomputers are more powerful than the human mind. Competitive games such as Go or chess are actually all about rules — they are easy for a computer. Once a computer grasps them, it will become very good at playing the games.” Professor Xu says: “The problem is that computers like AlphaGo aren’t good at the overall strategy, but they are good at partial ones because they search better within a smaller area. This explains why AI will often lag behind in the beginning but catches up later. “A human player can be affected by emotions, such as pressure or happiness, but a computer will not. “It’s said that a person is able to memorize 1000 games in a year, but a computer can memorize tens of thousands or hundreds of thousands during the same period. And a supercomputer can always improve — if it loses one game, then it would analyze it and do better next time. “If a supercomputer could totally imitate the human brain, and have human emotions, such as being angry or sad, it will be even more dangerous." Currently, AI is good for the labor-intensive industries and can work as human substitutes to serve the public interest. They can clean, work as agricultural robots in the fields, or probe deep underground. "Another challenge is that AI needs a more intelligent environment. For instance, self-driven automobiles often can’t recognize a red light, so if the traffic lights could send a signal to the cars and they could sense them, it would solve the problem. Singapore is making an effort to build an area with roads that are friendly or responsive to self-driven vehicles." Professor Xu believes it is crucial for companies, such as Google and Facebook, to set up “moral and ethics committees” to take control to ensure scientific research won’t head in the wrong direction and create machines that act maliciously. Dr. Michael Harre, a senior lecturer in complex systems who spent several years studying the AI behind the ancient Chinese board game, says: “Go is probably the most complicated game that is commonly played today. Even when compared to chess, which has a very large number of possible patterns, Go has more possible patterns than there are atoms in the universe. “The technology has developed to a point that it can now outsmart a human in both simple and complex tasks. This is a concern, because artificial intelligence technology may reach a point in a few years where it is feasible that it could be adapted to areas of defense where a human may no longer be needed in the control loop: truly autonomous AI."


News Article | April 21, 2016
Site: www.techtimes.com

Australia has launched its first Sydney Nanoscience Hub. The $150 million facility in University of Sydney is among the most advance research and training facilities in the world that would allow researchers to conduct further scientific studies on the nanoscale. Jointly funded by the university and the Commonwealth Education Infrastructure Fund of the Australian government, the building is furnished with core nanofabrication and characterization facilities, academic laboratories, and state-of -the-art teaching rooms that provide groundwork for its main thrust in education and research. Thanks to Canadian Prime Minister Justin Trudeau's explanation about quantum computing, more people are trying to understand quantum physics. Experts believe the partnership of Microsoft with quantum physicists would set the foundation for the second information age. For more than 10 years, the tech giant has been eyeing development of scalable universal quantum computer. Microsoft has been working with several labs including Quantum Nanoscience Laboratory headed by Biographical Details Professor David Reilly. Microsoft, in collaboration with the university, sent its quantum computing scientists and directors to participate in the launch of the Australian Institute for Nanoscale Science and Technology (AINST), where manipulation of electrons in temperatures colder than deep space can be now carried out. Among those sent by Microsoft is its research director Dr. Norm Whitaker who believes that the company's partnership with the university would push the very edge of physics and engineering. The computers of today have reached their limits. By harnessing the nanoscale size of qubits, computing speeds would improve 100 million times faster, as what NASA and Google's D-Wave quantum computing can do. "To build a quantum computer you need more than just the [quantum] qubits; more than just the elementary constituents of matter - the electrons and so on," Prof. Reilly said. He added that quantum computing needs a series of electronics and classical control technology that is more than what is available today. Professor Reilly acknowledged that building quantum computers has great challenges and is grateful for Microsoft's support. He shared that his team is focused on scaling up and constructing specialized electronic systems that can function in both room and cryogenic temperatures. He added that in order to overcome that challenge, his team plans to look at classical and quantum streams associations to build quantum machines. © 2016 Tech Times, All rights reserved. Do not reproduce without permission.


News Article | December 17, 2016
Site: www.eurekalert.org

LUGANO-SINGAPORE, 18 December, 2016 - Cancer patients are ending up in debt because they have to cover the costs of treatment as well as other care related expenses, researchers report at the ESMO Asia 2016 Congress in Singapore. Previous studies have demonstrated that cancer patients face financial difficulties even in countries where the national public health system covers most of the expense. The economic hardship experienced by patients and survivors is often refered to as the "financial toxicity" of cancer (1). Research presented at ESMO Asia 2016 shows new aspects of the burden of cancer care on patients. A study (2) from Malaysia has found that more than half of cancer survivors spend at least a third of their yearly household income on treatment, as well as on costs such as transport to hospital and childcare. They have to pay for cancer drugs because many are not funded by the government despite the availabilty of free healthcare. Lead author Nirmala Bhoo Pathy, a clinical epidemiologist at the Faculty of Medicine, University Malaya Medical Center, Kuala Lumpur, Malaysia, said: "The scope of cancer care and drugs offered free through the public health services is limited in Malysia. "The current system of funding for cancer healthcare needs to be reviewed. The government must increase financial risk protection especially for the poor. Early cancer detection must also be improved through policy changes." The findings were based on 1,662 men and women who participated in the 2012-2014 ASEAN Costs in Oncology Study. More than half (51%) of those still alive (n=1,215) a year after diagnosis were suffering financial difficulties. Low-income, a lack of health insurance and not having surgery were among the factors associated with patients having money issues. The risk was lower for those who did not undergo chemotherapy. Commenting on the study, Professor Nathan Cherny, lead investigator for the ESMO International Consortium Study on the Availability of Anti-neoplastic Medicines, said: "Unfortunately this situation is not limited to Malaysia. Data from the soon-to-be published ESMO International Consortium Study on the Availability of Anti-neoplastic Medicines (3), indicates that the burden of out-of- pocket expenses for drugs that are not on the World Health Organisation (WHO) essential medicines list, is substantial in most upper-middle income countries like Malaysia. "The problems are even more severe in low-middle and low income countries where patients often need to cover costs themselves, even for essential anticancer treatments." The financial burden of cancer care is also highlighted in a separate pilot study (4) to be presented at the ESMO Asia 2016 Congress. It includes data from patients (n=14) aged 37 to 77 currently being treated at a hospital outpatient department in Australia. All participants completed a questionnaire covering issues including income and employment history, as well as health insurance status. Preliminary results found nearly three quarters (n=10) reported a reduction in household income after their cancer diagnosis. Of the twelve patients who did have a job, ten highlighted changes in employment conditions such as decreased hours (n=6), others were no longer working (n=3) and one had retired. "The loss of work, a carer's income and early retirement can all contribute to the financial burden on the household," said lead author Anupriya Agarwal, research fellow, Concord Cancer Center, Concord Repatriation General Hospital, Sydney, Australia. "Our study aims to provide insight into these costs and assist policymakers in finding ways of reducing this burden on patients." Another study (5) included in the programme investigated why cancer doctors recommend costly cancer drugs. Oncologists were asked to complete an online experiment. They were presented with several scenarios involving a fictional patient with advanced cancer, and then had to state the cancer drug (A or B) they would recommend. The results based on 101 responses found that healthcare professionals were more likely to advise the use of drugs that allow patients to live longer and have a higher chance of improving their symptoms. They were less likely to recommend drugs with an increased chance of side-effects and that cost patients more, because they are not subsidised by the government. However, doctors would favour expensive treatments if they increased a person's survival time by two months or more over the standard care available. They would take this approach even when the out-of-pocket cost could expose patients with advanced cancer to extreme financial difficulty or 'toxicity'. Lead author Deme Karikios, PhD candidate, National Health and Medical Research Council (NHMRC) Clinical Trials Centre, University of Sydney, Sydney, Australia, said: "Australian oncologists are willing to expose their patients to financial toxicity when recommending expensive unfunded anticancer drugs. They will only do this though in cases where the survival benefit is above that of standard care. "Cancer doctors need to help patients understand the potential benefits, harms and costs of drugs not subsidised by the government. Then patients can make an informed decision about these treatment options." Commenting on the findings, Cherny said: "Tools like the ESMO Magnitude of Clinical Benefit Scale (ESMO-MCBS) (6) can help clinicians better inform patients on how extensive the potential benefit is from a treatment option being considered. "Using a very objective tool such as the ESMO-MCBS helps put the evidence into perspective, mitigates against optimism bias and can lead to shared decision-making that is better informed."


News Article | February 5, 2016
Site: phys.org

This material has a remarkable combination of flexibility and durability: elastin is one of the body's most long-lasting component proteins, with an average survival time comparable to a human lifespan. During a person's life, the elastin in a blood vessel, for example, will have gone through an estimated two billion cycles of pulsation. A team of researchers at the University of Sydney, MIT in the United States and at the University of Manchester in the United Kingdom has carried out an analysis that reveals the details of a hierarchical structure of scissor-shaped molecules that gives elastin its remarkable properties. The findings are published today in the journal Science Advances, in a paper by the University of Sydney postdoctoral research associate Dr Giselle Yeo and Professor Anthony Weiss in the Faculty of Science and Charles Perkins Centre, with co-authors including MIT graduate student Anna Tarakanova and Professor of Civil and Environmental Engineering Markus Buehler. Elastin tissues are made up of molecules of a protein called tropoelastin, which are strung together in a chain-like structure and which Professor Weiss and his team have been studying in the lab for many years. In this work, they collaborated with Professor Buehler and Ms Tarakanova at MIT, who have specialised in determining the molecular structure of biological materials through highly detailed atomic-scale modeling. Combining the computational and laboratory approaches provided insights that neither method could have yielded alone, team members say. While the study of elastin has been going on for a long time, Professor Weiss says this particular paper was exciting on a number of levels: because of synchrotron imaging done by team member Clair Baldock at the University of Manchester, the research revealed the shape and structure of the basic tropoelastin molecules. But these were snapshots - still images that could not illuminate the complex dynamics of the material as it forms large structures that can stretch and rebound. Those dynamic effects were revealed through the combination of computer modeling and laboratory work. "It's really by combining forces with these three groups" that the details were pieced together, Professor Weiss said. Ms Tarakanova explained that in Professor Buehler's lab, "we use modeling to study materials at different length scales, and for elastin, that is very useful, because we can study details at the sub-molecular scale and build up to the scale of a single molecule." By examining the relationship of structure across these different scales, she said, "we could predict the dynamics of the molecule". The dynamics turned out to be complex and surprising, Professor Weiss said. "It's almost like a dance the molecule does, with a scissors twist - like a ballerina doing a dance." Then, the scissors-like appendages of one molecule naturally lock onto the narrow end of another molecule, like one ballerina riding piggyback on top of the next. This process continues, building up long, chain-like structures. These long chains weave together to produce the flexible tissues that our lives depend on - including skin, lungs, and blood vessels. These structures "assemble very rapidly," Professor Weiss said, and this new research "helps us understand this assembly process". A key part of the puzzle was the movements of the molecule itself, which the team found were controlled by the structure of key local regions and the overall shape of the protein. The team tested the way this flexibility comes about by genetically modifying the protein and comparing the characteristics of the modified and natural versions. They revived a short segment of the elastin gene that has become dormant in humans, which changes part of the protein's configuration. They found that even though the changes were minor and just affected one part of the structure, the results were dramatic. The modified version had a stiff region that altered the molecule's movements. This helped to confirm that certain specific parts of the molecule, including one with a helical structure, were essential to contributing to the material's natural flexibility.That finding in itself could prove useful medically, the team says, as it could explain why blood vessels become weakened in people with certain disease conditions, perhaps as a result of a mutation in that gene. While the findings specifically relate to one particular protein and the tissues it forms, the team said the research may help in understanding a variety of other flexible biological tissues and how they work. "The integration of experiment and modelling in identifying how the molecular structure endows materials with exceptional durability, elasticity, and studying how these materials fail under extreme conditions, yields important insights for the design of new materials that replace those in our body, or for materials that we can use in engineering applications in which durable materials are critical," Professor Buehler said. "We are excited about the new opportunities that arise from this collaboration and the potential for future work, because designing materials that last for many decades without breaking down is a major engineering challenge that nature has beautifully accomplished, and on which we hope to build." Explore further: Solving the riddle of nature's perfect spring More information: Subtle balance of tropoelastin molecular shape and flexibility regulates dynamics and hierarchical assembly, Science Advances, , dx.doi.org/10.1126/sciadv.1501145


Mitchell P.,University of Sydney | Wong T.Y.,Singapore Eye Research Institute
American Journal of Ophthalmology | Year: 2014

Purpose To provide evidence-based recommendations for diabetic macular edema (DME) management based on updated information from publications on DME treatment modalities. Design Perspective. Methods A literature search for "diabetic macular edema" or "diabetic maculopathy" was performed using the PubMed, Cochrane Library, and ClinicalTrials.gov databases to identify studies from January 1, 1985 to July 31, 2013. Meta-analyses, systematic reviews, and randomized controlled trials with at least 1 year of follow-up published in the past 5 years were preferred sources. Results Although laser photocoagulation has been the standard treatment for DME for nearly 3 decades, there is increasing evidence that superior outcomes can be achieved with anti-vascular endothelial growth factor (anti-VEGF) therapy. Data providing the most robust evidence from large phase II and phase III clinical trials for ranibizumab demonstrated visual improvement and favorable safety profile for up to 3 years. Average best-corrected visual acuity change from baseline ranged from 6.1-10.6 Early Treatment Diabetic Retinopathy Study (ETDRS) letters for ranibizumab, compared to 1.4-5.9 ETDRS letters with laser. The proportion of patients gaining ≥10 or ≥15 letters with ranibizumab was at least 2 times higher than that of patients treated with laser. Patients were also more likely to experience visual loss with laser than with ranibizumab treatment. Ranibizumab was generally well tolerated in all studies. Studies for bevacizumab, aflibercept, and pegaptanib in DME were limited but also in favor of anti-VEGF therapy over laser. Conclusions Anti-VEGF therapy is superior to laser photocoagulation for treatment of moderate to severe visual impairment caused by DME.


Byrne M.,University of Sydney | Przeslawski R.,Geoscience Australia
Integrative and Comparative Biology | Year: 2013

Benthic marine invertebrates live in a multistressor world where stressor levels are, and will continue to be, exacerbated by global warming and increased atmospheric carbon dioxide. These changes are causing the oceans to warm, decrease in pH, become hypercapnic, and to become less saturated in carbonate minerals. These stressors have strong impacts on biological processes, but little is known about their combined effects on the development of marine invertebrates. Increasing temperature has a stimulatory effect on development, whereas hypercapnia can depress developmental processes. The pH, pCO 2, and CaCO3 of seawater change simultaneously with temperature, challenging our ability to predict future outcomes for marine biota. The need to consider both warming and acidification is reflected in the recent increase in cross-factorial studies of the effects of these stressors on development of marine invertebrates. The outcomes and trends in these studies are synthesized here. Based on this compilation, significant additive or antagonistic effects of warming and acidification of the ocean are common (16 of 20 species studied), and synergistic negative effects also are reported. Fertilization can be robust to near-future warming and acidification, depending on the male-female mating pair. Although larvae and juveniles of some species tolerate near-future levels of warming and acidification (+2C/pH 7.8), projected far-future conditions (ca. 4C/pH 7.6) are widely deleterious, with a reduction in the size and survival of larvae. It appears that larvae that calcify are sensitive both to warming and acidification, whereas those that do not calcify are more sensitive to warming. Different sensitivities of life-history stages and species have implications for persistence and community function in a changing ocean. Some species are more resilient than others and may be potential "winners" in the climate-change stakes. As the ocean will change more gradually over coming decades than in "future shock" perturbation investigations, it is likely that some species, particularly those with short generation times, may be able to tolerate near-future oceanic change through acclimatization and/or adaption. © 2013 The Author. Published by Oxford University Press on behalf of the Society for Integrative and Comparative Biology. All rights reserved.


Wood N.,University of Sydney | Siegrist C.-A.,University of Geneva
Current Opinion in Infectious Diseases | Year: 2011

Purpose of review: Recognition of the high burden of disease in early life and advances in the understanding of neonatal immunology have resulted in renewed interest in maternal and neonatal vaccination. This article reviews existing information and recent advances in neonatal human immunization. Recent findings: Recent findings have demonstrated the neonatal immune system not to be immature but rather specifically adapted for early postnatal life. This includes the preferential induction of memory B cell rather than antibody-secreting plasma cells and polarization of neonatal T-cell responses away from potentially deleterious T-helper type 1 cytokines. Recent neonatal acellular pertussis and pneumococcal conjugate vaccine trials have proven that a birth dose of acellular pertussis and/or pneumococcal vaccine, in limited samples sizes, are well tolerated and immunogenic; however they have identified vaccine interference as a critical issue to address. Summary: Neonatal immunization may be a well tolerated and effective preventive strategy against early life pathogens. Research to better understand how neonatal vaccine responses are elicited and to identify optimal early life adjuvants and formulations may broaden neonatally vaccine-preventable diseases to pertussis, rotavirus and possibly influenza, further reducing disease burden in this vulnerable group. Hurdles to neonatal vaccination include safety concerns, both immunological and clinical, demonstration of vaccine efficacy and public acceptance. © 2011 Wolters Kluwer Health | Lippincott Williams & Wilkins.


Yang J.,Aston University | Fletcher J.E.,University of Sydney | O'Reilly J.,University of Glasgow
IEEE Transactions on Industrial Electronics | Year: 2012

The application of high-power voltage-source converters (VSCs) to multiterminal dc networks is attracting research interest. The development of VSC-based dc networks is constrained by the lack of operational experience, the immaturity of appropriate protective devices, and the lack of appropriate fault analysis techniques. VSCs are vulnerable to dc-cable short-circuit and ground faults due to the high discharge current from the dc-link capacitance. However, faults occurring along the interconnecting dc cables are most likely to threaten system operation. In this paper, cable faults in VSC-based dc networks are analyzed in detail with the identification and definition of the most serious stages of the fault that need to be avoided. A fault location method is proposed because this is a prerequisite for an effective design of a fault protection scheme. It is demonstrated that it is relatively easy to evaluate the distance to a short-circuit fault using voltage reference comparison. For the more difficult challenge of locating ground faults, a method of estimating both the ground resistance and the distance to the fault is proposed by analyzing the initial stage of the fault transient. Analysis of the proposed method is provided and is based on simulation results, with a range of fault resistances, distances, and operational conditions considered. © 2012 IEEE.


Liang Y.-C.,University of Geneva | Liang Y.-C.,University of Sydney | Spekkens R.W.,Perimeter Institute for Theoretical Physics | Wiseman H.M.,Griffith University
Physics Reports | Year: 2011

In 1960, the mathematician Ernst Specker described a simple example of nonclassical correlations, the counter-intuitive features of which he dramatized using a parable about a seer, who sets an impossible prediction task to his daughter's suitors. We revisit this example here, using it as an entrée to three central concepts in quantum foundations: contextuality, Bell-nonlocality, and complementarity. Specifically, we show that Specker's parable offers a narrative thread that weaves together a large number of results, including the following: the impossibility of measurement-noncontextual and outcome-deterministic ontological models of quantum theory (the 1967 Kochen-Specker theorem), in particular, the recent state-specific pentagram proof of Klyachko; the impossibility of Bell-local models of quantum theory (Bell's theorem), especially the proofs by Mermin and Hardy and extensions thereof; the impossibility of a preparation-noncontextual ontological model of quantum theory; the existence of triples of positive operator valued measures (POVMs) that can be measured jointly pairwise but not triplewise. Along the way, several novel results are presented: a generalization of a theorem by Fine connecting the existence of a joint distribution over outcomes of counterfactual measurements to the existence of a measurement-noncontextual and outcome-deterministic ontological model; a generalization of Klyachko's proof of the Kochen-Specker theorem from pentagrams to a family of star polygons; a proof of the Kochen-Specker theorem in the style of Hardy's proof of Bell's theorem (i.e., one that makes use of the failure of the transitivity of implication for counterfactual statements); a categorization of contextual and Bell-nonlocal correlations in terms of frustrated networks; a derivation of a new inequality testing preparation noncontextuality; some novel results on the joint measurability of POVMs and the question of whether these can be modeled noncontextually. Finally, we emphasize that Specker's parable of the overprotective seer provides a novel type of foil to quantum theory, challenging us to explain why the particular sort of contextuality and complementarity embodied therein does not arise in a quantum world. © 2011 Elsevier B.V.


Lindoy L.F.,University of Sydney | Park K.-M.,Gyeongsang National University | Lee S.S.,Gyeongsang National University
Chemical Society Reviews | Year: 2013

In this tutorial review the use of macrocyclic complexes as building blocks in a selection of supramolecular systems is discussed with emphasis on the properties, such as enhanced stabilities, that cyclic ligands and their complexes may impart on the resulting assemblies. An aim of the review is to exemplify the versatility of macrocyclic ligand complexes for use as components in a range of both discrete and polymeric systems. The use of macrocyclic systems for controlling CuI aggregation, as scaffolds for metal-cluster formation, as the cyclic components in interlocked catenane and rotaxane structures, for constructing assemblies based on macrocycle exo-coordination, for forming columnar stacks, as well as their roles as both structural and redox centres in a range of coordination polymer types are all presented. © 2013 The Royal Society of Chemistry.


Patent
Sydney West Area Health Service Swahs and University of Sydney | Date: 2011-02-11

The present disclosure relates generally to a structure-modeling approach to identify therapeutic and diagnostic targets on proteins. Means are provided to generate agents which bind and optionally antagonize a particular domain within a protein referred to as a Cleaved_Adhesin Family Domain. In an embodiment, the disclosure is directed to the control of Porphyromonas gingivalis infection or infection by related microorganisms by targeting selected domains on protease-like molecules having a hemagglutinin region. In another embodiment, the present invention enables the modulation or detection of a protein having a Cleaved_Adhesin domain homologous to those in the protease-like molecules.


News Article | November 11, 2016
Site: www.eurekalert.org

This research article by Dr. Kimberley Larisa Way has been published in Currrent Diabetes Review , Volume 12, issue 4, 2016 A new study from the University of Sydney has found that regular aerobic exercise can improve artery health in people with type 2 diabetes (T2D). The findings from this study have been published in Current Diabetes Reviews, and shed new light on exercise as a therapy in this population. Compromised arterial health is an underlying mechanism that promotes the progression of cardiovascular disease (CVD), which is the leading cause of death in individuals with T2D. Effectively managing CVD risk in this population is a major challenge for health professionals. Exercise is one of the first lines of treatment recommended by health professionals to manage the array of complications associated with T2D, such as controlling blood sugar. While it has been consistently shown that exercise is exceptionally beneficial for managing CVD, blood pressure medication is the main treatment used to manage arterial health problems. This new study combined the results of nine randomised controlled clinical trials investigating the effects of exercise in T2D. Kimberley Way, who leads the research, says: "We focussed on measures looking at arterial stiffness, vascular reactivity and smooth muscle function, because there is evidence that suggests they are closely associated with disease progression and CVD mortality." Ms Way statesadds: "What we found from our analysis, is that aerobic exercise, such as brisk walking or cycling appears to have a significantly beneficial effect on the stiffness and the function of the smooth muscles in the arteries. This makes our findings very valuable to health professionals, because aerobic exercise can be used as a primary treatment strategy for arterial health, while also assisting with other health complications associated with T2D. " No major funding resources assisted with the completion of this study. Citation: Way, K.L., Keating, S. E., Baker, M.K., Chuter, V.H., & Johnson, N.A. (2016). The Effect of Exercise on Vascular Function and Stiffness In Type 2 Diabetes: A Systematic Review and Meta-Analysis. Current Diabetes Reviews, 12(4), 369-383. For more information about the article, please visit http://benthamscience.


The International Association of HealthCare Professionals is pleased to welcome Dr. Yiotoula Sotiropoulos, MBBS, to their prestigious organization with an upcoming publication in  The Leading Physicians of the World. Dr. Sotiropoulos holds 30 years of experience as a family practitioner, with areas of expertise in pediatrics, as well as family and general practice. Dr. Sotiropoulos is currently serving patients at her private practice Yiotoula Sotiropoulos Family Practice in Bexley, Australia. Additionally, she also works as a supervisor at the University of Sydney and the University of New South Wales. Dr. Sotiropoulos was educated at the University of New South Wales in Sydney, Australia, graduating in 1984. To keep up to date with the latest advances in her field. Dr. Sotiropoulos maintains a professional membership with the American Medical Association and The Royal Australian College of General Practitioners. Dr. Sotiropoulos attributes her great success to being an astute listener and keen observer. She believes that much of what a patient expresses may be communicated indirectly and she reads between the lines to achieve a more accurate diagnosis and treatment program. View Dr. Yiotoula Sotiropoulos’ Profile Here: https://www.findatopdoc.com/doctor/Yiotoula-Sotiropoulos-Family-Practitioner-Bexley-New-South-Wales-2207 Learn more about Dr. Yiotoula Sotiropoulos by reading her upcoming publication in The Leading Physicians of the World. FindaTopDoc.com is a hub for all things medicine, featuring detailed descriptions of medical professionals across all areas of expertise, and information on thousands of healthcare topics.  Each month, millions of patients use FindaTopDoc to find a doctor nearby and instantly book an appointment online or create a review.  FindaTopDoc.com features each doctor’s full professional biography highlighting their achievements, experience, patient reviews and areas of expertise.  A leading provider of valuable health information that helps empower patient and doctor alike, FindaTopDoc enables readers to live a happier and healthier life.  For more information about FindaTopDoc, visit http://www.findatopdoc.com


News Article | November 16, 2016
Site: www.nanotech-now.com

Abstract: The search for a cure for mesothelioma is in no way limited to the shores of the United States. The toxic impact of asbestos is felt internationally, with the highest incidence of malignant mesothelioma per capita being found in Australia, where over 700 new cases are diagnosed each year. According to the country’s Asbestos Diseases Research Institute (ADRI) more than 10,000 Australians have died of mesothelioma since the early 1980s and the organization expects another 25,000 to be lost over the next forty years. But ADRI researchers have teamed up with a New Zealand-born associate professor at the University of Sydney Medical School, Dr. Glen Reid, along with a Sydney-based biotech company called EnGelIC to research the way that the disease responds to chemotherapeutic drugs and find a better solution. What they’ve come up with is a “futuristic new drug delivery system that relies on nanotechnology and guiding antibodies” The bottom line – they’ve found that it works. According to Dr. Reid, the group has been treating mesothelioma tumors with antibody-guided minicells containing micro RNAs that work like a Trojan Horse. “We have found an amazing inhibition of tumor growth. The results were far in excess of what we have seen with other experimental therapies in this model, and we are very excited about it.” He goes on to explain, “A nano cell is a delivery vehicle. You can package basically anything in there that you like, so a chemotherapy drug, or in our case a mini-gene – and then it’s injected into the body.” The group is still in the midst of its research, but its excitement is based in large part on the results they have seen in a single patient, 51-year old Bradley Selmon, who was diagnosed with mesothelioma in 2013. After failing to respond to chemotherapy, he elected to join Dr. Reid’s clinical trial and in a two-month period his tumor has been virtually eliminated. He is one of only ten patients in the phase one trial and is the only one to respond so well, so the researchers are working to remind people that it is important to remain measured in their enthusiasm and response. Still, according to Selmon’s oncologist, Dr. Steven Kao of the Chris O’Brien Lifehouse cancer center in Sydney, the treatment “has the potential for a paradigm shift in the management of other treatment resistant tumors.” About Mesothelioma.net Author: Terri Oppenheimer Terri Oppenheimer is an independent writer, editor and proofreader. She graduated from the College of William and Mary with a degree in English. Her dreams of a writing career were diverted by a need to pay her bills. She spent a few years providing copy for a major retailer, then landed a lucrative career in advertising sales. With college bills for all three of her kids paid, she left corporate America for a return to her original goal of writing. She specializes in providing content for websites and finds tremendous enjoyment in the things she learns while doing her research. Her specific areas of interest include health and fitness, medical research, and the law. For more information, please click If you have a comment, please us. Issuers of news releases, not 7th Wave, Inc. or Nanotechnology Now, are solely responsible for the accuracy of the content.


News Article | November 9, 2016
Site: www.sciencedaily.com

Planned births occur where a considered decision is made to deliver an infant, and in recent years there have been significant changes in clinical practice resulting in an increase in planned births before the ideal time of birth at 39-40 weeks' gestation. This is mostly attributable to the increased use of elective caesarean section and induction of labour. The study of 153,000 Australian children published today in Pediatrics reports that overall, 9.6 per cent of children were developmentally high risk. In particular, infants born following planned birth before the optimal time of birth were more likely to have poor child development. Using the Australian Early Development Census instrument, children in the study were assessed in five domains: physical health and wellbeing, language and cognition, social competence, emotional maturity, and general knowledge and communication. Children scoring in the bottom 10 per cent of these domains were considered 'developmentally vulnerable', and children who were 'developmentally vulnerable' on two or more domains were classified as 'developmentally high risk'. Compared to children born vaginally following spontaneous labor, the combined adjusted relative risk of being 'developmentally high risk' was 26 per cent higher for a planned birth at 37 weeks and 13 per cent higher at 38 weeks. This is after taking account other important factors associated with poor child development such as socioeconomic disadvantage, lower maternal age, maternal smoking in pregnancy and fetal growth restriction. "The timing of planned birth is potentially modifiable, and the benefits of waiting should be communicated to clinicians, mothers and families," says study co-author, Dr Jonathan Morris of the Kolling Institute and the University of Sydney. The study also reports that the risk of being 'developmentally vulnerable' increased with decreasing gestational age. Compared to children with a gestational age of 40 weeks, the adjusted relative risk of being 'developmentally high risk' was 25 per cent higher at 32-33 weeks, 26 per cent higher at 34-36 weeks, 17 per cent higher at 37 weeks, and six per cent higher at 38 weeks. Compared to children born vaginally following spontaneous labor, the adjusted relative risk of being 'developmentally high risk' was seven per cent higher for labor induction or pre-labor cesarean section. The study's senior author, Associate Professor Natasha Nassar from the University of Sydney Menzies Centre for Health Policy said: "While the association between being born earlier -- lower gestational age -- and poorer developmental outcomes is well established, our results revealed that poor development is further exacerbated in the case of planned birth, where a considered decision made to deliver an infant determines gestational age. "Significant changes in clinical practice have seen an increase in planned births before 39-40 completed weeks' gestation from an increased use of primary and repeat cesarean section and a greater use of labor induction. At a population level this has resulted in a decrease in modal gestational age with planned birth accounting for almost half of births before 39-40 weeks. It is of paramount importance to ensure there are no unintended harms from such a significant shift in clinical practice." The study's lead author, Mr Jason Bentley from the Menzies Centre for Health Policy commented: "There is an urgent need for strategies to inform more judicious clinical decision making about the timing of planned birth." "In cases where labor occurs naturally before 39 weeks or planned birth is unavoidable, it is important that there are appropriate interventions and support in early childhood for these potentially vulnerable children."


News Article | November 7, 2016
Site: www.eurekalert.org

Planned births occur where a considered decision is made to deliver an infant, and in recent years there have been significant changes in clinical practice resulting in an increase in planned births before the ideal time of birth at 39-40 weeks' gestation. This is mostly attributable to the increased use of elective caesarean section and induction of labour. The study of 153,000 Australian children published today in Pediatrics reports that overall, 9.6 per cent of children were developmentally high risk. In particular, infants born following planned birth before the optimal time of birth were more likely to have poor child development. Using the Australian Early Development Census instrument, children in the study were assessed in five domains: physical health and wellbeing, language and cognition, social competence, emotional maturity, and general knowledge and communication. Children scoring in the bottom 10 per cent of these domains were considered 'developmentally vulnerable', and children who were 'developmentally vulnerable' on two or more domains were classified as 'developmentally high risk'. Compared to children born vaginally following spontaneous labor, the combined adjusted relative risk of being 'developmentally high risk' was 26 per cent higher for a planned birth at 37 weeks and 13 per cent higher at 38 weeks. This is after taking account other important factors associated with poor child development such as socioeconomic disadvantage, lower maternal age, maternal smoking in pregnancy and fetal growth restriction. "The timing of planned birth is potentially modifiable, and the benefits of waiting should be communicated to clinicians, mothers and families," says study co-author, Dr Jonathan Morris of the Kolling Institute and the University of Sydney. The study also reports that the risk of being 'developmentally vulnerable' increased with decreasing gestational age. Compared to children with a gestational age of 40 weeks, the adjusted relative risk of being 'developmentally high risk' was 25 per cent higher at 32-33 weeks, 26 per cent higher at 34-36 weeks, 17 per cent higher at 37 weeks, and six per cent higher at 38 weeks. Compared to children born vaginally following spontaneous labor, the adjusted relative risk of being 'developmentally high risk' was seven per cent higher for labor induction or pre-labor cesarean section. The study's senior author, Associate Professor Natasha Nassar from the University of Sydney Menzies Centre for Health Policy said: "While the association between being born earlier - lower gestational age - and poorer developmental outcomes is well established, our results revealed that poor development is further exacerbated in the case of planned birth, where a considered decision made to deliver an infant determines gestational age. "Significant changes in clinical practice have seen an increase in planned births before 39-40 completed weeks' gestation from an increased use of primary and repeat cesarean section and a greater use of labor induction. At a population level this has resulted in a decrease in modal gestational age with planned birth accounting for almost half of births before 39-40 weeks. It is of paramount importance to ensure there are no unintended harms from such a significant shift in clinical practice." The study's lead author, Mr Jason Bentley from the Menzies Centre for Health Policy commented: "There is an urgent need for strategies to inform more judicious clinical decision making about the timing of planned birth." "In cases where labor occurs naturally before 39 weeks or planned birth is unavoidable, it is important that there are appropriate interventions and support in early childhood for these potentially vulnerable children."


News Article | February 15, 2017
Site: phys.org

Located just about 8.7 light years away, UV Ceti, or Luyten 726-8B, belongs to a nearby binary star system Luyten 726-8. It is a variable red dwarf of spectral type M, just like its companion star BL Ceti (Luyten 726-8A). Due to its proximity, this star system is a treasure trove for astronomers studying flaring events of magnetically active stellar systems. That is why a team of researchers led by Christene Lynch of the University of Sydney in Australia selected UV Ceti as a target of radio astronomy observations in December 2015. They used the Murchison Widefield Array (MWA) in Australia to confirm previous bright flare detections in the system at 100 to 200 MHz. The array allowed the scientist to get a glimpse of low-level flares fainter than expected. "We have detected four flares from UV Ceti at 154 MHz using the Murchison Widefield Array," the paper reads. The observation sessions, which used a 30.72 MHz bandwidth centered at 154 MHz with 40 kHz channels and 0.5-second integrations, allowed the team to observe flare emission in the polarized images. In each epoch, they detected a single right-handed circularly polarized flare, finding also a left-handed flare immediately following the right-handed one. Moreover, they detected linear polarization during the brightest flare, what indicates that the flares are elliptically polarized. The researchers noted that these results highlight the importance of polarization images in such studies. "These dim flares are only detected in polarized images, which have an order of magnitude better sensitivity than the total intensity images. This highlights the utility of using polarization images to detect low level emission in confusion limited images," the team wrote in the paper. The study also revealed that the newly detected flares have flux densities between 10 to 65 mJy. This means that they are about 100 times fainter than most flares observed so far at similar frequencies. Notably, three of the four flares described in the paper have flux densities below 15 mJy, while the one observed on Dec. 11, 2015 reached nearly 65 mJy. The researchers emphasize that their study provides first flare rate measurements for low-intensity (below 100 mJy) flares at 100 to 200 MHz. However, they note that there is still much to accomplish in the field of flare emission research and recommend further observations. Future studies would improve our understanding of physical parameters of the stellar magnetospheric plasma. "To better characterize M dwarf flares at meter wavelengths requires more observational time on individual sources to constrain flare rates. More sensitive observations are also needed to investigate the fine time-frequency structure of the flares. Simultaneous multi-wavelength observations would also add to this analysis," the team concluded. Explore further: Rapid gas flares discovered in white dwarf star for the first time More information: 154 MHz detection of faint, polarized flares from UV Ceti, arXiv:1702.03030 [astro-ph.SR] arxiv.org/abs/1702.03030 Abstract We have detected four flares from UV Ceti at 154 MHz using the Murchison Widefield Array. The flares have flux densities between 10—65 mJy —- a factor of 100 fainter than most flares in the literature at these frequencies —- and are only detected in polarization. The circular polarized fractions are limited to >27% at 3σ confidence and two of the flares exhibit polarity reversal. We suggest that these flares occur periodically on a time scale consistent with the rotational period of UV Ceti. During the brightest observed flare, we also detect significant linear polarization with polarization fraction >18%. Averaging the data in 6-minute, 10 MHz frequency bins we find that the flux density of these flares does not vary over the 30 MHz bandwidth of the Murchison Widefield Array, however we cannot rule out finer time-frequency structure. Using the measured flux densities for the flares, we estimate brightness temperatures between (1013−1014)K, indicative of a coherent emission mechanism. The brightness temperature and polarization characteristics point to the electron cyclotron maser mechanism. We also calculate the flare rates given our four observed flares and compare them to flare rates for the set of M dwarf stars with known 100—200 MHz flares. Our measurement is the first for flares with intensities


News Article | December 1, 2016
Site: www.cnet.com

Pharmaceutical executive Martin Shkreli won near-universal scorn in 2015 when he hiked the price of anti-malaria and -HIV medication Daraprim from $13.50 to $750 overnight. But now, he may have been bested by a few Australian high school students. A group of Year 11 students at Sydney Grammar School has managed to recreate Daraprim's key ingredient in their school lab, producing 3.7 grams of the compound Pyrimethamine. And they did it for AU$20 (about $15, £12). Speaking to Australia's ABC news, Sydney Grammar student Milan Leonard said he and his classmates worked on the project to highlight the "ridiculous" price of the drug, which costs AU$13 for a bottle of 50 tablets in Australia. Daraprim came to the world's attention when Turing Pharmaceuticals, headed up by then CEO Shkreli, acquired the rights to sell the medication, before hiking prices by 5,000 percent. Since then, Shkreli has courted controversy in the sci-tech scene, offering to buy out the embattled 4chan and refusing to release music from a Wu-Tang album, the lone copy of which he owned, unless Donald Trump was elected president. The Sydney Grammar students did their work with the help of Open Source Malaria, a project supported by the University of Sydney and the Australian government, dedicated to finding a cure for malaria "guided by open source principles." One of the students on the team, Brandon Lee, described the feeling of making the final discovery. "At first there was definitely disbelief," Lee told the ABC. "We spent so long and there were so many obstacles... it surprised us, like, 'Oh, we actually made this material' and 'This can actually help people out there'. "So it was definitely disbelief but then it turned into happiness as we realised we finally got to our main goal." So what's it like to beat Big Pharma at their own game? To use an Aussie phrase, student Milan Leonard was pretty stoked. "It was ecstatic, it was bliss, it was euphoric," he said. Martin Shkreli later responded to the news, releasing a short video statement about the students along with a flurry of tweets. "I'm delighted to hear about more and more students entering the STEM field," he said in the video, posted on YouTube. "These Australian students are the proof that the 21st-century economy will solve problems of human suffering through science and technology." His tweets were more direct. "Learning synthesis isn't innovation," he wrote in one tweet. "And never, ever compare your cook game to mine. Highest yield, best purity, most scale. I have the synthesis game on lock," he later added. CNET contacted Sydney Grammar for comment, but the students were at their end of year Speech Day.


News Article | November 30, 2016
Site: www.nature.com

Generally speaking, scientists aren't known as a gregarious bunch. Many identify as bookish, introverted, perhaps even a bit awkward. Yet those with more outgoing, extroverted traits might find it easier to thrive in today's scientific culture. That's because researchers in academia and industry often have to step into the spotlight, by presenting their results at seminars and meetings and forging new relationships with colleagues, funders and, increasingly, the public. Mastering these skills is especially important for young scientists who are trying to build their reputations and advance their careers. But for many shy or introverted researchers, these tasks can feel daunting, if not downright terrifying. They can even cause some to question their place in science, says Louise Harkness, a postdoc at the Woolcock Institute of Medical Research in Sydney, Australia, who has blogged about the challenges of being an introverted scientist. “A future in academia is hard for the best scientists,” says Harkness, who studies treatments for respiratory disorders. “Let alone for quiet scientists who are too shy to put their work forward. Still, quiet scientists can compete successfully with their more loquacious counterparts by cultivating their public-speaking and networking skills, as well as by engaging in creative methods of self-promotion that fit their personalities. Researchers will need to acknowledge the political dimensions of professional science and examine their own personality traits and motivations to find approaches that work best for them. Along with the myth that all scientists are introverts, there is also a widespread perception that science operates as a pure meritocracy. Many young researchers think that they just need to do good research and the rest will follow, says Donna Dean, a retired administrator at the US National Institutes of Health and an executive consultant on leadership and talent development for the US Association of Women in Science. That's usually not the case, Dean says. “We can't just sit around and do nothing and assume that people will recognize our achievements.” Indeed, Jonathan Cheek, a personality psychologist at Wellesley College in Massachusetts, says that shy or introverted people can easily get overlooked in a culture of self-promotion. “Social communication skills, such as public speaking, are the largest predictor of career success outside of whatever the technical requirements for that career are,” he says. That may not seem fair, he admits, but it's reality. Acknowledging the importance of 'soft skills' is a good first step, Cheek says, particularly for certain types of introvert (Cheek and his colleagues recognize four different categories: social, thinking, anxious and restrained introversion). Not all introverts are shy, and some of them — all except those who are anxious introverts, according to Cheek — avoid speaking up and drawing attention to themselves simply because they don't wish to or don't find the behaviour rewarding. For those scientists, he says, it can be enough to recognize that there are tangible benefits to engaging in some form of self-promotion, even if it doesn't come naturally. For others, the barriers are greater. People who experience general shyness feel discomfort when talking to strangers or in front of crowds (Cheek also helped to develop a shyness scale). And researchers who might sometimes feel unwelcome in science because of their identity — including women, minorities and those in the lesbian, gay, bisexual, transgender and queer (LGBTQ) community — can find themselves struggling to speak out in professional settings, Dean says. She adds that such discomfort might stem from a feeling that they bear the burden of representing their entire demographic group, or because they have been conditioned to be quiet as a result of their background. Many in the scientific community agree on the need to help those researchers to amplify their voices, but in the meantime, researchers can help themselves by weighing the costs and benefits of staying quiet. “You have to think about, 'What's standing between me and my goals?',” says Cheek, who identifies as an ambivert, or someone who has both introverted and extroverted traits. If people have already invested years of their life in graduate studies, Cheek says, it's likely that they have a strong stake in continuing their scientific career and so will be willing to push past their shyness. Sometimes, it just takes finding the right motivation, says Harkness, who overcame some of her quiet tendencies while doing her PhD research at the University of Sydney on gene regulation in asthmatic muscle cells. “I came to this realization that if I don't get up there and present my results, the world is missing out on these results and my thought process,” she says (see 'Embrace your quietness'). Almost all scientists, at one point or another, have to share their research in front of crowds — a task that strikes dread into the hearts of many, not just the introverted and shy. Some surveys, such as the Chapman University Survey on American Fears conducted in 2014, show that in the United States, fear of public speaking often tops people's list of phobias, beating even fear of drowning in some cases. “Early in my PhD, I recognized it was something I was abysmal at,” says Paul Brack, a PhD student at Loughborough University, UK, who studies ways to produce hydrogen for fuel cells. “I wanted to become average — that was my aspiration.” Fortunately, Cheek says, public speaking is not as hard to learn as many people fear, and doesn't require quiet researchers to become extroverts. The main reason, he adds, that most people loathe public speaking is that they haven't done it very much, and getting better at it just takes practice. Many universities offer resources to help scientists to become comfortable presenting at meetings and to hone their speaking skills, says biochemist Kate Sleeth, interim associate dean of administration and student development at City of Hope hospital in Duarte, California. If an institution has no such offerings, Sleeth — also an introvert, who now chairs the board of directors of the National Postdoctoral Association — recommends seeking out groups such as Toastmasters International, a non-profit organization dedicated to helping its members to become better communicators. Another strategy is for researchers to develop a presentation style that feels comfortable to them. For Harkness, that involved using her talks to illustrate her thought process, rather than just to disseminate her findings. “I want to take people through the story,” she says. Stepping through the evolution of a research project actually made her feel better about presenting her work, she says. “I'm quite proud to show it.” Networking can also be adapted to individual preferences, despite the intimidating connotation that it has for many scientists. “The word 'networking' makes a lot of people feel like they are going to have to come up with some sort of beautifully flowing conversational piece,” says Brack, who wrote about the subject in a Naturejobs blogpost last year (see go.nature.com/2fx60wc). But he has devised several ways to network that suit him, as an introvert who also used to be very shy. One strategy involves approaching individuals — rather than big groups — at networking events at meetings with a question or two in mind ahead of time. If Brack strikes up a conversation with a fellow graduate student, he usually leads with questions about their research, adviser and university. You don't even have to stick to science, says Dean, who still finds networking hard. Perhaps you notice something on someone's name badge — such as being from the same place — or share a hobby or other connection. “Get people talking about themselves,” she says. Dean advises young scientists to set a goal of talking to two or three new people each time they go to a conference, and she urges them to avoid describing their work in self-deprecating terms. Sleeth also suggests taking along an outgoing friend who will help you to feel comfortable, but who will not hog the limelight. Quiet scientists might also consider collaborating with more-extroverted colleagues on research. “It makes it so much easier,” Sleeth says. Ultimately, even if these tasks don't ever feel natural to many quiet scientists, those scientists should not despair, says Steve Blank, who teaches entrepreneurship at Stanford University in California and is the architect of the US National Science Foundation's Innovation Corps Program, which helps scientists to commercialize their discoveries. “By definition, scientists are pretty smart,” Blank says. “While you might not have it in your gut, you have enough computing power to emulate it.” When making big career choices, quiet scientists might want to consider how different paths in science might suit their personality. For instance, academia probably entails teaching and giving many public talks, whereas government agencies might require more lab work and meetings with agency managers. As entrepreneurs in the tech industry, Blank says, scientists have to sell their ideas to investors and customers. “If you want a leadership role, I'd say the biggest thing you need to learn is to communicate,” he says. And that often involves at least learning to emulate an extrovert. Because of the diverse demands of different scientific trajectories, Cheek recommends that early-career scientists look at literature such as Holland's theory of vocational choice, developed by the late John Holland, a psychologist at Johns Hopkins University in Baltimore, Maryland. “It's sort of a theory about how work environments have personalities,” he says. Both people and occupations are ranked against Holland's framework, and three of the categories — realistic, investigative and artistic — are well suited to introverts. These might correspond to more applied, theoretical and creative career paths, respectively. Scientists should not let such classifications dissuade them from following their aspirations, Cheek says, but they should consider whether their personality is compatible with their intended career choice. “Your preference, when it