Melbourne, Australia
Melbourne, Australia

The University of Melbourne is an Australian public university located in Melbourne, Victoria. Founded in 1853, it is Australia's second oldest university and the oldest in Victoria. Melbourne was named Australia's best university by Times Higher Education, Academic Ranking of World Universities and National Taiwan University Rankings. Times Higher Education ranks Melbourne as 34th in the world, while the QS World University Rankings places Melbourne 31st in the world. According to QS World University Subject Rankings 2014, the University of Melbourne is ranked 2nd in the world for Education, 8th in Accounting & Finance, and Law, 10th in Psychology, 12th in Medicine, and 15th in Computer Science & IT.Melbourne's main campus is located in Parkville, an inner suburb north of the Melbourne central business district, with several other campuses located across Victoria. Melbourne is a sandstone university and a member of the Group of Eight, Universitas 21 and the Association of Pacific Rim Universities. Since 1872 various residential colleges have become affiliated with the university. There are 12 colleges located on the main campus and in nearby suburbs offering academic, sporting and cultural programs alongside accommodation for Melbourne students and faculty.Melbourne comprises 11 separate academic units and is associated with numerous institutes and research centres, including the Walter and Eliza Hall Institute of Medical Research, Florey Institute of Neuroscience and Mental Health, the Melbourne Institute of Applied Economic and Social Research and the Grattan Institute. Amongst Melbourne's 15 graduate schools the Melbourne Business School, the Melbourne Law School and the Melbourne Medical School are particularly well regarded.Four Australian prime ministers and five governors-general have graduated from Melbourne. Seven Nobel laureates have been students or faculty, the most of any Australian university. Wikipedia.

Time filter

Source Type

O'Brien K.,University of Oslo | Barnett J.,University of Melbourne
Annual Review of Environment and Resources | Year: 2013

This article reviews research on global environmental change and human security, providing retrospective and tentative prospective views of this field. It explains the roles that the concept of human security has played in research on environmental change, as well as the knowledge that it has contributed. It then discusses the Global Environmental Change and Human Security (GECHS) project as an example of how this research has encouraged a more politicized understanding of the problem of global environmental change, drawing attention to the roles of power, agency, and knowledge. Finally, the article considers new research frontiers that have emerged from this field, including research on social transformations as a means of promoting, sustaining, and enhancing human security in response to complex global environmental challenges. The potential contributions of human security approaches to the next generation of global change research are discussed. © 2013 by Annual Reviews. All rights reserved.

Elliott A.D.,University of Adelaide | La Gerche A.,University of Melbourne
British Journal of Sports Medicine | Year: 2015

Aims: Prolonged endurance exercise is associated with elevated biomarkers associated with myocardial damage and modest evidence of left ventricular (LV) dysfunction. Recent studies have reported more profound effects on right ventricular (RV) function following endurance exercise. We performed a meta-analysis of studies reporting RV function pre-endurance and postendurance exercise. Methods: We performed a search of peer-reviewed studies with the criteria for inclusion in the analysis being (1) healthy adult participants; (2) studies examining RV function following an event of at least 90 min duration; (3) studies reporting RV fractional area change (RVFAC), RV strain (S), RV ejection fraction (RVEF) or tricuspid annular plane systolic excursion (TAPSE) and (4) studies evaluating RV function immediately (<1 h) following exercise. Results: Fourteen studies were included with 329 participants. A random-effects meta-analysis revealed significant impairments of RV function when assessed by RVFAC (weighted mean difference (WMD) -5.78%, 95% CI -7.09% to -4.46%), S (WMD 3.71%, 95% CI 2.79% to 4.63%), RVEF (WMD -7.05%, 95% CI -12.3% to -1.8%) and TAPSE (WMD -4.77 mm, 95% CI -8.3 to -1.24 mm). Modest RV dilation was evident in studies reporting RV systolic area postexercise (WMD 1.79 cm2, 95% CI 0.5 to 3.08 cm2). In contrast, no postexercise changes in LV systolic function (expressed as LVFAC or LVEF) were observed in the included studies (standardised mean difference 0.03%, 95% CI -0.13% to 0.18%). Conclusions: Intense prolonged exercise is associated with a measurable reduction in RV function while LV function is relatively unaffected. Future studies should examine the potential clinical consequences of repeated prolonged endurance exercise on the right ventricle. © 2015, BMJ Publishing Group. All rights reserved.

Newton-Howes G.,University of Otago | Newton-Howes G.,Imperial College London | Clark L.A.,University of Notre Dame | Chanen A.,University of Melbourne
The Lancet | Year: 2015

The pervasive effect of personality disorder is often overlooked in clinical practice, both as an important moderator of mental state and physical disorders, and as a disorder that should be recognised and managed in its own right. Contemporary research has shown that maladaptive personality (when personality traits are extreme and associated with clinical distress or psychosocial impairment) is common, can be recognised early in life, evolves continuously across the lifespan, and is more plastic than previously believed. These new insights offer opportunities to intervene to support more adaptive development than before, and research shows that such intervention can be effective. Further research is needed to improve classification, assessment, and diagnosis of personality disorder across the lifespan; to understand the complex interplay between changes in personality traits and clinical presentation over time; and to promote more effective intervention at the earliest possible stage of the disorder than is done at present. Recognition of how personality disorder relates to age and developmental stage can improve care of all patients. © 2015 Elsevier Ltd.

Grossmann M.,University of Melbourne | Grossmann M.,Austin Health
Journal of Endocrinology | Year: 2014

A wealth of observational studies show that low testosterone is associated with insulin resistance and with an increased risk of diabetes and the metabolic syndrome. Experimental studies have identified potential mechanisms by which low testosterone may lead to insulin resistance. Visceral adipose tissue is an important intermediate in this relationship. Actions of testosterone or its metabolite oestradiol on other tissues such as muscle, liver, bone or the brain, and body composition-independent effects may also play a role. However, definitive evidence from randomised controlled trials (RCTs) to clarify whether the association of low testosterone with disordered glucose metabolism is causative is currently lacking. It therefore remains possible that this association is due to reverse causation, or simply originates by association with common health and lifestyle factors. RCTs of testosterone therapy in men with or without diabetes consistently show modest metabolically favourable changes in body composition. Despite this, testosterone effects on glucose metabolism have been inconsistent. Recent evidence suggests that the hypothalamic-pituitary-testicular axis suppression in the majority of obese men with metabolic disorders is functional, and may be, at least in part, reversible with weight loss. Until further evidence is available, lifestyle measures with emphasis on weight reduction, treatment of comorbidities and optimisation of diabetic control should remain the first-line treatment in these men. Such measures, if successful, may be sufficient to normalise testosterone levels in men with metabolic disorders, who typically have only modest reductions in circulating testosterone levels. © 2014 Society for Endocrinology Printed in Great Britain.

Grossmann M.,University of Melbourne | Wittert G.,University of Adelaide
Endocrine-Related Cancer | Year: 2012

Metabolic disorders such as diabetes, obesity and the metabolic syndrome have been shown to modulate prostate cancer (PCa) risk and aggressiveness in population-based and experimental studies. While associations between these conditions are modest and complex, two consistent findings have emerged. First, there is observational evidence that obesity and associated insulin excess are linked to increased PCa aggressiveness and worse outcomes. Secondly and somewhat paradoxically, long-standing diabetes may be protective against PCa development. This apparent paradox may be due to the fact that long-standing diabetes is associated with insulin depletion and decreased IGF1 signalling. Men with obesity or diabetes have moderate reductions in their androgen levels. The interconnectedness of metabolic and androgen status complicates the dissection of the individual roles of these factors in PCa development and progression. Metabolic factors and androgens may promote prostate carcinogenesis via multiple mechanisms including inflammation, adipokine action, fatty acid metabolism and IGF signalling. Moreover, androgen deprivation, given to men with PCa, has adverse metabolic consequences that need to be taken into account when estimating the risk benefit ratio of this therapy. In this review, we will discuss the current epidemiological and mechanistic evidence regarding the interactions between metabolic conditions, sex steroids and PCa risk and management. © 2012 Society for Endocrinology.

Sarris J.,University of Melbourne | Sarris J.,Swinburne University of Technology
Psychiatric Clinics of North America | Year: 2013

St. John's wort (Hypericum perforatum) has been extensively studied and reviewed for its use in depression; however, there is less salient discussion on its clinical application for a range of other psychiatric disorders. This article outlines the current evidence of the efficacy of St John's wort in common psychiatric disorders, including major depression, bipolar depression, attention-deficit hyperactivity disorder, obsessive-compulsive disorder, social phobia, and somatization disorder. Mechanisms of action, including emerging pharmacogenetic data, safety, and clinical considerations are also detailed. © 2013 Elsevier Inc.

Sgro C.M.,Monash University | Lowe A.J.,University of Adelaide | Hoffmann A.A.,University of Melbourne
Evolutionary Applications | Year: 2011

Evolution occurs rapidly and is an ongoing process in our environments. Evolutionary principles need to be built into conservation efforts, particularly given the stressful conditions organisms are increasingly likely to experience because of climate change and ongoing habitat fragmentation. The concept of evolutionary resilience is a way of emphasizing evolutionary processes in conservation and landscape planning. From an evolutionary perspective, landscapes need to allow in situ selection and capture high levels of genetic variation essential for responding to the direct and indirect effects of climate change. We summarize ideas that need to be considered in planning for evolutionary resilience and suggest how they might be incorporated into policy and management to ensure that resilience is maintained in the face of environmental degradation. © 2010 Blackwell Publishing Ltd.

Pettolino F.A.,CSIRO | Walsh C.,University of Melbourne | Fincher G.B.,University of Adelaide | Bacic A.,University of Melbourne
Nature Protocols | Year: 2012

The plant cell wall is a chemically complex structure composed mostly of polysaccharides. Detailed analyses of these cell wall polysaccharides are essential for our understanding of plant development and for our use of plant biomass (largely wall material) in the food, agriculture, fabric, timber, biofuel and biocomposite industries. We present analytical techniques not only to define the fine chemical structures of individual cell wall polysaccharides but also to estimate the overall polysaccharide composition of cell wall preparations. The procedure covers the preparation of cell walls, together with gas chromatographyg-mass spectrometry (GC-MS)-based methods, for both the analysis of monosaccharides as their volatile alditol acetate derivatives and for methylation analysis to determine linkage positions between monosaccharide residues as their volatile partially methylated alditol acetate derivatives. Analysis time will vary depending on both the method used and the tissue type, and ranges from 2 d for a simple neutral sugar composition to 2 weeks for a carboxyl reduction/methylation linkage analysis. © 2012 Nature America, Inc. All rights reserved.

Adger W.N.,University of Exeter | Barnett J.,University of Melbourne | Brown K.,University of Exeter | Marshall N.,James Cook University | O'Brien K.,University of Oslo
Nature Climate Change | Year: 2013

Society's response to every dimension of global climate change is mediated by culture. We analyse new research across the social sciences to show that climate change threatens cultural dimensions of lives and livelihoods that include the material and lived aspects of culture, identity, community cohesion and sense of place. We find, furthermore, that there are important cultural dimensions to how societies respond and adapt to climate-related risks. We demonstrate how culture mediates changes in the environment and changes in societies, and we elucidate shortcomings in contemporary adaptation policy. © 2013 Macmillan Publishers Limited. All rights reserved.

Villemagne V.L.,Austin HealthVIC | Villemagne V.L.,University of Melbourne | Fodero-Tavoletti M.T.,Austin HealthVIC | Fodero-Tavoletti M.T.,University of Melbourne | And 3 more authors.
The Lancet Neurology | Year: 2015

Use of selective in-vivo tau imaging will enable improved understanding of tau aggregation in the brain, facilitating research into causes, diagnosis, and treatment of major tauopathies such as Alzheimer's disease, progressive supranuclear palsy, corticobasal syndrome, chronic traumatic encephalopathy, and some variants of frontotemporal lobar degeneration. Neuropathological studies of Alzheimer's disease show a strong association between tau deposits, decreased cognitive function, and neurodegenerative changes. Selective tau imaging will allow the in-vivo exploration of such associations and measure the global and regional changes in tau deposits over time. Such imaging studies will comprise non-invasive assessment of the spatial and temporal pattern of tau deposition over time, providing insight into the role tau plays in ageing and helping to establish the relation between cognition, genotype, neurodegeneration, and other biomarkers. Once validated, selective tau imaging might be useful as a diagnostic, prognostic, and progression biomarker, and a surrogate marker for the monitoring of efficacy and patient recruitment for anti-tau therapeutic trials. © 2015 Elsevier Ltd.

Quevedo D.E.,University of Newcastle | Oostergaard J.,University of Aalborg | Nesic D.,University of Melbourne
IEEE Transactions on Automatic Control | Year: 2011

We study a control architecture for linear time-invariant plants with random disturbances and where a network is placed between the controller output and the plant input. The network imposes a constraint on the expected bit-rate and is affected by random independent and identically distributed (i.i.d.) dropouts. Dropout-rates and acknowledgments of receipt are not available at the controller side. To achieve robustness with respect to i.i.d. dropouts, the controller transmits data packets containing quantized plant input predictions. These are provided by an appropriate optimal entropy coded dithered lattice vector quantizer. Within this context, we derive stochastic stability results and provide a noise-shaping model of the closed loop system. This model is employed for performance analysis by using rate-distortion theory. © 2006 IEEE.

Krenske E.H.,University of Melbourne | Houk K.N.,University of California at Los Angeles
Accounts of Chemical Research | Year: 2013

This Account describes how attractive interactions of aromatic rings with other groups can influence and control the stereoselectivity of many reactions. Recent developments in theory have improved the accuracy in the modeling of aromatic interactions. Quantum mechanical modeling can now provide insights into the roles of these interactions at a level of detail not previously accessible, both for ground-state species and for transition states of chemical reactions. In this Account, we show how transition-state modeling led to the discovery of the influence of aryl groups on the stereoselectivities of several types of organic reactions, including asymmetric dihydroxylations, transfer hydrogenations, hetero-Diels-Alder reactions, acyl transfers, and Claisen rearrangements.Our recent studies have also led to a novel mechanistic picture for two classes of (4 + 3) cycloadditions, both of which involve reactions of furans with oxyallyl intermediates. The first class of cycloadditions, developed by Hsung, features neutral oxyallyl intermediates that contain a chiral oxazolidinone auxiliary. Originally, it was thought that these cycloadditions relied on differential steric crowding of the two faces of a planar intermediate. Computations reveal a different picture and show that cycloaddition with furan takes place preferentially through the more crowded transition state: the furan adds on the same side as the Ph substituent of the oxazolidinone. The crowded transition state is stabilized by a CH-π interaction between furan and Ph worth approximately 2 kcal/mol.Attractive interactions with aromatic rings also control the stereoselectivity in a second class of (4+3) cycloadditions involving chiral alkoxy siloxyallyl cations. Alkoxy groups derived from chiral α-methylbenzyl alcohols favor crowded transition states, where a stabilizing CH-π interaction is present between the furan and the Ar group. The cationic cycloadditions are stepwise, while the Hsung cycloadditions are concerted. Our results suggest that this form of CH- π-directed stereocontrol is quite general and likely controls the stereoselectivities of other addition reactions in which one face of a planar intermediate bears a pendant aromatic substituent. © 2012 American Chemical Society.

Singh P.J.,University of Melbourne | Power D.,University of Melbourne | Chuong S.C.,National University of Singapore
Journal of Operations Management | Year: 2011

More than 900,000 organizations worldwide have registered to the ISO 9000 quality management standard. Despite its growing popularity, few studies have offered a coherent theoretical basis for the standard's appeal. A theory-based explanation enhances understanding and appreciation for the standard, and provides clarity on how the standard benefits organizations. In this paper, we invoke the resource dependence theory (RDT) to purport that the standard is used by organizations as a tool to manage their organizational environment. It does this by specifying procedures that organizations need to manage their organization-environment boundary spanning processes. Using the RDT perspective, a model with three key constructs embodying ISO 9000 was developed: internal processes, relationships with customers and relationships with suppliers. The latter two were treated as being part of the task environment. We predicted that the external aspects of the standard affect operating performance (a measure of effectiveness), both directly and through internal processes. Empirical data from 416 ISO 9000 registered Australian manufacturing plants validated the RDT perspective, and suggest that the three constructs, individually and in isolation, are not as effective as when they are considered together. By invoking RDT, a new theoretical viewpoint to ISO 9000 has been developed that adds to other theoretical perspectives, and goes some way to explaining the growing popularity of this standard with organizations. © 2010 Elsevier B.V.

Activity-composition (a-x) models have been generated for zirconium-bearing haplogranitic silicate melt and garnet from experimental data on zircon dissolution and natural rock data, respectively. Additionally including the recently proposed a-x model for Zr-bearing rutile [Tomkins et al., Journal of Metamorphic Geology25 (2007) 401], calculated phase diagrams that explicitly include ZrO 2 in the bulk composition predict the growth and dissolution of zircon at sub- and supra-solidus conditions. This occurs within the context of the evolution of major metamorphic minerals and mineral assemblages in pressure-temperature-composition space for a metapelitic rock composition. The stability of zircon is a function of the bulk ZrO 2 content. Garnet contains insufficient Zr to affect the stability of zircon whereas rutile does contain sufficient Zr that zircon stability can be curtailed in rocks with significant rutile. Silicate melt contains appreciable Zr and zircon abundance varies inversely with melt abundance. Thermometers based on the Zr-content of rutile (and potentially garnet) can be graphically portrayed as compositional contours in mineral assemblage fields on the phase diagrams, thereby potentially adding to the utilization of such thermometers. The ability to calculate phase diagrams explicitly including Zr is a major step towards more systematically linking zircon growth - and zircon geochronology - and accessory phase thermometry in a readily adaptable way to the metamorphic evolution of major silicate minerals in a wide range of rocks. © 2010 Blackwell Publishing Ltd.

Win A.K.,University of Melbourne | Jenkins M.A.,University of Melbourne
Breast Cancer Research | Year: 2013

Introduction: Lynch syndrome is an autosomal dominantly inherited disorder of cancer susceptibility caused by germline mutations in the DNA mismatch repair (MMR) genes. Mutation carriers have a substantial burden of increased risks of cancers of the colon, rectum, endometrium and several other organs which generally occur at younger ages than for the general population. The issue of whether breast cancer risk is increased for MMR gene mutation carriers has been debated with evidence for and against this association. Methods: Using the PUBMED, we identified all relevant studies of breast cancer associated with Lynch syndrome that were published by 15 December 2012. In the review, we included: (i) molecular studies that reported microsatellite instability and/or immunohistochemistry in breast cancer tumors of MMR gene mutation carriers; and (ii) risk studies that investigated risk of breast cancer for confirmed MMR gene mutation carriers or families or clinically and/or pathologically defined Lynch syndrome families. Results: We identified 15 molecular studies and, when combined, observed 62 of 122 (51%; 95% CI 42 to 60%) breast cancers in MMR gene mutation carriers were MMR-deficient. Of the 21 risk studies identified, 13 did not observe statistical evidence for an association of breast cancer risk with Lynch syndrome while 8 studies found an increased risk of breast cancer ranging from 2- to 18-fold compared with the general population (or non-carriers). There is only one prospective study demonstrating an elevated risk of breast cancer for MMR gene mutation carriers compared with the general population (standardized incidence ratio 3.95; 95% CI 1.59, 8.13). Conclusions: Since breast cancer is a relatively common disease in the general population, more precise estimates of risk and gene-specific risks will need to utilize large prospective cohort studies with a long follow-up. While current data are inconclusive at a population level, individual tumor testing results suggest that MMR deficiency is involved with breast cancers in some individuals with Lynch syndrome. © 2013 Win et al.; licensee BioMed Central Ltd.

Hoffmann A.A.,University of Melbourne | Turelli M.,University of California at Davis
Proceedings of the Royal Society B: Biological Sciences | Year: 2013

Wolbachia infections are being introduced into mosquito vectors of human diseases following the discovery that they can block transmission of disease agents. This requires mosquitoes infected with the disease-blockingWolbachia to successfully invade populations lacking the infection. While this process is facilitated by features of Wolbachia, particularly their ability to cause cytoplasmic incompatibility, blocking Wolbachia may produce deleterious effects, such as reduced host viability or fecundity, that inhibit successful local introductions and subsequent spatial spread. Here, we outline an approach to facilitate the introduction and spread of Wolbachia infections by coupling Wolbachia introduction to resistance to specific classes of insecticides. The approach takes advantage of very high maternal transmission fidelity of Wolbachia infections in mosquitoes, complete incompatibility between infected males and uninfected females, the widespread occurrence of insecticide resistance, and the widespread use of chemical control in disease-endemic countries. This approach is easily integrated into many existing control strategies, provides population suppression during release and might be used to introduceWolbachia infections even with high and seasonally dependent deleterious effects, such as the wMelPop infection introduced into Aedes aegypti for dengue control. However, possible benefits will need to be weighed against concerns associated with the introduction of resistance alleles. © 2013 The Author.

Davey R.A.,University of Melbourne | Findlay D.M.,University of Adelaide
Journal of Bone and Mineral Research | Year: 2013

Calcitonin, a potent hypocalcemic hormone produced by the C-cells of the thyroid, was first discovered by Harold Copp in 1962. The physiological significance of calcitonin has been questioned, but recent studies using genetically modified mouse models have uncovered additional actions of calcitonin acting through its receptor (CTR) that are of particular significance to the regulation of bone and calcium homeostasis. Mice in which the CTR is deleted in osteoclasts are more susceptible to induced hypercalcemia and exogenous calcitonin is able to lower serum calcium in younger animals. These data are consistent with the hypothesis that calcitonin can regulate serum calcium by inhibiting the efflux of calcium from bone, and that this action is most important when bone turnover is high. Calcitonin has also been implicated in protecting the skeleton from excessive loss of bone mineral during times of high calcium demand, such as lactation. This action may be linked to an intriguing and as yet unexplained observation that calcitonin inhibits bone formation, because deletion of the CTR leads to increased bone formation. We propose several mechanisms by which calcitonin could protect the skeleton by regulating bone turnover, acting within the bone and/or centrally. A new more holistic notion of the physiological role of calcitonin in bone and calcium homeostasis is required and we have highlighted some important knowledge gaps so that future calcitonin research will help to achieve such an understanding. © 2013 American Society for Bone and Mineral Research. Copyright © 2013 American Society for Bone and Mineral Research.

Agency: GTR | Branch: AHRC | Program: | Phase: Research Grant | Award Amount: 4.16M | Year: 2012

Over the last decade, the creative industries have been revolutionised by the Internet and the digital economy. The UK, already punching above its weight in the global cultural market, stands at a pivotal moment where it is well placed to build a cultural, business and regulatory infrastructure in which first movers as significant as Google, Facebook, Amazon or iTunes may emerge and flourish, driving new jobs and industry. However, for some creators and rightsholders the transition from analogue to digital has been as problematic as it has been promising. Cultural heritage institutions are also struggling to capitalise upon new revenue streams that digitisation appears to offer, while maintaining their traditional roles. Policymakers are hampered by a lack of consensus across stakeholders and confused by partisan evidence lacking robust foundations. Research in conjunction with industry is needed to address these problems and provide support for legislators. CREATe will tackle this regulatory and business crisis, helping the UK creative industry and arts sectors survive, grow and become global innovation pioneers, with an ambitious programme of research delivered by an interdisciplinary team (law, business, economics, technology, psychology and cultural analysis) across 7 universities. CREATe aims to act as an honest broker, using open and transparent methods throughout to provide robust evidence for policymakers and legislators which can benefit all stakeholders. CREATe will do this by: - focussing on studying and collaborating with SMEs and individual creators as the incubators of innovation; - identifying good, bad and emergent business models: which business models can survive the transition to the digital?, which cannot?, and which new models can succeed and scale to drive growth and jobs in the creative economy, as well as supporting the public sector in times of recession?; - examining empirically how far copyright in its current form really does incentivise or reward creative work, especially at the SME/micro level, as well as how far innovation may come from open business models and the informal economy; - monitoring copyright reform initiatives in Europe, at WIPO and other international fora to assess how they impact on the UK and on our work; - using technology as a solution not a problem: by creating pioneering platforms and tools to aid creators and users, using open standards and released under open licences; - examining how to increase and derive revenues from the user contribution to the creative economy in an era of social media, mash-up, data mining and prosumers; - assessing the role of online intermediaries such as ISPs, social networks and mobile operators to see if they encourage or discourage the production and distribution of cultural goods, and what role they should play in enforcing copyright. Given the important governing role of these bodies should they be subject to regulation like public bodies, and if so, how?; - consider throughout this work how the public interest and human rights, such as freedom of expression, privacy, and access to knowledge for the socially or physically excluded, may be affected either positively or negatively by new business models and new ways to enforce copyright. To investigate these issues our work will be arranged into seven themes: SMEs and good, bad and emergent business models; Open business models; Regulation and enforcement; Creators and creative practice; Online intermediaries and physical and virtual platforms; User creation, behaviour and norms; and, Human rights and the public interest. Our deliverables across these themes will be drawn together to inform a Research Blueprint for the UK Creative Economy to be launched in October 2016.

Agency: Cordis | Branch: FP7 | Program: CSA | Phase: ICT-2009.1.3 | Award Amount: 1.34M | Year: 2010

Internet of Things (IoT) is one of the most important areas of a Future Internet with significant potential to positively impact European economy and society. The current IoT research community is however highly fragmented, featuring diverging approaches and technologies. This inevitably leads to a lack of coherency that hinders the realisation of an interoperable IoT, preventing the IoT to achieve its full potential.\nThe IoT initiative (IoT-i) brings together key actors from all relevant but currently fragmented IoT communities in Europe to work jointly towards a common vision of the Internet of Things. It represents the first serious attempt in building a unified IoT community in Europe, going across boundaries of disparate technology sectors, in order to create a joint European strategic vision of the Internet of Things and aligning this vision with the current developments on the Future Internet.\nThe IoT-i pursues the achievement of the following strategic objectives: 1) Creating a joint strategic and technical vision for the IoT in Europe that encompasses the currently fragmented sectors of the IoT domain holistically. 2) Contribute to the creation of an economically sustainable and socially acceptable environment in Europe for IoT technologies and respective R&D activities. 3) Creating an environment that favours international adoption of European IoT technology.\nTo achieve the objectives the project will utilize and implement a range of instruments. Most notably is the creation of an international IoT forum that will be used as a global platform to bring together key international stakeholders from all technology and application areas relevant to IoT. Other tangible outcomes are a converged reference model for IoT aligned with other areas of the Future Internet, synthesised technology roadmaps identifying longer-term research priorities, strategic application agendas and legal, ethical and socio-economonic recommendations for the IoT.

Agency: Cordis | Branch: FP7 | Program: CP | Phase: ICT-2011.1.1 | Award Amount: 11.44M | Year: 2012

Recent history has shown that in the aftermath of an emergency, disaster or related unexpected events, telecommunication infrastructures play a key role in recovery operations. In most cases, the legacy terrestrial infrastructure is seriously compromised and cannot guarantee reliable services for citizens and rescue teams. It is also well accepted that current public safety networks cannot provide sufficient capacity for broadband applications.The main goal of ABSOLUTE is to design and validate a holistic and rapidly deployable mobile network to provide broadband services based on a flexible, scalable, resilient and secure network design.The most important elements that ABSOLUTE will pioneer are i) LTE-A base station embedded in Low Altitude Platform enabling a large coverage for broadband services ii) Portable land mobile base stations interoperable with conventional public safety networks, iii) Advanced multi-service professional terminals for first responders. The usage of satellite communications for both broadband backhauling as well as narrowband ubiquitous messaging services is another essential enabler.ABSOLUTE objectives will be achieved by developing innovative concepts out of promising ideas, namely cognitive mechanisms for dynamic spectrum management enabling a seamless network reconfiguration as well as opportunistic and cooperative networking mechanisms ensuring maximum system availability and dependability. Proof of concept implementations and realistic demonstrations are also envisaged.Thus ABSOLUTE will greatly impact the next generation public safety communication systems in Europe, enabling operators, manufacturers and other relevant stakeholders to exploit new market opportunities for LTE-A and satellite communications. The ABSOLUTE project aims also to greatly impact CEPT initiatives for frequency allocation in Europe and ETSI/3GPP standardization for public safety applications.

Agency: Cordis | Branch: FP7 | Program: CP | Phase: ICT-2009.1.6 | Award Amount: 8.47M | Year: 2010

SmartSantander proposes a unique in the world city-scale experimental research facility in support of typical applications and services for a smart city. Tangible results are expected to greatly influence definition and specification of Future Internet architecture design from viewpoints of Internet of Things and Internet of Services.\nThis unique experimental facility will be sufficiently large, open and flexible to enable horizontal and vertical federation with other experimental facilities and stimulates development of new applications by users of various types including experimental advanced research on IoT technologies and realistic assessment of users acceptability tests.\nThe facility will comprise of more than 20,000 sensors. It will utilise the results of two major EU-funded projects, SENSEI and WISEBED, in creating a secure and open platform of heterogeneous technologies.\nThe city of Santander was chosen in response to offer of full support received from regional Government of Cantabria with real cash contribution of 500,000 . Thereby strengthening their chance of winning the 2016 election of the city as the Cultural Capital of Europe.\nTo efficiently achieve the goals of SmartSantander, three phases of deployments have been envisaged. It is also planned to issue two Open Calls for the experimental facility to be used by researchers from outside the project and involvement of various types of users to develop new applications.\nSuch a unique facility with practical experience gained and feedbacks received, enables better understanding and insight into issues of: 1) Required capacity, 2) Scalability, 3) Interoperability 4) Stimulation of faster development of new and innovative applications and 5) Influence specification of FI architecture design.\nThe SmartSantander project will enable the Future Internet of Things to become a reality.

University of Melbourne, Grains Research & Development Corporation, University of Adelaide and Csiro | Date: 2013-11-27

The present invention relates generally to polysaccharide synthases. More particularly, the present invention relates to (1,3;1,4)--D-glucan synthases. The present invention provides, among other things, methods for influencing the level of (1,3;1,4)--D-glucan produced by a cell and nucleic acid and amino acid sequences which encode (1,3;1,4)--D-glucan synthases.

Agency: Cordis | Branch: FP7 | Program: CP | Phase: ICT-2013.1.4 | Award Amount: 3.69M | Year: 2013

SOCIOTAL addresses a crucial next step in the transformation of an emerging business driven Internet of Things (IoT) infrastructure into an all-inclusive one for the society by accelerating the creation of a socially aware citizen-centric Internet of Things. It will close the emerging gap between business centric IoT enterprise systems and citizen provided infrastructure. SOCIOTAL will establish an IoT eco-system that puts trust, user control and transparency at its heart in order to gain the confidence of everyday users and citizens. By providing adequate socially aware tools and mechanisms that simplify complexity and lower the barriers of entry it will encourage citizen participation in the Internet of Things. This will add a novel and rich dimension to the emerging IoT ecosystem, providing a wealth of opportunities for the creation of new services and applications that address true societal needs and allow the improvement of the quality of life across European cities and communities.SOCIOTAL builds on the foundations of emerging IoT architectures and introduces the following innovative key target outcomes, ensuring that privacy and trust are deeply embedded in the resulting architecture:1) A governance, trust and reputation framework combining a set of innovative enablers that addresses the challenges of massive crowd-sourced IoT infrastructure2) A privacy-preserving context-sensitive communication framework for IoT devices with adequate security enablers3) A detailed understanding of technological and socio-economic barriers for citizen participation in an IoT4) An intuitive environment inspired by social media tools that provides increased awareness and control and empowers citizens to easily manage access to IoT devices and information, while allowing IoT enabled citizen centric services to be created through open community APIs5) Services piloted in two cities demonstrating the value of SOCIOTAL to real word communities

Agency: Cordis | Branch: FP7 | Program: CP-FP | Phase: HEALTH.2010.2.4.1-5 | Award Amount: 8.12M | Year: 2011

Among patients with adrenal masses Adrenocortical carcinoma (ACC) and malignant pheochromocytomas (MPH) are found with a low incidence but very unfavorable prognosis. Due to this poor clinical outcome, concomitant hormone dysregulation and limited treatment options the two cancer entities severely impact on affected patients. However, the rarity of the tumors also impedes clinical studies which are affected by fragmentation and low cohort sizes. The European Network for the Study of Adrenal Tumors (ENS@T) has recently implemented a collection of adrenal tumor related databases and defined an associated network of Biological Resource Centers devoted to research on adrenal tumors. The concurrence of recent achievements of this evolving network, the progress in the understanding of molecular mechanisms and increasing availability of specific diagnostic and therapeutic tools for adrenal cancers provides the unique opportunity to achieve unmatched progress in the implementation of both translation and clinical research dedicated to ACC and MPH. Specifically, the newly formed ENS@T-CANCER consortium will address the following topics: 1. Structuring European clinical and translational research through implementation of a virtual research environment, 2. Improving clinical outcome of patients with adrenal cancer by conducting interventional trials carried out by European centers of excellence, 3. Improvement of differential diagnosis and risk stratification of adrenal cancer, 4. Identification and validation of tools for follow-up of patients with adrenal cancer, 5. Identification of novel biomarkers for treatment response. The ultimate aim of the ENS@T-CANCER Consortium is to develop research in the field of adrenal cancers to improve diagnosis and treatment abilities. The Network will allow recruiting sufficient patients in all relevant European centers, to harmonize diagnosis criteria and to use the various technological approaches of a number of laboratories.

Agency: Cordis | Branch: FP7 | Program: CP-FP | Phase: HEALTH.2013.1.3-2 | Award Amount: 8.50M | Year: 2013

Chronic kidney disease is world wide a major cause of end-stage renal disease (ESRD). 800.000 patients in Europe and in the US, respectively, require long-term treatment initially with peritoneal dialysis, followed by hemodialysis and kidney transplantation. Each ESRD patient on hemodialysis costs 40000 to 80000 per year, has extremely poor quality of life and an average life expectancy of only 4 years. Kidney transplantation totally changes life for an ESRD patient who can then return to normal life, but this treatment is hampered by the low number of available kidney grafts. All these treatments are, however, associated with severe adverse reactions that cause damaging thromboinflammation, triggered by the intravascular innate immune system, which lead to poor results and non-function. The overall aim of this project is to clarify the mechanisms and identify natures own specific control points of regulation in these adverse reactions in order to be able to significantly improve the quality of hemodialysis devices and kidney grafts by applying these concepts of regulation in hemodialysis and kidney transplantation. We envisage that conveying a novel soluble complement inhibitor to the clinical stage via phase 1/2a clinical studies, creation of nano-profiled surfaces with low activating properties and generation of easy-to-apply one step-coatings for treatment of biomaterials (hemodialysis) and endothelial cell surfaces (kidney grafts) will revolutionize the treatment modalities of ESRD. The feasible hemodialysis treatment periods are anticipated to be extended, combined with an improved quality of life and in kidney transplantation attenuation of innate immune reactions will prolong the life expectancy of the graft and make kidneys more accessible for transplantation. All the novel techniques can be applied to other types of implantations, extracorporeal treatments and transplantation and in the future be used in xenotransplantation and stem cell therapies.

Agency: Cordis | Branch: H2020 | Program: RIA | Phase: PHC-01-2014 | Award Amount: 7.27M | Year: 2015

This programme of work will advance the understanding of the combined effects of factors that cause poor lung function, respiratory disability and the development of COPD . This will be achieved by examination of determinants of lung growth and lung function decline within existing cohorts that cover the whole life course, and which have followed, in detail, the respiratory health status of over 25000 European children and adults from the early 1990s to the present day. Following a comprehensive programme of risk factor identification we will generate a predictive risk score. The programme includes 1) identification of behavioural, environmental, occupational, nutritional, other modifiable lifestyle, genetic determinant of poor lung growth, excess lung function decline and occurrence of low lung function, respiratory disability and COPD within existing child and adult cohorts 2) generation of new data to fill gaps in knowledge on pre-conception and transgenerational determinants and risk factors 3) validation of the role of risk factors by integration of data from relevant disciplines, integration of data from the cohort-related population-based biobanks and exploitation of appropriate statistical techniques 4) generation of information on change in DNA methylation patterns to identify epigenetic changes associated with both disease development and exposure to specific risk factors 5) generation of a predictive risk score for individual risk stratification that takes account of the combined effects of factors that cause poor lung growth, lung function decline, respiratory disability, and COPD and 6) implementation of an online interactive tool for personalised risk prediction based which will be disseminated freely and widely to the population, patients and health care providers. The work will provide an evidence base for risk identification at individual and population level that can underpin future preventive and therapeutic strategies and policies.

Agency: Cordis | Branch: H2020 | Program: RIA | Phase: PHC-05-2014 | Award Amount: 8.44M | Year: 2015

Arterial hypertension affects up to 45% of the general population and is responsible for 7.1 million deaths per year worldwide. Although a large therapeutic arsenal exists, blood pressure control is sub-optimal in up to two thirds of patients. Yet, even small increments in blood pressure are associated with increased cardiovascular risk, with 62% of cerebrovascular disease and 49% of ischemic heart disease being attributable to hypertension. Detection of secondary forms of hypertension is key to targeted management of the underlying disease and prevention of cardiovascular complications. Endocrine forms of hypertension represent major targets for stratified approaches of health promotion. They include a group of adrenal disorders resulting in increased production of hormones affecting blood pressure regulation: primary aldosteronism (PA), pheochromocytoma/functional paraganglioma (PPGL) and Cushings syndrome (CS). These diseases are associated with increased cardiovascular and metabolic risk and with diminished quality of life. This project will develop and evaluate an omics-based stratified health promotion program for patients with endocrine forms of hypertension. We will define specific omics profiles for patients with PA, PPGL and CS by integrating high throughput genetics, genomics and metabolomics data with phenome annotations through bioinformatics modelling. Established profiles will be validated as stratification biomarkers and applied to the screening of referred hypertensive patients for both stratifying primary forms of hypertension for effective and cost efficient therapy as well as improving identification of endocrine causes for curative treatment and prevention of cardiovascular and metabolic complications. Omics-based profiling should allow identification of patients with preclinical phenotypes along with those hypertensives that cluster into specific endocrine groups who may benefit from personalised treatment.

Agency: Cordis | Branch: FP7 | Program: NoE | Phase: HEALTH-2009-2.3.2-1 | Award Amount: 16.97M | Year: 2009

This is a proposal from 55 partners from 36 institutes to form a NoE that will seek to integrate European malaria research that is directed towards a better understanding of the basic biology of the parasite, its vector and of the biology of the interactions between the parasite and both its mammalian host and vectors. All the member institutes and researchers have demonstrated both their excellence and their ability to contribute to a successful network. The structure of the proposed network significantly evolves prior concepts of network structure introducing new modes of research that have recently emerged. Comprising of 4 research clusters the core activities will include molecular cell biology of the parasite, host immunity, vector biology, population biology and systems biology. One arm of the network activities will be concerned with the timely and effective translation of research respecting the IP rights of partner institutes. The network will also contribute significantly to the production of the next generation of malaria researchers through the operation of an expanded European PhD School for malaria research based at EMBL, students enjoying two supervisors based in different member states. Bespoke training courses for PhD students and network personnel will be offered throughout the duration of the network to maximise individual potential. To create a long term benefit from network activities a limited programme of post-doctoral fellowships within the network will be established. Furthermore, individual career mentoring facilities and an alumni association will continue to guide and engage network graduates. New members will be affiliated annually on a competitive basis with an emphasis on young, emerging Principle Investigators. Through the establishment of an umbrella Foundation and active lobbying of government and non-government funding agencies as well as the establishment of a charitable profile the network will strive to become self-determining.

Agency: Cordis | Branch: H2020 | Program: RIA | Phase: PHC-05-2014 | Award Amount: 6.46M | Year: 2015

Breast cancer affects more than 360,000 women per year in the EU and causes more than 90,000 deaths. Identification of women at high risk of the disease can lead to disease prevention through intensive screening, chemoprevention or prophylactic surgery. Breast cancer risk is determined by a combination of genetic and lifestyle risk factors. The advent of next generation sequencing has opened up the opportunity for testing in many disease genes, and diagnostic gene panel testing is being introduced in many EU countries. However, the cancer risks associated with most variants in most genes are unknown. This leads to a major problem in appropriate counselling and management of women undergoing panel testing. In this project, we aim to build a knowledge base that will allow identification of women at high-risk of breast cancer, in particular through comprehensive evaluation of DNA variants in known and suspected breast cancer genes. We will exploit the huge resources established through the Breast Cancer Association Consortium (BCAC) and ENIGMA (Evidence-based Network for the Interpretation of Germline Mutant Alleles). We will expand the existing datasets by sequencing all known breast cancer susceptibility genes in 20,000 breast cancer cases and 20,000 controls from population-based studies, and 10,000 cases from multiple case families. Sequence data will be integrated with in-silico and functional data, with data on other known risk factors, to generate a comprehensive risk model that can provide personalised risk estimates. We will develop online tools to aid the interpretation of gene variants and provide risk estimates in a user-friendly format, to help genetic counsellors and patients worldwide to make informed clinical decisions. We will evaluate the acceptability and utility of comprehensive gene panel testing in the clinical genetics context.

News Article | February 25, 2017

Considerable time has been spent by scientists in analyzing the cause of Chronic Fatigue Syndrome (CFS) or Myalgic Encephalomyelitis. The puzzle, however, persisted without any clue whether it was a psychological or physical illness. There are one million Americans affected by this disease. Complete exhaustion makes people afflicted with CFS incapacitated to work or study. The underlying cause has been eluding discovery with many dismissing it as not a "real disease." However, a new breakthrough is in sight. Thanks to Australian scientists, for the first time the cause of CFS has been traced to a faulty cell receptor in the immune cells. The cell receptor dysfunction was found by researchers from Griffith University who identified patients with CFS/ME as having single nucleotide polymorphisms in the genetic code of some cell receptors. The CFS condition usually manifests with the following symptoms "This discovery is great news for all people living with Chronic Fatigue Syndrome (CFS) and the related Myalgic Encephalomyelitis (ME), as it confirms what people with these conditions have long known - that it is a 'real' illness - not a psychological issue," said Leeanne Enoch, the Science Minister of Queensland. The study has been published in Clinical Experimental Immunology. The research conducted by scientists at the Griffith University's National Centre for Neuroimmunology and Emerging Diseases focused on the abnormalities in the immune cell receptors. "We have discovered and reported for the first time abnormalities of a certain receptor in immune cells of the body and hence it's likely to be in every cell in the body", NCNED Co-Director, Professor Don Staines, said. They zeroed in on the single nucleotide polymorphisms in the genetic code of affected patients. The cell receptor TRPM3 plays a vital role in moving calcium from outside the cell to the system in balancing gene expression and production of proteins. A defect in the receptor stems from the change in gene transcription. It now follows that CFS condition blocks transit of calcium causing pain in the brain, spinal cord, stomach, and the pancreas. Meanwhile, the flu-like CFS condition with heavy fatigue that limits a person's ability to carry out daily life has brought focus on the role of gut bacteria in enhancing the disease. "The key defining feature is actually what's called post-exertional malaise. This involves a flu-like reaction following any form of exertion, trauma or activity that exacerbates stress," noted Chris Armstrong, a researcher at the University of Melbourne's Department of Biochemistry. In the research on CFS, attention is also turning to gut health revealing what causes this condition. Armstrong and other researchers at Melbourne University studied the products of metabolism and gut microbiota in the patients' feces, blood, and urine. A microbial difference was spotted with the gut bacteria showing microbiome abnormality. A study had suggested changes in the composition of the gut bacteria could be adding pressure on the process of metabolism that converts food into energy. "Our food gets broken down by bacteria and these things called short chain fatty acids ... Our study has shown an increase of bacteria that are better at fermenting amino acids to these acids," Armstrong said. The assumption is that the increase in bacteria is causing the abnormal conditions in metabolism. © 2017 Tech Times, All rights reserved. Do not reproduce without permission.

News Article | November 8, 2016

Direct doping of nanoparticles in glass shows potential for smart applications A new and versatile method for integrating light-emitting nanoparticles, without loss of their unique properties, into glass is demonstrated. Light-emitting upconversion nanocrystals (UPNCs)—tiny particles studded with active lanthanide ions (Ln3+)—can convert IR excitation radiation into higher-energy emissions. Improved understanding and manipulation of upconversion properties at the nanoscale has recently fueled the development of new-generation UPNCs.1 These new-generation nanocrystals emit higher brightness upconversion through the use of high-irradiance excitation (which can unlock the previously inactive emitters).2 Alternatively, clustering ytterbium ion (Yb3+) sensitizers, in arrays at the sublattice level, are used to promote localized excited states.3 Furthermore, the UPNCs are promising for various applications, including biological sensing, anti-counterfeiting, photon energy management, and volumetric displays. The realization of many of those applications would be particularly helped if the new-generation UPNCs could be incorporated into glass. It remains a challenge, however, to infuse glass with UPNCs that have tailored nanophotonic properties. The glass ceramic technique is the conventional integration approach for in situ growth of nanocrystals inside a glass.4, 5 In this technique, a glass, which contains precursor ions for the nanocrystals—see Figure 1(a)—is heated above the glass transition temperature to yield the crystal seeds. These seeds then undergo further growth, and thus form nanocrystals across the glass volume: see Figure 1(b). Although this in situ method is promising for certain nanocrystals, it runs into significant chemical and physical disadvantages when dealing with UPNCs. These disadvantages—related to production conditions, solubility restrictions, and post-annealing events—mean that the desired optical properties in hybrid glass are hard to achieve (and they can cause increased light scattering).6 Figure 1. Schematic illustration of active lanthanide ions (Ln3+) distributions in various glasses, i.e., glass containing (a) Ln3+ ions (Ln3+-doped glass), (b) nanocrystals (NCs) grown in situ (glass ceramics), and (c) directly doped NCs (NC-doped glass). The small gray dots in (c) represent the Ln3+ ions in the NC-doped glass that were dissolved from NCs. To pursue high levels of compositional and structural control over UPNCs in glass, we have thus devised a versatile ‘direct doping’ approach6—see Figure 1(c)—as an alternative to the conventional glass ceramic technique. In our direct doping approach, as-synthesized nanoparticles are injected into the molten glass and are then integrated to create a highly controllable hybrid material.7, 8 With this technique, we advantageously combine the flexible selection of glass and sophisticated syntheses of unique nanocrystals. In this way, we can achieve far more control over the composition, concentration, and nanostructure of UPNCs in glass. In particular, the success of our approach lies in the manipulation of the correct doping temperature and dwell time of the nanocrystals in the glass melt. To thus ensure the survival and even dispersal of UPNCs across the glass, we first determined a suitable glass melting temperature for doping and dispersing ytterbium- and erbium-doped lithium yttrium fluoride (LiYF :Yb, Er) nanocrystals. The maximum doping temperature of LiYF is given by its decomposition threshold. In addition, we determined the lower limit of the doping temperature from the glass melt viscosity of TZN tellurite (where TZN denotes tellurium dioxide–zinc oxide–sodium oxide, or 75TeO –15ZnO–10Na O). We thus identified the doping temperature window of LiYF :Yb, Er as 550–625°C, as marked by the blue region in Figure 2(a). At higher temperatures within this window, both the desired dispersion and the detrimental dissolution of nanocrystals in the glass melt are accelerated. By examining three different doping temperatures within the window, we confirmed that 577°C is the optimal temperature for achieving the balanced survival and dispersion of UPNCs in glass. Similarly, a prolonged dwell time aids the dispersion of the nanocrystals, but enhances their dissolution in the glass melt. From three different dwell times (3, 5, and 10min), we demonstrated that the 5min dwell time, at 577°C, was the most promising. Figure 2. Characterizations of NC-doped tellurium dioxide–zinc oxide–sodium oxide (TZN) glasses. (a) Differential thermal analysis of lithium yttrium fluoride (LiYF ) NCs, used to determine their decomposition temperature. The blue band represents the doping temperature (T ) window that is suitable for directly doping the LiYF NCs in a TZN glass melt. Δ: Energy change. T : Melting temperature. #: Decomposition threshold. (b) Transmittance spectra of bulk TZN glasses doped with different amounts of LiYF NCs. Er: Erbium. (c) Normalized upconversion spectra of Er3+-doped TZN glass, TZN glass doped with 170ppm of LiYF NCs (at three different locations: P1, P2, and P3), and a suspension of LiYF NCs. a.u.: Arbitrary units. (d) A 3D reconstruction of TZN glass doped with 67ppm of LiYF NCs. This is produced by stacking 100 x–y planes (10 ×10μm) of upconversion images, with a depth increment (i.e., between the frames) of 1.5μm. (e) Optical attenuation curves (between 500 and 1300nm) of blank, Er3+-doped TZN, and NC-doped TZN glass fibers. Solid curves represent the data, and the range of the standard error is shown by the shaded regions. We thus used this optimum doping temperature and dwell time as the pre-set conditions to prepare a series of UPNC-doped TZN glasses. We find—see Figure 2(b)—that all of these samples exhibit high optical transmittance (very close to the maximum transmission of blank TZN glass). According to Rayleigh–Gans–Mie theory,9 we ascribe the negligible amount of light scattering in our glasses to their low-doping concentration (i.e., ≤170ppm w/w), the partial dissolution, and the absence of serious agglomerations of nanocrystals. Furthermore, we obtained almost identical x-ray diffraction patterns and Raman spectra from the blank TZN glass and our UPNC-doped TZN glasses, which suggests that our hybrid glasses retain the original glass network. We have also used optical methods to thoroughly inspect our doped glasses. For example, we used the hypersensitivity of Er3+ emissions—Figure 2(c)—to validate the survival of the UPNCs in the glass and to quantify the dissolution fraction of the doped UPNCs as 30–60%. In addition, we used upconversion scanning confocal microscopy—see Figure 2(d)—to produce the first 3D in situ visualization of UPNC dispersion in glass. To obtain this volumetric 3D imagery and thus visualize the spatial distribution of UPNCs in TZN glass, we reconstructed 100 scanned x–y planes. We also acquired the light attenuation spectrum (between 500 and 1300nm) of the samples. We observe a loss of 0.28±0.06dB/m for a UPNC-doped TZN glass fiber. This value is intermediate to the loss from blank TZN (0.35±0.02dB/m) and from an Er3+-doped TZN (0.08±0.06dB/m) glass fiber: see Figure 2(e). These loss results indicate that dissolution of the nanocrystals has occurred and the absence of serious nanocrystal agglomerations in the TZN glass fibers. In summary, we have used our direct doping approach to successfully integrate UPNCs (which have unique properties) in TZN glass fibers. We have thus demonstrated that this new methodology can be used to overcome key obstacles in the conventional glass ceramics technique. We now plan to use core-shell nanoparticles (which are surrounded by an additional robust layer of material) to ensure that the nanoparticles remain intact and are better dispersed within the glass. We will also generalize our direct doping approach so that it can be used to embed other nanoparticles (with interesting photonic, electronic, and magnetic properties) in glass and to thus advance smart glass technology for a wealth of applications, e.g., in biomedical engineering, remote radiation sensing, and 3D volumetric displays. We thank colleagues and collaborators at Macquarie University and the University of Melbourne (Australia) for their contribution to this work. We also gratefully acknowledge financial support from the Australian Research Council (grants DP130102494 and CE140100003), as well as from the Commonwealth and South Australia State Government (funding for the OptoFab node of the Australian National Fabrication Facility). University of Adelaide Jiangbo (Tim) Zhao is an associate investigator at the Australian Research Council (ARC) Centre of Excellence in Nanoscale Biophotonics and a research associate at the University of Adelaide. He holds a PhD from Macquarie University, Australia. His research interests lie in the interdisciplinary area of photonics, materials, and biomedical science, and are focused on light–matter interactions, luminescence in solid-state and nanoscale structures, and photonics-based devices for practical applications. Heike Ebendorff-Heidepriem received her PhD in chemistry from the University of Jena, Germany, in 1994. She subsequently held two prestigious fellowships and received the Weyl International Glass Science Award. Between 2001 and 2004 she worked at the Optoelectronics Research Centre at the University of Southampton, UK, and she has been at the University of Adelaide since 2005. She is currently one of the leaders of the Optical Materials and Structures Theme and is the deputy director of the Institute for Photonics and Advanced Sensing. She is also a senior investigator of the ARC Centre of Excellence in Nanoscale Biophotonics. Her research is focused on the development of novel optical glasses, fibers, surface functionalization, and sensing approaches. 1. A. Nadort, J. Zhao, E. M. Goldys, et al., Lanthanide upconversion luminescence at the nanoscale: fundamentals and optical properties, Nanoscale 8, p. 13099-13130, 2016. 3. J. Wang, R. Deng, M. A. MacDonald, B. Chen, J. Yuan, F. Wang, D. Chi, et al., Enhancing multiphoton upconversion through energy clustering at sublattice level, Nat. Mater. 13, p. 157-162, 2014. 5. A. Herrmann, M. Tylkowski, C. Bocker, C. Rüssel, Cubic and hexagonal NaGdF4 crystals precipitated from an aluminosilicate glass: preparation and luminescence properties, Chem. Mater. 25, p. 2878-2884, 2013. 7. H. Ebendorff-Heidepriem, Y. Ruan, H. Ji, A. D. Greentree, B. C. Gibson, T. M. Monro, Nanodiamond in tellurite glass part I: origin of loss in nanodiamond-doped glass, Opt. Mater. Express 4, p. 2608-2620, 2014. 8. M. R. Henderson, B. C. Gibson, H. Ebendorff-Heidepriem, K. Kuan, S. Afshar V., J. O. Orwa, I. Aharonovich, et al., Diamond in tellurite glass: a new medium for quantum information, Adv. Mater. 23, p. 2806-2810, 2011.

News Article | October 28, 2016

A Microsoft regional director and security developer, Troy Hunt, was contacted early on Tuesday morning by an anonymous person on Twitter who told him he had obtained personal information about him and his wife. “This guy reached out to me and said, ‘Here’s your personal data,’” Hunt said. “There was my name, my email, my phone number, my data of birth, and information about when I had last donated blood.” It didn’t take Hunt long to figure out that the data had come from a form he had filled out online through the Red Cross blood donation form. On Friday the Red Cross Blood Service chief executive, Shelly Park, admitted at a media conference in Melbourne that the data of more than half a million blood donors across Australia had been compromised in a massive security breach, and accessed by an “unauthorised person”. “We learned that a file, containing donor information, which was located on a development website, was left unsecured by a contracted third party who develops and maintains our website,” Park said. Hunt, who founded the website, said the information about his wife provided to him by the man was even more concerning. She had donated blood many more times than he had and there was more information available about her. “Her blood type was in there,” Hunt said. “The details provided by people through the questionnaire were mostly benign, I suppose, things like, ‘Are you are you under 50kg?’ and, ‘Have you had dental procedures?’ The one which stands out though is, ‘Have you had any risky sexual activity in the last 12 months?’” The man ended up sending Hunt the entire 1.74 GB file he had obtained. Realising how serious the situation was, Hunt immediately contacted AusCERT, a leading computer emergency response team that provides security advice to the Australian public service and not-for-profit sector. AusCert has now helped the Red Cross Blood Service to contain the data. “I also asked the person who sent the file to me to delete it immediately,” Hunt said. “He immediately complied. He even screen-capped his delete process showing him deleting. “Of course, he could have made other copies. I also asked him point blank if he had passed it on and he said no, he had not. “All we can do is take him at his word. There is also no evidence he had any malicious intent. There are a lot of people who just scan the internet for information like this. He would have had some software to do this and he would have just been trawling around to see what he could find.” Cyber security experts have told the blood service that the risk of the data being misused was low, and those affected have been told. But Dr Vanessa Teague, a senior lecturer at the department of computing and information systems at the University of Melbourne’s school of engineering, said that reassurance was “cold comfort”. “If one person noticed this data could be accessed, you have no idea how many other people also noticed it but chose not to notify anyone,’ she said. “The other thing is that the scientific literature always talks about deidentification of data. But what people try to do is link multiple different data sets together to break people’s privacy. “So one of the worst things about this is the possibility that other data sets that might have been privacy-preserving on their own might be more at risk because of the extra clues given in this blood data set, such as names and addresses, that wouldn’t normally be part of a deidentified data set.” Those affected by the data breach have been sent a text message that reads: “The Blood Service has identified a potential data issue that may affect you,” with a link to the service’s website for more information. Chris Culnane, a University of Melbourne programming languages and human-computer interaction expert, said it was worth pointing out that those who completed the Red Cross online questionnaire had done so so before filling out their personal details. “If you answer any of the questions [such as, ‘Have you had a tattoo in last four months?’ and, ‘Are you pregnant or have you just given birth?’] in the affirmative, you are declined with no further questions being asked and no personal details being taken. “So I assume that the data collected is only from people who have answered no to all of the questions. Due to the structure of the quiz, the linking would have to be on the negation of the answers.”

News Article | October 31, 2016

University of Melbourne researchers are on the cusp of making a real difference by developing a new strain of rice that contains much higher quantities of the essential micronutrients iron and zinc in the grain. This has the potential to reduce chronic malnutrition disorders that can be caused by an over-reliance on rice in the human diet. Some two billion low-income people around the world aren't getting enough vitamins and minerals from their food in what is called "hidden hunger". The World Health Organization estimates that 30 per cent of the world's population is anaemic, in many cases because people simply aren't getting enough iron in their diet. Anaemia leaves people weak and lethargic and it is a significant, even fatal, health risk to pregnant women and children. Similar numbers of people are at risk of not getting enough zinc, resulting in stunted growth and impaired immune function. University of Melbourne plant biologist Dr Alex Johnson and colleagues have created a genetically modified (GM) strain of "biofortified" rice that produces grains with significantly more iron and zinc. In recent field trials the researchers not only beat their targets for increased grain iron and zinc concentration, but the biofortified rice proved to be just as high-yielding as conventionally bred rices. Rice grains usually contain just 2-5 parts per million (ppm) of iron and the researchers needed to increase that to at least 13 ppm to address iron deficiencies in rice-based diets. They managed to get to 15 ppm. Similarly, they had been aiming to increase the amount of zinc from 16 ppm to 28 ppm, but they managed to get to 45 ppm. The results were published earlier this year in Scientific Reports, an open access journal from prestigious scientific publishers Nature. "The results show that this technology actually works in the field, not just in the glasshouse," says Dr Johnson, from the School of BioSciences. "We exceeded our biofortification targets and the rice was just as high yielding as existing rice varieties." Crucially, the field testing also showed that while the genetic modification had enabled the biofortified rice to take up more iron and zinc from the soil, it didn't increase the take-up of harmful heavy metals such as cadmium. Nutritional testing of the grain also showed that if we were to eat this rice, our bodies would readily absorb the higher quantity of iron and zinc. The scientists were able to determine this by "feeding" the rice to so-called Caco-2 cells, which are a human cell line that can be grown in the lab to resemble the cells of the small intestine. The biofortified rice was "fed" to the Caco-2 cells by first artificially "digesting" it using enzymes that mimic our own digestive process. "There are no deal-breakers in these results. We have proven our concept in a major variety of rice, and we are now ready to move this into a developing country," says Dr Johnson. "Rice is the staple food for billions of people today and that isn't going to change anytime soon, so rice biofortification is a tool that we can use to address hidden hunger in a huge number of people. "Over time that should lead to healthier and more productive populations in the developing world, boosting local economics and eventually supporting more diverse and balanced diets. "We can and do use vitamin and mineral supplements and food processing to help people suffering from micronutrient deficiencies, but those interventions are recurrent costs and need industrial processing that may not be readily available in developing countries. Biofortification is a sustainable solution because once it's in the seeds you've increased the nutritional quality of the crop itself. The farmer simply needs to plant biofortified seeds." Dr Johnson's research has been funded and supported by several partners including the Australian Research Council and the not-for-profit HarvestPlus initiative. HarvestPlus is backed by the Bill and Melinda Gates Foundation and is tackling hidden hunger in developing countries with biofortified crops. Dr Johnson's ambition is that farmers around the world would face no additional cost for adopting the iron and zinc biofortified rice. Dr Johnson, an American who later also became an Australian, did his PhD at Virginia Tech in the US where he worked to genetically modify potatoes to create resistance to the Colorado Potato Beetle. At the University of Melbourne he has been working on genetic strategies to boost the iron content of rice since 2009. In 2011, his team identified a specific rice gene that when "switched on" increases the amount of iron taken up from the soil and transported to the grain. Usually this gene is only activated when the rice plant itself is short on iron, but by modifying what drives the gene they were able to keep the gene switched on all the time. "We have basically tricked the plant into thinking it is continuously short of iron." They also found that it increased the uptake of zinc. "It was a dream result," says Dr Johnson. His fascination with plants goes back to his childhood when he was enthralled by seeds growing into something that his family could eat. He remembers following his mother around the garden and impatiently digging up her plants to see what they looked like as they were growing. Now as a scientist, he has had to learn the patience of a good gardener. "Given the huge opportunity we have here to fight human malnutrition, there are times when the project doesn't seem to be going fast enough. But plants can only grow so fast and we need time for replicated field trials in multiple countries. It's important that we fully understand how our biofortified rice grows in as many different environments as possible." Dr Johnson and his colleagues are now aiming to introduce the iron and zinc biofortified rice into Bangladesh where almost 80 per cent of cultivated land is dedicated to rice, but where more than half of all children and 70 per cent of women are iron-deficient. He says iron biofortified rice could have a huge impact in this country. Another reason that the team is targeting Bangladesh is that it has already released other GM crops such as an eggplant variety that has allowed farmers to drastically reduce their insecticide use. GM crops are controversial because of concerns from some, including Greenpeace, that they may have unforseen consequences that could eventually harm the environment and pose a health threat. But Dr Johnson says that there is a wealth of information showing that GM crops are safe and notes that more than a hundred Nobel Prize winners, from a range of mostly science disciplines, recently penned a letter asking Greenpeace to end its opposition to genetically modified organisms. "Hidden hunger isn't a hypothetical problem, it is a real problem, and biofortification is a real solution. I've not met anyone who is against that." Explore further: A biofortified rice high in iron and zinc is set to combat hidden hunger in developing countries

News Article | February 15, 2017

Digital Science, a technology company serving the needs of scientific and research communities, today announced Carnegie Mellon University (CMU) as a key customer and development partner. By implementing a suite of products from the Digital Science portfolio, Carnegie Mellon will unveil a solution to capture, analyze and showcase its leading research. Using continuous, automated capture of data from multiple internal and external sources, including publication and associated citation and altmetrics data, grant data, and research data, Carnegie Mellon will be able to provide its faculty, funders and decision-makers with an accurate, timely and holistic picture of the institution’s research. With the goal of championing new forms of scholarly communication, Carnegie Mellon is creating a number of research platforms that will work together to enable innovation and provide opportunities for interactive research among the university's researchers. As part of this effort, the university is building out an ecosystem of support, processes and tools that underpin the full research lifecycle from ideation to dissemination. Carnegie Mellon plans to roll out a suite of tools from Digital Science to its academic community over the coming months. These tools offer a multitude of benefits including: “The library is at the heart of the work of the institution and must provide a reimagined ‘intellectual commons’ for a campus community,” said Keith Webster, Dean of University Libraries, Carnegie Mellon. “With this partnership, we have the opportunity to position ourselves as a world leader in the development of the scholarly ecosystem. Digital Science is central in allowing us to build the best research information system that exists today and we look forward to sharing our experience and expertise with the global academic community.” “Carnegie Mellon is at the forefront of creating a transformative and collaborative research environment that is open to the free exchange of ideas, where research, creativity, innovation, and entrepreneurship flourish,” said Daniel Hook, CEO Digital Science. “We are very proud indeed to be working with the team at CMU to support their researchers to spend more time on discovery and collaboration. We also look forward to working with them as a development partner to continue to drive this innovation.” *About Digital Science* Digital Science is a technology company serving the needs of scientific and research communities at key points along the full cycle of research. It invests in and incubates research software companies that simplify the research cycle, making more time for discovery. Its portfolio companies include a host of leading brands including Altmetric, BioRAFT, Figshare, IFI CLAIMS Patent Services, Labguru, Overleaf, Peerwith, ReadCube, Symplectic, ÜberResearch, TetraScience and Transcriptic. It is operated by global media company, the Holtzbrinck Publishing Group. Visit and follow @digitalsci on Twitter. *About Altmetric* Altmetric was founded in 2011 and has made it a mission to track and analyze a world beyond scholarly citations around scholarly literature. Altmetric tracks what people are saying about research outputs online and works with some of the biggest publishers, funders, and institutions around the world to deliver this data in an accessible and reliable format. Visit for more information and follow @altmetric on Twitter. *About ÜberResearch* ÜberResearch, the company behind Dimensions, is a leading provider of software solutions focused on helping funding organizations, non-profits, and governmental institutions make more informed decisions about science funding. The company's cloud-based platform provides better views of an organization's grant data, peer organisation activities, and the data of the funding community at large. The software functions span search and duplication detection to robust tools for reviewer identification and portfolio analysis. For more information, visit: and follow @uberresearch on Twitter. *About Figshare* Figshare is a web-based platform to help academic institutions manage, disseminate and measure the public attention of all their research outputs. The light-touch and user-friendly approach focuses on four key areas: research data management, reporting and statistics, research data dissemination and administrative control. Figshare works with institutions in North America and internationally to help them meet key funder recommendations and to provide world-leading tools to support an open culture of data sharing and collaboration. For more information, visit and follow @figshare on Twitter. *About Symplectic* Symplectic is a leading developer of Research Information Management systems. Founded in 2003, Symplectic’s flagship product Elements is used by over 300,000 researchers, repository managers and librarians at over 80 of the world’s top institutions including the University of Oxford, University of Melbourne, and Duke University. For more information, visit and follow @symplectic on Twitter. *About Carnegie Mellon University* Carnegie Mellon is a private, internationally ranked research university with programs in areas ranging from science, technology and business, to public policy, the humanities and the arts. More than 13,000 students in the university's seven schools and colleges benefit from a small student-to-faculty ratio and an education characterized by its focus on creating and implementing solutions for real problems, interdisciplinary collaboration and innovation. Carnegie Mellon's main campus in the United States is in Pittsburgh, Pa. It has campuses in California's Silicon Valley and Qatar, and programs in Africa, Asia, Australia, Europe and Mexico.

News Article | February 15, 2017

It’s as if a switch has been flicked. Evidence is mounting that chronic fatigue syndrome (CFS) is caused by the body swapping to less efficient ways of generating energy. Also known as ME or myalgic encephalomyelitis, CFS affects some 250,000 people in the UK. The main symptom is persistent physical and mental exhaustion that doesn’t improve with sleep or rest. It often begins after a mild infection, but its causes are unknown. Some have argued that CFS is a psychological condition, and that it is best treated through strategies like cognitive behavioural therapy. But several lines of investigation are now suggesting that the profound and painful lack of energy seen in the condition could in many cases be due to people losing their ability to burn carbohydrate sugars in the normal way to generate cellular energy. Instead, the cells of people with CFS stop making as much energy from sugar as usual, and start relying more on lower-yielding fuels, such as amino acids and fats. This kind of metabolic switch produces lactate, which can cause pain when it accumulates in muscles. Together, this would explain both the shortness of energy, and why even mild exercise can be exhausting and painful. Øystein Fluge of Haukeland University Hospital in Bergen, Norway, and his colleagues studied amino acids in 200 people with CFS, and 102 people without it. The levels of some amino acids in the blood of women with CFS was abnormally low – specifically for the types of amino acid that can be used by the body as an alternative fuel source. These shortfalls were not seen in men with CFS, but that could be because men tend to extract amino acids for energy from their muscles, instead of their blood. And the team saw higher levels of an amino acid that’s a sign of such a process. “It seems that both male and female CFS patients may have the same obstruction in carbohydrate metabolism to energy, but they may try to compensate differently,” says Fluge. Both sexes had high levels of several enzymes known to suppress pyruvate dehydrogenase (PDH), an enzyme vital for moving carbohydrates and sugars into a cell’s mitochondria – a key step for fully exploiting sugar for energy. Fluge thinks PDH is prevented from working in people with CFS, but that it can spontaneously recover. Several studies have now hinted that defects in sugar burning can cause CFS, but there is still uncertainty over how exactly this is disrupted. However, a picture is emerging. Something makes the body switch from burning sugar to a far less efficient way of making energy. “We don’t think it’s just PDH,” says Chris Armstrong at the University of Melbourne in Australia, whose research has also uncovered anomalies in amino acid levels in patients. “Broadly, we think it’s an issue with sugar metabolism in general.” The result is not unlike starvation, says Armstrong. “When people are facing starvation, the body uses amino acids and fatty acids to fuel energy for most cells in the body, to keep glucose levels vital for the brain and muscles as high as possible.” “We think that no single enzyme in metabolism will be the answer to CFS, just as no single enzyme is the ‘cause’ of something like hibernation,” says Robert Naviaux of the University of California at San Diego, who has found depletion of fatty acids in patients suggesting they were diverted as fuel. So what could flick the switch to a different method of metabolism? Fluge’s team thinks that a person’s own immune system may stop PDH from working, possibly triggered by a mild infection. His team has previously shown that wiping out a type of white blood cell called B-cells in CFS patients seems to relieve the condition. These white blood cells make antibodies, and Fluge suspects that some antibodies made to combat infections may also recognise something in PDH and disable it. The team is now conducting a large trial in Norway of the cancer drug rituximab, which destroys the cells that make antibodies, in people with CFS. Results are expected next year. Together, these metabolic approaches are suggesting that CFS has a chemical cause. “It’s definitely a physiological effect that we’re observing, and not psychosomatic, and I’ll put my head on the block on that,” says Armstrong. However, he adds that psychological and brain chemistry factors might be involved in some cases.

SINGAPUR, 8. Dezember 2016 /PRNewswire/ -- Carmentix Private Limited („Carmentix") und die University of Melbourne sind stolz, die Initiative „Preterm Birth Biomarker Discovery" (Aufdeckung von Biomarkern bei Frühgeburten) bekannt geben zu dürfen. Das Ziel dieser gemeinsam...

News Article | November 10, 2016

It’s a case of lofty living meeting Noah’s ark. Gardens atop city buildings can act as refuges for threatened species and help plants colonise the surrounding landscape. For the last six years, a team of Australian conservationists has been growing critically endangered native plants on the roofs of buildings in Melbourne. The plants form unique communities on the volcanic plains of Victoria, but are in severe decline because of agriculture. One of the major benefits of roof gardens is that threatened species don’t have to compete with plants found at ground level, says project leader Nicholas Williams at the University of Melbourne. Moreover, there are no snails or slugs to eat them. Elevation is another advantage, because the seeds of the endangered plants can drift off in the wind and take root in the wider landscape, Williams says. It was in 2010 that the team began growing two native grass and 27 wildflower species from the volcanic plains in four 18-square-metre plots on top of Melbourne’s Pixel building. Instead of soil, a lightweight substrate of volcanic rock was used to improve drainage. Initially the plants were watered to help them establish, but subsequently they survived purely on rainwater. Some of the species thrived and spread across the gardens, while others survived but did not spread. However, there were plants – particularly the smaller species – that did not establish themselves. This suggests that some plants are better suited than others to the urban roof environment, says Williams, who will present the findings at the Australasian Plant Conservation Conference next week. The next step will be to determine how well the roofs perform as seed distributors. The team will test this using a threatened native fireweed, Senecio macrocarpus, recently planted on a roof at the University of Melbourne. Conservation efforts should still primarily focus on maintaining plants in their original environment, but green roofs provide useful back-up, says Williams. They are also a cost-effective way of improving storm-water absorption, small-scale carbon capture, and providing insulation that reduces the air conditioning needs of rooms below, he says. Andrew Lowe at the University of Adelaide, Australia, says that using green roofs to conserve threatened plants is a great idea, although not all species will be able to survive the harsh wind and sunlight on tall buildings. “The other thing is that rooftops are increasingly being used for solar panels, so there might be a bit of a conflict between conservation and renewable energy,” he says. “It’s becoming pretty hot real estate up there.”

News Article | March 21, 2016

The collaboration involving physicists at the Centre for Ultrahigh bandwidth Devices for Optical Systems (CUDOS), an ARC Centre of Excellence headquartered in the School of Physics, and electrical engineers from the School of Electrical and Information Engineering, has been published today in Nature Communications. The team's work resolved a key issue holding back the development of password exchange which can only be broken by violating the laws of physics. Photons are generated in a pair, and detecting one indicates the existence of the other. This allows scientists to manage the timing of photon events so that they always arrive at the time they are expected. Lead author Dr Chunle Xiong, from the School of Physics, said: "Quantum communication and computing are the next generation technologies poised to change the world." Among a number of quantum systems, optical systems offer particularly easy access to quantum effects. Over the past few decades, many building blocks for optical quantum information processing have developed quickly," Dr Xiong said. "Implementing optical quantum technologies has now come down to one fundamental challenge: having indistinguishable single photons on-demand," he said. "This research has demonstrated that the odds of being able to generate a single photon can be doubled by using a relatively simple technique—and this technique can be scaled up to ultimately generate single photons with 100% probability." CUDOS director and co-author of the paper, Professor Ben Eggleton, said the interdisciplinary research was set to revolutionise our ability to exchange data securely - along with advancing quantum computing, which can search large databases exponentially faster. "The ability to generate single photons, which form the backbone of technology used in laptops and the internet, will drive the development of local secure communications systems - for safeguarding defence and intelligence networks, the financial security of corporations and governments and bolstering personal electronic privacy, like shopping online," Professor Eggleton said. "Our demonstration leverages the CUDOS Photonic chip that we have been developing over the last decade, which means this new technology is also compact and can be manufactured with existing infrastructure." Co-author and Professor of Computer Systems, Philip Leong, who developed the high-speed electronics crucial for the advance, said he was particularly excited by the prospect of further exploring the marriage of photonics and electronics to develop new architectures for quantum problems. "This advance addresses the fundamental problem of single photon generation - promises to revolutionise research in the area," Professor Leong said. The group—which is now exploring advanced designs and expects real-world applications within three to five years—has involved research with University of Melbourne, CUDOS nodes at Macquarie University and Australian National University and an international collaboration with Guangdong University of Technology, China.

News Article | January 4, 2016

In this special feature, we have invited top astronomers to handpick the Hubble Space Telescope image that has the most scientific relevance to them. The images they’ve chosen aren’t always the colorful glory shots that populate the countless “best of” galleries around the internet, but rather their impact comes in the scientific insights they reveal. My all-time favorite astronomical object is the Orion Nebula — a beautiful and nearby cloud of gas that is actively forming stars. I was a high school student when I first saw the nebula through a small telescope and it gave me such a sense of achievement to manually point the telescope in the right direction and, after a fair bit of hunting, to finally track it down in the sky (there was no automatic ‘go-to’ button on that telescope). Of course, what I saw on that long ago night was an amazingly delicate and wispy cloud of gas in black and white. One of the wonderful things that Hubble does is to reveal the colors of the universe. And this image of the Orion Nebula, is our best chance to imagine what it would look like if we could possibly go there and see it up-close. So many of Hubble’s images have become iconic, and for me the joy is seeing its beautiful images bring science and art together in a way that engages the public. The entrance to my office, features an enormous copy of this image wallpapered on a wall 4m wide and 2.5m tall. I can tell you, it’s a lovely way to start each working day. The impact of the fragments of Comet Shoemaker Levy 9 with Jupiter in July 1994 was the first time astronomers had advance warning of a planetary collision. Many of the world’s telescopes, including the recently repaired Hubble, turned their gaze onto the giant planet. The comet crash was also my first professional experience of observational astronomy. From a frigid dome on Mount Stromlo, we hoped to see Jupiter’s moons reflect light from comet fragments crashing into the far side of Jupiter. Unfortunately we saw no flashes of light from Jupiter’s moons. However, Hubble got an amazing and unexpected view. The impacts on the far side of Jupiter produced plumes that rose so far above Jupiter’s clouds that they briefly came into view from Earth. As Jupiter rotated on its axis, enormous dark scars came into view. Each scar was the result of the impact of a comet fragment, and some of the scars were larger in diameter than our moon. For astronomers around the globe, it was a jaw dropping sight. NASA, ESA and Jonathan Nichols (University of Leicester), CC BY This pair of images shows a spectacular ultraviolet aurora light show occurring near Saturn’s north pole in 2013. The two images were taken just 18 hours apart, but show changes in the brightness and shape of the auroras. We used these images to better understand how much of an impact the solar wind has on the auroras. We used Hubble photographs like these acquired by my astronomer colleagues to monitor the auroras while using the Cassini spacecraft, in orbit around Saturn, to observe radio emissions associated with the lights. We were able to determine that the brightness of the auroras is correlated with higher radio intensities. Therefore, I can use Cassini’s continuous radio observations to tell me whether or not the auroras are active, even if we don’t always have images to look at. This was a large effort including many Cassini investigators and Earth-based astronomers. This far-ultraviolet image of Jupiter’s northern aurora shows the steady improvement in capability of Hubble’s scientific instruments. The Space Telescope Imaging Spectrograph (STIS) images showed, for the first time, the full range of auroral emissions that we were just beginning to understand. The earlier Wide Field Planetary Camera 2 (WFPC2) camera had shown that Jupiter’s auroral emissions rotated with the planet, rather than being fixed with the direction to the sun, thus Jupiter did not behave like the Earth. We knew that there were aurora from the mega-ampere currents flowing from Io along the magnetic field down to Jupiter, but we were not certain this would occur with the other satellites. While there were many ultraviolet images of Jupiter taken with STIS, I like this one because it clearly shows the auroral emissions from the magnetic footprints of Jupiter’s moons Io, Europa, and Ganymede, and Io’s emission clearly shows the height of the auroral curtain. To me it looks three-dimensional. Take a good look at these images of the dwarf planet, Pluto, which show detail at the extreme limit of Hubble’s capabilities. A few days from now, they will be old hat, and no-one will bother looking at them again. Why? Because in early May, the New Horizons spacecraft will be close enough to Pluto for its cameras to reveal better detail, as the craft nears its 14 July rendezvous. Yet this sequence of images — dating from the early 2000s — has given planetary scientists their best insights to date, the variegated colors revealing subtle variations in Pluto’s surface chemistry. That yellowish region prominent in the center image, for example, has an excess of frozen carbon monoxide. Why that should be is unknown. The Hubble images are all the more remarkable given that Pluto is only 2/3 the diameter of our own moon, but nearly 13,000 times farther away. I once dragged my wife into my office to proudly show her the results of some imaging observations made at the Anglo-Australian Telescope with a (then) new and (then) state-of-the-art 8,192 x 8,192 pixel imager. The images were so large, they had to be printed out on multiple A4 pages, and then stuck together to create a huge black-and-white map of a cluster of galaxies that covered a whole wall. I was crushed when she took one look and said: “Looks like mould”. Which just goes to show the best science is not always the prettiest. My choice of the greatest image from HST is another black-and-white image from 2012 that also “looks like mould”. But buried in the heart of the image is an apparently unremarkable faint dot. However it represents the confirmed detection of the coldest example of a brown dwarf then discovered. An object lurking less than 10 parsecs (32.6 light years) away from the sun with a temperature of about 350 Kelvin (77 degrees Celsius) –- colder than a cup of tea! And to this day it remains one of the coldest compact objects we’ve detected outside out solar system. NASA/ESA/STScI, processing by Lucas Macri (Texas A&M University). Observations carried out as part of HST Guest Observer program 9810. In 2004, I was part of a team that used the recently-installed Advanced Camera for Surveys (ACS) on Hubble to observe a small region of the disk of a nearby spiral galaxy (Messier 106) on 12 separate occasions within 45 days. These observations allowed us to discover over 200 Cepheid variables, which are very useful to measure distances to galaxies and ultimately determine the expansion rate of the universe (appropriately named the Hubble constant). This method requires a proper calibration of Cepheid luminosities, which can be done in Messier 106 thanks to a very precise and accurate estimate of the distance to this galaxy (24.8 million light-years, give or take 3%) obtained via radio observations of water clouds orbiting the massive black hole at its center (not included in the image). A few years later, I was involved in another project that used these observations as the first step in a robust cosmic distance ladder and determined the value of the Hubble constant with a total uncertainty of three percent. NASA, ESA and H.E. Bond (STScI), CC BY One of the images that excited me most — even though it never became famous — was our first one of the light echo around the strange explosive star V838 Monocerotis. Its eruption was discovered in January 2002, and its light echo was discovered about a month later, both from small ground-based telescopes. Although light from the explosion travels straight to the Earth, it also goes out to the side, reflects off nearby dust, and arrives at Earth later, producing the “echo.” Astronauts had serviced Hubble in March 2002, installing the new Advanced Camera for Surveys (ACS). In April, we were one of the first to use ACS for science observations. I always liked to think that NASA somehow knew that the light from V838 was on its way to us from 20,000 light-years away, and got ACS installed just in time! The image, even in only one color, was amazing. We obtained many more Hubble observations of the echo over the ensuing decade, and they are some of the most spectacular of all, and VERY famous, but I still remember being awed when I saw this first one. X-ray: NASA/CXC/Univ of Iowa/P.Kaaret et al.; Optical: NASA/ESA/STScI/Univ of Iowa/P.Kaaret et al., CC BY-NC Galaxies form stars. Some of those stars end their “normal” lives by collapsing into black holes, but then begin new lives as powerful X-ray emitters powered by gas sucked off a companion star. I obtained this Hubble image (in red) of the Medusa galaxy to better understand the relation between black hole X-ray binaries and star formation. The striking appearance of the Medusa arises because it’s a collision between two galaxies – the “hair” is remnants of one galaxy torn apart by the gravity of the other. The blue in the image shows X-rays, imaged with the Chandra X-ray Observatory. The blue dots are black hole binaries. Earlier work had suggested that the number of X-ray binaries is simply proportional to the rate at which the host galaxy forms stars. These images of the Medusa allowed us to show that the same relation holds, even in the midst of galactic collisions. NASA, ESA, the Hubble Heritage (STScI/AURA)-ESA/Hubble Collaboration, and A. Evans (University of Virginia, Charlottesville/NRAO/Stony Brook University), CC BY Some of the Hubble Space Telescope images that appeal to me a great deal show interacting and merging galaxies, such as the Antennae (NGC 4038 and NGC 4039), the Mice (NGC 4676), the Cartwheel galaxy (ESO 350-40), and many others without nicknames. These are spectacular examples of violent events that are common in the evolution of galaxies. The images provide us with exquisite detail about what goes on during these interactions: the distortion of the galaxies, the channeling of gas towards their centers, and the formation of stars. I find these images very useful when I explain to the general public the context of my own research, the accretion of gas by the supermassive black holes at the centers of such galaxies. Particularly neat and useful is a video put together by Frank Summers at the Space Telescope Science Institute (STScI), illustrating what we learn by comparing such images with models of galaxy collisions. Our best computer simulations tell us galaxies grow by colliding and merging with each other. Similarly our theories tell us that when two spiral galaxies collide, they should form a large elliptical galaxy. But actually seeing it happen is another story entirely! This beautiful Hubble image has captured a galaxy collision in action. This doesn’t just tell us that our predictions are good, but it lets us start working out the details because we can now see what actually happens. There are fireworks of new star formation triggered as the gas clouds collide and huge distortions going on as the spiral arms break up. We have a long way to go before we’ll completely understand how big galaxies form, but images like this are pointing the way. This is the highest-resolution view of a collimated jet powered by a supermassive black hole in the nucleus of the galaxy M87 (the biggest galaxy in the Virgo Cluster, 55 million light years from us). The jet shoots out of the hot plasma region surrounding the black hole (top left) and we can see it streaming down across the galaxy, over a distance of 6,000 light-years. The white/purple light of the jet in this stunning image is produced by the stream of electrons spiraling around magnetic field lines at a speed of approximately 98% of the speed of light. Understanding the energy budget of black holes is a challenging and fascinating problem in astrophysics. When gas falls into a black hole, a huge amount of energy is released in the form of visible light, X-rays and jets of electrons and positrons traveling almost at the speed of light. With Hubble, we can measure the size of the black hole (a thousand times bigger than the central black hole of our galaxy), the energy and speed of its jet, and the structure of the magnetic field that collimates it. NASA, Jayanne English (University of Manitoba), Sally Hunsberger (Pennsylvania State University), Zolt Levay (Space Telescope Science Institute), Sarah Gallagher (Pennsylvania State University), and Jane Charlton (Pennsylvania State University), CC BY When my Hubble Space Telescope proposal was accepted in 1998 it was one of the biggest thrills of my life. To imagine that, for me, the telescope would capture Stephan’s Quintet, a stunning compact group of galaxies! Over the next billion years Stephan’s Quintet galaxies will continue in their majestic dance, guided by each other’s gravitational attraction. Eventually they will merge, change their forms, and ultimately become one. We have since observed several other compact groups of galaxies with Hubble, but Stephan’s Quintet will always be special because its gas has been released from its galaxies and lights up in dramatic bursts of intergalactic star formation. What a fine thing to be alive at a time when we can build the Hubble and push our minds to glimpse the meaning of these signals from our universe. Thanks to all the heroes who made and maintained Hubble. When Hubble was launched in 1990, I was beginning my PhD studies into gravitational lensing, the action of mass bending the paths of light rays as they travel across the universe. Hubble’s image of the massive galaxy cluster, Abell 2218, brings this gravitational lensing into sharp focus, revealing how the massive quantity of dark matter present in the cluster – matter that binds the many hundreds of galaxies together — magnifies the light from sources many times more distant. As you stare deeply into the image, these highly magnified images are apparent as long thin streaks, the distorted views of baby galaxies that would normally be impossible to detect. It gives you pause to think that such gravitational lenses, acting as natural telescopes, use the gravitational pull from invisible matter to reveal amazing detail of the universe we cannot normally see! NASA, ESA, J. Rigby (NASA Goddard Space Flight Center), K. Sharon (Kavli Institute for Cosmological Physics, University of Chicago), and M. Gladders and E. Wuyts (University of Chicago) Gravitational lensing is an extraordinary manifestation of the effect of mass on the shape of space-time in our universe. Essentially, where there is mass the space is curved, and so objects viewed in the distance, beyond these mass structures, have their images distorted. It’s somewhat like a mirage; indeed this is the term the French use for this effect. In the early days of the Hubble Space Telescope, an image appeared of the lensing effects of a massive cluster of galaxies: the tiny background galaxies were stretched and distorted but embraced the cluster, almost like a pair of hands. I was stunned. This was a tribute to the extraordinary resolution of the telescope, operating far above the Earth’s atmosphere. Viewed from the ground, these extraordinary thin wisps of galactic light would have been smeared out and not distinguishable from the background noise. My third year astrophysics class explored the 100 Top Shots of Hubble, and they were most impressed by the extraordinary, but true colors of the clouds of gas. However, I cannot go past an image displaying the effect of mass on the very fabric of our universe. NASA, ESA, J. Richard (Center for Astronomical Research/Observatory of Lyon, France), and J.-P. Kneib (Astrophysical Laboratory of Marseille, France), CC BY With General Relativity, Einstein postulated that matter changes space-time and can bend light. A fascinating consequence is that very massive objects in the universe will magnify light from distant galaxies, in essence becoming cosmic telescopes. With the Hubble Space Telescope, we have now harnessed this powerful ability to peer back in time to search for the first galaxies. This Hubble image shows a hive of galaxies that have enough mass to bend light from very distant galaxies into bright arcs. My first project as a graduate student was to study these remarkable objects, and I still use the Hubble today to explore the nature of galaxies across cosmic time. To the human eye, the night sky in this image is completely empty. A tiny region no thicker than a grain of rice held at arms length. The Hubble Space Telescope was pointed at this region for 12 full days, letting light hit the detectors and slowly, one by one, the galaxies appeared, until the entire image was filled with 10,000 galaxies stretching all the way across the universe. The most distant are tiny red dots tens of billions of light years away, dating back to a time just a few hundred million years after the Big Bang. The scientific value of this single image is enormous. It revolutionized our theories both of how early galaxies could form and how rapidly they could grow. The history of our universe, as well as the rich variety of galaxy shapes and sizes, is contained in a single image. To me, what truly makes this picture extraordinary is that it gives a glimpse into the scale of our visible universe. So many galaxies in so small an area implies that there are 100 thousand million galaxies across the entire night sky. One entire galaxy for every star in our Milky Way! NASA, ESA, and J. Lotz, M. Mountain, A. Koekemoer, and the HFF Team (STScI), CC BY This is what Hubble is all about. A single, awe-inspiring view can unmask so much about our Universe: its distant past, its ongoing assembly, and even the fundamental physical laws that tie it all together. We’re peering through the heart of a swarming cluster of galaxies. Those glowing white balls are giant galaxies that dominated the cluster center. Look closely and you’ll see diffuse shreds of white light being ripped off of them! The cluster is acting like a gravitational blender, churning many individual galaxies into a single cloud of stars. But the cluster itself is just the first chapter in the cosmic story being revealed here. See those faint blue rings and arcs? Those are the distorted images of other galaxies that sit far in the distance. The immense gravity of the cluster causes the space-time around it to warp. As light from distant galaxies passes by, it’s forced to bend into weird shapes, like a warped magnifying glass would distort and brighten our view of a faint candle. Leveraging our understanding of Einstein’s General Relativity, Hubble is using the cluster as a gravitational telescope, allowing us to see farther and fainter than ever before possible. We are looking far back in time to see galaxies as they were more than 13 billion years ago! As a theorist, I want to understand the full life cycle of galaxies – how they are born (small, blue, bursting with new stars), how they grow, and eventually how they die (big, red, fading with the light of ancient stars). Hubble allows us to connect these stages. Some of the faintest, most distant galaxies in this image are destined to become monster galaxies like those glowing white in the foreground. We’re seeing the distant past and the present in a single glorious picture. Tanya Hill, Honorary Fellow of the University of Melbourne and Senior Curator (Astronomy), Museum Victoria. This article was originally published on The Conversation. Read the original article.

News Article | October 26, 2016

A University of Melbourne researcher has found that over one-third of Americans report health problems--from asthma attacks to migraine headaches--when exposed to common fragranced consumer products such as air fresheners, cleaning supplies, laundry products, scented candles, cologne, and personal care products. The study also found that fragranced products may affect profits, with more than 20% of respondents entering a business, but leaving as quickly as possible if they smell air fresheners or some fragranced product. More than twice as many customers would choose hotels and airplanes without fragranced air than with fragranced air. In the workplace, over 15% of the population lost workdays or a job due to fragranced product exposure. Over 50% of Americans surveyed would prefer fragrance-free workplaces. And over 50% would prefer that health care facilities and professionals were fragrance-free. The research was conducted by Professor Anne Steinemann, from the University of Melbourne School of Engineering, who is a world expert on environmental pollutants, air quality, and health effects. Professor Steinemann conducted a nationally representative population survey in the United States, using a random sample of 1,136 adults from a large web-based panel held by Survey Sampling International (SSI). The results are published in the international journal Air Quality, Atmosphere & Health. When exposed to fragranced products, 34.7% of Americans suffer adverse health effects, such as breathing difficulties, headaches, dizziness, rashes, congestion, seizures, nausea, and a range of other physical problems. For half of these individuals, effects are potentially disabling, as defined by the Americans with Disabilities Act. "This is a huge problem; it's an epidemic," says Professor Steinemann. Fragranced products are pervasive in society, and over 99% of Americans are regularly exposed to fragranced products from their own use or others' use. Reports of adverse health effects were as frequent and wide-ranging across all types of fragranced products. "Basically, if it contained a fragrance, it posed problems for people," Professor Steinemann said. Professor Steinemann is especially concerned with involuntary exposure to fragranced products, or what she calls "secondhand scents." She found over 20% of the population suffer health problems around air fresheners or deodorizers, and over 17% can't use public restrooms that have air fresheners. In addition, over 14% of the population wouldn't wash their hands with soap if it was fragranced. Over 12% of the population experience health problems from the scent of laundry products vented outdoors, over 19% from being in a room cleaned with scented products, and over 23% from being near someone wearing a fragranced product. More generally, over 22% of Americans surveyed can't go somewhere because exposure to a fragranced product would make them sick. "These findings have enormous implications for businesses, workplaces, care facilities, schools, homes, and other private and public places," said Professor Steinemann. For instance, a growing number of lawsuits under the Americans with Disabilities Act concern involuntary and disabling exposure to fragranced products. Professor Steinemann's earlier research found that fragranced products--even those called green, natural, and organic--emitted hazardous air pollutants. However, fragranced consumer products sold in the US (and other countries) are not required to list all ingredients on their labels or material safety data sheets. Nearly two-thirds of the population surveyed were not aware of this lack of disclosure, and would not continue to use a fragranced product if they knew it emitted hazardous air pollutants. Professor Steinemann's research continues to investigate why fragranced product emissions are associated with such a range of adverse and serious health effects. In the meantime, for solutions, Professor Steinemann suggests using products that do not contain any fragrance (including masking fragrance, which unscented products may contain). She also recommends fragrance-free policies within buildings and other places. "It's a relatively simple and cost-effective way to reduce risks and improve air quality and health," she explains. Professor Steinemann has also completed a survey of the Australian population, with results expected to be published soon. "The numbers are similarly striking," she said. The full article is available, free of charge or on the journal website: http://dx.

SINGAPUR, 8. Dezember 2016 /PRNewswire/ -- Carmentix Private Limited („Carmentix") und die University of Melbourne sind stolz, die Initiative „Preterm Birth Biomarker Discovery" (Aufdeckung von Biomarkern bei Frühgeburten) bekannt geben zu dürfen. Das Ziel dieser gemeinsam durchgeführten klinischen Studie ist es, neuartige, von Carmentix entdeckte Biomarker sowie Biomarker, die zuvor an der University of Melbourne endeckt und validiert wurden, in einem gemeinsamen Panel zu validieren und das Risiko für eine Frühgeburt ab der 20. Schwangerschaftswoche abzuschätzen. Die retrospektive Studie, die unter Leitung von Dr. Harry Georgiou, PhD, und Dr. Megan Di Quinzio, MD, an der University of Melbourne durchgeführt wird, soll die statistische Aussagekraft des neuartigen Biomarkerpanels bewerten. „Carmentix ist gespannt auf den Start dieser Zusammenarbeit, zumal wir anstreben, auch zukünftig die Biomarker, die auf unserer einzigartigen Plattform zur Datenaufbereitung entdeckt wurden, weiterzuentwickeln", sagte Dr. Nir Arbel, CEO von Carmentix. „Falls es validiert wird, könnte das neue Panel an Biomarkern die Hoffnung auf eine erhebliche, weltweite Verringerung bei der Zahl von Frühgeburten bestärken." Die klinische Geburtshelferin und Forscherin Dr. Di Quinzio erlebt häufig Mütter, die fragen: „Warum ist mein Baby zu früh geboren?" Oft gibt es keine befriedigende Antwort. „Frühgeburten bleiben weiterhin ein weltweites Problem im Gesundheitswesen, und traurigerweise fehlt es an zuverlässigen Instrumenten bei der Diagnostik", sagte Dr. Georgiou, Wissenschaftlicher Leiter an der University of Melbourne. „Die Kooperationsinitiative gemeinsam mit einem kommerziellen Partner wird dabei helfen, den Weg für einen neuartigen Lösungsansatz für eine bessere Diagnose zu bereiten, und hoffentlich einen Beitrag zur Vermeidung von zu früh einsetzenden Geburtswehen leisten." Carmentix ist von Esco Ventures unterstütztes Start-up-Unternehmen mit Sitz in Singapur. Carmentix entwickelt derzeit ein neuartiges, prognostisches Biomarkerpanel, das die Zahl von Frühgeburten signifikant reduzieren soll. Erreicht werden soll das durch die Einführung von biomolekularen Instrumenten, die das klinische Fachpersonal auf das Risiko einer Frühgeburt schon Wochen vor dem Auftauchen von Symptomen in Alarmbereitschaft versetzt. Die Technik von Carmentix beruht auf einer Vielzahl von Analysen der Signalwege, bei denen ein einzigartiges Panel an Biomarkern zum Einsatz kommt. Dieses Panel an patentrechtlich geschützten Markern wird es erlauben, eine Frühgeburt zwischen der 16. und 20. Schwangerschaftswoche vorherzusagen, wobei ein in hohem Maße genauer vorausberechnender Algorithmus aufgrund der Abdeckung von molekularen Flaschenhalsprozessen, die an Frühgeburten beteiligt sind, angenommen wird. Das Ziel von Carmentix ist es, eine kostengünstige Lösung zu erreichen, die stabil und genau ist und die sich weltweit an die klinischen Gegebenheiten anpassen lässt. Über die University of Melbourne und ihre Vermarktungsinitiativen Die University of Melbourne ist die beste Universität in Australien und gehört zu den weltweit führenden Hochschulen. Gemäß seiner Stellung als Zentrum für Forschung und Entwicklung mit weltweit führenden Spezialisten aus Wissenschaft, Technik und der Medizin wird in Melbourne hochmoderne Forschung betrieben, um neue Wege bei der Ideenfindung, neue Technologien und neues Wissen für den Aufbau einer besseren Zukunft zu schaffen. Forschung von Weltrang, Lösungen für die reale Welt: Die University of Melbourne bekennt sich zu einer Kultur der Innovation -- und arbeitet mit der Industrie, der Regierung, Nichtregierungsorganisationen und der Kommune zusammen, um die Herausforderungen der realen Welt zu meistern. Unsere kommerziellen Partnerschaften hauchen der Forschung Leben ein, und zwar über Zusammenarbeiten in den Bereichen Bio-Engineering, Materialentwicklung, technische Innovationen in der Medizin, Entwicklung von kommunalen Kapazitäten und Kulturunternehmen. Zu den bahnbrechenden und kommerziell umgesetzten Technologien, die an der University of Melbourne erschaffen wurden, gehören etwa Innenohrimplantat , die Stentrode (ein Instrument, das die Steuerung von Computer, Robotergliedmaße oder eines Exoskeletts mithilfe von Gedanken erlaubt), und neuartige anti-fibrotische Medikamentenkandidaten für die Behandlung von Fibrose (häufig vorkommend bei chronischen Erkrankungen, wie chronischen Nierenerkrankung, chronischen Herzinsuffizienz, Lungenfibrose und Arthritis). Die University of Melbourne pflegt enge Partnerschaften mit dem Peter Doherty Institute for Infection and Immunity, Walter and Eliza Hall Institute, CSIRO, CSL und The Royal Melbourne, Royal Children's und Royal Women's Hospitals. Mit seinen über 160 Jahren in führender Position bei Bildung und Forschung reagiert die Universität auf unmittelbare und auf zukünftige Herausforderungen, denen sich unsere Gesellschaft aufgrund des wissenschaftlichen Fortschritts gegenübersieht. Die University of Melbourne ist die Nummer 1 in Australien und steht weltweit auf Rang 31 (laut Times Higher Education World University Rankings 2015-2016).

News Article | June 11, 2016

Light-emitting particles can now be embedded into smart glass panes, scientists have revealed. Scientists from Australia were able to develop glass panes that can display information and emit light while keeping the natural properties of the glass. They said the answer lies in the overall glass transparency and the process of molding it into a shape, something that they have already figured out — through the use of nanoparticles. University of Adelaide researchers embedded nanoparticles into the glass panes by synthesizing the glass and nanoparticles separately, then integrating them under specific pre-set conditions, a process they dubbed "direct-doping." Combining the particles carefully allows both materials to retain much of their natural properties, including transparency and the flexibility of processing into fine optical fibers. Researchers said the process is easier than early methods of smart glass development. Researchers believe that this breakthrough would be beneficial in neuroscience. Fluorescent nanoparticles can assist glass pipettes into precise brain regions. Neuroscientists would no longer rely on the tedious process of applying dye and lasers to hit the specific areas of the brain. Nuclear facilities can also create glass with nanoparticles as remote sensors. "Integrating these nanoparticles into glass, which is usually inert, opens up exciting possibilities for new hybrid materials and devices that can take advantage of the properties of nanoparticles in ways we haven't been able to do before," said lead researcher Tim Zhao, who is also a physicist at the School of Physical Sciences at the University of Adelaide and Institute for Photonics and Advanced Sensing (IPAS). The technology has a wealth of applications. However, for now, the researchers are working on biomedical imaging, biological sensing and three-dimensional volumetric displays. Researchers believe that the use can further be expanded. Nanoparticles can be used in conjunction with electronic, photonic and magnetic properties. "We are heading towards a whole new world of hybrid glass and devices for light-based technologies," said IPAS deputy director Heike Ebendorff-Heidepriem, who also acted as the project leader. Nanoparticles have been proven multiple times to have properties that are beneficial across a wide variety of applications. In a recent Tech Times report, researchers from the Massachusetts Institute of Technology were able to establish nanoparticle catalysts as an alternative to the more expensive precious metals. To assess the efficacy and safety of nanoparticles to consumers, a group of researchers also developed an imaging technique to identify and visualize engineered nanoparticles in the tissue. The research was conducted in collaboration with researchers from the University of Melbourne and Macquarie University and was published in the journal Advanced Optical Materials on May 30. © 2017 Tech Times, All rights reserved. Do not reproduce without permission.

News Article | April 6, 2016

It is the elephant in the room for dark-matter research: a claimed detection that is hard to believe, impossible to confirm and surprisingly difficult to explain away. Now, four instruments that will use the same type of detector as the collaboration behind the claim are in the works or poised to go online. Within three years, the experiments will be able to either confirm the existence of dark matter — or rule the claim out once and for all, say the physicists who work on them. “This will get resolved,” says Frank Calaprice of Princeton University in New Jersey, who leads one of the efforts. The original claim comes from the DAMA collaboration, whose detector sits in a laboratory deep under the Gran Sasso Massif, east of Rome. For more than a decade, it has reported overwhelming evidence1 for dark matter, an invisible substance thought to bind galaxies together through its gravitational attraction. The first of the new detectors to go online, in South Korea, is due to start taking data in a few weeks. The others will follow over the next few years in Spain, Australia and, again, Gran Sasso. All will use sodium iodide crystals to detect dark matter, which no full-scale experiment apart from DAMA’s has done previously. Scientists have substantial evidence that dark matter exists and is at least five times as abundant as ordinary matter. But its nature remains a mystery. The leading hypothesis is that at least some of its mass is composed of weakly interacting massive particles (WIMPs), which on Earth should occasionally bump into an atomic nucleus. DAMA’s sodium iodide crystals should produce a flash of light if this happens in the detector. And although natural radioactivity also produces such flashes, DAMA’s claim to have detected WIMPs, first made in 1998, rests on the fact that the number of flashes produced per day has varied with the seasons. This, they say, is exactly what is expected if the signal is produced by WIMPs that rain down on Earth as the Solar System moves through the Milky Way’s dark-matter halo2. In this scenario, the number of particles crossing Earth should peak when the planet’s orbital motion lines up with that of the Sun, in early June, and should hit a low when its motion works against the Sun’s, in early December. There is one big problem. “If it’s really dark matter, many other experiments should have seen it already,” says Thomas Schwetz-Mangold, a theoretical physicist at the Karlsruhe Institute of Technology in Germany — and none has. But at the same time, all attempts to find weaknesses in the DAMA experiment, such as environmental effects that the researchers had not taken into account, have failed. “The modulation signal is there,” says Kaixuan Ni at the University of California, San Diego, who works on a dark-matter experiment called XENON1T. “But how to interpret that signal — whether it’s from dark matter or something else — is not clear.” No other full-scale experiment has used sodium iodide in its detector, although the Korea Invisible Mass Search (KIMS), in South Korea, used caesium iodide. So the possibility remains that dark matter interacts with sodium in a different way to other elements. “Not until someone has turned on a detector made of the same material will you grow convinced that nothing is there,” says Juan Collar at the University of Chicago, Illinois, who has worked on several dark-matter experiments. Many have found it challenging to grow sodium iodide crystals with the required purity. Contamination by potassium, which has a naturally occurring radioactive isotope, is a particular problem. But now three dark-matter-hunting teams — KIMS; DM-Ice, run from Yale University in New Haven, Connecticut; and ANAIS, at the University of Zaragoza, Spain — have each obtained crystals with about twice the level of background radioactivity of DAMA’s. That is pure enough to test its results, they say. The KIMS and DM-Ice teams have built a sodium iodide detector together at Yangyang Underground Laboratory, 160 kilometres east of Seoul. This instrument uses an ‘active veto’ sensor that will enable it to separate the dark-matter signal from the noise better than DAMA does, says Yeongduk Kim, the director of South Korea’s Center for Underground Physics in Daejeon, which manages KIMS. ANAIS is building a similar detector in the Canfranc Underground Laboratory in the Spanish Pyrenees. Together, KIMS/DM-Ice and ANAIS will have about 200 kilograms of sodium iodide, and they will pool their data. That is comparable to DAMA’s 250 kilograms, enabling them to catch a similar number of WIMPs, they say. Even though the newer detectors will have higher levels of background noise, it should still be possible to either falsify or reproduce the very large DAMA signal, says Reina Maruyama of Yale, who leads DM-Ice. But Calaprice argues that high purity is more important than mass. He and his collaborators have developed a technique to lower contamination, and in January announced that they were the first to obtain crystals purer than DAMA’s. He expects to reduce the background levels further, to one-tenth of DAMA’s. The project, SABRE (Sodium-iodide with Active Background Rejection), will put one detector at Gran Sasso and the other at the Stawell Underground Physics Laboratory, which is being built in a gold mine in Victoria, Australia. SABRE will also use a sensor to pull out the dark-matter signal from noise, and will have a total mass of 50 kilograms. SABRE should complete its research and development stage in about a year, and will build its detectors soon after that, says Calaprice. It will then make its technology available to other labs — something that DAMA did not do. And having twin detectors in both the Northern and Southern hemispheres could clarify whether environmental effects could have mimicked dark matter’s seasonality in DAMA’s results — if the signal is from WIMPs, then both detectors should see peaks at the same time. DAMA will wait at least until 2017 to release its latest results, says spokesperson Rita Bernabei of the University of Rome Tor Vergata. She is not holding her breath about the upcoming sodium iodide detectors. “Our results have already been verified in countless cross-checks in 14 annual cycles, so we have no reason to get excited about what others may do,” she says. If other experiments do not see the annual modulation, she adds, her collaboration will conclude that they do not yet have sufficient sensitivity. Could the teams prove DAMA right? “I was unwilling to believe the DAMA results or even take them seriously at first,” says Katherine Freese, a theoretical astroparticle physicist at the University of Michigan in Ann Arbor, who with her collaborators first proposed the seasonal modulation technique used by DAMA2. But, as DAMA’s data have accumulated, and no other explanation for their signal has arisen, Freese is now excited by the possibility that dark matter may have been discovered after all. The fact that many have tried and failed to repeat DAMA’s experiment shows that it is not easy, says Elisabetta Barberio at the University of Melbourne, who leads the Australian arm of SABRE. “The more one looks into their experiment, the more one realizes that it is very well done.”

News Article | March 22, 2016

With enough computing effort most contemporary security systems will be broken. But a research team at the University of Sydney has made a major breakthrough in generating single photons (light particles), as carriers of quantum information in security systems. The collaboration involving physicists at the Centre for Ultrahigh bandwidth Devices for Optical Systems (CUDOS), an ARC Centre of Excellence headquartered in the School of Physics, and electrical engineers from the School of Electrical and Information Engineering, has been published in Nature Communications. The team's work resolved a key issue holding back the development of password exchange which can only be broken by violating the laws of physics. Photons are generated in a pair, and detecting one indicates the existence of the other. This allows scientists to manage the timing of photon events so that they always arrive at the time they are expected. Lead author Dr. Chunle Xiong, from the School of Physics, said: "Quantum communication and computing are the next generation technologies poised to change the world." Among a number of quantum systems, optical systems offer particularly easy access to quantum effects. Over the past few decades, many building blocks for optical quantum information processing have developed quickly," Xiong said. "Implementing optical quantum technologies has now come down to one fundamental challenge: having indistinguishable single photons on-demand," he said. "This research has demonstrated that the odds of being able to generate a single photon can be doubled by using a relatively simple technique — and this technique can be scaled up to ultimately generate single photons with 100 percent probability." CUDOS director and co-author of the paper, Professor Ben Eggleton, said the interdisciplinary research was set to revolutionize our ability to exchange data securely — along with advancing quantum computing, which can search large databases exponentially faster. "The ability to generate single photons, which form the backbone of technology used in laptops and the Internet, will drive the development of local secure communications systems — for safeguarding defense and intelligence networks, the financial security of corporations and governments and bolstering personal electronic privacy, like shopping online," Professor Eggleton said. "Our demonstration leverages the CUDOS Photonic chip that we have been developing over the last decade, which means this new technology is also compact and can be manufactured with existing infrastructure." Co-author and Professor of Computer Systems, Philip Leong, who developed the high-speed electronics crucial for the advance, said he was particularly excited by the prospect of further exploring the marriage of photonics and electronics to develop new architectures for quantum problems. "This advance addresses the fundamental problem of single photon generation — promises to revolutionize research in the area," Professor Leong said. The group — which is now exploring advanced designs and expects real-world applications within three to five years — has involved research with University of Melbourne, CUDOS nodes at Macquarie University and Australian National University and an international collaboration with Guangdong University of Technology, China.

News Article | December 11, 2016

The internet of things (IoT) – that ever-expanding ecosystem of digital sensors, home appliances and wearable smart devices – attracts its fair share of attention. Speculation is rife on how the 23bn-odd (and counting) “things” will improve quality of life, streamline business operations and ultimately fuel economic benefits to the tune of up to $11tn per year by 2025. Less often considered is the cost to the environment of such a vast network of devices. With the full extent of the IoT far from being realised, even experts are divided on whether it will spell doom or salvation for the environment. One thing that experts can agree on is that we shouldn’t wait around to find out. “The internet of things will be the biggest, most sophisticated piece of equipment that we’ve deployed across the planet – ever,” says telecommunications expert Kerry Hinton, former director of the Centre for Energy Efficient Telecommunications at the University of Melbourne. “That means that we’ve got to think about the potential limitations on it due to power consumption, the use of rare earth elements – all of that – from day one.” According to Hinton, how energy hungry the IoT will be largely depends on the types of devices deployed and what they will be doing. At one end of the spectrum, low-power, low-data transmitting devices – such as sensors that monitor when vending machines need a refill – are unlikely to send energy bills through the roof. Indeed, many of these simple devices won’t tap into a building’s mains power at all. Long-lasting batteries will do most of the work and devices that can power themselves, by tapping into sunlight, vibrations or heat, are also in development. However, Hinton and others foresee an ecosystem of increasingly complex and energy-hungry devices emerging. Devices using video surveillance are a good example. Not only will these devices require mains power to function, they will also contribute significantly to the growth in data coursing through the internet’s veins. According to Cisco’s visual networking index, an ongoing survey of data-consumption trends, internet video surveillance traffic almost doubled between 2014 and 2015, and is set to increase tenfold by 2020. The problem of energy consumption will be a pernicious one, says Hinton. “These technologies on a device-by-device basis, or even a house-by-house basis, are not a significant additional contribution to overall power consumption,” he says. Multiply that across Australia though and “that’s going to boil down to another power station or another two power stations”. Far from being energy gluttons though, IoT devices could contribute to substantial energy and water savings, according to Bettina Tratz-Ryan, green IT specialist and research vice-president at Gartner. “Concepts like energy harvesting are a huge component of innovation that the IoT, specifically, can drive,” she says. In addition, sensors will allow smart buildings to ramp up temperature controls when needed, dim lights when nobody’s around and alert maintenance crews to water leaks as soon as they happen. This is exactly the kind of application fledgling IoT company SkyGrid is developing. “There’s a lot of gimmicky stuff out there but we’re interested in something that changes and improves the world,” says the company’s chief executive, Rory Gleeson. SkyGrid, which is based in Melbourne and Sydney, is developing a smart hot-water system in partnership with hot-water company Quantum Energy. The aim is to intelligently control when a building’s hot-water systems are switched on, so that energy isn’t wasted heating water when no one is around to use it – something that currently wastes as much as 50% of a system’s power. Of course the “things” are only one component of the IoT. The sheer volume of data being transmitted and stored is also set to explode. Data storage has become more energy efficient over recent years. Instead of being relegated to servers held in energy-inefficient company backrooms, data is increasingly stored and processed in the cloud. That’s to say, in large server farms operated by tech giants who have an interest in keeping energy consumption (and costs) to a minimum. Tech giants are turning to renewable energy to lessen their carbon footprint, according to Greenpeace’s 2015 Clicking Clean report. Apple’s data centres, for instance, boast 100% renewable energy, with Yahoo (73% renewables), Facebook (49%) and Google (46%) also improving their green credentials with renewables. (By contrast, Australia-based data centres for HP, IBM and Microsoft get 74% of their energy from coal-fired power). Companies are also strategically locating their data centres for improved energy inefficiency. In 2013, for example, Facebook opened a data centre in northern Sweden that is cooled with outside air and runs off local hydroelectric power. Because of these efficiencies, the deluge of data from the IoT, if stored in the cloud, won’t have a huge impact on energy consumption. But much of the IoT won’t be run off the cloud, says Hinton. Applications that require rapid data access and response times – such as health monitors and autonomous vehicles – will need data to be stored locally and efficiency gains from offshoring of data storage could diminish. Keeping data local isn’t all bad, according to Tratz-Ryan. “Devices are talking to each other without the data being pushed back into the network, which uses energy, which produces carbon,” she says. The Melbourne-based IoT company Freestyle takes advantage of this decentralisation of data to make more responsive energy grids. “It’s taking the intelligence away from centralised control and letting the devices make decisions on near-real-time events,” says its business development general manager, Brad Affleck. Freestyle has partnered with engineering firm PowerTec, South Australian utilities provider SA Power Networks and the University of Adelaide on an intelligent energy grid for Kangaroo Island in South Australia. Sensors and controllers in the grid intelligently manage energy sources to sway energy consumption towards renewables without sacrificing the reliability of the supply. With so many factors on either side of the environment ledger, crunching the numbers to determine whether our connected lives are good or bad for the environment is no mean feat. But the Global e-Sustainability Initiative (GeSI), an international consortium of tech companies and telcos, has attempted just that. In 2015, GeSI released its #SMARTer2030 report, which suggests that information and communications technologies, including the IoT, will be able to save almost 10 times the carbon dioxide emissions that it generates by 2030 through reduced travel, smart buildings and greater efficiencies in manufacturing and agriculture. While Tratz-Ryan is optimistic that this vision of the future is achievable, Hinton isn’t convinced. “The tricky bit,” he says, “is you’ve got to get people to do it” – a “non-trivial exercise” that will require significant public policy intervention. For Tratz-Ryan, policy is only one piece of the puzzle. More important will be peer pressure that encourages organisations and individuals to behave in a socially responsible way. “Policies are not enough to drive energy efficiency and climate change initiatives,” she says. “It has to come from the user community and it has to come from industry.”

News Article | November 22, 2016

PARKVILLE, AUSTRALIA, November 22, 2016-- Dr. James Angus has been included in Marquis Who's Who. As in all Marquis Who's Who biographical volumes, individuals profiled are selected on the basis of current reference value. Factors such as position, noteworthy accomplishments, visibility, and prominence in a field are all taken into account during the selection process.Dr. Angus has been an Honorary Professorial Fellow and Professor Emeritus of the Department of Pharmacology and Therapeutics, Faculty of Medicine, Dentistry and Health Sciences at the University of Melbourne since 2014. From 2003 to 2013, Dr. Angus was Dean of the Faculty of Medicine, Dentistry and Health Sciences.Dr. Angus earned a Bachelor of Science in pharmacology with honors and a Ph.D. from the University of Sydney. In 1974, he was a NHMRC Senior Research Officer at Hallstrom Institute of Cardiology, Royal Prince Alfred Hospital & Department of Medicine at the University of Sydney, and the Baker Medical Research Institute in Prahran, Victoria. In 1977 he received the NHMRC CJ Martin Travelling Fellowship to work with Sir James Black who would go on to receive the Nobel Prize for Medicine in 1988. Dr. Angus then continued to work at the Baker Medical Research Institute in a variety of roles over the next 15 years including Senior Research Officer, Research Fellow, Senior Research Fellow, Principal Research Fellow, and Senior Principal Research Fellow of the National Health and Medical Research Council. He was ultimately named Deputy Director of the Baker Medical Research Institute in 1992. In 1993, Dr. Angus was appointed to the Chair of Pharmacology at the University of Melbourne.His appointments include Chair of the Melbourne Genomics Health Alliance Phase 1 (2014-2016), Governor and Director of the Florey Institute of Neuroscience and Mental Health, a position he has held since 2012, member of the Program Steering Committee of the Australian Council of Learned Academies (2014-2016), member of the Steering Committee to establish the Australian Academy of Health and Medical Sciences (2013-2014), current President of the National Stroke Foundation, current Chair of the University of Melbourne Sport Board, and current Board Director of the Jack Brockhoff Foundation since 2015.Dr. Angus is a Fellow and former Council Member of the Australian Academy of Science as well as the International Academy of Cardiovascular Sciences, and an Honorary Fellow of the Australian Academy of Health and Medical Sciences. Over the past 25 years, he has received numerous research grants from such learned institutions and organizations as the NHMRC, Australian College of Anaesthetists, National Heart Foundation of Australia, Glaxo Smith Kline Pty Ltd, and Johnson & Johnson Pty Ltd.His scientific society memberships include the Australian Physiological and Pharmacology Society, Australian Society for Clinical and Experimental Pharmacology, British Pharmacological Society, Cardiac Society of Australia and New Zealand, High Blood Pressure Research Council of Australia, International Society for Heart Research, International Society of Autonomic Neuroscience and International Union of Pharmacology, of which he was first Vice President (2002-2006).Dr. Angus is a regular contributor to scientific journals, including the Clinical and Experimental Pharmacology and Physiology Journal, the Journal of Vascular Research, the British Journal of Pharmacology, and Pharmacology and Toxicology. He has attended and lectured at numerous national and international scientific meetings.In recognition of professional excellence, he was the recipient of the Alfred Gottschalk Medal of the Australian Academy of Science in 1984, and the Thomson ISI: Australian Citation Laureate in Pharmacology in 2004. In 2010, Dr. Angus was appointed an Officer to The Order of Australia for distinguished service to biomedical research, particularly in the fields of pharmacology and cardiovascular disease, as a leading academic and medical educator, and as a contributor to a range of advisory boards and professional organizations both nationally and internationally. Further, he received the Centenary Medal in 2003 for services to pharmacology and the community. His many important roles throughout scientific and academic circles have brought distinction to the University of Melbourne and to the Melbourne Medical School. For his professional efforts, Dr. Angus was selected for inclusion in Who's Who in Medicine and Healthcare, Who's Who in Science and Engineering, and Who's Who in the World.About Marquis Who's Who :Since 1899, when A. N. Marquis printed the First Edition of Who's Who in America , Marquis Who's Who has chronicled the lives of the most accomplished individuals and innovators from every significant field of endeavor, including politics, business, medicine, law, education, art, religion and entertainment. Today, Who's Who in America remains an essential biographical source for thousands of researchers, journalists, librarians and executive search firms around the world. Marquis now publishes many Who's Who titles, including Who's Who in America , Who's Who in the World , Who's Who in American Law , Who's Who in Medicine and Healthcare , Who's Who in Science and Engineering , and Who's Who in Asia . Marquis publications may be visited at the official Marquis Who's Who website at

MELBOURNE, AUSTRALIA--(Marketwired - Nov 29, 2016) - Genetic Technologies Limited ( : GTG) ( : GENE) ("Company"), a molecular diagnostics company and provider of BREVAGenplus®, a first-in-class, clinically validated risk assessment test for sporadic (non-hereditary) breast cancer, today announced the signing of an exclusive worldwide license agreement with The University of Melbourne for the development and commercialisation of a novel colorectal cancer (CRC) risk assessment test. The core technology behind this CRC risk assessment test was developed by Professor Mark Jenkins and his research team at the University's Centre for Epidemiology and Biostatistics. Results from preliminary modelling studies were first published online in Future Oncology on 1 February 2016, in a paper entitled "Quantifying the utility of single nucleotide polymorphisms to guide colorectal cancer screening," 2016 Feb: 12(4), 503-13. This simulated case-control study of 1 million patients indicated that a panel of 45 known susceptibility SNPs can stratify the population into clinically useful CRC risk categories. In practice, the technology could be used to identify people at high risk for CRC who should be subjected to intensive screening which can ultimately reduce the risk of occurrence and death from the disease. Those identified as low risk of CRC can be spared expensive and invasive screening, thereby preventing adverse events and saving money, as it is not justified. A scientific validation study supporting this work is nearing completion and is expected to be published within the next six (6) months. The fundamental technology is similar to the BREVAGenplus test and will fit synergistically into the Company's existing infrastructure and processes. The CRC test represents a significant milestone for the Company as it seeks to diversify its product pipeline and become a key player in the SNP-based cancer risk assessment landscape. The commercial development strategy for the CRC test will benefit from the BREVAGenplus experience in the marketplace. The terms and conditions of the Agreement are confidential, however, Genetic Technologies will be responsible for the commercial development of the test. In addition, as part of the Agreement, The University of Melbourne and Genetic Technologies will embark on a robust, ongoing research collaboration enabling the Company to leverage the University's renowned world-class expertise in SNP-based risk assessment and risk model development. "This is an exciting time for the Company as we commence this strategic alliance with The University of Melbourne. The relationship with the University is comprehensive and highlights our overall corporate mission to become a leader in the genomics focused oncology diagnostics' industry while enhancing our pipeline of risk assessment products," commented Eutillio Buccilli, Executive Director and Chief Executive Officer of Genetic Technologies Limited. Excluding skin cancers, CRC is the third most common cancer diagnosed in both men and women in the United States. Overall, the lifetime risk for developing CRC is about 1 in 20, (5%). CRC is the third leading cause of cancer-related deaths in the United States when men and women are considered separately, and the second leading cause when both are combined. As with breast cancer, early diagnosis is key. When diagnosed at an early stage (before the disease has spread outside the colon), the relative 5-year survival rate for CRC is 92% and 87% for rectal cancer, according to the American Cancer Society, while the respective survival rate for late stage (metastatic) disease is much lower, at 11% and 12%, respectively. In fact, the majority of CRC cases are preventable by early detection and removal of precancerous polyps. Regular CRC screening is therefore, one of the most powerful weapons for preventing CRC. The main challenge with current CRC screening methodologies is compliance (the patient actually doing and completing the test), with the NCI stating that compliance in one of the large RCTs was ~47%, theoretically halving the impact of screening on CRC mortality. The most common CRC screening tool is a faecal occult blood test (FOBT) or visual inspection of the bowel by endoscopy (Colonoscopy or Sigmoidoscopy). FOBT-based-screening has been shown to reduce CRC mortality by three very large randomised controlled trials, according to the U.S. National Cancer Institute (NCI). FOBT screening has a fairly high sensitivity but low positive predictive value meaning a patient who returns a positive FOBT, then goes on to receive a diagnostic colonoscopy. Colonoscopy may be used as a primary screening tool in certain patients, but the cost and the infrastructure required to use it as a primary tool are considered too prohibitive. As with breast cancer, the more the physician can tailor a patient's screening program to their level of risk of developing CRC, the greater impact screening will have on the disease. The development of a much improved CRC risk assessment tool has the potential to provide a significant health benefit by better targeting the existing screening modalities and improving compliance among those patients most at risk of developing CRC. Risk stratification would also likely influence the age a patient will start screening and the frequency. "The licensing Agreement with Genetic Technologies provides us with a wonderful opportunity to work with an organisation that is a leader in the field of genomics for precision public health. Furthermore, Genetic Technologies provides the University with an established platform that will facilitate the transition of our scientific work into the clinical arena," commented Mark Jenkins, Professor of Epidemiology and Biostatistics at The University of Melbourne. About the University of Melbourne and its commercialisation initiatives The University of Melbourne is Australia's best and one of the world's leading universities. As an R&D hub with world-leading specialist in science, technology and medicine, Melbourne undertakes cutting-edge research to create new ways of thinking, new technology and new expertise to build a better future. World-class research, real-world solutions: The University of Melbourne embraces a culture of innovation -- working with industry, government, non-governmental organisations and the community to solve real-world challenges. Our commercial partnerships bring research to life through collaboration in areas of bio-engineering, materials development, medical technology invention, community capacity development and cultural entrepreneurship. With over 160 years of leadership in education and research, the University responds to immediate and future challenges facing our society through innovation in research. The University of Melbourne is No. 1 in Australia and 33 in the world (Times Higher Education World University Rankings 2015-2016). About Genetic Technologies Limited Genetic Technologies is a molecular diagnostics company that offers predictive testing and assessment tools to help physicians proactively manage women's health. The Company's lead product, BREVAGenplus®, is a clinically validated risk assessment test for non-hereditary breast cancer and is first in its class. BREVAGenplus® improves upon the predictive power of the first generation BREVAGen™ test and is designed to facilitate better informed decisions about breast cancer screening and preventive treatment plans. BREVAGenplus® expands the application of BREVAGen™ from Caucasian women to include African-Americans and Hispanics, and is directed towards women aged 35 years or above, who have not had breast cancer and have one or more risk factors for developing breast cancer. The Company has successfully launched the first generation BREVAGen™ test across the U.S. via its U.S. subsidiary Phenogen Sciences Inc. and the addition of BREVAGenplus®, launched in October 2014, significantly expands the applicable market. The Company markets BREVAGenplus® to healthcare professionals in comprehensive breast health care and imaging centres, as well as to obstetricians/gynaecologists (OBGYNs) and breast cancer risk assessment specialists (such as breast surgeons). For more information, please visit and Safe Harbor Statement Any statements in this press release that relate to the Company's expectations are forward-looking statements, within the meaning of the Private Securities Litigation Reform Act. The Private Securities Litigation Reform Act of 1995 (PSLRA) implemented several significant substantive changes affecting certain cases brought under the federal securities laws, including changes related to pleading, discovery, liability, class representation and awards fees. Since this information may involve risks and uncertainties and are subject to change at any time, the Company's actual results may differ materially from expected results. Additional risks associated with Genetic Technologies' business can be found in its periodic filings with the SEC.

News Article | November 7, 2016

Sydney Australia, Nov. 07, 2016 (GLOBE NEWSWIRE) -- Heron Resources Limited (TSX: HER; ASX: HRR) ("Heron" or the "Company") is pleased to advise that its gold-nickel spin-off, Ardea Resources Limited (Ardea) has appointed a well credentialed independent Board to oversee its imminent listing on ASX and on-going operations. The Board combines a focussed blend of exploration, development and corporate expertise well suited to Ardea’s strategic plan. Katina Law has over 25 years’ experience in the mining industry covering corporate and site based roles across several continents. Over the past ten years she has worked with a number of ASX listed resources companies in strategic financial advisory and general management roles. She has worked on a number of development and evaluation projects which were later subject to corporate transactions including the Deflector gold and copper project and the King Vol polymetallic zinc project. Ms Law was Executive Director and CEO of East Africa Resources Limited from 2012 to 2015. Ms Law has also held senior positions at Newmont Mining Corporation’s head office in Denver, USA producing the company’s financial plans and providing financial information and analysis to the Board of Directors and the Executive Committee. She held the position of New Business Development Executive at LionOre International based in Perth where she was responsible for the financial assessment of development projects. Ms Law has a Bachelor of Commerce degree from UWA, is a Certified Practising Accountant and has an MBA from London Business School. She is a currently a non-executive Director of headspace and Gumala Enterprises Pty Ltd. Ms Law has no other public company directorships. Matthew Painter is a geologist with over 20 years’ professional experience including SRK Consulting, Sabre Resources, AngloGold Ashanti, Geological Survey of WA and MIM Exploration. His expertise is in ore deposit geology and structural geology, and his work has been instrumental in the successful discovery, exploration, and development of greenfields and brownfields deposits globally. Dr Painter has extensive on-ground experience throughout Australia and overseas including east, west, and southern Africa, central and south-eastern Asia, and South America, across a broad range of commodities including gold, copper, zinc-lead-silver, uranium, tin and manganese. Dr Painter has extensive managerial and ASX-listed company corporate experience. He has a Bachelor of Science with Honours degree from the University of Melbourne and has a Doctor of Philosophy (PhD) in Economic Geology from the University of Queensland. Dr Painter has no other public company directorships. Ian Buchhorn is a mineral economist and geologist with over 35 years’ experience. Prior to listing Heron in 1996 as founding Managing Director, Mr Buchhorn worked with Anglo American Corporation in southern Africa, and Comalco, Shell/Billiton and Elders Resources in Australia variously as a corporate and research geologist, as well as setting up and managing Australia’s first specialist mining grade control consultancy. For the last 25 years Mr Buchhorn has developed mining projects throughout the Eastern Goldfields of Western Australia and operated as a Registered Mine Manager. Mr Buchhorn is an executive director of Heron Resources Limited and non-executive director of RBR Group Limited. Mr Buchhorn’s role is to provide continuity from Heron’s stewardship of the assets to Ardea’s. Sam Middlemas is a chartered accountant with more than 20 years’ experience in various financial and company secretarial roles with a number of listed public companies operating in the resources sector. He is the principal of a corporate advisory company which provides financial and secretarial services specializing in capital raisings and initial public offerings. Previously Mr Middlemas worked for an international accountancy firm. His fields of expertise include corporate secretarial practice, financial and management reporting in the mining industry, treasury and cash flow management and corporate governance. Mr Middlemas is currently CEO and Company Secretary of Bauxite Resources Limited, and CFO/Company Secretary of RBR Group Limited, Alto Metals Limited and Enterprise Metals Limited. Due to the December Closing Date, the current indicative timetable is deliberately conservative, to accommodate the holiday season. The Ardea Board will endeavour to close the offer earlier. This news release contains forward-looking statements and forward-looking information within the meaning of applicable Australian and Canadian securities laws, which are based on expectations, estimates and projections as of the date of this news release. This forward-looking information includes, or may be based upon, without limitation, estimates, forecasts and statements as to management’s expectations with respect to, among other things, the timing and ability to complete the Ardea spin-off, the timing and amount of funding required to execute the Company’s exploration, development and business plans, capital and exploration expenditures, the effect on the Company of any changes to existing legislation or policy, government regulation of mining operations, the length of time required to obtain permits, certifications and approvals, the success of exploration, development and mining activities, the geology of the Company’s properties, environmental risks, the availability of labour, the focus of the Company in the future, demand and market outlook for precious metals and the prices thereof, progress in development of mineral properties, the Company’s ability to raise funding privately or on a public market in the future, the Company’s future growth, results of operations, performance, and business prospects and opportunities. Wherever possible, words such as “anticipate”, “believe”, “expect”, “intend”, “may” and similar expressions have been used to identify such forward-looking information. Forward-looking information is based on the opinions and estimates of management at the date the information is given, and on information available to management at such time. Forward-looking information involves significant risks, uncertainties, assumptions and other factors that could cause actual results, performance or achievements to differ materially from the results discussed or implied in the forward-looking information. These factors, including, but not limited to, the ability to complete the Ardea spin-off on the basis of the proposed terms and timing or at all, the ability to complete the Woodlawn Zinc-Copper Project Feasibility Study on time or at all, and whether the feasibility study is positive and otherwise consistent with the business plans of the Company, fluctuations in currency markets, fluctuations in commodity prices, the ability of the Company to access sufficient capital on favourable terms or at all, changes in national and local government legislation, taxation, controls, regulations, political or economic developments in Canada, Australia or other countries in which the Company does business or may carry on business in the future, operational or technical difficulties in connection with exploration or development activities, employee relations, the speculative nature of mineral exploration and development, obtaining necessary licenses and permits, diminishing quantities and grades of mineral reserves, contests over title to properties, especially title to undeveloped properties, the inherent risks involved in the exploration and development of mineral properties, the uncertainties involved in interpreting drill results and other geological data, environmental hazards, industrial accidents, unusual or unexpected formations, pressures, cave-ins and flooding, limitations of insurance coverage and the possibility of project cost overruns or unanticipated costs and expenses, and should be considered carefully. Many of these uncertainties and contingencies can affect the Company’s actual results and could cause actual results to differ materially from those expressed or implied in any forward-looking statements made by, or on behalf of, the Company. Prospective investors should not place undue reliance on any forward-looking information. Although the forward-looking information contained in this news release is based upon what management believes, or believed at the time, to be reasonable assumptions, the Company cannot assure prospective purchasers that actual results will be consistent with such forward-looking information, as there may be other factors that cause results not to be as anticipated, estimated or intended, and neither the Company nor any other person assumes responsibility for the accuracy and completeness of any such forward-looking information. The Company does not undertake, and assumes no obligation, to update or revise any such forward-looking statements or forward-looking information contained herein to reflect new events or circumstances, except as may be required by law. No stock exchange, regulation services provider, securities commission or other regulatory authority has approved or disapproved the information contained in this news release.

Agency: Cordis | Branch: FP7 | Program: CPCSA | Phase: INFRA-2010-1.2.1 | Award Amount: 70.14M | Year: 2010

Scientific research is no longer conducted within national boundaries and is becoming increasing dependent on the large-scale analysis of data, generated from instruments or computer simulations housed in trans-national facilities, by using e Infrastructure (distributed computing and storage resources linked by high-performance networks).\nThe 48 month EGI-InSPIRE project will continue the transition to a sustainable pan-European e-Infrastructure started in EGEE-III. It will sustain support for Grids of high-performance and high-throughput computing resources, while seeking to integrate new Distributed Computing Infrastructures (DCIs), i.e. Clouds, SuperComputing, Desktop Grids, etc., as they are required by the European user community. It will establish a central coordinating organisation,, and support the staff throughout Europe necessary to integrate and interoperate individual national grid infrastructures. will provide a coordinating hub for European DCIs, working to bring existing technologies into a single integrated persistent production infrastructure for researchers within the European Research Area.\nEGI-InSPIRE will collect requirements and provide user-support for the current and new (e.g. ESFRI) users. Support will also be given for the current heavy users as they move their critical services and tools from a central support model to ones driven by their own individual communities. The project will define, verify and integrate within the Unified Middleware Distribution, the middleware from external providers needed to access the e-Infrastructure. The operational tools will be extended by the project to support a national operational deployment model, include new DCI technologies in the production infrastructure and the associated accounting information to help define EGIs future revenue model.

Agency: Cordis | Branch: FP7 | Program: CPCSA | Phase: INFRA-2007-1.2-03 | Award Amount: 49.02M | Year: 2008

A globally distributed computing Grid now plays an essential role for large-scale, data intensive science in many fields of research. The concept has been proven viable through the Enabling Grids for E-sciencE project (EGEE and EGEE-II, 2004-2008) and its related projects. EGEE-II is consolidating the operations and middleware of this Grid for use by a wide range of scientific communities, such as astrophysics, computational chemistry, earth and life sciences, fusion and particle physics. Strong quality assurance, training and outreach programmes contribute to the success of this production Grid infrastructure. \nBuilt on the pan-European network GANT2, EGEE has become a unique and powerful resource for European science, allowing researchers in all regions to collaborate on common challenges. Worldwide collaborations have extended its reach to the benefit of European science.\nThe proposed EGEE-III project has two clear objectives that are essential for European research infrastructures: to expand, optimize and simplify the use of Europes largest production Grid by continuous operation of the infrastructure, support for more user communities, and addition of further computational and data resources; to prepare the migration of the existing Grid from a project-based model to a sustainable federated infrastructure based on National Grid Initiatives. \nBy strengthening interoperable, open source middleware, EGEE-III will actively contribute to Grid standards, and work closely with businesses to ensure commercial uptake of the Grid, which is a key to sustainability. \nFederating its partners on a national or regional basis, EGEE-III will have a structuring effect on the European Research Area. In particular, EGEE-III will ensure that the European Grid does not fragment into incompatible infrastructures of varying maturity. EGEE-III will provide a world class, coherent and reliable European Grid, ensuring Europe remains at the forefront of scientific excellence.

News Article | November 7, 2016

A new comprehensive study of Australian natural hazards paints a picture of increasing heatwaves and extreme bushfires as this century progresses, but with much more uncertainty about the future of storms and rainfall. Published today (Tuesday 8 November) in a special issue of the international journal Climatic Change, the study documents the historical record and projected change of seven natural hazards in Australia: flood; storms (including wind and hail); coastal extremes; drought; heatwave; bushfire; and frost. "Temperature-related hazards, particularly heatwaves and bushfires, are increasing, and projections show a high level of agreement that we will continue to see these hazards become more extreme into the 21st century," says special issue editor Associate Professor Seth Westra, Head of the Intelligent Water Decisions group at the University of Adelaide. "Other hazards, particularly those related to storms and rainfall, are more ambiguous. Cyclones are projected to occur less frequently but when they do occur they may well be more intense. In terms of rainfall-induced floods we have conflicting lines of evidence with some analyses pointing to an increase into the future and others pointing to a decrease. "One thing that became very clear is how much all these hazards are interconnected. For example drought leads to drying out of the land surface, which in turn can lead to increased risk of heat waves and bushfires, while also potentially leading to a decreased risk of flooding." The importance of interlinkages between climate extremes was also noted in the coastal extremes paper: "On the open coast, rising sea levels are increasing the flooding and erosion of storm-induced high waves and storm surges," says CSIRO's Dr Kathleen McInnes, the lead author of the coastal extremes paper. "However, in estuaries where considerable infrastructure resides, rainfall runoff adds to the complexity of extremes." This special issue represents a major collaboration of 47 scientists and eleven universities through the Australian Water and Energy Exchange Research Initiative, an Australian research community program. The report's many authors were from the Centre of Excellence for Climate System Science, the CSIRO, Bureau of Meteorology, Australian National University, Curtin University, Monash University, University of Melbourne, University of Western Australia, University of Adelaide, University of Newcastle, University of New South Wales, University of Tasmania, University of Western Australia and University of Wollongong. The analyses aim to disentangle the effects of climate variability and change on hazards from other factors such as deforestation, increased urbanisation, people living in more vulnerable areas, and higher values of infrastructure. "The study documents our current understanding of the relationship between historical and possible future climatic change with the frequency and severity of Australian natural hazards," says Associate Professor Westra. "These hazards cause multiple impacts on humans and the environment and collectively account for 93% of Australian insured losses, and that does not even include drought losses. "We need robust decision-making that considers the whole range of future scenarios and how our environment may evolve. The biggest risk from climate change is if we continue to plan as though there will be no change. One thing is certain: our environment will continue to change." Some of the key findings from the studies include: Associate Professor Seth Westra Special Issue Editor School of Civil, Environmental and Mining Engineering University of Adelaide Phone: 61-8-8313-1538. Mobile: 61-0-414-997-406

News Article | September 7, 2016

On the dock in Buenaventura, Colombia, the fisherman needed help identifying his catch. “I don’t have any clue what this is,” he said, holding a roughly 50-centimeter-long, grayish-brown fish. Gustavo Castellanos-Galindo, a fish ecologist, recalls the conversation from last October. “I said, ‘Well, this is a cobia, and it shouldn’t be here.’ ” The juvenile cobia had probably escaped from a farm off the coast of Ecuador that began operating earlier in 2015, Castellanos-Galindo and colleagues at the World Wildlife Fund in Cali, Colombia, reported in March in BioInvasions Records. Intruders had probably cut a net cage, perhaps intending to catch and sell the fish. Roughly 1,500 cobia fled, according to the aquaculture company Ocean Farm in Manta, Ecuador, which runs the farm. Cobia are fast-swimming predators that can migrate long distances and grow to about 2 meters long. The species is not native to the eastern Pacific, but since the escape, the fugitives have been spotted from Panama to Peru. The cobia getaway is not an isolated incident. Aquaculture, the farming of fish and other aquatic species, is rapidly expanding — both in marine and inland farms. It has begun to overtake wild-catch fishing as the main source of seafood for the dinner table. Fish farmed in the ocean, such as salmon, sea bass, sea bream and other species, are raised in giant offshore pens that can be breached by storms, predators, fish that nibble the nets, employee error and thieves. Global numbers for escapes are hard to come by, but one study of six European countries over three years found that nearly 9 million fish escaped from sea cages, according to a report published in Aquaculture in 2015. Researchers worry that these releases could harm wildlife, but they don’t have a lot of data to measure long-term effects. Many questions remain. A study out of Norway published in July suggests that some domesticated escapees have mated extensively with wild fish of the same species, which could weaken the wild population. Scientists also are investigating whether escaped fish could gobble up or displace native fish. Worst-case scenario: Escaped fish spread over large areas and wreak havoc on other species. From toxic toads overrunning Australia and Madagascar (SN Online: 2/22/16) to red imported fire ants in the United States, invasive species are one of the planet’s biggest threats to biodiversity, and they cost billions of dollars in damage and management expenses. Not every introduced species has such drastic effects, but invasives can be tough to eliminate. While researchers try to get a handle on the impact of farm escapes, farmers are working to better contain the fish and reduce the ecological impact of the runaways. Some countries have tightened their aquaculture regulations. Researchers are proposing strategies ranging from new farm designs to altering fish genetics. As aquaculture becomes a widespread means to feed the planet’s protein-hungry people, the ecological effects are getting more attention. If escapees weaken native wildlife, “we’re solving a food issue globally and creating another problem,” says population geneticist Kevin Glover of Norway’s Institute of Marine Research in Bergen. Norway, a top producer of marine fish, has done much of the research on farm escapes. Fish farming is big business. In 2014, the industry churned out 73.8 million metric tons of aquatic animals worth about $160 billion, according to a report in July from the Food and Agriculture Organization of the United Nations in Rome. Nearly two-thirds of this food comes from inland freshwater farms such as ponds, used in Asia for thousands of years. The rest is grown on marine and coastal farms, where farmed fish live in brackish ponds, lagoons or cages in the ocean. Freshwater fish can escape from pond farms during events such as floods. Some escapees, such as tilapia, have hurt native species by competing with and eating wild fish. But sea farming has its own set of problems. The physical environment is harsh and cages are exposed to damaging ocean waves and wind, plus boats and predator attacks. Salmon is one of the most heavily farmed marine fish. In some areas, the number of farmed salmon dwarfs wild populations. Norway’s marine farms hold about 380 million Atlantic salmon, while the country’s rivers are home to only about 500,000 wild spawning Atlantic salmon. In the four decades that farmers have been cultivating Atlantic salmon, farmed strains have diverged from their wild cousins. When both are raised in standard hatchery conditions, farm-raised salmon can grow about three to five times heavier than wild salmon in the first year of life. Salmon raised in farms also tend to be less careful; for instance, after being exposed to an artificial predator, they emerge more quickly from hiding places than wild fish. This risky behavior may have arisen partly because the fish haven’t faced the harsh challenges of nature. “The whole idea of a hatchery is that everything gets to survive,” says Philip McGinnity, a molecular ecologist at University College Cork in Ireland. Farmed fish don’t know better. These differences are bad news for hybrid offspring and wild fish. In early experiments, hybrid offspring of farmed and wild salmon tended to fare poorly in the wild. In the 1990s, McGinnity’s team measured these fish’s “lifetime success” in spawning rivers and the ocean. Compared with wild salmon, hybrid offspring had a lifetime success rate about a fourth to a half as high. Around the same time, a team in Norway found that when wild fish swam with farmed fish in their midst, the number of wild offspring that survived long enough to leave the river to head to the ocean was about one-third lower than expected, perhaps because the fast-growing farmed offspring gobbled a lot of food or claimed territory. “There was truly reason to be concerned,” says Ian Fleming, an evolutionary ecologist at Memorial University of Newfoundland in St. John’s, Canada, who was part of the Norway team. Recent work supports the idea that farmed fish could crowd out wild fish by hogging territory in a river. In a study published last year in the Journal of Fish Biology, researchers found that the survival rate of young wild salmon dropped from 74 to 53 percent when the fish were raised in the same confined stream channels as young farmed salmon rather than on their own. When the channels had an exit, more wild fish departed the stream when raised with farmed salmon than when raised alone. “These are fish that give up the territory and have to leave,” says study coauthor Kjetil Hindar, a salmon biologist at the Norwegian Institute for Nature Research in Trondheim. To find out how much escaped fish had genetically mingled with wild fish, Glover’s team obtained historical samples of salmon scales collected from 20 rivers in Norway before aquaculture became common. The researchers compared the DNA in the scales with that of wild salmon caught from 2001 to 2010 in those rivers. Wild salmon in five of the 20 rivers had become more genetically similar to farmed fish over about one to four decades, the team reported in 2013 in BMC Genetics. In the most affected population, 47 percent of the wild fish’s genome originated from farmed strains. “We’re talking about more or less a complete swamping of the natural gene pool,” Glover says. Imagine buckets of paint — red, blue, green — representing each river, he says, and pouring gray paint into each one. Interbreeding was less of an issue where wild fish were plentiful. The farmed fish aren’t good at spawning, so they won’t mate much if a lot of wild competitors are present. But in sparse populations, the farm-raised salmon may be able to “muscle in,” Glover says. A larger study by Hindar’s team, published in July in the ICES Journal of Marine Science, showed that genetic mixing between wild and farmed salmon is happening on a large scale in Norway. Among 109 wild salmon populations, about half had significant amounts of genetic material from farmed strains that had escaped. In 27 populations, more than 10 percent of the fish’s DNA came from farmed fish. What does that mean for the offspring? Each salmon population has adapted to survive in its habitat — a certain river, at a specific temperature range or acidity level. When farmed fish mate with wild fish, the resulting offspring may not be as well-suited to live in that environment. Over generations, as the wild population becomes more similar to farmed salmon, scientists worry that the fish’s survival could drop. Scientists at several institutions in Norway are exploring whether genetic mixing changes the wild salmon’s survival rates, growth and other traits. Making a definitive link will be difficult. Other threats such as climate change and pollution also are putting stress on the fish. If escapes can be stopped, wild salmon may rebound. Natural selection will weed out the weakest fish and leave the strongest, fish that got a lucky combination of hardy traits from their parents. But Glover worries that, just as a beach can’t recover if oil is spilled every year, the wild population can’t rally if farmed fish are continually pumped in: “Mother Nature cannot clean up if you constantly pollute.” In places where the species being farmed is not naturally abundant, researchers are taking a look at whether escapes could upset native ecosystems. For instance, European sea bass sometimes slip away from farms in the Canary Islands, where (except for a few small populations on the eastern end) the species doesn’t normally live. In February 2010, storms battered cages at the island of La Palma, “like a giant tore up all the nets,” says Kilian Toledo-Guedes, a marine ecologist at the University of Alicante in Spain. About 1.5 million fish — mostly sea bass — reportedly swam free. A couple of weeks later, the number of sea bass in nearby waters was “astounding,” he says. “I couldn’t see the bottom.” Sea bass density in waters near the farm was 162 times higher than it had been at the same time the previous year, his team reported in 2014 in Fisheries Management and Ecology. Fisheries data showing a spike in catches of sea bass by local fishermen that January also suggested that large unreported escapes had occurred before the storm. Despite being raised in captivity, where they are fed pellets, some of the farmed fish learn to hunt. The researchers found that escaped sea bass caught four months after the 2010 farm breakdown had eaten mostly crabs. Sea bass from earlier escapes that had been living in the wild for several years had eaten plenty of fish as well. The results, reported in 2014 in Marine Environmental Research, suggest that escapees start by catching easy targets such as crustaceans and then learn to nab faster-moving fish. So far, though, scientists have not seen clear signs that the escapees damaged the ecosystem. The density of sea bass around La Palma had fallen drastically by October 2010 and continued to decline the next year, probably because some fish couldn’t find enough to eat, while others were caught by fishermen or predators, according to a 2015 study by another team in the Journal of Aquaculture Research & Development. Catches of small fish that sea bass eat, such as parrot fish, did not drop significantly after the 2010 escape or after a similar large escape in 1999, says study coauthor Ricardo Haroun, a marine conservation researcher at the University of Las Palmas de Gran Canaria in Spain. While he agrees that the industry should try to prevent escapes, he sees no evidence that the runaways are suppressing wild species. If the escaped fish can breed and multiply, the risk of harming native species rises. In a study published in Marine Ecology in 2012, Toledo-Guedes and colleagues reported finding sexually mature sea bass around the central island of Tenerife. But Haroun says the water is too warm and salty for the fish to reproduce, and his team did not see any juveniles during their surveys of La Palma, nor have they heard any reports of juveniles in the area. Toledo-Guedes says that more extensive studies, such as efforts to catch larvae, are needed before reproduction can be ruled out. Similarly, researchers can’t predict the consequences of the cobia escape in Ecuador. The water is the right temperature for reproduction, and these predators eat everything from crabs to squid. Castellanos-Galindo believes that farming cobia in the area is a mistake because escapes will probably continue, and the fish may eventually form a stable population in the wild that could have unpredictable effects on native prey and other parts of the ecosystem. He points to invasive lionfish as a cautionary tale: These predators, probably released from personal aquariums in Florida, have exploded across the Caribbean, Gulf of Mexico and western Atlantic and are devouring small reef fish. The situation for cobia may be different. Local sharks and other predators will probably eat the escapees, whereas lionfish have few natural predators in their new territory, argues Diego Ardila, production manager at Ocean Farm. Milton Love, a marine fish ecologist at the University of California, Santa Barbara, also notes that lionfish settle in one small area, but cobia keep moving, so prey populations might recover after the cobia have moved on. Not all introduced species become established or invasive, and it can take decades for the effects to become apparent. “Time will tell what happens,” says Andrew Sellers, a marine ecologist at the Smithsonian Tropical Research Institute in Panama City. “Basically, it’s just up to the fish.” Once fish have fled, farmers sometimes enlist fishermen to help capture the escapees. Professional fishermen caught nearly one-quarter of the sea bass and sea bream that escaped after the Canary Islands breach. On average, though, only 8 percent of fish are recaptured after an escape, according to a study published in June in Reviews in Aquaculture. Given the recapture failures, farmers and policy makers should focus on preventing escapes and maintaining no-fishing zones around farms to create a “wall of mouths,” local predators that can eat runaway fish, says coauthor Tim Dempster, a sustainable aquaculture researcher at the University of Melbourne in Australia. Technical improvements could help. The Norwegian government rolled out a marine aquaculture standard in 2004 that required improvements, such as engineering nets, moorings and other equipment to withstand unusually strong storms. Compared with the period 2001–2006, the average number of Atlantic salmon escaping annually from 2007–2009 dropped by more than half. Ocean Farm in Ecuador has tightened security, increased cage inspections and switched to stronger net materials; no cobia have escaped since last year’s break-in, says Samir Kuri, the company’s operations manager. Some companies raise fish in contained tanks on land to avoid polluting marine waters, reduce exposure to diseases and control growth conditions. But the industry is largely reluctant to adopt this option until costs come down. The money saved from reducing escapes probably wouldn’t make up for the current start-up expense of moving to land. The 242 escape events analyzed in the 2015 Aquaculture study cost farmers about $160 million. By one estimate, establishing a land-based closed-containment farm producing about 4,000 metric tons of salmon annually — a small haul by industry standards — would cost $54 million; setting up a similar-sized sea-cage farm costs $30 million. Another solution is to raise fish that have three sets of chromosomes. These triploid fish, produced by subjecting fertilized eggs to a pressure shock, can’t reproduce and therefore wouldn’t proliferate or pollute the wild gene pool. “The only ultimate solution is sterility,” Norway’s Glover says. “Accidents happen.” Escaped triploid salmon are less likely to disrupt mating by distracting females from wild males, the researchers wrote in Biological Invasions in May. But triploid fish don’t grow as well when the water is warmer than about 15° Celsius, and consumers might be reluctant to accept these altered salmon. Although the ecological effects of fish farm escapes may take a long time to play out, most researchers agree that we shouldn’t take chances with the health of the oceans, which already face threats such as climate change, pollution and overfishing. With the aquaculture industry expanding at about 6 percent per year, farmers will have to keep improving their practices if they are to stay ahead of the runaway fish. This story appears in the September 17, 2016, issue of Science News with the headline, "Runaway fish: Escapes from marine farms raise concerns about native wildlife."

News Article | November 10, 2016

In the echoey theatre of modern political gesture, welfare has recently had one of its periodic stagings. A minister has honed his ideological credentials; columnists have extracted culture war content; tabloid journalists have parroted shamelessly grossed-up figures and patrolled Bondi beach for dole bludgers. Yet for once it was worth cocking an ear. For if the world of employment is facing upheaval, then so is its counterpart of unemployment. And if the future of work is, as many argue, increasingly flexible, casual, various and scarce, it’s arguable that those short of it will steadily face exacerbated economic risk. What might a welfare system of the future look like? What is the potential of ideas such as a universal basic income or a negative income tax, long discussed by economists, mostly beyond the ken of politicians? Despite its regular depiction as a peeler of lotuses for layabouts, Australians have reasons to be proud of their social safety net. Australia’s taxes are steeply progressive, moderating inequalities of income, and its welfare system comparatively cheap, certifiably efficient in delivering to those in greatest need. Most of the bill is absorbed not by the dole but by the aged pension, even if no tabloid has ever sent reporters to scour those opulent retirement homes and bingo halls in search of the spongeing workshy elderly. But being fit for purpose is no help if purpose should change. “What we have now is a welfare system designed around a whole bunch of policy settings that aren’t there any more – a parsimonious dole predicated on a protected economy in what’s now a dynamic, unprotected, trade-exposed economy,” argues Tim Lyons of the thinktank Per Capita. “The dole makes sense if you’re a cook and you lose your job as a cook and you want another job as a cook in the same town you live in. But it’s not going to get you through a long period of unemployment without a decline into poverty, it won’t help you retrain and it’s not sufficient to help you move.” Possibilities of gales of creative destruction are mitigated somewhat by slow growth in the supply of new workers. “One thing that’s happened over the last five years is that the growth rate in supply of workers has shrunk quite rapidly,” notes economist Saul Eslake. “Since 2011, we’ve seen slowest growth in working age population since the great depression … and unemployment has fallen by more in the last three years than in any other three-year period.” Average job duration too has changed relatively little in the past 30 years and the much-discussed gig economy is as yet small, its impacts anecdotal rather than substantive – even if that may partly be a factor of official measurability. In some respects, Australia is better prepared than other economies for the challenges of job churn. “We’re well ahead of a lot of other countries that tie a lot of things into the employment package,” observes the shadow assistant treasurer, Andrew Leigh. “For example, when you tie healthcare into employment, it makes it much harder to move between jobs. When you have defined benefit pension plans tied to a particular job, that makes transition towards an economy in which people start working multiple jobs far more problematic. In that sense, we’re starting in the new economic world with less lead in our saddle bags than much of North America and Europe.” Developing a sense of future employment prospects, moreover, is something about which humility is advisable. Says Eslake: “While it is easy to point to jobs that are threatened by technological change, what economists, futurists and others have been very bad at is pointing to the jobs that will emerge, whether people will be appropriately equipped to do those jobs and whether they will produce adequate incomes.” Yet some facts can now be constituted as trends. Wage growth is anaemic, with all manner of entailments, including a property boom that is the empowered consumer’s response to income insecurity. Casual work, meanwhile, is as profuse as full-time employment is flat. In 1988, just over a fifth of workers were employed part-time and adult full-time wages were 21% higher than wages paid to all. Today almost a third of workers are employed part-time and the wages gap yawns at 36%. Employers are ever more reliant on the ranks of casuals, part-timers, workers on short-term contracts, migrants on short-term visas and even full-time school students, who now constitute 2.3% of the workforce. The underemployed – those who would like to find more work but cannot – have reached unprecedented proportions, their circumspection acting as a drag on demand. The Turnbull government’s peculiar mix of effervescence and inanition suggests a superficial engagement with future possibility – imprecations to “nimbleness” have replaced the valourising of “working families”. But its rhetoric about welfare – variations on lifters and leaners, workers and shirkers – sounds more like a pandering to constituencies. The benefits of the laissez-faire approach have been whittled away, leaving concentrations of prosperity, a more general residue of exhaustion and discontent. “We’re about to fall into that low-inflation trap that we’ve previously avoided, even though we’re still growing – wage growth is the lowest it’s been since they started collecting that data,” reports George Megalogenis, the author of the Australia’s Second Chance (2016). “You can’t say people are slack and don’t deserve a pay rise. They’re working harder than ever and productivity has boomed. But the profit share has swung too far to capital without it being reinvested. So we need a reset, for same reason as you always need a reset – to maintain stability and smooth out the cycle.” Says veteran economist Ross Garnaut: “There will be a certain number of jobs in future that give very good incomes, with a degree of luck involved in who gets those – even people of talent could work their guts and out and not get one. We want a system of transfers that help the rest, the people who are unlucky, enjoy a reasonably secure life.” But how? There are big ideas abroad. At the cutting edge is one with paradoxically ancient antecedents. Universal basic income, a form of social security involving the state paying its citizens a regular unconditional lump sum regardless of whether they work, claims a 500-year intellectual pedigree, the notion of providing “everyone with some livelihood” having been countenanced by Thomas More in Utopia (1516) as an antidote to crime. It has since been approved by a remarkable range of reformers and redistributionists. In Agrarian Justice (1797), Thomas Paine envisioned a national fund, provisioned by the levying of a “ground-rent” on landowners, which would make two kinds of payments to every person: a fifteen-pound lump sum at the age of 21 “to enable him or her to begin the world” and a ten pound annual stipend to everyone over 50 “to enable them to live in old age without wretchedness, and go decently out of the world”. In The Road to Serfdom (1944), FA Hayek argued that the universal assurance of “some minimum of food, shelter, and clothing, sufficient to preserve health and the capacity to work” was “no privilege but a legitimate object of desire” and could be provided “outside of and supplementary to the market system”. For a long time, then, universal basic income has looked like an elegant solution in search of a problem, including in Australia, where it was first mooted in the context of postwar reconstruction in a 1942 monograph by the radio journalist RG Lloyd Thomas and was a proposition entertained in Prof Ronald Henderson’s landmark Report of the Commission of Inquiry into Poverty (1975). Lately it has found another articulate Australian advocate, the academic Tim Dunlop, whose The Future is Workless (2016) deems it “suddenly very sexy” – not a description often applied to social security schemes. “To me the real efficacy of basic income is that even if we’re not headed into a workless future, we’re destined for a future where work will be based on short-term contracts,” Dunlop says. “That might be very lucrative for some people. And you might be able to string together a lifetime of that stuff. But, for a lot of people, even if there is plenty of that sort of work, there are going to be periods where you’ve got nothing. And if you have a society based on that insecurity, that’s a bad society.” The argument is being heard in a number of countries. Swiss voters pondered such a scheme this year, before rejecting it; Finland is running a non-universal pilot program next year; Y-Combinator, a Silicon Valley incubator firm, will sponsor a similar test in Oakland, California. And one of the most interesting features of a universal basic income is its variety of advocates, from wings of the egalitarian left, who see it as decommodifying labour and transforming labour relations, rallying behind such works as Guy Standing’s The Precariat (2011) and Nick Srnicek and Alex Williams’ Inventing the Future (2016), and the libertarian right, eager to winnow the welfare bureaucracy away, inspired chiefly by Charles Murray’s In Our Hands (2006). Dunlop finds common cause with both. “The way the welfare system works now is insane,” he says. “It’s all outsourced to private providers, yet neither the government nor the provider can get you a job, only an interview. So the government is paying all this money to providers for getting people into interviews without access to the most obvious performance metric: did they get a job? All they can ask is are people ‘job ready’? So they come up with all these measures. Have you done a CV? Have you done training? Have you done this or that? All this government money is being spent basically on compliance costs, on policing people. I’m all for the libertarian argument in that sense. If we can get rid of that, I think everyone’s better off.” The trouble with having some supporters on left and right is that annoys others. Many on the left hear the echo of “to each according to his needs”; a sizeable proportion of right still yearn to distinguish between “the deserving poor” and “the undeserving poor”. Welfare in Australia long ago sheered away from a social insurance model; it has traditionally been targeted, generally been niggardly. The idea of money for nothing would go against a deeply grained idea of a reciprocal bond between state and citizen – embedded in our national anthem, of course, is the notion of “wealth for toil”. Given work’s perceived ennobling characteristics, a basic income would strike some as morally corrosive. But most now uncontroversial forms of income support were first condemned. And a universal basic income might prove less of a disincentive to work than generally imagined. In the most famous basic income experiment, Mincome (1974-79), residents of a poor town in the Canadian province of Manitoba were sent monthly cheques. Working habits among males in Dauphin changed hardly at all, households remained thrifty, and other life measures were enhanced: teenagers remained in school longer; mental health outcomes were better. Mincome was doomed by a conservative shift in Canadian politics and its findings were buried – perhaps in part because they cast doubt on an abiding assumption that the poor immiserate themselves by carelessness with money. On the contrary, it seemed, for having had little, they were all the more careful. Occurring as it did only on a local scale, however, Mincome only takes the argument so far. A truly universal basic income would reach many who neither need nor want it. “If the proposition is for a basic income, then, as someone who earns more than $200,000, I say: ‘Don’t cut me a cheque’,” says Labor’s Andrew Leigh. “It will do nothing to change my behaviour and it will be produced either by raising taxes unnecessarily or, worse still, by taking money away from people who need it more than me. And we would be doing it in a country where the total burden of tax is one of the lowest in the advanced world.” A basic income would, indeed, be exceedingly expensive: even calibrated at the minimum wage, it would require the disgorging of almost the entire tax take. “Our system isn’t perfect,” says economist Eslake. “It can be improved on the expenditure and the revenue sides but that can be done. Whereas if you were to say to the Australian people we need to raise the level of tax by 10% of GDP, I can’t see that flying. It is less of an issue for Scandinavian countries who recycle a lot more through the tax and transfer system.” Dunlop is not so starry-eyed that he sees the universal basic income as an idea whose time has come. “I guess I end up at the view, as I do with lots of things about the future of work, is that we actually have to get to the crisis,” he says. “That you don’t argue your way into this, because the social and cultural norms are so entrenched that you need to come to a point where alternatives demand it. We only got a welfare state because of two world wars. You’ve also got to convince the one per cent that their income is being legitimately redistributed. They’re very powerful and it’s not like society is falling down around their ears – not yet anyway. In the meantime they can insulate themselves from it. Things will have to get very bad to get their attention.” As Tim Lyons notes: “At the moment we’re a country that can’t come to grips with increasing the dole by $35 a day, which everyone, even the Business Council of Australia, accepts is a ridiculously low level.” Sometimes drawn into discussion of universal basic income is a less ambitious but more robust welfare variation – that of the targeted cash transfer. Its advocates first stirred in the mid-1990s and again have numbered some interesting individuals. In his maiden speech to parliament in 1994, for example, young backbencher Tony Abbott outlined an inchoate plan for a broad-based, non-means-tested “family wage” of $100 a week payable to the principal carer of dependent children. Again the allure was simplicity, reduced bureaucracy. While voters have shown an innate mistrust of radical change to the tax system, everyone understands and hardly anyone objects to a cash payment. A family wage is quite different from welfare. It is a recognition of responsibilities, not need. It is a payment for services, not a handout. It means that personal choice could replace economic necessity as a rationale for family decisions. One beauty of a family wage system is that it would take one public servant, just one, and a computer to administer. Payments would start the moment a birth is recorded on the registrar of births, deaths and marriages database and finish 16 years later. The showpiece, however, has come from the left of politics, specifically Lulism in Brazil. Now 13 years old, Bolsa Familia is a mean-tested basic income allocated to poor families through the female head of household, delivered via a citizens card that operates on a debit basis, conditional on contrapartidas (counterpart responsibilities) such as children attending school, receiving vaccinations and medical check-ups. It is overwhelmingly popular, a quarter of Brazilian households being eligible recipients. It is also disarmingly cheap, absorbing less than half a per cent of GDP. Bolsa Familia has its critics, from the right, including the Catholic church, and from the left, who see it as too paternalistic and consumerist – the contrapartidas are arguably as much about legitimising the scheme in the eyes of the middle class as improving public health outcomes. There have been the inevitable issues of abuse, enforcement and burden on overtaxed social services. Yet many initial objections have been disarmed: the evidence is against it breeding improvidence or dependency, with the bulk of the money spent on basics such as food and clothing, while three-quarters of adult recipients still undertake paid work. If the disbursement of cash might seem a crude device, a third device has already been run up a mainstream political flagpole. The negative income tax, in which those earning below a set amount receive supplemental pay from the government, has described a zig-zagging intellectual course, one of its most committed advocates being Milton Friedman in Capitalism and Freedom (1962), who saw it as eliminating the need for a minimum wage, addressing disincentives to work posed by high marginal tax rates, and consolidating overlapping welfare circles. A negative income tax was first mooted here by the classical liberals Wolfgang Kasper, Dick Blandy, John Freebairn, Douglas Hocking and Robert O’Neill, who in Australia at the Crossroads (1980) envisioned it being financed by “a reduction in the provision of ‘free’ government services and subsidies to private producers” (they also promoted life-endowment grants: sums of ‘the order of $20,000’ payable to young people 15-21 as a start in life or ‘stake’ in society). It was subsequently promoted as an anti-unemployment measure by Freebairn, Peter Dawkins, Ross Garnaut, Michael Keating and Chris Richardson in their Five Economists’ Letter of October 1998. Garnaut, a former senior adviser to prime minister Bob Hawke and an ambassador to China, now a professorial fellow of economics at University of Melbourne, remains an advocate: “A simple way of making it work would be to give all Australian citizens or residents [an] automatic payment into their bank account every fortnight, subject to a means tests, principally an assets test, [and] you’d be taxed at a basic rate from the very first dollar of income … It’s just a much simpler system for employers and you don’t get high marginal tax rates discouraging people on social security as they do now.” Its attractions were noted by the McClure review of welfare in 2000, which recommended a single “common base payment” to people of working age replacing all other pensions and allowances, with a few add-ons for those with special ability and housing needs. Tony Abbott, by then the employment minister, thought that it “should be possible to build a consensus for a single working-age payment with supplements based on special needs and participation in the community”. But the push lost momentum: its outgrowing instead was the low income tax offset and to a degree the raising of the tax-free threshold. The only political party in Australia still advocating a negative income tax is the Pirate party, whose platform in the last election included a tax designed to ensure of minimum income of $14,000 for everyone over the age of 18. Whatever transpires, a shift in the paradigm of work poses a challenge to the recently evolved bargain between citizen and state. For much of last 20 years, Australian governments have explored variations on the theme of shifting costs back to the electorate, coaxing voters along the route of private provision for education, health, infrastructure, services and retirement. This slow but steady heaping of expenses on the household table has exacerbated domestic financial precariousness. “You go underwater very quickly without a wage,” says Megalogenis. “Which suggests that, technically, if your open market model can’t provide your working-age population with a steadily rising income any more, you’ve reached the limits of that cost shifting, especially in education and health. That’s a big problem for the economy overall. It was a shortcoming of laissez-faire capitalism in the 19th century: it could never get its mind around the idea that when it got rid of a worker, it was robbing itself of a consumer. We face something like that problem now.” To Megalogenis, the global financial crisis should be recognised as the day the future began: “The last few years in politics have diverted us from the great lesson of 2008-9. We now have a federal government looking at a structural budget deficit, a Reserve Bank that has lost its discretion on the price of money because global interest rates are now so low and a wages system not rewarding the worker for what the open market has demanded they do for the last 20 years, which is adapt – a model that’s broken but exhausted. Yet knowing how successfully the state leaned against the GFC, basically all the levers working exactly as you would wish, you could be pretty confident of it getting things right.” The future requires, he believes, a state more active and more robust in alleviating the cost of day-to-day living. “Before the robots take over, I’d be thinking about where the state can get back in, with the commonwealth as the borrower and the states as the spender,” he says. “Most people want to feel that government is on their side. Not because they want a handout, which is the way it’s frequently been misinterpreted. They want government involved. So the state should be thinking hard about keeping on its books the costs it can bear on behalf of the community. Without an active state you can’t prepare the population for the next big hit, which is coming anyway, whether we’re ready or not.” The welfare nostrums of the Turnbull government? Largely pointless, argues Tim Lyons. “What this mob has returned to is a view that it’s necessary to punish people who are unemployed, to get them in a headlock,” he says. “It’s a recurrent rightwing fantasy of the last 25 years that there’s this vast imaginary army of malingerers and a ton of money to be saved in welfare – the idea that we’re going to save the federal budget by finding a whole bunch of Paxton kids.” In fact, believes Garnaut, the stakes could hardly be higher. “We’re testing how democracy works when wages are stagnant or falling,” he says. “Well, I think we already know how it works, which is badly. In fact, unless we get used to the idea of doing something systematic and non-stigmatising to support the incomes of ordinary people, it may not be viable as a political system.” And the answers will not be found on Bondi beach.

News Article | June 6, 2016

It was the smell that really got to diver Richard Vevers. The smell of death on the reef. “I can’t even tell you how bad I smelt after the dive – the smell of millions of rotting animals.” Vevers is a former advertising executive and is now the chief executive of the Ocean Agency, a not-for-profit company he founded to raise awareness of environmental problems. After diving for 30 years in his spare time, he was compelled to combine his work and hobby when he was struck by the calamities faced by oceans around the world. Chief among them was coral bleaching, caused by climate change. His job these days is rather morbid. He travels the world documenting dead and dying coral reefs, sometimes gathering photographs just ahead of their death, too. With the world now in the midst of the longest and probably worst global coral bleaching event in history, it’s boom time for Vevers. Even with all that experience, he’d never seen anything like the devastation he saw last month around Lizard Island in the northern third of Australia’s spectacular Great Barrier Reef. As part of a project documenting the global bleaching event, he had surveyed Lizard Island, which sits about 90km north of Cooktown in far north Queensland, when it was in full glorious health; then just as it started bleaching this year; then finally a few weeks after the bleaching began. “It was one of the most disgusting sights I’ve ever seen,” he says. “The hard corals were dead and covered in algae, looking like they’ve been dead for years. The soft corals were still dying and the flesh of the animals was decomposing and dripping off the reef structure.” It’s the sort of description that would be hard to believe, if it wasn’t captured in photographs. In images shared exclusively with the Guardian, the catastrophic nature of the current mass bleaching event on previously pristine parts of the Great Barrier Reef can now be revealed. Coral bleaches when the water it’s in is too warm for too long. The coral polyps gets stressed and spit out the algae that live in inside them. Without the colourful algae, the coral flesh becomes transparent, revealing the stark white skeleton beneath. And because the algae provides the coral with 90% of its energy, it begins to starve. Unless the temperatures quickly return to normal, the coral dies and gets taken over by a blanket of seaweed. Once that happens it can take a decade for the coral to recover – and even then that recovery depends on the reef not being hit by other stressors such as water pollution. Vevers’ images show how the once brilliant coral first turned white and then became covered in seaweed. While the hard corals are still holding their structure under the seaweed blanket, the soft corals are dying; dripping off the dead coral skeletons. The thick seaweed is a sign of extreme ecosystem meltdown. Fish can no longer use the coral structure as shelter – blocked by the plants – and before long the coral structures themselves are likely to collapse, leaving little chance of full recovery within the next 10 years. When the coral dies, the entire ecosystem around it transforms. Fish that feed on the coral, use it as shelter, or nibble on the algae that grows among it die or move away. The bigger fish that feed on those fish disappear too. But the cascading effects don’t stop there. Birds that eat fish lose their energy source, and island plants that thrive on bird droppings can be depleted. And, of course, people who rely on reefs for food, income or shelter from waves – some half a billion people worldwide – lose their vital resource. Justin Marshall, a biologist at the University of Queensland who spends a lot of his time studying the reef ecosystem around Lizard Island, says: “What happens is the colony dies, the polyps disintegrate. The algae use that as fertiliser and grow very quickly over the coral head. And at that point it’s doomed. It’s going to break up. “It’s like a forest where plants compete for light. On the reef you’ve got this continuous competition between the seaweed and the coral. And, in the conditions we’ve got at the moment, the seaweed tends to win because it’s warm and it’s got lots of rotting stuff around to fertilise it.” Marshall says the thing that struck him about the bleaching event this year was not just the severity but the rapidity of the death. “I was just blown away by that.” Once the seaweed has taken hold, and the structure of the reef is broken up and lost, studies have shown that recovery is slower. Reefs can be lost forever. What’s at stake here is the largest living structure in the world, and by far the largest coral reef system. The oft-repeated cliche is that it can be seen from space, which is not surprising given it stretches more than 2,300km in length and, between its almost 3,000 individual reefs, covers an area about the size of Germany. It is an underwater world of unimaginable scale. But it is up close that the Great Barrier Reef truly astounds. Among its waters live a dizzying array of colourful plants and animals. With 1,600 species of fish, 130 types of sharks and rays, and more than 30 species of whales and dolphins, it is one of the most complex ecosystems on the planet. It begins in the subtropical waters of Hervey Bay in Queensland, about 200km north of Brisbane. From there it stretches the rest of the way up the eastern coast of Australia, stopping just off the coast of Papua New Guinea. About 2 million people visit it each year and together they contribute almost $6bn to the Australian economy. Going back for millennia, Indigenous Australians have relied on the Great Barrier Reef. As the world emerged from the last ice age about 20,000 years ago and sea levels began to rise, Indigenous Australians moved off the area that was once a floodplain and would have watched as today’s Great Barrier Reef formed. Today there are more than 70 Indigenous groups with a connection to the reef, many of whom depend on it for their livelihoods. Perhaps most disturbingly, what Marshall and Vevers have witnessed on Lizard Island is in no way unique. In the upper third of the 2,300km reef it’s estimated that about half the coral is dead. Surveys have revealed that 93% of the almost 3,000 individual reefs have been touched by bleaching, and almost a quarter – 22% – of coral over the entire Great Barrier Reef has been killed by this bleaching event. On many reefs around Lizard Island and further north, there is utter devastation. Further south, the bleaching is less severe. Since tourists usually go diving and snorkelling in the middle and southern sections, there are plenty of spectacular corals for them to see there. But they shouldn’t be fooled by that – the reef is in the midst of a major environmental catastrophe. Many scientists are now saying it is almost too late to save it. Strong and immediate action is required to alleviate water pollution and stop the underlying cause: climate change. Australians are being wooed by politicians for an upcoming federal election, most of whom support policies that will guarantee the reef’s destruction. This is the story of the impending death of the world’s largest living structure – whose hand it is dying by, who is staging a cover-up, and how it could be saved. Let’s be completely clear. This is no natural death. And there’s no question about who is to blame. Although bleaching has probably always happened in small patches here and there during unusually warm and calm weather, it used to be extremely rare. The first recorded bleaching was in 1911 on Bird Key Reef in the Florida Keys. It happened during a period of calm, hot weather. Something similar was reported on the Great Barrier Reef in 1929. Then there was not much to speak of for decades. There were a smattering of reports – maybe two or three over the next half century – until the year 1979. That year, everything changed. A new phenomenon of “mass bleaching” was seen for the first time, where bleaching would smash large regions, rather than just isolated stretches of coral. In 1979 widespread bleaching was seen stretching throughout the Caribbean and the Florida Keys. And from then there was no turning back. Every year since then, bleaching has been reported somewhere in the world, often on a regional scale. Something that had rarely been seen before was being seen literally every year. Then it was time to go global. Coral reefs right around the world experienced bleaching during the first extreme El Niño recorded in 1982 and 1983. El Niño is a splurge of warm water that spreads across the Pacific Ocean on irregular intervals, with an average frequency of once every five years. When it does that, it warms the world. An extreme El Niño wreaks havoc on weather patterns around the globe. That splurge of warm water bleached coral on the Great Barrier Reef, through Indonesia, Japan and over to the Caribbean. Then just five years later, during another El Niño, another bleaching event stretched its way around the globe. By then, it was already clear what was causing all this. A paper in 1990 warned these events were being caused by climate change and bleaching “will probably continue and increase until coral-dominated reefs no longer exist”. At that time the 1982 event was described as “the most widespread coral bleaching and mortality in recorded history” but today there is debate about whether it and the 1987 events’ severity was bad enough to count as a true “global bleaching event”. That hardly matters now. In an age of climate change, records don’t last long. In 1997-98, the world was hit by a second extreme El Niño – the strongest seen to date. Figures of how much coral died that year are hard to confirm but it is thought 16% of the world’s reefs were destroyed in a matter of months. About half of those might have been lost forever. Mass bleachings – some global, some not – have continued ever since but until this year 1998 held on to the record for the worst yet. That was probably a result of an extended La Niña-like phase that suppressed temperatures until now. During that time, warm water was being buried in the Pacific Ocean, suppressing surface temperatures, and keeping bleachings in check. The year 2016 looks set to blow 1998 out of the water. By some measures it’s the longest global bleaching event in history and, on the Great Barrier Reef, it’s definitely the worst. The reef has been hit by at least three significant mass bleachings in recorded history. The first coincided with the global bleaching in 1998, then it got hit in 2002, and then again this year. A Guardian analysis of the three events, based on data from aerial surveys, shows the increasing severity of each event, and how they smashed different parts of the reef. The mechanism behind this incredible new trend is obvious and well understood. As Bloomberg Businessweek famously said on its cover after Hurricane Sandy, “It’s global warming, stupid.” Since 1950 more than 90% of the excess heat our carbon emissions have trapped in the atmosphere has gone into the oceans. As a result their surface temperature has increased by 1C in just the past 35 years. That puts the water much closer to the limit of what coral can bear. Then, when a surge of even warmer water comes through – often as a result of the irregular El Niño cycle – corals over large stretches get stressed, bleach and die. So well understood is the mechanism that satellite data on water temperature is a good proxy for coral bleaching. Using that understanding, the US National Oceanographic and Atmospheric Administration looks at satellite data and produces “bleaching alerts” that represent a predicted stress response from coral. In data produced exclusively for the Guardian by Mark Eakin, head of Coral Reef Watch at Noaa, we can now reveal exactly how stressful ocean temperatures have been increasing on the Great Barrier Reef over the 34 years that satellite data has been available. Since 1982, just after mass bleachings were seen for the first time, the data shows that the average proportion of the Great Barrier Reef exposed to temperatures where bleaching or death is likely has increased from about 11% a year to about 27% a year. Eakin says looking at that data revealed a clear trend that hadn’t been quantified before. “In seeing that what it immediately showed was that there was a real background pattern of increasing levels of thermal stress.” Combined with other stressors hitting the reef, this is having a devastating impact. Over that period, half the coral cover on the Great Barrier Reef has been lost – and that’s before the mass bleaching this year is taken into account. That data has limitations – it’s not direct bleaching, but stress inferred from temperature readings. And it lumps extreme levels of stress – like what is being seen around Lizard Island now – with anything that is expected to cause mortality. Despite that, it reveals the way global warming is leading to more regular bleaching and mortality. “While there was a considerable amount of variability – from El Niños and other things – there was an obvious upward trend in the data,” Eakin says. “So you’re looking at the background warming, which is having a major effect on the corals.” And just looking at the surface temperature of water around the Great Barrier Reef over the past 100 years leaves little doubt about the role of climate change. Adding to this correlational data, researchers have examined exactly how much more likely the warm conditions on the Great Barrier Reef were as a result of carbon emissions. They ran climate models thousands of times, and simulated a world with human CO2 emissions and a world without them. They found that in a world without humans and their carbon emissions, the conditions on the Great Barrier Reef that caused the current bleaching would have been virtually impossible. Today they’re still unusual, but have been made at least 175 times more likely as a result of our carbon emissions. “In a world without humans, it’s not quite impossible that you’d get March sea surface temperatures as warm as this year, but it’s extremely unlikely,” Andrew King, a lead author of the study from the University of Melbourne, told the Guardian in April. But what was even more concerning was how quickly things are predicted to get worse. “In the current climate it’s unusual but not exceptional. By the mid 2030s it will be average. And beyond that it will be cooler than normal if it was as warm as this year.” That means the Great Barrier Reef is likely to be hit with conditions like this, on average, every second year in fewer than 20 years. Many reef biologists approached by the Guardian have said this could mean it’s too late for the Great Barrier Reef. We may have already made its death inevitable. But since there’s still a chance it’s not too late, they all said it was imperative to keep fighting. “Yes, maybe it’s too late,” Marshall told the Guardian. But he said that was no reason to not try to save it. “I’m not going to sit back and buy a Hummer and just let it all slide.” And there have been signs that coral is more resilient than biologists used to think – it might be able to adapt and evolve and, while the weaker corals are probably doomed, maybe the stronger corals will be able to spread and take over. In some places, maybe reefs will even migrate further from the equator. These tiny signs of hope are all biologists and conservationists can cling to. “With biology there are always things around the corner that we don’t know,” Marshall says. “These things are fantastically resilient and biologically programmed for survival.” But hope requires action. And there are some powerful forces who don’t want to see light shone on on this particular murder. And murder it is: we’ve known for decades that we’re to blame. “It’s the great white lie,” Col McKenzie, the chief executive of the Association of Marine Park Tourism Operators, told a Queensland newspaper in April. “It’s not dead, white and dying. It’s under stress but it will bounce back.” He tells the Guardian he’s furious at the media and at the scientists who have been making a big deal out of the bleaching event: “What I’m seeing is that my industry is being held out for ransom and is the whipping boy for the Greenies who want to be anti-coalmining. And, frankly, I think that’s bloody disgusting.” He represents an industry that, as he puts, is “tied by the hip pocket to the health of the reef”. In 2011-12 it was estimated tourism centred on the Great Barrier Reef generated $5.7bn for the economy and created 69,000 jobs. McKenzie says the media coverage of the bleaching is a bigger risk to the industry than the bleaching itself. He says people are less likely visit the reef now that they think it’s in worse condition. Jumping on this concern, the Australian government looks to be doing everything it can to downplay the bleaching. In May the Guardian revealed the Australian department of environment had intervened to have every mention of the Great Barrier Reef – and indeed every mention of the country – scrubbed from the final version of a UN report on climate change and world heritage sites. As a result, Australia was the only continent on the planet not mentioned. When confronted with the revelation, the government told the Guardian it did it because: “Recent experience in Australia had shown that negative commentary about the status of world heritage properties impacted on tourism.” The revelation came shortly after Australia’s environment minister, Greg Hunt, told a Queensland newspaper after seeing a David Attenborough documentary about the Great Barrier Reef: “The key point that I had from seeing the first of the three parts is that clearly, the world’s Great Barrier Reef is still the world’s Great Barrier Reef.” The article ran with the headline: “Reports of reef’s death greatly exaggerated: Attenborough.” In fact, Attenborough said that “the Great Barrier Reef is in grave danger”. And later: “The twin perils brought by climate change – an increase in the temperature of the ocean and in its acidity – threaten its very existence.” Then in May and June, these concerns caused a split in the national coral bleaching taskforce, which was set up to monitor the bleaching event. It’s made up of 10 Australian institutions, some of them government agencies, and others university research centres and is led by Terry Hughes from James Cook University. The group was about to release the results of its coral mortality surveys when two leading government agencies pulled out of the announcement. Hughes and his university colleagues released the results anyway, on Monday 30 May, but with only part of the data. They announced that “35% of the corals are now dead or dying” in the “northern and central sections of the Great Barrier Reef”. On Thursday of that week, Col McKenzie went on the attack, saying the results were “utter rubbish”. “It seems that some marine scientists have decided to use the bleaching event to highlight their personal political beliefs and lobby for increased funding in an election year,” he said in a media release. The results of surveys from the government agency the Great Barrier Reef Marine Park Authority told a different story, he said. A day later the rest of the results were released by the government agencies. Attached to these was a long media release that aimed to dispel perceived exaggerations of the damage and highlight corals’ ability to recover. Russell Reichelt, the marine park authority’s chairman and chief executive, told the Australian newspaper the agency had split from the group release because it wasn’t telling the whole story. He was quoted as saying that the maps illustrating the coral mortality exaggerated the impact, and that the exaggeration “suits the purpose” of the people sending it out. The story ran on the front page of Australia’s only national newspaper declaring that “activist scientists” were distorting the data. “Marine park head denies coral bleaching crisis,” it screamed. But the authority’s actual data, which revealed a striking 22% of coral on the Great Barrier Reef had been killed, was entirely consistent with the figures released earlier that week from the university partners – something Reichelt later acknowledged on social media. It’s clear that a cabal of climate change deniers, worried tourism operators, and a conservative government have tried to whitewash the environmental disaster unfolding over the Great Barrier Reef. McKenzie is no climate change denier and is quick to agree that climate change has caused the bleaching. But he has taken signs of coral’s adaptability to heart and is sure that the coral will adapt to higher temperatures under climate change. He thinks the reef will be fine. He says the scientists who are making a lot of noise about the bleaching have overstepped a line. “The scientists decided to make some fairly strong statements about the health of the reef – and some fairly outrageous ones at that. I don’t think that’s what science is about. I believe scientists should be reporting the facts as they are, not sensationalising the issue.” The fear that the media spotlight on the bleaching will stop people wanting to visit the reef runs deep in the tourism industry. So much so that tour operators have reportedly been routinely refusing to take conservationists, media and politicians to bleached parts of the reef. But that alliance may be breaking down, with some tourism operators on the reef getting worried about its long-term health. “Many tourism operators, they don’t want people not to come to the reef, so they’ve been reluctant to speak out,” says John Rumney, who has run diving and fishing tours on the Great Barrier Reef for the past four decades. “They are worried it will have a negative impact on the short-term cash flow.” Rumney says that’s short-sighted since unless people speak up now there will be no reef in the future, and the industry won’t exist. He and other operators have broken away from the crowd and are speaking out. (McKenzie describes them as “the fringe dwellers of the industry”.) In May the Guardian revealed that a group of more than 170 individuals and businesses in the tourism industry had written an open letter, published in a north Queensland newspaper, urging people to recognise the severity of the bleaching, and begging the government to take stronger action to save the reef. “We are proud of our stewardship of this incredible resource,” they wrote. “We understand its value lies in looking after it. We hope the majority of the reef can recover but Australia must start doing everything it can to tackle the root cause of the coral bleaching, which is global warming.” And, speaking to other tourism operators, it doesn’t appear these people are industry outsiders as McKenzie suggests. Paul Crocombe is the manager of Adrenalin Dive, a business based in Townsville that takes tourists out to see the reef. He has been diving on the reef for more than 30 years and has been working in tourism for more than 20. He’s concerned that the media reporting about the bleaching will impact tourist numbers but he acknowledges that it’s important to get the information out. Crocombe says when tourists hear that 93% of the reef has been impacted by bleaching they expect to come and see that it’s all dead. Of course that’s not true. In most of the places tourists go, only about 5% of the coral is likely to die, meaning they’ll hardly see any difference. In 2016 there is no reason for tourists to avoid most areas of the reef. “We were really fortunate this time with the coral bleaching that the majority of the mortality is a long way north of here,” Crocombe says. He’s very aware that if the sort of bleaching that hit Lizard Island and other areas was seen near Townsville or Port Douglas, tourism would have had a major, long-term problem. “With the reporting on the threats to the reef, it has, again, a double-edged sword. I think it’s really important that people do understand that the reef is in danger and that if we don’t do something then, yes, we are going to have a significant impact on the reef. “I think it’s really important that people do understand there are threats to the reef. Currently it is in reasonably good condition but I don’t think it will take a lot to tip it over the edge.” So with more moderate tourism operators speaking out, efforts to hide the reef’s impending death might be failing. As that happens, and the world confronts reality, can the reef be saved? “You either do it properly or you give up on the reef, I think. It’s that bad,” says Jon Brodie from James Cook University. Since 1975 he has studied how to give coral reefs their best chance of surviving the various things thrown at them. The solution to climate change itself is well-rehearsed. It’s not a scientific or technological problem but a political one. And a global one. We need to transition away from fossil fuels. That’s a sentiment that chimes with the Guardian’s “Keep it in the ground” campaign. Climate change is the greatest threat facing the Great Barrier Reef and other coral reefs around the world. According to the UN report on climate change that Australia had itself deleted from, and a paper in Nature it cites, a 2C rise in global surface temperatures will result in the loss of more than 95% of coral around the world. If the world limits warming to 1.5C, we might save 10%. If we want to save 50% of what’s around right now, we need to limit warming to just 1.2C – and we’re already more than 80% of the way there. The Australian government has committed to reductions in carbon emissions that aren’t even consistent with limiting warming to 2C. Worse still, the policies in place at the moment are widely acknowledged to be unable meet even those targets. But to give the Great Barrier Reef a fighting chance of survival in current or future temperatures, it needs to be protected from an array of other assaults it is being hit with. Scientists refer to this as building reef resilience. Nick Graham from James Cook University showed last year that almost 60% of reefs in the Seychelles recovered after they lost 90% of their coral following the 1998 global bleaching event. The reefs that recovered were those that were not being hit with pollution, weren’t being overfished, and when the reef managed to maintain a complex structure. When it comes to the Great Barrier Reef, the biggest threat to resilience is water pollution. It is being increasingly smothered with suspended sediment that blocks light; smeared with fertilisers that cause outbreaks of seaweed and coral-eating crown of thorns starfish; and poisoned with herbicides that kill the coral’s symbiotic algae. Compared with what was happening before the 20th century, today there is almost three times as much sediment, about twice as much fertiliser and 17,000 extra kilograms of herbicide washing over the reef each year. Brodie says this needs to be fixed immediately. And the bleaching this year is proof of that. “Climate change is coming on much quicker and stronger than we thought,” he says. “We used to think 2035 was soon enough to fix up water quality but we’ve had to revise that.” Now, he says, if it’s not under control by 2025, it’s game over for the reef. With an election campaign under way in Australia, which will deliver a government for at least three years, many are saying that this election is the last chance to squeeze commitments from politicians that could deliver the resilience the reef needs to survive. So far the current Coalition government of Liberals and Nationals has committed $210m to improve water quality on the reef, and a further $6m to control crown of thorns starfish, if they win the election. The Labor opposition has promised slightly more, with $500m to improve water quality. The Greens, who could hold the balance of power in the next parliament, have so far focused their attention on climate change policies, with a seven-point plan aimed at transitioning Australia away from fossil fuels. According to the best science, none of this is enough. Brodie has written hundreds of papers and technical reports on the issue, and in May published a paper estimating what would be required to get the water to an adequate state by 2025. He said it would take $1bn a year between now and then. As it stands, the major parties have committed to what amounts to tinkering around the edges, he says. A few hundred million here, a few million there. “We know how to do it,” Brodie says. “In fact right now we’re spending a little bit of money doing some of it and we have made a little bit of progress with that little bit of money but we just need a lot more.” He adds: “This is the last chance to do it, I think. If we don’t do it soon then we probably shouldn’t bother, really. It’s as bad as that now.” Graphics by Nick Evershed and Ri Liu. Videos by Josh Wall. Opening footage courtesy of Exposure Labs, which is producing a feature film on the effects of climate change on oceans. Michael Slezak reported from Townsville, the Great Barrier Reef and Sydney

News Article | December 1, 2016

They're flexible, cheap to produce and simple to make - which is why perovskites are the hottest new material in solar cell design. And now, engineers at Australia's University of New South Wales in Sydney have smashed the trendy new compound's world efficiency record. Speaking at the Asia-Pacific Solar Research Conference in Canberra on Friday 2 December, Anita Ho-Baillie, a Senior Research Fellow at the Australian Centre for Advanced Photovoltaics (ACAP), announced that her team at UNSW has achieved the highest efficiency rating with the largest perovskite solar cells to date. The 12.1% efficiency rating was for a 16 cm2 perovskite solar cell, the largest single perovskite photovoltaic cell certified with the highest energy conversion efficiency, and was independently confirmed by the international testing centre Newport Corp, in Bozeman, Montana. The new cell is at least 10 times bigger than the current certified high-efficiency perovskite solar cells on record. Her team has also achieved an 18% efficiency rating on a 1.2 cm2 single perovskite cell, and an 11.5% for a 16 cm2 four-cell perovskite mini-module, both independently certified by Newport. "This is a very hot area of research, with many teams competing to advance photovoltaic design," said Ho-Baillie. "Perovskites came out of nowhere in 2009, with an efficiency rating of 3.8%, and have since grown in leaps and bounds. These results place UNSW amongst the best groups in the world producing state-of-the-art high-performance perovskite solar cells. And I think we can get to 24% within a year or so." Perovskite is a structured compound, where a hybrid organic-inorganic lead or tin halide-based material acts as the light-harvesting active layer. They are the fastest-advancing solar technology to date, and are attractive because the compound is cheap to produce and simple to manufacture, and can even be sprayed onto surfaces. "The versatility of solution deposition of perovskite makes it possible to spray-coat, print or paint on solar cells," said Ho-Baillie. "The diversity of chemical compositions also allows cells be transparent, or made of different colours. Imagine being able to cover every surface of buildings, devices and cars with solar cells." Most of the world's commercial solar cells are made from a refined, highly purified silicon crystal and, like the most efficient commercial silicon cells (known as PERC cells and invented at UNSW), need to be baked above 800?C in multiple high-temperature steps. Perovskites, on the other hand, are made at low temperatures and 200 times thinner than silicon cells. But although perovskites hold much promise for cost-effective solar energy, they are currently prone to fluctuating temperatures and moisture, making them last only a few months without protection. Along with every other team in the world, Ho-Baillie's is trying to extend its durability. Thanks to what engineers learned from more than 40 years of work with layered silicon, they're are confident they can extend this. Nevertheless, there are many existing applications where even disposable low-cost, high-efficiency solar cells could be attractive, such as use in disaster response, device charging and lighting in electricity-poor regions of the world. Perovskite solar cells also have the highest power to weight ratio amongst viable photovoltaic technologies. "We will capitalise on the advantages of perovskites and continue to tackle issues important for commercialisation, like scaling to larger areas and improving cell durability," said Martin Green, Director of the ACAP and Ho-Baillie's mentor. The project's goal is to lift perovskite solar cell efficiency to 26%. The research is part of a collaboration backed by $3.6 million in funding through the Australian Renewable Energy Agency's (ARENA) 'solar excellence' initiative. ARENA's CEO Ivor Frischknecht said the achievement demonstrated the importance of supporting early stage renewable energy technologies: "In the future, this world-leading R&D could deliver efficiency wins for households and businesses through rooftop solar as well as for big solar projects like those being advanced through ARENA's investment in large-scale solar." To make a perovskite solar cells, engineers grow crystals into a structure known as 'perovskite', named after Lev Perovski, the Russian mineralogist who discovered it. They first dissolve a selection of compounds in a liquid to make the 'ink', then deposit this on a specialised glass which can conduct electricity. When the ink dries, it leaves behind a thin film that crystallises on top of the glass when mild heat is applied, resulting in a thin layer of perovskite crystals. The tricky part is growing a thin film of perovskite crystals so the resulting solar cell absorbs a maximum amount of light. Worldwide, engineers are working to create smooth and regular layers of perovskite with large crystal grain sizes in order to increase photovoltaic yields. Ho-Baillie, who obtained her PhD at UNSW in 2004, is a former chief engineer for Solar Sailor, an Australian company which integrates solar cells into purpose-designed commercial marine ferries which currently ply waterways in Sydney, Shanghai and Hong Kong. The Australian Centre for Advanced Photovoltaics is a national research collaboration based at UNSW, whose partners are the University of Queensland, Monash University, the Australian National University, the University of Melbourne and the CSIRO Manufacturing Flagship. The collaboration is funded by an annual grant from ARENA, and partners include Arizona State University, Suntech Power and Trina Solar. UNSW's Faculty of Engineering is the powerhouse of engineering research in Australia, comprising of nine schools, 21 research centres and participating or leading 10 Cooperative Research Centres. It is ranked in the world's top 50 engineering faculties, and home to Australia's largest cohort of engineering undergraduate, postgraduate, domestic and international students. UNSW itself and is ranked #1 in Australian Research Council funding ($150 million in 2016); ranked #1 in Australia for producing millionaires (#33 globally) and ranked #1 in Australia for graduates who create technology start-ups.

News Article | November 7, 2016

A new comprehensive study of Australian natural hazards paints a picture of increasing heatwaves and extreme bushfires as this century progresses, but with much more uncertainty about the future of storms and rainfall. Published in a special issue of the international journal Climatic Change, the study documents the historical record and projected change of seven natural hazards in Australia: flood; storms (including wind and hail); coastal extremes; drought; heatwave; bushfire; and frost. "Temperature-related hazards, particularly heatwaves and bushfires, are increasing, and projections show a high level of agreement that we will continue to see these hazards become more extreme into the 21st century," says special issue editor Associate Professor Seth Westra, Head of the Intelligent Water Decisions group at the University of Adelaide. "Other hazards, particularly those related to storms and rainfall, are more ambiguous. Cyclones are projected to occur less frequently but when they do occur they may well be more intense. In terms of rainfall-induced floods we have conflicting lines of evidence with some analyses pointing to an increase into the future and others pointing to a decrease. "One thing that became very clear is how much all these hazards are interconnected. For example drought leads to drying out of the land surface, which in turn can lead to increased risk of heat waves and bushfires, while also potentially leading to a decreased risk of flooding." The importance of interlinkages between climate extremes was also noted in the coastal extremes paper: "On the open coast, rising sea levels are increasing the flooding and erosion of storm-induced high waves and storm surges," says CSIRO's Dr Kathleen McInnes, the lead author of the coastal extremes paper. "However, in estuaries where considerable infrastructure resides, rainfall runoff adds to the complexity of extremes." This special issue represents a major collaboration of 47 scientists and eleven universities through the Australian Water and Energy Exchange Research Initiative (, an Australian research community program. The report's many authors were from the Centre of Excellence for Climate System Science, the CSIRO, Bureau of Meteorology, Australian National University, Curtin University, Monash University, University of Melbourne, University of Western Australia, University of Adelaide, University of Newcastle, University of New South Wales, University of Tasmania, University of Western Australia and University of Wollongong. The analyses aim to disentangle the effects of climate variability and change on hazards from other factors such as deforestation, increased urbanisation, people living in more vulnerable areas, and higher values of infrastructure. "The study documents our current understanding of the relationship between historical and possible future climatic change with the frequency and severity of Australian natural hazards," says Associate Professor Westra. "These hazards cause multiple impacts on humans and the environment and collectively account for 93% of Australian insured losses, and that does not even include drought losses. "We need robust decision-making that considers the whole range of future scenarios and how our environment may evolve. The biggest risk from climate change is if we continue to plan as though there will be no change. One thing is certain: our environment will continue to change." Some of the key findings from the studies include: • Historical information on the most extreme bushfires -- so-called "mega fires" -- suggests an increased occurrence in recent decades with strong potential for them to increase in frequency in the future. Over the past decade major bushfires at the margins of Sydney, Canberra, and Melbourne have burnt more than a million hectares of forests and woodlands and resulted in the loss of more than 200 lives and 4000 homes. • Heatwaves are Australia's most deadly natural hazard, causing 55% of all natural disaster related deaths and increasing trends in heatwave intensity, frequency and duration are projected to continue throughout the 21st century. • The costs of flooding have increased significantly in recent decades, but factors behind this increase include changes in reporting mechanisms, population, land-use, infrastructure as well as extreme rainfall events. The physical size of floods has either not changed at all, or even decreased in many parts of the country.

News Article | December 8, 2016

A child mummy from the 17th century, found in a crypt underneath a Lithuanian church, was discovered to harbor the oldest known sample of the variola virus that causes smallpox. Researchers who sequenced the virus say it could help answer lingering questions about the history of smallpox, including how recently it appeared in humans (perhaps more recently than we thought) and when specific evolutionary events occurred. Their study appears December 8 in Current Biology. "There have been signs that Egyptian mummies that are 3,000 to 4,000 years old have pockmarked scarring that have been interpreted as cases of smallpox," says first author Ana Duggan, a postdoctoral fellow at the McMaster University Ancient DNA Center in Canada. "The new discoveries really throw those findings into question, and they suggest that the timeline of smallpox in human populations might be incorrect." The research team gathered the disintegrated variola virus DNA from the mummy after obtaining permission from the World Health Organization. Using RNA baits designed from existing variola sequences, the researchers targeted variola sequences found within the extracted DNA from the mummy's skin. Then they reconstructed the entire genome of the ancient strain of the virus and compared it to versions of the variola virus genome dating from the mid-1900s and before its eradication in the late 1970s. They concluded that these samples shared a common viral ancestor that originated sometime between 1588 and 1645--dates that coincide with a period of exploration, migration, and colonization that would have helped spread smallpox around the globe. "So now that we have a timeline, we have to ask whether the earlier documented historical evidence of smallpox, which goes back to Ramses V and includes everything up to the 1500s, is real," says co-author Henrik Poinar, the director of the Ancient DNA Centre at McMaster University in Canada. "Are these indeed real cases of smallpox, or are these misidentifications, which we know is very easy to do, because it is likely possible to mistake smallpox for chicken pox and measles." In addition to providing a more accurate timeline for the evolution of smallpox, the researchers were also able to identify distinct periods of viral evolution. One of the clearest instances of this occurred around the time that Edward Jenner famously developed his vaccine against the virus in the 18th century. During this period, the variola virus appears to have split into two strains, variola major and variola minor, which suggests that vaccination, which led to eradication of smallpox, may have changed the selection pressures acting on the virus and caused it to split into two strains. The researchers hope to use this work to identify how the sample they discovered in Lithuania compares to others that were sweeping throughout other countries in Europe at the same time. But in the bigger context of smallpox research, the scientists are optimistic that their work will provide a stepping stone to allow virologists to continue to trace smallpox and other DNA viruses back through time. "Now we know all the evolution of the sampled strains dates from 1650, but we still don't know when smallpox first appeared in humans, and we don't know what animal it came from, and we don't know that because we don't have any older historical samples to work with," says co-author Edward Holmes, a professor at the University of Sydney in Australia. "So this does put a new perspective on this very important disease, but it's also showing us that our historical knowledge of viruses is just the tip of the iceberg." This work was supported by the McMaster Ancient DNA Centre at McMaster University, the Department of Virology at the University of Helsinki, the Department of Anatomy, Histology and Anthropology at Vilnius University, the Marie Bashir Institute for Infectious Diseases and Biosecurity, the Department of Biochemistry and Molecular Science and Biotechnology at the University of Melbourne, the Department of History at Duke University, the Department of Biology at McMaster University, UC Irvine, the Mycroarray in Michigan, the Department of Chemical Engineering at the University of Michigan, the Center for Microbial Genetics and Genomics at Northern Arizona University, the Laboratoire d'Anthropologie Biologique Paul Broca at the PSL Research University, Helsinki University Hospital, the Department of Forensic Medicine at the University of Helsinki, the Department of Pathology at the University of Cambridge, the Michael G. DeGroote Institute for Infectious Disease Research at McMaster University and the Humans & the Microbiome Program at the Canadian Institute for Advanced Research. Current Biology, Duggan, Marciniak, Poinar, Emery, Poinar et al: "17th Century Variola Virus Reveals the Recent History of Smallpox" Current Biology (@CurrentBiology), published by Cell Press, is a bimonthly journal that features papers across all areas of biology. Current Biology strives to foster communication across fields of biology, both by publishing important findings of general interest and through highly accessible front matter for non-specialists. Visit: http://www. . To receive Cell Press media alerts, contact

Larabell C.A.,University of California at San Francisco | Larabell C.A.,Lawrence Berkeley National Laboratory | Nugent K.A.,University of Melbourne
Current Opinion in Structural Biology | Year: 2010

X-ray imaging of biological samples is progressing rapidly. In this paper we review the progress to date in high-resolution imaging of cellular architecture. In particular we survey the progress in soft X-ray tomography and argue that the field is coming of age and that important biological insights are starting to emerge. We then review the new ideas based on coherent diffraction. These methods are at a much earlier stage of development but, as they eliminate the need for X-ray optics, have the capacity to provide substantially better spatial resolution than zone plate-based methods. © 2010 Elsevier Ltd.

Yang J.,University of Queensland | Zaitlen N.A.,University of California at San Francisco | Goddard M.E.,University of Melbourne | Visscher P.M.,University of Queensland | And 2 more authors.
Nature Genetics | Year: 2014

Mixed linear models are emerging as a method of choice for conducting genetic association studies in humans and other organisms. The advantages of the mixed-linear-model association (MLMA) method include the prevention of false positive associations due to population or relatedness structure and an increase in power obtained through the application of a correction that is specific to this structure. An underappreciated point is that MLMA can also increase power in studies without sample structure by implicitly conditioning on associated loci other than the candidate locus. Numerous variations on the standard MLMA approach have recently been published, with a focus on reducing computational cost. These advances provide researchers applying MLMA methods with many options to choose from, but we caution that MLMA methods are still subject to potential pitfalls. Here we describe and quantify the advantages and pitfalls of MLMA methods as a function of study design and provide recommendations for the application of these methods in practical settings. © 2014 Nature America, Inc.

Raghavan V.,University of Melbourne | Veeravalli V.V.,University of Illinois at Urbana - Champaign
IEEE Transactions on Information Theory | Year: 2010

Recent attention in quickest change detection in the multisensor setting has been on the case where the densities of the observations change at the same instant at all the sensors due to the disruption. In this work, a more general scenario is considered where the change propagates across the sensors, and its propagation can be modeled as a Markov process. A centralized, Bayesian version of this problem is considered, with a fusion center that has perfect information about the observations and a priori knowledge of the statistics of the change process. The problem of minimizing the average detection delay subject to false alarm constraints is formulated in a dynamic programming framework. Insights into the structure of the optimal stopping rule are presented. In the limiting case of rare disruptions, it is shown that the structure of the optimal test reduces to thresholding the a posteriori probability of the hypothesis that no change has happened. Under a certain condition on the Kullback-Leibler (K-L) divergence between the post- and the pre-change densities, it is established that the threshold test is asymptotically optimal (in the vanishing false alarm probability regime). It is shown via numerical studies that this low-complexity threshold test results in a substantial improvement in performance over naive tests such as a single-sensor test or a test that incorrectly assumes that the change propagates instantaneously. © 2006 IEEE.

Huang M.,Carleton University | Manton J.H.,University of Melbourne
IEEE Transactions on Automatic Control | Year: 2010

We consider consensus seeking of networked agents on directed graphs where each agent has only noisy measurements of its neighbors' states. Stochastic approximation type algorithms are employed so that the individual states converge both in mean square and almost surely to the same limit. We further generalize the algorithm to networks with random link failures and prove convergence results. © 2009 IEEE.

Fusar-Poli P.,King's College London | Nelson B.,University of Melbourne | Valmaggia L.,King's College London | Yung A.R.,University of Melbourne | McGuire P.K.,King's College London
Schizophrenia Bulletin | Year: 2014

Background: The current diagnostic system for subjects at enhanced clinical risk of psychosis allows concurrent comorbid diagnoses of anxiety and depressive disorders. Their impact on the presenting high-risk psychopathology, functioning, and transition outcomes has not been widely researched. Methods: In a large sample of subjects with an At-Risk Mental State (ARMS, n = 509), we estimated the prevalence of DSM/SCID anxiety or depressive disorders and their impact on psychopathology, functioning, and psychosis transition. A meta-analytical review of the literature complemented the analysis. Results: About 73% of ARMS subjects had a comorbid axis I diagnosis in addition to the "at-risk" signs and symptoms. About 40% of ARMS subjects had a comorbid diagnosis of depressive disorder while anxiety disorders were less frequent (8%). The meta-analysis conducted in 1683 high-risk subjects confirmed that baseline prevalence of comorbid depressive and anxiety disorders is respectively 41% and 15%. At a psychopathological level, comorbid diagnoses of anxiety or depression were associated with higher suicidality or self-harm behaviors, disorganized/odd/stigmatizing behavior, and avolition/apathy. Comorbid anxiety and depressive diagnoses were also associated with impaired global functioning but had no effect on risk of transition to frank psychosis. Meta-regression analyses confirmed no effect of baseline anxiety and/or depressive comorbid diagnoses on transition to psychosis. Conclusions: The ARMS patients are characterized by high prevalence of anxiety and depressive disorders in addition to their attenuated psychotic symptoms. These symptoms may reflect core emotional dysregulation processes and delusional mood in prodromal psychosis. Anxiety and depressive symptoms are likely to impact the ongoing psychopathology, the global functioning, and the overall longitudinal outcome of these patients. © 2012 The Author.

Kentish S.,University of Melbourne | Feng H.,University of Illinois at Urbana - Champaign
Annual Review of Food Science and Technology | Year: 2014

Acoustic energy as a form of physical energy has drawn the interests of both industry and scientific communities for its potential use as a food processing and preservation tool. Currently, most such applications deal with ultrasonic waves with relatively high intensities and acoustic power densities and are performed mostly in liquids. In this review, we briefly discuss the fundamentals of power ultrasound. We then summarize the physical and chemical effects of power ultrasound treatments based on the actions of acoustic cavitation and by looking into several ultrasound-Assisted unit operations. Finally, we examine the biological effects of ultrasonication by focusing on its interactions with the miniature biological systems present in foods, i.e., microorganisms and food enzymes, as well as with selected macrobiological components. Copyright © 2014 by Annual Reviews.

Hurlimann A.,University of Melbourne | Dolnicar S.,University of Wollongong
Water Research | Year: 2010

Located approximately 100 km west of Brisbane, Toowoomba is home to approximately 95,000 people. Surface water from dams is the main source of water for the city. In 2006 the residents of Toowoomba were invited to vote in a referendum (plebiscite) concerning whether or not an indirect potable wastewater reuse scheme should be constructed to supply additional water to the area. At that stage dam levels in Toowoomba were at approximately twenty percent of capacity. Toowoomba residents, after intense campaigning on both sides of the referendum debate, voted against the proposal. In July 2008 dam levels dropped to eleven percent. Stage 5 water restrictions have been in place since September 2006, subsequently mains water must not be used for any outdoor uses. This paper describes in detail how public opposition in the case of Toowoomba's referendum, defeated the proposal for a water augmentation solution. Reasons for the failure are analysed. In so doing, the paper provides valuable insights with respect to public participation in indirect potable reuse proposals, and discusses factors including politics, vested interest and information manipulation. This paper is significant because of the lack of detailed information published about failed water infrastructure projects. © 2009 Elsevier Ltd. All rights reserved.

Moss R.,University Paris - Sud | Thomas S.R.,University of Melbourne
American Journal of Physiology - Renal Physiology | Year: 2014

We present a lumped-nephron model that explicitly represents the main features of the underlying physiology, incorporating the major hormonal regulatory effects on both tubular and vascular function, and that accurately simulates hormonal regulation of renal salt and water excretion. This is the first model to explicitly couple glomerulovascular and medullary dynamics, and it is much more detailed in structure than existing whole organ models and renal portions of multiorgan models. In contrast to previous medullary models, which have only considered the antidi-uretic state, our model is able to regulate water and sodium excretion over a variety of experimental conditions in good agreement with data from experimental studies of the rat. Since the properties of the vasculature and epithelia are explicitly represented, they can be altered to simulate pathophysiological conditions and pharmacological interventions. The model serves as an appropriate starting point for simulations of physiological, pathophysiological, and pharmacological renal conditions and for exploring the relationship between the extrarenal environment and renal excretory function in physiological and pathophysiological contexts. © 2014 the American Physiological Society.

Agency: Cordis | Branch: FP7 | Program: CP-IP | Phase: HEALTH-2009-2.2.1-2 | Award Amount: 15.03M | Year: 2010

The aim of EU-GEI is to identify the interactive genetic, clinical and environmental determinants involved in the development, severity and outcome of schizophrenia (EU-GEI, Schiz. Res. 2008; 102: 21-6). In order to identify these interactive determinants, EU-GEI will employ family-based, multidisciplinary research paradigms, which allow for the efficient assessment of gene-environment interactions. In order to go beyond old findings from historical convenience cohorts with crude measures of environmental factors and clinical outcomes, the focus in EU-GEI will be on recruitment of new, family-based clinical samples with state-of-the-art assessments of environmental, clinical and genetic determinants as well as their underlying neural and behavioural mechanisms. New statistical tools will be developed to combine the latest multilevel epidemiological with the latest genome-wide genetic approaches to analysis. Translation of results to clinical practice will be facilitated by additional experimental research and risk assessment bioinformatics approaches. This will result in the identification of modifiable biological and cognitive mechanisms underlying gene-environment interactions and the construction of Risk Assessment Charts and Momentary Assessment Technology tools which can be used for (i) early prediction of transition to psychotic disorder in help-seeking individuals with an at-risk mental state and (ii) early prediction of course and outcome after illness onset. In order to reach these goals, EU-GEI has assembled a multidisciplinary team of top schizophrenia researchers who have the range of skills required to deliver a program of research that meets all the calls requirements and who have access to / will collect a number of unique European samples. The partners in EU-GEI represent the nationally funded schizophrenia / mental health networks of the UK, Netherlands, France, Spain, Turkey and Germany as well as other partners.

Agency: Cordis | Branch: FP7 | Program: CP-SICA | Phase: HEALTH-2007-2.3.2-12 | Award Amount: 3.25M | Year: 2009

The protein synthesis machinery represents one of the most useful targets for the development of new anti-infectives. Several families of broadly used antibiotics (tetracyclines, macrolides, and novel glycopeptides like vancomycin, among others) exert their function by blocking the protein synthesis machinery. Doxicycline, a tetracycline antibiotic, remains a useful tool for the prevention of paludism among travellers, despite its numerous secondary effects. And yet, very little is known about the specifics of the protein synthesis machinery in Plasmodium. A search of articles in the PubMed library with the words Plasmodium and ribosome/ribosomal in their titles will yield 6 publications since the year 2000. Only one article contains the words tRNA (or transfer RNA) and Plasmodium in its title, in the same period. And only one article in PubMed (Snewin et al., 1996) contains the words Plasmodium and tRNA synthetase (or ligase) in its title. This lack of information about this central metabolic pathway in Plasmodium clearly blocks the possibility of transferring the knowledge in protein synthesis to the development of new anti-malarial drugs directed against the translational machinery of the parasite. Thus, the study of components of the genetic code in Plasmodium has the potential for providing new and important information on the biology of the parasite and, more importantly, open new leads for the development of novel anti-malarials. This proposal coordinates an effort to study tRNA biology in Plasmodium falciparum. It contains specific schemes for the development of new pharmacological screens, several initiatives for the selection of new potential anti-malarial drugs, and projects designed to answer fundamental questions regarding protein synthesis in Plasmodium. The laboratories in MEPHITIS accumulate a large body of experience in the biology of this parasite, and in different aspects of tRNA biology in model species.

Agency: Cordis | Branch: FP7 | Program: CP-FP | Phase: HEALTH.2013.2.2.1-2 | Award Amount: 8.14M | Year: 2014

Neuroimaging (NI) has enormous potential to improve the clinical care of patients with psychiatric disorders, but has yet to deliver. The PSYSCAN project will address this issue directly by developing a NI-based tool that will help clinicians resolve key clinical issues in the management of patients with psychotic disorders. Clinicians will use the tool to assess patients with a standardised set of NI and complementary demographic, clinical, cognitive, and genetic measures. The clinician will enter data on to an iPad, and these data, along with NI data will be electronically transferred to a central facility for analysis. Key features of the analysis include the assessment of NI data at a network level, the integration of NI and non-NI data, and the use of machine learning methods to make predictions specific to the patient being assessed. The results will be delivered to the clinicians iPad and will indicate the likelihood of a given clinical or functional outcome. The tool will have 3 clinical applications. PSYSCAN-Predict will facilitate prediction of the onset of psychosis in high risk subjects. PSYSCAN-Stratify will aid early diagnosis and the stratification of patients with first episode psychosis according to future course and outcome. PSYSCAN-Monitor will allow clinicians to measure progression of the disorder over time. Once developed, the tool will be validated in 2 large scale naturalistic studies using the consortiums extensive network of centres. The validated tool will then be disseminated to clinical centres across the EU. The PSYSCAN project involves a world-class consortium of experts on NI and psychiatry that unites academic centres, SMEs with image processing and computerised testing expertise, a large medical device provider, and the pharmaceutical industry. The consortium is thus ideally suited to translating expertise and knowledge in NI to build a tool that can be used to improve the care of patients with psychiatric disorders

Agency: Cordis | Branch: FP7 | Program: CP-FP | Phase: HEALTH-2009-2.2.1-4 | Award Amount: 4.19M | Year: 2010

Brain diseases are one of the most prevalent groups of diseases in Europe with estimated annual costs amounting to 386 billion (1). Data collected by the WHO suggest that brain diseases are responsible for 35% of Europes total disease burden (1). In the treatment of neurological disease, the blood brain barrier (BBB) still represents an obstacle for the delivery of drugs to the brain and thus a major challenge for the development of therapeutic regimens. Understanding the molecular basis and functioning of the BBB in health and disease, including transport mechanisms across the BBB, therefore holds significant potential for future strategies to prevent and ameliorate neurological disease. Recent research indicates that some neurological disorders have a developmental etiologic component. The major goal of the NEUROBID project is thus to understand the molecular mechanisms and function of the BBB in health and disease both in the developing brain and the adult central nervous system. With an interdisciplinary consortium from the fields of developmental neurobiology and BBB research, NEUROBID aims to (i) understand the involvement of normal and disturbed BBB function in normal and abnormal brain development and (ii) to develop novel strategies for drug delivery to the brain. Unique transport mechanisms across the BBB will be used to target potential therapeutic macromolecular and cellular agents specifically to the brain barriers and transport them into the brain. The main target disorders of NEUROBID are non-inherited neurodevelopmental disorders arising from perinatal adverse exposure, such as cerebral palsy, and classic adult neurological disorders such as multiple sclerosis and stroke. In the long term, NEUROBID hopes to pave the way for new treatment strategies and thus reduce the economic and social burden of neurological disease. 1. Olesen J, et al. Consensus document on European brain research. J Neurol Neurosurg Psychiatry 2006;77 Suppl 1:i1-49.

Agency: Cordis | Branch: H2020 | Program: RIA | Phase: PHC-09-2015 | Award Amount: 24.09M | Year: 2015

HIV-1 is responsible for a global pandemic of 35 million people, and continues to spread at a rate of >2 million new infections/year. It is widely acknowledged that a protective vaccine would be the most effective means to reduce HIV-1 spread and ultimately eliminate the pandemic, while a therapeutic vaccine may help mitigate the clinical course of disease and lead to strategies of viral eradication. However despite 30 years of research, we do not have a vaccine capable of protecting from HIV-1 infection or impacting on disease progression. This in part represents the challenge of identifying immunogens and vaccine modalities with reduced risk of failure in late stage development. To overcome this bottleneck some of the most competitive research groups in vaccine discovery from European public institutions and biotechs from 9 EU countries together with top Australian and Canadian groups and US collaborators, have agreed to join forces in EAVI, providing a pool of international expertise at the highest level. EAVI2020 will provide a platform for the discovery and selection of several new, diverse and novel preventive and/or therapeutic vaccine candidates for HIV/AIDS. Emphasis will be placed on early rapid, iterative, small Experimental medicine (EM) human vaccine studies to select and refine the best immunogens, adjuvants, vectors, homologous and heterologous primeboost schedules, and determine the impact of host factors such as gender and genetics. Animal models will be used to complement human studies, and to select novel immunization technologies to be advanced to the clinic. To shift the risk curve in product development we will develop innovative risk prediction methods, specifically designed to reduce the risk associated with late stage preventive or therapeutic vaccine failure, increasing the chance of discovery of an effective vaccine.

Fusar-Poli P.,King's College London | Berger G.,University of Melbourne
Journal of Clinical Psychopharmacology | Year: 2012

BACKGROUND: Omega-3 fatty acids, in particular, eicosapentaenoic acid (EPA) have been suggested as augmentation strategies in the treatment of schizophrenia and related psychosis. Published results are conflicting, and the antipsychotic efficacy of such augmentation strategies is not well established. METHODS: Double-blind, randomized, placebo-controlled studies using purified or EPA-enriched oils in established schizophrenia were included in a meta-analysis. The effect size of EPA on psychotic symptoms was measured using Hedges' g. Publication bias was assessed with funnel plots and Egger's intercept. Heterogeneity was assessed with Q statistic and I index. Influence of moderators was assessed with meta-regression analyses in Comprehensive Meta-analysis Software version 2. RESULTS: The database included 167 schizophrenic subjects under the placebo arm (mean age, 37 [SD, 9.7] years; 37% females) matched with 168 schizophrenic subjects under the EPA arm (mean age, 37 [SD, 7.9] years; 36% females) (t tests P > 0.05). Meta-analysis showed no consistent significant effect for the EPA augmentation on psychotic symptoms (Hedges' g = 0.242; 95% confidence interval, 0.028-0.512, Z = 1.7531, P > 0.05). There were no significant effects for moderator variables such as age, sex, and EPA dose used in the trials. Heterogeneity across studies was small and statistically non significant (Q = 9.06; P = 0.170; I = 33.81). CONCLUSIONS: Meta-analysis of randomized controlled trials on symptomatic outcome revealed no beneficial effect of EPA augmentation in established schizophrenia. However, no conclusion can be made for medium- to long-term effects of EPA in schizophrenia, in particular on relapse prevention in the early course of psychotic disorders. Copyright © 2012 by Lippincott Williams & Wilkins.

News Article | February 22, 2017

Institute scientists have revealed a potent inflammatory molecule released by dying cells triggers inflammation during necroptosis, a recently described form of cell death linked to inflammatory disease. Institute scientists have revealed a potent inflammatory molecule released by dying cells triggers inflammation during necroptosis, a recently described form of cell death linked to inflammatory disease. The discovery could lead to new and existing medicines that target the molecule being investigated as a way of treating inflammatory diseases, such as psoriasis and inflammatory bowel disease. Dr Lisa Lindqvist, Dr Kate Lawlor, Dr James Vince and PhD student Ms Stephanie Conos led research that showed interleukin-1 beta (IL-1) triggers inflammation during necroptotic cell death. Necroptosis is important for protecting us against infections, by sacrificing infected or diseased cells 'for the greater good'. However, necroptosis can become inappropriately or excessively activated, triggering damaging inflammation that leads to inflammatory disease. Dr Lindqvist said the discovery challenged a long-standing dogma that inflammation triggered by necroptosis was a byproduct of dead cell debris. "Our research has pinpointed that, during necroptosis, dying cells release IL-1, a potent inflammatory signal," Dr Lindqvist said. "Now that we have discovered IL-1 is the 'root' of the inflammation associated with necroptosis, we speculate that targeting this molecule could be an effective way of treating inflammatory diseases." The findings suggest that targeting IL-1 could suppress inflammation associated with multiple inflammatory diseases, including multiple sclerosis, ischemia-reperfusion injury, atherosclerosis, liver disease, pancreatitis, psoriasis, inflammatory bowel disease, and infectious diseases. "Our research suggests that existing drugs that block IL-1 might be useful in treating these diseases," Dr Lindqvist said. "We are also exploring how IL-1 is signalled to be secreted during necroptosis, so that we can create new drugs to stop its release and reduce inflammation to treat inflammatory diseases." The research was an international effort with collaborators Dr Kate Schroder and Dr Kaiwen Chen at the University of Queensland, and Professor Gabriel Núñez from the University of Michigan Medical School, US. The research was supported by the Australian National Health and Medical Research, the Australian Research Council, the US National Institutes of Health, the Australian Government and the Victorian State Government Operational Infrastructure Scheme. Ms Conos' PhD studies were conducted at the Walter and Eliza Hall Institute, as the University of Melbourne's Department of Medical Biology. The research findings were published in the journal Proceedings of the National Academy of Sciences.

News Article | November 22, 2016

UNIONVILLE, PA, November 22, 2016-- Charles Coyne has been included in Marquis Who's Who. As in all Marquis Who's Who biographical volumes, individuals profiled are selected on the basis of current reference value. Factors such as position, noteworthy accomplishments, visibility, and prominence in a field are all taken into account during the selection process.With nearly four and a half decades of practiced industry experience, Mr. Coyne is uniquely qualified to manage a variety of tasks on behalf of Obermayer Rebmann Maxwell & Hippel LLP, where he has served as Of Counsel since 2007. Before entering the field in a professional capacity, he was an AIESEC exchange student through the University of Melbourne, earned a Bachelor of Science in economics from The Wharton School at The University of Pennsylvania, became an intern through the General Services Administration, and obtained a JD from Temple University.In addition to serving Obermayer for nearly 10 years, Mr. Coyne has also sat on the board of directors for George S. Coyne Chemical Co., Inc. since 1973. Some of his other professional roles have included developing simultaneous filings of adversary proceedings, establishing limitations on the jurisdiction of bankruptcy court to collect pre-petition accounts receivable, forming the creditors committee at Alan Wood Steel Co., and serving as counsel at Penn Central Bankruptcy and Mid-Manhattan Properties.A member of the American and Pennsylvania Bar Associations, the Pennsylvania Republican Committee Common Wealth Club, Quaker City Farmers, and the Kappa Alpha Society, Mr. Coyne has achieved much throughout his career. He supplemented his other professional endeavors by parlaying his extensive knowledge into numerous creative works through positions as a columnist for Life in the County, part of the Ledger Newspaper Group, for six years and an associate editor of the Temple Law Review for one year. In recognition of his hard work and dedication, Mr. Coyne received the Distinguished Young Republican Award in 1976, and had the honor of being named to Who's Who in Finance and Industry, Who's Who in American Law, Who's Who in America, and Who's Who in the World.During his spare time, Mr. Coyne enjoyed giving back to his community through the Pennsylvania Hunt Cup, where he was a member of the racing committee from 1992 until 1997, and as a Republican candidate for state representative in 1976. He was also a People to People ambassador to Brazil in 2004. In the coming years, Mr. Coyne intends to continue giving back and growing in his career.For more information about Charles Coyne and Obermayer Rebmann Maxwell & Hippel LLP, please visit . For more information about George S. Coyne Chemical Co., Inc., please visit About Marquis Who's Who :Since 1899, when A. N. Marquis printed the First Edition of Who's Who in America , Marquis Who's Who has chronicled the lives of the most accomplished individuals and innovators from every significant field of endeavor, including politics, business, medicine, law, education, art, religion and entertainment. Today, Who's Who in America remains an essential biographical source for thousands of researchers, journalists, librarians and executive search firms around the world. Marquis now publishes many Who's Who titles, including Who's Who in America , Who's Who in the World , Who's Who in American Law , Who's Who in Medicine and Healthcare , Who's Who in Science and Engineering , and Who's Who in Asia . Marquis publications may be visited at the official Marquis Who's Who website at

News Article | December 8, 2016

Australia’s top climate scientists have come out in support of their American counterparts, in response to news that the incoming Trump Administration will scrap climate research at the country’s top research facility, NASA. Trump’s senior advisor on NASA, Bob Walker, announced the plans strip NASA’s Earth science division of funding on Wednesday, in a crackdown on what his team refers to as “politicised science”. The policy – and the language used to frame it – would be all too familiar to Australian climate scientists, who faced a similar attack on funding and staff of the world-leading CSIRO climate department, and the dismantling of the Climate Commission. In defense of the CSIRO cuts, the Organisation’s ex-venture capitalist CEO Larry Marshall said the national climate change discussion was “more like religion than science.” Here’s what Australia’s scientists are saying about Trump and NASA… “Just as we have seen in Australia the attack on CSIRO climate science under the Coalition government, we now see the incoming Trump administration attacking NASA,” said Professor Ian Lowe, Emeritus Professor of Science, Technology and Society at Griffith University and a former President of the Australian Conservation Foundation. “They obviously hope that pressure for action will be eased if the science is muffled. “But with temperatures in the Arctic this week a startling 20 degrees above normal, no amount of waffle can disguise the need for urgent action to decarbonise our energy supply and immediately withdraw support for new coal mines,” Prof Lowe said. “Why a world leader in Earth observation should do this is beyond rational explanation,” said David Bowman, a “fire scientist” and Professor of Environmental Change Biology at The University of Tasmania. “Earth observation is a non-negotiable requirement for effective, sustainable fire management and it will be provided by other sources if the US proceeds with this path, such as Europe, Japan and China,” Prof Bowman said. “So, effectively the US would be ceding intellectual ‘real estate’ to other nations that could quickly become dominant providers of essential information on fire activity.” Dr Megan Saunders, a Research Fellow in the School of Geography Planning and Environmental Management & Centre for Biodiversity and Conservation Science at The University of Queensland, said scrapping funding to climate research in NASA would be devastating. “Climate change is already causing significant disruptions to the earth system on which humanity relies, and urgent action on climate change is required around the globe. Cutting funding to NASA compromises our ability to cope with climate change sends a message that climate change is not being taken seriously,” Doctor Saunders said. “In many instances symptoms of climate change are occurring faster than predicted by models. For instance, NASA’s temperature records have shown that September 2016 was the warmest in 136 years of modern record keeping. NASA’s research on sea-level rise demonstrated that sea-level rise in the 21st century was greater than previously understood. NASA research in West Antarctica identified the fastest rates of glacier retreat ever observed.” Dr Liz Hanna, fellow of the National Centre for Epidemiology & Population Health at the Australian National University, and National Convenor Climate Change Adaptation Research Network for Human Health said that shutting down the science would not stop climate change. “All it will do is render people, communities and societies unprepared at even greater risk. …If Trump does not care about people’s lives, perhaps he might consider the drop in productivity that inevitably tracks temperature increases,” she said. “My advice to president-elect Trump is to look beyond his advisor Bob Walker’s comments and see exactly the important work done by the NASA Earth science division,” said Dr Helen McGregor, an ARC Future Fellow in the School of Earth Sciences and Environmental Sciences at the University of Wollongong. “This is not ‘politically correct environmental monitoring’ as Walker asserts but is essential data to ensure society’s health and wellbeing. “As for climate change science, the division’s reports on global temperatures are solely based on robust data. What’s being politicised here is not the science but the story that the science tells: that the planet is warming. Let’s not shoot the messenger,” Dr McGregor said. “Will Mr Trump be taking his electorate with him once he’s finished with Earth?” asked Dr Paul Read, a Research Fellow in Natural Disasters at the University of Melbourne’s Sustainable Society Institute. “Mr Trump is about 10 years behind the public understanding of climate science, much less the scientific consensus. As the climate hits home here on Earth, his own support base could turn on him like snake with whiplash.” Buy a cool T-shirt or mug in the CleanTechnica store!   Keep up to date with all the hottest cleantech news by subscribing to our (free) cleantech daily newsletter or weekly newsletter, or keep an eye on sector-specific news by getting our (also free) solar energy newsletter, electric vehicle newsletter, or wind energy newsletter.

SINGAPORE, Dec. 7, 2016 /PRNewswire/ -- Carmentix Private Limited ("Carmentix") and the University of Melbourne are proud to announce the "Preterm Birth Biomarker Discovery" initiative. The aim of this collaborative clinical study is to validate novel biomarkers discovered by Carmentix and biomarkers previously discovered and validated at the University of Melbourne in a combined panel and to assess the risk for preterm birth as early as 20 weeks of gestation. The retrospective study led by Dr. Harry Georgiou, PhD and Dr. Megan Di Quinzio, MD at the University of Melbourne will validate the statistical strength of the novel biomarker panel. "Carmentix is excited to begin this collaboration, as we are keen to further develop the biomarkers discovered on our unique data mining platform," said Dr. Nir Arbel, CEO Carmentix. "If validated, this new panel of biomarkers may shed hope to significantly reduce the number of preterm birth cases on a global scale." Clinical Obstetrician and researcher, Dr. Di Quinzio frequently sees mothers asking "why was my baby born prematurely?" There is often no satisfactory answer. "Preterm birth continues to be a global health problem but sadly, reliable diagnostic tools are lacking," said Dr. Georgiou, scientific leader at the University of Melbourne. "This collaborative initiative with a strong commercial partner will help pave the way for a novel approach for better diagnosis and hopefully the prevention of preterm labour." Carmentix is an Esco Ventures-backed startup company based in Singapore. Carmentix is developing a novel biomarker prognostic panel to significantly reduce the numbers of preterm birth cases by establishing biomolecular tools that will alert clinicians of the preterm birth risk weeks before symptoms occur. Carmentix's technology relies on a multiple pathway analysis utilizing a unique panel of biomarkers. This panel of proprietary markers will allow the prediction of preterm birth at 16-20 weeks of gestation, anticipating a high accuracy predictive algorithm due to its coverage of bottleneck molecular process involved in preterm birth. Carmentix' goal is to achieve a cost-effective solution that would be robust and accurate, and will accommodate clinical settings worldwide. About the University of Melbourne and its commercialisation initiatives The University of Melbourne is Australia's best and one of the world's leading universities. As an R&D hub with world-leading specialists in science, technology and medicine, Melbourne undertakes cutting-edge research to create new ways of thinking, new technology and new expertise to build a better future. World-class research, real-world solutions: The University of Melbourne embraces a culture of innovation -- working with industry, government, non-governmental organisations and the community to solve real-world challenges. Our commercial partnerships bring research to life through collaboration in areas of bio-engineering, materials development, medical technology innovation, community capacity development and cultural entrepreneurship. Some of the ground-breaking commercialised technology created at the University of Melbourne includes the cochlear implant, the stentrode (a device that delivers mind control over computers, robotic limbs or exoskeletons), and novel anti-fibrotic drug candidates for the treatment of the fibrosis (prevalent in such chronic conditions as chronic kidney disease, chronic heart failure, pulmonary fibrosis and arthritis). The University of Melbourne is closely partnered with the Peter Doherty Institute for Infection and Immunity, Walter and Eliza Hall Institute, CSIRO, CSL, and The Royal Melbourne, Royal Children's and Royal Women's Hospitals. With over 160 years of leadership in education and research, the University responds to immediate and future challenges facing our society through innovation in research. The University of Melbourne is No. 1 in Australia and 31 in the world (Times Higher Education World University Rankings 2015-2016).

News Article | September 13, 2016

An international team of more than 20 scientists has inadvertently discovered how to create a new type of crystal using light more than ten billion times brighter than the sun. The discovery, led by Associate Professor Brian Abbey at La Trobe in collaboration with Associate Professor Harry Quiney at the University of Melbourne, has been published in the journal Science Advances. Their findings reverse what has been accepted thinking in crystallography for more than 100 years. The team exposed a sample of crystals, known as Buckminsterfullerene or Buckyballs, to intense light emitted from the world’s first hard X-ray free electron laser (XFEL), based at Stanford University. The molecules have a spherical shape forming a pattern that resembles panels on a soccer ball. Light from the XFEL is around one billion times brighter than light generated by any other X-ray equipment — even light from the Australian Synchrotron pales in comparison. Because other X-ray sources deliver their energy much slower than the XFEL, all previous observations had found that the X-rays randomly melt or destroy the crystal. Scientists had previously assumed that XFELs would do the same. The result from the XFEL experiments on Buckyballs, however, was not at all what scientists expected. When the XFEL intensity was cranked up past a critical point, the electrons in the Buckyballs spontaneously re-arranged their positions, changing the shape of the molecules completely. Every molecule in the crystal changed from being shaped like a soccer ball to being shaped like an AFL ball at the same time. This effect produces completely different images at the detector. It also altered the sample’s optical and physical properties. “It was like smashing a walnut with a sledgehammer and instead of destroying it and shattering it into a million pieces, we instead created a different shape — an almond!” Abbey says. “We were stunned, this is the first time in the world that X-ray light has effectively created a new type of crystal phase,” says Quiney, from the School of Physics, University of Melbourne. “Though it only remains stable for a tiny fraction of a second, we observed that the sample’s physical, optical and chemical characteristics changed dramatically, from its original form,” he says. “This change means that when we use XFELs for crystallography experiments we will have to change the way interpret the data. The results give the 100-year-old science of crystallography a new, exciting direction,” Abbey says. “Currently, crystallography is the tool used by biologists and immunologists to probe the inner workings of proteins and molecules — the machines of life. Being able to see these structures in new ways will help us to understand interactions in the human body and may open new avenues for drug development.” The study was conducted by researchers from the ARC Centre of Excellence in Advanced Molecular Imaging, La Trobe University, the University of Melbourne, Imperial College London, the CSIRO, the Australian Synchrotron, Swinburne Institute of Technology, the University of Oxford, Brookhaven National Laboratory, the Stanford Linear Accelerator (SLAC), the BioXFEL Science and Technology Centre, Uppsala University, and the Florey Institute of Neuroscience and Mental Health.

News Article | December 7, 2016

Every 18 seconds someone dies from tuberculosis (TB). It is the world's most deadly infectious disease. Mycobacterium tuberculosis, the causative agent of TB, has infected over one-third of the entire human population with an annual death toll of approximately 1.5 million people. For the first time, an international team of scientists from Monash and Harvard Universities have seen how, at a molecular level, the human immune system recognises TB infected cells and initiates an immune response. Their findings, published in Nature Communications, are the first step toward developing new diagnostic tools and novel immunotherapies. Lead author, Professor Jamie Rossjohn says one of the main reasons for our current lack of knowledge comes down to the complexity of the bacterium itself. Working with Professor Branch Moody's team at Harvard, they have begun to gain key insight into how the immune system can recognise this bacterium. Crucial to the success of M. tuberculosis as a pathogen is its highly unusual cell wall that not only serves as a barrier against therapeutic attack, but also modulates the host immune system. Conversely, its cell wall may also be the "Achilles' heel" of mycobacteria as it is essential for the growth and survival of these organisms. This unique cell wall is comprised of multiple layers that form a rich waxy barrier, and many of these lipid -- also known as fatty acids -- components represent potential targets for T-cell surveillance. Specifically, using the Australian Synchrotron, the team of scientists have shown how the immune system recognises components of the waxy barrier from the M. tuberculosis cell wall. "With so many people dying from TB every year, any improvements in diagnosis, therapeutic design and vaccination will have major impacts," Professor Moody says. "Our research is focussed on gaining a basic mechanistic understanding of an important biomedical question. And may ultimately provide a platform for designing novel therapeutics for TB and treat this devastating disease," Professor Rossjohn concludes. Professor Jamie Rossjohn is a Chief Investigator on the Australian Research Council Centre of Excellence in Advanced Molecular Imaging. The $39 million ARC-funded Imaging CoE develops and uses innovative imaging technologies to visualise the molecular interactions that underpin the immune system. Featuring an internationally renowned team of lead scientists across five major Australian Universities and academic and commercial partners globally, the Centre uses a truly multi scale and programmatic approach to imaging to deliver maximum impact. The Imaging CoE is headquartered at Monash University with four collaborating organisations - La Trobe University, the University of Melbourne, University of New South Wales and the University of Queensland. Professor Rossjohn is also a researcher at the Monash Biomedicine Discovery Institute. Committed to making the discoveries that will relieve the future burden of disease, the newly established Monash Biomedicine Discovery Institute at Monash University brings together more than 120 internationally-renowned research teams. Our researchers are supported by world-class technology and infrastructure, and partner with industry, clinicians and researchers internationally to enhance lives through discovery.

News Article | March 2, 2017

Greenfield Advisors is excited to announce that its two principals have been issued their first patent by the United States Patent and Trademark Office (USPTO). The patent, U.S. Patent 9,582,819, is entitled “Automated-valuation-model training-data optimization systems and methods.” The inventors are former Greenfield Advisors employee Andy Krause [currently a Lecturer at the University of Melbourne (Australia)], Clifford A. Lipscomb, and John A. Kilpatrick, both Co-Managing Directors of the firm. The technology covered by the patent is related to the firm’s long history of using sophisticated quantitative tools in its work. In valuing real estate, predictive models have been used for decades. These predictive models, however, have not been used to go back and optimize the training data sets used in the initial passes of the predictive model. What the patent does is describe a process for optimizing the training data sets used by a predictive model for automatically performing real estate valuations. As stated in the Background of the patent, “there is a need for an improved method of selecting training data to provide more accurate value predictions.” “Greenfield Advisors, in our 40-year history, has developed significant expertise in predictive modeling. We have applied that expertise to situations where we have been tasked with valuing properties affected by environmental contamination as well as properties underlying residential mortgage backed securities (RMBS),” said Dr. Lipscomb. “What we learned from our predictive modeling efforts is that the underlying data shape our predictive models – some jurisdictions have better data than others. Predictive models must be flexible in using the best available data in each jurisdiction. That reality was one of the reasons we decided to pursue a patent.” “The rapid popularization of real estate valuation models has led naturally to a demand for increased accuracy, precision, and reliability,” said Dr. Kilpatrick. “We have worked hard to make sure the results from our predictive models, in particular the Greenfield AVM, are state-of-the-art so our clients can be confident they are receiving the best data analyses available.” About Greenfield Advisors Founded in 1976, Greenfield Advisors is a boutique economic and financial analysis firm that provides government and private sector clients with customized consultations and advisory services. Best known for its analysis of complex economic, financial, and real estate situations in high-profile litigation matters, Greenfield Advisors also develops feasibility studies, business plans, and appraisals for its clients. Greenfield Advisors’ subsidiary, Bartow Street Capital LLC, serves as its investment banking and capital raising arm, and its subsidiary, Accre LLC, acts as an investment principal. Learn more about Greenfield Advisors by calling 206-623-2935 or visiting

News Article | November 17, 2016

The Victorian Aboriginal Community Controlled Health Organisation (VACCHO) and the Vision Initiative will be launching a suite of culturally appropriate eye health promotion materials at Rumbalara Aboriginal Co-operative on Friday 18 November at 11.30am. Jill Gallagher, CEO of VACCHO, expressed her strong concern that vision impairment and blindness among Aboriginal and/or Torres Strait Islander people is still three times that of non-Indigenous Australians. “The launch of these new materials will be an important step towards closing the eye health gap for Aboriginal and/or Torres Strait Islander people in Victoria,” says Ms Gallagher. “The need for these materials was made clear following  our consultation with the Aboriginal Community Controlled Health Services in Victoria. We identified a shortage of culturally appropriate eye health promotion literature and training resources for Aboriginal health workers,” says Ms Gallagher. “I feel these new communication tools will support our services in continuing their important promotion of eye health message throughout our communities.” The new materials feature artwork by respected Wiradjuri / Yorta Yorta artist, Aunty Lyn Briggs, originally commissioned for VACCHO’s first eye health program eighteen years ago. Widely recognised by the Victorian eye health sector, Aboriginal and/or Torres Strait Islander community and Aboriginal Community Controlled Organisation staff, the new designs received unanimous approval during the extensive consultation process. Vision Initiative Manager, Dee Tumino, says the new resources are the result of collaboration between many organisations and individuals. “VACCHO, the Vision Initiative and the many Aboriginal community members and health workers, including those at Rumbalara, have come together to make this project a reality. The launch of these resources signifies a major step towards increasing awareness and understanding of eye health among Aboriginal and/or Torres Strait Islander people in Victoria,” says Ms Tumino. The completed suite of eye health promotion materials include posters, appointment cards and information brochures that cover significant eye health conditions and issues, such as refractive error, diabetic retinopathy, cataracts, smoking and vision loss, as well as accessing the Victorian Aboriginal Spectacles Subsidy Scheme. Clinical expertise on the materials produced was provided by the Indigenous Eye Health Unit at the University of Melbourne and the Australian College of Optometry, who also provided VACCHO with additional funding to support the “$10 glasses for Community” resource that describes the Victorian Aboriginal Spectacles Subsidy Scheme. Rumbalara Aboriginal Cooperative CEO, Kemal ‘Kim’ Sedick, is delighted to be hosting the launch event at Mooroopna. “Staff at Rumbalara Health Service were integral to this work and have provided advice and assistance to set the framework for further consultation with the wider Aboriginal Community Controlled Organisation workforce. With the launch event being held at Rumbalara I am pleased to be able to acknowledge their work, along with the work of all the organisations and individuals involved with this important step towards closing the eye health gap.” The Victorian Aboriginal Community Controlled Health Organisation (VACCHO) was established in 1996. VACCHO is the peak body for the health and wellbeing of Aboriginal people living in Victoria. The Vision Initiative is an integrated health promotion program managed by Vision 2020 Australia and funded by the Victorian Government. It aims to reduce avoidable blindness and vision loss across Victoria by delivering eye health interventions to primary health providers, community and media. Rumbalara Aboriginal Cooperative (RAC) is a community controlled organisation that offers a range of health and community services to the Greater Shepparton community. We are a large provider of services in the Greater Shepparton area, and are one of the largest providers of services to Aboriginal and Torres Strait Islanders in Victoria. The organisation employs about 200 people and has a budget of nearly $20 million. RAC has 600 people registered as members or nearly 30% of the Greater Shepparton Aboriginal and Torres Strait Islander population. Established in October 2000, Vision 2020 Australia is part of VISION 2020: The Right to Sight, a global initiative of the World Health Organisation and the International Agency for the Prevention of Blindness. Vision 2020 Australia is the national peak body working in partnership to prevent avoidable blindness and improve vision care in Australia. It represents around 50 members involved in local and global eye care, health promotion, low vision support, vision rehabilitation, eye research, professional assistance and community support.

News Article | March 28, 2016

Forget all the smart washing machines you've seen at the annual Consumer Electronics Show, as there’s a fascinating new development that might let you clean your clothing using tech that's straight out of a science fiction movie. The new technique involves wearing the dirty clothes out in the sun so that solar energy can catalyze a chemical reaction that would obliterate stains. In other words, the next time you drop ketchup on your shirt at the game, it might clean itself off by the time you're ready to leave the stadium. WARNING: Anyone can access sensitive info on your iPhone without even unlocking it Published in Advanced Materials Interfaces, the study involves a new kind of fabric that’s made of copper and silver nanostructure woven inside cotton textiles and kept in place by a fixative solution. Researchers from the Royal Melbourne Institute of Technology at the University of Melbourne can clean stains within six minutes using the technology, although heavy stains might not be wiped out immediately. Tomato sauce and red wine are still be tricky to remove. When sunlight hits the metallic nanostructure, it releases high-energy electrons that can break down the organic molecules of dirt. The image above shows a closeup of the fabric. The nanoparticles are invisible to the naked eye, and the photo above has been magnified 200 times. "The advantage of textiles is they already have a 3D structure, so they are great at absorbing light, which in turn speeds up the process of degrading organic matter," Dr. Rajesh Ramanathan said, according to PhysOrg. He continued, "There's more work to do to before we can start throwing out our washing machines, but this advance lays a strong foundation for the future development of fully self-cleaning textiles." Remarkable nanowires could let computers of the future grow their own chips Today's best paid iPhone and iPad apps on sale for free More from BGR: Consumer Reports: Galaxy S7 beats iPhone to become the world’s best smartphone This article was originally published on

News Article | November 3, 2016

Apparently there’s not much public insistence on medical privacy for the President. Respondents to a national online poll strongly favor total medical transparency for any future President who develops AIDS, an infection which can raise sensitive questions as to its source. The poll was conducted for the just-published mystery novel, “The President Has AIDS” (, reports Dr. Leslie Norins, the book’s author. The poll presented three options as to whether the public should be informed about such an AIDS diagnosis: Yes, always inform; No, never inform; or Yes inform—but only if the source were one of several choices offered. A significant majority, two-thirds (65.8%), of respondents said the public should always be informed, regardless of the circumstances under which a Commander in Chief acquired the HIV which led to AIDS. In contrast, only about one-fifth (21%) said the public should not be informed at all. The smallest portion of poll-takers, 13.2 %, said the public should be informed--but only if the HIV source were one of seven possibilities presented. (These respondents could vote for more than one source as meriting disclosure) Highest ranked of these seven was “contaminated blood transfusion” (62.2%). Second was “extramarital sex with a same-sex partner” (51%). Two sources were tied for third place (48% each): “enemy plot” and “personal drug abuse using contaminated needle”. Then came “contaminated surgical instrument” (46.9%) and “extramarital sex with opposite sex partner” (43.9%). In last place was “sex with infected spouse” (38.8%). The non-scientific online poll, open to anyone interested, was taken October 14 through November 2. Dr. Norins declined to report the precise total of respondents, but called it “substantial.” He explained that the poll illuminates one of the two big challenges in his book: White House insiders are grappling with whether the President’s AIDS infection should be revealed to the public. The second challenge is faced by the mystery’s medical detective, Dr. Martin Riker—ferreting out the source of POTUS’s HIV. Just before the book ends, its fictional President, Paul Ralston, does decide whether to disclose his AIDS infection to the American people. Book author Dr. Leslie Norins graduated Johns Hopkins University and received his M.D. from Duke University School of Medicine. He then received his Ph.D. from the University of Melbourne (Australia) where he strudied immunology with Sir Macfarlane Burnet, Nobel prize-winner, at the Walter and Eliza Hall Institute of Medical Research. Early in his career he directed the Venereal Disease Research Laboratory at the Centers for Disease Control and Prevention (CDC). Then he became a medical publisher, creating and growing over the next 35 years more than 80 monthly newsletters serving specialized information needs of healthcare professionals and facilities. Dr. Norins previously authored, with contributions from Thomas Hauck, the thriller, "Deadly Pages", in which Mideast terrorists plot to attack vulnerable Americans with smallpox, disseminated by contaminating the printing ink of the New York Times. “The President Has AIDS. by Leslie Norins, MD, is published by Medvostat LLC, and is available at and bookstores. $14.95 paperback, $9.95 Kindle. ISBN 978-0692758003. ( .

The International Association of HealthCare Professionals is pleased to welcome Matthew John Skinner, MBBS, FRACP, General Practitioner, to their prestigious organization with his upcoming publication in The Leading Physicians of the World. Dr. Skinner is a highly trained and qualified general practitioner with a vast expertise in all facets of his work, especially HIV medicine and infectious diseases. Dr Skinner has been in practice for more than 18 years and is currently serving patients at Sir Charles Gairdner Hospital in Nedlands, Western Australia. Dr Skinner’s career in medicine began in 1997 when he graduated with his Bachelor of Medicine, Bachelor of Surgery Degree from the University of Melbourne Medical School. Following his graduation, Dr. Skinner completed his fellowship training at the Alfred Hospital in Melbourne, Victoria. He has earned the coveted title of Fellow of the Royal College of Physicians. To keep up to date with the latest advances and developments in his field, Dr. Skinner maintains a professional membership with the Internal Medicine Society of Australia and New Zealand, the Australian Medical Association, the Australasian Society for Infectious Diseases, the Australasian Society for HIV, the Hepatitis and Sexual Health Medicine, the European Society for Clinical Microbiology and Infectious Diseases, and the American Society for Microbiology. He attributes his great success to being actively involved within associations. Learn more about Dr. Skinner by reading his upcoming publication in The Leading Physicians of the World. is a hub for all things medicine, featuring detailed descriptions of medical professionals across all areas of expertise, and information on thousands of healthcare topics.  Each month, millions of patients use FindaTopDoc to find a doctor nearby and instantly book an appointment online or create a review. features each doctor’s full professional biography highlighting their achievements, experience, patient reviews and areas of expertise.  A leading provider of valuable health information that helps empower patient and doctor alike, FindaTopDoc enables readers to live a happier and healthier life.  For more information about FindaTopDoc, visit:

SINGAPORE, Dec. 7, 2016 /PRNewswire/ -- Carmentix Private Limited ("Carmentix") and the University of Melbourne are proud to announce the "Preterm Birth Biomarker Discovery" initiative. The aim of this collaborative clinical study is to validate novel biomarkers discovered by...

News Article | February 22, 2017

IBM Research has today announced new research developments in IBM Watson's ability to detect abnormalities of the eye's retina. The Melbourne based IBM researchers have trained a research version of Watson to recognize abnormalities in retina images, which could in the future offer doctors greater insights and speed in their early identification of patients who may be at risk of eye diseases – such as glaucoma, a leading cause of blindness in the developed world. The research began in 2015 and the latest work has focused on streamlining some of the manual processes experienced by doctors today. This includes distinguishing between left and right eye images, evaluating the quality of retina scans, as well as ranking possible indicators of glaucoma. Glaucoma has been named "the silent thief of sight" as many patients remain undiagnosed until irreversible vision loss occurs. Glaucoma can be treated but early detection is critical, with doctors currently relying on regular eye examination screening programs. The researchers applied deep learning techniques and image analytics technology to 88,000 de-identified retina images accessed through EyePACS, to analyze key anomalies of the eye. The research results demonstrate Watson's ability to accurately measure the ratio of the optic cup to disc – which is a key sign of glaucoma – with statistical performance as high as 95 percent. The technology has also been trained to distinguish between left and right eye images (with up to 94 percent confidence), which are important for downstream analysis and for the development effective treatment programs. "It is estimated that at least 150,000 Australians have undiagnosed glaucoma, with numbers expected to rise due to our rapidly aging population. It is critical that every Australian has access to regular eye examinations throughout their life so that diseases like glaucoma and diabetic retinopathy can be detected and treated as early as possible," Dr. Peter van Wijngaarden, Principal Investigator at Centre for Eye Research Australia, Department of Ophthalmology, University of Melbourne. "There is a real need for resources that allow all Australians to access regular eye examinations and the development of image analytics and deep learning technology will provide great promise in this area." The research is expected to continue to improve over time as the research technology expands to detect features of other eye diseases such as diabetic retinopathy and age-related macular degeneration. "Medical image analysis with cognitive technology has the capacity to fundamentally change the delivery of healthcare services," said Dr. Joanna Batstone, Vice President and Lab Director at IBM Research Australia. "Medical images represent a rich source of data for clinicians to make early diagnosis and treatment of disease, from assessing the risk of melanomas to identifying eye diseases through the analysis of retinas. Cognitive technology holds immense promise for confirming the accuracy, reproducibility and efficiency of clinicians' analyses during the diagnostic workflow." IBM Research globally continues to advance research combining cognitive technology with medical images. Through its 12 collaborative labs worldwide, IBM Research is focused on research projects involving medical imaging analysis for diseases such as melanoma, breast cancer, lung cancer and eye disease. Explore further: Major increase in U.S. glaucoma cases expected by 2030 More information: "Automatic Eye Type Detection in Retinal Fundus Image Using Fusion of Transfer Learning and Anatomical Features." International Conference on Digital Image Computing: Techniques and Applications (DICTA), 2016. D. Mahapatra et al. "Retinal Image Quality Classification Using Saliency Maps and CNNs." Machine Learning in Medical Imaging, Volume 10019 of the series Lecture Notes in Computer Science, pp 172-179. S. Sedai, et al. "Segmentation of Optic Disc and Optic Cup in Retinal Fundus Images Using Coupled Shape Regression." , Proceedings of the Ophthalmic Medical Image Analysis Third International Workshop (OMIA 2016) Held in Conjunction with MICCAI 2016, pp 1-8.

News Article | February 20, 2017

URLs go live when the embargo lifts A weight loss program that incorporates a maintenance intervention could help participants be more successful at keeping off pounds long term. Researchers found that a primarily telephone-based intervention focused on providing strategies for maintaining weight loss modestly slowed the rate of participants' weight regain after weight loss. Results of a randomized trial are published in Annals of Internal Medicine. Despite the efficacy of behavioral weight loss programs, weight loss maintenance remains the holy grail of weight loss research. After initial weight loss, most people tend to regain weight at a rate of about 2 to 4 pounds a year. Teaching people weight maintenance skills has been shown to slow weight gain, but can be time and resource-intensive. Simple and effective weight maintenance interventions are needed. Researchers tested a weight maintenance intervention on obese outpatients who had lost an average of 16 pounds during a 16-week, group-based weight loss program to determine if a low-intensity intervention could help participants keep off the weight they lost. Participants were randomly assigned to the intervention or usual care. The intervention focused on providing participants with skills to help them make the transition from initiating weight loss to maintaining their weight. Over the first 42 weeks, the intervention shifted from group visits to individual telephone calls, with decreased frequency of contact. There was no intervention contact during the final 14 weeks. The usual care group had no contact except for weight assessments. After 56 weeks, mean weight regain in the intervention group was about 1.5 pounds compared to 5 pounds in the usual care group. The evidence suggests that incorporating a weight maintenance intervention into clinical or commercial weight loss programs could make them more effective over the long term. Note: For an embargoed PDF, please contact Cara Graeff. To speak with the lead author, Corrine Voils at the Clinical Science Center, Madison, WI, please contact Isatu Hughes at or 1-888-478-8321 ext. 11011. 2. The Internet may be an effective tool for treating chronic knee pain Abstract: http://annals. Editorial: http://annals. Free patient summary: http://annals. URLs go live when the embargo lifts An online intervention combining home exercise and pain-coping skills training provided substantial clinical benefits for patients suffering from chronic knee pain. This model of care delivery could greatly improve patient access to effective treatments. Results of a randomized, controlled trial are published in Annals of Internal Medicine. Knee osteoarthritis, the leading cause of chronic knee pain, causes loss of function, reduced quality of life, and psychological disability. There is no cure for osteoarthritis and given the aging population and increasing obesity, disease burden is rapidly increasing. Home-based exercise and pain-coping skills training (an approach based on cognitive behavioral principles to target psychological factors that are common in persons with chronic pain) have been shown to offer relief, but accessing specialist clinicians to prescribe and supervise these treatments may be a challenge for some patients. Researchers tested an Internet-delivered treatment program to determine if it could improve pain and function in patients with chronic knee pain. Participants were randomly assigned to an intervention or control group. The intervention group had seven Skype sessions with a physical therapist to learn home exercises and pain-coping skills over 3 months. The control group received educational materials online. The researchers measured pain and physical functioning in both groups at baseline, 3 months, and 9 months. Participants in the intervention group reported significantly more improvement in pain and physical function than those in the control group. The author of an accompanying editorial suggests that these findings are encouraging and show that "telemedicine" can break down barriers to care, making treatment inexpensive and easily scalable. Note: For an embargoed PDF, please contact Cara Graeff. To speak with the lead author, Dr. Kim L. Bennell at the University of Melbourne, please contact Liz Lopez at or +61 3 834 42704. To reach the editorialist, Lisa Mandl, MD, MPH, at the hospital for Special Surgery, New York, NY, please contact Tracy Hickenbottom at or 212-606-1197. Also new in this issue: Inpatient Notes: Legislating Quality to Prevent Infection--A Primer for Hospitalists Jennifer Meddings, MD, MSc, and Laurence F. McMahon Jr., MD, MPH Hospitalist Commentary Abstract: http://annals. Single-Payer Reform: The Only Way to Fulfill the President's Pledge of More Coverage, Better Benefits, and Lower Costs Steffie Woolhandler, MD, MPH, and David U. Himmelstein, MD Ideas and Opinions Abstract: http://annals.

SINGAPORE, Dec. 7, 2016 /PRNewswire/ -- Carmentix Private Limited ("Carmentix") and the University of Melbourne are proud to announce the "Preterm Birth Biomarker Discovery" initiative. The aim of this collaborative clinical study is to validate novel biomarkers discovered by Carmentix and biomarkers previously discovered and validated at the University of Melbourne in a combined panel and to assess the risk for preterm birth as early as 20 weeks of gestation. The retrospective study led by Dr. Harry Georgiou, PhD and Dr. Megan Di Quinzio, MD at the University of Melbourne will validate the statistical strength of the novel biomarker panel. "Carmentix is excited to begin this collaboration, as we are keen to further develop the biomarkers discovered on our unique data mining platform," said Dr. Nir Arbel, CEO Carmentix. "If validated, this new panel of biomarkers may shed hope to significantly reduce the number of preterm birth cases on a global scale." Clinical Obstetrician and researcher, Dr. Di Quinzio frequently sees mothers asking "why was my baby born prematurely?" There is often no satisfactory answer. "Preterm birth continues to be a global health problem but sadly, reliable diagnostic tools are lacking," said Dr. Georgiou, scientific leader at the University of Melbourne. "This collaborative initiative with a strong commercial partner will help pave the way for a novel approach for better diagnosis and hopefully the prevention of preterm labour." Carmentix is an Esco Ventures-backed startup company based in Singapore. Carmentix is developing a novel biomarker prognostic panel to significantly reduce the numbers of preterm birth cases by establishing biomolecular tools that will alert clinicians of the preterm birth risk weeks before symptoms occur. Carmentix's technology relies on a multiple pathway analysis utilizing a unique panel of biomarkers. This panel of proprietary markers will allow the prediction of preterm birth at 16-20 weeks of gestation, anticipating a high accuracy predictive algorithm due to its coverage of bottleneck molecular process involved in preterm birth. Carmentix' goal is to achieve a cost-effective solution that would be robust and accurate, and will accommodate clinical settings worldwide. About the University of Melbourne and its commercialisation initiatives The University of Melbourne is Australia's best and one of the world's leading universities. As an R&D hub with world-leading specialists in science, technology and medicine, Melbourne undertakes cutting-edge research to create new ways of thinking, new technology and new expertise to build a better future. World-class research, real-world solutions: The University of Melbourne embraces a culture of innovation -- working with industry, government, non-governmental organisations and the community to solve real-world challenges. Our commercial partnerships bring research to life through collaboration in areas of bio-engineering, materials development, medical technology innovation, community capacity development and cultural entrepreneurship. Some of the ground-breaking commercialised technology created at the University of Melbourne includes the cochlear implant, the stentrode (a device that delivers mind control over computers, robotic limbs or exoskeletons), and novel anti-fibrotic drug candidates for the treatment of the fibrosis (prevalent in such chronic conditions as chronic kidney disease, chronic heart failure, pulmonary fibrosis and arthritis). The University of Melbourne is closely partnered with the Peter Doherty Institute for Infection and Immunity, Walter and Eliza Hall Institute, CSIRO, CSL, and The Royal Melbourne, Royal Children's and Royal Women's Hospitals. With over 160 years of leadership in education and research, the University responds to immediate and future challenges facing our society through innovation in research. The University of Melbourne is No. 1 in Australia and 31 in the world (Times Higher Education World University Rankings 2015-2016).

News Article | December 5, 2016

A world-first vaccine developed by Melbourne scientists, which could eliminate or at least reduce the need for surgery and antibiotics for severe gum disease, has been validated by research published this weekend in a leading international journal. A team of dental scientists at the Oral Health CRC at the University of Melbourne has been working on a vaccine for chronic periodontitis for the past 15 years with industry partner CSL. Clinical trials on periodontitis patients could potentially begin in 2018. Moderate to severe periodontitis affects one in three adults and more than 50 per cent of Australians over the age of 65. It is associated with diabetes, heart disease, rheumatoid arthritis, dementia and certain cancers. It is a chronic disease that destroys gum tissue and bone supporting teeth, leading to tooth loss. The findings published in the journal NPJ Vaccines (part of the Nature series) represent analysis of the vaccine's effectiveness by collaborating groups based in Melbourne and at Cambridge, USA. The vaccine targets enzymes produced by the bacterium Porphyromonas gingivalis, to trigger an immune response. This response produces antibodies that neutralise the pathogen's destructive toxins. P. gingivalis is known as a keystone pathogen, which means it has the potential to distort the balance of microorganisms in dental plaque, causing disease. CEO of the Oral Health CRC, Melbourne Laureate Professor Eric Reynolds AO, said it was hoped the vaccine would substantially reduce tissue destruction in patients harbouring P. gingivalis. "We currently treat periodontitis with professional cleaning sometimes involving surgery and antibiotic regimes," Professor Reynolds said. "These methods are helpful, but in many cases the bacterium re-establishes in the dental plaque causing a microbiological imbalance so the disease continues." "Periodontitis is widespread and destructive. We hold high hopes for this vaccine to improve quality of life for millions of people."

News Article | September 14, 2016

Abstract: The discovery, led by Associate Professor Brian Abbey at La Trobe in collaboration with Associate Professor Harry Quiney at the University of Melbourne, has been published in the journal Science Advances. Their findings reverse what has been accepted thinking in crystallography for more than 100 years. The team exposed a sample of crystals, known as Buckminsterfullerene or Buckyballs, to intense light emitted from the world's first hard X-ray free electron laser (XFEL), based at Stanford University in the United States. The molecules have a spherical shape forming a pattern that resembles panels on a soccer ball. Light from the XFEL is around one billion times brighter than light generated by any other X-ray equipment --even light from the Australian Synchrotron pales in comparison. Because other X-ray sources deliver their energy much slower than the XFEL, all previous observations had found that the X-rays randomly melt or destroy the crystal. Scientists had previously assumed that XFELs would do the same. The result from the XFEL experiments on Buckyballs, however, was not at all what scientists expected. When the XFEL intensity was cranked up past a critical point, the electrons in the Buckyballs spontaneously re-arranged their positions, changing the shape of the molecules completely. Every molecule in the crystal changed from being shaped like a soccer ball to being shaped like an AFL ball at the same time. This effect produces completely different images at the detector. It also altered the sample's optical and physical properties. "It was like smashing a walnut with a sledgehammer and instead of destroying it and shattering it into a million pieces, we instead created a different shape - an almond!" Assoc. Prof. Abbey said. "We were stunned, this is the first time in the world that X-ray light has effectively created a new type of crystal phase" said Associate Professor Quiney, from the School of Physics, University of Melbourne. "Though it only remains stable for a tiny fraction of a second, we observed that the sample's physical, optical and chemical characteristics changed dramatically, from its original form," he said. "This change means that when we use XFELs for crystallography experiments we will have to change the way interpret the data. The results give the 100-year-old science of crystallography a new, exciting direction," Assoc. Prof. Abbey said. "Currently, crystallography is the tool used by biologists and immunologists to probe the inner workings of proteins and molecules -- the machines of life. Being able to see these structures in new ways will help us to understand interactions in the human body and may open new avenues for drug development." ### The study was conducted by researchers from the ARC Centre of Excellence in Advanced Molecular Imaging, La Trobe University, the University of Melbourne, Imperial College London, the CSIRO, the Australian Synchrotron, Swinburne Institute of Technology, the University of Oxford, Brookhaven National Laboratory, the Stanford Linear Accelerator (SLAC), the BioXFEL Science and Technology Centre, Uppsala University and the Florey Institute of Neuroscience and Mental Health. For more information, please click If you have a comment, please us. Issuers of news releases, not 7th Wave, Inc. or Nanotechnology Now, are solely responsible for the accuracy of the content.

News Article | December 22, 2016

The number of deaths that occur in the United States around the Christmas holiday is higher compared with other times of the year. The spike, which was identified by sociologist David Phillips after looking at death certificates in the United States, has been confirmed by other studies that used other large data sets such as those from the Centers for Disease Control and Prevention. People are more likely to die of natural causes from Dec. 25 through New Year's day compared with any other time of the year, a phenomenon called the Christmas holiday effect. Many doctors and researchers think the spike is weather-related. The explanation is that the colder temperatures make people's body more vulnerable to complications from flu, heart attack, and other ailments. Findings of a new study published in the Journal of the American Heart Association on Dec. 22., however, crossed out cold weather as a possible reason for death spikes during the holiday season. Josh Knight, from the University of Melbourne, and colleagues looked at 25 years' worth of mortality data in New Zealand, where Christmas and New Year's day occur during the summer season. The researchers found that even when the Christmas season in this country occurs during a warmer weather, the number of deaths around the holiday season are still up by 4 percent compared with the average for the rest of the year. The findings suggest that the holidays themselves are factors that contribute to increased mortality regardless of the effects of weather and health problems linked with colder temperatures. Several culprits are speculated to drive the mortality rate up during the festive season. The holidays, for instance, can be stressful for many people given the increased family, social, and financial obligations. The stress can contribute to higher blood pressure levels and may aggravate risk factors for heart disease. The kind of food served during the holidays as well as the numerous opportunities to partake them may also have a role. Alcohol consumption likewise tend to increase during the festive season, which can also impact the health of some people. The researchers said that the so-called displacement of death and delays in seeking medical care could also be factors. Displacement of death means that for people who are already ill, they may attempt to postpone dying in a bid to experience one more holiday season with their loved ones. "There is the possibility of a displacement effect, in which mortality is being concentrated during the holiday period rather than directly causing additional mortality; however, the use of a different method of estimating the expected deaths will be required to fully explore this issue," the researchers wrote in their study. © 2017 Tech Times, All rights reserved. Do not reproduce without permission.

News Article | December 20, 2016

Alarmed by “pseudoscience” that may bring “devastating” health consequences, two groups of researchers have asked the journal Scientific Reports to retract a paper that they claim undermines confidence in the human papillomavirus (HPV) vaccine, given to girls to prevent cervical cancer. The 11 November paper describes impaired mobility and brain damage in mice given an HPV vaccine. The mice received doses that were proportionally a thousand times greater than that given to people, along with a toxin that makes the blood-brain barrier leaky. That protocol, critics contend, does not mimic what happens in the human body. “Basically, this is an utterly useless paper, a waste of precious animals,” David Gorski, a surgical oncologist at the Barbara Ann Karmanos Cancer Institute in Detroit, Michigan, wrote on his Orac blog at In an email to Science, the paper’s corresponding author, Toshihiro Nakajima of Tokyo Medical University, defended the work, stating: “Our manuscript was formally published after an intensive scientific review done by reviewers and by the editorial board of Scientific Reports.” The tussle is the latest salvo in a widening global battle over the HPV vaccine. Originally licensed in 2006, the vaccine is now approved for use in more than 120 countries. Studies show it is already starting to reduce HPV infections, which are blamed for 528,000 cervical cancer cases and 266,000 deaths each year, with the greatest burden in developing countries. (Boys are also now getting vaccinated, as HPV can cause genital warts and various cancers.) But in several countries, girls have complained of debilitating symptoms, reminiscent of chronic fatigue syndrome, after vaccination. These claims have attracted media attention, spawned antivaccination campaigns, and cut vaccination rates. More than 90% of Danish girls born in 2000 received at least one vaccine dose, but that rate has dropped year by year. Ireland also saw a drop in vaccination rates in 2015 and 2016. The trend is “alarming,” says Heidi Larson, who heads the Vaccine Confidence Project at the London School of Hygiene & Tropical Medicine. Japan is the prime battleground. As in many countries, the HPV vaccine got off to a promising start there. The first vaccine was licensed in 2009; in April 2013, the ministry added the vaccine to its recommended list and offered it for free. Uptake was robust. Sharon Hanley, a cancer epidemiologist at Hokkaido University in Sapporo, Japan, and colleagues reported in The Lancet in 2015 that roughly 70% of girls born between 1994 and 1998 completed the three-dose vaccination course. In spring 2013, however, a number of media outlets in Japan reported on alleged side effects. These include difficulty walking, headache, fatigue, poor concentration, and pain. That June, the health ministry suspended its “proactive recommendation” for vaccination, pending an investigation. The following January a ministry panel concluded that there is no evidence for a causal association between the HPV vaccine and the reported adverse events. The European Medicines Agency and the U.S. Centers for Disease Control and Prevention have come to similar conclusions. Epidemiological studies indicate that the symptoms reported by the vaccinated girls are found at equal rates in nonvaccinated populations. Yet Japan’s health ministry has never restored its proactive recommendation, which means that although the government pays for the shots, it has stopped urging local authorities to promote vaccination. Vaccination rates have plummeted in Japan. Hanley says that in the city of Sapporo, the vaccination rate fell to just 0.6% of eligible girls, and she believes that nationwide, the rate is close to zero. (In a sign of growing trouble for vaccination, 63 women in July filed a class-action lawsuit against the government and vaccinemakers, seeking $125,000 each in compensation for the alleged side effects.) Larson notes that health ministries in other countries aggressively promoted vaccine safety after claims of side effects surfaced, keeping vaccination rates high. An official at Japan’s health ministry says a decision on restoring the proactive recommendation is under review. Nakajima’s study is sure to inflame the debate. His group gave mice large doses of the HPV vaccine along with a pertussis toxin to help the vaccine slip into the central nervous system. The treatment, they found, impaired tail movement and locomotion. A postmortem revealed structural damage, increased cell death, and other abnormalities in the mice’s brains. Critics assail the study in a pair of letters to Scientific Reports and its publisher, the Nature Publishing Group. One, signed by 20 members of the HPV Prevention and Control Board at the University of Antwerp in Belgium, asserts: “This experimental setup in no way mimics the immunization with HPV vaccines but is gross over dosage and manipulation of membrane permeability.” A second letter, from David Hawkes, a virologist at the University of Melbourne in Australia, and two colleagues argues that the paper “lacks a clear methodology, adequate controls to control for bias, descriptions of results consistent with the data presented, or enough information for this study to be reproduced.” Nakajima defends his group’s methodology, stating that they adopted a strategy similar to that commonly used in studying autoimmune encephalitis in mice. As for the dose, he wrote, “This is just the first paper and dose-dependency could be one of the interesting experiments in the future.” He added that they are now preparing a detailed response to criticisms of their paper. Vaccine proponents worry that the paper will embolden vaccine opponents. Nearly 200 tweets have mentioned it, with several mistakenly assuming it appeared in Nature. Both letters call on Scientific Reports to withdraw it. In an email to Science, a journal spokesperson confirmed having received the letters, and wrote, “We investigate every concern that is raised with us carefully and will take action where appropriate.” Even as opposition to the HPV vaccine gains momentum, evidence of its efficacy is accumulating. But with its paltry vaccination rate, Japan is unlikely to see any reduction in its current 9000-plus cases of cervical cancer and 3000 deaths each year. Worse, says Larson, Japan’s suspension of the proactive recommendation “has been particularly applauded” by vaccine-critical groups in other countries. For women in Asian nations with weaker health infrastructure, Hanley adds, “The vaccine may be their only hope of prevention.”

News Article | December 22, 2016

NEW YORK, Dec. 22, 2016 (GLOBE NEWSWIRE) -- Dr. Pamela Stanley, Professor of Cell Biology at Albert Einstein College of Medicine and Horace W. Goldsmith Foundation Chair, has been selected to join the Education Board at the American Health Council. She will be sharing her knowledge and expertise in Glycobiology and Developmental Biology. A photo accompanying this announcement is available at Dr. Stanley has been active in biomedical research for the past 39 years. She obtained her PhD in Virology and Biochemistry in 1972 from the University of Melbourne in Australia and did postdoctoral work at the University of Toronto supported by a MRC Fellowship. Dr. Stanley has been Professor of Cell Biology at Albert Einstein College of Medicine for 27 years and Horace W. Goldsmith Foundation Chair for 9 years. Her day-to-day responsibilities include mentoring post-doctoral fellows and graduate students in investigations of roles for mammalian glycans in development, cancer, and Notch signaling. She also offers a graduate course in Glycobiology. Albert Einstein College of Medicine is a premier, research-intensive medical school dedicated to innovative biomedical investigation and the development of ethical and compassionate physicians and scientists. Dr. Stanley takes great pride in being able to make a difference in the lives of post-graduate trainees at such a prestigious institution. Dr. Stanley’s area of expertise is in diseases that stem from defects in the biosynthesis of glycans, often termed congenital disorders of glycosylation (CDG). Each disease is usually inherited and rare, but together the CDGs comprise mutations in more than 100 human genes. She has been supported by grants from the National Institutes of Health since 1980 and has also received grants from the American Cancer Society, the National Science Foundation and the Mizutani Foundation. Dr. Stanley, whose interest in science began at the age of 14, attributes success throughout her career to having a keen interest in understanding and discussing experimental details, being flexible, working hard, persevering, and sharing information and reagents with others. She is proud of the extensive research produced by the members of her laboratory on isolating and identifying biochemical and genetic bases of Chinese hamster ovary (CHO) cell glycosylation mutants, in using these mutants, as well as developing mutant mice, to understand functions of of glycans in development and Notch signaling. Her current goals include determining roles for complex N-glycans in spermatogenesis, and for O-glycans in Notch signaling. She is on the Editorial Board of Glycobiology, Glycoconjugate Journal, F1000 Research Reports and Scientific Reports. She is a member of AAUWA, NYAS, ASBMB, ASCB, and SFG, and is a past President of the Society for Glycobiology (SFG). Dr. Stanley’s awards and honors include: Dunlop Prize for First Place in Biochemistry (1966 and 1967); Australian Society for Microbiology Prize for Virology (1968); Commonwealth post-graduate award (1969-1972); Postdoctoral Fellowship from the Medical Research Council of Canada (1972-1975); American Cancer Society Faculty Awards (1978-1981 and 1981-1983); Irma T. Hirschl Faculty Award (1985-1990); election to Leo M. Davidoff Society for Excellence in Teaching (1987); MERIT Award from the National Cancer Institute, NIH (1991); Dorothy Baugh Harmon Lectureship, Oklahoma Med Res Foundation (1997); Mizutani Awards (2001, 2013); International Glycoconjugate Organization Award (2003); Karl Meyer Award, Society for Glycobiology (2003); Horace W. Goldsmith Foundation Chair (2007); LaDonne Shulman award for graduate teaching (2009); Goldstein Lecture, Dept. Biological Chemistry, U. Michigan, Ann Arbor (2010); Peter Gallagher memorial lecture, Griffith University, Australia (2012); Marshall Horwitz Faculty Prize for Research Excellence (2014); and WALS lecture, National Institutes of Health (2015). Dr. Stanley can converse in French. In her free time, she enjoys playing tennis and piano, swimming, and reading.

Kemper K.E.,University of Melbourne | Goddard M.E.,University of Melbourne | Goddard M.E.,Australian Department of Primary Industries and Fisheries
Human Molecular Genetics | Year: 2012

The genetic architecture of complex traits in cattle includes very large numbers of loci affecting any given trait. Most of these loci have small effects but occasionally there are loci with moderate-to-large effects segregating due to recent selection for the mutant allele. Genomic markers capture most but not all of the additive genetic variance for traits, probably because there are causal mutations with low allele frequency and therefore in incomplete linkage disequilibrium with the markers. The prediction of genetic value from genomic markers can achieve high accuracy by using statistical models that include all markers and assuming that marker effects are random variables drawn from a specified prior distribution. Recent effective population size is in the order of 100 within cattle breeds and ~;2500 animals with genotypes and phenotypes are sufficient to predict the genetic value of animals with an accuracy of 0.65. Recent effective population size for humans is much larger, in the order of 10 000-15 000, and more than 145 000 records would be required to reach a similar accuracy for people. However, our calculations assume that genomic markers capture all the genetic variance. This may be possible in the future as causal polymorphisms are genotyped using genome sequence data. © The Author 2012. Published by Oxford University Press. All rights reserved.

Vesely M.D.,University of Washington | Kershaw M.H.,Peter MacCallum Cancer Center | Kershaw M.H.,University of Melbourne | Kershaw M.H.,Monash University | And 3 more authors.
Annual Review of Immunology | Year: 2011

The immune system can identify and destroy nascent tumor cells in a process termed cancer immunosurveillance, which functions as an important defense against cancer. Recently, data obtained from numerous investigations in mouse models of cancer and in humans with cancer offer compelling evidence that particular innate and adaptive immune cell types, effector molecules, and pathways can sometimes collectively function as extrinsic tumor-suppressor mechanisms. However, the immune system can also promote tumor progression. Together, the dual host-protective and tumor-promoting actions of immunity are referred to as cancer immunoediting. In this review, we discuss the current experimental and human clinical data supporting a cancer immunoediting process that provide the fundamental basis for further study of immunity to cancer and for the rational design of immunotherapies against cancer. © 2011 by Annual Reviews. All rights reserved.

Powell J.E.,Queensland Institute of Medical Research | Visscher P.M.,Queensland Institute of Medical Research | Goddard M.E.,University of Melbourne | Goddard M.E.,Australian Department of Primary Industries and Fisheries
Nature Reviews Genetics | Year: 2010

Identity by descent (IBD) is a fundamental concept in genetics and refers to alleles that are descended from a common ancestor in a base population. Identity by state (IBS) simply refers to alleles that are the same, irrespective of whether they are inherited from a recent ancestor. In modern applications, IBD relationships are estimated from genetic markers in individuals without any known relationship. This can lead to erroneous inference because a consistent base population is not used. We argue that the purpose of most IBD calculations is to predict IBS at unobserved loci. Recognizing this aim leads to better methods to estimating IBD with benefits in mapping genes, estimating genetic variance and predicting inbreeding depression. © 2010 Macmillan Publishers Limited. All rights reserved.

Scott A.M.,University of Melbourne | Wolchok J.D.,Sloan Kettering Cancer Center | Wolchok J.D.,New York Medical College | Old L.J.,Sloan Kettering Cancer Center | Old L.J.,New York Medical College
Nature Reviews Cancer | Year: 2012

The use of monoclonal antibodies (mAbs) for cancer therapy has achieved considerable success in recent years. Antibody-drug conjugates are powerful new treatment options for lymphomas and solid tumours, and immunomodulatory antibodies have also recently achieved remarkable clinical success. The development of therapeutic antibodies requires a deep understanding of cancer serology, protein-engineering techniques, mechanisms of action and resistance, and the interplay between the immune system and cancer cells. This Review outlines the fundamental strategies that are required to develop antibody therapies for cancer patients through iterative approaches to target and antibody selection, extending from preclinical studies to human trials. © 2012 Macmillan Publishers Limited. All rights reserved.

Seddon P.B.,University of Melbourne | Calvert C.,Monash University | Yang S.,University of Melbourne
MIS Quarterly: Management Information Systems | Year: 2010

This paper develops a long-term, multi-project model of factors affecting organizational benefits from enterprise systems (ES), then reports a preliminary test of the model. In the shorter-term half of the model, it is hypothesized that once a system has gone live, two factors, namely functional fit and overcoming organizational inertia, drive organizational benefits flowing from each major ES improvement project. The importance of these factors may vary from project to project. In the long-term half of the model, it is hypothesized that four additional factors, namely integration, process optimization, improved access to information, and on-going major ES business improvement projects, drive organizational benefits from ES over the long term. Preliminary tests of the model were conducted using data from 126 customer presentations from SAP's 2003 and 2005 Sapphire U.S. conferences. All six factors were found to be important in explaining variance in organizational benefits from enterprise systems from the perspective of senior management.

Lee S.H.,Queensland Institute of Medical Research | Wray N.R.,Queensland Institute of Medical Research | Goddard M.E.,Australian Department of Primary Industries and Fisheries | Goddard M.E.,University of Melbourne | Visscher P.M.,Queensland Institute of Medical Research
American Journal of Human Genetics | Year: 2011

Genome-wide association studies are designed to discover SNPs that are associated with a complex trait. Employing strict significance thresholds when testing individual SNPs avoids false positives at the expense of increasing false negatives. Recently, we developed a method for quantitative traits that estimates the variation accounted for when fitting all SNPs simultaneously. Here we develop this method further for case-control studies. We use a linear mixed model for analysis of binary traits and transform the estimates to a liability scale by adjusting both for scale and for ascertainment of the case samples. We show by theory and simulation that the method is unbiased. We apply the method to data from the Wellcome Trust Case Control Consortium and show that a substantial proportion of variation in liability for Crohn disease, bipolar disorder, and type I diabetes is tagged by common SNPs. © 2011 The American Society of Human Genetics.

Turner S.,University of Melbourne | Rossjohn J.,Monash University
Immunity | Year: 2011

It is unclear how an effective T cell repertoire is built from a limited array of T cell receptor (TCR) genes. In this issue of Immunity, Stadinski et al. (2011) demonstrate that TCR variable (V) α chains can indirectly affect Vβ-mediated recognition of the major histocompatibility complex (MHC) molecule. © 2011 Elsevier Inc.

Gapin L.,University of Colorado at Denver | Godfrey D.I.,University of Melbourne | Rossjohn J.,Monash University
Current Opinion in Immunology | Year: 2013

Natural Killer T (NKT) cells are distinct lymphocyte lineages that recognize lipid antigens presented by the non-classical Major Histocompatibility Complex molecule CD1d. Two categories of NKT cells, type I and type II, have been described based on T-cell receptor expression and antigenic specificity. In both cases, increasing evidence suggest that recognition of self-antigens by these cells plays an important role not only in their development but also in their regulation of a broad range of immune responses. Here we review recent advances in our understanding of how and when NKT cell autoreactivity manifests itself, how the NKT T cell receptor engages self-antigens and the nature of these self-antigens. © 2013 Elsevier Ltd.

Visscher P.M.,Queensland Institute of Medical Research | Goddard M.E.,Australian Department of Primary Industries and Fisheries | Goddard M.E.,University of Melbourne
Nature Genetics | Year: 2011

Identifying causal variants for complex traits and understanding their function remain arduous tasks. A new study combines the advantages of gene mapping in livestock with elegant genetic and functional analyses to address these challenges and identifies candidate regulatory variants affecting stature in cattle. © 2011 Nature America, Inc. All rights reserved.

Yang J.,Queensland Institute of Medical Research | Lee S.H.,Queensland Institute of Medical Research | Goddard M.E.,University of Melbourne | Goddard M.E.,Australian Department of Primary Industries and Fisheries | Visscher P.M.,Queensland Institute of Medical Research
American Journal of Human Genetics | Year: 2011

For most human complex diseases and traits, SNPs identified by genome-wide association studies (GWAS) explain only a small fraction of the heritability. Here we report a user-friendly software tool called genome-wide complex trait analysis (GCTA), which was developed based on a method we recently developed to address the "missing heritability" problem. GCTA estimates the variance explained by all the SNPs on a chromosome or on the whole genome for a complex trait rather than testing the association of any particular SNP to the trait. We introduce GCTA's five main functions: data management, estimation of the genetic relationships from SNPs, mixed linear model analysis of variance explained by the SNPs, estimation of the linkage disequilibrium structure, and GWAS simulation. We focus on the function of estimating the variance explained by all the SNPs on the X chromosome and testing the hypotheses of dosage compensation. The GCTA software is a versatile tool to estimate and partition complex trait variation with large GWAS data sets. © 2011 The American Society of Human Genetics.

Rogelj J.,ETH Zurich | Meinshausen M.,Potsdam Institute for Climate Impact Research | Meinshausen M.,University of Melbourne | Knutti R.,ETH Zurich
Nature Climate Change | Year: 2012

Climate projections for the fourth assessment report (AR4) of the Intergovernmental Panel on Climate Change (IPCC) were based on scenarios from the Special Report on Emissions Scenarios (SRES) and simulations of the third phase of the Coupled Model Intercomparison Project (CMIP3). Since then, a new set of four scenarios (the representative concentration pathways or RCPs) was designed. Climate projections in the IPCC fifth assessment report (AR5) will be based on the fifth phase of the Coupled Model Intercomparison Project (CMIP5), which incorporates the latest versions of climate models and focuses on RCPs. This implies that by AR5 both models and scenarios will have changed, making a comparison with earlier literature challenging. To facilitate this comparison, we provide probabilistic climate projections of both SRES scenarios and RCPs in a single consistent framework. These estimates are based on a model set-up that probabilistically takes into account the overall consensus understanding of climate sensitivity uncertainty, synthesizes the understanding of climate system and carbon-cycle behaviour, and is at the same time constrained by the observed historical warming. © 2012 Macmillan Publishers Limited. All rights reserved.

Meuwissen T.,Norwegian University of Life Sciences | Goddard M.,University of Melbourne | Goddard M.,Australian Department of Primary Industries and Fisheries
Genetics | Year: 2010

Whole-genome resequencing technology has improved rapidly during recent years and is expected to improve further such that the sequencing of an entire human genome sequence for $1000 is within reach. Our main aim here is to use whole-genome sequence data for the prediction of genetic values of individuals for complex traits and to explore the accuracy of such predictions. This is relevant for the fields of plant and animal breeding and, in human genetics, for the prediction of an individual's risk for complex diseases. Here, population history and genomic architectures were simulated under the Wright-Fisher population and infinite-sites mutation model, and prediction of genetic value was by the genomic selection approach, where a Bayesian nonlinear model was used to predict the effects of individual SNPs. The Bayesian model assumed a priori that only few SNPs are causative, i.e., have an effect different from zero. When using whole-genome sequence data, accuracies of prediction of genetic value were >40% increased relative to the use of dense ∼30K SNP chips. At equal high density, the inclusion of the causative mutations yielded an extra increase of accuracy of 2.5-3.7%. Predictions of genetic value remained accurate even when the training and evaluation data were 10 generations apart. Best linear unbiased prediction (BLUP) of SNP effects does not take full advantage of the genome sequence data, and non-linear predictions, such as the Bayesian method used here, are needed to achieve maximum accuracy. On the basis of theoretical work, the results could be extended to more realistic genome and population sizes. Copyright © 2010 by the Genetics Society of America.

Voskoboinik I.,Peter MacCallum Cancer Center | Voskoboinik I.,University of Melbourne | Whisstock J.C.,Monash University | Trapani J.A.,Peter MacCallum Cancer Center | Trapani J.A.,University of Melbourne
Nature Reviews Immunology | Year: 2015

A defining property of cytotoxic lymphocytes is their expression and regulated secretion of potent toxins, including the pore-forming protein perforin and serine protease granzymes. Until recently, mechanisms of pore formation and granzyme transfer into the target cell were poorly understood, but advances in structural and cellular biology have now begun to unravel how synergy between perforin and granzymes brings about target cell death. These and other advances are demonstrating the surprisingly broad pathophysiological roles of the perforin-granzyme pathway, and this has important implications for understanding immune homeostasis and for developing immunotherapies for cancer and other diseases. In particular, we are beginning to define and understand a range of human diseases that are associated with a failure to deliver active perforin to target cells. In this Review, we discuss the current understanding of the structural, cellular and clinical aspects of perforin and granzyme biology. © 2015 Macmillan Publishers Limited.

Paterson B.M.,University of Melbourne | Paterson B.M.,Bio21 Molecular Science and Biotechnology Institute | Donnelly P.S.,University of Melbourne | Donnelly P.S.,Bio21 Molecular Science and Biotechnology Institute
Chemical Society Reviews | Year: 2011

The molecules known as bis(thiosemicarbazones) derived from 1,2-diones can act as tetradentate ligands for Cu(ii), forming stable, neutral complexes. As a family, these complexes possess fascinating biological activity. This critical review presents an historical perspective of their progression from potential chemotherapeutics through to more recent applications in nuclear medicine. Methods of synthesis are presented followed by studies focusing on their potential application as anti-cancer agents and more recent investigations into their potential as therapeutics for Alzheimer's disease. The Cu(ii) complexes are of sufficient stability to be used to coordinate copper radioisotopes for application in diagnostic and therapeutic radiopharmaceuticals. Detailed understanding of the coordination chemistry has allowed careful manipulation of the metal based properties to engineer specific biological activities. Perhaps the most promising complex radiolabelled with copper radioisotopes to date is Cu II(atsm), which has progressed to clinical trials in humans (162 references). © 2011 The Royal Society of Chemistry.

Fornito A.,Monash University | Zalesky A.,University of Melbourne | Breakspear M.,QIMR Berghofer Medical Research Institute | Breakspear M.,Royal Brisbane and Womens Hospital
Nature Reviews Neuroscience | Year: 2015

Pathological perturbations of the brain are rarely confined to a single locus; instead, they often spread via axonal pathways to influence other regions. Patterns of such disease propagation are constrained by the extraordinarily complex, yet highly organized, topology of the underlying neural architecture; the so-called connectome. Thus, network organization fundamentally influences brain disease, and a connectomic approach grounded in network science is integral to understanding neuropathology. Here, we consider how brain-network topology shapes neural responses to damage, highlighting key maladaptive processes (such as diaschisis, transneuronal degeneration and dedifferentiation), and the resources (including degeneracy and reserve) and processes (such as compensation) that enable adaptation. We then show how knowledge of network topology allows us not only to describe pathological processes but also to generate predictive models of the spread and functional consequences of brain disease. © 2015 Macmillan Publishers Limited. All rights reserved.

Mereghetti S.,Istituto di Astrofisica Spaziale e Fisica Cosmica | Pons J.A.,University of Alicante | Melatos A.,University of Melbourne
Space Science Reviews | Year: 2015

Magnetars are neutron stars in which a strong magnetic field is the main energy source. About two dozens of magnetars, plus several candidates, are currently known in our Galaxy and in the Magellanic Clouds. They appear as highly variable X-ray sources and, in some cases, also as radio and/or optical pulsars. Their spin periods (2–12 s) and spin-down rates (∼10−13–10−10 s s−1) indicate external dipole fields of ∼1013−15 G, and there is evidence that even stronger magnetic fields are present inside the star and in non-dipolar magnetospheric components. Here we review the observed properties of the persistent emission from magnetars, discuss the main models proposed to explain the origin of their magnetic field and present recent developments in the study of their evolution and connection with other classes of neutron stars. © 2015, Springer Science+Business Media Dordrecht.

Vinkhuyzen A.A.E.,University of Queensland | Wray N.R.,University of Queensland | Yang J.,University of Queensland | Goddard M.E.,University of Melbourne | And 2 more authors.
Annual Review of Genetics | Year: 2013

Understanding genetic variation of complex traits in human populations has moved from the quantification of the resemblance between close relatives to the dissection of genetic variation into the contributions of individual genomic loci. However, major questions remain unanswered: How much phenotypic variation is genetic; how much of the genetic variation is additive and can be explained by fitting all genetic variants simultaneously in one model, and what is the joint distribution of effect size and allele frequency at causal variants? We review and compare three whole-genome analysis methods that use mixed linear models (MLMs) to estimate genetic variation. In all methods, genetic variation is estimated from the relationship between close or distant relatives on the basis of pedigree information and/or single nucleotide polymorphisms (SNPs). We discuss theory, estimation procedures, bias, and precision of each method and review recent advances in the dissection of genetic variation of complex traits in human populations. By using genome-wide data, it is now established that SNPs in total account for far more of the genetic variation than the statistically highly significant SNPs that have been detected in genome-wide association studies. All SNPs together, however, do not account for all of the genetic variance estimated by pedigree-based methods. We explain possible reasons for this remaining "missing heritability." © 2013 by Annual Reviews. All rights reserved.

Kershaw M.H.,University of Melbourne | Kershaw M.H.,Monash University | Westwood J.A.,University of Melbourne | Darcy P.K.,University of Melbourne | Darcy P.K.,Monash University
Nature Reviews Cancer | Year: 2013

T cells have the capacity to eradicate diseased cells, but tumours present considerable challenges that render T cells ineffectual. Cancer cells often make themselves almost 'invisible' to the immune system, and they sculpt a microenvironment that suppresses T cell activity, survival and migration. Genetic engineering of T cells can be used therapeutically to overcome these challenges. T cells can be taken from the blood of cancer patients and then modified with genes encoding receptors that recognize cancer-specific antigens. Additional genes can be used to enable resistance to immunosuppression, to extend survival and to facilitate the penetration of engineered T cells into tumours. Using genetic modification, highly active, self-propagating 'slayers' of cancer cells can be generated. © 2013 Macmillan Publishers Limited.

Foley D.L.,University of Melbourne | Morley K.I.,University of Melbourne | Morley K.I.,Wellcome Trust Sanger Institute
Archives of General Psychiatry | Year: 2011

Context: The increased mortality associated with schizophrenia is largely due to cardiovascular disease. Treatment with antipsychotics is associated with weight gain and changes in other cardiovascular risk factors. Early identification of modifiable cardiovascular risk factors is a clinical imperative but prospective longitudinal studies of the early cardiometabolic adverse effects of antipsychotic drug treatment other than weight gain have not been previously reviewed. Objectives: To assess the methods and reporting of cardiometabolic outcome studies of the first treated episode of psychosis, review key findings, and suggest directions for future research. Data Sources: PsycINFO, MEDLINE, and Scopus from January 1990 to June 2010. Study Selection: Subjects were experiencing their first treated episode of psychosis. Subjects were antipsychotic naive or had been exposed to antipsychotics for a short known period at the beginning of the study. Cardiometabolic indices were assessed. Studies used a longitudinal design. Data Extraction: Sixty-four articles were identified describing 53 independent studies; 25 studies met inclusion criteria and were retained for detailed review. Data Synthesis: Consolidated Standards of Reporting Trials and Strengthening the Reporting of Observational Studies in Epidemiology checklists were used to assess the methods and reporting of studies. A qualitative review of findings was conducted. Conclusions: Two key hypotheses were identified based on this review: (1) in general, there is no difference in cardiovascular risk assessed by weight or metabolic indices between individuals with an untreated first episode of psychosis and healthy controls and (2) cardiovascular risk increases after first exposure to any antipsychotic drug. A rank order of drugs can be derived but there is no evidence of significant class differences. Recommended directions for future research include assessing the effect on cardiometabolic outcomes of medication adherence and dosage effects, determining the therapeutic window for antipsychotic use in adults and youth, and testing for moderation of outcomes by demographic factors, including sex and age, and clinical and genetic factors. ©2011 American Medical Association. All rights reserved.

Barrow S.J.,University of Melbourne | Funston A.M.,Monash University | Wei X.,University of Melbourne | Mulvaney P.,University of Melbourne
Nano Today | Year: 2013

We review recent progress on the assembly of metal nanocrystals using dithiol and DNA based bifunctional linkers to create discrete plasmonic superstructures. The structures formed include one-dimensional linear arrays, two-dimensional trimers and tetramers as well as stable three-dimensional assemblies built up on a substrate. We outline specific aspects and challenges within the DNA-assembly technique, including control of the desired interparticle spacing. The optical properties of a number of general classes of assemblies are described and the consequences of symmetry-breaking, such as the formation of Fano-like resonances. The assembly and optical properties of unique three-dimensional structures are described along with a hybrid top-down and bottom-up technique for obtaining long, linear arrays of crystalline metal nanoparticles. © 2013 Elsevier Ltd. All rights reserved.

Shanks G.,University of Melbourne | Weber R.,Monash University
MIS Quarterly: Management Information Systems | Year: 2012

Allen and March provide a critique of one of our papers in which we argue composites should be represented as entities/objects in a conceptual model rather than relationships/associations (Shanks et al. 2008). They contend we have addressed a non-issue. Furthermore, they argue our theoretical rationale and empirical evidence have flaws. In this paper, we provide a response to their arguments. We show that the issue we address is substantive. We show, also, that our theoretical analysis and empirical results are robust. We find, instead, that Allen and March's theoretical arguments and empirical evidence have flaws. Copyright © 2012.

Rossjohn J.,Monash University | Rossjohn J.,University of Cardiff | Gras S.,Monash University | Miles J.J.,University of Cardiff | And 4 more authors.
Annual Review of Immunology | Year: 2015

The Major Histocompatibility Complex (MHC) locus encodes classical MHC class I and MHC class II molecules and nonclassical MHC-I molecules. The architecture of these molecules is ideally suited to capture and present an array of peptide antigens (Ags). In addition, the CD1 family members and MR1 are MHC class I-like molecules that bind lipid-based Ags and vitamin B precursors, respectively. These Ag-bound molecules are subsequently recognized by T cell antigen receptors (TCRs) expressed on the surface of T lymphocytes. Structural and associated functional studies have been highly informative in providing insight into these interactions, which are crucial to immunity, and how they can lead to aberrant T cell reactivity. Investigators have determined over thirty unique TCR-peptide-MHC-I complex structures and twenty unique TCR-peptide-MHC-II complex structures. These investigations have shown a broad consensus in docking geometry and provided insight into MHC restriction. Structural studies on TCR-mediated recognition of lipid and metabolite Ags have been mostly confined to TCRs from innate-like natural killer T cells and mucosal-associated invariant T cells, respectively. These studies revealed clear differences between TCR-lipid-CD1, TCR-metabolite-MR1, and TCR-peptide-MHC recognition. Accordingly, TCRs show remarkable structural and biological versatility in engaging different classes of Ag that are presented by polymorphic and monomorphic Ag-presenting molecules of the immune system. © 2015 by Annual Reviews. All rights reserved.

Wiede F.,Monash University | La Gruta N.L.,University of Melbourne | Tiganis T.,Monash University
Nature Communications | Year: 2014

When the peripheral T-cell pool is depleted, T cells undergo homoeostatic expansion. This expansion is reliant on the recognition of self-antigens and/or cytokines, in particular interleukin-7. The T cell-intrinsic mechanisms that prevent excessive homoeostatic T-cell responses and consequent overt autoreactivity remain poorly defined. Here we show that protein tyrosine phosphatase N2 (PTPN2) is elevated in naive T cells leaving the thymus to restrict homoeostatic T-cell proliferation and prevent excess responses to self-antigens in the periphery. PTPN2-deficient CD8 + T cells undergo rapid lymphopenia-induced proliferation (LIP) when transferred into lymphopenic hosts and acquire the characteristics of antigen-experienced effector T cells. The enhanced LIP is attributed to elevated T-cell receptor-dependent, but not interleukin-7-dependent responses, results in a skewed T-cell receptor repertoire and the development of autoimmunity. Our results identify a major mechanism by which homoeostatic T-cell responses are tuned to prevent the development of autoimmune and inflammatory disorders. © 2014 Macmillan Publishers Limited. All rights reserved.

Agency: Cordis | Branch: H2020 | Program: RIA | Phase: ICT-11-2014 | Award Amount: 7.27M | Year: 2015

OrganiCity offers a new paradigm to European digital city making. Built on and extending the FIRE legacy, this project seeks to build a strong foundation for future sustainable cities through co-creation by a wide range of stakeholders. Globally, Europe is a champion of sustainable, inclusive and open societies. The digital age enables us to push this position further and to rethink the way we create cities and facilitate living by integrating many complex systems. OrganiCity combines top-down planning and operations with flexible bottom-up initiatives where citizen involvement is key. So far, this has been difficult to achieve. Previous attempts to scale informal one-off projects or broaden single community projects have failed. By focusing on the city as a sociotechnical whole, OrganiCity brings software, hardware and associated human processes flexibly together into a new living city that is replicable, scalable, as well as socially, environmentally and economically sustainable. Three clusters Aarhus (DK), London (UK) and Santander (ES) recognised for their digital urban initiatives, bring their various stakeholders together into a coherent effort to develop an integrated Experimentation-as-a-Service facility respecting ethical and privacy sensitivities and potentially improving the lives of millions of people. The OrganiCity consortium will create a novel set of tools for civic co-creation, well beyond the state of the art in trans-disciplinary participatory urban interaction design. The tools will be validated in each cluster and integrated across the three cities. In addition to citizen-centric join of testbeds, partner technologies and enhancements, two open calls with a budget of 1.8M will permit 25-35 experiments to use the new facility and co-creation tools. The aim is to grow sustainable digital solutions for future cities that are adjusted to the culture and capacities of each city unlocking amended services and novel markets.

Agency: Cordis | Branch: FP7 | Program: CP | Phase: ICT-2013.1.5 | Award Amount: 8.44M | Year: 2013

The aim of the AU2EU project is to implement and demonstrate in a real-life environment an integrated eAuthentication and eAuthorization framework to enable trusted collaborations and delivery of services across different organizational/governmental jurisdictions. Consequently, the project aims at fostering the adoption of security and privacy-by-design technologies in European and global markets. This objective will be achieved by:\n1)\tdesigning a joint eAuthentication and eAuthorization framework for cross-domain and jurisdictional collaborations, supporting different identity/attribute providers and organizational policie,s and guaranteeing privacy, security and trust;\n2)\tadvancing the state-of-the-art by extending the joint eAuthentication and eAuthorization framework with assurance of claims, trust indicators, policy enforcement mechanisms and processing under encryption techniques to address specific security and confidentiality requirements of large distributed infrastructures;\n3)\timplementing the joint eAuthentication and eAuthorization framework as a part of the platform that supports collaborative secure distributed storage, secure data processing and management in the cloud and offline scenarios;\n4)\tdeploying the designed framework and platform in two pilots on bio-security incident management and collaborative services in Australia and on eHealth and Ambient Assisted Living in Europe; and\n5)\tvalidating the practical aspects of the developed platform such as scalability, efficiency, maturity and usability.\nThe aforementioned activities will contribute to the increased trust, security and privacy, which in turn shall lead to the increased adoption of (cloud-based) critical infrastructures and collaborative delivery of services dealing with sensitive data. AU2EU strategically invests in two pilots deploying the existing research results as well as the novel techniques developed in the project to bridge the gap between research and market adoption.\nThe project builds on existing schemes and research results, particularly on the results of the ABC4Trust project as well as the Trust in Digital Life (TDL) initiative (, which initiated this project and will support its objectives by executing aligned activities defined in the TDL strategic research agenda. The project brings together a strong collaboration of leading industry (such as Philips, IBM, NEC, Thales), SMEs (such as Bicore) and research organizations of Europe (such as Eindhoven University of Technology) and Australia (such as CSIRO, Edith Cowan University, RMIT University, University of New South Wales & Macquarie University) as well as the large voluntary welfare association (such as German Red Cross). Consortium is determined to make a sustained long term impact through commercialization, open source & standardization of open composable infrastructure for e-services where privacy and interoperability with existing technologies are guaranteed.

Weerasinghe H.C.,Monash University | Weerasinghe H.C.,University of Melbourne | Huang F.,Monash University | Cheng Y.-B.,Monash University
Nano Energy | Year: 2013

This article introduces the latest progress in the research on fabrication of flexible dye-sensitized solar cells (DSSCs), in particular the choice of flexible plastic substrates replacing the conventionally used rigid glass substrates. A major challenge for making DSSCs on plastic substrates is the temperature limitation of the substrate in producing the TiO2 working electrode. Several low-temperature fabrication methods for nano-porous TiO2 films, such as ball-milling, acid/water treatments, chemical vapor deposition and electrophoretic deposition, as well as recently studied chemical and mechanical film processing methods such as chemical sintering, hydrothermal treatment, microwave irradiation, and different compression techniques are extensively discussed here. It also presents studies on new fabrication methods of flexible counter electrodes and recently developed new materials particularly useful for flexible DSSCs. At last, the developments and prospects of fabricating large scale flexible DSSC modules and their durability are discussed. © 2012 Elsevier Ltd.

Agency: Cordis | Branch: FP7 | Program: CP-FP | Phase: HEALTH.2013.2.2.1-2 | Award Amount: 9.24M | Year: 2013

Affective and non-affective psychoses have a major negative impact on human society. They account for 6.3% of the global burden of disease and cost 207 billion per year in Europe alone, making them the most expensive brain-related disorders and even more expensive than cardiovascular diseases. This socioeconomic burden is largely caused by two core disease features: onset in adolescence and early adulthood and long-term disabling disease courses. Both factors lead to enduring social and vocational exclusion and contribute to 8-20 times higher suicide rates in affected patients. Reliable and accessible prognostic tools will alleviate this burden by enabling individualised risk prediction, thus facilitating the targeted prevention of psychoses. Thus, we will first use routine brain imaging and complementary data to optimise our candidate biomarkers for the prediction and staging of psychoses and generate a prognostic system that generalises well across mental health services. Secondly, we will implement new multi-modal risk quantification tools to predict mental health-related disability in young help-seekers. The fusion of these tools with clinical knowledge will produce cybernetic prognostic services that accurately identify help-seekers at the highest risk of psychosis, poor functioning and suicide-related mortality. During this project we will secure our intellectual property rights and transform into a European company to commercially exploit these prognostic services through internet-based telemedicine applications. This will provide psychosis risk profiling tools to diverse target groups in the healthcare markets, including care-givers, the pharmaceutical industry and research institutions. By disseminating objective risk quantification, these products will provide firm diagnostic grounds for preventive therapy, improving outcomes and reducing costs. Thus, they will offer a unique selling proposition to the mental health sectors in Europe and beyond.

Agency: GTR | Branch: BBSRC | Program: | Phase: Research Grant | Award Amount: 29.69K | Year: 2014

Abstracts are not currently available in GtR for all funded research. This is normally because the abstract was not required at the time of proposal submission, but may be because it included sensitive information such as personal details.

Agency: Cordis | Branch: FP7 | Program: CP | Phase: ENERGY.2013.5.1.2 | Award Amount: 5.71M | Year: 2013

Carbon capture and storage (CCS) is one of the technological solutions to decarbonize the energy market while providing secure energy supply. So far, the cost of CCS is dominated by the CO2 capture, reason why new capture techniques should be developed. Adsorption techniques have already been evaluated for CO2 capture. So far, the main drawbacks of this technique are the energetic demand to regenerate the adsorbent and obtain high purity CO2. However, the utilization of commercially available materials was employed in the former evaluations. New materials with targeted properties to capture CO2 from flue gases can improve the performance of adsorption processes significantly. The vision of MATESA is to develop an innovative post-combustion capture termed as Electric Swing Adsorption (ESA). The utilization of hybrid CO2 honeycomb monoliths with high-loading CO2 materials (zeolites and MOFs) will be targeted. Classical ESA regeneration is done by passing electricity through the adsorbent, releasing adsorbed CO2 that can be recovered at high purity. A game-changing innovation in MATESA is the development of a regeneration protocol where electricity is only used to increase the purity of CO2 in the column and further regeneration is done using available low-grade heat. The predicted energy savings of the developed process may transform this CO2 capture process in a key component to make CCS commercially feasible in fossil fuel power plants going into operation after 2020. In order to realize a proof of concept of the proposed process, a strong component of the project will deal with the development of a hybrid material that is able to selectively adsorb CO2, conduct electricity, result in a low pressure drop and have reduced environmental impact. The development of such a material is important for MATESA and will also have a significant impact to increase the energy efficiency of pre-combustion CO2 capture and other energy intensive gas separations.

NewSouth Innovations Pty Ltd and University of Melbourne | Date: 2015-11-03

The present disclosure provides a quantum processor realised in a semiconductor material and method to operate the quantum processor to implement adiabatic quantum computation. The quantum processor comprises a plurality of qubit elements disposed in a two-dimensional matrix arrangement. The qubits are implemented using the nuclear or electron spin of phosphorus donor atoms. Further, the processor comprises a control structure with a plurality of control members, each arranged to control a plurality of qubits disposed along a line or a column of the matrix. The control structure is controllable to perform adiabatic quantum error corrected computation.

NewSouth Innovations Pty Ltd and University of Melbourne | Date: 2015-11-03

The present disclosure provides a quantum processor realised in a semiconductor material and method to operate the quantum processor to implement error corrected quantum computation. The quantum processor comprises a plurality of qubit elements disposed in a two-dimensional matrix arrangement. The qubits are implemented using the nuclear or electron spin of phosphorus donor atoms. Further, the processor comprises a control structure with a plurality of control members, each arranged to control a plurality of qubits disposed along a line or a column of the matrix. The control structure is controllable to perform topological quantum error corrected computation.

NewSouth Innovations Pty Ltd and University of Melbourne | Date: 2016-05-04

The present disclosure provides a quantum processor realised in a semiconductor material and method to operate the quantum processor to implement error corrected quantum computation. The quantum processor comprises a plurality of qubit elements disposed in a two-dimensional matrix arrangement. The qubits are implemented using the nuclear or electron spin of phosphorus donor atoms. Further, the processor comprises a control structure with a plurality of control members, each arranged to control a plurality of qubits disposed along a line or a column of the matrix. The control structure is controllable to perform topological quantum error corrected computation.

NewSouth Innovations Pty Ltd and University of Melbourne | Date: 2016-05-04

The present disclosure provides a quantum processor realised in a semiconductor material and method to operate the quantum processor to implement adiabatic quantum computation. The quantum processor comprises a plurality of qubit elements disposed in a two-dimensional matrix arrangement. The qubits are implemented using the nuclear or electron spin of phosphorus donor atoms. Further, the processor comprises a control structure with a plurality of control members, each arranged to control a plurality of qubits disposed along a line or a column of the matrix. The control structure is controllable to perform adiabatic quantum error corrected computation.

News Article | January 12, 2016

Researchers have conducted the first ever trials of smart pills that can measure intestinal gases inside the body, with surprising results revealing some unexpected ways that fiber affects the gut. Intestinal gases have been linked to colon cancer, irritable bowel syndrome (IBS) and inflammatory bowel disease (IBD), but their role in health is poorly understood and there is currently no easy and reliable tool for detecting them inside the gut. The first animal trials of smart gas sensing pills developed at Australia's RMIT University - which can send data from inside the gut directly to a mobile phone - have examined the impact of low and high-fiber diets on intestinal gases and offer new clues for the development of treatments for gut disorders. Lead investigator Professor Kourosh Kalantar-zadeh, from the Centre for Advanced Electronics and Sensors at RMIT, said the results reversed current assumptions about the effect of fiber on the gut. "We found a low-fiber diet produced four times more hydrogen in the small intestine than a high-fiber diet," Kalantar-zadeh said. "This was a complete surprise because hydrogen is produced through fermentation, so we naturally expected more fibre would equal more of this fermentation gas. "The smart pills allow us to identify precisely where the gases are produced and help us understand the microbial activity in these areas - it's the first step in demolishing the myths of food effects on our body and replacing those myths with hard facts. "We hope this technology will in future enable researchers to design personalized diets or drugs that can efficiently target problem areas in the gut, to help the millions of people worldwide that are affected by digestive disorders and diseases." The trials revealed different levels of fiber in a diet affected both how much gas was produced and where it was concentrated in the gut - in the stomach, small intestine or large intestine. The smart pills were trialed on two groups of pigs - whose digestive systems are similar to humans - fed high and low-fiber diets. The results indicate the technology could help doctors differentiate gut disorders such as IBS, showing: The research, jointly conducted with the Department of Gastroenterology at The Alfred Hospital, Monash University, the University of Melbourne and CSIRO, is published in the January edition of the high-impact journal, Gastroenterology.

News Article | February 2, 2016

Researchers from the University of Melbourne in Australia are studying how orangutans are able to learn and make social choices by exposing the animals to digital technology such as Microsoft's Xbox Kinect system. In earlier studies, animal experts from Melbourne's Microsoft Research Centre for Social Natural User Interfaces and Zoos Victoria tried to use touchscreen computers and tablets to determine the social interaction and cognitive challenges that orangutans typically experience. However, because of the animals' curiosity and immense strength, the researchers would have to stay alongside the orangutans in order to guide them in using the computers or tablets. The animals had to put their hand or fingers through a strong mesh when operating the devices. Despite these challenges, the researchers were able to discover that the orangutans had a penchant for using technology, especially if it gives them the chance to interact with humans. Sally Sherwen, an animal welfare expert from Zoos Victoria, wanted to allow the orangutans to use the technology the way they see fit. She believes that it would provide the animals a richer and more engaging interaction as they can use their full range of body movements. "They enjoyed using the tablet but we wanted to give them something more, something they can use when they choose to," Sherwen said. The researchers developed a new natural user interface (NUI) technology and incorporated it to the Xbox Kinect, a gaming console accessory that allows users to make virtual actions using their voice and body movement. Using the Xbox Kinect, the research team is now able to create a full body-sized projection that provides the orangutans the opportunity to engage images through their body gestures. The Xbox Kinect projection serves as a touchscreen for the animals to use without requiring any physical devices be placed inside their enclosure. During their testing this week, the researchers found that the orangutans were very receptive to the projected interface. Malu, a 12-year-old male orangutan from Melbourne Zoo, was shown a projection of a red dot. Once he saw the projection, he went over to the dot and proceeded to kiss it. The red dot then exploded. When the projection reappeared, Malu kiss it again. Malu's actions indicate the orangutans' keen sense to using not only their hands whenever they interact. The researchers aim to develop a new method of stimulation for the orangutans, which would allow the animals to have fun while also motivating them to use their problem solving skills. One of the team's primary goals is to find out how the orangutans, which are known to enjoy social engagements, would behave toward humans when they are given control of their interaction. Various computer games, picture galleries and painting applications are now being developed for the use of the orangutans.

NAPLES, Fla.--(BUSINESS WIRE)--A new online forum for analysis and commentary,, has opened, announced Leslie Norins, MD, PhD, publisher at Medvostat LLC, its parent company. The debut analysis on the site, authored by Dr. Norins himself, claims the decline in print circulation of the “Big Three” national newspapers—Wall Street Journal, New York Times, and USA Today—can be reversed. He also presents his prescription for the turnaround. Smartphones and digital media are convenient villains, he says, because blaming them has diverted publishers‘ attention from necessary updating of print edition marketing. Dr. Norins believes reading the print edition of the Big Three marks a person as “substantial,” whereas a smartphone viewer could be consulting gossip, games, or even porn. “Print is positive, smartphones are iffy,” he says. He says one unsung advantage of the print edition page is its “glorious” display size, 264 square inches, versus the 13 square inches of his iPhone screen. Thus, print’s big page facilitates “serendipity,” which occurs when the reader’s peripheral vision unexpectedly notes nearby important, profitable articles on other subjects. He prescribes ads quoting millennials who benefit from the print edition, plus hiring “influencers” to tout the special advantages of this format. Dr. Norins also reports he found it difficult to find the Journal and Times on sale in Brooklyn, “in the backyard” of the Manhattan headquarters of the two papers. Dr. Norins has over 40 years’ publishing experience creating and growing over 80 subscriber-paid newsletters serving medical professionals. Before the publishing phase of his career, he was a physician-researcher. He received his AB from Johns Hopkins, his MD from Duke University Medical School, and his PhD from University of Melbourne, where he studied with Sir Macfarlane Burnet, Nobel Laureate.

News Article | November 16, 2016

How much can we really know about someone, based on their 140 characters? In a study published on Tuesday in the journal Social Psychological and Personality Science, researchers at the University of Pennsylvania, the Technical University of Darmstadt, and the University of Melbourne examined this question, digging into how stereotypes influence what we think about someone based on their tweets. In a series of four studies, 3,000 participants guessed the gender, age, education, and politics of 6,000 tweeters, by looking at 20 publicly-available tweets. The tweets were stripped of images or any other markers that might indicate demographics. The researchers asked each participant—who had enrolled in the study via Amazon Mechanical Turk—to look at a dozen or so tweets, and make a judgment about the tweeter using one of the four variables. Participants were only asked to judge one marker, in order to prevent them from being influenced by the other answers. "We reversed the problem," said Daniel Preotiuc-Pietro, a researcher at the University of Pennsylvania. Instead of asking people about their stereotypes, which is the normal method, "we wanted to do this in the wild," he said. When you ask people to name their biases, "people might not be aware of them, or want to present themselves as unbiased," he said. By using natural language processing, researchers could separate out the stereotypes. The aim was to figure out how stereotyping impacted judgments. Participants were correct in their judgements, on average, 68% of the time. Here's what they found: The main takeaway? Participants were mostly right, but their stereotypes were exaggerated, Preotiuc-Pietro said. So, for instance, when swearing was used, participants judged that the tweet came from a lower-educated person. But they then used that logic to apply across the board, so they missed many cases in which that language came from people with advanced degrees. Also, the characteristics ascribed to one variable often impacted the other variables. Machine learning algorithms trained on, for example, tweets by women, to "learn" the characteristics of feminine-sounding tweets. As it turned out, when participants made decisions in other areas, like in judging political orientation, these cues influenced the outcome—so feminine-sounding tweets were marked as liberal, as well, and masculine-sounding ones were judged to be conservative. SEE: Myth busted: Older workers are just as tech-savvy as younger ones, says new survey The subject matter itself also led participants to stereotypes. Technology-related language, for instance, led participants to guess a male had penned the tweet—which was, mostly, true. Still, judging it to be always true led to participants missing cases where women wrote about technology. Also, it's not as if some participants used stereotypes and others did—across the board, everyone displayed some sort of bias. The findings have elicited further questions that the researchers are looking into, such as "Are women better at identifying other women?" the results of which are currently under review. The important point, said Preotiuc-Pietro, is to figure out ways to combat stereotypes. And how to do this? "Making people aware of their stereotypes towards certain groups so it can be intervened upon," Preotiuc-Pietro said. "If we can educate people about the ways these beliefs can steer them wrong, it will make people more socially accurate both online and off."

News Article | February 21, 2017

Monash University (Australia) and Cardiff University (UK) researchers have come a step further in understanding how the human immunodeficiency virus (HIV) evades the immune system. Declared a pandemic in 1987 by the World Health Organization, HIV infection has been responsible for 39 million deaths over the last 30 years. It remains one of the world's most significant public health challenges and thus a greater understanding of how HIV functions is urgently needed so that researchers can design better therapies to target this devastating pathogen. Published today in Nature Structural and Molecular Biology, the Monash-Cardiff team has made an important finding in understanding how HIV-I can evade the immune system. They demonstrated, in molecular detail, how mutations within HIV can lead to differing ways in which key immune molecules, termed the Major Histocompatibility Complex (MHC), display fragments of the virus and how this results in the HIV remaining "hidden" from the immune system. Principal author of the study, Dr Julian Vivian, said the team was yet to develop a complete understanding of how HIV outmanoeuvred our immune system. "This work uncovers a novel mechanism for HIV immune escape, which will be important to incorporate into future vaccine development and may have broader implications for immune recognition of MHC molecules," he said. The recent finding is part of a much larger international alliance between the two Universities, with the Systems Immunity Research Institute (SIURI) at Cardiff University and Monash Biomedicine Discovery Institute (BDI), having signed a Memorandum of Understanding. The five year mutual agreement recognises a number of highly productive joint projects already being conducted around inflammation and immunity, and provides a mechanism for enabling additional innovative projects and student exchange in the areas of protective immunity, metabolism, autoimmunity and cancer. A chief Investigator on the ARC CoE for Advanced Molecular Imaging, based at Monash BDI, Professor Jamie Rossjohn, said the find was exciting and unexpected. "These result were only possible because of the close collaborative ties between Monash and Cardiff researchers." Cardiff University Vice-Chancellor, Professor Colin Riordan, said the signing of the MoU called for a celebration. "Formalising this collaboration is another step forward in what will continue to be a highly successful exchange program and transfer of knowledge between the two countries for the benefit of all." Monash BDI Director, Professor John Carroll, said the research demonstrated the power of international collaboration. "We are bringing together excellence in molecular and systems level immunity in this partnership, and I know it will lead to many more great discoveries." The $39 million ARC-funded Imaging CoE develops and uses innovative imaging technologies to visualise the molecular interactions that underpin the immune system. Featuring an internationally renowned team of lead scientists across five major Australian Universities and academic and commercial partners globally, the Centre uses a truly multi scale and programmatic approach to imaging to deliver maximum impact. The Imaging CoE is headquartered at Monash University with four collaborating organisations - La Trobe University, the University of Melbourne, University of New South Wales and the University of Queensland. Committed to making the discoveries that will relieve the future burden of disease, the newly established Monash Biomedicine Discovery Institute at Monash University brings together more than 120 internationally-renowned research teams. Our researchers are supported by world-class technology and infrastructure, and partner with industry, clinicians and researchers internationally to enhance lives through discovery.

News Article | January 12, 2016

Researchers have conducted the first ever trials of smart pills that can measure intestinal gases inside the body, with surprising results revealing some unexpected ways that fiber affects the gut. Intestinal gases have been linked to colon cancer, irritable bowel syndrome (IBS) and inflammatory bowel disease (IBD), but their role in health is poorly understood and there is currently no easy and reliable tool for detecting them inside the gut. The first animal trials of smart gas sensing pills developed at Australia's RMIT University - which can send data from inside the gut directly to a mobile phone - have examined the impact of low and high-fiber diets on intestinal gases and offer new clues for the development of treatments for gut disorders. Lead investigator Professor Kourosh Kalantar-zadeh, from the Centre for Advanced Electronics and Sensors at RMIT, said the results reversed current assumptions about the effect of fiber on the gut. "We found a low-fiber diet produced four times more hydrogen in the small intestine than a high-fiber diet," Kalantar-zadeh said. "This was a complete surprise because hydrogen is produced through fermentation, so we naturally expected more fiber would equal more of this fermentation gas. "The smart pills allow us to identify precisely where the gases are produced and help us understand the microbial activity in these areas - it's the first step in demolishing the myths of food effects on our body and replacing those myths with hard facts. "We hope this technology will in future enable researchers to design personalized diets or drugs that can efficiently target problem areas in the gut, to help the millions of people worldwide that are affected by digestive disorders and diseases." The trials revealed different levels of fiber in a diet affected both how much gas was produced and where it was concentrated in the gut - in the stomach, small intestine or large intestine. The smart pills were trialed on two groups of pigs - whose digestive systems are similar to humans - fed high and low-fiber diets. The results indicate the technology could help doctors differentiate gut disorders such as IBS, showing: The research, jointly conducted with the Department of Gastroenterology at The Alfred Hospital, Monash University, the University of Melbourne and CSIRO, is published in the January edition of the high-impact journal, Gastroenterology.

News Article | September 15, 2016

Home > Press > Journey to the center of the cell: Nano-rods and worms wriggle best Abstract: When it comes to delivering drugs, nanoparticles shaped like rods and worms are the best bet for making the daunting journey to the centre of a cell, new Australian research suggests. A new study published in Nature Nanotechnology has answered a long-standing question that could lead to the design of better drug delivery vehicles: how nanoparticle shape affects the voyage through the cell. "We were able to show for the first time that nanoparticles shaped like rods and worms were more effective than spherical nanoparticles at traversing intracellular barriers and this enabled them to get all the way into the nucleus of the cell," says lead author UNSW's Dr Elizabeth Hinde. The study was led by chemists, engineers, and medical researchers from UNSW in a collaboration between the Australian Research Council Centre of Excellence in Advanced Molecular Imaging and the Australian Research Council Centre of Excellence in Bio-Nano Science. The Centres are both headquartered at Monash University, with research nodes at UNSW in Sydney. The team applied a new microscopy method to drug delivery for the first time, which allowed them to track the movement of differently shaped nanoparticles through a single cultured cancer cell, with very high temporal and spatial resolution. Using this method, the researchers were able to pinpoint where drugs were being released, and how they spread throughout the cell. They found that the cancer drug, doxorubicin, was most effective when it could breach the strong yet porous cellular barrier protecting the nucleus - the cell's control centre. Importantly, they discovered that a nanoparticles shape influenced how well the drug breached the barrier. Dr Hinde, an Associate Investigator on the Imaging CoE, says researchers could previously see the overall distribution of their nanoparticles throughout a cell, but didn't have the microscopy tools to understand how this localisation was set up - a key limitation in drug delivery research. "You need to know how things arrive at their final destination in order to target them there. Now we have a tool to track this incredible journey to the centre of the cell. It means other research groups can use this to assess their nanoparticles and drug delivery systems. "They'll be able to work out how to tailor their particles to reach the nucleus or other structures in the cell, and gauge where the cargo is being dropped off. This wasn't possible before." The shape of things to come: rod, worm or sphere? Polymeric nanoparticles will play a vital role in the future of medicine: these ultra-tiny particles can carry drugs to help attack and kill cancer cells, selectively deliver drugs just to where they are needed, and yield breakthroughs in disease diagnostics and imaging. UNSW engineers fabricated four types of nanoparticles: one shaped like a rod, one like a worm, and two that were spherical in shape. These were labelled with fluorescent tags, and incubated in cancer cells. By combining a novel fluorescence microscopy approach with some statistical analysis, the team was able to create a clear picture of how each particle passed through the cell. While the spherical particles got blocked by the nuclear envelope, the rod and worm-shaped particles were able to pass through. This provides a pathway for the development of particles that can selectively target and kill cancer cells, without hurting healthy ones. Dr Hinde explains: "Cancer cells have different internal architecture than healthy cells. If we can fine-tune the dimensions of these rod-shaped nanoparticles, so they only pass through the cellular barriers in cancer cells and not healthy ones, we can reduce some of the side effects of chemotherapies." Opportunities for other researcher groups "The impact for the field is huge," says Scientia Professor Justin Gooding from UNSW and the ARC Centre of Excellence in Bio-Nano Science. "It gives us the ability to look inside the cell, see what the particles are doing, and design them to do exactly what we want them to do." "And this isn't just thanks to the microscope, but the information and data we can extract from the new analysis procedures we've developed. If other research groups can learn how to do this analysis, they can use the equipment already in their labs and get started tomorrow," says Professor Gooding. "People are going to see, suddenly, that they can get all sorts of new information about their particles." The researchers will soon be collaborating with Dr John McGhee from UNSW Art & Design, who combines scientific data, microscopy images, and computer generated animation to create virtual reality renderings of the inside of human cells and blood vessels. The artworks allow researchers to visualise and go on VR walking tours through the body, and could help speed up the drug development process. About University of New South Wales About the ARC Centre of Excellence in Advanced Molecular Imaging The $39 million ARC-funded Imaging CoE develops and uses innovative imaging technologies to visualise the molecular interactions that underpin the immune system. Featuring an internationally renowned team of lead scientists across five major Australian Universities and academic and commercial partners globally, the Centre uses a truly multi scale and programmatic approach to imaging to deliver maximum impact. The Imaging CoE is headquartered at Monash University with four collaborating organisations - La Trobe University, the University of Melbourne, University of New South Wales and the University of Queensland. About the ARC Centre of Excellence in Bio-Nano Science The ARC Centre of Excellence in Bio-Nano Science is a national innovator in bio-nano sciences and an incubator of the expertise and technological excellence required to develop next generation bio-responsive nanomaterials. Five Australian universities collaborate in the Centre, the Universities of Melbourne, Queensland, New South Wales and South Australia, and Monash University, where the Centre is headquartered. For more information, please click If you have a comment, please us. Issuers of news releases, not 7th Wave, Inc. or Nanotechnology Now, are solely responsible for the accuracy of the content.

News Article | April 5, 2016

Perhaps you saw this graphic on the front page of The New York Times last week, leading into Amy Harmon's article about scientists from a variety of labs banding together in the fight against the Zika virus. The researchers' shared goal; sequence the genome of the virus' mosquito vector, Aedes aegypti, in the hope that a more complete knowledge of the insect’s genetic makeup will lead to ideas on how to prevent it from transmitting the virus that causes disease in humans. (The last major—although incomplete—sequencing effort was published in 2007). The New York Times caption (as it appears online) states that you're looking at "A visualization of the recently sequenced Aedes aegypti genome. Each of the 3,752 colored lines is a fragment of its three chromosomes..." But what does that mean? How do you read the graphic, and how was it built? To find out, I reached out to Mark Kunitomi, author of the chart and postdoctoral fellow in the Andino Lab at University of California, San Francisco. The genome sequence data for this chart was produced by the Andino lab in collaboration with Pacific BioSciences. As noted in Harmon’s article, other sequencing approaches are also currently being pursued, to refine the map further. (To learn more about a variety genome-reading technologies, see "Genomes for All" by George Church, in the January 2006 issue of Scientific American. To learn more about challenges related to visualizing genomes, see "Similarities Between Human and Chimp Genomes Revealed by Hilbert Curve" by Martin Krzywinski). Each of the colored lines in Kunitomi's graphic represents a string of chemical base pairs—the A,T, C and G of the mosquito's genetic code—whose accuracy researchers are highly confident about. These precisely known chemical base pair sequences are known as contigs. The detail below shows six of them. There are 3,752 contigs in the full map. The 2007 draft map included 36,206 contigs. The ultimate goal of continued sequencing efforts is to end up with just three lines; one continuous string of base pairs for each chromosome. The length of each colored line represents the number of base pairs in a contig, ranging from about 35,000 (smallest visible line on the graphic) to 7,901,702. The full data set of this cell line of A. aegypti is comprised of about 1.7 billion base pairs, which includes both coding regions (genes) and non-coding regions of the genome. Each grouping of colored lines represents contigs that the researchers are pretty sure belong together, but some gaps, overlaps, conflicts, and/or other uncertainties may exist at the points of connection (circled in black, below). The position of each group within the full image grid is roughly based on size. Line shape (curves, squiggles, and loops) and orientation are arbitrary. Kunitomi created the graphic with the bioinformatics visualization tool Bandage, developed by Ryan Wick (currently a research assistant in Kathryn Holt's research group at University of Melbourne). A description paper was published last year in the journal Bioinformatics: the software is available online, or you can clone the source code on GitHub. The bottom line? Researchers have made significant steps toward piecing together the genome of Aedes aegypti, but the map is still quite fragmented. Visualizations like this one allow researchers to zoom in and identify which regions still need more work, and allow non-specialists—like me—to track their progress.

News Article | February 15, 2017

Australia could save AUD $3.4 billion (USD $2.3 billion) in healthcare costs over the remaining lifetimes of all Australians alive in 2010 by instituting a combination of taxes on unhealthy foods and subsidies on fruits and vegetables, according to a new study published in PLOS Medicine by Linda Cobiac, from the University of Melbourne, Australia and colleagues. An increasing number of Western countries have implemented or proposed taxes on unhealthy foods and drinks in an attempt to curb rates of dietary-related diseases, however the cost-effectiveness of combining various taxes and subsidies is not well-understood. In the new study, researchers modeled the effect of taxes on saturated fat, salt, sugar, and sugar-sweetened beverages and a subsidy on fruits and vegetables on the Australian population of 22 million alive in 2010. They simulated how different combinations of these taxes and subsidies--designed so there would be less than a one percent change in total food expenditure for the average household--impacted the death and morbidity rates of Australians as well as healthcare spending over the remainder of their lives. The greatest impact, the researchers concluded, came from a sugar tax, which could avert 270,000 disability-adjusted life years (DALYs, or years of healthy lifespan across the population lost due to disease). "That is a gain of 1.2 years of healthy life for every 100 Australians alive in 2010," the authors say. "Few other public health interventions could deliver such health gains on average across the whole population." A salt tax was estimated to save 130,000 DALYS over the remainder of the lives of Australians alive in 2010, a saturated fat tax 97,000 DALYs, and a sugar-sweetened beverage tax 12,000 DALYs. Combined with taxes, the fruit and vegetable subsidies made for additional averted DALYs and reduced health sector spending, but on their own were not estimated to lead to a clear health benefit. Overall, when combined to maximize benefits, the taxes and subsidies could save an estimated 470,000 DALYs and reduce spending by AUD $3.4 billion (USD $2.3 billion). "Simulation studies, such as ours, have uncertainty. For example, we are reliant on other research estimating the responsiveness of the public to changes in food prices. There are also implementation issues for the food industry." "Nevertheless, this study adds to the growing evidence of large health benefits and cost-effectiveness of using taxes and regulatory measures to influence the consumption of healthy foods," the authors say. "We believe that with such large potential health benefits for the Australian population and large benefits in reducing health sector spending...the formulation of a tax and subsidy package should be given more prominent and serious consideration in public health nutrition strategy." "Several countries have imposed taxes on sugary drinks, with the UK the latest to consider such a policy. Our research suggests that even bigger health gains and cost savings may be possible with food taxes and subsidies on a wider range of foods." Funding: LJC was supported by a National Health and Medical Research Council Fellowship (Grant number 1036771; http://www. ). The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript. Competing Interests: LJC is a member of the Editorial Board of PLOS Medicine. Citation: Cobiac LJ, Tam K, Veerman L, Blakely T (2017) Taxes and Subsidies for Improving Diet and Population Health in Australia: A Cost-Effectiveness Modelling Study. PLoS Med 14(2): e1002232. doi:10.1371/journal.pmed.1002232 Centre for Health Policy, School of Population and Global Health, University of Melbourne, Melbourne, Victoria, Australia School of Public Health, The University of Queensland, Herston, Queensland, Australia, Burden of Disease Epidemiology, Equity and Cost-Effectiveness Programme, Department of Public Health, University of Otago, Wellington, Wellington, New Zealand IN YOUR COVERAGE PLEASE USE THIS URL TO PROVIDE ACCESS TO THE FREELY AVAILABLE PAPER: http://journals.

News Article | December 21, 2016

Scientists are forecasting ice-melting temperatures in the middle of winter for some parts of the Arctic for the second year in a row. And analysis shows such recent record temperatures there would have been virtually impossible without human greenhouse emissions. Over the coming days, some parts of the Arctic are expected to get gusts of warm air that are more than 20C hotter than usual for this time of year, some of which will tip over the 0C melting temperature of water. Maximum temperatures in parts of the Arctic will be warmer than the maximum over most of Canada for the next five days, according the global forecasting system run by the US National Oceanographic and Atmospheric Administration (Noaa). The extreme temperatures predicted coincide with record low sea-ice levels in the Arctic, which have already been wreaking havoc with weather North America, Europe and Asia, according to leading climate scientists. A low pressure system near Greenland is pulling the warm air towards the Arctic, in a similar pattern to that seen in 2015. And a paper published this month showed events like that, called “midwinter warming”, were occurring more frequently, and made more likely by the loss of winter sea ice – something itself caused by climate change. With less ice, warm air moved closer to the Arctic and could then more easily be swept over it, the scientists claimed. “These are very strange temperatures and are getting very close to hitting the freezing point, which is incredible for this time of year,” said Andrew King, a climate scientist from the University of Melbourne in Australia. But it’s not just predicted maximum temperatures that have been extreme. November and December have seen record average temperatures over the Arctic, averaging 2.5C above the usual for this time of year. Temperature anomalies like that have been linked to changes in migration patterns of marine mammals, cause mass starvation and deaths of reindeer as well as impact the habitats of polar bears. Now King and colleagues have shown the recent extreme average temperatures are almost certainly caused by climate change. And while they are still rare events – expected once every 200 years – they will be average by the year 2040. King and colleagues compared model simulations with and without the influence of human-caused greenhouse gas concentrations. “The record November-December temperatures in the Arctic are not seen in the natural world simulations where human influences have been removed,” King wrote in a piece published in The Conversation. “In comparison, in the current climate with human effects included, this event has at least a one-in-200-year return time.” By the 2040s, the event is expected to occur every second year, on average. The work has not yet been published in a peer-reviewed journal, but uses methods the team have used several times before in work that has been peer-reviewed.

News Article | February 28, 2017

eXplorance, the leading Learning Experience Management (LEM) solutions provider, announces an alliance partnership with Canvas, the Learning Management System (LMS) created by Instructure, to offer eXplorance’s all-in-one evaluation and feedback systems to all Instructure clients within the Canvas platform interface. Using Blue® and Bluepulse® 2 LEM solutions within the Canvas solution will give Instructure’s client base the tools to gather crucial learning experience analytics to improve student retention and engagement. Blue, an all-in-one evaluation platform, measures learners’ needs, skills and competencies through a range of evaluation mechanisms, while Bluepulse 2 is a live formative feedback platform built to gauge student sentiment and progress. Together, the tools provide actionable, data-driven insights to educators, leveraging a variety of applications such as course evaluations, online surveys and 360 degree feedback. This leads to improved effectiveness of teaching and learning and assists meeting stakeholder needs from the level of the student up to that of the institute’s president/chancellor. eXplorance’s solutions focus on automating the evaluation process and providing comprehensive analytics reports, while leveraging deep data integrations. “The integration between the products was designed with the Canvas user in mind,” said Samer Saab, eXplorance’s CEO and Founder. “It is important to ensure that every stakeholder in the learning process can easily view the required analytics for his/her improvement in Canvas.” Through embedded integration with the Canvas LMS, students, faculty and administrators have all their feedback and assessment needs in one place, allowing them to use the same familiar Canvas dashboard, calendar, course listing and notification system for improving the learning and teaching process. “We’re excited to be able to bring eXplorance’s expertise in learning experience management to our clients around the world,” said Melissa Loble, VP of Partners & Programs at Instructure. “The Blue and Bluepulse products are a great asset for educational institutions who want to effectively reach data-driven student success.” For more details about the integration between eXplorance products and the Canvas LMS system or to learn how your organization can reach enhanced student success, download the Canvas - Blue/ Bluepulse integration brochures. eXplorance, a Learning Experience Management (LEM) solutions provider, empowers organizations in making the right decisions with fact-based learning experience analytics. eXplorance’s offerings, Blue® and Bluepulse®, help instill a culture of continuous improvement by assessing (e.g., course evaluations, institutional surveys, 360 degree feedback reviews, advisor assessments), analyzing, and improving stakeholder needs, expectations, skills, knowledge, and competencies. Founded in 2003, eXplorance is a privately held corporation based in Montreal, Canada with offices in APAC, Europe, and Latin America. eXplorance is deemed one of the Best Workplaces by the Great Places to Work Institute®, having been awarded this title for three consecutive years. eXplorance’s clients include academic institutions such as the University of Melbourne, University of Pennsylvania, University of Toronto, Zayed University, Del Mar College, Bowdoin College, IESE Business School, Xi'an Jiaotong-Liverpool University, University of Auckland, and Liverpool John Moores University and organizations including Aramco, National Bank of Canada, and NASA.

News Article | February 8, 2017

Australian researchers are a step closer to understanding immune sensitivities to well-known, and commonly prescribed, medications. Many drugs are successfully used to treat diseases, but can also have harmful side effects. While it has been known that some drugs can unpredictably impact on the functioning of the immune system, our understanding of this process has been unclear. The team investigated what drugs might activate a specialised type of immune cell, the MAIT cell (Mucosal associated invariant T cell). They found that some drugs prevented the MAIT cells from detecting infections (their main role in our immune system), while other drugs activated the immune system, which may be undesirable. The results, published in Nature Immunology overnight, may lead to a much better understanding of, and an explanation for, immune reactions by some people to certain kinds of drugs. The findings may also offer a way to control the actions of MAIT cells in certain illnesses for more positive patient outcomes. The multidisciplinary team of researchers are part of the ARC Centre of Excellence in Advanced Molecular Imaging, and stem from Monash University, The University of Melbourne and The University of Queensland. Access to national research infrastructure, including the Australian synchrotron, was instrumental to the success of this Australian research team. Dr Andrew Keller from Monash University's Biomedicine Discovery Institute said that T cells are an integral part of the body's immune system. "They protect the body by 'checking' other cells for signs of infection and activating the immune system when they detect an invader," he said. "This arrangement is dependent on both the T cells knowing what they're looking for, and the other cells in the body giving them useful information." PhD student Weijun Xu from The University of Queensland's Institute for Molecular Bioscience used computer modelling to predict chemical structures, drugs and drug-like molecules that might impact on MAIT cell function. Such small compounds included salicylates, non-steroidal anti-inflammatory drugs like diclofenac, and drug metabolites. University of Melbourne Dr Sidonia Eckle from the Peter Doherty Institute for Infection and Immunity said the implications point to possible links between known drug hypersensitivities and MAIT cells. "A greater understanding of the interaction between MAIT cells and other host cells will hopefully allow us to better predict and avoid therapeutics that influence and cause harm," she said. "It also offers the tantalising prospect of future therapies that manipulate MAIT cell behaviour, for example, by enhancing or suppressing immune responses to achieve beneficial clinical outcome." Article: Drugs and drug-like molecules can modulate the function of mucosal-associated invariant T cells, Andrew N Keller, Sidonia B G Eckle, Weijun Xu, Ligong Liu, Victoria A Hughes, Jeffrey Y W Mak, Bronwyn S Meehan, Troi Pediongco, Richard W Birkinshaw, Zhenjun Chen, Huimeng Wang, Criselle D'Souza, Lars Kjer-Nielsen, Nicholas A Gherardin, Dale I Godfrey, Lyudmila Kostenko, Alexandra J Corbett, Anthony W Purcell, David P Fairlie, James McCluskey & Jamie Rossjohn, Nature Immunology, doi:10.1038/ni.3679, published online 6 February 2017.

News Article | November 30, 2016

Researchers from the Wellcome Trust Sanger Institute and Imperial College London have developed Microreact, a free, real-time epidemic visualisation and tracking platform that has been used to monitor outbreaks of Ebola, Zika and antibiotic-resistant microbes. The team have collaborated with the Microbiology Society to allow any researcher around the world to share their latest information about disease outbreaks. Details of the platform are published in the journal Microbial Genomics today (30 November 2016). Until now, disease data and geographic information about the movement of an infection or disease as it evolves and spreads has been locked up in databases that are often out of people's reach. Researchers have been left to rely on published information in research papers, which may be many months out of date, containing static visuals which show only a small part of the whole disease or infection threat. Microreact is a cloud-based system that combines the power of open data and the web, to provide real-time global data sharing and visualisation, allowing anyone to explore and examine outbreak information with unprecedented speed and detail. This is becoming increasingly important in the race to monitor and control fast-developing outbreaks like Ebola or Zika, or the growing threat of anti-microbial resistance. Microreact allows data and metadata sets to be uploaded via a web browser, which can then be visualised, shared and published in a research paper via a permanent web link. The partnership with Microbial Genomics allows the journal to make data from prospective publications available through Microreact. This promotes open availability and access while also starting to build a unique resource for global health professionals and scientists. Dr David Aanensen, Director of the Centre for Genomic Pathogen Surveillance (a joint initiative between Imperial College London and the Sanger Institute) and one of Microreact's creators, said: "Until now, the global research community has been hamstrung because results are generally only shared in static pictures or tables in publications. Microreact allows everyone to explore the information dynamically - across both time and space - letting them see the whole picture. Using Microreact takes disease tracking out the hands of a privileged few and gives it to everyone who wants to understand disease evolution." One example of how Microreact can democratise genomic data and resulting insights is the work of Dr Kathryn Holt and Professor Gordon Dougan. They have recently published two papers on the global distribution of typhoid bacteria around the world, showing the epidemic spread of a multidrug resistant strain. But they also published their data to Microreact to help others build on their work. Dr Kathryn Holt, from the University of Melbourne, said: "We gathered together data from almost 2000 samples Salmonella Typhi bacteria collected by 74 collaborators in 63 countries. By comparing the different strains and mapping them to when and where they were 'caught' we were able to show that a new drug-resistant strain emerged in Asia and has spread across that continent and into Africa. We have put all this information on Microreact and now anyone can see exactly what we saw - both scientists and those public health professionals tasked with controlling such outbreaks." By putting this information on Microreact, the researchers have ensured that the data continues to live on - allowing others to learn from their work and to use the information as a basis of comparison or foundation for future work. Microreact also allows individual researchers to share information globally and in real-time - crowdsourcing new discoveries and insights that could have immediate impact. Leighton Chipperfield, Director of Publishing at the Microbiology Society said: "We are delighted that our open-access, open-data journal Microbial Genomics is partnering with Microreact. All Microreact projects that appear in Microbial Genomics papers will be highlighted on the journal's website to increase the discoverability and accessibility of researchers' datasets." View a video of Microreact : Open data for genomic epidemiology video from Genomic Pathogen Surveillance at: https:/ Typhoid data: Wong et al. (2016) An extended genotyping framework for Salmonella enterica serovar Typhi, the cause of human typhoid. Nature Communications http://www. Global typhoid distribution in Microreact - https:/ The Centre for Genomic Pathogen Surveillance The Centre for Genomic Pathogen Surveillance is a joint initiative between Imperial College London and The Wellcome Trust Sanger Institute. The centre seeks to provide data and tools to allow researchers, doctors and governments worldwide to track and analyse the spread of pathogens and antimicrobial resistance. http://www. Society of Microbiology The Microbiology Society is a membership organisation for scientists who work in all areas of microbiology. It is the largest learned microbiological society in Europe with a worldwide membership based in universities, industry, hospitals, research institutes and schools. For more details, please visit http://www. Imperial College London Imperial College London is one of the world's leading universities. The College's 16,000 students and 8,000 staff are expanding the frontiers of knowledge in science, medicine, engineering and business, and translating their discoveries into benefits for society. Founded in 1907, Imperial builds on a distinguished past - having pioneered penicillin, holography and fibre optics - to shape the future. Imperial researchers work across disciplines to improve health and wellbeing, understand the natural world, engineer novel solutions and lead the data revolution. This blend of academic excellence and its real-world application feeds into Imperial's exceptional learning environment, where students participate in research to push the limits of their degrees. Imperial collaborates widely to achieve greater impact. It works with the NHS to improve healthcare in west London, is a leading partner in research and education within the European Union, and is the UK's number one research collaborator with China. Imperial has nine London campuses, including its White City Campus: a research and innovation centre that is in its initial stages of development in west London. At White City, researchers, businesses and higher education partners will co-locate to create value from ideas on a global scale. http://www. The Wellcome Trust Sanger Institute The Wellcome Trust Sanger Institute is one of the world's leading genome centres. Through its ability to conduct research at scale, it is able to engage in bold and long-term exploratory projects that are designed to influence and empower medical science globally. Institute research findings, generated through its own research programmes and through its leading role in international consortia, are being used to develop new diagnostics and treatments for human disease. http://www. Wellcome Wellcome exists to improve health for everyone by helping great ideas to thrive. We're a global charitable foundation, both politically and financially independent. We support scientists and researchers, take on big problems, fuel imaginations and spark debate. http://www.

News Article | April 12, 2016

Seven frog species in Australia are getting closer to extinction. If there will be no immediate action in preserving the species, its population will be totally wiped out by the killer fungus, research says. Biologists from Taronga Zoo, University of Melbourne, James Cook University and Southern Cross University in Lismore said that the seven frog species can still be saved from extinction by allotting a budget for research and disease management. According to lead author Lee Skerratt from University of Melbourne, six Australian frogs are already extinct due to chytridiomycosis. With a fund of $15 million for research and management program, other seven species can be saved from extinction within five years. "The deadly chytrid fungus has already wiped out six frog species since it reached Australia in 1978," said David Newell of Southern Cross University, Lismore. The seven frogs that are at risk are the northern and southern corroboree from New South Wales, the baw baw frog from Victoria, Litoria Spenceri or the spotted tree frog from Victoria, New South Wales, kroombit tinkerfrog of Queensland, the armored mist frog from Queensland, and Tasmanian tree frog of Tasmania. Some of the frogs that are at risk already have a program in place in dealing with chytrid fungus. The first threat abatement program was developed in 2006, for captive breeding. "Unfortunately, there hasn't been any funds attached to the new threat abatement plan and there really needs to be, because we're a long way from being able to save these species," said Newell. Inquiries regarding the funding can be directed to Mr. Gregory Andrews, Threatened Species Commissioner. "Putting a cost on a program before it has been designed is like asking how long is a piece of string," said Andrews. However Andrews agreed that it is a must to quickly tackle the problem on chytridiomycosis threatening the frog species. Andrews' office already runs a program in protecting the southern corroboree frogs. "We built large chytrid-free outdoor enclosures in Kosciuszko National Park. We now have more than 200 in these enclosures, with a goal to have them to at 600 within a year," he added. Though according to Newell this work still needs more support, saying that the $15 million funding is just an estimate from experts on chytrid fungus. He added that the funding is a must for developing captive husbandry techniques to maintain the population of the threatened frog species. In the study, researchers also mentioned 22 frog species which are at low and moderate risk from chytridiomycosis. Chytrid fungus attacks the frog's skin, making it difficult for them to breathe and manage their hydration. © 2016 Tech Times, All rights reserved. Do not reproduce without permission.

News Article | December 2, 2016

As our planet faces increasing urbanisation, public health experts are spearheading innovation for adjusting to this. We know that cities can make us ill: according to figures from the International Diabetes Federation, in 2014 there were 387 million people globally suffering from diabetes and in 2015 there were 415 million people living with the disease. Two-thirds of those people live in cities, experiencing poor diet and sedentary lifestyles. And our mental health suffers in cities, too. Urban living has been found to raise the risk of anxiety and mood disorders by 21% and 39% respectively. While half the world population currently lives in a city, this is predicted to rise to two-thirds by 2050. As they grow, cities will play a crucial role in finding solutions to many of our greatest public health challenges, from obesity and diabetes to communicable diseases like tuberculosis. With public health systems overstretched, and local governments pressed on all sides for resources and money, innovative solutions are needed. Public-private partnerships (PPPs) could be a source of new thinking, getting projects off the ground. So how can cities best build on PPPs to create health systems and fresh thinking so that our urban world will be a healthy one? How can public health bodies capitalise on the skills of the private sector without losing control? How can cities ensure equal access to healthcare for all residents? And what role should city mayors and other local government figures play in establishing innovative partnerships for health? Join an expert panel on Thursday 8 December, from 2pm to 3.30pm GMT, to discuss these questions and more. Niels Lund, vice president, Novo Nordisk and Cities Changing Diabetes spokesperson, Copenhagen, Denmark @lund_niels Cities Changing Diabetes is a programme to address the huge urban diabetes challenge. Niels has had an extensive career in international development with assignments for Unicef and the World Bank. Abdul El-Sayed, executive director and health officer, City of Detroit, United States @AbdulElSayed Abdul is turning around the fortunes of healthcare in one of America’s poorest cities, working with a variety of partners from all sectors. Laurence Carmichael, head, WHO Collaborating Centre for Healthy Urban Environments, UWE, Bristol, UK @laurencecarmich Laurence contributes to healthy cities research, consultancy and teaching in collaboration with local, national and international stakeholders including WHO-Europe. Julie Hughes, director, Institute for Market Transformation (IMT), co-director, City Energy Project, Washington DC, US The City Energy Project is a national initiative to create healthier and more prosperous American cities by improving the energy efficiency of buildings. IMT seeks market-based solutions to today’s climate and energy challenges. Susan Claris, associate director, transport consulting, Arup, London, UK, @Susan Claris Susan is a transport planner and anthropologist who has worked for Arup for more than twenty years. She has a particular interest in the many benefits that arise from making cities more walkable. Claudia Adirazola, director, Health and Road Safety, WRI Ross Center For Sustainable Cities, Washington DC, US Claudia works on a global strategy for addressing the public health impact of urban transportation and urban development. She has a background in the public sector in her home country of Peru. Tim Grandage, managing trustee, Future Hope, Kolkata, India Tim founded Future Hope, a charity that works with vulnerable children in Kolkata’s streets and slums, in 1987. Federico Cartin Arteaga, director, Rutas Naturbanas, San José, Costa Rica, @fedecartin Federico is an economist and urban planner. Rutas Naturbanas aims to revitalise urban rivers – to allow people to bike, walk and run – and eventually restore these water sheds. Alex Ross, director, World Health Organisation (WHO) Centre for Health Development (WHO Kobe Centre), Kobe, Japan, @directorwkc The WHO Kobe Centre has been working on urban health for over a decade, addressing health systems, health inequities and and urban planning-health collaboration. Alex’s background is in international development, with roles at USAid and DfID. Billie Giles-Corti, lead of the NHMRC Centre for Research Excellence in Healthy Liveable Communities, University of Melbourne, Melbourne, Australia, @billiegc Billie heads a centre with the mission to provide research that informs healthy urban design and planning. She is the author of a 2016 Lancet series on urban design, transport and health. Jess Beagley, policy research officer, NCD Alliance, London, UK, @JessicaBeagley Jess leads NCD Alliance’s work on environment and health, with a particular focus on urbanisation and climate change and the opportunities for co-benefit solutions to promote human and planetary health. The live chat is not video or audio-enabled but will take place in the comments section (below). Want to recommend someone for the panel or ask a question in advance? Get in touch via or @GuardianGDP on Twitter. Follow the discussion using the hashtag #globaldevlive.

News Article | January 13, 2016

Scientists at the UCLA Eli and Edythe Broad Center of Regenerative Medicine and Stem Cell Research have uncovered two specific markers that identify a stem cell able to generate heart muscle and the vessels that support heart function. This discovery may eventually aid in identifying ways to use stem cells to regenerate damaged heart tissue after a heart attack. Dr. Reza Ardehali, the study’s senior author, and his team published their findings in the journal Stem Cell Reports. “In a major heart attack, a person loses an estimated 1 billion heart cells, which results in permanent scar tissue in the heart muscle. Our findings seek to unlock some of the mysteries of heart regeneration in order to move the possibility of cardiovascular cell therapies forward,” said Ardehali, who is an associate professor of cardiology and a member of the UCLA Broad Stem Cell Research Center. “We have now found a way to identify the right type of stem cells that create heart cells that successfully engraft when transplanted and generate muscle tissue in the heart, which means we’re one step closer to developing cell-based therapies for people living with heart disease.” The method is still years away from being tested in humans, but the findings are a significant step forward in the use of human embryonic stem cells for heart regeneration. The research team used human embryonic stem cells, which are capable of turning into any cell in the body, to create cardiac mesoderm cells. Cardiac mesoderm cells have some stem cell characteristics, but only generate specific cell types found in the heart. The researchers pinpointed two distinct markers on cardiac mesoderm cells that specifically create heart muscle tissue and supporting vessels. They then transplanted these cells into an animal model and found that a significant number of the cells survived, integrated and produced cardiac cells, resulting in the regeneration of heart muscle and vessels. Ardehali, who is both a physician and a scientist, treats patients with advanced heart disease and also studies ways to cure or reverse heart disease. His goal is to one day be able to develop regenerative heart cells from stem cells and then transplant them into the heart through a minimally invasive procedure, replacing scar tissue and restoring heart function. Another study recently published by Ardehali and his team helps further this goal by outlining a novel approach to image, label and track transplanted cells in the heart using MRI, a common and non-invasive imaging technique. That study, which was published in the journal Stem Cells Translational Medicine, used specialized particles that are easily identified using an MRI. The labeling approach allowed Ardehali and his team to track cells in an animal model for up to 40 days after transplantation. The first author on both studies was Rhys Skelton, who was a visiting graduate student in Ardehali’s lab when he completed the research. Skelton has since completed his studies at the Murdoch Childrens Research Institute in Australia and received a Ph.D. from the University of Melbourne. He plans to return to UCLA as a postdoctoral scholar to continue his research on human embryonic stem cell-derived cardiac cells with the hope of one day developing a cell-based therapy for heart disease patients in need. “Our findings show, for the first time, that specific markers can be used to isolate the right kind of early heart cells for transplantation,” said David Elliott, a co-author of both studies, leader of the cardiac development research group at the Murdoch Institute and  Skelton’s doctoral supervisor. “Furthermore, our cell labeling and tracking approach allows us to determine the viability and location of transplanted cells.” Both studies were supported by the California Institute of Regenerative Medicine and the UCLA Broad Stem Cell Research Center.

News Article | September 28, 2016

Adding cinnamon to your diet can cool your body by up to two degrees, according to research published today. And the spice may also contribute to a general improvement in overall health. The research has been published in the journal Scientific Reports. Project leader Kourosh Kalantar-zadeh, from RMIT's School of Engineering, said the results of the study, which used pigs, seemed to show that cinnamon maintained the integrity of the stomach wall. "When pigs feed at room temperature, carbon dioxide (CO ) gas increases in their stomach. "Cinnamon in their food reduces this gas by decreasing the secretion of gastric acid and pepsin from the stomach walls, which in turn cools the pigs' stomachs during digestion. "When the pigs are hot, they hyperventilate, which reduces CO production. With cinnamon treatment, CO decreases even further. "This not only cools the pigs but leads to a significant improvement in their overall health." "Altogether cinnamon cooled the stomach by up to 2C," said fellow researcher Jian Zhen Ou. "No wonder cinnamon is so popular in warm regions as taking it makes people feel better and gives them a feeling of cooling down." The research is part of a bigger study at RMIT into gut health using swallowable gas sensor capsules or smart pills, developed at the University. Kalantar-zadeh said gut gases were the by-product of digestion and could provide valuable insights into the functioning and health of the gut. "Our experiments with pigs and cinnamon show how swallowable gas sensor capsules can help provide new physiological information that will improve our understanding of diet or medicine. "They are a highly reliable device for monitoring and diagnosing gastrointestinal disorders. They will revolutionize food science as we know it." Scientists at the University of Melbourne and Monash University also contributed to the paper, entitled "Potential of in vivo real-time gastric gas profiling: a pilot evaluation of heat-stress and modulating dietary cinnamon effect in an animal model."

Cai C.,Sun Yat Sen University | Yu Z.-H.,University of Melbourne | Zhang H.-H.,Sun Yat Sen University
Physical Review D - Particles, Fields, Gravitation and Cosmology | Year: 2016

We interpret the 750 GeV diphoton excess recently found in the 13 TeV LHC data as a singlet scalar in an extra dimensional model, where one extra dimension is introduced. In the model, the scalar couples to multiple vectorlike fermions, which are just the Kaluza-Klein modes of SM fermions. Mediated by the loops of these vectorlike fermions, the φ effective couplings to gluons and photons can be significantly large. Therefore, it is quite easy to obtain an observed cross section for the diphoton excess. We also calculate the cross sections for other decay channels of φ, and find that this interpretation can evade the bounds from the 8 TeV LHC data. © 2016 American Physical Society.

Moodie R.,University of Melbourne | Stuckler D.,University of Cambridge | Monteiro C.,University of Sao Paulo | Sheron N.,University of Southampton | And 4 more authors.
The Lancet | Year: 2013

The 2011 UN high-level meeting on non-communicable diseases (NCDs) called for multisectoral action including with the private sector and industry. However, through the sale and promotion of tobacco, alcohol, and ultra-processed food and drink (unhealthy commodities), transnational corporations are major drivers of global epidemics of NCDs. What role then should these industries have in NCD prevention and control? We emphasise the rise in sales of these unhealthy commodities in low-income and middle-income countries, and consider the common strategies that the transnational corporations use to undermine NCD prevention and control. We assess the effectiveness of selfregulation, public-private partnerships, and public regulation models of interaction with these industries and conclude that unhealthy commodity industries should have no role in the formation of national or international NCD policy. Despite the common reliance on industry self-regulation and public-private partnerships, there is no evidence of their effectiveness or safety. Public regulation and market intervention are the only evidence-based mechanisms to prevent harm caused by the unhealthy commodity industries.

Sudol M.,Weis Center for Research | Sudol M.,Mount Sinai School of Medicine | Harvey K.F.,Peter MacCallum Cancer Center | Harvey K.F.,University of Melbourne
Trends in Biochemical Sciences | Year: 2010

Metazoans have evolved several pathways to regulate the size of organs and ultimately that of organisms. One such pathway is known as Salvador-Warts-Hippo, or simply Hippo. Research on the Hippo pathway has grown exponentially during the past 8 years, revealing a complex signaling network. Intriguingly, within this complexity, there are levels of modularity. One level of modularity is represented by the unusually wide occurrence of the WW module in the Hippo core kinase cassette, the upstream regulatory components and the downstream nuclear proteins. We suggest that the prevalence of WW domain-mediated complexes in the Hippo pathway should facilitate its molecular analysis and aid prediction of new pathway components. © 2010 Elsevier Ltd.

Van Geelen J.M.,Foundation of Pelvic Floor Patients SBP | Dwyer P.L.,University of Melbourne
International Urogynecology Journal and Pelvic Floor Dysfunction | Year: 2013

Introduction: With the publication of the updated US Food and Drug Administration (FDA) communication in 2011 on the use of transvaginal placement of mesh for pelvic organ prolapse (POP) it is appropriate to now review recent studies of good quality on POP to assess the safety and effectiveness of treatment options and determine their place in management. Methods: A systematic search for studies on the conservative and surgical management of POP published in the English literature between January 2002 and October 2012 was performed. Studies included were review articles, randomized controlled trials, prospective and relevant retrospective studies as well as conference abstracts. Selected articles were appraised by the authors regarding clinical relevance. Results: Prospective comparative studies show that vaginal pessaries constitute an effective and safe treatment for POP and should be offered as first treatment of choice in women with symptomatic POP. However, a pessary will have to be used for the patient's lifetime. Abdominal sacral colpopexy is effective in treating apical prolapse with an acceptable benefit-risk ratio. This procedure should be balanced against the low but non-negligible risk of serious complications. The results of native tissue vaginal POP repair are better than previously thought with high patient satisfaction and acceptable reoperation rates. The insertion of mesh at the time of anterior vaginal wall repair reduces the awareness of prolapse as well as the risk of recurrent anterior prolapse. There is no difference in anatomic and subjective outcome when native tissue vaginal repairs are compared with multicompartment vaginal mesh. Mesh exposure is still a significant problem requiring surgical excision in approximately ≥10 % of cases. The ideal mesh has not yet been found necessitating more basic research into mesh properties and host response. Several studies indicate that greater surgical experience is correlated with fewer mesh complications. In women with uterovaginal prolapse uterine preservation is a feasible option which women should be offered. Randomized studies with long-term follow-up are advisable to establish the place of uterine preservation in POP surgery. Conclusion: Over the last decade treatment of POP has been dominated by the use of mesh. Conservative treatment is the first option in women with POP. Surgical repair with or without mesh generally results in good short-term objective and functional outcomes. However, basic research into mesh properties with host response and comparative studies with long-term follow-up are urgently needed. © 2012 The International Urogynecological Association.

Kaess M.,University of Heidelberg | Brunner R.,University of Heidelberg | Chanen A.,University of Melbourne
Pediatrics | Year: 2014

Borderline personality disorder (BPD) is a common and severe mental disorder that is associated with severe functional impairment and a high suicide rate. BPD is usually associated with other psychiatric and personality disorders, high burden on families and carers, continuing resource utilization, and high treatment costs. BPD has been a controversial diagnosis in adolescents, but this is no longer justified. Recent evidence demonstrates that BPD is as reliable and valid among adolescents as it is in adults and that adolescents with BPD can benefit from early intervention. Consequently, adolescent BPD is now recognized in psychiatric classification systems and in national treatment guidelines. This review aims to inform practitioners in the field of adolescent health about the nature of BPD in adolescence and the benefits of early detection and intervention. BPD diagnosis and treatment should be considered part of routine practice in adolescent mental health to improve these individuals' well-being and long-term prognosis. Copyright © 2014 by the American Academy of Pediatrics.

News Article | January 17, 2017

Of all Australia’s venomous animals, bees and wasps pose the biggest threat to public health, causing more than twice the number of admissions to hospital as snake bites and the same number of deaths. The first national analysis of 13 years’ data on bites and stings from venomous creatures has found that just over one-third (33%) of almost 42,000 admissions were caused by bees and wasps, compared with 30% by spiders and 15% by snakes. Though snake bites caused 27 deaths between 2000 and 2013, nearly twice as many per hospital admission as other venomous creatures, researchers found fatalities from snakes were rare given that there are 140 species of land snakes in Australia. People were more likely to die from an encounter with a horse or dog, said Dr Ronelle Welton, a public health expert at the University of Melbourne’s Australian venom unit, who led the study. Stings from bees and wasps caused 12,351 admissions and 27 fatalities. There were three deaths caused by jellyfish, a further three by ticks and two by ants. A man reportedly died from a redback spider bite in April last year – the first such fatality in more than 30 years – but it fell outside the study period. Welton said Australia was known internationally as “the epicentre of all things venomous”, but had a surprising lack of data about related injuries and fatalities, which meant national guidelines for prevention and treatment were inadequate. She said she was surprised to find that more than half the fatalities occurred at home, and two-thirds (64%) in major cities and inner-regional areas where healthcare was readily accessible. Bites and stings were much more likely to occur between April and October, and dangers varied by state and territory. Anaphylaxis from bee stings was common in South Australia, while in Queensland there were more snake bites. No fatalities were recorded in Tasmania but there were instances of anaphylaxis caused by jumper ants. Men aged between 30 and 35 were the most likely to be bitten or stung. More than half (34) of the deaths were the result of an allergic reaction that caused anaphylactic shock, which researchers pointed to as evidence of a widespread and under-reported problem of allergies in Australia. Three-quarters of snake bite victims died in hospital, compared with only 44% of those who died from an allergic reaction to an insect sting. Welton suggested that could be because bees were perceived to be more “innocuous”. “Most people don’t really fear them in the same way they fear snakes,” she said. “Without having a previous history of allergy, you might get bitten and, although nothing happens the first time, you’ve still developed an allergic sensitivity. “We need to understand why people are dying from bee-sting anaphylaxis at home.” Prof Daniel Hoyer, of the University of Melbourne’s department of pharmacology and therapeutics, said the “number one surprise” of the research was that there were so many deaths caused by insect bites. The incidence of potentially life-threatening allergies in Australia was “enormous” and a much bigger problem than many people perceived, he said, pointing to the fatalities caused by “Melbourne’s thunderstorm asthma” in November. More than 8,500 people were taken to hospital after the mass asthma event triggered by severe thunderstorms and at least eight died, many of whom did not know they were susceptible.

News Article | November 14, 2016

Nothing will stand in their way. After devastating Australia’s low-lying regions, European rabbits are now muscling in on snowy mountainous areas by adapting to survive on toxic snow gum leaves. Rabbits were introduced to Australia in the 19th century and rapidly spread across the continent, creating huge problems for native wildlife and farmers. The only areas they have failed to colonise are those with snow cover in winter, because the grass they eat is buried. But in 2011, Ken Green at Australia’s National Parks and Wildlife Service began to notice rabbits living above the winter snowline in the Snowy Mountains of New South Wales. To understand how they are surviving, he collected their faecal pellets for three years and sent them to the University of Melbourne for dietary analysis. The results showed that the leaves of alpine eucalyptus trees, also known as snow gums, form the biggest part of the rabbits’ winter diet. It is astonishing that the rabbits can eat such high quantities of eucalyptus leaves, says Green. The tough leaves are difficult to digest, low in nutrients and contain toxins like tannins, terpenes and phenolics. Native animals like koalas can survive on gum leaves because they have evolved special digestive mechanisms – such as hindgut fermentation – that allow them to extract nutrients and detoxify the chemicals. But koalas are mostly sedentary, conserving the limited energy they can extract. “Rabbits of course are quite different – they are very energetic – so it’s amazing that they’re getting by and not having major digestive issues,” Green says. How they are managing this is not clear. In theory, the rabbits might have acquired gut microbes that help them digest eucalyptus leaves, or evolved physical adaptations. Or it could just be a behavioural change, which means the rabbits must already have had some ability to digest the leaves. The health of the rabbits was not directly monitored, but they continued breeding from year to year, suggesting they were surviving well, he says. This may be because they only need to eat the gum leaves for three to four months of the year while there is snow cover, says Green. The leaves that the rabbits are eating are those that have recently regenerated after a bushfire ripped through the area in the summer of 2003. These may be gentler on the rabbits’ stomachs than older leaves, says David Lee at the University of the Sunshine Coast in Queensland. The regenerating trees also have leaves low enough to the snow for the rabbits to reach them. How the rabbits will fare as the trees grow taller is not clear. This species of rabbit, Oryctolagus cuniculus, has not colonised snowy regions in Europe because European trees lose their leaves during winter, Green says. “It just goes to show that if you take animals out of their native range and put them in novel environments, strange things will happen.”

The heatwave that engulfed southeastern Australia at the end of last week has seen heat records continue to tumble. On Saturday 11 February, as New South Wales suffered through the heatwave’s peak, temperatures soared to 47℃ in Richmond, 50km northwest of Sydney, while 87 bushfires raged across the state, amid catastrophic fire conditions. On that day, most of NSW experienced temperatures at least 12C above normal for this time of year. In White Cliffs, the overnight minimum was 34.2, a new record for the state’s highest observed minimum temperature. On Friday, the average maximum temperature right across NSW hit 42.4, beating the previous February record of 42.0. The new record stood for all of 24 hours before it was smashed again on Saturday, as the whole state averaged 44.0 at its peak. At this time, NSW was the hottest place on earth. A degree or two here or there might not sound like much, but to put it in cricketing parlance, those temperature records are the equivalent of a modern-day test batsman retiring with an average of over 100 – the feat of outdoing Don Bradman’s fabled 99.94 – and would undoubtedly be front-page news. And still the records continue to fall. Mungindi, on the border of NSW and Queensland, broke the Australian record of 50 days in a row above 35, set just four years ago at Bourke airport, with the new record now at 52 days. Meanwhile, two days after that sweltering Saturday we woke to find the fires ignited during the heatwave still cutting a swathe of destruction, with the small town of Uarbry, east of Dunedoo, all but burned to the ground. This heatwave is all the more noteworthy when we consider that the El Niño of 2015-16 is long gone and the conditions that ordinarily influence our weather are firmly in neutral. This means we should expect average, not sweltering, temperatures. Since Christmas, much of eastern Australia has been in a flux of extreme temperatures. This increased frequency of heatwaves shows a strong trend in observations, which is set to continue as the human influence on the climate deepens. It is all part of a rapid warming trend that over the past decade has seen new heat records in Australia outnumber new cold records by 12 to one. Let’s be clear, this is not natural. Climate scientists have long been saying that we would feel the impacts of human-caused climate change in heat records first, before noticing the upward swing in average temperatures (although that is happening too). This heatwave is simply the latest example. What’s more, in just a few decades’ time, summer conditions like these will be felt across the whole country regularly. The useful thing scientifically about heatwaves is that we can estimate the role that climate change plays in these individual events. This is a relatively new field known as “event attribution”, which has grown and improved significantly over the past decade. Using the Weather@Home climate model, we looked at the role of human-induced climate change in this latest heatwave, as we have for other events previously. We compared the likelihood of such a heatwave in model simulations that factor in human greenhouse gas emissions, compared with simulations in which there is no such human influence. Since 2017 has only just begun, we used model runs representing 2014, which was similarly an El Niño-neutral year, while also experiencing similar levels of human influence on the climate. Based on this analysis, we found that heatwaves at least as hot as this one are now twice as likely to occur. In the current climate, a heatwave of this severity and extent occurs, on average, once every 120 years, so is still quite rare. However, without human-induced climate change, this heatwave would only occur once every 240 years. In other words, the waiting time for the recent east Australian heatwave has halved. As climate change worsens in the coming decades, the waiting time will reduce even further. Our results show very clearly the influence of climate change on this heatwave event. They tell us that what we saw last weekend is a taste of what our future will bring, unless humans can rapidly and deeply cut our greenhouse emissions. Our increasingly fragile electricity networks will struggle to cope, as the threat of rolling blackouts across NSW showed. It is worth noting that the large number of rooftop solar panels in NSW may have helped to avert such a crisis this time around. Our hospital emergency departments also feel the added stress of heat waves. When an estimated 374 people died from the heatwave that preceded the Black Saturday bushfires the Victorian Institute of Forensic Medicine resorted to storing bodies in hospitals, universities and funeral parlours. The Victorian heatwave of January 2014 saw 167 more deaths than expected, along with significant increases in emergency department presentations and ambulance callouts. Infrastructure breaks down during heatwaves, as we saw in 2009 when railway lines buckled under the extreme conditions. It can also strain Australia’s beloved sporting events, as the 2014 Australian Open showed. These impacts have led state governments and other bodies to investigate heatwave management strategies, while our colleagues at the Bureau of Meteorology have developed a heatwave forecast service for Australia. These are likely to be just the beginning of strategies needed to combat heatwaves, with conditions currently regarded as extreme set to be the “new normal” by the 2030s. With the ramifications of extreme weather clear to everyone who experienced this heatwave, there is no better time to talk about how we can ready ourselves. We urgently need to discuss the health and economic impacts of heatwaves, and how we are going to cope with more of them in the future. • Sarah Perkins-Kirkpatrick is a research fellow at UNSW, Andrew King is a climate extremes research fellow at University of Melbourne and Matthew Hale is a research assistant at UNSW. The authors also acknowledge Robert Smalley, Andrew Watkins and Karl Braganza of the Australian Bureau of Meteorology for providing observations included in this article. This article was republished from the Conversation

News Article | January 24, 2016

In the wake of the broken Basslink cable and after a summer of only limited rainfall, stakeholders say that Tasmania is one of the best places in the world to harvest wind, hydro and solar energy. Research done by the Energy Institute at the University of Melbourne has shown that Tasmania provided one of the best investment opportunities for wind generation in Australia.

News Article | November 4, 2016

For the first time, researchers and health experts have undertaken a comprehensive analysis of the concerning situation in the World Health Organisation European Region regarding digital marketing to children of foods high in fats, salt and sugars. The World Health Organisation (WHO) has published the report, which calls for immediate action by policy makers to recognise and address the growing issue of targeted marketing to kids through digital media. Dr Emma Boyland, from the University's Institute of Psychology, Health and Society, in collaboration with The Open University, WHO, University of Melbourne and Flinders University, produced the report which examines trends in media use among children, marketing methods in the new digital media landscape and children's engagement with such marketing. In the absence of effective regulations for digital media in many countries, children are increasingly exposed to persuasive and individually-tailored marketing techniques through, for example, social media sites and advergames. This trend persists, despite the stubbornly high rates of childhood obesity found almost universally in the WHO European region. Food marketing has been identified by the scientific community as an important contributor to the so-called 'obesogenic' environment, where foods high in fats, salt and sugars are promoted extensively, are more visible, as well as cheaper and easier to obtain than healthy options. Food marketing has been consistently demonstrated to influence children's food preferences and choices, shaping their dietary habits and increasing the risk of becoming obese. Digital marketing offers a loophole for marketers, as there is currently little or no effective regulation and minimal efforts to control it. Furthermore, due to the ability to tailor adverts online to a specific audience, marketing online is potentially much more powerful and targeted to the individual child and their social network. Often, parents do not see the same advertisements, nor can they observe the online activities of children and many therefore underestimate the scale of the problem. Dr Emma Boyland, said: "The food, marketing and digital industries have access to an enormous amount of information regarding young people's exposure to HFSS food marketing online and its influence on children's behaviour, yet external researchers are excluded from these privately held insights, which increases the power imbalances between industry and public health." To address the challenges the report suggests a number of recommendations. These include States acknowledging their duty to protect children from HFSS digital marketing with statutory regulation, the extension of existing offline protection online and existing regulation of internet content being drawn on to compel private Internet platforms to remove marketing of HFSS foods. Dr Boyland adds: "Children have the right to participate in digital media; and, when they are participating, they have the right to protection of their health and privacy and to not be economically exploited." The full report, entitled 'Tackling food marketing to children in a digital world: trans-disciplinary perspectives', can be found at: http://www.

News Article | February 17, 2017

BUFFALO, N.Y. -- While diversity training programs are a good way to build awareness of cultural differences, they usually are not as effective at changing attitudes and behaviors toward diverse groups in the workplace, according to new research from the University at Buffalo School of Management. Published in Psychological Bulletin, the study found diversity training can be successful -- but that results vary widely based on the content and length of training and whether it was accompanied by other related initiatives. "In today's political climate, diversity training has the potential to make a huge positive impact in addressing biases and prejudice within organizations," says Kate Bezrukova, PhD, associate professor of organization and human resources in the UB School of Management. "But training must be conducted thoughtfully. At best, it can engage and retain women and people of color in the workplace, but at worst, it can backfire and reinforce stereotypes." Diversity training aims to enhance participants' cultural awareness, skills and motivation to interact with individuals of different ethnicities, genders, orientations, ages and more. Bezrukova and her team examined more than 40 years of research, combining data from 260 studies and more than 29,000 participants across a variety of fields. They found diversity training had immediate positive effects on participants' knowledge, attitudes and behaviors related to diverse groups. Over time, however, their attitude and behavioral changes decayed or reverted, while their cultural knowledge remained consistent or even increased. "The attitudes this training attempts to change are generally strong, emotion-driven and tied to our personal identities, and we found little evidence that long-term effects to them are sustainable," Bezrukova says. "However, when people are reminded of scenarios covered in training by their colleagues or even the media, they are able to retain or expand on the information they learned." The study found training is most effective when it is mandatory, delivered over an extended period of time, integrated with other initiatives and designed to increase both awareness and skills. In addition, participants responded more favorably to programs that used several methods of instruction, including lectures, discussions and exercises. "It's critical to offer diversity programs as part of a series of related efforts, such as mentoring or networking groups for minority professionals," Bezrukova says. "When organizations demonstrate a commitment to diversity, employees are more motivated to learn about and understand these societal issues and apply that in their daily interactions." Bezrukova's co-authors on the study are Karen Jehn, PhD, professor of management, University of Melbourne Business School; Jamie Perry, PhD, assistant professor, Cornell University School of Hotel Administration; and Chester Spell, PhD, professor of management, Rutgers University School of Business-Camden. The UB School of Management is recognized for its emphasis on real-world learning, community and economic impact, and the global perspective of its faculty, students and alumni. The school also has been ranked by Bloomberg Businessweek, Forbes and U.S. News & World Report for the quality of its programs and the return on investment it provides its graduates. For more information about the UB School of Management, visit

News Article | February 21, 2017

Diversity training programs at work can build awareness of cultural differences, but often fall short at changing attitudes and behaviors. Published in Psychological Bulletin, the study finds diversity training can be successful—but that results vary widely based on the content and length of training and whether it was accompanied by other related initiatives. “In today’s political climate, diversity training has the potential to make a huge positive impact in addressing biases and prejudice within organizations,” says Kate Bezrukova, associate professor of organization and human resources in the University at Buffalo School of Management. “But training must be conducted thoughtfully. At best, it can engage and retain women and people of color in the workplace, but at worst, it can backfire and reinforce stereotypes.” Diversity training aims to enhance participants’ cultural awareness, skills, and motivation to interact with individuals of different ethnicities, genders, orientations, ages, and more. Bezrukova and her team examined more than 40 years of research, combining data from 260 studies and more than 29,000 participants across a variety of fields. They found diversity training had immediate positive effects on participants’ knowledge, attitudes, and behaviors related to diverse groups. Over time, however, their attitude and behavioral changes decayed or reverted, while their cultural knowledge remained consistent or even increased. “The attitudes this training attempts to change are generally strong, emotion-driven and tied to our personal identities, and we found little evidence that long-term effects to them are sustainable,” Bezrukova says. “However, when people are reminded of scenarios covered in training by their colleagues or even the media, they are able to retain or expand on the information they learned.” The study found training is most effective when it is mandatory, delivered over an extended period of time, integrated with other initiatives, and designed to increase both awareness and skills. In addition, participants responded more favorably to programs that used several methods of instruction, including lectures, discussions, and exercises. “It’s critical to offer diversity programs as part of a series of related efforts, such as mentoring or networking groups for minority professionals,” Bezrukova says. “When organizations demonstrate a commitment to diversity, employees are more motivated to learn about and understand these societal issues and apply that in their daily interactions.” Bezrukova’s coauthors are from the University of Melbourne Business School; Cornell University School of Hotel Administration; and Rutgers University School of Business-Camden.

News Article | March 1, 2017

A new online forum for analysis and commentary,, has opened, announced Leslie Norins, MD, PhD, publisher at Medvostat LLC, its parent company. The debut analysis on the site, authored by Dr. Norins himself, claims the decline in print circulation of the “Big Three” national newspapers—Wall Street Journal, New York Times, and USA Today—can be reversed. He also presents his prescription for the turnaround. Smartphones and digital media are convenient villains, he says, because blaming them has diverted publishers attention from necessary updating of print edition marketing. Dr. Norins believes reading the print edition of the Big Three marks a person as “substantial”, whereas a smartphone viewer could be consulting gossip, games, or even porn. “Print is positive, smartphones are iffy,” he says. He says one unsung advantage of the print edition page is its “glorious” display size, 264 square inches, versus the 13 square inches of his iPhone screen. Thus, print’s big page facilitates “serendipity”, which occurs when the reader’s peripheral vision unexpectedly notes nearby important, profitable articles on other subjects. He prescribes better marketing, including ads quoting millennials who benefit from the print edition, plus hiring “influencers” to tout the special advantages of this format. Dr. Norins also reports he found it difficult to find the Journal and Times on sale in Brooklyn, “in the backyard” of the Manhattan headquarters of the two papers. Dr. Norins has over 40 years’ publishing experience creating and growing over 80 subscriber-paid newsletters serving medical professionals. Before the publishing phase of his career, he was a physician-researcher. He received his AB from Johns Hopkins, his MD from Duke University Medical School, and his PhD from University of Melbourne, where he studied with Sir Macfarlane Burnet, Nobel Laureate.

News Article | November 30, 2016

Six years ago Neil Waters moved to Tasmania. There, he says, he had a “brief encounter” with a thylacine, the carnivorous marsupial known as the Tasmanian tiger, declared extinct in 1986. Two years later, in January 2014, he was doing work on his house when a smaller animal walked up a dirt track leading out of a tin mine and past his bedroom window. He was a “little bit uninformed” back then, he says, and did not take any photographs. Now living in Adelaide, where he works as a gardener, he has been making up for it as the founder of the Thylacine Awareness Group. The group is “dedicated to the research, recognition and conservation of our most elusive apex predator”. The group has just over 3,000 members on Facebook, some based as far away as Canada and the UK. Some share their sightings of an animal believed to have died out with the last individual in a Hobart zoo in 1936. Some stories are now decades old and have taken on the quality of well-worn anecdotes. (“We were two young 23-year-olds at the time and we went on holidays towing our little caravan … ”) But other people, Waters says, had never spoken of their sightings before he gave them an opportunity to do so with the group, a “little comfort zone for sightings”. “When you’re looking at something that’s not meant to exist, it tends to make you not believe your eyes, I suppose,” he says. “A lot of people, when they describe a sighting to me, they say, ‘I couldn’t believe what I was looking at … They’re not meant to exist, but I’ve seen them.’” Not only are Waters and his supporters convinced the last thylacine did not die in 1936, they say it is even more prominent in mainland Australia, where it is believed to have become extinct at least 2,000 years ago. He says there have been more than 5,000 reported sightings of thylacines in the past 80 years. It’s a tough sell, he admits. He has been criticised by scientists. He cheerily describes his views –adopting the words of others – as “outlandish”. But what would people have to gain by lying? “The sad part is, we haven’t found a dead one lying on the side of the road,” he says. “Then we’d have some proof. The sightings people convey to me, they’re sincere.” Almost 100 people attended the group’s first meet-and-greet event, held in Adelaide on Sunday – among them a woman who told Waters of seeing a thylacine 50 years ago. “She’d never told anyone in her life … people don’t have any reason to make these things up, they’re not looking for attention, they’re just looking for someone not to laugh at them.” Tickets to the event were $8 each, with proceeds to be put towards an upcoming documentary that Waters is producing. He is interested to hear from collaborators, preferably Australians (“the Americans tend to sensationalise things a bit too much for my liking”). A drawcard at the event was the premiere of new footage, shot in Western Australia by a woman who says she has seen thylacines “several times” on her property. It was uploaded to YouTube late on Monday. Since then, debate has raged as to whether the video shows a thylacine, or a fox with mange. A meme reading “one does not simply dismiss a sighting as a mangy fox” drew a supportive response from members. Waters admits that the video, like much of the footage shared by the Facebook group, is “ambiguous”, proving nothing other than that there are animals in Australia that resist ready classification. But that, he says, should be enough to invigorate interest in the possibility that thylacines – or, alternatively, animals “that have not yet been described by science” – exist on the mainland. The scientific community remains resistant (and “usually not very polite”). “But no one wants to actually get off their bum and come for a walk through the bush and have a look with me,” he says. “I don’t really mind being the butt of all their jokes but I guarantee that if we find one they’ll all want to be my best friend then.” Waters has trail cameras installed, potential den sites he keeps an eye on, and a huge network of people all over Australia “contributing information on a daily basis”. His motivation is not financial, “just for the vindication of people who’ve been told they’re bonkers”. “I represent 3,000 people who have been told they’re nuts, basically.” So convinced are scientists of the thylacine’s extinction, discussion in recent years has centred on whether it could be resurrected by cloning. In 2008 a gene was successfully inserted into a mouse embryo from fragments taken from 100-year-old specimens preserved in ethanol. Andrew Pask, the projector’s team leader and a developmental biologist at the University of Melbourne, says technology is not yet at the point where it is possible to clone a thylacine anew, but that the animal’s entire DNA has been sequenced. For science to accept that the Tasmanian tiger lives on today, says Pask, it needs irrefutable evidence. “I would love, love, love to believe they’re still out there, but unfortunately I think all of the evidence points to the contrary on that front.” He has been sent many samples of scat found in Tasmania to test in his laboratory. “I’m quite tired of people sending me big bags of poo in the mail,” he says. “None of them are ever thylacines’.” Waters himself has gathered some 20 specimens that he is hoping to get tested. “It might be that ‘it’s from a marsupial and it’s unknown’ is as conclusive as they can get,” he says. But that would still be progress, given that it would reaffirm his conviction there are large unknown fauna in Australia. That seems possible when he points out that species thought to have declined or become extinct have been rediscovered. It seems less likely when he also points out there have been between 5,000 and 7,000 recorded sightings of big cats in Australia. “I’m a firm believer in thylacines, and I’m a firm believer in big cats, for the record,” he says. What else does he believe in? “Umm, three meals a day and a happy, healthy, stress-free life as much as possible.”

News Article | October 31, 2016

A female mosquito lays hundreds of eggs at a time, and within ten days newly minted adults are leaving their stagnant water homes to buzz around our ears and ankles. Victoria is braced for swarms of the things after late winter floods and warming spring temperatures created perfect breeding conditions, prompting public health warnings and forcing councils to start spraying breeding sites. But as Victorians stalk their bedrooms and hallways armed with insecticide cans, they should count their blessings that for now at least the Aedes albopictus mosquito remains in the Torres Strait. Nicknamed "Tiger" because of the bright white stripe on its back and the white bands on its legs, albopictus is a biting mosquito that can carry a variety of tropical diseases. And it doesn't mind a bit of chilly whether. Which means, if it crossed to the mainland it could cover much of the country, making once exotics tropical diseases into more common temperate ones. "Albopictus is one of our main quarantine pests," says Professor Ary Hoffmann of the Bio21 Institute at the University of Melbourne. "Its ability to withstand colder weather has allowed it to invade Europe and North America." And as far as Professor Hoffmann is concerned, despite our quarantine efforts it is likely a matter of when, not if, the Tiger makes it to the mainland. It was first reported in the Torres Strait islands in 2005. At the moment the rise in mosquito numbers in Victoria has prompted health warnings about comparatively rare mosquito-borne diseases such as the non-fatal Ross River Fever and Barmah Forest viruses, and the potentially fatal Murray Valley Encephalitis. But if albopictus arrives here Professor Hoffmann says we will have to add dengue fever, which is currently just limited to northern Queensland, as well as the zika and chikungunya viruses. Many mosquito breeds tend to be more active at dawn and dusk when the air is more humid and the insects are at less risk of drying out. But Professor Hoffmann says albopictus doesn't mind a bit of daylight. It means that albopictus is active and biting in the middle of the day. "It is regarded as a massive irritant as well as being a vector for diseases, making it a real nuisance for outdoor activities." Professor Hoffmann and colleagues have modelled the possible reach of albopictus if it does arrive, and they predict it could become widespread as far south as northern Tasmania. "It will go a long way," he says While their modelling suggests it would be concentrated on the coastal fringes of the continent, its spread so far in North America and elsewhere suggests forms of this mosquito could also travel further inland in Australia. Dengue fever is spreading rapidly elsewhere around the world as a direct result of albopictus migrating into more temperate climes and the ongoing spread of its sister species Aedes aegypti. Since 1970 the number of countries where dengue is endemic, that is it is present, has risen from just nine to now 100. And there are a rising number of outbreaks. The World Health Organization says 2015 was a particularly bad year with outbreaks of over 100,000 cases in the Philippines and Malaysia, representing a 60 per cent and 16 per cent increase respectively on the previous year. The number of cases in Brazil trebled to over 1.5 million, and Delhi in India reported its worst outbreak since 2006 with 15,000 cases. In 2014, China's Guangdong province near Hong Kong reported its worst outbreak with 45,000 reported cases and six confirmed deaths. WHO now warns that Europe is also at risk of possible outbreaks. Cases of local transmission were reported in France and Croatia in 2010. The main vector of dengue has been Aedes aegypti mosquito, which is limited to tropical climates including northern Australia. But the ability of albopictus to withstand cold temperatures means the disease is now spread more widely. It is believed to have travelled into northern climes by breeding in water puddles caught inside imported tyres and bamboo. Its eggs can survive temperatures below freezing. Dengue fever has similar symptoms to mosquito-borne malaria, but is less fatal. Dengue can cause severe flu-like symptoms, headaches and joint pain, as well as vomiting and rashes. Severe dengue fever, known as dengue haemorrhagic fever, is present in most Asian and Latin American countries, and with proper medical attention fatality rates from severe dengue can be kept below 1 per cent. It is estimated that every year abour 500,000 people are infected with severe dengue fever requiring hospitalisation, of which about 2.5 per cent die. In contrast, in 2015 there were 438,000 deaths from malaria. "Dengue isn't as deadly as malaria, but while the incidence of malaria is going down the incidence of dengue fever is going up, and the resulting economic impact can be massive because dengue can still really knock people around." "We hope albopictus won't hit the mainland but I think it is inevitable that it will at some stage, it is just a matter of time. And when it does you will certainly notice it." The odd temporary swarm of mosquitoes may be the least of our problems. Explore further: Dengue fever: what you need to know More information: Matthew P. Hill et al. Predicting the spread ofin Australia under current and future climates: Multiple approaches and datasets to incorporate potential evolutionary divergence, Austral Ecology (2014). DOI: 10.1111/aec.12105

News Article | December 22, 2016

DALLAS, Dec. 22, 2016 -- Heart-related deaths spike during Christmas, but the effect may have nothing to do with the cold winter season, according to new research in Journal of the American Heart Association, the Open Access Journal of the American Heart Association/American Stroke Association. "Spikes in deaths from natural causes during Christmas and New Year's Day has been previously established in the United States. However, the Christmas holiday period (December 25th to January 7th) in the U.S. falls within the coldest period of the year when death rates are already seasonally high due to low temperatures and influenza," said Josh Knight, B.Sc., study author and research fellow at the University of Melbourne in Australia. In this study, researchers analyzed trends in deaths in New Zealand, where Christmas occurs during the summer season when death rates are usually at a seasonal low - allowing researchers to separate any winter effect from a holiday effect. During a 25-year period (1988-2013), there were a total of 738,409 deaths (197,109 were noted as cardiac deaths). A 4.2 percent increase in heart-related deaths occurring away from a hospital from December 25 - January 7. The average age of cardiac death was 76.2 years during the Christmas period, compared with 77.1 years during other times of the year. There are a range of theories that may explain the spike in deaths during the holiday season, including the emotional stress associated with the holidays, changes in diet and alcohol consumption, less staff at medical facilities, and changes in the physical environment (for example visiting relatives). However, there have been few attempts to replicate prior studies. Although more research is needed to explain the spike in deaths, researchers suggest one possibility may be that patients hold back in seeking medical care during the holiday season. "The Christmas holiday period is a common time for travel within New Zealand, with people frequently holidaying away from their main medical facilities. This could contribute to delays in both seeking treatment, due to a lack of familiarity with nearby medical facilities, and due to geographic isolation from appropriate medical care in emergency situations," Knight said Another explanation may have to do with a terminally ill patients' will to live and hold off death for a day that is important to them. "The ability of individuals to modify their date of death based on dates of significance has been both confirmed and refuted in other studies, however it remains a possible explanation for this holiday effect," Knight said. However, researchers note that the study did not track daily temperatures and New Zealand has an island climate, which almost eliminates the extremes of temperature that have been associated with heart-related death rates in previous studies. Co-authors are Chris Schilling, M.Sc.; Adrian Barnett, Ph.D.; Rod Jackson Ph.D.; and Phillip Clarke, Ph.D. Author disclosures are on the manuscript. The Australian National Health and Medical Research Council and the New Zealand Health Research Council funded the study. Winter images, 5 Tips for Avoiding a Holiday Heart Attack infographic, heart graphics, and heart attack video are located in the right column of this release link http://newsroom. Avoiding the deadly holiday heart attack Cold Weather and Cardiovascular Disease After Dec 22, view the manuscript online. Follow AHA/ASA news on Twitter @HeartNews For updates and new science from JAHA, follow @JAHA_AHA Statements and conclusions of study authors published in American Heart Association scientific journals are solely those of the study authors and do not necessarily reflect the association's policy or position. The association makes no representation or guarantee as to their accuracy or reliability. The association receives funding primarily from individuals; foundations and corporations (including pharmaceutical, device manufacturers and other companies) also make donations and fund specific association programs and events. The association has strict policies to prevent these relationships from influencing the science content. Revenues from pharmaceutical and device corporations are available at http://www. .

News Article | October 12, 2016

Abstract: Led by the University of Melbourne and published today in Nature Nanotechnology, the work holds promise for micro and nano scale applications including drug delivery, chemical sensing and energy storage. Frank Caruso, Professor and ARC Australian Laureate Fellow, Department of Chemical and Biomolecular Engineering said that the team nanoengineered building blocks to tailor the development of advanced materials. "Nano-objects are difficult to manipulate, as they're too tiny to see directly by eye, far too small to hold, and often have incompatible surfaces for assembling into ordered structures," he said. "Assembling LEGO bricks into complex shapes is relatively easy, as LEGO studs ensure the blocks stick together wherever you want. "So we used a similar strategy as a basis for assembling nano-objects into complex architectures by first coating them with a universally adhesive material (a polyphenol) so that they resemble the studs on LEGO bricks. "This allows for a range of nano-objects to stick together around a template, where the template determines the final shape of the assembled structure," Professor Caruso said. Different materials can be assembled using this approach. This simple and modular approach has been demonstrated for 15 representative materials to form different sizes, shapes, compositions and functionalities. Compositions include polymeric particles, metal oxide particles and wires, noble metal nanoparticles, coordination polymer nanowires, nanosheets and nanocubes, and biologicals. The building blocks can be used to construct complex 3D superstructures, including core-satellite, hollow, hierarchically organised supraparticles, and macroscopic hybrid materials. "Many previous methods have been limited by particle-specific assembly," Professor Caruso said. "However, this new polyphenol-based particle approach can be adapted to different functions and allows different building blocks to be assembled into super-structures," he said. The "studs" in the LEGO brick-like structures, known as C/G studs from the polyphenols, provide a superstructuring process for assembling and inter-locking the building blocks using multiple anchor points. The "C/G studs" on the building block nanoparticles can further interact with a secondary substrate and/or coordinate with metal ions, interlocking the structures. This provides a platform for the rapid generation of superstructured assemblies with enhanced chemical diversity and structural flexibility across a wide range of length scales, from nanometres to centimetres. For more information, please click If you have a comment, please us. Issuers of news releases, not 7th Wave, Inc. or Nanotechnology Now, are solely responsible for the accuracy of the content.

McGorry P.,University of Melbourne | Bates T.,Headstrong The National Center for Youth Mental Health | Birchwood M.,University of Birmingham
British Journal of Psychiatry | Year: 2013

Despite the evidence showing that young people aged 12-25 years have the highest incidence and prevalence of mental illness across the lifespan, and bear a disproportionate share of the burden of disease associated with mental disorder, their access to mental health services is the poorest of all age groups. A major factor contributing to this poor access is the current design of our mental healthcare system, which is manifestly inadequate for the unique developmental and cultural needs of our young people. If we are to reduce the impact of mental disorder on this most vulnerable population group, transformational change and service redesign is necessary. Here, we present three recent and rapidly evolving service structures from Australia, Ireland and the UK that have each worked within their respective healthcare contexts to reorient existing services to provide youth-specific, evidence-based mental healthcare that is both accessible and acceptable to young people.

Phillips S.J.,ATandT Labs Research | Elith J.,University of Melbourne
Ecology | Year: 2013

A fundamental ecological modeling task is to estimate the probability that a species is present in (or uses) a site, conditional on environmental variables. For many species, available data consist of 'presence' data (locations where the species [or evidence of it] has been observed), together with 'background' data, a random sample of available environmental conditions. Recently published papers disagree on whether probability of presence is identifiable from such presence-background data alone. This paper aims to resolve the disagreement, demonstrating that additional information is required. We defined seven simulated species representing various simple shapes of response to environmental variables (constant, linear, convex, unimodal, S-shaped) and ran five logistic model-fitting methods using 1000 presence samples and 10 000 background samples; the simulations were repeated 100 times. The experiment revealed a stark contrast between two groups of methods: those based on a strong assumption that species' true probability of presence exactly matches a given parametric form had highly variable predictions and much larger RMS error than methods that take population prevalence (the fraction of sites in which the species is present) as an additional parameter. For six species, the former group grossly under- or overestimated probability of presence. The cause was not model structure or choice of link function, because all methods were logistic with linear and, where necessary, quadratic terms. Rather, the experiment demonstrates that an estimate of prevalence is not just helpful, but is necessary (except in special cases) for identifying probability of presence. We therefore advise against use of methods that rely on the strong assumption, due to Lele and Keim (recently advocated by Royle et al.) and Lancaster and Imbens. The methods are fragile, and their strong assumption is unlikely to be true in practice. We emphasize, however, that we are not arguing against standard statistical methods such as logistic regression, generalized linear models, and so forth, none of which requires the strong assumption. If probability of presence is required for a given application, there is no panacea for lack of data. Presence-background data must be augmented with an additional datum, e.g., species' prevalence, to reliably estimate absolute (rather than relative) probability of presence. © 2013 by the Ecological Society of America.

Zwanenburg F.A.,MESA Institute for Nanotechnology | Zwanenburg F.A.,University of New South Wales | Dzurak A.S.,University of New South Wales | Morello A.,University of New South Wales | And 6 more authors.
Reviews of Modern Physics | Year: 2013

This review describes recent groundbreaking results in Si, Si/SiGe, and dopant-based quantum dots, and it highlights the remarkable advances in Si-based quantum physics that have occurred in the past few years. This progress has been possible thanks to materials development of Si quantum devices, and the physical understanding of quantum effects in silicon. Recent critical steps include the isolation of single electrons, the observation of spin blockade, and single-shot readout of individual electron spins in both dopants and gated quantum dots in Si. Each of these results has come with physics that was not anticipated from previous work in other material systems. These advances underline the significant progress toward the realization of spin quantum bits in a material with a long spin coherence time, crucial for quantum computation and spintronics. Published by the American Physical Society.

Austin Research Institute and University of Melbourne | Date: 2010-06-11

The present invention provides a method of immunising a subject comprising the step of administering a composition comprising an antigen and a carbohydrate polymer comprising mannose to a mucosal site of the subject, methods of use of the composition for vaccination and sterilization and use of the composition in manufacturing a medicament.

Agency: Cordis | Branch: FP7 | Program: CP-FP | Phase: SiS-2007- | Award Amount: 930.13K | Year: 2008

We seek to develop a plan for amending the current Intellectual Property Rights (IPR) regime for rewarding pharmaceutical innovations. The existing IPR regime is highly problematic. This has become obvious in the wake of a series of public health emergencies, most notably the AIDS crisis, which pits the vital needs of poor patients against the need of pharmaceutical companies to recoup their investments. Amending the current system represents one of the major 21st century challenges, namely delivering reasonably priced health care to patients around the world. This is a challenge that lies at the heart of biomedical ethics striving for sustainable world development. Our effort to take up the challenge focuses on a potential two-tiered patent system. This scheme would create a new patent (Patent-2) that is complementary to existing monopoly patents, leaving innovators free to choose a patent of either kind. Patent-2 holders would not have veto powers over the reproduction of their inventions, thus allowing medicines to become available at competitive market prices without delay. Patent-2 holders would be rewarded, out of public funds, in proportion to the impact of their invention on the global burden of disease. A first sketch of the Patent-2 scheme has already been developed through a grant from the Australian Research Council. However, the system is now in urgent need of development with input from a range of experts and policy-makers. In order to forge a policy consensus, some of the most influential social philosophers and economists world-wide (Nobel Laureate Joseph Stiglitz, Peter Singer and Thomas Pogge) will be joined by key policy institutes to use their cumulative weight to enhance and promote a proposal that has the clear potential to provide access to essential medicines to poor patients whilst increasing the possibilities for innovation in the pharmaceutical sector.

News Article | March 7, 2016

In an unusual medical case, a man in Australia lost his sense of smell for more than a year after he was bitten by a venomous snake, according to a new report of his case. The man has since regained some of his sense of smell, but he is still unable to fully detect smells the way he did before his encounter with the reptile, called the mulga snake, said the doctors and other experts who examined the man's neurological condition about a year after he was bitten and who wrote the report of his case. "As far as I know, he is still affected but somewhat improved," said Kenneth D. Winkel, a toxinologist at the University of Melbourne in Australia, who co-authored the report. The otherwise healthy 30-year-old man went to a neurology clinic at St Vincent's Hospital in Melbourne, Australia, telling doctors that he'd lost his sense of smell about a year before and had not regained it. The man first noticed this bizarre symptom a week after he was bitten by a snake while he was traveling in the Australian outback. The snake bit the man on two of his fingers while he was washing his hands at a roadside restroom, the man told the doctors. A local resident helped out, trapping the snake in the sink and killing it. The man preserved the snake in a jar of alcohol. [3 Unusual Snakebite Reactions] Shortly after the incident, the man went to the emergency department of a regional hospital. The doctors who treated him there found that he had temporary problems with blood clotting, too much protein in his urine and blisters that oozed with a clear liquid. The man stayed at that hospital for three days, during which his doctors gave him medication to prevent the bite wound from becoming infected. However, those doctors did not give the man anti-venom because they considered his symptoms to be "mild enough to not warrant anti-venom administration," according to the report, published Feb. 17 in the Journal of Clinical Neuroscience. The administration of anti-venom is normally recommended when a person is experiencing severe symptoms from a venomous bite, the authors of the report said. A few days after the man was released, he noticed his sense of smell began to deteriorate, and within weeks, he completely lost the ability to smell. A year later, when the man went to the neurology clinic at a different hospital, neurological tests confirmed that he was unable to detect smells — a condition that doctors call anosmia. However, the examination of his nose and nervous system did not reveal any other abnormalities, which meant his anosmia did not have a structural cause and therefore was most likely caused by the snakebite, the researchers said. Because more than a year had passed since the man was bitten and his loss of smell was severe, there was not much his doctors could do to treat his condition at that point. Meanwhile, the snake specimen that the man had kept in the jar was sent to the Queensland Museum's herpetology department, where experts identified it as the mulga snake (Pseudechis australis). The mulga snake is the largest terrestrial venomous snake in Australia, according to a previous study on mulga snake bites. In that study, which looked at 27 cases of people bitten by mulga snakes, the researchers noted that although the bites can be fatal, the most recent case of a fatal mulga snakebite was reported more than 40 years ago. In most cases, bites from a mulga snake can cause symptoms such as inflammation at the bite site, muscle pain and destruction of blood cells. But effects on the nervous system rarely have been reported for bites inflicted by this snake species, the researchers said. However, cases of long-term and permanent anosmia attributed to bites by other types of snakes have been reported, the researchers said. [Oddest Medical Case Reports] It's unclear how often people may develop anosmia after a snakebite, Winkel told Live Science. Overall, it appears to be "uncommon, but not rare," he said. In a previous study done in Australia, researchers examined the effects of the bites from the red-bellied black snake, and found that 1 in 57 affected patients developed anosmia, he said. The red-bellied black snake (Pseudechis porphyriacus) belongs to the same family of snakes as the mulga snake, called elapids. It is not clear whether administering anti-venom soon after a person is bitten may help prevent anosmia, the researchers said. Copyright 2016 LiveScience, a Purch company. All rights reserved. This material may not be published, broadcast, rewritten or redistributed.

A staggering 93 percent of the Great Barrier Reef is already suffering from coral bleaching, as a comprehensive reef survey confirmed this week. But not all may be lost, as there are still ways for corals to survive these dire events. For the first time, researchers showed that some corals that survived bleaching can obtain and host new kinds of algae from the environment, making them more tolerant to heat and better geared for recovery. According to lead researcher and Southern Cross University postgraduate student Nadine Boulotte, most corals were previously thought to acquire microalgae while they’re still young – and to harbor the same kinds of algae their whole lives. “Our study shows for the first time that some adult corals can be promiscuous, and swap their algal partners later in life,” said Boulotte, who worked with a team of scientists from the University of Melbourne as well as other Australian and U.S. organizations. The swapping activity could assist corals in getting more heat-tolerant microalgae and eventually adapting to global warming and bleaching situations. Bleaching takes place once the microalgae found in coral polyps start to die off and leave the tissues of the dead white. The microalgae provide corals the energy they need to build reefs, survive, and stay in mutually beneficial relationships. Using novel DNA sequencing, the team analyzed algal specimens from corals at Lord Howe Island in Australia during and post-bleaching events in 2010 and then 2011. Tracking the patterns followed by microalgae in polyp tissues in two species of corals, they detected “an extraordinary range” of various microalgae types in the corals. The survivors seemed to have obtained new algae types from their environment, with one microalgae proliferating well and occupying around a third of the community in the sampled corals. Contrary to previously thought, taking up new microalgae types happened not just in coral larvae and in the juvenile phase – offerings corals at different life stages a chance at coping with rising sea temperatures. Co-author and Director of SCU's Marine Ecology Research Center Peter Harrison deemed the research timely in light of the severe coral bleaching currently afflicting the northern Great Barrier Reef. He called for expanding studies from subtropical areas to tropical reef regions, where a majority of coral reefs exist and bleaching severely affects the coral populations., Lord Howe Island corals have not been seen to bleach this year, but previous events demonstrated that even the world’s southernmost reef is not fully shielded from massive bleaching. The findings were published in The ISME Journal. The Great Barrier Reef’s massive bleaching is feared to bring about grave ecological and economic consequences. Even its world heritage status is in danger of being revoked – UNESCO decided in 2015 to exclude the reef from its “in danger list,” seeking an update from the Australian government on progress made in water and reef quality improvements. © 2016 Tech Times, All rights reserved. Do not reproduce without permission.

News Article | February 9, 2017

African penguins have used biological cues in the ocean for centuries to find their favorite fish. Now these cues are trapping juvenile penguins in areas with hardly any food, scientists report February 9 in Current Biology. It’s the first known ocean “ecological trap,” which occurs when a once-reliable environmental cue instead, often because of human interference, prompts an animal to do something harmful. When juvenile Spheniscus demersus penguins off the Western Cape of South Africa leave the nest for their maiden voyage at sea, they head for prime penguin hunting grounds. But the fish are no longer there, says Richard Sherley, a marine ecologist with the University of Exeter Environment and Sustainability Institute. Increased ocean temperatures, changes in salinity and overfishing have driven the fish eastward. Penguins are doing what they’ve evolved to do, following signs in the water to historically prosperous habitats. “But humans have broken the system,” Sherley says, and there’s no longer enough fish to support the seabirds. Sherley estimates that only about 20 percent of these African penguins survive their first year, partly because they can fall into this ecological trap. Ecological traps have been documented on land for decades. There has been a lot of speculation about traps in the ocean, but this study is the best evidence so far, says Rob Hale, an ecologist with the University of Melbourne. “Hopefully the study will generate more interest in examining ecological traps in the ocean so we can better understand when and why traps arise, how they are likely to affect animals, and how we can go about managing their effects,” Hale says. This trap may have occurred because of how penguins find their food. Researchers think penguins can sense a stress chemical that phytoplankton release when being eaten. Penguins eat sardines, which eat phytoplankton. Usually the chemical, dimethyl sulfide, signals to penguins where the fish are feasting on phytoplankton. But phytoplankton can release the compound in other situations, like in rough water. The signal is still sent, but there are no fish. “You have a cue that used to signal high quality in an environment, but that environment has been modified by human action to some extent,” Sherley says. “The animals are tricked or trapped into selecting a lower quality habitat because the cue still exists, even though there’s high quality habitat available.” Adult penguins have adapted to the trap and shifted their hunting patterns to follow the fish east. Sherley says they’re not sure how adults learned to avoid the problem, but that there must be a way that juveniles who survive to adulthood also adapt. Researchers also tracked juvenile penguins from the Namibia and Eastern Cape of South Africa breeding regions. The eastern penguins have been unaffected by the trap because the fish have moved closer to them. The Namibia population is being barely sustained by the goby, a junk food fish that appears to be taking over the areas previously inhabited by sardines and anchovies. The Western Cape penguins have been most affected. The population has declined 80 percent in the last 15 years — from 40,000 breeding pairs to 5,000 or 6,000, Sherley says. He estimates that if juvenile penguins hadn’t been falling victim to this trap, the Western Cape population would be double its current levels. If the loss of fish off the Western Cape of South Africa can’t be reversed, Sherley speculates the two most likely outcomes are an African penguin extinction or an ecosystem shift. Current penguin conservation efforts protect penguin breeding areas, but the study suggests that the protections may be insufficient because the ecological trap is far from the breeding grounds.

Saeed I.,University of Melbourne | Tang S.-L.,Academia Sinica, Taiwan | Halgamuge S.K.,University of Melbourne
Nucleic Acids Research | Year: 2012

An approach to infer the unknown microbial population structure within a metagenome is to cluster nucleotide sequences based on common patterns in base composition, otherwise referred to as binning. When functional roles are assigned to the identified populations, a deeper understanding of microbial communities can be attained, more so than gene-centric approaches that explore overall functionality. In this study, we propose an unsupervised, model-based binning method with two clustering tiers, which uses a novel transformation of the oligonucleotide frequency-derived error gradient and GC content to generate coarse groups at the first tier of clustering; and tetranucleotide frequency to refine these groups at the secondary clustering tier. The proposed method has a demonstrated improvement over PhyloPythia, S-GSOM, TACOA and TaxSOM on all three benchmarks that were used for evaluation in this study. The proposed method is then applied to a pyrosequenced metagenomic library of mud volcano sediment sampled in southwestern Taiwan, with the inferred population structure validated against complementary sequencing of 16S ribosomal RNA marker genes. Finally, the proposed method was further validated against four publicly available metagenomes, including a highly complex Antarctic whale-fall bone sample, which was previously assumed to be too complex for binning prior to functional analysis. © 2011 The Author(s).

Berns E.M.J.J.,Erasmus Medical Center | Bowtell D.D.,Peter MacCallum Cancer Center | Bowtell D.D.,University of Melbourne
Cancer Research | Year: 2012

The classification of epithelial ovarian cancer has been substantially revised, with an increased appreciation of the cellular origins and molecular aberrations of the different histotypes. Distinct patterns of signaling-pathway disruption are seen between and within histotypes. Large-scale genomic studies of high-grade serous cancer, the most common histotype, have identified novel molecular subtypes that are associated with distinct biology and clinical outcome. High-grade serous cancers are characterized by few driver point mutations but abundant DNA copy number aberrations. Inactivation of genes associated withDNAdamage repair underlies responses to platinum and PARP inhibitors. Here we review these recent developments. ©2012 AACR.

Shackleton M.,Peter MacCallum Cancer Center | Shackleton M.,University of Melbourne
Seminars in Cancer Biology | Year: 2010

The functional capabilities of normal stem cells and tumorigenic cancer cells are conceptually similar in that both cell types are able to proliferate extensively. Indeed, mechanisms that regulate the defining property of normal stem cells - self-renewal - also frequently mediate oncogenesis. These conceptual links are strengthened by observations in some cancers that tumorigenic cells can not only renew their malignant potential but also generate bulk populations of non-tumorigenic cells in a manner that parallels the development of differentiated progeny from normal stem cells. But cancer cells are not normal. Although tumorigenic cells and normal stem cells are similar in some ways, they are also fundamentally different in other ways. Understanding both shared and distinguishing mechanisms that regulate normal stem cell proliferation and tumor propagation is likely to reveal opportunities for improving the treatment of patients with cancer. © 2010 Elsevier Ltd.

Shortt J.,Peter MacCallum Cancer Center | Johnstone R.W.,Peter MacCallum Cancer Center | Johnstone R.W.,University of Melbourne
Cold Spring Harbor Perspectives in Biology | Year: 2012

The transforming effects of proto-oncogenes such as MYC that mediate unrestrained cell proliferation are countered by "intrinsic tumor suppressor mechanisms" that most often trigger apoptosis. Therefore, cooperating genetic or epigenetic effects to suppress apoptosis (e.g., overexpression of BCL2) are required to enable the dual transforming processes of unbridled cell proliferation and robust suppression of apoptosis. Certain oncogenes such as BCR-ABL are capable of concomitantly mediating the inhibition of apoptosis and driving cell proliferation and therefore are less reliant on cooperating lesions for transformation. Accordingly, direct targeting of BCR-ABL through agents such as imatinib have profound antitumor effects. Other oncoproteins such as MYC rely on the anti-apoptotic effects of cooperating oncoproteins such as BCL2 to facilitate tumorigenesis. In these circumstances, where the primary oncogenic driver (e.g., MYC) cannot yet be therapeutically targeted, inhibition of the activity of the cooperating antiapoptotic protein (e.g., BCL2) can be exploited for therapeutic benefit. © 2012 Cold Spring Harbor Laboratory Press; all rights reserved.

Room R.,University of Melbourne | Reuter P.,University of Maryland University College
The Lancet | Year: 2012

The Single Convention on Narcotic Drugs in 1961 aimed to eliminate the illicit production and non-medical use of cannabis, cocaine, and opioids, an aim later extended to many pharmaceutical drugs. Over the past 50 years international drug treaties have neither prevented the globalisation of the illicit production and non-medical use of these drugs, nor, outside of developed countries, made these drugs adequately available for medical use. The system has also arguably worsened the human health and wellbeing of drug users by increasing the number of drug users imprisoned, discouraging effective countermeasures to the spread of HIV by injecting drug users, and creating an environment conducive to the violation of drug users' human rights. The international system has belatedly accepted measures to reduce the harm from injecting drug use, but national attempts to reduce penalties for drug use while complying with the treaties have often increased the number of drug users involved with the criminal justice system. The international treaties have also constrained national policy experimentation because they require nation states to criminalise drug use. The adoption of national policies that are more aligned with the risks of different drugs and the effectiveness of controls will require the amendment of existing treaties, the formulation of new treaties, or withdrawal of states from existing treaties and re-accession with reservations. © 2012 Elsevier Ltd.

Alvarez Rojas C.A.,University of Melbourne | Romig T.,University of Hohenheim | Lightowlers M.W.,University of Melbourne
International Journal for Parasitology | Year: 2014

Genetic variability in the species group Echinococcus granulosus sensu lato is well recognised as affecting intermediate host susceptibility and other biological features of the parasites. Molecular methods have allowed discrimination of different genotypes (G1-10 and the 'lion strain'), some of which are now considered separate species. An accumulation of genotypic analyses undertaken on parasite isolates from human cases of cystic echinococcosis provides the basis upon which an assessment is made here of the relative contribution of the different genotypes to human disease. The allocation of samples to G-numbers becomes increasingly difficult, because much more variability than previously recognised exists in the genotypic clusters G1-3 (=E. granulosus sensu stricto) and G6-10 (Echinococcus canadensis). To accommodate the heterogeneous criteria used for genotyping in the literature, we restrict ourselves to differentiate between E. granulosus sensu stricto (G1-3), Echinococcus equinus (G4), Echinococcus ortleppi (G5) and E. canadensis (G6-7, G8, G10). The genotype G1 is responsible for the great majority of human cystic echinococcosis worldwide (88.44%), has the most cosmopolitan distribution and is often associated with transmission via sheep as intermediate hosts. The closely related genotypes G6 and G7 cause a significant number of human infections (11.07%). The genotype G6 was found to be responsible for 7.34% of infections worldwide. This strain is known from Africa and Asia, where it is transmitted mainly by camels (and goats), and South America, where it appears to be mainly transmitted by goats. The G7 genotype has been responsible for 3.73% of human cases of cystic echinococcosis in eastern European countries, where the parasite is transmitted by pigs. Some of the samples (11) could not be identified with a single specific genotype belonging to E. canadensis (G6/10). Rare cases of human cystic echinococcosis have been identified as having been caused by the G5, G8 and G10 genotypes. No cases of human infection with G4 have been described. Biological differences between the species and genotypes have potential to affect the transmission dynamics of the parasite, requiring modification of methods used in disease control initiatives. Recent investigations have revealed that the protective vaccine antigen (EG95), developed for the G1 genotype, is immunologically different in the G6 genotype. Further research will be required to determine whether the current EG95 vaccine would be effective against the G6 or G7 genotypes, or whether it will be necessary, and possible, to develop genotype-specific vaccines. © 2013 Australian Society for Parasitology Inc.

Chilton N.F.,University of Manchester | Collison D.,University of Manchester | McInnes E.J.L.,University of Manchester | Winpenny R.E.P.,University of Manchester | Soncini A.,University of Melbourne
Nature Communications | Year: 2013

Understanding the anisotropic electronic structure of lanthanide complexes is important in areas as diverse as magnetic resonance imaging, luminescent cell labelling and quantum computing. Here we present an intuitive strategy based on a simple electrostatic method, capable of predicting the magnetic anisotropy of dysprosium(III) complexes, even in low symmetry. The strategy relies only on knowing the X-ray structure of the complex and the well-established observation that, in the absence of high symmetry, the ground state of dysprosium(III) is a doublet quantized along the anisotropy axis with an angular momentum quantum number mJ=± 15 / 2. The magnetic anisotropy axis of 14 low-symmetry monometallic dysprosium(III) complexes computed via high-level ab initio calculations are very well reproduced by our electrostatic model. Furthermore, we show that the magnetic anisotropy is equally well predicted in a selection of low-symmetry polymetallic complexes. © 2013 Macmillan Publishers Limited. All rights reserved.

Goud B.,University Pierre and Marie Curie | Gleeson P.A.,University of Melbourne
Trends in Cell Biology | Year: 2010

The architecture of the Golgi apparatus is intimately linked to its role in regulating membrane trafficking. The recruitment of peripheral membrane proteins, in particular golgins and small G proteins has emerged as a key to the understanding of the organization and the dynamics of this organelle. There have been considerable recent advances in defining the structures and binding partners of golgins, and their contribution to membrane-mediated biological processes. In this paper, we review the proposed roles for golgins with a focus on the golgins of the trans-Golgi network (TGN). We explore the potential for TGN golgins, acting as scaffold molecules, to co-ordinate the regulation of TGN structure and function. © 2010 Elsevier Ltd.

Harvey K.F.,Peter MacCallum Cancer Center | Harvey K.F.,University of Melbourne | Zhang X.,Peter MacCallum Cancer Center | Zhang X.,University of Melbourne | And 2 more authors.
Nature Reviews Cancer | Year: 2013

The Hippo pathway controls organ size in diverse species, whereas pathway deregulation can induce tumours in model organisms and occurs in a broad range of human carcinomas, including lung, colorectal, ovarian and liver cancer. Despite this, somatic or germline mutations in Hippo pathway genes are uncommon, with only the upstream pathway gene neurofibromin 2 (NF2) recognized as a bona fide tumour suppressor gene. In this Review, we appraise the evidence for the Hippo pathway as a cancer signalling network, and discuss cancer-relevant biological functions, potential mechanisms by which Hippo pathway activity is altered in cancer and emerging therapeutic strategies. © 2013 Macmillan Publishers Limited. All rights reserved.

Loi S.,Peter MacCallum Cancer Center | Loi S.,University of Melbourne
OncoImmunology | Year: 2013

By analyzing over 2000 samples from a randomized clinical trial, we have recently associated high levels of tumorinfiltrating lymphocytes with an excellent prognosis among triple negative breast cancer patients as well with improved clinical responses to immunogenic chemotherapy among patients bearing HER 2 over-expression. These findings suggest that immunomodulation could represent a new approach to treat these aggressive breast cancer subtypes. © 2013 Landes Bioscience.

Research & Innovation, Grains Research & Development Corporation and University of Melbourne | Date: 2014-08-21

The present invention relates generally to polysaccharide synthases. More particularly, the present invention relates to (1,3;1,4)--D-glucan synthases. The present invention provides, among other things, methods for influencing the level of (1,3;1,4)--D-glucan produced by a cell and nucleic acid and amino acid sequences which encode (1,3;1,4)--D-glucan synthases.

University of Melbourne and Council Of Scientific & Industrial Research | Date: 2011-09-14

The present invention relates to a synthetic immunogen represented by the general formula 1, useful for generating long lasting protective immunity against various intracellular pathogens which are the causative agents of tuberculosis, leishmaniasis, AIDS, trypanosomiasis, malaria and also allergy, cancer and a process for the preparation thereof. The developed immunogen is able to circumvent HLA restriction in humans and livestock. The invention further relates to a vaccine comprising the said immunogen for generating enduring protective immunity against various diseases. The said vaccine is targeted against intracellular pathogens, more particularly the pathogen M. tuberculosis in this case. In the present invention, promiscuous peptides of M. tuberculosis are conjugated to TLR ligands especially; Pam2Cys to target them mainly to dendritic cells and therefore elicit long-lasting protective immunity. (The formula (I) should be inserted here) General formula (I) wherein, X_(1)=a promiscuous CD4 T helper epitope selected from SEQ ID No. 1 to 98 OR nil; X_(2)=a promiscuous CD8 T cytotoxic epitope selected from SEQ ID No. 99 to 103 OR nil; when X1=nil; X2=SEQ ID No. 99 to 103 and when X2=nil; X1=SEQ ID No. 1 to 98; Y=Lysine; and S=Serine.

Agency: Cordis | Branch: H2020 | Program: RIA | Phase: YOUNG-3-2015 | Award Amount: 2.50M | Year: 2016

The ENLIVEN research models how policy interventions in adult education markets can become more effective. Integrating state-of-the-art methodologies and theorisations (e.g. Case-Based Reasoning methodology in artificial intelligence, bounded agency in adult learning), it implements and evaluates an innovative Intelligent Decision Support System to provide a new and more scientific underpinning for policy debate and decision-making on adult learning, especially for young adults. It utilizes findings from research conducted by European and international agencies and research projects, as well as from the ENLIVEN project. It will enable policy-makers at EU, national and organizational levels to enhance the provision and take-up of learning opportunities for adults, leading to a more productive and innovative workforce, and reduced social exclusion. The project comprises 11 workpackages in 4 clusters. WPs1-3 examine programmes, governance and policies in EU adult learning, looking at the multi-dimensional nature of social exclusion and disadvantage. WP4 examines system characteristics to explain country/region-level variation in lifelong learning participation rates, with particular reference to disadvantaged and at-risk groups, and to young people. WPs 5-7 examine the operation and effectiveness of young adults learning at and for work, undertaking cross-country comparative institutional analysis. WPs 8 -9 develop the knowledge base for, and develop and trials, an Intelligent Data Support System (IDSS) for evidence-based policy-making and debate. The ENLIVEN team comprises leading scholars with a full range of methodological skills in lifelong learning research and related areas, as well as advanced computer science skills. It will maintain a continuing interaction with policy makers and key research networks, make targeted interventions in policy and scientific debate, and deliver a state-of-the-art IDSS to improve lifelong learning for young adults across Europe.

News Article | December 23, 2016

University of Melbourne researchers have found an increase in heart attacks around the festive period may be due to more difficult access to hospitals, combined with stress, an excess of alcohol and a fatty diet. Previous research from the USA has established that the Christmas holidays are related to more heart attacks, however, it was thought it could be due to the season - winter - when mortality rates are at their highest. To find out, University of Melbourne researchers analysed 25 years' of death records of heart attacks between Christmas and the first week of January, during summer in the southern hemisphere. The research, published today, in the Journal of the American Heart Association, revealed a 4.2 per cent increase in heart-related deaths occurring out of hospital during the Christmas period in New Zealand. And victims were typically younger. The average age of cardiac death was 76.2 years during the Christmas period compared with 77.1 years at other times of the year. Lead author and researcher at the Centre for Health Policy at the University of Melbourne, Josh Knight, said by using data from a country where Christmas occurs in the height of summer, he was able to separate any "holiday effect" from the "winter effect". Knight said that there is a need to understand whether restricted access to healthcare facilities might be combining with other risk factors such as emotional stress, changes in diet, alcohol consumption result in the spike in cardiac deaths. He suggested patients might also hold back in seeking medical care during the holiday season. "The Christmas holiday period is a common time for travel within New Zealand, with people frequently holidaying away from their main medical facilities," he said. "This could contribute to delays in both seeking treatment, due to a lack of familiarity with nearby medical facilities, and due to geographic isolation from appropriate medical care in emergency situations." Another explanation may have to do with a terminally ill patients' will to live and hold off death for a day that is important to them. "The ability of individuals to modify their date of death based on dates of significance has been both confirmed and refuted in other studies, however it remains a possible explanation for this holiday effect," Mr Knight said.

News Article | November 9, 2016

A new type of sensor has the potential to replace sniffer dogs when it comes to detecting explosives such as TNT. This week, researchers from a number of institutions including TU Delft are publishing an article about this subject in the American Chemical Society's journal Nano Letters. “For the first time ever, we have used molecules with a lantern-type cage structure to fabricate sensitive nanosensors that can detect explosive substances such as TNT,” says researcher Louis de Smet (affiliated with TU Delft and Wageningen University). “These cage structures have a capacity of about 1 cubic nanometer, which precisely accommodates a single TNT molecule.” Researchers from TU Delft, the University of Twente, Philips Research, the City University of Hong Kong, and the University of Melbourne have chemically bound an ultrathin layer of these specially developed cages to the surface of a sensor chip containing a few dozen sensitive nanosensors. A single cage is not sufficient for detection purposes. “Porous molecules are used quite often to capture ambient molecules,” explains De Smet. “In the case of relatively small molecules, as with explosives, the challenge is to ensure that the cage structure is not only the right size but that it also has the right anchor points so that the molecule can click into place – thus rendering it detectable. For this work, we therefore use layers consisting of so-called MOP molecules (Metal-Organic Polyhedra). Through variation with a large number of geometric and electronic properties of these complex cage molecules, we are able to capture the ‘explosive’ molecules we are looking for. And the presence of such a molecule also causes the electrical conductance of the underlying silicon nanowires to change in a very characteristic way. We can measure this and thus confirm that we have actually found TNT molecules from an explosive.” “Eventually, we may be able to use this type of sensor to detect explosives – in a war situation, for example, or when facing a terrorist threat,” says De Smet. “Currently, very different, qualitative methods are mainly used for this, involving chemical reactions causing color changes, for instance, or the deployment of sniffer dogs. The great thing about our method is that you can not only detect whether there are traces of TNT but you can also determine the amount.” Cao, PhD candidate at TU Delft and Zhu, postdoctoral researcher at the University of Twente initiated this work and performed the experiments, while Shang did the computational work. Klootwijk, Sudhölter, Huskens and De Smet supervised the project.

News Article | August 18, 2016

The United Nations climate change conference held last year in Paris had the aim of tackling future climate change. After the deadlocks and weak measures that arose at previous meetings, such as Copenhagen in 2009, the Paris summit was different. The resulting Paris Agreementcommitted to: The agreement was widely met with cautious optimism. Certainly, some of the media were pleased with the outcome while acknowledging the deal’s limitations. Many climate scientists were pleased to see a more ambitious target being pursued, but what many people fail to realise is that actually staying within a 1.5℃ global warming limit is nigh on impossible. There seems to be a strong disconnect between what the public and climate scientists think is achievable. The problem is not helped by the media’s apparent reluctance to treat it as a true crisis. In 2015, we saw global average temperatures a little over 1℃ above pre-industrial levels, and 2016 will very likely be even hotter. In February and March of this year, temperatures were 1.38℃ above pre-industrial averages. Admittedly, these are individual months and years with a strong El Niñoinfluence (which makes global temperatures more likely to be warmer), but the point is we’re already well on track to reach 1.5℃ pretty soon. So when will we actually reach 1.5℃ of global warming? Timeline showing best current estimates of when global average temperatures will rise beyond 1.5℃ and 2℃ above pre-industrial levels. Boxes represent 90% confidence intervals; whiskers show the full range. Image via Andrew King. On our current emissions trajectory we will likely reach 1.5℃ within the next couple of decades (2024 is our best estimate). The less ambitious 2℃ target would be surpassed not much later. This means we probably have only about a decade before we break through the ambitious 1.5℃ global warming target agreed to by the world’s nations in Paris. A University of Melbourne research group recently published these spiral graphs showing just how close we are getting to 1.5℃ warming. Realistically, we have very little time left to limit warming to 2℃, let alone 1.5℃. This is especially true when you bear in mind that even if we stopped all greenhouse gas emissions right now, we would likely experience about another half-degree of warming as the oceans “catch up” with the atmosphere. The public seriously underestimates the level of consensus among climate scientists that human activities have caused the majority of global warming in recent history. Similarly, there appears to be a lack of public awareness about just how urgent the problem is. Many people think we have plenty of time to act on climate change and that we can avoid the worst impacts by slowly and steadily reducing greenhouse gas emissions over the next few decades. This is simply not the case. Rapid and drastic cuts to emissions are needed as soon as possible. In conjunction, we must also urgently find ways to remove greenhouse gases already in the atmosphere. At present, this is not yet viable on a large scale. The 1.5℃ and 2℃ targets are designed to avoid the worst impacts of climate change. It’s certainly true that the more we warm the planet, the worse the impacts are likely to be. However, we are already experiencing dangerous consequences of climate change, with clear impacts on society and the environment. For example, a recent study found that many of the excess deaths reported during the summer 2003 heatwave in Europe could be attributed to human-induced climate change. Also, research has shown that the warm seas associated with the bleaching of the Great Barrier Reef in March 2016 would have been almost impossible without climate change. Climate change is already increasing the frequency of extreme weather events, from heatwaves in Australia to heavy rainfall in Britain. These events are just a taste of the effects of climate change. Worse is almost certainly set to come as we continue to warm the planet. It’s highly unlikely we will achieve the targets set out in the Paris Agreement, but that doesn’t mean governments should give up. It is vital that we do as much as we can to limit global warming. The more we do now, the less severe the impacts will be, regardless of targets. The simple take-home message is that immediate, drastic climate action will mean far fewer deaths and less environmental damage in the future. By Andrew King, Climate Extremes Research Fellow, University of Melbourne and Benjamin J. Henley, Research Fellow in Climate and Water Resources, University of Melbourne. This article has been cross-posted from The Conversation.

News Article | March 11, 2016

While popular thought has dictated that the onset of global warming began around 1979, it looks like climate change germinated decades before that hallmark year. A group of scientists has determined that the detrimental effects of greenhouse gases could have been in play as far back as 1937 - a year in which recorded temperatures reached record-breaking heights. In a paper recently published in the scientific journal Geophysical Research Letters, that self-same year was a harbinger of sorts - it paved the way for other astronomically high temperatures in 1940, 1941 and 1944. "What we found was that we could actually detect human influence on extreme events a lot earlier than we'd thought," said Oxford University's Daniel Mitchell, a physicist and climate change researcher, as well as one of the co-authors of the article. While industrialization wasn't quite at its zenith in 1937 as it is today, the scientists theorized from their results - which were garnered from models the team used with data going as far back as 1901 - that the comparatively trace amounts still spurred it on. "The record set last year saw average temperatures of 1°C (1.8°F) above those of the late 19th century. A United Nations climate agreement was struck in Paris in December, aiming to keep warming 'well below' 2°C (3.6°F). The warming effects of fossil fuel use and deforestation remained relatively slight 80 years ago, when compared with the heavy hand they have played in rapid-fire records set more recently. Even so, the researchers concluded that greenhouse gas pollution in 1937 doubled the likelihood of reaching that year's high average temperature." Another researcher on the project named Andrew King from the University of Melbourne stated that it would have been "virtually impossible" for the Earth's temperature to reach these extremes if the effect of greenhouse gases weren't in play. "It's just kind of scary that we've been influencing the climate for a very long time, and we haven't really done anything substantial to limit our emissions," King added. "We've just made the problem worse and worse."

News Article | November 8, 2016

India is poised to become the first non-signatory to the Nuclear Non-proliferation Treaty to be allowed to import nuclear technology from Japan. The prime ministers of the two countries, Narendra Modi (above right) and Shinzo Abe (above left) will meet on Friday to sign the deal under which Japan can supply nuclear power plants to India. This will allow India to add to the 21 nuclear plants that it is already operating. The reactors will create electricity from radioactive uranium, generating plutonium as a by-product. Plutonium can be used to build nuclear weapons, but Japan says it will end the deal if India carries out any nuclear tests. The trouble is that India is not bound by the non-proliferation treaty, so there are no international safeguards to prevent military repurposing of plutonium, say anti-nuclear groups. Analysts are also concerned that the deal comes at a time when India and Japan are increasingly at loggerheads with China. India and China are scaling up military operations near their shared border, while China and Japan are quarrelling over the East China Sea Islands. And India has had long-standing border disagreements with Pakistan. But Satoru Nagao at Gakushuin University in Japan says that India will be able to reduce its reliance on fossil fuels by expanding its nuclear energy industry. The country is the third highest emitter of carbon dioxide, and is under pressure to make the move to clean energy. Moreover, India has a good non-proliferation record, Nagao says. The country has a small number of nuclear weapons but has not performed any tests since 1998. “India has an impeccable record of nuclear non-proliferation,” says Ashok Sharma at the University of Melbourne, Australia. “So, I don’t think India will use the nuclear deal with Japan for making bombs.” In any case, Sharma says, India already has enough atomic bombs to deter China and Pakistan. Sharma points out a previous US-India nuclear deal that was overwhelmingly passed in both the houses of the US Congress as evidence of its non-proliferation. “India wants to enhance the energy mix of nuclear energy from 3 per cent to around 25 per cent in the coming decades. Nuclear energy is environment-friendly and it will help India to reduce its carbon emission to achieve its Paris Agreement commitments. “It has committed itself to use the enriched uranium and the technological assistance for civilian purposes.” Read more: Secret clean-up secures weapons-grade plutonium dump; India’s thorium-based nuclear dream inches closer; US to share nuclear technology with India

News Article | December 23, 2016

University of Melbourne researchers have found an increase in heart attacks around the festive period may be due to more difficult access to hospitals, combined with stress, an excess of alcohol and a fatty diet. Previous research from has established that the Christmas holidays are related to more heart attacks, however, it was thought it could be due to the season - winter - when mortality rates are at their highest. To find out, University of Melbourne researchers analyzed 25 years' of death records of heart attacks between Christmas and the first week of January, during summer in the southern hemisphere. The research, published Dec. 22, in the Journal of the American Heart Association, revealed a 4.2 percent increase in heart-related deaths occurring out of hospital during the Christmas period in New Zealand. And victims were typically younger. The average age of cardiac death was 76.2 years during the Christmas period compared with 77.1 years at other times of the year. Lead author and researcher at the Centre for Health Policy at the University of Melbourne, Josh Knight, said by using data from a country where Christmas occurs in the height of summer, he was able to separate any "holiday effect" from the "winter effect." Knight said that there is a need to understand whether restricted access to healthcare facilities might be combining with other risk factors such as emotional stress, changes in diet and alcohol consumption, resulting in the spike in cardiac deaths. He suggested patients might also hold back in seeking medical care during the holiday season. "The Christmas holiday period is a common time for travel within New Zealand, with people frequently holidaying away from their main medical facilities," he said. "This could contribute to delays in both seeking treatment, due to a lack of familiarity with nearby medical facilities, and due to geographic isolation from appropriate medical care in emergency situations." Another explanation may have to do with a terminally ill patients' will to live and hold off death for a day that is important to them. "The ability of individuals to modify their date of death based on dates of significance has been both confirmed and refuted in other studies, however it remains a possible explanation for this holiday effect," Knight said.

Millions of Australians just endured a sizzlingly hot summer, with three blistering heat waves enveloping much of southeastern Australia during January and February sending temperatures soaring as high as 48.2 degrees Celsius, or 118.7 degrees Fahrenheit. New South Wales, located in southeastern Australia, had its warmest summer on record, with numerous temperature milestones shattered in Sydney, Brisbane and Canberra, among other locations. SEE ALSO: The atmosphere has forgotten what season it is in the U.S. Now a new quick-turnaround analysis from an international group of climate researchers found direct ties between global warming and this summer's heat. In completing the study, the researchers utilized the computing power of hundreds of volunteers' laptops and desktops worldwide, through a project known as weather@home. January 2017 saw the highest monthly mean temperatures on record for Sydney and Brisbane, and the highest daytime temperatures on record in Canberra, the study, produced by the world weather attribution program at Climate Central and other institutions, found. In many areas of New South Wales, temperatures topped out at higher than 45 degrees Celsius, or 113 degrees Fahrenheit, during the heat's peak on Feb. 11 and 12. The duration of the heat wave was particularly noteworthy, with Moree, New South Wales, enduring 52 straight days with temperatures exceeding 35 degrees Celsius, or 95 degrees Fahrenheit, which set a record. The study found that the regional record hot summer "can be linked directly to climate change." To reach this conclusion, researchers used methods that have been established in the peer reviewed scientific literature, such as by simulating the hot summer in the presence of planet-warming greenhouse gases and without them and comparing the likelihood of its occurrence. Australia's annually averaged temperature has warmed by around 1 degree Celsius, or 1.8 degrees Fahrenheit, since 1910, according to the Bureau of Meteorology. The team, including scientists at the University of Melbourne and University of New South Wales, also analyzed observational data, and both methods led to the conclusion that average summer temperatures like those seen in 2016-2017 are now 50 times more likely to occur than they were before global warming began. Even worse, the study found that in the future, a summer a hot as this one is likely to happen once every five years, compared to once every 500 years prior to global warming. "Our results are certainly in line with what we have observed around the world in heat waves and their trends," said Andrew King, a climate extremes research fellow at the University of Melbourne, in an email. He said the results are "also in line with what we would largely expect to see as the overall climate warms."  Although the analysis itself is not yet peer reviewed, King said its reliance on peer reviewed methods adds credibility to the conclusions.  "By looking at multiple methods and regional to local scales we have confidence in our findings," he said. "The methods we have used here are all peer-reviewed so we can be confident in the results." In addition to the regional analysis, the researchers also investigated how the odds of such a hot summer has shifted on a local level for Sydney and Canberra specifically.  Here, their results were more mixed due to the greater uncertainties involved with looking at smaller geographical scales. In Canberra, the researchers found there has been a 50 percent increase in the likelihood of a three-day heat wave such as what occurred from Feb. 9 to 11 of this year.  However, the analysis found no clear human-induced trend in Sydney for such a short-term heat event, since the decades-long signal of global warming couldn't be separated from natural climate variability there.  This conclusion should not come as a surprise, considering numerous climate reports from Australian research organizations and a growing body of research in scientific journals showing that as global average surface temperatures climb, the odds of extreme heat are rapidly increasing.  "If anything, these attribution statements are conservative, which is how I prefer them," said Michael Wehner, a senior staff scientist at Lawrence Berkeley National Laboratory, who was not involved in this analysis. "But the researchers have clearly found that the human increase in these measures of heat waves is profound."  A study Wehner published in December along with Daithi Stone and other colleagues tied climate change to deadly heat events in India and Pakistan in 2015. Other studies published in the same month also found links between global warming and heat waves in Australia and China, among other locations.  "The methods that they used to put numbers on the change in likelihood ... is well established and very similar to what we published recently on the deadly 2015 Indian and Pakistani heat waves," Wehner said.  Such post-event scientific sleuthing is known as climate attribution research, and climate experts are aiming to decrease the lag time between a headline-grabbing extreme event occurring, and the time when scientists can say what, if anything, global warming had to do with it.  In this way, it is hoped, people will learn more about how global warming is reshaping the weather extremes they are already experiencing, and provide valuable intelligence to everyone from insurance companies to local governments.

CANBERRA, Australia, Nov. 08, 2016 (GLOBE NEWSWIRE) -- The Northrop Grumman Foundation is pleased to announce the recipient of the 2017 Northrop Grumman Foundation-American Australian Association (AAA) Fellowship for advanced research and study in science and technology. As part of its commitment to the AAA’s Fellowship program, the Foundation will provide sponsorship to mathematics scholar Scott Mullane to undertake advanced studies in moduli spaces in the United States. Scott received a Master of Science degree at the University of Melbourne and a doctorate from Boston College, and will use the funding provided by the AAA to complete a 12 month postdoctoral scholarship at Harvard University. The Northrop Grumman Foundation-American Australian Association Fellowship provides an annual scholarship to a highly talented Australian student undertaking research in areas of study focusing on science, technology, engineering and mathematics (STEM). This fellowship is offered as part of the AAA’s range of scholarships which provide up to $40,000 for Australia’s brightest researchers to undertake advanced graduate or postgraduate research in the United States and aim to foster intellectual exchange between the United States and Australia. As part of his placement at Harvard University, Scott will work on cutting edge advancements in the study of moduli spaces, which relates to many branches of science including modelling the universe in physics and analysing the structures of DNA in technology. “It is essential that Australian academics remain globally connected and part of top quality international research collaborations, and the foresight and support of the AAA is critical to allowing these incredible opportunities to occur,” said Mullane. “We congratulate Scott on his success in winning this highly competitive award,” said Ian Irving, chief executive, Northrop Grumman Australia. “Northrop Grumman is proud to support STEM students at all levels in Australia, and we identify with Scott’s passion in regards to supporting Australians to expand their academic horizons at institutions across the world.” “At the American Australian Association we are committed to developing Australian talent and to ensuring that Australia remains at the forefront of research,” said Diane Sinclair, vice president, AAA. “We are thrilled to have the ongoing support of the Northrop Grumman Foundation to allow Australian scholars to forge strong links with American research partners across a range of STEM disciplines which are critical for Australia’s future.” This is the third year Northrop Grumman and the AAA have partnered to offer the fellowship to Australian researchers. Benjamin Morrell, who received the first fellowship in 2015, studied at Texas A&M University and Massachusetts Institute of Technology. He said, “The experience enabled by the fellowship led to the forging of connections with collaborators, researchers, industry professionals and industry leaders throughout America that have been, and will be, invaluable in developing closer connections between Australia and the United States.” In Australia, in addition to awarding a range of academic scholarships, Northrop Grumman sponsors scholarships for high school students and teachers from Australia to attend Space Camp® at the U.S. Space and Rocket Center® in Huntsville, Alabama. The company has also entered into strategic partnership agreements with the University of Sydney and the Australian Defence Force Academy to work jointly on education, training programs and research initiatives. For more information, please visit Established in 1948, the American Australian Association is the largest public not-for-profit organisation devoted to building strategic alliances between the United States and Australia. The Association’s Education Fund provides postgraduate educational opportunities in Australia and the U.S. Since 2002, the Association has awarded fellowships to Australians and Americans for advanced research and study in the U.S. and Australia. To date, $5 million in fellowships has been awarded to more than 200 scholars. Please visit for more information. Northrop Grumman and the Northrop Grumman Foundation are committed to expanding and enhancing the pipeline of diverse, talented STEM students globally. They provide funding to sustainable STEM programs that span from preschool to high school and through collegiate levels, with a major emphasis on middle school students and teachers. In 2015, Northrop Grumman and the Northrop Grumman Foundation continued outreach efforts by contributing more than $17.4 million to diverse STEM-related groups such as the Air Force Association (CyberPatriot), the REC Foundation (VEX Robotics), National Science Teachers Association, and the National Action Council for Minorities in Engineering.

Getting access to data in order to carry out research and analysis is often a long and tortuous process. In the health sector for example, a request for de-identified data about CT scans and cancer notifications took 5 years to reach researchers at the University of Melbourne. Medical guidelines about CT use in young people were changed after the researchers showed a relationship between CT scans and an increased risk of cancer. This link could have been identified much earlier, possibly preventing many young patients from being put at risk. The Commission rightly makes the point that the ongoing use of data for the assessment of medical procedures and drug treatments in particular should be the norm, not the subject of occasional research. There are a wide range of reasons why data is not being used more effectively. In the case of health data, there are concerns by hospitals and health authorities relating to the privacy and confidentiality of patient data. There is perhaps a more practical block however which is that sharing data, or making use of it for analysis, is not seen as a priority by many organisations and consequently, they are unwilling to devote much money to those activities. Certainly, they are even less willing to spend money on preparing data to share with other organisations, especially those that they regard as competitors. Another issue is that CEOs and managers are often not highly numerate, and consequently don't see the value in the information and knowledge to be gained from data analytics. What the Commission is proposing however is legislative changes to force the increased availability of data from the public and private sector. It also wants to give individuals ultimate control over what data is held about them and control over what organisations and government can do with that data. It pushes for a more radical culture of data use than is currently the case and will face an uphill struggle to change that culture. One of the recommendations for example suggests that linked data, including statistical linkage keys, that is used for research should not have to be destroyed at the completion of the project. This was one of the issues raised through public concern about the Australian Bureau of Statistics retaining names and addresses from the 2016 census for a longer period. In fact, the Commission is arguing for more widespread use of identified data. What this means is that if the recommendations are accepted and become law, private and public organisations will have to do more to catalogue and publish what data is held, share it more widely and in particular allow individuals with access and more direct control. For international companies, this may not be anything new. Legislation in other countries, especially the European Union has meant that companies like Google for example already have mechanisms by which individuals can access the data that is held about them. For a strategy of justifying the increased use of data to work, the public will need to be convinced that the data is going to be used in the their interest and not as just another way to justify increased surveillance and recording of private information, especially by the Government. The benefits of data availability and analysis need to be clearly communicated to the public. It will be important to give clear examples of cases where there has been a direct association between using data and better health outcomes. Of course, this is harder to do if the benefits of analysis of data are in the future and depend on the changes to data use being implemented successfully. Perhaps a bigger challenge however is the lack of people with the right skills to handle and analyse data. Universities are only recently starting to focus on data science and related degrees and even then, the focus is on the analysis part and not on the other skills required for making use of information produced from these activities in an organisation or in government. Those companies that understand the benefits of data are already struggling to get trained staff to work on these problems. Many organisations however, just don't understand how the collection and analysis of data will help them. Consequently, they are unwilling to make it a priority in their budgets or activities. Ironically enough, this includes universities themselves. Explore further: The future will be built on open data – here's why

Heart-related deaths increase in the United States every Christmas season, but what is behind the spike in cardiac deaths over the holiday period? Because the holidays fall during the winter season, the increased number of deaths during this time of the year, also known as the Christmas holiday effect, has been attributed to colder temperatures. Blaming the weather makes sense. For one, the cold temperature can restrict the blood vessels and make the heart work harder. The American Heart Association said that people who already have existing heart conditions are particularly at risk during the winter season since they tend to suffer from angina pectoris characterized by chest pain or discomfort when they are in cold weather. Findings of a new study, however, debunks the idea that the cold weather is widely responsible for the spike in deaths during the holiday. In a research published in the Journal of the American Heart Association on Dec. 22., Josh Knight, from the University of Melbourne, and colleagues found that the number of deaths also increases even in New Zealand, where Christmas and New Year's day happen during the summer season, crossing out cold weather as the main reason for the death spike during the holidays. So what's causing the spike in heart-related deaths in New Zealand? Theories include the effect of the holiday on diet, consumption of alcohol, and stress, although further study is still needed to look more closely into the potential effects of these factors. The lesser number of staff at medical facilities and changes in the physical environment due to people traveling during the holidays to visit relatives are also potential factors that can explain the spike in cardiac-related deaths. "The Christmas holiday period is a common time for travel within New Zealand, with people frequently traveling away from the main centers. This could contribute to delays in seeking treatment because of both a lack of familiarity with nearby medical facilities and geographic isolation from appropriate medical care in emergency situations," Knight and colleagues, wrote in their study. Knight and colleagues added that the reduced age of those who died from cardiac-related deaths during the Christmas season adds weight to the argument that delay in seeking medical care can be a factor to spike in deaths. Displacement of death is also another possible reason that could be behind the Christmas holiday effect. © 2017 Tech Times, All rights reserved. Do not reproduce without permission.

News Article | August 31, 2016

Abdulsalam Nasidi's phone rang shortly after midnight: Nigeria's health minister was on the line. Nasidi, who worked at the country's Federal Ministry of Health, learnt that he was needed urgently in the Benue valley to investigate a cluster of dying patients. People were bleeding out of their noses, their mouths, their eyes. Names of spine-chilling viruses such as Ebola, Lassa and Marburg raced through Nasidi's mind. When he arrived in Benue, he found people splayed on the ground and tents serving as makeshift hospital wards and morgues. But Nasidi quickly realized that the cause of the mystery illness was millions of times larger than any virus. The onset of the rainy season had brought the start of spring planting for farmers in the valley, and flooding had disturbed the resident carpet vipers (Echis ocellatus). Many farmers were simply too poor to buy boots — and their exposed feet became targets for the highly venomous snakes. Nasidi wanted to help, but he found himself with limited tools. He had only a small amount of antivenom with which to neutralize the toxin — and it quickly ran out. Once the hospital exhausted its supply, people stopped coming. No one knows how many people were killed. In an average year, hundreds of Nigerians die from snakebite, and that rainy season, which started in 2012, was far from average. Snakebites are a growing public-health crisis. According to the World Health Organization, around 5 million people worldwide are bitten by snakes each year; more than 100,000 of them die and as many as 400,000 endure amputations and permanent disfigurement. Some estimates point to a higher toll: one systematic survey concluded that in India alone, more than 45,000 people died in 2005 from snakebite1 — around one-quarter the number that died from HIV/AIDS (see 'The toll of snakebite'). “It's the most neglected of the world's neglected tropical diseases,” says David Williams, a toxinologist and herpetologist at the University of Melbourne, Australia, and chief executive of the non-profit organization Global Snakebite Initiative in Herston. Many of those bites are treatable with existing antivenoms, but there are not enough to go around. This long-standing problem became international news in September 2015, when Médecins Sans Frontières (MSF, also known as Doctors Without Borders) announced that the last remaining vials of the antivenom Fav-Afrique, used to treat bites from several of Africa's deadliest snakes, were about to expire. The French pharma giant Sanofi Pasteur in Lyons had decided to cease production in 2014. MSF estimates that this could cause an extra 10,000 deaths in Africa each year — an “Ebola-scale disaster”, according to Julien Potet, a policy adviser for MSF in Paris. Yet, because most of those affected by snakebites are in the poorest regions of the world, the issue has been largely ignored. In May, however, the crisis was discussed for the first time at the annual World Heath Assembly meeting in Geneva, Switzerland. The world's handful of snakebite specialists gathered in a small conference room in the Palais des Nations — although they shared concern over the problem, they were split about how to solve it. Many want to use synthetic biology and other high-tech tools to develop a new generation of broad-spectrum antivenoms. Others argue that existing antivenoms are safe, effective and low cost, and that the focus should be on improving their production, price and use. “From the physician perspective, patient care and public health comes before anything new,” says Leslie Boyer, who directs an institute dedicated to antivenom study at the University of Arizona, Tucson. The debate mirrors those around many other developing-world challenges, from improving agriculture to providing clean drinking water. Do people need high-tech solutions, or can cheaper, lower-tech remedies do the job? The answer is simple to Jean-Philippe Chippaux, a physician working on snakebite for the French Institute of Research for Development in Cotonou, Benin. “We have the ability to fix this problem now. We just lack the will to do it,” he says. Every December, Williams sees snakebite victims flood into the Port Moresby General Hospital in Papua New Guinea. Nearly all of them were bitten by the taipan (Oxyuranus scutellatus), one of the world's deadliest snakes, which emerges at the start of the rainy season. The venom stops a victim's blood from clotting, paralyses muscles and leads to a slow, agonizing death. It seems a far cry from Australia, where Williams is based. “There's this incredible suffering just 90 minutes away from the modern world,” he says. Yet Williams knows that these people are the lucky ones. The hospital ward, which might be treating as many as eight taipan victims at any time, is often the only place in the country with antivenom drugs. Without them, some 10–15% of all snakebite victims die; with them, just 0.5% do. The situation is reflected around the world. “Many countries don't want to admit that they have such a primeval-sounding problem,” Chippaux says. The method used to make antivenom has changed little since French physician Albert Calmette developed it in the 1890s. Researchers inject minuscule amounts of venom, milked from snakes, into animals such as horses or sheep to stimulate the production of antibodies that bind to the toxins and neutralize them. They gradually increase doses of venom until the animal is pumping out huge amounts of neutralizing antibodies, which are purified from the blood and administered to snakebite victims. Across much of Latin America, government-funded labs typically produce antivenoms and distribute them free of charge. But in other areas, especially sub-Saharan Africa, these life-saving medications are too often out of reach. Many governments lack the infrastructure or political will to purchase and distribute antivenom. Bribery and corruption often jack up the price of an otherwise inexpensive drug from a typical wholesale cost of US$18 to $200 per vial to a retail cost between $40 and $24,000 for a complete treatment, according to a 2012 analysis2. Not all hospitals and clinics can afford the antivenom, and some won't risk buying it because their patients either can't pay for it or won't, because they doubt that it really works. With no reliable market for the medicines, some pharmaceutical companies have halted production. Sanofi Pasteur stopped making Fav-Afrique because, at an average retail price of around $120 per vial, it just couldn't sell enough to make production worthwhile. A total of 35 government or commercial manufacturers produce antivenom for distribution around the world, but only 5 now make the drugs for sub-Saharan Africa. In the absence of medicines, snakebite victims have been known to drink petrol, electrocute themselves or apply a poultice of cow dung and water to the bite, says Tim Reed, executive director of Health Action International in Amsterdam. But there are also problems with the drugs themselves, says Robert Harrison, head of the Alistair Reid Venom Research Unit at the Liverpool School of Tropical Medicine, UK. They often have a limited shelf life and require continuous refrigeration, which is a problem in remote areas without electricity. And many are effective against just one species of snake, so clinics need an array of medicines constantly on hand. (A few, such as Fav-Afrique, combine antibodies to create a broad-spectrum product.) Venoms from spiders and scorpions typically have only one or two toxic proteins; snake venoms can have more than ten times that amount. They are a “pandemonium of molecules”, says Alejandro Alagón, a toxinologist at the National Autonomous University of Mexico in Mexico City. Researchers do not always know which proteins in this toxic soup are the damaging ones — which is why some think that smarter biology could help. Ten years ago, teams led by Harrison and José María Gutiérrez, a toxinologist at the University of Costa Rica in San José, began parallel efforts to create a universal antivenom for sub-Saharan Africa using 'venomics' and 'antivenomics'. The aim is to identify destructive proteins in venoms using an array of techniques, ranging from genome sequencing to mass spectrometry, and then find the specific parts, known as epitopes, that provoke an immunological response and are neutralized by the antibodies in antivenom drugs. The ultimate goal is to use the epitopes to produce antibodies synthetically, using cells rather than animals, and develop antivenoms that are effective against a wide range of snake species in one part of the world. The scientists have made slow but steady progress. Last year, Gutiérrez and his colleagues separated and identified the most toxic proteins from a family of venomous snakes known as elapids (Elapidae). By combining information about the abundance of each protein and how lethal it is to mice, the team created a toxicity score to indicate how important it was to neutralize a protein with antivenom, a first step towards making the treatment3. In March this year, a Brazilian team reported that they had gone further, designing short pieces of DNA that encode key toxic epitopes in the venom of the coral snake (Micrurus corallinus), a member of the elapid family4. Mice were injected with the DNA using a technique that enabled some to generate antibodies against coral-snake venom, and the group enhanced the mice's immune responses by injecting them with synthetic antibodies manufactured in bacterial cells. These and other advances led Harrison to estimate that the first trials of new antivenoms in humans could be just three or four years away. But with so few researchers working on the problem, a paucity of funding and the biological complexity of snake venoms, he and others admit that this is an optimistic prediction. Despite the growing literature on antivenomics, Alagón and Chippaux aren't convinced that the approach will help. Alagón estimates that newly developed antivenoms would need to be priced at tens of thousands of dollars per dose to be financially viable to produce, and that no biotech or pharma company would manufacture one without substantial government subsidies. Compare that, he says, to the rock-bottom price of many existing antivenoms. “You can't get cheaper than that,” he says. “We can make an entire lot of antivenoms in one day using technology that's been available for 80 years.” Finding someone to produce new medications might be a greater challenge than actually developing them, Williams acknowledges: governments or non-governmental organizations (NGOs) will almost certainly have to step in to help to defray the development costs. But he argues that now is the time to research alternative approaches. These could “revolutionize the treatment of snakebite envenoming in the next 10–15 years”, Williams says. All these tensions, brewing for nearly a decade, came to a head at the Geneva meeting in May. Around 75 scientists, public-health experts and health-assembly delegates crowded around three long tables in a third-floor conference room at the United Nations Headquarters. Spring rain pelted the tall windows. Lights were dimmed, and then the screams of a toddler filled the room. A short documentary co-produced by the Global Snakebite Initiative told the story of a girl bitten by a cobra whose parents carried her for days over rocky roads in Africa to find antivenom. They arrived in time — the girl survived — but she lost the use of her arm. Her sister had already died after a bite from the same snake. Convincing attendees of the scale of the problem was the meeting's primary goal; how to solve it came next. For 90 minutes, scientists and NGOs made short, impassioned speeches laying out the scope of the issue and the variety of problems that they faced. At the centre of each presentation was the same message: we need more antivenom. But the meeting was strained. Chippaux and representatives of the African Society of Venomology were disappointed and angry that so few Africans had been invited to speak, even though the continent is where antivenom shortages are most acute. “Our voice, our issues, were completely overlooked,” Chippaux says. Seated at the front of the room, group members whispered and gestured frantically to each other, and Chippaux barely managed to keep them from storming out. They argue that the current antivenom shortage stems from Africa's reliance on foreign companies and governments for its drugs, and that the only solution lies in building up infrastructure in Africa to produce its own high-quality antivenom. Alagón views antivenomics as a dangerous diversion. “It's distracting many brilliant minds and resources from improving antivenoms using existing technology,” he says. “Perhaps by 2050 this will be the standard technique, but the problem is now.” Williams and Gutiérrez take a middle ground. They feel that the problem requires attacks on all fronts. As well as innovation, Gutiérrez calls for existing manufacturers to step up the production of current drugs. There are signs of this happening already. Latin America has a long history of producing antivenoms both for its own needs and for those of countries around the world, and even before Sanofi Pasteur announced that it would cease production of Fav-Afrique, Costa Rica, Brazil and Mexico were testing antivenoms for different parts of Africa. One product, EchiTAb-Plus-ICB, is produced by Costa Rica and effective against a range of African viper species; it completed clinical trials in 2014 and is now available for use. Several other antivenoms are expected to be ready in the next two years. The drugs should be affordable: government labs in Costa Rica have already indicated that they will not seek to make money from the antivenoms, just recoup their expenditures. But beyond that, the way forward remains murky. Williams knows that the World Heath Assembly meeting was just a start. Inevitably, more meetings will be needed to produce a concrete action plan. But the discussion still gave him and some others a renewed sense of hope that the international community is beginning to take snakebite seriously — momentum they hope to build on by banging away at the topic at conferences and in the media. Boyer says that whatever solution the snakebite field decides on, the most important thing is to “break the cycle of antivenom failure in Africa”. Doing that requires building trust from governments, health-care workers and the public that the drugs are safe and effective, that clinics will have antivenom on hand, and that people will be able to afford treatment. “Without that, you've got nothing,” Boyer says. Educating local clinics on how to care for snakebite victims and administer treatments in a timely manner would also go a long way towards preventing deaths. Speaking of the devastation he saw in Benue, Nasidi says that something as simple as providing boots for poor farmers would have helped to prevent much of the suffering and death that he witnessed. It's perhaps the ultimate in low-tech methods in snakebite protection: shielding vulnerable human skin.”

Diener J.F.A.,University of Cape Town | Powell R.,University of Melbourne
Journal of Metamorphic Geology | Year: 2010

Ferric iron is present in all metamorphic rocks and has the ability to significantly affect their phase relations. However, the influence of ferric iron has commonly been ignored, or at least not been considered quantitatively, mainly because its abundance in rocks and minerals is not determined by routine analytical techniques. Mineral equilibria calculations that explicitly account for ferric iron can be used to examine its effect on the phase relations in rocks and, in principle, allow the estimation of the oxidation state of rocks. This is illustrated with calculated pseudosections in NCKFMASHTO for mafic and pelitic rock compositions. In addition, it is shown that ferric iron has the capacity to significantly increase the stability of the corundum + quartz assemblage, making it possible for this assemblage to exist at crustal P-T conditions in oxidized rocks of appropriate composition. © 2010 Blackwell Publishing Ltd.

McCarthy D.J.,Walter and Eliza Hall Institute of Medical Research | Chen Y.,Walter and Eliza Hall Institute of Medical Research | Chen Y.,University of Melbourne | Smyth G.K.,Walter and Eliza Hall Institute of Medical Research | Smyth G.K.,University of Melbourne
Nucleic Acids Research | Year: 2012

A flexible statistical framework is developed for the analysis of read counts from RNA-Seq gene expression studies. It provides the ability to analyse complex experiments involving multiple treatment conditions and blocking variables while still taking full account of biological variation. Biological variation between RNA samples is estimated separately from the technical variation associated with sequencing technologies. Novel empirical Bayes methods allow each gene to have its own specific variability, even when there are relatively few biological replicates from which to estimate such variability. The pipeline is implemented in the edgeR package of the Bioconductor project. A case study analysis of carcinoma data demonstrates the ability of generalized linear model methods (GLMs) to detect differential expression in a paired design, and even to detect tumour-specific expression changes. The case study demonstrates the need to allow for gene-specific variability, rather than assuming a common dispersion across genes or a fixed relationship between abundance and variability. Genewise dispersions de-prioritize genes with inconsistent results and allow the main analysis to focus on changes that are consistent between biological replicates. Parallel computational approaches are developed to make non-linear model fitting faster and more reliable, making the application of GLMs to genomic data more convenient and practical. Simulations demonstrate the ability of adjusted profile likelihood estimators to return accurate estimators of biological variability in complex situations. When variation is gene-specific, empirical Bayes estimators provide an advantageous compromise between the extremes of assuming common dispersion or separate genewise dispersion. The methods developed here can also be applied to count data arising from DNA-Seq applications, including ChIP-Seq for epigenetic marks and DNA methylation analyses. © 2011 The Author(s).

Kaufmann T.,University of Bern | Strasser A.,Walter and Eliza Hall Institute of Medical Research | Strasser A.,University of Melbourne | Jost P.J.,TU Munich
Cell Death and Differentiation | Year: 2012

Fas (also called CD95 or APO-1), a member of a subgroup of the tumour necrosis factor receptor superfamily that contain an intracellular death domain, can initiate apoptosis signalling and has a critical role in the regulation of the immune system. Fas-induced apoptosis requires recruitment and activation of the initiator caspase, caspase-8 (in humans also caspase-10), within the death-inducing signalling complex. In so-called type 1 cells, proteolytic activation of effector caspases (-3 and-7) by caspase-8 suffices for efficient apoptosis induction. In so-called type 2 cells, however, killing requires amplification of the caspase cascade. This can be achieved through caspase-8-mediated proteolytic activation of the pro-apoptotic Bcl-2 homology domain (BH)3-only protein BH3-interacting domain death agonist (Bid), which then causes mitochondrial outer membrane permeabilisation. This in turn leads to mitochondrial release of apoptogenic proteins, such as cytochrome c and, pertinent for Fas death receptor (DR)-induced apoptosis, Smac/DIABLO (second mitochondria-derived activator of caspase/direct IAP binding protein with low Pi), an antagonist of X-linked inhibitor of apoptosis (XIAP), which imposes a brake on effector caspases. In this review, written in honour of Juerg Tschopp who contributed so much to research on cell death and immunology, we discuss the functions of Bid and XIAP in the control of Fas DR-induced apoptosis signalling, and we speculate on how this knowledge could be exploited to develop novel regimes for treatment of cancer. © 2012 Macmillan Publishers Limited All rights reserved.

Ward C.W.,Walter and Eliza Hall Institute of Medical Research | Lawrence M.C.,Walter and Eliza Hall Institute of Medical Research | Lawrence M.C.,University of Melbourne
Current Opinion in Structural Biology | Year: 2012

The insulin and epidermal growth factor receptor families are among the most intensively studied proteins in biology. They are closely related members of the receptor tyrosine kinase superfamily and deregulated signaling by members of either receptor family has been implicated in the progression of a variety of cancers. These receptors have thus emerged as validated therapeutic targets for the development of anti-tumour agents. Recent studies have revealed detail of the ligand-binding sites in the insulin receptor family, as well as detail of conformational change upon ligand binding in the epidermal growth factor receptor family. Taken together, these findings and further data relating to kinase activation highlight the fact that while the receptor families share common structural elements, the structural detail of their functioning is remarkably different. © 2012.

Duffy K.R.,National University of Ireland, Maynooth | Hodgkin P.D.,Walter and Eliza Hall Institute of Medical Research | Hodgkin P.D.,University of Melbourne
Trends in Cell Biology | Year: 2012

During an adaptive immune response, lymphocytes proliferate for five to 20 generations, differentiating to take on effector functions, before cessation and cell death become dominant. Recent experimental methodologies enable direct observation of individual lymphocytes and the times at which they adopt fates. Data from these experiments reveal diversity in fate selection, heterogeneity and involved correlation structures in times to fate, as well as considerable familial correlations. Despite the significant complexity, these data are consistent with the simple hypothesis that each cell possesses autonomous processes, subject to temporal competition, leading to each fate. This article addresses the evidence for this hypothesis, its hallmarks, and, should it be an appropriate description of a cell system, its ramifications for manipulation. © 2012 Elsevier Ltd.

Smits A.J.,Princeton University | McKeon B.J.,California Institute of Technology | Marusic I.,University of Melbourne
Annual Review of Fluid Mechanics | Year: 2011

We review wall-bounded turbulent flows, particularly high†"Reynolds number, zero-pressure gradient boundary layers, and fully developed pipe and channel flows. It is apparent that the approach to an asymptotically high-Reynolds number state is slow, but at a sufficiently high Reynolds number the log law remains a fundamental part of the mean flow description. With regard to the coherent motions, very-large-scale motions or erstructures exist at all Reynolds numbers, but they become increasingly important with Reynolds number in terms of their energy content and their interaction with the smaller scales near the wall. There is accumulating evidence that certain features are flow specific, such as the constants in the log law and the behavior of the very large scales and their interaction with the large scales (consisting of vortex packets). Moreover, the refined attached-eddy hypothesis continues to provide an important theoretical framework for the structure of wall-bounded turbulent flows. © 2011 by Annual Reviews. All rights reserved.

Visvader J.E.,Walter and Eliza Hall Institute of Medical Research | Visvader J.E.,University of Melbourne | Stingl J.,University of Cambridge
Genes and Development | Year: 2014

The mammary epithelium is highly responsive to local and systemic signals, which orchestrate morphogenesis of the ductal tree during puberty and pregnancy. Based on transplantation and lineage tracing studies, a hierarchy of stem and progenitor cells has been shown to exist among the mammary epithelium. Lineage tracing has highlighted the existence of bipotent mammary stem cells (MaSCs) in situ as well as long-lived unipotent cells that drive morphogenesis and homeostasis of the ductal tree. Moreover, there is accumulating evidence for a heterogeneous MaSC compartment comprising fetal MaSCs, slow-cycling cells, and both long-term and short-term repopulating cells. In parallel, diverse luminal progenitor subtypes have been identified in mouse and human mammary tissue. Elucidation of the normal cellular hierarchy is an important step toward understanding the "cells of origin" and molecular perturbations that drive breast cancer. © 2014 Visvader and Stingl.

Liao Y.,Walter and Eliza Hall Institute of Medical Research | Smyth G.K.,Walter and Eliza Hall Institute of Medical Research | Smyth G.K.,University of Melbourne | Shi W.,Walter and Eliza Hall Institute of Medical Research
Bioinformatics | Year: 2014

Motivation: Next-generation sequencing technologies generate millions of short sequence reads, which are usually aligned to a reference genome. In many applications, the key information required for downstream analysis is the number of reads mapping to each genomic feature, for example to each exon or each gene. The process of counting reads is called read summarization. Read summarization is required for a great variety of genomic analyses but has so far received relatively little attention in the literature.Results: We present featureCounts, a read summarization program suitable for counting reads generated from either RNA or genomic DNA sequencing experiments. featureCounts implements highly efficient chromosome hashing and feature blocking techniques. It is considerably faster than existing methods (by an order of magnitude for gene-level summarization) and requires far less computer memory. It works with either single or paired-end reads and provides a wide range of options appropriate for different sequencing applications. © 2013 The Author 2013. Published by Oxford University Press. All rights reserved.

Shortman K.,Walter and Eliza Hall Institute of Medical Research | Heath W.R.,University of Melbourne
Immunological Reviews | Year: 2010

Mouse lymphoid tissues contain a subset of dendritic cells (DCs) expressing CD8α together with a pattern of other surface molecules that distinguishes them from other DCs. These molecules include particular Toll-like receptor and C-type lectin pattern recognition receptors. A similar DC subset, although lacking CD8 expression, exists in humans. The mouse CD8+ DCs are non-migrating resident DCs derived from a precursor, distinct from monocytes, that continuously seeds the lymphoid organs from bone marrow. They differ in several key functions from their CD8- DC neighbors. They efficiently cross-present exogenous cell-bound and soluble antigens on major histocompatibility complex class I. On activation, they are major producers of interleukin-12 and stimulate inflammatory responses. In steady state, they have immune regulatory properties and help maintain tolerance to self-tissues. During infection with intracellular pathogens, they become major presenters of pathogen antigens, promoting CD8+ T-cell responses to the invading pathogens. Targeting vaccine antigens to the CD8+ DCs has proved an effective way to induce cytotoxic T lymphocytes and antibody responses. © 2010 John Wiley & Sons A/S.

Masters S.L.,Walter and Eliza Hall Institute of Medical Research | Masters S.L.,University of Melbourne
Clinical Immunology | Year: 2013

Blocking the cytokines Interleukin-1beta (IL-1β) and Interleukin-18 (IL-18) benefits a diverse range of inflammatory pathologies. In each of these diseases, different cytoplasmic innate immune receptors nucleate individual protein complexes known as inflammasomes, to regulate the production of active IL-1β or IL-18. This review will outline the complex diseases where these cytokines are pathogenic, and explain which inflammasome(s) may be responsible. For example, inflammasomes nucleated by NLRP3 and NLRP6 integrate signals from metabolic and commensal systems contributing to metabolic dysfunction and type 2 diabetes. On the other hand, NLRP1 and AIM2 are more broadly implicated in autoimmunity and allergy. Furthermore, each inflammasome has unique roles in pathogen recognition, which may determine the outcome of polymicrobial infection and link different infectious co-morbidities to chronic inflammatory disease. We can now imagine a time when targeted inflammasome inhibitors will be employed in the clinic, tailoring treatments to particular diseases, and perhaps individual patients. © 2012 Elsevier Inc.

Kile B.T.,Walter and Eliza Hall Institute of Medical Research | Kile B.T.,University of Melbourne
British Journal of Haematology | Year: 2014

The role of apoptotic pathways in the development and function of the megakaryocyte lineage has generated renewed interest in recent years. This has been driven by the advent of BH3 mimetic drugs that target BCL2 family proteins to induce apoptosis in tumour cells: agents such as ABT-263 (navitoclax, which targets BCL2, BCL-XL [BCL2L1] and BCL2L2) and ABT-199 (a BCL2-specific agent) are showing great promise in early stage clinical trials. However, the major dose-limiting toxicity of navitoclax has proven to be thrombocytopenia, an on-target effect of inhibiting BCL-XL. It transpires that the anucleate platelet contains a classical intrinsic apoptosis pathway, which at steady state regulates its life span in the circulation. BCL-XL is the critical pro-survival protein that restrains apoptosis and maintains platelet viability. These findings have paved the way to a deeper understanding of apoptotic pathways and processes in platelets, and their precursor cell, the megakaryocyte. © 2014 John Wiley & Sons Ltd.

Peel M.C.,University of Melbourne | Bloschl G.,Vienna University of Technology
Progress in Physical Geography | Year: 2011

Changing hydrological conditions due to climate, land use and infrastructure pose significant ongoing challenges to the hydrological research and water management communities. While, traditionally, hydrological models have assumed stationary conditions, there has been much progress since 2005 on model parameter estimation under unknown or changed conditions and on techniques for modelling in those conditions. There is an analogy between extrapolation in space (termed Prediction in Ungauged Basins, PUB), and extrapolation in time (termed Prediction in Ungauged Climates, PUC) that can be exploited for estimating model parameters. Methods for modelling changing hydrological conditions need to progress beyond the current scenario approach, which is reliant upon precalibrated models. Top-down methods and analysis of spatial gradients of a variable of interest, instead of temporal gradients (a method termed 'Trading space for time') show much promise for validating more complex model projections. Understanding hydrological processes and how they respond to change, along with quantification of parameter estimation and modelling process uncertainty will continue to be active areas of research within hydrology. Contributions from these areas will not only help inform future climate change impact studies about what will change and by how much, but also provide insight into why any changes may occur, what changes we are able to predict in a realistic manner, and what changes are beyond the current predictability of hydrological systems. © The Author(s) 2011.

Aumann T.D.,University of Melbourne | Prut Y.,Hebrew University of Jerusalem
Trends in Neurosciences | Year: 2015

Coherent β-oscillations are a dominant feature of the sensorimotor system yet their function remains enigmatic. We propose that, in addition to cell intrinsic and/or local network interactions, they are supported by activity propagating recurrently around closed neural 'loops' between primary motor cortex (M1), muscles, and back to M1 via somatosensory pathways. Individual loops reciprocally connect individual muscle synergies ('motor primitives') with their representations in M1, and the conduction time around each loop resonates with the periodic spiking of its constituent neurons/muscles. During β-oscillations, this resonance strengthens within-loop connectivity (via long-term potentiation, LTP), whereas non-resonance between different loops weakens connectivity (via long-term depression, LTD) between M1 representations of different muscle synergies. In this way, β-oscillations help maintain accurate and discrete representations of muscle synergies in M1. © 2014 Elsevier Ltd.

Diener J.F.A.,University of Cape Town | Powell R.,University of Melbourne
Journal of Metamorphic Geology | Year: 2012

Recent activity-composition models for clinopyroxene and amphibole are revised to provide better consistency with observed phase relations in natural rocks. For clinopyroxene, the calibration in NCFMAS is retained, but the incorporation of acmite is revised to improve the partitioning of ferric iron between coexisting clinopyroxenes. For amphibole, the NCFMASH calibration is retained, but the addition of ferric iron is changed to provide consistency with the clinopyroxenes. The thermodynamics of orthoamphibole (gedrite) is also adjusted to resolve an unrelated inconsistency. The effects of these improvements are illustrated through comparison of calculated pseudosections produced with the existing and new models with natural data from lawsonite eclogites. © 2011 Blackwell Publishing Ltd.

Tran N.,University of Rhode Island | Tran P.A.,University of Melbourne
ChemPhysChem | Year: 2012

Bacterial infections remain one of the biggest concerns to our society. Conventional antibiotic treatments showed little effect on the increasing number of antibiotic-resistant bacteria. Advances in synthetic chemistry and nanotechnology have resulted in a new class of nanometer-scale materials with distinguished properties and great potential to be an alternative for antibiotics. In this Minireview, we address the current situation of medical-device-associated infections and the emerging opportunities for antibacterial nanomaterials in preventing these complications. Several important antimicrobial nanomaterials emergent from advances in synthesis chemistry are introduced and their bactericidal mechanisms are analyzed. In addition, concerns regarding the biocompatibility of such materials are also addressed. Nanomaterials to the rescue: Medical-device-associated bacterial infections remain challenges to modern medicine. Several nanomaterials have proven to be effective antibacterial agents (see graphic). The increasing number of antibiotic-resistant bacteria makes such materials valuable tools for fighting infection. Copyright © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

Denes F.,University of Nantes | Schiesser C.H.,University of Melbourne | Renaud P.,University of Bern
Chemical Society Reviews | Year: 2013

Due to their stability, availability and reactivity, sulfides are particularly attractive sources of carbon-centered radicals. However, their reactivity in homolytic substitution processes is strongly reduced when compared with the corresponding selenides or halides. Despite this, sulfur-containing compounds can be engineered so that they become effective agents in radical chain reactions. A detailed description of the reactivity of organo-sulfur compounds is reported here with the aim of providing clear guidance on the scope and limitation of their use as radical precursors in chain reactions. This journal is © The Royal Society of Chemistry.