Leiden University, located in the city of Leiden, is the oldest university in the Netherlands. The university was founded in 1575 by William, Prince of Orange, leader of the Dutch Revolt in the Eighty Years' War. The royal Dutch House of Orange-Nassau and Leiden University still have a close relationship. The Queens Juliana and Beatrix and crown-prince Willem-Alexander studied at Leiden University. In 2005 Queen Beatrix received a rare honorary degree from Leiden University. Wikipedia.
Leiden University | Date: 2016-04-14
The present invention is based on the finding that microRNA from the microRNA gene cluster located on the human chromosomal at locus 14q32 play an important role in vascular development and re-modelling. Modulators of any of the 14q32 microRNA may be exploited as a means to modulate vascular re-modelling processes and/or in the treatment and/or prevention of vascular disorders or disease.
Birdja Y.Y.,Leiden University |
Koper M.T.M.,Leiden University
Journal of the American Chemical Society | Year: 2017
A seemingly catalytically inactive electrode, boron-doped diamond (BDD), is found to be active for CO2 and CO reduction to formaldehyde and even methane. At very cathodic potentials, formic acid and methanol are formed as well. However, these products are the result of base-catalyzed Cannizzaro-type disproportionation reactions. A local alkaline environment near the electrode surface, caused by the hydrogen evolution reaction, initiates aldehyde disproportionation promoted by hydroxide ions, which leads to the formation of the corresponding carboxylic acid and alcohol. This phenomenon is strongly influenced by the electrolyte pH and buffer capacity and not limited to BDD or formaldehyde, but can be generalized to different electrode materials and to C2 and C3 aldehydes as well. The importance of these reactions is emphasized as the formation of acids and alcohols is often ascribed to direct CO2 reduction products. The results obtained here may explain the concomitant formation of acids and alcohols often observed during CO2 reduction. © 2017 American Chemical Society.
Bresters D.,Leiden University
Bone Marrow Transplantation | Year: 2017
Permanent alopecia after haematopoietic stem cell transplantation (HSCT) is distressing and few studies have investigated this late effect. The aim of the study was to assess the percentage of patients with alopecia and investigate risk factors for alopecia. Patients who underwent allogeneic HSCT before age 19 years, from January 1990 to January 2013, who were at least 2 years after transplant and in follow-up in our clinic were included. Alopecia was defined as clinically apparent decreased hair density. Possible risk factors considered for alopecia after HSCT included: gender, age, diagnosis, donor type, conditioning regimen: cranial irradiation (TBI/cranial radiotherapy) and/or chemotherapy, which chemotherapeutic agents were used and acute/chronic GvHD. The percentage of permanent alopecia in our cohort was 15.6% (41/263 patients). All patients had diffuse alopecia except for one with alopecia totalis. In multivariate analysis, a conditioning regimen with busulphan and busulphan plus fludarabine (odds ratio (OR) 5.7 (confidence interval (CI): 2.5–12.7) and OR 7.4 (CI: 3.3–16.2), respectively, was the main risk factor and associated with alopecia independent of acute/chronic GvHD. Neither TBI nor other alkylating chemotherapy, including treosulfan, was associated with alopecia. In conclusion, permanent alopecia after HSCT is associated with busulphan and GvHD and occurs in 16% of patients.Bone Marrow Transplantation advance online publication, 20 March 2017; doi:10.1038/bmt.2017.15. © 2017 Macmillan Publishers Limited, part of Springer Nature.
van Gaalen R.D.,Leiden University
Epidemiology | Year: 2017
Rotavirus is a common viral infection among young children. As in many countries, the infection dynamics of rotavirus in the Netherlands are characterized by an annual winter peak, which was notably low in 2014. Previous work suggested an association between weather factors and both rotavirus transmission and incidence. From epidemic theory, we know that the proportion of susceptible individuals can affect disease transmission. We investigated how these factors are associated with rotavirus transmission in the Netherlands, and their impact on rotavirus transmission in 2014.We used available data on birth rates and rotavirus laboratory reports to estimate rotavirus transmission and the proportion of individuals susceptible to primary infection. Weather data were directly available from a central meteorological station.We developed an approach for detecting determinants of seasonal rotavirus transmission by assessing non-linear, delayed associations between each factor and rotavirus transmission. We explored relationships by applying a distributed lag non-linear regression model with seasonal terms. We corrected for residual serial correlation using auto-regressive moving average errors. We inferred the relationship between different factors and the effective reproduction number from the most parsimonious model with low residual auto-correlation.Higher proportions of susceptible individuals and lower temperatures were associated with increases in rotavirus transmission. For 2014, our findings suggest that relatively mild temperatures combined with the low proportion of susceptible individuals contributed to lower rotavirus transmission in the Netherlands. However, our model, which overestimated the magnitude of the peak, suggested that other factors were likely instrumental in reducing the incidence that year. Copyright © 2017 Wolters Kluwer Health, Inc. All rights reserved.
van Balen P.,Leiden University
Transplantation | Year: 2017
BACKGROUND: Donors for allogeneic stem cell transplantation (alloSCT) are preferentially matched with patients for HLA-A, B, C and DRB1. Mismatches between donor and patient in these alleles are associated with an increased risk of graft-versus-host disease (GVHD). In contrast, HLA-DRB3, 4 and 5, HLA-DQ and HLA-DP are usually assumed to be low expression loci with limited relevance, although mismatches in HLA-DQ and HLA-DP can result in allo-immune responses. Mismatches in HLA-DRB3, 4 and 5 are usually not taken into account in donor selection. METHODS: Conversion of chimerism in the presence of GVHD after CD4 donor lymphocyte infusion (DLI) was observed in a patient, HLA 10/10 matched, but mismatched for HLA-DRB3 and HLA-DPB1 compared to the donor. Alloreactive CD4 T cells were isolated from peripheral blood after CD4 DLI and recognition of donor derived target cells transduced with the mismatched patient variant HLA-DRB3 and HLA-DPB1 molecule was tested. RESULTS: A dominant polyclonal CD4 T cell response against patient’s mismatched HLA-DRB3 molecule was found in addition to an immune response against patient’s mismatched HLA-DPB1 molecule. CD4 T cells specific for these HLA class II molecules recognized both hematopoietic target cells as well as GVHD target cells. CONCLUSION: In contrast to the assumption that mismatches in HLA-DRB3, 4 and 5 are not of immunogenic significance after HLA 10/10 matched alloSCT, we show that in this matched setting not only mismatches in HLA-DPB1, but also mismatches in HLA-DRB3 may induce a polyclonal allo-immune response associated with conversion of chimerism and severe GVHD. Copyright © 2017 Wolters Kluwer Health, Inc. All rights reserved.
Ouweneel A.B.,Leiden University
Arteriosclerosis, Thrombosis, and Vascular Biology | Year: 2017
OBJECTIVE—: Murine atherosclerosis models do not spontaneously develop atherothrombotic complications. We investigated whether disruption of natural anticoagulation allows preexisting atherosclerotic plaques to progress toward an atherothrombotic phenotype. APPROACH AND RESULTS—: On lowering of plasma protein C levels with small interfering RNA (siProc) in 8-week Western-type diet–fed atherosclerotic apolipoprotein E–deficient mice, one out of 4 mice displayed a large, organized, and fibrin- and leukocyte-rich thrombus on top of an advanced atherosclerotic plaque located in the aortic root. Although again at low incidence (3 in 25), comparable thrombi at the same location were observed during a second independent experiment in 9-week Western-type diet–fed apolipoprotein E–deficient mice. Mice with thrombi on their atherosclerotic plaques did not show other abnormalities and had equally lowered plasma protein C levels as siProc-treated apolipoprotein E–deficient mice without thrombi. Fibrinogen and thrombin–antithrombin concentrations and blood platelet numbers were also comparable, and plaques in siProc mice with thrombi had a similar composition and size as plaques in siProc mice without thrombi. Seven out of 25 siProc mice featured clots in the left atrium of the heart. CONCLUSIONS—: Our findings indicate that small interfering RNA–mediated silencing of protein C in apolipoprotein E–deficient mice creates a condition that allows the occurrence of spontaneous atherothrombosis, albeit at a low incidence. Lowering natural anticoagulation in atherosclerosis models may help to discover factors that increase atherothrombotic complications. © 2017 American Heart Association, Inc.
Huang Y.-F.,Leiden University |
Koper M.T.M.,Leiden University
Journal of Physical Chemistry Letters | Year: 2017
To understand the interaction between Pt and surface oxygenated species in electrocatalysis, this paper correlates the electrochemistry of atomic oxygen on Pt formed in the gas phase with electrochemically generated oxygen species on a variety of single-crystal platinum surfaces. The atomic oxygen adsorbed on single-crystalline Pt electrodes, made by thermal dissociation of molecular oxygen, is used for voltammetry measurements in acidic electrolytes (HClO4 and H2SO4). The essential knowledge of coverage, binding energy, and surface construction of atomic oxygen is correlated with the charge, potential, and shape of voltammograms, respectively. The differences of the voltammograms between the oxide made by thermal dissociation of molecular oxygen and electrochemical oxidation imply that atomic oxygen is not an intermediate of the electrochemical oxidation of Pt(111). The reconstruction of (100) terrace and step and the low-potential stripping of atomic oxygen on (111) step site provide insight into the first stages of degradation of Pt-based electrocatalysts. © 2017 American Chemical Society.
van Raan A.F.J.,Leiden University
Scientometrics | Year: 2017
A ‘Sleeping Beauty in Science’ is a publication that goes unnoticed (‘sleeps’) for a long time and then, almost suddenly, attracts a lot of attention (‘is awakened by a prince’). In our foregoing study we found that roughly half of the Sleeping Beauties are application-oriented and thus are potential Sleeping Innovations. In this paper we investigate a new topic: Sleeping Beauties that are cited in patents. In this way we explore the existence of a dormitory of inventions. To our knowledge this is the first study of this kind. We investigate the time lag between publication of the Sleeping Beauty and the first citation by a patent. We find that patent citation may occur before or after the awakening and that the depth of the sleep, i.e., citation rate during the sleeping period, is no predictor for later scientific or technological impact of the Sleeping Beauty. A surprising finding is that Sleeping Beauties are significantly more cited in patents than ‘normal’ papers. Inventor–author self-citations relations occur only in a small minority of the Sleeping Beauties that are cited in patents, but other types of inventor–author links occur more frequently. We develop an approach in different steps to explore the cognitive environment of Sleeping Beauties cited in patents. First, we analyze whether they deal with new topics by measuring the time-dependent evolution in the entire scientific literature of the number of papers related to both the precisely defined topics as well as the broader research theme of the Sleeping Beauty during and after the sleeping time. Second, we focus on the awakening by analyzing the first group of papers that cites the Sleeping Beauty. Third, we create concept maps of the topic-related and the citing papers for a time period immediately following the awakening and for the most recent period. Finally, we make an extensive assessment of the cited and citing relations of the Sleeping Beauty. We find that tunable co-citation analysis is a powerful tool to discover the prince(s) and other important application-oriented work directly related to the Sleeping Beauty, for instance papers written by authors who cite Sleeping Beauties in both the patents of which they are the inventors, as well as in their scientific papers. © 2017 The Author(s)
Sofos E.,Leiden University
Proceedings of the London Mathematical Society | Year: 2016
Let φ: X→ P1Q be a non-singular conic bundle over Q having n non-split fibres and denote by N(φ, B) the cardinality of the fibres of Weil height at most B that possess a rational point. Serre showed in 1990 that a direct application of the large sieve yields N(φ,B)< B2 (log B)-n/2 and raised the problem of proving that this is the true order of magnitude of N(φ, B) under the necessary assumption that there exists at least one smooth fibre with a rational point. We solve this problem for all non-singular conic bundles of rank at most 3. Our method comprises the use of Hooley neutralisers, estimating divisor sums over values of binary forms, and an application of the Rosser-Iwaniec sieve. © 2016 London Mathematical Society.
Neefjes J.,Leiden University |
Jongsma M.M.L.,University of Amsterdam |
Berlin I.,Leiden University
Trends in Cell Biology | Year: 2017
The endosomal system constitutes a key negotiator between the environment of a cell and its internal affairs. Comprised of a complex membranous network, wherein each vesicle can in principle move autonomously throughout the cell, the endosomal system operates as a coherent unit to optimally face external challenges and maintain homeostasis. Our appreciation of how individual endosomes are controlled in time and space to best serve their collective purpose has evolved dramatically in recent years. In light of these efforts, the endoplasmic reticulum (ER) - with its expanse of membranes permeating the cytoplasmic space - has emerged as a potent spatiotemporal organizer of endosome biology. We review the latest advances in our understanding of the mechanisms underpinning endosomal transport and positioning, with emphasis on the contributions from the ER, and offer a perspective on how the interplay between these aspects shapes the architecture and dynamics of the endosomal system and drives its myriad cellular functions. Endosomal transport and positioning cooperate in the establishment of compartment architecture, dynamics and function.Functional attributes of peripheral endosomes differ from those found in the perinuclear region of the cell.Interactions between the ER and endosomes influence endosome distribution, motility, and fission. © 2017 Elsevier Ltd.
Coulais C.,AMOLF |
Coulais C.,Leiden University |
Sounas D.,University of Texas at Austin |
Alu A.,University of Texas at Austin
Nature | Year: 2017
Reciprocity is a general, fundamental principle governing various physical systems, which ensures that the transfer function - the transmission of a physical quantity, say light intensity - between any two points in space is identical, regardless of geometrical or material asymmetries. Breaking this transmission symmetry offers enhanced control over signal transport, isolation and source protection. So far, devices that break reciprocity (and therefore show non-reciprocity) have been mostly considered in dynamic systems involving electromagnetic, acoustic and mechanical wave propagation associated with fields varying in space and time. Here we show that it is possible to break reciprocity in static systems, realizing mechanical metamaterials that exhibit vastly different output displacements under excitation from different sides, as well as one-way displacement amplification. This is achieved by combining large nonlinearities with suitable geometrical asymmetries and/or topological features. In addition to extending non-reciprocity and isolation to statics, our work sheds light on energy propagation in nonlinear materials with asymmetric crystalline structures and topological properties. We anticipate that breaking reciprocity will open avenues for energy absorption, conversion and harvesting, soft robotics, prosthetics and optomechanics.
Kopnina H.,Leiden University
Journal of Cleaner Production | Year: 2017
Sustainable production is often limited by structural factors such as industrial development, neoliberal democracy, growing population and globalization of consumer culture. Drawing on the work of some theorists linking unsustainability to universal psychological propensities, this article discusses sustainable production in relation to human nature. Human nature is understood here as complex cross-cultural and historically consistent psychological traits or universal physiological predispositions that result in the largely shared repertoire of human behavior. It is posited here that these traits, when combined with specific conditions of industrial development result in unsustainable behaviors. This article explores the relationship between human population and sustainability, human nature and culture as well as human nature and environment, and between human nature and sustainability. Recommendations focus on how sustainability efforts can take advantage of some of our natural tendencies, and mitigate others in order to provide strategic solutions to unsustainable practices. © 2017 Elsevier Ltd
Saxon D.,Leiden University
Journal of Conflict and Security Law | Year: 2016
This article argues that it is possible-given the right resources and expertise - to hold individual non-state actors responsible for violations of international humanitarian law (also known as 'the laws and customs of war') perpetrated with cyberweapons. It describes jurisdictional elements of violations of the laws and customs of war as well as points that prosecutors and investigators must consider when planning investigations of serious violations of international humanitarian law perpetrated in cyberspace. It addresses how certain theories of individual criminal responsibility for war crimes apply to offences committed by non-state actors during cyberwarfare and identifies particular evidentiary challenges arising from the particular qualities of cyberspace and cyberweapons. Individual accountability for war crimes perpetrated during cyber operations requires new thinking about the application of legal principles and theories during cyber conflict. © Oxford University Press 2016.
McCarthy I.G.,Liverpool John Moores University |
Schaye J.,Leiden University |
Bird S.,Johns Hopkins University |
Le Brun A.M.C.,University Paris Diderot
Monthly Notices of the Royal Astronomical Society | Year: 2017
The evolution of the large-scale distribution of matter is sensitive to a variety of fundamental parameters that characterize the dark matter, dark energy, and other aspects of our cosmological framework. Since the majority of the mass density is in the form of dark matter that cannot be directly observed, to do cosmology with large-scale structure, one must use observable (baryonic) quantities that trace the underlying matter distribution in a (hopefully) predictable way. However, recent numerical studies have demonstrated that the mapping between observable and total mass, as well as the total mass itself, are sensitive to unresolved feedback processes associated with galaxy formation, motivating explicit calibration of the feedback efficiencies. Here, we construct a new suite of large-volume cosmological hydrodynamical simulations (called BAHAMAS, for BAryons and HAloes of MAssive Systems), where subgrid models of stellar and active galactic nucleus feedback have been calibrated to reproduce the present-day galaxy stellar mass function and the hot gas mass fractions of groups and clusters in order to ensure the effects of feedback on the overall matter distribution are broadly correct. We show that the calibrated simulations reproduce an unprecedentedly wide range of properties of massive systems, including the various observed mappings between galaxies, hot gas, total mass, and black holes, and represent a significant advance in our ability to mitigate the primary systematic uncertainty in most present large-scale structure tests. © 2016 The Authors.
Fiacconi D.,University of Zürich |
Rossi E.M.,Leiden University
Monthly Notices of the Royal Astronomical Society | Year: 2017
Supermassive black holes are a key ingredient of galaxy evolution. However, their origin is still highly debated. In one of the leading formation scenarios, a black hole of ~100 M⊙ results from the collapse of the inner core of a supermassive star (≳104-5 M⊙), created by the rapid accumulation (≳0.1M⊙ yr-1) of pristine gas at the centre of newly formed galaxies at z ~ 15. The subsequent evolution is still speculative: the remaining gas in the supermassive star can either directly plunge into the nascent black hole or part of it can form a central accretion disc, whose luminosity sustains a surrounding, massive, and nearly hydrostatic envelope (a system called a 'quasi-star'). To address this point, we consider the effect of rotation on a quasi-star, as angular momentum is inevitably transported towards the galactic nucleus by the accumulating gas. Using a model for the internal redistribution of angular momentum that qualitatively matches results from simulations of rotating convective stellar envelopes, we show that quasistars with an envelope mass greater than a few 105 M⊙ × (black hole mass/100M⊙)0.82 have highly sub-Keplerian gas motion in their core, preventing gas circularization outside the black hole's horizon. Less massive quasi-stars could form but last for only ≳ 104 yr before the accretion luminosity unbinds the envelope, suppressing the black hole growth. We speculate that this might eventually lead to a dual black hole seed population: (i) massive ( > 104 M⊙) seeds formed in the most massive ( > 108 M⊙) and rare haloes; (ii) lighter (~102 M⊙) seeds to be found in less massive and therefore more common haloes. © 2016 The Authors.
Millar T.J.,Queen's University of Belfast |
Walsh C.,Leiden University |
Walsh C.,University of Leeds |
Field T.A.,Queen's University of Belfast
Chemical Reviews | Year: 2017
Until a decade ago, the only anion observed to play a prominent role in astrophysics was H-. The bound-free transitions in H- dominate the visible opacity in stars with photospheric temperatures less than 7000 K, including the Sun. The H- anion is also believed to have been critical to the formation of molecular hydrogen in the very early evolution of the Universe. Once H2 formed, about 500000 years after the Big Bang, the expanding gas was able to lose internal gravitational energy and collapse to form stellar objects and "protogalaxies", allowing the creation of heavier elements such as C, N, and O through nucleosynthesis. Although astronomers had considered some processes through which anions might form in interstellar clouds and circumstellar envelopes, including the important role that polycyclic aromatic hydrocarbons might play in this, it was the detection in 2006 of rotational line emission from C6H- that galvanized a systematic study of the abundance, distribution, and chemistry of anions in the interstellar medium. In 2007, the Cassini mission reported the unexpected detection of anions with mass-to-charge ratios of up to ∼10 000 in the upper atmosphere of Titan; this observation likewise instigated the study of fundamental chemical processes involving negative ions among planetary scientists. In this article, we review the observations of anions in interstellar clouds, circumstellar envelopes, Titan, and cometary comae. We then discuss a number of processes by which anions can be created and destroyed in these environments. The derivation of accurate rate coefficients for these processes is an essential input for the chemical kinetic modeling that is necessary to fully extract physics from the observational data. We discuss such models, along with their successes and failings, and finish with an outlook on the future. © 2017 American Chemical Society.
Odijk T.,Leiden University
Physica A: Statistical Mechanics and its Applications | Year: 2017
An analogy is pointed out between a polymer chain fluctuating in a two-dimensional nematic background and a freely floating material line buffeted by a two-dimensional turbulent fluid in the inertial (Kraichnan) regime. Under certain conditions, the back-reaction of the line on the turbulent flow may be neglected. The fractal exponent related to the size–contour relation of the material line is connected to a “nematic” correlation function in the bulk. © 2016 Elsevier B.V.
Bakker E.M.,Leiden University
International Journal of Multimedia Information Retrieval | Year: 2016
One of the pillars of the scientific community is open and free datasets. Not only do they allow benchmarking, evaluation and also reproducibility but they also provide an important contribution themselves by allowing researchers to gain deeper insight into the strengths and weaknesses of their algorithms and paradigms. Furthermore, in the cases of the larger datasets, they facilitate major advances (e.g., concept learning using big data). Here, we present some of the recent free and open datasets for the scientific community. © 2016, The Author(s).
Jovanovski O.,Leiden University
Journal of Statistical Physics | Year: 2017
We consider Glauber dynamics for the low-temperature, ferromagnetic Ising Model on the n-dimensional hypercube. We derive precise asymptotic results for the crossover time (the time it takes for the dynamics to go from the configuration with a “(Formula presented.)” at every vertex, to the configuration with a “(Formula presented.)” at each vertex) in the limit as the inverse temperature (Formula presented.). © 2017 The Author(s)
Nienhuis G.,Leiden University
Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences | Year: 2017
The insight that a beam of light can carry orbital angular momentum (AM) in its propagation direction came up in 1992 as a surprise. Nevertheless, the existence of momentum and AM of an electromagnetic field has been well known since the days of Maxwell. We compare the expressions for densities of AM in general three-dimensional modes and in paraxial modes. Despite their classical nature, these expressions have a suggestive quantum mechanical appearance, in terms of linear operators acting on mode functions. In addition, paraxial wave optics has several analogies with real quantum mechanics, both with the wave function of a free quantum particle and with a quantum harmonic oscillator. We discuss how these analogies can be applied. © 2017 The Author(s) Published by the Royal Society. All rights reserved.
Trouw L.A.,Leiden University
Nature Reviews Rheumatology | Year: 2017
The presence of autoantibodies is one of the hallmarks of rheumatoid arthritis (RA). In the past few decades, rheumatoid factors (autoantibodies that recognize the Fc-tail of immunoglobulins) as well as anti-citrullinated protein antibodies (ACPAs) have been studied intensively. ACPAs recognize post-translationally modified proteins in which the amino acid arginine has been converted into a citrulline. More recently, other autoantibody systems recognizing post-translationally modified proteins have also gained attention, including autoantibodies recognizing fragmented immunoglobulin (anti-hinge antibodies), autoantibodies recognizing acetylated proteins and autoantibodies recognizing proteins that are modified by adducts formed under oxidative stress. In particular, detailed insights have been obtained on the presence and properties of autoantibodies recognizing carbamylated proteins, commonly called anti-carbamylated protein (anti-CarP) antibodies. In this Review, we summarize the current knowledge relating to these emerging autoantibodies that recognize post-translationally modified proteins identified in RA, with an emphasis on anti-CarP antibodies. © 2017 Nature Publishing Group, a division of Macmillan Publishers Limited. All Rights Reserved.
Stefoudi D.,Leiden University
Proceedings of the International Astronautical Congress, IAC | Year: 2016
The term "big data" refers to large amounts of data, generated in great velocity and variety and processed to match the needs of different types of users. In the space field, big data is translated to large sources of information acquired using Earth and Space observation technologies. Data collected from remote sensing activities are used for several purposes, varying from military and civil services, to commercial uses. The EU Copernicus Earth monitoring system, as well as other similar private projects are aiming at connecting the world through accurate, near real-time data. Scientists, policy makers, governmental entities, industry and the general public are increasingly gaining access to the multiple applications of space data. New technologies have accordingly created commercial potentials for users who view big data as a competitive advantage and a value-generating asset. This growing demand was not foreseen thirty years ago, when the UN Remote Sensing Principles of 1986 were drafted. The legal regime related to Earth observation should be reconsidered, in the light of the needs of this emerging domain. The purpose of this paper is to discuss the legal challenges related to access and dissemination of big data from space, taking into account international space law, as well as data protection and privacy regulations.
Vellinga D.,Leiden University
Ear and Hearing | Year: 2017
OBJECTIVE:: Current spread is a substantial limitation of speech coding strategies in cochlear implants. Multipoles have the potential to reduce current spread and thus generate more discriminable pitch percepts. The difficulty with multipoles is reaching sufficient loudness. The primary goal was to compare the loudness characteristics and spread of excitation (SOE) of three types of phased array stimulation, a novel multipole, with three more conventional configurations. DESIGN:: Fifteen postlingually deafened cochlear implant users performed psychophysical experiments addressing SOE, loudness scaling, loudness threshold, loudness balancing, and loudness discrimination. Partial tripolar stimulation (pTP, σ = 0.75), TP, phased array with 16 (PA16) electrodes, and restricted phased array with five (PA5) and three (PA3) electrodes was compared with a reference monopolar stimulus. RESULTS:: Despite a similar loudness growth function, there were considerable differences in current expenditure. The most energy efficient multipole was the pTP, followed by PA16 and PA5/PA3. TP clearly stood out as the least efficient one. Although the electric dynamic range was larger with multipolar configurations, the number of discriminable steps in loudness was not significantly increased. The SOE experiment could not demonstrate any difference between the stimulation strategies. CONCLUSIONS:: The loudness characteristics all five multipolar configurations tested are similar. Because of their higher energy efficiency, pTP and PA16 are the most favorable candidates for future testing in clinical speech coding strategies. Copyright © 2017 Wolters Kluwer Health, Inc. All rights reserved.
van Lummel M.,Leiden University
Nature Medicine | Year: 2017
Identification of epitopes that are recognized by diabetogenic T cells and cause selective beta cell destruction in type 1 diabetes (T1D) has focused on peptides originating from native beta cell proteins. Translational errors represent a major potential source of antigenic peptides to which central immune tolerance is lacking. Here, we describe an alternative open reading frame within human insulin mRNA encoding a highly immunogenic polypeptide that is targeted by T cells in T1D patients. We show that cytotoxic T cells directed against the N-terminal peptide of this nonconventional product are present in the circulation of individuals diagnosed with T1D, and we provide direct evidence that such CD8+ T cells are capable of killing human beta cells and thereby may be diabetogenic. This study reveals a new source of nonconventional polypeptides that act as self-epitopes in clinical autoimmune disease. © 2017 Nature Publishing Group, a division of Macmillan Publishers Limited. All Rights Reserved.
Frouws M.A.,Leiden University
Medicine | Year: 2017
Several studies have suggested an association between use of metformin and an increased overall survival in patients diagnosed with pancreatic cancer, however with several important methodological limitations. The aim of the study was to assess the association between overall survival, pancreatic cancer, and metformin use.A retrospective cohort study of 1111 patients with pancreatic cancer was conducted using data from The Netherlands Comprehensive Cancer Organization (1998-2011). Data were linked to the PHARMO Database Network containing drug-dispensing records from community pharmacies. Patients were classified as metformin user or sulfonylurea derivatives user from the moment of first dispensing until the end of follow up. The difference in overall survival between metformin users and nonusers was assessed, and additionally between metformin users and sulfonylurea derivatives users. Univariable and multivariable parametric survival models were used and use of metformin and sulfonylurea derivatives was included as time-varying covariates.Of the 1111 patients, 91 patients were excluded because of differences in morphology, 48 patients because of using merely metformin before diagnosis, and 57 metformin-users ever used contemporary sulfonylurea derivatives and were therefore excluded. Lastly, 8 patients with a survival of zero months were excluded. This resulted in 907 patients for the analysis. Overall, 77 users of metformin, 43 users of sulfonylurea derivatives, and 787 nonusers were identified. The adjusted rate ratio for overall survival for metformin users versus nonusers was 0.86 (95% CI: 0.66-1.11; P = 0.25). The difference in overall survival between metformin users and sulfonylurea derivatives users showed an adjusted rate ratio of 0.90 (95% CI: 0.59-1.40; P = 0.67).No association was found between overall survival, pancreatic cancer, and metformin use. This was in concordance with 2 recently published randomized controlled trials. Future research should focus on the use of adjuvant metformin in other cancer types and the development or repurposing of other drugs for pancreatic cancer.
News Article | May 8, 2017
Vienna, Austria - 7 May 2017: Hodgkin lymphoma survivors have more severe coronary artery disease 20 years after chest irradiation, according to research presented today at ICNC 2017.1 "Patients with Hodgkin lymphoma receive high dose mediastinal irradiation at a young age as part of their treatment," said Dr Alexander van Rosendael, a medical doctor at Leiden University Medical Centre, the Netherlands. "There is an ongoing debate about whether to screen patients who get chest irradiation for coronary artery disease." The current study assessed the extent, severity and location of coronary artery disease (CAD) in Hodgkin lymphoma survivors who had received chest irradiation. The study included 79 patients who had been free of Hodgkin lymphoma for at least 10 years and had received mediastinal irradiation 20 years ago, plus 273 controls without Hodgkin lymphoma or irradiation. CAD was assessed using coronary computed tomography angiography (CTA). To assess differences in CAD between patients and controls they were matched in a one to three fashion by age, gender, diabetes, hypertension, hypercholesterolaemia, family history of coronary artery disease, and smoking status. Patients were 45 years old on average and the presence of cardiovascular risk factors was low overall. Just 42% of patients had no atherosclerosis on coronary CTA compared to 64% of controls. Regarding the extent and severity of CAD, Hodgkin patients had significantly more multi-vessel CAD: 10% had two-vessel disease and 24% had three-vessel disease compared to 6% and 9% of controls, respectively. The segment involvement score (which measures overall coronary plaque distribution) and the segment stenosis score (which measures overall coronary plaque extent and severity) were significantly higher for patients compared with controls. Regarding the location of CAD, patients had significantly more coronary plaques in the left main (17% versus 6%), proximal left anterior descending (30% versus 16%), proximal right coronary artery (25% versus 10%) and proximal left circumflex (14% versus 6%), but not in non-proximal coronary segments. Patients had a four-fold risk of proximal plaque and a three-fold risk of proximal obstructive stenosis compared to controls. "Hodgkin patients who have chest irradiation have much more CAD than people of the same age who did not have irradiation," said Dr van Rosendael. "The CAD occurred at a young age - patients were 45 years old on average - and was probably caused by the irradiation. The CTA was done about 20 years after chest irradiation so there was time for CAD to develop." He continued: "What was remarkable was that irradiated patients had all the features of high risk CAD, including high stenosis severity, proximal location, and extensive disease. We know that the proximal location of the disease is much riskier and this may explain why Hodgkin patients have such poor cardiovascular outcomes when they get older." Dr van Rosendael explained that irradiation of the chest can cause inflammation of the coronary arteries, making patients more vulnerable to developing coronary artery disease. But it is not known why the CAD in irradiated patients tends to be proximally located. He said the finding of more, and more severe, CAD in irradiated patients supported the argument for screening. "When you see CAD in patients who received chest irradiation it is high risk CAD," he said. "Such patients should be screened at regular intervals after irradiation so that CAD can be spotted early and early treatment can be initiated." "These patients are around 45 years old and they are almost all asymptomatic," he said: "If you see a severe left main stenosis by screening with CTA (which occurred in 4%) then you can start statin therapy and perform revascularisation which may improve outcome. We know such treatment reduces the risk of events in non-irradiated patients so it seems likely that it would benefit Hodgkin patients."
News Article | April 17, 2017
Put on the brakes. A spinning neutron star that shifts between two states slows at a faster rate in one of them – and gravitational waves may be responsible. The neutron star J1023+0038 spins almost 600 times per second. But as its powerful magnetic field dissipates energy, it is slowing by about 76 rotations per second every billion years. This magnetic “spin-down” is normal, but sometimes J1023 slows at a faster rate. The different rates are associated with two states the neutron star switches back and forth between: one where it emits mostly radio waves and one where it mainly gives off X-rays. No one knows why some neutron stars behave in this way. But when the star is emitting mostly X-rays, it slows down about 30 per cent faster. In this X-ray phase, the star is stealing material from a smaller companion star that orbits it. Brynmor Haskell at the Polish Academy of Sciences in Warsaw and Alessandro Patruno at Leiden University, the Netherlands, argue that this stolen gas may be the key to J1023’s strange spin. As material snatched from its companion sticks to J1023’s surface, it builds a so-called mountain. Despite being no more than a few millimetres in height, the bump crushes the atoms beneath it, pushing them deeper into the neutron star. There the higher pressure fuses them into heavier elements, giving the mountain roots in the star’s interior. The extra surface bump and the heavier atoms below it together result in the mountain creating an asymmetry in J1023’s gravity. “Neutron stars are very compact, roughly the mass of the sun compressed in a 10-kilometre radius,” says Haskell. “This means that even very small deformations can lead to large changes in the gravitational field.” The imbalance in the neutron star’s gravitational field may cause it to radiate gravitational waves, ripples in space-time caused by the movement of massive objects. These waves would carry away some of the energy that keeps J1023 spinning. When the star switches from its X-ray phase to its radio phase, it stops munching on its stellar partner. As a result, the mountain gradually flattens out and the star emits no more spin-stunting gravitational waves. Last year, the LIGO collaboration announced that it had observed gravitational waves shaken off by black holes colliding. But nobody has yet seen gravitational waves from continuous, rather than catastrophic, events. Objects like J1023 are promising candidates for future gravitational wave searches, especially if they can grow larger mountains. “If this happens, then there might be many other neutron stars that do the same,” says Patruno. “Continuous gravitational waves might really be a widespread phenomenon.” Such a scenario could also explain the apparent cap on neutron stars’ spin. “The fastest ones we see don’t rotate as fast as we think they should be able to go,” says Nils Andersson at the University of Southampton, UK. “There’s something missing in our understanding.” If faster-spinning stars have defects such as mountains, they would emit more gravitational waves and slow down faster, setting a cosmic speed limit for neutron stars.
News Article | April 3, 2017
What happens in the startup scene in The Netherlands right now? Find out in another Dutch startup news update! Our Startup-Prince Contantijn van Oranje announced last Wednesday on the populair Dutch TV Show ‘De Wereld Draait Door’ that a new edition of the StartupFest will be held September this year. It will coincide with the populair startup event in September: Amsterdam Capital Week. Last year the titan’s of the tech industry – Tim Cook, Travis Kalanic and Eric Smidt – were speakers among others. Contantijn said that just like last year high profile names will be invited as keynote speakers, hinting that one of the speakers might the founder of Alibaba – Jack Ma. The theme of the event will be a new focus on promoting the involvement of startups in tackling major global problems like climate change, health, energy etc. Silicon Valley based insider security company Dtex Systems has set office in The Hague Security Delta. To prevent data abuse and data breaches the company has developed software to detect insider threats and infiltration from the outside. “We are also eager to work with partners and The Hague Security Delta is the perfect platform to work with governments, knowledge institutions and enterprises to build a solid foundation for the Dtex operations in the Netherlands and beyond”, said Olav van Haren, Sales Director at Dtex Systems. Founded in 1949 Vanderlande is for a long time not a startup anymore. But we still think the recent acquisition of the company deserves some attention. Vanderlande is a global market leader in baggage handling systems for airports and with a revenue of $1.051 billion it’s the world’s fifth-largest materials handling systems supplier. Last month Toyota Industries bought Vanderlande for 1.2 billion euros. The online supermarket raised 100 million euro in growth capital from four wealthy-family funds NPM Capital, De Hoge Dennen, Hoyberg and Finci. It was founded in 2015 after two years of stealth mode development. It has seeded the development of the company in just a few cities and plans to expand throughout the country in three year with the capital raised. It’s two key value propositions are: free delivery and taking out 1 hour of your agenda to stay at home for the delivery. You can on top of that follow the driver (like with Uber) in the app and thus know up to the minute when you’re expected to open the door. It’s contending for biggest deal of the year in The Netherlands and the investment is already in the top 3 of all-time funding in Dutch startups. Dutch/Belgian biotech company Pluriomics has raised an additional 2.5 million euro in funding from Belgian VCs SFPI-FPIM and SambrInvest. The startup is a spin-off from the Leiden University Medical Center (LUMC). It uses stem cell technology, disease modelling and cell-based assays development for cardiovascular drug discovery. SciSports, a spin-off of the University of Twenty has raised an additional 1.8 million euro in funding for further expansion. The startup has created a tool called BallJames that gathers 3D data of football games by looking into the field of radiology to try and make football analytics better while partnering with the university’s chair of Biometric Pattern Recognition. The startup previously raised a seed round of 1.35 million euro. SciSports will invest in automating these tools for the media, gambling, and gaming industries, said founder Giels Brouwer to tech.eu. Eindhoven-based SMART-Photonics has managed to raise seven million euro in a second round of investment. The money will be used for further growth, including through building a new plant and additional staff. The company specialises in the development and (mass) production of integrated photonic chips. These chips could eventually replace the electronic chips, because they are faster and more accurate. Ask a Female Engineer: How Can Managers Help Retain Technical Women on Their Team? Here is a great Y Combinator article on why female engineers leave a company. As it should be the folks at Y Combinator asked the female engineers themselves this questions, instead an arm-chair scholar. Building viral growth engines into the product experience can be a great strategy to build momentum and grow your startup. Here is a great article by Josh Elman on virality.
News Article | May 3, 2017
(Reuters) - Europe’s top tech hubs tend to radiate from massive capital cities like London, Berlin and Paris. But the heart of European innovation isn’t a major metropolis –it’s a small city in the Dutch-speaking region of Flanders. That’s the conclusion of Reuters’ second annual ranking of Europe’s Most Innovative Universities, a list that identifies and ranks the educational institutions doing the most to advance science, invent new technologies, and help drive the global economy. The most innovative university in Europe, for the second year running, is Belgium’s KU Leuven. This nearly 600-year-old institution was founded by Pope Martin V, but today it’s better known for technology than theology: KU Leuven maintains one of the largest independent research and development organizations on the planet. In fiscal 2015, the university’s research spending exceeded €454 million, and its patent portfolio currently includes 586 active families, each one representing an invention protected in multiple countries. How does a relatively small Catholic university out-innovate bigger, better-known institutions across Europe? KU Leuven earned its first-place rank, in part, by producing a high volume of influential inventions. Its researchers submit more patents than most other universities on the continent, and outside researchers frequently cite KU Leuven inventions in their own patent applications. Those are key criteria in Reuters ranking of Europe’s Most Innovative Universities, which was compiled in partnership with Clarivate Analytics, and is based on proprietary data and analysis of indicators including patent filings and research paper citations. The second most innovative university in Europe is Imperial College London, an institution whose researchers have been responsible for the discovery of penicillin, the development of holography and the invention of fiber optics. The third-place University of Cambridge has been associated with 91 Nobel Laureates during its 800-year history. And the fourth-place Technical University of Munich has spun off more than 800 companies since 1990, including a variety of high-tech startups in industries including renewable energy, semiconductors and nanotechnology. Overall, the same countries that dominate European business and politics dominate the ranking of Europe's Most Innovative Universities. German universities account for 23 of the 100 institutions on the list, more than any other country, and the United Kingdom comes in second, tied with France, each with 17 institutions. But those three countries are also among the most populous and richest countries on the continent. Control for those factors, and it turns out that countries with much smaller populations and modest economies often outperform big ones. The Republic of Ireland has only three schools on the entire list, but with a population of less than 5 million people, it can boast more top 100 innovative universities per capita than any other country in Europe. On the same per capita basis, the second most innovative country on the list is Denmark, followed by Belgium, Switzerland and the Netherlands. Germany, the United Kingdom and France rank in the middle of the pack, an indication that they may be underperforming compared with their smaller neighbors: On a per capita basis, none of those countries has half as many top 100 universities than Ireland. And the same trends hold true if you look at national economies. According to the International Monetary Fund, in 2016 Germany’s gross domestic product exceeded $3.49 trillion –11 times larger than Ireland at $307 billion, yet Germany has only 7 times as many top 100 innovative universities. Some countries underperform even more drastically. Russia is Europe’s most populous country and has the region’s fifth largest economy, yet none of its universities count among the top 100. Other notable absences include any universities from Ukraine or Romania–a fact that reveals another divide between Western and Eastern Europe. To compile the ranking of Europe’s most innovative universities, Clarivate Analytics (formerly the Intellectual Property & Science business of Thomson Reuters) began by identifying more than 600 global organizations that published the most articles in academic journals, including educational institutions, nonprofit charities, and government-funded institutions. That list was reduced to institutions that filed at least 50 patents with the World Intellectual Property Organization in the period between 2010 and 2015. Then they evaluated each candidate on 10 different metrics, focusing on academic papers (which indicate basic research) and patent filings (which point to an institution's ability to apply research and commercialize its discoveries). Finally, they trimmed the list so that it only included European universities, and then ranked them based on their performance. This is the second consecutive year that Clarivate and Reuters have collaborated to rank Europe’s Most Innovative Universities, and three universities that ranked in the top 100 in 2016 fell off the list entirely: the Netherland’s Eindhoven University of Technology, Germany’s University of Kiel, and the UK’s Queens University Belfast. All three universities filed fewer than 50 patents during the period examined for the ranking, and thus were eliminated from consideration. They’ve been replaced by three new entrants to the top 100: the University of Glasgow (#54), the University of Nice Sophia Antipolis (#94), and the Autonomous University of Madrid (#100). The returning universities that made the biggest moves on the list were the Netherland’s Leiden University (up 21 spots to #17) and Germany’s Technical University of Berlin (up 21 spots to #41). Belgium’s Université Libre of Brussels (down 17 to #38) and the UK’s University of Leeds (down 17 to #73) made the biggest moves in the opposite direction. Generally, though, the list remained largely stable: Nine of the top ten schools of 2016 remained in the top 10 for 2017, and 17 of the top 20. This stability is understandable because something as large as university paper output and patent performance is unlikely to change quickly. Of course, the relative ranking of any university does not provide a complete picture of whether its researchers are doing important, innovative work. Since the ranking measures innovation on an institutional level, it may overlook particularly innovative departments or programs: a university might rank low for overall innovation but still operate one of the world's most innovative computer science laboratories, for instance. And it's important to remember that whether a university ranks at the top or the bottom of the list, it's still within the top 100 on the continent: All of these universities produce original research, create useful technology and stimulate the global economy.
Rabelink T.J.,Leiden University |
De Zeeuw D.,University of Groningen
Nature Reviews Nephrology | Year: 2015
Albuminuria is commonly used as a marker of kidney disease progression, but some evidence suggests that albuminuria also contributes to disease progression by inducing renal injury in specific disease conditions. Studies have confirmed that in patients with cardiovascular risk factors, such as diabetes and hypertension, endothelial damage drives progression of kidney disease and cardiovascular disease. A key mechanism that contributes to this process is the loss of the glycocalyx - a polysaccharide gel that lines the luminal endothelial surface and that normally acts as a barrier against albumin filtration. Degradation of the glycocalyx in response to endothelial activation can lead to albuminuria and subsequent renal and vascular inflammation, thus providing a pathophysiological framework for the clinical association of albuminuria with renal and cardiovascular disease progression. In this Review, we examine the likely mechanisms by which glycocalyx dysfunction contributes to kidney injury and explains the link between cardiovascular disease and albuminuria. Evidence suggests that glycocalyx dysfunction is reversible, suggesting that these mechanisms could be considered as therapeutic targets to prevent the progression of renal and cardiovascular disease. This possibility enables the use of existing drugs in new ways, provides an opportunity to develop novel therapies, and indicates that albuminuria should be reconsidered as an end point in clinical trials.
Vlieland T.P.M.V.,Leiden University
Current Opinion in Rheumatology | Year: 2011
Purpose of Review: To summarize recent literature on nonpharmacological and nonsurgical interventions in patients with rheumatoid arthritis (RA). Recent Findings: Recent systematic reviews and individual studies substantiate the effectiveness of aerobic and strength exercise programmes in RA. The evidence for the promotion of physical activity according to public health recommendations is scarce, and implementation research found that the reach and maintenance of exercise or physical activity programmes in RA patients are suboptimal. For self-management interventions, characteristics that increase their effectiveness were identified, including the use of cognitive behavioural approaches and approaches derived from the self-regulation theory. A limited number of recent individual trials substantiate the effectiveness of comprehensive occupational therapy, foot orthoses, finger splints and wrist working splints, but not of wrist resting splints. Overall, the evidence for the effectiveness of assistive devices and dietary interventions is scanty. Summary: For exercise and physical activity programmes and self-management interventions in RA, research is increasingly directed towards the optimization of their content, intensity, frequency, duration and mode of delivery and effective implementation strategies. A number of studies substantiate the effectiveness of comprehensive occupational therapy, wrist working splints and finger splints. More research into the effectiveness of assistive devices, foot orthoses and dietary interventions is needed. © 2011 Wolters Kluwer Health | Lippincott Williams and Wilkins.
Casini A.,University of Groningen |
Reedijk J.,Leiden University |
Reedijk J.,King Saud University
Chemical Science | Year: 2012
A critical discussion is presented about the possible role of Pt-protein interactions in the mechanisms of action of platinum anticancer compounds. Although, since 40 years from its discovery, cisplatin and analogues are believed to exert their therapeutic effects via direct interactions with nucleic acids, several proteins/enzymes have recently appeared to be involved in the compounds' overall pharmacological and toxicological profiles, apart from classical serum transport proteins and metal detoxification systems. As an example, the emerging role of zinc finger proteins is noteworthy in the activity of platinum drugs. Moreover, the pursuit of novel platinum candidates that selectively target enzymes is now the subject of intense investigation in medicinal bioinorganic chemistry and chemical biology. An overview is presented of the most representative studies in the field, with particular focus on the characterization of the Pt-protein interactions at a molecular level, using different biophysical and analytical methods. © 2012 The Royal Society of Chemistry.
Unanue E.R.,University of Washington |
Turk V.,Jozef Stefan Institute |
Neefjes J.,Netherlands Cancer Institute |
Neefjes J.,Leiden University
Annual Review of Immunology | Year: 2016
MHC class II (MHC-II) molecules are critical in the control of many immune responses. They are also involved in most autoimmune diseases and other pathologies. Here, we describe the biology of MHC-II and MHC-II variations that affect immune responses. We discuss the classic cell biology of MHC-II and various perturbations. Proteolysis is a major process in the biology of MHC-II, and we describe the various components forming and controlling this endosomal proteolytic machinery. This process ultimately determines the MHC-II-presented peptidome, including cryptic peptides, modified peptides, and other peptides that are relevant in autoimmune responses. MHC-II also variable in expression, glycosylation, and turnover. We illustrate that MHC-II is variable not only in amino acids (polymorphic) but also in its biology, with consequences for both health and disease. Copyright © 2016 by Annual Reviews. All rights reserved.
Tan B.G.,University of Essex |
Vijgenboom E.,Leiden University |
Worrall J.A.R.,University of Essex
Nucleic Acids Research | Year: 2014
Metal ion homeostasis in bacteria relies on metalloregulatory proteins to upregulate metal resistance genes and enable the organism to preclude metal toxicity. The copper sensitive operon repressor (CsoR) family is widely distributed in bacteria and controls the expression of copper efflux systems. CsoR operator sites consist of G-tract containing pseudopalindromes of which the mechanism of operator binding is poorly understood. Here, we use a structurally characterized CsoR from Streptomyces lividans (CsoRSl) together with three specific operator targets to reveal the salient features pertaining to the mechanism of DNA binding. We reveal that CsoRSl binds to its operator site through a 2-fold axis of symmetry centred on a conserved 5′-TAC/GTA-3′ inverted repeat. Operator recognition is stringently dependent not only on electropositive residues but also on a conserved polar glutamine residue. Thermodynamic and circular dichroic signatures of the CsoRSl-DNA interaction suggest selectivity towards the A-DNA-like topology of the G-tracts at the operator site. Such properties are enhanced on protein binding thus enabling the symmetrical binding of two CsoRSl tetramers. Finally, differential binding modes may exist in operator sites having more than one 5′-TAC/GTA-3′ inverted repeat with implications in vivo for a mechanism of modular control. © 2013 The Author(s).
Agency: European Commission | Branch: FP7 | Program: MC-ITN | Phase: FP7-PEOPLE-ITN-2008 | Award Amount: 6.05M | Year: 2010
Astronomical observations are revealing in ever increasing detail how our Universe works. Existing and planned European investment in sophisticated observational platforms approaches many billions of Euros. However, the observations that can be made on these telescopes would be little more than pretty pictures were it not for the efforts of the experimental and theoretical laboratory astrophysics communities in collaboration with their astronomical colleagues in developing models of our Universe firmly grounded here on Earth. These models recognise the importance of chemical processes in the astronomical environment and the young science of Astrochemistry seeks to understand the rich variety of this chemistry in such a way as to make a significant contribution to us truly understanding the evolution of the modern day Universe. The LASSIE (Laboratory Astrochemical Surface Science in Europe) Initial Training Network seeks to address the key issue of the interaction of the astronomical gas phase with the dust that pervades the Universe. The gas-grain interaction, as it is know, has been recognised by astronomers as crucial in promoting chemistry. The LASSIE ITN brings together the leading European players in experimental and computational surface and solid state astrochemistry, astronomers seeking to understand the detailed role of chemical species in our modern Universe, industrial partners engaged in the development of relevant laboratory instrumentation and experts in public engagement. Through this combination LASSIE will develop capacity in astrochemistry in Europe, produce researchers equipped with a range of specialist and generic skills necessary to engage in a wide range knowledge-based careers and to reach out to all aspects of European society to deliver a positive message in relation to the scientific and technical advancement of Europe.
Agency: European Commission | Branch: FP7 | Program: CP-FP | Phase: HEALTH-2007-2.4.1-4;HEALTH-2007-2.4.1-8 | Award Amount: 4.20M | Year: 2008
Recently the zebrafish has emerged as a new important system for cancer research because the zebrafish genome contains all orthologs of human oncogenes and forms tumors with similar histopathological and gene profiling features as human tumors. The zebrafish provides an in vivo vertebrate model for identifying novel mechanisms of cancer progression and for development of new anticancer compounds in a time- and cost-effective manner. The ZF-CANCER project aims to develop high-throughput bioassays for target discovery and rapid drug screenings applicable in preclinical validation pipelines. Fluorescently labelled human and zebrafish cancer cells will be implanted (xenogenic and allogenic transplantation) into zebrafish embryos transgenic for a GFP-vascular marker and quantitative, multi-colour fluorescent intravital bio-imaging of tumour progression will be set up as the readout. Because of its amenability to genetic manipulation and optical transparency, the zebrafish is currently the only vertebrate model that allows the simultaneous in vivo imaging of all hallmarks of cancer progression including cell survival, proliferation, migration and induction of angiogenesis. The combination of visual, non- invasive monitoring in translucent host embryos with powerful RNA interference technology, successfully developed for human cancer cells will enable identification of novel targets in a wide variety of human cancers. Automation of these fluorescent readouts will accelerate the screening process with chemical libraries to discover new compounds involved in different aspects of cancer progression and inhibition. In the case study, a selected panel of genes and lead compounds will be screened on a high-throughput platform, possibly resulting in the identification of important anti-tumour drugs relevant for human cancer therapy. Fundamental knowledge, tools and technical expertise gained from ZF-CANCER will be commercially exploited by one company and two high-tech SMEs.
Agency: European Commission | Branch: H2020 | Program: RIA | Phase: PHC-33-2015 | Award Amount: 30.12M | Year: 2016
The vision of EU-ToxRisk is to drive a paradigm shift in toxicology towards an animal-free, mechanism-based integrated approach to chemical safety assessment. The project will unite all relevant disciplines and stakeholders to establish: i) pragmatic, solid read-across procedures incorporating mechanistic and toxicokinetic knowledge; and ii) ab initio hazard and risk assessment strategies of chemicals with little background information. The project will focus on repeated dose systemic toxicity (liver, kidney, lung and nervous system) as well as developmental/reproduction toxicity. Different human tiered test systems are integrated to balance speed, cost and biological complexity. EU-ToxRisk extensively integrates the adverse outcome pathway (AOP)-based toxicity testing concept. Therefore, advanced technologies, including high throughput transcriptomics, RNA interference, and high throughput microscopy, will provide quantitative and mechanistic underpinning of AOPs and key events (KE). The project combines in silico tools and in vitro assays by computational modelling approaches to provide quantitative data on the activation of KE of AOP. This information, together with detailed toxicokinetics data, and in vitro-in vivo extrapolation algorithms forms the basis for improved hazard and risk assessment. The EU-ToxRisk work plan is structured along a broad spectrum of case studies, driven by the cosmetics, (agro)-chemical, pharma industry together with regulators. The approach involves iterative training, testing, optimization and validation phases to establish fit-for-purpose integrated approaches to testing and assessment with key EU-ToxRisk methodologies. The test systems will be combined to a flexible service package for exploitation and continued impact across industry sectors and regulatory application. The proof-of-concept for the new mechanism-based testing strategy will make EU-ToxRisk the flagship in Europe for animal-free chemical safety assessment.
Agency: European Commission | Branch: FP7 | Program: CP | Phase: ICT-2011.9.7 | Award Amount: 7.86M | Year: 2012
Future advancements in ICT domain are closely linked to the understanding about how multi-level complex systems function. Indeed, multi-level dependencies may amplify cascade failures or make more sudden the collapse of the entire system. Recent large-scale blackouts resulting from cascades in the power-grid coupled to the control communication system witness this point very clearly. A better understanding of multi-level systems is essential for future ICTs and for improving life quality and security in an increasingly interconnected and interdependent world. In this respect, complex networks science is particularly suitable for the many challenges that we face today, from critical infrastructures and communication systems, to techno-social and socio-economic networks.MULTIPLEX proposes a substantial paradigm shift for the development of a mathematical, computational and algorithmic framework for multi-level complex networks. Firstly, this will lead to a significant progress in the understanding and the prediction of complex multi-level systems. Secondly, it will enable a better control, and optimization of their dynamics. By combining mathematical analyses, modelling approaches and the use of massive heterogeneous data sets, we shall address several prominent aspects of multi-level complex networks, i.e. their topology, dynamical organization and evolution.
Agency: European Commission | Branch: H2020 | Program: MSCA-ITN-ETN | Phase: MSCA-ITN-2014-ETN | Award Amount: 3.81M | Year: 2015
IMMUNOSHAPE aims at training a new generation of scientists that will be capable of combining state of the art synthesis and screening technology to develop new lead structures for highly selective glycan based multivalent immunotherapeutics for the treatment of cancer, autoimmune diseases and allergy. To this end, we have set up a training program in a unique academic-industrial environment that will educate young researchers in scientific and practical biomedical glycoscience with the final aim to produce new talent and innovation in the field and improve their career perspectives in both academic and non-academic sectors. The unique combination of 10 academic groups with expertise in automated solid-phase carbohydrate synthesis, microarray based highthroughput screening technology, tumour immunology, structural glycobiology, multivalent systems and medicinal chemistry along with 4 industrial partners active in nanomedicine, immunotherapy, medicinal device development and the fabrication of scientific instrumentation will provide a multidisciplinary and multisectorial training to 15 ESRs in biomedical glycoscience and its industrial applications.
Agency: European Commission | Branch: FP7 | Program: CP-FP | Phase: HEALTH-2007-2.3.2-4 | Award Amount: 2.60M | Year: 2008
Artemisinin based antimalarial drug combinations are recommended for the treatment of P.falciparum malaria infections throughout all malarial endemic areas of the world and in all populations, including women of child bearing age. The studies planned in this collaborative project are central to assessing the potential hazard posed by these drugs to the developing human foetus and thereby making evidence based recommendations on the risk:benefit of these drugs. Although clinical experience to date indicates the artemisinins to be safe, the area of reproductive toxicology demands special consideration. Data from the Chinese literature and our own studies confirm that the artemisinins are embryotoxic and potentially teratogenic in animal species at drug doses within the human therapeutic range. Based on over ten years of investigating the pharmacology of these drugs we have developed a hypothesis which can explain these teratogenic effects. Our hypothesis is based on the generation of reactive oxygen species (ROS) from cleavage of the artemisinin peroxide bridge and consequent embryofoetal damage to key biological macromolecules. Our hypothesis draws on parallels with the metabolic activation and teratogenic effects of the other established teratogens such as phenytoin
Agency: European Commission | Branch: FP7 | Program: CP-FP | Phase: HEALTH.2013.2.2.1-3 | Award Amount: 8.74M | Year: 2014
Aggression inflicts a huge personal, psychological and financial burden on affected individuals, their relatives, and society at large. Despite large scientific, preventive, and treatment investments, no decrease in aggressive behavior is seen. This calls for a shift to new approaches. By capitalising on comprehensive longitudinal cohorts, recent advances in genetic, biological, epidemiological, and clinical fields, and combining such interdisciplinary expertise the ACTION consortium will dissect the etiology and pathogenesis of aggression. Based on new insights, ACTION will inform the development of novel diagnostic tools and causative targets and guide the development of treatment and prevention strategies. ACTION is built on interrelated work packages with a focus on a) clinical epidemiology and current classification and treatment problems; b) genetic epidemiology, including Genome Wide Association studies and epigenetics; c) gene-environment correlation and interaction; d) biomarkers and metabolomics. ACTION will deliver an overarching framework that combines a thorough understanding of pathways leading to aggression with a map of current gaps, best practices on clinical, ethical, legal, and social issues. Based on this framework, ACTION will develop novel biomarkers suitable for large-scale applications in children and combine biomarker data with new insights into the effects of gender, age, and comorbidity. ACTION will provide guidance in optimising current intervention programs and deliver new biological targets to pave the way for novel therapeutic interventions. ACTION will provide a decision tree to guide personalised intervention programmes and will have direct and sustained impact on reducing paediatric aggression. Its overarching aim is to reduce aggression by developing approaches that take individual differences in genetic and environmental susceptibility into account, thereby leading to better understanding of personalised intervention programs.
Agency: European Commission | Branch: FP7 | Program: CP | Phase: ENV.2012.6.3-3 | Award Amount: 3.65M | Year: 2012
DESIRE will develop and apply an optimal set of indicators to monitor European progress towards resource-efficiency. We propose a combination of time series of environmentally extended input output data (EE IO) and the DPSIR framework to construct the indicator set. Only this approach will use a single data set that allows for consistent construction of resource efficiency indicators capturing the EU, country, sector and product group level, and the production and consumption perspective including impacts outside the EU. The project will a) improve data availability, particularly by creating EE IO time series and now-casted data (WP5). b) improve calculation methods for indicators that currently still lack scientific robustness, most notably in the field of biodiversity/ecosystem services (WP7) and critical materials (WP6). We further will develop novel reference indicators for economic success (Beyound GDP and Value added, WP8). c) explicitly address the problem of indicator proliferation and limits in available data that have a statistical stamp. Via scientific analysis we will select the smallest set of indicators giving mutually independent information, and show which shortcuts in (statistical) data inventory can be made without significant loss of quality (WP8) The project comprises further Interactive policy analysis, indicator concept development via brokerage activities (WP2-4), Management (WP1), and Conclusions and implementation (WP10) including a hand over ofdata and indicators to the EUs Group of Four of EEA, Eurostat, DG ENV and DG JRC. Our team includes 4 UN Resource Panel members (WI, AAU-SEC, NTNU and LU-CML) and founders of the material flow analysis field (e.g. SERI). We further include TNO (global leader in EE IO via projects like EXIOPOL and CREEA), FFCUL (global top in biodiversity and ecosystem services) and RU (top player in sustainability impact assessment).
Agency: European Commission | Branch: FP7 | Program: MC-ITN | Phase: FP7-PEOPLE-ITN-2008 | Award Amount: 3.39M | Year: 2009
The implementation of the new EU legislation concerning the registration, evaluation, authorization and restriction of chemicals (REACH) requires demonstration of the safe manufacture of chemicals and their safe use throughout the supply chain. REACH encourages development of new in vitro test methods and replacement of animal tests wherever possible by alternative methods. These goals are not achievable without well-trained personnel with a broad expertise and knowledge in both experimental and computational areas of environmental sciences. The requirements for such scientists, however, are not limited to the REACH implementation itself. Large companies and SMEs could be interested to employ such specialists to perform risk assessment and prioritization of molecules in the development stage. Therefore, the primary objective of this ITN (http://www.eco-itn.eu) is to contribute to the education of a new generation of scientists, environmental chemoinformaticans, who will receive advanced training in both environmental and computational methods. To achieve this goal the ITN will train the fellows using expertise and knowledge of its partners in various complementary computational and experimental areas of environmental sciences. The additional training will also be offered by means of Winter and Summer Schools and will include both theoretical and practical courses. The internships to the laboratories of associated partners will allow fellows to learn new methods and to broaden their knowledge in the field. A flexible system of Short Term Fellowships will offer additional targeted training to researchers originally not associated with the network. Given the potentially great business impact of evaluating more than 120,000 industrial chemicals in the European market within the next decade, the fellows of this network may have a significant economic dimension with regard to the hazard evaluation of chemicals in Europe.
Agency: European Commission | Branch: FP7 | Program: CP | Phase: ICT-2011.9.1 | Award Amount: 2.54M | Year: 2012
The MUSE project will introduce a new way of exploring and understanding information by bringing text to life through 3D interactive storytelling. Taking as input natural language text like childrens stories or medical patient education materials, MUSE will process the natural language, translate it into formal knowledge that represents the actions, actors, plots and surrounding world, and then render these as virtual 3D worlds in which the user can explore the text through interaction, re-enactment and guided game play.\nTo enable such a system, MUSE will make targeted advances in natural language processing that enable the translation of natural language text to the necessary knowledge representations, as well as targeted advances in the action representation and story planning necessary for interactive storytelling. In natural language processing, MUSE will develop new techniques for finding explicit action structures in text and combining them with implicit action structures inferred from the context based on probabilistic models of translation and automatic methods for acquiring world knowledge from large corpora. In interactive storytelling, MUSE will develop action and object representations that bridge the gap between natural language and virtual worlds, and will create advanced techniques for planning virtual world stories given inconsistent and incomplete information.\nThe proposed methodology will be evaluated and showcased on two scenarios: one for creating immersive childrens stories from text and one for allowing medical patients to interact with patient education materials. Comparable to the invention of symbolic writing systems several millennia ago, MUSE contributes to a novel symbolic system communicating natural language utterances.
Agency: European Commission | Branch: FP7 | Program: CP-FP | Phase: SPA.2013.2.1-01 | Award Amount: 3.22M | Year: 2013
GENIUS is designed to boost the impact of the next European breakthrough in astrophysics, the Gaia astrometric mission. Gaia is an ESA Cornerstone mission scheduled for launch in October 2013 and aims at producing the most accurate and complete 3D map of the Milky Way to date. A pan-European consortium named DPAC is working on the implementation of the Gaia data processing, of which the final result will be a catalogue and data archive containing more than one billion objects. The archive system containing the data products will be located at the European Space Astronomy Centre (ESAC) and will serve as the basis for the scientific exploitation of the Gaia data. The design, implementation, and operation of this archive are a task that ESA has opened up to participation from the European scientific community. GENIUS is aimed at significantly contributing to this development based on the following principles: an archive design driven by the needs of the user community; provision of exploitation tools to maximize the scientific return; ensuring the quality of the archive contents and the interoperability with existing and future astronomical archives (ESAC, ESO, ...); cooperation with the only other two astrometric missions in the world, nanoJASMINE and JASMINE (Japan); and last but not least, the archive will facilitate outreach and academic activities to foster the public interest in science in general and astronomy in particular. GENIUS fits seamlessly into existing Gaia activities, exploiting the synergies with ongoing developments. Its members actively participate in these ongoing tasks and provide an in-depth knowledge of the mission as well as expertise in key development areas. Furthermore, GENIUS has the support of DPAC, several Gaia national communities in the EU member states, and will establish cooperation with the Japanese astrometric missions already mentioned.
Agency: European Commission | Branch: FP7 | Program: CP-TP | Phase: KBBE.2013.3.5-01 | Award Amount: 3.85M | Year: 2013
The Decathlon project will bring together a broad range of experts and expertise to jointly work on the development of new or improved methods that are needed in the field of 1) food pathogens, 2) traceability of GMOs and 3) customs issues. The project will develop advanced methods for all three application areas with method characteristics that meet the requirements of the individual areas, as will be laid down in minimal performance parameters (MPPs) for the types of methods as will be developed within the Decathlon project. Decathlon brings together all relevant molecular biological and bioinformatics expertise through the participation of expert European researchers in the respective fields of application. Besides technical experts, also field-related, application-oriented scientists will participate for the three areas of interest, which are fully aware of the user requirements for the methods to be developed, also in the light of current and future European regulations. By combining this awareness with technical expertise, user requirements will be translated into technical and bioinformatics method requirements that will form the starting-point for the molecular biological technical methods (including any related bioinformatics module, where applicable) to be developed. In this way the Decathlon project will develop focused DNA-based (on-site) methods for the identified areas of food pathogens, GMOs and customs issues, and at the same time stimulate the development of DNA methods for similar applications in numerous other fields that require high-quality, focused DNA-based detection and identification methods. Decathlon will provide the roadmap and blueprint for this broader application of all methods and modules developed in Decathlon. Furthermore, Decathlon will have the cooperation platform and network in place that will be extended effectively throughout the duration of the project as a consolidated European network of analytical experts.
Agency: European Commission | Branch: FP7 | Program: MC-IRSES | Phase: FP7-PEOPLE-2010-IRSES | Award Amount: 779.10K | Year: 2011
The focus in cosmology is shifting from the determination of the basic cosmological parameters to developing an understanding of how galaxies formed. Progress in this field has been driven by a combination of computer simulation and observational breakthroughs. Over the next few years, groundbreaking new facilities will come online and will provide data of unprecedented quality with which to test theoretical models. The key objective of our proposal is to allow European scientists to play a leading role in advancing our understanding of galaxy formation, by forging new links and research collaborations with scientists in Latin America and China, which host some of these new experiments. Our research programme covers all aspects of numerical galaxy formation. In addition to building new research capacity, we will organise a series of events to avoid fragmentation of research expertise and to help train a new generation of galaxy formation modellers.
Agency: European Commission | Branch: FP7 | Program: MC-ITN | Phase: PEOPLE-2007-1-1-ITN | Award Amount: 3.14M | Year: 2008
The aim of ELCAT is to train a new generation of young researchers in experimental and theoretical research methods in electrocatalysis. The scientific objectives are to address the problem of achieving specific reactivity in electrochemical transformations and to establish predictive tools based on quantum chemical calculations and computational modelling. The electrocatalytic properties of nanostructured metal and metal-oxide interfaces will be investigated using state of the art instrumental and experimental techniques. Four important electrochemical reactions have been chosen for their general importance in energy production, environmental control and clean chemicals production. Industrial participation in the Network will provide an understanding of the technological applications of electrocatalysis. The training objectives are to provide the young researchers with scientific skills as well as leadership qualifications to enable them to become established as independent researcherers in new areas thus providing an attractive vision for their research careers For this to be achieved, ELCAT will provide training by interdisciplinary research involving several laboratories, exchanges between laboratories having complementary expertise and instrumentation, participation in workshops and conferences and training by courses arranged by the Network in both scientific issues and complementary skills. This project is timely due to the present confluence of theoretical and instrumental research techniques of unprecedented power, such as new computational methods, a new understanding of electron transfer reactions at the nanoscale and advanced in-situ spectroscopies and Scanning Probe Microscopies. There are new industrial requirements derived from important societal issues resulting in an urgent need for a new generation of trained young researchers with a modern background in experimental and theoretical techniques for applications in electrocatalysis.
Agency: European Commission | Branch: FP7 | Program: MC-ITN | Phase: FP7-PEOPLE-2013-ITN | Award Amount: 3.90M | Year: 2014
In the Early Modern Age (16th-17th centuries) the construction of ocean-going ships was paramount to the development of cultural encounters in what became the Age of Discovery and European expansion. In the case of the Iberian Empires, the establishment of new trade routes brought up the need for armed merchantmen, galleons and smaller vessels, placing unprecedented demands on Iberian forests for the supply of construction timber. Forestry and sea power became inextricably linked, creating new geopolitical tensions, alliances and forest regulations. Key questions in this context are: could Iberian forest resources sustain the increasing demand of sound timber, or was the wood imported from elsewhere? If so, how were the trade networks organized? And did the lack of raw material force the technological changes occurred in shipbuilding in the 16th century, or were they a result of exchange between Mediterranean and Atlantic shipbuilding traditions? This project will address these questions through a multidisciplinary and innovative training research program to improve the understanding of our historical past, our cultural heritage, and our knowledge of the use of resources for shipbuilding. The prerequisite for such approach is combining knowledge derived from Humanities and Life Sciences. The aims of the project are: i) to consolidate a research line combining historical research, underwater archaeology, GIS and wood provenancing methods (dendrochronology, wood anatomy and geo/dendrochemistry); ii) to increase the background and experience of trainees in the different research areas, by engaging the fellows in training courses and workshops aimed at developing their scientific, communication, and management skills; and iii) to develop their transferable skills for future careers in academia or the private sector whilst advancing the research fields through the integration of research tools, development of reference datasets and new discoveries.
Agency: European Commission | Branch: FP7 | Program: CP-FP | Phase: ENV.2011.2.1.4-3 | Award Amount: 4.05M | Year: 2011
People in Europe acknowledge that nature is important to them and to society at large. Economists have shown that indeed, biodiversity has total economic values running into the trillions of euros worldwide and hundreds of millions even for minor ecosystem services on local scales. In spite of these immense values, politicians and the general public in Europe do not appear to really act for nature. In the ballot box, people think about their job security, their mortgage or foreign immigrants not about the loss of nature. Politicians feel tempted to focus on these same narrow issues. As a result, European biodiversity continues to decline. Can economic methods to assess the value of biodiversity be improved such that they reach out to what really motivates action? Can alternative approaches be developed that lie closer to what connects people to nature and can appeal to their actions in stead of only to their feelings? The BIOMOT project, funded by the FP 7 programme of the European Union, will address these challenges. Involving eight research institutes in seven European countries and uniting a unique group of economists, governance experts, psychologists and philosophers, BIOMOT will undertake empirical research in the seven European countries, focusing on (a) the motivational capacity of economic valuation methods, (b) the types of motivation for nature that underlie successful policy actions for biodiversity at various scales and (c) the motivations that drive citizens, business and public leaders to take action for nature. On that basis, BIOMOT will develop a general theory of motivation for biodiversity and think through its implications for biodiversity policies, for business and civil society actors and for public communication.
Agency: European Commission | Branch: FP7 | Program: CP-SICA | Phase: KBBE-2007-2-5-04 | Award Amount: 7.59M | Year: 2009
Trade in aquatic products is the largest global food sector, by value, and Asia represents the main external source of aquatic products into the EU. Current EU policy supporting international trade between Asia and Europe concentrates on issues of food safety as measures of quality, whilst market-forces drive development of standards and labels that identify social and environmental parameters. This project proposes to establish an evidence-based framework to support current stakeholder dialogues organised by a third party certifier. This will contribute to harmonising standards, helping consumers to make fully informed choices with regards to the sustainability and safety of their seafood. The Ethical Aquatic Food Index, a qualitative holistic measure of overall sustainability to support consumers purchasing decisions, will be based on detailed research centred around a Life Cycle Assessment of current processes involved in ensuring aquatic products reach consumers, aligned with analyses from the sustainable livelihoods approach and systems thinking. SMEs based in the EU will participate in this project, particularly the action research phase, enhancing their relative competitiveness. By strengthening the knowledge base surrounding EU-Asia seafood trade the project will provide the evidence required to support further expansion whilst ensuring a fair deal for producers who are meeting appropriate social and environmental goals and offering a safe and sustainable product for consumers. The sectors covered represent the main aquaculture products reaching EU markets; tilapia, catfish, shrimps and prawns. Known case study stakeholders include SMEs in Bangladesh, China, Thailand and Vietnam where sustainability is essential in the face of rapid growth. The research will secondarily improve understanding of opportunities for European exports to supply the expanding middleclass in Asia. Outputs will be promoted through workshops, websites, journal and press articles.
Agency: European Commission | Branch: H2020 | Program: CSA | Phase: ISSI-1-2015 | Award Amount: 3.44M | Year: 2016
Ensuring the availability of and access to sufficient safe and nutritious food is a key priority that impacts all EU citizens and Horizon 2020 has therefore identified food security as one of the major challenges to be addressed. BGCI, an international network organisation will work with botanic gardens, experienced informal science centres with research expertise in food and food plants, alongside other key organisations to implement the BigPicnic project. This project builds, through the co-creation approach and public debate, public understanding of food security issues and enables adults and young people across Europe and in Africa to debate and articulate their views on Responsible Research and Innovation (RRI) in this field to their peers, scientists and policy makers. The project involves the delivery of low-cost, co-created outreach exhibitions on food security, using the metaphor of a picnic basket; the exhibition will include information, activities and participatory events that engage a broad range of target audiences (adults, schoolchildren and families). Building on audience engagement and data captured from these initial, locally held, exhibitions, the project will run science cafs in publicly accessible and informal engagement areas as well as in botanic gardens, again capturing public views on RRI and food security. The final phase of the project will consolidate the findings of the public engagement to produce two key publications, a report articulating public opinion and recommendations for RRI on food security and a co-creation toolkit that will build capacity for engagement in further science institutions across the EU. A number of case studies on RRI will be provided to support the EU RRI toolkit currently under construction. It is expected that the project evaluation will show organisational learning and change amongst partner institutions. Partners will go on to disseminate training and promotion of RRI for future public engagement.
Agency: European Commission | Branch: H2020 | Program: MSCA-ITN-ETN | Phase: MSCA-ITN-2016 | Award Amount: 4.01M | Year: 2016
Large Polycyclic Aromatic Hydrocarbon (PAH) molecules are deeply interwoven in the fabric of the Universe and lock up ~15% of the elemental carbon in the interstellar medium (ISM) of galaxies. They dominate the mid-infrared emission characteristics of galaxies that can be used to trace star formation locally as well as in the early universe, they influence the phase structure of the ISM and the star formation rate of galaxies, and they are the epitome of molecular complexity in space, heralding the importance of top-down chemistry. In spite of the influential role of PAHs in the ISM, their lifecycle, catalytic activity, interaction with interstellar radiation, gas and grains and their role in the organic inventory of solar system bodies is still poorly understood. The EUROPAH ETN aims to change this by creating a highly multidisciplinary network that combines astronomy, molecular physics, molecular spectroscopy, environmental science, quantum chemistry, surface sciences, and plasma physics in a comprehensive research and training program. EUROPAH will train 16 ESRs through cutting edge individual research and innovation projects investigating key physical and chemical processes of PAHs in space and related terrestrial settings and linking directly to R&D needs of our industrial beneficiaries. EUROPAH will engage all ESRs in industry driven innovation activities aimed at R&D of the industrial participants products and services, including outreach activities led by our industrial science communication beneficiary. Research and innovation training is complemented by an extensive program of network-wide training events to expose ESRs to all disciplines in the network and to instill in them a comprehensive set of transferable skills. This will provide the ESRs with a unique learning environment in a multidisciplinary setting aimed at developing a research oriented creative and innovative mind set and will place them well for a future career in academia or in industry.
Agency: European Commission | Branch: FP7 | Program: CP-FP | Phase: ENV.2009.3.3.2.1 | Award Amount: 4.49M | Year: 2009
LC-IMPACT is a 3.5-year project and its main objective is the development and application of life cycle impact assessment methods, characterisation and normalisation factors. Impact from land use, water use, marine, mineral and fossil resource use, ecotoxicity and human toxicity, and a number of non-toxic emission-related impact categories will be considered in LC-IMPACT. First, new impact assessment methods will be developed for categories that are not (commonly) included in life cycle impact assessments and categories for which model uncertainties are very high, i.e. land use, water exploitation, resource use, and noise. Second, LC-IMPACT will provide spatially explicit characterisation factors based on global scale models for land use, water exploitation, toxicants, priority air pollutants, and nutrients. Thirdly, parameter uncertainty and value choices will be assessed for impact categories with high uncertainties involved, such as ecotoxicity and human toxicity. Fourthly, ready-to-use characterisation factors will be calculated and reported. Fifthly, normalisation factors for Europe and the world will be calculated for the impact categories included. Sixthly, the improved decision support of the new characterisation factors and normalisation factors will be demonstrated in the context of the following three case studies: i) food production (fish, tomatoes, margarine), ii) paper production and printing, and iii) automobile manufacturing and operation. Finally, verification and dissemination of the new life cycle impact assessment methods and factors will be done by a portfolio of actions, such as stakeholder consultation, a project website, workshops, course developments, and training of user groups. In short, LC-IMPACT will provide improved, globally applicable life cycle impact assessment methods, characterisation and normalisation factors, that can be readily used in the daily practice of life cycle assessment studies.
Agency: European Commission | Branch: FP7 | Program: CP-FP | Phase: SSH-2009-3.2.1. | Award Amount: 3.11M | Year: 2010
One of the key changes in societal trends and lifestyles witnessed over the past few years has been the move on-line of many consumers and the way they have become increasingly sophisticated in their media consumption habits. Have these recent changes to consumer and commercial practices developed in such a way that consumers are (in)voluntarily signing away their fundamental right to privacy? This project (CONSENT) seeks to examine how consumer behaviour, and commercial practices are changing the role of consent in the processing of personal data. While consumer consent is a fundamental value on which the European market economy is based, the way consumer consent is obtained is questionable in popular user-generative/user-generated (UGC) online services (including sites like MySpace, YouTube and Facebook), whose commercial success depends to a large extent on the disclosure by their users of substantial amounts of personal data. There is an urgent need to study and analyse the changes in consumption behaviour and consumer culture arising from the emergence of UGC online services and how contractual, commercial and technical practices and other factors affect consumer choice and attitudes toward personal privacy in the digital economy. CONSENTs multidisciplinary team intends to carry out a status quo analysis of commercial practices, legal position and consumer attitudes, identifying criteria for fairness and best practices, and then create a toolkit for policy makers and corporate counsel which will enable them to address problem identified in the analysis. CONSENT will advance the knowledge base that underpins the formulation and implementation of policies and corporate procedures in the area of privacy and consumer protection with a view to informing policy-making in the European Union and to contribute to the development of European research communities in these areas.
Agency: European Commission | Branch: FP7 | Program: CP-FP | Phase: AAT.2008.6.2.1.;AAT.2008.6.2.2. | Award Amount: 7.23M | Year: 2009
The current project aims at the investigation and development of technologies and steps necessary for approaches to conquer the grey zone between aeronautics and space in Europe, and thus to set the foundation of a new paradigm for transportation in the long term. The underlying concepts considered are a) a European space plane based on an airplane launch approach to advance European know how in this area, based essentially on a ballistic flight experience using hybrid propulsion, and b) the same space plane envisioned to evolve into suborbital point-to-point long-distance transport in very short times by using high-energy propulsion. An alternative, vertically starting two-stage rocket space vehicle system concept is used to identify technologies required for suborbital ultra-fast transportation. The concepts will be addressed separately and in relation to each other as well as with those considered in other EC projects, exploiting similarities and synergies wherever possible. The concepts can be classified with near term and very long term realisation capabilities, and will be evaluated according to the maturity of underlying technology, inherent risk, sustained operations, and cost. All concepts and technologies will be considered with respect to environmental issues. Some activities concern the leagal issues and those of suitable space ports. Due to the recent agreement with Virgin Galactic, it is quite natural to consider the ESRANGE facility in Sweden an excellent candidate for a starting place of experimental high-altitude high-speed flights.
Agency: European Commission | Branch: FP7 | Program: CPCSA | Phase: INFRA-2007-1.2-02 | Award Amount: 4.80M | Year: 2008
The increasing cost of experimental facilities in many research fields is powering a concentration of such facilities in a few selected places, sometimes driven also by environmental conditions.\nThe clear, steady skies without light pollution necessary to Astronomical Observatories are generally not easily found. In the Southern hemisphere the best observing facility for optical and infrared astronomy is widely acknowledged to be ESO.\nAt the same time the ever increasing data volumes as detectors get bigger and more complex, raises a number of problems for the builders, the operators and the users as well.\nThe remoteness of the facilities makes the travelling from European home institutions difficult and expensive.\nInformation Technologies can offer a solution to these problems, provided the necessary infrastructure and tools are put in place.\nThe strategic objective of this proposal is to make possible a strict integration in the ever-growing instrumental grid emerging worldwide of the world-class facilities created in Chile by the European Astronomical Community. These represent an investment of many hundred million Euros that will be exploited in the next decades.\nThe present project proposes to create a physical infrastructure (and the tools to exploit it) to efficiently connect these facilities to Europe. The infrastructure will be complementary to the international infrastructures created in the last years with the EC support (RedCLARA, ALICE, GEANT) and will be another step in the creation in Latin America of an advanced instrumentation GRID. This will allow the European Research a competitive edge having faster access to the collected data and use the facilities in an ever more efficient way.
Agency: European Commission | Branch: H2020 | Program: CSA | Phase: GARRI-5-2014 | Award Amount: 1.99M | Year: 2015
Research integrity is a growing concern among scientists, research organisations, policy makers, and the public, expressed in a proliferation of ethical codes, guidelines, and procedures. While this proliferation calls for harmonisation and coordination, there is little factual knowledge about the actual processes leading to misconduct or the effectiveness of current integrity policies. PRINTEGER analyses the incidence and individual, social, and organisational causes and dynamics of misconduct. It also analyses how institutions respond to allegations, specifically in interaction with neighbouring law, the media, complex research organisations, and systemic changes in research. From the perspective of the research work floor, including the daily work of journal editors or research managers, PRINTEGER will analyse how current instruments of integrity policy operate in practice. How do guidelines most contribute to integrity? What other instruments and procedures will promote integrity? PRINTEGER will provide concrete tools and advice to promote research integrity in Europe through four specific target groups: advice on an optimal policy mix and opportunities for harmonisation to research policy makers; best practice approaches to foster integrity for research leaders and managers; advice on the use of IT tools and organisational measures for research support organisations; practice-informed educational tools for ethical training and reflection of early career scientists. PRINTEGER uses a unique approach that looks at procedures and guidelines, but also analyses how they operate in the context of daily research practice. For this purpose, PRINTEGER gathers not only ethicists, but also very pertinent expertise that has barely informed integrity policy so far: legal studies, scientometrics, and social sciences, such as criminology and media studies; all flanked by intensive stakeholder consultation and dissemination activities to maximise impact.
Agency: European Commission | Branch: H2020 | Program: CSA | Phase: SC5-16-2016-2017 | Award Amount: 1.54M | Year: 2016
The project Towards a World Forum on Raw Materials (FORAM) will develop and set up an EU-based platform of international experts and stakeholders that will advance the idea of a World Forum on Raw Materials (WFRM) and enhance the international cooperation on raw material policies and investments. The global use of mineral resources has drastically increased and supply chains have become ever more complex. A number of global initiatives and organizations have been contributing to knowledge and information transfer, including the EC, UNEP International Resource Panel, the World Resources Forum, the World Material Forum, the OECD and others. It is widely felt that improved international resource transparency and governance would be beneficial for all, since it would lead to stability, predictability, resource-efficiency and hence a better foundation for competitiveness on a sustainable basis. The FORAM project will contribute to consolidate the efforts towards a more joint and coherent approach towards raw materials policies and investments worldwide, by closely working with the relevant stakeholders in industry, European and international organisations, governments, academia and civil society. Synergies with relevant EU Member States initiatives will be explored and fostered. The project will in particular seek to engage the participation of G20 Member countries and other countries active in the mining and other raw materials sectors, so that experiences will be shared and understanding of all aspects of trade in raw materials will be increased. By implementing this project an EU-based platform of international key experts and stakeholders is created, related to the entire raw materials value chain. This platform will work together on making the current complex maze of existing raw material related initiatives more effective. As such, the FORAM project will be the largest collaborative effort for raw materials strategy cooperation on a global level so far.
ISIS INNOVATION Ltd and Leiden University | Date: 2014-10-13
The invention relates to an antigenic composition or vaccine comprising a viral vector, the viral vector comprising nucleic acid encoding Plasmodium protein PfLSA1, or a part or variant of Plasmodium protein PfLSA1; PfLSAP2, or a part or variant of Plasmodium protein PfLSAP2; PfUIS3, or a part or variant of Plasmodium protein PfUIS3; PfI0580c, or a part or variant of Plasmodium protein PfI0580c; and PfSPECT-1, or a part or variant of Plasmodium protein PfSPECT-1.
Agency: European Commission | Branch: FP7 | Program: CP-FP | Phase: KBBE-2009-1-2-13 | Award Amount: 5.09M | Year: 2010
The recent decline of European eel (Anguilla anguilla) and no signs of recovery has brought attention to the biologically unsustainable exploitation of the stock. In September 2007, the EU has adopted the Council Regulation 1100/2007 establishing measures for the recovery of the European eel stock. However, eel are still fished intensively for human consumption while aquaculture and restocking rely exclusively on the supply of glass eels caught each year. A controlled production of eel larvae is ever more urgent. The objective of PRO-EEL is to develop standardised protocols for production of high quality gametes, viable eggs and feeding larvae. The approach is to expand knowledge about the intricate hormonal control and physiology of eels which complicates artificial reproduction. This knowledge will be applied in the development of suitable methods to induce maturation considering different rearing conditions. Knowledge about the gametogenesis and maturation pattern will be developed in small scale tests and applied to establish standardised fertilisation procedures. New knowledge about functional anatomy of embryos and yolksac larvae will be applied to develop suitable feed. Protocols for larval production will be tested in full scale experimental facilities managed in collaboration with a qualified SME. The integrated protocols and technology development will be evaluated relative to the output of healthy embryos and yolksac larvae. Larval feeds will be developed towards pioneering first-feeding in European eel larvae, which will be a major breakthrough and promising step towards a self-sustained aquaculture. The strength of the project is its interdisciplinary approach and the unique expertise of the consortium. PRO-EEL brings together leading institutes in eel reproduction complemented by excellence in disciplines filling gaps in knowledge and technology. A tight collaboration with the aquaculture industry promotes the applicability of developed technology.
Agency: European Commission | Branch: FP7 | Program: CP-IP | Phase: HEALTH.2013.2.1.1-1 | Award Amount: 14.69M | Year: 2013
Cancers are genetic disease arising from the accumulation of multiple molecular alterations in affected cells. Large-scale genomic, transcriptomic and proteomic analyses have established comprehensive catalogues of molecules which are altered in their structure and/or abundance in malignant tumors as compared to healthy tissues. Far less developed are concepts and methods to integrate data from different sources and to directly interrogate gene functions on a large scale in order to differentiate driver alterations, which directly contribute to tumor progression, from indolent passenger alterations. As a consequence, examples of successful translation of knowledge generated from omics approaches into novel clinical concepts and applications are scarce. Pancreatic cancer is a prime example of this dilemma. Representing the 4th to 5th most common cause of cancer related deaths, it is a disease with a major socioeconomic impact. Despite enormous advances in the identification of molecular changes associated with the disease, new treatment options have not emerged. Thus, 5-year survival rates remain unchanged at a dismal 6%, the lowest for all solid tumors. Using pancreatic cancer as a model disease, the goal of this integrative project is to develop novel cellular and animal models, as well as novel strategies to generate, analyze and integrate large scale metabolic and transcriptomic data from these models, in order to systematically characterize and validate novel targets for therapeutic intervention. In addition to the general tumor cell population, special consideration will be given to sub-populations of tumor-initiating cells, a.k.a. tumor stem cells. To this end, the consortium comprises i) SMEs with strong focus on technology development, ii) clinical and academic partners with extensive experience in pancreatic cancer molecular biology and management of pancreatic cancer patients, and iii) technology and data analysis experts from academic groups.
Agency: European Commission | Branch: FP7 | Program: CSA-CA | Phase: HEALTH-2007-2.1.2-7 | Award Amount: 1.12M | Year: 2009
In contrast to the reductionist approach of Western medicine that is based on modern anatomy, cell and molecular biology, Traditional Chinese Medicine (TCM) uses a unique theory system and an individualised and holistic approach to describe health and disease, based on the philosophy of Yin-Yang balance and an emphasis on harmony of functions. These two medicine systems disagree with each other in many situations as both of them may observe health from their limited perspective. GP-TCM aims to inform best practice and harmonise research of the safety and efficacy of TCM, especially Chinese herbal medicines (CHM) and acupuncture, in EU Member States using a functional genomics approach through exchange of opinions, experience and expertise among scientists in EU Member States and China. In 10 proposed work packages, we will take actions to review the current status, identify problems and solutions in the quality control, extraction and analysis of CHM. While these fundamental issues are addressed, discussion forums emphasising the use of functional genomics methodology in research of the safety, efficacy and mechanisms of CHM and acupuncture will be the core of this Coordination project. It will include the application of the technique in cell-based models, animal models and in clinical studies. Guidelines about good practice and agreed protocols in related research areas will be published to promote future TCM research in all the EU member states; online tools and research resources will be made available to EU member states; EU member states and additional China partners will be invited to join this network; The GP-TCM Research Association will be established during this project and kept running autonomously to continue the guidance and coordination of EU-China collaboration in TCM research.
Agency: European Commission | Branch: FP7 | Program: MC-ITN | Phase: FP7-PEOPLE-2011-ITN | Award Amount: 3.74M | Year: 2012
Infectious diseases caused by pathogenic micro-organisms are major causes of death, disability, and social and economic disruption for millions of people. During evolution these pathogens have developed intricate strategies to manipulate host defence mechanisms and outwit the immune system. To reduce the burden of infectious diseases it is important to increase understanding of these host-pathogen interaction mechanisms and to develop more effective strategies for drug discovery. The zebrafish holds much promise as a high-throughput drug screening model. In the last few years, zebrafish models for studying human pathogens or closely related animal pathogens have emerged at a rapid pace. The fact that zebrafish produce large amounts of embryos, which develop externally and are optically transparent, gives unprecedented possibilities for live imaging of disease processes and is the basis of novel high-thoughput drug screening approaches. In recent years good models have been developed for toxicity, safety and efficacy of drug screening in zebrafish embryos. However, the major bottleneck for development of high-throughput antimicrobial drug screens has been that infection models rely on manual injection and handling of zebrafish embryos. This limiting factor has been overcome by a unique automated injection system that will be applied in this project. The FishForPharma training network brings together leading European research groups that have pioneered the use of zebrafish infection models and partners from the Biotech and Pharma sectors that aim to commercialise zebrafish tools for biomedical applications. FishForPharma aims to deliver the proof-of-principle for drug discovery using zebrafish infectious disease models and to increase understanding of host-pathogen interaction mechanisms to identify new drug targets for infectious disease treatment. Most importantly, we aim to equip a cohort of young researchers with the knowledge and skills to achieve these goals.
Agency: European Commission | Branch: FP7 | Program: NoE | Phase: HEALTH.2010.4.2-2 | Award Amount: 12.67M | Year: 2011
Paediatric drugs (PD) lack appropriate testing. Most drugs have inadequate information about dosing regimen, dose adjustment and how to administer them. These are longstanding problems that unquestionably require concerted efforts at the international level. Both the US and the EU have introduced paediatric legislation that facilitates participation of children in research and pharmaceutical innovation but initiatives are not always coordinated and often different approaches are used to deal with the same problems. The main aim of GRiP will be to implement an infrastructure matrix to stimulate and facilitate the development and safe use of medicine in children. This implementation entails active coordination of knowledge management efforts and integrated use of existing research capacity, whilst reducing the fragmentation and duplication of activities. The consortium will primarily focus on: 1) development of a Paediatric Clinical Pharmacology Training Program; 2) Validation and harmonisation of research tools specific for paediatrics; 3) Sharing of strategies and plans; 4) Use of ongoing/planned research studies to evaluate the feasibility of proposed research tools and strategies. GRiP brings together an exceptional range of high quality leaders and stakeholders that are very active in the context of EU and US paediatric medicines research. GRiP will mobilize 21 institutions as partners and at least another 16 major networks that represent several hundreds of clinical sites and a total of more than 1000 researchers across Europe, the US and Asia. The integration of the WHO, EMA and the NIH-NICHD associated networks, including the FDA, will be a major asset not just for an effective implementation of the network activities without duplication, but also for the rapid translation of GRiP deliverables into practice. This partnership will work closely with families to provide children with safe and effective medicines.
Agency: European Commission | Branch: FP7 | Program: CP-FP | Phase: HEALTH.2010.4.2-1 | Award Amount: 7.63M | Year: 2011
-thalassaemia major is one of the most severe forms of chronic congenital anaemia. The recommended treatment consists in regular blood transfusions combined with chelating therapy to remove harmful iron accumulation in the body. The use of deferoxamine, the first chelating agent only available for subcutaneous administration is limited due to toxicity and the lack of compliance, despite its satisfactory therapeutic effects. An oral iron chelating agent, deferiprone, was authorised in Europe in August 1999 and recommended for the treatment of iron overload in patients with thalassaemia major when deferoxamine is contraindicated or inadequate. Despite a wide experience of the administration of deferiprone for thalassaemic patients, limited data are available on its use in children below 10 years and the need for additional data in this age subset was clearly indicated in the 2009 priority list approved by the Paediatric Committee at the European Medicines Agency (PDCO). In addition, according to the recent scientific advancements and in consideration of the anticipated benefit of this chelator in controlling cardiac iron overload, studies evaluating the effects of the deferiprone in all the paediatric ages and in all transfusion-dependent chronic congenital anaemia (including Sickle Cell Diseases) were also considered a critical therapeutic need. The DEEP project, in line with these premises, has been funded with the specific aim to produce a new oral liquid formulation of deferiprone suitable for the paediatric use and to provide evidences for the use of this chelator as first line therapy in the whole paediatric population (from 1 month to 18 years) affected by transfusion-dependent chronic anaemia. The condition under study in the DEEP project is rare. This poses special difficulties in the conduct of the studies due to the small patient population and the need to involve a large number of recruiting centres . However, being dedicated to develop an orphan drug, DEEP has been also recognised in the context of IRDiRC, the International Rare Diseases Research Consortium devoted to repurpose/develop 200 new drugs for Rare Diseases by the end of 2020. Main features of the DEEP project are: -The innovative design of the clinical studies including pharmacokinetic modelling for the definition of the most appropriate dosage of deferiprone in younger children, the cardiac MRI T2* evaluation as primary endpoint, a three years safety study aimed at evaluating deferiprone, in monotherapy or in combination, in the real worlds setting and, for the first time, a comparative efficacy-safety trial to compare the two existing oral chelators: deferiprone and deferasirox. -The DEEP Consortium including European and non-European Countries from the Mediterranean region where the transfusion-dependent congenital anaemia, in particular -thalassemia major, is particularly widespread: the collaboration within a multinational and multicultural network makes the Project extremely challenging due to many different ethical, methodological and social approaches to be explored and positively addressed.
Agency: European Commission | Branch: H2020 | Program: RIA | Phase: INFRAIA-1-2014-2015 | Award Amount: 10.23M | Year: 2015
The Europlanet 2020 Research Infrastructure (EPN2020-RI) will address key scientific and technological challenges facing modern planetary science by providing open access to state-of-the-art research data, models and facilities across the European Research Area. Its Transnational Access activities will provide access to world-leading laboratory facilities that simulate conditions found on planetary bodies as well as specific analogue field sites for Mars, Europa and Titan. Its Virtual Access activities will make available the diverse datasets and visualisation tools needed for comparing and understanding planetary environments in the Solar System and beyond. By providing the underpinning facilities that European planetary scientists need to conduct their research, EPN2020-RI will create cooperation and effective synergies between its different components: space exploration, ground-based observations, laboratory and field experiments, numerical modelling, and technology. EPN2020-RI builds on the foundations of successful FP6 and FP7 Europlanet programmes that established the Europlanet brand and built structures that will be used in the Networking Activities of EPN2020-RI to coordinate the European planetary science communitys research. It will disseminate its results to a wide range of stakeholders including industry, policy makers and, crucially, both the wider public and the next generation of researchers and opinion formers, now in education. As an Advanced Infrastructure we place particular emphasis on widening the participation of previously under-represented research communities and stakeholders. We will include new countries and Inclusiveness Member States, via workshops, team meetings, and personnel exchanges, to broaden/widen/expand and improve the scientific and innovation impact of the infrastructure. EPN2020-RI will therefore build a truly pan-European community that shares common goals, facilities, personnel, data and IP across national boundaries
Agency: European Commission | Branch: FP7 | Program: CSA-CA | Phase: HEALTH-2007-2.3.2-5 | Award Amount: 1.13M | Year: 2008
PENTA-LABNET(PL) is a coordination action aimed at improving the range of products and clinical use of antiretrovirals(ARVs) in HIV-infected children in resource-rich and resource-limited countries. This will be achieved through building capacity of laboratories to undertake co-ordinated studies on pharmacokinetics, pharmacodynamics and pharmacogenetics of new formulations and dosing and studies of viral and immune responses to novel regimens and strategies for using ARVs in children. PL forms a logical, necessary and cost-effective addition to the clinical-trial-focused research activities of the longstanding PENTA network, building on its existing operational infrastructures and expertise. To respond to emerging needs identified by EU as priority areas, the aim of PL is the development of a drug centred research platform, which will provide a complimentary range of activities focussed on supporting the rational selection of optimal dosage and delivery forms of ARVs, and providing the lab basis for evaluating new ARVs strategies in children. The definition, organisation and management of integrated pharmacological and viro/immunological studies to better characterise the concentration-exposure-effect relationship will be a central activity of PL. In support of these studies, standardised data collection systems will be established enabling linkage of clinical and laboratory data. In addition a central biobank will be set up to provide rapid identification of samples to be used for research. The laboratory and paediatric expertise generated in PL will support rapid assessment of new and existing individual and combined ARVs. The WHO will be a key partner of PL to define research priorities in ARV drug development and (also through PENTAs extensive international links) to rapidly disseminate results to a range of stakeholders (e.g. EMEA and industries) and support the rapid translation of research findings into guidelines and practice for children in all settings
Agency: European Commission | Branch: FP7 | Program: CP-CSA-Infra | Phase: INFRA-2012-1.1.25. | Award Amount: 10.98M | Year: 2013
Optical-infrared astronomy in Europe is in a state of transition and opportunity, with the goal of a viable structured European scale community in sight. A strong astronomical community requires access to state of the art infrastructures (telescopes), equipped with the best possible instrumentation, and with that access being open to all on a basis of competitive excellence. Further, the community needs training in optimal use of those facilities to be available to all, Critically, it needs a viable operational model, with long-term support from the national agencies, to operate those infrastructures. The most important need for most astronomers is to have open access to a viable set of medium aperture telescopes, with excellent facilities, complemented by superb instrumentation on the extant large telescopes, while working towards next generation instrumentation on the future flagship, the European Extremely Large Telescope. OPTICON has made a substantial contribution to preparing the realisation of that ambition. OPTICON supported R&D has, and is developing critical next-generation technology, to enhance future instrumentation on all telescopes. The big immediate challenge is to retain a viable set of well-equipped medium aperture telescopes. The present project is to make the proof of principle that such a situation is possible - a situation developed by OPTICON under its previous contracts, in collaboration with the EC supported strategy network ASTRONET - and set the stage for the step to full implementation.
Agency: European Commission | Branch: H2020 | Program: RIA | Phase: EINFRA-1-2014 | Award Amount: 8.02M | Year: 2015
In the coming decade a significant number of the 500.000.000 European (EU/EEA) citizens will have their genome determined routinely. This will be complemented with much cheaper (currently ~20 Euro per measurement) acquisition of the metabolome of biofluids (e.g. urine, saliva, blood plasma) which will link the genotype with metabolome data that captures the highly dynamic phenome and exposome of patients. Having such low cost solutions will enable, for the first time, the development of a truly personalised and evidence-based medicine founded on hard scientific measurements. The exposome includes the metabolic information resulting from all the external influences on the human organism such as age, behavioural factors like exercise and nutrition or other environmental factors. Considering that the amount of data generated by molecular phenotyping exceeds the data volume of personal genomes by at least an order of magnitude, the collection of such information will pose dramatic demands on biomedical data management and compute capabilities in Europe. For example, a single typical National Phenome Centre, managing only around 100,000 human samples per year, can generate more than 2 Petabytes of data during this period alone. A scale-up to sizable portions of the European population over time will require data analysis services capable to work on exabyte-scale amounts of biomedical phenotyping data, for which no viable solution exists at the moment. The PhenoMeNal project will develop and deploy an integrated, secure, permanent, on-demand service-driven, privacy-compliant and sustainable e-infrastructure for the processing, analysis and information-mining of the massive amount of medical molecular phenotyping and genotyping data that will be generated by metabolomics applications now entering research and clinic.
Agency: European Commission | Branch: H2020 | Program: RIA | Phase: INFRAIA-1-2014-2015 | Award Amount: 10.00M | Year: 2015
Structural biology provides insight into the molecular architecture of cells up to atomic resolution, revealing the biological mechanisms that are fundamental to life. It is thus key to many innovations in chemistry, biotechnology and medicine such as engineered enzymes, new potent drugs, innovative vaccines and novel biomaterials. iNEXT (infrastructure for NMR, EM and X-rays for Translational research) will provide high-end structural biology instrumentation and expertise, facilitating expert and non-expert European users to translate their fundamental research into biomedical and biotechnological applications. iNEXT brings together leading European structural biology facilities under one interdisciplinary organizational umbrella and includes synchrotron sites for X-rays, NMR centers with ultra-high field instruments, and, for the first time, advanced electron microscopy and light imaging facilities. Together with key partners in biological and biomedical institutions, partners focusing on training and dissemination activities, and ESFRI projects (Instruct, Euro-BioImaging, EU-OPENSCREEN and future neutron-provider ESS), iNEXT forms an inclusive European network of world class. iNEXT joint research projects (fragment screening for drug development, membrane protein structure, and multimodal cellular imaging) and networking, training and transnational access activities will be important for SMEs, established industries and academics alike. In particular, iNEXT will provide novel access modes to attract new and non-expert users, which are often hindered from engaging in structural biology projects through lack of instrumentation and expertise: a Structural Audit procedure, whereby a sample is assessed for its suitability for structural studies; Enhanced Project Support, allowing users to get expert help in an iNEXT facility; and High-End Data Collection, enabling experienced users to take full benefit of the iNEXT state-of-the-art equipment.
Agency: European Commission | Branch: FP7 | Program: CP | Phase: ENV.2012.6.3-1 | Award Amount: 5.77M | Year: 2012
The proposal IDREEM will create smarter greener growth for one of Europes most important industrial sectors: the aquaculture industry. It will achieve this through taking waste streams that are at present lost to the environment (as pollution) and converting them into secondary raw materials for the production of high value organisms such as seaweed and shellfish. To do this IDREEM will develop, demonstrate and benchmark (against existing production techniques) innovative production technology for the European aquaculture industry. Aquaculture is now a major component of global food security and is the fast growing food production sector globally. However the European industry is stagnating. The industry is facing real questions of economic and environmental sustainability. IDREEM will address these questions by working with a range of SME aquaculture producers across Europe to develop deploy and quantitatively assess the new production technology. Using an integrated approach defining the environmental, economic and social impact of the new production technology, life cycle assessment and life cycle costing will be used to quantify and demonstrate the economic and environmental benefits. Along with this process a combined environmental and economic modelling platform will be used to provide an evidence based decision making framework for aquaculture producers, regulators and policy makers. Throughout the project a dedicated impact coordinator will ensure that the project is fully engaged with the wide range of stakeholders, inviting their participation from the beginning and throughout the project (specifically in the form of a project advisory committee) and ensuring that results are fed back into that community. This will ensure that there is a rapid up take of the new production technology across the European sector, creating opportunity and support for a range of new SME producers, processors and up the value chain
Agency: European Commission | Branch: H2020 | Program: RIA | Phase: INFRAIA-01-2016-2017 | Award Amount: 10.01M | Year: 2017
Europe has become a global leader in optical-near infrared astronomy through excellence in space and ground-based experimental and theoretical research. While the major infrastructures are delivered through major national and multi-national agencies (ESO, ESA) their continuing scientific competitiveness requires a strong community of scientists and technologists distributed across Europes nations. OPTICON has a proven record supporting European astrophysical excellence through development of new technologies, through training of new people, through delivering open access to the best infrastructures, and through strategic planning for future requirements in technology, innovative research methodologies, and trans-national coordination. Europes scientific excellence depends on continuing effort developing and supporting the distributed expertise across Europe - this is essential to develop and implement new technologies and ensure instrumentation and infrastructures remain cutting edge. Excellence depends on continuing effort to strengthen and broaden the community, through networking initiatives to include and then consolidate European communities with more limited science expertise. Excellence builds on training actions to qualify scientists from European communities which lack national access to state of the art research infrastructures to compete successfully for use of the best available facilities. Excellence depends on access programmes which enable all European scientists to access the best infrastructures needs-blind, purely on competitive merit. Global competitiveness and the future of the community require early planning of long-term sustainability, awareness of potentially disruptive technologies, and new approaches to the use of national-scale infrastructures under remote or robotic control. OPTICON will continue to promote this excellence, global competitiveness and long-term strategic planning.
Agency: European Commission | Branch: FP7 | Program: CSA-CA | Phase: ENV.2008.3.3.1.1. | Award Amount: 1.20M | Year: 2009
Many potentially hazardous compounds are traded as chemicals or incorporated as additives in products. Their release to the environment has been a concern of EC, UNO, WHO and OECD. The discussion of the assessment and management of chemicals and products led to the OECD program Globally Harmonised System of Classification and Labelling of Chemicals (GHS). The World Summit encouraged countries to implement GHS with a view of having the system operating by 2008. The need to form GHS on a global scale is part of EU policy. GHS aims to have the same criteria worldwide to classify the responsible trade and handling of chemicals and at the same time protect human health. The EU will ensure transition from the current EU Classification & Labelling (C\L) to the GHS which harmonizes with REACH. Countries like Japan and the USA announced to implement GHS in the near future. UNITAR supports other countries. However, a complete picture on the global state of implementation is not available. With the growing level of worldwide trade we however face unsafe products on the marked. Only last year reports about toys releasing hazardous components made it to headlines. Vietnam reported that all kind of plastic gets recycled and sold back to the market. This shows that global trade in a circular economy is not acceptable without globally agreed assessment methods and harmonised C\L. A ECB study revealed that the EU regulation REACH will require 3.9 mill. additional test animals if no alternative methods are accepted. The number of additional tests are unknown when GHS is implemented in a global scale. The CA RISKCYCLE will include experts from OECD, UNEP, SusChem, country experts from Asia, America and Europe. The overall objective of the project is to define with international experts future needs of R\D contributions for innovations in the field of risk-based management of chemicals and products in a global perspective using alternative testing strategies to minimize animal tests.
Agency: European Commission | Branch: FP7 | Program: CP-FP | Phase: HEALTH-2007-3.1-5 | Award Amount: 1.29M | Year: 2009
The inappropriate supply and consumption of non-prescribed medicines consists a public health problem of outmost importance for developed as well as for developing countries. The aims to develop new research methods and generate scientific basis to reduce the incidence of drug-related mishaps and maximize the potent effect of medicines in the provision of healthcare. The project utilizes a theory-specific approach to identify and understand primary care physicians and primary care patients behaviour towards prescription and consumption of medicines. Grounded on the theory of planned behaviour (TPB; Ajzen, 1991) seeks to identify predisposing behavioural factors that will enable the alteration of the problematic behaviour. This model also provides the basis for theory-guided interventions, tailored to address the behavioural components playing an influential role in the irrational prescription and consumption of medicines. In particular, the projects objectives include the assessment of the extent of OTC misuse in countries of southern Europe, the identification of influential factors on primary care physicians and patients intentions towards irrational prescription and misuse of medicines as well as the design and implementation of certain pilot interventions with the potential to be translated into policy. Qualitative and quantitative research methods will be employed to assess predisposing factors of inappropriate prescription practices and medicine misuse in samples of primary care physicians and primary care patients. Pilot interventions will be also devised and applied. Southern European countries will benefit from the progress and the know-how of northern European countries invited to participate in the current proposal. Another benefit will be the formation of a network consisting of various disciplines that ensures evaluation, discussion and widespread dissemination of emerging knowledge throughout European primary health care settings.
Agency: European Commission | Branch: FP7 | Program: MC-ITN | Phase: FP7-PEOPLE-2010-ITN | Award Amount: 4.25M | Year: 2011
Gaia is the ESA cornerstone mission set to revolutionise our understanding of the Milky Way. This proposal will shape a critical mass of new expertise with the fundamental skills required to power the scientific exploitation of Gaia over the coming decade and beyond. The GREAT-ITN research theme is Unravelling the Milky Way focused on four fundamental problems: unravelling the origin and history of our home galaxy; tracing the birth place and understanding the astrophysical properties of the stellar constituents of our galaxy; deepening the understanding of planetary systems by linking the study of exoplanets to the origins of the solar system; take up the grand challenges offered by Gaia in the domains of the distance scale and the transient sky. The GREAT-ITN will deliver a training programme structured around these research themes to a core of new researchers, equipping them with the skills and expertise to become future leaders in astronomy or enter industry. These skills are relevant across many of the key challenges facing us now from climate change to energy security. These require well trained people, people which this GREAT-ITN will deliver. The 12 GREAT-ITN partners in Europe, and 1 in China, each have world leading expertise. 19 additional associate partners provide access to complementary expertise and facilities. The network includes three associates from the Information Technology industry: Microsoft, InterSystems and The Server Labs, each driving the new global on-line agenda. The European Space Agency provides the vital interface to the Gaia project, and exposure to the Space industry. This powerful combination of expertise, from industry and academia, will lead to a new cluster of expertise in the area of Galactic astronomy, deliver powerful and effective training to a large pool of Early Stage Researchers, and cement a sustainable research community adding impact to Europes leadership in this fundamental area of astrophysics.
Agency: European Commission | Branch: FP7 | Program: CP-IP | Phase: HEALTH-2007-1.2-3 | Award Amount: 9.09M | Year: 2008
We aim develop in-vivo imaging biomarker of multidrug transporter function as a generic tool for the prediction, diagnosis, monitoring and prognosis of major CNS diseases, as well as to provide support and guidance for therapeutic interventions. Multidrug transporters actively transport substrates (including multiple CNS drugs) against concentration gradients across the blood-brain barrier (BBB). Overactivity of these efflux transporters results in inadequate access of CNS drugs to their targets and hampers the build up of adequate tissue levels of these drugs in the brain, greatly limiting their therapeutic efficacy. As such, this transporter hypothesis of drug resistance is applicable to a broad range of CNS drugs and patients with a variety of CNS diseases who critically depend on these drugs. Efflux transporters may also influence brain elimination of A, the hallmark of Alzheimers disease (AD). Impaired multidrug transporter function with reduced clearance of A could lead to accumulation within the extracellular space, contributing to the pathogenesis of AD. We will determine the contribution of multidrug transporters to impaired brain uptake of drugs for the prediction of therapeutic responses, or the contribution of impaired transporter function to reduced clearance of toxic substances for the early in-vivo diagnosis of AD. Circumvention of pharmacoresistance, or increasing clearance, may involve inhibitors of multidrug transporters or sophisticated alternative therapies, but demonstration of overexpression or underactivity of transporter function is an essential and necessary first step. An in-vivo imaging biomarker of multidrug transporter function is essential for identifying altered transporter activity in individual patients. If a relation between overexpression and therapy resistance, or underactivity and AD, can be demonstrated, such a biomarker will provide the means for predicting treatment response, or early diagnosis, in individual patients.
Agency: European Commission | Branch: FP7 | Program: CP-FP | Phase: SIS-2010-184.108.40.206 | Award Amount: 2.03M | Year: 2011
Assessment of the performance of individual researchers is the cornerstone of the scientific and scholarly workforce. It shapes the quality and relevance of knowledge production in science, technology and innovation. Currently, there is a discrepancy between the criteria used in performance assessment and the broader social and economic function of scientific and scholarly research. Additional problems in the current evaluation system are: lack of resources for qualitative evaluation due to increased scale of research; available quantitative measures are often not applicable at the individual level; and there is a lack of recognition for new types of work that researchers need to perform. As a result, the broader social functions of the scientific system are often not included in its quality control mechanisms. Academic Careers Understood through Measurement and Norms (ACUMEN) addresses these problems by developing criteria and guidelines for Good Evaluation Practices (GEP). These will be based on an ACUMEN Portfolio for individual researchers throughout the sciences, social sciences and humanities combining multiple qualitative and quantitative evidence sources. The ACUMEN Portfolio will be based on: a comparative analysis of current peer review systems in Europe; an in-depth exploration of novel peer review practices; an assessment of the use of scientometric indicators in performance evaluation; the development of new web-based indicators and web-based evaluation tools; and a comparative analysis of the implications of current and emerging evaluation practices for the career development of women researchers. ACUMEN is an integrated, comparative study in which a set of proven methods will be used on the basis of selections from one shared data set: a sample of European Research Area personnel from bibliographic and web databases as well as data harvested from websites, and data gathered through interviews and from citation indexes.
Agency: European Commission | Branch: FP7 | Program: CSA-SA | Phase: SiS.2012.2.2.1-1 | Award Amount: 3.56M | Year: 2013
The aim of this teacher training project is to help transform science and mathematics teaching practice across Europe by giving teachers new skills to engage with their students, exciting new resources and the extended support needed to effectively introduce enquiry based learning into their classrooms. We will do this by working with teacher training institutions and teacher networks across Europe where we will implement innovative training programmes called enquiry labs. These will be based around the core scientific concepts and emotionally engaging activity of solving mysteries, i.e. exploring the unknown. The enquiry labs will use scientists and communication professionals (e.g. actors, communication experts, etc.) to mentor teachers through the transition to use enquiry to teach science. A spoke and hub model for coordination and delivery allows the project to both respond to local country needs while maintaining an overall EU wide sharing of best practices and reporting.
Agency: European Commission | Branch: FP7 | Program: CP-IP | Phase: SSH-2010-2.1-1 | Award Amount: 10.11M | Year: 2011
The objective of NEUJOBS is to imagine future, or rather various possible futures, under the conditions of the socioecological transition (and incorporating other key influences), map the implications for employment overall, but also in key sectors and relevant groups and integrate all of this together under a single intellectual framework. It will do so by combining EU-wide studies based on existing datasets with small-N comparative research dealing with one or more countries. Furthermore, the output will be a mix of quantitative and qualitative analysis, foresight activities and policy analysis. The proposal is organised in 23 workpackages that will run over a period of 48 months. The Consortium is composed by a team of 29 partners chosen among top research centres in Europe.
Agency: European Commission | Branch: FP7 | Program: MC-IRSES | Phase: FP7-PEOPLE-2011-IRSES | Award Amount: 1.20M | Year: 2012
Asias mounting global importance includes a remarkable growth is urbanisation. Over 60 percent of the estimated 3.5 billion Asian population are now living in cities. A citys most important asset is indeed its inhabitants (ADB 2008, Managing Asian Cities). If we are to address such unparalleled growth of Asian megacities, effective urban management must be informed by qualitative analytical knowledge and framed within a global, pluri-disciplinary experience that a transcontinental mobility programme such as the International Research Staff Exchange Scheme (IRSES) can provide. The challenge is for urban scholars and practitioners policy makers or community leaders to create a balance between the benefits and costs of urbanisation with a view of improving the quality of life of millions. The objective is to nurture more contextualised and policy-relevant knowledge on Asian cities, through exchanges and targeted case-study-based research among participants from the 11 partner institutions, with the European institutions playing a key role. Inspired by the new qualitative emphasis commanding European urban policy, the Urban Knowledge Network Asia intends to address critical urban development issues in Asia, taking into account the challenges of the diversity of urban societies, with their heterogeneous populations. The Urban Knowledge Network Asia aims, therefore, to study how Asian cities, taken as organic socio-spatial entities, manage their space and improve human liveability. To this end, the network put together by the International Institute for Asian Studies (IIAS) aims to host a variety of research projects covering three key areas of society in relation to the planning, management and governance of the urban environment: 1) shelter/housing (the house and the neighbourhood where people live), 2) the urban environment and its impact on living conditions, and 3) the city as a cultural nexus.
Agency: European Commission | Branch: FP7 | Program: CP-FP | Phase: HEALTH.2012.2.1.1-3 | Award Amount: 7.86M | Year: 2012
Users of NGS technologies, producing large and numerous distinct types of omics data, demands statistical methods to combat data and knowledge fragmentation and inappropriate procedures for data analysis. Yet, the current a gap between the available tools for analysis of a single omics data-type versus the requirement of biomedical scientists to understand the integrated results derived from several omics data-types, threatens to further increase due to the accelerated capacity of data production. STATegra will therefore improve and develop new statistical methods enabling accurate collection and integration of omics data while providing user-friendly packaging of STATegra tools targeting biomedical scientists. To close the gap between the present sub-optimal utilization of omics data and the power of statistics, STATegra develops statistical methods targeting efficient experimental design, data gathering, missing data, noisy data, current knowledge, meta-analysis and integrative data analysis. Importantly, STATegra facilitates understanding and use of omics data by forcing abstract concepts including knowledge, design, dirty data, visualization, causality and integration to be embedded in a real yet prototypical biomedical context. STATegra is positioned to ensure that the collective output of the statistical STATegra methods is relevant and subject to statistical and experimental validation and iteration. STATegra mimics a user-driven IT development strategy, by involving real biomedical users as beta-testers. To deliver beyond current exploratory tools, our consortium accumulates the necessary strong statistical, technological, and molecular expertise. The strong lead by research intensive SMEs, with proven track-record in software deployment, translates STATegra to a wide existing community base. STATegra accelerates the production of relevant statistical tools impacting a broad community of biomedical scientists in research, industry, and healthcare.
Agency: European Commission | Branch: H2020 | Program: CSA | Phase: COMPET-10-2014 | Award Amount: 2.00M | Year: 2015
EUSPACE-AWE uses the excitement of space to attract young people to science and technology and stimulate European and global citizenship. Our main goal is to increase the number of young people that choose space-related careers. We shall target diverse groups that influence career decisions, showing teenagers the opportunities offered by space science and engineering and inspiring primary-school children when their curiosity is high, their value systems are being formed and seeds of future aspirations are sown. Activities will 1. Acquaint young people with topical cutting-edge research and role-model engineers, 2. Demonstrate to teachers the power of space as a motivational tool and the opportunities of space careers, 3. Provide a repository of innovative peer-reviewed educational resources, including toolkits highlighting seductive aspects of Galileo and Copernicus and 4. Set up a space career hub and contest designed to appeal to teenagers. Attention will be paid to stimulating interest amongst girls and ethnic minorities and reaching children in underprivileged communities, where most talent is wasted. Targeting policy makers via high-impact events will help ensure sustainability and demonstrate the social value of the space programme. We maximise cost effectiveness by 1 Piggy backing on existing ESERO and other teacher training courses and 2. Exploiting and expanding infrastructures of proven FP7-Space projects, EU Universe Awareness for young children and Odysseus for teenagers. EUSPACE-AWE will complement existing space-education programs and coordinate closely with ESA. We shall reach European teachers, schools and national curricula through host organisations of ESEROs and the extensive networks of European Schoolnet, Scientix and UNAWE. Designated nodes will provide curriculum and resource localisation and test beds for professional evaluation. A partnership with the IAU Office of Astronomy for Development in Cape Town ensures global reach.
Agency: European Commission | Branch: FP7 | Program: CP-IP | Phase: SSH.2012.3.2-1 | Award Amount: 8.36M | Year: 2013
The main objectives of this project are to investigate the diversity of family forms, relationships, and life courses in Europe; to assess the compatibility of existing policies with these changes; and to contribute to evidence-based policy-making. The project will extend our knowledge on how policies promote well-being, inclusion and sustainable societal development among families. Our approach relies on three key premises. First, family life courses are becoming more diverse and complex. Second, individual lives are interdependent, linked within and across generations. Third, social contexts and policies shape individual and family life courses. Building upon these premises we a) explore the growing complexity of family configurations and transitions, b) examine their implications for men, women and children with respect to inequalities in life chances, intergenerational relations and care arrangements, c) investigate how policies address family diversity, d) develop short- and longer-term projections, and e) identify future policy needs. Transversal dimensions that are integrated into the project are gender, culture, socioeconomic resources and life stages. Our approach is multidisciplinary combining a wide range of expertise in social sciences, law and the humanities represented in the consortium of 25 research partners from 15 European countries, old and new member states, and three transnational civil society actors. We will conduct comparative analyses applying advanced quantitative methods to high quality register and survey data, and qualitative studies. The project will also develop a database of the legal content of family forms available in European countries, suitable for comparative analyses. Together with various stakeholders, government agencies, national and local policy-makers, non-governmental organizations and additional members of the scientific community across Europe, we will identify and disseminate innovation and best policy practices.
Agency: European Commission | Branch: FP7 | Program: | Phase: | Award Amount: 3.42M | Year: 2007
DRIVER is a multi-phase effort whose vision and primary objective is to establish a cohesive, pan-European infrastructure of Digital Repositories, offering sophisticated functionality services to both researchers and the general public. The present proposal (DRIVER Phase-II) aims to introduce key innovations compared to the original DRIVER project, while building on its results. The main novelties envisioned are: Establishment of a European Confederation of Digital Repositories as a strategic arm of DRIVER; Inclusion of Digital Repositories with non-textual or non-publication content, e.g., images, presentations, and possibly primary data; Construction of enhanced publications, which combine interrelated information objects into a logical whole, e.g., publications coupled with relevant presentations and associated datasets; Provision of advanced functionality to address the requirements raised by the above innovations or to serve varied modes of scientists research explorations. Additionally, DRIVER Phase-II moves from a test-bed to a production-quality infrastructure, expands the geographical coverage of Digital Repositories included in it, intensifies state-of-the-art and future-direction studies, and escalates dissemination, training, and community building activities. DRIVER Phase-II significantly broadens the horizons of the whole DRIVER endeavour regarding infrastructure operation, functionality innovation, and community relevance, and constitutes a major step on the way to the envisioned Knowledge Society.
News Article | November 13, 2016
This study was conducted following a previous discovery of multiple jet-driven winds in IC5063, which are linked to the supermassive black hole in its center (see The jet of a black hole drives multiple winds in a nearby galaxy). About 160 million years ago, charged particles (electrons/protons) that were inflowing toward the black hole were caught in magnetic field lines and ejected outward in the shape of a beam with high velocities. The beam of particles, also known as jet, propagated through the galaxy for more than 3000 light years. It went through a gas disk, driving strong winds at the points where it collided with interstellar clouds. The winds lasted for more than a half-million years, as indicated by ESO Very Large Telescope data. The scientists analyzed the ALMA data aiming to determine whether the gas in the winds has different properties than the gas in the rest of the clouds. For this purpose, they targeted emission lines of CO, originating from molecules in dense interstellar clouds, where the formation of new stars is often taking place, and where the temperature of the gas is typically ~10K. They showed that the molecular gas impacted by the black hole jet is heated, with temperatures often in the range 30K to 100K. The importance of this result lies in the impediments it poses for star formation—the increased thermal and turbulent motions of the gas delay its gravitational collapse. The gravitational collapse is further delayed by the dispersion of the clouds as the impact of the jet removes gas from dense clouds and disperses it into tenuous winds. The mass of the molecular gas in the winds is at least 2 million solar masses. Because of the energy deposited by the jet, the molecular gas is more highly excited in the winds than in the rest of the clouds. This result is encouraging for future studies in the field, as it indicates that the detection of molecular winds will be easier than previously thought for distant galaxies, which can only be observed in high excitation CO lines. Consequently, scientists can evaluate the role of the winds driven by black hole jets in the sizes of the observed galaxies over cosmological scales. This study was published in the peer-reviewed journal Astronomy & Astrophysics on November 1, 2016. Explore further: The jet of a black hole drives multiple winds in a nearby galaxy More information: K. M. Dasyra et al. ALMA reveals optically thin, highly excited CO gasin the jet-driven winds of the galaxy IC 5063, Astronomy & Astrophysics (2016). DOI: 10.1051/0004-6361/201629689 , On Arxiv: https://arxiv.org/abs/1609.03421 The team of the astrophysicists who worked on the study: Drs. K. Dasyra (National and Kapodistrian University of Athens, Greece), F. Combes (College de France, Observatory of Paris, France), T. Oosterloo, R. Morganti (ASTRON and the University of Groningen, The Netherlands), R. Oonk (ASTRON and Leiden University, The Netherlands), P. Salome (Observatory of Paris, France), and N. Vlahakis (National and Kapodistrian University of Athens, Greece).
News Article | December 7, 2016
Hendrik Hildebrandt from the Argelander-Institut für Astronomie in Bonn, Germany and Massimo Viola from the Leiden Observatory in the Netherlands led a team of astronomers  from institutions around the world who processed images from the Kilo Degree Survey (KiDS), which was made with ESO's VLT Survey Telescope (VST) in Chile. For their analysis, they used images from the survey that covered five patches of the sky covering a total area of around 2200 times the size of the full Moon , and containing around 15 million galaxies. By exploiting the exquisite image quality available to the VST at the Paranal site, and using innovative computer software, the team were able to carry out one of the most precise measurements ever made of an effect known as cosmic shear. This is a subtle variant of weak gravitational lensing, in which the light emitted from distant galaxies is slightly warped by the gravitational effect of large amounts of matter, such as galaxy clusters. In cosmic shear, it is not galaxy clusters but large-scale structures in the Universe that warp the light, which produces an even smaller effect. Very wide and deep surveys, such as KiDS, are needed to ensure that the very weak cosmic shear signal is strong enough to be measured and can be used by astronomers to map the distribution of gravitating matter. This study takes in the largest total area of the sky to ever be mapped with this technique so far. Intriguingly, the results of their analysis appear to be inconsistent with deductions from the results of the European Space Agency's Planck satellite, the leading space mission probing the fundamental properties of the Universe. In particular, the KiDS team's measurement of how clumpy matter is throughout the Universe -- a key cosmological parameter -- is significantly lower than the value derived from the Planck data . Massimo Viola explains: "This latest result indicates that dark matter in the cosmic web, which accounts for about one-quarter of the content of the Universe, is less clumpy than we previously believed." Dark matter remains elusive to detection, its presence only inferred from its gravitational effects. Studies like these are the best current way to determine the shape, scale and distribution of this invisible material. The surprise result of this study also has implications for our wider understanding of the Universe, and how it has evolved during its almost 14-billion-year history. Such an apparent disagreement with previously established results from Planck means that astronomers may now have to reformulate their understanding of some fundamental aspects of the development of the Universe. Hendrik Hildebrandt comments: "Our findings will help to refine our theoretical models of how the Universe has grown from its inception up to the present day." The KiDS analysis of data from the VST is an important step but future telescopes are expected to take even wider and deeper surveys of the sky. The co-leader of the study, Catherine Heymans of the University of Edinburgh in the UK adds: "Unravelling what has happened since the Big Bang is a complex challenge, but by continuing to study the distant skies, we can build a picture of how our modern Universe has evolved." "We see an intriguing discrepancy with Planck cosmology at the moment. Future missions such as the Euclid satellite and the Large Synoptic Survey Telescope will allow us to repeat these measurements and better understand what the Universe is really telling us," concludes Konrad Kuijken (Leiden Observatory, the Netherlands), who is principal investigator of the KiDS survey.  The international KiDS team (http://kids. ) of researchers includes scientists from Germany, the Netherlands, the UK, Australia, Italy, Malta and Canada.  This corresponds to about 450 square degrees, or a little more than 1% of the entire sky.  The parameter measured is called S8. Its value is a combination of the size of density fluctuations in, and the average density of, a section of the Universe. Large fluctuations in lower density parts of the Universe have an effect similar to that of smaller amplitude fluctuations in denser regions and the two cannot be distinguished by observations of weak lensing. The 8 refers to a cell size of 8 megaparsecs, which is used by convention in such studies. This research was presented in the paper entitled "KiDS-450: Cosmological parameter constraints from tomographic weak gravitational lensing", by H. Hildebrandt et al., to appear in Monthly Notices of the Royal Astronomical Society. The team is composed of H. Hildebrandt (Argelander-Institut für Astronomie, Bonn, Germany), M. Viola (Leiden Observatory, Leiden University, Leiden, the Netherlands), C. Heymans (Institute for Astronomy, University of Edinburgh, Edinburgh, UK), S. Joudaki (Centre for Astrophysics & Supercomputing, Swinburne University of Technology, Hawthorn, Australia), K. Kuijken (Leiden Observatory, Leiden University, Leiden, the Netherlands), C. Blake (Centre for Astrophysics & Supercomputing, Swinburne University of Technology, Hawthorn, Australia), T. Erben (Argelander-Institut für Astronomie, Bonn, Germany), B. Joachimi (University College London, London, UK), D Klaes (Argelander-Institut für Astronomie, Bonn, Germany), L. Miller (Department of Physics, University of Oxford, Oxford, UK), C.B. Morrison (Argelander-Institut für Astronomie, Bonn, Germany), R. Nakajima (Argelander-Institut für Astronomie, Bonn, Germany), G. Verdoes Kleijn (Kapteyn Astronomical Institute, University of Groningen, Groningen, the Netherlands), A. Amon (Institute for Astronomy, University of Edinburgh, Edinburgh, UK), A. Choi (Institute for Astronomy, University of Edinburgh, Edinburgh, UK), G. Covone (Department of Physics, University of Napoli Federico II, Napoli, Italy), J.T.A. de Jong (Leiden Observatory, Leiden University, Leiden, the Netherlands), A. Dvornik (Leiden Observatory, Leiden University, Leiden, the Netherlands), I. Fenech Conti (Institute of Space Sciences and Astronomy (ISSA), University of Malta, Msida, Malta; Department of Physics, University of Malta, Msida, Malta), A. Grado (INAF - Osservatorio Astronomico di Capodimonte, Napoli, Italy), J. Harnois-Déraps (Institute for Astronomy, University of Edinburgh, Edinburgh, UK; Department of Physics and Astronomy, University of British Columbia, Vancouver, Canada), R. Herbonnet (Leiden Observatory, Leiden University, Leiden, the Netherlands), H. Hoekstra (Leiden Observatory, Leiden University, Leiden, the Netherlands), F. Köhlinger (Leiden Observatory, Leiden University, Leiden, the Netherlands), J. McFarland (Kapteyn Astronomical Institute, University of Groningen, Groningen, the Netherlands), A. Mead (Department of Physics and Astronomy, University of British Columbia, Vancouver, Canada), J. Merten (Department of Physics, University of Oxford, Oxford, UK), N. Napolitano (INAF - Osservatorio Astronomico di Capodimonte, Napoli, Italy), J.A. Peacock (Institute for Astronomy, University of Edinburgh, Edinburgh, UK), M. Radovich (INAF - Osservatorio Astronomico di Padova, Padova, Italy), P. Schneider (Argelander-Institut für Astronomie, Bonn, Germany), P. Simon (Argelander-Institut für Astronomie, Bonn, Germany), E.A. Valentijn (Kapteyn Astronomical Institute, University of Groningen, Groningen, the Netherlands), J.L. van den Busch (Argelander-Institut für Astronomie, Bonn, Germany), E. van Uitert (University College London, London, UK) and L. van Waerbeke (Department of Physics and Astronomy, University of British Columbia, Vancouver, Canada). ESO is the foremost intergovernmental astronomy organisation in Europe and the world's most productive ground-based astronomical observatory by far. It is supported by 16 countries: Austria, Belgium, Brazil, the Czech Republic, Denmark, France, Finland, Germany, Italy, the Netherlands, Poland, Portugal, Spain, Sweden, Switzerland and the United Kingdom, along with the host state of Chile. ESO carries out an ambitious programme focused on the design, construction and operation of powerful ground-based observing facilities enabling astronomers to make important scientific discoveries. ESO also plays a leading role in promoting and organising cooperation in astronomical research. ESO operates three unique world-class observing sites in Chile: La Silla, Paranal and Chajnantor. At Paranal, ESO operates the Very Large Telescope, the world's most advanced visible-light astronomical observatory and two survey telescopes. VISTA works in the infrared and is the world's largest survey telescope and the VLT Survey Telescope is the largest telescope designed to exclusively survey the skies in visible light. ESO is a major partner in ALMA, the largest astronomical project in existence. And on Cerro Armazones, close to Paranal, ESO is building the 39-metre European Extremely Large Telescope, the E-ELT, which will become "the world's biggest eye on the sky".
News Article | November 16, 2016
Life can get in the way of science, forcing PhD students to take time out from the pursuit of knowledge and the lab. It can be a tough call for students to make. Immersed in their work with no assurances of a great job, driven to scour the literature to stay current and primed to worry about competition and impressing their advisers, many PhD students think that academic success is everything. For them, nothing comes before their studies and research programme. Breaks are risky — there is no way to ensure a smooth return to studies, funding and the bench. University policies governing gap time vary across nations, regions and institutions, and maintaining funding and research continuity can pose hurdles. Attitudes towards time off also differ widely. Many faculty members and potential future employers look askance at a doctoral student's decision to step aside, even for a brief period. Your capacity to put your PhD programme on hold will depend largely on your field, your institution and your advisers. In general, you can take a break when you need it, as long as you are prepared for the consequences — particularly if you aim to pursue an academic career. The decision could affect your reputation, publishing record and ability to stay current with your research programme. But with careful planning, there are ways to soften the blow (see 'How to take a successful break'). Few statistics exist on how often, for how long or why PhD students take time off from their studies. In the United States, neither the National Science Foundation nor the Council of Graduate Schools tracks leaves of absence or can point to a central source for such data. Some individual institutions provide estimates of how many PhD students have taken breaks each year. Heather Amos, a spokesperson for the University of British Columbia in Vancouver, Canada, says that about 50 PhD students out of nearly 4,000 across all disciplines, or about 1.25%, took a leave of absence in 2015. Martin Grund, spokesperson for the Max Planck Institutes' graduate-student organization, PhDnet, says that his group doesn't track leaves of absence. But, he says, internal surveys show that 7% of doctoral students at the institutes in 2012 were parents, and so had probably taken parental leave at some point. Some funding agencies allow for certain interruptions of study, including care for children and elderly people, professional development and other life needs. Some universities permit students to retain access to campus services while on leave for a variety of reasons; others have no defined policy. Anecdotally, it seems that few PhD students so much as think about a pause in their programme. “I think most don't even consider it,” says Heather Buschman, who earned a PhD in molecular pathology from the University of California, San Diego, School of Medicine after taking six months off for a US National Cancer Institute communications internship in 2006. “They think, 'I could never do that'. People are on such a focused trajectory and see any wavering as a negative.” There is a great deal of external and internal pressure to race to the finish, agrees Gareth O'Neill, a PhD candidate in linguistics at Leiden University in the Netherlands. “Doing a PhD is a relatively focused and driven occupation — once started, you just want to finish it,” he says. As a board member of the PhD Candidates' Network of the Netherlands, O'Neill is involved with an initiative called the Professional PhD Program, which helps to place PhD students who seek work experience outside academia. O'Neill says that the programme rarely receives applications from students who feel they need to stay at the bench throughout their doctoral studies, but that those who do apply sometimes experience pressure from supervisors to finish their PhD sooner. That pressure, he adds, is misguided or inapplicable — particularly from mid- or late-career scholars, who don't know or who don't want to admit how hard it is for new PhD students to remain in academia now. “We hope to bring about a shift of mindset,” he says. Still, when the need for a hiatus arises, some don't hesitate to take it — and then sail through their leave and back. Earlier this year, Anna Miller earned a PhD in parks, recreation and tourism management from North Carolina State University (NCSU) in Raleigh. She says that she never questioned her decision to leave her research behind for half a year, when her Brazilian fiancé was offered a postdoctoral appointment in Portugal. The travails of a long-distance relationship had become burdensome, and she wanted to join him abroad. “Academically, I was getting a bit burnt out, but it was really the strain on my personal life that was the problem,” she says. “If I was going to stay in the programme, I needed to deal with the personal part of my life.” Her adviser was concerned that she might not come back, and Miller herself says that she left for Portugal knowing that might be true. But in Lisbon, she found herself drawn to nearby parks, and started studying how they were managed, just for fun. “It was a refreshing way to look at the same questions from a different perspective and reaffirm my desire to study this subject,” she says. “I came back with new energy for being a full-time student.” Re-entry turned out to be easy. To get approval for a leave of absence, Miller and her advisers had already agreed on a formal plan for her return, charting out how she would later complete course work, research and exams. They had also predetermined how Miller's funding, which was suspended while she was away, would be reinstated. Everything unfolded as planned — and Miller became treasurer and then co-president of her department's graduate-student association. She also began to mentor other students and to organize career panels and other programmes. Three years on, Miller and her fiancé have since married, and she is now a resident lecturer at the School for Field Studies Center for Marine Resource Studies in the Turks and Caicos Islands. She teaches undergraduates who are studying abroad. “I can't think of any negatives of taking the time off,” she says. Others also report a positive experience. “Ideally, I'd say don't take time off, but if you do, don't judge yourself harshly,” says Jen O'Keefe, a geologist and science-education researcher at Morehead State University in Kentucky. She took a pause from her PhD studies in 2002, after a working relationship with an adviser fell apart. Ultimately, she devised a new plan that combined part-time work on her doctorate with a full-time teaching schedule. Looking back, she thinks that her research career benefited from the five-month break, which enabled her to refocus her work towards palaeoecology and curriculum and instruction, as well as a variety of other pursuits that she loves. “Everything from fly-ash geochemistry to honey studies to sinkholes,” she says. “No two PhD situations are the same. You have to do what's right for you.” Some think that their field of study smoothed the way. Benedikt Herwerth, who studies theoretical quantum physics at the Max Planck Institute of Quantum Optics in Garching, Germany, says that he had little trouble setting up two stints of paternity leave, for a total of seven months, after the birth of his daughter in February. He says that Germany's generous approach towards parental leave helped, but that his field of study might also have facilitated the interruption. “I'm not doing experiments,” he says. “It might be an advantage.” Even when there are no obstacles to taking time off, trouble might arise that complicates a student's return. Eleanor Harding, a doctoral candidate at the Max Planck Institute for Human Cognitive and Brain Sciences in Leipzig, Germany, has taken two breaks from her PhD research on how the brain processes music and language. The first was in 2011, when her mother died. Harding took several months off. “I lost my edge for quite a while,” she says. “But my adviser encouraged me to keep going.” Harding returned to work later that year and expected to earn her degree in 2013 — until an experiment fell through, which caused delays, and pregnancy complications rendered her unable to work. In May 2014, after her daughter was born, Harding returned to her research, but she found that she could not afford enough childcare to resume her studies full-time. In addition, while she had been out, other researchers had published work in her area, so she had to redirect her research to examine a narrower question that would respond to the other scientists' work. “Now, unfortunately, I'm in the middle of the pack instead of at the front,” she says. “I can't say I was the first.” A gap of just one year can put a PhD candidate behind when it comes to mastery of important technological advances, warns Kim First, president and chief executive of the recruiting firm Agency Worldwide in Encino, California. As a headhunter who searches for PhD graduates for jobs in biotechnology and pharmaceutical companies, she says that she encounters few candidates who have interrupted their doctoral programme. “The way technology is changing, taking a break can become difficult,” she says. “How do you stay cutting edge?” Other recruiters say that taking time away to have children or for other life events can hurt a researcher's scientific reputation, and that students should find ways to incorporate those obligations into their PhD programme without putting their research on pause. Some think that the stigma might be worse for women. Justin Schwartz, head of the materials science and engineering department at NCSU, has helped students to organize leaves of absence. When it comes to parental leave, he says, women are more likely than men to take the time off — but those who do are often terrified (sadly, with some reason, he notes) that faculty members will think that they lack the drive to be the best and will extrapolate that women aren't suited to doing science. But whether female or male, most students experience one clear consequence after taking the break: they lose momentum. Harding says that although there was a benefit to delaying her dissertation — a competing paper helped her to solve a problem in her data — she now has few job leads near her husband's medical residency in the Netherlands, and attributes that to having lost potential publications and chances to attend more conferences. “Your worth is based on quantitative measures like an impact factor,” she says. “They want people with publications. Life doesn't always cooperate.” Harding is now networking locally — getting involved, for instance, with a organization in the region that funds research into Parkinson's disease. O'Keefe wishes that the harsh judgement weren't there, but says that it seems specific to academia. “People feel badly and a lot of scientists out there judge them harshly,” she says. “There's a lot of, 'If you had to take time off, you're not really good enough to finish'.” She says that many early-career scientists she knows who interrupted their PhD programmes eschewed academic research in the end, and instead, accepted positions in industry or teaching. Now in her early 40s and a mother, she says that she wouldn't have done anything differently, and looks forward to expanding her research. “I was on the fast track and I was moving too fast,” she says. “A lot of good comes from taking a break and reassessing your priorities. A year off is sometimes the best thing you can do. The big message is, it's OK and you're not alone and you can go on to be what you want to be.”
News Article | December 7, 2016
Scientists have gained fresh insight into the nature of dark matter, the elusive material that accounts for much of the mass of the Universe. Calculations based on a study of distant galaxies, using a powerful telescope, suggest that dark matter is less dense and more smoothly distributed throughout space than previously thought. The results, from an international team of scientists, will inform efforts to understand how the Universe has evolved in the 14 billion years since the Big Bang, by helping to refine theoretical models of how it developed. Scientists studied wide-area images of the distant universe, taken from the European Southern Observatory in Chile. They applied a technique based on the bending of light by gravity - known as weak gravitational lensing - to map out the distribution of dark matter in the Universe today. Their study represents the largest area of the sky to be mapped using this technique to date. To eliminate bias in their results, scientists carried out three sets of calculations, including two false sets, only revealing to themselves at the outcome which of the sets was real. The new results contradict previous predictions from a survey of the far-off universe, representing a point in time soon after the Big Bang, imaged by the European Space Agency's Planck satellite. This previous study used a theoretical model to project how the Universe should appear today. The disagreement between this prediction and the latest direct measurements suggests that scientists' understanding of the evolving modern day Universe is incomplete and needs more research. The latest study, published in Monthly Notices of the Royal Astronomical Society, was carried out by a team jointly led by the University of Edinburgh, the Argelander Institute for Astronomy in Germany, Leiden University in the Netherlands and Swinburne University of Technology, Australia in an ongoing project called the Kilo Degree Survey, or KiDS. It was supported by the European Research Council. Dr Hendrik Hildebrandt of the Argelander Institute for Astronomy in Germany, who co-led the study, said: "Our findings will help to refine our theoretical model for how the Universe has grown since its inception, improving our understanding of the modern day Universe." Dr Massimo Viola of Leiden University in the Netherlands, who co-led the study, said: "This latest result indicates that the cosmic web dark matter, which accounts for about one-quarter of the Universe, is less clumpy than we previously believed." Professor Catherine Heymans of the University of Edinburgh's School of Physics and Astronomy, who co-led the study, said: "Unravelling what has happened since the Big Bang is a complex challenge, but by continuing to study the distant skies, we can build a picture of how our modern Universe has evolved."
News Article | March 1, 2017
VIENNA, AUSTRIA and NUREMBERG, GERMANY--(Marketwired - Mar 1, 2017) - At this year's ECR, Ziehm Imaging presents the latest generation of its long seller product, Ziehm Vision FD. The innovation leader celebrates "10 years of flat-panel technology" in its mobile C-arms this year, and announces the further expansion of its CMOS portfolio. The next generation of the bestselling Ziehm Vision FD 10 years ago, Ziehm Imaging introduced the first fully digital mobile C-arm with flat-panel technology: The introduction of the Ziehm Vision FD marked a new era of image quality in mobile C-arms. „Our first mobile C-arm with flat-panel turned into a global success story", says Martin Törnvik, Vice President Global Sales and Marketing at Ziehm Imaging. He had been in charge of the market launch of the Ziehm Vision FD in his role as product manager back then. The first Ziehm Vision FD was delivered to the university clinic Leiden in the Netherlands. „Back in 2006, it was a brave decision to go for the new flat-panel technology which nobody in the field of mobile C-arms had. We trusted the engineers of Ziehm Imaging and our local partner to deliver innovation and they did. We are proud to be the first partner hospital in the world to have bought the Ziehm Vision FD", states Paul Booijen, Facility Manager Radiology, Leiden University Medical Center. The cost-efficiency, flexibility and the broad range of clinical applications did not only convince the university clinic Leiden: More than 750 systems are installed in clinics worldwide today. At ECR congress, the Ziehm Vision FD will be showcased with a new CMOS flat-panel detector which allows for display of smaller pixels in the same quality. This enables higher image resolution at the same dose, closing the gap between the image quality of conventional flat-panel detectors and the cost-efficiency of image intensifiers. At RSNA in Chicago, Ziehm Imaging had already presented two further mobile C-arms with CMOS technology: Due to its versatile design, the Ziehm Solo FD ensures maximum flexibility even in small operating theaters. The C-arm is ideally suited for orthopedic, trauma and pain management procedures. The Ziehm Vision RFD, featuring a powerful generator and a reliable cooling system, is the solution of choice for vascular and hybrid interventions. Ziehm Imaging will also present two of its flagship products at ECR: the Ziehm Vision RFD 3D and the Ziehm Vision RFD Hybrid Edition. Ziehm Vision RFD 3D, the first 3D C-arm that provides a 16 cm edge length per scan volume, allows for optimized intraoperative control in orthopedic, trauma and spinal surgery interventions. The Ziehm Vision RFD Hybrid Edition is the first fully motorized mobile C-arm for the hybrid OR. The system is a space and cost saving alternative to fix installed systems for highly demanding cardiovascular procedures. In addition to the mobile C-arms of Ziehm Imaging, the mini C-arms of sister company OrthoScan will be on display at the joint booth, completing the broad portfolio of intraoperative imaging solutions. The mini C-arms are mainly used in orthopedic surgical procedures such as hand and foot surgery. Visit the joint booth of Ziehm Imaging and OrthoScan: X2/09, Austria Center Vienna. About Ziehm Imaging Founded in 1972, Ziehm Imaging has stood for the development, manufacturing and worldwide marketing of mobile X-ray-based imaging solutions for more than 40 years. Employing approx. 500 people worldwide, the company is the recognized innovation leader in the mobile C-arm industry and a market leader in Germany and other European countries. The Nuremberg-based manufacturer has received several awards for its ground-breaking technologies and achievements, including the Frost & Sullivan Award (various years), the iF Design Award 2011 and 2016, the Top100 award for innovative mid-size companies 2012, the Stevie Awards 2013, 2014 and 2015, the German Stevie Award and the IAIR Global Awards 2014 as "Best Company for Innovation & Leadership". For more information, please visit: www.ziehm.com.
News Article | March 16, 2016
The Milky Way in the 2MASS infrared survey, similar to Hubble observations of the sky colour (near-infrared). Here, the visible stars are mostly bright giant stars. Credit: The Infrared Processing and Analysis Center (IPAC) Two astronomy students from Leiden University have mapped the entire Milky Way Galaxy in dwarf stars for the first time. They show that there are a total of 58 billion dwarf stars, of which seven per cent reside in the outer regions of our Galaxy. This result is the most comprehensive model ever for the distribution of these stars. The findings appear in a new paper in Monthly Notices of the Royal Astronomical Society. The Milky Way, the galaxy we live in, consists of a prominent, relatively flat disc with closely spaced bright stars, and a halo, a sphere of stars with a much lower density around it. Astronomers assume that the halo is the remnant of the first galaxies that fused together to form our Galaxy. To find out exactly what the Milky Way looks like, astronomers have previously made maps using counts of the stars in the night sky. Leiden Astronomy students Isabel van Vledder and Dieuwertje van der Vlugt used the same technique in their research. Rather than studying bright stars, the two students used Hubble Space Telescope data from 274 dwarf stars, which were serendipitously observed by the orbiting observatory while it was looking for the most distant galaxies in the early Universe. The particular type of star they looked at were red dwarfs of spectral class M. Dwarf stars are undersized and often have too low a mass to burn hydrogen. As warm, rather than hot objects, they are best viewed with near-infrared cameras. Van Vledder comments: "Astronomers believe that there are very many of these stars. That makes them really quite suitable for mapping the Galaxy even though they are so hard to find." To find the distribution of the M dwarfs, Van Vledder and Van der Vlugt used three density models that astronomers use to describe the flat disc and halo, both separately and combined. To calculate which model best describes the structure of the Milky Way; the students then applied the Markov Chain Monte Carlo method. Van der Vlugt describes how this works: "You let a computer program test all possible values of each parameter of your model. It then fixes the value which corresponds best with the data." The model that includes both disk and halo was the perfect match. From the positions of the 274 M dwarfs in their sample, van Vledder and van der Vlugt inferred the existence of 58 billion dwarf stars. They were also able to accurately estimate the number of dwarfs in the halo, calculating a fraction of 7 per cent, higher than astronomers have previously found for the whole Milky Way. The results of the students are important for future research with the European Space Agency's Euclid Space Telescope, due for launch in 2020. Like Hubble, Euclid will image the whole sky in near-infrared. Van Vledder adds: "With our research, astronomers can now better assess whether they are dealing with a distant galaxy or a star in our own Galaxy." The students expect Euclid observations to yield an even more accurate picture of the Milky Way. Van der Vlugt and van Vledder did the research for their bachelor's degree in Astronomy at Leiden University. They worked together with Leiden astronomers Benne Holwerda, Matthew Kenworthy and Rychard Bouwens. More information: Isabel van Vledder et al. The Size and Shape of the Milky Way Disk and Halo from M-type Brown Dwarfs in the BoRG Survey, Monthly Notices of the Royal Astronomical Society (2016). DOI: 10.1093/mnras/stw258
Agency: European Commission | Branch: H2020 | Program: CSA | Phase: SC5-15-2016-2017 | Award Amount: 3.00M | Year: 2016
Since the publication of the first list of Critical Raw Materials (CRM) in 2010 by the Ad-hoc Working Group on CRM, numerous European projects have addressed (part of) the CRMs value and several initiatives have contributed to gather (part of) the related community into clusters and associations. This led to the production of important knowledge, unfortunately disseminated. Numerous databases have also been developed, sometimes as duplicates. For the first time in the history, SCRREEN aims at gathering European initiatives, associations, clusters, and projects working on CRMs into along lasting Expert Network on Critical Raw Materials, including the stakeholders, public authorities and civil society representatives. SCRREEN will contribute to improve the CRM strategy in Europe by (i) mapping primary and secondary resources as well as substitutes of CRMs, (ii) estimating the expected demand of various CRMs in the future and identifying major trends, (iii) providing policy and technology recommendations for actions improving the production and the potential substitution of CRM, (iv) addressing specifically WEEE and other EOL products issues related to their mapping and treatment standardization and (vi) identifying the knowledge gained over the last years and easing the access to these data beyond the project. The project consortium also acknowledges the challenges posed by the disruptions required to devlop new CRM strategies, which is why stakeholder dialogue is at the core of SCRREEN: policy, society, R&D and industrial decision-makers are involved to facilitate strategic knowledge-based decisions making to be carried out by these groups. A specific attention will also be brought on informing the general public on our strong dependence on imported raw materials, on the need to replace rare materials with substitutes and on the need to set up innovative and clean actions for exploration, extraction, processing and recycling.
Agency: European Commission | Branch: H2020 | Program: RIA | Phase: WASTE-3-2014 | Award Amount: 7.67M | Year: 2015
EU28 currently generates 461 million tons per year of ever more complex construction and demolition waste (C&DW) with average recycling rates of around 46%. There is still a significant loss of potential valuable minerals, metals and organic materials all over Europe. The main goal of HISER project is to develop and demonstrate novel cost-effective technological and non-technological holistic solutions for a higher recovery of raw materials from ever more complex C&DW, by considering circular economy approaches throughout the building value chain (from the End-of-Life Buildings to new Buildings). The following solutions are proposed: - Harmonized procedures complemented with an intelligent tool and a supply chain tracking system, for highly-efficient sorting at source in demolition and refurbishment works. - Advanced sorting and recycling technologies for the production and automated quality assessment of high-purity raw materials from complex C&DW. - Development of optimized building products (low embodied energy cements, green concretes, bricks, plasterboards and gypsum plasters, extruded composites) through the partial replacement of virgin raw materials by higher amounts of secondary high-purity raw materials recovered from complex C&DW. These solutions will be demonstrated in demolition projects and 5 case studies across Europe. Moreover, the economic and environmental impact of the HISER solutions will be quantified, from a life cycle perspective (LCA/LCC), and policy and standards recommendations encouraging the implementation of the best solutions will be drafted. HISER will contribute to higher levels of recovered materials from C&DW from 212 Mt in 2014, to 359 Mt in 2020 and 491 Mt by ca. 2030, on the basis of the increase in the recovery of aggregates, from 40% (169 Mt) to more than 80% (394 t) and wood, from 31% (2.4 Mt) to 55% (5 Mt);. Similarly, unlocking valuable raw materials currently not exploited is foreseen, namely some metals and emerging flows.
News Article | March 30, 2016
A decade ago, Dutch astronomer Frans Snik invented a simple optical device to measure the density of dust, soot and other particles, or aerosols, in the atmosphere that affect human health and the climate. He hoped to launch it into orbit around Earth aboard a satellite. But one afternoon in 2011, Snik held up a demonstration version of his device to an iPhone camera. The smartphone's screen displayed a rainbow of colours: Snik's optical device was converting incoming light into a spectrum that contained polarization information and channelling it into the camera. Snik realized that he could pair smartphones with the optical device and make the same kind of measurements that he and his colleagues planned to record from space. An idea was born. “We thought, why not make use of a technology that millions of people carry around in their pockets anyway?” By 2013, Snik and his colleagues at Leiden University in the Netherlands had given or sold a version of the optical device — called iSPEX — to more than 8,000 willing iPhone users across the country. The users followed instructions provided by an associated app to attach the optical devices to their iPhone cameras and photographed the sky in their local areas. Within a day, reams of crowdsourced spectra had stacked up in an online database, ready for analysis. It resulted in a Netherlands-wide map of atmospheric particles with unprecedented resolution ( et al. Geophys. Res. Lett. 41, 7351–7358; 2014) — several years before the proposed satellite launch and for a fraction of the original estimated cost. The team has since received funding from the European Union to repeat the project in 11 European cities. Many researchers are finding ways to exploit smartphones. Snik's project, and those of some geophysicists, astronomers and other scientists who need huge data sets, go one step further. They recruit citizen scientists who use their own smartphones to collect data that would be difficult — if not impossible — to obtain in conventional ways. The various internal sensors that smartphones carry, such as cameras, microphones, accelerometers and pressure gauges, coupled with user-friendly apps offer a way for the public to contribute high-quality data. “There are tons of possibilities for science,” says Travis Desell, a computer scientist at the University of North Dakota who designs research projects that run on smartphones. Scientists who want to exploit the potential of smartphones first need to assess whether the devices can obtain the measurements they need. They must then decide which software platform will optimize the proposed use, before ironing out any errors or 'bugs' in the apps that will be used to collect data. Scientists should also determine how to screen out invalid data. And they need to find ways to recruit participants. Although recruiting the public isn't complicated, thanks to social media, it can still be time-consuming. Snik and his colleagues had a head start — the iSPEX project was covered by the Dutch media, which prompted a few thousand citizens to send in requests to participate. The team drummed up a similar number of contributors by collaborating with the charity Lung Foundation Netherlands in Amersfoort, which invited participation from supporters who were concerned about the effects of aerosols on health. Even so, the iSPEX researchers had to spend a year and a half on their crowdsourcing campaign, which involved uploading instruction manuals and video tutorials to a website, posting calls for support on social media and in online publications, and answering questions. But their efforts paid off when they received more than 6,000 submissions of data. The more technical aspects of crowdsourcing data can be trickier to master, and it helps to have some technological savvy. Scientists will find it useful to know how to write an app or how to manufacture an inexpensive hardware 'add-on' (see 'How to create a hand-held research toolkit'). But if a researcher is not an adept programmer, help is available. Snik and his team turned to DDQ, a Netherlands-based company that creates apps that are tailored to citizen science. Researchers who lack funds for third-party support can learn to write an app themselves, thanks to a wealth of free online tutorials and discussion forums. Researchers also need to decide which software platform to select. Snik and his colleagues chose the popular Apple iOS: the physical similarity between iPhone models made it easier to design a compatible add-on. But the leading platform, Google's Android, has advantages, too. It is less strict about the nature of apps and presents fewer barriers to its instrumentation. Remote-sensing scientist Liam Gumley at the University of Wisconsin–Madison has developed an app that aims to improve weather forecasting by comparing photos of the sky taken from smartphones with satellite imagery. He has advice for anyone who is interested in smartphone-aided science: “Just do it!” Gumley recommends drawing up a set of storyboards that describe exactly what the app will do, what each screen will look like and what will happen when the user touches an onscreen control or a button. It is also a good idea, he says, to determine whether any data processing will be performed by the app or by a server online. Depending on the type of processing that is required, one might be faster than the other. Researchers must also be ready with a database that can accommodate a deluge of data. “If you release the app globally, you may get more data than you expect within days,” warns Qingkai Kong, a PhD student in seismology at the University of California, Berkeley, who is working on MyShake, a seismology app. After extrapolating from a small group of users how much data he and his colleagues were likely to receive, they turned to Amazon Web Services to host their database. Other available cloud-computing services include the Google Cloud Platform and Microsoft Azure. Once the data have been collected, it can be difficult to know whether they are reliable. Kong and his colleagues are refining MyShake so that it can distinguish between an actual seismic event and when a user is shaking a phone. A similar app, known as CSN-Droid and designed by scientists at the California Institute of Technology (Caltech) in Pasadena, was discontinued because it could not reliably make such distinctions. But Kong thinks that rigorous testing will reveal ways to improve MyShake's accuracy. Particle physicist Daniel Whiteson of the University of California, Irvine, is also tackling data reliability. He and his colleagues have developed an app called CRAYFIS (Cosmic Rays Found in Smartphones) that enables smartphone users to observe and record the particle debris that is generated when high-energy cosmic rays strike Earth's atmosphere ( et al. Preprint at http://arxiv.org/abs/1410.2895; 2014). If several hundred smartphones in a kilometre radius simultaneously detect a signal, or 'blip', the app registers the event as a cosmic-ray shower. The more blips that occur in a given radius, the greater the energy of the primary cosmic ray. But there is still the possibility that synchronous blips could originate from sources other than cosmic rays — including detector noise or ambient light. Whiteson and his team hope to rule this out by recording the metadata that accompany blips, such as their time and location. If a smartphone is left in one place to record data, the researchers will be able to characterize sources of ambient light and noise so that genuine cosmic-ray signals become readily apparent. More than 150,000 people worldwide have already signed up to participate in the CRAYFIS study, but before they release the app officially, the researchers want to make sure it is free of performance issues that could drive contributors away. The team is currently running a test version of the app on 1,000 phones worldwide. Despite the glitches, apps that crowdsource data are especially attractive for researchers because they can overcome issues that might prevent the collection of data. “The prospect that seismic data in large earthquakes can be obtained from consumer electronics is potentially transformative,” says Tom Heaton, a seismologist at Caltech. “One major obstacle to acquiring seismic data in a building is that the building owners are frightened by the prospect that researchers will uncover a critical safety issue.” Just as smartphones have become indispensable for many scientists' day-to-day lives, they might also prove to be transformative vehicles for some experiments. “Gone are the days when governments would invest US$10 billion to $15 billion on new types of infrastructure, so it's important to think about the infrastructure that's already been built,” Whiteson says. “Smartphones are very powerful and very flexible. It's an enormous platform that we're only now beginning to think about for science.”
The Zoryan Institute Issues Statement on the Occasion of the International Day of Commemoration and Dignity of the Victims of the Crime of Genocide and of the Prevention of this Crime The Contribution of the Genocide Convention to the Battle Against Impun
News Article | December 8, 2016
TORONTO, ON--(Marketwired - December 08, 2016) - On the occasion of the International Day of Commemoration and Dignity of the Victims of the Crime of Genocide and of the Prevention of this Crime, the Zoryan Institute releases an original statement from its Board Member, William A. Schabas, a renowned Professor of International Law. In this statement, Schabas explains the historical significance of this important day of commemoration. This UN resolution was presented on the behalf of 84 co-sponsors by the Republic of Armenia and adopted on December 9th, 2015. On 9 December 1948, the General Assembly of the United Nations, sitting in Paris at the Palais de Chaillot, unanimously adopted the Convention on the Prevention and Punishment of the Crime of Genocide. Three years later, after obtaining the requisite twenty ratifications, the Convention entered into force. Describing the crime of genocide as an 'odious scourge', the Preamble of the Convention states 'that at all periods of history genocide has inflicted great losses on humanity'. The Convention defines genocide as acts committed 'with intent to destroy, in whole or in part, a national, ethnical, racial or religious group, as such'. For the first fifty years or so of its existence, there was very little judicial application of the Convention. More recently, however, the definition in the Convention has been interpreted by the most important international tribunals, including the International Court of Justice, the International Criminal Court, the European Court of Human Rights and the ad hoc International Criminal Tribunals for the former Yugoslavia and Rwanda. For many years the Convention definition of genocide was criticised as being too narrow. Many atrocities, both ongoing and historic, did not seem to fit within is parameters. In the 1990s, the gaps left by the codification of genocide were filled, but not by amendment of the definition of genocide, which has remained unchanged since 1948. Instead, atrocities that appeared to escape the scope of the 1948 text were covered by an enlarged understanding of the cognate concept of crimes against humanity. With rare exceptions, judges have adopted a relatively restrictive interpretation of the definition, confining it to acts aimed at the intentional destruction of substantial parts of a national, ethnic, racial or religious groups. Because of the broadened approach to crimes against humanity, there was no longer any need to encourage interpretative expansion of the definition of genocide. The drafting of the Genocide Convention in 1947 and 1948 was mandated by a resolution of the United Nations General Assembly adopted at its first session, in December 1946. Resolution 96(I), entitled The Crime of Genocide, states that '[m]any instances of such crimes of genocide have occurred when racial, religious, political and other groups have been destroyed, entirely or in part'. Clearly, one of the purposes of the General Assembly resolution was to confirm that genocide was already a crime under international law and not, as some have erroneously suggested, to call for its recognition in a convention. The purpose of the Convention is to impose various obligations upon States by means of a multilateral treaty, including a duty to prevent genocide, and to cooperate in prosecuting those suspected of perpetrating the crime. In February 2015, the authoritative International Court of Justice stated that the 1948 Convention cannot have any retroactive application. This is in keeping with a general principle of interpretation of international treaties. However, that does not mean, as some have argued, that acts taking place prior to 1948 cannot be described as the crime of genocide. For example, in the first proceedings before the International Court of Justice dealing with the Genocide Convention, in 1951, the United States's submission referred to the Roman persecution of Christians, the Turkish massacres of Armenians and the extermination of millions of Jews and Poles by the Nazis as 'outstanding examples of the crime of genocide', adding that this was 'the background when the General Assembly of the United Nations considered the problem of genocide'. In his compilation of multilateral treaties, the Secretary-General of the United Nations lists the Genocide Convention under the rubric of 'Human Rights'. Indeed, it was the first human rights treaty to be adopted by the United Nations General Assembly. Since 1948, the list of human rights treaties has grown enormously. However, for many years, the Genocide Convention stood relatively alone as the only text to make an explicit link between the protection of human rights and the prosecution and punishment of those who violate human rights. That situation has changed dramatically over the past couple of decades as the human rights activities of the United Nations have focused their attention on accountability for violations and on addressing impunity. The most important product of this process is the creation of the International Criminal Court. Today, this still young institution is actively examining situations of massive violation of human rights, including some alleged cases of genocide, in more than twenty countries. The suspected perpetrators include senior officials in four of the largest armies in the world. A century ago, the Allied Powers denounced 'these new crimes of Turkey against humanity and civilization', but post World War I efforts to prosecute the perpetrators soon ran out of momentum. Twenty-five years later, the trials at Nuremberg and Tokyo provided a measure of justice to victims of unimaginable atrocities. But the tribunals were ephemeral. Many years of impunity were to follow. The systems we have today, crowned by the International Criminal Court, are still far from adequate and their reach is subject to many constraints. Yet they are a huge improvement on what existed, or rather did not exist, in the past. As we commemorate the victims of all genocides on this day of December 9th, we may recall the words of Martin Luther King, who said, the moral arc of the universe is long but it tends towards justice. Professor William A. Schabas is a Professor of International Law at Middlesex University in London as well as a professor of International Human Law and Human Rights at Leiden University, Emeritus Professor of Human Rights Law at the National University of Ireland Galway and Honorary Chairman of the Irish Centre for Human Rights. He is a member of the Zoryan Institute's Academic Board of Directors and a faculty member of the Institute's annual Genocide and Human Rights Program (GHRUP). Professor Schabas holds BA and MA degrees in History from the University of Toronto and LLB, LLM and LLD degrees from the University of Montreal, as well as honorary doctorates in Law from several universities. He is the author of more than twenty books dealing in whole or in part with International Human Rights Law. He has also published more than 300 articles in academic journals, principally in the field of International Human Rights Law and International Criminal Law. His writings have been translated into Russian, German, Spanish, Portuguese, Chinese, Japanese, Arabic, Persian, Turkish, Nepali and Albanian. The Zoryan Institute and its subsidiary, the International Institute for Genocide and Human Rights Studies, is the first Armenian non-profit, international centre devoted to the research and documentation of contemporary issues with a focus on Genocide, Diaspora and Homeland. For more information, please visit www.zoryaninstitute.org. Image Available: http://www.marketwire.com/library/MwGo/2016/12/8/11G124668/Images/United_Nations_Signing_of_the_Convention_1948-94c88fe366edbb5c84915eb0d32657fe.jpg Image Available: http://www.marketwire.com/library/MwGo/2016/12/8/11G124668/Images/Lemkin-8f6464ef9fb899db968a0bb292e8295c.jpg Image Available: http://www.marketwire.com/library/MwGo/2016/12/8/11G124668/Images/Lemkin_UN-112e6441d6deba0a892ecc385a3ac908.jpg Image Available: http://www.marketwire.com/library/MwGo/2016/12/8/11G124668/Images/Eleanor_Rosevelt_-_Convention_ad-da84955721b88edeb2be2a0f75571c5a.jpg Image Available: http://www.marketwire.com/library/MwGo/2016/12/8/11G124668/Images/William_A._Schabas-560724c93b62b3f5859d908daca0cfa0.jpg Image Available: http://www.marketwire.com/library/MwGo/2016/12/8/11G124668/Images/December_9-93a304a58f21bb61875c2c95d31483e1.jpg
News Article | December 8, 2016
The Contribution of the Genocide Convention to the Battle Against Impunity TORONTO, ON--(Marketwired - December 08, 2016) - On the occasion of the International Day of Commemoration and Dignity of the Victims of the Crime of Genocide and of the Prevention of this Crime, the Zoryan Institute releases an original statement from its Board Member, William A. Schabas, a renowned Professor of International Law. In this statement, Schabas explains the historical significance of this important day of commemoration. This UN resolution was presented on the behalf of 84 co-sponsors by the Republic of Armenia and adopted on December 9th, 2015. On 9 December 1948, the General Assembly of the United Nations, sitting in Paris at the Palais de Chaillot, unanimously adopted the Convention on the Prevention and Punishment of the Crime of Genocide. Three years later, after obtaining the requisite twenty ratifications, the Convention entered into force. Describing the crime of genocide as an 'odious scourge', the Preamble of the Convention states 'that at all periods of history genocide has inflicted great losses on humanity'. The Convention defines genocide as acts committed 'with intent to destroy, in whole or in part, a national, ethnical, racial or religious group, as such'. For the first fifty years or so of its existence, there was very little judicial application of the Convention. More recently, however, the definition in the Convention has been interpreted by the most important international tribunals, including the International Court of Justice, the International Criminal Court, the European Court of Human Rights and the ad hoc International Criminal Tribunals for the former Yugoslavia and Rwanda. For many years the Convention definition of genocide was criticised as being too narrow. Many atrocities, both ongoing and historic, did not seem to fit within is parameters. In the 1990s, the gaps left by the codification of genocide were filled, but not by amendment of the definition of genocide, which has remained unchanged since 1948. Instead, atrocities that appeared to escape the scope of the 1948 text were covered by an enlarged understanding of the cognate concept of crimes against humanity. With rare exceptions, judges have adopted a relatively restrictive interpretation of the definition, confining it to acts aimed at the intentional destruction of substantial parts of a national, ethnic, racial or religious groups. Because of the broadened approach to crimes against humanity, there was no longer any need to encourage interpretative expansion of the definition of genocide. The drafting of the Genocide Convention in 1947 and 1948 was mandated by a resolution of the United Nations General Assembly adopted at its first session, in December 1946. Resolution 96(I), entitled The Crime of Genocide, states that '[m]any instances of such crimes of genocide have occurred when racial, religious, political and other groups have been destroyed, entirely or in part'. Clearly, one of the purposes of the General Assembly resolution was to confirm that genocide was already a crime under international law and not, as some have erroneously suggested, to call for its recognition in a convention. The purpose of the Convention is to impose various obligations upon States by means of a multilateral treaty, including a duty to prevent genocide, and to cooperate in prosecuting those suspected of perpetrating the crime. In February 2015, the authoritative International Court of Justice stated that the 1948 Convention cannot have any retroactive application. This is in keeping with a general principle of interpretation of international treaties. However, that does not mean, as some have argued, that acts taking place prior to 1948 cannot be described as the crime of genocide. For example, in the first proceedings before the International Court of Justice dealing with the Genocide Convention, in 1951, the United States's submission referred to the Roman persecution of Christians, the Turkish massacres of Armenians and the extermination of millions of Jews and Poles by the Nazis as 'outstanding examples of the crime of genocide', adding that this was 'the background when the General Assembly of the United Nations considered the problem of genocide'. In his compilation of multilateral treaties, the Secretary-General of the United Nations lists the Genocide Convention under the rubric of 'Human Rights'. Indeed, it was the first human rights treaty to be adopted by the United Nations General Assembly. Since 1948, the list of human rights treaties has grown enormously. However, for many years, the Genocide Convention stood relatively alone as the only text to make an explicit link between the protection of human rights and the prosecution and punishment of those who violate human rights. That situation has changed dramatically over the past couple of decades as the human rights activities of the United Nations have focused their attention on accountability for violations and on addressing impunity. The most important product of this process is the creation of the International Criminal Court. Today, this still young institution is actively examining situations of massive violation of human rights, including some alleged cases of genocide, in more than twenty countries. The suspected perpetrators include senior officials in four of the largest armies in the world. A century ago, the Allied Powers denounced 'these new crimes of Turkey against humanity and civilization', but post World War I efforts to prosecute the perpetrators soon ran out of momentum. Twenty-five years later, the trials at Nuremberg and Tokyo provided a measure of justice to victims of unimaginable atrocities. But the tribunals were ephemeral. Many years of impunity were to follow. The systems we have today, crowned by the International Criminal Court, are still far from adequate and their reach is subject to many constraints. Yet they are a huge improvement on what existed, or rather did not exist, in the past. As we commemorate the victims of all genocides on this day of December 9th, we may recall the words of Martin Luther King, who said, the moral arc of the universe is long but it tends towards justice. Professor William A. Schabas is a Professor of International Law at Middlesex University in London as well as a professor of International Human Law and Human Rights at Leiden University, Emeritus Professor of Human Rights Law at the National University of Ireland Galway and Honorary Chairman of the Irish Centre for Human Rights. He is a member of the Zoryan Institute's Academic Board of Directors and a faculty member of the Institute's annual Genocide and Human Rights Program (GHRUP). Professor Schabas holds BA and MA degrees in History from the University of Toronto and LLB, LLM and LLD degrees from the University of Montreal, as well as honorary doctorates in Law from several universities. He is the author of more than twenty books dealing in whole or in part with International Human Rights Law. He has also published more than 300 articles in academic journals, principally in the field of International Human Rights Law and International Criminal Law. His writings have been translated into Russian, German, Spanish, Portuguese, Chinese, Japanese, Arabic, Persian, Turkish, Nepali and Albanian. The Zoryan Institute and its subsidiary, the International Institute for Genocide and Human Rights Studies, is the first Armenian non-profit, international centre devoted to the research and documentation of contemporary issues with a focus on Genocide, Diaspora and Homeland. For more information, please visit www.zoryaninstitute.org. Image Available: http://www.marketwire.com/library/MwGo/2016/12/8/11G124695/Images/United_Nations_Signing_of_the_Convention_1948-94c88fe366edbb5c84915eb0d32657fe.jpg Image Available: http://www.marketwire.com/library/MwGo/2016/12/8/11G124695/Images/Lemkin-8f6464ef9fb899db968a0bb292e8295c.jpg Image Available: http://www.marketwire.com/library/MwGo/2016/12/8/11G124695/Images/Lemkin_UN-112e6441d6deba0a892ecc385a3ac908.jpg Image Available: http://www.marketwire.com/library/MwGo/2016/12/8/11G124695/Images/Eleanor_Rosevelt_-_Convention_ad-da84955721b88edeb2be2a0f75571c5a.jpg Image Available: http://www.marketwire.com/library/MwGo/2016/12/8/11G124695/Images/William_A._Schabas-560724c93b62b3f5859d908daca0cfa0.jpg Image Available: http://www.marketwire.com/library/MwGo/2016/12/8/11G124695/Images/December_9-93a304a58f21bb61875c2c95d31483e1.jpg
News Article | December 7, 2016
Flash Physics is our daily pick of the latest need-to-know developments from the global physics community selected by Physics World's team of editors and reporters The distribution of dark matter in the universe appears to be smoother and more diffuse than previously thought – according to a study of wide-area images of the distant universe. Astronomers at the University of Edinburgh in the UK, Leiden University in the Netherlands, the Argelander Institute for Astronomy in Germany and the Swinburne University of Technology in Australia used the weak gravitational lensing of light from far-off galaxies to map the distribution of dark matter in intervening parts of the universe. The map is at odds with a prediction of dark-matter distribution that is based on the structure of the early universe based on measurements of the cosmic microwave background made by the Planck satellite. "Our findings will help to refine our theoretical model for how the universe has grown since its inception, improving our understanding of the modern-day universe," says Hendrik Hildebrandt of the Argelander Institute. Edinburgh's Catherine Heymans adds: "Unravelling what has happened since the Big Bang is a complex challenge, but by continuing to study the distant skies, we can build a picture of how our modern universe has evolved. The study is described in Monthly Notices of the Royal Astronomical Society. The UK's Engineering and Physical Sciences Research Council (EPSRC) has announced £60m for six new research hubs that aim to transform manufacturing in fields such as composite materials, 3D printing and medicine. The hubs, each receiving £10m, will draw together 17 universities and 200 industrial and academic partners to help turn research into products. The University of Huddersfield will lead a consortium to create a £30m Future Metrology Hub that will be based at the university's Centre for Precision Technologies and will open next year. "Our vision is to develop new technologies and universal methods that will integrate measurement science with design and production processes to improve control, quality and productivity," says physicist Jane Jiang, who will lead the Huddersfield hub. "These will become part of the critical infrastructure for a new generation of digital, high-value manufacturing, the so-called 4th industrial revolution, or Industry 4.0." The other hubs are led by Cardiff University (semiconductors), the universities of Nottingham (composites), Sheffield (advanced powder processes), Strathclyde (advanced crystallisation) and University College London (targeted healthcare). Pakistan will rename a physics research centre in Islamabad after the Nobel laureate Abdus Salam, who died 20 years ago. Born in what is now Pakistan, Salam shared the 1979 Nobel Prize for Physics for his work on unifying the weak and electromagnetic interactions. However, he was never fully celebrated in his native country because he was a member of the Ahmadiyya community. Now, the prime minister Nawaz Sharif has announced that the National Centre for Physics at Quaid-i-Azam University in Islamabad will be called the Professor Abdus Salam Center fo Physics. There will also be five annual fellowships named after Salam, which will be awarded to Pakistani students pursuing PhDs in physics. In addition to his Nobel prize, Salam is remembered for founding the International Centre for Theoretical Physics in Trieste, Italy, in 1964. Now called the Abdus Salam International Centre for Theoretical Physics, the centre fosters the growth of mathematical physics in developing countries.
News Article | February 7, 2017
Students will enjoy a more personalised, engaging learning experience as D2L strengthens its commitment to the Dutch market LONDON – 07 February, 2017 – D2L, a global learning technology leader, is proud to announce that the largest and oldest Dutch public technological university, Delft University of Technology (TU Delft), has selected its Brightspace Learning Management System (LMS) to deliver a more engaging and personalised learning experience to its 20,000 students. Located in Delft, Netherlands, TU Delft hosts students and scientists across eight faculties and numerous research institutes. In 2012, TU Delft, Leiden University and Erasmus University Rotterdam formed a strategic alliance to collaborate in many fields of education, research and valorisation. The University’s commitment to excellence and innovation was made evident in the 2016-2017 World University Rankings in which TU Delft jumped an impressive six places from last year, placing it at number 59, with the Netherlands coming second only to Singapore when ranking the overall scores of countries and their Higher Education sector, further demonstrating both the University’s and the country’s alignment with D2L’s focus on enhancing teaching and learning through technological innovation. After 17 years using a competing LMS from Blackboard, TU Delft put the project out to tender to evaluate other providers and ensure that it could continue to invest in the most innovative learning technology on the market. TU Delft opted for a Best Value Procurement approach to the tender, inviting suppliers to propose the best solution to meet its strategic goals. Their approach placed far greater emphasis on evaluating the quality and proven performance of suppliers and their solutions than traditional requirements only based procurement. The Brightspace platform from D2L includes a number of features that were attractive to the University, including a mobile-friendly user experience that enables students to engage in online, blended, and competency based education programs on a single platform. Since the platform incorporates personalised learning, teachers can deliver their lessons with much greater flexibility and give each student the personal experience they need to succeed. Additionally, Brightspace includes powerful real-time learning analytics to provide data that can help improve student outcomes. “Selecting a strong learning platform was an extremely important decision for us,” said Timo Kos, Director of Education and Student affairs at TU Delft. “We were looking for a partner that is able to support our current education and future developments in teaching and learning. D2L took time to understand our specific challenges and ambitions and offered a solution that not only met these challenges, but exceeded them too. We are confident that we have chosen a strong, long-term partner that will work with us in our continued effort to provide the best possible collaborative learning experience for our students.” Once the rollout is complete, TU Delft will be able to harness all the benefits of Brightspace, making it easy for instructors to design courses, create content, and grade assignments. D2L’s track record of innovation, which was a central element of TU Delft’s selection of Brightspace, has been widely recognised. In March, Fast Company Ranked D2L #6 on the Most Innovative Companies of 2016 List in the Data Science Category, amongst Google, IBM, Spotify, Costco, and Blue Cross Blue Shield. Brightspace was also named the #1 LMS in Higher Education by Ovum Research. “We are delighted to be working with a university that shares our vision for innovation in the future of education,” said Elliot Gowans, VP EMEA at D2L. “TU Delft has a strong reputation and forging an alliance with such a well-respected institution is an exciting milestone for us in the Dutch market.” ABOUT D2L D2L is the software leader that makes learning experiences better. The company’s cloud-based platform, Brightspace, is easy to use, flexible, and smart. With Brightspace, organisations can personalise the experience for every learner to deliver real results. The company is a world leader in learning analytics: its platform predicts learner performance so that organisations can take action in real-time to keep learners on track. Brightspace is used by learners in higher education, secondary schools, and the corporate sector, including the Fortune 1000. D2L has operations in the United States, Canada, Europe, Australia, Brazil, and Singapore. www.D2L.com Twitter: @D2LNews @D2L_EMEA D2L PRESS CONTACT Virginia Jamieson, Vice President of PR and AR, D2L Ltd., 650-279-8619, virginia.jamieson@D2L.com, Twitter: @D2LNews The D2L family of companies includes D2L Corporation, D2L Ltd, D2L Australia Pty Ltd, D2L Europe Ltd, D2L Asia Pte Ltd, and D2L Brasil Soluções de Tecnologia para Educação Ltda. All D2L marks are trademarks of D2L Corporation. Please visit D2L.com/trademarks for a list of D2L marks.
Vedanta Biosciences Announces Clinical Translational Medicine Collaborations with Stanford University School of Medicine and Leiden University Medical Center Focused on Food Allergies and C. difficile and Graft versus Host Disease
News Article | February 16, 2017
CAMBRIDGE, Mass.--(BUSINESS WIRE)--Vedanta Biosciences, pioneering the development of a novel class of therapies for immune and infectious diseases based on rationally designed consortia of bacteria derived from the human microbiome, today announced that it has entered into translational medicine collaborations with Stanford University School of Medicine and Leiden University Medical Center. The relationships will focus on food allergies in children and on patients with C. difficile infection or graft-versus-host disease, respectively. Both collaborations seek to better understand patterns in the microbiome that may potentially inform clinical responses to therapy. Under the terms of the agreement with Stanford, Vedanta will work in collaboration with Kari Nadeau, MD, PhD, Director of the Sean N. Parker Center for Allergy and Asthma Research at Stanford University, to analyze the potential connection between the gut microbiome and responses to oral immunotherapies in children with food allergies. With Leiden University, Vedanta will generate clinical data from interventional studies of fecal transplantation in C. difficile patients treated with donors from the Netherlands Donor Feces Bank, as well as clinical data from patients with graft-versus-host disease, in collaboration with Ed Kuijper, MD, PhD, Professor of Medical Microbiology at the Leiden University Medical Center and co-chair of the Netherlands Donor Feces Bank. The clinical data will feed into Vedanta’s leading platform for discovery, development, and GMP manufacturing of rationally designed bacterial consortia drugs. “We’re excited to announce our relationships with Stanford and Leiden University,” said Bruce Roberts, PhD, Chief Scientific Officer of Vedanta Biosciences. “Collaborations with leading academic centers are an important part of our strategy to support our drug development efforts with human data and with careful science.” About Vedanta Biosciences Vedanta Biosciences is pioneering development of a novel class of therapies for immune and infectious diseases based on rationally designed consortia of bacteria derived from the human microbiome, with clinical trials expected to begin in the first half of 2017. Founded by PureTech Health (PureTech Health plc, PRTC.L) and a group of world-renowned experts in immunology and microbiology, Vedanta Biosciences is a leader in the microbiome field with capabilities to discover, develop and manufacture drugs based on live bacterial consortia. Leveraging its proprietary technology platform and the expertise of its team of scientific co-founders, Vedanta Biosciences has isolated a vast collection of human-associated bacterial strains and characterized how the immune system recognizes and responds to these microbes. This work has led to the identification of human commensal bacteria that induce a range of immune responses – including induction of regulatory T cells and Th17 cells, among others – as well as the characterization of novel molecular mechanisms of microbial-host communication. These advances have been published in leading peer-reviewed journals including Science, Nature (multiple), Cell and Nature Immunology. Vedanta Biosciences has harnessed these biological insights as well as data from clinical translational collaborations to generate a pipeline of programs in development for infectious disease, autoimmune disease, inflammation and immune-oncology. The clinical potential of therapeutic manipulation of the microbiome has been validated by multiple randomized, controlled trials in infectious disease and inflammatory bowel disease. Vedanta Biosciences’ scientific co-founders have pioneered the fields of innate immunity, Th17 and regulatory T cell biology, and include Dr. Ruslan Medzhitov (Professor of Immunobiology at Yale), Dr. Alexander Rudensky (tri-institutional Professor at the Memorial Sloan-Kettering Institute, the Rockefeller University and Cornell University), Dr. Dan Littman (Professor of Molecular Immunology at NYU), Dr. Brett Finlay (Professor at the University of British Columbia) and Dr. Kenya Honda (Professor, School of Medicine, Keio University). Vedanta is backed by PureTech Health, Seventure, Invesco Asset Management, and Rock Springs Capital. Forward Looking Statement This press release contains statements that are or may be forward-looking statements, including statements that relate to the company's future prospects, developments and strategies. The forward-looking statements are based on current expectations and are subject to known and unknown risks and uncertainties that could cause actual results, performance and achievements to differ materially from current expectations, including, but not limited to, those risks and uncertainties described in the risk factors included in the regulatory filings for PureTech Health plc. These forward-looking statements are based on assumptions regarding the present and future business strategies of the company and the environment in which it will operate in the future. Each forward-looking statement speaks only as at the date of this press release. Except as required by law and regulatory requirements, neither the company nor any other party intends to update or revise these forward-looking statements, whether as a result of new information, future events or otherwise.
News Article | October 28, 2016
NDA Partners announced today that Carl Peck, MD, founder and Chairman of the company, will receive the Sheiner–Beal Pharmacometrics Award at the 118th Annual Meeting of the American Society for Clinical Pharmacology and Therapeutics (ASCPT), March 15–18, 2017, in Washington, D.C. The Sheiner–Beal award honors the memory of the pioneering contributions of Drs. Lewis B. Sheiner and Stuart Beal to the scientific discipline of pharmacometrics by recognizing outstanding achievements at the forefront of research or leadership in pharmacometrics and/or application of pharmacometric concepts and techniques to enhance research, development, regulatory evaluation, and/or utilization of therapeutic products. Dr. Sheiner was a founder and highly regarded Partner in NDA Partners before his sudden passing in 2004. “I am especially honored to receive this award as it is named after my mentors and very good friends, Professors Lewis Sheiner and Stuart Beal”, said Dr. Peck”. “Lewis, who we lost in 2004, is recognized as the founder of the field of pharmacometrics, and was a pioneer in the application of modeling and simulation of clinical trials and the introduction of the ‘Learn and Confirm’ paradigm into contemporary drug development.” As a Colonel in the US Army Medical Department, Dr. Peck founded and directed the Division of Clinical Pharmacology as Professor, Departments of Medicine and Pharmacology, Uniformed Services University, Bethesda, Maryland during 1980 to 1987. In 1987, FDA Commissioner Frank Young, appointed Dr. Peck to be Director, Center for Drug Evaluation & Research (CDER), a national leadership position he held until his retirement from government service in 1993. He was promoted to Assistant Surgeon General of the United States in the Public Health Service in October 1990. During 1994, Dr. Peck lectured as “Boerhaave” Professor of Clinical Drug Research at Leiden University (The Netherlands). Dr. Peck was the founding Director of the Center for Drug Development Science at Georgetown University Medical Center, where he served as Professor of Pharmacology from 1994 to 2003. In 1999, Commissioner Henney presented Dr. Peck with the FDA Distinguished Alumnus Award. Sweden’s University of Uppsala conferred an Honorary Doctorate degree (Doctor Honoris Causa) in January 2002 in recognition of "outstanding contributions to the science of drug development." In 2012, he received the Gary Neal Prize for Innovation in Drug Development. Appointed Adjunct Professor in the Department of Bioengineering and Therapeutic Sciences at the University of California at San Francisco (UCSF) in 2004, Dr. Peck and colleagues founded the ongoing UCSF American Course in Drug Development and Regulatory Science in 2007. Dr. Peck is an author of more than 150 original research papers, chapters and books. He serves on numerous scientific advisory boards to academic, industry and government institutions, and is a member of several boards of directors. About ASCPT The American Society for Clinical Pharmacology and Therapeutics (ASCPT) was founded in 1900, and consists of over 2,200 professionals whose primary interest is to advance the science and practice of clinical pharmacology and translational medicine for the therapeutic benefit of patients and society. ASCPT is the largest scientific and professional organization serving the disciplines of Clinical Pharmacology and Translational Medicine. About NDA Partners NDA Partners is a strategy consulting firm specializing in expert product development and regulatory advice to the medical products industry and associated service industries such as law firms, investment funds and government research agencies. The highly experienced Principals and Premier Experts of NDA Partners include three former FDA Center Directors; the former Chairman of the Medicines and Healthcare Products Regulatory Agency (MHRA) in the UK; an international team of more than 100 former pharmaceutical industry and regulatory agency senior executives; and an extensive roster of highly proficient experts in specialized areas including nonclinical development, toxicology, pharmacokinetics, CMC, medical device design control and quality systems, clinical development, regulatory submissions, and development program management. Services include product development and regulatory strategy, expert consulting, high-impact project teams, and virtual product development teams.
News Article | March 2, 2017
GOTHENBURG, Sweden, March 2, 2017 - Immunicum AB (publ; First North Premier: IMMU.ST) a biopharmaceutical company advancing a novel immuno-oncology treatment against a range of solid tumors, today announced the appointment of Sijme Zeilemaker as Senior Director Business Development. Mr. Zeilemaker joins Immunicum with a breadth of experience in science-based business transactions and a refined knowledge and understanding of oncology-based biotech companies. He will report to Carlos de Sousa, CEO of Immunicum. "Sijme's previous experiences in each layer of the business development process and the oncology landscape will greatly contribute to our ability as a leadership team to more actively engage with potential partners, a key aspect of our corporate strategy," said Carlos de Sousa, CEO of Immunicum. "His insight and network of cancer-focused biotechnology companies as well as connections to big pharma will provide the fundamental support needed for Immunicum as we advance our programs and further evolve as a company." "I am very much looking forward to being an active part of the Immunicum team," said Sijme Zeilemaker. "I truly believe in the great potential of Immunicum's immuno-oncology therapeutic approach and highly appreciate the opportunity to join and support the company in its next stage of development." Sijme Zeilemaker joins Immunicum having most recently served as Director Business Development at InteRNA Technologies where he supported the preclinical oncology company in connecting with pharmaceutical and biotechnology companies, licensing technologies and exploring grant opportunities. Sijme also served as Head of Business for 2-BBB Medicines and Business Development Manager for to-BBB technologies where he provided partnering support and attracted over €7.5 million in non-dilutive funding. Sijme obtained a Masters degree in Biomedical Sciences from Leiden University, The Netherlands. About Immunicum AB (publ) Immunicum AB (First North Premier: IMMU.ST) is a clinical stage company developing novel immuno-oncology therapies against a range of solid tumors. The Company's lead compound, INTUVAX® is currently being evaluated in clinical trials for the treatment of kidney cancer, liver cancer and gastrointestinal stromal tumors. INTUVAX® was designed to combine the best of two worlds: a cost-effective cell-based (allogeneic) and off-the-shelf therapy that is capable of triggering a highly personalized and potentially long-lasting immune response against tumor cells throughout the body. www.immunicum.com For more information, please contact: Carlos de Sousa, CEO, Immunicum Ph: +46 (0) 31 41 50 52 E-mail: firstname.lastname@example.org The information in this press release is disclosed pursuant to the EU Market Abuse Regulation. The information was released for public disclosure through the agency of the company's contact person on March 2, 2017 at 8.00 am CET
Bhaseen M.J.,King's College London |
Doyon B.,King's College London |
Lucas A.,Harvard University |
Schalm K.,Harvard University |
Schalm K.,Leiden University
Nature Physics | Year: 2015
Characterizing the behaviour of strongly coupled quantum systems out of equilibrium is a cardinal challenge for both theory and experiment. With diverse applications ranging from the dynamics of the quark-gluon plasma to transport in novel states of quantum matter, establishing universal results and organizing principles out of equilibrium is crucial. We present a universal description of energy transport between quantum critical heat baths in arbitrary dimension. The current-carrying non-equilibrium steady state (NESS) is a Lorentz-boosted thermal state. In the context of gauge/gravity duality this reveals an intimate correspondence between far-from-equilibrium transport and black hole uniqueness theorems. We provide analytical expressions for the energy current and the generating function of energy current fluctuations, together with predictions for experiment. © 2015 Macmillan Publishers Limited. All rights reserved.
Van Der Molen A.J.,Leiden University |
Hovius M.C.,Onze Lieve Vrouwe Gasthuis
American Journal of Roentgenology | Year: 2012
OBJECTIVE. To present a problem-based algorithm in the work-up of patients with Hematuria. Since the 2010 Dutch Guideline on Hematuria was problem-based, this served as an illustration for such an approach.. CONCLUSION. The work-up of hematuria should be individualized and risk-based. Given the a priori low likelihood of cancer in hematuria, risk categories should be established and imaging algorithms should be tailored to populations at low-risk, medium-risk and high-risk for developing urothelial cancer. © American Roentgen Ray Society.
Roep B.O.,Leiden University |
Peakman M.,King's College London
Cold Spring Harbor Perspectives in Medicine | Year: 2012
Type 1 diabetes is characterized by recognition of one or more β-cell proteins by the immune system. The list of target antigens in this disease is ever increasing and it is conceivable that additional islet autoantigens, possibly including pivotal β-cell targets, remain to be discovered. Many knowledge gaps remain with respect to the disorder's pathogenesis, including the cause of loss of tolerance to islet autoantigens and an explanation as to why targeting of proteins with a distribution of expression beyond β cells may result in selective β-cell destruction and type 1 diabetes. Yet, our knowledge of β-cell autoantigens has already led to translation into tissue-specific immune intervention strategies that are currently being assessed in clinical trials for their efficacy to halt or delay disease progression to type 1 diabetes, as well as to reverse type 1 diabetes. Here we will discuss recently gained insights into the identity, biology, structure, and presentation of islet antigens in relation to disease heterogeneity and β-cell destruction. © 2012 Cold Spring Harbor Laboratory Press; all rights reserved.
Roep B.O.,Leiden University |
Tree T.I.M.,King's College London
Nature Reviews Endocrinology | Year: 2014
Type 1 diabetes mellitus (T1DM) is the result of autoimmune destruction of pancreatic β cells in genetically predisposed individuals with impaired immune regulation. The insufficiency in the modulation of immune attacks on the β cells might be partly due to genetic causes; indeed, several of the genetic variants that predispose individuals to T1DM have functional features of impaired immune regulation. Whilst defects in immune regulation in patients with T1DM have been identified, many patients seem to have immune regulatory capacities that are indistinguishable from those of healthy individuals. Insight into the regulation of islet autoimmunity might enable us to restore immune imbalances with therapeutic interventions. In this Review, we discuss the current knowledge on immune regulation and dysfunction in humans that is the basis of tissue-specific immune regulation as an alternative to generalized immune suppression.
Ament L.J.P.,Leiden University |
Van Veenendaal M.,Argonne National Laboratory |
Van Veenendaal M.,Northern Illinois University |
Devereaux T.P.,SLAC |
And 2 more authors.
Reviews of Modern Physics | Year: 2011
In the past decade, resonant inelastic x-ray scattering (RIXS) has made remarkable progress as a spectroscopic technique. This is a direct result of the availability of high-brilliance synchrotron x-ray radiation sources and of advanced photon detection instrumentation. The technique's unique capability to probe elementary excitations in complex materials by measuring their energy, momentum, and polarization dependence has brought RIXS to the forefront of experimental photon science. Both the experimental and theoretical RIXS investigations of the past decade are reviewed, focusing on those determining the low-energy charge, spin, orbital, and lattice excitations of solids. The fundamentals of RIXS as an experimental method are presented and then the theoretical state of affairs, its recent developments, and the different (approximate) methods to compute the dynamical RIXS response are reviewed. The last decade's body of experimental RIXS data and its interpretation is surveyed, with an emphasis on RIXS studies of correlated electron systems, especially transition-metal compounds. Finally, the promise that RIXS holds for the near future is discussed, particularly in view of the advent of x-ray laser photon sources. © 2011 American Physical Society.
Scott D.L.,King's College London |
Wolfe F.,University of Kansas |
Huizinga T.W.J.,Leiden University
The Lancet | Year: 2010
Rheumatoid arthritis is characterised by persistent synovitis, systemic inflammation, and autoantibodies (particularly to rheumatoid factor and citrullinated peptide). 50 of the risk for development of rheumatoid arthritis is attributable to genetic factors. Smoking is the main environmental risk. In industrialised countries, rheumatoid arthritis affects 0·5-1·0 of adults, with 5-50 per 100 000 new cases annually. The disorder is most typical in women and elderly people. Uncontrolled active rheumatoid arthritis causes joint damage, disability, decreased quality of life, and cardiovascular and other comorbidities. Disease-modifying antirheumatic drugs (DMARDs), the key therapeutic agents, reduce synovitis and systemic inflammation and improve function. The leading DMARD is methotrexate, which can be combined with other drugs of this type. Biological agents are used when arthritis is uncontrolled or toxic effects arise with DMARDs. Tumour necrosis factor inhibitors were the first biological agents, followed by abatacept, rituximab, and tocilizumab. Infections and high costs restrict prescription of biological agents. Long-term remission induced by intensive, short-term treatment selected by biomarker profiles is the ultimate goal. © 2010 Elsevier Ltd.
Windecker S.,University of Bern |
Bax J.J.,Leiden University |
Myat A.,King's College London |
Stone G.W.,Columbia University Medical Center |
Marber M.S.,King's College London
The Lancet | Year: 2013
Over the past five decades, management of acute ST-segment elevation myocardial infarction (STEMI) has evolved substantially. Current treatment encompasses a systematic chain of network activation, antithrombotic drugs, and rapid instigation of mechanical reperfusion, although pharmacoinvasive strategies remain relevant. Secondary prevention with drugs and lifestyle modifications completes the contemporary management package. Despite a tangible improvement in outcomes, STEMI remains a frequent cause of morbidity and mortality, justifying the quest to find new therapeutic avenues. Ways to reduce delays in doing coronary angioplasty after STEMI onset include early recognition of symptoms by patients and prehospital diagnosis by paramedics so that the emergency room can be bypassed in favour of direct admission to the catheterisation laboratory. Mechanical reperfusion can be optimised by improvements to stent design, whereas visualisation of infarct size has been improved by developments in cardiac MRI. Novel treatments to modulate the inflammatory component of atherosclerosis and the vulnerable plaque include use of bioresorbable vascular scaffolds and anti-proliferative drugs. Translational efforts to improve patients' outcomes after STEMI in relation to cardioprotection, cardiac remodelling, and regeneration are also being realised.
Dame R.T.,Leiden University |
Kalmykowa O.J.,Leiden University |
Grainger D.C.,University of Warwick |
Grainger D.C.,University of Birmingham
PLoS Genetics | Year: 2011
The Escherichia coli chromosome is organized into four macrodomains, the function and organisation of which are poorly understood. In this review we focus on the MatP, SeqA, and SlmA proteins that have recently been identified as the first examples of factors with macrodomain-specific DNA-binding properties. In particular, we review the evidence that these factors contribute towards the control of chromosome replication and segregation by specifically targeting subregions of the genome and contributing towards their unique properties. Genome sequence analysis of multiple related bacteria, including pathogenic species, reveals that macrodomain-specific distribution of SeqA, SlmA, and MatP is conserved, suggesting common principles of chromosome organisation in these organisms. This discovery of proteins with macrodomain-specific binding properties hints that there are other proteins with similar specificity yet to be unveiled. We discuss the roles of the proteins identified to date as well as strategies that may be employed to discover new factors. © 2011 Dame et al.
Kleijn S.E.F.,Leiden University |
Lai S.C.S.,University of Warwick |
Lai S.C.S.,MESA Institute for Nanotechnology |
Koper M.T.M.,Leiden University |
Unwin P.R.,University of Warwick
Angewandte Chemie - International Edition | Year: 2014
Metal nanoparticles (NPs) find widespread application as a result of their unique physical and chemical properties. NPs have generated considerable interest in catalysis and electrocatalysis, where they provide a high surface area to mass ratio and can be tailored to promote particular reaction pathways. The activity of NPs can be analyzed especially well using electrochemistry, which probes interfacial chemistry directly. In this Review, we discuss key issues related to the electrochemistry of NPs. We highlight model studies that demonstrate exceptional control over the NP shape and size, or mass-transport conditions, which can provide key insights into the behavior of ensembles of NPs. Particular focus is on the challenge of ultimately measuring reactions at individual NPs, and relating the response to their structure, which is leading to imaginative experiments that have an impact on electrochemistry in general as well as broader surface and colloid science. Revealing electrochemistry: Key issues related to the electrochemistry of nanoparticles are being uncovered through innovative techniques capable of relating activity and structure, ultimately at the level of a single nanoparticle. Recent advances in experimental approaches are discussed and assessed, with particular emphasis on those that enhance the fundamental understanding of electrocatalysis and nanoscale electrochemistry. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Roep B.O.,Leiden University |
Peakman M.,King's College London
Current Opinion in Immunology | Year: 2011
The field of Type 1 diabetes research has been quick to embrace the era of translational medicine in the recent epoch. Building upon some 30 years of intense immunological research, the past decade has been marked by a series of clinical trials designed to evaluate the potential beneficial effects of a range of immune intervention and prevention strategies [1 ••,2-5]. At the heart of Type 1 diabetes is an autoimmune process, the consequence of which is immune-mediated destruction of islet β-cells. Although understanding the pathogenesis of islet autoimmunity is critical, there are also good reasons to focus research onto the β-cell destructive process itself. Measuring preservation of function of insulin-producing cells is currently the best means available to evaluate potential beneficial effects of immunotherapy, but there is an urgent need to discover and monitor immunological correlates of this β-cell destructive process. Whilst the best approach to intervention and prevention has yet to emerge, it is logical that future attempts to intelligently design therapeutics for Type 1 diabetes will need to be predicated on a clear understanding of the process of β-cell destruction and the immune components involved. For these reasons, this review will focus on the role of diabetogenic T lymphocytes in this disease-defining event. © 2011 Elsevier Ltd.
Mill J.,University of Exeter |
Mill J.,King's College London |
Heijmans B.T.,Leiden University
Nature Reviews Genetics | Year: 2013
The epigenome has been heralded as a key 'missing piece' of the aetiological puzzle for complex phenotypes across the biomedical sciences. The standard research approaches developed for genetic epidemiology, however, are not necessarily appropriate for epigenetic studies of common disease. Here, we discuss the optimal execution of population-based studies of epigenetic variation, which will contribute to the emerging field of 'epigenetic epidemiology' and emphasize the importance of establishing a causal role in pathology for disease-associated epigenetic changes. We propose that improved understanding of the molecular mechanisms underlying human health and disease are best achieved through carrying out studies of epigenetics in populations as a part of an integrated functional genomics strategy. © 2013 Macmillan Publishers Limited. All rights reserved.
Agency: European Commission | Branch: FP7 | Program: CP-IP | Phase: HEALTH-2009-2.1.1-1 | Award Amount: 15.31M | Year: 2010
In recent years, the zebrafish has emerged as a new vertebrate model organism for biomedical research which offers a unique combination of traits: a short generation time, small size and efficient breeding procedures make it the best choice among vertebrates for forward genetic screening and small-molecule screens, including toxicology, while the transparent embryo and larva offers unique opportunities for imaging of cell movement and gene expression in a developing organism. Building on recent advances in the zebrafish field, we will conduct high-throughput phenotyping of at least a thousand regulatory genes relevant for common human diseases, by behavioural assays (for viable mutants), 3D / 4D imaging and expression profiling (including high-throughput sequencing). We will include mutants generated by TILLING and by the new zinc finger nuclease method, as well as mutants from earlier forward-genetics screens. A phenotyping effort of this scale has never been undertaken before in any vertebrate organism. Complementing the study of mutants relevant for neurological disorders, we will produce an atlas of gene expression in the brain, the most comprehensive one in a vertebrate. We will further perform a genome-wide characterisation of regulatory elements of potential disease genes by a combination of bioinformatics and transgenics. Small-molecule screening for mutant rescue or disease-relevant processes will identify candidate drugs and provide insights into gene function. Our increasing knowledge on the regulators and their interactions with regulatory targets will be integrated with knowledge at cellular and organismic level. By capitalising on the virtues of the zebrafish system, this systems biology approach to the regulome will gain unique knowledge complementing ongoing work in mammalian systems, and provide important new stimuli for biomedical research.
Agency: European Commission | Branch: H2020 | Program: CSA | Phase: CO-CREATION-09-2016 | Award Amount: 2.00M | Year: 2017
KNOWMAK project aims at developing a web-based tool, which provides interactive visualisations and state-of-the-art indicators on knowledge co-creation in the European Research Area (ERA). It is structured around three integrative elements: Research topics, by developing ontologies around Societal Grand Challenges and Key Enabling Technologies. Actors, with a focus on the quadruple helix and the involvement of societal actors in knowledge co-creation. Geographical spaces, with a focus on multiple level metropolitan, regional, national and European spaces and their interconnectedness. The tool combines three main data sources: established indicators of scientific and technological knowledge production based on scientific publications and patents; information on knowledge in the making derived from research projects descriptions; information on social innovation projects and user attention to knowledge production derived from the Internet and from social media. The integrative elements (topics, actors, space) allow for the interlinking of data items, to produce a characterisation of different dimensions of knowledge in the making. KNOWMAK will be tailored to the needs of specific user groups with a focus on four groups: policy-makers, regional actors and representatives of the civil society, business sector, and managers of public research organisations and universities. User groups will be involved in the design of the system, the specification of the indicators and of the visualisations to be provided. This user-centred approach will ensure responsiveness of the tool to (changing) needs of relevant stakeholders in the ERA. Moving beyond the existing approaches to S&T indicators, the project will design and implement a consistent infrastructure where different types of data sources are interlinked and mobilized to produce a rich set of indicators and visualisations responding to the needs of specific user groups, thanks to experienced consortium.
Agency: European Commission | Branch: FP7 | Program: MC-ITN | Phase: FP7-PEOPLE-ITN-2008 | Award Amount: 4.65M | Year: 2009
European scientists lead the world in the modelling of the formation of cosmic structures using computer simulations. The objective of the CosmoComp proposal is to reinforce Europes world standing in this field by training the next generation of computational cosmologists. CosmoComp builds on and extends existing research collaborations between major European centres, and has a global element with links to Latin America and the Far East. New training capacity will be developed through the network activities, which will benefit early stage researchers from across Europe, beyond the network members. We propose a series of ground breaking ``grand-challenge simulations which use the state of the art numerical techniques in the subject on some of the largest supercomputers available in Europe. Sun Microsystems and Microsoft will actively participate in our training programme, ensuring that CosmoComp will prepare young people for a research career in academia or industry.
Agency: European Commission | Branch: FP7 | Program: CP-FP | Phase: ENV.2010.3.1.3-1 | Award Amount: 4.92M | Year: 2011
The recycling of end-of-life concrete into new concrete is one of the most interesting options for reducing worldwide natural resources use and emissions associated with the building materials sector. The production of the cement used in concrete, for example, is responsible for at least 5% of worldwide CO2 emissions. On-site reuse of clean silica aggregate from old concrete saves natural resources and reduces transport and dust, while the re-use of the calcium-rich cement paste has the potential to cut carbon dioxide emissions in the production of new cement by a factor of two. In order to achieve this goal, a new system approach is studied in which automatic quality control assesses and maintains high standards of concrete demolition waste from the earliest stage of recycling, and novel breaker/sorting technology concentrates silica and calcium effectively into separate fractions at low cost (Figure 1.1). Finally, the smaller calcium-rich fraction, which is typically also rich in fine organic residues, is converted into new binding agents by thermal processing, and mixed with the aggregate into new mortar. Next to technological advances, certification and design guidelines are developed to use the recycle concrete in a responsible and optimal way. The project aims to develop three innovative technologies for recycling end-of-life concrete, integrate them with state-of-the-art demolition and building processes and procedures, and test the new system approach on two Dutch concrete towers involving 70,000 tons of concrete. A special feature of this large case study is a new type of government contract which links the recycling of the towers to the re-use of the recycled materials in new buildings. The results of the project will be used to determine which kinds of strategies and policies are most effective to facilitate an efficient transition towards optimal value recovery from Construction and Demolition Waste and sustainable building.
Agency: European Commission | Branch: FP7 | Program: MC-ITN | Phase: FP7-PEOPLE-ITN-2008 | Award Amount: 2.47M | Year: 2010
The NanoCTM network will tackle major challenges in the theory of nanoelectronics. Ten internationally-leading European theory-of-condensed-matter groups from nine different countries [including one of Europes leading industrial electronics-research groups (QinetiQ)] have joined forces as full participants, combining theoretical expertise in nanowires, quantum dots, carbon-based electronics, and spintronics, along with interaction and proximity effects in small dimensions. Our highly-integrated approach to nanoscale transport will represent a major step towards the realisation of future scalable nanotechnologies and processes. In the longer term, the insights gained will contribute to the fabrication of novel functional nanoscale architectures and their integration into a higher hierarchical level. System parameters such as electric field, light, temperature or chemical reactivity are envisaged as possible drivers of future nanoelectronic devices.
Agency: European Commission | Branch: FP7 | Program: MC-IRSES | Phase: FP7-PEOPLE-2010-IRSES | Award Amount: 434.70K | Year: 2011
The ChemBioFight project aims towards the exploration of natural resources to the discovery of bioactive therapeutic molecules against leishmania and Chagas disease. This will be accomplished through the establishment of an extended scientific network between European and South American research entities. Already assembled, highly diverse chemical libraries will be employed for the determination of active natural scaffolds leading to the focused collection of biomaterial (plants, marine organisms, fungi, endophytes) from local diversity hot-spots. Automated, high throughput and advanced techniques will be incorporated for the extraction process as well as the isolation and identification of natural products. Sophisticated approaches as metabolomics and chemical profiling will contribute to the discovery of novel active compounds and will be used to conduct dereplication procedures. Semi-synthetic derivatives of lead compounds will be also produced aiming to the optimization of favorable biological properties via medicinal chemistry aspects. In every step of the proposed work flow, the obtained samples (extracts, isolated compounds, synthetic derivatives) will be evaluated in vitro and/or in vivo for their antileishmanial and antitrypanosomal activity. Within the aforementioned context, an extensive net of secondements, both with educational and experimental attributes, will be established. Core scientific knowledge is expected to be produced and exchanged, with the prospect of creating partnerships with future scientific potential. All partners will participate in the dissemination procedure through teaching activities, workshops and international conferences, leading overall to mutual transfer of know-how. Finally, all procedures will be effectively monitored from a management team ensuring effectiveness and prompt objective achievement.
Agency: European Commission | Branch: H2020 | Program: RIA | Phase: FETPROACT-01-2016 | Award Amount: 7.98M | Year: 2017
A novel concept for a photo-electro-catalytic (PEC) cell able to directly convert water and CO2 into fuels and chemicals (CO2 reduction) and oxygen (water oxidation) using exclusively solar energy will be designed, built, validated, and optimized. The cell will be constructed from cheap multifunction photo-electrodes able to transform sun irradiation into an electrochemical potential difference (expected efficiency > 12%); ultra-thin layers and nanoparticles of metal or metal oxide catalysts for both half-cell reactions (expected efficiency > 90%); and stateof- the-art membrane technology for gas/liquid/products separation to match a theoretical target solar to fuels efficiency above 10%. All parts will be assembled to maximize performance in pH > 7 solution and moderate temperatures (50-80 C) as to take advantage of the high stability and favorable kinetics of constituent materials in these conditions. Achieving this goal we will improve the state-of-the-art of all components for the sake of cell integration: 1) Surface sciences: metal and metal oxide catalysts (crystals or nanostructures grown on metals or silicon) will be characterized for water oxidation and CO2 reduction through atomically resolved experiments (scanning probe microscopy) and spatially-averaged surface techniques including surface analysis before, after and in operando electrochemical reactions. Activity and performance will be correlated to composition, thickness, structure and support as to determine the optimum parameters for device integration. 2) Photoelectrodes: This unique surface knowledge will be transferred to the processing of catalytic nanostructures deposited on semiconductors through different methods to match the surface chemistry results through viable up-scaling processes. Multiple thermodynamic and kinetic techniques will be used to characterize and optimize the performance of the interfaces with spectroscopy and photo-electrochemistry tools to identify best matching between light absorbers and chemical catalysts along optimum working conditions (pH, temperature, pressure). 3) Modeling: Materials, catalysts and processes will be modeled with computational methods as a pivotal tool to understand and to bring photo-catalytic-electrodes to their theoretical limits in terms of performance. The selected optimum materials and environmental conditions as defined from these parallel studies will be integrated into a PEC cell prototype. This design will include ion exchange membranes and gas diffusion electrodes for product separation. Performance will be validated in real working conditions under sun irradiation to assess the technological and industrial relevance of our A-LEAF cell.
Agency: European Commission | Branch: H2020 | Program: RIA | Phase: INT-08-2015 | Award Amount: 2.67M | Year: 2016
Ten years after its inception, the European Neighbourhood Policy (ENP) has fallen short of accomplishing its mission. The war in Ukraine and the rising tensions with Russia have made a re-assessment of the ENP both more urgent and more challenging. EU-STRAT will address two questions: First, why has the EU fallen short of creating peace, prosperity and stability in its Eastern neighbourhood? Second, what can be done to strengthen the EUs transformative power in supporting political and economic change in the six Eastern Partnership (EaP) countries? Adopting an inside-out perspective on the challenges of transformation the EaP countries and the EU face, EU-STRAT will develop a conceptual framework for the varieties of social orders in EaP countries to explain the propensity of domestic actors to engage in change; investigate how bilateral, regional and global interdependencies shape the scope of action and the preferences of domestic actors in the EaP countries; de-centre the EU by studying the role of selected member states and other external actors active in the region; evaluate the effectiveness of the Association Agreements and alternative EU instruments, including scientific cooperation, in supporting change in the EaP countries; analyse normative discourses used by the EU and Russia to enhance their influence over the shared neighbourhood. formulate policy recommendations to strengthen the EUs capacity to support change in the EaP countries by advancing different scenarios for developmental pathways. EU-STRAT features an eleven-partner consortium including six universities, three think-tanks, one civil society organization and one consultancy. This consortium will achieve the research and policy relevant objectives of the project by bringing together various disciplinary perspectives and methodologies and strengthening links with academics and policy makers across six EU member states, Switzerland and three of the EaP countries.
Agency: European Commission | Branch: H2020 | Program: RIA | Phase: PHC-03-2015 | Award Amount: 5.10M | Year: 2015
Common mechanisms and pathways in Stroke and Alzheimers disease. It has long been recognized that stroke and (Alzheimers Disease) AD often co-occur and have an overlapping pathogenesis. As such, these two diseases are not considered fellow travelers, but rather partners in crime. This multidisciplinary consortium includes epidemiologists, geneticists, radiologists, neurologists with a longstanding track-record on the etiology of stroke and AD. This project aims to improve our understanding of the co-occurrence of stroke and AD. An essential concept of our proposal is that stroke and AD are sequential diseases that have overlapping pathyphysiological mechanisms in addition to shared risk factors. We will particularly focus on these common mechanisms and disentangle when and how these mechanisms diverge into causing either stroke, or AD, or both. Another important concept is that mechanisms under study will not only include the known pathways of ischemic vasculopathy and CAA, but we will explore and unravel novel mechanisms linking stroke and AD. We will do so by exploiting our vast international network in order to link various big datasets and by incorporating novel analytical strategies with emerging technologies in the field of genomics, metabolomics, and brain MR-imaging.
Agency: European Commission | Branch: FP7 | Program: CP-FP | Phase: SPA.2013.2.1-02 | Award Amount: 3.25M | Year: 2013
Assessing the habitability of Mars and detecting life, if it was ever there, depends on knowledge of whether the combined environmental stresses experienced on Mars are compatible with life and whether a record of that life could ever be detected. However, our current ability to make these assessments is hampered by a lack of knowledge of how the combined effect of different environmental stresses influence the survival and growth of organisms. In particular, many combinations of stress, such as high radiation conditions combined with high salt and low temperature, relevant for early Mars, have not been investigated. Furthermore, a lack of experimental studies on how anaerobic microorganisms respond to such stresses undermine our knowledge of Mars as a location for life since the planet is essentially anoxic. Even if life can be shown to be potentially supported on Mars, there exist no systematic studies of how organisms would be preserved. MASE proposes to address these limitations in our knowledge and advance our ability to assess the habitability of Mars and detect life. In particular, MASE intends to: - Isolate and characterise anaerobic microorganisms from selected sites that closely match environmental conditions that might have been habitable on early Mars. - Study their responses to realistic combined environmental stresses that might have been experienced in habitable environments on Mars. - Investigate their potential for fossilisation on Mars and their detectability by carrying out a systematic study of the detectability of artificially fossilised organisms exposed to known stresses. Cross cutting aspects of i) optimised methodologies for sample management and experimental process and ii) optimised methodologies for life detection will also be thoroughly considered. MASE will allow us to gain knowledge on Mars habitability and on adaptation of life to extremes, it will also present opportunities to optimise mission operations and life detection.
Agency: European Commission | Branch: H2020 | Program: RIA | Phase: EO-2-2015 | Award Amount: 2.99M | Year: 2016
With the start of the SENTINEL era, an unprecedented amount of Earth Observation (EO) data will become available. Currently there is no consistent but extendible and adaptable framework to integrate observations from different sensors in order to obtain the best possible estimate of the land surface state. MULTIPY proposes a solution to this challenge. The project will develop an efficient and fully traceable platform that uses state-of-the-art physical radiative transfer models, within advanced data assimilation (DA) concepts, to consistently acquire, interpret and produce a continuous stream of high spatial and temporal resolution estimates of land surface parameters, fully characterized. These inferences on the state of the land surface will be the result from the coherent joint interpretation of the observations from the different Sentinels, as well as other 3rd party missions (e.g. ProbaV, Landsat, MODIS). The framework allows users to exchange components as plug-ins according to their needs. The proposal is based on the EO-LDAS concepts developed within several ESA-funded projects, which have shown the feasibility of producing estimates of the land surface parameters by combining different sets of observations through the use of radiative transfer models. We will provide a fully generic flexible data retrieval platform for Copernicus services that provides integrated and consistent data products in an easily accessible virtual machine with advanced visualisation tools. Users will be engaged throughout the process and trained. Moreover, user demonstrator projects include applications to crop monitoring & modelling, forestry, biodiversity and nature management. Another user demonstrator project involves providing satellite operators with an opportunity to cross-calibrate their data to the science-grade Sentinel standards.
Agency: European Commission | Branch: FP7 | Program: NoE | Phase: HEALTH.2010.2.1.2-2 | Award Amount: 15.96M | Year: 2011
Biological processes occur in space and time, but current experimental methods for systems biology are limited in their ability to resolve this spatiotemporal complexity of life. In addition, traditional omics methods often suffer from limited sensitivity and need to average over populations of cells at the expense of cell to cell variation. Next-generation systems biology therefore requires methods that can capture data and build models in four dimensions, three-dimensional space and time, and needs to address dynamic events in single living cells. In fact, recent advances in automated fluorescence microscopy, cell microarray platforms, highly specific probes, quantitative image analysis and data mining provide a powerful emerging technology platform to enable systems biology of the living cell. These imaging technologies, here referred to as Systems microscopy, will be a cornerstone for next-generation systems biology to elucidate and understand complex and dynamic molecular, sub-cellular and cellular networks. As a paradigm to enable systems biology at the cellular scale of biological organization, this NoE will have as its core biological theme two basic but complex cellular processes that are highly relevant to human cancer: cell division and cell migration. Methods, strategies and tools established here will be applicable to many disease-associated processes and will be instrumental for obtaining a systems level understanding of the molecular mechanisms underlying human diseases as manifested at the living cell level. Through close multidisciplinary collaborations in our programme of joint activities this NoE will develop a powerful enabling platform for next-generation systems biology and will apply these tools to understand cellular systems underlying human cancer. This provides a unique opportunity for Europe to acquire a global lead in systems microscopy.
Verzijden M.N.,Lund University |
ten Cate C.,Leiden University |
Servedio M.R.,University of North Carolina at Chapel Hill |
Kozak G.M.,University of Illinois at Urbana - Champaign |
And 2 more authors.
Trends in Ecology and Evolution | Year: 2012
Learning is widespread in nature, occurring in most animal taxa and in several different ecological contexts and, thus, might play a key role in evolutionary processes. Here, we review the accumulating empirical evidence for the involvement of learning in mate choice and the consequences for sexual selection and reproductive isolation. We distinguish two broad categories: learned mate preferences and learned traits under mate selection (such as bird song). We point out that the context of learning, namely how and when learning takes place, often makes a crucial difference to the predicted evolutionary outcome. Factors causing biases in learning and when one should expect the evolution of learning itself are also explored. © 2012 Elsevier Ltd.
Agency: European Commission | Branch: FP7 | Program: CP-IP | Phase: SSH.2013.5.2-1 | Award Amount: 6.40M | Year: 2014
The project Advancing The European Multilingual Experience (AThEME) takes an integrated approach towards the study of multilingualism in Europe by incorporating and combining linguistic, cognitive and sociological perspectives; by studying multilingualism in Europe at three different levels of societal magnitude, viz. the individual multilingual citizen, the multilingual group, and the multilingual society; by using a palate of research methodologies, ranging from fieldwork methods to various experimental techniques and advanced EEG/ERP technologies. This integrated approach towards the study of multilingualism is grounded in the idea that multilingualism in Europe has many facets. AThEME will cover the different forms of multilingualism in Europe by developing new lines of inquiry on regional/minority languages, heritage languages, languages spoken by bi-/multi-lingual speakers with communicative disorders, and languages spoken by bi-/multi-linguals at different stages of development and life. These lines of inquiry will provide (partial) answers to fundamental questions, including: What does it mean to be bilingual? How and why do people succeed or fail in learning another language? How can we help speakers maintain their regional/heritage language and reach proficient multilingualism? What are the reciprocal effects of multilingualism and cognition? Are there cognitive benefits of multilingualism for senior citizens? How does multilingualism interact with communicative disorders? Which societal factors have a major impact on successful maintenance of regional/heritage languages? Answers to these questions provided within the context of AThEME will provide a firm basis for assessing existing public policies and practices within major areas such as education and health and contribute to evidence-based policy-making. AThEME aims to raise societal awareness of multilingualism through building on the successful model of academic public engagement provided by the program Bilingualism Matters.
Agency: European Commission | Branch: FP7 | Program: CSA-CA | Phase: SPA.2010.2.1-03 | Award Amount: 2.53M | Year: 2011
EU-UNAWE responds to the outreach part of the Coordinating Action for FP7-SPACE-2010-1. It meets the specific requirements of the call (Section 220.127.116.11). EU-UNAWE exploits inspirational aspects of astronomy and space to interest very young disadvantaged children in science and technology, broaden their minds and stimulate European and global citizenship. The proposal builds on Universe Awareness (UNAWE), a unique, innovative and proven programme for children aged 4 to 10 years. It will exploit the achievements of European (EU) and South African (SA) space sciences to inspire, excite and stimulate young children, when their curiosity is high and their value systems are being formed. Specifically, EU-UNAWE will: - Train and empower primary school teachers in 6 countries to include astronomy and space topics in the classroom. - Develop and translate hands-on material, where appropriate emphasising EU and SA science and technology. - Provide a network for exchange of expertise and material between educators - Lay the groundwork for expansion of the programme throughout the EU, Associated Countries and ICP Countries. - Act as a showcase for EU and SA astronomy/space and related technologies, by disseminating the products among very young children, their teachers and their families. - Use astronomy/space products to stimulate awareness and strengthen public support for EU and SA space science research and technology. - Stimulate the next generation of EU and SA engineers and scientists, particularly girls. - Contribute to the integration of disadvantaged communities in participating countries. - Strengthen collaboration between EU and SA over mutually beneficial scientific, technological, educational and social topics. - Provide significant added value for Europes expenditure on astronomy and space sciences for a modest incremental cost. Pooling complementary expertise and resources of 6 partners gives a project whose whole is greater than the sum of its parts.
Agency: European Commission | Branch: FP7 | Program: CP | Phase: ICT-2009.8.4 | Award Amount: 8.63M | Year: 2010
The Collective Experience of Empathic Data Systems (CEEDS) project will develop novel, integrated technologies to support human experience, analysis and understanding of very large datasets.\n\nMaking use of humans implicit processing abilities\n\nCEEDS will develop innovative tools to exploit theories showing that discovery is the identification of patterns in complex data sets by the implicit information processing capabilities of the human brain. Implicit human responses will be identified by the CEEDs systems analysis of its sensing systems, tuned to users bio-signals and non-verbal behaviours. By associating these implicit responses with different features of massive datasets, the CEEDs system will guide users discovery of patterns and meaning within the datasets.\n\nImmersion in synthetic reality spaces\n\nTo achieve this goal, users will be immersed in synthetic reality spaces (SRS), allowing them to explore complex data whilst following narrative structures of varying spatio-temporal complexity. Unobtrusive multi-modal wearable technologies will be developed in the project for users to wear whilst experiencing the SRS. These will provide an assessment of the behavioural, physiological and mental states of the user.\n\nTwo brains are better than one collective experience\n\nIndividuals pattern detection abilities will be augmented by linking multiple users together, creating a collective discovery system. Components of the CEEDs system will be integrated using generalized architectures from network robotics, creating a genuinely novel approach to massive distributed synthetic reality applications.\n\nMaking a practical difference\n\nCEEDs effectiveness will be validated through studies involving stakeholders from science, history and design. The consortium envisages genuine benefits from the CEEDs system. Think, for example, of a young pupil using CEEDs being able to see complex patterns in an astronomy data set, patterns which without CEEDs would only be perceptible to an experienced professor. By unleashing the power of the subconscious, CEEDs will make fundamental contributions to human experience. When we look back to life before CEEDs, we may liken our experience to living with our eyes closed.\n\nEnriching theory across disciplines\n\nOn the theoretical level, CEEDs targets a novel integrated computational and empirical framework, merging the delivery of presence with the study of consciousness, its underlying sub-conscious factors and creativity. To do this, CEEDS will follow a multi-disciplinary approach that will significantly further the state of the art across science, engineering and the humanities. By bringing together a team of leading experts in psychology, computer science, engineering, mathematics, and other key disciplines, CEEDs will build the foundations for key developments in future confluent technologies.
Agency: European Commission | Branch: FP7 | Program: CSA | Phase: INFRA-2012-3.3. | Award Amount: 974.10K | Year: 2012
There are over a billion PCs in the world. Most of these PCs can be found in citizens homes and, to a lesser extent, in universities. Most of these computers remain idle most of the time. About 1 million of them are active in supporting science in a volunteer computing grid and use their idle time to run scientific applications.\n\nThe potential growth of this computing capacity is enormous. Many Desktop Grids have therefore decided to found the International Desktop Grid Federation (IDGF) to help each other improving their e-Infrastructures.\n\nThe IDGF-Support Project will give the IDGF a boost in two important areas. Firstly it will help considerably with increasing the number of citizens that donate computing time to e-Science.It will do so by targeted communication activities and settin-up a network of ambassadors. Secondly it will help universities e-infrastructures to include otherwise idle PCs from their class rooms and offices. In addition IDGF-SP will collect and analyse data that will help deploying idle PCs in an effective and energy efficient way. It has been shown that Desktop Grids can contribute to Green IT if used in the correct way. IDGF-SP will collect data to underpin and advocate best practices.\n\nAs a result of IDGF-SP, the number of citizen volunteers donating computing to e-Science will increase significantly. By employing unused PCs in private Desktop Grid, universities and other research organisations, will save on their costs on providing computer capacity for their scientists. IDGF-SP will help strengthening the co-operation amongst Desktop Grid e-Infrastructure operators. IDGF-SP will encourage and help IDGF Desktop Grid providers to integrate their infrastructures into the main e-Science environment. The existance of an lively active IDGF community assures the swift take-up of the IDGF-SP project results.
Agency: European Commission | Branch: FP7 | Program: CSA | Phase: INFRA-2012-3.3. | Award Amount: 2.65M | Year: 2012
Metabolomics is an important phenotyping technique for molecular biology and medicine. It assesses the molecular state of an organism or collections of organisms through the comprehensive quantitative and qualitative analysis of all small molecules in cells, tissues, and body fluids. Metabolic processes are at the core of physiology. Consequently, metabolomics is ideally suited as a medical tool to characterise disease states in organisms, as a tool to assessment of organism for their suitability in, for example, renewable energy production or for biotechnological applications in general.\nWe now see the emergence of metabolomics databases and repositories in various subareas of metabolomics and the emergence of large general e-infrastructures in the life sciences. In particular the BioMedBridges project is set to link a variety of European Strategy Forum on Research Infrastructures (ESFRI)s projects, such as ELIXIR and BBMRI.\nMetabolomics generates large and diverse sets of analytical data and therefore impose significant challenges for the above mentioned e-infrastructures.\nWe will therefore develop policies to ensure that Metabolomics data is\n\n1.\tEncoded in open standards to allow barrier-free and wide-spread analysis.\n2.\tTagged with a community-agreed, complete set of metadata (minimum information standard).\n3.\tSupported by a communally developed set of open source data management and capturing tools.\n4.\tDisseminated in open-access databases adhering to the above standards.\n5.\tSupported by vendors and publishers, who require deposition upon publication\n6.\tProperly interfaced with data in other biomedical and life-science e-infrastructures (such as ELIXIR, BioMedBridges, EU-Openscreen).\n\nIn order to achieve this, we have assembled the COSMOS (CCOordination of Standards in MetabOlomicS) consortium of leading European groups in Metabolomics and we will interface with all interested players in Metabolomics world-wide in the Metabolomics community and beyond.
Agency: European Commission | Branch: FP7 | Program: CP-CSA-Infra | Phase: INFRA-2012-1.1.2. | Award Amount: 7.12M | Year: 2014
The RISIS project aims at creating a distributed research infrastructure to support and advance science and innovation studies. This will give the field a strong scientific push forward, and at the same time, provide a radically improved evidence base for research and innovation policies, for research evaluation, and for the quality of policy relevant indicators. The field of science and innovation studies is interdisciplinary, and is related to political sciences, sociology, management and economics. It has a strong quantitative core - with specialties such as scientometrics, technometrics and more widely indicators design - but for many important questions data were lacking or small scale only. This has made the field too much dependent on a few pre-existing datasets. However, during the last decade important efforts have been undertaken to develop new datasets on burning issues such as industrial R&D globalisation, patenting activities of firms, university performance, Europeanisation through joint programming, or the dynamics of nano S&T. Another new characteristic of the field is the development together with computer scientists of software platforms for collecting, integrating and analysing ever more data. Data and platforms are currently owned and/or located at many different organizations, such as individual research groups, companies, and public organizations with very restricted access to others. Through deploying various networking and access strategies, and through joint research, RISIS will decisively open, harmonize, integrate, improve, and extend their availability, quality and use.
Agency: European Commission | Branch: FP7 | Program: CP-CSA-Infra | Phase: INFRA-2008-1.1.1 | Award Amount: 9.65M | Year: 2009
SYNTHESYS IA will aid in the evolution of a European resource through the creation an accessible, integrated infrastructure for researchers in the natural sciences in Europe and globally. By focusing the JRA on DNA extraction, SYNTHESYS IA will increase the opportunities for Users to exploit a largely untapped facet of the 337 million strong collections. Users will be able to play an active role in generating new knowledge based on molecular and morphological studies. A range of new services and improved access both physical and digital will be provided to a broad range of scientific Users (from biological and geological related disciplines) in a consistent and more easily accessible way. The new tools to be developed and disseminated will give Users the chance to pursue new avenues for independent studies at the leading edge of biodiversity and environmental research.
Agency: European Commission | Branch: FP7 | Program: CP-FP | Phase: SEC-2013.6.1-2 | Award Amount: 3.59M | Year: 2014
The PRIME project will support the design of technologies (counter-measures and communication measures) for the prevention, interdiction and mitigation of lone actor extremist events (LOEEs), which are hard to anticipate, yet can be highly damaging to local and national communities and therefore must be addressed. Given the difficulty in detecting LOEEs, prevention-based strategies must be complemented by interdiction- and mitigation-based measures, to minimize harm in the event of detection failure. These measures must be accompanied by communication strategies aimed at a range of audiences, including extremists and the general public. The PRIME project will deliver a knowledge-base to inform the design of measures to defend against LOEEs, by achieving the following objectives: 1): Characterising a) the risk posed by lone actor extremists, and b) the context in which measures to defend against LOEEs may be implemented; 2) Producing a cross-level risk analysis framework within which to articulate the key factors and processes implicated in LOEEs, across all stages of the event (radicalisation, attack preparation, attack). 3): Translating the risk analysis framework into a meta-script of lone actor extremist events, and developing methodologies and techniques to produce empirically-supported scripts of each stage. 4): Producing an integrated, cross-level script of LOEEs, and identifying categories of intervention points or pinch points. 5): Delivering a portfolio of requirements for the design of measures for the prevention, interdiction and mitigation of lone actor extremist events across levels of intervention, informed by the analysis of the event script and an understanding of the context in which these measures may be implemented. 6): Delivering a portfolio of requirements for communication measures directed at a diverse audience at each stage of the script, in coordination with the portfolio of counter-measures.
Agency: European Commission | Branch: FP7 | Program: CP | Phase: ICT-2011.2.1 | Award Amount: 9.45M | Year: 2012
Enabling robots to competently perform everyday manipulation activities such as household chores exceeds, in terms of task,activity, behavior and context complexity, anything that we have so far investigated in motion planning, cognitive robotics, autonomous robot control and artificial intelligence at large. For achieving robust, adaptive, effective and natural performance of everyday manipulation tasks, it is not feasible to expect that programmers can equip the robots with plan libraries that cover such open-ended task spectrum competently.RoboHow.Cog targets at enabling autonomous robots to perform expanding sets of human-scale everyday manipulation tasks - both in human working and living environments. To this end, RoboHow.Cog will investigate a knowledge-enabled and plan-based approach to robot programming and control where knowledge for accomplishing everyday manipulation tasks is semi-automatically acquired from instructions in the World Wide Web, from human instruction and demonstration (videos), and from haptic demonstration.The knowledge-enabled control will be made possible through extensions of constraint- and optimization-based movement specification and execution methods that allow for the force adaptive control of movements to achieve the desired effects and avoid the unwanted ones. In addition, novel perception mechanisms satisfying the knowledge preconditions of plans and monitoring the effects of actions will make the RoboHow.Cog approach feasible.The software components that will come out of RoboHow.Cog will be integrated into complete generic robot control systems such as ROS, and, in particular, into Aldebarans humanoid platform Romeo. RoboHow.Cog will strive to make the code of many of its components - and even of large parts of the Milestone demonstrations -- publicly available under free/open source software licenses.
Agency: European Commission | Branch: H2020 | Program: RIA | Phase: LCE-01-2014 | Award Amount: 4.33M | Year: 2015
Silicon based photovoltaic cells are the dominant technology for terrestrial solar energy conversion. After many decades of research and development, efficiencies are today flat with the best devices measuring 25 % in the laboratory. Significantly higher conversion efficiencies up to 38.8 % are so far only reached with multi-junction cells based on III-V semiconductors. However, these materials are too expensive for the use in flat-plat modules. Nanowires allow to significantly reduce material needs without compromising absorption or performance. The consortium has already shown InP single-junction nanowire solar cells on InP substrate, reaching world-record efficiencies of 13.8 % and using only 12 % of the volume of a conventional bulk solar cell. Combining III-V nanowires with todays silicon photovoltaic technology offers the potential to reach at the same time very high performance devices, efficient use of materials and low cost. In this project we are aiming to demonstrate time an experimental proof of a tandem solar cell composed of a top pn-junction formed in III V nanowires and series connected to a bottom pn-junction formed in silicon.Such solar cell devices are either fabricated by direct growth of the nanowires on Si or by transferring a film of nanowires embedded in a polymer onto a Si bottom cell. Besides developing the best process to demonstrate such tandem solar cells with > 25 % efficiency, we are also addressing the important aspect of scaling up the technology to large areas. To reach this objective, we are developing technologies for large area III V nanowire arrays (> 10 cm) based on nano-imprint technology and epitaxial growth or on a new vapour phase growth method of nanowire aerotaxy. The wide-spread application of nano-materials and III-V compounds in photovoltaics further requires an in depth analysis of ecological and health related risks. In this project we are addressing this important issue already at an early stage of the development
Agency: European Commission | Branch: FP7 | Program: MC-IRSES | Phase: FP7-PEOPLE-2009-IRSES | Award Amount: 799.20K | Year: 2011
The proposed exchange programme will allow us to strengthen the infrastructure of intellectual exchange between the EU (UK, Netherlands and Spain) and China. The short-term transfer of academic staff will facilitate the mutual exchange of domain specific knowledge and know-how, and by means of seminars and workshops, we intend to widen the scope of the programme, including as many (self-supported) external participants as possible, both from academia and industry. We place special emphasis on the development of early career researchers to allow PhD students to gain international experience by working with some of the world-leading research groups in the field. We intend to achieve the objectives of our exchange programme by jointly working on a series of research topics that have been chosen specifically to maximise the synergy between the partners. The overall research theme is nature inspired computation and its applications: The field of nature-inspired computation is a relatively new inter-disciplinary area of research that is concerned with the computational capabilities of natural systems and their interpretation in a computational framework. Each of the seven participants (3 in the EU and 4 in China) has dedicated research groups that are amongst the worlds best in this highly relevant field of research. Each institute provides at least one leading researcher as a representative, with a total of twelve distinguished academics across all participants. The individual research efforts across all members are highly complimentary to one another and may be combined efficiently to widen our knowledge base. The collaboration is expected to generate high-impact research outcomes in the form of publications, seminars and workshops. The strengthening of the relationships between the EU and China will set the stage for long-term collaborations in the future and will provide the ERA with better access to the rapidly growing Chinese academic and industrial sectors.
Agency: European Commission | Branch: FP7 | Program: ERC-SG | Phase: ERC-SG-SH6 | Award Amount: 1.43M | Year: 2012
This project revisits a major question in world history: how can we explain the continuity of the Chinese Empire. Moving beyond the comparison of early world empires (China and Rome) to explain the different courses Chinese and European history have taken, this project aims to assess the importance of political communication in the maintenance of empire in the last millennium. The core questions are twofold: 1) How can the continuity of empire in the Chinese case be best explained? 2) Does the nature and extent of political communication networks, measured through the frequency and multiplexity of information exchange ties, play a critical role in the reconstitution and maintenance of empire? Its methodology is based on the conviction that an investigation of the nature and extent of political communication in imperial Chinese society should include a systematic quantitative and qualitative analysis of the rich commentary on current affairs in correspondence and notebooks. By combining multi-faceted digital analyses of relatively large corpora of texts with an intellectually ambitious research agenda, this project will both radically transform our understanding of the history of Chinese political culture and inspire wide-ranging methodological innovation across the humanities.
Agency: European Commission | Branch: FP7 | Program: CP | Phase: ICT-2007.8.0 | Award Amount: 2.29M | Year: 2009
New magneto-transport phenomena have been discovered in magnetic multilayers and are now being optimized for industrial applications, extending the conventional electronics with new functionality. However, most of the current research on magnetic multilayer materials and its device applications rely on conventional equilibrium electron transport. The full potential of nano-structuring, which leads to a broad spectrum of novel non-equilibrium transport phenomena, is therefore not realized. In this research project we will focus on practically unexplored functional principles that can be implemented in nanostructures produced by state-of-the-art lithography and surface manipulation techniques. Our main idea is to use electrically controlled spin currents in highly non-equilibrium regimes with respect to energy and temperature; hence spin-thermo-electronics. The large amount of heat generated in nanoscale devices is today one of the most fundamental obstacles for reducing the size of electronics. In this proposal we turn the problem around by instead using electrically controlled local heating of magnetic nano-circuits to achieve fundamentally new functionality, relevant to several key objectives of the information and communication technology. Particular emphasis will be put on investigating and technologically evaluating the interplay of spin, charge, and heat in magnetic structures of sub-10 nm dimensions. Such structures, although inaccessible by todays lithographic means, are in our view crucial for further miniaturization of electronic devices.
Agency: European Commission | Branch: FP7 | Program: CP-FP | Phase: SSH.2012.5.2-1 | Award Amount: 2.99M | Year: 2013
The big bang enlargement of the European Union (EU) has nurtured vivid debates among both academics and practitioners about the consequences of an ever larger Union for the EUs integration capacity. The research project MAXCAP will start with a critical analysis of the effects of the 2004- 2007 enlargement on stability, democracy and prosperity of candidate countries, on the one hand, and the EUs institutions, on the other. We will then investigate how the EU can maximize its integration capacity for current and future enlargements. Adopting an inter-disciplinary and mixed methods approach that combines desk research, in-depth interviews and Q-methodology, MAXCAP will a) explain the effects of the EUs integration modes and strategies on democracy and socio-economic development in the new members, candidates and neighbourhood countries; b) inquire into the relationship between the widening and deepening of the EU by establishing conditions for effective decision-making and implementation in an enlarged EU; c) identify the social limits to the EUs integration capacity related to citizens perceptions of the last and future enlargements; d) study the EUs current and past negotiation strategies in the context of enlargement and investigate to what extent they need to be adjusted to changing conditions in the EU and the candidate countries; e) examine how the EU employs different modes of integrating countries with highly diverse economic powers, democratic qualities of governance, and institutional capacities and f) assess whether alternative models, such as the European Neighbourhood Policy, can be successful in bringing countries closer to the EU. MAXCAP which features a nine-partner consortium of academic, policy, dissemination and management excellence will create new and strengthen existing links within and between the academic and the policy world on matters relating to the current and future enlargement of the EU.
News Article | August 30, 2016
The FourStar Galaxy Evolution Survey (ZFOURGE) has built a multicolored photo album of galaxies as they grow from their faint beginnings into mature and majestic giants. They did so by measuring distances and brightnesses for more than 70,000 galaxies spanning more than 12 billion years of cosmic time, revealing the breadth of galactic diversity. The team assembled the colorful photo album by using a new set of filters that are sensitive to infrared light and taking images with them with the FourStar camera at Carnegie's 6.5-meter Baade Telescope at our Las Campanas Observatory in Chile. They took the images over a period of 45 nights. The team made a 3-D map by collecting light from over 70,000 galaxies, peering all the way into the distant universe, and by using this light to measure how far these galaxies are from our own Milky Way. The deep 3-D map also revealed young galaxies that existed as early as 12.5 billion years ago (at less than 10 percent of the current universe age), only a handful of which had previously been found. This should help astronomers better understand the universe's earliest days. "Perhaps the most surprising result is that galaxies in the young universe appear as diverse as they are today," when the universe is older and much more evolved, said lead author Caroline Straatman, a recent graduate of Leiden University. "The fact that we see young galaxies in the distant universe that have already shut down star formation is remarkable." But it's not just about distant galaxies; the information gathered by ZFOURGE is also giving the scientists the best-yet view of what our own galaxy was like in its youth. "Ten billion years ago, galaxies like our Milky Way were much smaller, but they were forming stars 30 times faster than they are today," said Casey Papovich of Texas A&M University. "ZFOURGE is providing us with a highly complete and reliable census of the evolving galaxy population, and is already helping us to address questions like: How did galaxies grow with time? When did they form their stars and develop into the spectacular structures that we see in the present-day universe?" added Ryan Quadri, also of Texas A&M. In the study's first images, the team found one of the earliest examples of a galaxy cluster, a so-called "galaxy city" made up of a dense concentration of galaxies, which formed when the universe was only three billion years old, as compared to the nearly 14 billion years it is today. "The combination of FourStar, the special filters, Magellan, and the conditions at Las Campanas led to the detection of the cluster," said Persson, who built the FourStar instrument at the Carnegie Observatories in Pasadena. "It was in a very well-studied region of the sky—'hiding in plain sight.'" The paper marks the completion of the ZFOURGE survey and the public release of the dataset, which can be found here: http://zfourge.tamu.edu/DR2016/data.html. More information: The FourStar Galaxy Evolution Survey (ZFOURGE): ultraviolet to far-infrared catalogs, medium-bandwidth photometric redshifts with improved accuracy, stellar masses, and confirmation of quiescent galaxies to z~3.5. arxiv.org/abs/1608.07579
News Article | September 14, 2016
On its way to assembling the most detailed 3-D map ever made of our Milky Way galaxy, Gaia has pinned down the precise position on the sky and the brightness of 1142 million stars. As a taster of the richer catalogue to come in the near future, today's release also features the distances and the motions across the sky for more than two million stars. "Gaia is at the forefront of astrometry, charting the sky at precisions that have never been achieved before," says Alvaro Giménez, ESA's Director of Science. "Today's release gives us a first impression of the extraordinary data that await us and that will revolutionise our understanding of how stars are distributed and move across our Galaxy." Launched 1000 days ago, Gaia started its scientific work in July 2014. This first release is based on data collected during its first 14 months of scanning the sky, up to September 2015. "The beautiful map we are publishing today shows the density of stars measured by Gaia across the entire sky, and confirms that it collected superb data during its first year of operations," says Timo Prusti, Gaia project scientist at ESA. The stripes and other artefacts in the image reflect how Gaia scans the sky, and will gradually fade as more scans are made during the five-year mission. "The satellite is working well and we have demonstrated that it is possible to handle the analysis of a billion stars. Although the current data are preliminary, we wanted to make them available for the astronomical community to use as soon as possible," adds Dr Prusti. Transforming the raw information into useful and reliable stellar positions to a level of accuracy never possible before is an extremely complex procedure, entrusted to a pan-European collaboration of about 450 scientists and software engineers: the Gaia Data Processing and Analysis Consortium, or DPAC. "Today's release is the result of a painstaking collaborative work over the past decade," says Anthony Brown from Leiden University in the Netherlands, and consortium chair. "Together with experts from a variety of disciplines, we had to prepare ourselves even before the start of observations, then treated the data, packaged them into meaningful astronomical products, and validated their scientific content." In addition to processing the full billion-star catalogue, the scientists looked in detail at the roughly two million stars in common between Gaia's first year and the earlier Hipparcos and Tycho-2 Catalogues, both derived from ESA's Hipparcos mission, which charted the sky more than two decades ago. By combining Gaia data with information from these less precise catalogues, it was possible to start disentangling the effects of 'parallax' and 'proper motion' even from the first year of observations only. Parallax is a small motion in the apparent position of a star caused by Earth's yearly revolution around the Sun and depends on a star's distance from us, while proper motion is due to the physical movement of stars through the Galaxy. In this way, the scientists were able to estimate distances and motions for the two million stars spread across the sky in the combined Tycho–Gaia Astrometric Solution, or TGAS. This new catalogue is twice as precise and contains almost 20 times as many stars as the previous definitive reference for astrometry, the Hipparcos Catalogue. As part of their work in validating the catalogue, DPAC scientists have conducted a study of open stellar clusters – groups of relatively young stars that were born together – that clearly demonstrates the improvement enabled by the new data. "With Hipparcos, we could only analyse the 3-D structure and dynamics of stars in the Hyades, the nearest open cluster to the Sun, and measure distances for about 80 clusters up to 1600 light-years from us," says Antonella Vallenari from the Istituto Nazionale di Astrofisica (INAF) and the Astronomical Observatory of Padua, Italy. "But with Gaia's first data, it is now possible to measure the distances and motions of stars in about 400 clusters up to 4800 light-years away. For the closest 14 open clusters, the new data reveal many stars surprisingly far from the centre of the parent cluster, likely escaping to populate other regions of the Galaxy." Many more stellar clusters will be discovered and analysed in even greater detail with the extraordinary data that Gaia continues to collect and that will be released in the coming years. The new stellar census also contains 3194 variable stars, stars that rhythmically swell and shrink in size, leading to periodic brightness changes. Many of the variables seen by Gaia are in the Large Magellanic Cloud, one of our galactic neighbours, a region that was scanned repeatedly during the first month of observations, allowing accurate measurement of their changing brightness. Details about the brightness variations of these stars, 386 of which are new discoveries, are published as part of today's release, along with a first study to test the potential of the data. "Variable stars like Cepheids and RR Lyraes are valuable indicators of cosmic distances," explains Gisella Clementini from INAF and the Astronomical Observatory of Bologna, Italy. "While parallax is used to measure distances to large samples of stars in the Milky Way directly, variable stars provide an indirect, but crucial step on our 'cosmic distance ladder', allowing us to extend it to faraway galaxies." This is possible because some kinds of variable stars are special. For example, in the case of Cepheid stars, the brighter they are intrinsically, the slower their brightness variations. The same is true for RR Lyraes when observed in infrared light. The variability pattern is easy to measure and can be combined with the apparent brightness of a star to infer its true brightness. This is where Gaia steps in: in the future, scientists will be able to determine very accurate distances to a large sample of variable stars via Gaia's measurements of parallaxes. With those, they will calibrate and improve the relation between the period and brightness of these stars, and apply it to measure distances beyond our Galaxy. A preliminary application of data from the TGAS looks very promising. "This is only the beginning: we measured the distance to the Large Magellanic Cloud to test the quality of the data, and we got a sneak preview of the dramatic improvements that Gaia will soon bring to our understanding of cosmic distances," adds Dr Clementini. Knowing the positions and motions of stars in the sky to astonishing precision is a fundamental part of studying the properties and past history of the Milky Way and to measure distances to stars and galaxies, but also has a variety of applications closer to home – for example, in the Solar System. In July, Pluto passed in front of a distant, faint star, offering a rare chance to study the atmosphere of the dwarf planet as the star gradually disappeared and then reappeared behind Pluto. This stellar occultation was visible only from a narrow strip stretching across Europe, similar to the totality path that a solar eclipse lays down on our planet's surface. Precise knowledge of the star's position was crucial to point telescopes on Earth, so the exceptional early release of the Gaia position for this star, which was 10 times more precise than previously available, was instrumental to the successful monitoring of this rare event. Early results hint at a pause in the puzzling pressure rise of Pluto's tenuous atmosphere, something that has been recorded since 1988 in spite of the dwarf planet moving away from the Sun, which would suggest a drop in pressure due to cooling of the atmosphere. "These three examples demonstrate how Gaia's present and future data will revolutionise all areas of astronomy, allowing us to investigate our place in the Universe, from our local neighbourhood, the Solar System, to Galactic and even grander, cosmological scales," explains Dr Brown. This first data release shows that the mission is on track to achieve its ultimate goal: charting the positions, distances, and motions of one billion stars – about 1% of the Milky Way's stellar content – in three dimensions to unprecedented accuracy. "The road to today has not been without obstacles: Gaia encountered a number of technical challenges and it has taken an extensive collaborative effort to learn how to deal with them," says Fred Jansen, Gaia mission manager at ESA. "But now, 1000 days after launch and thanks to the great work of everyone involved, we are thrilled to present this first dataset and are looking forward to the next release, which will unleash Gaia's potential to explore our Galaxy as we've never seen it before." More information: The data from Gaia's first release can be accessed at archives.esac.esa.int/gaia Fifteen scientific papers describing the data contained in the release and their validation process will appear in a special issue of Astronomy & Astrophysics.
News Article | February 3, 2016
The cultural superiority of our human ancestors may have given them a deadly edge over our Neanderthal cousins, consequently driving the latter into extinction. This supports the idea that instead of epidemics or climate change wiping out the Neanderthals, as previously believed, it was humans who had done it. Thousands of years ago, our Neanderthal cousins lived in what is now known as Europe, even before our human ancestors arrived in the area. According to scientists, humans came to Europe about 43,000 years ago. Some 5,000 years later, Neanderthals went extinct. Experts are uncertain as to what really happened, but there are different theories that attempt to explain why and how Neanderthals were wiped out. While some believe that Neanderthals died because of epidemics or climate change, others think that modern humans took down Neanderthals with better clothing, tools or social organization. A small group of researchers, led by Professor Marcus Feldman of Stanford University, argues that the cultural and technological advances of modern humans could have been the tipping point. In a study featured in the journal Proceedings of the National Academy of Sciences, the team adjusted a mathematical model often used to predict competition between populations. They added, for the first time, dimensions for the ability to learn and cultural advantage. Researchers modified the model so that the more culturally advanced a group was, the larger it could grow. As modern humans were moving into Neanderthal territory, scientists said it was likely that those who were dominant arrived in small numbers, in comparison to already established Neanderthal groups. Despite being outnumbered, the cultural skills that modern humans brought with them could have allowed them to hunt, settle land, use resources more efficiently than the original residents. Eventually, the population of modern humans swelled, making them even more powerful, researchers said. An even smaller number of modern humans could have overwhelmed a much larger Neanderthal population that did not have culture. Additionally, if the culture levels were different, modern humans could have begun with half as many people and still win out. Feldman said it was not a case of whether modern humans were smarter than Neanderthals. Research suggests that modern humans and Neanderthals had similar brainpower, a trait that slowly evolves through time. What allowed modern humans to outsmart Neanderthals were resources: modern humans had more tools, more clothes and more complex form of society. These cultural and technological advances spread from person to person, especially when coupled with superior learning abilities. However, not everyone agrees with this. A 2014 study reviewed several arguments for human cultural superiority and had found them lacking. Researchers from the University of Colorado, Boulder and Leiden University said they did not find any data to support the theory that Neanderthals were inferior in social, cognitive and technological aspects. This indicated that the extinction of Neanderthals had resulted from a combination of factors. Feldman and his colleagues do not settle the debate over humankind's 15,000-year conquest of Europe and Middle East. Rather, it merely tests the possibility of the cultural argument. It's an attempt to tie together what archaeologists have uncovered and direct to new things they might have to look for in the field. Lastly, Feldman said one thing that can improve their mathematical model is a measure of the speed at which anatomically modern humans could have spread across Europe. "We'd like to see the geographic trajectory - how much migration there would have to be and at what pace it would have to happen to reconstruct what geologists tell us," added Feldman.
News Article | August 31, 2016
An international team of astronomers, including Carnegie’s Eric Persson, has charted the rise and fall of galaxies over 90 percent of cosmic history. Their work, which includes some of the most sensitive astronomical measurements made to date, is published by The Astrophysical Journal. The FourStar Galaxy Evolution Survey (ZFOURGE) has built a multicolored photo album of galaxies as they grow from their faint beginnings into mature and majestic giants. They did so by measuring distances and brightnesses for more than 70,000 galaxies spanning more than 12 billion years of cosmic time, revealing the breadth of galactic diversity. The team assembled the colorful photo album by using a new set of filters that are sensitive to infrared light and taking images with them with the FourStar camera at Carnegie’s 6.5-meter Baade Telescope at our Las Campanas Observatory in Chile. They took the images over a period of 45 nights. The team made a 3-D map by collecting light from over 70,000 galaxies, peering all the way into the distant universe, and by using this light to measure how far these galaxies are from our own Milky Way. The deep 3-D map also revealed young galaxies that existed as early as 12.5 billion years ago (at less than 10 percent of the current universe age), only a handful of which had previously been found. This should help astronomers better understand the universe’s earliest days. "Perhaps the most surprising result is that galaxies in the young universe appear as diverse as they are today,” when the universe is older and much more evolved, said lead author Caroline Straatman, a recent graduate of Leiden University. “The fact that we see young galaxies in the distant universe that have already shut down star formation is remarkable.” But it’s not just about distant galaxies; the information gathered by ZFOURGE is also giving the scientists the best-yet view of what our own galaxy was like in its youth. “Ten billion years ago, galaxies like our Milky Way were much smaller, but they were forming stars 30 times faster than they are today,” said Casey Papovich of Texas A&M University. “ZFOURGE is providing us with a highly complete and reliable census of the evolving galaxy population, and is already helping us to address questions like: How did galaxies grow with time? When did they form their stars and develop into the spectacular structures that we see in the present-day universe?” added Ryan Quadri, also of Texas A&M. In the study’s first images, the team found one of the earliest examples of a galaxy cluster, a so-called “galaxy city” made up of a dense concentration of galaxies, which formed when the universe was only three billion years old, as compared to the nearly 14 billion years it is today. “The combination of FourStar, the special filters, Magellan, and the conditions at Las Campanas led to the detection of the cluster,” said Persson, who built the FourStar instrument at the Carnegie Observatories in Pasadena. “It was in a very well-studied region of the sky—‘hiding in plain sight.’” The paper marks the completion of the ZFOURGE survey and the public release of the dataset.
News Article | February 27, 2017
DUARTE, Calif.--(BUSINESS WIRE)--An international team of researchers led by City of Hope’s Bart Roep, Ph.D., the Chan Soon-Shiong Shapiro Distinguished Chair in Diabetes and professor/founding chair of the Department of Diabetes Immunology, has been able to justify an alternative theory about the cause of type 1 diabetes (T1D) through experimental work. The study results were published online today in the journal, Nature Medicine. T1D, previously known as juvenile diabetes, affects an estimated 1.5 million Americans and is the result of the loss of insulin-producing cells in the pancreas. The prevailing belief was that the root cause of T1D was the immune system mistakenly identifying those insulin-secreting beta cells as a potential danger and, in turn, destroying them. Now Roep, along with researchers from the Leiden University Medical Center in the Netherlands, have found a mechanism in which stressed beta cells are actually causing the immune response that leads to T1D. “Our findings show that type 1 diabetes results from a mistake of the beta cell, not a mistake of the immune system,” said Roep, who is director of the Wanek Family Project for Type 1 Diabetes, which was recently created with gifts from the Wanek family and anonymous donors to support the institution’s goal of curing T1D in six years. “The immune system does what it is supposed to do, which is respond to distressed or ‘unhappy’ tissue, as it would in infection or cancer.” In order to gain a better understanding of why the immune system attacks the body’s own source of insulin — the pancreatic beta cells in the islets of Langerhans — the team took some clues from cancer molecules that are targeted by the immune system after successful treatment of the cancer with immunotherapy. One of these cancer targets is a so-called nonsense protein, resulting from a misreading of a DNA sequence that makes a nonfunctional protein. It turns out that the same type of protein error is also produced by the beta cells in T1D. Therefore, Roep and the other researchers believe it is a ‘wrong read’ of the insulin gene itself that proves to be a major target of the immune system. This error product of the insulin gene is made when beta cells are stressed, Roep said. “Our study links anti-tumor immunity to islet autoimmunity, and may explain why some cancer patients develop type 1 diabetes after successful immunotherapy,” he added. “This is an incredible step forward in our commitment to cure this disease.” According to the paper titled, “Autoimmunity against a defective ribosomal insulin gene product in type 1 diabetes,” the findings “further support the emerging concept that beta cells are destroyed in T1D by a mechanism comparable to classical antitumor responses where the immune system has been trained to survey dysfunctional cells in which errors have accumulated.” The results of the study give Roep new insight, he said, for his work in developing new vaccines to desensitize the immune system so that it will tolerate islets again, as well as for research into combining immunotherapy with more traditional diabetes treatments to reinvigorate islets. “Our goal is to keep beta cells happy,” Roep said. “So we will work on new forms of therapy to correct the autoimmune response against islets and hopefully also prevent development of type 1 diabetes during anti-cancer therapy.” The work described in the Nature Medicine paper was supported by the Dutch Diabetes Research Foundation, the DON Foundation and the JDRF. City of Hope is an independent research and treatment center for cancer, diabetes and other life-threatening diseases. Designated as one of only 47 comprehensive cancer centers, the highest recognition bestowed by the National Cancer Institute, City of Hope is also a founding member of the National Comprehensive Cancer Network, with research and treatment protocols that advance care throughout the world. City of Hope is located in Duarte, California, just northeast of Los Angeles, with community clinics throughout Southern California. It is ranked as one of “America’s Best Hospitals” in cancer by U.S. News & World Report. Founded in 1913, City of Hope is a pioneer in the fields of bone marrow transplantation, diabetes and numerous breakthrough cancer drugs based on technology developed at the institution. For more information about City of Hope, follow us on Facebook, Twitter, YouTube or Instagram.
News Article | December 7, 2016
Inflammation is a good thing when it's fighting off infection, but too much can lead to autoimmune diseases or cancer. In efforts to dampen inflammation, scientists have long been interested in CC chemokine receptor 2 (CCR2) -- a protein that sits on the surface of immune cells like an antenna, sensing and transmitting inflammatory signals that spur cell movement toward sites of inflammation. Researchers at the Skaggs School of Pharmacy and Pharmaceutical Sciences at University of California San Diego have now determined the 3D structure of CCR2 simultaneously bound to two inhibitors. Understanding how these molecules fit together may better enable pharmaceutical companies to develop anti-inflammatory drugs that bind and inhibit CCR2 in a similar manner. The study is published December 7 by Nature. CCR2 and associated signaling molecules are known to play roles in a number of inflammatory and neurodegenerative diseases, including multiple sclerosis, asthma, diabetic nephropathy and cancer. Many drug companies have attempted to develop drugs that target CCR2, but none have yet made it to market. "So far drugs that target CCR2 have consistently failed in clinical trials," said Tracy Handel, PhD, professor in the Skaggs School of Pharmacy. "One of the biggest challenges is that, to work therapeutically, CCR2 needs to be turned 'off' and stay off completely, all of the time. We can't afford ups and downs in its activity. To be effective, any small molecule drug that inhibits CCR2 would have to bind the receptor tightly and stay there. And that's difficult to do." Handel led the study with Irina Kufareva, PhD, project scientist at Skaggs School of Pharmacy, and Laura Heitman, PhD, of Leiden University. The study's first author is Yi Zheng, PhD, postdoctoral researcher also at Skaggs School of Pharmacy. CCR2 spans the membrane of immune cells. Part of the receptor sticks outside the cell and part sticks inside. Inflammatory molecules called chemokines bind the external part of CCR2 and the receptor carries that signal to the inside of the cell. Inside the cell, CCR2 changes shape and binds other communication molecules, such as G proteins, triggering a cascade of activity. As a result, the immune cells move, following chemokine trails that lead them to places in the body where help is needed. In this study, the researchers used a technique known as X-ray crystallography to determine the 3D structure of CCR2 with two molecules bound to it simultaneously -- one at each end. That's a huge accomplishment because, Kufareva said, "Receptors that cross the cell membrane are notoriously hard to crystalize. To promote crystallization, we needed to alter the amino acid sequence of CCR2 to make the receptor molecules assemble in an orderly fashion. Otherwise, when taken out of the cell membrane, they tend to randomly clump together. " Handel, Kufareva and team also discovered that the two small molecules binding CCR2 turn the receptor "off" by different but mutually reinforcing mechanisms. One of the small molecules binds the outside face of the receptor and blocks binding of the natural chemokines that normally turn the receptor "on." The other small molecule binds the face of the receptor inside the cell, where the G protein normally binds, preventing inflammatory signal transmission. According to Handel, the latter binding site has never been seen before. "It's our hope that this new structure of CCR2 with two bound inhibitors will help optimize current and future drug discovery efforts," Kufareva said. Co-authors of this study also include: Ling Qin, Martin Gustavsson, Chunxia Zhao, Ruben Abagyan, UC San Diego; Natalia V. Ortiz Zacarías, Henk de Vries, Adriaan P. IJzerman, Leiden University; Gye Won Han, Vadim Cherezov, Raymond C. Stevens, University of Southern California; Marta Dabros, Robert Cherney, Percy Carter, Andrew Tebben, Briston-Myers Squibb Company; Dean Stamos, Vertex Pharmaceuticals.
News Article | December 19, 2016
Researchers explore for the first time the impact of pregnancy on the structure of the human brain Pregnancy involves radical hormone surges and biological adaptations, but the effects on the brain are still unknown. In this study a team of researchers compared the structure of the brain of women before and after their first pregnancy. This is the first research to show that pregnancy involves long-lasting changes - at least for two years post-partum - in the morphology of a woman's brain. Using magnetic resonance imaging, the scientists have been able to show that the brains of women who have undergone a first pregnancy present significant reductions in grey matter in regions associated with social cognition. The researchers believe that such changes correspond to an adaptive process of functional specialization towards motherhood. "These changes may reflect, at least in part, a mechanism of synaptic pruning, which also takes place in adolescence, where weak synapses are eliminated giving way to more efficient and specialized neural networks", says Elseline Hoekzema, co-lead author of the article. According to Erika Barba, the other co-lead author, "these changes concern brain areas associated with functions necessary to manage the challenges of motherhood". In fact, researchers found that the areas with grey matter reductions overlapped with brain regions activated during a functional neuroimaging session in which the mothers of the study watched images of their own babies. In order to conduct the study, researchers compared magnetic resonance images of 25 first-time mothers before and after their pregnancy, of 19 male partners, and of a control group formed by 20 women who were not and had never been pregnant and 17 male partners. They gathered information about the participants during five years and four months. The results of the research directed by Òscar Vilarroya and Susanna Carmona demonstrated a symmetrical reduction in the volume of grey matter in the medial frontal and posterior cortex line, as well as in specific sections of, mainly, prefrontal and temporal cortex in pregnant women. "These areas correspond to a great extent with a network associated with processes involved in social cognition and self-focused processing", indicates Susanna Carmona. The analyses of the study determine with great reliability whether any woman from the study had been pregnant depending on the changes in the brain structure. They were even able to predict the mother's attachment to her baby in the postpartum period based on these brain changes. The study took into account variations in both women who had undergone fertility treatments and women who had become pregnant naturally, and the reductions in grey matter were practically identical in both groups. Researchers did not observe any changes in memory or other cognitive functions during the pregnancies and therefore believe that the loss of grey matter does not imply any cognitive deficits, but rather: "The findings point to an adaptive process related to the benefits of better detecting the needs of the child, such as identifying the newborn's emotional state. Moreover, they provide primary clues regarding the neural basis of motherhood, perinatal mental health and brain plasticity in general", says Oscar Vilarroya. Elseline Hoekzema (researcher at the UAB at the time of the study, but currently working at Leiden University) and Erika Barba-Müller (UAB) are the lead authors of the article published in Nature Neuroscience. The study was directed by Òscar Vilarroya, from the Cognitive Neuroscience Research Unit of the Department of Psychiatry and Legal Medicine at the UAB, and coordinator of the research group Neuroimaging of Mental Disorders at the IMIM Foundation, and co-directed by Susana Carmona [researcher at the UAB at the time of the study and now at the University Carlos III, Madrid, and affiliated to the CIBER of Mental Health (CIBERSAM)]. Also collaborating in the research were Cristina Pozzobon, Florencio Lucco and Agustín Ballesteros (Valencian Infertility Institute, IVI); Marisol Picado (Hospital Clínic); Eveline A. Crone (Leiden University); David García-García and Manuel Desco (University Carlos III and Instituto de Investigación Sanitaria Gregorio Marañón, Madrid); and Juan Carlos Soliva and Adolf Tobeña (UAB).
News Article | November 22, 2015
A version of this story first appeared at What's On Weibo? Tens of millions of empty apartments in brand new cities all over China, deserted cinemas and quiet parks. It is an image that has captured the public imagination: China’s “ghost cities” have become a popular topic in international media. American author Wade Shepard spent the past few years touring these new territories for his new book Ghost Cities of China. Earlier this year, he came to a sold-out event at Beijing’s Bookworm Literary Festival to speak about his project with New York Times reporter Dan Levin. “The term ‘ghost cities’ is actually not appropriate,” Shepard said. “Ghost cities are places that once lived and then died. What I write about is new places that are underpopulated, and where houses are dark at night.” Shepard points out that most of China’s ghost cities actually do have people living in them. The ones that don’t are still under construction: “These new underpopulated cities are built by world luxury developers who are working on constructing new urban utopias all over China. The people living in these cities come from various places. Some are trendy people who are looking to live in a new city. Others have been relocated from their original villages. There are many from the countryside.” By 2020, China hopes to move 100 million people from the country’s farming regions into cities, in the largest urban migration in history. China’s government-driven push for urbanization is part of changing the economy, going from export to domestic demand. New towns, with hospitals, roads and sport centers, are mushrooming all over China. Shepard’s fascination with China’s new towns started about ten years ago. “I saw a ‘ghost city’ for the first time in 2006, when I was a student near Hangzhou,” he said. “It happened in the small town of Tiantai. I took a wrong turn after getting off the bus, and I ended up in this new part of town with nobody there. Never in my life had I seen anything like it: a brand new neighborhood with nobody there. I was so excited about it. My professors later told me those places were everywhere, they were not impressed. But it stuck with me. Just take any bus, and there is going to be a new city or neighborhood under construction. I enjoyed walking around these areas. I went to Mongolia and forgot about them for a while, until I returned in 2012. I travelled around and tried to figure out what these places really were. They are the new landscape of China." Shepard went out into China’s new areas by bicycle. “My objective was to go there and try to make friends. A foreigner showing up there is not a common thing, so many people want to know what you are doing there. It isn’t too difficult to talk to people.” There is an upside and a downside to the emergence of China’s new cities. “There are people who are very happy to move there. Because they get a urban hukou, they feel like they’re moving up.” A hukou is an urban residency permit. In China’s new towns, residents will get a different permit than their countryside one. It enables them to legally work within the cities and enjoy certain benefits. Health care, for example, is better than in the countryside. For some elderly people from rural areas, moving to the city could literally save their life. The Ordos Museum, which opened in 2011. Photo: Ma Yansong, Yosuke Hayano, Dang Qun/MAD Architects But there is also a big downside to China’s urban migration, Shepard says. “Many people are moving from a traditional village structure, where people make daily social connections, and ask each other what they are doing today and what’s for dinner tonight. With these high apartment buildings, this structure changes; they don’t do that anymore. It’s an elevator culture. People also come from so many different places that they don’t really connect.” Some people who move to the city feel like they have lost their livelihood. “There are those who have been out in the hills for thousands of years. Once they’re in the towns, they suddenly have to pay for water and electricity. They have to go to the store to buy things.” The issues that come with China’s new towns are also visible in The Land of Many Palaces, a new documentary by Adam James Smith and Song Ting. The Land of Many Palaces focuses on Ordos (鄂尔多斯), a 21st century city in the deserts of Inner Mongolia. The city holds an estimated one-sixth of the country’s coal reserves. After the coal was discovered, the region went from becoming one of the poorest to one of richest in China. Coal exploitation has created many millionaires investing in infrastructure and real estate. New city district Kangbashi sprung from the desert sands, and is the result of such an investment. Ordos Kangbashi was built between 2005 and 2010. It has skyscrapers, stadiums, a grand theater, museum and thousands of apartments. It is ready to house one million future residents. The documentary starts with a scene that shows how Mrs. Yuan, the community manager, guides people through their new homes. Some don’t know how to use modern toilets, stoves or heaters. Mrs. Yuan teaches them, and also shows them how the television works (“There are over 100 channels!“) These new surroundings are in stark contrast with those of a nearby village, where one farmer is working on the land, where the houses are abandoned. “They all moved to the new city,” he says. “In the countryside, you can live for months without spending money” The Land of Many Palaces shows that moving to one of China’s “ghost cities” is not just about moving houses, but about changing lifestyles. Farmers have to get used to living in the city and cope with all the things that come with it. Community staff members go to public places to teach them “how to become a civilized person”; telling people that good behavior is the answer to Ordos becoming a civilized city. Employment is a major problem in China’s new cities. Many farmers have ample experience in raising pigs and working on the land, but their experience is of no use in the urban environment where there is more need for hair stylists and shop attendants. The lack of jobs is one important reason why farmers are hesitant to migrate from the countryside to the city. A trailer for The Land of Many Palaces, 2015. One film scene shows a village where only two farmers are left. The rest of the villagers have already moved to the city. Ordos’s community manager visits the farmers to convince them to trade their clay houses for an apartment flat. When they decline, she says, “If the developers of the new city need your land, they will take it anyway.” After a recent screening of the film in Amsterdam, Adam Smith described how the project began. “When we first visited Ordos in 2011, we expected to find a ghost city. Instead, we found a place that is becoming a city.” Many of China’s so-called ghost cities look like ambitious dreams that turned into nightmares. “When we first started the project, we were somewhat indoctrinated by the general media reports in Europe and North America on how this top-down style of urbanization and city-building is wrong. But the more time we spent there, the more we started to think like the people there,” Smith says: “Nobody felt like what was going on was wrong. They were uplifted by the plan.” Smith explains how many people, ironically, were pushed out of the cities to the countryside during the Cultural Revolution. In many of these areas it was hard to farm, and people struggled to survive. In some way, being taken to the city is like being saved for many: “We met very few people, if any, who were opposed.”“Ghost cities are a hot topic, but ex-ghost cities are not.” The public square in Kangbaishi, Ordos, 2008. Photo by Alex Pasternack Although ghost cities are a hot topic, ex-ghost cities are not. “When a ‘ghost city’ comes to life we barely hear about it anymore”, Shepard writes. Although many of China’s new cities are still virtually empty, there are also those that have now become busier. Shepard names a few example in a recent article, such as Dantu (Zhejiang), Wujin (Changzhou), and perhaps the most famous one, Shanghai’s Pudong district. “For the past few years I’ve been chasing reports of ghost cities around China, but I rarely ever find one that qualifies for this title. Though the international media claims that China is building cities for nobody, I often find something very different upon arrival,” Shepard writes. Earlier this year, Global Times and China Daily published a statement from the mayor of Ordos on Sina Weibo: “We are not a ghost city.” Over the past year 10,000 houses were sold, he says. But that still leaves 34,000 houses empty. “They buy it to sell it, but none of those rich people actually live there,” one netizen responded. “The mayor just doesn’t wanna lose face.” Other netizens said they like the city of Ordos. “The town is quite pretty, and the air is good,” user ‘392‘ wrote. “I’ve just been to Ordos, and it’s really not as bad as the media says,” another netizen wrote: “The city is well-built, the air is good and it is safe.” One other Weibo user praised the still-mostly empty city “Finally a place in China that is not crowded yet.” Maybe China’s ‘ghost cities’ are not that bad, or that ghostly, after all. They just might need another decade to really come to life. A version of this story originally appeared at What’s On Weibo. Manya Koetse is the editor-in-chief of What’s On Weibo, a website providing cultural, historical and political insights into trending topics on China's biggest social media sites. She's a Sinologist and PhD researcher in Sino-Japanese Relations at Leiden University. The researchers’ data set contained billions of locational data points for 770 million Baidu users—China has a population of 1.36 billion—captured between September 2014 and April 2015. The results were used to make a handy interactive map of China’s ghost cities.
Agency: European Commission | Branch: FP7 | Program: CP-CSA-Infra | Phase: INFRA-2012-1.1.3. | Award Amount: 8.43M | Year: 2013
ARIADNE is a proposal to bring together and integrate the existing archaeological research data infrastructures so that researchers can use the various distributed datasets and new and powerful technologies as an integral component of the archaeological research methodology. There is now a large availability of archaeological digital datasets that altogether span different periods, domains and regions; more are continuously created as a result of the increasing use of IT. They are the accumulated outcome of the research of individuals, teams and institutions, but form a vast and fragmented corpus and their potential is constrained by difficult access and non-homogenous perspectives. This integrating activity will enable trans-national access of researchers to data centres, tools and guidance, and the creation of new Web-based services based on common interfaces to data repositories, availability of reference datasets and usage of innovative technologies. It will stimulate new research avenues in the field of archaeology, relying on the comparison, re-use and integration into current research of the outcomes of past and on-going field and laboratory activity. Such data are scattered amongst diverse collections, datasets, inaccessible and unpublished fieldwork reports grey literature, and in publications, the latter still being the main source of knowledge sharing. It will contribute to the creation of a new community of researchers ready to exploit the contribution of Information Technology and to incorporate it in the body of established archaeological research methodology. To achieve this result the project will use a number of integrating technologies that build on common features of the currently available datasets, and on integrating actions that will build a vibrant community of use. The overall objective outlined above will be achieved through subordinate goals, which altogether will enable the provision of advanced Integrated Infrastructure.
News Article | November 2, 2016
New Rochelle, NY, November 2, 2016--The decreased expression of some structural covariance networks (SCNs) in the brain is associated with advancing age, whereas other networks are less affected by age, and a new study now points to the independent effects of cerebral small vessel disease on SCNs. SCNs may be an important indicator of diminished cognitive functioning in older persons, according to an article published in Brain Connectivity, a peer-reviewed journal from Mary Ann Liebert, Inc., publishers. The article is available free on the Brain Connectivity website until December 2, 2016. In "Structural Covariance Networks and Their Association with Age, Features of Cerebral Small Vessel Disease and Cognitive Functioning in Older Persons," Jessica Foster-Dingley, Jeroen van der Grond, PhD, et al. from Leiden University Medical Center and Leiden University, the Netherlands and CAPRI-University of Antwerp, Belgium, analyzed the magnetic resonance imaging (MRI) scans of study participants aged 75-96 years who had mild loss in cognitive function. The researchers assessed the volume of white matter hyperintensities, microbleeds, and other vascular changes associated with small vessel disease. They compared this to the expression of SCNs, age, memory loss, and psychomotor speed. "Scientific consensus is building that age related cognitive decline is connected to maladaptive changes in the brain's small blood vessels," says Christopher Pawela, PhD, Co-Editor-in-Chief of Brain Connectivity and Assistant Professor, Medical College of Wisconsin. "Leiden University researchers have performed an elegant study using magnetic resonance imaging (MRI) to demonstrate that these micro-scale blood vessel alterations are related to decreased detection of certain imaging brain networks and, furthermore, that decreased detection of these brain networks is correlated to impaired cognitive functioning using standard behavioral testing methods." Brain Connectivity is the essential peer-reviewed journal covering groundbreaking findings in the rapidly advancing field of connectivity research at the systems and network levels. Published 10 times per year in print and online, the Journal is under the leadership of Founding and Co-Editors-in-Chief Christopher Pawela, PhD, Assistant Professor, Medical College of Wisconsin, and Bharat Biswal, PhD, Chair of Biomedical Engineering, New Jersey Institute of Technology. It includes original peer-reviewed papers, review articles, point-counterpoint discussions on controversies in the field, and a product/technology review section. To ensure that scientific findings are rapidly disseminated, articles are published Instant Online within 72 hours of acceptance, with fully typeset, fast-track publication within 4 weeks. Tables of content and a sample issue may be viewed on the Brain Connectivity website. Mary Ann Liebert, Inc., publishers is a privately held, fully integrated media company known for establishing authoritative medical and biomedical peer-reviewed journals, including Journal of Neurotrauma and Therapeutic Hypothermia and Temperature Management. Its biotechnology trade magazine, GEN (Genetic Engineering & Biotechnology News), was the first in its field and is today the industry's most widely read publication worldwide. A complete list of the firm's 80 journals, newsmagazines, and books is available on the Mary Ann Liebert, Inc., publishers website.
News Article | February 15, 2017
Chemical reactions that release oxygen in the presence of a catalyst, known as oxygen-evolution reactions, are a crucial part of chemical energy storage processes, including water splitting, electrochemical carbon dioxide reduction, and ammonia production. The kinetics of this type of reaction are generally slow, but compounds called metal oxides can have catalytic activities that vary over several orders of magnitude, with some exhibiting the highest such rates reported to date. The physical origins of these observed catalytic activities is not well-understood. Now, a team at MIT has shown that in some of these catalysts oxygen doesn’t come only from the water molecules surrounding the catalyst material; some of it comes from within the crystal lattice of the catalyst material itself. The new findings are being reported this week in the journal Nature Chemistry, in a paper by recent MIT graduate Binghong Han PhD ’16, postdoc Alexis Grimaud, Yang Shao-Horn, the W.M. Keck Professor of Energy, and six others. The research was aimed at studying how water molecules are split to generate oxygen molecules and what factors limit the reaction rate, Grimaud says. Increasing those reaction rates could lead to more efficient energy storage and retrieval, for example, so determining just where the bottlenecks may be in the reaction is an important step toward such improvements. The catalysts used to foster the reactions are typically metal oxides, and the team wanted “to be able to explain the activity of the sites [on the surface of the catalyst] that split the water,” Grimaud says. The question of whether some oxygen gets stored within the crystal structure of the catalyst and then contributes to the overall oxygen output has been debated before, but previous work had never been able to resolve the issue. Most researchers had assumed that only the active sites on the surface of the material were taking any part in the reaction. But this team found a way of directly quantifying the contribution that might be coming from within the bulk of the catalyst material, and showed clearly that this was an important part of the reaction. They used a special “labeled” form of oxygen, the isotope oxygen-18, which makes up only a tiny fraction of the oxygen in ordinary water. By collaborating with Oscar Diaz-Morales and Marc T. Koper at Leiden University in the Netherlands, they first exposed the catalyst to water made almost entirely of oxygen-18, and then placed the catalyst in normal water (which contains the more common oxygen-16). Upon testing the oxygen output from the reaction, using a mass spectrometer that can directly measure the different isotopes based on their atomic weight, they showed that a substantial amount of oxygen-18, which cannot be accounted for by a surface-only mechanism, was indeed being released. The measurements were tricky to carry out, so the work has taken some time to complete. Diaz-Morales “did many experiments using the mass spectrometer to detect the kind of oxygen that was evolved from the water,” says Shao-Horn, who has joint appointments in the departments of Mechanical Engineering and Materials Science and Engineering, and is a co-director of the MIT Energy Initiative’s Center for Energy Storage. With that knowledge and with detailed theoretical calculations showing how the reaction takes place, the researchers say they can now explore ways of tuning the electronic structure of these metal-oxide materials to increase the reaction rate. The amount of oxygen contributed by the catalyst material varies considerably depending on the exact chemistry or electronic structure of the catalyst, the team found. Oxides of different metal ions on the perovskite structure showed greater or lesser effects, or even none at all. In terms of the amount of oxygen output that is coming from within the bulk of the catalyst, “you observe a well-defined signal of the labeled oxygen,” Shao-Horn says. One unexpected finding was that varying the acidity or alkalinity of the water made a big difference to the reaction kinetics. Increasing the water’s pH enhances the rate of oxygen evolution in the catalytic process, Han says. These two previously unidentified effects, the participation of the bulk material in the reaction, and the influence of the pH level on the reaction rate, which were found only for oxides with record high catalytic activity, “cannot be explained by the traditional mechanism” used to explain oxygen evolution reaction kinetics, Diaz-Morales says. “We have proposed different mechanisms to account for these effects, which requires further experimental and computational studies.” “I find it very interesting that the lattice oxygen can take part in the oxygen evolution reactions,” says Ib Chorkendorff, a professor of physics at the Technical University of Denmark, who was not involved in this work. “We used to think that all these basic electrochemical reactions, related to proton membrane fuel cells and electrolyzers, are all taking place at the surface,” but this work shows that “the oxygen sitting inside the catalyst is also taking part in the reaction.” These findings, he says, “challenge the common way of thinking and may lead us down new alleys, finding new and more efficient catalysts.” The team also included Wesley Hong PhD ’16, former postdoc Yueh-Lin Lee, research scientist Livia Giordano in the Department of Mechanical Engineering, Kelsey Stoerzinger PhD ’16, and Marc Koper of the Leiden Institute of Chemistry, in the Netherlands. The work was supported by the Skoltech Center for Electrochemical Energy, the Singapore-MIT Alliance for Research and Technology, the Department of Energy, and the National Energy Technology Laboratory.
News Article | February 15, 2017
Pain is a signal of actual or potential damage to the body, so it is natural to think of it as a localized sensation: knee pain in the knee, back pain in the back and so on. However, research has demonstrated that pain is an experience constructed in the brain. A knee doesn't "feel" anything. Instead, it sends signals to the brain. Input from the body is important, but a person's pain experience also depends on the brain's interpretation of what the input signal means. Scientists are just beginning to study these complex cerebral processes, and in a promising step forward, University of Colorado Boulder researchers have developed a functional MRI-based model that identifies brain activity patterns involved in varied pain experience, even when the input from the body is held constant. "Pain is more than just a passive response to stimuli. The brain actively contributes to pain, constructing it through various neural systems," said Choong-Wan Woo, lead author and a post-doctoral researcher in CU Boulder's Institute of Cognitive Science when the research was completed. "Thus, we wanted to build a brain-based model to predict pain using variables beyond the painful stimuli." For the study, researchers began by aggregating data from six independent brain imaging studies, deliberately choosing those with differing methodologies. In all of the studies, participants had been exposed to several seconds' worth of a painful stimulus and asked to rate their pain while inside an MRI scanner that recorded brain activity. From the data, the researchers were able to identify common markers in the brain that were predictive of a participant's different pain experiences when external stimuli are matched on intensity, resulting in fine-grained mapping of both positively correlated ("pro-pain") and negatively correlated ("anti-pain") brain sub-regions. Comprising part of the new model, those markers several brain regions that are not classically considered important for pain. However, the regions -- which include the ventromedial prefrontal cortex, nucleus accumbens, and hippocampus -- are involved in the brain's assessment of the meaning of painful and non-painful events alike. The researchers named their telltale brain pattern the Stimulus Intensity Independent Pain Signature-1 (SIIPS1), a preliminary roadmap that can now be tested and refined in future studies. "We now have a model that can be applied to other basic and clinical pain research in the field," said Woo, who is now beginning an Assistant Professorship at Sungkyunkwan University in South Korea. "We deliberately added the number one to the name because we don't think this is the only brain signature related to pain and expect that more will be developed." The SIIPS1 may provide researchers with a new understanding of chronic pain and hypersensitivity to pain, potentially paving the way for the development of clinical applications and more effective treatments. "There is increasing evidence that chronic pain often involves changes in brain areas identified in our model," said Tor Wager, a professor in CU Boulder's Department of Psychology and Neuroscience and the study's senior author. "The SIIPS1 provides a template for systematic evaluation of how these areas are altered in chronic pain. We hope that it will improve our understanding of chronic pain and lead to the development of new options for preventing and treating this complex disease." The study was published today in the journal Nature Communications. In addition to Woo and Wager, co-authors of the new research include Liane Schmidt of Ecole Normale Supérieure (France); Anjali Krishnan of Brooklyn College of the City University of New York; Marieke Jepma of Leiden University (Netherlands); Mathieu Roy of McGill University (Canada); Martin Lindquist of Johns Hopkins University; and Lauren Atlas of the National Center for Complementary and Integrative Health and the National Institute on Drug Abuse.
News Article | April 8, 2016
Laura van Leeuwe Kirsch was in the middle of her course on kidney transplantation when she answered another email from me. She typed out her response, hit send, and returned to watching a surgeon’s hands wrest open an incision on a patient's abdomen. Or at least, that’s how I imagined it. In reality, my only window into the university student’s life was through our screens, even though only a few Dutch city blocks separate us. It’s not her fault, though: As a first-year biomedical science major at the University of Leiden, she’s overwhelmed with coursework and makes time for little else other than absorbing information. Lately, van Leeuwe Kirsch has been spending more of that time at her screen: in January, she enrolled in “Clinical Kidney Transplantation,” the world’s first MOOC, or massive open online course, to offer instruction in the surgical procedure. She had already spent two hours per week over five weeks of school learning from her laptop, digesting short, pre-filmed interviews with doctors and a kidney donor, playing games that quizzed her on human anatomy, and realistic videos detailing the surgical process on a computer-generated patient. After an introductory video, van Leeuwe Kirsch watches a series of six to seven additional clips, each of which typically runs under 10 minutes. At the end, there are one or two interactive activity modules, called “E-tivities” and a quiz to complete, along with optional videos. Some weeks, she also gets assignments. Because she took the class out of curiosity and to supplement her regular biology course load at Leiden, Van Leeuwe Kirsch opted not to pay the optional 43 Euro fee for a certificate of completion, and she skipped the interactive activity modules to save time. “The E-tivities help you start a discussion with your fellow ‘classmates’ in a way that you can learn from each other,” van Leeuwe Kirsch said. “However, I didn’t participate in any of these discussions since I only had time to follow the course itself.” The course is among the world’s few clinical MOOCs, but it isn't intended to replace real-life education with a professor and a hospital patient. Nor is it likely to prepare van Leeuwe Kirsch to conduct a kidney transplant anytime soon. (“Not that I’m allowed to anyway,” she said.) But it's part of a growing digital push among medical schools seeking to educate a generation of students raised on smartphones and to expand their audiences to virtually anyone with a computer and an internet connection. The university encourages non-medical students to register for its medical MOOCs, but an introductory background in biology and physiology is recommended. Since it began in January, more than 3,500 students from 90 countries enrolled, and 200 students completed it on time. Five percent of enrolled students chose to pay the 43 Euro fee for the digital certificate of completion. “I have not taken a MOOC before,” said van Leeuwe Kirsch, “but I am definitely considering taking another one.” Immersive online training began at a global group of med schools in the early 2000s with very limited success. But in recent years, services like YouTube and digital learning platforms designed by companies like Udacity, edX, and Coursera have helped schools across the globe experiment with online learning again as a way to expand the reach of their educational brands and overturn the traditional lecture-exam-lecture format. Supporters of the idea say that MOOCs’ interactive learning elements are designed to increase knowledge retention, while their nearly limitless size allows more students to learn from scenarios that wouldn’t normally accommodate large classes. “We usually cannot get all of the people involved in a kidney transplantation together into one lecture room,” Dr. Marlies Reinders, a nephrologist and one of the instructors of the kidney transplant MOOC, explained. “It would almost be impossible.” Leiden has made bold strides into the MOOC era. In 2013, it became the first Dutch university to partner with Coursera, the for-profit Mountain View-based company that, with thirteen million registered users and 1,500 courses, leads the pack among massive online education platforms. Last year, the university's Coursera offerings—on topics like global terrorism, tax law, linguistics, and urban mining—reached about 200,000 enrollments from 196 countries. Out of those, 11,000 exams were submitted and over 12,000 Statements of Accomplishment were awarded, according to the university, which has a real-life student body of 25,800 students. The kidney transplantation MOOC is already available independently of its curriculum, but the school intends to integrate it into two courses in its medical curriculum: the first in April and the second in September. And this week, the school launched a new medical MOOC: “Anatomy of the Abdomen and Pelvis, a journey from basis to clinic.” While MOOCs that cover theoretical coursework in physiology, biology, and pathology are more common, other medical schools are experimenting with more clinical MOOCs too. Meanwhile, schools in the U.S. and abroad are still determining how best to mix in-person learning with virtual lessons. Stanford, Yale, and Harvard have all to varying degrees implemented a so-called “flipped” classroom approach, in which students watch lectures at home and work on problems in groups and with professors during class time. In September, Harvard Medical School’s incoming class became its first cohort to learn through its flipped classroom curriculum. Alongside video lectures and interactive activities, students now spend time with patients earlier, with a weekly session starting in year one, and do clinical rotations in year two, moved up by a year. “My job, in the time that we’re together, student and teacher, is to teach you what you can’t Google,” Richard M. Schwartzstein, a Harvard professor who helped develop the new curriculum, told the Boston Globe. The office of the course’s lead instructor, Dr. Reinders, feels practically inaccessible behind an unmarked security door. She did not hold office hours during the MOOC, and she didn’t meet any of the enrolled students during the course, but a few arranged to meet her after the course finished, she said. (The class was assisted by four graduate student instructors and five moderators, who were available to answer questions on the course’s online forum.) Like van Leeuwe Kirsch, Reinders is busy—very busy—which is one reason she likes the idea of virtual learning. “As academic doctors we have clinical, research and education duties,” she explained. “We are in charge of the hospital,” and “with regular courses we give classes personally.” But “teaching online saves a lot of time for the teachers. Time saved now can be spent on the other duties. It is a little difficult to express [the benefits] in money.” Dr. Reinders, who is helping the university develop a new medical curriculum and novel teaching methods, stresses that MOOCs like hers will not only save time for teachers and students: they can raise the sophistication of medical instruction too. Preparing for, performing, and following up after a kidney transplant requires at least 13 different departments, she said. Instead of limiting a lecture to one professor, MOOCs let each of these 13 practicing physicians teach students directly through short videos. (A team of three animators helped to design the 3D animations that accompany these videos; the university did not disclose the cost of producing the course materials.) At Leiden, the choice of topic for its first major medical MOOC was no accident. The university specializes in transplantation research and technology, owed in large part to the work of a former faculty member, the Dutch immunologist Jon van Rood. Van Rood made significant contributions in the 1960s and 1970s to understanding HLA typing, the technology that determines how well patients will accept foreign cells into their bodies. Van Rood was also one of the founders of Eurotransplant, which coordinates the international organ trade among eight European countries and is run from the University of Leiden’s campus. Medical experts expect the number of kidney transplants to rise sharply in the next decade as the proportion of patients with end-stage renal disease increases. To Dr. Reinders, that makes expanding access to information about kidney health and transplantation imperative. The class is “designed for (bio) medical students, health care professionals and anyone interested in research and knowledge on clinical transplantation," said a post on the school’s Facebook page. Providing instructions for performing a kidney transplant to anyone could, in theory, impact the dangerous black market for organs: kidneys are one of the most widely sought-after organs. In Europe, there were more than 68,000 people on the waiting list for a kidney transplant in 2012. Reinders, however, sees the course as a positive force in the international arena. “We see this MOOC as a way to help other countries that don’t have the same resources,” she said. MOOCs can also help schools like Leiden improve both virtual and in-person learning via the reams of data that their participants generate. “Each one of these courses generates a wealth of behavioral data by which we can evaluate teaching methods, improve on-campus education, and offer a platform for academic research to investigate learning behavior of students with different educational and cultural backgrounds,” wrote Jasper Ginn, a data analyst at the university’s Online Learning Lab, in a blog post last year. Some of that raw data can be found in the ratings and review section on the course’s Coursera page. There, among the class’s reviews, one student wrote that the mentors, who are graduate students from the University of Leiden, communicated well with students. Other students lauded the 3-D graphics and up-to-date information. A student who indicated he or she had advanced knowledge of kidney transplantation was slightly less positive. “Very useful for all the users who are not still working in the transplantation field,” they wrote. “For those people this course is perhaps too superficial.” This student still rated the course with the maximum five stars; the course received 4.6 stars overall. While none of the University of Leiden’s MOOCs requires payment, the school receives a percentage of the fees students pay for certificates of completion, part of Coursera's "Signature Track." The certificates aren’t accepted as credit at academic institutions, but they’re endorsed by both Coursera and the institution offering each course, said Daphne Koller, co-founder and president of Coursera. “Learners are given digital copies of their certificates that they can print, and they also have the option to easily publish certificates on their LinkedIn profiles.” Koller said that Coursera’s certificates are one of the most popular types of certificates to share on LinkedIn. If a student wants to receive a certificate but can’t afford to pay the fee, Coursera promises need-based financial aid. Coursera is also expanding past its certificate programs to full-blown degree programs. At the end of March, the company announced a partnership with the University of Illinois to allow Coursera users to earn a master’s degree in data science from the school. This is the first degree-earning opportunity that Coursera has offered and the first that any traditional higher education institution has recognized on a MOOC platform. “We expect to launch a number of additional degree programs in the coming year and onward,” Koller wrote. “We would be excited to explore options in medicine or another health-related field.” Some of the types of teaching modules used in Leiden's MOOCs The University of Leiden did not say what percentage of fees it receives from its MOOCs, but Coursera says it splits revenue 50-50 with universities. In addition to paid certificates, Coursera and its university partners have also sought new sources of revenue. In a report published in March, the university endorsed new paths toward monetization, and said it would experiment with crowdfunding, pay it forward, and pay-what-you-wish models this year. "We are not opposed to experiments with new monetization models as a financially healthy platform provider and the continuity of the platform are important for Leiden University,” says the report. “Also, the share the university receives helps financing MOOC updates and to keep them running on-demand.” For all of her enthusiasm about MOOCs, Reinders is more interested in a hybrid model. “I really believe that, from the start, the approach has to be blended,” she said, alluding to the “flipped” classrooms that other medical schools are beginning to implement. In 2002, 50 medical schools from around the world banded together to form the first-of-its-kind International Virtual Medical School. Led by the University of Dundee, and including Brown and Wake Forest, IVIMEDS encouraged participating schools to offer the same online coursework during the first two years of their programs. In the final two years, IVIMEDS contacts would oversee the students’ clinical experiences at local hospitals. Ultimately, all but a few universities had the technical capabilities to create and share content, and most weren’t committed to streamlining a uniform curriculum on the internet. “Many universities had only joined because they felt like they were missing out on something,” said Natalie Lafferty, head of the Center for Technology and Innovation in Learning at the University of Dundee, over Skype. By 2010 the program had dissolved. That year, however, a book by a group of medical educators, Educating Physicians: A Call for Reform of Medical School and Residency, helped inspire new thinking around medical curricula, which had changed little since the beginning of the 20th century. Meanwhile, Lafferty credits the rise of the open access movement in the education industry with making virtual learning tools more acceptable in medical schools this time around. Online platforms like Coursera facilitate sharing, and more people own digital recording devices and know how to operate video and graphics editing software. Mobile phones have put educational and informational tools in students’ hands even when they’re roaming the hospital floor. In addition to browsing video archives and doing web searches, they can access MOOCs on mobile devices. Advances in graphics simulations and, now, virtual reality could enhance these MOOCs even more. Software advances have already made it possible to develop an entire virtual patient. Virtual reality is beginning to help practicing physicians better diagnose conditions in patients. Virtual reality software can piece together existing MRI and CT images into 3D visuals that doctors can rotate around and inspect from any angle without having to open up a patient. Yet another digital learning experiment from Harvard and MIT shows that medical MOOCs might not follow a completely linear trajectory into the virtual realm but rather settle into a some hybrid of open-closed and digital-in-person instruction. In 2013, Harvard and MIT started a program of small private online courses, called SPOCS, on their proprietary platform, edX. SPOCS are designed for a single classroom of tuition-paying students who attend the physical lectures. Neither massive, nor open, a SPOC is essentially a recorded lecture that a student watches before going to class, the heart of the flipped classroom approach. From a 2014 worksheet on MOOCs by the American Academy of Medical Colleges But the flipped classroom approach can come with downsides, say some educators, especially when schools can’t match technology with its pedagogical needs. “Except for introducing myself in the course, I didn’t have any contact with the teachers whatsoever.” van Leeuwe Kirsch told me. “I can imagine that other students that participated in discussions had more contact [with the teachers].” Despite the visual and practical appeal of medical MOOCs, students still crave personal interaction. “A lot of people think MOOCs are the future,” Lafferty said. “But students like interacting with the teacher and interactive small group learning.” Over the last two decades, medical education thought leaders have promoted team and problem-based learning curricula in the pre-clinical years. According to a study published this year in Medical Teacher, pediatrics students retained knowledge better over the short-term using these interactive, in-person methods when compared to students following traditional lectures. Still, the study found, this curriculum change had no impact on long-term knowledge retention. Medical schools that offer a wide range of MOOCs also recognize digital learning’s limits. Harry Goldberg, assistant dean of the Johns Hopkins School of Medicine, began making taped lectures available to students in his cardiovascular physiology course 15 years ago. Today the university’s Bloomberg School of Public Health offers a MOOC catalog that more than two million students have followed on Coursera. But as Goldberg told the medical journal BMJ in 2013, medical teaching often requires more interaction than a MOOC can provide. “MOOCs have a role in medical education,” he said. “I think that role is a lot smaller than people hope it will be.” Some medical students express a general distaste for flipped classrooms. In February a reddit user commented on Stanford's flipped classroom approach on a forum titled “Dear Stanford Medical School”: On another online medical student forum last September, a first-year medical student who attends a school with flipped classrooms described being “broken down” by his performance so far. “I get out of class at 3 usually and have 8 hours of videos to watch so it doesn't leave me with much time to do anything.” He wrote. “I feel that my biggest mistake was studying alone [and] not doing any problems.” One respondent suggested that he watch the videos at 1.5 times their normal speeds. To some, MOOCs may be a sign of how patient-student-doctor interaction in the medical system has changed. Hospitals are discharging patients more quickly after surgery than they have in the past, in line with research that shows that fewer post-operative interventions lead to better patient outcomes. And many patients that would have been treated in hospitals twenty years ago are now treated elsewhere, at outpatient clinics, or at home. Interns, too, spend less time in the hospital. In 2013, a study in the Journal of General Internal Medicine found that US medical interns only spend around eight minutes with each of their patients per day—around 12 percent of their working shift, down from 18 percent in the early nineties. Meanwhile, medical students, like most everyone else, are already spending increasing amounts of time with screens. In addition to learning and updating electronic medical records systems, medical students now use Google as a de facto learning tool, and special social networks have evolved to let doctors share medical information in real-time. A group of doctors called FOAMed shares medical visualizations and case studies on Twitter and Periscope. In 2014, the venture capital firm Union Square Partners backed a company called Figure 1, that also lets clinicians share case studies through its app. Figure 1 says 30 percent of medical schools in the US use its app, while it is available in 19 countries and has 150,000 users. Once they leave medical school, doctors may still be inclined to spend still more time at computers than face-to-face with their patients. More doctors now accept Skype consultations, and major US health insurance providers support telemedicine apps, like MDLive and Teladoc. Already, online medical courses are proving useful for present-day doctors, as a way of maintaining certification or continuing education for those who don’t have the time to attend medical conferences or read new journal papers. For undergraduates like van Leeuwe Kirsch, MOOCs are offering a taste of what’s to come. She hasn’t yet decided if she’ll enroll in a medical master’s program after graduation, but if and when she does, van Leeuwe Kirsch may not simply be better equipped for kidney transplants: she’ll be more prepared for a curriculum that’s only going to get more virtual. Correction: An earlier version of this article said that Coursera's revenue sharing agreements with universities give it up to 15% of course revenues. In fact, the split is 50-50.
News Article | December 9, 2016
MELBOURNE, Australia, Dec. 09, 2016 (GLOBE NEWSWIRE) -- Adherium Limited (ASX:ADR), a global leader in digital health technologies that address sub-optimal medication use in chronic disease, today announces that some of Europe’s most prestigious clinical opinion leaders have chosen Adherium’s Smartinhaler™ for the myAirCoach programme, part of the European Union’s Horizon 2020 Framework for Research and Innovation. The myAirCoach programme is being funded through the Horizon 2020 framework which is the largest EU Research and Innovation programme ever established with nearly €80 billion of funding available over 7 years (2014 to 2020). The purpose of the myAirCoach programme is to establish whether home monitoring and mobile health (mHealth) systems can be used to predict asthma control and the occurrence of asthma exacerbations. One of the main goals of the project is to help patients manage their health through user-friendly tools that will increase awareness of their clinical state as well as the adherence and effectiveness of the medical treatment they follow. The impact of the myAirCoach programme is expected to set the basis for the widespread adoption of sensor-based self-management systems across the spectrum of respiratory diseases. (Source: http://www.myaircoach.eu/myaircoach/content/what-myaircoach-project) Smartinhaler™ is already proven to reduce hospitalisation as demonstrated in the results of the year-long STAAR study, recently published online in the prestigious peer-reviewed medical journal Thorax . The study revealed a significant reduction in hospital admissions over the course of 12 months as well as substantial other health and quality of life benefits for patients using Smartinhaler™. The data from the STAAR study builds on a study published in January 2015 in The Lancet Respiratory Medicine  which showed the use of the Smartinhaler™ platform increased adherence to preventative medication by 180% and reduced use of reliever medication by 45%. “Successful disease management and prevention is only truly achieved by understanding the real-world complexity of our lives. The best way to do that is through rich data collected in the real world. By providing our innovative solutions to people with asthma, we can help people take the right medicine, at the right dose, at the right time and in the right way,” said Garth Sutherland, Founder & Group CEO of Adherium. “The myAirCoach programme is a leading initiative assessing the impact that digital health solutions, such as Smartinhaler™, can have on patients in a real-world setting. It is a major international project, funded by the prestigious European Union Horizon 2020 framework.” Participating organisations in the myAirCoach project include: Imperial College London, University of Manchester and Leiden University Medical Center. The project is anticipated to run for the next 12 months. Dr. Omar Usmani, Reader in Respiratory Medicine & Consultant Physician, National Heart and Lung Institute, Imperial College London and Royal Brompton Hospital, said; “The myAirCoach programme will help asthma sufferers to manage their health through user-friendly tools that will increase awareness of their clinical state as well as their adherence and the effectiveness of medical treatment they follow.” Professor Fan Chung, Professor of Respiratory Medicine and Head of Experimental Studies Medicine, National Heart and Lung Institute, Imperial College London and Royal Brompton Hospital, commented; “People with asthma are more likely to miss school or work and experience reduced quality of life. We are excited about the collaboration between Adherium and myAirCoach which is expected to set the basis for the widespread adoption of sensor-based self-management systems across the spectrum of respiratory diseases.” Adherium will be supplying a range of products to the programme via three key centres, including devices for inhaled medications manufactured by AstraZeneca and GSK: The project is coordinated by the Centre for Research and Technology Hellas (CERTH), Thessaloniki, Greece and is supported by a group of asthma patients acting as advisors. Other participating organisations in the myAirCoach project include: University of Patras, Allertec Hellas SA, IHP Microelectronics, ZorgGemak (MedVision 360BV), European Federation of Asthma & Allergy Associations, Cnet Svenska AB, Circassia and Asthma UK. More information on myAirCoach can be found at the project’s website http://www.myaircoach.eu/myaircoach/  Published online: Thorax doi:10.1136/thoraxjnl-2015-208171  Chan AHY, Stewart AWS, Harrison J, Camargo C, Black PN, Mitchell EA. The effect of an inhaler with ringtones on asthma control and school attendance in children. Lancet Respir Med. 2015;3:210-219 Adherium (ASX:ADR) is an Australian Securities Exchange listed company which develops, manufactures and supplies digital health technologies which address sub-optimal medication use and improve health outcomes in chronic disease. Adherium operates globally from bases in the USA, Europe and Australasia. Adherium is a provider of digital health solutions to patients, pharmaceutical companies, healthcare providers and contract research organizations. The Company’s proprietary Smartinhaler™ platform has been independently proven to improve medication adherence and health outcomes for patients with chronic respiratory disease. Adherium has the broadest range of "smart" medication sensors for respiratory medications globally. The Smartinhaler™ platform has so far been used in more than 65 projects (clinical, device validation or other) and has been referenced in 56 peer reviewed journal articles. Clinical outcomes data has proven that the Smartinhaler™ platform can improve adherence by up to 59% in adults and 180% in children and reduce severe episodes by 60% in adults, leading to improved quality-of-life and demonstrating a substantial gain over current best practice treatment. The Company has received FDA 510(k) notifications for clearance to market and CE Marks for its devices and software, which allows it to sell these devices into international markets.
Huisman E.M.,Leiden University |
Lubensky T.C.,University of Pennsylvania
Physical Review Letters | Year: 2011
We numerically investigate deformations and modes of networks of semiflexible biopolymers as a function of crosslink coordination number z and strength of bending and stretching energies. In equilibrium filaments are under internal stress, and the networks exhibit shear rigidity below the Maxwell isostatic point. In contrast to two-dimensional networks, ours exhibit nonaffine bending-dominated response in all rigid states, including those near the maximum of z=4 when bending energies are less than stretching ones. © 2011 American Physical Society.
Titulaer M.J.,Leiden University |
Titulaer M.J.,University of Pennsylvania |
Lang B.,John Radcliffe Hospital |
Verschuuren J.J.G.M.,Leiden University
The Lancet Neurology | Year: 2011
Lambert-Eaton myasthenic syndrome (LEMS) is a neuromuscular autoimmune disease that has served as a model for autoimmunity and tumour immunology. In LEMS, the characteristic muscle weakness is thought to be caused by pathogenic autoantibodies directed against voltage-gated calcium channels (VGCC) present on the presynaptic nerve terminal. Half of patients with LEMS have an associated tumour, small-cell lung carcinoma (SCLC), which also expresses functional VGCC. Knowledge of this association led to the discovery of a wide range of paraneoplastic and non-tumour-related neurological disorders of the peripheral and central nervous systems. Detailed clinical studies have improved our diagnostic skills and knowledge of the pathophysiological mechanisms and association of LEMS with SCLC, and have helped with the development of a protocol for early tumour detection. © 2011 Elsevier Ltd.
Zijlstra P.,Leiden University |
Paulo P.M.R.,Leiden University |
Orrit M.,Leiden University |
Orrit M.,University of Lisbon
Nature Nanotechnology | Year: 2012
Existing methods for the optical detection of single molecules require the molecules to absorb light to produce fluorescence or direct absorption signals. This limits the range of species that can be detected, because most molecules are purely refractive. Metal nanoparticles or dielectric resonators can be used to detect non-absorbing molecules because local changes in the refractive index produce a resonance shift. However, current approaches only detect single molecules when the resonance shift is amplified by a highly polarizable label or by a localized precipitation reaction on the surface of a nanoparticle. Without such amplification, single-molecule events can only be identified in a statistical way. Here, we report the plasmonic detection of single molecules in real time without the need for labelling or amplification. Our sensor consists of a single gold nanorod coated with biotin receptors, and the binding of single proteins is detected by monitoring the plasmon resonance of the nanorod with a sensitive photothermal assay. The sensitivity of our device is 1/4700 times higher than state-of-the-art plasmon sensors and is intrinsically limited by spectral diffusion of the surface plasmon resonance.© 2012 Macmillan Publishers Limited.
Cohen A.F.,Leiden University |
Cohen A.F.,Center for Human Drug Research
Nature Reviews Drug Discovery | Year: 2010
New medicines are designed to bind to receptors or enzymes and are tested in animal cells, tissues and whole organisms in a highly scientific process. Subsequently they are often administered to human subjects with tolerability as the primary objective. The process of development is considered to be linear and consecutive and passes through the famous four phases of development (Phase I- Phase IV). This is efficient for those projects for which the uncertainty about the development is low. There is, however, an increasing number of new prototypical compounds resulting from the increased biological knowledge with a high level of uncertainty. For these prototypical drugs development has to proceed in a much more adaptive manner, using tailor-made objectives, the development of special methodology and a cyclical rather than a linear type of project management. © 2010 Macmillan Publishers Limited. All rights reserved.
Inouye S.K.,Beth Israel Deaconess Medical Center |
Inouye S.K.,Institute for Aging Research |
Westendorp R.G.J.,Leiden University |
Westendorp R.G.J.,Vitality |
Saczynski J.S.,University of Massachusetts Medical School
The Lancet | Year: 2014
Delirium is an acute disorder of attention and cognition in elderly people (ie, those aged 65 years or older) that is common, serious, costly, under-recognised, and often fatal. A formal cognitive assessment and history of acute onset of symptoms are necessary for diagnosis. In view of the complex multifactorial causes of delirium, multicomponent non-pharmacological risk factor approaches are the most effective strategy for prevention. No convincing evidence shows that pharmacological prevention or treatment is effective. Drug reduction for sedation and analgesia and non pharmacological approaches are recommended. Delirium offers opportunities to elucidate brain pathophysiology - it serves both as a marker of brain vulnerability with decreased reserve and as a potential mechanism for permanent cognitive damage. As a potent indicator of patients safety, delirium provides a target for system-wide process improvements. Public health priorities include improvements in coding, reimbursement from insurers, and research funding, and widespread education for clinicians and the public about the importance of delirium.
Bot I.,Leiden University |
Shi G.-P.,Harvard University |
Kovanen P.T.,Wihuri Research Institute
Arteriosclerosis, Thrombosis, and Vascular Biology | Year: 2015
The mast cell is a potent immune cell known for its functions in host defense responses and diseases, such as asthma and allergies. In the past years, accumulating evidence established the contribution of the mast cell to cardiovascular diseases as well, in particular, by its effects on atherosclerotic plaque progression and destabilization. Through its release not only of mediators, such as the mast cell-specific proteases chymase and tryptase, but also of growth factors, histamine, and chemokines, activated mast cells can have detrimental effects on its immediate surroundings in the vessel wall. This results in matrix degradation, apoptosis, and enhanced recruitment of inflammatory cells, thereby actively contributing to cardiovascular diseases. In this review, we will discuss the current knowledge on mast cell function in cardiovascular diseases and speculate on potential novel therapeutic strategies to prevent acute cardiovascular syndromes via targeting of mast cells. © 2014 American Heart Association, Inc.
Agency: European Commission | Branch: H2020 | Program: RIA | Phase: FETOPEN-01-2016-2017 | Award Amount: 3.73M | Year: 2017
Two-dimensional materials (2DMs) such as graphene, hexagonal boron nitride, silicene and others, are currently amongst the most intensively studied classes of materials that hold great promise for future applications in many technological areas. However, the main hurdle against practical utilization of 2DMs is the lack of effective mass production techniques to satisfy the growing qualitative and quantitative demands for scientific and technological applications. The current state-of-the-art synthesis method of 2DMs involves the dissociative adsorption of gas-phase precursors on a solid catalyst. This process is slow by nature, inefficient, and environmentally unfriendly. Our analysis and recent experimental evidence suggest that using liquid metal catalysts (LMCats) instead of solid ones bears the prospect of a continuous production of 2DMs with unprecedented quality and production speed. However, the current knowledge about the catalytic properties of LMCats is extremely poor, as they had no technological significance in the past. In fact, there exist no well-established experimental facilities, nor theoretical frameworks to study the ongoing chemical reactions on a molten surface at elevated temperatures and under a reactive gas atmosphere. Our aim is to establish a central lab under supervision/collaboration of several scientific/engineering teams across Europe to develop an instrumentation/methodology capable of studying the ongoing chemical reactions on the molten catalyst, with the goal to open two new lines of research, namely in situ investigations on the catalytic activity of LMCats in general, and unravelling the growth mechanisms of 2DMs on LMCat surfaces in specific. The gained knowledge will be used to establish the first efficient mass production method for 2DMs using the new LMCat technology. This will open up the possibility of exploiting the unique properties of 2DMs on an industrial scale and in every day devices.
Agency: European Commission | Branch: H2020 | Program: CSA | Phase: SC5-13e-2015 | Award Amount: 2.01M | Year: 2015
Primary and secondary raw materials are fundamental to Europes economy and growth. They represent the most important link in the value chain of industrial goods production, which plays a prominent role as a source of prosperity in Europe. However, as stated in the call, there exists to-date no raw materials knowledge infrastructure at EU level. The Mineral Intelligence Capacity Analysis (MICA) project contributes to on-going efforts towards the establishment of such an infrastructure by projects such as ProMine, EURare, Minventory, EuroGeoSource, Minerals4EU, ProSum, I2Mine, INTRAW, MINATURA2020 and others. The main objectives of MICA are: - Identification and definition of stakeholder groups and their raw material intelligence (RMI) requirements, - Consolidation of relevant data on primary and secondary raw materials, - Determination of appropriate methods and tools to satisfy stakeholder RMI requirements, - Investigation of (RMI-) options for European mineral policy development, - Development of the EU-Raw Materials Intelligence Capacity Platform (EU-RMICP) integrating information on data and methods/tools with user interface capable of answering stakeholder questions, - Linking the derived intelligence to the European Union Raw Materials Knowledge Base developed by the Minerals4EU project. The MICA project brings together a multidisciplinary team of experts from natural and technical sciences, social sciences including political sciences, and information science and technology to ensure that raw material intelligence is collected, collated, stored and made accessible in the most useful way corresponding to stakeholder needs. Furthermore, the MICA project integrates a group of 15 European geological surveys that contribute to the work program as third parties. They have specific roles in the fulfilment of tasks and will provide feedback to the project from the diverse range of backgrounds that characterizes the European geoscience community.
Agency: European Commission | Branch: FP7 | Program: CP-FP | Phase: KBBE-2008-1-1-01 | Award Amount: 4.17M | Year: 2009
Successful and efficient plant breeding depends on rapid recombination of advantageous traits to form new crop varieties. In recent years new breeding techniques have been introduced which rely on transgenic alteration of somatic cells and regeneration into plants with novel properties. The precision and effectiveness of both strategies rely upon homologous recombination (HR). The objective of this proposal is to provide plant breeders with new tools allowing better control over HR in both somatic and meiotic cells. The expected outcomes of the proposed research are efficient gene targeting (GT) technologies for precise engineering of plant genomes and control of rates of meiotic recombination between homologous or homeologous chromosomes in classical breeding. The major components of the HR machinery are common to somatic and meiotic cells, enabling us to address both processes in a synergistic way. HR can be divided into different steps: initiation by formation of a DNA double-strand break (DSB); recognition and invasion of an homologous DNA sequence; resolution of recombination structures. Each stage contains a bottleneck for both GT and meiotic HR that we will address. Work package 1 (WP1) aims at enhancing HR through targeted DSB induction. DSBs will be induced by Zinc-finger nucleases that can be custom-designed for target sequences anywhere in the genome. In WP2, we will test the influence of HR factors affecting homologue invasion and heteroduplex formation, such as RAD51 and its paralogues, the RAD52 homologue, genes that affect cytosine methylation in DNA, and mismatch repair. In WP3 we will concentrate on proteins involved in resolution and crossing-over. WP4 will test combinations of those approaches found in the first three WPs to build optimal strategies for application. Most experiments will be performed in the model plant Arabidopsis and implemented into crops such as tomato and maize to guarantee quick applicability for breeding.
Agency: European Commission | Branch: FP7 | Program: CP-IP | Phase: KBBE-2007-3-1-09 | Award Amount: 8.06M | Year: 2009
Plants sustainably produce low levels of secondary metabolites of high industrial value. However, they are often too complex to be economically manufactured by chemical synthesis. Advanced metabolic engineering and exploitation of plants as Green Factories has been prevented due to poorly understood metabolic pathways in plants and the regulation thereof. SmartCell brings together 14 leading European academic laboratories and four industrial partners in order to create a novel concept for rationally engineering plants towards improved economical production of high-value compounds for non-food industrial use. Although SmartCell focuses on terpenoids, the largest class of secondary metabolites, which exhibit extremely diverse biological and pharmaceutical activities, all knowledge, tools and resources developed in the project, are generic and broadly applicable to engineer any plant biosynthetic pathway. A systems biology approach using metabolomics and transcriptomics is taken to move beyond the state of the art. New multigene transfer technologies are developed. By screening and functionally categorizing genes at structural, regulatory and transport levels a comprehensive knowledge base of how secondary metabolite biosynthetic pathways operate in plants is developed. The case study component i.e. manufacturing a valuable terpenoid in an optimized large-scale system gives SmartCell a unique opportunity to directly make transition from fundamental science to application. For long-term exploitation an integrated database, compound library, cell culture collection and a genebank available for academic and industrial communities will be established. SmartCell provides new opportunities for SMEs and established European biotech companies, and the technology can also be transferred to other e.g. fine chemical and pharmaceutical industries. SmartCell will prove that plant-based resources can furnish the European society and industry far more than they presently do.
Agency: European Commission | Branch: FP7 | Program: CP-CSA-Infra | Phase: INFRA-2008-1.1.1 | Award Amount: 5.40M | Year: 2009
It is an unfortunate truth that the current electronics is facing a brick wall in a decade or so when Moores law has finally run its course and no further miniaturization is possible. We need something new. Coherent electron circuitry may provide that entirely new alternative. In nanocircuits the electrons can behave coherently over the circuit dimension and thus follow the rules of wave motion rather than Ohms law. To achieve coherence, however, electron scattering lengths must be larger than the sample size. That demands high purity to limit impurity scattering, but even limiting thermal scattering, by working at millikelvin temperature, we are still confined to circuits on the nanoscale. This provides the motivation for this application: there is an implicit imperative in nanoscience that there are enormous advantages to be gained at much lower temperatures. Despite the clear demand, nanoscience in general is inhibited from advancing beyond the millikelvin regime by a lack of appropriate expertise and facilities. However, in Europe we already have the greatest concentration of microkelvin infrastructure and espertise in the world, developed by our quantum-fluids community. By integration and rationalization MICROKELVIN aims to put this existing infrastructure at the disposal of the wider community and together develope new techniques and materials to bring corerent structures into the completely new regime. Our ultimate aim is the creation of a virtual European microkelvin laboratory without walls operating as a single entity. Integration will also allow us to pool our existing expertise and project it outword by greating new stand-alone machines able to access this temperature range anywhere. Such activity will also encourage European commercial interest in this opportunity. This advance is inevitable in the long term, but the European lead in the microkelvin field gives us the opportunity now to be the first with this new development. The infrastructure is there. The need is manifest. We simply have to bring the two together.
Agency: European Commission | Branch: FP7 | Program: CP | Phase: ICT-2007.8.0 | Award Amount: 4.58M | Year: 2008
HIDEAS aims at a breakthrough in the information capacity of Quantum Communication (QC), well beyond the standard single-mode approach, by exploiting the intrinsic multi-modal and multivariate character of the radiation field. Our long term vision is that of a broadband Quantum Communication, where all the physical properties of the photons are utilized to store information. Working at the quantum level requires: i) to produce quantum entanglement of light in high dimensional and multivariate spaces and ii) to create multimode quantum interfaces between light and matter in order to store high-D quantum states of light in long-lived matter degrees of freedom. Beneficial impacts will be also on Quantum Metrology. As crucial steps towards these objectives we propose five research lines: WP1 aims at bringing to the quantum realm the spectacular progress achieved by the introduction of frequency combs in the classical domain. WP2 aims at realizing sources of multimode spatially entangled light appropriate for paving the way to parallel quantum communication and information processing. WP3 aims at realizing a very high-D entanglement between twin photons produced by parametric down- conversion (PDC). It focuses on the conjugate variables angle and optical angular momentum (OAM), providing a larger alphabet for QC and a more robust non-locality. WP4 explores the non-factorable spatio-temporal X-structure of light entanglement in PDC, opening an avenue that offers unique access to the full broad band of PDC. WP5 exploits various forms of multimode light-matter entanglement, to realize multi-modal light-atom quantum interfaces and a parallel quantum memory for light (quantum hologram), with resolution and memory capacity exceeding those of classical holograms. The consortium gathers an unparalleled expertise in measuring and manipulating OAM, in generating/controlling quantum effects in multi-mode PDC and in realizing light-matter quantum interfaces.
Agency: European Commission | Branch: H2020 | Program: CSA | Phase: ICT-26-2014 | Award Amount: 979.81K | Year: 2015
LIGHT2015 is an ambitious, high impact EU-wide outreach and education project to promote the importance of photonics to young people, entrepreneurs and the general public during the United Nations International Year of Light and Light-based Technologies (IYL2015). LIGHT2015 will leverage the tremendous visibility of IYL2015 to ensure that the public in all member states of the EU understand and appreciate: (i) what photonics is; (ii) what photonics is used for; (iii) why photonics is important. LIGHT2015 will raise awareness of the essential role of photonics in driving economic growth and improving quality of life, and recognition of the need for continued promotion of training and education in photonics. Specific target audiences are young people, entrepreneurs and the public, and there will be special focus on encouraging careers for women and girls. The LIGHT2015 project management team includes the leadership of the global IYL2015 initiative, three members of the Board of Stakeholders of Photonics21, and Europes major scientific societies in physics and optics. We thus have access to an unprecedented network of contacts which will allow effective penetration of activities to the EU public. LIGHT2015 is structured in terms of three broad objectives: Explain Photonics, Inspire People and Network Europe. To achieve these, we plan a broad range of complementary actions: (i) high-profile public lectures directed to students, entrepreneurs and industry EU-wide; (ii) events involving broad stakeholder groups coinciding with major events on the 2015 photonics calendar; (iii) hand-on training for teachers and students, and the first European-scale citizen-science photonics experiment using smartphones to raise awareness of the power of photonics in daily life; (iv) leveraging the IYL2015 visibility to strengthen networks and collaborations in Europe in the fields of photonics outreach and education.
Agency: European Commission | Branch: FP7 | Program: MC-ITN | Phase: FP7-PEOPLE-2012-ITN | Award Amount: 3.66M | Year: 2012
By the middle of the year, experts announced 2011 to be the most expensive year in counts of global disaster damages ever. The EU has articulated its ambition to become an effective player in crisis management as part of the European Security Strategy (Stockholm Programme 2010-2014, Lisbon Treaty). The Hyogo Framework stipulates that a substantial reduction of disaster losses is only possible through advanced governance and management. NITIMesr has been designed as a practice-inspired research initiative that addresses research and conceptualization of new modes and competencies for coordination and collaboration in heterogeneous actor networks including involvement of individuals, advanced practices of vertically integrating governance of crisis management, strategic and operative management, with a special focus on involvement and engagement of self-motivated individuals - actors, agents, volunteer, citizens or, as we call them - entrepreneurs. NITIMesr brings together an interdisciplinary group of researchers and industry partners including the two leading European security clusters around the international court of justice in Den Haag and the Bavarian Security Cluster around the International Campus Aerospace and Security, BICAS, at the German EADS headquarters in Munich. The aim is to explore new frontiers of safety and security with the institutional, governance, organizational and managerial challenges of crisis networks, to develop and test innovative approaches for coordination in real world settings and to build Europe-wide connected clusters for crisis management and implement integrated solutions. Both clusters expect universities to provide the key success factor talent through leadership for joint research, education, development and career development. The role of this ITN is to provide initial resources to establish European-level leadership to build research and training capacities in the security clusters on advanced crises management.
Agency: European Commission | Branch: H2020 | Program: MSCA-ITN-ETN | Phase: MSCA-ITN-2016 | Award Amount: 4.00M | Year: 2016
Circuit the Circular European Economy Innovative Training Network - creates a cohort of future leaders in research, policy & business through its innovative training programme focused on the Circular Economy. Circular business models, based on leasing or providing functionality rather than products, often called Product Services Systems, are widely seen as a way how business can create sustainable jobs and growth. The Ellen MacArthur Foundation (EMF) and McKinsey calculated that circular business will create billions of value. This opportunity has become an important development area for researchers engaged in the sustainability, engineering and design and business fields. Seven top universities well embedded in the EIT KIC Raw materials, supported by the EMF, their CE100 network and various companies propose here a multi-disciplinary approach to ensure a range of research perspectives are included across the circular field. 5 main areas of research are relevant to understand how to create such business models. 1. Businesses and business models: how to stimulate circular provisioning? 2. Supply chains: how to organize supply and delivery chains for circularity? 3. Users: how to motivate stimulate circular consumption? 4. Design: how to design circular value propositions? 5. Systems: How to ensure economic and environmental benefits can support for change to circularity? We choose these areas as our main Work Packages, and appoint PhD students in each of these areas with as main goals: 1. Create new business model innovation across Europe that helps to support the economy while at the same time reduce ecological burden 2. Create a new, sustainable and cross-disciplinary network of trained experts who will have the skills, qualifications, and professional connections to drive future innovation. 3. Create new links between industry and academia in training ESRs to develop new approaches to PSS which will help orgs to compete, create growth and innovation.
Agency: European Commission | Branch: FP7 | Program: CP-FP | Phase: ENV.2010.4.2.2-1 | Award Amount: 4.49M | Year: 2011
The main idea behind this project is to refine and elaborate economic and environmental accounting principles as discussed in the London Group and consolidated in the future SEEA 2012, to test them in practical data gathering, to troubleshoot and refine approaches, and show added value of having such harmonized data available via case studies. This will be done in priority areas mentioned in the call, i.e. waste and resources, water, forest and climate change / Kyoto accounting. In this, the project will include work and experiences from major previous projects focused on developing harmonized data sets for integrated economic and environmental accounting (most notably EXIOPOL, FORWAST, a series of Eurostat projects in Environment Accounting, and to a lesser extent EU KLEMS and WIOD). Where possible data gathered in the project will be consolidated in and enrich such existing databases (most notably the EXIOPOL and FORWAST databases). The project will be executed by a mix of National Statistical Institutes and top research institutes in this field Europe, of whom the majority was involved already in EXIOPOL, FORWAST and various EUROSTAT projects setting up environmental and economic accounts, or have dedicated expertise on key domain areas. The project has made special provisions for further engagement of (European) participants in the London Group.
Agency: European Commission | Branch: FP7 | Program: MC-ITN | Phase: FP7-PEOPLE-ITN-2008 | Award Amount: 2.37M | Year: 2009
Mobility of Ideas and Transmission of Texts is a joint PhD programme (2009-2013) that studies the medieval transmission of learning from the ecclesiastical and academic elites of the professional intellectuals to the wider readership that could be reached through the vernacular. The programme focuses on the medieval dynamics of intellectual life in the Rhineland and the Low Countries, nowadays divided over five countries (Switzerland, France, Germany, Belgium and the Netherlands) but one cultural region in the later Middle Ages. Here, the great fourteenth-century mystics Meister Eckhart, Johannes Tauler, Jan van Ruusbroec and their contemporaries produced a sophisticated vernacular literature on contemplative theology and religious practice, introducing new lay audiences to a personal relation with the Supreme Being. The project seeks to develop a new perspective on this literary culture by looking at the readership, appropriation and circulation of texts in the contemporary religious and intellectual contexts. The programme unites expertise in the fields of medieval philosophy, religious studies, manuscript studies and Dutch and German literature, to provide structural training for interdisciplinary and international research in one of the medieval aspects of European culture of lasting merit. The training programme is built on a number of current research projects in which all full partners (Antwerp, Freiburg, Lecce, Leiden and Oxford) participate simultaneously, thus offering an adequate international infrastructure for a series of coherent PhD projects on medieval literature and learning that require a broader academic framework than the national literatures and other concepts of the modern tradition of academic disciplines. The programme prepares a new generation of medievalists for international careers in academic research, education and the presentation of the medieval cultural heritage.
Agency: European Commission | Branch: FP7 | Program: CP-FP | Phase: HEALTH-2007-2.4.1-6 | Award Amount: 3.95M | Year: 2008
The co-operative Metafight project will perform functional studies to understand the dissemination and outgrowth of metastasis through systematic analysis of the Core Invasive Machinery contained within integrin-mediated ECM attachment structure. It includes a large and discretely localised intracellular signalling network which drives migration and invasion. Our strategy is based on the hypothesis that metastatic cells arise from molecular changes that alter the primary tumour architecture as well as the tumor microenvironment, by reversible modulation of cell adhesion, onset of signalling pathways and acquisition of novel migratory and invasive capacities. The Metafight Consortium is strengthened by established combined expertise on adhesion receptor signalling and availability of unique tumor cell and cancer animal model systems. We will characterise known and novel components of the Core Invasive Machinery, responsible for development and homing of metastasis, by innovative in vivo non-invasive imaging techniques. The Metafight project comprises seven major interrelated Work Packages (WP). WP1 is designed to investigate the molecular and functional architecture of the Core Invasive Machinery, from which depends tumour cell migration and eventually invasion. WP2 and WP3 screen for key modulators of metastasis formation in in vivo murine models, by non-invasive imaging techniques. WP4 is related to a comprehensive proteomic analysis of expression and phosphorylation of our candidate modulators in metastasis formation. WP1-4 feed into WP5, which builds upon and exploits knowledge generated in the first four WPs and translates it into existing pharmacological interventions to generate new candidate drugs. This will lead to patent filing and licensing strategies. WP6 and WP7 handle dissemination and management activities. The ultimate accomplishment will include improvement of health, reduction of health expenses, as well as creation of new jobs and economic benefit.
Agency: European Commission | Branch: FP7 | Program: CSA-CA | Phase: REGIONS | Award Amount: 2.41M | Year: 2010
Challenge: Health-TIES addresses Europes largest health challenges: an ageing population and the sustainability of the healthcare system. This cross-border challenge needs a collaborative approach! Third revolution: True collaboration between medical scientists, engineers, healthcare providers, industry and government is the key to innovation in healthcare (the third revolution). Health-TIES is at the forefront: a transnational consortium in medical technology comprising four top regions in biosciences, technology and entrepreneurship: Biocat (Catalonia), Medical Delta (West Netherlands), Oxford/Thames Valley, Canton de Zurich and a mentoring region szak-Alfld (Hungary). Common goals: To maximise the impact of innovation and RTD on healthcare, Health-TIES will stimulate joint science, education and state-of-the-art infrastructure; and boost the Healthcare Technology Innovation Cycle. S&T meet patients: Breakthroughs in molecular technology, imaging, and drug design have created new avenues for prevention (risk factor identification, early diagnosis) and more effective treatment. Health-TIES will apply these to Cardiovascular diseases, Cancer, Neurodegenerative diseases, Immunology & Infectious diseases. Accelerate innovation: Health-TIES will accelerate the Healthcare Technology Innovation Cycle, which connects engineers and medical professionals, scientists and entrepreneurs, developers and end-users. Policy makers will be engaged in developing the RTD agenda. Analyse, integrate, disseminate: Health-TIES activities include: analysing innovation performance; attracting SMEs; exchanging staff and students; entrepreneurial training; connecting stakeholders; involving end-users in RTD; sharing best-practices through a Virtual Reference Region and ICT-tools. Better health(care): Health-TIES will contribute to improved health of EU citizens, better-equipped medical doctors, more efficient and effective healthcare systems, and an economically viable sector.
Agency: European Commission | Branch: H2020 | Program: RIA | Phase: PHC-14-2015 | Award Amount: 7.97M | Year: 2016
Uveal melanoma (UM) is a rare intraocular tumour with an incidence of 5 cases per million individuals per year. Up to 50% of UM patients develop metastases, most often in the liver, and these are invariably fatal. Despite new discoveries in the genetic and molecular background of the primary tumour, little is known about the metastatic disease; furthermore, there is no therapy to either prevent or treat UM metastases. In UM Cure 2020, we aim to identify and validate at the preclinical level novel therapeutic approaches for the treatment of UM metastases. For this purpose, the consortium brings together the major experts of UM in both patient care and basic/translational/clinical research, as well as patient representatives. An ambitious multidisciplinary approach is proposed to move from patient tissue characterisation to preclinical evaluation of single or combinations of drugs. This approach includes the characterisation of the genetic landscape of metastatic UM and its microenvironment, proteomic studies to address signal pathway deregulation and establishment of novel relevant in vitro and in vivo UM models. We also aim to validate accurate surrogate endpoint biomarkers to evaluate therapies and detect metastases as early as possible. Underpinning this will be the UM Cure 2020 virtual biobank registry, linking existing biobanks into a harmonised network, which will prospectively collect primary and metastatic UM samples. Together, our approach will lead to the identification of new therapies, allowing the initiation of UM-dedicated clinical trials sponsored by academia or pharma. Dissemination of results will include the building of a patient network across the countries as part of the consortium as well as a dedicated UM patient and caregivers data portal as part of the UM Cure 2020 website, in order to increase patient information and disease awareness.
News Article | December 7, 2015
While that may sound like the worst infomercial ever, in many cases making a virus really is that simple. Viruses such as influenza spread so effectively, and as a result can be so deadly to their hosts, because of their ability to spontaneously self-assemble in large numbers. If researchers can understand how viruses assemble, they may be able to design drugs that prevent viruses from forming in the first place. Unfortunately, how exactly viruses self-assemble has long remained a mystery because it happens very quickly and at such small length-scales. Now, there is a system to track nanometer-sized viruses at sub-millisecond time scales. The method, developed by researchers at the Harvard John A. Paulson School of Engineering and Applied Sciences (SEAS), is the first step towards tracking individual proteins and genomic molecules at high speeds as they assemble to create a virus. The research was led by Vinothan Manoharan, the Wagner Family Professor of Chemical Engineering and Professor of Physics, and was published recently in ACS Nano. Manoharan's group worked in collaboration with researchers at Leiden University, MIT, the Leibniz Institute of Photonic Technology, the University of Jena, and Heraeus Quarzglas, a manufacturer of fiber optics. "Our goal is to understand how viruses manage to assemble spontaneously, so quickly and so robustly," said Yoav Lahini, research associate, former Pappalardo Fellow at MIT, and co-first author of the study. Identifying critical intermediate stages in the assembly process could help researchers understand how to interfere with this process, Lahini said. Shedding light on the physics of self-assembly could also help engineers design better synthetic nanomaterials that can spontaneously piece themselves together. There are two main challenges to tracking virus assembly: speed and size. While fluorescent microscopy can detect single proteins, the fluorescent chemical compound that emits photons does so at a rate too slow to capture the assembly process. It's like trying to observe the mechanics of a hummingbird's flapping wing with stop-motion camera; it captures pieces of the process but the crucial frames are missing. Very small particles, like capsid proteins, can be observed by how they scatter light. This technique, known as elastic scattering, emits an unlimited number of photons at a time, solving the problem of speed. However, the photons also interact with dust particles, reflected light, and imperfections in the optical path, all of which obscure the small particles being tracked. To solve these problems, the team decided to leverage the outstanding quality of optical fibers, perfected over years of research in the telecommunication industry. They designed a new optical fiber with a nano-scale channel, smaller than the wavelength of light, running along the inside of its silica core. This channel is filled with liquid containing nanoparticles, so that when light is guided through the fiber's core, it scatters off the nanoparticles in the channel and is collected by a microscope above the fiber. The researchers observed the motion of viruses measuring 26 nanometers in diameter at a rate of thousands of measurements per second. "These are the smallest viruses to be tracked on sub-millisecond time scales, which are comparable to the time scales for self-assembly." said Rees Garmann, post-doctoral fellow in the Manoharan lab and co-author of the research. The next step is to track not just single viruses but single viral proteins, which scatter 100 to 1,000 times less light than a single virus. "This research is a step forward in observing and measuring the self-assembly of viruses," said Manoharan. "Viral infection involves many complex molecular and cellular pathways, but self-assembly is a process that is found in many different viruses. This simple technology, which is cheap, easy and scalable, could provide a new, cost effective way to study and diagnose viruses. From the point of view of fundamental physics, understanding the self-assembly of a naturally evolved system would be a major milestone in the study of complex systems." More information: Sanli Faez et al. Fast, Label-Free Tracking of Single Viruses and Weakly Scattering Nanoparticles in a Nanofluidic Optical Fiber, ACS Nano (2015). DOI: 10.1021/acsnano.5b05646
News Article | December 8, 2015
Want to make a virus? It’s easy: combine one molecule of genomic nucleic acid, either DNA or RNA, and a handful of proteins, shake, and in a fraction of a second you’ll have a fully-formed virus. While that may sound like the worst infomercial ever, in many cases making a virus really is that simple. Viruses such as influenza spread so effectively, and as a result can be so deadly to their hosts, because of their ability to spontaneously self-assemble in large numbers. If researchers can understand how viruses assemble, they may be able to design drugs that prevent viruses from forming in the first place. Unfortunately, how exactly viruses self-assemble has long remained a mystery because it happens very quickly and at very small length-scales. Now, there is a system to track nanometer-sized viruses at sub-millisecond time scales. The method, developed by researchers at the Harvard John A. Paulson School of Engineering and Applied Sciences (SEAS), is the first step towards tracking individual proteins and genomic molecules at high speeds as they assemble to create a virus. The research was led by Vinothan Manoharan, the Wagner Family Professor of Chemical Engineering and Professor of Physics, and was published recently in ACS Nano. Manoharan’s group worked in collaboration with researchers at Leiden University, MIT, the Leibniz Institute of Photonic Technology, the University of Jena, and Heraeus Quarzglas, a manufacturer of fiber optics. “Our goal is to understand how viruses manage to assemble spontaneously, so quickly and so robustly,” says Yoav Lahini, research associate, former Pappalardo Fellow at MIT, and co-first author of the study. Identifying critical intermediate stages in the assembly process could help researchers understand how to interfere with this process, Lahini says. Shedding light on the physics of self-assembly could also help engineers design better synthetic nanomaterials that can spontaneously piece themselves together. There are two main challenges to tracking virus assembly: speed and size. While fluorescent microscopy can detect single proteins, the fluorescent chemical compound that emits photons does so at a rate too slow to capture the assembly process. It’s like trying to observe the mechanics of a hummingbird’s flapping wing with stop-motion camera; it captures pieces of the process but the crucial frames are missing. Very small particles, like capsid proteins, can be observed by how they scatter light. This technique, known as elastic scattering, emits an unlimited number of photons at a time, solving the problem of speed. However, the photons also interact with dust particles, reflected light, and imperfections in the optical path, all of which obscure the small particles being tracked. To solve these problems, the team decided to leverage the outstanding quality of optical fibers, perfected over years of research in the telecommunication industry. They designed a new optical fiber with a nano-scale channel, smaller than the wavelength of light, running along the inside of its silica core. This channel is filled with liquid containing nanoparticles, so that when light is guided through the fiber’s core, it scatters off the nanoparticles in the channel and is collected by a microscope above the fiber. The researchers observed the motion of viruses measuring 26 nanometers in diameter at a rate of thousands of measurements per second. “These are the smallest viruses to be tracked on sub-millisecond time scales, which are comparable to the time scales for self-assembly,” says Rees Garmann, post-doctoral fellow in the Manoharan lab and co-author of the research. The next step is to track not just single viruses but single viral proteins, which scatter 100 to 1,000 times less light than a single virus. “This research is a step forward in observing and measuring the self-assembly of viruses,” says Manoharan. “Viral infection involves many complex molecular and cellular pathways, but self-assembly is a process that is found in many different viruses. This simple technology, which is cheap, easy and scalable, could provide a new, cost effective way to study and diagnose viruses. From the point of view of fundamental physics, understanding the self-assembly of a naturally evolved system would be a major milestone in the study of complex systems.”
News Article | December 8, 2015
Home > Press > Seeing viruses in a new light: New method for observing viruses may shed light on how to stop them Abstract: Want to make a virus? It's easy: combine one molecule of genomic nucleic acid, either DNA or RNA, and a handful of proteins, shake, and in a fraction of a second you'll have a fully-formed virus. How can you fight something you can't see? Viruses like influenza spread so effectively, and as a result can be so deadly, because of their ability to spontaneously self-assemble in large numbers. If researchers can understand how viruses assemble, they may be able to design drugs that prevent them from forming in the first place. Now, researchers at the Harvard John A. Paulson School of Engineering and Applied Sciences have engineered a new way to observe and track viruses as they assemble using fiberoptic cables. This new method may spot weaknesses in viruses that drug makers can exploit. Harvard SEAS While that may sound like the worst infomercial ever, in many cases making a virus really is that simple. Viruses such as influenza spread so effectively, and as a result can be so deadly to their hosts, because of their ability to spontaneously self-assemble in large numbers. If researchers can understand how viruses assemble, they may be able to design drugs that prevent viruses from forming in the first place. Unfortunately, how exactly viruses self-assemble has long remained a mystery because it happens very quickly and at such small length-scales. Now, there is a system to track nanometer-sized viruses at sub-millisecond time scales. The method, developed by researchers at the Harvard John A. Paulson School of Engineering and Applied Sciences (SEAS), is the first step towards tracking individual proteins and genomic molecules at high speeds as they assemble to create a virus. The research was led by Vinothan Manoharan, the Wagner Family Professor of Chemical Engineering and Professor of Physics, and was published recently in ACS Nano. Manoharan's group worked in collaboration with researchers at Leiden University, MIT, the Leibniz Institute of Photonic Technology, the University of Jena, and Heraeus Quarzglas, a manufacturer of fiber optics. "Our goal is to understand how viruses manage to assemble spontaneously, so quickly and so robustly," said Yoav Lahini, research associate, former Pappalardo Fellow at MIT, and co-first author of the study. Identifying critical intermediate stages in the assembly process could help researchers understand how to interfere with this process, Lahini said. Shedding light on the physics of self-assembly could also help engineers design better synthetic nanomaterials that can spontaneously piece themselves together. There are two main challenges to tracking virus assembly: speed and size. While fluorescent microscopy can detect single proteins, the fluorescent chemical compound that emits photons does so at a rate too slow to capture the assembly process. It's like trying to observe the mechanics of a hummingbird's flapping wing with stop-motion camera; it captures pieces of the process but the crucial frames are missing. Very small particles, like capsid proteins, can be observed by how they scatter light. This technique, known as elastic scattering, emits an unlimited number of photons at a time, solving the problem of speed. However, the photons also interact with dust particles, reflected light, and imperfections in the optical path, all of which obscure the small particles being tracked. To solve these problems, the team decided to leverage the outstanding quality of optical fibers, perfected over years of research in the telecommunication industry. They designed a new optical fiber with a nano-scale channel, smaller than the wavelength of light, running along the inside of its silica core. This channel is filled with liquid containing nanoparticles, so that when light is guided through the fiber's core, it scatters off the nanoparticles in the channel and is collected by a microscope above the fiber. The researchers observed the motion of viruses measuring 26 nanometers in diameter at a rate of thousands of measurements per second. "These are the smallest viruses to be tracked on sub-millisecond time scales, which are comparable to the time scales for self-assembly." said Rees Garmann, post-doctoral fellow in the Manoharan lab and co-author of the research. The next step is to track not just single viruses but single viral proteins, which scatter 100 to 1,000 times less light than a single virus. "This research is a step forward in observing and measuring the self-assembly of viruses," said Manoharan. "Viral infection involves many complex molecular and cellular pathways, but self-assembly is a process that is found in many different viruses. This simple technology, which is cheap, easy and scalable, could provide a new, cost effective way to study and diagnose viruses. From the point of view of fundamental physics, understanding the self-assembly of a naturally evolved system would be a major milestone in the study of complex systems." For more information, please click If you have a comment, please us. Issuers of news releases, not 7th Wave, Inc. or Nanotechnology Now, are solely responsible for the accuracy of the content.
News Article | May 26, 2016
Stalagmite cave rings that have been found not only suggest that Neanderthals are skilled builders but that they existed way before the arrival of Homo sapiens by as much as 100,000 years. Scientists initially explored Bruniquel Cave in southwest France to look for cave bears and rare megafauna. Instead, they stumbled upon the mysterious stony structures located more than a thousand feet away from the entrance - structures that, without a doubt, were purposely crafted. About 400 stalagmites were found to be carved into almost the same lengths. Some of the rocks were created to form rings with a diameter of 22 feet (6.7 meters), while the rest were elaborately piled up. There were notable black and red discolorations that suggest the stalagmites touched fire. There were also burnt bones of a large herbivore adjacent to the cave rings. Early Neanderthals lived for 300,000 years, and there was even a time that they coexisted and bred with modern humans. They had big brains and knowledge of fire but they were not believed to be adept at symbolism and ritualistic behavior. Many contest that the reason they did not survive is their inability to learn subterranean living. However, the evidence found inside the Bruniquel cave proves that these distant cousins were, in fact, skilled and able to conceive complex patterns. "The find is solid, and it is an important documentation of the advanced behaviors of the Neanderthals," said Erik Trinkaus, a paleoanthropologist from Washington University. Study author Jacques Jaubert from the University of Bordeaux and his team said that for the Neanderthals to create such complex patterns, they must have had planning and construction capabilities to be able to make the structures and go deep into the caves where illumination would be necessary. "This requires the mobilization of people who choose, who lead, who advise, manufacture - and with continuous light," said Jaubert. "All this indicates a structured society." Were the Neanderthals really capable of such complex behavior? Could the elaborate structures be a handiwork of hibernating cave bears that the scientists were initially looking for? John Shea, a paleoanthropologist from Stony Brook University, raised a question as to why Neanderthals would build a structure underneath a cave. Shea pointed out that these early inhabitants retreat to a cave because they do not want to engage in building an artificial structure. Shea claims that these structures were more likely created by hibernating cave bears. "When bears settle in for the winter hibernation, they push all kinds of litter to the side. This looks like a place where cave bears settled in for a nice nap over and through time," Shea said. But the proponents of the study said that Shea's argument is weak. Leiden University anthropologist Marie Soressi said that bear dens, in general, could not be bigger than the rings found inside the cave and that these animals cannot pile heaps of stalagmites by just pushing the rocks to the side. Jaubert added that the evidence of fire ultimately junks the idea of cave bears. The Purpose Of The Structures? Why were the Neanderthals so deep into the cave? What were the structures for? Jaubert and his team would need to conduct further studies to answer these questions. But for now, what they know is that the Bruniquel structures are dated within a glacial stage where the caves could have served as an intermediate temperate refuge. © 2017 Tech Times, All rights reserved. Do not reproduce without permission.
News Article | April 27, 2016
When Andrew Ng joined Google from Stanford University in 2011, he was among a trickle of artificial-intelligence (AI) experts in academia taking up roles in industry. Five years later, demand for expertise in AI is booming — and a torrent of researchers is following Ng’s lead. The laboratories of tech titans Google, Microsoft, Facebook, IBM and Baidu (China’s web-services giant) are stuffed with ex-university scientists, drawn to private firms’ superior computing resources and salaries. “Some people in academia blame me for starting part of this,” says Ng, who in 2014 moved again to become chief scientist at Baidu, working at the company’s research lab in California’s Silicon Valley. Many scientists say that the intense corporate interest is a boon to AI — bringing vast engineering resources to the field, demonstrating its real-world relevance and attracting eager students. But some are concerned about the more subtle impacts of the industrial migration, which leaves universities temporarily devoid of top talent, and could ultimately sway the field towards commercial endeavours at the expense of fundamental research. Private firms are investing heavily in AI — and in particular in an AI technique called deep learning — because of its promise to glean understanding from huge amounts of data. Sophisticated AI systems might be able to create effective digital personal assistants, control self-driving cars, or take on a host of other tasks that are too complex for conventional programming. And corporate labs’ resources allow progress that might not be possible in academic departments, says Geoffrey Hinton, a deep-learning pioneer at the University of Toronto in Canada who took up a post at Google in 2013. The fields of speech and image recognition, for instance, had been held up for years by a lack of data to use in training algorithms and a shortage of hardware, he says — bottlenecks that he was able to get past at Google. “AI is so hot right now. There are so many opportunities and so few people to work on them,” says Ng, who says he was attracted by Google’s reams of data and computing power, and its ability to tackle real-world problems. Another temptation is that private firms can offer “astronomical” wages, says Tara Sinclair, chief economist at Indeed, a firm headquartered in Austin, Texas, that aggregates online job listings and has chronicled a rising demand for jobs in AI in Britain and the United States. The excitement shows that AI is at a point at which it can achieve real-world impact — and companies are the natural way to make this happen, says Pieter Abbeel, a specialist in AI and deep learning at the University of California, Berkeley. In the 1950s, a similar job migration occurred in semiconductor research, when many of the field’s leading figures were poached to become heads of industrial research-and-development labs, says Robert Tijssen, a social scientist at Leiden University in the Netherlands. Moving academics bring expertise while extending their new-found corporate networks back to their former colleagues and students — making the scenario a “classic win–win situation”, Tijssen says. Herman Herman, director of the US National Robotics Engineering Center based at Carnegie Mellon University in Pittsburgh, Pennsylvania, subscribes to that view. In 2015, car-hailing app Uber, which was collaborating with the centre, hired almost 40 of his 150 researchers, mainly those working on self-driving cars. Reports at the time suggested that the centre was left in crisis, but Herman says this was overblown; the project was one of dozens across Carnegie Mellon’s Robotics Institute, which has about 500 faculty members. The move was a chance to bring in new blood, and shortly afterwards, Uber donated US$5.5 million to support student and faculty fellowships at the institute. Meanwhile, the publicity raised the profile of the centre’s work, says Herman — and student applications are up. The loss of expertise in academia concerns Yoshua Bengio, a computer scientist at the University of Montreal in Canada, which has also seen a surge in graduate-student applications. If industry-hired faculty members do retain university roles — as Hinton has at the University of Toronto and Ng has at Stanford University in California — they are usually only minor, says Bengio. Losing faculty members reduces the number of students that can be trained, especially at PhD level, adds Abbeel. Hinton predicts that the shortage in deep-learning experts will be temporary. “The magic of graduate research in universities is something to be protected, and Google recognizes that,” he says. Google is currently funding more than 250 academic research projects and dozens of PhD fellowships. In supplying industry with talent, universities are fulfilling their natural role, says Michael Wooldridge, a computer scientist at the University of Oxford, UK. And with interest in AI generally booming, he struggles to see a situation in which academia is left deserted. The London-based firm Google DeepMind hired ten researchers from Oxford in 2014 — but Google gave the university a seven-figure financial contribution, and formed a research collaboration (see ‘Google DeepMind’s talent grab’). Many of the poached staff still hold active teaching positions at the university — giving students opportunities they might otherwise never have had. Bengio is also concerned about the long-term impacts of corporate domination. Industry researchers are more secretive, he says. Although scientists in some corporate labs (such as those at Google and Baidu) are still publishing papers and code openly — allowing others to build on their work — Bengio argues that corporate researchers still often avoid discussing their work ahead of publication because they are more likely than academics to have filed for patents. “That makes it more difficult to collaborate,” he says. Some industry insiders are concerned about transparency, too. In December 2015, SpaceX founder Elon Musk was among a group of Silicon Valley investors who launched a non-profit firm called OpenAI in San Francisco, California. With $1 billion promised by its backers, it aims to develop AI for the public good, sharing its patents and collaborating freely with other institutions. Although Google, Facebook and the like seem committed for the moment to tackling fundamental questions in AI, Bengio fears that this might not last. “Business tends to be pulled to short-term concerns. It’s in the nature of the beast,” he says. He cites telecommunications firms Bell Labs and AT&T as examples of companies that had strong research labs, but eventually lost their talent by putting too much emphasis on the short-term objective of making money for the company. Hinton insists that basic research can thrive in industry. And because of the urgent need for AI research, some of today’s expansion in basic research is inevitably taking place at corporate firms, he adds. But academia will still play a crucial part in AI research, he says. “It’s the most likely place to get radical new ideas.”
News Article | December 11, 2015
The search for a suspected calling card of the universe’s most elusive matter has come up empty. Multiple days of telescope time spent looking for a specific X-ray glow coming out of the nearby dwarf galaxy Draco failed to turn up any signal, two University of California, Santa Cruz astrophysicists report online December 7 at arXiv.org. Finding such a glow would have offered a compelling clue for the identity of dark matter, the invisible, inert stuff that makes up more than 80 percent of the universe’s matter. The study’s authors say that the absence of the X-rays in Draco, one of the most dark matter–dominated objects known, means that scientists had previously detected the X-ray emissions of interstellar atoms rather than dark matter. Not everyone agrees with the study’s conclusion, including a different team of scientists who commissioned the lengthy Draco observations and are reviewing the same data. Those scientists, who haven’t yet published their analysis, say they can’t rule out the possibility that dark matter produces the X-rays that have been spotted emanating from other cosmic objects. Scientists know dark matter permeates the cosmos because, among other evidence, the outer regions of galaxies spin faster than they should based on the distribution of the galaxies’ stars and gas. In an attempt to identify the particles that make up dark matter, some scientists analyze images of dark matter–rich regions like galaxy clusters and dwarf galaxies in search of gamma rays, X-rays or other unexpected signals. Their hope is that dark matter particles emit observable radiation when they decay or collide with each other (SN Online: 11/4/14). Scientists flagged one promising signal in February 2014: bursts of X-rays with an energy of about 3,500 electron volts that consistently appeared in a set of 73 galaxy clusters. Other groups soon found X-rays streaming from the Perseus galaxy cluster, Andromeda and the center of the Milky Way, too. Theorists quickly pointed out that dark matter in the form of a proposed particle called a sterile neutrino could decay and emit radiation at that energy. “It was very exciting,” says Stefano Profumo, an author of the new paper. “We had a signal that matched with a predicted dark matter candidate.” But dark matter isn’t the only way to explain the X-rays. Profumo and others argued that initial studies underestimated the role of a kind of decidedly undark matter — potassium atoms — that can also emit 3,500-eV X-rays in galactic gas clouds. To settle the issue, a team led by Alexey Boyarsky, a particle physicist at Leiden University in the Netherlands, pointed the XMM-Newton space telescope at Draco. The dwarf galaxy, located about 270,000 light-years away, contains lots of dark matter but barely any potassium-carrying gas. For the new study, Profumo and colleague Tesla Jeltema, who are not part of Boyarsky’s team, analyzed the publicly available XMM-Newton data along with a previous Draco observation. They found no evidence that the galaxy radiates 3,500-eV X-rays. Profumo says their results prove that the X-rays in the other galaxy clusters could not have come from the decay of dark matter. Boyarsky agrees that there is no strong X-ray signal coming from Draco. But he says he’s not convinced that the data rule out that dark matter decays into X-rays. He expects to share a more careful analysis encompassing more telescope data within the next few weeks. Esra Bulbul, an astrophysicist at the Harvard-Smithsonian Center for Astrophysics who is working with Boyarsky, says the new data hurt the case for sterile neutrinos composing dark matter. But she says that other kinds of dark matter particles could produce a feebler emission of X-rays that might explain the Draco observations. “Draco is a good clue, but I’m afraid it’s not going to be conclusive enough to evaluate the dark matter origin,” she says. “We have seen the signal in so many clusters.” Bulbul expects to narrow down the list of potential X-ray emitters next year after the launch of ASTRO-H. That space telescope will be able to separate the contribution of potassium atoms from the rest of the X-ray signal, she says.
News Article | December 16, 2016
Humans are really good at picking out faces. Our brains are so good at this that we even see faces in places they don’t exist — like Jesus on toast. Flip a face upside down, though, and the brain needs an extra moment to determine that, yes, that’s a face. This is known as the inversion effect. And a new study finds that we’re not the only species to demonstrate it: Chimps do, too. Only they do it with butts. And this says something about human evolution — but we’ll get to that in a bit. In 2008, Frans de Waal and Jennifer Pokorny of Emory University in Atlanta showed in an Ig Nobel–winning experiment that chimpanzees could recognize each other from their behinds — or at least photographs of them. The chimp rear, it turns out, conveys important information about sex and, in females, ovulation status. Both males and females pay attention to those signals, which are important in mating and competition. Given the evidence that chimps could use both face and behind for recognition, Mariska Kret of Leiden University in the Netherlands and Masaki Tomonaga of the Primate Research Institute at Kyoto University in Japan wanted to know if, like us, chimpanzees display the inversion effect when it comes to these body parts — and how humans might compare. They started by gathering a group of more than 100 university students and giving them a matching game. The student would be shown a picture of a human or chimp face, butt or foot. Then they would be shown two pictures of the same body part (from the same species) and have to choose the one that matched the first image. Some of those images were right side up and others were turned upside down. The researchers then repeated the experiment with images in black and white and then repeated both experiments again with a group of four female chimps and one male chimp living at the Primate Research Institute. Humans displayed the inversion effect for human faces only, while chimps did so for chimp behinds and, perhaps, faces (though the evidence isn’t a strong for that), Kret and Tomonaga report November 30 in PLOS ONE. Perhaps this shouldn’t be all that surprising, though. The chimpanzee rear conveys a lot of information to other chimps. For humans, our rears not only don’t show the same changes during the menstrual cycle, but we now cover them up nearly all of the time. “Over human evolution, the face became more and more important for communication,” notes Kret. And over that same time, our faces have become much more like, well, butts: Kret notes that, like the chimp behind, the human face is symmetrical, stands out from the rest of the body, is highlighted with red color and can be attractive, gives information about fitness and is used for identity. In other words, “buttface” may not be an insult but the truth. Why did humans’ focus evolve from rears to faces? It’s not that humans started wearing clothes. It’s probably that we started walking upright and looking up instead. Most of the time, anyway.
News Article | March 4, 2016
Superconductivity—a quantum phenomenon in which metals below a certain temperature develop flow of current with no loss or resistance—is one of the most exciting problems in physics, which has resulted in investments worldwide of enormous brain power and resources since its discovery a little over a century ago. Many prominent theorists, Nobel laureates among them, have proposed theories for new classes of superconducting materials discovered several decades later, followed by teams of experimentalists working furiously to provide solid evidence for these theories. More than 100,000 research papers have been published on the new materials. One such theory began with a proposal in 1989 by Chandra Varma while at Bell Laboratories, NJ, and now a distinguished professor of physics and astronomy at the University of California, Riverside. At UC Riverside, he further developed the theory and proposed experiments to confirm or refute it. That theory has now been experimentally proven to be a consistent theory by physicists in China and Korea. The experimental results, published in Science Advances on March 4, 2016, now allow for a clear discrimination of theories of high-temperature superconductivity, favoring one and ruling others out. The research paper is titled “Quantitative determination of pairing interactions for high-temperature superconductivity in cuprates.” “At the core of most models for the high-temperature superconductivity in cuprates lies the idea of the electron-electron pairing,” said Lev P. Gor’kov, a theoretical physicist at Florida State University who is renowned for making the most important formal advance in the superconductivity field in 1958, while at the Soviet Academy of Sciences. “The paper by Prof. Chandra Varma and his colleagues from China and Korea is the daring and successful attempt to extract the relevant electron-electron interactions directly from experiment. Their elegant approach opens new prospects also for studying the superconductivity mechanisms in other systems with strongly correlated electrons.” Superconductors are used in magnetic-imaging devices in hospitals. They are used, too, for special electrical switches. The electromagnets used in the Large Hadron Collider at CERN use superconducting wire. Large-scale use of superconductivity, however, is not feasible presently because of cost. If superconductors could be made cheaply and at ordinary temperatures, they would find wide use in power transmission, energy storage and magnetic levitation. First discovered in the element mercury in 1911, superconductivity is said to occur when electrical resistance in a solid vanishes when that solid is cooled below a characteristic temperature, called the transition temperature, which varies from material to material. Transition temperatures tend to be close to 0 K or -273 C. At even slightly higher temperatures, the materials tend to lose their superconducting properties; indeed, at room temperature most superconductors are very poor conductors. In 1987, some high-temperature superconductors, called cuprates, were discovered by physicists Georg Bednorz and Alexander Müller, so named because they all contain copper and oxygen. These new materials have properties which have raised profound new questions. Why these high-temperature superconductors perform as they do has remained unknown. The superconductivity problem was considered solved by a theory proposed in 1957: the BCS theory of superconductivity. This comprehensive theory, developed by physicists John Bardeen, Leon Cooper and John Schrieffer (the first letter of their last names gave the theory its name), explained the behavior of superconducting materials as resulting from electrons forming pairs, with each pair being strongly correlated with other pairs, allowing them all to function coherently as a single entity. Concepts in the BCS theory and its elaborations have influenced all branches of physics, ranging from elementary particle physics to cosmology. “But in the cuprates, some of the founding concepts of the physics of interacting particles, such as the quasi-particle concept, were found to be invalid,” Varma said. “The physical properties of superconductors above the superconducting transition temperature were more remarkable than the superconductivity itself. Subsequently, almost all the leading theoretical physicists in the world proposed different directions of ideas and calculations to explain these properties as well as superconductivity. But very few predictions stemming from these ideas were verified, and specific experiments were not in accord with them.” A quasi-particle is a packet of energy and momentum that can, in some respects, be regarded as a particle. It is a physical concept, which allows detailed calculation of properties of matter. In 1989, while at Bell Laboratories, Varma and some collaborators proposed that the breakdown of the quasi-particle concept occurs due to a simple form of quantum-critical fluctuations—fluctuations which are quantum in nature and occur when symmetry of matter breaks down, such as at the phase transition critical point near absolute zero of temperature. In physics, symmetry is said to occur when some change in orientation or movement by any amount leaves the physical situation unchanged (empty space, for example, has symmetry because it is everywhere the same). Relativity, quantum theory, crystallography and spectroscopy involve notions of symmetry. “It was at this time that we introduced the concept of marginal Fermi-liquids or marginal quasi-particles through which various properties of superconductivity were explained,” Varma said. “We also provided some definitive predictions, which could only be tested in 2000 by a new technique called Angle Resolved Photoemissions or ARPES.” Varma explained that in 1989 there was also no evidence that the same quantum-critical fluctuations promoted the superconductivity transition. “There was no theory for the cause of such quantum-critical fluctuations or for the symmetry which must change near absolute zero to realize them,” he said. In 1997, Varma proposed transitions to a new class of symmetries, in which the direction of time was picked by the direction of currents. These currents, he suggested, begin to spontaneously flow in each microscopic cell of the cuprates. Since 2004, a group of French scientists at Saclay has been reporting evidence of such symmetries in every high-temperature superconducting compound it could investigate with neutron scattering. Several other kinds of experiments by other research groups are in accord also. Varma cautioned that some unresolved issues persist. His group is proposing experiments to address them. In 2003, the year Varma moved to UC Riverside, he formulated a theory for how quantum fluctuations coupled to electrons give rise to the observed symmetry in superconductivity. “This was a completely new kind of coupling,” he said. “It had very remarkable and unusual predictions for experiments designed to decipher such a coupling.” In 2010, Varma became aware of high-quality laser-based ARPES in a laboratory at the Institute of Physics in the Chinese Academy of Sciences, Beijing, China. A collaboration with physicist Xingjiang Zhou at the institute ensued, with numerical analysis of the data being done by Han-Yong Choi, a physicist at SungKyunKwan University, Korea, who, in the past, worked with Varma at UCR. Zhou’s team made several improvements in the ARPES technique, which ensured that the quality of data was high and reproducible enough to have full confidence. “The data obtained and the analysis we describe in our paper are conclusive on the most important issues relevant to superconductivity,” Varma said. “Our conclusions—namely, that the quantum fluctuations promoting superconductivity are the same as those that lead to the marginal Fermi-liquid and they are consistently of the form predicted, being stretched exponentially in time in a scale-invariant way relative to stretching in space—also have no theoretical approximations. They are as precise as the quality of the data allows. They also unambiguously address the question of symmetry of superconductivity. Further, they rule out many of the alternative ideas that have been proposed on this problem in the last thirty years since the original discovery. Our observations of the breakdown of time-reversal symmetry and of the fluctuations that follow complete major aspects of our understanding of these problems.” Varma, Zhou and Choi were joined in the research by Jin Mo Bok (first author of the paper) and Jong Ju Bae at SungKyunKwan University, Korea; and Wentai Zhang, Junfeng He, Yuxiao Zhang and Li Yu at the Institute of Physics, Chinese Academy of Sciences, Beijing, China. Varma was partially supported by a grant from the National Science Foundation. After he received his doctoral degree in physics from the University of Minnesota, Varma joined Bell Labs in 1969, one of the most coveted positions at the time for young physicists anywhere in the world. The following year, he became a permanent member of the laboratory. He was the head of the theoretical physics department at Bell Labs from 1983 to 1987, and was awarded the Distinguished Member of Research in 1988. He has served as a visiting professor at the University of Chicago, Stanford University, MIT, the College de France in Paris, France, and at CNRS, France; and a senior visiting fellow at Cavendish Lab at Cambridge University. In 2000, he was selected to the Lorentz Visiting Chair at Leiden University, the Netherlands. In 2009, he held a Miller Professorship at UC Berkeley. He is a fellow of the American Physical Society and of the American Association for the Advancement of Science. A member of the World Academy of Sciences, he is the recipient of the Alexander Humboldt Prize and the Bardeen Prize for theoretical advances in superconductivity. Varma has published nearly 200 scientific papers, which have in all about 18,000 citations. He has made seminal contributions to the theory of glasses, to Kondo and mixed valence and heavy-fermion phenomena, novel forms of superconductivity, charge density waves, co-existing magnetic and superconducting states, the Higgs boson in superconductors, quantum criticality, singular Fermi-liquids and associated superconductivity.
News Article | December 19, 2016
Growing a human being is no small feat—just ask any newly pregnant woman. Her hormones surge as her body undergoes a massive physical transformation, and the changes don’t end there. A study published Monday in Nature Neuroscience reveals that during pregnancy women undergo significant brain remodeling that persists for at least two years after birth. The study also offers preliminary evidence that this remodeling may play a role in helping women transition into motherhood. A research team at Autonomous University of Barcelona, led by neuroscientist Elseline Hoekzema of Leiden University, performed brain scans on first-time mothers before and after pregnancy and found significant gray matter changes in brain regions associated with social cognition and theory of mind—the same regions that were activated when women looked at photos of their infants. These changes, which were still present two years after birth, predicted women’s scores on a test of maternal attachment, and were so clear that a computer algorithm could use them to identify which women had been pregnant. One of the hallmarks of pregnancy is an enormous increase in sex steroid hormones such as progesterone and estrogen, which help a woman’s body prepare for carrying a child. There is only one other time when our bodies produce similarly large quantities of these hormones: puberty. Previous research has shown that during puberty these hormones cause dramatic structural and organizational changes in the brain. Throughout adolescence both boys and girls lose gray matter as the brain connections they don’t need are pruned, and their brains are sculpted into their adult form. Very little research has focused on anatomical brain changes during pregnancy, however. “Most women undergo pregnancy at some point in their lives,” Hoekzema says, “But we have no idea what happens in the brain.” Hoekzema and her colleagues performed detailed anatomical brain scans on a group of women who were trying to get pregnant for the first time. The 25 women who got pregnant were rescanned soon after they gave birth; 11 of them were scanned two years after that. (For comparison, the researchers also scanned men and women who were not trying to have a child as well as first-time fathers). During the postpartum period, the researchers also performed brain scans on the new mothers while they looked at photos of their infants. The scientists used a standard scale to rate the attachment between mother and infant. The researchers found that the new mothers experienced gray matter reductions that lasted for at least two years after birth. This loss, however, is not necessarily a bad thing (according to Hoekzema, “the localization was quite remarkable”); it occurred in brain regions involved in social cognition, particularly in the network dedicated to theory of mind, which helps us think about what is going on in someone else’s mind—regions that had the strongest response when mothers looked at photos of their infants. These brain changes could also be used to predict how mothers scored on the attachment scale. In fact, researchers were able to use a computer algorithm to identify which women were new mothers based solely on their patterns of gray matter loss. Gray matter loss was not seen in new fathers or nonparents. It is not entirely clear why women lose gray matter during pregnancy, but Hoekzema thinks it may be because their brains are becoming more specialized in ways that will help them adapt to motherhood and respond to the needs of their babies. The study offers some preliminary evidence to support this idea. Whereas the present study focuses primarily on documenting brain changes during pregnancy, she expects follow-up work to tackle more applied questions such as how brain changes relate to postpartum depression or attachment difficulties between mother and child. Ronald Dahl, a neuroscientist at the University of California, Berkeley, who was not involved in the work, says he had “a delightful ‘wow’ moment” on seeing the study. “This is a pioneering contribution that not only documents structural brain changes linked to pregnancy but also compellingly offers evidence that suggests these represent adaptive changes,” he wrote in an e-mail. Mel Rutherford, an evolutionary psychologist at McMaster University in Ontario, is also enthusiastic about the study—which, to his knowledge, is the first that uses neuroimaging to track brain changes during pregnancy. “Probably the most exciting thing is that they were able to follow up two years after the birth of the baby,” he says, “So they have the longest-term evidence that we've seen of changes in the brain after pregnancy.” The results mesh with Rutherford’s own research on cognitive changes during pregnancy, which he approaches from an evolutionary perspective. “As a parent, you're now going to be solving slightly different adaptive problems, slightly different cognitive problems than you did before you had children,” he explains. “You have different priorities, you have different tasks you're going to be doing, and so your brain changes.”
News Article | March 8, 2016
Abstract: Superconductivity - a quantum phenomenon in which metals below a certain temperature develop flow of current with no loss or resistance - is one of the most exciting problems in physics, which has resulted in investments worldwide of enormous brain power and resources since its discovery a little over a century ago. Many prominent theorists, Nobel laureates among them, have proposed theories for new classes of superconducting materials discovered several decades later, followed by teams of experimentalists working furiously to provide solid evidence for these theories. More than 100,000 research papers have been published on the new materials. One such theory began with a proposal in 1989 by Chandra Varma while at Bell Laboratories, NJ, and now a distinguished professor of physics and astronomy at the University of California, Riverside. At UC Riverside, he further developed the theory and proposed experiments to confirm or refute it. That theory has now been experimentally proven to be a consistent theory by physicists in China and Korea. The experimental results, published in Science Advances today (March 4), now allow for a clear discrimination of theories of high-temperature superconductivity, favoring one and ruling others out. The research paper is titled "Quantitative determination of pairing interactions for high-temperature superconductivity in cuprates." "At the core of most models for the high-temperature superconductivity in cuprates lies the idea of the electron-electron pairing," said Lev P. Gor'kov, a theoretical physicist at Florida State University who is renowned for making the most important formal advance in the superconductivity field in 1958, while at the Soviet Academy of Sciences. "The paper by Prof. Chandra Varma and his colleagues from China and Korea is the daring and successful attempt to extract the relevant electron-electron interactions directly from experiment. Their elegant approach opens new prospects also for studying the superconductivity mechanisms in other systems with strongly correlated electrons." A boon to technology Superconductors are used in magnetic-imaging devices in hospitals. They are used, too, for special electrical switches. The electromagnets used in the Large Hadron Collider at CERN use superconducting wire. Large-scale use of superconductivity, however, is not feasible presently because of cost. If superconductors could be made cheaply and at ordinary temperatures, they would find wide use in power transmission, energy storage and magnetic levitation. First discovered in the element mercury in 1911, superconductivity is said to occur when electrical resistance in a solid vanishes when that solid is cooled below a characteristic temperature, called the transition temperature, which varies from material to material. Transition temperatures tend to be close to 0 K or -273 C. At even slightly higher temperatures, the materials tend to lose their superconducting properties; indeed, at room temperature most superconductors are very poor conductors. In 1987, some high-temperature superconductors, called cuprates, were discovered by physicists Georg Bednorz and Alexander Müller, so named because they all contain copper and oxygen. These new materials have properties which have raised profound new questions. Why these high-temperature superconductors perform as they do has remained unknown. A brief history lesson The superconductivity problem was considered solved by a theory proposed in 1957: the BCS theory of superconductivity. This comprehensive theory, developed by physicists John Bardeen, Leon Cooper and John Schrieffer (the first letter of their last names gave the theory its name), explained the behavior of superconducting materials as resulting from electrons forming pairs, with each pair being strongly correlated with other pairs, allowing them all to function coherently as a single entity. Concepts in the BCS theory and its elaborations have influenced all branches of physics, ranging from elementary particle physics to cosmology. "But in the cuprates, some of the founding concepts of the physics of interacting particles, such as the quasi-particle concept, were found to be invalid," Varma said. "The physical properties of superconductors above the superconducting transition temperature were more remarkable than the superconductivity itself. Subsequently, almost all the leading theoretical physicists in the world proposed different directions of ideas and calculations to explain these properties as well as superconductivity. But very few predictions stemming from these ideas were verified, and specific experiments were not in accord with them." A quasi-particle is a packet of energy and momentum that can, in some respects, be regarded as a particle. It is a physical concept, which allows detailed calculation of properties of matter. In 1989, while at Bell Laboratories, Varma and some collaborators proposed that the breakdown of the quasi-particle concept occurs due to a simple form of quantum-critical fluctuations - fluctuations which are quantum in nature and occur when symmetry of matter breaks down, such as at the phase transition critical point near absolute zero of temperature. In physics, symmetry is said to occur when some change in orientation or movement by any amount leaves the physical situation unchanged (empty space, for example, has symmetry because it is everywhere the same). Relativity, quantum theory, crystallography and spectroscopy involve notions of symmetry. "It was at this time that we introduced the concept of marginal Fermi-liquids or marginal quasi-particles through which various properties of superconductivity were explained," Varma said. "We also provided some definitive predictions, which could only be tested in 2000 by a new technique called Angle Resolved Photoemissions or ARPES." Varma explained that in 1989 there was also no evidence that the same quantum-critical fluctuations promoted the superconductivity transition. "There was no theory for the cause of such quantum-critical fluctuations or for the symmetry which must change near absolute zero to realize them," he said. In 1997, Varma proposed transitions to a new class of symmetries, in which the direction of time was picked by the direction of currents. These currents, he suggested, begin to spontaneously flow in each microscopic cell of the cuprates. Since 2004, a group of French scientists at Saclay has been reporting evidence of such symmetries in every high-temperature superconducting compound it could investigate with neutron scattering. Several other kinds of experiments by other research groups are in accord also. Varma cautioned that some unresolved issues persist. His group is proposing experiments to address them. In 2003, the year Varma moved to UC Riverside, he formulated a theory for how quantum fluctuations coupled to electrons give rise to the observed symmetry in superconductivity. "This was a completely new kind of coupling," he said. "It had very remarkable and unusual predictions for experiments designed to decipher such a coupling." ARPES to the rescue In 2010, Varma became aware of high-quality laser-based ARPES in a laboratory at the Institute of Physics in the Chinese Academy of Sciences, Beijing, China. A collaboration with physicist Xingjiang Zhou at the institute ensued, with numerical analysis of the data being done by Han-Yong Choi, a physicist at SungKyunKwan University, Korea, who, in the past, worked with Varma at UCR. Zhou's team made several improvements in the ARPES technique, which ensured that the quality of data was high and reproducible enough to have full confidence. "The data obtained and the analysis we describe in our paper are conclusive on the most important issues relevant to superconductivity," Varma said. "Our conclusions - namely, that the quantum fluctuations promoting superconductivity are the same as those that lead to the marginal Fermi-liquid and they are consistently of the form predicted, being stretched exponentially in time in a scale-invariant way relative to stretching in space - also have no theoretical approximations. They are as precise as the quality of the data allows. They also unambiguously address the question of symmetry of superconductivity. Further, they rule out many of the alternative ideas that have been proposed on this problem in the last thirty years since the original discovery. Our observations of the breakdown of time-reversal symmetry and of the fluctuations that follow complete major aspects of our understanding of these problems." ### Varma, Zhou and Choi were joined in the research by Jin Mo Bok (first author of the paper) and Jong Ju Bae at SungKyunKwan University, Korea; and Wentai Zhang, Junfeng He, Yuxiao Zhang and Li Yu at the Institute of Physics, Chinese Academy of Sciences, Beijing, China. Varma was partially supported by a grant from the National Science Foundation. About Varma After he received his doctoral degree in physics from the University of Minnesota, Varma joined Bell Labs in 1969, one of the most coveted positions at the time for young physicists anywhere in the world. The following year, he became a permanent member of the laboratory. He was the head of the theoretical physics department at Bell Labs from 1983 to 1987, and was awarded the Distinguished Member of Research in 1988. He has served as a visiting professor at the University of Chicago, Stanford University, MIT, the College de France in Paris, France, and at CNRS, France; and a senior visiting fellow at Cavendish Lab at Cambridge University. In 2000, he was selected to the Lorentz Visiting Chair at Leiden University, the Netherlands. In 2009, he held a Miller Professorship at UC Berkeley. He is a fellow of the American Physical Society and of the American Association for the Advancement of Science. A member of the World Academy of Sciences, he is the recipient of the Alexander Humboldt Prize and the Bardeen Prize for theoretical advances in superconductivity. Varma has published nearly 200 scientific papers, which have in all about 18,000 citations. He has made seminal contributions to the theory of glasses, to Kondo and mixed valence and heavy-fermion phenomena, novel forms of superconductivity, charge density waves, co-existing magnetic and superconducting states, the Higgs boson in superconductors, quantum criticality, singular Fermi-liquids and associated superconductivity. About University of California - Riverside The University of California, Riverside (http://www.ucr.edu) is a doctoral research university, a living laboratory for groundbreaking exploration of issues critical to Inland Southern California, the state and communities around the world. Reflecting California's diverse culture, UCR's enrollment has exceeded 21,000 students. The campus opened a medical school in 2013 and has reached the heart of the Coachella Valley by way of the UCR Palm Desert Center. The campus has an annual statewide economic impact of more than $1 billion. A broadcast studio with fiber cable to the AT&T Hollywood hub is available for live or taped interviews. UCR also has ISDN for radio interviews. To learn more, call (951) UCR-NEWS. For more information, please click If you have a comment, please us. Issuers of news releases, not 7th Wave, Inc. or Nanotechnology Now, are solely responsible for the accuracy of the content.
News Article | October 27, 2016
A rare triple-star system surrounded by a disk with a spiral structure has been discovered by a University of Oklahoma-led research team. Recent observations from the Atacama Large Millimeter/submillimeter Array -- a revolutionary observatory in northern Chile, commonly known as ALMA -- resulted in the discovery, lending support for evidence of disk fragmentation -- a process leading to the formation of young binary and multiple star systems. Until ALMA, no one had observed a tri-star system forming in a disk like the one discovered by the OU team. John J. Tobin, professor of astrophysics in the Homer L. Dodge Department of Physics and Astronomy, OU College of Arts and Sciences, led a global team of researchers who demonstrated that the disk surrounding the tri-star system appeared susceptible to disk fragmentation. Team members represented Leiden University, The Netherlands; University of Arizona; Chalmers University of Technology, Onsala Sweden; University of Illinois; SUNY Fredonia; University of Virginia; National Radio Astronomy Observatory, New Mexico; Max-Planck, Germany; and University of California, San Diego. "What is important is that we discovered that companion stars can form in disk material surrounding a dominant star," said Tobin. "We had observed this system in the past with ALMA's predecessors, but this is the first time we have been able to clearly analyze the disk and the newborn stars within it. ALMA revealed the spiral arms and disk that led to the formation of the tri-star system. Triple systems like this one are rare, and this is the only one with a configuration like this, but we are actively searching for more." How binary stars form has been a mystery for some time, and there are different theories about how they form--one is the fragmentation of the disk around the stars that are forming. Tobin explains the formation of the disk in which the tri-star system is forming is like a figure skater doing a spin and pulls his or her arms in to gather speed. A star initially forms from a cloud of interstellar gas that is collapsing under its own gravity. The spin from the cloud causes a disk to form as the material spins faster and falls toward the star. If the disk happens to have enough material, spiral arms form and the disk can fragment to another star. "A Triple Protostar System Formed Via Fragmentation of a Gravitational Unstable Disk," will be published in Nature on October 27, 2016. Support for this research was provided by the Homer L. Dodge Department of Physics and Astrophysics Endowed Chair; the Netherlands Organization for Scientific Research, Grant No. 639.041.439; and the National Science Foundation, Grant No. AST-1410174. For more information about this research, contact OU Professor Tobin at email@example.com. Please follow SpaceRef on Twitter and Like us on Facebook.
News Article | December 5, 2016
Further clinical data for lirilumab and IPH4102 presented at ASH annual meeting reinforcing confidence in our programs Innate Pharma SA (the "Company" - Euronext Paris: FR0010331421 - IPH) today announces that clinical data for lirilumab and IPH4102 were presented in two posters at the American Society of Hematology's (ASH) 2016 Annual Meeting (December 3-6, 2016), San Diego, CA, U.S.: This poster presented preliminary results from the first seven dose levels of the dose-escalation part of the Phase I trial of IPH4102. They showed that the drug candidate is well tolerated in patients with relapsed/refractory CTCL and a preliminary global objective response rate (ORR) of 38% in the evaluable population across all dosage levels. Explorative assessments show that clinical improvement in skin comes along with decreases of malignant cells and normalization of immune parameters in the tumor microenvironment. All responses were ongoing at the time of poster presentation. Dose level 8 (3 mg/kg) has been completed without dose-limiting toxicity. Two further dose levels (6 and 10 mg/kg) remain to be explored in the dose-escalation part of this study. Lirilumab is a first-in-class fully human monoclonal antibody that blocks inhibitory killer-cell immunoglobulin-like receptors (KIRs) expressed predominantly on natural killer (NK) cells to potentiate an anti-tumor immune response mediated by the latter. Lirilumab is licensed to Bristol-Myers Squibb Company by Innate Pharma. In this Phase Ib/II study testing lirilumab in combination with azacytidine in a heavily pretreated patient population with relapsed AML, full doses of azacytidine and lirilumab were well tolerated. No dose-limiting toxicities were observed. Preliminary efficacy data for 25 evaluable patients showed a response rate of 20% including two patients achieved a CR or a CR with insufficient count recovery and three achieving hematologic improvement. "Preliminary results presented at ASH 2016 are encouraging, as they further reinforce the favorable safety profile of both lirilumab and IPH4102. The study of IPH4102 conducted in patients with CTCL is progressing very well and we look forward to the completion of the dose-escalation part of the trial to confirm the encouraging efficacy signal seen across dose levels to date," said Pierre Dodion, Chief Medical Officer of Innate Pharma. "The good safety profile of lirilumab in combination with azacytidine in patients with relapsed AML further supports the view that lirilumab is well tolerated in numerous combinations." The poster #1826 is available on Innate Pharma's website. The poster #1641 will be soon available on Innate Pharma's website. About trial NCT02399917 (lirilumab and azacytidine in patients with relapsed AML): This is an open-label Phase II study to assess the combination of lirilumab and azacytidine to find the maximum tolerated dose of the combination that may be given to patients with refractory/relapsed acute myeloid leukemia (AML) or high-risk myelodysplastic syndromes (MDS). Primary endpoints of this study include the evaluation of the safety and efficacy of the combination in this patient population. Patients are eligible if they have AML and failed prior therapy (including prior therapy with a hypomethylating agent), have adequate performance status (ECOG max = 2), and organ function. Azacytidine was given at the dosage of 75mg/m² on days 1-7; lirilumab was given on Day 8 at the dosage of 1 and 3 mg/kg in 2 consecutive cohorts of 6 patients each. Courses were repeated approximately every 4-5 weeks. No dose-limiting toxicities were observed and lirilumab 3mg/kg was established as the recommended phase 2-dose (RP2D) in combination with azacytidine. 9 additional patients have been treated at the RP2D. Responses were evaluated at the end of 3 courses of therapy. Conducted by the MD Anderson Cancer Center at the University of Texas in the United States, the trial began in April 2015 and is expected to enroll 64 patients. Lirilumab is a fully human monoclonal antibody that is designed to act as a checkpoint inhibitor by blocking the interaction between KIR2DL-1,-2,-3 inhibitory receptors and their ligands. Blocking these receptors facilitates activation of NK cells and potentially some subsets of T cells, ultimately leading to destruction of tumor cells. Lirilumab is licensed to Bristol-Myers Squibb Company. As part of the agreement with Innate Pharma, Bristol-Myers Squibb holds exclusive worldwide rights to develop, manufacture and commercialize lirilumab and related compounds blocking KIR receptors for all indications. Under the agreement, Innate Pharma conducts the development of lirilumab through Phase II in AML. Innate Pharma is currently testing lirilumab in a randomized, double-blind, placebo-controlled Phase II trial as a maintenance treatment in elderly patients with AML in first complete remission (the "EffiKIR" trial). In addition, lirilumab is also being evaluated by Bristol-Myers Squibb in clinical trials in combination with other agents in a variety of tumor types. The Phase I trial is an open label, multicenter study of IPH4102 in patients with relapsed/refractory CTCL which is performed in Europe (France, Netherlands, United Kingdom) and in the US (NCT02593045). Participating institutions include several hospitals with internationally recognized expertise: the Saint-Louis Hospital (Paris, France), the Stanford University Medical Center (Stanford, CA), the Ohio State University (Columbus, OH), the MD Anderson Cancer Center (Houston, Texas), the Leiden University Medical Center (Netherlands), and the Guy's and St Thomas' Hospital (United Kingdom). 45 to 60 patients with KIR3DL2-positive CTCL having received at least two prior lines of systemic therapy are expected to be enrolled in two sequential study parts: - A dose-escalation part including 25 to 40 CTCL patients in 10 dose levels. The objective is to identify the Maximum Tolerated Dose and/or the RP2D; the dose-escalation follows an accelerated 3+3 design; - A cohort expansion part with 2 cohorts of 10 patients each in 2 CTCL subtypes (transformed mycosis fungoides and Sézary syndrome) receiving IPH4102 at the RP2D until progression. The cohort design (CTCL subtype, number of patients) could be revisited based on the findings in the dose-escalation part of the study. The primary objective of this trial is to evaluate the safety and tolerability of repeated administrations of single agent IPH4102 in this patient population. The secondary objectives include assessment of the drug's antitumor activity. A large set of exploratory analyses is aimed at identifying biomarkers of clinical activity. Clinical endpoints include overall objective response rate, response duration and progression-free survival. IPH4102 is a first-in-class anti-KIR3DL2 humanized cytotoxicity-inducing antibody, designed for treatment of CTCL, an orphan disease. This group of rare cutaneous lymphomas of T lymphocytes has a poor prognosis with few therapeutic options at advanced stages. KIR3DL2 is an inhibitory receptor of the KIR family, expressed by approximately 65% of patients across all CTCL subtypes and expressed by up to 95% of certain aggressive CTCL subtypes, in particular, Sézary Syndrome and transformed mycosis fungoides. It has a restricted expression on normal tissues. Potent antitumor properties of IPH4102 were shown against human CTCL cells in vitro and in vivo in a mouse model of KIR3DL2+ tumors, in which IPH4102 reduced tumor growth and improved survival. The efficacy of IPH4102 was further evaluated in laboratory assays using the patients' own natural killer (NK) cells against their primary tumor samples in the presence of IPH4102. These studies were performed in patients with Sézary Syndrome, the leukemic form of CTCL, which is known to have a very poor prognosis. In these experiments, IPH4102 selectively and efficiently induced killing of the patients' CTCL cells. These results were published in Cancer Research in 2014 (http://www.ncbi.nlm.nih.gov/pubmed/25361998). IPH4102 was granted orphan drug status in the European Union for the treatment of CTCL. CTCL is a heterogeneous group of non-Hodgkin's lymphomas which arise primarily in the skin and are characterized by the presence of malignant clonal mature T-cells. CTCL accounts for approximately 4% of all non-Hodgkin's lymphoma cases and has a median age at diagnosis of 55-65 years. Mycosis fungoides, and Sézary Syndrome, its leukemic variant, are the most common CTCL subtypes. The overall 5-year survival rate, which depends in part on disease subtype, is approximately 10% for Sézary Syndrome and less than 15% for transformed mycosis fungoides. CTCL is an orphan disease and patients with advanced CTCL have a poor prognosis with few therapeutic options and no standard of care. There are approximately 6,000 CTCL patients in Europe and the United States. Innate Pharma S.A. is a clinical-stage biotechnology company with a focus on discovering and developing first-in-class therapeutic antibodies that harness the innate immune system to improve cancer treatment and clinical outcomes for patients. Innate Pharma specializes in immuno-oncology, a new therapeutic field that is changing cancer treatment by mobilizing the power of the body's immune system to recognize and kill cancer cells. The Company's aim is to become a commercial stage biopharmaceutical company in the area of immunotherapy and focused on serious unmet medical needs in cancer. Innate Pharma has pioneered the discovery and development of checkpoint inhibitors to activate the innate immune system. Innate Pharma's innovative approach has resulted in three first-in-class, clinical-stage antibodies targeting natural killer cell receptors that may address a broad range of solid and hematological cancer indications as well as additional preclinical product candidates and technologies. Targeting receptors involved in innate immunity also creates opportunities for the Company to develop therapies for inflammatory diseases. The Company's expertise and understanding of natural killer cell biology have enabled it to enter into major alliances with leaders in the biopharmaceutical industry including AstraZeneca, Bristol-Myers Squibb and Sanofi. Based in Marseille, France, Innate Pharma has more than 140 employees and is listed on Euronext Paris. Learn more about Innate Pharma at www.innate-pharma.com. This press release contains certain forward-looking statements. Although the company believes its expectations are based on reasonable assumptions, these forward-looking statements are subject to numerous risks and uncertainties, which could cause actual results to differ materially from those anticipated. For a discussion of risks and uncertainties which could cause the company's actual results, financial condition, performance or achievements to differ from those contained in the forward-looking statements, please refer to the Risk Factors ("Facteurs de Risque") section of the Document de Reference prospectus filed with the AMF, which is available on the AMF website (http://www.amf-france.org) or on Innate Pharma's website. This press release and the information contained herein do not constitute an offer to sell or a solicitation of an offer to buy or subscribe to shares in Innate Pharma in any country.  Data were previously reported at the Third World Congress of Cutaneous Lymphomas on October 28, 2016.  CTCL is an orphan disease with poor prognosis and few therapeutic options at advanced stages.  While azacytidine has been approved in high risk myelodysplastic syndromes (US and EU), in palliative treatment of acute myeloid leukemia (EU) and in chronic myelomonocytic leukemia (EU), it has not been yet approved in relapsed acute myeloid leukemia.
Kenworthy M.A.,Leiden University |
Mamajek E.E.,University of Rochester
Astrophysical Journal | Year: 2015
The light curve of 1SWASP J140747.93-394542.6, a 16 Myr old star in the Sco-Cen OB association, underwent a complex series of deep eclipses that lasted 56 days, centered on 2007 April. This light curve is interpreted as the transit of a giant ring system that is filling up a fraction of the Hill sphere of an unseen secondary companion, J1407b. We fit the light curve with a model of an azimuthally symmetric ring system, including spatial scales down to the temporal limit set by the star's diameter and relative velocity. The best ring model has 37 rings and extends out to a radius of 0.6 AU (9 × 107 km), and the rings have an estimated total mass on the order of 100 M Moon. The ring system has one clearly defined gap at 0.4 AU (6.1 × 107 km), which, we hypothesize, is being cleared out by a <0.8 M ⊕ exosatellite orbiting around J1407b. This eclipse and model imply that we are seeing a circumplanetary disk undergoing a dynamic transition to an exosatellite-sculpted ring structure, which is one of the first seen outside our solar system. © 2015. The American Astronomical Society. All rights reserved.
Dooley S.,University of Heidelberg |
Ten Dijke P.,Leiden University
Cell and Tissue Research | Year: 2012
Transforming growth factor-β (TGF-β) is a central regulator in chronic liver disease contributing to all stages of disease progression from initial liver injury through inflammation and fibrosis to cirrhosis and hepatocellular carcinoma. Liver-damage-induced levels of active TGF-β enhance hepatocyte destruction and mediate hepatic stellate cell and fibroblast activation resulting in a wound-healing response, including myofibroblast generation and extracellular matrix deposition. Being recognised as a major profibrogenic cytokine, the targeting of the TGF-β signalling pathway has been explored with respect to the inhibition of liver disease progression. Whereas interference with TGF-β signalling in various short-term animal models has provided promising results, liver disease progression in humans is a process of decades with different phases in which TGF-β or its targeting might have both beneficial and adverse outcomes. Based on recent literature, we summarise the cell-type-directed double-edged role of TGF-β in various liver disease stages. We emphasise that, in order to achieve therapeutic effects, we need to target TGF-β signalling in the right cell type at the right time. © The Author(s) 2011.
Valkenburg W.,Leiden University |
Valkenburg W.,University of Heidelberg
General Relativity and Gravitation | Year: 2012
We present elliptic solutions to the background equations describing the Lemaître-Tolman-Bondi metric as well as the homogeneous Friedmann equation, in the presence of dust, curvature and a cosmological constant Λ. For none of the presented solutions any numerical integration has to be performed. All presented solutions are given for expanding and collapsing phases, preserving continuity in time and radius; both radial and angular scale functions are given. Hence, these solutions describe the complete spacetime of a collapsing spherical object in an expanding universe, as well as those of ever expanding objects. In the appendix we present for completeness a solution of the Friedmann equation in the additional presence of radiation, only valid for the Robertson-Walker metric. © 2012 The Author(s).
Bushway S.D.,University at Albany |
Nieuwbeerta P.,Leiden University |
Blokland A.,Netherlands Institute for the Study of Crime and Law Enforcement NSCR
Criminology | Year: 2011
Criminal record checks are being used increasingly by decision makers to predict future unwanted behaviors. A central question these decision makers face is how much time it takes before offenders can be considered "redeemed" and resemble nonoffenders in terms of the probability of offending. Building on a small literature addressing this topic for youthful, first-time offenders, the current article asks whether this period differs across the age of last conviction and the total number of prior convictions. Using long-term longitudinal data on a Dutch conviction cohort, we find that young novice offenders are redeemed after approximately 10 years of remaining crime free. For older offenders, the redemption period is considerably shorter. Offenders with extensive criminal histories, however, either never resemble their nonconvicted counterparts or only do so after a crime-free period of more than 20 years. Practical and theoretical implications of these findings are discussed. © 2011 American Society of Criminology.
Valkenburg W.,Leiden University |
Marra V.,University of Heidelberg |
Clarkson C.,University of Cape Town
Monthly Notices of the Royal Astronomical Society: Letters | Year: 2014
We present a new programme for placing constraints on radial inhomogeneity in a darkenergy- dominated universe. We introduce a new measure to quantify violations of the Copernican principle. Any violation of this principle would interfere with our interpretation of any dark-energy evolution. In particular, we find that current observations place reasonably tight constraints on possible late-time violations of the Copernican principle: the allowed area in the parameter space of amplitude and scale of a spherical inhomogeneity around the observer has to be reduced by a factor of 3 so as to confirm the Copernican principle. Then, by marginalizing over possible radial inhomogeneity we provide the first constraints on the cosmological constant which are free of the homogeneity prior prevalent in cosmology. © The Author 2013. Published by Oxford University Press on behalf of The Royal Astronomical Society.
Agency: European Commission | Branch: H2020 | Program: MSCA-ITN-ETN | Phase: MSCA-ITN-2015-ETN | Award Amount: 3.56M | Year: 2015
GLYCOVAX is a network for the education of promising young scientists who will learn how to rationally design a next generation of well-defined and innovative glycoconjugate vaccines to improve current preventive therapies and to tackle unmet medical needs. Glycoconjugate vaccines represent the key for success of vaccination in children. The covalent linkage to proteins renders carbohydrates able to evoke a T-cell memory response. Current vaccines are prepared from heterogeneous mixtures of sugars linked by unspecific methods to the carrier protein giving complex mixtures of products. Due to this intricate structure, it has not been possible to apply a medicinal chemistry approach in the development of glycoconjugate vaccines and to fully understand their mechanism of action. Combination of novel approaches for glycan synthesis and site-selective conjugation methods now gives access to conjugates defined in sugar component and attachment site, thus leading to robust structure-immunogenicity relationship. Advancements in structural biophysics can be applied to select the optimal glycan antigen. By combining the beneficiaries expertise in carbohydrate synthesis, bioconjugation, high throughput screening, structural glycobiology, vaccinology and immunology, together with the experience in project management, GLYCOVAX will create a multidisciplinary environment where 14 young researchers will contribute to develop a novel route towards improved, safer and better characterized glycoconjugate vaccines, and contemporarily acquire transferable skills which will lead them to become the new leaders of academic or industrial research. The network will involve 9 academic groups and 2 industrial partners as Beneficiaries, and one SME as Partner Organization. The profound interaction between academy and industry will aid the students to get new concepts and visions for translating their ideas from the bench to the manufacturing of the next generation of glycoconjugate vaccines.
Agency: European Commission | Branch: FP7 | Program: CP-FP | Phase: SPA.2013.2.1-01 | Award Amount: 3.29M | Year: 2013
Understanding the evolution of galaxies across cosmic time is one of the great challenges of astrophysics. At the present day, galaxies found in different environments are very different from each other. To understand how this came to be we need to map a wide range of environments in the early Universe using telescopes that probe the different physical processes. Many astronomical facilities have thus been undertaking ambitious programmes to observe large areas of the distant Universe to study galaxy evolution and most of these complete in the next four years. Our project brings together key members of the various teams to combine these data homogenously. We will add new meta and physical data that is only possible once the data have been properly combined, but is essential to interpret them scientifically. ESAs Herschel mission has a unique place probing the obscured star-formation history (roughly 50 per cent of all star formation activity). The Herschel extra galactic surveys were a major goal of and accounted for around 10 per cent of the Herschel mission. Full exploitation of these data is complicated by the large beam size. The ancillary data and tools assembled by our project are necessary to fully capitalise on this fantastic resource and to enable astronomers in Europe (without Herschel experience) to exploit the data easily. As well as a census of galaxies with value-added data and tools to exploit the original telescope maps we will new characterisations of the environment: catalogues of galaxy clusters and 3D maps of the Universe. We will also provide a new framework an extended halo model to characterise the Universe and provide a benchmark for theorists. We thus intend to provide a vast resource for studying the distant Universe, similar to the SDSS for the nearby Universe as a lasting legacy of these major ground-based and space-based surveys.
Fred Hutchinson Cancer Research Center, Leiden University, University of Rochester and University of Washington | Date: 2013-06-18
The present invention relates generally to the field of molecular biology and genetics. More particularly, it concerns methods and compositions for detecting, diagnosing, and/or treating facioscapulohumeral dystrophy (FSHD2).
Avantium International B.V. and Leiden University | Date: 2010-06-09
A method for detecting polymorphism of at least one analyte, comprising the provision of a set of compositions on a solid support medium, the solid support medium comprising a multitude of separate cells, each said composition comprising at least one analyte, and inducing or allowing each composition to adopt at least a first condition possibly influencing crystallisation of the compositions in the cells on the solid support medium, and detecting any crystallisation in at least one composition, determining the X-ray diffraction patterns, comparing the X-ray diffraction patterns and thereby identifying any polymorphs of the analyte.
Vib Vzw, Ghent University and Leiden University | Date: 2015-02-18
This disclosure relates to the field of secondary metabolite production in plants. More specifically, the disclosure relates to chimeric genes and their use in the regulation of biosynthesis and/or production of secondary metabolites in plants and plant-derived cell cultures.
Prosensa and Leiden University | Date: 2010-05-24
Methods for treating neuroblastoma in a subject are provided. The methods include administering to the subject nucleic acid that is capable of causing a significant reduction of the amount of doublecortin like protein (DCL) in the subject. The nucleic acids include antisense oligonucleotides and small interfering RNA.
Agency: European Commission | Branch: FP7 | Program: CP | Phase: ENV.2013.6.1-5 | Award Amount: 3.57M | Year: 2013
Climate change mitigation now focus on production, where upward drivers of GHG emissions come from consumption. Demand side oriented policies hence can complement domestic GHG reduction efforts. The core aim of this project is to 1. Stimulate innovative demand side oriented climate policies by improved shared insight in consumption emissions. 2. Realize a more effective policy mix for achieving the objectives of the EU policy packages (e.g. Low carbon economy roadmap) There are significant questions about consumption based carbon accounting (CBCA) systems (Gap 1: CBCA reliability) and demand side policies (effectiveness (Gap 2) and societal impacts (Gap 3)). Stakeholders hence can easily question their added value (Gap 4). Our project will overcome this problem via the following responses 1. (WP4). Comparing the major CBCA databases (EXIOBASE, WIOD, GTAP, EORA), identifying key factors causing uncertainty, assessing upward drivers, resulting in CBCA that can be implemented by formal players in the climate community (UNFCCC, IEA, others) 2. (WP5 and WP6). Providing an in-depth analysis of the feasibilities of consumption based and trade related policies, assessing their effectiveness, and compatibility with e.g. WTO rules (WP5). Specific case studies will zoom in on practical improvement options and implications for specific sectors (WP6) 3. (WP7). Improving some of the most ambitious global economic models, E3ME/E3MG, EXIOMOD and IPTSs FIDELIO relation to point 1 so that they capture side-effects and rebound effects, impacts on trade, investment etc. of consumption based policies 4. (WP8 and WP2). Creating an implementation roadmap for consumption based accounts and policies (WP8) endorsed by a critical mass of stakeholders via policy-science brokerage activities (WP2) The project is complemented by Management (WP1) and an inception phase (WP3) and executed by a group of the most renowned institutes in CBCA, economic modeling and climate policy.
Fred Hutchinson Cancer Research Center, University of Rochester and Leiden University | Date: 2011-08-18
In one aspect, the invention provides a method of screening a human subject to determine if said subject has a genetic predisposition to develop, or is suffering from Facioscapulohumeral Dystrophy (FSHD), said method comprising: (a) providing a biological sample comprising genomic DNA from the subject; and (b) analyzing the portion of the genomic DNA in the sample corresponding to the distal D4Z4-pLAM region on chromosome 4 and determining the presence or absence of a polymorphism resulting in a functional polyadenylation sequence operationally linked to exon 3 of the DUX4 gene, wherein a determination of the absence of a functional polyadenylation sequence operationally linked to exon 3 of the DUX4 gene indicates that the subject does not have a genetic predisposition to develop, and is not suffering from FSHD, and/or wherein a determination of the presence of a functional polyadenylation sequence operationally linked to exon 3 of the DUX4 gene indicates that the subject has a genetic predisposition to develop, or is suffering from Facioscapulohumeral Dystrophy (FSHD).
Agency: European Commission | Branch: H2020 | Program: RIA | Phase: FETHPC-1-2014 | Award Amount: 4.12M | Year: 2015
Multiscale phenomena are ubiquitous and they are the key to understanding the complexity of our world. Despite the significant progress achieved through computer simulations over the last decades, we are still limited in our capability to accurately and reliably simulate hierarchies of interacting multiscale physical processes that span a wide range of time and length scales, thus quickly reaching the limits of contemporary high performance computing at the tera- and petascale. Exascale supercomputers promise to lift this limitation, and in this project we will develop multiscale computing algorithms capable of producing high-fidelity scientific results and scalable to exascale computing systems. Our main objective is to develop generic and reusable High Performance Multiscale Computing algorithms that will address the exascale challenges posed by heterogeneous architectures and will enable us to run multiscale applications with extreme data requirements while achieving scalability, robustness, resiliency, and energy efficiency. Our approach is based on generic multiscale computing patterns that allow us to implement customized algorithms to optimise load balancing, data handling, fault tolerance and energy consumption under generic exascale application scenarios. We will realise an experimental execution environment on our pan-European facility, which will be used to measure performance characteristics and develop models that can provide reliable performance predictions for emerging and future exascale architectures. The viability of our approach will be demonstrated by implementing nine grand challenge applications which are exascale-ready and pave the road to unprecedented scientific discoveries. Our ambition is to establish new standards for multiscale computing at exascale, and provision a robust and reliable software technology stack that empowers multiscale modellers to transform computer simulations into predictive science.
Agency: European Commission | Branch: FP7 | Program: CP | Phase: ENV.2011.3.1.9-3 | Award Amount: 3.14M | Year: 2011
EMInInn aims at assessing the environmental impacts associated with innovation. In a first step EMInInn will assemble and set out coherently, on the one hand, macro-indicators and data of environmental impacts and, on the other hand, indicators and data to measure innovations. The definitions and delineations will be the basis for selecting appropriate analytical frameworks to operationalize assessments of environmental impacts associated with innovation on a macro scale. EMInInn will incorporate and integrate a number of advanced analytical approaches for the ex post assessment of the macro-environmental impacts of innovation. This methodology will be applied in different areas of technological innovation: Energy sources and conversion technologies Information and Communication Technologies Transport Built environment and buildings Waste management EMInInn aims at developing an analytical framework for assessing environmental impacts of established as well as emerging technologies. In selected cases options for scenarios to model burden-shifting and rebound effects will be explored. EMInInn will strengthen the science-policy link. An advisory board with experts from different governance levels will be to advise the EMInInn researchers how to link the research and dissemination with ongoing and upcoming policy initiatives and research. A number of workshops and publications will allow interaction with experts, stakeholders and policy-makers. In that context EMInInn will address EU policies, which affect three major fields of environmental impact: resources and waste, energy and climate, as well as land-use and biodiversity. By improving environmental assessments of innovation as well as policy-oriented interactions and outputs, EMInInn will generate contributions for improving EU-policies for a transition towards a more sustainable Europe and thus contribute to the flagship initiatives for a Resource Efficient Europe and the Innovation Union.