The University of Geneva is a public research university located in Geneva, Switzerland. It was founded in 1559 by John Calvin, as a theological seminary and law school. It remained focused on theology until the 17th century, when it became a center for Enlightenment scholarship. In 1873, it dropped its religious affiliations and became officially secular. Today, the university is the second-largest university in Switzerland by number of students. In 2009, the University of Geneva celebrated the 450th anniversary of its founding.UNIGE has programs in various fields but has academic and research programs in international relations , law, astrophysics, astronomy, genetics . The university holds and actively pursues teaching, research, and community service as its primary objectives. In 2011, it was ranked 73rd worldwide by the Academic Ranking of World Universities, and 69th in the QS World University Rankings.UNIGE is a member of the League of European Research Universities, the Coimbra Group and the European University Association. Wikipedia.
University of Geneva | Date: 2016-08-15
This invention concerns the field of sample identification, in particular a method and apparatuses for identifying or discriminating biological species from non biological species, both as individual particles and as components of a composition, by pump-probe fluorescence spectroscopy for time-resolved detection or imaging. The method uses the finding that the UV-induced fluorescence of biological molecules is varied, in particular is depleted, by the addition of visible radiation, whereas this does not occur with non-biological organic molecules. The invention discriminates the fluorescence signals of bio and non-bio particles or species using a differential approach, i.e. the comparison. of the total fluorescence recorded with and without additional visible radiation. This allows to discriminate biological particles comprising aromatic amino-acids (AA), like peptides, proteins, bacteria, viruses, pollens, spores, etc., from non-biological particles, like aromatic (AH) or polyaromatic hydrocarbons (PAH), carbonaceous aerosols, soot, etc.
Beta C.,University of Potsdam |
Kruse K.,University of Geneva
Annual Review of Condensed Matter Physics | Year: 2017
Dynamic processes in living cells are highly organized in space and time. Unraveling the underlying molecular mechanisms of spatiotemporal pattern formation remains one of the outstanding challenges at the interface between physics and biology. A fundamental recurrent pattern found in many different cell types is that of self-sustained oscillations. They are involved in a wide range of cellular functions, including second messenger signaling, gene expression, and cytoskeletal dynamics. Here, we review recent developments in the field of cellular oscillations and focus on cases where concepts from physics have been instrumental for understanding the underlying mechanisms. We consider biochemical and genetic oscillators as well as oscillations that arise from chemo-mechanical coupling. Finally, we highlight recent studies of intracellular waves that have increasingly moved into the focus of this research field. © 2017 by Annual Reviews. All rights reserved.
Frowis F.,University of Geneva
Journal of Physics A: Mathematical and Theoretical | Year: 2017
Experimental progress with meso- and macroscopic quantum states (i.e. general Schrödinger-cat states) was recently accompanied by theoretical proposals on how to measure the merit of these efforts. So far, experiment and theory have been disconnected as theoretical analysis of actual experimental data was missing. Here, we consider a proposal for macroscopic quantum states that measures the extent of quantum coherence present in the system. For this, the quantum Fisher information is used. We calculate lower bounds from real experimental data. The results are expressed as an 'effective size', that is, relative to 'classical' reference states. We find remarkable numbers of up to 70 in photonic and atomic systems. © 2017 IOP Publishing Ltd.
Lamanna G.,University of Geneva
Clinical Nuclear Medicine | Year: 2017
PURPOSE: The aims of this study were to assess the intraindividual performance of F-fluorocholine (FCH) and C-acetate (ACE) PET studies for restaging of recurrent prostate cancer (PCa), to correlate PET findings with long-term clinical and imaging follow-up, and to evaluate the impact of PET results on patient management. METHODS: Thirty-three PCa patients relapsing after radical prostatectomy (n = 10, prostate-specific antigen [PSA] ≤3 ng/mL), primary radiotherapy (n = 8, prostate-specific antigen ≤5 ng/mL), or radical prostatectomy + salvage radiotherapy (n = 15) underwent ACE and FCH PET-CT (n = 29) or PET-MRI (n = 4) studies in a randomized sequence 0 to 21 days apart. RESULTS: The detection rate for ACE was 66% and for FCH was 60%. Results were concordant in 79% of the cases (26/33) and discordant in 21% (retroperitoneal, n = 5; pararectal, n = 1; and external iliac nodes, n = 1). After a median FU of 41 months (n = 32, 1 patient lost to FU), the site of relapse was correctly identified by ACE and FCH in 53% (17/32) and 47% (15/32) of the patients, respectively (2 M1a patients ACE+/FCH−), whereas in 6 of 32 patients the relapse was not localized. Treatment approach was changed in 11 (34.4%) of 32 patients and 9 (28%) of 32 patients restaged with ACE and FCH PET, respectively. CONCLUSIONS: In early recurrent PCa, ACE and FCH showed minor discrepancies, limited to nodal staging and mainly in the retroperitoneal area, with true positivity of PET findings confirmed in half of the cases during FU. Treatment approach turned out to be influenced by ACE or FCH PET studies in one third of the patients. Copyright © 2017 Wolters Kluwer Health, Inc. All rights reserved.
Bergeron F.C.,University of Geneva
Environmental Impact Assessment Review | Year: 2017
Waste is not a rejected item to dispose anymore but increasingly a secondary resource to exploit, influencing waste allocation among treatment operations in a waste management (WM) system. The aim of this methodological paper is to present a new method for the assessment of the WM system, the “analytical method of the waste allocation process” (AMWAP), based on the concept of the “waste allocation process” defined as the aggregation of all processes of apportioning waste among alternative waste treatment operations inside or outside the spatial borders of a WM system. AMWAP contains a conceptual framework and an analytical approach. The conceptual framework includes, firstly, a descriptive model that focuses on the description and classification of the WM system. It includes, secondly, an explanatory model that serves to explain and to predict the operation of the WM system. The analytical approach consists of a step-by-step analysis for the empirical implementation of the conceptual framework. With its multiple purposes, AMWAP provides an innovative and objective modular method to analyse a WM system which may be integrated in the framework of impact assessment methods and environmental systems analysis tools. Its originality comes from the interdisciplinary analysis of the WAP and to develop the conceptual framework. AMWAP is applied in the framework of an illustrative case study on the household WM system of Geneva (Switzerland). It demonstrates that this method provides an in-depth and contextual knowledge of WM. © 2016
Antonarakis G.S.,University of Geneva
Journal of Burn Care and Research | Year: 2017
Microstomia, an abnormally small oral orifice, is a complication of perioral facial burns. In this case, contraction of the circumoral tissues and hypotonia of the musculature is responsible for this microstomia, which can produce aesthetic and functional impairment with eating, swallowing, communication (speech and facial expressions), compromised dental care and maintenance due to limited oral access, social interactions, and psychological well-being. Conservative management involves providing physical resistance to scar contracture, with opposing horizontal and vertical circumoral forces by means of appliances that aim to stretch the commissures and fibrotic muscles. Numerous appliances, either intraoral or extraoral, have been described to prevent or treat microstomia by delivering a static or dynamic stretch horizontally or vertically, with most designed to stretch the mouth horizontally. Finding a comfortable effective way to stretch the mouth vertically has proved to be a challenge. This article describes the fabrication of a dynamic commissural appliance, constructed using acrylic resin and expansion screws, which provide simultaneous horizontal and vertical circumoral forces. This appliance is constructed easily and inexpensively without the need for taking impressions, can be adjusted so that it is almost painlessly inserted, and is progressively activated. It is convenient for use because the patient controls the pressure that is applied by the appliance. Its use in a case is described where the appliance has improved mouth opening and consequently functional outcomes. © 2017 The American Burn Association
Sheldrake T.,University of Geneva |
Caricchi L.,University of Geneva
Geology | Year: 2017
Quantifying the frequency at which volcanic eruptions of different size occurs is important for hazard assessment. Volcanic records can be used to estimate the recurrence rate of large-magnitude eruptions (magnitude =4), but recording biases that impact data completeness complicate analysis. To overcome these biases, we conceptualize the volcanic record as a series of individual and unique time series associated by a common behavior. Thus, we approach issues of completeness on a volcano-by-volcano basis and use a hierarchical Bayesian approach to characterize the common frequency-magnitude (f-M) behavior for different groups of volcanoes. We identify variations in the f-M relationship between different volcano types and between different volcanic arcs. By accounting for systematic under-recording in the volcanic record, we also calculate the global recurrence rates for large-magnitude eruptions during the Holocene, which are similar to previous estimates. However, higher recurrence rates for smaller-magnitude events are observed, which is a result of our adjustments for data completeness. Quantifying how the f-M relationship varies between different groups of volcanoes provides an opportunity to understand how the tectonic setting influences f-M behavior, which is important to quantify long-term regional volcanic hazard. © 2016 Geological Society of America.
Lavenex S.,University of Geneva
British Journal of Politics and International Relations | Year: 2017
The destabilization of Eastern Europe and of the Southern Mediterranean has exposed the limits of the European peace project. Obviously, the European Neighbourhood Policy has not succeeded in boosting peace and prosperity. This article attributes this failure to a combination of functionalist hubris and political myopia that emanates from the European Union’s peculiar constitution as a regulatory power with weak political union. While the projection of the European Union’s single market acquis has set over-ambitious targets, the needs of the partner countries and the wider geopolitical implications of the Neighbourhood Policy have received little political attention. In sum, the experience at the fringes of the European peace project not only unveils the limits of the European Union as a foreign policy actor, but it also raises more theoretical questions on the notion of ‘liberal peace’. © 2016, © The Author(s) 2016.
Goyette S.,University of Geneva
Climate Dynamics | Year: 2017
A coupled single-column atmosphere-lake model, along with the Stein–Alpert factor separation methodology, is used to explore some of the non-linear interactions in the vertical dimension between the lower atmosphere and the deep-Lake Geneva, Switzerland, during three selected periods in 1990. The first from the end of April to the end of May when Lake Geneva was building its stratification, the second from mid-August to mid-September during stable stratification, and the third from the end of November to the end of December during destratification. It is recognized that the large thermal inertia of Lake Geneva reduces the surface annual and diurnal temperature variations for neighbouring regions. However, the question of how the open water and the overlying atmosphere interact and which of these “factors” has the most influence needs much attention. The sole presence of the lake is shown to be a major feature with regard to the surface energy budget components whose contributions counteract those of the lower atmosphere, thus supporting the fact that Lake Geneva acts as a damping factor to the regional climate system. It is also shown that not only did the presence of the lake and the overlying atmosphere independently modulate the surface energy budget, but also the synergistic nonlinear interaction among them, either positive or negative, was often found non-negligible. Moreover, some processes may turn out to be important on short time scales while being negligible on the long term. © 2016, Springer-Verlag Berlin Heidelberg.
Nowell C.S.,University of Geneva
Nature Reviews Cancer | Year: 2017
The Notch signalling cascade is an evolutionarily conserved pathway that has a crucial role in regulating development and homeostasis in various tissues. The cellular processes and events that it controls are diverse, and continued investigation over recent decades has revealed how the role of Notch signalling is multifaceted and highly context dependent. Consistent with the far-reaching impact that Notch has on development and homeostasis, aberrant activity of the pathway is also linked to the initiation and progression of several malignancies, and Notch can in fact be either oncogenic or tumour suppressive depending on the tissue and cellular context. The Notch pathway therefore represents an important target for therapeutic agents designed to treat many types of cancer. In this Review, we focus on the latest developments relating specifically to the tumour-suppressor activity of Notch signalling and discuss the potential mechanisms by which Notch can inhibit carcinogenesis in various tissues. Potential therapeutic strategies aimed at restoring or augmenting Notch-mediated tumour suppression will also be highlighted. © 2017 Nature Publishing Group, a division of Macmillan Publishers Limited. All Rights Reserved.
Gander M.J.,University of Geneva
Lecture Notes in Computational Science and Engineering | Year: 2017
The idea of preconditioning iterative methods for the solution of linear systems goes back to Jacobi (Astron Nachr 22(20):297-306, 1845), who used rotations to obtain a system with more diagonal dominance, before he applied what is now called Jacobis method. The preconditioning of linear systems for their solution by Krylov methods has become a major field of research over the past decades, and there are two main approaches for constructing preconditioners: either one has very good intuition and can propose directly a preconditioner which leads to a favorable spectrum of the preconditioned system, or one uses the splitting matrix of an effective stationary iterative method like multigrid or domain decomposition as the preconditioner. Much less is known about the preconditioning of non-linear systems of equations. The standard iterative solver in that case is Newton’s method (1671) or a variant thereof, but what would it mean to precondition the non-linear problem? An important contribution in this field is ASPIN (Additive Schwarz Preconditioned Inexact Newton) by Cai and Keyes (SIAM J Sci Comput 24(1):183-200, 2002), where the authors use their intuition about domain decomposition methods to propose a transformation of the non-linear equations before solving them by an inexact Newton method. Using the relation between stationary iterative methods and preconditioning for linear systems, we show in this presentation how one can systematically obtain a non-linear preconditioner from classical fixed point iterations, and present as an example a new two level non-linear preconditioner called RASPEN (Restricted Additive Schwarz Preconditioned Exact Newton) with substantially improved convergence properties compared to ASPIN. © Springer International Publishing AG 2017.
Garnero P.,University of Geneva
Molecular Diagnosis and Therapy | Year: 2017
The measurement of bone turnover markers is useful for the clinical investigation of patients with osteoporosis. Among the available biochemical markers, the measurements of serum procollagen type I N-terminal propeptide (PINP) and the crosslinked C-terminal telopeptide (serum CTX) have been recommended as reference markers of bone formation and bone resorption, respectively. The important sources of preanalytical and analytical variability have been identified for both markers, and precise measurement can now be obtained. Reference interval data for PINP and CTX have been generated across different geographical locations, which allows optimum clinical interpretation. However, conventional protein-based markers have some limitations, including a lack of specificity for bone tissue, and their inability to reflect osteocyte activity or periosteal metabolism. Thus, novel markers such as periostin, sclerostin and, sphingosine 1-phosphate have been developed to address some of these shortcomings. Recent studies suggest that the measurements of circulating microRNAs, a new class of marker, may represent early biological markers in osteoporosis. Bone markers have been shown to be a useful adjunct to bone mineral density for identifying postmenopausal women at high risk for fracture. Because levels of bone markers respond rapidly to both anabolic and anticatabolic drugs, they are very useful for investigating the mechanism of action of new therapies and, potentially, for predicting their efficacy to reduce fracture risk. © 2017 Springer International Publishing Switzerland
Barisch C.,University of Geneva |
Soldati T.,University of Geneva
PLoS Pathogens | Year: 2017
During a tuberculosis infection and inside lipid-laden foamy macrophages, fatty acids (FAs) and sterols are the major energy and carbon source for Mycobacterium tuberculosis. Mycobacteria can be found both inside a vacuole and the cytosol, but how this impacts their access to lipids is not well appreciated. Lipid droplets (LDs) store FAs in form of triacylglycerols (TAGs) and are energy reservoirs of prokaryotes and eukaryotes. Using the Dictyostelium discoideum/Mycobacterium marinum infection model we showed that M. marinum accesses host LDs to build up its own intracytosolic lipid inclusions (ILIs). Here, we show that host LDs aggregate at regions of the bacteria that become exposed to the cytosol, and appear to coalesce on their hydrophobic surface leading to a transfer of diacylglycerol O-acyltransferase 2 (Dgat2)-GFP onto the bacteria. Dictyostelium knockout mutants for both Dgat enzymes are unable to generate LDs. Instead, the excess of exogenous FAs is esterified predominantly into phospholipids, inducing uncontrolled proliferation of the endoplasmic reticulum (ER). Strikingly, in absence of host LDs, M. marinum alternatively exploits these phospholipids, resulting in rapid reversal of ER-proliferation. In addition, the bacteria are unable to restrict their acquisition of lipids from the dgat1&2 double knockout leading to vast accumulation of ILIs. Recent data indicate that the presence of ILIs is one of the characteristics of dormant mycobacteria. During Dictyostelium infection, ILI formation in M. marinum is not accompanied by a significant change in intracellular growth and a reduction in metabolic activity, thus providing evidence that storage of neutral lipids does not necessarily induce dormancy. © 2017 Barisch, Soldati.
News Article | April 28, 2017
Recent articles have declared that deposits of mineral raw materials (copper, zinc, etc.) will be exhausted within a few decades. An international team, including the University of Geneva (UNIGE), Switzerland, has shown that this is incorrect and that the resources of most mineral commodities are sufficient to meet the growing demand from industrialization and future demographic changes. Future shortages will arise not from physical exhaustion of different metals but from causes related to industrial exploitation, the economy, and environmental or societal pressures on the use of mineral resources. The report can be read in the journal Geochemical Perspectives. Some scientists have declared that mineral deposits containing important non renewable resources such as copper and zinc will be exhausted in a few decades if consumption does not decrease. Reaching the opposite conclusion, the international team of researchers shows that even though mineral resources are finite, geological arguments indicate that they are sufficient for at least many centuries, even taking into account the increasing consumption required to meet the growing needs of society. How can this difference be explained? "Do not confuse the mineral resources that exist within the Earth with reserves, which are mineral resources that have been identified and quantified and are able to be exploited economically. Some studies that predict upcoming shortages are based on statistics that only take reserves into account, i.e. a tiny fraction of the deposits that exist", explains Lluis Fontboté, Professor in the Department of Earth Sciences, University of Geneva. To define reserves is a costly exercise that requires investment in exploration, drilling, analyses, and numerical and economic evaluations. Mining companies explore and delineate reserves sufficient for a few decades of profitable operation. Delineation of larger reserves would be a costly and unproductive investment, and does not fit the economic logic of the modern market. The result is that the estimated life of most mineral commodities is between twenty to forty years, and has remained relatively constant over decades. Use of these values to predict the amount available leads to the frequently announced risks of impending shortages. But this type of calculation is obviously wrong, because it does not take into account the amount of metal in lower quality deposits that are not included in reserves and the huge amount of metal in deposits that have not yet been discovered. Some studies have produced figures that include the known and undiscovered resources, but as our knowledge on ore deposits in large parts of the Earth's crust is very fragmentary, these estimates are generally very conservative. The vast majority of mined deposits have been discovered at the surface or in the uppermost 300 meters of the crust, but we know that deposits are also present at greater depths. Current techniques allow mining to depths of at least 2000 to 3000 meters. Thus, many mineral deposits that exist have not yet been discovered, and are not included in the statistics. There have been some mineral shortages in the past, especially during the boom related to China's growth, but these are not due to a lack of supplies, but to operational and economic issues. For instance, between the discovery of a deposit and its effective operation, 10 to 20 years or more can elapse, and if demand rises sharply, industrial exploitation cannot respond instantly, creating a temporary shortage. "The real problem is not the depletion of resources, but the environmental and societal impact of mining operations", says Professor Fontboté. Mining has been undeniably linked to environmental degradation. While impacts can be mitigated by modern technologies, many challenges remain. The financial, environmental and societal costs of mining must be equitably apportioned between industrialized and developing countries, as well as between local communities near mines and the rest of society. "Recycling is important and essential, but is not enough to meet the strong growth in demand from developing countries. We must continue to seek and carefully exploit new deposits, both in developing and in industrialized countries", says the researcher at the University of Geneva. But how can we protect the environment while continuing to mine? Continuing research provides the solutions. If we are to continue mining while minimizing associated environmental effects, we need to better understand the formation of ore deposits, to open new areas of exploration with advanced methods of remote sensing . The continual improvement of exploration and mining techniques is reducing the impact on the Earth's surface. "Rapid evolution of technologies and society will eventually reduce our need for mineral raw materials, but at the same time, these new technologies are creating new needs for metals, such as many of the 60 elements that make up every smart phone", adds Professor Fontboté. The geological perspective that guided the present study leads to the conclusion that shortages will not become a threat for many centuries as long as there is a major effort in mineral exploration, coupled with conservation and recycling. To meet this challenge, society must find ways to discover and mine the needed mineral resources while respecting the environment and the interests of local communities.
News Article | April 17, 2017
The Oracle Virus From Paul Michael Privateer Points An Allegorical Finger at the New Order of American Politics Digital Book Guild announces its first novel with proceeds going to St. Jude Children's Research Hospital. Digital Book Guild is an non-profit publisher attracting writers willing to donate some or all of their proceeds to various charities. Paul Michael Privateer's Oracle Virus is the first release in this effort. Reviewers and readers have said the Oracle Virus has a fast-paced plot that moves through bizarre serial murders, kidnappings, betrayal and deceit with a jolting tension running through the story to the end in Washington DC. The violent climax between nature and man, the fake fog of Washington D.C. politics, Kafkaesque intrigue, and epic battles collide allegorically, leading readers down the America’s truth resistant political rabbit hole, with an orange haired real estate tycoon guide. The book has nothing and everything to do with the Oracle, not Larry' Ellison's but America's. Other have enjoyed it as the first "Google-assist read". They've stated that while surface action pleases pop readers hooked on hybrid international thriller-sci-fi mysteries, there is a below resembling a Lynch ecosystem reminiscent of Borges, Kafka, Hesse, Updike, Rushdie, Kidman, Antonioni, Greenville, Ihimaera, Adichie, and Abani. The Oracle Virus invents a new genre—hyperallegorical realism. The deeper a reader dives in the more The Oracle Virus reveals itself as postcolonial fiction, a cautionary tale of the effects of Trumpian politics, dressed up as a blockbuster. Others insist that Privateer has created a novel that reads like a movie and is in a league with the likes of John Le Carre (and British intelligence), John Grisham (and legal acumen), Dan Brown (and religious intrigue), and Patricia Cornwell (and Forensic Science). Digital Book Guild believes it has a winner given responses that suggest the novel is flat out the most intense sci-fi mystery and beat political thriller offered up in a decade, a novel in which readers travel through two centuries, fly in and out of exotic world capitols and meet renegade Nazis, meet a computer genius, Dark Web cave dwellers, Shaolin priests, Interpol’s toughest agents, strange prophets, and then meet Jack Kavanaugh, a modern hybrid of Sherlock Holmes and Jason Bourne, only smarter and stronger. The first reviewer's comments are about the novel's realism: "This sci-fi mystery thriller is set in the present, save for opening Hitler Nazi flashbacks. I say this in case a prospective reader expects star travel. The description and praise stated in the synopsis is accurate. Parts of the “science fiction” aspect of the book’s storyline are increasingly believable, given ever-expanding bio-technology. Scary! I tend to Google items I find in fiction, to learn how fictitious some story features are. I found that seriously-sized drone apocalyptic delivery vehicles used in the scary climax are available. Privateer spins an engaging mystery thriller. His social awareness is a plus. All proceeds are to go St Jude Children’s Hospital, cancer research division.” The editorial board agrees that the Oracle Virus is a very unusual, advanced high-tech detective story that combines science-fiction, detective, thriller, and romance. The story starts out meticulously and then becomes a real page turner. The cover provides a good overview: The Oracle Virus nerve blasts its way into being a classic sci-fi thriller with substantial philosophic insight. It doesn’t brake a nanosecond for Hitchcock twists or Phillip K. Dick’s paranoia. Page one is a rabbit hole: why is there a fake Gestapo assassination of Hitler or is there? How does a secret genetics lab survive WWII bombings? Can a machine named Mediatron create reality? Or can a nanovirus control our minds? How is a serial geek murder, a hurricane headed toward D.C., a whale stranding, the kidnapping of world presidents, a bloody fight atop the Washington Monument and a secret Louisiana rogue organization all be connected? Or are they? They found that Privateer's social awareness was a plus with proceeds going to St Jude’s Children’s’ Hospital, cancer research division. The Oracle Virus may be the first novel ever in its charity goal is announced on the title page. Paul Michael Privateer was born in New York, served in the United States Air Force, and is interested in intersections between literature, media, and science/information technology. His books include Romantic Voices and Inventing Intelligence, and many of his journal articles deal with the cultural and political effects of cyberspace, digital technology, and corporate media. Privateer has taught at San Jose State, the University of Southern Mississippi, Georgia Institute of Technology and Arizona State University. The University of Geneva, Stanford, and MIT offered him a Fulbright and visiting professorships. He has appeared in the New York Times and on CNN, PBS, ABC, NPR, and BBC4 given his work on education reform, citizen service, and the digital future. His fiction focuses on the most basic aspects of being human: love, passion, fidelity, identity, taboos, social alienation, insecurity and death. His next novels, A Woman in Love and The Nightmare Collector explore the limits of digital media and hyperreal minimalism. His fiction is about fiction. His recent novel, The Oracle Virus, pays sometimes subtle homage to McFarlane, Shakespeare, Hugo, Dickens, Woolf, Kafka, Hardy, Melville, Camus, Steinbeck, Beckett, Borges, Dick, Auster, Angelou, Ellison, Roth, Gibson and many others whose influences ultimately make serious fiction writing a ritual gathering of ghosts. This respect and fascination began with his favorite childhood game: Authors. Privateer lives in the Pacific Northwest and is engaged in socially conscious initiatives. He is founder of NoSchoolViolence.org and Seattle Data for Good. He kayaks, likes trekking Puget Sound islands and the Olympic Peninsula with Nell, a curious but cautiously social black lab. For some unknown reason, she doesn’t sniff everyone’s hand.
News Article | April 18, 2017
When it comes to finding exoplanets in photos, hundreds or even thousands of eyes are better than an algorithm. The University of Geneva is creating a game to help researchers identify exoplanets with the help of gamers, it will be one of the largest citizen science projects ever, according to the MIT Technology Review. The game will operate within EVE Online servers under the title “Project Discovery.” EVE is an online gaming platform based in space, developed by the video game company CCP. There is already one Project Discovery game that involves identifying patterns of protein to help advance the science and expansion of the Human Protein Atlas database. Once the exoplanet version of the game is released in June, those playing the game, members of the Massively Multiplayer Online Science community, will be able to enter the game and play their way through data from the University of Geneva to help locate exoplanets. Once users are “recruited” in the game they’ll be trained to identify different patterns or identifiers for exoplanets that they may encounter and then as they play through the game they can report anything they think may be an exoplanet. Once enough players identify the same possible planet that data will go back to the University of Geneva for review. The crowd sourcing approach has yielded quick results for the first game that involved the human proteins and researchers are hopeful that this second one will too. Algorithms can only detect so much. When other factors, like light pollution or spikes, are introduced to an image the algorithms have a hard time discerning an exoplanet from any other star, according to the MIT Tech Review. So the trained human eye is necessary for some discoveries.
News Article | April 19, 2017
For scientists searching the skies for other Earth-like planets—other living worlds—the brightest hope may be a quiet star too dim to be seen with the naked eye, a sedate and solitary red dwarf called LHS 1140 nestled just 40 light-years away in the southern constellation Cetus. There an international team of astronomers has found a world that, although not a twin of Earth, certainly counts as a close cousin. LHS 1140 b is a “super-Earth,” a planet bigger than ours but smaller than Neptune, and the most common variety of world thought to exist in our galaxy. Many erstwhile super-Earths, however, have proved to be uninhabitable “mini-Neptunes” smothered beneath thick layers of gas. This world is different. At just under 50 percent larger than Earth but more than six times as heavy, its dimensions suggest it must be a ball of rock and metal, potentially with a thin and comparatively Earth-like atmosphere. Its 25-day orbit brings it 10 times closer to its star than Earth ever gets to our sun, but LHS 1140 shines so weakly that its planet soaks up just half the starlight our own world receives—just enough, it seems, to sustain the possibility of life-giving liquid water oceans on its surface. This alien world might well be tidally locked due to its nearness to its star, eternally turning the same face to its sun just as the moon does to Earth, leaving its far side in constant darkness. The planet and star are estimated to be at least five billion years old—that is, about half a billion years older than our solar system. Most importantly, each orbit sends this temperate, rocky world “transiting” across the face of its star as seen from Earth—a fortuitous alignment allowing astronomers to observe the planet more closely than any other potentially habitable world yet found beyond our solar system. Molecules in a transiting planet's upper atmosphere absorb a fraction of the starlight passing through, forming a tenuous ring of light around the globe that astronomers can study to learn what is in its alien air. In coming years astronomers will use this and other techniques to seek out any biosphere that might exist on LHS 1140 b, potentially revealing signs of oxygen and other atmospheric gases that, on Earth, constitute the literal breath of life. The planet’s discovery is detailed in a study published in Nature. “LHS 1140 b is the best candidate to look at for signs of life in the near future,” says study co-author David Charbonneau, an astronomer at Harvard University who leads the MEarth Project, a global network of small telescopes that first observed the transiting planet. (The “M” in “MEarth” stands for “M dwarf,” a technical term for those red dwarf stars that are about 30 percent or less the mass of the sun. Such stars are by far the most common variety in our galaxy, and the most amenable to studies of planets.) “This is the first time we’ve found a rocky planet that gives us the opportunity to look for oxygen,” Charbonneau adds This really is the one we’ve been hunting for.” Long sought, the planet was also one that almost got away. MEarth’s array of telescopes in the Southern Hemisphere, located at the Cerro Tololo Inter-American Observatory in Chile, first picked up tentative signs of LHS 1140b’s transit in September 2014. MEarth team member and lead study author Jason Dittmann, then a graduate student at Harvard University, spearheaded the effort to confirm and study the potential planet. The case for the planet slowly grew over the next two years, as the MEarth team enlisted help from a second group of astronomers operating the European Southern Observatory’s HARPS instrument in Chile—the world’s premiere planet-hunting spectrograph. Rather than look for transits, HARPS finds planets by the periodic gravitational wobbles they impose on their stars. This slow, painstaking technique allows a planet’s mass to be estimated. “MEarth detected a transit event, but only one, and it was low signal to noise so they were not completely sure it was real,” says study co-author Xavier Bonfils, an astronomer at the University of Geneva who helms the HARPS survey of red dwarf stars. “But they have never passed us a false positive, so we considered this a quite reliable candidate and began an intensive observing campaign.” By combining HARPS and MEarth observations, the teams eventually predicted a transit for the putative planet would be viewable from facilities in Hawaii and Australia on September 1, 2016. But on the appointed night, poor weather prevented five of the six telescopes from observing the star. Only one observer, amateur astronomer and study co-author Thiam-Guan Tan, successfully watched the transit using a small telescope in the suburbs of Perth, Australia. That night, Tan sent the MEarth team a terse e-mail reporting his success: “Transit egress seen at ~HJD +7633.12. Depth about 5 mmag.” That is, Tan had recorded LHS 1140 dimming by just half of 1 percent from a transiting planet—equivalent, he says, to “observing the dimming of light caused by a grain of sand moving in front of a candle placed 400 kilometers away.” With the planet’s orbital period in hand, subsequent observations with MEarth and HARPS quickly firmed up estimates for its size and mass, revealing it to be a giant, rocky and very noteworthy world. One could be forgiven for thinking planet hunters are somehow confused. With every passing month a new prime candidate for “Earth 2.0” seems to emerge. But not all potentially habitable worlds are equally promising for follow-up study. For example, since its launch in 2009 NASA’s Kepler space telescope has discovered about a dozen potentially habitable worlds transiting other stars in our galaxy. Yet Kepler’s finds are thousands of light-years away—too far to be scrutinized for more nuanced signs of habitability and life. Conversely, last year astronomers discovered a potentially habitable Earth-size planet, Proxima b, around the sun’s nearest neighboring star—the red dwarf Proxima Centauri, scarcely more than four light-years away. But like most other known nearby worlds, Proxima b does not appear to transit, meaning deeper studies may be delayed for years as astronomers develop the technology to actually snap its picture. Earlier this year, planet hunters hit pay dirt with a system of at least seven Earth-size planets transiting another red dwarf, TRAPPIST-1, which like LHS 1140 is about 40 light-years away. Researchers carefully studied each transiting planet’s shadow to determine its size, and even managed to estimate some of their weights by watching how the orbiting planets tugged on one another to subtly alter the timing of their transits. These studies, however, yielded mixed results—the worlds of TRAPPIST-1 could be rocky, Charbonneau says, or they could be drowned or smothered beneath thick layers of water, ice or gas. Even so, because they do transit, astronomers using NASA’s upcoming infrared James Webb Space Telescope or under-construction ground-based telescopes with 30-meter mirrors will be able to learn much more about the planets of TRAPPIST-1 by studying the makeup of their atmospheres. But although TRAPPIST-1 is the same distance from Earth as LHS 1140, it is a much smaller and dimmer “ultracool” red dwarf—as small and dim as a star can be, in fact, while still qualifying as a star. The meager trickle of light it shines toward Earth is insufficient to support a robust search for atmospheric oxygen. Even if TRAPPIST-1 were bright enough to allow its planets to be studied for signs of oxygen, the star presents other problems for life-seeking astronomers. Like all red dwarfs, it experienced a tempestuous youth during which it shined far brighter as it slowly contracted to its current size. This formative period lasted for perhaps a billion years, and may well have left its retinue of worlds scorched and airless—or wreathed in a crushing, arid atmosphere of almost pure carbon dioxide, due to a Venus-style runaway greenhouse effect. Even today the star is highly active, bathing its planets in atmosphere-eroding x-ray and ultraviolet radiation. LHS 1140, by contrast, is thought to have had a much briefer formative phase of just 40 million years, and is now a relatively quiescent star. “That’s the big question now: ‘Which planet is going to retain its atmosphere against stellar heating and erosion?’” Bonfils says. “And the chance seems higher around a quiet star like LHS 1140.” The great bulk of LHS 1140 b, its discoverers say, could offer additional advantages. The planet’s hefty gravitational field may have allowed it to retain more of its air against stellar insults. And even if it did lose its primordial atmosphere or suffer a runaway greenhouse effect during its star’s initial 40 million years of planet-scorching brightness, back then its crust and mantle were likely still molten, forming a planetary magma ocean that could act as a reservoir for volatile gases. As the magma cooled, it could release those gases to replenish the planet’s atmosphere and inventory of water. Studying both planetary systems together, Dittmann says, could yield crucial insights about how potentially habitable worlds can keep—or lose—their atmospheres around red dwarf stars. “Between TRAPPIST-1 and LHS 1140 b we have the opportunity to compare a planet bathed in intense radiation by an active ultracool dwarf star with one around a much calmer, steadier star,” he explains. “That will let us ask—and answer—some fun questions.” In the meantime, he says, the MEarth team’s plans for LHS 1140 “are incredibly simple: We’re going to hit this system with everything we’ve got.” Already, the team is hammering away at the system with additional observations, bombarding the star with HARPS measurements practically every night for several months in hopes of pinning down the planet’s true mass and learning whether other worlds lurk hidden in the system. Observations with NASA’s Hubble Space Telescope are measuring how much ultraviolet light from the star falls on the planet to better understand its prospects for life. Additional, yet-to-be-approved observations with Hubble and another space-based NASA telescope, the Chandra X-Ray Observatory, could reveal just how much high-energy radiation the world receives, further clarifying its capacity to support life. This fall the team hopes to take over most of the world-class telescopes in Chile for one night, monitoring a transit of the planet on October 26 with the twin 6.5-meter Magellan telescopes as well as three or four of the eight-meter observatories that make up the European Southern Observatory’s Very Large Telescope complex. These observations will seek to detect the planet’s atmosphere—or at least to confirm that it lacks a thick, biosphere-stifling envelope of gas. But the best information will come later this decade and early in the next with the launch of NASA’s Webb telescope in 2018 and the debut of ground-based 30-meter extremely large telescopes in the 2020s. Operating in the infrared part of the spectrum, Webb could search for signs of carbon dioxide, water vapor, methane and other gases in LHS 1140 b’s atmosphere. A ground-based facility such as the under-construction Giant Magellan Telescope (GMT) could look for atmospheric oxygen in visible light reflected from the planet. Combining data from Webb and the GMT, Charbonneau says, could allow astronomers to distinguish between potentially biological sources of oxygen—such as photosynthetic organisms—and abiotic production routes for the gas, which can be generated in enormous amounts by runaway-greenhouse conditions. “The message is that we really need both Webb and something like the GMT,” Charbonneau says. “The GMT could detect oxygen, which would tell us that there really could be life there. But to understand the source of that oxygen you must go and measure other atmospheric molecules, and those will be in domain of Webb.” Astronomers preparing Webb for launch are already planning observations of the new planet. “Only time will tell, but I would not be surprised that LHS 1140b will become one of the most-studied planets by Webb in its entire lifetime,” says René Doyon, an astronomer at the University of Montreal and principal investigator for NIRISS, a Canadian-built instrument for Webb that is optimized for studying planetary atmospheres. Doyon has already allocated some of NIRISS’s precious observing time to study the system, which he calls a “dream target” for Webb. Pondering the prospect of devoting years—decades even—of his scientific career to studying this newfound planet, Dittmann (who has since moved to Massachusetts Institute of Technology, where he is a postdoctoral fellow) occasionally wonders whether the investment will pan out. Red dwarfs and super-Earths are respectively the most abundant stars and planets in the galaxy, and when they come together to form a transiting system they offer astronomers a bonanza of observational possibilities with current or near-future technology. But they are also profoundly alien, presenting myriad unique challenges to observers hoping to understand them and their prospects for life. Studies of more familiar territory—smaller planets around scarcer, larger stars like our sun—are at present far more difficult, with breakthrough results perhaps still decades away. “We’re being pushed to [red dwarfs] because of their abundance and our available technology. But you know, we go outside everyday and there’s a nice yellow star up there, shining for us,” Dittmann muses. “It is kind of strange, to wonder why we don’t instead orbit one of the most common star types in the universe—and maybe it’s because they’re not so great for life. It’s on the back of everyone’s mind—certainly mine. Then again, maybe life would have no problem around these stars. What’s important is that we’re now at the point where we’re finding and studying these planets, like LHS 1140 b and those of TRAPPIST-1—and more that will come—so that we can confront all these hypotheticals with actual data. So this is where we’re going. In 10 years I may eat my words, but in 10 years I’ll also be eating lots of telescope time.”
News Article | April 26, 2017
Six years ago, a chimpanzee had the bright idea to use moss to soak up water, then drink from it, and seven others soon learned the trick. Three years later, researchers returned to the site to see if the practice had persisted to become part of the local chimp culture. They now report that the technique has continued to spread, and it’s mostly been learned by relatives of the original moss-spongers. This adds to earlier evidence that family ties are the most important routes for culture to spread in animals. After the first report of chimps using moss as a sponge in Budongo Forest, Uganda, researchers rarely saw the behaviour again, and wondered whether chimps still knew how to do it. So they set up an experiment, providing moss and leaves at the clay pit where the chimps had demonstrated the technique before. Then they watched to see whether chimpanzees would use leaves – a more common behaviour – or moss to soak up the mineral-rich water from the pit. Most of the original moss-spongers used moss again during the experiment, and so did another 17 chimps, showing the practice had become more widespread. The researchers wondered what factors influenced which individuals adopted it: were they connected socially, or through families, for instance? This group of chimps has been observed for a long time, so the researchers were able to look through field data to calculate an index of how much time each chimpanzee spent with other individuals. It turned out that this metric wasn’t a good predictor of which chimps would use the moss sponge. Instead, moss-sponging was strongly correlated with having moss-sponging relatives. The chimpanzees didn’t only learn from their parents: it was spread between any family members in either direction. “It’s like the family is the [crucible] where the behaviour is transmitted,” says Thibaud Gruber of the University of Geneva, Switzerland, one of the study authors. But there were also individuals who learned the technique from non-family members. “Once a behaviour has been developed and spread to a few individuals, the majority of transmission will appear in the family, but if you hang out with some tool users, you’re still likely to develop a behaviour by social learning,” says Gruber. “This is a wonderful contribution to the study of animal cultures,” says Andrew Whiten at the University of St Andrews, UK. “The accumulated evidence suggests that chimpanzees pass on scores of different traditions across Africa, but being able to see any of them originate and then spread is very much rarer.” One of few previous studies to record new behaviours emerging and spreading in animal populations involved Japanese macaques on Koshima Island in the 1950s. A young female began washing sand off sweet potatoes in a river before eating them, and her peers soon did the same. Since then, the behaviour has spread from mother to offspring. Moss-sponging seems to be following a similar pattern, says Frans de Waal of Emory University in Atlanta, Georgia. “Social closeness is most of the time a bias in social learning, so that individuals learn the best from those they hang out with and whose behaviour interests them,” he says. We learn more readily from those we can identify with, and so do animals, he adds. The origins of human culture may lie in the sharing of useful behaviours this way, says Whiten. “What has been revealed in recent studies of cultural practices in all the great apes – chimpanzees, gorillas and orangutans – means it would be surprising if humans’ ape ancestors did not show similar behaviour, the foundations of the rich human cultures that have evolved in more recent times.” However, some researchers think moss-sponging chimpanzees and potato-washing macaques aren’t learning by imitation at all, and each one invents the behaviour by itself. “Chimpanzees fail to imitate in controlled experiments, and moss sponging does indeed occur in naive individuals,” says Claudio Tennie at the University of Tübingen, Germany. “Neither this nor the potato washing study – or indeed any other study – shows similar cultures in chimpanzees to our own.” Gruber takes a different point of view. “Chimps are able, to a certain extent, to imitate, although it may not be as fine grained as in humans,” he says. Read more: Chimp social network shows how new ideas catch on; Well-travelled chimps more likely to pick up tools and innovate; Chimp filmed cleaning a corpse’s teeth in a mortuary-like ritual
News Article | April 17, 2017
From the clown fish to leopards, skin colour patterns in animals arise from microscopic interactions among coloured cells that obey equations discovered by the mathematician Alan Turing. Today, researchers at the University of Geneva (UNIGE), Switzerland, and SIB Swiss Institute of Bioinformatics report in the journal Nature that a southwestern European lizard slowly acquires its intricate adult skin colour by changing the colour of individual skin scales using an esoteric computational system invented in 1948 by another mathematician: John von Neumann. The Swiss team shows that the 3D geometry of the lizard's skin scales causes the Turing mechanism to transform into the von Neumann computing system, allowing biology-driven research to link, for the first time, the work of these two mathematical giants. A multidisciplinary team of biologists, physicists and computer scientists lead by Michel Milinkovitch, professor at the Department of Genetics and Evolution of the UNIGE Faculty of Science, Switzerland and Group Leader at the SIB Swiss Institute of Bioinformatics, realised that the brown juvenile ocellated lizard (Timon lepidus) gradually transforms its skin colour as it ages to reach an intricate adult labyrinthine pattern where each scale is either green or black. This observation is at odd with the mechanism, discovered in 1952 by the mathematician Alan Turing, that involves microscopic interactions among coloured cells. To understand why the pattern is forming at the level of scales, rather than at the level of biological cells, two PhD students, Liana Manukyan and Sophie Montandon, followed individual lizards during 4 years of their development from hatchlings crawling out of the egg to fully mature animals. For multiple time points, they reconstructed the geometry and colour of the network of scales by using a very high resolution robotic system developed previously in the Milinkovitch laboratory. The researchers were then surprised to see the brown juvenile scales change to green or black, then continue flipping colour (between green and black) during the life of the animal. This very strange observation prompted Milinkovitch to suggest that the skin scale network forms a so-called 'cellular automaton'. This esoteric computing system was invented in 1948 by the mathematician John von Neumann. Cellular automata are lattices of elements in which each element changes its state (here, its colour, green or black) depending on the states of neighbouring elements. The elements are called cells but are not meant to represent biological cells; in the case of the lizards, they correspond to individual skin scales. These abstract automata were extensively used to model natural phenomena, but the UNIGE team discovered what seems to be the first case of a genuine 2D automaton appearing in a living organism. Analyses of the four years of colour change allowed the Swiss researchers to confirm Milinkovitch's hypothesis: the scales were indeed flipping colour depending of the colours of their neighbour scales. Computer simulations implementing the discovered mathematical rule generated colour patterns that could not be distinguished from the patterns of real lizards. How could the interactions among pigment cells, described by Turing equations, generate a von Neumann automaton exactly superposed to the skin scales? The skin of a lizard is not flat: it is very thin between scales and much thicker at the center of them. Given that Turing's mechanism involves movements of cells, or the diffusion of signals produced by cells, Milinkovitch understood that this variation of skin thickness could impact on the Turing's mechanism. The researchers then performed computer simulations including skin thickness and saw a cellular automaton behaviour emerge, demonstrating that a Cellular Automaton as a computational system is not just an abstract concept developed by John von Neumann, but also corresponds to a natural process generated by biological evolution. However, the automaton behaviour was imperfect as the mathematics behind Turing's mechanism and von Neumann automaton are very different. Milinkovitch called in the mathematician Stanislav Smirnov, Professor at the UNIGE, who was awarded the Fields Medal in 2010. Before long, Smirnov derived a so-called discretisation of Turing's equations that would constitute a formal link with von Neumann's automaton. Anamarija Fofonjka, a third PhD student in Milinkovitch's team implemented Smirnov new equations in computer simulations, obtaining a system that had become un-differentiable from a von Neumann automaton. The highly multidisciplinary team of researchers had closed the loop in this amazing journey, from biology to physics to mathematics ... and back to biology.
News Article | April 17, 2017
Aerosols are collections of fine particles, either biological or of other types, in suspension in a gaseous medium. They play a major role in cloud formation and therefore have a strong impact on climate models. They are however extremely hard to study due to the small size and immense variety of their constituent particles. But researchers from the University of Geneva (UNIGE), Switzerland, members of the PlanetSolar Deepwater expedition, have now succeeded in linking the composition of marine biological aerosols -- and therefore their influence on the climate -- to that of bodies of water under them within the Atlantic Ocean, thereby paving the way to an indirect study of these aerosols through water analysis. This study, which has been published in Scientific Reports, will contribute to making climate models more accurate. Aerosols are fine particles in suspension in the air. Over the oceans, some contain organic or biological ingredients (bacteria, degradation products of microscopic algae) which come from sea spray, others are transported in the air (mineral dust, smoke). They serve as seeds for forming clouds and also reflect light. Their role is extremely important for modelling clouds, and therefore for the climate in general. But due to the small size of the particles and their large quantity, it's difficult to accurately study them. So researchers at the University of Geneva (UNIGE) asked themselves if it would be possible to characterize biological aerosols through the composition of the water whence they come. "To answer this question, we needed two tools," explains Jérôme Kasparian, Professor in the Department of Applied Physics at the UNIGE Science Faculty. "The first is a detector of fluorescence which we designed, called Biobox, and which enables us to analyse aerosol particles one by one. The spectrum gives us information on their composition and distinguishes the organic particles, which are fluorescent, from the other particles. Then we needed PlanetSolar." Indeed this research could only be undertaken over a long time period of time without any disturbances of water and air. Only PlanetSolar, a solar boat that navigated remains at sea for three months and produces no emissions, could make it possible. During the expedition, scientists carried out analysis of the salinity, temperature, dissolved oxygen and the microalgae contained in the various bodies of water in the Atlantic, and then compared this data with that obtained by the Biobox. "And we found that they matched!" exclaims Jérôme Kasparian. The physicists discovered that biological aerosols are related to the temperature and salinity of the sea. According to previous criteria, water creates large bodies that don't inter-mix, which allows them to be differentiated. Thus, when the characteristics of a water mass were favourable for reproduction of microalgae, researchers noticed that after a certain amount of time, the aerosols detected above this same water mass contained more biological particles. The biological fraction of aerosols is therefore linked to the history of biological activity of bodies of water close to the surface. "Provided that this is also valid in oceans and seas other than the Atlantic, our research location, our results would allow us to estimate biological aerosols by directly studying the bodies of water, which would simplify aerosol caracterization and make climate models more accurate," adds Kasparian. Difficult to study directly, aerosols are now being studied via the sea, which, unlike aerosols, can easily be analysed by satellites.
News Article | April 17, 2017
Our closest evolutionary relatives are quite the mind readers. And they can use that knowledge to help people figure things out when they are labouring under a misapprehension, according to the latest research. The ability to attribute mental states to others, aka theory of mind, is sometimes considered unique to humans, but evidence is mounting that other animals have some capacity for it. In a study last year, chimps, bonobos and orangutans watched videos of people behaving in different scenarios as cameras tracked their eye movements. The experiment found that the apes looked where an actor in the video would expect to see an object, rather than towards its true location, suggesting the animals were aware others could hold false beliefs. But that experiment left open the possibility apes were simply predicting that the actor would go to the last place he’d seen the object, without understanding that he held a false belief. Now, David Buttelmann at the Max Planck Institute for Evolutionary Anthropology in Leipzig, Germany, and colleagues tested 34 zoo chimpanzees, bonobos and orangutans, in search of more conclusive evidence. In their test, person A places an object into one of two boxes, then either remains in the room or leaves. Person B removes it, places it in the other box and locks both boxes. Then A tries to open the box where they left the object. The apes know how to unlock the boxes and can decide to open either one. When A remained in the room, the apes were equally likely to unlock either box. But when A wasn’t there for the switch, the apes unlocked the box containing the object in 77 per cent of trials. This shows apes can recognise when A is acting under a false belief, the researchers argue. The apes guess that the person is trying to find the object, and help them by opening the right box. When A knows which box the object is in and tries to open the other box, the team’s reasoning goes, it’s not clear why they are doing this so the apes don’t respond in a consistent way Their performance in this test closely matches the behaviour of a 16-month-old baby. In a second test, A gives the object to B, then leaves the room while B puts the object in one of the boxes. In this case, rather than having a false belief, A doesn’t actually know where the object is. The apes chose to unlock each box equally often, perhaps, the researchers say, because it was less clear what the person’s intention was. Because the apes behave differently in each of the two scenarios, it shows they have some mental representation of what the other person believes, says Buttelmann, rather than just thinking that person doesn’t know where the object is. The results show apes apply their understanding of others’ beliefs when deciding how to behave in social interactions, he says. Many other studies have found that great apes understand other mental states such as goals, intentions and desires. “The fact that we now have two studies that show evidence of belief understanding in great apes, shows that we are not that different,” says Buttelmann. “Whether this belief understanding is as fully fledged as it is in humans is a different question.” Thibaud Gruber, from the University of Geneva, Switzerland, says the new study is a great improvement on the previous work because it tests active responses rather than tracking the apes’ gaze. “The results suggest that, similar to 16-month-old infants, all great apes are able to use their mind reading skills to help others,” he says. “It’s particularly interesting that they actually use these skills to help the experimenter, while usually apes are seen as the competitive ones, compared to humans the cooperative ones!” Richard Byrne of the University of St Andrews, UK, says observational studies have pointed towards these abilities in non-human great apes for 40 years. “I’m glad to see that another non-verbal theory of mind test has been passed by three species of great ape, but I’m not at all surprised,” he says. But Alia Martin from Victoria University of Wellington, New Zealand, isn’t convinced by the conclusions, given that in two of the test conditions apes still choose boxes randomly, indicating they don’t have a good understating of the situation. “I’m excited to see researchers look for this amazing ability in apes, but we’re going to need more research to settle the ape theory of mind debate.” Read more: Why teenagers can’t see your point of view
News Article | April 17, 2017
To detect the goings-on inside cells without the need for an external light source, scientists can genetically engineer cells to produce chemiluminescent reporter molecules. A new class of small molecules, however, can penetrate cells and monitor their biological processes by chemiluminescence, avoiding genetic modification. Chemiluminescence, the process that lights up glow sticks, occurs when a chemical reaction generates light. In the lab, researchers use it to monitor reactive oxygen species, diagnose pathogenic infections, and detect the results of chromatography, electrophoresis, immunoassays, nucleic acid assays, and blotting experiments. The new reagents are modified versions of a set of widely used chemiluminescent Schaap’s adamantylidene-dioxetanes, each of which has a characteristic protecting group that reacts when a specific enzyme or reactive compound is present. For example, the protecting groups may be substrates for a particular enzyme. When that enzyme is present, it cleaves the protecting group from the Schaap’s reagent, yielding an unstable phenolate-dioxetane that chemiluminesces. Schaap’s reagents work well in organic solvents but emit light only weakly in water. Three-component systems—each a mixture of a Schaap’s reagent, a surfactant, and an excitable fluorescent dye—shine about 100 times as brightly in water as Schaap’s reagents by themselves, but the mixtures aren’t used in cells because they are toxic. Scientists can also use the firefly substrate-enzyme pair luciferin and luciferase to monitor gene expression and other processes inside cells, but they must first engineer the cells to produce luciferase. Doron Shabat of Tel Aviv University and coworkers at the University of Geneva have now brightened up Schaap’s reagents in a way that permits their use in cells (ACS Cent. Sci. 2017, DOI: 10.1021/acscentsci.7b00058). The team adds electron-withdrawing substituents to conjugated positions on the reagents’ phenolate group, creating long π-electron systems that emit more light in water-based media. One modified reagent, with added acrylonitrile and chlorine groups, emits in aqueous solution 1,000 times as much light as a conventional Schaap’s reagent and 10 times as much as a three-component system. It is nearly as bright as luciferin-luciferase, it can simply diffuse into cells, and it doesn’t require genetic engineering. By adding different protecting groups as triggering substrates for various enzymes or reactive compounds, Shabat and coworkers used the new reagents to image β-galactosidase activity in single cells and to detect alkaline phosphatase, glutathione, and hydrogen peroxide in aqueous solution. “This simple and elegant molecular design provides a dramatic enhancement,” comments Alexander R. Lippert of Southern Methodist University. “No excitation light source is needed,” eliminating several problems associated with fluorescence-based cell analysis, including signal fading, toxicity, and background interference, he says. Shabat and coworkers have applied for a patent on the new reagents. The researchers hope to extend the molecules’ light emission range from the visible to the near infrared to improve their ability to penetrate tissue deeply for possible in vivo use. This article has been translated into Spanish by Divulgame.org and can be found here.
News Article | May 4, 2017
The liver has a remarkable ability to grow to one and a half times its size before shrinking back on a daily basis, scientists have discovered. After the skin, the liver is the largest organ in the human body. It has a huge range of vital functions, from filtering out toxins such as alcohol to making bile. Trending: Wild dolphins' immune systems are failing because of ocean pollution Now scientists have discovered that the size of the liver is closely tied in to the workings of the body clock, also known as the circadian rhythm. The results are published in the journal Cell. A study in mice has found that the mass of the whole organ, the size of the individual liver cells and the amount of protein in the liver varies hugely throughout the day. The liver is at its largest and most active when the mice were most active. As mice are nocturnal animals, this meant that their livers were bulging by the early hours. "We saw the biggest difference in the night, an up to 45% increase. I would expect that this is true in most mammals, but exactly what the effect will be in humans we don't know yet," study author Ueli Schibler of the University of Geneva told IBTimes UK. The change in mass is thought to be to help the mice deal with the extra activity – the more food the mouse eats, the more work the liver has to do to keep up. Most popular: Is your baby in pain? Brain scan can detect physical suffering in infants "The liver gets all the bad stuff from the food and has to deal with that," Schibler said. "It's possible that it's not sufficient to have a liver the size of the resting phase at this point." But when mice had their body clock disrupted, the liver lost its ability to grow and shrink. Mice that were fed during the day had a constant liver size, even though they were eating the same amount of food as mice in the ordinary nocturnal rhythm. "This may contribute to the health problems associated with shift work. When we eat at the wrong time, the liver doesn't oscillate," Schibler said. Previous studies in humans from the 1980s showed that the liver in humans also changes size, although the measurements were not on such a fine scale. These would require invasive measurement methods, Schibler said. "But it would be very interesting to do that, and measure the oscillations of the human liver around the clock." You may be interested in:
News Article | May 4, 2017
Biologists from UNIGE have discovered how this organ adapts to the cycles of feeding and fasting, and the alternation of day and night In mammals, the liver plays a pivotal role in metabolism and the elimination of toxins, and reaches its maximum efficiency when they are active and feed. Biologists from the University of Geneva (UNIGE), Switzerland, have discovered how this organ adapts to the cycles of feeding and fasting, and the alternation of day and night within 24 hours. The researchers showed in mice that the size of the liver increases by almost half before returning to its initial dimensions, according to the phases of activity and rest. Published in the journal Cell, their study describes the cellular mechanisms of this fluctuation, which disappears when the normal biological rhythm is reversed. The disruption of our circadian clock due to professional constraints or private habits therefore probably has important repercussions on our liver functions. Mammals have adapted to diurnal and nocturnal rhythms using a central clock located in the brain. The latter, which is resettled every day by the light, synchronizes the subordinate clocks present in most of our cells. In the liver, more than 350 genes involved in metabolism and detoxification are expressed in a circadian fashion, with a biological rhythm of 24 hours. "Many of them are also influenced by the rhythm of food intake and physical activity, and we wanted to understand how the liver adapts to these fluctuations", says Ueli Schibler, professor emeritus at the Department of Molecular Biology of the UNIGE Faculty of Science. The liver oscillates, but not the other organs The mice forage and feed at night, while the day is spent resting. "In rodents following a usual circadian rhythm, we observed that the liver gradually increases during the active phase to reach a peak of more than 40% at the end of the night, and that it returns to its initial size during the day", notes Flore Sinturel, researcher of the Geneva group and first author of the study. The cellular mechanisms of this adaptation were discovered in collaboration with scientists from the Nestlé Institute of Health Sciences (NIHS) and the University of Lausanne (UNIL) in Switzerland. Researchers have shown that the size of liver cells and their protein content oscillate in a daily manner. The number of ribosomes, the organelles responsible for producing the proteins required for the various functions of the liver, fluctuates together with the size of the cell. "The latter adapts the production and assembly of new ribosomes to ensure a peak of protein production during the night. The components of ribosomes produced in excess are then identified, labeled, and degraded during the resting phase", explains Flore Sinturel. The amplitude of the variations observed by the biologists depends on the cycles of feeding and fasting, as well as diurnal and nocturnal phases. Indeed, the fluctuations disappear when the phases of feeding no longer correspond to the biological clock, which evolved in the course of hundreds of millions of years: "the size of the liver and the hepatocytes, as well as their contents in ribosomes and proteins, remain nearly stable when mice are fed during the day. Yet, these animals ingest similar amounts of food, irrespective of whether they are fed during the night or during the day", points out Frédéric Gachon of the NIHS, who co-directed the study. Many human subjects no longer live according to the rhythm of their circadian clock, due to night work hours, alternating schedules or frequent international travels. A previous study (Leung et al., Journal of Hepatology, 1986) determining the volume of the human liver during six hours using methods based on ultrasound, suggests that this organ also oscillates within us. If mechanisms similar to those found in mice exist in humans, which is likely to be the case, the deregulation of our biological rhythms would have a considerable influence on hepatic functions.
News Article | May 2, 2017
A project supported by the Swiss National Science Foundation (SNSF) aims to find new materials which can be used in rechargeable batteries and eventually provide alternatives to the current lithium batteries. Lithium-based batteries have several drawbacks, such as the limited availability of the raw material itself as well as the numerous safety issues, which are primarily associated with the use of a flammable liquid compound. This problem has been exemplified by the recurrence of exploding mobile phones. The recent research led by Arndt Remhof of the Swiss Federal Laboratories for Materials Science and Technology, Empa, demonstrates the potential of sodium and magnesium in the development of alternative technologies based exclusively on solid elements. His team has produced experimental battery components based on these metals. Swiss researchers have developed solid-state battery cells using a solid compound (as opposed to cells which are based on a liquid electrolyte), the design of which poses a significant technical problem. Ions - whether they are lithium, sodium or magnesium - must be allowed to move through a solid medium. By moving from one pole to the other inside the battery, ions (positive charge) facilitate the displacement of electrons (negative charge) and thus the discharge of an electrical current through an external circuit. To facilitate the displacement of ions, the researchers developed solid electrolytes with crystalline structure. By substituting lithium with sodium or magnesium, Arndt Remhof's team had to completely overhaul their crystalline architecture and use new components and manufacturing processes. "I always like to compare our job to that of a football coach", says Arndt Remhof. "You can bring the best elements together, but if you don't optimise the settings you won't achieve good results!" Arndt Remhof's team has developed a solid electrolyte that facilitates good mobility of sodium ions at 20 degrees. This last point is crucial: ions require a source of heat in order to move, and inducing a reaction at room temperature poses a technical challenge. The electrolyte is also non-flammable and is chemically stable up to 300 degrees, which addresses the various safety concerns associated with lithium-ion batteries. Hans Hagemann's team at the University of Geneva has been working in parallel to develop cheaper technology for the production of this new solid electrolyte. Unlike lithium, there are huge reserves of sodium: it's one of the two components of table salt. "Availability is our key argument", says Léo Duchêne of Empa and first author of the research paper. "However, it stores less energy than the equivalent mass of lithium and thus could prove to be a good solution if the size of the battery isn't a factor for its application." The same team has also developed a solid magnesium-based electrolyte. Until now, very little research had been done in this field. The fact that it is much more difficult to set this element in motion doesn't mean that it is any less attractive: it's available in abundance, it's light, and there's no risk of it exploding. But more importantly, a magnesium ion has two positive charges, whereas lithium only has one. Essentially, this means that it stores almost twice as much energy in the same volume. Some experimental electrolytes have already been used to stimulate magnesium ions to move, but at temperatures in excess of 400 degrees. The electrolytes used by the Swiss scientists have already recorded similar conductivities at 70 degrees. "This is pioneering research and a proof of concept," says Elsa Roedern of Empa, who led the experiments. "We are still a long way from having a complete and functional prototype, but we have taken the first important step towards achieving our goal." The Novel Ionic Conductors project brings together researchers from Empa, the University of Geneva, the Paul Scherrer Institute and the Henryk Niewodniczanski Institute for Nuclear Physics in Poland. It has been funded by the Swiss National Science Foundation since 2015 as part of the Sinergia programme, which supports collaborative and interdisciplinary research. "What we have managed to achieve in less than two years is quite extraordinary!" says Arndt Remhof. Explore further: Freezing lithium batteries may make them safer and bendable More information: L. Duchêne et al. A highly stable sodium solid-state electrolyte based on a dodeca/deca-borate equimolar mixture, Chem. Commun. (2017). DOI: 10.1039/C7CC00794A R. Moury et al. An alternative approach to the synthesis of NaB3H8 and Na2B12H12 for solid electrolyte applications, International Journal of Hydrogen Energy (2017). DOI: 10.1016/j.ijhydene.2017.02.044 Elsa Roedern et al. Magnesium Ethylenediamine Borohydride as Solid-State Electrolyte for Magnesium Batteries, Scientific Reports (2017). DOI: 10.1038/srep46189
News Article | April 19, 2017
Diatoms between 0.01 and 0.02 mm, consisting of a single cell surrounded by an artificially colored silica skeleton. The alga in green is present in clean environments, while the orange one lives in more polluted water. Credit: Laure Apothéloz-Perret-Gentil, UNIGE Diatoms are unicellular algae particularly sensitive to changes that affect their aquatic environment. This is why they are used as bioindicators for the biological monitoring of water quality. However, their microscopic identification in river samples requires a lot of time and skill. Biologists from the University of Geneva (UNIGE), Switzerland, have succeeded in establishing a water quality index based solely on the DNA sequences of the diatoms present in the samples, without needing to identify each species visually. This study, published in the journal Molecular Ecology Resources, presents a revolutionary tool to process a very large number of samples in parallel, allowing wide coverage of the monitored sites in a reduced time and at a lower cost. The degree of pollution of rivers resulting from human activities is assessed using different biotic indices. The latter reflect the ecological status of a river based on the quantity and diversity of organisms selected as bioindicators, due to their ecological preferences and tolerance to pollution. This is the case of diatoms, algae consisting of a single cell surrounded by a silica skeleton, recommended by the European Union and Switzerland as one of the ideal bioindicators for rivers and lakes. The quality of rivers is determined using the Swiss diatom index (DI-CH), whose value defines the ecological status. "The morphological identification of the different species present in each sample, however, no longer meets the needs of rapid and reliable bioassessment measures introduced to protect aquatic environments. This is why we have tried to develop a new method," says Jan Pawlowski, professor at the Department of Genetics and Evolution of the UNIGE Faculty of Science. In collaboration with the Geneva Water Ecology Service (SECOE) and the PhycoEco environmental office in La Chaux-de-Fonds, Switzerland, the researchers analyzed the 90 or so samples taken in different rivers in Switzerland and determined their ecological status using the DI-CH. They have thus established a reference system in order to validate the molecular index under development. The latter is based on the DNA sequences characteristic of all the diatom species which may be present in these samples. "The whole range of DNA sequences revealed in each sample corresponds to a specific DI-CH quality index. Furthermore, each sequence identified has a different distribution and is detected in variable amounts from one sample to another. By integrating all these data, we were able to calculate an ecological value for each sequence, without having to identify the species to which it belongs," explains Laure Apothéloz-Perret-Gentil, a member of the Geneva group and the first author of the study. This approach makes it possible to determine the quality of water using all of these ecological values. "Our assessment was correct for almost 80 percent of the samples, which is very encouraging. Increasing the number and diversity of samples will allow us to calibrate our method for future routine, large-scale analyses," indicates Jan Pawlowski. The synchronous processing of a large number of samples in record time and at a reduced cost is not the only advantage of this new tool. The molecular index developed by the biologists from UNIGE could easily be adapted to other groups of unicellular bioindicators: a major asset for monitoring various types of aquatic ecosystems. Explore further: Using genetics to measure the environmental impact of salmon farming More information: Laure Apothéloz-Perret-Gentil et al. Taxonomy-free molecular diatom index for high-throughput eDNA biomonitoring, Molecular Ecology Resources (2017). DOI: 10.1111/1755-0998.12668
News Article | May 8, 2017
In mammals, the liver plays a pivotal role in metabolism and the elimination of toxins, and reaches its maximum efficiency when they are active and feed. Biologists from the University of Geneva (UNIGE), Switzerland, have discovered how this organ adapts to the cycles of feeding and fasting, and the alternation of day and night within 24 hours. The researchers showed in mice that the size of the liver increases by almost half before returning to its initial dimensions, according to the phases of activity and rest. Published in the journal Cell, their study describes the cellular mechanisms of this fluctuation, which disappears when the normal biological rhythm is reversed. The disruption of our circadian clock due to professional constraints or private habits therefore probably has important repercussions on our liver functions. Mammals have adapted to diurnal and nocturnal rhythms using a central clock located in the brain. The latter, which is resettled every day by the light, synchronizes the subordinate clocks present in most of our cells. In the liver, more than 350 genes involved in metabolism and detoxification are expressed in a circadian fashion, with a biological rhythm of 24 hours. "Many of them are also influenced by the rhythm of food intake and physical activity, and we wanted to understand how the liver adapts to these fluctuations", says Ueli Schibler, professor emeritus at the Department of Molecular Biology of the UNIGE Faculty of Science. The mice forage and feed at night, while the day is spent resting. "In rodents following a usual circadian rhythm, we observed that the liver gradually increases during the active phase to reach a peak of more than 40% at the end of the night, and that it returns to its initial size during the day", notes Flore Sinturel, researcher of the Geneva group and first author of the study. The cellular mechanisms of this adaptation were discovered in collaboration with scientists from the Nestlé Institute of Health Sciences (NIHS) and the University of Lausanne (UNIL) in Switzerland. Researchers have shown that the size of liver cells and their protein content oscillate in a daily manner. The number of ribosomes, the organelles responsible for producing the proteins required for the various functions of the liver, fluctuates together with the size of the cell. "The latter adapts the production and assembly of new ribosomes to ensure a peak of protein production during the night. The components of ribosomes produced in excess are then identified, labeled, and degraded during the resting phase", explains Flore Sinturel. The amplitude of the variations observed by the biologists depends on the cycles of feeding and fasting, as well as diurnal and nocturnal phases. Indeed, the fluctuations disappear when the phases of feeding no longer correspond to the biological clock, which evolved in the course of hundreds of millions of years: "the size of the liver and the hepatocytes, as well as their contents in ribosomes and proteins, remain nearly stable when mice are fed during the day. Yet, these animals ingest similar amounts of food, irrespective of whether they are fed during the night or during the day", points out Frédéric Gachon of the NIHS, who co-directed the study. Many human subjects no longer live according to the rhythm of their circadian clock, due to night work hours, alternating schedules or frequent international travels. A previous study (Leung et al., Journal of Hepatology, 1986) determining the volume of the human liver during six hours using methods based on ultrasound, suggests that this organ also oscillates within us. If mechanisms similar to those found in mice exist in humans, which is likely to be the case, the deregulation of our biological rhythms would have a considerable influence on hepatic functions. Article: Diurnal Oscillations in Liver Mass and Cell Size Accompany Ribosome Assembly Cycles, Ueli Schibler et al., Cell, doi: 10.1016/j.cell.2017.04.015, published 4 May 2017.
Cazalilla M.A.,Donostia International Physics Center |
Citro R.,University of Salerno |
Giamarchi T.,University of Geneva |
Orignac E.,French National Center for Scientific Research |
Rigol M.,Georgetown University
Reviews of Modern Physics | Year: 2011
The physics of one-dimensional interacting bosonic systems is reviewed. Beginning with results from exactly solvable models and computational approaches, the concept of bosonic Tomonaga-Luttinger liquids relevant for one-dimensional Bose fluids is introduced, and compared with Bose-Einstein condensates existing in dimensions higher than one. The effects of various perturbations on the Tomonaga-Luttinger liquid state are discussed as well as extensions to multicomponent and out of equilibrium situations. Finally, the experimental systems that can be described in terms of models of interacting bosons in one dimension are discussed. © 2011 American Physical Society.
Czekalski N.,Eawag - Swiss Federal Institute of Aquatic Science and Technology |
Gascon Diez E.,University of Geneva |
Burgmann H.,Eawag - Swiss Federal Institute of Aquatic Science and Technology
ISME Journal | Year: 2014
Antibiotic-resistance genes (ARGs) are currently discussed as emerging environmental contaminants. Hospital and municipal sewage are important sources of ARGs for the receiving freshwater bodies. We investigated the spatial distribution of different ARGs (sul1, sul2, tet(B), tet(M), tet(W) and qnrA) in freshwater lake sediments in the vicinity of a point source of treated wastewater. ARG contamination of Vidy Bay, Lake Geneva, Switzerland was quantified using real-time PCR and compared with total mercury (THg), a frequently particle-bound inorganic contaminant with known natural background levels. Two-dimensional mapping of the investigated contaminants in lake sediments with geostatistical tools revealed total and relative abundance of ARGs in close proximity of the sewage discharge point were up to 200-fold above levels measured at a remote reference site (center of the lake) and decreased exponentially with distance. Similar trends were observed in the spatial distribution of different ARGs, whereas distributions of ARGs and THg were only moderately correlated, indicating differences in the transport and fate of these pollutants or additional sources of ARG contamination. The spatial pattern of ARG contamination and supporting data suggest that deposition of particle-associated wastewater bacteria rather than co-selection by, for example, heavy metals was the main cause of sediment ARG contamination. © 2014 International Society for Microbial Ecology All rights reserved.
Romano A.,Allergy Unit |
Romano A.,Instituto Of Ricovero E Cura A Carattere Scientifico Oasi Maria Ss |
Caubet J.-C.,University of Geneva
Journal of Allergy and Clinical Immunology: In Practice | Year: 2014
Hypersensitivity reactions to β-lactam and non-β-lactam antibiotics are commonly reported. They can be classified as immediate or nonimmediate according to the time interval between the last drug administration and their onset. Immediate reactions occur within 1 hour after the last drug administration and are manifested clinically by urticaria and/or angioedema, rhinitis, bronchospasm, and anaphylactic shock; they may be mediated by specific IgE-antibodies. Nonimmediate reactions occur more than 1 hour after the last drug administration. The most common manifestations are maculopapular exanthems; specific T lymphocytes may be involved in this type of manifestation. The diagnostic evaluation of hypersensitivity reactions to antibiotics is usually complex. The patient's history is fundamental; the allergic examination is based mainly on in vivo tests selected on the basis of the clinical features and the type of reaction, immediate or nonimmediate. Immediate reactions can be assessed by immediate-reading skin tests and, in selected cases, drug provocation tests. Nonimmediate reactions can be assessed by delayed-reading skin tests, patch tests, and drug provocation tests. However, skin tests have been well validated mainly for β-lactams but less for other classes of antibiotics. © 2014 American Academy of Allergy, Asthma & Immunology.
Luthi A.,Friedrich Miescher Institute for Biomedical Research |
Luscher C.,University of Geneva
Nature Neuroscience | Year: 2014
Current models of addiction and anxiety stem from the idea that aberrant function and remodeling of neural circuits cause the pathological behaviors. According to this hypothesis, a disease-defining experience (for example, drug reward or stress) would trigger specific forms of synaptic plasticity, which in susceptible subjects would become persistent and lead to the disease. While the notion of synaptic diseases has received much attention, no candidate disorder has been sufficiently investigated to yield new, rational therapies that could be tested in the clinic. Here we review the arguments in favor of abnormal neuronal plasticity underlying addiction and anxiety disorders, with a focus on the functional diversity of neurons that make up the circuits involved. We argue that future research must strive to obtain a comprehensive description of the relevant functional anatomy. This will allow identification of molecular mechanisms that govern the induction and expression of disease-relevant plasticity in identified neurons. To establish causality, one will have to test whether normalization of function can reverse pathological behavior. With these elements in hand, it will be possible to propose blueprints for manipulations to be tested in translational studies. The challenge is daunting, but new techniques, above all optogenetics, may enable decisive advances. © 2014 Nature America, Inc. All rights reserved.
Enriquez-Garcia A.,University of Geneva |
Kundig E.P.,University of Geneva
Chemical Society Reviews | Year: 2012
Complementary to enzymatic methods, catalytic enantioselective desymmetrisation of meso-diols (EDMD) by small molecule catalysts has emerged as a powerful tool that provides highly enantioenriched materials of considerable value in organic synthesis. This review traces the evolution of easily accessible catalysts used in the EDMD and compares their performance with the existing enzymatic methods. This journal is © The Royal Society of Chemistry 2012.
Clavien P.-A.,University of Zürich |
Lesurtel M.,University of Zürich |
Gores G.J.,Mayo Medical School |
Langer B.,University of Toronto |
Perrier A.,University of Geneva
The Lancet Oncology | Year: 2012
Although liver transplantation is a widely accepted treatment for hepatocellular carcinoma (HCC), much controversy remains and there is no generally accepted set of guidelines. An international consensus conference was held on Dec 2-4, 2010, in Zurich, Switzerland, with the aim of reviewing current practice regarding liver transplantation in patients with HCC and to develop internationally accepted statements and guidelines. The format of the conference was based on the Danish model. 19 working groups of experts prepared evidence-based reviews according to the Oxford classification, and drafted recommendations answering 19 specific questions. An independent jury of nine members was appointed to review these submissions and make final recommendations, after debates with the experts and audience at the conference. This report presents the final 37 statements and recommendations, covering assessment of candidates for liver transplantation, criteria for listing in cirrhotic and non-cirrhotic patients, role of tumour downstaging, management of patients on the waiting list, role of living donation, and post-transplant management. © 2012 Elsevier Ltd.
Agency: European Commission | Branch: FP7 | Program: MC-ITN | Phase: FP7-PEOPLE-2011-ITN | Award Amount: 3.98M | Year: 2012
Currently, 180 million people suffer from diabetes worldwide and this number is expected to double until 2030. Diabetes-related healthcare costs may rise to 40% of the total healthcare budget in high incidence countries. Despite these daunting numbers, our knowledge about the pathophysiology of T1D and T2D remains limited and many questions about the relation of the of the beta cell mass, the beta cell function and the metabolism of different tissues remain unanswered. In order to address these urgent questions, great hope has been put on the development of novel tracers, and functional and molecular imaging methods, which only recently have become available for in vivo diabetes imaging. However, it remains difficult to build up top level expertise as few, if any, European institutions are able to offer a profound combined molecular imaging/diabetes training, a shortcoming that continues to hamper the progress of the field. As a consequence, most available molecular imaging techniques are insufficiently characterised for clinical use in diabetes. To address this challenge, we propose a training network (BetaTrain) to connect academic/private sector partners from 5 leading European FP7 consortia with top level expertise in beta cell/diabetes imaging. Like this, BetaTrain will not only provide a unique multidisciplinary intersectoral training opportunity to young scientists in the field, but will also address the urgent challenges in our combat against diabetes. In order to non-invasively characterize beta cells and other relevant tissues in animal models and humans suffering from diabetes, it will be necessary to combine different molecular imaging techniques to provide information complementary to that obtained by other imaging, laboratory, and functional tests. The scientific training program of BetaTrain will therefore characterise, cross-calibrate and map these technologies/tracers in order to create the basis for personalised diagnosis and therapy in diabetes.
Agency: European Commission | Branch: FP7 | Program: CP-IP | Phase: HEALTH.2010.2.1.2-1 | Award Amount: 16.62M | Year: 2011
Colorectal cancer (CRC) is one of the most common cancers in both males and females, and it is perhaps the best understood of all epithelial tumors in terms of its molecular origin. Yet, despite large amount of work that has concentrated on understanding of colon tumorigenesis, we still do not know the full complement of molecular lesions that are individually necessary and together sufficient to cause colorectal cancer. Neither do we understand why some specific mutations that are relatively rare in other tumors (e.g. loss of the APC tumor suppressor) are extremely common in colorectal cancer. We propose here to use the tools of systems biology to develop a quantitative and comprehensive model of colorectal tumorigenesis. The model will include a wiring diagram that identifies cell-type specific and oncogenic pathways that contribute to colon tumorigenesis, and explains in molecular detail how a genotype of an individual CRC leads to activation of downstream genes that drive uncontrolled cell growth. This model will subsequently be used to find novel therapeutic targets, to guide genetic screening to identify individuals with elevated risk for developing CRC, and to classify patients into molecular subgroups to select the treatment combination which is optimal for each patient (personalized medicine). The specific objectives of the SYSCOL project are: 1. Identify genetic markers for individual risk using genotyping and sequencing of germline DNA from sporadic and familial CRC cases and controls 2. Identify genes and regulatory elements that contribute to colorectal cancer cell growth 3. Use data from Aims 1-2 to develop a quantitative model for colorectal tumorigenesis 4. Apply the model for identification of high-risk individuals, for molecular classification of the disease, and for identification of novel molecular treatment targets
Agency: European Commission | Branch: FP7 | Program: CPCSA | Phase: ICT-2013.9.9 | Award Amount: 74.61M | Year: 2013
This Flagship aims to take graphene and related layered materials from a state of raw potential to a point where they can revolutionize multiple industries from flexible, wearable and transparent electronics, to new energy applications and novel functional composites.\nOur main scientific and technological objectives in the different tiers of the value chain are to develop material technologies for ICT and beyond, identify new device concepts enabled by graphene and other layered materials, and integrate them to systems that provide new functionalities and open new application areas.\nThese objectives are supported by operative targets to bring together a large core consortium of European academic and industrial partners and to create a highly effective technology transfer highway, allowing industry to rapidly absorb and exploit new discoveries.\nThe Flagship will be aligned with European and national priorities to guarantee its successful long term operation and maximal impact on the national industrial and research communities.\nTogether, the scientific and technological objectives and operative targets will allow us to reach our societal goals: the Flagship will contribute to sustainable development by introducing new energy efficient and environmentally friendly products based on carbon and other abundant, safe and recyclable natural resources, and boost economic growth in Europe by creating new jobs and investment opportunities.
Agency: European Commission | Branch: FP7 | Program: CP-IP | Phase: HEALTH-2009-2.3.2-3 | Award Amount: 17.07M | Year: 2010
This proposal is for a large scale collaborative project in which we propose both to develop novel microbicides directed against new intracellular targets and to investigate novel combinations of highly active anti-retroviral drugs which may be particularly effective as microbicides. Combinations may enhance efficacy but equally importantly will increase the genetic barrier to the development of resistance. The proposal includes development of both slow release and gel formulations, pharmacokinetic and challenge experiments in macaques as well as human studies including a collaborative study with an EDCTP-funded project to use multiplex and proteomic technologies as well as culture-independent DNA-based analysis of mucosal microbiota to investigate biomarkers and establish a baseline signature from which perturbations can be recognised. This is a large consortium comprising 30 partners from 8 EU countries and from Switzerland, Ukraine, South Africa and the United States.Partners include microbicide developers, IPM and Particle Sciences, and producers, Gilead, Tibotec and Virco. Two SMEs will also participate in RTD aspects. The consortium is multidisciplinary with scientists engaged in basic discovery working with new targets and developing novel chemistry to produce compounds with improved safety and efficacy profiles as well as altered patterns of resistance.
Agency: European Commission | Branch: FP7 | Program: CP | Phase: ICT-2007.8.4 | Award Amount: 7.17M | Year: 2009
We will develop Complexity Science based modelling, prediction and simulation methods for large scale socio-technical systems. We focus on the specific example of Ambient Intelligence (AmI) based smart environments. A key component of such environments is the ability to monitor user actions and to adjust its configuration and functionality accordingly. Thus, the system reacts to human behaviour while at the same influencing it. This creates a feedback loop and leads to a tight entanglement between the human and the technical system. At the same time there is dynamic, heterogeneous human-human, human-technology, and technology-technology communication leading to ad-hoc coupling between components and different feedback loops. The project will study global properties and emergent phenomena that arise in AmI based socio-technical systems from such local feedback loops and their coupling on two concrete scenarios: transportation and emergency/disaster.\nSOCIONICAL takes a parallel, multi facetted research approach. Thus, we will investigate analytical methods, complex networks based representations, and agent based models. The advances in modelling and prediction will be verified by large scale, distributed simulation driven by real life data. We will develop a methodology by which a small number of instrumented users can be realistically integrated in a large scale simulation as additional agents, experiencing the system and driving it. A separate WP is devoted to the integration of different approaches into a coherent framework. Another ensures generalization.\nTo take into account all technological, psychological and social dimensions and realistic diversity of behaviours we have assembled a multi disciplinary consortium with separate WPs for technology analysis and the modelling of human technology interactions.\nSOCIONICAL has a WP devoted to the development and dissemination of guidelines and recommendation for businesses and policy makers.
Agency: European Commission | Branch: FP7 | Program: CP-CSA-Infra | Phase: INFRA-2008-1.1.1 | Award Amount: 8.38M | Year: 2009
During the past 70 years or more, thousands of viruses have been isolated and partly characterised by experts working in different countries worldwide. These viruses potentially provide a unique and extremely valuable medical and educational resource for research and development to understand the basis of virus diseases, and to develop modern state of the art strategies for disease control. However, nowhere in the world has there been an attempt to coordinate these collections of viruses so that they can be authenticated, amplified under quality-controlled conditions, stored long-term, and disseminated worldwide to laboratories engaged in fundamental and/or applied research. Without carefully coordinated intervention, these valuable resources will be lost to science and medicine. The objective of this project is therefore to develop a readily accessible virus reference library at the European level through the creation of the European Virus Archive (EVA). Since it would create insurmountable problems to develop such a collection in a single laboratory, EVA will utilise the expertise and facilities of recognised centres of excellence in virology within Europe. EVA will also exploit the high international reputations of these centres to obtain viruses currently held outside Europe. The management structure of EVA will ensure the highest standards of quality assurance, security, traceability and dissemination for the benefit of science, medicine, education and global information. The EVA network will develop appropriate protocols for virus amplification, supported by sustainable long-term storage facilities. The resource will be available to all users who can demonstrate the appropriate biosecurity credentials. An associated technology transfer centre will develop products for diagnosis, research, therapeutic application, education, and training. The EVA consortium contains internationally recognised experts in all aspects of the proposal. EVA is therefore an exciting, ambitious and realisable concept that can work for the benefit of mankind.
Agency: European Commission | Branch: FP7 | Program: CP | Phase: ICT-2011.8.2 | Award Amount: 2.58M | Year: 2013
Sport is the most universal and accessible of cultural pursuits. European Traditional Sports and Games (TSG) are as diverse as our cultures. TSG organisations work tirelessly to promote participation in their sports, but also as custodians of custom, language and history. The TSG club or playing field is often the focal point of community life. Trends in globalisation have led to a convergence of spectator interest to just a few mainstream sports with culturally homogonous identities. Re-Play will focus on two families of TSG (Gaelic and Basque), that are integral to the fabric of their communities of origin and have successfully staved off this trend of convergence. Gaelic and Basque sports are also practiced internationally and experience high levels of participation amongst all genders and age groups. They also share common techniques and forms of play and thus, open the opportunity for the Re-Play project results to be applied to other TSGs. To this end Re-Play will include an advisory group of other TSG associations and its results will be made available under Open Source or Hybrid license models. Re-Play proposes a study of the biomechanics and play dynamics of a number of sports and the creation of capture methodologies which balance cost and effectiveness. The platform will use off-the shelf and sensors and leading-edge studio rigs. Such an approach will allow the styles of play of elite sportspersons (national heroes) to be captured with precision for posterity, and amateur sportspersons (local heroes) for the more routine elements of play with inexpensive setups as well as a comparison of the two approaches. Re-Play includes novel 3D rendering and interaction for coaching, teaching and entertainment to allow a user practice new basic skills and to emulate their hero. A video only approach is included to recover techniques of past players from legacy video content, thus allowing for an analysis of the evolution of the sport into the future.
Agency: European Commission | Branch: H2020 | Program: RIA | Phase: NMP-28-2014 | Award Amount: 11.30M | Year: 2015
Concept: NanoFASE will deliver an integrated Exposure Assessment Framework, including methods, parameter values, model and guidance that will allow Industry to assess the full diversity of industrial nano-enabled products to a standard acceptable in regulatory registrations. Methods to assess how use phases, waste streams and environmental compartments (air, soil, water biota) act as reactors in modifying and transporting ENMs will be developed and used to derive parameter values. Our nanospecific models will be integrated with the existing multi-media fate model SimpleBox4Nano for use in EUSES and also develop into a flexible multi-media model for risk assessment at different scales and complexities. Information on release form, transformation and transport processes for product relevant ENMs will allow grouping into Functional Fate Groups according to their most probable fate pathways as a contribution to safe-by-design based on fate. Methodology: Inventories of material release forms along the product value chain are established. We then study how released ENMs transform from initial reactive states to modified forms with lower energy states in which nanospecific properties may be lost. Transport studies assess material fluxes within/between compartments. The experimental work underpins models describing ENM transformation and transport. Open access is provided to the models suitable for incorporation into existing exposure assessment tools (e.g. SimpleBox4Nano) and for more detailed assessment. Framework completeness is validated by case studies. Impact: Identified links between ENM material properties and fate outcome (e.g. safe-by-design). Improved representation of nanospecific processes in existing key fate and exposure assessment tools (e.g. SimpleBox4Nano in EUSES). Contribution to standardization. GIS framework to support predictive assessment, catchment and point source management of ENM releases.
Agency: European Commission | Branch: FP7 | Program: MC-IRSES | Phase: FP7-PEOPLE-2010-IRSES | Award Amount: 188.80K | Year: 2012
54 partners from 34 institutes have formed an EC-funded Network of Excellence (NoE) in basic malaria research, the European Virtual Institute for Malaria Research (EVIMalaR). Over the previous >5 years as the NoE Biomalpar these partners successfully broke down many barriers to cooperation pursuing a programme of integrated research. This was greatly assisted by the Biomalpar PhD School whose students were supervised by two partners from different member states. Evimalar represents the latest incarnation of this network and has recruited in tranches 21 students into the Evimalar PhD School. Australian malaria researchers have also realised the greater benefits of collaborative research and within the Australian Parasitology Network have exchanged personnel and expertise. Both the European and Australian networks recognised that their domestic spirit of cooperation could be mutualised and signed a Memorandum of Understanding (MoU) in 2007 (updated in 2010) to formalise the ambition. The MoU generated greater exchange between the regions but was limited due to lack of finance. Evimalar created a legal link between the regions by incorporating an Australian malaria researcher who was then applied for funding from the Australian NHMRC to finance OzEMalaR, a mechanism for exchange of Australians to Evimalar partners. Ozmalnet seeks reciprocal funding to allow Evimalar researchers to conduct exchange visits to OzEMalaR laboratories. Both regions are world leaders in malaria research with particular local strengths that can be exploited to the mutual benefit of both regions and their early stage researchers including Evimalar PhD students who will primarily be undertaking the exchanges. The outcome will be a more globalised integration of malaria research and greater exchange of information and personnel in the future leading to collaborative grants and ultimately concerted efforts to defeat malaria one of the greatest scourges of mankind.
Agency: European Commission | Branch: FP7 | Program: ERC-SG | Phase: ERC-SG-SH2 | Award Amount: 1.20M | Year: 2012
In Europe and all over the world, genocide and mass violence have been a structural feature of the 20th century. This project aims at questioning the social legacy of mass violence by studying how different societies have coped with the first consequence of mass destruction: the mass production of cadavers. What status and what value have indeed been given to corpses? What political, social or religious uses have been made of dead bodies in occupied Europe, Soviet Union, Serbia, Spain but also Rwanda, Argentina or Cambodia, both during and after the massacres? Bringing together perspectives of social anthropology, history and law, and raising the three main issues of destruction, identification and reconciliation, our project will enlighten how various social and cultural treatments of dead bodies simultaneously challenge common representations, legal practices and moral. Project outputs will therefore open and strengthen the field of genocide studies by providing proper intellectual and theoretical tools for a better understanding of mass violences aftermaths in today societies.
Agency: European Commission | Branch: H2020 | Program: RIA | Phase: SC5-16-2014 | Award Amount: 15.99M | Year: 2015
Terrestrial and marine ecosystems provide essential services to human societies. Anthropogenic pressures, however, cause serious threat to ecosystems, leading to habitat degradation, increased risk of collapse and loss of ecosystem services. Knowledge-based conservation, management and restoration policies are needed to improve ecosystem benefits in face of increasing pressures. ECOPOTENTIAL makes significant progress beyond the state-of-the-art and creates a unified framework for ecosystem studies and management of protected areas (PA). ECOPOTENTIAL focuses on internationally recognized PAs in Europe and beyond in a wide range of biogeographic regions, and it includes UNESCO, Natura2000 and LTER sites and Large Marine Ecosystems. Best use of Earth Observation (EO) and monitoring data is enabled by new EO open-access ecosystem data services (ECOPERNICUS). Modelling approaches including information from EO data are devised, ecosystem services in current and future conditions are assessed and the requirements of future protected areas are defined. Conceptual approaches based on Essential Variables, Macrosystem Ecology and cross-scale interactions allow for a deeper understanding of the Earths Critical Zone. Open and interoperable access to data and knowledge is assured by a GEO Ecosystem Virtual Laboratory Platform, fully integrated in GEOSS. Support to transparent and knowledge-based conservation and management policies, able to include information from EO data, is developed. Knowledge gained in the PAs is upscaled to pan-European conditions and used for planning and management of future PAs. A permanent stakeholder consultancy group (GEO Ecosystem Community of Practice) will be created. Capacity building is pursued at all levels. SMEs are involved to create expertise leading to new job opportunities, ensuring long-term continuation of services. In summary, ECOPOTENTIAL uses the most advanced technologies to improve future ecosystem benefits for humankind.
Agency: European Commission | Branch: FP7 | Program: CP-SICA | Phase: ENV.2007.1.1.5.3. | Award Amount: 4.28M | Year: 2008
The CLARIS LPB Project aims at predicting the regional climate change impacts on La Plata Basin (LPB) in South America, and at designing adaptation strategies for land-use, agriculture, rural development, hydropower production, river transportation, water resources and ecological systems in wetlands. In order to reach such a goal, the project has been built on the following four major thrusts. First, improving the description and understanding of decadal climate variability is of prime importance for short-term regional climate change projections (2010-2040). Second, a sound approach requires an ensemble of coordinated regional climate scenarios in order to quantify the amplitude and sources of uncertainties in LPB future climate at two time horizons: 2010-2040 for adaptation strategies and 2070-2100 for assessment of long-range impacts. Such coordination will allow to critically improve the prediction capacity of climate change and its impacts in the region. Third, adaptation strategies to regional scenarios of climate change impacts require a multi-disciplinary approach where all the regional components (climate, hydrology, land use, land cover, agriculture and deforestation) are addressed in a collaborative way. Feedbacks between the regional climate groups and the land use and hydrology groups will ensure to draw a first-order feedback of future land use and hydrology scenarios onto the future regional climate change. Fourth, stakeholders must be integrated in the design of adaptation strategies, ensuring their dissemination to public, private and governmental policy-makers. Finally, in continuity with the FP6 CLARIS Project, our project will put a special emphasis in forming young scientists in European institutes and in strengthening the collaborations between European and South American partners. The project is coordinated with the objectives of LPB, an international project on La Plata Basin that has been endorsed by the CLIVAR and GEWEX Panels.
Agency: European Commission | Branch: H2020 | Program: ERA-NET-Cofund | Phase: SC5-15-2015 | Award Amount: 52.36M | Year: 2016
In the last decade a significant number of projects and programmes in different domains of environmental monitoring and Earth observation have generated a substantial amount of data and knowledge on different aspects related to environmental quality and sustainability. Big data generated by in-situ or satellite platforms are being collected and archived with a plethora of systems and instruments making difficult the sharing of data and knowledge to stakeholders and policy makers for supporting key economic and societal sectors. The overarching goal of ERA-PLANET is to strengthen the European Research Area in the domain of Earth Observation in coherence with the European participation to Group on Earth Observation (GEO) and the Copernicus. The expected impact is to strengthen the European leadership within the forthcoming GEO 2015-2025 Work Plan. ERA-PLANET will reinforce the interface with user communities, whose needs the Global Earth Observation System of Systems (GEOSS) intends to address. It will provide more accurate, comprehensive and authoritative information to policy and decision-makers in key societal benefit areas, such as Smart cities and Resilient societies; Resource efficiency and Environmental management; Global changes and Environmental treaties; Polar areas and Natural resources. ERA-PLANET will provide advanced decision support tools and technologies aimed to better monitor our global environment and share the information and knowledge in different domain of Earth Observation.
Agency: European Commission | Branch: FP7 | Program: CP-FP | Phase: SPA.2013.2.1-01 | Award Amount: 3.22M | Year: 2013
GENIUS is designed to boost the impact of the next European breakthrough in astrophysics, the Gaia astrometric mission. Gaia is an ESA Cornerstone mission scheduled for launch in October 2013 and aims at producing the most accurate and complete 3D map of the Milky Way to date. A pan-European consortium named DPAC is working on the implementation of the Gaia data processing, of which the final result will be a catalogue and data archive containing more than one billion objects. The archive system containing the data products will be located at the European Space Astronomy Centre (ESAC) and will serve as the basis for the scientific exploitation of the Gaia data. The design, implementation, and operation of this archive are a task that ESA has opened up to participation from the European scientific community. GENIUS is aimed at significantly contributing to this development based on the following principles: an archive design driven by the needs of the user community; provision of exploitation tools to maximize the scientific return; ensuring the quality of the archive contents and the interoperability with existing and future astronomical archives (ESAC, ESO, ...); cooperation with the only other two astrometric missions in the world, nanoJASMINE and JASMINE (Japan); and last but not least, the archive will facilitate outreach and academic activities to foster the public interest in science in general and astronomy in particular. GENIUS fits seamlessly into existing Gaia activities, exploiting the synergies with ongoing developments. Its members actively participate in these ongoing tasks and provide an in-depth knowledge of the mission as well as expertise in key development areas. Furthermore, GENIUS has the support of DPAC, several Gaia national communities in the EU member states, and will establish cooperation with the Japanese astrometric missions already mentioned.
Agency: European Commission | Branch: FP7 | Program: MC-ITN | Phase: FP7-PEOPLE-ITN-2008 | Award Amount: 5.06M | Year: 2009
SyMBaD aims to move forward our knowledge on synapse structure and function in the normal and pathological brain. Brain diseases represent a considerable social and economic burden in Europe. Emerging evidence indicates that synaptic dysfunction is associated with a majority of neurological and psychiatric disorders. Novel therapeutical approaches relie on a better knowledge of the synapse and its pathologies. The network comprises 23 teams from 6 academic centres (Bordeaux, Alicante, Milan, Geneva, Gttingen, Bristol) representing an important fraction of the leading European researchers in the field. Synergies and complementarities between the research teams exist and should develop with the activities of the SyMBaD network. The participants are already well integrated in European scientific collaborative networks, and have an outstanding track-record of training young researchers. Industrial partners (6) will take part as full partners in training by an obligatory placement from 6 to 12 months of 16 ESR among the 26 recruited. The other ESR will be fully integrated into collaborative projects between academic teams. The private sector comprises companies involved in the development of new therapeutical strategies to combat brain diseases (GSK, Neurosearch, Xygen, and Noscira) and companies involved in technical development to be used in synaptic research and beyond (Bioxtal, Amplitude Systems, Explora Nova). The SYMBAD network aims to: Teach a number of increasingly sophisticated techniques required in neuroscience and to advance towards novel therapies. Focus on technological innovation and on interweaving of multilevel approaches. Facilitate future constructive dialogue between academia and industry in the field by involving SMEs in the training of PhD students through collaborative research projects. SyMBaD will make European Neuroscience more attractive to young scientists, it will catalyze multi-level collaborations and foster intersectorial exchanges to advance in the study of some of the foremost Health issues of the European Community.
IoT Lab - Researching crowdsourcing to extend IoT testbed infrastructure for multidisciplinary experiments, with more end-user interactions, flexibility, scalability, cost efficiency and societal added value
Agency: European Commission | Branch: FP7 | Program: CP | Phase: ICT-2013.1.7 | Award Amount: 3.35M | Year: 2013
IoT Lab is a research project exploring the potential of crowdsourcing to extend IoT testbed infrastructure for multidisciplinary experiments with more end-user interactions. It will research and develop:\n1. Crowdsourcing mechanisms and tools enabling testbeds to use third parties resources (such as mobile phones), and to interact with distributed users (the crowd). The crowdsourcing enablers will address issues such as privacy by design, identity management, security, reputation mechanisms, and data ownership.\n2. Virtualization of crowdsourcing and testbed components by using a meta-layer with an open interface, facilitating the integration and interaction with heterogeneous components. It should ease data integration and reduce the cost of deployment in real environment.\n3. Ubiquitous Interconnection and Cloudification of the testbeds resources. It will research the potential of IPv6 and network virtualization to interconnect heterogeneous and distributed resources through a Virtual IoT Network and will integrate them into the Cloud to provide an on-line platform of crowdsourcing Testbed as a Service (TBaaS) available to the research community.\n4. End-user and societal value creation by analyzing the potential end-users and crowdsourcing participants to propose an optimized model for end-user adoption and societal value creation.\n5. Crowdsourcing-driven research as a new model in which the research can be initiated, guided and assessed by the crowd. It will compare it to other models.\n6. Economic dimension of crowdsourcing testbed, by analyzing the potential markets and business models able to monetize the provided resources with adequate incentives, in order to optimize the exploitation, costs, profitability and economic sustainability of such testbeds. It will also develop tools for future experiments.\n7. Performing multidisciplinary experiments, including end-user driven experiments through crowdsourcing, to assess the added value of such approach.
Agency: European Commission | Branch: FP7 | Program: CP | Phase: ICT-2007.6.3 | Award Amount: 5.04M | Year: 2008
Management of the environment for predictable and sustainable use of natural resources is one of the great challenges of the 21st century. Although water covers most of the planet, it is becoming increasingly difficult to ensure adequate supplies of fresh, clean water for drinking, as well as, for sports and wellness activities. The demand for water resources is increasing as the population grows. At the same time, water resources are increasingly exposed to pollutants and spills as parts of the world become ever more crowded and industrialised. Potential climate changes due to global warming may also impact water resources.Management of water quality requires regular measurements and monitoring. Today, measurements of water quality are performed manually. The process can be slow and painstaking. Multiple point measurements are needed to cover an area. The process needs to be automated and extended to provide rapid and effective monitoring. Autonomous, mobile and self-healing solutions are needed to identify trends and to help localize and track potential problems. \tMOBESENS provides a modular and scalable ICT based solution for water quality monitoring. It enables data to be gathered quickly and reported across wide areas. The low power wireless sensor network gathers data samples, which are time and location stamped and automatically entered into the grid based information system to facilitate analysis and issue alarms if needed. Mobility is a unique feature of MOBESENS, which are capable of navigation and both surface and subsurface measurements. This extends range, enables 3D area measurements and facilitates operation, even in bad weather. MOBESENS may form ad-hoc networks enabling rapid and reliable reporting as well as relative localization and tracking (e.g. of contaminants). Opportunistic communication between MOBESENS and both fixed and mobile buoys is envisioned. Renewable energy sources are studied for self-sustained MOBESENS operation.
Agency: European Commission | Branch: FP7 | Program: MC-IAPP | Phase: FP7-PEOPLE-2009-IAPP | Award Amount: 1.49M | Year: 2010
The STEMCAM project is a 4-years training and transfer of knowledge program between two distinguished academic groups in stem cell research, two highly innovative SMEs and a company leader in development of media for stem cell research, to foster long-term industry-academy collaboration and partnership in the field of stem cells research and applications. The scientific and industrial aim of STEMCAM is to study the role of the Neuronal Cell Adhesion Molecule NCAM and related growth factors in the maintenance, survival and differentiation of induced pluripotent stem (iPS) cells towards the neural and myocardial lineage, in comparison with embryonic stem cells. The project will take advantage of very unique and innovative pharmacological tools, the NCAM and growth factor mimetic peptides discovered by ENKAM. To achieve its aim STEMCAM will apply an interdisciplinary approach from cell biology (including innovative in vitro culture using bi- and three dimensional systems), immunocytochemistry, imaging, molecular biology, electrophysiology, to peptide chemistry and chemioinformatics. The project will run via a training and transfer of knowledge program structured to efficiently exploit the expertise and complementarities between the industrial and academic partners to reach the scientific goals of the project and provide high quality intersectorial training for the participating researchers. This is expected to be of high benefit for their individual career development. The STEMCAM project links intersectorial research activities in two very relevant areas of stem cell research, neurogenesis and cardiomyogenesis and will strictly collaborate with the IAPPs INDUSTEM and PARTNERS and the large FP7 IP ESNATS, complementing and expanding with its unique approach their research scope and transfer of knowledge. Thus STEMCAM will significantly contribute to progress of stem cell research in Europe with high potential impact on European competitiveness and regenerative medicine
Agency: European Commission | Branch: H2020 | Program: RIA | Phase: PHC-08-2014 | Award Amount: 25.06M | Year: 2015
The TBVAC2020 proposal builds on the highly successful and long-standing collaborations in subsequent EC-FP5-, FP6- and FP7-funded TB vaccine and biomarker projects, but also brings in a large number of new key partners from excellent laboratories from Europe, USA, Asia, Africa and Australia, many of which are global leaders in the TB field. This was initiated by launching an open call for Expressions of Interest (EoI) prior to this application and to which interested parties could respond. In total, 115 EoIs were received and ranked by the TBVI Steering Committee using proposed H2020 evaluation criteria. This led to the prioritisation of 52 R&D approaches included in this proposal. TBVAC2020 aims to innovate and diversify the current TB vaccine and biomarker pipeline while at the same time applying portfolio management using gating and priority setting criteria to select as early as possible the most promising TB vaccine candidates, and accelerate their development. TBVAC2020 proposes to achieve this by combining creative bottom-up approaches for vaccine discovery (WP1), new preclinical models addressing clinical challenges (WP2) and identification and characterisation of correlates of protection (WP5) with a directive top-down portfolio management approach aiming to select the most promising TB vaccine candidates by their comparative evaluation using objective gating and priority setting criteria (WP6) and by supporting direct, head-to head or comparative preclinical and early clinical evaluation (WP3, WP4). This approach will both innovate and diversify the existing TB vaccine and biomarker pipeline as well as accelerate development of most promising TB vaccine candidates through early development stages. The proposed approach and involvement of many internationally leading groups in the TB vaccine and biomarker area in TBVAC2020 fully aligns with the Global TB Vaccine Partnerships (GTBVP).
Agency: European Commission | Branch: H2020 | Program: IA | Phase: DRS-01-2015 | Award Amount: 14.54M | Year: 2016
The ultimate purpose of ANYWHERE is to empower exposed responder institutions and citizens to enhance their anticipation and pro-active capacity of response to face extreme and high-impact weather and climate events. This will be achieved through the operational implementation of cutting-edge innovative technology as the best way to enhance citizens protection and saving lives. ANYWHERE proposes to implement a Pan-European multi-hazard platform providing a better identification of the expected weather-induced impacts and their location in time and space before they occur. This platform will support a faster analysis and anticipation of risks prior the event occurrence, an improved coordination of emergency reactions in the field and help to raise the self-preparedness of the population at risk. This significant step-ahead in the improvement of the pro-active capacity to provide adequate emergency responses is achievable capitalizing on the advanced forecasting methodologies and impact models made available by previous RTD projects, maximizing the uptake of their innovative potential not fully exploited up to now. The consortium is build upon a strong group of Coordinators of previous key EC projects in the related fields, together with 12 operational authorities and first responders institutions and 6 leading enterprises of the sector. The platform will be adapted to provide early warning products and locally customizable decision support services proactively targeted to the needs and requirements of the regional and local authorities, as well as public and private operators of critical infrastructures and networks. It will be implemented and demonstrated in 4 selected pilot sites to validate the prototype that will be transferred to the real operation. The market uptake will be ensured by the cooperation with a SME and Industry Collaborative Network, covering a wide range of sectors and stakeholders in Europe, and ultimately worldwide.
Agency: European Commission | Branch: FP7 | Program: CSA-CA | Phase: HEALTH.2010.2.1.1-2 | Award Amount: 2.24M | Year: 2010
High-throughput next-generation DNA sequencing technologies allow investigators to sequence entire human genomes at an affordable price and within a short time frame. The correct interpretation, storage, and dissemination of the large amount of produced genomics data generate major challenges. Tackling these challenges requires extensive exchange of data, information and knowledge between medical scientists, sequencing centres, bioinformatics networks and industry at the European level. The GEUVADIS (genetic European variation in disease) Consortium aims at developing standards in quality control and assessment of sequence data, models for data storage, exchange and access, as well as standards for the handling, analysis and interpretation of sequencing data and other functional genomics datasets, standards for the biological and medical interpretation of sequence data and in particular rare variants for monogenic and common disorders, and finally standards for the ethics of phenotype prediction from sequence variation. The partners are all involved in international sequencing initiatives (1000 GP, ICGC), EU and other international projects (ENGAGE, GEN2PHEN, ENCODE, TECHGENE ), biobanking activities (BBMRI), data sharing initiatives (ELIXIR), and the European Sequencing and Genotyping Infrastructure (ESGI), ensuring tight collaborations. The Consortium will undertake pilot sequencing projects on selected samples from three medical fields (cardiovascular, neurological and metabolic), using RNA (RNASeq) and DNA (exonSeq) sequencing. The analysis of such samples will allow the consortium to set up standards in operating procedures and biological/medical interpretation of sequence data in relation to clinical phenotypes. The consortium will bring together the knowledge and resources on medical genome sequencing at a European level and allow researchers to develop and test new hypotheses on the genetic basis of disease.
Agency: European Commission | Branch: H2020 | Program: SGA-RIA | Phase: FETFLAGSHIP | Award Amount: 89.00M | Year: 2016
Understanding the human brain is one of the greatest scientific challenges of our time. Such an understanding can provide profound insights into our humanity, leading to fundamentally new computing technologies, and transforming the diagnosis and treatment of brain disorders. Modern ICT brings this prospect within reach. The HBP Flagship Initiative (HBP) thus proposes a unique strategy that uses ICT to integrate neuroscience data from around the world, to develop a unified multi-level understanding of the brain and diseases, and ultimately to emulate its computational capabilities. The goal is to catalyze a global collaborative effort. During the HBPs first Specific Grant Agreement (SGA1), the HBP Core Project will outline the basis for building and operating a tightly integrated Research Infrastructure, providing HBP researchers and the scientific Community with unique resources and capabilities. Partnering Projects will enable independent research groups to expand the capabilities of the HBP Platforms, in order to use them to address otherwise intractable problems in neuroscience, computing and medicine in the future. In addition, collaborations with other national, European and international initiatives will create synergies, maximizing returns on research investment. SGA1 covers the detailed steps that will be taken to move the HBP closer to achieving its ambitious Flagship Objectives.
Agency: European Commission | Branch: FP7 | Program: CP-IP | Phase: HEALTH-2007-2.1.1-5 | Award Amount: 14.64M | Year: 2008
Cys-loop receptors (CLRs) form a superfamily of structurally related neurotransmitter-gated ion channels, comprising nicotinic acetylcholine, glycine, GABA-A/C and serotonin (5HT3) receptors, crucial to function of the peripheral and central nervous system. CLRs cover a wide spectrum of functions, ranging from muscle contraction to cognitive functions. CLR (mal)function is linked to various disorders, including muscular dystrophies, neurodegenerative diseases, e.g. Alzheimers and Parkinsons, and neuropsychiatric diseases, e.g. schizophrenia, epilepsy and addiction. CLRs are potentially important drug targets for treatment of disease. However, novel drug discovery strategies call for in depth understanding of ligand binding sites, the structure-function relationships of these receptors and insight into their actions in the nervous system. NeuroCypres assembles the expertise of leading European laboratories to provide a technology workflow, which enables to embark on this next step in CLR structure and function. A major target of this project is to obtain high-resolution X-ray and NMR structures for CLRs and their complexes with diverse ligands, agonists/antagonists, channel blockers and modulators, which will reveal basic mechanisms of receptor functioning from ligand binding to gating and open new avenues to rational drug design. In addition, the project aims at understanding receptor function in the context of the brain, focusing on receptor biosensors, receptor-protein interactions and transgenic models. This major challenge requires application and development of a multidisciplinary workflow of high-throughput (HT) crystallization and HT-electrophysiology technologies, X-ray analysis, NMR and computational modeling, fragment-based drug design, innovative quantitative methods of interaction-proteomics, sensitive methods for visualization of activity and localization of receptors and studies of in vitro and in vivo function in animal models of disease.
Agency: European Commission | Branch: H2020 | Program: MSCA-ITN-EJD | Phase: MSCA-ITN-2016 | Award Amount: 3.06M | Year: 2017
Environmental perturbations to lakes and reservoirs occur largely as episodic climatic events. These range from relatively short mixing events to storms and heat waves. While the driving events occur along a continuum of frequency and magnitude, however, their effect is generally longer lasting than the events themselves. In addition, the more extreme weather events are now becoming increasingly frequent, a trend that has been linked to directional climate change and is projected to continue in the coming decades. Understanding the impact of these short-lived pressures requires monitoring that captures the event (hoursdays) and the ensuing impact, that can last for months or even years. Only recently has automated high frequency monitoring (HFM) of lakes been adopted throughout Europe. This Training Network will investigate the effects of the most extreme events, and of cumulative lower magnitude events, using HFM, while at the same time training a cohort of doctoral students in state-of-the art technology, data analysis and modelling. The aim of the EJD is to change the way in which water quality monitoring is carried out so that the effects of episodic climatic events can be understood, thus ensuring that future water management strategies can explicitly account for their effects.
Agency: European Commission | Branch: FP7 | Program: CP-FP | Phase: HEALTH-2007-2.4.1-3 | Award Amount: 4.84M | Year: 2008
Genomic instability is a characteristic of practically all human cancers. Recent results generated by members of this Consortium suggest that signs of genomic instability are evident from the very beginning of human cancer development, even in precancerous lesions. In these early lesions, the genomic instability affects primarily specific genomic loci, called common fragile sites. Because common fragile sites are very sensitive to perturbations in DNA replication, we proposed that cancer development from its very beginning is associated with DNA replication stress. A separate set of observations focused on telomeres and showed that short telomeres mimick DNA ends, activate the DNA damage checkpoint and promote genomic instability and cancer development. We propose here to study the role of DNA replication stress and short telomeres on driving genomic instability particularly in human precancerous lesions. Our studies will investigate the most common forms of cancer in the EU and will benefit from access to some of the largest databases of cancerous and precancerous lesions in Europe. Genomic instability will be explored using high resolution genomic arrays and the data will be correlated to clinical information on tumor progression. Further, analysis of proteins and genes involved in the cellular response to DNA replication stress and short telomeres will be explored using high throughput and targeted approaches and will be used to identify novel targets for cancer therapy.
Agency: European Commission | Branch: H2020 | Program: MSCA-ITN-ETN | Phase: MSCA-ITN-2016 | Award Amount: 3.92M | Year: 2016
Mitochondria are essential organelles found in every eukaryotic cell, required to convert food into usable energy. The mitochondrial oxidative phosphorylation (OXPHOS) system, which produces the majority of cellular energy in the form of ATP, is controlled by two distinct genomes: the nuclear and the mitochondrial genome (mtDNA). Mutations in mitochondrial genes encoded by either genome could cause diseases affecting OXPHOS system, called mitochondrial diseases, whose prevalence has been estimated to be 1:8500. Moreover, dysfunction of mitochondrial OXPHOS system has emerged as a key factor in a myriad of common diseases, including neurodegenerative and metabolic disorders like Parkinsons and Alzheimers Disease, Type 2 Diabetes, and was linked to aging process. Despite all this, it is surprising that our understanding of the mechanisms governing the mitochondrial gene expression and its associated pathologies remain superficial and therapeutic interventions unexplored. The basic machineries for mtDNA replication, mtDNA transcription and mitochondrial translation are known, but the regulation of these processes in response to metabolic demands is poorly understood. The complex nature of mitochondrial gene expression that relies on two different genomes calls for a multidisciplinary approach where different teams of researchers join forces. Studies in this area are not only of basic scientific interest but may also provide new avenues towards treatment of mitochondrial dysfunction in a variety of human diseases. The key aim of the REMIX Network is combine the skills of European research groups to provide strategic training of the next generation of scientists through a programme that will progress in the elucidation of the molecular mechanisms and pathways that regulate mitochondrial gene expression.
Agency: European Commission | Branch: FP7 | Program: CP-CSA-Infra | Phase: INFRA-2008-1.1.1 | Award Amount: 32.30M | Year: 2009
Particle physics stands at the threshold of a new era of discovery and insight. Results from the much awaited LHC are expected to shed light on the origin of mass, supersymmetry, new space dimensions and forces. In July 2006 the European Strategy Group for Particle Physics defined accelerator priorities for the next 15 years in order to consolidate the potential for discovery and conduct the required precision physics. These include an LHC upgrade, R&D on TeV linear colliders and studies on neutrino facilities. These ambitious goals require the mobilisation of all European resources to face scientific and technological challenges well beyond the current state-of-the-art and the capabilities of any single laboratory or country. EuCARD will contribute to the formation of a European Research Area in accelerator science, effectively creating a distributed accelerator laboratory across Europe. It will address the new priorities by upgrading European accelerator infrastructures while continuing to strengthen the collaboration between its participants and developing synergies with industrial partners. R&D will be conducted on high field superconducting magnets, superconducting RF cavities which are particularly relevant for FLASH, XFEL and SC proton linacs, two-beam acceleration, high efficiency collimation and new accelerator concepts. EuCARD will include networks to monitor the performance and risks of innovative solutions and to disseminate results. Trans-national access will be granted to users of beams and advanced test facilities. Strong joint research activities will support priority R&D themes. As an essential complement to national and CERN programmes, the EuCARD proposal will strengthen the European Research Area by ensuring that European accelerator infrastructures further improve their performance and remain at the forefront of global research, serving a community of well over 10,000 physicists from all over the world.
Agency: European Commission | Branch: FP7 | Program: CP-FP | Phase: HEALTH.2010.2.2.2-4 | Award Amount: 3.92M | Year: 2011
Ageing is a fundamental biological process that defines all aspects of life, yet the understanding of the basic biological interactions leading to ageing and how genetic and environmental variation contributes to this processes are still largely unknown. We propose to use the most up to date genomic and epidemiological methodologies to shed light in some of the key questions in ageing research. We will use whole transcriptome sequencing, and telomere length assays in three cell types of 855 twins for which a large set of phenotypic measurements relating to ageing are available. These datasets will be integrated with genome-wide association studies in a systems genetics framework to infer causal interactions (genetic or environmental). The combination of these unique datasets together with the expertise of the participants is likely to provide novel insights to ageing research and uncover currently unknown factors that contribute to variation in the ageing process in human populations.
Agency: European Commission | Branch: FP7 | Program: MC-ITN | Phase: FP7-PEOPLE-2011-ITN | Award Amount: 4.99M | Year: 2012
The TRAINBIODIVERSE ITN will provide professional skills and training for young scientists covering multi-disciplinary aspects of soil biodiversity, ecosystem services, their economic significance and practical implications and implementation. The researchers will gain access to European educational and network facilities and training aimed at ensuring the wellbeing of human populations and the continued availability of sustainable recourses underpinned by soil microbiology. Practical and theoretical training related to monitoring, evaluating and improving the quality of biodiversity in European soils, in combination with training professionals to ensure enhanced intersectorial skills and communications will help to secure the future of European ecosystem services and agricultural production. TRAINBIODIVERSE will fill the gap between specialists in different institutions and administrative bodies providing information and policy on biodiversity and ecosystem services in Europe. The consortium encompasses different academic, non-academic industrial economic and political professions in different sectors. An understanding of the interrelationships and communications between the different sectors involved will be made available to European researchers for the first time. This will coincide with increases in related governmental policy and actions. The training will cover the process for applying scientific rational to political implementation. Initial training will commence with field and laboratory work then proceeding through interpretation of results to economic evaluation for managerial administrative and decision making processes and application of the information.
Agency: European Commission | Branch: H2020 | Program: RIA | Phase: REV-INEQUAL-05-2016 | Award Amount: 3.25M | Year: 2017
The proposed project aims to study the relations between inequalities and young peoples ways of doing politics as well as to advance scenarios for future democratic models and political systems in Europe that are more inclusive for young people. It has three main objectives: (1) To provide systematic evidence on the ways in which inequalities are lived by young people and (re)acted upon, exploring the coping mechanisms which are embedded in young peoples ways of doing politics; these coping mechanisms are manifested in multiple forms, i.e. as either political (dis)engagement and contestation online and offline or as (trans-)national democratic innovation and experimentation; (2) To advance knowledge on the conditions and causes underpinning young peoples ways of doing politics; this involves an examination of their norms, values, attitudes, and behaviors regarding democracy, power, politics, policy-making, social and political participation (online and offline) and the organization of economic, social and private life in order to identify ways to strengthen youth political participation and engagement with democratic life in Europe; (3) To suggest a number of different future scenarios for the development of democracy and political participation in Europe, putting particular emphasis on implementing new democratic models that are more inclusive for young people especially those with fewer opportunities. The research design consists of a multidimensional theoretical framework that combines macro-level (institutional), meso-level (organizational), and micro-level (individual) explanatory factors, a cross-national comparative design that includes nine European countries with different institutional arrangements and policies towards youth, and an integrated methodological approach based on multiple sources and methods (policy analysis, claims-making analysis, organizational survey, panel survey, survey experiments, biographical interviews, and social media analysis).
Agency: European Commission | Branch: FP7 | Program: CP-IP | Phase: NMP.2012.1.3-1 | Award Amount: 12.95M | Year: 2013
The NanoMILE project is conceived and led by an international elite of scientists from the EU and US with the aim to establish a fundamental understanding of the mechanisms of nanomaterial interactions with living systems and the environment, and uniquely to do so across the entire life cycle of nanomaterials and in a wide range of target species. Identification of critical properties (physico-chemical descriptors) that confer the ability to induce harm in biological systems is key to allowing these features to be avoided in nanomaterial production (safety by design). Major shortfalls in the risk analysis process for nanomaterials are the fundamental lack of data on exposure levels and the environmental fate and transformation of nanomaterials, key issues that this proposal will address, including through the development of novel modelling approaches. A major deliverable of the project will be a framework for classification of nanomaterials according to their impacts, whether biological or environmental, by linking nanomaterial-biomolecule interactions across scales (sub-cellular to ecosystem) and establishing the specific biochemical mechanisms of interference (toxicity pathway).
Universitatsklinikum Hamburg Eppendorf, Saarland University and University of Geneva | Date: 2012-12-21
The invention relates to a compound which is effective in inhibiting the function of the TRPM4 ion channel and the use of such compound in treating or preventing a neurodegenerative disease, such as Multiple Sclerosis, Parkinsons disease, Alzheimers disease, or a myotrophic lateral sclerosis, in a subject. The invention also provides a pharmaceutical composition comprising a TRPM4 inhibitory compound. The invention further relates to in vitro methods for identifying pharmaceutically active compounds that are useful for treating or preventing a neurodegenerative disease.
Agency: European Commission | Branch: H2020 | Program: MSCA-RISE | Phase: MSCA-RISE-2015 | Award Amount: 1.39M | Year: 2016
In EU and Australia, every year thousands of square miles of forests and other lands burn due to wildfires. These fires cause important economic and ecological losses, and often, human casualties. Both EU and Australian governments are aware of how crucial it is to improve wildfires management and containment . Scientists from different specialties, both in EU and Australia, have already developed methods and models in order to improve the management and decision process pertaining to preparedness and response phases in case of bushfire. The present project, named Geospatial based Environment for Optimisation Systems Addressing Fire Emergencies (GEO SAFE), aims at creating a network enabling the two regions to exchange knowledge, ideas and experience , thus boosting the progress of wildfires knowledge and the related development of innovative methods for dealing efficiently with such fires. More precisely, the GEO SAFE project will focus on developing the tools enabling to set up an integrated decision support system optimizing the resources during the response phase, through: Developing a dynamic risk cartography of a region with regard to the possibility of a wildfire. The task will involve data collection (satellite and remote sensors), risk analysis and development of a tool enabling to forecast fire extension i, and in particular to predict fire and risk evolution during the response phase Designing and testing a resource allocation tool for the response phase using the dynamic risk cartography. One of the problems to consider will be the resource allocation for securing key places (schools, hospitals, .) given time dependent constraints. Problems will be identified through connections with final users, and the proposed solutions will be tested on simulated data. Developing analyses of relevant management processes as well as training tools in order to facilitate the implementation of such solutionto be completed
Agency: European Commission | Branch: H2020 | Program: RIA | Phase: PHC-02-2015 | Award Amount: 5.50M | Year: 2016
EuroPOND will develop a data-driven statistical and computational modeling framework for neurological disease progression. This will enable major advances in differential and personalized diagnosis, prognosis, monitoring, and treatment and care decisions, positioning Europe as world leaders in one of the biggest societal challenges of 21st century healthcare. The inherent complexity of neurological disease, the overlap of symptoms and pathologies, and the high comorbidity rate suggests a systems medicine approach, which matches the specific challenge of this call. We take a uniquely holistic approach that, in the spirit of systems medicine, integrates a variety of clinical and biomedical research data including risk factors, biomarkers, and interactions. Our consortium has a multidisciplinary balance of essential expertise in mathematical/statistical/computational modelling; clinical, biomedical and epidemiological expertise; and access to a diverse range of datasets for sporadic and well-phenotyped disease types. The project will devise and implement, as open-source software tools, advanced statistical and computational techniques for reconstructing long-term temporal evolution of disease markers from cross-sectional or short-term longitudinal data. We will apply the techniques to generate new and uniquely detailed pictures of a range of important diseases. This will support the development of new evidence-based treatments in Europe through deeper disease understanding, better patient stratification for clinical trials, and improved accuracy of diagnosis and prognosis. For example, Alzheimers disease alone costs European citizens around 200B every year in care and loss of productivity. No disease modifying treatments are yet available. Clinical trials repeatedly fail because disease heterogeneity prevents bulk response. Our models enable fine stratification into phenotypes enabling more focussed analysis to identify subgroups that respond to putative treatments.
Agency: European Commission | Branch: H2020 | Program: RIA | Phase: SC1-PM-04-2016 | Award Amount: 10.77M | Year: 2017
Our main objective is to identify determinants of brain, cognitive and mental health at different stages of life. By integration, harmonisation and enrichment of major European neuroimaging studies of age differences and changes, we will obtain an unparalleled database of fine-grained brain, cognitive and mental health measures of more than 6.000 individuals. Longitudinal brain imaging, genetic and health data are available for a major part, as well as cognitive/mental health measures for extensively broader cohorts, exceeding 40.000 examinations in total. By linking these data, also to additional databases and biobanks, including birth registries, national and regional archives, and by enriching them with new online data collection and novel measures, we will address risk and protective factors of brain, cognitive and mental health throughout the lifespan. We will identify the pathways through which risk and protective factors work and their moderators. Through exploitation of, and synergies with, existing European infrastructures and initiatives, this approach of integrating, harmonising and enriching brain imaging datasets will make major conceptual, methodological and analytical contributions towards large integrative cohorts and their efficient exploitation. We will thus provide novel information on brain, cognitive and mental health maintenance, onset and course of brain, cognitive and mental disorders, and lay a foundation for earlier diagnosis of brain disorders, aberrant development and decline of brain, cognitive and mental health, as well as future preventive and therapeutic strategies. Working with stakeholders and health authorities, the project will provide the evidence base for policy strategies for prevention and intervention, improving clinical practice and public health policy for brain, cognitive and mental health. This project is realized by a close collaboration of small and medium-sized enterprise (SME) and major European brain research centres
Agency: European Commission | Branch: FP7 | Program: MC-IAPP | Phase: FP7-PEOPLE-2009-IAPP | Award Amount: 771.68K | Year: 2010
The project Q-CERT intends to gather industrial and academic partners with strong scientific and technical backgrounds in quantum key distribution (QKD) technology, in order to establish research partnerships focused on one common high-level objective: strengthen the security of practical QKD systems by developping techniques and standards (both at the hardware and software level) that will allow cryptographic security evaluation and certification At the hardware level, we will conduct systematic studies of the potential vulnerabilities of QKD systems, by testing experimentally the feasibility of attacks on the optical and electronical layer of the systems. We will in response implement experimentally countermeasures, test their efficiency and develop the theoretical framework allowing to model the entire QKD implementation and prove its security. At the software level, we will push further a formal approach of security proof for an essential part of a practical quantum key distribution protocol : key distillation. We will specify and then develop a software library of key distillation that will present a very high-level of security assurance, validated by the use of formal methods for cryptographic protocol verification. This library will in particular include a state-of-the-art error correction module, based on unidirectional LDPC codes. In order to increase the impact of our work, and to benefit from the fruitful interaction and feedback of the research community, we will publicize parts of our results by integrating them in QKD security standards. The development of such security assurance procedures is expected to greatly strengthen the practical security of quantum key distribution (QKD) systems. We will in particular write security targets for a high-performance QKD system, and for a secure infrastructure relying on a network of QKD links.
Agency: European Commission | Branch: FP7 | Program: CP | Phase: ICT-2007.4.4 | Award Amount: 4.47M | Year: 2009
The goal of the e-LICO project is to build a virtual laboratory for interdisciplinary collaborative research in data mining and data-intensive sciences. The proposed e-lab will comprise three layers: the e-science and data mining layers will form a generic research environment that can be adapted to different scientific domains by customizing the application layer. The e-science layer, built on an open-source e-science infrastructure developed by one of the partners, will support content creation through collaboration at multiple scales and degrees of commitment---ranging from small, contract-bound teams to voluntary, constraint-free participation in dynamic virtual communities. The data mining layer will be the distinctive core of e-LICO; it will provide comprehensive multimedia (structured records, text, images, signals) data mining tools. Standard tools will be augmented with preprocessing or learning algorithms developed specifically to meet challenges of data-intensive, knowledge rich sciences, such as ultra-high dimensionality or undersampled data. Methodologically sound use of these tools will be ensured by a knowledge-driven data mining assistant, which will rely on a data mining ontology and knowledge base to plan the mining process and propose ranked workflows for a given application problem. Extensive e-lab monitoring facilities will automate the accumulation of experimental meta-data to support replication and comparison of data mining experiments. These meta-data will be used by a meta-miner, which will combine probabilistic reasoning with kernel-based learning from complex structures to incrementally improve the assistants workflow recommendations. e-LICO will be showcased in a systems biology task: biomarker discovery and molecular pathway modelling for diseases affecting the kidney and urinary pathways.
Agency: European Commission | Branch: FP7 | Program: CP-IP | Phase: ENV.2007.1.1.5.2. | Award Amount: 8.54M | Year: 2008
As the evidence for human induced climate change becomes clearer, so too does the realization that its effects will have impacts on natural environment and socio-economic systems. Some regions are more vulnerable than others, both to physical changes and to the consequences for ways of life. The proposal will assess the impacts of a changing climate on the quantity and quality of water in mountain regions. Modeling techniques will be used to project the influence of climatic change on the major determinants of river discharge at various time and space scales. Regional climate models will provide the essential information on shifting precipitation and temperature patterns, and snow, ice, and biosphere models will feed into hydrological models in order to assess the changes in seasonality, amount, and incidence of extreme events in various catchment areas. Environmental and socio-economic responses to changes in hydrological regimes will be analyzed in terms of hazards, aquatic ecosystems, hydropower, tourism, agriculture, and the health implications of changing water quality. Attention will also be devoted to the interactions between land use/land cover changes, and changing or conflicting water resource demands. Adaptation and policy options will be elaborated on the basis of the model results. Specific environmental conditions of mountain regions will be particularly affected by rapidly rising temperatures, prolonged droughts and extreme precipitation. The methodological developments gained from a European mountain focus will be used to address water issues in regions whose economic conditions and political structures may compromise capacities to respond and adapt, such as the Andes and Central Asia where complex problems resulting from asymmetric power relations and less robust institutions arise. Methodologies developed to study European mountains and their institutional frameworks will identify vulnerabilities and be used to evaluate a range of policy options.
Agency: European Commission | Branch: FP7 | Program: CP-IP | Phase: NMP-2008-4.0-1 | Award Amount: 12.25M | Year: 2010
Based on the clinical unmet needs and recent research in biomarkers on Rheumatoid Arthritis (RA) and Osteoarthritis (OA) the main objective of the project is to develop a nanotechnology based novel diagnostic tool for easy and early detection of biomarkers in inflammatory diseases especially RA and OA by using modified superparamagnetic nanoparticles (SPION) for (A) bioassay (ex-vivo application) and (B) MRI (in-vivo detection). A new technology based on multiple functionalized single nanoparticles specifically entering/attaching to cells, to enzymes in serous fluids or organelles in living cells will be used to detect, separate and identify low abundance biomarkers. Newly identified biomarkers will be used to decorate SPION with binding moieties which are specific to the biomarker(s) and can be used diagnostically such as in contrast agents (MRI). A sensitive micro-immunoassay will be developed for special use of these particles in biochemical tests for arthritis. This project is driven by the high clinical need to identify early arthritis and then segment RA and OA patients into progressors/responders or non-progressors/-responders to various treatment options. Inflammatory disorders like RA inducing the destruction of cartilage in 1% of the population which is accompanied by significant pain, morbidity and mortality leads to reduced capacity to work. OA, a degenerative arthritis is the leading cause of disability among the elderly population. As there is no cure for RA and finally the replacement of e.g. the knee in OA, early diagnostic tools for the detection of the disease progression and the ability to evaluate the efficacy of therapeutic interventions are necessary u.a. for drug development. Existing diagnostic methods often do not permit an early definite diagnosis, so new nanoparticle based diagnostic techniques targeting to the detection of molecular events (based on MRI) with higher sensitivity/specificity will be developed to satisfy the urgent need.
Agency: European Commission | Branch: FP7 | Program: CP-FP | Phase: ENV.2011.1.2.3-2 | Award Amount: 4.52M | Year: 2012
Indications exist that close contact with nature brings benefits to human health and well-being. The proposed work will investigate the interconnections between exposure to natural outdoor environments, in both rural and urban settings, and better human health and well-being in the North West, South and East of Europe. The project will explore the underlying mechanisms at work (stress reduction/restorative function, physical activity, social interaction, exposure to environmental hazards) and examine the health effects (general health and well-being, mental health/neural development, stress, cardiovascular, cancer and respiratory mortality and morbidity, birth outcomes and obesity) for different population groups (pregnant women and/or foetus, different age groups, socio-economic status, ethnic minorities and patients). We will use conventional and new innovative high tech methods to characterize the natural environment in terms of quality and quantity. Preventive as well as therapeutic effects of contact with the natural environment will be covered. We will address implications for land-use planning and green space management. The work will produce more robust evidence base on links between exposure to natural outdoor environment and human health and well-being, and a better integration of human health needs into land use planning and green space management in rural as well as urban areas.
Agency: European Commission | Branch: FP7 | Program: CP-IP | Phase: HEALTH-2007-1.3-1 | Award Amount: 15.59M | Year: 2008
ESNATS aims at developing a novel toxicity test platform based on embryonic stem cells (ESC), especially human ESC (hESC), to accelerate drug development, reduce R&D costs and propose a powerful alternative to animal tests (3 Rs). ESNATS will address current drug-testing shortcomings: - testing takes place late in the development cycle - animal test systems bear the risk of non-prediction due to inter-species variation - non-ESC assays rely on primary cells or cells of malignant origin that are hard-to-standardise and limited in regard to quantity, homogeneity and genetic diversity - existing assay systems based on primary animal cell lines do not reliably represent the physiological situation ESNATS will develop a battery of toxicity tests using hESC lines subjected to different standardised culture protocols. Tests will cover embryoid bodies in different developmental stages and differentiated derivatives including gamete and neuronal lineages, complemented with test systems for hepatic metabolism. Predictive toxicogenomics and proteomics markers will be identified. The individual tests will be integrated into an all-in-one test system. To enable future industrial use ESNATS will prepare automating and scaling up of hESC culture. The predictivity, quality and reproducibility of ESNATS will be evaluated in a proof of concept study. ESNATS benefits are to increase safety due to better predictivity of human test systems, to reduce, refine and replace animal tests, to lower testing cost, and to support medium/high throughput testing. ESNATS objectives will be achieved in a 5 year multi-disciplinary collaboration of leading European researchers in alternative testing, toxicology, ESC research, genomics, modelling, and automation. The consortium will also include representatives from regulatory bodies, the pharmaceutical industry and ethical advisors to provide guidance to ensure rapid applicability of the developed tests systems.
Agency: European Commission | Branch: FP7 | Program: CP-FP | Phase: SSH.2013.5.1-1 | Award Amount: 3.14M | Year: 2013
The proposed research deals with citizens reactions to economic crises and their social and political consequences. It examines in particular the ways in which European citizens have reacted to the crisis that, at different degree of intensity in different countries, struck Europe since 2008, but also how they deal with economic crises and their consequences more generally. We examine both individual and collective responses by citizens, both the private and the public dimensions of such responses, and both political and non-political responses. In addition, while the focus of the research is on citizens responses, we also examine policy responses so as to have a baseline for assessing citizens reactions to crises. The project has three main objectives: (1) to provide systematic evidence of the ways in which European citizens react to economic crises and their social and political consequences, both individually and collectively; (2) to advance knowledge on the connections between individual factors, contextual factors, and the ways in which European citizens react to economic crises and their social and political consequences; and (3) to suggest a number of good practices as to how to deal with economic crises, both at the social and political level, through which their negative consequences on European citizens can be avoided or limited. The projects objectives are addressed by means of six main types of data and methods: (1) the creation of a cross-national comparative dataset on economic, social, and political indicators; (2) an analysis of policy responses to crises; (3) an analysis of collective responses to crises in the public domain; (4) an analysis of individual responses to crises by private citizens; (5) experiments designed to assess causal effects of different dimensions of crises on citizens attitudes and behaviors; and (6) an analysis of alternative forms of resilience in times of crisis.
Agency: European Commission | Branch: FP7 | Program: CP | Phase: ICT-2007.5.2 | Award Amount: 8.35M | Year: 2008
In about half a century of antibiotic use, unexpected new challenges have come to light: fast emer-gence of resistances among pathogens, misuse and overuse of antibiotics; direct and indirect re-lated costs. Antimicrobial resistance results in escalating healthcare costs, increased morbidity and mortality and the emergence or reemergence of potentially untreatable pathogens. In this context of infectious diseases we will (1) detect patient safety issues, (2) learn how to prevent them and (3) actually prevent them in clinical cases. We will detect harmful patterns and trends using clinical and operational information from Clinical Information Systems (CIS). This will be\ndone through the view of a virtualized Clinical Data Re-pository (CDR), featuring, transparent access to the original CIS and/or collection and aggregation of data in a local store. Text, image and structured data mining on individual patients as well as on populations will learn us informational and temporal patterns of patient harm. This knowledge will be fed into a Medical Knowledge Repository and mixed with knowledge coming from external sources (for example guidelines and evidences). After editing and validating, this knowledge will be used by a\ndecision support and monitoring tool in the clinical environment to prevent patient safety issues and report on it.\nOutcomes and benefits, both clinical and economical will be measured and reported on. Innovation within this project lays in the virtualization of Clinical Data Repository through ontology mediation, the advanced mining techniques, the reasoning engine and the consolidation of all these techniques in a comprehensive but open framework. This framework will be implemented, focused on infectious diseases, but will be applicable for all sorts of clinical cases in the future.
Agency: European Commission | Branch: FP7 | Program: CP-IP | Phase: HEALTH.2011.2.1.1-1 | Award Amount: 39.64M | Year: 2011
In response to the call for a high impact initiative on the human epigenome, the BLUEPRINT Consortium has been formed with the aim of generating at least 100 reference epigenomes and studying them to advance and exploit knowledge of the underlying biological processes and mechanisms in health and disease. BLUEPRINT will focus on distinct types of haematopoietic cells from healthy individuals and on their malignant leukaemic counterparts. Reference epigenomes will be generated by state-of-the-art technologies from highly purified cells for a comprehensive set of epigenetic marks in accordance with quality standards set by IHEC. This resource-generating activity will be conducted at dedicated centres to be complemented by confederated hypothesis-driven research into blood-based diseases, including common leukaemias and autoimmune disease (T1D), by epigenetic targets and compound identification, and by discovery and validation of epigenetic markers for diagnostic use. By focussing on 100 samples of known genetic variation BLUEPRINT will complete an epigenome-wide association study, maximizing the biomedical relevance of the reference epigenomes. Key to the success of BLUEPRINT will be the integration with other data sources (i.e. ICGC, 1000 genomes and ENCODE), comprehensive bioinformatic analysis, and user-friendly dissemination to the wider scientific community. The involvement of innovative companies will energize epigenomic research in the private sector by creating new targets for compounds and the development of smart technologies for better diagnostic tests. BLUEPRINT will outreach through a network of associated members and form critical alliances with leading networks in genomics and epigenomics within Europe and worldwide. Through its interdisciplinarity and scientific excellence combined with its strong commitment to networking, training and communication BLUEPRINT strives to become the cornerstone of the EU contribution to IHEC.
Agency: European Commission | Branch: FP7 | Program: CP-CSA-Infra | Phase: INFRA-2007-2.2-01 | Award Amount: 15.61M | Year: 2008
The Large Hadron Collider upgrade (SLHC) is the project with highest priority in The European strategy for particle physics document, unanimously approved by the CERN Council in July 2006. The SLHC, with expected 1 B budget, includes the upgrade of specific elements of the LHC accelerator, major upgrades in the accelerator injector complex, as well as upgrades to the two high-luminosity experiments ATLAS and CMS. It will result in a tenfold increase of the LHC luminosity. Thus the SLHC will remain the most powerful particle accelerator in the world in the next two decades. The Preparatory Phase project of the LHC-upgrade (SLHC-PP), co-funded by the EC, comprises Coordinating, Support and Technical activities. The Coordinating activities within SLHC-PP play a central role for the organisation of the new accelerator- and detector-upgrade collaborations, putting in place project structures and collaboration management tools, ultimately aiming for agreements on work-sharing and funding for the implementation phase. Support activities address upfront priority safety issues in the radiation protection domain. The Technical developments address the construction of prototypes of Nb-Ti high-field magnets with large aperture, the study of a new H- ion source, field stabilization in superconducting accelerating structures, and novel tracking detector power distribution. The SLHC-PP project runs in parallel with an extensive SLHC-oriented R&D program, funded by CERN together with important contributions from many CERN member and non-member states. In order to prepare for the SLHC project implementation as a whole, the coordination tasks within SLHC-PP include the coordination of these developments carried out outside SLHC-PP. The main aim of SLHC-PP is to prepare the SLHC project for a decision on the approval of its implementation by 2011. Beside the justification of SLHC by the physics results and operational experience from the first years of LHC running, the necessary ingredients for the approval will include: the maturity of new technologies required for SLHC, solutions for critical safety issues, and the formation of collaborations for the implementation, including the definition of work sharing and financial commitments. The SLHC-PP project is fully set up to address these issues and to prepare for the approval by the CERN council and by all other funding agencies involved.
Agency: European Commission | Branch: H2020 | Program: RIA | Phase: INFRAIA-1-2014-2015 | Award Amount: 13.00M | Year: 2015
Particle physics is at the forefront of the ERA, attracting a global community of more than 10,000 scientists. With the upgrade of the LHC and the preparation of new experiments, the community will have to overcome unprecedented challenges in order to answer fundamental questions concerning the Higgs boson, neutrinos, and physics beyond the Standard Model. Major developments in detector technology are required to ensure the success of these endeavours. The AIDA-2020 project brings together the leading European infrastructures in detector development and a number of academic institutes, thus assembling the necessary expertise for the ambitious programme of work. In total, 19 countries and CERN are involved in this programme, which follows closely the priorities of the European Strategy for Particle Physics. AIDA-2020 aims to advance detector technologies beyond current limits by offering well-equipped test beam and irradiation facilities for testing detector systems under its Transnational Access programme. Common software tools, micro-electronics and data acquisition systems are also provided. This shared high-quality infrastructure will ensure optimal use and coherent development, thus increasing knowledge exchange between European groups and maximising scientific progress. The project also exploits the innovation potential of detector research by engaging with European industry for large-scale production of detector systems and by developing applications outside of particle physics, e.g. for medical imaging. AIDA-2020 will lead to enhanced coordination within the European detector community, leveraging EU and national resources. The project will explore novel detector technologies and will provide the ERA with world-class infrastructure for detector development, benefiting thousands of researchers participating in future particle physics projects, and contributing to maintaining Europes leadership of the field.
Agency: European Commission | Branch: FP7 | Program: CP-IP | Phase: HEALTH.2011.1.4-4 | Award Amount: 40.88M | Year: 2011
Vaccines so far have been developed mostly by following an empiric approach. To prevent and possibly cure unresolved and emerging infectious diseases we need to fully exploit the potential of the human immune system. Progress in science and technology makes it possible to achieve what was previously deemed impossible. The scope of this project is to produce knowledge necessary to develop novel and powerful immunization technologies for the next generation of human vaccines. This goal requires a multidisciplinary approach in which diverse but complementary scientific disciplines and technologies converge. Therefore some of the most competitive European research groups from public institutions and biotechs have agreed to join forces in ADITEC, together with top US groups on systems biology and adjuvants to support this enterprise. A systems biology approach will be used to study licensed and experimental vaccines in patient characterization studies and in clinical trials, to investigate the effect of adjuvants, vectors, formulations, delivery devices, routes of immunization, homologous and heterologous primeboost schedules, as well as the impact of host factors such as age, gender, genetics and pathologies. Animal models will be used to complement human studies, and to select novel immunization technologies to be advanced to the clinic. To address these issues in a coordinated manner, ADITEC is organised on a matrix structure in which research themes and experimental approaches feed into each other. Training curricula will be created to impact on the formation of the next generation of EU researchers in the field. ADITEC scientists and institutions are part of the Sclavo Vaccines Association (SVA), which is dedicated to vaccines and vaccine research. SVA, acting as the coordinating institution, guarantees the long-term commitment and sustainability of this initiative, beyond the duration of ADITEC itself.
Agency: European Commission | Branch: H2020 | Program: RIA | Phase: EURO-3-2014 | Award Amount: 2.85M | Year: 2015
TransSOL is committed to the systematic, interdisciplinary and praxis-oriented analysis of European solidarity in times of crisis. It has three overarching objectives: (a) it will map and analyse solidarity in Europe by means of a cross-national database that comprises three surveys addressing the general population, organized civil society, and claims-making in the media; (b) it will gather systematic data on the contextual factors and engage into political and legal analyses to ascertain the influence of the socio-economic, political, and legal context on solidarity, in particular the impact of the crisis, the EUs political responses and target-groups specific public policies; and (c) it will identify and develop best practices of transnational solidarity, draft evidence-based policy recommendations, and engage proactive dissemination and communication activities. The project comprises teams from Denmark, France, Germany, Greece, Italy, Poland, Switzerland and the UK, including scientists from various disciplines and civil society practitioners, thus promising to deliver interdisciplinary and comparative analyses, knowledge-transfer and evidence-based, practicable recommendations. The project will enable us to address the three topics of the call. First, TransSOL will provide the first rigorous and comprehensive analysis of transnational solidarity in Europe, its main forms, conditioning factors (e.g., individual features as gender and social class, spatial inequalities, and contextual factors), and underlying conflicts about contending norms, identities, and interests. Secondly, the project will address the impact of Europes cultural diversity and multiple identities on European solidarity by analysing public claims-making and debates within the media. And finally, we engage into a critical reflection about adequate policy responses, in particular about the potentials of social investments balancing civic virtues of solidarity with public responsibilities.
Agency: European Commission | Branch: H2020 | Program: RIA | Phase: FETOPEN-01-2016-2017 | Award Amount: 3.96M | Year: 2017
Controlling lightning is a long time dream of mankind. The goal of the present project is to investigate and develop a new type of lightning protection based on the use of upward lightning discharges initiated through a high repetition rate multi terawatt laser. The feasibility of the novel technique and the projects prospect of success are based on recent research providing new insights into the mechanism responsible for the guiding of electrical discharges by laser filaments, on cutting-edge high power laser technology and on the availability of the uniquely suitable Sntis lightning measurement station in Northeastern Switzerland. The LLR consortium is ideally positioned to succeed and to raise the competitiveness of Europe in lightning control as it relies on the integration of trans-disciplinary fields in laser development, nonlinear optics, plasma physics, remote sensing, and lightning: The project team is made up of leaders in the domains of high power nonlinear propagation of laser pulses in the atmosphere, laser control of electric discharges, lightning physics, high power laser development, and high-repetition-rate lasers. In addition, the largest European company in aeronautics brings its expertise in lightning direct effects and protection means on aircraft and infrastructures.
Agency: European Commission | Branch: H2020 | Program: MSCA-ITN-ETN | Phase: MSCA-ITN-2015-ETN | Award Amount: 3.86M | Year: 2016
European societies face rapid social changes, challenges and benefits, which can be studied with traditional tools of analysis, but with serious limitations. This rapid transformation covers changes in family forms, fertility, the decline of mortality and increase of longevity, and periods of economic and social instability. Owing to population ageing across Europe, countries are now the experiencing the impact of these rapid changes on the sustainability of their welfare systems. At the same time, the use of the space and residential mobility has become a key topic, with migrations within the EU countries and from outside Europe being at the center of the political agenda. Over the past decade research teams across Europe have been involved in the development and construction of longitudinal population registers and large research databases, while opening up avenues for new linkages between different data sources (ie administrative and health data) making possible to gain an understanding of these fast societal transformations. However, in order to work with these types of datasets requires advanced skills in both data management and statistical techniques. LONGPOP aims to create network to utilize these different research teams to share experiences, construct joint research, create a training track for specialist in the field and increase the number of users of these large possibly underused - databases, making more scientists and stakeholders aware of the richness in the databases.
Agency: European Commission | Branch: H2020 | Program: MSCA-RISE | Phase: MSCA-RISE-2015 | Award Amount: 1.22M | Year: 2016
The main goal of the MediHealth project is to introduce a novel approach for the discovery of active agents of food plants from Mediterranean diet and other global sources to promote healthy ageing. This will be achieved through an extended and well-balanced scheme of researchers secondments between 5 universities and 4 enterprises from EU & Associated countries as well as 4 universities from Third countries. A mutual scientific project developed on the needs and interests of both sectors exploiting the existing complementary expertise will be the base of this proposal. Plants from the Mediterranean diet and food plants from TC will be rationally selected and will be subjected to an integrated, interactive and comprehensive platform including in silico, in vitro (advanced cell-based assays), in vivo (flies and mice models) & metabolism assessment. Advanced analytical techniques will embrace the pharmacological evaluation process for the efficient isolation and identification of bioactive plant constituents. Pharmacological profiling of bioactive natural products as well as identification and synthesis of their metabolites will be carried out. Finally, to carry to the stage of development innovative products in the area of nutraceuticals/dietary supplements process-optimization studies will be performed. Within this project, core scientific multidisciplinary knowledge from different research areas will be integrated creating valuable synergies. Expertise will be transferred by means of the seconded researchers training in environments with different research orientation where complimentary skills are required. Special attention will be given to dissemination activities aiming to public awareness of benefits of healthy diet(s). MediHealth aspires to comprise a successful model promoting considerably researchers competences and long-lasting collaboration between Industry and Academia generating innovation potential at the European and global levels.
Agency: European Commission | Branch: H2020 | Program: RIA | Phase: EO-3-2016 | Award Amount: 1.85M | Year: 2016
E2mC aims at demonstrating the technical and operational feasibility of the integration of social media analysis and crowdsourced information within both the Mapping and Early Warning Components of Copernicus Emergency Management Service (EMS). The Project will develop a prototype of a new EMS Service Component (Copernicus Witness), designed to exploit social media analysis and crowdsourcing capabilities to generate a new Product of the EMS Portfolio. The purpose of the new Copernicus Witness Service Component is to improve the timeliness and accuracy of geo-spatial information provided to Civil Protection authorities, on a 24/7 basis, during the overall crisis management cycle and, particularly, in the first hours immediately after the event. This will result in an early confirmation of alerts from running Early Warning Systems as well as first rapid impact assessment from the field. The technological enabler of the Copernicus Witness is the innovative and scalable Social&Crowd (S&C) Platform, developed by E2mC. Heterogeneous social media data streams (Twitter, Facebook, Instagram, and different data: text, image, video, ) will be analysed and sparse crowdsourcing communities will be federated (crisis specific as Tomnod, HOT, SBTF and generic as Crowdcrafting, EpiCollect,). Two demonstration loops will validate the usefulness of Copernicus Witness and the S&C Platform suitability to allow EC to evaluate possible Copernicus EMS evolution options. E2mC will perform demonstrations within realistic and operational scenarios designed by the Users involved within the Project (Civil Protection Authorities and Humanitarian Aid operators, including their volunteer teams) and by the current Copernicus EMS Operational Service Providers that are part of the E2mC Consortium. The involvement of social media and crowdsourcing communities will foster the engagement of a large number of people in supporting crisis management; many more citizens will become aware of Copernicus.
Agency: European Commission | Branch: FP7 | Program: CP-FP-SICA | Phase: HEALTH.2010.2.3.4-2 | Award Amount: 6.45M | Year: 2010
Neglected Infectious Diseases (NID) such as trypanosomiasis, leishmaniasis, schistosomiasis and soil-transmitted helminthiasis receive less than 5% of the global investment for tropical diseases research. Clinical praxis in disease-endemic countries (DEC) is rarely evidence based and does not make use of the latest innovations in diagnostic technology. NIDrelated research on diagnostics is particularly underfunded, and diagnostic tools are lacking for a number of NID. The aim of this proposal is to bridge the gap between existing technological innovation in diagnostics and clinical care practice for NID in resource-poor settings. The specific objectives are to develop simple, cost-effective diagnosis-treatment algorithms for three NID-related clinical syndromes: the persistent fever, the neurological and the digestive syndromes. Evidence-based algorithms for the primary care level will be designed with a patient-centred approach, following guidance from DEC stakeholders and making the best possible use of existing assays and treatments. Relevant diagnostic technology and diagnostic platforms will be introduced according to the specific epidemiological contexts in Africa and South-Asia. The research consortium brings together a network of clinical epidemiologists, a diagnostics development group, several partners from academia and SMEs. The consortium further includes workpackages on reference laboratory, economic evaluation, quality assurance and translation to policy. By developing accurate and affordable diagnostic platforms and by optimizing diagnostic-treatment algorithms, this project will rationalise treatment use, circumvent progression to severe presentations and thereby reduce NID morbidity/mortality and hinder the emergence of resistances. The project will result in two main deliverables: policy recommendation for health authorities in DEC, and a series of innovative diagnostic platforms.
Agency: European Commission | Branch: FP7 | Program: CP | Phase: ICT-2009.1.6 | Award Amount: 5.21M | Year: 2010
The flourishing of user driven demand, the heterogeneity of networks, the multiplicity of new devices, all mean that the Internet as we know it is reaching a saturation point. One of the main challenges of Future Internet research is to address the surge in complexity that service and network developers are facing.\n\nBuilding on top of the on-going actions to support large-scale experimentation for Future Internet protocols, TEFIS brings evaluation processes one step further. TEFIS provides an open platform to support experimentations at large-scale of resource demanding Internet services in conjunction with upcoming Future Internet networking technologies and user-oriented living labs.\n\nIt will act as a single access point to a variety of existing and next generation of experimental facilities.\n\nTEFIS outcome will be:\n\tOpen platform to integrate and use heterogeneous testbeds based on a connector models, and exposed as a classical service.\n\tIntegration of 8 complementary experimental facilities, including network and software testing facilities, and user oriented living labs.\n\tPlatform to share expertise and best practices.\n\tCore services for flexible management of experimental data and underlying testbeds resources during the experiment workflow\n\tSingle access point to testbeds instrumented with a large number of tools to support the users throughout the whole experiment lifecycle (compilation, integration, deployment, dimensioning, user evaluation, monitoring, etc) and allow them to work together by sharing expertise.\n\nA specific action is foreseen via an Open Call to engage new experimentations and to gradually expand TEFIS.\nCombining the efforts of the software and service industry, the FIRE community and the user-centric Living Labs, TEFIS will foster research and business communities in collaboratively elaborating knowledge about the provisioning of Future Internet services.
Agency: European Commission | Branch: FP7 | Program: CP-IP | Phase: HEALTH-2009-2.3.2-2 | Award Amount: 19.00M | Year: 2010
With 14.4 million prevalent cases and 1.7 million deaths tuberculosis (TB) remains one of the most serious infectious diseases to date. An estimated 2 billion people are believed to be infected with Mycobacterium tuberculosis and at risk of developing disease. Multi- and extensively drug resistant strains are increasingly appearing in many parts of the world, including Europe. While with current control measures the Millennium Development Goals (MDGs) set for 2015 may be achieved, reaching these would still leave a million people per year dying from TB. Much more effective measures, particularly more effective vaccines will be essential to reach the target of eliminating TB in 2050. Two successive FP5 and FP6 funded projects, Tuberculosis (TB) Vaccine Cluster (2000-2003) and TBVAC (2004-2008), have in the recent decade made significant contributions to the global TB vaccine pipeline, with four vaccines (out of nine globally) being advanced to clinical stages. Both projects strongly contributed to the strengthening and integration of expertise and led to a European focus of excellence that is unique in the area of TB vaccine development. In order to sustain and accelerate the TB vaccine developments and unique integrated excellence of TBVAC, a specific legal entity was created named TuBerculosis Vaccine Initiative (TBVI). The NEWTBVAC proposal is the FP7 successor of TBVAC, and will be coordinated by TBVI. The proposal has the following objectives : 1) To sustain and innovate the current European pipeline with new vaccine discoveries and advance promising candidates to clinical stages; 2) To design new, second generation vaccines based new prime-boost strategies and/or new (combinations of) promising subunit vaccines, that will impact on reduction of disease in exposed individuals; 3) To sustain and innovate discovery, evaluation and testing of new biomarkers, that will be critically important for future monitoring of clinical trials.
Agency: European Commission | Branch: FP7 | Program: MC-ITN | Phase: FP7-PEOPLE-2013-ITN | Award Amount: 4.05M | Year: 2013
Finding novel solutions for energy storage is of high societal relevance, since it is a prerequisite for the turnaround from fossil fuels and nuclear power to energy from renewable sources, since these sources mostly are intermittent. Also for providing an ecological friendly mobility, high capacity energy storage solutions are urgently needed. Well trained experts in energy storage are a prerequisite of the necessary technological development. ECOSTORE contributes to these targets by training 12 ESRs and 3 ERs in materials science and use of novel metal hydrides for energy storage chemical, as hydrogen, and electrochemical, in batteries. The fellows will be trained in scientific skills by pursuing own research projects (leading to a PhD in the case of ESR) as well as in complementary skills, important for their future career in academia or industry, like management of scientific and technical projects, science-public communication and development of their own career and personality. ECOSTORE is an international network of partners each with high reputation in the field of hydrogen and electrochemical storage. 9 European research institutions, 3 European industrial companies, and 2 Associated Partners from Japanese Universities form a network of complementary scientific and techno-economical expertise. Novel borohydride- and nitride based materials may allow for high energy storage densities in terms of both hydrogen and electrochemical processes. For commercial introduction, a prerequisite is the cost efficient large scale production from abundant and relatively cheap raw materials, going from extremely pure chemicals and laboratory-scale to less pure raw materials and industrial scale. ECOSTORE aims at the scientific understanding of materials behaviour in hydrogen as well as in electrochemical processes, and, based on this, at scale-up of cost effective materials production, and at prototype testing to perform a techno-economical evaluation of the developments
Agency: European Commission | Branch: FP7 | Program: CP-SICA | Phase: ENV.2008.1.1.6.1. | Award Amount: 4.29M | Year: 2009
The hydrological system of Northern India is based on two main phenomena, the monsoon precipitation in summer and the growth and melt of the snow and ice cover in the Himalaya, also called the Water Tower of Asia. However, climate change is expected to change these phenomena and it will have a profound impact on snow cover, glaciers and its related hydrology, water resources and the agricultural economy on the Indian peninsula (Singh and Kumar, 1996, Divya and Mehrotra, 1995). It is a great challenge to integrate the spatial and temporal glacier retreat and snowmelt and changed monsoon pattern in weather prediction models under different climate scenarios. Furthermore, the output of these models will have an effect on the input of the hydrological models. The retreat of glaciers and a possible change in monsoon precipitation and pattern will have a great impact on the temporal and spatial availability of water resources in Northern India. Besides climate change, socio-economic development will also have an influence on the use of water resources, the agricultural economy and the adaptive capacity. Socio-economic development determines the level of adaptive capacity. It is a challenge to find appropriate adaptation strategies with stakeholders for each of the sectors agriculture, energy, health and water supply by assessing the impact outputs of the hydrological and socio-economical models. The principal aim of the project is to assess the impact of Himalayan glaciers retreat and possible changes of the Indian summer monsoon on the spatial and temporal distribution of water resources in Northern India and to provide recommendations for appropriate and efficient response strategies that strengthen the cause for adaptation to hydrological extreme events.
Agency: European Commission | Branch: FP7 | Program: CP-FP | Phase: HEALTH-2009-2.3.1-2 | Award Amount: 7.82M | Year: 2010
Many results drawn from previous studies of the effect of antibiotic use on emergence, selection and spread of antimicrobial resistance (AMR) have lacked a holistic view combining all aspects into one study. The SATURN project aims to study the impact of antibiotic exposure on AMR with a multidisciplinary approach that bridges molecular, epidemiological, clinical and pharmacological research. Two types of clinical studies will be conducted: First, a randomized trial will be performed to resolve an issue of high controversy (antibiotic cycling vs. mixing). Second, 3 observational studies will be conducted to rigorously study issues surrounding the effect of antibiotic use on AMR that are not easily assessable through randomized trials. These clinical studies will serve as a platform to 2 complementary workpackages (microbiology & pharmacology) that will perform important investigations relevant to this call. The work package focusing on molecular studies will generate new evidence about the changes effected by antibiotic therapy on commensal organisms or opportunistic pathogens in the oropharyngeal, nasal and gastro-intestinal flora and study AMR mechanisms and the dissemination of successful clones of fluoroquinolone-resistant, carbapenem-resistant or extended-spectrum beta-lactamase harboring Gram-negative bacteria, MRSA and fluoroquinolone-resistant viridans streptococci. The purpose of the pharmacodynamic study is to model the relationships between antibiotic exposure and AMR emergence over time for various classes of agents. In summary, the overarching rationale of SATURN is to improve methodological standards and conduct research that will help to better understand the impact of antibiotic use on acquisition, selection and transmission of AMR in different environments, by combining analyses of molecular, individual patient-level and ecologic data. The anticipated results may guide clinical and policy decisions to ultimately reduce the burden of AMR in Europe.
Agency: European Commission | Branch: FP7 | Program: CP-FP | Phase: HEALTH-2007-2.3.2-1 | Award Amount: 3.90M | Year: 2008
Standard therapy of infection with the human immunodeficiency virus type 1 (HIV-1) is based on potent cocktails of drugs targeting viral proteins. This treatment is associated with severe side effects and is almost unaffordable for the patients living in sub-Saharan Africa. Incomplete suppression of HIV replication results in drug-resistance. Therefore, a continued research effort is required to develop more potent, cheaper and less toxic antivirals. The insight has grown that HIV requires cellular proteins to serve as co-factors for viral replication. Our over-all objective is to develop novel drugs by targeting co-factors required for HIV replication. The virus will find it difficult to develop antiviral resistance against drugs targeting interaction between invariable cellular proteins and conserved viral protein domains. We will focus on the cellular proteins that mediate HIV trafficking, nuclear import and integration, such as Lens Epithelium Derived Growth Factor (LEDGF/p75), a novel cofactor of HIV-1 integration. THINC is composed of 3 virologists, 2 medicinal chemists, 1 virologist from South Africa, 1 structural biologist, 1 pharmaceutical company. Our first objective is to identify and validate novel co-factors of HIV trafficking, nuclear import and integration as novel targets for anti-HIV therapy. The second objective is to develop new drugs against the validated cellular target LEDGF/p75. The third objective is to perform this work in the perspective of those who will benefit most: the HIV infected people all over the world. The initial steps of target validation and hit identification will mainly be taken by academic groups, while optimization and (pre)clinical development of drugs requires the participation of Tibotec, a European company devoted to the development of antiviral drugs. The project will also increase our generic understanding of protein-protein interactions (PPI).
Agency: European Commission | Branch: FP7 | Program: CP-IP | Phase: HEALTH.2011.2.3.1-3 | Award Amount: 15.65M | Year: 2011
Antibiotics are a mainstay of public health, but their use has increased exponentially leading to the emergence of antibiotic resistance. The R-GNOSIS (Resistance in Gram-Negative Organisms: Studying Intervention Strategies) project combines 5 international clinical studies, all supported by highly innovative microbiology, mathematical modelling and data-management, to determine - in the most relevant patient populations - the efficacy and effectiveness of cutting-edge interventions to reduce carriage, infection and spread of Multi-Drug Resistant Gram-negative Bacteria (MDR-GNB). All work-packages will progress science beyond the state-of-the-art in generating new and translational clinically relevant knowledge, through hypothesis-driven studies focussed on patient-centred outcomes. The 5 clinical studies will investigate the following interventions: A Point-Of-Care-Testing guided management strategy to improve appropriate antibiotic prescription for uncomplicated UTI in primary care. Gut decolonization in outpatients with intestinal carriage of MDR-GNB. A test and prescribe strategy, based on rapid diagnostic testing of faeces for MDR-GNB to optimize antibiotic prophylaxis in colo-rectal surgery. Contact Isolation of patients with ESBL-producing Enterobacteriaceae in general hospital wards. Three Decolonization strategies in ICUs. Seven laboratories across Europe will perform microbiological analyses, as well as unique quantitative experiments. All information will be integrated by 3 groups of mathematical modellers into highly innovative models to better understand and predict future trends and effects of interventions. The studies and analyses proposed in R-GNOSIS will generate a step-change in identifying evidence-based preventive measures and clinical guidance for primary care and hospital-based physicians and health-care authorities, to combat the spread and impact of infections caused by MDR-GNB in Europe.
Agency: European Commission | Branch: FP7 | Program: MC-ITN | Phase: FP7-PEOPLE-2011-ITN | Award Amount: 4.52M | Year: 2012
TALENT is a 4-year multi-site training network aiming at career development of young researchers on design, construction, manufacturing, testing and commissioning of innovative radiation hard detector modules and novel scientific instruments. TALENT provides, to 15 ESRs and 2 ERs, training for deep understanding of the complexity of scientific instrument building from theoretical design until industrial manufacturing cost efficiency considerations. The network consists of 9 academic institutions and 8 industrial companies of excellence, providing the researchers a multicultural, truly stimulating and interdisciplinary learning environment. The 2006 report of ESFRI and the European Strategy for Particle Physics set the CERN Large Hadron Collider (LHC) Upgrade and enhancement of intersectoral R&D as priorities to keep the leading high energy physics facilities and expertise of Europe at the world-class level. TALENT will make substantial advancements into these objectives. The research program significantly contributes into CERN ATLAS R&D project, the Insertable B-layer (IBL). Furthermore, IBLs innovative detector modules and instrumentation are already showing major potential for industrial applications in satellite instruments, X-ray systems, sensor technologies, medical imaging and cancer therapy. To enhance intersectoral R&D and training collaboration as well as mutual knowledge transfer, and thus to speed up the development of the IBL technologies, major R&D efforts within TALENT are put into these industrial applications. The mutual R&D interests the intersectoral consortium partners share is likely to lead into particularly creative multidisciplinary learning environment within TALENT. The chosen training approach will deepen the existing R&D collaborations between the partners and, more importantly, give the participating young researchers expertise and understanding to build a successful international career in R&D in science, industry or in their interface.
Agency: European Commission | Branch: FP7 | Program: MC-ITN | Phase: FP7-PEOPLE-2010-ITN | Award Amount: 4.25M | Year: 2011
Gaia is the ESA cornerstone mission set to revolutionise our understanding of the Milky Way. This proposal will shape a critical mass of new expertise with the fundamental skills required to power the scientific exploitation of Gaia over the coming decade and beyond. The GREAT-ITN research theme is Unravelling the Milky Way focused on four fundamental problems: unravelling the origin and history of our home galaxy; tracing the birth place and understanding the astrophysical properties of the stellar constituents of our galaxy; deepening the understanding of planetary systems by linking the study of exoplanets to the origins of the solar system; take up the grand challenges offered by Gaia in the domains of the distance scale and the transient sky. The GREAT-ITN will deliver a training programme structured around these research themes to a core of new researchers, equipping them with the skills and expertise to become future leaders in astronomy or enter industry. These skills are relevant across many of the key challenges facing us now from climate change to energy security. These require well trained people, people which this GREAT-ITN will deliver. The 12 GREAT-ITN partners in Europe, and 1 in China, each have world leading expertise. 19 additional associate partners provide access to complementary expertise and facilities. The network includes three associates from the Information Technology industry: Microsoft, InterSystems and The Server Labs, each driving the new global on-line agenda. The European Space Agency provides the vital interface to the Gaia project, and exposure to the Space industry. This powerful combination of expertise, from industry and academia, will lead to a new cluster of expertise in the area of Galactic astronomy, deliver powerful and effective training to a large pool of Early Stage Researchers, and cement a sustainable research community adding impact to Europes leadership in this fundamental area of astrophysics.
Agency: European Commission | Branch: FP7 | Program: MC-ITN | Phase: PEOPLE-2007-1-1-ITN | Award Amount: 2.98M | Year: 2008
Malaria exacts a devastating social and economic cost across the globe. Europe is at the forefront of the battle against this disease. It contains many of the leading malaria research groups, most of which are members of at least one of two consortia; BioMalPar, a Network of Excellence focused on basic research into the biology and pathology of malaria; and AntiMal, an integrated project aiming to develop a portfolio of new antimalarial drugs, urgently needed to meet the problems of drug-resistant malaria. To sustain the competitiveness of European malaria research into the future, there is a need to integrate these initiatives by the establishment of a broad-based training programme that emphasises the path from fundamental research to translation into disease control strategies. To address this need, it is proposed to establish an international training programme called InterMalTraining which will train a cohort of early stage researchers (ESR) to PhD level by means of collaborative malaria research projects. Each project will be jointly supervised by two principal investigators from separate partner institutions and usually different countries, affording a multicultural and multidisciplinary element to the training. Through this and additional broad-based, intensive training provided by experts from both the malaria research community and the industrial sector, it is intended to create a new generation of mobile, highly skilled young scientists who will be well acquainted with each other and with the leading malaria groups in Europe and beyond, enhancing their prospects for a career in their chosen area and suiting them to be future leaders in research institutions and industry. The cross-disciplinary nature of the training will have the breadth to ensure that it is applicable across and beyond the field of infectious diseases, allowing mobility of the young scientists into these areas and forging future links across the life sciences and into industry.
Agency: European Commission | Branch: FP7 | Program: MC-ITN | Phase: FP7-PEOPLE-2013-ITN | Award Amount: 3.72M | Year: 2013
Cultural Heritage (CH) is an integral element of Europe and vital for the creation of a common European identity and one of the greatest assets for steering Europes social, economic development and job creation. However, the current research training activities in CH are fragmented and mostly design to be of a single-discipline, failing to cover the whole lifecycle of Digital Cultural Heritage (DCH) research, which is by nature a multi-disciplinary and inter-sectoral research agenda. ITN-DCH aims for the first time worldwide that top universities, research centers, industries and CH stakeholders, end-users and standardized bodies will collaborate to train the next generation of researchers in DCH. The project aims to analyze, design, research, develop and validate an innovative multi-disciplinary and inter-sectoral research training framework that covers the whole lifecycle of digital CH research for a costeffective preservation, documentation, protection and presentation of CH. ITN-DCH targets innovations that covers all aspects of CH ranging from tangible (books, newspapers, images, drawings, manuscripts, uniforms, maps, artefacts, archaeological sites, monuments) to intangible content (e.g., music, performing arts, folklore, theatrical performances) and their inter-relationships. The project aims to boost the added value of CH assets by re-using them in real application environments (protection of CH, education, tourism industry, advertising, fashion, films, music, publishing, video games and TV) through research on (i) new personalized, interactive, mixed and augmented reality enabled e-services, (ii) new recommendations in data acquisition, (iii) new forms of representations (3D/4D) of both tangible /intangible assets and (iv) interoperable metadata forms that allow easy data exchange and archiving.
Agency: European Commission | Branch: FP7 | Program: CP-FP | Phase: HEALTH.2011.2.4.3-1 | Award Amount: 8.25M | Year: 2012
Background: A significant proportion of pre-diabetics, show macro and micro vascular complications associated with hyperglycaemia. Although many trials have demonstrated the efficacy of lifestyle and pharmaceutical interventions in diabetes prevention, no trial has evaluated the extent to which mid- and long-term complications can be prevented by early interventions on hyperglycaemia. Aims: To assess the long-term effects on multiple complications of hyperglycaemia of early intensive management of hyperglycaemia with sitagliptin, metformin or their combination added to lifestyle intervention (LSI) (diet and physical activity), compared with LSI alone in adults with non-diabetic intermediate hyperglycaemia (IFG, IGT or both). Study Design: Long-term, multi-centre, randomised, partially double blinded, placebo controlled, phase-IIIb clinical trial with prospective blinded outcome evaluation. Participants will be randomised to four parallel arms: 1) LSI \ 2 placebo tablets/day; 2) LSI \ 2 Metformin tablets of 850 mg/day; 3) LSI \ 2 Sitagliptin tablets of 50 mg/day; 4) LSI \ 2 tablets of a fixed-dose combination of Sitagliptin 50mg and Metformin 850 /day. Active intervention will last for at least 3 years, and additional follow-up up to 5 years. Setting and population: Males and Females with pre-diabetes (IFG, IGT or both) aged 45 to 74 years selected from primary care screening programs in 15 clinical centres from 12 countries: Australia, Austria, Bulgaria, Germany, Greece, Italy, Lithuania, Poland, Serbia, Spain, Switzerland and Turkey. (N=3000) Main Outcomes: The primary endpoint is a combined continuous variable: the microvascular complication ndex (MCI) composed by a linear combination of the Early Treatment Diabetic Retinopathy Study Scale (ETDRS) score (based on retinograms), the level of urinary albumin to creatinine ratio, and a measure of distal small fibre neuropathy (sudomotor test by SUDOSCAN), measured during baseline visit and at 36th and 60th month visits after randomisation. In addition, this project will include the evaluation of early novel serological biomarkers of systemic inflammation, early micro-vascular damage, non-alcoholic fatty liver disease, insulin sensitivity and insulin secretion, and measures of quality of life, sleep quality (somnograms) and neuropsychological evaluation. Vascular function and structure will be evaluated in a subset of participants (n=1000), including cIMT and microvascular endothelial function measured by EndoPAT. Expected results: By evaluating the effect of aggressive treatments in pre-diabetes for the early prevention of diabetes complication, this project has the potential of changing the current paradigm of early management of hyperglycaemia. The ultimate goal is the development of a standardized core protocol for the early prevention of microvascular and other complications, impacting social cost as a result not only in health care, but also in disabilities at work.
Agency: European Commission | Branch: FP7 | Program: CP-FP | Phase: ENERGY-2007-3.5-01 | Award Amount: 5.53M | Year: 2008
SOLAR-H2 brings together 12 world-leading European laboratories to carry out integrated, basic research aimed at achieving renewable hydrogen (H2) production from environmentally safe resources. The vision is to develop novel routes for the production of a Solar-fuel, in our case H2, from the very abundant, effectively inexhaustible resources, solar energy and water. Our multidisciplinary expertise spans from molecular biology, biotechnology, via biochemistry and biophysics to organo-metallic and physical chemistry. The project integrates two frontline research topics: artificial photosynthesis in man-made biomimetic systems, and photobiological H2 production in living organisms. H2 production by these methods on a relevant scale is still distant but has a vast potential and is of utmost importance for the future European economy. The scientific risk is high - the research is very demanding. Thus, our overall objective now, is to explore, integrate and provide the basic science necessary to develop these novel routes and advance them toward new horizons. Along the first track, the knowledge gained from biochemical/biophysical studies of efficient enzymes will be exploited by organometallic chemists to design and synthesize bio-mimetic compounds for artificial photosynthesis. The design of these molecules is based on molecular knowledge about how natural photosynthesis works and how hydrogenase enzymes form H2. Along the second track, we perform research and development on the genetic level to increase our understanding of critical H2 forming reactions in photosynthetic alga and cyanobacteria. These studies are directly aimed at the improvement of the H2 producing capability of the organisms using novel genetic and metabolic engineering. The project also involves research aimed at demonstrating the concept of photobiological H2 production in photobioreactors.
Agency: European Commission | Branch: FP7 | Program: CP-SICA | Phase: ENV.2008.4.1.4.1. | Award Amount: 8.01M | Year: 2009
The Black Sea Catchment is internationally known as one of ecologically unsustainable development and inadequate resource management, which has led to severe environmental, social and economic problems. EnviroGRIDS @ Black Sea Catchment aims at building the capacities of regional stakeholders to use new international standards to gather, store, distribute, analyze, visualize and disseminate crucial information on past, present and future states of the environment, in order to assess its sustainability and vulnerability. The EnviroGRIDS @ Black Sea Catchment project addresses these issues by bringing several emerging information technologies that are revolutionizing the way we are able to observe our planet. The Group on Earth Observation Systems of Systems (GEOSS) is building a data-driven view of our planet that feeds into models and scenarios. EnviroGRIDS aims at building the capacity of scientist to assemble such a system in the Black Sea Catchment, the capacity of decision-makers to use it, and the capacity of the general public to understand the important environmental, social and economic issues at stake. To achieve its objectives, EnviroGRIDS will build an ultra-modern Grid enabled Spatial Data Infrastructure (GSDI) that will become one component in the Global Earth Observation System of Systems (GEOSS), compatible with the new EU directive on Infrastructure for Spatial Information in the European Union (INSPIRE). EnviroGRIDS will particularly target the needs of the Black Sea Commission (BSC) and the International Commission for the Protection of the Danube River (ICPDR) in order to help bridging the gap between science and policy.
Agency: European Commission | Branch: FP7 | Program: NoE | Phase: HEALTH-2009-2.3.2-1 | Award Amount: 16.97M | Year: 2009
This is a proposal from 55 partners from 36 institutes to form a NoE that will seek to integrate European malaria research that is directed towards a better understanding of the basic biology of the parasite, its vector and of the biology of the interactions between the parasite and both its mammalian host and vectors. All the member institutes and researchers have demonstrated both their excellence and their ability to contribute to a successful network. The structure of the proposed network significantly evolves prior concepts of network structure introducing new modes of research that have recently emerged. Comprising of 4 research clusters the core activities will include molecular cell biology of the parasite, host immunity, vector biology, population biology and systems biology. One arm of the network activities will be concerned with the timely and effective translation of research respecting the IP rights of partner institutes. The network will also contribute significantly to the production of the next generation of malaria researchers through the operation of an expanded European PhD School for malaria research based at EMBL, students enjoying two supervisors based in different member states. Bespoke training courses for PhD students and network personnel will be offered throughout the duration of the network to maximise individual potential. To create a long term benefit from network activities a limited programme of post-doctoral fellowships within the network will be established. Furthermore, individual career mentoring facilities and an alumni association will continue to guide and engage network graduates. New members will be affiliated annually on a competitive basis with an emphasis on young, emerging Principle Investigators. Through the establishment of an umbrella Foundation and active lobbying of government and non-government funding agencies as well as the establishment of a charitable profile the network will strive to become self-determining.
Agency: European Commission | Branch: FP7 | Program: MC-IAPP | Phase: FP7-PEOPLE-2011-IAPP | Award Amount: 2.24M | Year: 2012
NATPROTEC aims to discover and carry to the stage of development innovative products in the area of cosmeceuticals originating from European natural resources using emerging and environmentally friendly technologies. These objectives will be implemented through an extended and balanced scheme of researchers exchanges and recruitments, in both directions and via a mutual scientific project developed on the needs and interests of both Industry and Academia sectors. More specifically, NATPROTEC scientific concept involves the discovery of novel natural products (NPs) originating from the Mediterranean and Alpine biodiversity. Already existing chemical libraries will be exploited incorporating modern high throughput platforms (in silico & in vitro) for the rational and targeted selection of the optimum natural sources. Advanced analytical approaches and techniques will be applied for the efficient, accelerated and advantageous isolation and identification procedures of natural constituents as well as the quality assessment of the lead products. A broad spectrum of bioassays and novel analytical approaches will be incorporated for the evaluation of skin-protecting, anti-ageing and anti-hyperpigmenting activity of all derived products. Attention will be given to the selection of the optimum source of the biomaterial to ensure sustainability and into the development, optimisation and application of novel, green technologies for the production of the final lead products. Within this frame, core scientific knowledge and lead compounds for further development are expected to be produced creating valuable synergies. Expertise will be transferred by means of the seconded researchers training in environments with different dynamics and orientation where other skills are required. NATPROTEC aspires to comprise a successful model of an efficient, long-lasting collaboration between Industry and Academia for sustainable exploitation of existing know-how and produced knowledge.
Agency: European Commission | Branch: FP7 | Program: CP-IP | Phase: SSH.2013.5.2-1 | Award Amount: 6.39M | Year: 2014
Using an innovative interdisciplinary approach, MIME will generate an organised body of policy-relevant propositions addressing the full range of questions raised in the call. Our aim is to identify the language policies and strategies that best combine mobility and inclusion. MIME emphasises complementarity between disciplines, and brings together researchers from sociolinguistics, political science, sociology, history, geography, economics, education, translation studies, psychology, and law, who all have longstanding experience in the application of their discipline to language issues. The diverse concepts and methods are combined in an analytical framework designed to ensure their practice-oriented integration. MIME identifies, assesses and recommends measures for the management of trade-offs between the potentially conflicting goals of mobility and inclusion in a multilingual Europe. Rather than taking existing trade-offs as a given, we think that they can be modified, both in symbolic and in material/financial terms, and we argue that this objective can best be achieved through carefully designed public policies and the intelligent use of dynamics in civil society. Several partners have been involved in successful FP6 research, and key advances achieved there will guide the MIME project: languages are viewed as fluid realities in a context of high mobility of people, goods, services, and knowledge, influencing the way in which skills and identities are used and constantly re-shaped. The project integrates these micro-level insights into a macro-level approach to multilingual Europe. MIME results will be made widely available through a creative approach to dissemination, including training modules and the MIME Stakeholder Forum, allowing for sustained dialogue between academics, professional associations and local/regional authorities. The project culminates in a consensus conference where recommendations based on the project findings are adopted.
Agency: European Commission | Branch: FP7 | Program: CP | Phase: INFRA-2011-2.1.1. | Award Amount: 10.17M | Year: 2011
Key questions in physics can be answered only by constructing a giant underground observatory to search for rare events and study terrestrial and astrophysical neutrinos. The Astroparticle Roadmap of ApPEC/ASPERA strongly supports this, recommending that: a new large European infrastructure of 100000-500000 ton for proton decay and low-energy neutrinos be evaluated as a common design study together with the underground infrastructure and eventual detection of accelerator neutrino beams. The latest CERN roadmap also states: a range of very important non-accelerator experiments takes place at the overlap of particle and astroparticle physics exploring otherwise inaccessible phenomena; Council will seek with ApPEC a coordinated strategy in these areas of mutual interest. Reacting to this, uniting scientists across Europe with industrial support to produce a very strong collaboration, the LAGUNA FP7 design study has had a very positive effect. It enabled, via study of seven pre-selected locations (Finland, France, Italy, Poland, Romania, Spain and UK), a detailed geo-technical assessment of the giant underground cavern needed, concluding finally that no geo-technical show-stoppers to cavern construction exist. Building on this, the present design study will address two challenges vital to making a final detector and site choice: (i) to determine the full cost of construction underground, commissioning and long-term operation of the infrastructure, and (ii) to determine the full impact of including long baseline neutrino physics with beams from CERN.
Agency: European Commission | Branch: FP7 | Program: MC-IRSES | Phase: FP7-PEOPLE-2013-IRSES | Award Amount: 279.30K | Year: 2014
The aim of Research Exchange programme Ionophore Based Sensor Network (IBS-NETWORK) is to establish the scientific basis from which the utility and potential for IBSs can be extended in a range of new sectors. To achieve this requires the combined expertise from a number of disciplines and backgrounds, including analytical science focused on IBSs, organic synthesis, material science, microfluidics and statistics. All these disciplines exist within the partnership IBS-NETWORK. We bring together in this project a multidisciplinary team of 10 research groups from 8 institutions. These teams originate from 7 countries within and outside of Europe, all recognised leaders in their respective fields. Our partners possess complementary skills and knowledge and the necessary skills to develop new methodologies and sensing platforms. IBS-NETWORK includes researchers involved in basic research working with researchers with applied expertise. The collaborations of the partners involved in this project represent a significant audience of stakeholders from academia and industry, national and international levels, creating high potential for the take-up and use of results. We expect that the research exchange between these groups will achieve the following main goals: Greatly expand the utility and potential for application of IBSs in a range of new sectors with recognized potential for benefit (climate action, health, wellbeing, marine and maritime research and food security) To strengthen existing and create new research collaborations between experts from a number of disciplines, techniques and equipment. To provide the basis for long-term, sustainable collaborations in research with the knowledge to take forward developments in the IBS field These goals will be achieved through a balanced, two-way exchange of researchers and expertise between countries inside and outside Europe.
Agency: European Commission | Branch: FP7 | Program: CP-FP | Phase: HEALTH-2007-2.4.3-9 | Award Amount: 7.09M | Year: 2008
The development of sensitive, non-invasive methods for the characterisation and quantification of beta-cell mass would greatly enhance our means for gaining understanding of the pathophysiology of diabetes and allow the development of novel therapies to prevent, halt and reverse the disease. The aim of this project is to develop and apply innovative approaches for beta-cell imaging, the emphasis being on beta-cell mass regulation (loss and neogenesis) with the perspective of entering initial clinical trials. For this purpose, our approach is to: (1) Focus on imaging technologies offering the potential to enter clinical trials during the runtime of the project. Since beta cells contribute only marginally (1-2%) to the total mass of the pancreas, a highly sensitive method for clinical imaging is required. BETA IMAGE will focus on positron emission tomography (PET) relying on chemical resolution, i.e. the specificity of a radiolabelled tracer molecule. The lead compound will be radiolabelled Exendin 4, developed in the consortium for GLP-1 receptor imaging. (2) Devise novel imaging strategies by generating labelled design molecules/peptides/nanobody molecules targeting newly identified beta-cell surface proteins. These targets will be identified using a Systems Biology approach. For high-throughput tracer development, a stream-lined methodology will be established based on in vitro model systems and micro-/macroscopic in vivo real time dynamic imaging of tracer distribution by optical coherence tomography and complementary small animal PET and MRI. (3) Build on European excellence in tracer development using peptides, peptide-like and organic molecules for different imaging modalities. To achieve these ambitious goals, we have established a highly interdisciplinary and interactive project combining leading European research groups. In this way, a unique expertise is achieved regarding tracer development and imaging, beta-cells/diabetes and target definition.
Agency: European Commission | Branch: FP7 | Program: CP-CSA-Infra | Phase: INFRA-2012-1.1.24. | Award Amount: 23.40M | Year: 2013
Research accelerators are facing important challenges that must be addressed in the years to come: existing infrastructures are stretched to all performance frontiers, new world-class facilities on the ESFRI roadmap are starting or nearing completion, and strategic decisions are needed for future accelerators and major upgrades in Europe. While current projects concentrate on their specific objectives, EuCARD-2 brings a global view to accelerator research, coordinating a consortium of 40 accelerator laboratories, technology institutes, universities and industry to jointly address common challenges. By promoting complementary expertise, cross-disciplinary fertilisation and a wider sharing of knowledge and technologies throughout academia and with industry, EuCARD-2 significantly enhances multidisciplinary R&D for European accelerators. This new project will actively contribute to the development of a European Research Area in accelerator science by effectively implementing a distributed accelerator laboratory in Europe. Transnational access will be granted to state-of-the-art test facilities, and joint R&D effort will build upon and exceed that of the ongoing EuCARD project. Researchers will concentrate on a few well-focused themes with very ambitious deliverables: 20 T accelerator magnets, innovative materials for collimation of extreme beams, new high-gradient high-efficiency accelerating systems, and emerging acceleration technologies based on lasers and plasmas. EuCARD-2 will include six networks on strategic topics to reinforce synergies between communities active at all frontiers, extending the scope towards innovation and societal applications. The networks concentrate on extreme beam performance, novel accelerator concepts with outstanding potential, energy efficiency and accelerator applications in the fields of medicine, industry, environment and energy. One network will oversee the whole project to proactively catalyze links to industry and the innovation potential.
Agency: European Commission | Branch: FP7 | Program: CP | Phase: ICT-2011.9.7 | Award Amount: 3.53M | Year: 2012
We will contribute to a theory of dynamics of multi level complex systems by developing mathematical and computational formalisms for information processing in such multi level systems. We will develop the formalism in the context of criticality, emergence, and tipping points in multi level systems and apply it to real data. This should lead to a better understanding, but more important, to an improvement in predictive power for early warning. Can we observe tell-tales of things to happen in the (near) future? We will relate the emergence of structures and collective effects to the existence of an information-driven phase transition. Emergent structures may mean selection of preferred scales, creation of new levels or annihilation of existing levels, or occurrence of tipping points leading to extreme phenomena. We believe that these transitions are often self-organized because they appear in a spontaneous way, driven only by the dynamics of the system and the co-evolving topology of the interactions. We will create an experimental facility, a Computational Exploratory, which allows to implement our theoretical framework of information processing in multilevel complex systems, and to apply this to real life data. The theory will be validated on real world applications involving large, heterogeneous multi level datasets from the Socio-Economic domain (high frequency FX data, datasets on interest rates, and social media data) and applied to study the question of emergence of scales, and the detection and prediction of tipping points in real-life datasets. We contribute to the questions if and why Nature has preferred scales, and if so, if such emerging scales can be detected in real data sets. The impact of our theory on understanding of emergence of multilevel systems due to critical information processing is expected to be substantial. Our theory will offer new tools for critical transitions and extreme events prediction in real-life datasets.
Agency: European Commission | Branch: FP7 | Program: CP | Phase: ICT-2011.9.9 | Award Amount: 11.78M | Year: 2013
The overarching goal of our project is to develop systems based on direct and deterministic interactions between individual quantum entities, which by involving large-scale entanglement can outperform classical systems in a series of relevant applications.\nWe plan to achieve that by improving technologies from atomic, molecular and optical physics as well as from solid-state physics, and by developing new ones, including combinations across those different domains. We will explore a wide range of experimental platforms as enabling technologies: from cold collisions or Rydberg blockade in neutral atoms to electrostatic or spin interactions in charged systems like trapped ions and quantum dots; from photon-phonon interactions in nano-mechanics to photon-photon interactions in cavity quantum electrodynamics and to spin-photon interactions in diamond color centers.\nWe will work on two deeply interconnected lines to build experimentally working implementations of quantum simulators and of quantum interfaces. This will enable us to conceive and realize applications exploiting those devices for simulating important problems in other fields of physics, as well as for carrying out protocols outperforming classical communication and measurement systems.
AIDA - Preserving old antibiotics for the future : assessment of clinical efficacy by a pharmacokinetic/pharmacodynamic approach to optimize effectiveness and reduce resistance for off-patent antibiotics
Agency: European Commission | Branch: FP7 | Program: CP-FP | Phase: HEALTH.2011.2.3.1-1 | Award Amount: 7.77M | Year: 2011
The AIDA project aims to answer the question of clinical effectiveness and optimal dosing of 5 off-patent antibiotics for infections caused by multiple drug resistant (MDR) bacteria in three randomized controlled clinical trials. In an era of increasing emergence of drug resistance (EDR) and lack of new antibiotics, old off- patent antibiotics are increasingly being prescribed to patients. However, many of these were developed in an age before the advent of a structured process for drug assessment and approval, and the establishment of clinical efficacy and effectiveness in randomized controlled trials in particular. In a multidisciplinary approach the exposure response relationships for each antibiotic will be elucidated by including pharmacokinetic (PK), pharmacodynamic (PD) and microbiological studies, including emergence of drug resistance (EDR). The project addresses the optimization of treatment of infections caused by MDR pathogens that impose a major burden of disease in Europe and the rest of the world by selecting 5 off-patent-antibiotics that are increasingly being used without clear evidence with respect to their effectiveness, duration of therapy and issues of EDR. In the first trial the efficacy of colistin alone is compared to colistin plus imipenem for severe infections caused by carbapenem-resistant bacteria. The second trial compares fosfomycin vs. nitrofurantoin for the treatment of lower urinary tract infection in women at high risk of antibiotic-resistant pathogens. In the third trial antimicrobial oral treatment with minocycline plus rifampicin is compared with oral treatment with linezolid for complicated skin and soft tissue infections (cSSTI) due to MRSA. Exposure response relationships, PK/PD and EDR issues will be addressed in a separate project component and is an essential element of the research project that will interrelate synergistically with the clinical studies. The results thereof will be used to refine exposure response relationships but also to study effects of exposure that are not readily observed in the trials. This will aid to delineate optimal exposures and drug dosing. This project addresses an urgent medical need that is critical both for individual patients and for society. An effective dissemination strategy is essential to effectively communicate project results to the target groups therefore supporting the project goal of preserving and strengthening the public health benefits of the studied off-patent antibiotics. The dissemination of project results to professional groups and the public in general, communication to policymakers, and implementation of results in national formularies is an important aspect.
News Article | April 14, 2016
During their life, plants constantly renew themselves. They sprout new leaves in the spring and shed them in the fall. No longer needed, damaged or dead organs such as blossoms and leaves are also cast off by a process known as abscission. By doing so, plants conserve energy and prepare for the next step in their life cycle. But how does a plant know when it is the right time to get rid of unnecessary organs? Researchers from the University of Geneva (UNIGE) and the University of Oslo (UiO) now shed light on this process. It is regulated by receptor proteins located at the surface of specific cells that form a layer around the future break point. When it is time to shed an organ, a small hormone binds to this membrane receptor and, together with a helper protein, the abscission process is initiated. Their findings are now published in the journal eLife.
News Article | January 14, 2016
On 12 January 2016, the Japan Aerospace Exploration Agency (JAXA) presented their ASTRO-H satellite to the media at the Tanegashima Space Center, situated on a small island in the south of Japan. The satellite, developed with institutions in Japan, the US, Canada and Europe, is now ready to be mounted on an H-IIA rocket for launch on 12 February. ASTRO-H is a new-generation satellite, designed to study some of the most powerful phenomena in the Universe by probing the sky in the X-ray and gamma-ray portions of the electromagnetic spectrum. Scientists will investigate extreme cosmic environments ranging from supernova explosions to supermassive black holes at the centres of distant galaxies, and the hot plasma permeating huge clusters of galaxies. ESA contributed to ASTRO-H by partly funding various elements of the four science instruments, by providing three European scientists to serve as science advisors and by contributing one scientist to the team in Japan. In return for ESA's contribution, European scientists will have access to the mission's data. Traditionally, Japan's astronomy satellites receive a provisional name consisting of the word 'ASTRO' followed by a letter of the latin alphabet – in this case H, because it is the eighth project in JAXA's astronomical series. JAXA will announce the new name after launch. ASTRO-H is a new-generation satellite for high-energy astrophysics, developed by the Japan Aerospace Exploration Agency (JAXA) in collaboration with institutions in Japan, the US, Canada, and Europe. Its four instruments span the energy range 0.3-600 keV, including soft X-rays, hard X-rays and soft gamma rays. ESA's contribution consists in funding the procurement of a number of items on the various instruments, three European scientists who will serve as advisors to the mission's core science programme, and one full-time scientist based at the Institute of Space and Astronautical Science (ISAS), Japan, to support in-flight calibration, science software testing and data analysis. Support to European users will be provided by scientists at ESA's European Space Astronomy Centre in Madrid, Spain, and at the European Science Support Centre at the ISDC Data Centre for Astrophysics, University of Geneva, Switzerland. Explore further: Japan launches satellite for better GPS coverage (Update)
News Article | November 11, 2016
And it seems the trend is likely to continue, with the latest discovery comes from a team of European scientists. Using data from the ESO's High Accuracy Radial velocity Planet Searcher (HARPS) and HARPS-N instruments, they detected an exoplanet candidate orbiting around GJ 536 – an M-class red dwarf star located about 32.7 light years (10.03 parsecs) from Earth. According to their study, "A super-Earth Orbiting the Nearby M-dwarf GJ 536", this planet is a super-Earth – a class of exoplanet that has between more than one, but less than 15, times the mass of Earth. In this case, the planet boasts a minimum of 5.36 ± 0.69 Earth masses, has an orbital period of 8.7076 ± 0.0025 days, and orbits its sun at a distance of 0.06661 AU. The team was led by Dr. Alejandro Suárez Mascareño of the Instituto de Astrofísica de Canarias (IAC). The discovery of the planet was part of his thesis work, which was conducted under Dr Rafael Rebolo – who is also a member of the IAC, the Spanish National Research Council and a professor at the University of Laguna. And while the planet is not a potentially habitable world, it does present some interesting opportunities for exoplanet research. As Dr. Mascareño shared with Universe Today via email: "GJ 536 b is a small super Earth discovered in a very nearby star. It is part of the group of the smallest planets with measured mass. It is not in the habitable zone of its star, but its relatively close orbit and the brightness of its star makes it a promising target for transmission spectroscopy IF we can detect the transit. With a star so bright (V 9.7) it would be possible to obtain good quality spectra during the hypothetical transit to try to detect elements in the atmosphere of the planet. We are already designing a campaign for next year, but I guess we won't be the only ones." The survey that found this planet was part of a joint effort between the IAC (Spain) and the Geneva Observatory (Switzerland). The data came from the HARPS and HARPS-N instruments, which are mounted on the ESO's 3.6 meter telescope at the La Silla Observstory in Chile and the 3.6 meter telescope at the La Palma Observatory in Spain. This was combined with photometric data from the All Sky Automated Survey (ASAS), which has observatories in Chile and Maui. The research team relied on radial velocity measurements from the star to discern the presence of the planet, as well as spectroscopic observations of the star that were taken over a 8.6 year period. For all this, they not only detected an exoplanet candidate with 5 times the mass of Earth, but also derived information on the star itself – which showed that it has a rotational period of about 44 days, and magnetic cycle that lasts less than three years. By comparison, our sun has a rotational period of 25 days and a magnetic cycle of 11 years, which is characterized by changes in the levels of solar radiation it emits, the ejection of solar material and in the appearance of sunspots. In addition, a recent study from the the Harvard Smithsonian Center for Astrophysics (CfA) showed that Proxima Centauri has a stellar magnetic cycle that lasts for 7 years. This detection is just the latest in a long line of exoplanets being discovered around low-mass, low-luminosity, M-class (red dwarf) stars. And looking ahead, the team hopes to continue surveying GJ 536 to see if there is a planetary system, which could include some Earth-like planets, and maybe even a few gas giants. "For now we have detected only one planet, but we plan to continue monitoring the star to search for other companions at larger orbital separations," said Dr. Mascareño. "We estimate there is still room for other low-mass or even Neptune-mass planets at orbits from a hundred of days to a few years." The research also included scientists from the Astronomical Observatory at the University of Geneva, the University of Grenoble, The Astrophysical and Planetological Insitute of Grenoble, Institute of Astrophysics and Space Sciences in Portugal, and the University of Porto, Portugal. Explore further: Planet in star system nearest our Sun 'may have oceans'
News Article | February 15, 2017
At a ceremony held on 19 December at IPN Orsay, the French Physical Society awarded the 2015 Prix Joliot Curie for experimental particle physics to Marteen Boonekamp of the Institut de recherche sur les lois fondamentales de l̉Univers (IRFU) at Saclay. The prize, awarded every two years, recognised Boonekamp’s contributions to the measurement of the W mass at the LHC’s ATLAS experiment, of which he has been a member since 2001. The event also saw the French Physical Society present the Paul Langevin Prize, which recognises distinguished theorists and has not been awarded for the past few years. The winners of the 2015 Langevin Prize are François Gelis of the Institut de Physique Théorique Saclay, for his work on quantum field theory in the strong-field regime and its applications to the non-equilibrium evolution of quark–gluon plasma, and Ubirajara van Kolck of the Institut de Physique Nucléaire Orsay, for his formulation of effective field theories in nuclear physics. The 2017 Wolf Prize in Physics has been awarded to Michel Mayor and Didier Queloz of the University of Geneva, for the discovery of an exoplanet orbiting a solar-type star. The pair made the discovery of “51 Pegasi b” in 1995 following continuous improvement of cross-correlation spectrographs over a period of 20 years. The prize citation says that the team led by Mayor and Queloz, who is also at the University of Cambridge in the UK, contributed to the discovery of more than 250 additional exoplanets and sparked a revolution in the theory of planetary systems. Giovanni Passaleva of the Istituto Nazionale di Fisica Nucleare (INFN) Firenze, Italy, has been appointed as the next spokesperson of the LHCb experiment, taking over from Guy Wilkinson. Passaleva, who will become the new spokesperson in July, completed his PhD on the L3 experiment at LEP in 1995 and has been a member of the LHCb collaboration since 2000. His research interests include electroweak and flavour physics, as well as solid-state and gaseous tracking detectors, while his detector responsibilities include project leader of the LHCb muon system. On 20 January, CERN Director-General Fabiola Gianotti took part in a panel discussion at the 2017 World Economic Forum in Davos, at which delegates addressed the top issues on the global science agenda. Gianotti reinforced the importance of fundamental research in driving technology and as a force for peaceful collaboration, and emphasised the need for open science. “Scientists have made good progress over the last years to engage the public, but we have to do more to reach out to people at all levels using the tools we have,” she said. “Knowledge belongs to mankind, it does not belong to the scientists.” On 19 January, the Institut Laue-Langevin (ILL) in Grenoble marked 50 years of providing beams of neutrons for scientific users across a range of disciplines. The ILL was founded by the governments of France and Germany in 1967 with the aim of creating an intense, continuous source of neutrons devoted exclusively to civil fundamental research. Its first neutron beams were produced in 1971, and two years later the UK joined as the ILL’s third associate member. Today, the institute has 10 scientific members: Spain, Switzerland, Austria, Italy, the Czech Republic, Sweden, Belgium, Slovakia, Denmark and Poland. Research at the ILL covers fundamental physics to materials science and biology. The facility, which has an annual budget of around €100 million and almost 2000 user visits per year, has played a role in 21,000 scientific publications so far during its lifetime and is expected to operate well into the 2020s. Boris Johnson, secretary of state for foreign and commonwealth affairs, United Kingdom of Great Britain and Northern Ireland, visited CERN on 13 January, during which he took in the ATLAS control room and the LHC tunnel. Following the formal ascension of India as an associate Member State of CERN, Indian ambassador Amandeep Singh Gill visited CERN on 16 January. Here he is pictured with CERN Director-General Fabiola Gianotti holding the signed documents that will enable greater collaboration between India and CERN. Bernard Bigot, director-general of the ITER Organisation, which is responsible for the international fusion experiment under construction in France, visited CERN on 16 January. Bigot, who has a PhD in chemistry and has held several senior scientific roles in the French government, toured both CMS and ATLAS in addition to the LHC tunnel. Here he is pictured signing the guestbook with Frédérick Bordry, CERN’s director for accelerators and technology. Chief scientist of Quebec in Canada, Rémi Quirion, visited CERN on 22 January, during which he toured the LHC tunnel and experiments. Quirion received a PhD in pharmacology from Université de Sherbrooke in 1980 and was previously a professor at McGill University and scientific director of the Douglas Hospital Research Centre. On 23–26 January, more than 230 members of the international Deep Underground Neutrino Experiment (DUNE) collaboration met at CERN to discuss the project’s status and plans. A main focus of the meeting was to coordinate the assembly of prototype modules for the vast DUNE detector, which are being constructed in a new facility on the CERN site (see "ProtoDUNE revealed"). DUNE will comprise four detector modules with a total of 68,000 tonnes of liquid argon to detect neutrinos and look for rare subatomic phenomena such as proton decay. It will be situated 1.5 km underground at Sanford Underground Research Facility (SURF) in South Dakota, US. The experiment will be the target for intense beams of neutrinos and antineutrinos produced by a new facility to be built at Fermilab 1300 km away, and will address specific puzzles such as the neutrino mass hierarchy and CP violation in the neutrino sector. CERN is playing a significant role in the DUNE programme via its recently established neutrino platform (CERN Courier July/August 2016 p21). A collaboration agreement was signed between CERN and the US in December 2015, in which CERN committed to the construction of prototype DUNE detectors and the delivery of one cryostat for the experiment in the US. Two large “protoDUNE” detectors are now taking shape in a new building in the north area of the CERN site. DUNE aims to be for the neutrino what the LHC is for the Higgs boson, and enormous progress has been made in the past two years. Formed in early 2015, the collaboration now comprises 945 scientists and engineers from 161 institutions in 30 nations and is still growing, with about 60% of the collaborating institutions located outside the US. In September 2016, the US Department of Energy approved the excavation of the first caverns for DUNE, with preparatory work expected to begin at SURF this summer. A small, 3 × 1 × 1 m3 dual-phase demonstrator module constructed at CERN is also ready for filling and operation. One of the highlights of the CERN meeting was a tour of the construction site for the large protoDUNE detectors. The vessel for the cryostat of the 6 × 6 × 6 m3 single-phase liquid-argon prototype module is almost complete, and the construction of an identical cryostat for a dual-phase detector will start soon. Preparing for the installation of liquid-argon time-projection-chamber (TPC) detector components, which will start this summer, was one of the main focuses of the meeting. Both single- and dual-phase protoDUNE detectors are scheduled to be operational and take data with the tertiary charged-particle beam from the Super Proton Synchrotron in 2018. The DUNE collaboration is also starting to prepare a Technical Design Report (TDR) for the large underground detectors at SURF, and is working on the conceptual design for the DUNE near detector that will be placed about 55 m underground at the Fermilab site to measure neutrino interactions close to the source before the neutrinos start to oscillate. Discussions about the responsibilities for building the vast number of detector components for the DUNE far detectors have begun, and additional scientists and institutions are welcome to join the collaboration. The goal is to finish the TDR for review in 2019 and to begin the construction of the far-detector components in 2021, with the first detector modules at SURF operational in 2024. From 24 to 27 October 2016, accelerator experts from around the world gathered in Daresbury, UK, to discuss the status, challenges and future of circular high-luminosity electron-positron factories. Organised under ICFA and co-sponsored by the EuCARD-2 accelerator network, the “eeFACT2016” workshop attracted 75 participants from China, France, Germany, Italy, Japan, Russia, Switzerland, the UK and the US. Circular colliders have been a frontier technology of particle physics for half a century, providing more than a factor 10 increase in luminosity every 10 years. Several lower-energy factories are in operation: BEPC-II at IHEP Beijing, DAFNE at INFN Frascati and VEPP-2000 at BINP Novosibirsk. The SuperKEKB facility currently under commission in Japan (CERN Courier September 2016 p32) will mark the next step up in luminosity. Among other future projects, a super-charm-tau factory is being developed in Russia, while two ambitious high-energy circular Higgs-Z-W (and top) factories are being designed: the Circular Electron Positron Collider (CEPC) in China and the electron-positron version of the Future Circular Collider (FCC) at CERN. Despite 50 years of experience and development of the e+e– landscape, in the past couple of years several game-changing schemes have been introduced, such as colliding beams with a crab waist, large Piwinski angle and extremely low emittance. The crab-waist concept has already demonstrated its great merits at DAFNE. Other novel concepts include: the use of a double ring or partial double ring; magnet tapering; top-up injection; cost-effective two-in-one magnets; ultra-low beta function; “virtual crab waist”; and asymmetric interaction-region optics. Upcoming colliders like SuperKEKB and the upgraded VEPP-2000 collider will test the limits of these new schemes. In parallel, much progress is being made in the design and operation of storage-ring light sources, which exhibit numerous topics of common interest with the collider world. There is also a powerful synergy between a future large circular high-energy lepton collider such as CEPC or FCC-ee and a subsequent hadron collider installed in the same tunnel, called SPPC and FCC-hh, respectively. The projected performance of the future factories is further lifted by dramatic progress in accelerator technology such as superconducting radiofrequency (RF) systems, the efficiency of which have been revolutionised by novel production schemes such as nitrogen doping and thin-film Nb Sn coating. Several novel klystron concepts are on track to boost the power-conversion efficiency of RF power generators, which will make the next generation of colliders truly green facilities. With the performance of future factories being pushed so hard, subtleties that were unimportant in the past now arise – in particular concerning beam–beam effects. Large future collider concepts such as FCC-ee and CEPC build on recent innovations and would greatly advance progress in fundamental physics at the precision frontier. At the same time new ideas for compact low-energy crab-waist colliders are emerging, which might offer attractive alternative paths for research and science. The first international workshop on Hadronic Contributions to New Physics Searches (HC2NP 2016) was held on 25–30 September 2016 in Tenerife, Spain, inaugurating a new series aimed at hadronic effects that interfere in beyond-the-Standard-Model (SM) searches. A multidisciplinary group of 50 physicists attended the event to review four timely topics: muon g-2, flavour anomalies, sigma-terms in dark-matter searches, and the proton radius puzzle. The anomalous magnetic moment of the muon (g-2) provides one of the most precise tests of the SM, and theory currently stands at 3.3 standard deviations from the experimental measurements. Updates on the new measurements starting in 2017 at Fermilab and J-PARC were presented, with prospects to reduce the current experimental uncertainties by a factor of four within the next few years. Several ways to improve the theoretical uncertainty, especially on the hadronic side, were discussed – including new lattice-QCD calculations of the vacuum polarization contribution – and prospects for new experimental measurements at BESIII were also reviewed. Anomalies in weak flavour transitions in hadrons are a hot topic, especially the B-meson decay anomalies measured at LHCb and the tantalising hints of lepton-universality violation in the so-called RK and RD* ratios. These signals should be validated by other B-decay modes, which requires new lattice calculations of form factors. Since new physics might not constrain itself to one flavour sector, decays of other mesons such as pions, kaons and baryons are also being scrutinized. Regarding dark matter, the sigma terms (nucleon form factors of fundamental interest) are one of the main uncertainties when interpreting direct searches. Old tensions in the values of these quantities persist, as seen in the mild discrepancy between the results of lattice QCD and those obtained using effective field theory or dispersive methods from experimental data. Recent developments in effective field theories now enable the subsequent bounds from the direct searches to be interpreted in the context of dark-matter searches at ATLAS and CMS. Finally, HC2NP addressed the proton charge radius puzzle – the five-standard-deviation discrepancy between the value measured for muonic versus normal hydrogen (CERN Courier October 2016 p7). Results from electron–proton scattering have become controversial because different values of the radius are extracted from different fits to the same data, while lattice calculations of the proton charge radius so far do not provide the required accuracy. Recent chiral perturbation theory calculations of proton polarisability effects in muonic hydrogen show that this effect is relatively small, and new experiments on muonic deuterium and helium show that the same discrepancy exists for the deuterium but not the helium. With PSI due to perform a new experiment on the ground-state hyperfine splitting of muonic hydrogen, we require a factor 10 improvement in our understanding of proton-structure effects. Given the success of the meeting, a new edition of HC2NP covering a selection of timely subtopics will be organised in Tenerife during 2018. Some 400 theorists and experimentalists convened in Thessaloniki, Greece, from 29 August to 3 September 2016 for the 12th Quark Confinement and the Hadron Spectrum conference. Initiated in 1994, the series has become one of the most important and well attended forums in strong-interaction physics. The event (which this year included 40 plenary talks, 267 parallel talks and 33 posters) is organised in eight parallel sections: vacuum structure and confinement; emergent gauge fields and chiral fermions; light quarks; heavy quarks; deconfinement; QCD and new physics; nuclear and astroparticle physics; and strongly coupled theories. Two additional parallel sessions devoted to statistical methods and instrumentation were also included this year. The event brought together physicists working on approaches ranging from lattice field theory to higher-order perturbative and resummation methods; from phenomenology to experiments; from the mechanisms of confinement to deconfinement in heavy-ion physics; and from effective field theories of QCD to physics beyond the Standard Model. Only a brief summary of the wealth of results presented can be mentioned here. Of particular interest was a talk exploring the connections between gravitational-wave results from LIGO and hadron physics: the gravitational-wave signature for neutron-star mergers depends strongly on the QCD equation of state (EOS) and different assumptions about the EOS lead to uncertainties on the merger time, wave amplitude, peak frequency and radiated energy. Fortunately, there are other ways of exploring the QCD EOS at high density, such as upcoming experiments at the new FAIR facility in Germany, RHIC in the US and NICA in Russia, which also complement studies of the low-density regime of the EOS with heavy-ion collisions at the LHC. Several talks placed an emphasis on anomalies with respect to the Standard Model. The chiral anomaly in the background magnetic field of heavy-ion collisions, for example, has also been observed in condensed-matter physics in “Dirac semimetals”. Other talks addressed flavour anomalies and whether they could be a signal of new physics or be described by standard QCD effects. The status of heavy-flavour production from protons to ions was presented and the quarkonium production mechanism was emphasised, including the production of charmonium-like exotics. A number of talks were dedicated to physics on the scale of the nucleon rather than the nucleus, including new approaches to the parton distributions in the proton from lattice QCD, field theories and global analyses, incorporating results from JLab and the LHC. The status of the proton radius puzzle also generated lively discussions. The conference was followed by a satellite workshop on new accelerator-based facilities that will provide precision measurements of confinement and deconfinement physics, demonstrating the health of the field.
Agency: European Commission | Branch: FP7 | Program: CP-FP | Phase: SSH-2010-2.1-2 | Award Amount: 3.06M | Year: 2010
The effort to strengthen social cohesion and lower social inequalities is among Europes main policy challenges. It means that local welfare systems are at the forefront of the struggle to address this challenge and they are far from winning. While the statistics show some positive signs, the overall picture still shows sharp and sometimes rising inequalities, a loss of social cohesion and failing policies of integration. But, contrary to what is sometimes thought, a lack of bottom-up innovation is not the issue in itself. European cities are teeming with new ideas, initiated by citizens, professionals and policymakers. The problem is, rather, that innovations taking place in the city are not effectively disseminated because they are not sufficiently understood. Many innovations are not picked up, because their relevance is not recognised or they fail after they have been introduced, because they were not suitable to the different conditions in another city in another country. In this project, we will look into this missing link between innovations at the local level and their successful transfer and implementation to other settings. We will examine innovation in cities, not as a disconnected phenomenon, but as an element in a tradition of welfare that is part of particular socio-economic models and the result of specific national and local cultures. By contextualising innovations in local welfare, will be more effective in understanding how they could work in other cities, for the benefit of other citizens. In short, the aim of the project is to examine, through cross-national comparative research, how local welfare systems affect social inequalities and how they favour social cohesion and sustainability. The results will be used, through strong interaction with stakeholders and urban policy recommendations, to link immediately to the needs of practitioners.
News Article | March 30, 2016
It took less than a minute of playing League of Legends for a homophobic slur to pop up on my screen. Actually, I hadn't even started playing. It was my first attempt to join what many agree to be the world's leading online game, and I was slow to pick a character. The messages started to pour in. It was an unusual spelling, and the spaces may have been added to ease the word past the game's default vulgarity filter, but the message was clear. Online gamers have a reputation for hostility. In a largely consequence-free environment inhabited mostly by anonymous and competitive young men, the antics can be downright nasty. Players harass one another for not performing well and can cheat, sabotage games and do any number of things to intentionally ruin the experience for others — a practice that gamers refer to as griefing. Racist, sexist and homophobic language is rampant; aggressors often threaten violence or urge a player to commit suicide; and from time to time, the vitriol spills beyond the confines of the game. In the notorious 'gamergate' controversy that erupted in late 2014, several women involved in the gaming industry were subjected to a campaign of harassment, including invasions of privacy and threats of death and rape. League of Legends has 67 million players and grossed an estimated US$1.25 billion in revenue last year. But it also has a reputation for toxic in-game behaviour, which its parent company, Riot Games in Los Angeles, California, sees as an obstacle to attracting and retaining players. So the company has hired a team of researchers to study the social — and antisocial — interactions between its users. With so many players, the scientists have been able to gather vast amounts of behavioural data and to conduct experiments on a scale that is rarely achieved in academic settings. Whereas other game companies have similar research teams, Riot's has been remarkably open about its work — with players, with other companies and with a growing collection of academic collaborators who see multiplayer games as a Petri dish for studying human behaviour. “What's most interesting with Riot is not that they're doing it but that they're publicizing it and have an established way of sharing it with academics,” says Nick Yee, a social scientist and co-founder of Quantic Foundry, a video-game-industry consulting firm in Sunnyvale, California. Riot's findings have helped to reveal where the toxic behaviour comes from and how to steer players to be kinder to each other. And some say that the work may translate to digital venues outside the game. “The work they do is extensible to thinking about big questions,” says Justin Reich, an education researcher at the Massachusetts Institute of Technology in Cambridge, “not just how do we make online games more civil places, but how do we make the Internet a more civil place?” Jeffrey Lin, the lead designer of social systems at Riot, is the public face of its research programme. He has been playing video games online since he was about 11 years old and had long wondered why so many of his fellow gamers put up with toxic behaviour. “Everybody you talk to thinks of the Internet as this hate-filled place,” he says. “Why do we think that's a normal part of gaming experiences?” In 2012, Lin was finishing a PhD in cognitive neuroscience at the University of Washington in Seattle and was working for the game company Valve in nearby Bellevue when a friend and fellow gamer introduced him to the co-founders of Riot, Marc Merrill and Brandon Beck. They had recognized that toxic behaviour was a major drag on players' experience, and they wanted to solve the problem with science. So they hired Lin as a game designer, essentially giving him the keys to a juggernaut in the online gaming world. League of Legends, Riot's only game, was released in 2009 and currently attracts 27 million players each day. It is by far the most popular of a growing segment of games referred to as eSports, a world in which elite players form professional teams, win university scholarships and take part in million-dollar tournaments in sporting arenas. The final of League of Legends's 2015 world championship in Berlin drew 36 million viewers online and on television, rivalling the audience of the finals of some traditional sports. The game can be intimidating to newcomers. Players control one of more than 120 characters called champions, each of which has specific abilities, weaknesses and roles. Teams are usually made up of five players, who must cooperate to kill monsters and opponents, collect gold to purchase magical items, capture territory and eventually destroy the other team's base. Matches last about half an hour on average, so having a poorly performing player on a team can be aggravating. And the game requires coordination between players, for which it provides an in-game chat function. If someone makes a mistake, he or she will generally hear about it fast. Players can report their teammates for being toxic, and this can result in a temporary or permanent ban from the game. But working out how to distinguish a few frustrated grumbles or good-natured trash talk from the kind of vitriol that is worthy of punishment is a difficult task. To tackle it, Lin needed to make sure that he had a good picture of where such toxicity was coming from. So he got a team to review chat logs from thousands of games each day and to code statements from players as positive, neutral or negative. The resulting map of toxic behaviour was surprising. Common wisdom holds that the bulk of the cruelty on the Internet comes from a sliver of its inhabitants — the trolls. Indeed, Lin's team found that only about 1% of players were consistently toxic. But it turned out that these trolls produced only about 5% of the toxicity in League of Legends. “The vast majority was from the average person just having a bad day,” says Lin. They behaved well for the most part, but lashed out on rare occasions. That meant that even if Riot banned all the most toxic players, it might not have a big impact. To reduce the bad behaviour that most players experienced, the company would have to change how players act. Lin borrowed a concept from classic psychology. In late 2012, he initiated a massive test of priming, the idea that imagery or messages presented just before an activity can nudge behaviours in one direction or another. The Riot team devised 24 in-game messages or tips, including some that encourage good behaviour — such as “Players perform better if you give them constructive feedback after a mistake” — and some that discourage bad behaviour: “Teammates perform worse if you harass them after a mistake”. They presented the tips in three colours and at different times during the game. All told, there were 216 conditions to test against a control, in which no tips were given. That is a ridiculous number of permutations to test on people in a laboratory, but trivial for a company with the power to perform millions of experiments each day. Some of the tips had a clear impact (see ‘Civil engineering’). The warning about harassment leading to poor performance reduced negative attitudes by 8.3%, verbal abuse by 6.2% and offensive language by 11% compared with controls. But the tip had a strong influence only when presented in red, a colour commonly associated with error avoidance in Western cultures. A positive message about players' cooperation reduced offensive language by 6.2%, and had smaller benefits in other categories. Riot has released just a few of these analyses, so it is hard to make broad generalizations. From a scientific standpoint, says Lin, the results from the priming experiments were “epic”, and they opened the doors to many more research questions, such as how various tips and colours might influence players from different cultures. But the behavioural improvements were too modest and too fleeting to change the culture of the game. Lin reasoned that if he wanted to make the community more civil, then players would have to have a say in devising the norms. So Riot introduced the Tribunal, which gives players a chance to serve as judge and jury to their peers. In it, volunteers review chat logs from a player who has been reported for bad behaviour, and then vote on whether the offender deserves punishment. The Tribunal, which started in 2011, gave players a greater sense of control over establishing community norms, says Lin. And it revealed some of the things that triggered the most rebukes: homophobic and racial slurs. But players who were banned from the game were often unsure why they had been punished, and continued to act negatively when the bans were lifted. So Lin's team developed 'reform cards' to give feedback to banned players, and the company then monitored their play. When players were informed only of what kind of behaviour had landed them in trouble, 50% did not misbehave in a way that would warrant another punishment over the next three months. When they were sent reform cards that included the judgements from the Tribunal and that detailed the chats and actions that had resulted in the ban, the reform rate went up to 70%. But the process was slow; reform cards might not show up until two weeks to a month after an offence. “If you look at any classic literature on reinforcement learning, the timing of feedback is super critical,” says Lin. So he and his team used the copious data they were collecting to train a computer to do the work much more quickly. “We let loose machine learning,” Lin says. The automated system could provide nearly instantaneous feedback; and when abuse reports arrived within 5–10 minutes of an offence, the reform rate climbed to 92%. Since that system was switched on, Lin says, verbal toxicity among so-called ranked games, which are the most competitive — and most vitriolic — dropped by 40%. Globally, he says, the occurrence of hate speech, sexism, racism, death threats and other types of extreme abuse is down to 2% of all games. “If the numbers they put out there are correct and true, it seems to be working well,” says Jamie Madigan, an author in St Louis, Missouri, who writes about the psychology of gamers. And that's because the reprimands are specific, timely and easy to understand and act upon, he says. “That's classic psychology 101.” Riot's research team is constantly experimenting with other ways to improve interactions in the game. Sportsmanlike behaviour can earn players honour points and other rewards. Tinkering with chat features helped, too. And the team is planning to use the Tribunal to train the game's algorithms to detect sarcastic and passive-aggressive language in chats — a major challenge for machine learning. From the start, Riot has also made much of its data available for others to investigate. Jeremy Blackburn, an avid gamer and computer scientist who works at Telefonica Research and Development in Barcelona, Spain, mined data on 1.46 million Tribunal cases to develop his own machine-learning approach for predicting when player behaviours would be deemed toxic. Together with Haewoon Kwak at the Qatar Computing Research Institute in Doha, he found that the most important factor — beyond the specific words used in the toxic messages — was how well the opposing team performed1. Blackburn, who is interested in studying cyberbullying, hopes to look more at how different cultures judge behaviour. Some evidence, he says, suggests that it is common for Korean gamers to gang up on and berate the poorest-performing players, for example. League data may bear this out. “We saw there was a lot more pardon for this verbal-abuse category.” Rachel Kowert in Austin, Texas, is a research psychologist on the board of the Digital Games Research Association. She is impressed by the work and especially by Blackburn and Kwak's unfettered access. “It's awesome for the researchers. You can't put a price on real data,” she says. Other companies also have data that scientists would like. Blizzard Entertainment in Irvine, California, makes the popular online fantasy game World of Warcraft, which many regard as a treasure trove for data on complex social interactions. But few people outside the company have been able work with the data, and most of those who do are subject to stiff non-disclosure agreements. (Blizzard did not respond to Nature's requests for comment.) By contrast, Riot talks about its data at gaming conferences, and when it collaborates with researchers there are few restrictions on publishing. It also has an outreach programme, visiting universities to establish collaborations. And last May, Lin presented data at the annual meeting of the Association for Psychological Science in New York City to drum up more interest. Even with those efforts, the company's research has yet to achieve broad recognition among behavioural scientists. “Hopefully they will come to more conferences where people are studying behaviour,” says Betsy Levy Paluck, a social psychologist at Princeton University in New Jersey. Although she was not familiar with Riot, she says that the company seems to be working out how to do high-powered, big-data research in psychology, which has been a major challenge. Daphné Bavelier, a cognitive neuroscientist at the University of Geneva in Switzerland, met Lin at the conference in New York City. Her research has suggested — to the joy of many gamers and the agony of their parents — that some games, particularly fast-paced first-person shooters, can improve a handful of cognitive abilities, such as visual attention, both within and outside the games2. She plans to collaborate with Riot to study how players tackle the steep learning curve in League of Legends. The team-based nature of the game could also be useful to scientists. Young Ji Kim, a social scientist at the Massachusetts Institute of Technology's Center for Collective Intelligence, was able to recruit 279 experienced teams from League of Legends to fill out surveys and work together on a battery of online tests that were designed to explore team dynamics and the factors that make teams successful. (By providing an in-game incentive worth about $15, Riot helped her team to get thousands of sign-ups in a couple of hours, she says.) The preliminary results suggested that the teams' rank in the game correlates with their collective intelligence — a measure that generally tracks with things such as social perceptiveness and taking equal turns in conversation. The enthusiasm that players show for participating in experiments such as these may be attributable to Lin, who writes frequently about Riot research and can often be found answering players' questions on Twitter and other social media. Being upfront and public about the efforts is important, says Bavelier. Although most digital companies run experiments on users, they are often less transparent. Facebook, for example, published a study about how behind-the-scenes tinkering with news feeds can manipulate user emotions3, and received significant backlash from users. “We need to learn from some of the mistakes of others to make sure that the users are aware of what we're doing,” says Bavelier. Riot has an internal institutional review board that evaluates the ethics of all its experiments. Although not a conflict-free arrangement, it at least suggests that the research is being reviewed with an eye towards participant protection. Academic collaborators also need to get approval from their local boards. Lin has lofty goals for his teams' research and interventions. “Can we improve online society as a whole? Can we learn about how to teach etiquette?” he asks. “We're not an edutainment company. We're a games company first, but we're aware of how it could be used to educate.” Parents, lawmakers and some scientists have fretted for decades that video games, particularly violent ones, are warping the minds of children. But James Ivory, a communication scientist at Virginia Polytechnic Institute and State University, in Blacksburg, says that much of the attention on violence has missed the biggest impact that games have. “Researchers are slowly starting to wise to the idea that it may not be as important to think of what it means for someone to pretend to be a soldier than whether they're spending their time spewing racial or homophobic slurs.” By the age of 21, the average young gamer will have logged thousands of hours of playing time. That fact alone makes dichotomies such as 'real world' and 'digital world' ring false — for many, game-playing is the real world. And, says Ivory, “the strongest influence these games have on people is how they interact with other people”. Some researchers are cautious about trying to apply lessons from the game to other settings. Dmitri Williams, a social scientist and founder of Ninja Metrics, an analytics company in Manhattan Beach, California, warns that games have very specific incentive structures, which could limit how well these experiences map to the wider world. “People behave well in real life because if they offend someone or screw up, they have to deal with the consequences.” So, the manipulations that work to curb bad behaviour in League may be meaningless elsewhere. And there are still considerable challenges for Riot. Players continue to complain about toxic behaviour or what they deem to be unwarranted punishments. And a blog called 'League of Sexism' argues that the suggestive portrayal of female characters in the game contributes to a strong current of sexism in the player community. “It's difficult for players to identify sexist behaviour when sexism is built into the game's very imagery,” says a representative for the blog, who wished to remain anonymous. Although Lin's efforts are “admirable and likely industry-leading”, the representative says, many games are still “awash with verbal harassment, griefing and overall negative behaviour from teammates and opponents”. Lin says that Riot artists are aware of these concerns and that they have made efforts to portray female characters in a stronger and more-powerful way. Although Riot boasts that serious toxic behaviour infects only 2% of games, somehow I managed to experience it within a minute of playing for the first time. But immediately after “FA GO TT” popped up on my screen, something interesting happened. Another player chimed in with, “Calm down”. Perhaps it was a sign that Lin's efforts to engineer a more civil, self-policing digital space is starting to work. Or maybe it was just a friendly teammate reminding us all that it's just a game.
News Article | February 20, 2017
The use of nanoparticles is becoming increasingly widespread in the world of biomedicine. This rapidly-evolving technology offers hope for many medical applications, whether for diagnosis or therapies. In oncology, for example, a growing body of research suggests that, thanks to nanoparticles, treatment will soon become more precise, more effective and less painful for patients. One potential stumbling block, however, is that the way nanoparticles interact with the immune system has remained unclear and unpredictable, restricting their potential medical use. Now, researchers from the universities of Geneva (UNIGE) and Fribourg (UNIFR), both in Switzerland, are close to solving the problem. They have devised a rapid screening method for selecting the most promising nanoparticles, thereby fast-tracking the development of future treatments. In less than a week, they can determine whether or not nanoparticles are compatible with the human body – an analysis that previously required several months of work. This discovery, which is described in a paper in Nanoscale, may well lead to the swift, safe and less expensive development of nanoparticles for medical applications. Nanoparticles measure between 1nm and 100nm in size, approximately the size of a virus. Their very minuteness means that they have the potential to be used in a wide range of medical applications, such as serving as diagnostic markers or delivering therapeutic molecules to the exact spot in the body where the drug is intended to act. Before being applied to the medical field, however, nanoparticles must prove that they are safe for the human body and are capable of bypassing the immune system. "Researchers can spend years developing a nanoparticle, without knowing what impact it will have on a living organism," says Carole Bourquin, professor in the medicine and science faculties at UNIGE and project leader. "So there was a real need to design an effective screening method that could be implemented at the beginning of the development process. Indeed, if the nanoparticles aren't compatible, several years of research were simply thrown away." When any foreign element enters the body, including nanoparticles, the immune system is activated. Macrophages are always found on the front line; these are large cells that "ingest" invaders and trigger the immune response. The way macrophages react to the nanoparticle under investigation then determines the biocompatibility of the product. "When you begin to develop a new particle, it's very difficult to ensure that the recipe is exactly the same every time," points out Inès Mottas, first author of the paper. "If we test different batches, the results may differ. Hence our idea of finding a way to test three parameters simultaneously – and on the same sample – to establish the product's biocompatibility: its toxicity, its ability to activate the immune system and the capacity of the macrophages to ingest them." The ideal medical nanoparticle should therefore not be toxic (it should not kill the cells); should not be completely ingested by the macrophages (so that it retains its ability to act); and should limit the activation of the immune system (to avoid adverse side-effects). Until now, evaluating the biocompatibility of nanomaterials was a laborious task that took several months and posed reproducibility problems, since not all the tests were performed on the same batch of particles. Bourquin and her team have now used flow cytometry with macrophages to determine the three essential elements in a safe and standardized manner, and in record time. "The macrophages are brought into contact with the nanoparticles for 24 hours, and are then passed in front of the laser beams. The fluorescence emitted by the macrophages makes it possible to count them and characterize their activation levels," explains Mottas. "Since the particles themselves are fluorescent, we can also measure the amount ingested by the macrophages. Our process means we can test the three elements simultaneously, and we only need a very small amount of particles. We can obtain a comprehensive diagnosis of the nanoparticle submitted to us in two or three days." This method is now part of the work carried out within the National Centres of Competence in Research (NCCR) ‘Bio-Inspired Materials’, and is already a great success with scientists striving to develop new particles, allowing them to select the most promising particles quickly. As well as having a financial impact on the cost of research, this new approach also limits the use of animal testing. Furthermore, it is opening the door to the increasingly personalized treatment of certain pathologies. For example, by testing the nanoparticles on tumor cells isolated from a particular patient, it should theoretically be possible to identify the most effective treatment for that patient. This story is adapted from material from the University of Geneva, with editorial changes made by Materials Today. The views expressed in this article do not necessarily represent those of Elsevier. Link to original source.
Agency: European Commission | Branch: H2020 | Program: SGA-RIA | Phase: FETFLAGSHIP | Award Amount: 89.00M | Year: 2016
This project is the second in the series of EC-financed parts of the Graphene Flagship. The Graphene Flagship is a 10 year research and innovation endeavour with a total project cost of 1,000,000,000 euros, funded jointly by the European Commission and member states and associated countries. The first part of the Flagship was a 30-month Collaborative Project, Coordination and Support Action (CP-CSA) under the 7th framework program (2013-2016), while this and the following parts are implemented as Core Projects under the Horizon 2020 framework. The mission of the Graphene Flagship is to take graphene and related layered materials from a state of raw potential to a point where they can revolutionise multiple industries. This will bring a new dimension to future technology a faster, thinner, stronger, flexible, and broadband revolution. Our program will put Europe firmly at the heart of the process, with a manifold return on the EU investment, both in terms of technological innovation and economic growth. To realise this vision, we have brought together a larger European consortium with about 150 partners in 23 countries. The partners represent academia, research institutes and industries, which work closely together in 15 technical work packages and five supporting work packages covering the entire value chain from materials to components and systems. As time progresses, the centre of gravity of the Flagship moves towards applications, which is reflected in the increasing importance of the higher - system - levels of the value chain. In this first core project the main focus is on components and initial system level tasks. The first core project is divided into 4 divisions, which in turn comprise 3 to 5 work packages on related topics. A fifth, external division acts as a link to the parts of the Flagship that are funded by the member states and associated countries, or by other funding sources. This creates a collaborative framework for the entire Flagship.
Negrini S.,University of Geneva |
Gorgoulis V.G.,University of Geneva |
Halazonetis T.D.,University of Geneva |
Halazonetis T.D.,National and Kapodistrian University of Athens
Nature Reviews Molecular Cell Biology | Year: 2010
Genomic instability is a characteristic of most cancers. In hereditary cancers, genomic instability results from mutations in DNA repair genes and drives cancer development, as predicted by the mutator hypothesis. In sporadic (non-hereditary) cancers the molecular basis of genomic instability remains unclear, but recent high-throughput sequencing studies suggest that mutations in DNA repair genes are infrequent before therapy, arguing against the mutator hypothesis for these cancers. Instead, the mutation patterns of the tumour suppressor TP53 (which encodes p53), ataxia telangiectasia mutated (ATM) and cyclin-dependent kinase inhibitor 2A (CDKN2A; which encodes p16INK4A and p14ARF) support the oncogene-induced DNA replication stress model, which attributes genomic instability and TP53 and ATM mutations to oncogene-induced DNA damage. © 2010 Macmillan Publishers Limited. All rights reserved.
Ginovart N.,University of Geneva |
Kapur S.,King's College London
Handbook of Experimental Pharmacology | Year: 2012
This review summarizes the current state of knowledge regarding the proposed mechanisms by which antipsychotic agents reduce the symptoms of schizophrenia while giving rise to adverse side effects. The first part summarizes the contribution of neuroimaging studies to our understanding of the neurochemical substrates of schizophrenia, putting emphasis on direct evidence suggestive of a presynaptic rather than a postsynaptic dysregulation of dopaminergic neurotransmission in this disorder. The second part addresses the role of D2 and non-D2 receptor blockade in the treatment of schizophrenia and highlights a preponderant role of D2 receptors in the mechanism of antipsychotic action. Neuroimaging studies have defined a narrow, but optimal, therapeutic window of 65-78 % D2 receptor blockade within which most antipsychotics achieve optimal clinical efficacy with minimal side effects. Some antipsychotics though do not conform to that therapeutic window, notably clozapine. The reasons for its unexcelled clinical efficacy despite subthreshold levels of D2 blockade are unclear and current theories on clozapine's mechanisms of action are discussed, including transiency of its D2 receptor blocking effects or preferential blockade of limbic D2 receptors. Evidence is also highlighted to consider the use of extended antipsychotic dosing to achieve transiency of D2 blockade as a way to optimize functional outcomes in patients. We also present some critical clinical considerations regarding the mechanisms linking dopamine disturbance to the expression of psychosis and its blockade to the progressive resolution of psychosis, keeping in perspective the speed and onset of antipsychotic action. Finally, we discuss potential novel therapeutic strategies for schizophrenia. © 2012 Springer-Verlag Berlin Heidelberg.
Gorgoulis V.G.,National and Kapodistrian University of Athens |
Halazonetis T.D.,University of Geneva
Current Opinion in Cell Biology | Year: 2010
In late 1990s, it was shown that activated oncogenes are able to induce senescence. Since then large leaps in understanding this phenomenon have been achieved. There is substantial evidence supporting oncogene-induced senescence (OIS) as a potent antitumor barrier in vivo. Multiple pathways participating in cell cycle regulation, DNA damage signaling, immune response, and bioenergetics regulate the process. Despite its beneficial effects the senescent cell is thought to promote carcinogenesis and age-related disease in a nonautonomous manner. Here, we highlight the works dealing with all these aspects and discuss the studies proposing therapeutic exploitation of OIS. © 2010 Elsevier Ltd.
Agency: European Commission | Branch: FP7 | Program: CP-FP | Phase: KBBE-2008-1-1-01 | Award Amount: 4.12M | Year: 2009
This proposal, entitled Acquired Environmental Epigenetics Advances: from Arabidopsis to maize (acronym: AENEAS), aims to assess the impact of environmental conditions on epigenetic states in the model plant Arabidopsis thaliana and then transfer knowledge to maize (Zea mays): an important European crop. Advances in understanding the detailed mechanisms of epialleles formation in response to environmental cues and their heritable maintenance in a model plant such as Arabidopsis will be the starting objective of the AENEAS proposal. To this end, we will focus on three epigenetic regulatory pathways, which have been well characterized for their interaction with environmental signals in mediating changes into the epigenome. They are: the autonomous, the small RNA and the CpG methylation pathways. The outcome of this research activity will be a road map for plant environmental epigenetics, necessary for further progress of the basic research in this area and for the transfer of the knowledge to crop plants. Concomitantly, the constitution of an Environmental Epigenetics platform for maize, will start with the development of tools indispensable for the shift of epigenetic research from Arabidopsis to maize. This will be achieved by the functional characterization of maize mutants for epi-regulators belonging to the three pathways studied in Arabidopsis. The tools will comprise: maize epi-regulator mutants, their targets, and information about their interaction with environmental cues for epialleles formation and inheritance throughout generations. It is our opinion that the deliverables from AENEAS will be the progenitors for the next-generation of breeding programs, based on the exploitation of the environmental-induced epigenetics variability. In addition, we will conduct a comparative genomics analysis of data arising from the project to generate comparative models for environmental epigenomics in two evolutionary distinct species such as Arabidopsis and maize.
Agency: European Commission | Branch: FP7 | Program: MC-ITN | Phase: FP7-PEOPLE-2013-ITN | Award Amount: 3.41M | Year: 2013
CALIPSO is a genuine interdisciplinary and intersectorial research network composed of nine academic research institutions and three industrial partners, thus presenting an example of true translational research and training of young researchers in communicating and transferring achievements from different model organisms directly to industrial partners. CALIPSO aims at identifying environmentally triggered regulatory calcium signals and protein phosphorylation events that control photosynthesis and metabolism. CALIPSO partners work with a wide range of different organisms covering the full phylogenetic spectrum from algae to higher plants including economically important crops. They combine a wide spectrum of newest technologies in molecular biology, biochemistry, proteomics, metabolomics, genetics, bioinformatics and systems biology to uncover how photosynthetic organisms acclimate to changing environmental conditions or stress. This novel combination of scientific expertise combined with industrial applications is one of the major strengths of CALIPSO, exposing the participating researchers to different schools of thought. The active participation of Bayer CropScience Gent and Ecoduna as full network partners, and Photon Systems Instruments as associated partner, will enable intersectorial industry-academia cooperation with the long term objectives of (i) improving yield and stress robustness of crops and (ii) developing microalgal-biotechnology. The integrated systematic training programme of CALIPSO will boost the future employability of the young researchers by acquisition of technical skills for their work in academia or the private sector and also essential complementary skills for their future career. The training programme comprises three workshops on state-of-the art techniques - and one on industrial-relevant skills. This is completed by secondments to partner laboratories and industry and network-wide training events in further complementary skills.
Agency: European Commission | Branch: FP7 | Program: CP | Phase: ICT-2007.1.6 | Award Amount: 5.29M | Year: 2008
PERIMETERs main objective is to establish a new paradigm of user-centricity for advanced networking. In contrast to network-centric approaches, user-centric strategies could achieve true seamless mobility. Putting the user at the centre rather than the operator enables the user to control his or her identity, preferences and credentials, and so seamless mobility is streamlined, enabling mobile users to be Always Best Connected in multiple-access multiple-operator networks of the Future Internet.For that, PERIMETER will develop and implement protocols designed to cope with increased scale, complexity, mobility and requirements for privacy, security, resilience and transparency of the Future Internet. These include appropriate mechanisms for network selection based on Quality of Experience; innovative implementation of Distributed A3M protocols for Fast Authentication, Authorisation and Accounting based on privacy-preserving digital identity models. All these mechanisms will be designed to be independent from the underlying networking technology and service provider, so that fast, inter-technology handovers will be possible.PERIMETER will also develop and implement middleware that support generic Quality of Experience models, signalling and content adaptation, and exemplary extension applications and services for user-centric seamless mobility.The paradigms of user-centric seamless mobility, middleware components and its integrated applications and services will be tested in two large-scale interconnected testbeds on real users, in three cycles of increasingly complex scenarios. The results will be used for assessment of user centricity.The realization of user-centric paradigm will revolutionise mobile communications. It will impact seamless mobility, issues of security and privacy, standards and future research, and it will maintain Europes leading position in the race to define and develop the network and service infrastructures of the Future Internet.
Agency: European Commission | Branch: H2020 | Program: RIA | Phase: INFRADEV-1-2014 | Award Amount: 3.99M | Year: 2015
The award of the 2013 Nobel Prize for Physics acknowledged the leading role of Europe in particle physics, which has a global community of over 10,000 scientists. To reinforce its pole position throughout the 21st century, Europe must be ready to propose an ambitious post-LHC accelerator project by 2018/19. This is one of the main recommendations of the updated European Strategy for Particle Physics, adopted by the CERN Council in May 2013. The EuroCirCol conceptual design study is a direct response to this recommendation, initiating a study for a 100 TeV energy-frontier circular collider through a collaboration of institutes and universities worldwide. A new research infrastructure of such scale depends on the feasibility of key technologies pushed beyond current state of the art. Innovative designs for accelerator magnets to achieve high-quality fields up to 16 T and for a cryogenic beam vacuum system to cope with unprecedented synchrotron light power are required. The effects of colliding two 50 TeV beams must be mastered to meet the physics research requirements. Advanced energy efficiency, reliability and cost effectiveness are key factors to build and operate such an accelerator within realistic time scale and cost. This proposal is part of the Future Circular Collider study under European leadership, federating resources worldwide to assess the merits of different post-LHC accelerator scenarios. It forms the core of a globally coordinated strategy of converging activities, involving participants from the ERA and beyond. Organisations joining this study from Japan and the USA are expected to take part in a global implementation project and a suitable governance model will be drawn-up accordingly. The main outcome of EuroCirCol will be laying the foundation of subsequent infrastructure development actions that will strengthen the ERA as a focal point of global research cooperation and as a leader in frontier knowledge and technologies over the next decades.
Agency: European Commission | Branch: H2020 | Program: RIA | Phase: INFRASUPP-03-2016 | Award Amount: 3.00M | Year: 2017
The objective of the AENEAS project is to develop a concept and design for a distributed, federated European Science Data Centre (ESDC) to support the astronomical community in achieving the scientific goals of the Square Kilometre Array (SKA). The scientific potential of the SKA radio telescope is unprecedented and represents one of the highest priorities for the international scientific community. By the same token, the large scale, rate, and complexity of data the SKA will generate, present challenges in data management, computing, and networking that are similarly world-leading. SKA Regional Centres (SRC) like the ESDC will be a vital resource to enable the community to take advantage of the scientific potential of the SKA. Within the tiered SKA operational model, the SRCs will provide essential functionality which is not currently provisioned within the directly operated SKA facilities. AENEAS brings together all the European member states currently part of the SKA project as well as potential future EU SKA national partners, the SKA Organisation itself, and a larger group of international partners including the two host countries Australia and South Africa.
Agency: European Commission | Branch: FP7 | Program: CP-IP-SICA | Phase: ENV.2009.2.2.1.4 | Award Amount: 8.83M | Year: 2010
Many efforts have been deployed for developing Integrated Coastal Zone Management (ICZM) in the Mediterranean and the Black Sea. Both basins have, and continue to suffer severe environmental degradation. In many areas this has led to unsustainable trends, which have impacted, on economic activities and human well-being. An important progress has been made with the launch of the ICZM Protocol for the Mediterranean Sea in January 2008. The ICZM Protocol offers, for the first time in the Mediterranean, an opportunity to work in a new way, and a model that can be used as a basis for solving similar problems elsewhere, such as in the Back Sea. The aim of PEGASO is to build on existing capacities and develop common novel approaches to support integrated policies for the coastal, marine and maritime realms of the Mediterranean and Black Sea Basins in ways that are consistent with and relevant to the implementation of the ICZM Protocol for the Mediterranean. PEGASO will use the model of the existing ICZM Protocol for the Mediterranean and adjust it to the needs of the Black Sea through three innovative actions: - Constructing an ICZM governance platform as a bridge between scientist and end-user communities, going far beyond a conventional bridging. The building of a shared scientific and end users platform is at the heart of our proposal linked with new models of governance. -Refining and further developing efficient and easy to use tools for making sustainability assessments in the coastal zone (indicators, accounting methods and models, scenarios, socio-economic valuations, etc). They will be tested and validated in 10 sites (CASES) and by the ICZM Platform, using a multi-scale approach for integrated regional assessment. -Implementing a Spatial Data Infrastructure (SDI), following INSPIRE Directive, to organize local geonodes and standardize spatial data to support information sharing on an interactive visor, to make it available to the ICZM Platform, and to disseminate all results of the project to all interested parties and beyond. -Enhancing regional networks of scientists and stakeholders in ICPC countries, supported by capacity building, to implement the PEGASO tools and lessons learned, to assess the state and trends for coast and sea in both basins, identifying present and future main threats agreeing on responses to be done at different scales in an integrated approach, including transdisciplinary and transbondary long-term collaborations.
Agency: European Commission | Branch: FP7 | Program: CP-FP | Phase: SPA.2012.2.1-01 | Award Amount: 2.96M | Year: 2013
We propose to carry out an FP7 collaborative project to provide the first ever quantitative answer to one fundamental age-old question of mankind: How common are Earth analogs in our Galaxy?. We will achieve our goal by combining the unprecedented photometric precision of NASAs Kepler mission, the unrivalled precision of ground-based radial-velocities from the HARPS-N spectrograph, and ESAs Gaia mission exquisitely accurate parallaxes. Ours is a transnational collaboration between European countries and the US setup to optimize the synergy between space- and ground-based data whose scientific potential can only be fully exploited when analyzed together. We ask for manpower and resources to carry out a GTO program with HARPS-N@TNG (80 nights/yr for 5 years) to measure dynamical masses of terrestrial planet candidates identified by the Kepler mission. With the unique combination of Kepler and HARPS-N data we will learn for the first time about the physics of their interiors. Some of these planets will have characteristics (masses, radii) similar to Earth, and they might be orbiting within the habitable zone of stars much like our Sun. We will search for planets similar to Earth orbiting a carefully selected sample of nearby bright solar-type stars and red M dwarfs, providing suitable candidates for spectroscopic characterization of their atmospheres with next-generation space observatories. We will combine Kepler, HARPS-N, and Gaia data products of stars in the Kepler field to underpin the occurrence rates of terrestrial planets () as a function of stellar properties with unprecedented accuracy. Our unique team expertise in observations and modelling of exoplanetary systems will allow us to fully exploit the potential for breakthrough science intrinsic to this cutting-edge, multi-techniques, interdisciplinary project, making the best use of data of the highest quality gathered from NASA and ESA space missions and ground-based instrumentation.
Agency: European Commission | Branch: FP7 | Program: MC-ITN | Phase: FP7-PEOPLE-ITN-2008 | Award Amount: 2.47M | Year: 2010
The NanoCTM network will tackle major challenges in the theory of nanoelectronics. Ten internationally-leading European theory-of-condensed-matter groups from nine different countries [including one of Europes leading industrial electronics-research groups (QinetiQ)] have joined forces as full participants, combining theoretical expertise in nanowires, quantum dots, carbon-based electronics, and spintronics, along with interaction and proximity effects in small dimensions. Our highly-integrated approach to nanoscale transport will represent a major step towards the realisation of future scalable nanotechnologies and processes. In the longer term, the insights gained will contribute to the fabrication of novel functional nanoscale architectures and their integration into a higher hierarchical level. System parameters such as electric field, light, temperature or chemical reactivity are envisaged as possible drivers of future nanoelectronic devices.
Agency: European Commission | Branch: FP7 | Program: CP | Phase: ICT-2011.1.4 | Award Amount: 4.67M | Year: 2012
The overall purpose of MUSES is to foster corporate security by reducing the risks introduced by user behaviour.Data security and privacy are of fundamental importance to organizations, where they are defined and managed via Security Policies. Most security incidents are caused by organization insiders, either by their lack of knowledge or inadequate or malicious behaviour.Nowadays, information is highly distributed amongst corporate servers, the cloud and multiple per-sonal devices like PDAs, tablets and smart phones. These are not only information holders but also user interfaces to access corporate information. Besides, the Bring Your Own Device practice is becoming more common in large organisations, posing new security threats and blurring the limits between corporate and personal use.In this situation enforcement of Security Policies is increasingly difficult, as any strategy with a chance to succeed must take into account several changing factors: information delocalisation, access from heterogeneous devices and mixing of personal and professional activities. Besides, any mechanism or control must be user friendly and provide non-intrusive, clear feedback on the risk being incurred at any time.MUSES will provide a device independent, user-centric and self-adaptive corporate security system, able to cope with the concept of seamless working experience on different devices, in which a user may start a session on a device and location and follow up the process on different devices and locations, without corporate digital asset loss.During project development, metrics of usability, context risk evaluation, user current trust situation and device exposure level will be defined and several guidelines for design of secure applications, company policies and context-based security requirements will be produced. A real-time trust and risk analysis engine will also be developed with security mechanisms hard to compromise once installed on the target platforms
Agency: European Commission | Branch: FP7 | Program: MC-IRSES | Phase: FP7-PEOPLE-2010-IRSES | Award Amount: 434.70K | Year: 2011
The ChemBioFight project aims towards the exploration of natural resources to the discovery of bioactive therapeutic molecules against leishmania and Chagas disease. This will be accomplished through the establishment of an extended scientific network between European and South American research entities. Already assembled, highly diverse chemical libraries will be employed for the determination of active natural scaffolds leading to the focused collection of biomaterial (plants, marine organisms, fungi, endophytes) from local diversity hot-spots. Automated, high throughput and advanced techniques will be incorporated for the extraction process as well as the isolation and identification of natural products. Sophisticated approaches as metabolomics and chemical profiling will contribute to the discovery of novel active compounds and will be used to conduct dereplication procedures. Semi-synthetic derivatives of lead compounds will be also produced aiming to the optimization of favorable biological properties via medicinal chemistry aspects. In every step of the proposed work flow, the obtained samples (extracts, isolated compounds, synthetic derivatives) will be evaluated in vitro and/or in vivo for their antileishmanial and antitrypanosomal activity. Within the aforementioned context, an extensive net of secondements, both with educational and experimental attributes, will be established. Core scientific knowledge is expected to be produced and exchanged, with the prospect of creating partnerships with future scientific potential. All partners will participate in the dissemination procedure through teaching activities, workshops and international conferences, leading overall to mutual transfer of know-how. Finally, all procedures will be effectively monitored from a management team ensuring effectiveness and prompt objective achievement.
Agency: European Commission | Branch: FP7 | Program: MC-ITN | Phase: FP7-PEOPLE-2011-ITN | Award Amount: 2.87M | Year: 2012
Rare-earth ions (lanthanides) play an increasingly important role in modern optical technologies. Lanthanides are extensively used in solid-state laser physics, e.g. as key components in telecommunication networks. Rare-earths are also employed as luminescent materials in lamps or as radiation detectors in X-ray imaging. Rare-earths are already commercially omnipresent. However, the full potential of rare-earth ions is not yet explored in particular with regard to the rapidly evolving field of future information technology. Future data storage and processing will require novel types of memories (e.g. based on interactions between light and quantized matter), algorithms (e.g. based on quantum computations) and materials (e.g. appropriate quantum systems). Rare-earth ion doped solids are very promising candidates to permit implementation of future quantum technology. The media combine the advantages of solids (i.e. large density and scalability) and atomic gases (i.e. long coherence times). CIPRIS will build on the advantages of rare-earth doped media and drive applications towards future information technology. CIPRIS follows two scientific approaches : Classical processing and quantum processing. Both are meant as pronounced inter-disciplinary research efforts, combining physics, material science and information technology. To exploit the results, the public and private sector partners will closely cooperate to develop commercial demonstration devices. In terms of training, CIPRIS aims at the development of the next generation of young researchers with appropriate skills in rare-earth-based information technology pushing it towards commercial applications. CIPRIS offers a large variety of training actions, e.g. mini schools, laboratory courses, secondments to the private sector, or training sessions to strengthen complementary skills and contacts to the private sector. This will contribute to a European knowledge base for future information technology.
Agency: European Commission | Branch: FP7 | Program: CP-FP | Phase: HEALTH-2009-3.1-1 | Award Amount: 3.89M | Year: 2010
The aim of PROHIBIT is to understand existing guidelines and practices to prevent healthcare associated infections (HAI) in European hospitals, identify factors that enable and prevent compliance with best practices, and test the effectiveness of interventions of known efficacy. The project will employ a mixed-methods approach combining the strengths of qualitative research, survey methods, observational and experimental designs. First, we will systematically review current guidelines on prevention of the most common HAIs within the EU, as well as schemes for surveillance and public reporting. Next, we will conduct a large-scale survey of what is actually being done in European hospitals, determining factors, and how these relate to bloodstream infection rates. The project will then focus on catheter-related bloodstream infection (CRBSI), a highly transmissible and reliably measured HAI, in a selected sample of European hospitals. In-depth interviews of healthcare staff and direct observation will be used to measure compliance with key prevention practices. A randomized effectiveness trial using a stepped wedge design will be conducted in intensive care units to determine the uptake and impact of 2 interventions (WHO hand hygiene protocol and so-called catheter bundle) on CRBSI as well as clinical and utilization outcomes. The information will be synthesized to develop recommendations for the EU, policy makers, managers and medical professionals to prevent HAI. Dissemination will include instructional workshops and on-line training materials.
Agency: European Commission | Branch: H2020 | Program: RIA | Phase: PHC-03-2015 | Award Amount: 5.10M | Year: 2015
Common mechanisms and pathways in Stroke and Alzheimers disease. It has long been recognized that stroke and (Alzheimers Disease) AD often co-occur and have an overlapping pathogenesis. As such, these two diseases are not considered fellow travelers, but rather partners in crime. This multidisciplinary consortium includes epidemiologists, geneticists, radiologists, neurologists with a longstanding track-record on the etiology of stroke and AD. This project aims to improve our understanding of the co-occurrence of stroke and AD. An essential concept of our proposal is that stroke and AD are sequential diseases that have overlapping pathyphysiological mechanisms in addition to shared risk factors. We will particularly focus on these common mechanisms and disentangle when and how these mechanisms diverge into causing either stroke, or AD, or both. Another important concept is that mechanisms under study will not only include the known pathways of ischemic vasculopathy and CAA, but we will explore and unravel novel mechanisms linking stroke and AD. We will do so by exploiting our vast international network in order to link various big datasets and by incorporating novel analytical strategies with emerging technologies in the field of genomics, metabolomics, and brain MR-imaging.
Agency: European Commission | Branch: H2020 | Program: RIA | Phase: SC1-PM-01-2016 | Award Amount: 16.02M | Year: 2017
The SYSCID consortium aims to develop a systems medicine approach for disease prediction in CID. We will focus on three major CID indications with distinct characteristics, yet a large overlap of their molecular risk map: inflammatory bowel disease, systemic lupus erythematodes and rheumatoid arthritis. We have joined 15 partners from major cohorts and initiatives in Europe (e.g.IHEC, ICGC, TwinsUK and Meta-HIT) to investigate human data sets on three major levels of resolution: whole blood signatures, signatures from purified immune cell types (with a focus on CD14 and CD4/CD8) and selected single cell level analyses. Principle data layers will comprise SNP variome, methylome, transcriptome and gut microbiome. SYSCID employs a dedicated data management infrastructure, strong algorithmic development groups (including an SME for exploitation of innovative software tools for data deconvolution) and will validate results in independent retrospective and prospective clinical cohorts. Using this setup we will focus on three fundamental aims : (i) the identification of shared and unique core disease signatures which are associated with the disease state and independent of temporal variation, (ii) the generation of predictive models of disease outcome- builds on previous work that pathways/biomarkers for disease outcome are distinct from initial disease risk and may be shared across diseases to guide therapy decisions on an individual patient basis, (iii) reprogramming disease - will identify and target temporally stable epigenetic alterations in macrophages and lymphocytes in epigenome editing approaches as biological validation and potential novel therapeutic tool. Thus, SYSCID will foster the development of solid biomarkers and models as stratification in future long-term systems medicine clinical trials but also investigate new causative therapies by editing the epigenome code in specific immune cells, e.g. to alleviate macrophage polarization defects.
Agency: European Commission | Branch: FP7 | Program: CP | Phase: ICT-2013.4.2 | Award Amount: 3.64M | Year: 2014
This project proposes a unified, open-source execution framework for scalable data analytics. Data analytics tools have become essential for harnessing the power of our eras data deluge. Current technologies are restrictive, as their efficacy is usually bound to a single data and compute model, often depending on proprietary systems. The main idea behind ASAP is that no single execution model is suitable for all types of tasks and no single data model (and store) is suitable for all types of data. The project makes the following innovative contributions:\n\n(a) A general-purpose task-parallel programming model. The runtime will incorporate and advance state-of-the-art task-parallel programming models features, namely: (i) irregular general-purpose computations, (ii) resource elasticity, (iii) synchronization, data-transfer, locality and scheduling abstraction, (iv) ability to handle large sets of irregular distributed data, and (v) fault-tolerance.\n\n(b) A modeling framework that constantly evaluates the cost, quality and performance of data and computational resources in order to decide on the most advantageous store, indexing and execution pattern available.\n\n(c) A unique adaptation methodology that will enable the analytics expert to amend the task she has submitted at an initial or later stage.\n\n(d) A state-of-the-art visualization engine that will enable the analytics expert to obtain accurate, intuitive results of the analytics tasks she has initiated in real-time.\n\nTwo exemplary applications that showcase the ASAP technology in the areas of Web content and large-scale business analytics will be developed. The consortium -- led by the Foundation for Research & Technology -- is well-positioned to achieve its objectives by bringing together a team of leading researchers in data-management technologies. These are combined with active industrial and leading user organizations that offer expertise in the production-level domain of data analytics.
Agency: European Commission | Branch: FP7 | Program: MC-ITN | Phase: FP7-PEOPLE-2011-ITN | Award Amount: 3.95M | Year: 2012
Brain disorders impose an increasing economical and social burden in the member states of the European Union (EU). For most neurodegenerative diseases and many neuropsychiatric disorders no efficient treatment is available and no cure exists. In the next coming years the number of particularly elderly people suffering from brain disorders will tremendously increase. Predictions from the turn of the century about the exponential increase of dementia patients turned out to be correct and Alzheimers disease alone is underway to become the most expensive and most pressing health problem in the EU. The complexity of these diseases requires a more integrative view of the multiple interactions between genes and environment, synaptic processes and neuronal cicuitry. This is, however, not only achieved by training more young scientists in the relevant disciplines. The plastic properties of the brain can only be exploited by scientists that are trained to deal with this complexity and that are familiar with state of the art technology as well as with the principles at different levels of analysis. In consequence it is advisable for a training network to study more than one disease and to train scientists with a wide range of skills and background knowledge. The NPlast consortium consists of four partners from the private and seven partners from the public sector and will provide a research training program for fifteen young scientists. The program covers a broad spectrum of disorders and interventions ranging from synaptopathies and trafficking deficiencies to Alzheimers disease, and from altering gene expression programs to manipulations of the extracellular matrix of the brain to preserve or restore synaptic function. The key objective of the NPlast training network is to investigate neuroplastic principles that can preserve or restore function and that can be used to rejuvenitate the brain in the elderly as well as to treat neuropsychiatric conditions in adults.
Agency: European Commission | Branch: FP7 | Program: CP-FP | Phase: HEALTH.2011.2.4.3-2 | Award Amount: 9.96M | Year: 2011
The DIABAT project will employ knowledge of the function, dysfunction and physiological regulation of brown adipocytes to develop innovative therapeutic and preventive strategies for type 2 diabetes. Brown adipose tissue (BAT) is currently a worldwide recognized target to combat obesity and diabetes due to last years re-discovery of functional BAT in adult humans by several of the members of the DIABAT network (van Marken LIchtenbelt et al., N. Engl. J. Med. 360, 1500, 2009; Virtanen, Enerbck & Nuutila, N. Engl. J. Med. 360, 1518, 2009) along with sharp rise in insight in cellular, genetic, and regulatory mechanisms from animal studies. Therefore, the DIABAT project aims at recruiting and re-activating endogenous energy-dissipating BAT as a preventive and/or remedial measure for weight and blood sugar control in obesity-related type 2 diabetes (diabesity), thereby halting or preventing destruction and facilitating recovery of pancreatic beta-cells under diabetic conditions.
Agency: European Commission | Branch: FP7 | Program: CP | Phase: ICT-2007.8.2 | Award Amount: 3.09M | Year: 2008
In the near future, it is reasonable to expect that new types of systems will appear, designed or emerged, of massive scale, expansive and permeating their environment, of very heterogeneous nature, and operating in a constantly changing networked environment. We expect that most such systems will have the form of a large society of networked artifacts that are small, have limited sensing, signal processing, and communication capabilities, and are usually of limited energy. Yet by cooperation, they will be organized in large societies to accomplish tasks that are difficult or beyond the capabilities of todays conventional centralized systems. The scale and nature of these systems requires naturally that they are pervasive and are expected to operate beyond the complete understanding and control of their designers, developers, and users. These systems or societies should have particular ways to achieve an appropriate level of organization and integration that is achieved seamlessly and with appropriate levels of flexibility. The aim of this project is to establish the foundations of adaptive networked societies of small or tiny heterogeneous artifacts. We indent to develop an understanding of such societies that will enable us to establish their fundamental properties and laws, as well as, their inherent trade-offs. We will approach our goal by working on a usable quantitative theory of networked adaptation based on rigorous and measurable gains. We also indent to apply our models, methods, and results to the scrutiny of large-scale simulations and experiments, from which we expect to obtain valuable feedback. The foundational results and the feedback from simulations and experiments will form a unifying framework for adaptive nets of artifacts that hopefully will enable us to come up with a coherent working set of design rules for such systems. In a nutshell, we will work towards a science of adaptive organization of pervasive networks of small or tiny artifacts.
Agency: European Commission | Branch: FP7 | Program: CP-FP | Phase: HEALTH-2007-2.2.1-7 | Award Amount: 6.77M | Year: 2008
Neurodegenerative diseases all cause damage to the circuitry of the nervous system, with loss of connections, axons and neurons. The loss can be gradual, as in Alzheimers disease, rapid as in stroke, or intermediate as in the delayed neuronal loss after stroke. Following damage, the nervous system is able partially to compensate through the formation of alternative connections and pathways, a process known as plasticity. Adults are therefore able to regain considerable function after stroke, and to compensate for the synapse and cell loss of Alzheimers disease until it reaches a critical level. Children undergo a period of enhanced plasticity in most parts of the CNS at the end of development, known as critical periods. During these periods their ability to compensate for damage to the CNS is in many cases much greater than in adults. The overall concept behind this application is that restoration of the function in neurodegeneration can be achieved through plasticity (the formation of new functional connections, withdrawal of inappropriate connections, modulation of synaptic strength). Promoting increased plasticity in selected parts of the adult nervous system back to the level seen in children is a powerful method of enhancing recovery of function in animal models. Plasticity-promoting treatments could therefore be beneficial in a wide range of conditions that damage the CNS. The PLASTICISE project integrates scientists from four scientific areas 1) Development of methods to promote plasticity 2) Development of models of neurodegenerative disease 3) Imaging of plasticity at the macro and micro level 4) Study of recovery of function through plasticity in human patients with brain disorders. The concept that unites the partners is the belief that treatments that enhance plasticity will become one of key medications that will improve neurological function in the damaged human nervous system. The purpose of the project is to bring this moment closer.
Agency: European Commission | Branch: FP7 | Program: CP-FP | Phase: HEALTH-2007-2.3.2-1 | Award Amount: 4.90M | Year: 2008
AIDS remains one of the major life threatening infectious diseases in the world today. A constant supply of novel antiretrovirals (ARVs) is needed to respond to the limitations of current drugs. By the end of 2007, the therapeutic arsenal against HIV is expected to comprise ARVs blocking all major steps of HIV replication, except for particle assembly and budding. Members of HIV-ACE were instrumental in recent breakthroughs concerning virus assembly and Envelope incorporation into virions. On the basis of novel and fully validated targets, the objectives of HIV-ACE are: i) Assay development, drug screening and pre-clinical development of small molecule inhibitors of Capsid assembly and Env incorporation, up to the stage of Early Drug Candidates with acceptable toxicity profile determined by ADME/Tox studies, activity against multi drug-resistant viral strains and primary non-B isolates, antiviral activity in primary T cells and macrophages (WP1); ii) Elucidation of 3D structures of these validated targets and rational drug design guided by molecular modelling and docking of inhibitors (WP2); iii) validation and exploitation of the HIV-susceptible transgenic rat model to allow preclinical in vivo-evaluation of novel drug candidates from HIV-ACE and to provide the European pharmaceutical industry with an efficient in vivo-platform for predictive testing of any kind of anti-HIV drug (WP3); iv) Elucidation of the mechanisms responsible for activity of the validated inhibitors, and discovery/validation of novel targets in the budding pathway of HIV-1 (WP4). HIV-ACE is composed of 8 teams from 5 European countries, including a Biopharma SME, led by prominent world-class scientists in Virology, Cell Biology, Immunology, Organic and Medicinal Chemistry, working in highly regarded research organisations. To efficiently achieve the ambitious goals of HIV-ACE, strong management led by a very experienced organisation has been included in a separate coordination-management WP5.
POWER2YOUTH - Freedom, dignity and justice: A comprehensive approach to the understanding of youth exclusion and the prospects for youth inclusion and overall change in the South and East Mediterranean
Agency: European Commission | Branch: FP7 | Program: CP-FP-SICA | Phase: SSH.2013.4.1-2 | Award Amount: 3.40M | Year: 2014
The 2010-2011 youth-led wave of protests in the South and East Mediterranean, could be described as the coming on the scene of a new generation united by a shared experience of marginalisation and by new ways to protest and act. Important as this phenomenon could be for the future of the SEM, it still escapes the main frames of analysis utilised by academic research. Youth studies in the SEM, while producing important findings and insights, have failed so far to give a multi-dimensional and comprehensive understanding of the economic, political and social disadvantages faced by youth in the region and of the possible evolution of young peoples role in national or regional developments. This project aims at filling this important gap in our knowledge of the SEM by offering a comprehensive multi-level, interdisciplinary and gender-sensitive approach to the understanding of youth in the region. By combining the economic, political and socio-cultural spheres and a macro (policy/institutional), meso (organisational) and micro (individual) level analysis, POWER2YOUTH will explore the root causes and complex dynamics of youth exclusion, while investigating the factors fostering youth inclusion. Building on a conceptualisation of youth that gives prominence to youth as potential agents of change, the project starts out from the assumption that youth exclusion is the result of unequal power relations in society, in as much as effective youth inclusion can only be fostered by a bottom-up process of transformation of the systemic inequalities that lead to exclusion in the first place. From this premise comes the projects emphasis on the study of the potentially transformative impact of individual and collective youth agency searching for instances of empowerment leading to active youth participation in society and overall change. POWER2YOUTH will finally produce innovative and concrete policy recommendations.
Agency: European Commission | Branch: FP7 | Program: CP | Phase: ENV.2012.6.4-2 | Award Amount: 7.88M | Year: 2012
The main objectives of FUTUREVOLC are to establish an integrated volcanological monitoring procedure through European collaboration, develop new methods to evaluate volcanic crises, increase scientific understanding of magmatic processes and improve delivery of relevant information to civil protection and authorities. To reach these objectives the project combines broad European expertise in seismology, volcano deformation, volcanic gas and geochemistry, infrasound, eruption monitoring, physical volcanology, satellite studies of plumes, meteorology, ash dispersal forecasting, and civil defence. This European consortium leads the way for multi-national volcanological collaboration with the aim of mitigating the effects of major eruptions that pose cross-border hazards. Iceland is selected as a laboratory supersite area for demonstration because of (i) the relatively high rate of large eruptions with potential for long ranging effects, and (ii) Icelands capability to produce the near full spectrum of volcano processes at its many different volcano types. Based on present monitoring networks and ongoing research, the project will bridge gaps and combine efforts for a coherent close-to-real-time evaluation of the state of Icelandic volcanoes and their unrest. The project will provide timely information on magma movements from combined interpretation of earthquake sources relocated in three-dimensional velocity models, magma sources inferred from ground and space geodetic data, and measurements of volcanic volatiles. For better response during eruptions, the project will develop operational models of magma discharge rate, contributing directly to improved forecasts of ash dispersion. They will help to minimise economic disruption on a European scale during eruptions. By integrating a Volcanic Ash Advisory Centre and a civil protection unit into the project, European citizens will benefit directly from the scientific work of FUTUREVOLC.
Agency: European Commission | Branch: FP7 | Program: MC-ITN | Phase: FP7-PEOPLE-2011-ITN | Award Amount: 3.56M | Year: 2011
Musculoskeletal diseases (MSD) and related disorders are often considered as an inevitable consequence of aging and they account for the largest fraction of temporary and permanent disability. It is the leading cause of disability in the EU and accounts for more than half of all chronic conditions in people over 50 years of age in developed countries. The burden of MSD and related disorders presents thus a compelling argument for greater understanding through expanding research and training of experts. As recognized by the EU strategic approach white paper (2007) Together for health and the subsequent green paper (2008) On the European workforce for health, the advances in sciences and the rapid developments in new technologies are revolutionizing how we promote health and predict, prevent and treat illness. The goal of Multi-scale Biological modalities for physiological human articulation (MultiScaleHuman) is to research by training a body of early stage researchers (ESR) and experienced researchers (ER) in the creation of a multi-scale biological data visualization and knowledge management system for improved understanding, diagnosis and treatment of physiological human articulation. MultiScaleHuman will narrow its ambitious research towards a very important and challenging healthcare problem of MSD and related disorders. This will be achieved through initiating a network of ESR/ER with training provided from a three-sector-research consortium which involves academic (education), hospital (social actors) and private (industry) sectors. MultiScaleHuman will provide a unique training program, from technical to complementary skills learning by fully exploiting the training opportunities that Marie Curie ITN provides and by building a consortium of partners that brings multi-disciplinary skills in the understanding and treatment of physiological articulations in MSD and related disorders.
Agency: European Commission | Branch: FP7 | Program: CP-CSA-Infra | Phase: INFRA-2010-1.1.33 | Award Amount: 26.09M | Year: 2011
AIDA (http://cern.ch/aida) addresses the upgrade, improvement and integration of key research infrastructures in Europe, developing advanced detector technologies for future particle accelerators, as well as transnational access to facilities that provide these research infrastructures. In line with the European Strategy for Particle Physics, AIDA targets the infrastructures needed for R&D, prototyping and qualification of detector systems for the major particle physics experiments currently being planned at future accelerators. By focusing on common development and use of such infrastructure, the project integrates the entire detector development community, encouraging cross-fertilization of ideas and results, and providing a coherent framework for the main technical developments of detector R&D. This project includes a large consortium of 37 beneficiaries, covering much of the detector R&D for particle physics in Europe. This collaboration allows Europe to remain at the forefront of particle physics research and take advantage of the world-class infrastructures existing in Europe for the advancement of research into detectors for future accelerator facilities. The infrastructures covered by the AIDA project are key facilities required for an efficient development of future particle physics experiments, such as: test beam infrastructures (at CERN, DESY and LNF), specialised equipment, irradiation facilities (in several European countries), common software tools, common microelectronics and system integration tools and establishment of technology development roadmaps with a wide range of industrial partners.
Agency: European Commission | Branch: FP7 | Program: CP-IP | Phase: NMP-2009-4.0-3 | Award Amount: 16.52M | Year: 2010
NAMDIATREAM will develop a cutting edge nanotechnology-based toolkit for multi-modal detection of biomarkers of most common cancer types and cancer metastases, permitting identification of cells indicative of early disease onset in a high-specificity and throughput format in clinical, laboratory and point-of-care devices. The project is built on the innovative concepts of super-sensitive and highly specific lab-on-a-bead, lab-on-a-chip and lab-on-a-wire nano-devices utilizing photoluminescent, plasmonic, magnetic and non-linear optical properties of nanomaterials. This offers groundbreaking advantages over present technologies in terms of stability, sensitivity, time of analysis, probe multiplexing, assay miniaturisation and reproducibility. The ETP in Nanomedicine documents point out that nanotechnology has yet to deliver practical solutions for the patients and clinicians in their struggle against common, socially and economically important diseases such as cancer. Over 3.2M new cases and 1.7M cancer-related deaths are registered in Europe every year, largely because diagnostic methods have an insufficient level of sensitivity, limiting their potential for early disease identification. We will deliver Photoluminescent nanoparticle-based reagents and diagnostic chips for high throughput early diagnosis of cancer and treatment efficiency assessment Nanocrystals enabling plasmon-optical and nonlinear optical monitoring of molecular receptors within body fluids or on the surface of cancer cell Multi-Parameter screening of cancer biomarkers in diagnostic material implementing segmented magnetic nanowires Validation of nano-tools for early diagnosis and highly improved specificity in cancer research. OECD-compliant nanomaterials with improved stability, signal strength and biocompatibility Direct lead users of the results will be the diagnostic and medical imaging device companies involved in the consortium, clinical and academic partners
Agency: European Commission | Branch: FP7 | Program: CP-IP | Phase: HEALTH-2011.2.2.1-2 | Award Amount: 17.04M | Year: 2012
NEURINOX aims at elucidating the role of NADPH oxidases (NOX) in neuroinflammation and its progression to neurodegenerative diseases (ND), as well as evaluating the potential of novel ND therapeutics approaches targeting NOX activity. NOX generate reactive oxygen species (ROS) and have emerged as regulators of neuroinflammation. Their role is complex: ROS generated by NOX lead to tissue damage in microglia-mediated neuroinflammation, as seen in amyotrophic lateral sclerosis (ALS), while absence of ROS generation enhances the severity of autoimmune-mediated neuroinflammation, as seen for e.g. in multiple sclerosis (MS). The objective of the 5 years NEURINOX project is to understand how NOX controls neuroinflammation, identify novel molecular pathways and oxidative biomarkers involved in NOX-dependent neuroinflammation, and develop specific therapies based on NOX modulation. The scientific approach will be to: (i) identify NOX-dependent molecular mechanisms using dedicated ND animal models (ii) develop therapeutic small molecules either inhibiting or activating NOX and test their effects in animal models (iii) test the validity of identified molecular pathways in clinical studies in ALS and MS patients. NEURINOX will contribute to better understand brain dysfunction, and more particularly the link between neuroinflammation and ND and to identify new therapeutic targets for ND. A successful demonstration of the benefits of NOX modulating drugs in ALS and MS animal models, and in ALS early clinical trials will validate a novel high potential therapeutics target for ALS and also many types of ND. NEURINOX has hence a strong potential for more efficient ND healthcare for patients and thus for reducing ND healthcare costs. This multi-disciplinary consortium includes leading scientists in NOX research, ROS biology, drug development SMEs, experts in the neuroinflammatory aspects of ND, genomics and proteomics, and clinicians able to translate the basic science to the patient.
Agency: European Commission | Branch: FP7 | Program: CP-TP | Phase: NMP-2007-3.1-2 | Award Amount: 4.74M | Year: 2008
SERVIVE net proposes the enlargement of the assortment of customizable clothing items currently on offer, the enhancement of all co-design aspects (functionality and fun) and the development and testing of a new production model based on decentralized networked SME cells.The Servive net will not only seamlessly link critical Mass-Customisation (MC) enabling services, but more important it will adapt these services to the specific needs and preferences of well-defined target customer groups. It will also enable all necessary interactions of customers with value-chain actors in transparent ways, thus enabling and encouraging the active participation of end consumers in the configuration of the customised items. The selected product configuration will in turn influence the production scenario (see the extended Micro-Factory concept below). Central to this scenario is the concept of Virtual Customer Advisor (VCA), which, depending on the profile of the customer will recommend the optimum product configuration, based either on style preferences (Style Advisor), functional requirements (e.g. for protective clothing/ sportswear) or body morphology and physical disability or problem figure related issues. On the upstream part of the chain, the Servive net will introduce the innovative organisational concept of the Networked Micro-Factory, directly linked to the concept of User-centred Production Configuration. The MF concept promotes the idea of decentralized production close to retailers and consumers (proximity advantage). MFs can range from networked small size but high-tech MC production sites, to sites equipped with automatic knitting machines, or even semi-automatic 3D assembly centres (single-ply cutter \ sewing robots). Knowledge-based web services will relate to style expertise, human body expertise and data, material and specific manufacturing knowledge.
Agency: European Commission | Branch: FP7 | Program: CP | Phase: ICT-2013.5.1 | Award Amount: 4.20M | Year: 2013
The main aim of the Miraculous-Life project is to design, develop and evaluate an innovative user-centric technological solution, the Virtual Support Partner (VSP), attending to the elder (65\) daily activity and safety needs, while the elder goes about his normal daily life. The VSP will provide implicit daily activities support which is based on behaviour and emotional understanding and appropriate respond exhibiting distinctive emotions, deliver in a human like way simulating in essence the interaction with a real life partner.Specifically, the VSP fuses together users facial expressions, voice intonation, gestures and other contextual information of the users environment and provides intelligent responses and delivery of services through an Avatar-based interface exhibiting empathic respond through face emotions and voice intonations. Through an intelligent dialogue, and the use of different ICT services for elder home daily activities support and safety, the VSP stimulates and motivates the elder to act.The validation of the system will be realized in two well selected use cases in two different countries. Up to 100 elderly people and their caregivers will use the system over a six month period. The system will be delivered to the user in form of a stand-alone consumer product, operating on a scalable distributed network of interconnected tablet and Kinect devices, focusing on minimum essential personalized elders daily activities care support at home.The system will provide benefits on a practical, psychological and social level enabling and motivating the elderly to remain longer active in carrying out their daily life at home prolonging thus their independence and improving their wellbeing.
Agency: European Commission | Branch: FP7 | Program: CP | Phase: ICT-2007.1.6 | Award Amount: 3.63M | Year: 2008
The aim of this project is to provide a multi-level infrastracture of interconnected testbeds of large-scale wireless sensor networks for research purposes, pursuing an interdisciplinary approach that integrates the aspects of hardware, software, algorithms, and data. This will demonstrate how heterogeneous small-scale devices and testbeds can be brought together to form well-organized, large-scale structures, rather than just some large network; it will allow research not only at a much larger scale, but also in different quality, due to heterogeneous structure and the ability to deal with dynamic scenarios, both in membership and location. For the interdisciplinary area of wireless sensor networks, establishing the foundations of distributed, interconnected testbeds for an integrated approach to hardware, software, algorithms, and data will allow a new quality of practical and theoretical collaboration, possibly marking a turning point from individual, hand-tailored solutions to large-scale, integrated ones. For this end, we will engage in implementing recent theoretical results on algorithms, mechanisms and protocols and transform them into software. We will apply the resulting code to the scrutiny of large-scale simulations and experiments, from which we expect to obtain valuable feedback and derive further requirements, orientations and inputs for the long-term research. We intend to make these distributed laboratories available to the European scientific community, so that other research groups will take advantage of the federated infrastructure. Overall, this means pushing the new paradigm of distributed, self-organizing structures to a different level.
Agency: European Commission | Branch: FP7 | Program: CP-FP-SICA | Phase: HEALTH.2010.1.2-4 | Award Amount: 3.85M | Year: 2010
Chronic hepatitis C is one of the most common chronic viral infections of humans and a major cause of chronic liver disease, cirrhosis and liver cancer. Still about 4 million new infections occur world-wide each year with 50-85% of patients progressing to chronic hepatitis C. Currently there is no marker to predict spontaneous viral clearance and to guide treatment decisions. The major objectives of the HepaCute proposal are to develop biomarkers predicting the outcome of acute hepatitis C, improving the management of the related patients and thus decreasing the health burden of hepatitis C in Europe and Mediterranean partner countries (MPC). The HepaCute consortium has evolved from a series of EC-funded projects on hepatitis C (HCVacc/HepCvax/Virgil/HEPACIVAC) and consists of world leading experts in HCV epidemiology, immunology, and virology, including partners from Egypt and Morocco, who have strongly influenced the current management of patients with acute hepatitis C in their respective regions, and contributed considerably to our understanding of mechanisms of spontaneous viral clearance. The HepaCute proposal is closely connected to ongoing national, European, and Egyptian networks on HCV research (HepNet, EASL, STDF), which will support HepaCute to make it a success.Together with another pertinent EU-funded research project, SPHINX, it actively contributes to coordinating EU-funded hepatitis C research with pertinent research projects funded in the MCP countries, in particular with hepatitis research projects funded under the Egyptian Science and Technology Development Fund (STDF). Within HepaCute the most innovative technologies will be employed such as genome-wide association studies, transcriptomics, proteomics, and ultra-deep sequencing to better understand the early events in acute hepatitis C and to translate these results into readily practicable diagnostic tools to predict spontaneous viral clearance. HepaCute has firmly integrated partners from Egypt and Morocco with preexisting research collaborations with European partners into the scientific research programm and we expect this continuing partnership between European and Mediterranean countries to have a strong impact on the care of patients with acute hepatitis C both in Europe and MPC.
Agency: European Commission | Branch: FP7 | Program: NOE | Phase: ICT-2007.2.2 | Award Amount: 8.21M | Year: 2009
The ability to understand and manage social signals of a person we are communicating with is the core of social intelligence. Social intelligence is a facet of human intelligence that has been argued to be indispensable and perhaps the most important for success in life.\nAlthough each one of us understands the importance of social signals in everyday life situations, and in spite of recent advances in machine analysis and synthesis of relevant behavioural cues like blinks, smiles, crossed arms, laughter, etc., the research efforts in machine analysis and synthesis of human social signals like empathy, politeness, and (dis)agreement, are few and tentative. The main reasons for this are the absence of a research agenda and the lack of suitable resources for experimentation.\nThe mission of the SSPNet is to create a sufficient momentum by integrating an existing large amount of knowledge and available resources in Social Signal Processing (SSP) research domains including cognitive modelling, machine understanding, and synthesizing social behaviour, and so: (i) enable creation of the European and world research agenda in SSP, (ii) provide efficient and effective access to SSP-relevant tools and data repositories to the research community within and beyond the SSPNet, and\n(iii) further develop complementary and multidisciplinary expertise necessary for pushing forward the cutting edge of the research in SSP. The collective SSPNet research effort will be directed towards integration of existing SSP theories and technologies, and towards identification and exploration of potentials and limitations in SSP. A particular scientific challenge that binds the partners is the synergetic combination of human-human interaction models and tools for human behaviour sensing and synthesis within socially-adept multimodal interfaces.
Agency: European Commission | Branch: H2020 | Program: MSCA-ITN-ETN | Phase: MSCA-ITN-2016 | Award Amount: 3.99M | Year: 2016
Breast and ovarian cancer constitute serious health challenges in the EU. To identify new improved cancer therapeutic approaches, we will pursue a multi-facetted synthetic lethal approach, which takes advantage of the inherent genetic instability of cancer cells. Most mutations acquired by cancer cells do not cause lethality, but the very same mutations may cause cell death when a second gene in a redundant pathway is inactivated. Thus, targeting a gene that is synthetic lethal together with a cancer-specific mutation will kill only cancer cells while sparing normal cells. Synthetic lethal approaches have been clinically pioneered by members of our consortium, by combining cancer-promoting mutations (e.g. in BRCA2) with inactivating combinations of DNA repair genes. We will use this approach as the scientific frame for our ETN (SYNTRAIN) consisting of 9 academic and 1 SME beneficiary as well as 3 partners. We aim to identify synthetic lethal interactions and exploit them to innovate future breast and ovarian cancer treatments through compound screening and development. SYNTRAIN consists of World leading researchers with complementary knowledge in genome maintenance and stress response pathways, and a critical mass of expertise for providing an excellent training in screening methodologies, mechanistic investigations, and drug discovery. We will offer a structured training program that exceeds the capacities of each individual member. We will educate a future generation of cancer researchers with a high innovative capacity and skills that enhances their career prospects in both academia and industry. Our aims are to train young researchers: i) in techniques preparing for a career in cancer research, ii) in complementary skills relevant for work in academia and the pharma industry and iii) to become creative and entrepreneurial, capable of bridging the gap between basic and applied research.
Agency: European Commission | Branch: H2020 | Program: RIA | Phase: EINFRA-5-2015 | Award Amount: 4.94M | Year: 2016
This Centre of Excellence will advance the role of computationally based modelling and simulation within biomedicine. Three related user communities lie at the heart of the CoE: academic, industrial and clinical researchers who all wish to build, develop and extend such capabilities in line with the increasing power of high performance computers. Three distinct exemplar research areas will be pursued: cardiovascular, molecularly-based and neuro-musculoskeletal medicine. Predictive computational biomedicine involves applications that are comprised of multiple components, arranged as far as possible into automated workflows in which data is taken, from an individual patient, processed, and combined into a model which produces predicted health outcomes. Many of the models are multiscale, requiring the coupling of two or more high performance codes. Computational biomedicine holds out the prospect of predicting the effect of personalised medical treatments and interventions ahead of carrying them out, with all the ensuing benefits. Indeed, in some cases, it is already doing so today. The CoE presents a powerful consortium of partners and has an outward facing nature and will actively train, disseminate and engage with these user communities across Europe and beyond. Because this field is new and growing rapidly, it offers numerous innovative opportunities. There are three SMEs and three enterprises within the project, as well as eight associate partners drawn from across the biomedical sector, who are fully aware of the vast potential of HPC in this domain. We shall work with them to advance the exploitation of HPC and will engage closely with medical professionals through our partner hospitals in order to establish modeling and simulation as an integral part of clinical decision making. Our CoE is thus user-driven, integrated, multidisciplinary, and distributed; presenting a vision that is in line with the Work Programme.
Agency: European Commission | Branch: FP7 | Program: MC-ITN | Phase: PEOPLE-2007-1-1-ITN | Award Amount: 2.60M | Year: 2008
Geological time is inextricably linked with Earth Sciences and the Geological Time Scale (GTS) is the yardstick to measure it. As such the GTS is the key to reconstruct Earth history. Recent developments in numerical dating now permit to build a much improved next generation GTS for the last 100 million years by integrating independent state-of-the-art techniques; this time scale will have an unprecedented accuracy, precision, resolution and stability. Within GTSnext a consortium of world leading European experts will be brought together for the first time to integrate their expertise and provide young scientists with a top training in all these methods. This training is the prime objective of GTSnext, and crucial to its success. Together this team of newly trained scientists is well equipped to construct the new GTS. GTSnext is part of a broader international initiative - EARTHTIME - a community-based scientific effort aimed at sequencing Earth history through an integrated geochronologic and stratigraphic approach. It is our ambition to broaden the Earthtime platform in Europe with GTSnext, which combined with an ESF funded Research Network run in parallel, will also serve as the basis for wider outreach towards the Earth Science community. The expected scientific contributions and breakthroughs are 1) a full integration and intercalibration of different numerical dating techniques, leading to 2) a significant improvement in the consistency of these same techniques; 3) progress towards a fully integrated and astronomical-tuned GTS over the last 100 million years; 4) an essentially stable time scale that is highly beneficial for both academia and industry, and 5) new insights in key geological processes including climate change, catastrophic impacts, and volcanic hazards. Finally, a more fundamental comprehension of geological time and the time scales at which key processes occur in Earth history is highly relevant in view of the impact we have on System Earth.
Agency: European Commission | Branch: H2020 | Program: RIA | Phase: EURO-3-2014 | Award Amount: 2.74M | Year: 2015
In 2013, as a response to rising inequalities, poverty and distrust in the EU, the Commission launched a major endeavour to rebalance economic and social policies with the Social Investment Package (SIP). RE-InVEST aims to strengthen the philosophical, institutional and empirical underpinnings of the SIP, based on social investment in human rights and capabilities. Our consortium is embedded in the Alliances to Fight Poverty. We will actively involve European citizens severely affected by the crisis in the co-construction of a more powerful and effective social investment agenda with policy recommendations. This translates into the following specific objectives: 1. Development of innovative methodological tools for participative research, involving mixed teams of researchers, NGO workers and people from vulnerable groups in the co-construction of knowledge on social policy issues; 2. Diagnosis of the social damage of the crisis in terms of (erosion of) human rights, social (dis)investment, loss of (collective) capabilities; 3. Analysis of the relationships between the rise of poverty and social exclusion, the decline of social cohesion and trust, and the threats to democracy and solidarity in the EU; 4. Development of a theoretical model of social investment, with a focus on the promotion of human rights and capabilities; 5. Application of this model to active labour market policies and social protection: evaluation of policy innovations through qualitative and quantitative analyses; 6. Application of the same model to public intervention in five selected basic service markets: water provision, housing, early childhood education, health care and financial services, through qualitative and quantitative analyses; 7. Analysis of the macro-level boundary conditions for successful implementation of the SIP; 8. Capacity building in civil society organisations for the promotion of the European social investment agenda, through networking and policy recommendations.
News Article | November 24, 2016
EPFL scientists have developed a new perovskite material with unique properties that can be used to build next-generation hard drives. As we generate more and more data, we need storage systems, e.g. hard drives, with higher density and efficiency. But this also requires materials whose magnetic properties can be quickly and easily manipulated in order to write and access data on them. EPFL scientists have now developed a perovskite material whose magnetic order can be rapidly changed without disrupting it due to heating. The work, which describes the first ever magnetic photoconductor, is published in Nature Communications. The lab of Laszló Forró, in a project led by postdoc Bálint Náfrádi, synthesized a ferromagnetic photovoltaic material. Perovskite photovoltaics are gradually becoming a cheaper alternative to current silicon systems, drawing much interest from energy scientists. But this particular material, which is a modified version of perovskite, exhibits some unique properties that make it particularly interesting as a material to build next-generation digital storage systems. Magnetism in material arises from the interactions of localized and moving electrons of the material; in a way, it is the result of competition between different movements of electrons. This means that the resulting magnetic state is wired in the material and it cannot be reversed without changing the structure of electrons in the material's chemistry or crystal structure. But an easy way to modify magnetic properties would be an enormous advantage in many applications such as magnetic data storage. The new material that the EPFL scientists developed offers exactly that. "We have essentially discovered the first magnetic photoconductor," says Bálint Náfrádi. This new crystal structure combines the advantages of both ferromagnets, whose magnetic moments are aligned in a well-defined order, and photoconductors, where light illumination generates high density free conduction electrons. The combination of the two properties produced an entirely new phenomenon: the "melting" of magnetization by photo-electrons, which are electrons that are emitted from a material when light hits it. In the new perovskite material, a simple red LED -- much weaker than a laser pointer -- is enough to disrupt, or "melt" the material's magnetic order and generate a high density of travelling electrons, which can be freely and continuously tuned by changing the light's intensity. The timescale for shifting the magnetic in this material is also very fast, virtually needing only quadrillionths of a second. Though still experimental, all these properties mean that the new material can be used to build the next generation of memory-storage systems, featuring higher capacities with low energy demands. "This study provides the basis for the development of a new generation of magneto-optical data storage devices," says Náfrádi. "These would combine the advantages of magnetic storage -- long-term stability, high data density, non-volatile operation and re-writability-- with the speed of optical writing and reading." This work included contributions from the European Synchrotron Radiation Facility and the University of Geneva. It was funded by the Swiss National Science Foundation, the European Research Council (PICOPROP and TopoMat) and the NCCR-MARVEL.
News Article | August 16, 2016
A rocket carrying China's first ever quantum satellite shot upward from Inner Mongolia on Tuesday (GMT +8), Aug. 16, propelling the country's goal of pioneering the first quantum communications network in outer space. The 1,400-plus-pound Quantum Experiments at Space Scale (QUESS) satellite roared from the Jiuquan Satellite Launch Center into the sky aboard a Long March-2D rocket at 1:40 local time. QUESS Chief Scientist Pan Jianwei says that from a follower in classic information technology (IT) development, China is now at the forefront, guiding achievements in the future. The satellite will move around Earth once every 90 minutes after it gets into a sun-synchronous orbit at an altitude of 500 kilometers (310 miles), officials said. In late May, China announced plans to launch its first quantum satellite into space and obtain a highly coveted asset against cyberespionage: hack-proof communications. Tuesday's early launch is just the beginning of the country's strategy to surpass the West in this challenging scientific field. Quantum physicist Nicolas Gisin, a professor from University of Geneva, says it is very likely that China will win the race to produce a quantum satellite. "It shows again China's ability to commit to large and ambitious projects," Gisin tells The Wall Street Journal. Indeed, the quantum communication race has been going on for the last two decades since the initial demonstration of the quantum key distribution link under Lake Geneva in the 1990s, says Professor Alexander Sergienko of Boston University. What's more, although scientists from Europe, Japan and the United States are scrambling to take advantage of the powerful properties of subatomic particles, only few of them have as much state support as Chinese researchers. In fact, quantum technology is a chief strategic focus in China's economic development plan for five years. It hasn't been disclosed how much Beijing allocated to quantum research or in building the QUESS satellite, but basic research funding was estimated to be at $101 billion in 2015, reports say. During its two-year mission, QUESS will establish hack-proof communications by sending uncrackable keys from outer space to Earth. The satellite, which earned the nickname "Micius" after a fifth century Chinese scientist and philosopher, will also provide insights into quantum entanglement — one of the strangest phenomena in quantum physics. Quantum communications is a much-coveted technology and ensures ultra-high security because a quantum photon can neither be duplicated nor separated. Hence, it is impossible to intercept, wiretap or crack the data sent transmitted through it. Meanwhile, Chinese scientists will test the quantum key distribution between satellite and ground stations and perform quantum communications between Beijing and Xinjiang's Urumqi. The QUESS satellite will also transmit entangled photons to two Earth stations that are 1,200 kilometers (745 miles) away from one another, in the hopes of testing quantum entanglement over a greater distance. © 2016 Tech Times, All rights reserved. Do not reproduce without permission.
News Article | March 9, 2016
Few scientists know that, instead of buying their lab equipment, they can often build it much more cheaply — and customize their creations — by following ‘open-hardware’ instructions that are freely available online. Fifty enthusiasts who gathered last week at CERN, Europe’s particle-physics laboratory near Geneva, Switzerland, are hoping to remedy researchers’ lack of awareness about open science hardware. At the first conference dedicated to the field, they met to compare creations — and to thrash out a road map to promote the widespread manufacturing and sharing of labware. “We want open hardware to become a normal part of the scientific process,” says Shannon Dosemagen, a co-organizer of the conference who is executive director of the non-profit citizen-science community Public Lab. Proponents of open hardware — named by analogy to ‘open software’ in computer science — have already created free online designs for dozens of pieces of labware, taking advantage of manufacturing technologies such as 3D printers and laser-cutting machines. They argue that sharing designs for others to adapt can vastly accelerate the progress of science. But this share-all do-it-yourself (DIY) philosophy is yet to become mainstream. “The majority of scientists are still waiting to get involved,” says Joshua Pearce, an engineer at Michigan Technological University in Houghton, who two years ago published a book for scientists on how to create a low-cost lab. The open-hardware movement can already point to much success in science, says conference co-organizer Jenny Molloy, who coordinates OpenPlant, a synthetic-biology centre at the University of Cambridge, UK. Citizen-science projects, schools and researchers who lack money to buy expensive equipment have been particularly quick to adopt it. In 2009, for example, Irfan Prijambada, a microbiologist at Gadjah Mada University in Yogyakarta, Indonesia, was able to equip his lab with tissue-culture hoods and microscopes for less than 10% of their commercial price, using designs posted by a life-sciences-community platform called Hackteria. Online designs have been created for a wide range of labware, from DNA-amplifying PCR machines to fluorescence imaging microscopes (see ‘How to make a … digitally controlled syringe pump’). (Molloy says that the basic principles behind a lot of labware are not patented, so intellectual-property conflicts are rare.) For some kit — such as scanning tunnelling microscopes — the fabrication process is too complex to take place in the lab, but Pearce thinks that these, too, will eventually become open source. And because these blueprints are openly shared — allowing anyone to critique and improve them — the quality of equipment is often at least as good as or even better than what is available commercially, he says. For researchers, this ability to tinker with equipment is the main advantage of open-source sharing. “If it’s open source, I can adapt it and fix it. That’s most important to me,” says Tobias Wenzel, a biophysics PhD student at the University of Cambridge. But other scientists’ reluctance to dive into DIY may stem from doubts about whether open hardware can faithfully produce the validated, standardized performance of commercial equipment. Too often, the documentation that accompanies designs — intended to calibrate the equipment’s performance against known standards and describe its use — is unclear or incomplete, conference attendees heard. A community-standard or best-practice guide could use a checklist to ensure that designers cover all the necessary bases, says Wenzel. “It needs to be something that says: ‘if you follow this procedure, this will work and you’ll be able to get high precision, high accuracy and low error’,” says Pearce. The problem is that sharing work in enough detail for anyone else to follow takes time and effort, but provides little formal scientific credit. “It’s one thing to build something for one’s own research, but to make it so it’s easy for others to replicate is much more difficult,” says Ryan Fobel, an engineer at the University of Toronto, Canada, who helped to develop an open-source platform for doing biology and chemistry on a chip, known as DropBot. To this end, at the Geneva conference researchers debated ways to assign credit to the designers of open hardware. Some would like to see a citation system for designs, or want journals to publish more research papers that outline designs. A central repository for open science hardware might help: CERN hosts a repository for electronics open hardware, and the US National Institutes of Health has a 3D-printing repository with a labware section. But no single repository collates everything. Because many scientists won’t want to build devices themselves, taking open hardware mainstream will need to involve non-profit organizations and companies that can supply the kit, notes Francois Grey, a physicist at the University of Geneva and conference co-organizer. Firms such as OpenTrons in Brooklyn, New York, which makes automated pipetting systems, already both design open-source lab equipment and sell ready-made kit built from open-source designs. But because such companies give away their designs, figuring out a solid business model is a challenge, adds Javier Serrano, an engineer at CERN who helped to pioneer the lab’s Open Hardware Licence, which allows developers to ensure that all future modifications are documented and shared. Companies might make money by providing support for open hardware, or by conducting quality-assurance checks and validation tests that allow them to offer warranty-like guarantees on products, Pearce suggests. And a collection of success stories might also help scientists to convince their institutions — which may be accustomed to patenting in-house inventions — of the value of forming open-hardware spin-offs, adds Molloy. Pearce says that he dreams of a day when every published scientific article will instruct its readers not just on experimental methods, but also on how to build the equipment that the study requires. It’s something that will need the cooperation of funders to become a reality. Existing large-scale equipment grants tend to pay for single instruments, but Pearce would like to see the money spent on open-source hardware, which he says could bring down prices and — over time — improve designs.
News Article | March 8, 2016
Basal cell carcinoma (also known as basalioma or basal cell cancer -- BCC) is the most common, and the 5th expensive cancer type to cure, making about 90% of malignant skin tumors. Among risk factors there are genetic malfunctions, appearance of freckles, x-rays and radiation, burns or use of immunosuppressive medication (for organ transplantation, for instance). 'The work led by Sergey Nikolaev, University of Geneva, resulted in a finding that more than 80% of BCC cases contain gene mutations leading to tumor formation of other types, that were not associated with BCC initially. This discovery proves the existence of mechanisms helping BCC to resist anticancer therapy, which opens new range of opportunities for further clinical trials,' explains Vladimir Seplyarskiy, coauthor of the paper and junior researcher at the Bioengineering and Bioinformatics Department, Moscow State University. Among genes and corresponding proteins where mutative malfunctions provoke cancer in 'the basal cell carcinoma inquiry' appear the elements of a Sonic hedgehog (Hh) signaling pathway. Hh manages task distribution among cells during the embryogenesis and further 'self-determination', as well as forming of organs' left-side and right-side orientation. Under normal conditions the Sonic hedgehog signaling pathway is started by an eponymous signaling protein. Found first in drosophila, the Sonic hedgehog gene, which's absence made flies' embryos look like thorny balls, was actually named after Supersonic The Hedgehog, a character of comics and videogames. 85% of basalioma cover gene of the Sonic hedgehog signaling pathway, which leads to it's activation without a help of the 'regulating hedgehog'. There belong PTCH1 receptors and SMO protein immersed in the cell membrane, which passes the hedgehogs protein's instruction to start protein synthesis to Gli transcription factor. Together with that, among 'the accused' in more than a half of tumor cells a 'broken' version of TP53 gene can be found, that under normal conditions suppresses forming of a tumor and orders the cells containing a defective DNA to 'commit suicide' by apoptosis. Sabotaging treatment: who is behind that? A SMO protein suppression, enabled by a medicine called Vismodegib, neutralizes breach's consequences in a chain interaction controlled by the Sonic hedgehog gene. Thanks to that medicine, the orders of the commander do not reach its subjects, and the job of the whole chain is stopped: so, pulling one of the first dominoes in a row, you can prevent falling of all other. Though in 50% of cases the tumor tissue either does not react to medical treatment, or develop resistance. It is SMO protein mutations responsible for a half of those complications: they prevent interaction of Vismodegib and SMO, however reasons for the second half remain unknown. That is why scientists payed attention to the fact that 85% of basal carcinoma contain genetic mutations, related to other cancer types development, including far more dangerous, such as melanoma. After detecting those oncogenic gene mutations, scientists managed to confirm activation of corresponding cancer tracts experimentally. One of the important observations made during the research was that the detected mutations are often seen in a subtype of aggressive basaliomas. 'We researched the peculiarities of mutation process happening in basalis cellular carcinoma cells. More than 100 of cancer exomes (a part of genome consisting of protein-coding DNA sequences) allow both to find out, which genes provoke this type of cancer, and to research, which processes help storing mutations (most of which do not influence cancer development even in cancer samples). We compared mutation profile of basal cell carcinoma with other cancer type -- melanoma, also provoked by ultraviolet radiation. As the result we managed conclude that oxidative stress and UV play a more important role in basal cell carcinoma development,' tells Vladimir Seplyarskiy. To catch the remaining accomplices red-handed in that nearly detective story, biologists analyzed 293 species of tumor tissues of 236 patients. 30 species belonged to Gorlin syndrome sufferers (a genetic disease, dramatically increasing the likelihood of multiple basaliomas appearance). Among them were also 23 Vismodegib-resistiant and 259 not exposed to the medicine. Scientists compared mutations in basalioma, melanoma, Wilson disease (kidney cancer usually formed in the first-third year of life) and other cancer types, which are probably connected with Sonic hedgehog gene and its 'accomplices' of the same pathway. The inquiry showed: of 387 genes in the 'list of suspects' several may contribute to the appearance of basalis cellular carcinoma and developing sustainability to treatment. Among them -- genes forming Hippo-YAP pathway, and not only Sonic hedgehog controlled pathways, that used to bear the whole responsibility. Also in basal carcinoma N-Myc protein content were exceeded, that was caused by point mutation (replacement of one nucleotide - a DNA 'letter') in ??1 site. In other tumor types, where N-Myc protein in a high concentration was 'caught red-handed' earlier, the reason was a mutation provoking multiple increase of MYCN gene's copies. According to Vladimir Seplyarskiy, 'this study is a turning point in understanding molecular mechanisms of appearance and development of basalis carcinoma. The fact that basilioma cells are subject to oxidative stress may give start to an alternative anticancer therapy of basalis cellular carcinoma.'. The results also show which mutagens affect DNA damage in skin cells, the researcher says.
News Article | September 14, 2016
In the mid-1960s, when few people had even seen a computer, Seymour Papert was making it possible for children to use and program them. He spent his career inventing the tools, toys, software and projects that popularized the view of computers as incubators of knowledge. Papert wrote three seminal books on using the computer to supercharge learning, aimed at academics, teachers and parents — Mindstorms: Computers, Children, and Powerful Ideas (1980), The Children's Machine: Rethinking School in the Age of the Computer (1993) and The Connected Family: Bridging the Digital Generation Gap (1996). Few academics of Papert's stature have spent as much time as he did working in real schools. He delighted in the theories, ingenuity and playfulness of children. Tinkering or programming with them was the cause of many missed meetings. Papert, who died on 31 July, was born on leap day 1928 in Pretoria, South Africa. His father was an entomologist. Before he was two, he became enamoured by automotive gears that were lying around his home, which became the basis for early maths and science experiences. As an educator, he sought to help each learner to find his or her 'gears': objects or experiences they could mess about with, intuiting powerful ideas along the way. Papert believed that what gears could not do for all, the computer, the Proteus of machines, might. Papert was repelled by apartheid. He ran afoul of the authorities by organizing classes for local black servants while in school. His anti-apartheid activities as a young adult branded him a dissident and prohibited him from travelling outside South Africa. He earned a bachelor's degree in philosophy (1949) and a PhD (1952) in mathematics at the University of the Witwatersrand in Johannesburg. Without a passport, in 1954 he made his way to the University of Cambridge, UK, where he earned a second doctorate, in 1959, for work on the lattices of logic and topology. From 1959 to 1963, Papert worked at the University of Geneva with the Swiss philosopher and psychologist Jean Piaget. Their collaboration led to great insights into how children learn to think mathematically. Papert built on Piaget's theory of constructivism with a learning theory of his own: constructionism. It proposed that the best way to ensure that knowledge is built in the learner is through the active construction of something shareable — a poem, program, model or idea. In 1963, artificial-intelligence (AI) pioneer Marvin Minsky invited Papert to join him at the Massachusetts Institute of Technology (MIT) in Cambridge. Papert was soon promoted to co-direct Minsky's Artificial Intelligence Laboratory. The pair co-authored the 1969 book Perceptrons; their mathematical analyses of how neuron-like networks comprised of individual agents could model the brain had a great impact on AI research. In 1985, Papert became a founding faculty member of the MIT Media Laboratory, where he led research groups on epistemology and learning and the future of learning. Thinking about thinking and the freedom to achieve one's potential were the leitmotifs of his life. He wanted to create “a mathematics children can love rather than inventing tricks to teach them a mathematics they hate”. In the late 1960s, Papert was among the creators of Logo, the first programming language for children. One element that made Logo accessible was the turtle, which acted as the programmer's avatar. As mathematical instructions were given to the turtle to move about in space, the creature dragged a pen to draw a trail. Such drawings created turtle geometry, a context in which linear measurement, arithmetic, integers, angle measure, motion and foundational concepts from algebra, geometry and even calculus were made concrete and understandable. Mathematics became playful, personal, expressive, relevant and purposeful. In 1968, Alan Kay, now known as the designer of what became the Macintosh graphical user interface, was so impressed by the mathematics that he saw children spontaneously engaged in at Papert's Logo lab at MIT that on his flight home, he sketched the Dynabook, the prototype for what became the personal computer. In 1989, Australian schools seeking to realize Papert's ideas began providing a laptop to every student. In 2000, Maine governor Angus King proposed providing a laptop for every 7th and 8th grader (typically 12–14-year-olds). Papert spent two years making the case across the state, causing popular opinion to override legislative resistance. The programme remains in place today. Papert was also an inspiration behind the One Laptop per Child initiative that has reached millions of children in the developing world. A 1971 paper co-authored by Papert, 'Twenty Things to Do with a Computer' (see go.nature.com/2buuwe), marks the birth of the modern 'maker movement'. It describes a world in which children would create by programming inventions and experiments outside of a PC. In the mid-1980s, Papert and his colleagues made that world a reality with the first programmable robotics system for children, LEGO TC Logo. The name of LEGO's current line of robotics sets — Mindstorms — is a hat-tip to Papert. In 1989, LEGO endowed a permanent chair at the MIT Media Lab in his name. Although critical of institutional schooling, Papert's research took place in schools, often with under-served populations of students. In 1986, he was invited to help Costa Rica reinvent its educational system, and from 1999 to 2002, Papert led an alternative, high-tech, project-based learning environment inside a prison for teenagers. Papert dared educators to grow, invent and lead in a system prone to compliance and standardization. He argued that education is a natural process that blossoms in the absence of coercion. In Papert's eyes, the computer was an object to think with. He built a bridge between progressive educational traditions and the Internet age to maintain the viability of schooling, and to ensure the democratization of powerful ideas.
News Article | February 16, 2017
Alpha cells in the pancreas can be induced in living mice to quickly and efficiently become insulin-producing beta cells when the expression of just two genes is blocked, according to a study led by researchers at the Stanford University School of Medicine. Studies of human pancreases from diabetic cadaver donors suggest that the alpha cells' "career change" also occurs naturally in diabetic humans, but on a much smaller and slower scale. The research suggests that scientists may one day be able to take advantage of this natural flexibility in cell fate to coax alpha cells to convert to beta cells in humans to alleviate the symptoms of diabetes. "It is important to carefully evaluate any and all potential sources of new beta cells for people with diabetes," said Seung Kim, MD, PhD, professor of developmental biology and of medicine. "Now we've discovered what keeps an alpha cell as an alpha cell, and found a way to efficiently convert them in living animals into cells that are nearly indistinguishable from beta cells. It's very exciting." Kim is the senior author of the study, which will be published online Feb. 16 in Cell Metabolism. Postdoctoral scholar Harini Chakravarthy, PhD, is the lead author. "Transdifferentiation of alpha cells into insulin-producing beta cells is a very attractive therapeutic approach for restoring beta cell function in established Type 1 diabetes," said Andrew Rakeman, PhD, the director of discovery research at JDRF, an organization that funds research into Type 1 diabetes. "By identifying the pathways regulating alpha to beta cell conversion and showing that these same mechanisms are active in human islets from patients with Type 1 diabetes, Chakravarthy and her colleagues have made an important step toward realizing the therapeutic potential of alpha cell transdifferentiation." Rakeman was not involved in the study. Cells in the pancreas called beta cells and alpha cells are responsible for modulating the body's response to the rise and fall of blood glucose levels after a meal. When glucose levels rise, beta cells release insulin to cue cells throughout the body to squirrel away the sugar for later use. When levels fall, alpha cells release glucagon to stimulate the release of stored glucose. Although both Type 1 and Type 2 diabetes are primarily linked to reductions in the number of insulin-producing beta cells, there are signs that alpha cells may also be dysfunctional in these disorders. "In some cases, alpha cells may actually be secreting too much glucagon," said Kim. "When there is already not enough insulin, excess glucagon is like adding gas to a fire." Because humans have a large reservoir of alpha cells, and because the alpha cells sometimes secrete too much glucagon, converting some alpha cells to beta cells should be well-tolerated, the researchers believe. The researchers built on a previous study in mice several years ago that was conducted in a Swiss laboratory, which also collaborated on the current study. It showed that when beta cells are destroyed, about 1 percent of alpha cells in the pancreas begin to look and act like beta cells. But this happened very slowly. "What was lacking in that initial index study was any sort of understanding of the mechanism of this conversion," said Kim. "But we had some ideas based on our own work as to what the master regulators might be." Chakravarthy and her colleagues targeted two main candidates: a protein called Arx known to be important during the development of alpha cells and another called DNMT1 that may help alpha cells "remember" how to be alpha cells by maintaining chemical tags on its DNA. The researchers painstakingly generated a strain of laboratory mice unable to make either Arx or DNMT1 in pancreatic alpha cells when the animals were administered a certain chemical compound in their drinking water. They observed a rapid conversion of alpha cells into what appeared to be beta cells in the mice within seven weeks of blocking the production of both these proteins. To confirm the change, the researchers collaborated with colleagues in the laboratory of Stephen Quake, PhD, a co-author and professor of bioengineering and of applied physics at Stanford, to study the gene expression patterns of the former alpha cells. They also shipped the cells to collaborators in Alberta, Canada, and at the University of Illinois to test the electrophysiological characteristics of the cells and whether and how they responded to glucose. "Through these rigorous studies by our colleagues and collaborators, we found that these former alpha cells were -- in every way -- remarkably similar to native beta cells," said Kim. The researchers then turned their attention to human pancreatic tissue from diabetic and nondiabetic cadaver donors. They found that samples of tissue from children with Type 1 diabetes diagnosed within a year or two of their death include a proportion of bi-hormonal cells -- individual cells that produce both glucagon and insulin. Kim and his colleagues believe they may have caught the cells in the act of converting from alpha cells to beta cells in response to the development of diabetes. They also saw that the human alpha cell samples from the diabetic donors had lost the expression of the very genes -- ARX and DNMT1 -- they had blocked in the mice to convert alpha cells into beta cells. "So the same basic changes may be happening in humans with Type 1 diabetes," said Kim. "This indicates that it might be possible to use targeted methods to block these genes or the signals controlling them in the pancreatic islets of people with diabetes to enhance the proportion of alpha cells that convert into beta cells." Kim is a member of Stanford Bio-X, the Stanford Cardiovascular Institute, the Stanford Cancer Institute and the Stanford Child Health Research Institute. Researchers from the University of Alberta, the University of Illinois, the University of Geneva and the University of Bergen are also co-authors of the study. The research was supported by the National Institutes of Health (grants U01HL099999, U01HL099995, UO1DK089532, UO1DK089572 and UC4DK104211), the California Institute for Regenerative Medicine, the Juvenile Diabetes Research Foundation, the Center of Excellence for Stem Cell Genomics, the Wallenberg Foundation, the Swiss National Science Foundation, the NIH Beta-Cell Biology Consortium, the European Union, the Howard Hughes Medical Institute, the H.L. Snyder Foundation, the Elser Trust and the NIH Human Islet Resource Network. Stanford's Department of Developmental Biology also supported the work. The Stanford University School of Medicine consistently ranks among the nation's top medical schools, integrating research, medical education, patient care and community service. For more news about the school, please visit http://med. . The medical school is part of Stanford Medicine, which includes Stanford Health Care and Stanford Children's Health. For information about all three, please visit http://med. .
News Article | March 10, 2016
Sequencing and comparative analysis of the genome of the Western Orchard predatory mite has revealed intriguingly-extreme genomic evolutionary dynamics through an international research effort co-led by scientists from the University of Geneva and the SIB Swiss Institute of Bioinformatics. In a study published in the journal Genome Biology and Evolution, the researchers detail the initial insights into several remarkable features of the genome of this agriculturally important mite that is widely employed to control plant pests, with thousands shipped to fruit growers every day. As a major natural enemy of several damaging agricultural pests, the predatory mite Metaseiulus occidentalis is used in many agricultural settings as an effective biological control agent. Some of its favourite prey include spider mites that feed on and destroy various fruits including strawberries, apples, peaches and grapes. "I have been studying the behaviour, ecology, and molecular biology of these mites for more than 40 years," said Prof. Marjorie Hoy lead author from the University of Florida USA, "so I was very keen to sequence the entire genome to reveal the full catalogue of genes." To explore the unique biology of this agriculturally important predator the researchers focused their studies on genes putatively involved in processes linked to paralysis and pre-oral digestion of prey species and its rather rare parahaploid sex determination system, as well as how it senses chemical cues from its surroundings and defends itself from infections. Compared with other arthropod species, the evolutionary history of this mite's genome has been particularly dynamic. For example, the team's analyses revealed remarkably more intron gains and losses than in other arthropods. "The dynamic gains and losses of introns in the genes of this mite are in stark contrast to its closest relative with a draft genome assembly, the Ixodes tick" described Dr Robert Waterhouse, lead author from the University of Geneva and the SIB Swiss Institute of Bioinformatics. They identified five copies of Dicer-2, a gene found almost exclusively in single-copy in other arthropods, suggesting a possible rewiring of RNA processing pathways. The Hox genes, which are important for determining animal body plans and are located in a cluster of neighbouring genes in almost all species examined to date, were found to be completely dispersed across the mite's genome. "This raises questions about how regulatory programmes that turn Hox genes on and off during the coordinated development of complex body plans can be achieved even when the genes are no longer physically close to each other in the genome" explained Dr Waterhouse. "These resources greatly improve the genomic sampling of chelicerates, a group of arthropods that has so far been poorly represented mainly due to challenges associated with their often very large genomes" said Prof. Stephen Richards from the Baylor College of Medicine USA, where the genome sequencing was performed. Indeed, results from the study's phylogenomic analyses question the relationships amongst some of the major chelicerate groups of mites, ticks, and spiders, further emphasising the need for improved genomic sampling in this clade. This reference genome assembly therefore provides valuable new high-quality resources for future functional genomic and taxonomic analyses of this family of predatory mites and other arachnids. Explore further: Big pest, small genome: Blueprint of spider mite may yield better pesticides More information: Marjorie A. Hoy et al. Genome sequencing of the phytoseiid predatory mite reveals completely atomised genes and super-dynamic intron evolution , Genome Biology and Evolution (2016). DOI: 10.1093/gbe/evw048
News Article | November 10, 2016
There are not a lot of things that could bring together people as far apart on the US political spectrum as Republican Newt Gingrich and Democrat Bob Kerrey. But in 2007, after leading a three-year commission that looked into the costs of care for elderly people, the political rivals came to full agreement on a common enemy: dementia. At the time, there were fewer than 30 million people worldwide diagnosed with the condition, but it was clear that the numbers were set to explode. By 2050, current predictions suggest, it could reach more than 130 million, at which point the cost to US health care alone from diseases such as Alzheimer’s will probably hit US$1 trillion per year in today’s dollars. “We looked at each other and said, ‘You know, if we don’t get a grip on Alzheimer’s, we can’t get anything done because it’s going to drown the system,’” recalls Gingrich, the former speaker of the US House of Representatives. He still feels that sense of urgency, and for good reason. Funding has not kept pace with the scale of the problem; targets for treatments are thin on the ground and poorly understood; and more than 200 clinical trials for Alzheimer’s therapies have been terminated because the treatments were ineffective. Of the few treatments available, none addresses the underlying disease process. “We’re faced with a tsunami and we’re trying to deal with it with a bucket,” says Gingrich. But this message has begun to reverberate around the world, which gives hope to the clinicians and scientists. Experts say that the coming wave can be calmed with the help of just three things: more money for research, better diagnostics and drugs, and a victory — however small — that would boost morale. “What we really need is a success,” says Ronald Petersen, a neurologist at Mayo Clinic in Rochester, Minnesota. After so many failures, one clinical win “would galvanize people’s interest that this isn’t a hopeless disorder”. Dementia is the fifth-biggest cause of death in high-income countries, but it is the most expensive disease to manage because patients require constant, costly care for years. And yet, research funding for dementia pales in comparison with that for many other diseases. At the US National Institutes of Health (NIH), for example, annual funding for dementia in 2015 was only around $700 million, compared with some $2 billion for cardiovascular disease and more than $5 billion for cancer. One problem is visibility. Other disease communities — most notably, people affected by breast cancer and HIV/AIDS — have successfully advocated for large pots of dedicated research funding. But “there simply wasn’t any comparable upswell of attention to Alzheimer’s”, says George Vradenburg, chair and co-founder of UsAgainstAlzheimer’s, a non-profit organization in Chevy Chase, Maryland. The biggest reason, he says, is that “the victims of the disease hide out”. Dementia mostly affects elderly people and is often misconstrued as a normal part of ageing; there is a stigma attached to the condition, and family care-givers are often overworked and exhausted. Few are motivated enough to speak up. However, social and political awareness has increased in the past five years. “We all started to work together a lot more, and that helps,” says Susan Peschin, chief executive at the Alliance for Aging Research in Washington DC, one of more than 50 non-profit groups in the Accelerate Cure/Treatments for Alzheimer’s Disease coalition. The impact can be seen in government investments. France took action first, creating a national plan for Alzheimer’s in 2008 that included €200 million (US$220 million) over five years for research. In 2009, the German Centre for Neurodegenerative Diseases in Bonn was created with a €66-million annual budget. And UK spending on dementia research more than doubled between 2010 and 2015, to £66 million (US$82 million). The European Union has been dishing out tens of millions of euros each year for dementia studies through the Innovative Medicines Initiative and the Joint Programming process, and Australia is now about halfway through doling out its Aus$200-million (US$150-million), five-year dementia-research fund. “This is a global challenge, and no one country will be able to solve the problem,” says Philippe Amouyel, a neurologist and geneticist at the University Hospital of Lille in France. Yet it’s the United States that has been the biggest backer by far, thanks in part to efforts by Gingrich and Kerrey. The NIH’s annual budget for Alzheimer’s and other dementias jumped in the past year to around $1 billion, and there is support for a target to double that figure in the next few years — even in the fractious US political landscape. “Alzheimer’s doesn’t care what political party you’re in,” says Kerrey. Two billion dollars is “a reasonable number”, says Petersen, who chairs the federal advisory board that came up with the target in 2012. Now, he adds, the research community just needs to work out “what are we going to do with it if in fact we get it?”. The answer could depend in large part on the fate of a drug called solanezumab, developed by Eli Lilly of Indianapolis, Indiana. This antibody-based treatment removes the protein amyloid-β, which clumps together to form sticky plaques in the brains of people with Alzheimer’s. By the end of this year, Lilly is expected to announce the results of a 2,100-person clinical trial testing whether the drug can slow cognitive decline in people with mild Alzheimer’s. It showed preliminary signs of cognitive benefit in this patient population in earlier trials (R. S. Doody et al. N. Engl. J. Med. 370, 311–321; 2014), but the benefits could disappear in this final stage of testing, as has happened for practically every other promising compound. No one is expecting a cure. If solanezumab does delay brain degradation, at best it might help people to perform 30–40% better on cognitive tests than those on a placebo. But even such a marginal gain would be a triumph. It would show scientists and the drug industry that a disease-modifying therapy is at least possible. By contrast, another setback could bring recent momentum in therapeutic development to a halt. “This is a fork in the road,” says John Hardy, a neurogeneticist at University College London. “This is going to be a very important outcome, way beyond the importance for Lilly and this particular drug.” On a scientific level, success for solanezumab could lend credence to the much-debated amyloid hypothesis, which posits that the build-up of amyloid-β in the brain is one of the triggers of Alzheimer’s disease. The previous failure of amyloid-clearing agents led many to conclude that plaques were a consequence of a process in the disease, rather than the cause of it. But those in favour of the amyloid hypothesis say that the failed drugs were given too late, or to people with no amyloid build-up — possibly those with a different form of dementia. For its latest solanezumab trial, Lilly sought out participants with mild cognitive impairment, and used brain scans and spinal-fluid analyses to confirm the presence of amyloid-β in their brains. Another company, Biogen in Cambridge, Massachusetts, took the same approach to screening participants in a trial of its amyloid-targeting drug aducanumab. Earlier this year, a 165-person study reported early signs that successfully clearing amyloid-β with the Biogen therapy correlated with slower cognitive decline (J. Sevigny et al. Nature 537, 50–56; 2016). If those results hold up to further scrutiny, “that will at least tell us that amyloid is sufficiently upstream in the cascade that it deserves being targeted and tackled pharmacologically”, says Giovanni Frisoni, a clinical neuroscientist at the University of Geneva in Switzerland who is involved in the drug’s testing. Although debate over the amyloid hypothesis continues, interest is growing in earlier intervention with drugs that clear the protein. Reisa Sperling, a neurologist at Brigham and Women’s Hospital in Boston, Massachusetts, worries that even mild dementia is a sign of irreparable brain-cell death. “You can suck all the amyloid out of the brain or stop it from further accumulating, but you’re not going to grow those neurons back.” That is why she is leading Anti-Amyloid Treatment in Asymptomatic Alzheimer’s, or A4, a $140-million, placebo-controlled solanezumab study that aims to treat people with elevated amyloid levels before they show any signs of cognitive impairment. And A4 is not her only trial. In March, she and neurologist Paul Aisen of the University of Southern California’s Alzheimer’s Therapeutic Research Institute in San Diego launched a trial in 1,650 asymptomatic people with early signs of amyloid-β build-up. It will test a pill from Johnson & Johnson that blocks β-secretase, an enzyme responsible for producing the toxic protein. These interventions are known as secondary prevention because they target people who are already developing amyloid plaques. Sperling and Aisen also plan to test what’s called primary prevention. In August, they received NIH funding to start treating people who have normal brain levels of amyloid-β and no signs of cognitive decline, but who have a high risk of developing Alzheimer’s — because of a combination of factors such as age and genetics. “The biggest impact we can have is in delaying the onset of the diseases,” says David Holtzman, a neurologist at Washington University School of Medicine in St. Louis, Missouri, and an investigator in the Dominantly Inherited Alzheimer Network, which is testing the benefits of giving either solanezumab or another anti-amyloid therapy to people who inherit gene mutations that predispose them to develop Alzheimer’s at an early age. Secondary prevention could eventually mean screening everyone past middle age for signs of amyloid-β, although the current testing methods are either expensive ($3,000 brain scans) or invasive (spinal taps). Researchers have flagged a dozen possible blood-based biomarkers, but none has yet panned out, says Dennis Selkoe, a Brigham and Women’s Hospital neurologist. Yet a cheap and easy diagnostic test for amyloid-β could ultimately prove unnecessary. In the same way that some have suggested giving cholesterol-lowering drugs to anyone at risk of heart disease, clinicians might eventually give anti-amyloid drugs to a broad set of people prone to Alzheimer’s — even if they are not already amyloid positive, says Sperling. Just as cholesterol is not the sole cause of heart disease, amyloid-β is not the only driver of Alzheimer’s. There’s also tau, a protein that causes tangles in the brains of most people with Alzheimer’s. Several pharmaceutical companies are targeting tau, but few large drug-makers have clinical candidates directed at other types of target. “They know how to modulate a specific target and keep looking under that lamp post, rather than venturing away from their comfort zones,” says Bernard Munos, an industry consultant and former Eli Lilly executive. That’s a problem, says Howard Fillit, chief science officer of the Alzheimer’s Drug Discovery Foundation in New York City. “We really need to increase the diversity of targets we’re tackling.” After amyloid and tau, the only target receiving much attention from researchers is neuroinflammation — the “third leg of the stool” in treating Alzheimer’s, according to neurogeneticist Rudy Tanzi at Massachusetts General Hospital in Boston. He likens Alzheimer’s disease to a wildfire in the brain. Plaques and tangles provide the initial brush fires, but it’s the accompanying neuroinflammation that fans the flames. Once the blaze is raging, Tanzi says, “putting out those brush fires that got you there isn’t good enough”. This could explain why anti-amyloid drugs failed when given to people with full-blown dementia. For these individuals, perhaps reducing the inflammatory activity of brain immune cells called microglia could help. Drug researchers are now focusing on two genes, CD33 and TREM2, that are involved in microglial function. But, says Tanzi, “there are two dozen other genes that deserve attention. Who knows if one of these new genes that no one is working on might lead to drug clues?” Many Alzheimer’s experts emphasize the need to develop better low-cost interventions that don’t require drug research. At the University of New South Wales in Sydney, Australia, for example, geriatric psychiatrist Henry Brodaty is testing whether an Internet coaching tool that focuses on diet, exercise, cognitive training and mood can postpone disease development. “We know that two-thirds of the world’s dementia is going to be in developing countries,” he says (see ‘The approaching wave’). Lifestyle interventions, he argues, could be more broadly scalable than expensive drugs. Researchers also need to look beyond Alzheimer’s, to the many other types of dementia. Injuries to the vessels that supply blood to the brain cause a form called vascular dementia. Clumps of a protein called α-synuclein underlie cognitive problems in people with Parkinson’s disease and also what’s called Lewy body dementia. Tau deposits are often behind the nerve-cell loss responsible for frontotemporal dementia. And there are many other, equally devastating, drivers of serious mental decline. “We should not be ignoring these other diseases,” says Nick Fox, a neurologist at University College London, especially given that many types of dementia share biological mechanisms. Tackling one disease could help inform treatment strategies for another. But perhaps the biggest hindrance to drug development today is more logistical than scientific, with clinical trials for dementia taking years to complete as investigators struggle to recruit sufficient numbers of study participants. “We need to get answers more quickly,” says Marilyn Albert, director of the Johns Hopkins Alzheimer’s Disease Research Center in Baltimore, Maryland. One solution is trial-ready registries. By enrolling people who are interested in taking part in a study before it actually exists, investigators can start a trial as soon as a drug comes along for testing. “We have to register humanity in the task of defeating this disease,” says Aisen. The 1,600-person COMPASS-ND registry is being funded through the Canadian Consortium on Neurodegeneration in Aging. Member Serge Gauthier, a neurologist at McGill University in Montreal, says that finding participants can be challenging. But he adds that around one-third of the people who come to memory clinics such as his have what’s known as subjective cognitive impairment — they might forget names or suffer from other ‘senior moments’, but they do not meet the clinical definition of dementia. They are perfect for trial-ready registries, says Gauthier: they are at an elevated risk of the disease, and they’ve demonstrated concern. Gauthier wants to find more people like them. He fits the profile himself, so he joined the Brain Health Registry, which has more than 40,000 participants so far and is led by researchers at the University of California, San Francisco. He takes regular cognitive tests, and could be asked to do more once potential diagnostic tools or therapies are ready for testing. “It’s a fun thing to do,” he says. Voluntarily or not, people will need to face up to dementia, because in just a few short decades, pretty much everyone is going to have a friend or loved one affected by the disease. It’s an alarming idea, and it should spur action, says Robert Egge, chief public policy officer of the Alzheimer’s Association in Chicago, Illinois. “We know where we’re heading,” he says. “The question is: are we going to get in front of it or not?”
News Article | December 5, 2016
Life on earth largely depends on the conversion of light energy into chemical energy through photosynthesis by plants. However, absorption of excess sunlight can damage the complex machinery responsible for this process. Researchers from the University of Geneva (UNIGE), Switzerland, have discovered how Chlamydomonas reinhardtii, a mobile single-cell alga, activates the protection of its photosynthetic machinery. Their study, published in the journal PNAS, indicates that the receptors (UVR8) that detect ultraviolet rays induce the activation of a safety valve that allows dissipation of excess energy as heat. A second protective role is thus attributed to these receptors, whose ability to induce the production of an anti-UV 'sunscreen' had already been shown by the Geneva team. The energy of the sun is converted by plants into chemical energy through photosynthesis in order to produce sugars to feed themselves. The first step of this process, which takes place in cell compartments called chloroplasts, is the capture of photons of light by chlorophyll. Although light is essential for plants, sun in excess can damage their photosynthetic machinery, thereby affecting their growth and productivity. To protect themselves, plants activate a protection mechanism when light is too intense, which involves a series of proteins capable of converting the surplus of energy into heat to be harmlessly dissipated. "UV-B ultraviolet light is likely to cause the most damage to the photosynthetic machinery, and we wanted to know whether it is involved in activating protection mechanisms and, if so, how", say Michel Goldschmidt-Clermont and Roman Ulm, professors at the Department of Botany and Plant Biology of the UNIGE Faculty of Science. This work, conducted in collaboration with researchers from the Universities of Grenoble and of California, was carried out in Chlamydomonas reinhardtii, a single-cell mobile alga used as a model organism. The team of Roman Ulm had discovered in 2011 the existence of a UV-B receptor, called UVR8, whose activation allows plants to protect themselves against these UV and to develop their own molecular 'sunscreen". The researchers demonstrate now that this receptor activates a second protection mechanism. "When UVR8 perceives UV-B rays, it triggers a signal that induces, at the level of the cell nucleus, the production of proteins that will then be imported into the chloroplasts. Once integrated into the photosynthetic apparatus, they will help to divert excess energy, which will be dissipated as heat through molecular vibrations", explains Guillaume Allorent, first author of the article. In terrestrial plants, the perception of UV-B by the UVR8 receptor is also important for the protection of the photosynthetic machinery, but the underlying mechanism has not yet been elucidated. "It is crucial for agricultural productivity and the biotechnological exploitation of photosynthetic processes to better understand the mechanisms leading to photoprotection under sunlight and its UV-B rays", says Michel Goldschmidt-Clermont. A project the Genevan team intends to pursue. Explore further: How do plants protect themselves against sunburn? More information: UV-B photoreceptor-mediated protection of the photosynthetic machinery in Chlamydomonas reinhardtii, PNAS, www.pnas.org/cgi/doi/10.1073/pnas.1607695114
News Article | November 11, 2016
The Swiss Federal Office for the Environment (FOEN) lists more than 100 invasive species already posing threats in Switzerland, including the Asian longhorned beetle and other insects such as the box tree moth, the harlequin ladybird, the Asian tiger mosquito, and the Ambrosia and Colorado potato beetles. The Swiss Federal Institute for Forest, Snow and Landscape Research (WSL) considers Anoplophora glabripennis to be one of the most dangerous pests affecting broadleaf trees. The adult female beetles chew through the tree's bark to lay their eggs in a small hole, so as soon as the larvae hatch they have a ready source of vascular plant tissue on which to feed. As they mature, the larvae then make tunnels deep into the tree's heartwood, with each larva capable of consuming up to 1'000 cubic cm of wood in its lifetime. The emerging adults will usually produce the next generation on the same host tree, but high-density infestations will eventually kill the tree so they must disperse to find new host trees for their young. It is likely that the first invaders were unknowingly imported as larvae hidden inside wooden packaging materials. New laws requiring such packaging from China to be heat-dried or chemically treated to kill any larvae have helped to limit new invasions, and effective pest inspections and increasing public awareness are proving successful to help prevent any further spread of Asian longhorned beetles in Switzerland. The collaborative research project to sequence, annotate, and explore the Anoplophora glabripennis genome was led by Prof. Duane McKenna from the University of Memphis, with DNA sequencing and genome assembly performed at the Baylor College of Medicine directed by Prof. Stephen Richards as part of the i5K arthropod genome initiative. The team of researchers who analysed this new wealth of genetic and genomic data included experts from the SIB Swiss Institute of Bioinformatics and the University of Geneva Faculty of Medicine, Dr Panagiotis Ioannidis and Dr Robert Waterhouse from the group of Prof. Evgeny Zdobnov. The international team's efforts were made possible through funding from United States agencies including the National Human Genome Research Institute, the National Science Foundation, the National Institute of Food and Agriculture, and National Institutes of Health, as well as the German Research Foundation and the Swiss National Science Foundation. Their findings from exploring the 710 megabasepair genome and its 22'035 encoded genes are published in a comprehensive manuscript in the journal Genome Biology. Sequencing and annotating the beetle's genome enabled the researchers to perform detailed comparative analyses with other insects and examine the thousands of encoded genes for clues about how they have evolved to successfully feed on tree tissues. Prof. McKenna said "Research in my laboratory has focused on the evolution of beetle-plant interactions and phytophagy, i.e. plant-feeding, so sequencing the whole genome now allows us to identify the full set of genes that facilitate the specialised feeding of this beetle on woody plants." He further explained that "In particular, gene duplication, i.e. the generation of new gene copies in this beetle genome, and their subsequent functional divergence, have been important factors that have led to the expansion and enhancement of its metabolic gene repertoire, in some cases involving genes acquired from fungi and bacteria." As well as encoding many of these enzymes required to degrade plant tissues, the genome also revealed several expanded sets of genes that are known to be important for the detoxification of the chemicals normally produced by plants to defend themselves against attacks by such pests. "This means that the beetle is able to quickly get rid of these toxic plant defence chemicals that would normally deter most insects, and continue to feed on the woody tissues of the host trees", explained Dr Ioannidis. Dr Waterhouse added that "Using our comparative genomics tools including OrthoDB and BUSCO, we were able to classify genes into those shared across many insect species and those that are specific to beetles, especially plant-feeding beetles, to highlight genes that may be particularly important for the biological innovations that have allowed beetles to become such a successful - and in this case dangerous - group of insects." The SIB Swiss Institute of Bioinformatics is an academic not-for-profit organization. Its mission is to lead and coordinate the field of bioinformatics in Switzerland. Its data science experts join forces to advance biological and medical research and enhance health by (i) providing the national and international life science community with a state-of-the-art bioinformatics infrastructure, including resources, expertise and services; (ii) federating world-class researchers and delivering training in bioinformatics. It includes some 65 world-class research and service groups and some 800 scientists in the fields of genomics, transcriptomics, proteomics, evolution, population genetics, systems biology, structural biology, biophysics and clinical bioinformatics. http://www. Reference: McKenna et al. Genome of the Asian longhorned beetle (Anoplophora glabripennis), a globally significant invasive species, reveals key functional and evolutionary innovations at the beetle-plant interface. Genome Biology, 2016. DOI: 10.1186/s13059-016-1088-8
News Article | November 11, 2016
The Asian longhorned beetle, Anoplophora glabripennis, also known as the starry sky beetle, is native to eastern Asia but has successfully invaded North America and Europe where it infests maple, birch, willow, elm, and poplar trees. Published in the journal Genome Biology, an international team of scientists report on the sequencing, annotation, and comparative exploration of this beetle's genome in an effort to develop novel tools to combat its spread and better understand the biology of invasive wood-boring pests. The project involved scientists from more than 30 research institutions worldwide, including from the SIB Swiss Institute of Bioinformatics and the University of Geneva (UNIGE). The results begin to help unravel the complex genetic and genomic basis for the invasiveness of the Asian longhorned beetle and the evolutionary success of how beetles exploit plants. The Swiss Federal Office for the Environment (FOEN) lists more than 100 invasive species already posing threats in Switzerland, including the Asian longhorned beetle and other insects such as the box tree moth, the harlequin ladybird, the Asian tiger mosquito, and the Ambrosia and Colorado potato beetles. The Swiss Federal Institute for Forest, Snow and Landscape Research (WSL) considers Anoplophora glabripennis to be one of the most dangerous pests affecting broadleaf trees. The adult female beetles chew through the tree's bark to lay their eggs in a small hole, so as soon as the larvae hatch they have a ready source of vascular plant tissue on which to feed. As they mature, the larvae then make tunnels deep into the tree's heartwood, with each larva capable of consuming up to 1'000 cubic cm of wood in its lifetime. The emerging adults will usually produce the next generation on the same host tree, but high-density infestations will eventually kill the tree so they must disperse to find new host trees for their young. It is likely that the first invaders were unknowingly imported as larvae hidden inside wooden packaging materials. New laws requiring such packaging from China to be heat-dried or chemically treated to kill any larvae have helped to limit new invasions, and effective pest inspections and increasing public awareness are proving successful to help prevent any further spread of Asian longhorned beetles in Switzerland. The collaborative research project to sequence, annotate, and explore the Anoplophora glabripennis genome was led by Prof. Duane McKenna from the University of Memphis, with DNA sequencing and genome assembly performed at the Baylor College of Medicine directed by Prof. Stephen Richards as part of the i5K arthropod genome initiative. The team of researchers who analysed this new wealth of genetic and genomic data included experts from the SIB Swiss Institute of Bioinformatics and the University of Geneva Faculty of Medicine, Dr Panagiotis Ioannidis and Dr Robert Waterhouse from the group of Prof. Evgeny Zdobnov. The international team's efforts were made possible through funding from United States agencies including the National Human Genome Research Institute, the National Science Foundation, the National Institute of Food and Agriculture, and National Institutes of Health, as well as the German Research Foundation and the Swiss National Science Foundation. Their findings from exploring the 710 megabasepair genome and its 22'035 encoded genes are published in a comprehensive manuscript in the journal Genome Biology. Sequencing and annotating the beetle's genome enabled the researchers to perform detailed comparative analyses with other insects and examine the thousands of encoded genes for clues about how they have evolved to successfully feed on tree tissues. Prof. McKenna said "Research in my laboratory has focused on the evolution of beetle-plant interactions and phytophagy, i.e. plant-feeding, so sequencing the whole genome now allows us to identify the full set of genes that facilitate the specialised feeding of this beetle on woody plants." He further explained that "In particular, gene duplication, i.e. the generation of new gene copies in this beetle genome, and their subsequent functional divergence, have been important factors that have led to the expansion and enhancement of its metabolic gene repertoire, in some cases involving genes acquired from fungi and bacteria." As well as encoding many of these enzymes required to degrade plant tissues, the genome also revealed several expanded sets of genes that are known to be important for the detoxification of the chemicals normally produced by plants to defend themselves against attacks by such pests. "This means that the beetle is able to quickly get rid of these toxic plant defence chemicals that would normally deter most insects, and continue to feed on the woody tissues of the host trees," explained Dr Ioannidis. Dr Waterhouse added that "Using our comparative genomics tools including OrthoDB and BUSCO, we were able to classify genes into those shared across many insect species and those that are specific to beetles, especially plant-feeding beetles, to highlight genes that may be particularly important for the biological innovations that have allowed beetles to become such a successful -- and in this case dangerous -- group of insects."
News Article | December 14, 2016
Synthetic organic chemistry consists of transforming existing molecules into new molecular structures or assemblies. These new molecular systems are then used in a myriad of ways in everyday life - in a wide range of sectors, such as public health, energy and environment, for use in drugs, solar cells, fragrances, and so on. The active element in the molecule that initiates these transformations, known as the catalyst, is often hydrogen. However, a research team at the University of Geneva (UNIGE), Switzerland, has found that a sulfur atom, if carefully inserted into a molecule, can not only become an extremely effective catalyst but can also operate with greater precision. This discovery, published in Angewandte Chemie, has the potential to revolutionize the world of synthetic organic chemistry. It paves the way for the creation of new molecules that can be used in our daily life. Creativity in fundamental research in chemistry consists of finding new ways to transform molecules and to build new molecular structures. To achieve this, the starting molecule needs to undergo a series of transformations until the molecular architecture of interest is achieved. However, a molecule does not just change by itself - it has to be pushed by another molecule, the so-called catalyst. In nature, enzymes play this catalytic role. In chemistry and biology, the active element in catalysts is often the smallest possible atom - hydrogen. "When we want to carry out a molecular transformation, we frequently use the hydrogen bond," explains Stefan Matile, Professor in the Department of Organic Chemistry in the Faculty of Science at UNIGE, and director of the research project. "More precisely, we place the molecule that we want to transform, known as the substrate, in contact with hydrogen. The catalyst then attracts negative charge from the substrate, to the point where the molecule is so poor in negative charges that it is forced to seek contact with another substrate and, in order to maintain itself, to transform." Hydrogen can be thought of as a vacuum cleaner that aspirates negative charges until the molecules are forced to come together and transform to compensate for the loss. Professor Matile's team is interested in using bonds other than hydrogen bonds for catalysis and other activities. Most chemists consider these to be rather esoteric with little importance in the area of molecular transformation. However, when looking more closely at the sulfur atom in certain molecules, the UNIGE research team realized that the atom has a very localized area where it is extremely deficient in electrons, a sort of 'black hole'. The team wanted to know whether this hole could act as a 'vacuum cleaner', like hydrogen, if it were placed in contact with a substrate. If this were the case, sulfur could be used as a catalyst, causing molecules to transform themselves. This somewhat unorthodox bond, known as a chalcogen bond, would thus replace the conventional hydrogen bond. As Professor Matile further explains: "To test our hypothesis, we created and tested a series of molecular structures using chalcogen bonds of gradually increasing strength. We noticed that they not only work, but that they increase the speed of the transformation by more than a thousand times, as when there is no catalyst. Additionally, we achieved a degree of precision that is impossible with hydrogen bonds." In fact, hydrogen's entire surface is 'electron poor'. Thus, when it is playing the role of catalyst, the entire atom can come into contact with the substrate and suck up negative charges all over. However, with sulfur, only a small area can act as catalyst. This will enable chemists to be more precise in bringing the catalyst and substrate into contact, and thereby to exercise increased control over the transformation. This has the potential to revolutionize synthetic organic chemistry. This discovery puts a new tool in the hands of chemists. It proves that it is now possible to use different approaches to carry out molecular transformations, and it opens up entirely new perspectives to the world of synthetic chemistry. Professor Matile's group will now attempt to build molecules that are not accessible with conventional hydrogen bonds. This opens the door for the creation of new materials.
News Article | November 17, 2016
DBV Technologies, BioNet-Asia and Geneva University Hospitals Complete Dosing in First Cohort of Phase I Study of Viaskin rPT for Booster Vaccination Against Pertussis Following positive DSMB review, dosing with Viaskin rPT 50 µg has been initiated PARIS, BANGKOK and GENEVA November 17, 2016 - DBV Technologies (Euronext: DBV - ISIN: FR0010417345 - Nasdaq Stock Market: DBVT), the Geneva University Hospitals (HUG) and BioNet-Asia Co. Ltd today announced that in a planned interim assessment of the Phase I trial of Viaskin rPT for booster immunization against Bordetella pertussis, the independent Data and Safety Monitoring Board (DSMB) concluded that there were no safety concerns with the administration of Viaskin rPT 25 mcg in the first subject cohort. Based on this review, enrollment in the trial has continued as planned, with dosing of Viaskin rPT 50 mcg commencing in the second subject cohort. The Viaskin rPT pertussis booster vaccination program intends to test the ability of DBV's needleless and adjuvant-free patch technology, Viaskin, to epicutaneously deliver two different doses of BioNet's genetically detoxified, recombinant pertussis toxin for boosting immunity against whooping cough. In the first dosing cohort, subjects received two applications of either Viaskin rPT 25 mcg or placebo. Following the DSMB's positive recommendation, a second cohort of subjects will receive two applications of Viaskin rPT 50 mcg or placebo at two-week interval. This Phase I proof of concept study is being conducted under the supervision of Professor Claire-Anne Siegrist from the Clinical Research Center of HUG and is sponsored by DBV Technologies. About the Phase I Viaskin rPT Trial This Phase I dose-escalation, randomized, double-blind, placebo-controlled safety and immunogenicity study is assessing the safety of BioNet's genetically-detoxified recombinant pertussis toxin administered by DBV's Viaskin patches in 60 young healthy adults. Secondary endpoints will assess the patients' humoral responses elicited by Viaskin rPT 25 mcg and 50 mcg compared to placebo. Immune cellular responses will also be monitored as exploratory endpoints. The trial is being conducted in the Clinical Research Center of the Geneva University Hospitals. Men and women aged 18 to 40 years who have been vaccinated during childhood against pertussis will be randomized into two cohorts of 30 subjects each. The Viaskin patches will be applied for 48 hours, with a two-week interval between applications. Four weeks after the second Viaskin application, participants will receive one dose of Boostrix® dTpa vaccine to ensure the recall of immunity against diphtheria, tetanus and the three pertussis antigens (only a single antigen will be delivered through Viaskin rPT). All subjects will be observed after each application. Local and systemic adverse events will be monitored. The first cohort has received two applications of Viaskin rPT 25 mcg or placebo. Following a positive DSMB review, dosing in the second patient cohort, which is expected to receive two applications of Viaskin rPT 50 mcg or placebo has commenced. About Bordetella Pertussis Pertussis, commonly known as whooping cough, is a highly contagious respiratory illness caused by a type of bacteria known as Bordetella pertussis. Pertussis vaccination is recommended as part of routine childhood immunization. Although the incidence of pertussis has declined as a result of immunization of infants and young children, vaccine-induced immunity does not persist for long. This phenomenon, known as waning immunity, has increased since the introduction of acellular pertussis vaccines in 1996, which tend to provide short-lived protection against the Bordetella pertussis bacteria. According to the U.S. Centers for Disease Control and Prevention (CDC), there are 16 million pertussis cases worldwide each year, mainly in adolescents and adults who often can infect infants who have not yet completed their pertussis immunization. In these young patients, pertussis can be severe and fatal. Booster immunizations are now recommended for adolescents and adults, but compliance is not always high. A new vaccine technology that is patient-friendly, painless and non-invasive could help increase the compliance for booster immunization against whooping cough. About DBV Technologies DBV Technologies is developing Viaskin®, a proprietary technology platform with broad potential applications in immunotherapy. Viaskin is based on epicutaneous immunotherapy, or EPIT®, DBV's method of delivering biologically active compounds to the immune system through intact skin. With this new class of self-administered and non-invasive product candidates, the company is dedicated to safely transforming the care of food allergic patients, for whom there are no approved treatments. DBV's food allergies programs include ongoing clinical trials of Viaskin Peanut and Viaskin Milk, and preclinical development of Viaskin Egg. DBV is also pursuing a human proof-of-concept clinical study of Viaskin Milk for the treatment of Eosinophilic Esophagitis, and exploring potential applications of its platform in vaccines and other immune diseases. DBV Technologies has global headquarters in Montrouge, France and New York, NY. Company shares are traded on segment B of Euronext Paris (Ticker: DBV, ISIN code: FR0010417345), part of the SBF120 index, and traded on the Nasdaq Global Select Market in the form of American Depositary Shares (each representing one-half of one ordinary share) (Ticker: DBVT). For more information on DBV Technologies, please visit our website: www.dbv-technologies.com About Geneva University Hospitals The Geneva University Hospitals (HUG), reference academic institution at both national and international level, gather eight public hospitals of Geneva. Their centres of excellence cover hepato-biliary and pancreatic diseases, cardiovascular diseases, oncology, musculoskeletal and sports medicine, old age medicine, genetic medicine and vaccinology. Its Center of Vaccinology, led by Professor Claire-Anne Siegrist, gained international recognition through the performance of a large first-in-humans Phase I randomized clinical trial that enrolled 115 subjects to characterize the safety and immunogenicity of the VSV-ZEBOV Ebola vaccine candidate. With their 10,500 employees, the HUG welcome each year 60,000 hospitalized patients and assure 91,000 emergencies, 990,000 consultations or ambulatory care and 26,000 surgical procedures. More than 800 physicians, 3,000 interns and 150 apprentices perform their training here. The HUG are working closely with the Faculty of Medicine of the University of Geneva and WHO in various training and research projects. They develop partnerships with CHUV, EPFL, CERN and other actors from the Lemanic Health Valley. More information on: www.hug-ge.ch About BioNet-Asia BioNet-Asia offers access to vaccine and technology through biotech innovation and partnering networks. BioNet has built several international partnerships fostering vaccine self-reliance and leading to the supply of billions of doses of vaccines worldwide. BioNet has also a broad pipeline of vaccines in R&D and clinical stages. BioNet most advanced program is the development of a new generation of pertussis vaccines aimed at overcoming the waning immunity observed with the conventional acellular pertussis vaccines. BioNet's pertussis vaccines are produced from a new proprietary Bordetella pertussis strain expressing genetically detoxified Pertussis Toxin (PTgen(TM)). The unique properties of BioNet's PTgen enables the vaccines to induce superior anti-PT immune response. For additional information, please visit www.bionet-asia.com Forward Looking Statements This press release contains forward-looking statements, including statements about the potential safety and efficacy of Viaskin as a means of delivering recombinant pertussis toxin to boost immunity against Bordetella pertussis. These forward-looking statements are not promises or guarantees and involve substantial risks and uncertainties. The Company's product candidates have not been approved for sale in any jurisdiction. Among the factors that could cause actual results to differ materially from those described or projected herein include uncertainties associated generally with research and development, clinical trials and related regulatory reviews and approvals, the risk that historical preclinical results may not be predictive of future clinical trial results, and the risk that historical clinical trial results may not be predictive of future trial results. A further list and description of these risks, uncertainties and other risks can be found in the Company's regulatory filings with the French Autorité des Marchés Financiers, the Company's Securities and Exchange Commission filings and reports, including in the Company's Annual Report on Form 20-F for the year ended December 31, 2015 and future filings and reports by the Company. Existing and prospective investors are cautioned not to place undue reliance on these forward-looking statements, which speak only as of the date hereof. DBV Technologies undertakes no obligation to update or revise the information contained in this Press Release, whether as a result of new information, future events or circumstances or otherwise. Geneva University Hospitals Contact Professor Claire-Anne Siegrist, Director of the Center of Vaccinology +41(22) 379 57 77 email@example.com
News Article | October 31, 2016
WUXI, China, Oct. 31, 2016 /PRNewswire/ -- The World IoT Expo is held from October 30th to November 1st, 2016 in Wuxi. The expo is jointly hosted by the Chinese Ministry of Industry and Information Technology, the Chinese Ministry of Science and Technology and Jiangsu Provincial People's Government. Jointly supported by the Chinese Academy of Sciences (CAS), International Telecommunication Union (ITU), Institute of Electrical and Electronics Engineers (IEEE), Global Standard 1 (GS1) and Auto-ID Labs, it is the largest and highest scale national-level exposition of the Internet of Things sector in China. The China International IoT Expo has been held annually since 2010, and has become widely known over the past six years. This October, approved by the CPC Central Committee and State Council, it was renamed as the World Internet of Things Exposition. Compared with previous years, this year's event will boast a larger scale, higher-level participants and more advanced technologies. With numerous exhibits and upgraded technologies, the event fosters a new beginning for the Internet of Things Exposition. With the theme of "Create IoT Era, Share Global Intelligence", this expo consists of various activities including the IoT Wuxi Summit, the main exhibition of IoT applications and products, the China National College Innovation Competition of IoT Applications and the 4th Meeting of the Governing Group of Wuxi National Sensor Network Innovation Demonstration Zone. The IoT Wuxi Summit invited guests from home and abroad, such as the president of the International Organization for Standardization (ISO) Zhang Xiaogang and deputy secretary-general of ITU Malcolm Johnson, to deliver keynote speeches. The speakers also includes Khalil Najafi, professor of the Electrical and Computer Engineering Division at the University of Michigan; Nadia Magnenat Thalmann, founder and head of the MIRALab Research Laboratory at the University of Geneva; Alain Crozier, head and chief executive of Microsoft Greater China Region; Wu Hequan, academician of the Chinese Academy of Engineering; Wang Jian, chief technology officer at Alibaba; Zhang Shunmao, the president of Huawei Marketing and Solutions and Liu Haitao, chairman of World Sensing Net Group (WSN Group). In addition to an IoT competition for college students and a job fair, the event also includes a press conference for releasing the construction plan for a world-class "Internet of Things town" to promote the development of Wuxi's IoT industry. The IoT Wuxi Summit has the following features: More than 3,000 guests from 23 countries and regions attended the exhibition, including 10 ministerial leaders, and 24 academicians of the Chinese Academy of Sciences and Chinese Academy of Engineering, directors of international associations, the inventor of the COMS integrated circuit and the founder of MEMS Research Center in Singapore. Participants also include senior managers from State-owned companies such as China Railway, State Grid, Sinopec, PetroChina, Aviation Industry Corporation of China. In addition, professors from MIT, University of Colorado, University of Michigan, University of Cincinnati, University of Washington, University of Geneva of Switzerland, Tsinghua University are also in attendance. Managers in charge of technology at overseas companies including IBM, Siemens, Microsoft, Bosch, GE, Nokia, NTT, SK Telecom, ARM, Kaspersky, Honeywell and Tesla Motors, as well as domestic ones, including China Mobile, China Telecom, China Unicom, Huawei, Lenovo, Inspur, Haier, Midea, Foxconn, Alibaba, Baidu, Tencent, JD.com, Qihoo 360 and Neusoft will also take part in the event. With its exhibition area expanded to 50,000 m2 from 32,000 m2, the expo accommodates 489 exhibitors who will demonstrate technology applications and practical cases to visitors with interactive displays. Companies participating in the Expo include IBM, Siemens, OMRON, ARM, Infineon Technologies, China Telecom, China Mobile, China Unicom, XCMG, China North Industries Group Corporation, Aisino Corporation, Huawei, ZTE, Alibaba, Tencent, JD.com, AsiaInfo, Hikvision and Lenovo.
News Article | October 31, 2016
WUXI, China, Oct. 31, 2016 /PRNewswire/ -- The World IoT Expo is held from October 30th to November 1st, 2016 in Wuxi. The expo is jointly hosted by the Chinese Ministry of Industry and Information Technology, the Chinese Ministry of Science and Technology and Jiangsu Provincial People's Government. Jointly supported by the Chinese Academy of Sciences (CAS), International Telecommunication Union (ITU), Institute of Electrical and Electronics Engineers (IEEE), Global Standard 1 (GS1) and Auto-ID Labs, it is the largest and highest scale national-level exposition of the Internet of Things sector in China. The China International IoT Expo has been held annually since 2010, and has become widely known over the past six years. This October, approved by the CPC Central Committee and State Council, it was renamed as the World Internet of Things Exposition. Compared with previous years, this year's event will boast a larger scale, higher-level participants and more advanced technologies. With numerous exhibits and upgraded technologies, the event fosters a new beginning for the Internet of Things Exposition. With the theme of "Create IoT Era, Share Global Intelligence", this expo consists of various activities including the IoT Wuxi Summit, the main exhibition of IoT applications and products, the China National College Innovation Competition of IoT Applications and the 4th Meeting of the Governing Group of Wuxi National Sensor Network Innovation Demonstration Zone. The IoT Wuxi Summit invited guests from home and abroad, such as the president of the International Organization for Standardization (ISO) Zhang Xiaogang and deputy secretary-general of ITU Malcolm Johnson, to deliver keynote speeches. The speakers also includes Khalil Najafi, professor of the Electrical and Computer Engineering Division at the University of Michigan; Nadia Magnenat Thalmann, founder and head of the MIRALab Research Laboratory at the University of Geneva; Alain Crozier, head and chief executive of Microsoft Greater China Region; Wu Hequan, academician of the Chinese Academy of Engineering; Wang Jian, chief technology officer at Alibaba; Zhang Shunmao, the president of Huawei Marketing and Solutions and Liu Haitao, chairman of World Sensing Net Group (WSN Group). In addition to an IoT competition for college students and a job fair, the event also includes a press conference for releasing the construction plan for a world-class "Internet of Things town" to promote the development of Wuxi's IoT industry. The IoT Wuxi Summit has the following features: More than 3,000 guests from 23 countries and regions attended the exhibition, including 10 ministerial leaders, and 24 academicians of the Chinese Academy of Sciences and Chinese Academy of Engineering, directors of international associations, the inventor of the COMS integrated circuit and the founder of MEMS Research Center in Singapore. Participants also include senior managers from State-owned companies such as China Railway, State Grid, Sinopec, PetroChina, Aviation Industry Corporation of China. In addition, professors from MIT, University of Colorado, University of Michigan, University of Cincinnati, University of Washington, University of Geneva of Switzerland, Tsinghua University are also in attendance. Managers in charge of technology at overseas companies including IBM, Siemens, Microsoft, Bosch, GE, Nokia, NTT, SK Telecom, ARM, Kaspersky, Honeywell and Tesla Motors, as well as domestic ones, including China Mobile, China Telecom, China Unicom, Huawei, Lenovo, Inspur, Haier, Midea, Foxconn, Alibaba, Baidu, Tencent, JD.com, Qihoo 360 and Neusoft will also take part in the event. With its exhibition area expanded to 50,000 m2 from 32,000 m2, the expo accommodates 489 exhibitors who will demonstrate technology applications and practical cases to visitors with interactive displays. Companies participating in the Expo include IBM, Siemens, OMRON, ARM, Infineon Technologies, China Telecom, China Mobile, China Unicom, XCMG, China North Industries Group Corporation, Aisino Corporation, Huawei, ZTE, Alibaba, Tencent, JD.com, AsiaInfo, Hikvision and Lenovo.
News Article | December 26, 2016
Emotional experiences can induce physiological and internal brain states that persist for long periods of time after the emotional events have ended, a team of New York University scientists has found. This study, which appears in the journal Nature Neuroscience, also shows that this emotional "hangover" influences how we attend to and remember future experiences. "How we remember events is not just a consequence of the external world we experience, but is also strongly influenced by our internal states--and these internal states can persist and color future experiences," explains Lila Davachi, an associate professor in NYU's Department of Psychology and Center for Neural Science and senior author of the study. " 'Emotion' is a state of mind," Davachi continues. "These findings make clear that our cognition is highly influenced by preceding experiences and, specifically, that emotional brain states can persist for long periods of time." We have known for quite some time that emotional experiences are better remembered than non-emotional ones. However, in the Nature Neuroscience study, the researchers demonstrated that non-emotional experiences that followed emotional ones were also better remembered on a later memory test. To do so, subjects viewed a series of scene images that contained emotional content and elicited arousal. Approximately 10 to 30 minutes later, one group then also viewed a series of non-emotional, ordinary scene images. Another group of subjects viewed the non-emotional scenes first followed by the emotional ones. Both physiological arousal, measured in skin conductance, and brain activity, using fMRI, were monitored in both groups of subjects. Six hours later, the subjects were administered a memory test of the images previously viewed. The results showed that the subjects who were exposed to the emotion-evoking stimuli first had better long-term recall of the neutral images subsequently presented compared to the group who were exposed to the same neutral images first, before the emotional images. The fMRI results pointed to an explanation for this outcome. Specifically, these data showed that the brain states associated with emotional experiences carried over for 20 to 30 minutes and influenced the way the subjects processed and remembered future experiences that are not emotional. "We see that memory for non-emotional experiences is better if they are encountered after an emotional event," observes Davachi. The study's other authors were Arielle Tambini, an NYU doctoral student at the time of the study and now a postdoctoral fellow at the University of California, Berkeley, Ulrike Rimmele, an NYU postdoctoral fellow at the time of the study and now a postdoctoral researcher at the University of Geneva, and Elizabeth Phelps, a professor in NYU's Center for Neural Science and Department of Psychology. The work was supported by Dart Neuroscience, along with grants from the National Institute of Mental Health, part of the National Institutes of Health (MH074692, MH062104, MH092055), the Swiss National Science Foundation (DFG RI 1894/2-1), the German Research Foundation (DFG RI 1894/2-1), and the European Community Seventh Framework Programme (FP7/2007-2013).
Tikkanen M.,University of Turku |
Tikkanen M.,University of Geneva |
Aro E.-M.,University of Turku
Biochimica et Biophysica Acta - Bioenergetics | Year: 2012
In higher plants, the photosystem (PS) II core and its several light harvesting antenna (LHCII) proteins undergo reversible phosphorylation cycles according to the light intensity. High light intensity induces strong phosphorylation of the PSII core proteins and suppresses the phosphorylation level of the LHCII proteins. Decrease in light intensity, in turn, suppresses the phosphorylation of PSII core, but strongly induces the phosphorylation of LHCII. Reversible and differential phosphorylation of the PSII-LHCII proteins is dependent on the interplay between the STN7 and STN8 kinases, and the respective phosphatases. The STN7 kinase phosphorylates the LHCII proteins and to a lesser extent also the PSII core proteins D1, D2 and CP43. The STN8 kinase, on the contrary, is rather specific for the PSII core proteins. Mechanistically, the PSII-LHCII protein phosphorylation is required for optimal mobility of the PSII-LHCII protein complexes along the thylakoid membrane. Physiologically, the phosphorylation of LHCII is a prerequisite for sufficient excitation of PSI, enabling the excitation and redox balance between PSII and PSI under low irradiance, when excitation energy transfer from the LHCII antenna to the two photosystems is efficient and thermal dissipation of excitation energy (NPQ) is minimised. The importance of PSII core protein phosphorylation is manifested under highlight when the photodamage of PSII is rapid and phosphorylation is required to facilitate the migration of damaged PSII from grana stacks to stroma lamellae for repair. The importance of thylakoid protein phosphorylation is highlighted under fluctuating intensity of light where the STN7 kinase dependent balancing of electron transfer is a prerequisite for optimal growth and development of the plant. This article is part of a Special Issue entitled: Photosystem II. © 2011 Elsevier B.V. All rights reserved.
Mardling R.A.,Monash University |
Mardling R.A.,University of Geneva
Monthly Notices of the Royal Astronomical Society | Year: 2013
Modern applications of celestial mechanics include the study of closely packed systems of exoplanets, circumbinary planetary systems, binary-binary interactions in star clusters and the dynamics of stars near the Galactic centre. While developments have historically been guided by the architecture of the Solar System, the need for more general formulations with as few restrictions on the parameters as possible is obvious. Here, we present clear and concise generalizations of two classic expansions of the three-body disturbing function, simplifying considerably their original form and making them accessible to the non-specialist. Governing the interaction between the inner and outer orbits of a hierarchical triple, the disturbing function in its general form is the conduit for energy and angular momentum exchange and as such, governs the secular and resonant evolution of the system and its stability characteristics. Focusing here on coplanar systems, the first expansion is one in the ratio of inner to outer semimajor axes and is valid for all eccentricities, while the second is an expansion in eccentricity and is valid for all semimajor axis ratios, except for systems in which the orbits cross (this restriction also applies to the first expansion). Our generalizations make both formulations valid for arbitrary mass ratios. The classic versions of these appropriate to the restricted three-body problem are known as Kaula's expansion and the literal expansion, respectively. We demonstrate the equivalence of the new expansions, identifying the role of the spherical harmonic order m in both and its physical significance in the three-body problem, and introducing the concept of principal resonances. Several examples of the accessibility of both expansions are given including resonance widths, and the secular rates of change of the elements. Results in their final form are gathered together at the end of the paper for the reader mainly interested in their application, including a guide for the choice of expansion. © 2013 The Authors Published by Oxford University Press on behalf of the Royal Astronomical Society.
Daumke O.,Max Delbruck Centrum fur Molekulare Medizin |
Daumke O.,Free University of Berlin |
Roux A.,University of Geneva |
Haucke V.,Leibniz Institute for Molecular Pharmacology |
Haucke V.,Charité - Medical University of Berlin
Cell | Year: 2014
Biological membranes undergo constant remodeling by membrane fission and fusion to change their shape and to exchange material between subcellular compartments. During clathrin-mediated endocytosis, the dynamic assembly and disassembly of protein scaffolds comprising members of the bin-amphiphysin-rvs (BAR) domain protein superfamily constrain the membrane into distinct shapes as the pathway progresses toward fission by the GTPase dynamin. In this Review, we discuss how BAR domain protein assembly and disassembly are controlled in space and time and which structural and biochemical features allow the tight regulation of their shape and function to enable dynamin-mediated membrane fission. © 2014 Elsevier Inc.
Kilpinen H.,University of Geneva |
Kilpinen H.,Swiss Institute of Bioinformatics |
Barrett J.C.,Wellcome Trust Sanger Institute
Trends in Genetics | Year: 2013
Progress in understanding the genetics of human disease is closely tied to technological developments in DNA sequencing. Recently, next-generation technology has transformed the scale of sequencing; compared to the methods used in the Human Genome Project, modern sequencers are 50. 000-fold faster. Complex disease genetics presents an immediate opportunity to use this technology to move from approaches using only partial information (linkage and genome-wide association studies, GWAS) to complete analysis of the relationship between genomic variation and phenotype. We first describe sequence-based improvements to existing study designs, followed by prioritization of both samples and genomic regions to be sequenced, and then address the ultimate goal of analyzing thousands of whole-genome sequences. Finally, we discuss how the same technology will also fundamentally change the way we understand the biological mechanisms underlying disease associations discovered through sequencing. © 2012 Elsevier Ltd.
Stern D.B.,Boyce Thompson Institute for Plant Research |
Goldschmidt-Clermont M.,University of Geneva |
Hanson M.R.,Cornell University
Annual Review of Plant Biology | Year: 2010
The chloroplast genome encodes proteins required for photosynthesis, gene expression, and other essential organellar functions. Derived from a cyanobacterial ancestor, the chloroplast combines prokaryotic and eukaryotic features of gene expression and is regulated by many nucleus-encoded proteins. This review covers four major chloroplast posttranscriptional processes: RNA processing, editing, splicing, and turnover. RNA processing includes the generation of transcript 5â€ and 3' termini, as well as the cleavage of polycistronic transcripts. Editing converts specific C residues to U and often changes the amino acid that is specified by the edited codon. Chloroplasts feature introns of groups I and II, which undergo protein-facilitated cis-or trans-splicing in vivo. Each of these RNA-based processes involves proteins of the pentatricopeptide motif-containing family, which does not occur in prokaryotes. Plant-specific RNA-binding proteins may underpin the adaptation of the chloroplast to the eukaryotic context. Copyright © 2010 by Annual Reviews. All rights reserved.
Andersen J.E.,University of Aarhus |
Kashaev R.,University of Geneva
Communications in Mathematical Physics | Year: 2014
By using quantum Teichmüller theory, we construct a one parameter family of TQFTs on the categroid of admissible leveled shaped 3-manifolds. © 2014 Springer-Verlag Berlin Heidelberg.
Agency: European Commission | Branch: FP7 | Program: CP-FP | Phase: KBBE-2009-3-2-02 | Award Amount: 4.37M | Year: 2010
SUNBIOPATH - towards a better sunlight to biomass conversion efficiency in microalgae - is an integrated program of research aimed at improving biomass yields and valorisation of biomass for two Chlorophycean photosynthetic microalgae, Chlamydomonas reinhardtii and Dunaliella salina. Biomass yields will be improved at the level of primary processes that occur in the chloroplasts (photochemistry and sunlight capture by the light harvesting complexes) and in the cell (biochemical pathways and signalling mechanisms that influence ATP synthesis). Optimal growth of the engineered microalgae will be determined in photobioreactors, and biomass yields will be tested using a scale up approach in photobioreactors of different sizes (up to 250 L), some of which being designed and built during SUNBIOPATH. Biomethane production will be evaluated. Compared to other biofuels, biomethane is attractive because the yield of biomass to fuel conversion is higher. Valorisation of biomass will also be achieved through the production of biologicals. Significant progress has been made in the development of chloroplast genetic engineering in microalgae such as Chlamydomonas, however the commercial exploitation of this technology still requires additional research. SUNBIOPATH will address the problem of maximising transgenic expression in the chloroplast and will develop a robust system for chloroplast metabolic engineering by developing methodologies such as inducible expression and trans-operon expression. A techno economic analysis will be made to evaluate the feasibility of using these algae for the purposes proposed (biologicals production in the chloroplast and/or biomethane production) taking into account their role in CO2 mitigation.
Agency: European Commission | Branch: FP7 | Program: CP-FP | Phase: HEALTH.2013.1.3-2 | Award Amount: 6.19M | Year: 2013
Diabetes is caused by insufficient or lack of insulin secretion by the specialized B cells of the pancreas and, if not treated adequately evolves into in complications which alter patients integrity and wellness. Treatment is based on lifetime drugs administration for blood glucose control or parenteral infusion of insulin to better control glucose levels and glycosylation of hemoglobin. Artificial pancreases are in development but still dependent by external energy sources and need permanent transcutaneous access to release the hormone. Pancreatic whole organ transplantation is a major intervention requiring selected recipient and matched cadaveric donor which keep numbers down. Islet of Langerhans transplantation is a non-invasive method for the treatment of type 1 diabetes but several questions remain and several issues have to be addressed in order to improve the method since islet engraftment is clearly suboptimal, as a result of pro-apoptotic and pro-inflammatory stimuli sustained during islet isolation and at the site of implantation, the long-term islet graft function drops to 15% with time, and the current systemic immunosuppressive regimen has several drawbacks in terms of side effects. Solution should be find to increase transplantation efficiency with an higher number of islet, eventually from animals, induce tolerance toward the graft, avoiding systemic, lifetime immunosuppression and, lowering a specific inflammatory reaction and enhancing graft micro vasculogenesis to improve islet nesting. NEXT provides a 360 solution to the pitfalls of current methodology for pancreatic islet transplantation: i) Nano technologies, to engineer donor cell surfaces in order to derange recognition and suppress their rejection; ii) Advanced tissue engineering methods, to assemble bio synthetic islet, enriched by chimeric microvasculature; iii) Innovative double immune-suppressive strategy by graft - bound immunosuppressive nano peptides and shielded by self- vasculature
Agency: European Commission | Branch: H2020 | Program: RIA | Phase: FETPROACT-01-2016 | Award Amount: 8.65M | Year: 2016
The goal of BrainCom is to develop a new generation of neuroprosthetic devices for large-scale and high density recording and stimulation of the human cortex, suitable to explore and repair high-level cognitive functions. Since one of the most invalidating neurospychological conditions is arguably the impossibility to communicate with others, BrainCom primarily focuses on the restoration of speech and communication in aphasic patients suffering from upper spinal cord, brainstem or brain damage. To target broadly distributed neural systems as the language network, BrainCom proposes to use novel electronic technologies based on nanomaterials to design ultra-flexible cortical and intracortical implants adapted to large-scale high-density recording and stimulation. The main challenge of the project is to achieve flexible contact of broad cortical areas for stimulation and neural activity decoding with unprecedented spatial and temporal resolution. Critically, the development of such novel neuroprosthetic devices will permit significant advances to the basic understanding of the dynamics and neural information processing in cortical speech networks and the development of speech rehabilitation solutions using innovative brain-computer interfaces. Beyond this application, BrainCom innovations will enable the study and repair of other high-level cognitive functions such as learning and memory as well as other clinical applications such as epilepsy monitoring using closed-loop paradigms. BrainCom will be carried out by a consortium assembled to foster the emergence of a new community in Europe acting towards the development of neural speech prostheses. Thanks to its high interdisciplinarity involving technology, engineering, biology, clinical sciences, and ethics, BrainCom will contribute advances to all levels of the value chain: from technology and engineering to basic and language neuroscience, and from preclinical research in animals to clinical studies in humans.
Agency: European Commission | Branch: FP7 | Program: CP-FP | Phase: HEALTH.2011.2.2.2-1 | Award Amount: 13.07M | Year: 2012
As the European population is ageing rapidly, the growing number of seniors with age-related chronic diseases poses a challenge on European societies and health care systems. Therapeutic interventions that are effective, affordable and well-tolerated in the prevention of chronic disease are urgently needed and will have an outstanding impact on public health as a whole. Among the most promising interventions that meet these requirements are vitamin D, marine omega-3 fatty acids and physical exercise. However, their individual and combined effects have yet to be confirmed in a clinical trial. The DO-HEALTH will close this knowledge gap in a large 3-year multi-centre clinical trial that will establish long-term efficacy and safety data for the 3 interventions in the prevention of age-related diseases in seniors. The DO-HEALTH trial will enrol 2152 community-dwelling men and women aged 70 and older, when chronic diseases increase substantially. The randomized-controlled trial will test the individual and the combined benefit of 2000 IU vitamin D/day, 1 g of omega-3 fatty acids/day and a simple home exercise program in an efficient factorial trial design. DO-HEALTH will establish evidence in 5 primary endpoints: the risk of incident non-vertebral fractures; the risk of functional decline; the risk of blood pressure increase; the risk of cognitive decline; and the rate of any infection. Key secondary endpoints include risk of hip fracture, rate of falls, pain in symptomatic knee osteoarthritis, glucose tolerance, gastro-intestinal symptoms, mental and oral health, quality of life, and mortality. Follow-up will be in-person, in 3-monthly intervals (4 clinical visits and 9 phone calls). DO-HEALTH will further assess the comparative effectiveness of the interventions by evaluating reasons why or why not seniors adhere to them, and will assess their cost-benefit in a health economic model based on documented health care utilization and observed incidence of chronic disease.
Agency: European Commission | Branch: FP7 | Program: CP-FP | Phase: KBBE-2008-1-1-01 | Award Amount: 4.17M | Year: 2009
Successful and efficient plant breeding depends on rapid recombination of advantageous traits to form new crop varieties. In recent years new breeding techniques have been introduced which rely on transgenic alteration of somatic cells and regeneration into plants with novel properties. The precision and effectiveness of both strategies rely upon homologous recombination (HR). The objective of this proposal is to provide plant breeders with new tools allowing better control over HR in both somatic and meiotic cells. The expected outcomes of the proposed research are efficient gene targeting (GT) technologies for precise engineering of plant genomes and control of rates of meiotic recombination between homologous or homeologous chromosomes in classical breeding. The major components of the HR machinery are common to somatic and meiotic cells, enabling us to address both processes in a synergistic way. HR can be divided into different steps: initiation by formation of a DNA double-strand break (DSB); recognition and invasion of an homologous DNA sequence; resolution of recombination structures. Each stage contains a bottleneck for both GT and meiotic HR that we will address. Work package 1 (WP1) aims at enhancing HR through targeted DSB induction. DSBs will be induced by Zinc-finger nucleases that can be custom-designed for target sequences anywhere in the genome. In WP2, we will test the influence of HR factors affecting homologue invasion and heteroduplex formation, such as RAD51 and its paralogues, the RAD52 homologue, genes that affect cytosine methylation in DNA, and mismatch repair. In WP3 we will concentrate on proteins involved in resolution and crossing-over. WP4 will test combinations of those approaches found in the first three WPs to build optimal strategies for application. Most experiments will be performed in the model plant Arabidopsis and implemented into crops such as tomato and maize to guarantee quick applicability for breeding.
Agency: European Commission | Branch: FP7 | Program: CP | Phase: ICT-2009.8.2 | Award Amount: 6.51M | Year: 2010
Quantum entanglement has the capacity to enable disruptive technologies that solve outstanding issues in: - Trust, privacy protection, and security in two- and multi-party transactions; - Novel or enhanced modes of operation of ICT devices; - Reference standards, sensing, and metrology. The development of entanglement-based strategies addresses these challenges and provides the foundations for quantum technologies of the 21st century. The practical exploitation of entanglement requires groundbreaking levels of robustness and flexibility for deployment in real-world environments. This ambitious goal can be reached only through radically new designs of protocols, architectures, interfaces, and components. Q-ESSENCE will achieve this by a concerted application-driven effort covering relevant experimental, phenomenological, and fundamental aspects. Our consortium will target three main outcomes: 1) Development of entanglement-enabled and entanglement-enhanced ICT devices: atomic clocks, quantum sensors, and quantum random-number generators; 2) Novel physical-layer architectures for long-distance quantum communication that surpass current distance limitations through the deployment of next-generation components; 3) Distributed quantum information protocols that provide disruptive solutions to multiuser trust, privacy-protection, and security scenarios based on multipartite entanglement. These outcomes will be reached through the underpinning science and enabling technologies of: light-matter interfaces providing faithful interconversion between different physical realizations of qubits; entanglement engineering at new scales and distances; robust architectures protecting quantum information from decoherence; quantum information concepts that solve problems of limited trust and privacy intrusion. The project builds on the outstanding expertise of the consortium demonstrated by pioneering works over the past decades, enhanced by a strong industrial perspective.
Agency: European Commission | Branch: H2020 | Program: MSCA-ITN-ETN | Phase: MSCA-ITN-2015-ETN | Award Amount: 3.87M | Year: 2016
STREAM is a 4-year multi-site training network that aims at career development of Early Stage Researchers (ESRs) on scientific design, construction manufacturing and of advanced radiation instrumentation. STREAM targets the development of innovative radiation-hard, smart CMOS sensor technologies for scientific and industrial applications. The platform technology developed within the project will be tested in the demanding conditions posed by the CERN LHC detectors environment as well as European industry leaders in field of CMOS imaging, electron microscopy and radiation sensors. This leveraging factor will allow to fine-tune the technology to meet the requirements of industrial application cases on demand such as electron microscopy and medical X-ray imaging, as well as pathway towards novel application fields such as satellite environments, industrial X-ray systems and near-infrared imaging. The project will train a new generation of creative, entrepreneurial and innovative early-stage researchers and widen their academic career and employment opportunities. The STREAM consortium is composed of 10 research organisations and 5 industrial partners; the network will provide training to 17 ESRs. STREAM structures the research and training in four scientific work-packages which span the whole value-chain from research to application: CMOS Technologies Assessment, Smart Sensor Design and Layout, Validation and Qualification, Technology Integration, and Valorization.
Agency: European Commission | Branch: H2020 | Program: CSA | Phase: ISSI-1-2015 | Award Amount: 3.94M | Year: 2016
Our project, Doing-It-Together Science, DITOs, represents a step change in European public engagement with science and innovation. We propose moving from a model in which scientific research, innovation, and problem-solving is mainly driven by scientific/professional institutions to one based on active public participation and capacity building with various levels and strategies of engagement in the scientific process. At the core of our ethos is a recognition of peoples existing expertise and the different ways people want to and do engage in science and technology. The project is aimed at elevating public engagement with science across Europe from passive engagement with the process of developing science to an active one. Citizen Science and Do It Yourself (DIY) scientific efforts demonstrate that this is possible, and our aim is to ensure that the European Research Area will become leader in deep public engagement that is afforded by these advances. As a Coordination and Support Action, this project will support and build upon DIY, grassroots, and frugal innovation initiatives so that in the short and medium term we sustain localised capacity building and in the long term the effects of these grassroots efforts channel to policy makers at different levels, from external advice to societal inputs, regarding appropriate research and innovation policies. The proposal includes the participation of policy bodies (European Citizen Science Association, DE), SMEs (Tekiu, UK; Eutema, AT), Universities (University College London, UK; Universite Paris Descartes, FR; University of Genve, CH), Science galleries and public spaces (Royal Belgian Institute of Natural Sciences, BE; Medialab-Prado, E; Kersnikova Institution, SL) and NGOs (Meritum Association, PL; Waag Society, NL).
Agency: European Commission | Branch: H2020 | Program: CSA | Phase: FETOPEN-3-2015 | Award Amount: 886.50K | Year: 2016
Over the last decade substantial improvements were made in the technologies for low light-level detection. It is expected that the coming years may bring a technology revolution in this emerging FET-relevant topic. Various European research groups have played and will play an important role in further advancing this technology since low light-level detection is very important for a lot of research projects and upcoming research infrastructures. Indeed, R&D in this technology is growing and new groups all over Europe and world-wide are getting involved. However, a coordination of these groups especially in Europe is currently missing; while competition is often speeding up developments especially in the prototyping together with industry, duplication in testing and evaluating available devices for research projects is unfortunately standard. The projects objectives are to (1) conduct the development of a European R&D roadmap towards the ultimate low light-level (LLL) sensors, and to monitor and evaluate the progress of the developments with respect to the roadmap, (2) coordinate the R&D efforts of research groups and industry in advancing LLL sensors and liaise with strategically important European initiatives and research groups and companies world-wide, (3) transfer knowledge by initiating information and training events and material, and (4) disseminate information by suitable outreach activities.
Agency: European Commission | Branch: FP7 | Program: CP | Phase: ICT-2007.5.3 | Award Amount: 12.29M | Year: 2008
Nearly four million osteoporotic bone fractures cost the European health system more than 30 billion Euro per year. This figure could double by 2050. After the first fracture, the chances of having another one increase by 86%. We need to prevent osteoporotic fractures. The first step is an accurate prediction of the patient-specific risk of fracture that considers not only the skeletal determinants but also the neuromuscular condition. The aim of VPHOP is to develop a multiscale modelling technology based on conventional diagnostic imaging methods that makes it possible, in a clinical setting, to predict for each patient the strength of his/her bones, how this strength is likely to change over time, and the probability that the he/she will overload his/her bones during daily life. With these three predictions, the evaluation of the absolute risk of bone fracture will be much more accurate than any prediction based on external and indirect determinants, as it is current clinical practice. These predictions will be used to: i) improve the diagnostic accuracy of the current clinical standards; ii) to provide the basis for an evidence-based prognosis with respect to the natural evolution of the disease, to pharmacological treatments, and/or to preventive interventional treatments aimed to selectively strengthen particularly weak regions of the skeleton. For patients at high risk of fracture, and for which the pharmacological treatment appears insufficient, the VPHOP system will also assist the interventional radiologist in planning the augmentation procedure. The various modelling technologies developed during the project will be validated not only in vitro, on animal models, or against retrospective clinical outcomes, but will also be assessed in term of clinical impact and safety on small cohorts of patients enrolled at four different clinical institutions, providing the factual basis for effective clinical and industrial exploitations.
Agency: European Commission | Branch: FP7 | Program: CP | Phase: ICT-2007.2.1 | Award Amount: 4.77M | Year: 2008
Significant advances in human-computer interaction will require systems which can exhibit truly cognitive behaviour. This is particularly so in spoken dialogue systems (SDS) where, despite wide deployment and significant investment, current systems are still limited in capability and fragile to changes in environment or application. \n\nRecent advances in statistical modelling and machine learning offer the potential for making a significant step forward in SDS. By both exploiting and extending these advances, the CLASSiC project will improve generalization to unexpected situations. By modelling the whole end-to-end system as an integrated statistical process, the CLASSiC project will demonstrate a qualitative leap in the adaptivity, flexibility, robustness, and naturalness of SDS.\n\nThe CLASSiC partners will develop a modular processing framework with an explicit representation of uncertainty which connects the various sources of uncertainty (understanding errors, ambiguity, etc) to the constraints to be exploited (task, dialogue, and user contexts). This architecture will support a layered hierarchy of supervised learning and reinforcement learning in order to facilitate mathematically principled optimisation and adaptation techniques. The architecture will be developed in close cooperation with our industrial partner in order to ensure that it provides a practical deployment platform as well as a flexible research test-bed.\n\nThe resulting CLASSiC SDS will be able to adapt autonomously both to the needs of different users and to changing operating environments, and to learn through experience. The data-driven methodology will also enable faster and lower-cost system implementation through automatic optimisation. Overall, the project will demonstrate not only a step-change in the capability of practical spoken dialogue systems, it will also mark a significant step forward in the longer term goal of endowing autonomous systems with truly human-like capabilities.
Agency: European Commission | Branch: FP7 | Program: CSA-SA | Phase: INCO.2011-6.1 | Award Amount: 560.70K | Year: 2011
The central goal of EcoArm2ERA is to reinforce the international research cooperation between Armenian leading institute in environmental and ecological studies CENS and ERA in the critically important areas, specifically FP7 Themes as Environment, KBBE and Space (e.g. GIS technologies for environment)). This goal will be attained through pursuit of the following specific objectives: Objective1: To define and promote a sustainable development strategy for the Armenian CENS focusing on the overall improvement of the institutions capacities, visibility, and competitiveness. Objective 2: To develop a strategic partnership between CENS and (i) the School of Geography, Planning and Environmental Policy, Earth Sciences Institute, University College Dublin (NUID UCD), and (ii) the Institute of Environmental Sciences, University of Geneva (UNIGE). To build a CENS capacity to acquire and carry out international collaborative research partnerships. Objective 3: To build the competencies needed by Armenian researchers and staff members in order to participate in the FP7/FP8 programme. EcoArm2ERA brings together top-level partners from both ERA and Armenia: CENS created with the mission to integrate the Armenian research capabilities in the field of environment and sustainable development. CENS will be main beneficiary of the project. CENS will be directly supported through its European twinning partners: NUID UCD Earth Sciences Institute of the University College Dublin and Institute of Environmental Sciences, University of Geneva, who will provide and open up their scientific and networking capabilities and serve as strategic link to the European RTD community. Last but not least, the consortium is accomplished by GIRAF (a specialist training provider and project management expert with expertise in institutional development and organisation).
Agency: European Commission | Branch: H2020 | Program: CSA | Phase: GERI-4-2014 | Award Amount: 3.39M | Year: 2015
With the main goal to address gender equality in Research & Innovation the proposing GENERA (Gender Equality Network in the European Research Area) consortium has been formed to apply a bottom-up approach to enhance gender equality in the field of physics research as a benchmark for other sciences. Physics is a research field with a low representation of female researchers and a masculine image, so this field as such being represented by different actors will be the basis for GENERA analysis and interventions. GENERA comprises a starting set of organisations active in the field of physics, which are committed to the implementation of the project and to the achievement of its milestones. The consortium proposing the project will be extended to involve other interested major physics research organisations in European countries as associate partners. The GENERA consortium requests funding to support research organisations in implementing gender equality plans and proposes the following coordination and support actions with a focus on physics research and a keen eye on cultural differences throughout Europe by the following steps: 1. Assess the status of gender issues in the partner organisations. 2. Identify gaps in existing Gender Equality Plans (GEPs) and determine specific needs or actions to enhance gender equality and women careers in physics. 3. Monitor and evaluate the existing activities of the involved organisations (partners and associates). 4. Formulate customized GEPs for all implementing organizations and create a roadmap for their implementation in physics with the potential of application in other research fields. 5. Support involved organisations in implementing customized GEPs. 6. Create a network of RPOs, HEIs and RFOs to promote gender equality in physics. 7. Set up a long-term monitoring system allowing RPOs and RFOs monitoring the impact of their GEPs in physics with the potential of application in other research fields.
Agency: European Commission | Branch: FP7 | Program: CP | Phase: ICT-2011.4.2 | Award Amount: 4.93M | Year: 2011
The project goal is to design and build mobile applications that approach human performance in conversational interaction, specifically in terms of the interactional skills needed to do so. These skills will include recognising and generating conversational speech incrementally in real-time, adapting to new concepts without manual intervention, and personalising interaction. All of these skills will be learned or adapted using real data, and will be used to build systems for interactive hyper-local search in three languages (English, Spanish and Mandarin) and for two domains such as property search and tourist information. Current search engines work well only if the user has a single search goal and does not have multiple trade-offs to explore. For example, standard search works well if you want to know the phone number of a specific business but poorly if you are looking for a house with several different search criteria of varying importance, e.g. number of bedrooms versus bathrooms versus price etc. The latter requires the user to collaborate conversationally over several turns.
Agency: European Commission | Branch: FP7 | Program: MC-ITN | Phase: FP7-PEOPLE-2011-ITN | Award Amount: 2.54M | Year: 2012
One of the most interesting questions in prehistory is that of the origins of European peoples following the introduction of settled farming life to the continent some 8,000 years ago. The interdisciplinary BEAN network will address this by combining teaching and research in three different fields: i) Anthropology and genetics, ii) computer simulation and modelling, and iii) prehistoric archaeology. Particularly, it will provide state of the art training in palaeo-genomics, mathematical modelling of prehistoric culture change and statistical demographic inference methods. The network includes one industrial and seven academic participants and five associates and will incorporate an integrated educational system that combines intensive specialized training in each subject and rotation of early stage and advanced researchers. This research has wider impact, for example within the multibillion-euro cultural heritage industry, and BEAN will include internships at three private sector partners; a biotech company, a tourist company and a publisher, as well as the German national statistics office. Thus, participants will engage with cutting edge scientific methods, will combine diverse disciplines and schools of thought and will be guaranteed contact with private companies and state organisations for further career development. Additionally, there will be a special focus on translational skills, particularly media relations and scientific writing, both of which will feed into the final BEAN-book, which will be co-authored by ESRs, ERs and PIs, and published by Springer.
Agency: European Commission | Branch: FP7 | Program: CP-FP | Phase: HEALTH.2013.2.4.2-1 | Award Amount: 6.73M | Year: 2013
The management of cardiovascular stability in shock patients is of paramount importance in critical care. New, cutting edge knowledge is necessary to overcome the shortcomings of available therapies, in order to cope with the challenges that anesthesiologists, intensive care specialists and emergency physicians face when dealing with shock patients. Current therapies are targeted to reduce symptoms of shock and multiple organ failure, but they are unable to target the root cause or to act at the beginning of the cascade, because of the lack of a model explaining the molecular basis of shock induced tissue injury and ensuing multiple organ failure. Hence, the effectiveness of anti-hypotensive interventions such as fluid resuscitation is limited. Fluid may restore blood pressure within minutes, but complications such as pulmonary edema may arise on a longer time scale in patients who are unresponsive to fluid infusion, and cause more severe stress to the patients hemodynamic stability, respiratory efficiency, and immune system. This proposal seeks to shed light on the molecular trigger of acute heart failure in association to shock, in the presence of uncontrolled proteolytic activity, in order to identify inflammatory mediators and markers which are activated in shock, through a systematic analysis of expression levels of genes and their protein products, and novel targets for the delivery of new therapies, necessary to overcome the limitations of current ones.
Agency: European Commission | Branch: FP7 | Program: MC-ITN | Phase: FP7-PEOPLE-2011-ITN | Award Amount: 3.34M | Year: 2012
SPHINGONET was inspired by a common vision of its partners on the training of researchers in the field of sphingolipid homeostasis and body-wide sphingolipid signaling networks using innovative technological approaches. Dysregulation of sphingolipid balances contributes to a broad range of pathological processes, spanning neurodegeneration, asthma, autoimmune disease, insuline resistance, obesity and cancer progression. Key to defining proper sites for therapeutic intervention is a comprehensive understanding of the mechanisms of sphingolipid homeostasis and how sphingolipid-mediated signaling pathways are interconnected. In spite of its many clinical implications, progress in this field is curbed by a lack of appropriate tools to monitor, quantify and manipulate sphingolipid pools in live cells. SPHINGONETS training program is designed to close these gaps in knowledge and technology by transferring the complementary expertise of its partners to a future generation of scientists who will take a leading role in decoding the full regulatory potential of the sphingolipid signaling network and maximize its therapeutic use. By merging seven academic partners working at the forefront of sphingolipid, chemical and systems biology with three (pro)drug discovery-oriented SMEs, SPHINGONET will create a challenging interdisciplinary and clinically-relevant research environment with ample opportunities for structuring industrial projects, commercial exploitation of results, entrepreneurship and complementary education adapted to the personal needs. Thus, SPHINGONET will provide its trainees with a rounded education that, besides enhancing their career perspectives, will enable them to choose a career path in Europes academia or industry, and be successful at it while retaining ties between both these bodies.
Agency: European Commission | Branch: FP7 | Program: MC-ITN | Phase: FP7-PEOPLE-ITN-2008 | Award Amount: 1.47M | Year: 2010
PETAF aims to serve as a European research and training platform for joint philosophical research on perspectivalism in thought and language and its consequences for our conception of objective, mind-independent reality. PETAF comprises seven full network partners and five associate partners, four of which from industry. It is to provide seven Early Stage Researchers (ESRs) with the skills necessary for meeting the demands of top research in this area of philosophy, and the opportunity to apply and enhance these skills, and to acquire further, complementary skills, in professional contexts outside academia. PETAFs research programme addresses both foundational issues in metaphysics and in logic and semantics and local issues in more specialised areas in which perspective-bound cognition plays a pivotal role, i.e. the philosophy of space and time, the philosophy of alethic and epistemic modality, the philosophy of subjectivity and consciousness, and the philosophy of norms and value. PETAFs training programme, which is conducted in close interaction with the projected research, consists of seven well-articulated training modules, of which six are devoted to research training in the aforementioned fields and one is devoted to complementary skills training. Secondments at PETAFs industrial partners are core elements of the training programme, as is reflected in the time that ESRs are expected to spend with industry. PETAF thus seeks to significantly increase the career opportunities and job prospects of its recruited ESRs, both inside and outside academia.
Agency: European Commission | Branch: FP7 | Program: CSA-SA | Phase: SiS-2009-184.108.40.206 | Award Amount: 3.31M | Year: 2010
This project aims to effect a change across Europe in the teaching and learning of mathematics and science with teachers supported to develop inquiry-based learning (IBL) pedagogies so that students gain experience of IBL approaches. Ultimately, our objective is a greater number of students with more positive dispositions towards further study of these subjects and the desire to be employed in related fields. The proposal brings together 13 teams of experts in IBL in mathematics and science education from 12 nations and will be led and managed by a researcher who has recent successful experience of European work of this type. The nine working packages will be led by appropriate experts from the wider team, who will ensure the successful completion of each stage of the project. Overall, our design of the project throughout has been focused so as to provide a multi-level dissemination plan addressed to teachers and important stakeholders to ensure maximum impact. This plan includes the provision of high quality support for, and training of, teachers and teacher trainers; selection of high quality materials and methods with which to work with teachers, supporting actions addressed to teachers to advertise IBL, methods of working with out-of-school parties such as local authorities and parents and summaries of analyses that will inform a wide range of policy makers about how they can support the required changes. Throughout the projects timeline national consultancy panels and two international panels will provide on-going advice and orientation at key stages. To maximise the projects reach to teachers either established networks for professional development of teachers will be expanded, or new networks will be built using models which have proven efficacy. Rigorous evaluation both by an internal team and an outside agency will provide formative and summative feedback about the validity of the project and its effectiveness.
Agency: European Commission | Branch: FP7 | Program: CSA-CA | Phase: HEALTH.2011.3.4-2 | Award Amount: 2.35M | Year: 2011
SDH-Nets aim is to build, strengthen and link research capacities for health and its social determinants (SDH) in African and Latin American low- and middle income countries (LMIC) in close collaboration with European partners. The focus on SDH will allow for an in-depth and broad capacity-building approach, including managerial and technical excellence, ethical issues, and research strategies. Lessons learnt will be checked against best practices and success factors in other Latin American, African and global settings, leading to lessons learnt on how to build SDH-related research capacity with strong relevance to the respective context. A sound mapping exercise of (i) social determinants of health (SDH) and research activity in the field; (ii) national and global stakeholders in the research environment, and (iii) existing research capacities in the participating countries will be carried out building the basis for developing and piloting innovative research capacity building tools with a particular focus on research management, ethics and methodology relevant to comprehensively address social determinants of health. Finally, links between research and policy will be forged and lessons will be drawn to support the development of sustainable and attractive research structures and expertise. SDH-Net will be carried out by a strong consortium, based on clusters of existing networks of best in its kind public health institutions from Mexico, Colombia, Brazil and South Africa, Tanzania, and Kenya. The team is complemented by three distinguished European institutions: London School of Hygiene and Tropical Medicine; COHRED, and University of Geneva. SDH-Net is coordinated by GIZ with long term experience in health research and capacity building in LMIC, and IESE Business School, excellent in management capacity building. SDH-Net will have an important impact by developing a concept for research capacity building on individual, institutional and system level, contributing to research system strengthening and to the creation of research landscapes that enable and stimulate locally relevant, interdisciplinary research. It will lead to enhanced capacities for conducting and managing research on SDH and links between research, policy and practice will be forged by developing tools and mechanisms facilitating sustained collaboration. Furthermore, SDH-Net will lay foundations and provide tools for further research capacity building and research system strengthening in the future.
Agency: European Commission | Branch: FP7 | Program: CP | Phase: ICT-2009.5.1 | Award Amount: 3.90M | Year: 2010
PSYCHE project will develop a personal, cost-effective, multi-parametric monitoring system based on textile and portable sensing platform for the long and short term acquisition of data.The patient diagnosed with bipolar disorder will be placed at the epicentre of its management, for treatment and prevention of depressive and manic episodes.The system will use wearable and portable devices for acquiring, monitoring and communicating physiological parameters, behavioural and mood correlated indexes (i.e. vital body signs, biochemical markers and voice analysis).The acquired data will be processed and analyzed in the established platform that takes into consideration the Electronic Health Records (EHR) of the patient, the parameters set up in the first stage between bipolar and non-bipolar individuals, as well as medical analysis in order to verify the diagnosis and help in prognosis of the illness.Finally communication and feedback to the patient will be performed through a direct contact with the patient and device, or by communication between physician and patient.Constant feedback and monitoring will be used to manage illness, to give patients support, to facilitate interaction between patient and physician as well as to alert professionals in case of patients relapse and depressive or manic episodes income.PSYCHE project will focus on the following objectives:\ni) Implementation of a sensing platform physiological and behavioural monitoring for patients with bipolar disorders\nii) Development of novel portable devices for the monitoring of biochemical markers, voice analysis and a behavioural index correlated to mental illness\niii) Brain functionality: in order to correlate central measures o with clinical assessment and the parameters measured by Psyche platform\niv) Data mining and managing: The ultimate goal is to identify signal trends indicating detection and prediction of critical events\nv) The system will contain a patient and professional close loop
Agency: European Commission | Branch: FP7 | Program: CP | Phase: ICT-2009.3.8 | Award Amount: 3.63M | Year: 2010
The detection of chemical or biological substances increasingly appears as an essential concern in order to prevent human or animal health and security related problems. Present analytical techniques are expensive and often require highly specialized staff and infrastructures. The principal need is to perform screening tests, which can be carried out in non-specialized infrastructures, e.g. Point of Care, schools and field, before unambiguous identification in a specialized laboratory. There is thus a need to develop a new detection system that has low-cost and is portable but at the same time offers high sensitivity, selectivity and multi-analyte detection from a sample containing various components (e.g. blood, serum, saliva, etc.).\n\nThe objective of P3SENS is to design, fabricate and validate a multichannel (50 or more) polymer photonic crystal based label-free disposable biosensor allowing for a positive/negative detection scheme of ultra small concentrations of analytes in solution (< 1 ng/mL). The biosensor will be encapsulated in a specifically designed microfluidic system in order to deliver the sample to the multiple sensing zones. The design of the biochip will allow it to be easily inserted in a compact measurement platform, usable by non-specialized practitioners outside of specialized laboratories for carrying simultaneous multi-analyte detection, delivering real-time monitoring, and with an assay duration that will not exceed a few tens of minutes.\n\nThe photonic chip proposed in this project will be based on polymer Photonic Crystal (PhC) micro-cavities coupled into a planar waveguide optical distribution circuit. The photonic chip will be fabricated with available fabrication technologies - and with an emphasis on low cost substrates (polymer) and fabrication processes (nano-imprint lithography). More generally, P3SENS will push forward the development of low cost disposable biochips based on photonics.
Agency: European Commission | Branch: H2020 | Program: CSA | Phase: DRS-09-2014 | Award Amount: 3.03M | Year: 2015
Significant challenges exist towards strengthening the Climate Change Adaptation (CCA) and Disaster Risk Reduction (DRR) communities for coherent, mutually reinforcing and pragmatic planning and action. PLACARD seeks to support the coordination of these two communities. PLACARD will tackle current challenges by 1) providing a common space where CCA and DRR communities can come together, share experiences and create opportunities for collaboration; 2) facilitating communication and knowledge exchange between both communities; and 3) supporting the coordination and coherence of CCA and DRR research, policy and practice. PLACARDs approach to achieving these goals is to establish a strong and operational network of networks by connecting to existing networks and boundary organisations, to foster dialogue among stakeholders (e.g. researchers, research funders, policymakers, practitioners) engaged in CCA and DRR at the international, European, national and sub-national scales. This overarching network will enable these communities to share knowledge, to discuss challenges and to jointly co-produce options to bridge the gaps they experience. It will support the development and implementation of a research and innovation agenda to make better use of research funding, as well as to develop guidelines to strengthen relevant institutions in their efforts to mainstream CCA and DRR. PLACARD will evolve iteratively, learning from the different processes and experiences with the stakeholders, and being flexible and responsive to changing needs. PLACARD will be supported by an online platform that builds upon and links existing CCA and DRR platforms to streamline the dissemination and communication of CCA and DRR activities. PLACARD Consortium is built around the leadership of a number of key European institutions experienced in CCA and DRR policy and practice, and UN organizations leading and engaged inpost-2015 agendas.
Agency: European Commission | Branch: FP7 | Program: CP-FP | Phase: ENV.2007.1.3.2.1. | Award Amount: 1.80M | Year: 2008
Since a long time vulnerability is a key concept in disaster literature. Nevertheless the majority of studies and grants have been allocated to hazards related research, neglecting the influence of vulnerability of exposed systems on the death toll and losses in case of natural or man made disasters. There is the need to better identify and measure also the ability of menaced and affected communities and territorial systems to respond. This is the starting point of the ENSURE project. The overall objective of ENSURE is to structure vulnerability assessment model(s) in a way that different aspects of physical, systemic, social and economic vulnerability will be integrated as much as possible in a coherent framework. The ENSURE approach starts from the recognition that for all considered hazards most of damages and most of vulnerabilities arise from the territory, including artefacts, infrastructures and facilities. They may well represent its material skeleton: physical vulnerability is therefore entirely contained at a territorial level. Other vulnerabilities, such as systemic, economic and social have interactions with the territory, but cannot be entirely determined at a territorial level. The project will start by assessing the state of the art in different fields related to various vulnerability aspects as they have been tackled until today in Europe and internationally. The core of the project consists in integrated models comprising already existing models to assess vulnerability and develop new ones for those aspects that have been neglected until now. The research objective is therefore to achieve progress with respect to each individual sector of vulnerability and to enhance the capability of assessing interconnections among them in a dynamic way, identifying driving forces of vulnerability, that make communities change for the good or for the worse as far as their ability to cope with extreme events is concerned.
Agency: European Commission | Branch: FP7 | Program: CP-FP | Phase: HEALTH.2011.1.4-3 | Award Amount: 6.51M | Year: 2012
The SME-based SADEL consortium intends to develop the first generation of oral bio-therapeutics tackling disease targets in the digestive tract, by making optimal use of the Nanofitin (Nf) protein scaffold. Nf based drugs will progress through routes not travelled by antibodies while interacting with targets not modulated by chemical compounds. Existing Nf hits against validated targets will progress to the preparation of Phase I Clinical trials in Ulcerative Colitis (UC). To achieve this, SADEL assembles a virtual biopharmaceutical company with SMEs (70%), academics, clinicians and pharma industry with all cutting-edge skills: production (including GMP), analytics, formulation, preclinical and clinical development, up to licensing. Nfs are small (optimal tissue penetration), exhibit strong resistance to pH and human intestinal fluids (long half-life in digestive track) and their high affinity implies low effective concentration. They also demonstrate strong potential for optimizing pharmacological properties, including reducing immunogenicity. The Nf based drugs will be administered orally, reducing the systemic exposure and avoiding the safety issues reported with systemic administration of antibodies. This requires large quantities of proteins for frequent administration. Nfs are produced in bacterial systems for which GMP-compliant processes are broadly adopted in the industry, with a low cost of goods. The resulting proteins will be formulated for optimal release at the sites of action. The project is designed to address unmet technical challenges while avoiding external risks beyond those related to the scaffold behaviour itself. All additional elements are chosen for documented validation, from targets to evaluation protocols. Achieving SADEL aims will solve unmet patient needs by providing affordable, safe, efficient products in a format raising comfort and compliance to treatment. It will also assess the therapeutic potential of the Nanofitin platform.
Agency: European Commission | Branch: FP7 | Program: CP | Phase: ICT-2009.5.3 | Award Amount: 3.72M | Year: 2011
Rupture risk of intracranial aneurysms (IA) has been studied at length. However, very little is known about the healing mechanism, namely the formation of a clot inside the cavity after insertion of a stent. The multiscale interaction between biological and hemodynamic processes is the central ingredient of this proposal. The core of the project is to develop and validate a biological model of spontaneous or stent-induced thrombosis in IA. From this model we will compute quantitative stent efficiency score by its capability to induce clotting in aneurysms. In medical practice the choice of which stent to deploy is left to the medical doctor and remains intuitive to date. It is common to use one or several full-course stents into each other, in order to induce thrombosis formation. Recent Pipieline stents allowing simple or multiple devices constructs with variable flow disruption will be investigated. Our project will study through numerical simulation the effect of stent configuration in patient specific geometry and will help explain why some stents produce good thrombus while others dont. The project will develop a multiscale computational modelling and simulation framework based on the triptych In Vitro - In Vivo - In Silico - rule of three for the thrombosis. The associated technological aim of the project is to deliver software with an interactive end-user interface, providing a virtual simulation of the thrombosis leading to the optimal stent for a specific patients aneurysm. This goal will be achieved by integrating some of the leading open source software and VPH toolkit software in the area of computational bioengineering. Also a collaborative online system will be adapted allowing partners of THROMBUS to correlate any type of data in case simultaneous multidisciplinary analysis by distant partners is required. This platform will remain operational after the end of the project.
Agency: European Commission | Branch: FP7 | Program: CP | Phase: OCEAN 2013.2 | Award Amount: 6.74M | Year: 2013
SCHeMA is a multi-disciplinary collaborative project aiming to provide an open and modular sensing solution for in situ high resolution mapping of a range of anthropogenic and natural chemical compounds. Key targets are chemicals that may adversely affect marine ecosystems, living resources and ultimately human health. The SCHeMa tools will enhance ocean observing system capabilities to evaluate the impact of these compounds on marine water quality trends, thereby allowing one to rapidly localise problems and alert targeted groups. To achieve this, SCHeMA will develop: 1) chemical solid state miniaturized sensors functionalized using innovative analytical procedures to insure reliable and selective electrochemical and optical measurements of inorganic (micro-)nutrients/pollutants, VOCs, biotoxins, HABs, species relevant to the carbon cycle, as well as effective minimisation of chemical and physical interferences; 2) micro- and mini-analytical and mechanical fluidic systems; 3) miniaturized multichannel probes, incorporating the new sensors and fluidic systems, based on advanced hardware, firmware and wired/wireless interfaces allowing their plug-and-play integration to moored or free floating devices; 4) ad-hoc ICT solutions allowing remote control of data transfer and mapping system reconfiguration according to the OGC standard; 5) Web-based data information system for data storage, standardization, modelling and user-friendly accessibility by public authorities, scientists and existing observation/monitoring systems. The SCHeMA sensing tools will be optimised throughout their development via short field tests and inter-comparison with data obtained using established laboratory techniques. Long-term field applications in estuary and coastal systems will also be performed to (i) evaluate their ruggedness and reliability for high resolution spatial and temporal monitoring, and (ii) define their suitability for different applications and commercial production.
Agency: European Commission | Branch: FP7 | Program: MC-ITN | Phase: PEOPLE-2007-1-1-ITN | Award Amount: 2.84M | Year: 2008
COSI integrates young researchers in a network of 10 leading European research centres, including Bayer BioScience as industrial partner. We aim at identification of regulatory principles governing chloroplast metabolism, a crucial factor for agricultural productivity. Specifically we want to identify chloroplast-related protein kinases and their targets and associated calcium signals. A long term objective of COSI is increased plant productivity under stress conditions. COSI has expertise in various aspects of photosynthesis in algae and higher plants and in plant signal transduction. This unique combination will be used to identify major regulatory principles of plant organellar metabolism principally also applying far beyond the plant field. Thus training and knowledge can be transferred to many other fields in life sciences. An integrated working programme consisting of working packages, jointly coordinated by two groups of the network, guarantees maximal use of complementary expertises and strengthens ongoing interactions between partners. In addition to intensive exchange and collaboration of the involved young researchers, special training courses will introduce the young researchers in basic methods, which are required for their work and furthermore help them to develop complementary skills. Early stage researchers will be supported by a mentoring programme to enhance their personnel development. Special emphasis will be placed on promotion of women. A training course at Bayer BioScience will expose young researchers to an industrial environment and provide them with industrial relevant skills. COSI will offer hands-on training in cutting-edge technologies such as bioinformatics, live-cell imaging, mass spectrometry and metabolomics and establish an outstanding European research community in organellar signal transduction, an emerging new and competitive research field of central importance in life sciences.
Agency: European Commission | Branch: FP7 | Program: CP-FP-SICA | Phase: ENV.2010.2.1.1-1 | Award Amount: 4.16M | Year: 2011
Threats to the environment and natural resources, coupled with poor management, have serious implications for both poverty reduction and sustainable economic development. Degrading natural resources in Africa therefore result in an inreased vulnerability of the poor as a result of ecosystem stress, competition for space, soaring food and energy prices, climate change and demographic growth. Nowadays, it is widely accepted that reversing these trends asks for integrated management frameworks. Despite the availability of many tools, expertise, strategies, local practices and indigenous knowledge, the concept of INRM has hardly been brought into practice and the building blocks of INRM (see description acronym) in many cases still need to be integrated. AFROMAISON will make use of what is available regarding INRM and will contribute to a better integration of the components of INRM. In view of the decentralization policy in Africa, we aim to focus on the operational requirements of INRM for sub-national (or meso-scale) authorities and communities. The main outputs of AFROMAISON are a toolbox, short-term to long-term strategies, quick wins (much gains with little effort) and operational strategies for adaptation to global change. In order to enhance the potential impact, we will put strong efforts in integrated capacity building and a solid dissemination strategy. In order to do so, we will integrate tools, frameworks, strategies and processes for landscape functioning, livelihood & socio-economic development (incl. vulnerability to global change), local knowledge, institutional strenghtening and improved interaction between sectors, scales and communities. For the development of concrete operational strategies for adaptation to global change, AFROMAISON will focus on the three groups of tools: strategies for restoration and adaptation (including sustainable landscape intensification), economic tools and incentives for INRM and tools for spatial planning.
University of Geneva and ELECTROPHORETICS Ltd | Date: 2011-05-23
The invention relates to a method of aiding the diagnosis of acute brain damage in a subject, said method comprising (i) assaying the concentration of at least one oxidative stress polypeptide selected from the group consisting of: PRDX1, PRDX6 and GSTP1 in a sample from said subject; and (ii) assaying the concentration of at least one further polypeptide selected from Panel A; (Hi) comparing the concentrations of (i) and (ii) to the concentrations of the polypeptides in a reference standard and determining quantitative ratios for said polypeptides; (iv) wherein a finding of a quantitative ratio of each of the assayed polypeptides in the sample to the polypeptides in the reference standard of greater than 1.3 indicates an increased likelihood of acute brain damage having occurred in said subject.
University of Geneva and University Claude Bernard Lyon 1 | Date: 2010-03-10
The present invention relates to compounds of formula (I)to their pharmaceutical compositions and use thereof for the treatment of cancer expressing oncogenic ALK protein, particularly anaplastic large cell lymphoma (ALCL), diffuse large B cell lymphoma (DLBCL), inflammatory myofibroblastic tumours (IMT) and non-small cell lung cancer (NSCLC).
Les Hopitaux Universitaires Of Geneva, University of Geneva and Hoffmann-La Roche | Date: 2013-11-06
The invention relates to mimetic peptides of epitope(s) of Apolipoprotein A-I, diagnostic immunoassays comprising such mimetic peptides, as well as methods for diagnosing and methods for preventing and/or treating a cardiovascular disease.
News Article | February 26, 2017
Gamers and scientists are coming together to explore astronomical data, because science is awesome. In a collaboration between a multiplayer game EVE Online, Massively Multiplayer Online Science, Reykjavik University, and the University of Geneva, players can actually help scientist find and classify exoplanets. Though it sounds like science fiction, this new method of data-gathering looks to be awesome for both gamers and science! Naturally, with excitement up after the recent scientific discovery of seven exoplanets, players and scientists are working quickly to refine the search of data. EVE’s players will interact with data provided by the University of Geneva. After the players reach a consensus on the data, the information goes to scientists at the University of Geneva. The Executive Producer of EVE, Andie Nordgren said, “In searching for the next dataset for our massive player community to tackle, the stars aligned for players to have the opportunity to directly contribute to the search for new planets with a world-renowned scientific team. Real people around the world collaborating in a virtual universe to explore the real universe is the stuff science fiction, and soon science fact, is made of.” Because of how exoplanets are discovered, EVE is an ideal way for the scientific community to get outside help. Planets are found is by repeatedly examining changes to the light from stars, which occur when a planet passes in front of it. This “transit method” of discovery benefits from several cycles of observation, to rule out the light changes being a fluke. Check out the latest updates on the project below; the graphics are stunning! Michel Mayor will be presenting EVE in Iceland from April 6-8th. If you can’t make it to Iceland, livestreaming of the presentation is available! You can learn more about EVE Online and play for free on their website.
News Article | March 23, 2016
The Inter-agency and Expert Group on Sustainable Development Goal Indicators (IAEG-SDGs) meets at the end of this month to establish, among other things, the development of global reporting mechanisms (http://unstats.un.org/sdgs). We suggest that data that have been crowdsourced by civil-society ventures should be incorporated into the international process of monitoring the SDGs. To address potential issues of data quality, initiatives such as the Open Seventeen Challenge are necessary to train organizers of crowdsourcing projects (see http://openseventeen.org). This initiative draws on advice from leaders in advocacy, governance and crowdsourcing tools. It is run by Citizen Cyberlab, a partnership between the University of Geneva, the United Nations Institute for Training and Research (UNITAR) and Europe's particle-physics lab CERN. For crowdsourcing to achieve its full potential, governments will need to support projects that promote public participation in measuring progress towards the SDGs. National statistics offices must develop best practices for integrating crowdsourced data. As a first step, we encourage the IAEG-SDGs to emphasize crowdsourcing as a legitimate and valuable contribution to tracking the goals.
Bugianesi E.,University of Turin |
Salamone F.,University of Turin |
Negro F.,University of Geneva
Journal of Hepatology | Year: 2012
Given the pandemic spread of the hepatitis C virus (HCV) infection and the metabolic syndrome (MS), the burden of their interaction is a major public health issue, bound to increase in the near term. A better appreciation of the clinical consequences of the relationship between HCV and MS is needed, not only due to their potential synergism on liver disease severity, but also because of the multifaceted interactions between HCV and glucose and lipid metabolism. HCV infection per se does not carry an increased risk of MS, but is able to perturb glucose homeostasis through several direct and indirect mechanisms, leading to both hepatic and extrahepatic insulin resistance. This translates into accelerated liver disease progression (including the development of hepatocellular carcinoma), reduced response to antivirals and, in susceptible individuals, increased risk of developing full-blown type 2 diabetes. HCV may also cause hepatic steatosis, especially in patients infected with genotype 3, although the clinical impact of viral steatosis is debated. Possibly as a result of HCV-induced insulin resistance, and despite a paradoxically favourable lipid profile, the cardiovascular risk is moderately increased in chronic hepatitis C. In addition, the interaction with the MS further increases the risks of cirrhosis, hepatocellular carcinoma, diabetes, and cardiovascular events. Thus, targeted lifestyle and pharmacological measures are urgently warranted in chronic hepatitis C with metabolic alterations. © 2012 European Association for the Study of the Liver.
Sangouard N.,University of Geneva |
Sangouard N.,CNRS Materials and Quantum Phenomena Laboratory |
Simon C.,University of Geneva |
Simon C.,University of Calgary |
And 4 more authors.
Reviews of Modern Physics | Year: 2011
The distribution of quantum states over long distances is limited by photon loss. Straightforward amplification as in classical telecommunications is not an option in quantum communication because of the no-cloning theorem. This problem could be overcome by implementing quantum repeater protocols, which create long-distance entanglement from shorter-distance entanglement via entanglement swapping. Such protocols require the capacity to create entanglement in a heralded fashion, to store it in quantum memories, and to swap it. One attractive general strategy for realizing quantum repeaters is based on the use of atomic ensembles as quantum memories, in combination with linear optical techniques and photon counting to perform all required operations. Here the theoretical and experimental status quo of this very active field are reviewed. The potentials of different approaches are compared quantitatively, with a focus on the most immediate goal of outperforming the direct transmission of photons. © 2011 American Physical Society.
Thille A.W.,University of Poitiers |
Richard J.-C.M.,University of Geneva |
Brochard L.,University of Geneva |
Brochard L.,University Paris Est Creteil
American Journal of Respiratory and Critical Care Medicine | Year: 2013
The day of extubation is a critical time during an intensive care unit (ICU) stay. Extubation is usually decided after a weaning readiness test involving spontaneous breathing on a T-piece or low levels of ventilatory assist. Extubation failure occurs in 10 to 20% of patients and is associated with extremely poor outcomes, including high mortality rates of 25 to 50%. There is some evidence that extubation failure can directly worsen patient outcomes independently of underlying illness severity. Understanding the pathophysiology of weaning tests is essential given their central role in extubation decisions, yet few studies have investigated this point. Because extubation failure is relatively uncommon, randomized controlled trials on weaning are underpowered to address this issue. Moreover, most studies evaluated patients at low risk for extubation failure, whose reintubation rates were about 10 to 15%, whereas several studies identified high-risk patients with extubation failure rates exceeding 25 or 30%. Strategies for identifying patients at high risk for extubation failure are essential to improve the management of weaning and extubation. Two preventive measures may prove beneficial, although their exact role needs confirmation: one is noninvasive ventilation after extubation in high-risk or hypercapnic patients, and the other is steroid administration several hours before extubation. These measures might help to prevent postextubation respiratory distress in selected patient subgroups. Copyright © 2013 by the American Thoracic Society.
Agency: European Commission | Branch: FP7 | Program: CSA-CA | Phase: ENV.2013.6.5-4 | Award Amount: 1.16M | Year: 2013
IASON Project has the ultimate goal to establish a permanent and sustainable Network of scientific and non-scientific institutions, stakeholders and private sector enterprises belonging in the EU and third countries located in two significant areas: The Mediterranean and the Black Sea regions. The main focal points of the project will be the usage and application of Earth Observation (EO) in the following topics: climate change resource efficiency raw materials management IASON aims to build on the experiences gained by 5 FP7 funded projects, OBSERVE, enviroGRIDS, GEONETCab, EGIDA, and BalkanGEONet. All of the above projects focused on enhancing EO capacities, knowledge and technology in the EU and in neighborhood countries. During their execution time they managed to establish links with a critical mass of research institutions, organizations, public organizations, stakeholders, and policy makers in the Balkan region, the Mediterranean, and the Black Sea Basin. IASON intends to create the proper conditions for enhancing knowledge transfer capacity building, and market opportunities in using EO applications and mechanisms in specific research fields that are addressing climate actions resource efficiency and raw materials management. In order to achieve its goal IASON will engage in Visible and effective capacity building and knowledge transfer activities with Third countries research institutes and organizations, stakeholders and policy makers through the organization of two training workshops (one in each region). Demonstration of market opportunities through uptake of results from three projects (PEGASO, enviroGRIDS, and IMPACTMIN), best case scenarios and success stories. Identification of projects and networks, using the regional partners contacts in the Third Countries, along with input from training workshops, and Advisory Board Members in the thematic fields that have potential for future cooperation. Liaise and coordinate dissemination activities with other projects dealing with research and innovation cooperation for the Societal Challenge 5 of Horizon 2020 Creation of an innovative web based common information platform with information regarding clustering projects that demonstrate synergy potential, networking tools that will enhance communication between interesting parties.
Agency: European Commission | Branch: FP7 | Program: CSA-CA | Phase: HEALTH-2007-2.3.2-11 | Award Amount: 1.17M | Year: 2009
A number of new vaccines are being developed against Poverty-Related infectious diseases of major public health importance at a global level. The development of all those vaccines is facing the same kind of challenges and gaps, which still prevent 1) the establishment of readily accessible formulation and scale-up process development capacity for neglected diseases vaccines. 2) the establishment of a systematic approach for prioritizing formulation of vaccine candidates using accepted pre-clinical criteria. 3) the development of information-sharing tools to strengthen connections between the scientists, the developers and the clinical investigators. To address those challenges, the European vaccine community needs to establish a shared vision and goals and to identify the activities that could address some of the above-mentioned challenges. The cooperation between the different groups of PRD vaccine developers can bring innovative approaches for accelerating the development of effective vaccines. Among those challenges, several can be addressed through coordination across poverty related diseases, such as: 1) difficulties in accessing to some technology platforms, such as synthetic peptides, recombinant proteins, for GMP development, and therefore in making rationale decisions on the best industrial approach for the pharmaceutical development, 2) difficulties in accessing certain know-how, such as lyophilisation and lack of formulation platform accessible to academic group, 3) difficulties to access to some delivery platforms, such as adjuvants, virus-like particles for GMP development and/or to assess the quality and regulatory compliance of those platforms, 4) difficulties to harmonise the safety data collection, 5) the insufficient number of trained scientists able to play leadership in vaccine development
Agency: European Commission | Branch: FP7 | Program: CP | Phase: ICT-2009.1.1 | Award Amount: 6.29M | Year: 2010
The flexibility inherent to wireless technologies is giving rise to new types of access networks and allowing the Internet to expand in a user-centric way. This is particularly relevant if one considers that wireless technologies such as Wireless Fidelity (Wi-Fi) currently complement Internet access broadband technologies, forming the last hop to the end-userr. This fact becomes even more significant due to the dense deployment of Wi-FI Access Points that is common today in urban environments.\nDue to such density, a relevant aspect that can be worked upon is leveraging such wireless local-loop by developing networking mechanisms that allow adequate resource management and a future Internet architecture to scale in an autonomic way. Such wireless local-loop could then reach rates closer to the ones provided by current access technologies.\n\nA way to overcome the limitation of todays broadband access technologies is to expand the backbone infrastructure reach by means of low-cost wireless technologies that embody a multi-operator model, i.e., a local-loop based upon what a specific community of individuals (end-users) is willing to share, backed up by specific cooperation incentives and good behaviour rules.\n\nThe purpose of this project is therefore to explore the potential of having a wireless local-loop based upon a user-centric (community) model extending the reach of a high debit, multi-access broadband backbone from different perspectives (technical and business models, as well as the expected telecommunications market and legislation impact). Our expectations are to show that such model can be beneficial both from an end-user and from an access provider perspective, given that it allows expanding high debit reach in a seamless, cooperative, and low-cost manner, enabling the operators to focus on service rather than on pipes.
Agency: European Commission | Branch: FP7 | Program: MC-ITN | Phase: FP7-PEOPLE-2012-ITN | Award Amount: 4.03M | Year: 2012
AccliPhot is an interdisciplinary, intersectorial research and training network devoted to study photosynthetic acclimation processes in plants and algae. We will train 13 early-stage researchers (ESRs) and 1 experienced researcher (ER) in cutting-edge experimental technologies, modern modelling approaches, industrial applications and a wide spectrum of complementary and industry-relevant skills. In the frame of our research goal to obtain a systems-wide understanding of photosynthetic acclimation processes and their consequences at the organism and population levels, individual research projects and targeted secondments will expose the ESRs to intense interdisciplinary collaborations wielding complementary experimental and theoretical approaches to study photosynthetic acclimation across various scales of biological organisation. Projects will investigate signalling pathways that respond to environmental changes, electron transport chain activity, photosynthetic metabolism and growth of plants and algal populations using four organisms. Fundamental research uncovering novel mechanisms will be performed in Arabidopsis and Chlamydomonas. New knowledge will be applied to the diatom Phaeodactylum tricornutum, which will be subject to industrial research for the biotechnological production of biofuels. With complementary competences acquired in specialised workshops, the ESRs trained in our network will emerge as highly qualified scientists experienced in interdisciplinary collaborations. Exposure to research in the private sector and training in an industrial environment will stimulate their creativity and enhance their entrepreneurial spirit, thus optimising their perspectives on the European job market in academia, industry and beyond. Our research activities will have a long-term impact supporting the optimisation of crop productivity and the development of cost-effective strategies for algal biofuel production.
University of Rochester, University of Geneva and CSIC - Institute of Materials Science | Date: 2014-01-29
In the present disclosure, energy harvesters based on quantum confinement structures, such as resonant quantum wells and/or quantum dots, are described. Also disclosed are methods of harvesting energy utilizing the described energy harvester and methods of manufacturing energy harvesters. Energy harvesting is the process by which energy is taken from the environment and transformed to provide power for electronics.
Agency: European Commission | Branch: FP7 | Program: CP-FP | Phase: SSH-2007-3.2-01 | Award Amount: 1.91M | Year: 2008
This research aims to advance knowledge on the causes, processes, and perspectives for change related to the social and political exclusion of unemployed youth. It has three main objectives: (1) to generate a new body of data on young unemployed (in particular, young long-term unemployed), but also precarious youth; (2) to advance theory and extend knowledge on the social and political exclusion of young unemployed; and (3) to provide practical insights into the potential paths for the social and political integration of young unemployed. The overall design of the research has three main components: (1) a multidimensional theoretical framework that combines macro-level, meso-level, and micro-level explanatory factors while taking into account various dimensions of exclusion (social and political exclusion, individual well-being); (2) a cross-national comparative design that includes European countries with different institutional approaches to unemployment (France, Germany, Italy, Poland, Sweden, and Switzerland); (3) an integrated methodological approach based on multiple sources and methods (analysis of state and EU policies and practices towards unemployment, a survey of organizations active in the field, a survey of young long-term unemployed and precarious youth, in-depth interviews with young long-term unemployed, and focus groups with stakeholders). Three important features of the proposed research underscore its innovative impact: (1) its comparative approach allowing for bench-marking and best-practice analysis; (2) its multidimensional approach allowing to consider the mediating impact of (European, national, or local) public policy on the way people cope with their situation of unemployed; (3) its interactive research process spurring policy-learning by bringing together different expertise and knowledge, and allowing at the same time for the transfer of scientific findings into policy recommendations.
Agency: European Commission | Branch: FP7 | Program: BSG-SME | Phase: SME-1 | Award Amount: 1.36M | Year: 2010
Colorectal cancer and lung cancer cause millions of death each year. Currently there is no suitable non-invasive method for the early detection of these types of cancer. The tumour suppressor gene BARD1 (BRCA1-associated RING domain protein) is aberrantly expressed in several types of cancer and could be a diagnostic target for early cancer diagnosis in blood samples. The overall objective of the project is to develop blood tests for the early detection of colorectal and lung cancer based on cancer-specific BARD1 isoforms. The outlined tests will analyse BARD1 isoforms at two levels: the expression of isoform-specific RNA in circulating tumour cells (CTC) and the presence of isoform-specific autoantibodies in serum. To reach these objectives, several technological challenges have to be overcome. The BARDiag consortium includes 3 SMEs and 4 research centres, who have excellent expertise, specific knowledge, the required lab infrastructure and necessary clinical materials that will enable to conduct this project. Within the frame of the project, innovative methods for the isolation of CTCs in colorectal and lung cancer patients will be developed, the specific signatures of the BARD1 isoforms at both mRNA and autoimmune levels will be defined, and assays for the detection of these isoforms will be established, validated with clinical samples and tested for their marketability. The results of the proposed project will have extensive impacts. Not only more scientific knowledge on the expression of BARD1 isoforms in colorectal and lung cancer will be obtained and therefore the understanding for cancer will be improved, but also non-invasive methods for the early detection of colorectal and lung cancer will be made broadly available in form of commercial test kits. The SMEs will extend their expertise and knowledge and therefore strengthen their economic power, which will contribute to increase European competitiveness.
Agency: European Commission | Branch: FP7 | Program: CP-FP | Phase: HEALTH.2012.1.4-4 | Award Amount: 7.73M | Year: 2012
Age-related Macular Degeneration (AMD), a neurodegenerative disease of the retina, is a major cause of blindness in elderly people. Due to the aging population, AMD has been referred to as a time bomb in society. In the exudative form of AMD, high levels of vascular endothelial cell growth factor (VEGF) and low levels of pigment-epithelial derived factor (PEDF), an inhibitor of vascularization and a neuroprotective factor produced by retinal pigment epithelial (RPE) cells result in subretinal neovascularization and retinal pigment cell degeneration. The current treatment by monthly injections of anti-VEGF antibodies is only effective for ~30% of patients. To avoid the severe side effects, high costs and the overall continuing burden on health care associated with monthly antibody injections, inducing a higher level of PEDF expression to inhibit neovascularization would be a viable therapeutic alternative. TargetAMD will subretinally transplant genetically modified, patient-derived, iris- or RPE cells that overexpress PEDF to provide a long-lasting cure of AMD. Stable PEDF gene delivery will be based on the non-viral Sleeping Beauty transposon system, which combines the efficacy of viral delivery with the safety of naked DNA plasmids. Academic scientists and SME partners will produce innovative gene delivery technologies, reagents and devices to be translated into a simple and safe gene therapeutic treatment for exudative AMD. Experienced clinicians will perform two clinical trials, comprising isolation and PEDF-transfection of a patients pigment epithelial cells and implantation of transfected cells into the patient during a single, 60-minute surgical session. This project will bring a significant enhancement on quality of life to AMD patients, highlight the synergistic power of academic, clinical and industrial cooperation to the scientific arena, and open new markets for novel products for clinical applications of transposon-based gene therapy to industry.
Agency: European Commission | Branch: FP7 | Program: CP-FP | Phase: NMP-2008-2.2-2;NMP-2008-2.1-2 | Award Amount: 4.61M | Year: 2009
The NANOGOLD project aims at the fabrication and application of bulk electro-magnetic metamaterials. A promising new concept for the exploration of metamaterials is the use of periodic structures with periods considerably shorter than the wavelength of the operating electromagnetic radiation This concept allows to control the refractive properties. Making use of a bottom up approach in materials design, we will apply self-organization of organic-inorganic composite materials containing resonant entities. To tune electromagnetic properties, resonance and interference at different length scales will be implemented. In such a way we will obtain bulk optical metamaterials operating in spectral domains appropriate for photonics that can be used in applications. Our groundbreaking solution to form such artificial matter is interdisciplinary and combines inorganic chemistry, organic macromolecular synthesis, physics of electromagnetic resonances and liquid crystal technology. We start with resonant entities (metallic nanoparticles) and organize them via self-organization on the molecular scale. Systematic modular variation of the chemical entities gives access to libraries of materials which will be used to arrive at systems with desired properties. Simulation of optical properties and molecular ordering will guide the design of compounds and materials. Organization at molecular level leads to homogenous materials with optical, electronic or magnetic properties at elevated frequencies, in the visible and near infrared spectral range. The controlled utilization of the polymer physics of micro-segration, will allow for additional structuration at the nano-scale giving design freedoms to tune material properties optimally. NANOGOLD furthermore will make use of innovative fabrication techniques and processing known from liquid crystal displays by exploring new physical effects, which will result in novel devices.