London, United Kingdom
London, United Kingdom

Imperial College London is a public research university located in London, United Kingdom. As a former constituent college of the federal University of London, it became fully independent during the commemoration of its centenary on 9 July 2007. Imperial has grown through mergers, including with St Mary's Hospital Medical School , the National Heart and Lung Institute and the Charing Cross and Westminster Medical School . Imperial College Business School was established in 2003 and its building opened by the Queen of England in 2004.Imperial is organised into four main faculties: science, engineering, medicine and business; within the school there are over 40 departments, institutes and research centres. Imperial has around 13,500 students and 3,330 academic and research staff. Imperial's main campus is located in the South Kensington area of London, with additional campuses in Chelsea, Hammersmith, Paddington, Silwood Park, Wye College, and Singapore, making it one of the largest estates of any UK tertiary institution.Imperial is a major centre for biomedical research with the research staff having a total income of £822 million in 2012/13. Imperial is a founding member of the Francis Crick Institute and Imperial College Healthcare. Imperial is a member of the Association of Commonwealth Universities, the European University Association, the Association of MBAs, the G5, the League of European Research Universities, Oak Ridge Associated Universities and the Russell Group. Along with Cambridge and Oxford, Imperial, forms a corner of the "golden triangle" of British universities.Imperial is one of the most selective British universities. Imperial is consistently ranked among the top universities in the world, ranking 2nd in the 2014/15 QS World University Rankings and 9th in the 2014/15 Times Higher Education World University Rankings. In a corporate study carried out by The New York Times, its graduates were one of the most valued globally. Imperial's alumni and faculty include 15 Nobel laureates, 2 Fields Medalists, 70 Fellows of the Royal Society, 82 Fellows of the Royal Academy of Engineering, and 78 Fellows of the Academy of Medical science. Wikipedia.


Time filter

Source Type

Eaton J.W.,Imperial College London | Bao L.,Pennsylvania State University
AIDS | Year: 2017

Objectives: The aim of the study was to propose and demonstrate an approach to allow additional nonsampling uncertainty about HIV prevalence measured at antenatal clinic sentinel surveillance (ANC-SS) in model-based inferences about trends in HIV incidence and prevalence. Design: Mathematical model fitted to surveillance data with Bayesian inference. Methods: We introduce a variance inflation parameter that accounts for the uncertainty of nonsampling errors in ANC-SS prevalence. It is additive to the sampling error variance. Three approaches are tested for estimating using ANC-SS and household survey data from 40 subnational regions in nine countries in sub-Saharan, as defined in UNAIDS 2016 estimates. Methods were compared using in-sample fit and out-of-sample prediction of ANC-SS data, fit to household survey prevalence data, and the computational implications. Results: Introducing the additional variance parameter increased the error variance around ANC-SS prevalence observations by a median of 2.7 times (interquartile range 1.9-3.8). Using only sampling error in ANC-SS prevalence, coverage of 95% prediction intervals was 69% in out-of-sample prediction tests. This increased to 90% after introducing the additional variance parameter. The revised probabilistic model improved model fit to household survey prevalence and increased epidemic uncertainty intervals most during the early epidemic period before 2005. Estimating did not increase the computational cost of model fitting. Conclusions: We recommend estimating nonsampling error in ANC-SS as an additional parameter in Bayesian inference using the Estimation and Projection Package model. This approach may prove useful for incorporating other data sources such as routine prevalence from Prevention of mother-to-child transmission testing into future epidemic estimates. © 2017 The Author(s). Published by Wolters Kluwer Health, Inc.


Substantial evidence suggests that breast cancer initiation, recurrence and drug resistance is supported by breast cancer stem cells (BCSCs). Recently, we reported a novel role of Aurora kinase A (AURKA) in BCSCs, as a transactivating co-factor in the induction of the c-Myc oncoprotein. However, the mode of action and transcriptional network of nuclear AURKA in BCSCs remain unknown. Here, we report that nuclear AURKA can be recruited by Forkhead box subclass M1 (FOXM1) as a co-factor to transactivate FOXM1 target genes in a kinase-independent manner. In addition, we show that AURKA and FOXM1 participate in a tightly coupled positive feedback loop to enhance BCSC phenotype. Indeed, kinase-dead AURKA can effectively transactivate the FOXM1 promoter through a Forkhead response element, whereas FOXM1 can activate AURKA expression at the transcriptional level in a similar manner. Consistently, breast cancer patient samples portrayed a strong and significant correlation between the expression levels of FOXM1 and AURKA. Moreover, both FOXM1 and AURKA were essential for maintaining the BCSC population. Finally, we demonstrated that the AURKA inhibitor AKI603 and FOXM1 inhibitor thiostrepton acted synergistically to inhibit cytoplasmic AURKA activity and disrupt the nuclear AURKA/FOXM1-positive feedback loop, respectively, resulting in a more effective inhibition of the tumorigenicity and self-renewal ability of BCSCs. Collectively, our study uncovers a previously unknown tightly coupled positive feedback signalling loop between AURKA and FOXM1, crucial for BCSC self-renewal. Remarkably, our data reveal a novel potential therapeutic strategy for targeting both the cytoplasmic and nuclear AURKA function to effectively eliminate BCSCs, so as to overcome both breast cancer and drug resistance.Oncogene advance online publication, 23 January 2017; doi:10.1038/onc.2016.490. © 2017 The Author(s)


Tumor evolution is shaped by many variables, potentially involving external selective pressures induced by therapies. After surgery, patients with estrogen receptor (ERα)-positive breast cancer are treated with adjuvant endocrine therapy, including selective estrogen receptor modulators (SERMs) and/or aromatase inhibitors (AIs). However, more than 20% of patients relapse within 10 years and eventually progress to incurable metastatic disease. Here we demonstrate that the choice of therapy has a fundamental influence on the genetic landscape of relapsed diseases. We found that 21.5% of AI-treated, relapsed patients had acquired CYP19A1 (encoding aromatase) amplification (CYP19A1amp). Relapsed patients also developed numerous mutations targeting key breast cancer–associated genes, including ESR1 and CYP19A1. Notably, CYP19A1amp cells also emerged in vitro, but only in AI-resistant models. CYP19A1 amplification caused increased aromatase activity and estrogen-independent ERα binding to target genes, resulting in CYP19A1amp cells showing decreased sensitivity to AI treatment. These data suggest that AI treatment itself selects for acquired CYP19A1amp and promotes local autocrine estrogen signaling in AI-resistant metastatic patients. © 2017 Nature Publishing Group, a division of Macmillan Publishers Limited. All Rights Reserved.


Rutter G.A.,Imperial College London
Cell Research | Year: 2017

An ability to convert between pancreatic islet cell types may provide a new approach to replace insulin-secreting β cells destroyed by autoimmune attack in Type 1 diabetes. Two papers, which have recently appeared in Cell, describe how this might be achieved. © 2017 Shanghai Institutes for Biological Sciences, Chinese Academy of Sciences


Lynch C.J.,Imperial College London | Lane D.A.,Imperial College London
Blood | Year: 2016

Shear forces in the blood trigger a conformational transition in the von Willebrand factor (VWF) A2 domain, from its native folded to an unfolded state, in which the cryptic scissile bond (Y1605-M1606) is exposed and can then be proteolysed by ADAMTS13. The conformational transition depends upon a Ca(2+)binding site and a vicinal cysteine disulfide bond. Glycosylation at N1574 has previously been suggested to modulate VWF A2 domain interaction with ADAMTS13 through steric hindrance by the bulky carbohydrate structure. We investigated how the N-linked glycans of the VWF A2 domain affect thermostability and regulate both the exposure of the ADAMTS13 binding sites and the scissile bond. We show by differential scanning fluorimetry that the N-linked glycans thermodynamically stabilize the VWF A2 domain. The essential component of the glycan structure is the first sugar residue (GlcNAc) at the N1574 attachment site. From its crystal structures, N1574-GlcNAc is predicted to form stabilizing intradomain interactions with Y1544 and nearby residues. Substitution of the surface-exposed Y1544 to aspartic acid is able to stabilize the domain in the absence of glycosylation and protect against ADAMTS13 proteolysis in both the VWF A2 domain and FLVWF. Glycan stabilization of the VWF A2 domain acts together with the Ca(2+)binding site and vicinal cysteine disulfide bond to control unfolding and ADAMTS13 proteolysis. © 2016 by The American Society of Hematology.


OBJECTIVE:: To compare clinical outcomes after laparoscopic lavage (LL) or colonic resection (CR) for purulent diverticulitis. BACKGROUND:: Laparoscopic lavage has been suggested as an alternative treatment for traditional CR. Comparative studies to date have shown conflicting results. METHODS:: Electronic searches of Embase, Medline, Web of Science, and Cochrane databases were performed. Weighted mean differences (WMD) were calculated for effect size of continuous variables and pooled odds ratios (POR) calculated for discrete variables. RESULTS:: A total of 589 patients recruited from 3 randomized controlled trials (RCTs) and 4 comparative studies were included; 85% as Hinchey III. LL group had younger patients with higher body mass index and lower ASA grades, but comparable Hinchey classification and previous diverticulitis rates. No significant differences were noted for mortality, 30-day reoperations and unplanned readmissions. LL had higher rates of intraabdominal abscesses (POR = 2.85; 95% confidence interval, CI, 1.52–5.34; P = 0.001), peritonitis (POR = 7.80; 95% CI 2.12–28.69; P = 0.002), and increased long-term emergency reoperations (POR = 3.32; 95% CI 1.73–6.38; P < 0.001). Benefits of LL included shorter operative time, fewer cardiac complications, fewer wound infections, and shorter hospital stay. Overall, 90% had stomas after CR, of whom 74% underwent stoma reversal within 12-months. Approximately, 14% of LL patients required a stoma; 48% obtaining gut continuity within 12-months, whereas 36% underwent elective sigmoidectomy. CONCLUSIONS:: The preservation of diseased bowel by LL is associated with approximately 3 times greater risk of persistent peritonitis, intraabdominal abscesses and the need for emergency surgery compared with CR. Future studies should focus on developing composite predictive scores encompassing the wide variation in presentations of diverticulitis and treatment tailored on case-by-case basis. Copyright © 2017 Wolters Kluwer Health, Inc. All rights reserved.


Markar S.R.,Imperial College London
Annals of Surgery | Year: 2017

OBJECTIVE:: To identify patient factors that are associated with emergency presentation of esophageal and gastric cancer, and further to evaluate long-term prognosis in this cohort. BACKGROUND:: The incidence of emergency presentation is variable, with the prognosis of patients stabilized and discharged to return for elective surgery unknown. METHODS:: The primary admission of patients with esophageal or gastric cancer within the Hospital Episode Statistics database (1997–2012) was used to classify as emergency or elective diagnosis. Multivariate regression analyses were used to identify patient factors associated with emergency diagnosis and prognosis. RESULTS:: A total of 35,807 (29.4%) and 45,866 (39.6%) patients with esophageal and gastric cancer presented as an emergency over the study period. Age ≥70, female sex, non-white ethnicity, Charlson comorbidity index score ≥3 and more deprived Townsend index were independent predictors of emergency cancer diagnosis. Emergency diagnosis was an independent predictor of increased 5-year mortality for all patients with esophageal cancer [hazard ratio (HR) = 1.63, 95% confidence interval (CI) 1.61–1.65] and gastric cancer (HR = 1.20, 95% CI 1.16–1.23). Specifically patients receiving surgery on an elective follow-up admission with an initial emergency diagnosis had a poorer prognosis (esophageal cancer: HR = 1.35, 95% CI 1.27–1.44, gastric cancer: HR = 1.13. 95% CI 1.04–1.22), with a significant increase in liver recurrence (esophageal cancer: 7.1% vs 4.9%; P < 0.001, gastric cancer: 7.0% vs 4.8%; P < 0.001) compared to patients referred electively. CONCLUSIONS:: Emergency presentation of esophageal and gastric cancer is associated with a poor prognosis, due to the increased incidence of metastatic disease at diagnosis and a higher recurrence rate after surgery. Copyright © 2017 Wolters Kluwer Health, Inc. All rights reserved.


Rampling M.W.,Imperial College London
Clinical Hemorheology and Microcirculation | Year: 2016

An obvious candidate for the seminal event in the history of haemorheology is Harvey's presentation of the concept of the circulation of the blood. Prior to this, the ideas concerning the movement of blood were based, in Europe and Middle East, largely on the principles laid down by Galen, and these had been, in effect, dogma for nearly a millennium and a half. These principles were basically that blood is formed in the liver, thence it travels to the bodily organs and is consumed -hence there is one-way flow and no circulation of the blood at all. Harvey's revolutionary idea that blood circulates repeatedly around the cardiovascular system laid the foundation for haemorheology because once that idea was accepted then the fluidity of the blood immediately became potentially of crucial importance - and haemorheology was conceived. In this paper the ideas that preceded Harvey will be presented, i.e. those of Galen, Ibn al-Nafis, Vesalius, Fabricius and Colombo etc. Harvey's awareness of this background, due mainly to time spent in Padua, triggered his many experimental investigations and discoveries. Ultimately, these led to his astonishing insights published in De Mortu Cordis in 1628 which changed the understanding of the cardio-vascular system forever. © 2016 - IOS Press and the authors. All rights reserved.


News Article | April 25, 2017
Site: www.eurekalert.org

Some elderly people's brains appear older than the age on their birth certificates. These people have a higher risk of developing deteriorating age-associated conditions and impairments, and even of experiencing an earlier death than expected. So say researchers who used neuroimages of the brain to identify biomarkers that show how the structures of a person's brain age. Being able to predict someone's brain age could be a valuable tool to help clinicians make timely medical interventions, believes James Cole of Imperial College London in the UK. He is the lead author of a study in Springer Nature's journal Molecular Psychiatry that identified so-called brain-predicted age as a useful biomarker. Cole and his co-workers used neuroimaging techniques in combination with multivariate statistical models. They chose neuroimaging because this technique gives detailed information about brain structure, and they used multivariate statistics because it has already been employed successfully to compare healthy brains with those of people who suffer from age-related pathology and cognitive impairment, such as Alzheimer's disease, schizophrenia, or traumatic brain injuries. The researchers first used machine-learning analysis techniques on neuroimaging data from a cohort of 2001 healthy people aged 18-90. This was then compared to the brain scans of 669 older adults who were part of the Lothian Birth Cohort 1936, a longitudinal study conducted in the Edinburgh and Lothians area of Scotland. This comparison was done to determine whether there is any relationship between brain predicted age and age-associated body functions as well as mortality. In the process, the research team found a marked difference in people whose brains measure as older than their chronological age. They tend to have weaker grips, poorer lung function, slower walking speed and their bodies have suffered more wear and tear. They were worse at solving new problems, and showed poorer performance when applying logic and identifying patterns. The study showed that it is possible to use neuroimaging assessments to establish a 73-year-old's predicted age, and also to forecast whether that person will be dead by the age of 80. The research group further found that combining brain-predicted age with certain DNA-related epigenetic biomarkers of ageing further improves predictions about a person's probable mortality. "Our study introduces a clinically relevant neuroimaging ageing biomarker and demonstrates that combining distinct measures of biological ageing further helps to determine risk of age-related deterioration and death," Cole says, in summarizing the study's value. "Such biomarkers could potentially identify those at risk of age-associated health problems years before symptoms appear, and be used as outcome measures in trials of therapeutics aimed at delaying the onset of age-related disease." According to Cole the study's findings support the use of magnetic resonance imaging (MRI) scans of the brain as a screening tool to help identify people at greater risk of general functional decline and mortality during ageing.


As it turns out, Brexit was not the first time Britain has separated from the European mainland. Scientists say that England and France were once connected by a ridge of land, until powerful waterfalls from an overfull lake demolished their connection. The findings, published in the journal Nature Communications, help shed light on the emergence of Britain as an island, and on the changes this separation wrought in the greater region’s climate, ecology and human history. “The opening of the Strait has significance for the biogeography and archaeology of NW Europe, with particular attention on the pattern of early human colonization of Britain,” the study authors wrote. Some 450,000 years ago, Europe was a very different place. Glaciers covering the North Sea locked up much of the world’s water, leaving sea levels much lower than they are today. The English Channel was not a wide strip of water separating present-day England and France, but instead a frozen, river-ribboned tundra connecting the two lands. The debate over how the dry tundra turned into a wide waterway has dogged scientists for decades. Was it a sudden change or a gradual process? “The mechanism and history of the breaching of the Dover Strait is a question of importance to not only understanding the geographic isolation of Britain from continental Europe, but also the large-scale rerouting of northwest European drainage and meltwater to the North Atlantic via the Channel,” the study authors wrote. In their paper, a team of European scientists led by Sanjeev Gupta of Imperial College London says they’ve found new evidence backing up an old but until now unproven idea: that Britain was cut off from France thanks to some devastating waterfalls. The evidence for that lay in a series of strange “plunge pools” at the sea floor of the Dover Strait. Discovered in the 1960s while engineers were surveying the sea floor, these depressions could stretch roughly seven kilometers wide and hundreds of meters deep. These pits had been filled with looser sediment, forcing officials to reroute construction of the Channel Tunnel. In the 1980s, Bedford College marine geologist Alec Smith suggested that powerful, prehistoric waterfalls dug those enormous holes, but at the time, scientists lacked the data to determine whether this idea was true. But now, using bathymetric maps to study the sea floor, the scientists found that Smith’s hypothesis was largely correct. Their analysis shows that Britain was once connected to the mainland thanks to a chalk ridge that extended from Dover (home of the famous white-chalk cliffs) in England to Calais in France, right across the Dover Strait. This ridge kept a proglacial lake — a lake formed in front of a glacier — at bay, until some unknown event caused it to spill over the natural dam, plunging into the valley below. This must have occurred at several spots along the ridge, leaving the telltale string of seven or so oversized plunge pools stretching from Dover to Calais. The scientists can’t say for sure exactly what caused the lake to overflow and break the chalk dam. Perhaps a chunk of ice broke off the glacier and plunged into the lake, causing it to slosh over the ridge like sugar cubes dropped into a generous cup of tea. It’s also possible earthquakes helped to weaken the dam. In any case, the results certainly left telltale scars on the sea floor. It was a second major event, however, that finished the job, separating Britain from the mainland for good. The Lobourg Channel, a valley at the bottom of the channel that stretches 80 kilometers long and 10 kilometers wide, was likely carved after a series of smaller lakes brimmed over. Though the scientists are unclear on precisely how far apart in time these two massive events took place, they think the second episode may have occurred around 160,000 years ago. The researchers say more study is needed (though it may be easier said than done in the well-trafficked strait). “The Fosses Dangeard sediment infills are an outstanding target for future drilling in order to precisely constrain the chronology of events shaping the breaching history of the Strait, and its palaeogeographic consequences,” the study authors wrote. See the most-read stories in Science this hour » The findings, however, could refine our understanding of when various species — humans included — arrived in Britain. How different would ancient and modern history have been, for example, if Britain had not become an island but had instead remained a peninsula, rather like Denmark today? “Such a chronological framework is necessary to better understand the timing of when Britain first became isolated from mainland Europe during interglacial high sea-level phases,” the study authors wrote. “This has profound significance to understanding the ability and timing of biota, including humans, to colonize the British Isles.” Understanding how Britain became an island nation will also help scientists understand how such a dramatic rerouting of a massive body of water would have affected the climate, they added. “The rerouting of meltwater from the British-Scandinavian Ice Sheet and its injection into the North Atlantic has implications for inter-hemispheric climate variability,” the scientists wrote. Follow @aminawrite on Twitter for more science news and "like" Los Angeles Times Science & Health on Facebook. How did Mars lose so much of its atmosphere? MAVEN has an answer This is how much global trade costs, not in dollars but in premature deaths caused by pollution Drastic cuts to NIH budget could translate to less innovation and fewer patents, study argues


News Article | April 20, 2017
Site: www.newscientist.com

Struggling to remember something? An electrical jolt deep in the brain might help – if it is given at the right time. To discover the effect of electrical stimulation on memory, Michael Kahana and colleagues at the University of Pennsylvania turned to 150 volunteers who had previously had electrodes implanted in their brains to help control severe epilepsy. These electrodes can record the brain’s electrical signals, giving the team a window into each person’s neural processes. They can also deliver electricity to the brain. First, the team recorded the brain signals of the volunteers while they learned items from a list, and later as they tried to recall those items. They then applied machine learning methods to this brain signal data, enabling them to predict if a person’s efforts to commit something to memory would later prove successful, based on the state of their brain at the time. The team next ran further recall tests, during which they delivered random jolts of electricity to the participants while they were trying to memorise test items. They compared the effects of jolting someone during two different brain states – the pattern of signals linked to being likely to later remember something, and the pattern linked to being more likely to have a memory lapse. They found that giving electrical stimulation when a person’s brain signals suggested they would later forget the current item made that person 13 per cent more likely to recall it. “You get significant enhancement,” says Kahana. Timing was key, however. A jolt of electricity during a pattern of brain activity linked to later recall went on to reduce a person’s likelihood of remembering an item by 18 per cent. The study is the latest of many probing the question of whether zaps of electricity can improve memory. So far, many studies have conflicted with each other on the effects of deep brain stimulation and recall. “Electrical brain stimulation is controversial,” says Inês Violante at Imperial College London. “The majority of studies have a very low number of participants. A study of this size is much more reliable.” Kahana is now working on a device that could tell when the brain would benefit from an induced memory boost. “You could build a technology that could trigger stimulation at moments when you’re predicted to have poor memory, thus enhancing memory of an individual wearing such a device,” says Kahana. Such a device may be useful for people who have memory loss, but first we need to understand which parts of the brain benefit the most from this kind of stimulation. While deep brain stimulation already helps people with untreatable epilepsy or Parkinson’s disease, it’s an extreme treatment that carries the risk of infection. Experimental approaches that stimulate the brain externally may be a more desirable option. Read more: Alzheimer’s damage reversed by deep brain stimulation; Deep brain stimulation: A wonder treatment pushed too far?


News Article | April 24, 2017
Site: www.cemag.us

The concept of a perfect lens that can produce immaculate and flawless images has been the Holy Grail of lens makers for centuries. In 1873, a German physicist and optical scientist by the name of Ernst Abbe discovered the diffraction limit of the microscope. In other words, he discovered that conventional lenses are fundamentally incapable of capturing all the details of any given image. Since then, there have been numerous advances in the field to produce images that appear to have higher resolution than allowed by diffraction-limited optics. In 2000, Professor Sir John B. Pendry of Imperial College London -- the John Pendry who enticed millions of Harry Potter fans around the world with the possibility of a real Invisibility Cloak -- suggested a method of creating a lens with a theoretically perfect focus. The resolution of any optical imaging system has a maximum limit due to diffraction but Pendry's theoretic perfect lens would be crafted from metamaterials (materials engineered to have properties not found in nature) to go beyond the diffraction limit of conventional lenses. Overcoming this resolution limit of conventional optics could propel optical imaging science and technology into realms once only dreamt by common Muggles. Scientists all over the world have since endeavored to achieve super-resolution imaging that capture the finest of details contained in evanescent waves that would otherwise be lost with conventional lenses. Hyperlenses are super-resolution devices that transform scattered evanescent waves into propagating waves to project the image into the far-field. Recent experiments that focus on a single hyperlens made from an anisotropic metamaterial with a hyperbolic dispersion have demonstrated far-field sub-diffraction imaging in real time. However, such devices are limited by an extremely small observation area which consequently require precise positioning of the subject. A hyperlens array has been considered to be a solution, but fabrication of such an array would be extremely difficult and prohibitively expensive with existing nanofabrication technologies. Research conducted by Professor Junsuk Rho's team from the Department of Mechanical Engineering and the Department of Chemical Engineering at Pohang University of Science and Technology in collaboration with research team from Korea University has made great contributions to overcoming this obstacle by demonstrating a scalable and reliable fabrication process of a large scale hyperlens device based on direct pattern transfer techniques. This achievement has been published in the world-renowned Scientific Reports. The team solved the main limitations of previous fabrication methods of hyperlens devices through nanoimprint lithography. Based on a simple pattern transfer process, the team was able to readily fabricate a perfect large-scale hyperlens device on a replicated hexagonal array of hemisphere substrate directly printed and pattern-transferred from the master mold, followed by metal-dielectric multilayer deposition by electron beam evaporation. This 5 cm x 5 cm hyperlens array has been demonstrated to resolve sub-diffraction features down to 160 nm under a 410 nm wavelength visible light. Professor Rho anticipates that the research team's new cost-effective fabrication method can be used to proliferate practical far-field and real-time super-resolution imaging devices that can be widely used in optics, biology, medical science, nanotechnology, and other related interdisciplinary fields.


News Article | May 1, 2017
Site: www.chromatographytechniques.com

Scientists have modeled what happens to the brain of a football player when he collides forcefully with another player. The study, conducted by researchers at Imperial College London, was carried out to understand in more detail the link between traumatic brain injury (TBI) and chronic traumatic encephalopathy (CTE). The latter is a form of dementia and causes a long-term build-up of proteins called tau, associated with the degeneration of brain tissue and declining health. TBI occurs when an external force impacts on the brain. People who have been involved in one-off TBI incidents such as motorcycle accidents, and sportspeople like footballers who have repetitive TBIs from collisions on the field, are both vulnerable to CTE. Scientists believe there is a link between the initial impact in a TBI and where tau deposits build up in the brain. Now, Imperial researchers have modeled how brain tissue deforms during an impact between two American football players on the field. They have also modeled what happens to a person's brain when they have a ground-level fall and the initial impact to the brain in a motorcycle accident. They compared their 3-D high-fidelity models to MRI data on a cohort of 97 patients with TBI, and studies on post-mortem data of the brains of footballers from America's National Football Association (NFL) with CTE, previously donated to science institutes in America for analysis. They observed tau deposition in the brains, which was then diagnosed as CTE. "In TBI, the force of the blow shakes the brain, which is similar in texture to jelly. This shaking process deforms the brain tissue and can cause ruptured blood vessels and damaged nerve cells, and more severe complications later on. We've been able to replicate those initial moments when the 'jelly' brain is first deformed on impact, by combining engineering principles and medical knowledge. This is providing us with new insights," said Mazdak Ghajari, an engineer who co-led the study from the Dyson School of Design Engineering at Imperial College London. The Imperial team showed in all their 3-D models that the damage created from a TBI is greatest in the depths of the folds on the surface of the brain called sulci. Previous studies on CTE have shown that tau also accumulates in sulci. In addition, the team discovered that the location and severity of the blow to the head on impact can have a significant influence on the magnitude and pattern of the injury later on when CTE develops. The researchers say further clarification of these links in future studies will be the key to analyzing the long-term effects of head impacts. This could lead to new improvements in protective strategies, including new types of helmet designs. "Current technologies for assessing helmet safety are pretty crude. Our work is still in the early stages, but we believe that it shows promise for more accurately modeling how the brain deforms in different types of impacts. Using this knowledge we could refine the design of particular areas of helmets so that they could withstand collisions associated with particular types of sports. We could also design headgear that is more able to shield motorcyclists, who are so vulnerable on the road. Ultimately, we think better protective wear may prevent long-term diseases such as CTE," added Ghajari. The team carried out their modeling by gathering data from real incidents. In the case of the American football players, the Imperial team used data that was originally collected by Biokinetics and Associates Ltd (Canada). The data was collected from NFL games that occurred from 1996 to 2001. A total of 182 collisions were recorded in the study. The Imperial team chose a collision that they thought was reconstructed well in the lab and input this data into their model. The team reconstructed the second injury using medical records that detailed a patient's fall to a marble floor, from ground level. They reconstructed the fall using a dummy and recorded the head accelerations during impact. The accident involving a collision between a motorcyclist and a passenger car was reconstructed at the Transport Research Laboratory. Instruments were fitted inside a dummy head wearing a helmet, identical to the one worn in the accident, and the impacts were recorded. The location and velocity of the impact were adjusted to closely replicate the damage seen on the shell and lining of the helmet. The information recorded from the sensors during each reconstruction was fed into a 3-D model on a computer, created from MRI scans of a healthy 34-year-old male. The team's software enabled them to pixelate the head into one million hexahedral elements and a quarter of a million quadrilateral elements, which represented 11 types of tissues including the scalp, skull, brain and anatomical features such as the sulci. This gave them the high fidelity capacity to focus in detail on parts most damaged from the initial impact of a TBI. They then compared their models with the MRI data and post-mortem studies of American footballers with CTE, which showed mechanical forces at the time of collision are concentrated in locations of tau deposition seen in the footballers' brains with dementia. "We are very excited by the ability to link the early and late effects of head injuries. A large challenge is identifying patients at risk of dementia after head injury and our study provides a way to connect the critical events in this process. We will be working to understand how the way the brain deforms leads to brain degeneration, as this will be key to protecting against dementia," said David Sharp, co-author from the Department of Medicine at Imperial College London. The researchers plan to use the computer model to optimize the design of sporting headgears, with focus on two mainstream sports, American football and horse riding. The team will also be working with researchers from the Royal British Legion Centre for Blast Studies at Imperial, exploring the effects of blasts on brain tissue.


a) This is a multilayered spherical hyperlens structure. Metal and dielectric thin films are deposited on a spherical shape of substrate. b) This is a transmission electron microscopy (TEM) image of the cross-section of a replicated hyperlens c & d) Tilted view for the quartz master mold and the replicated substrate e) Scanning electron microscopy (SEM) image of the sub-diffraction scale objects. f) Far-field optical image after hyperlens. The small object below diffraction limit is clearly resolved by the hyperlens. Credit: POSTECH The concept of a perfect lens that can produce immaculate and flawless images has been the Holy Grail of lens makers for centuries. In 1873, a German physicist and optical scientist by the name of Ernst Abbe discovered the diffraction limit of the microscope. In other words, he discovered that conventional lenses are fundamentally incapable of capturing all the details of any given image. Since then, there have been numerous advances in the field to produce images that appear to have higher resolution than allowed by diffraction-limited optics. In 2000, Professor Sir John B. Pendry of Imperial College London—the John Pendry who enticed millions of Harry Potter fans around the world with the possibility of a real Invisibility Cloak—suggested a method of creating a lens with a theoretically perfect focus. The resolution of any optical imaging system has a maximum limit due to diffraction but Pendry's theoretic perfect lens would be crafted from metamaterials (materials engineered to have properties not found in nature) to go beyond the diffraction limit of conventional lenses. Overcoming this resolution limit of conventional optics could propel optical imaging science and technology into realms once only dreamt by common Muggles. Scientists all over the world have since endeavored to achieve super-resolution imaging that capture the finest of details contained in evanescent waves that would otherwise be lost with conventional lenses. Hyperlenses are super-resolution devices that transform scattered evanescent waves into propagating waves to project the image into the far-field. Recent experiments that focus on a single hyperlens made from an anisotropic metamaterial with a hyperbolic dispersion have demonstrated far-field sub-diffraction imaging in real time. However, such devices are limited by an extremely small observation area which consequently require precise positioning of the subject. A hyperlens array has been considered to be a solution, but fabrication of such an array would be extremely difficult and prohibitively expensive with existing nanofabrication technologies. Research conducted by Professor Junsuk Rho's team from the Department of Mechanical Engineering and the Department of Chemical Engineering at Pohang University of Science and Technology in collaboration with research team from Korea University has made great contributions to overcoming this obstacle by demonstrating a scalable and reliable fabrication process of a large scale hyperlens device based on direct pattern transfer techniques. This achievement has been published in the world-renowned Scientific Reports. The team solved the main limitations of previous fabrication methods of hyperlens devices through nanoimprint lithography. Based on a simple pattern transfer process, the team was able to readily fabricate a perfect large-scale hyperlens device on a replicated hexagonal array of hemisphere substrate directly printed and pattern-transferred from the master mold, followed by metal-dielectric multilayer deposition by electron beam evaporation. This 5 cm x 5 cm hyperlens array has been demonstrated to resolve sub-diffraction features down to 160 nm under a 410 nm wavelength visible light. Professor Rho anticipates that the research team's new cost-effective fabrication method can be used to proliferate practical far-field and real-time super-resolution imaging devices that can be widely used in optics, biology, medical science, nanotechnology, and other related interdisciplinary fields.


News Article | April 25, 2017
Site: www.chromatographytechniques.com

A method for predicting someone's 'brain age' based on MRI scans could help to spot who might be at increased risk of poor health and even dying at a younger age. By combining MRI scans with machine learning algorithms, a team of neuroscientists led by researchers at Imperial College London, has trained computers to provide a predicted "brain age" for people based on their volume of brain tissue. When the technique was tested on a study population of older adults in Scotland, they found that the greater the difference between a person's brain age and their actual age, the higher their risk of poor mental and physical health, and even early death. The researchers stress that while the technique is a long way from being used in clinical practice, they are hopeful it might one day be used as a screening tool, helping to identify those at risk of cognitive decline and dying before the age of 80, providing an opportunity for early intervention. Scientists around the world are working to find reliable biomarkers that can be used to measure age, such as from blood and hair samples. In the latest study, published today in the journal Molecular Psychiatry, researchers from Imperial and the University of Edinburgh have added a neuroimaging approach to the growing gerontology toolkit. "We've come up with a way of predicting someone's brain age based on an MRI scan of their brain," explained James Cole, a research associate in the Department of Medicine, who led the study. "Our approach uses the discrepancy between their chronological age and what we call their brain-predicted age as a marker of age-related atrophy in the brain. If your brain is predicted to be older than your real age than that reflects something negative may be happening." At the heart of the approach is a technique first developed in 2010 that measures brain volume and uses machine learning to estimate the overall loss of grey and white matter—a hallmark of the ageing process in the brain. Cole took this basic technique and refined it by testing it on publicly available datasets of MRI scans of more than 2,000 healthy people's brains, resulting in normalized maps that accurately predicted the person's age. Following this fine-tuning, it was then applied to scans of 669 people from the Lothian Birth Cohort 1936, a well-studied group of adults all born in 1936 who had undergone MRI scans at age 73, giving them a score for predicted brain age. Analysis revealed that those with a brain age older than their chronological age performed worse on standard physical measures for healthy aging, including grip strength, lung capacity and walking speed. Crucially, those with "older brains" were statistically more likely to die before the age of 80, with the average discrepancy between brain age and chronological age being eight years for deceased males and two years for deceased females. If the initial findings could be applied to a screening program, the technique could be used to inform health practitioners, showing whether or not a patient had a healthy brain age or was above or below the line, similar to how body mass index (BMI) is used today. "In the long run it would be great if we could do this accurately enough so that we could do it at an individual level," said Cole. "Someone could go to their doctor, have a brain scan and the doctor could say 'your brain is 10 years older than it should be', and potentially advise them to change their diet or lifestyle or to start a course of treatment. However, at the moment, it's not sufficiently accurate to be used at that sort of individual level." The team is now looking to refine the technique further, incorporating different types of imaging, such as diffusion MRI scans, to improve accuracy. Currently, the high cost associated with MRI scans inhibit the technique's use as a screening tool in the near term, but large scale projects such as the UK Biobank demonstrate the economies of scale that could help reduce the costs in future. The researchers also stress that while the technique has great potential, there is still a relatively large margin of error, with the absolute error in determining brain age across all of the MRIs found to be five years. "People use the 'age' of an organ all the time to talk about health," explained Cole. "Smokers are said to have lungs that are 20 years older than they should be, you can even answer online questionnaires about exercise and diet and get a 'heart age'. This technique could eventually be like that." "It could be that if your brain looks older than it should do, it could be an indication that something bad has happened or is happening and should put you more at risk of age-related brain disease or cognitive impairment, and the data we have so far seems to back that up, at least at the group level," added Cole.


The team from Imperial College London has received seed funding from the Institute of Molecular Science and Engineering (IMSE) to forge ahead with a new project. The aim is to use synthetic chemistry to prepare several aza-polycyclic aromatic hydrocarbons (aza-PAHs) that are proposed to be part of the interstellar medium. The target compounds are very rare on Earth and could hold the key to understanding more about the birth of stars, and the formation of solar systems and galaxies. Professor Mark Sephton, Head of the Department of Earth Science and Engineering, along with Dr Wren Montgomery, also from the department, are teaming up with Dr Matthew Fuchter, from Imperial's Department of Chemistry. This group is one of the first, along with six other new research projects to receive funding from IMSE's proof-of-concept seed funding initiative. Colin Smith caught up with Drs Montgomery and Fuchter to learn more about aza-PAHs and what synthesising them in the lab could mean for our understanding of the universe. These consist of rings of carbon atoms together with a few nitrogen atoms. Scientists classify them as either "smaller" or "larger" depending on the number of carbon rings they contain. Where are they found? On Earth, smaller aza-PAHs (two to three rings) are pollutants associated with asphalt and tar. Out in the wider universe, larger aza-PAHs (seven rings or more) are thought to be a key part of the interstellar medium (ISM). This is the matter that exists in the space between star systems in galaxies. This matter includes gas in ionic, atomic, and molecular forms, as well as dust and cosmic rays. It fills interstellar space and blends smoothly into the surrounding intergalactic space. Scientists believe that larger aza-PAHs are important ingredients in the ISM, but it has not been previously possible to get enough pure samples of these on Earth to take measurements in a laboratory to determine if this hypothesis is correct. How will this seed funding help us to learn more about them? We are planning to create synthetic aza-PAHs in the laboratory and study them using a device called Fourier Transform Infrared Spectroscopy (FTIR), which uses light in the infrared spectrum to study molecules in detail. Currently, astronomers use infrared instruments to study the ISM. We plan to make a direct comparison between our synthesised samples and the actual ISM. This will help us to reveal the nature and distribution of the organic building blocks of the cosmos and its planetary systems. We also plan to study aza-PAHs in high-pressure environments. This will help us to understand how they are altered or possibly destroyed by the processes of stars and planets forming. If you successfully create aza-PAHs in the lab what could it tell us about the universe? Firstly, having a sample can verify the existing models developed by scientists and tell us whether or not aza-PAHs are present in the ISM. If they are present, then their behaviour under high pressure will tell us something about what happens to them when the molecular cloud condenses and forms planets. They're very scarce on Earth today, so perhaps our work can shed some light on where have they gone. What are some of the challenges of this project? This class of chemical compound will be very difficult to "manufacture" in the lab. We will be covering new ground in terms of how we work with Mark Fuchter in Imperial's Department of Chemistry. One of the big challenges for us will be to find a way for our two different sciences to "talk" to one another so that we can achieve our goals. It will be a very exciting and creative process. What unique qualities will you bring to this project? My group has expertise in synthetic chemistry: the ability to build more complex molecules from simple precursors. In particular, we have developed methods to construct polycyclic aromatic compounds – a key target molecular class for our research - and so have the correct background to try and construct the the target molecules needed for this project. Why is this seeding funding important? These target molecules have never been made in sufficient quantities to be fully characterised by scientists, so their synthesis and study would be a world first. Are there other applications for this research? Outside of the specific aims of this project, the chemical compounds should have other interesting applications. For example, they could be used in the construction of organic electronic devices. A key example of current organic electronic technology is the light emitting diodes, which are currently used in smartphone displays. My group, together with collaborators in Imperial's Department of Physics, has an ongoing research programme, which concerns the use of new condensed aromatic molecules in novel devices and materials. Therefore, this project could additionally seed other new directions of research for my collaborators and me. What are the benefits of being aligned with IMSE? One of IMSE's key aims is to foster new collaborations across all four faculties at Imperial around ambitious grand challenge projects. Through their seed funding scheme, Mark, Wren and I have established a new collaboration between departments with complementary strengths to work on an exciting, integrated new project area. Explore further: Tracing aromatic molecules in the early universe


News Article | April 19, 2017
Site: news.yahoo.com

A new study has revealed that hallucinogenic drugs are capable of providing a ‘higher level of consciousness’ within humans. The study, conducted by scientists from The University of Sussex, The University of Auckland, and Imperial College London, measured the consciousness levels of humans after taking three hallucinogenic substances. The substances in question, namely LSD, ketamine and psilocybin (the active substance in magic mushrooms), were given to several participants, whose brain activity was measured using a specialised technique known as Magnetoencephalography (MEG). After taking the drug, each participant was then asked to give a brief summary of their experience while under the influence – with some claiming to have found a ‘profound inner peace’. Others claimed that ‘the experience had a spiritual or mystical quality’. At the same time as the initial experiment, some participants had been handed a placebo drug – which would acted as the barometer for the researchers to measure the levels of consciousness against. MORE: Does Donald Trump know who the leader of North Korea is? MORE: Facebook killer foiled by chicken nuggets After comparing the brain levels of the participants who took the drug against those who did not, scientists noted instances of ‘increased signal diversity’ across all the participants who had taken the drug. Specifically, 100 percent of all participants who took ketamine had a stronger state of consciousness than those who did not – along with similarly high scores of 86 and 93 percent respectively for psilocybin and LSD. The study claimed: ‘We have demonstrated, for the first time, that measures of neural signal diversity that are known to be sensitive to conscious level, are also sensitive to the changes in brain dynamics associated with the psychedelic state. ‘We found that the psychedelic state induces increased brain-wide signal diversity as compared to placebo, across a range of measures and three different psychedelic compounds.’


News Article | April 20, 2017
Site: news.yahoo.com

Scientists have, for the first time, uncovered evidence of a "higher" state of consciousness — one induced by psychedelics like LSD and psilocybin. In the 1960s, psychedelics like LSD and psilocybin (magic mushrooms) were often touted as "enlightenment" drugs that could awaken its users to a higher level of consciousness. Now, on the 74th anniversary of the world's first acid trip, scientists have confirmed, to an extent, what hippies have long claimed — psychoactive drugs do induce an "elevated" level of consciousness, as measured by a metric known as neural signal diversity. Neural signal diversity, which is a measure of the complexity of brain activity, provides a mathematical index of the level of someone’s consciousness. Although scientists have previously shown that people who are awake have more diverse neural activity than those who are asleep or in a vegetative state, this is the first time a measurement of signal diversity higher than the baseline has been observed. "This finding shows that the brain-on-psychedelics behaves very differently from normal. During the psychedelic state, the electrical activity of the brain is less predictable and less ‘integrated’ than during normal conscious wakefulness – as measured by 'global signal diversity,'" Anil Seth from the University of Sussex, and the co-author of a study detailing the findings, said in a statement. "Since this measure has already shown its value as a measure of ‘conscious level’, we can say that the psychedelic state appears as a higher ‘level’ of consciousness than normal – but only with respect to this specific mathematical measure." For the purpose of their study, the researchers focused on three drugs — lysergic acid diethylamide (LSD), ketamine and psilocybin. LSD was administered to 15 healthy subjects, ketamine to 19 and psilocybin to 14. Even after accounting for placebo effects, the researchers found evidence of more random brain activity among the participants while under the influence — activity that was associated with thoughts and sensations like "my perception of time was distorted," "a sense of merging with my surroundings," and "the experience had a spiritual or mystical quality." However, it should be noted that the neural activity did not, in any way, indicate that the psychedelic state is a better or more desirable state of consciousness, only that it is something distinct and worth studying further using more sophisticated and varied techniques. "People tend to associate phrases like ‘a higher state of consciousness’ with hippy speak and mystical nonsense. This is potentially the beginning of the demystification, showing its physiological and biological underpinnings," co-author Robin Carhart-Harris from Imperial College London told the Guardian. "Maybe this is a neural signature of the mind opening." In addition, the research could also open the doors to more extensive research on how drugs like psilocybin can be used to cure depression and anxiety — something that previous studies have hinted at — and what the neural basis of these elevated states of consciousness is. "Rigorous research into psychedelics is gaining increasing attention, not least because of the therapeutic potential that these drugs may have when used sensibly and under medical supervision," Carhart-Harris said in the statement. "The present study’s findings help us understand what happens in people’s brains when they experience an expansion of their consciousness under psychedelics. People often say they experience insight under these drugs – and when this occurs in a therapeutic context, it can predict positive outcomes."


News Article | May 2, 2017
Site: www.bbc.co.uk

The fossil of a dinosaur that has been languishing in a museum for decades has been re-examined - and it turns out to be that of a new species. Brachiosaurus, depicted in Jurassic Park, now has an early relative, providing clues to the evolution of some of the biggest creatures on Earth. Scientists say the plant-eating dinosaur was longer than a double-decker bus and weighed 15,000kg. Its remains were found in the 1930s in the Jura region of France. Since then it has been somewhat over-looked, spending most of that time in storage crates in the National Museum of Natural History in Paris. Lead researcher Dr Philip Mannion of Imperial College London said the dinosaur would have eaten all kinds of vegetation, such as ferns and conifers, and lived at a time when Europe was a series of islands. ''We don't know what this creature died from, but millions of years later it is providing important evidence to help us understand in more detail the evolution of brachiosaurid sauropods and a much bigger group of dinosaurs that they belonged to, called titanosauriforms,'' he said. Titanosauriforms were some of the largest creatures ever to have lived on land and were very diverse, surviving right up until the asteroid strike that wiped out most life on Earth. The new species, given the scientific name, Vouivria damparisensis, lived in the Late Jurassic, some 160 million years ago. "It's the earliest member of a group that includes Brachiosaurus - one of the most famous dinosaurs we know - one of the prominent animals in Jurassic Park," Dr Mannion told BBC News "And it gives us a much clearer idea of what's going on in the early evolution of this really important radiation of dinosaurs." The dinosaur is a sauropod - a sub-group of titanosauriforms, which include well-known groups such as Brachiosaurus, Diplodocus and Brontosaurus. They had very long necks, long tails, and small heads with thick, pillar-like legs. The fossil predates the previously oldest-known member of this group by about five million years. "It starts to give us an idea that these animals were evolving much earlier than the fossil record previously has indicated," Dr Mannion added. "This pushes back a lot of origin times for a range of sauropod dinosaurs based on our understanding of how these different species related to one another." The re-classification of Vouivria as an early member of the titanosauriforms will help in mapping their spread across the Earth, from Jurassic times to the extinction of all dinosaurs. It is thought that they were present across Europe, the US and Africa, but became extinct in Europe towards the end of their reign. The fossil was discovered in the village of Damparis in the Jura region of eastern France in 1934. It was documented scientifically in the 1940s, but has not been studied in detail since then. Its scientific name, Vouivria damparisensis, relates to 'La vouivre', a local folklore legend about a winged serpent. Dr Mannion examined the bones of the creature along with scientists at the National Museum of Natural History in Paris and the CNRS/Université Paris 1 Panthéon-Sorbonne. The research is published in the journal, PeerJ.


News Article | April 25, 2017
Site: www.gizmag.com

While your chronological age will tell you how long you've been riding around on this planet, your biological age might actually be a more important number, as it tells you how well your body is faring on its journey. Now, researchers at Imperial College London have added to the growing toolbox of biological age determinants by examining images of our brains. About two years ago, researchers developed a test that could determine your biological age by searching for 150 active genes in the blood. That's also about the time that researchers came up with a different way of determining biological age by examining urine samples. Now, the IC London researchers believe they have found yet another way to determine how well we're aging by combining MRI scans of the brain with machine-learning algorithms. If their method is perfected and verified, finding out our biological age might be as simple as getting a picture snapped of our noggins. The technique at the heart of the new method was actually initially developed in 2010 and involves using machine learning to analyze measurements of brain volume to estimate the total loss of grey and white matter, a process that naturally occurs as we age. The twist on the process, which has been described today in the journal Molecular Psychiatry, involves comparing the reduction in brain volume seen in MRIs to a standard developed by examining the brains of 2,000 healthy people. In other words, the 2,000 healthy MRIs were used to create a map that shows what a healthy brain should look like. By comparing new scans to this data, researchers are able to see how much brain matter has been degraded and come up with a biological age for the patient. In verifying the technique, the researchers used data from a group known as the Lothian Birth Cohort 1936, a well-observed collection of individuals who all had MRI scans of their brains at age 73. They found that if the scans revealed such a degradation of grey and white matter that the brain appeared older than the person's biological age, those people had worse results on standardized tests for aging such as lung capacity, grip strength and gait speed. Also, if the brain appeared older than the participant's chronological age, he or she was statistically more likely to die before reaching 80 years of age. The technique currently has an error value of five years, so it will need to be refined further before it could become a valid measurement of biological age, but the researchers are hopeful that they'll get there, perhaps even using brain age as a health indicator in much the way body mass index (BMI) is used today. "In the long run it would be great if we could do this accurately enough so that we could do it at an individual level," said James Cole, an IC London research associate in the Department of Medicine, who led the study. "Someone could go to their doctor, have a brain scan and the doctor could say 'your brain is 10 years older than it should be,' and potentially advise them to change their diet or lifestyle or to start a course of treatment. However, at the moment, it's not sufficiently accurate to be used at that sort of individual level."


News Article | April 24, 2017
Site: www.biosciencetechnology.com

Three African countries have been chosen to test the world's first malaria vaccine, the World Health Organization announced Monday. Ghana, Kenya and Malawi will begin piloting the injectable vaccine next year with hundreds of thousands of young children, who have been at highest risk of death. The vaccine, which has partial effectiveness, has the potential to save tens of thousands of lives if used with existing measures, the WHO regional director for Africa, Dr. Matshidiso Moeti, said in a statement. The challenge is whether impoverished countries can deliver the required four doses of the vaccine for each child. Malaria remains one of the world's most stubborn health challenges, infecting more than 200 million people every year and killing about half a million, most of them children in Africa. Bed netting and insecticides are the chief protection. Sub-Saharan Africa is hardest hit by the disease, with about 90 percent of the world's cases in 2015. Malaria spreads when a mosquito bites someone already infected, sucks up blood and parasites, and then bites another person. A global effort to counter malaria has led to a 62 percent cut in deaths between 2000 and 2015, WHO said. But the U.N. agency has said in the past that such estimates are based mostly on modeling and that data is so bad for 31 countries in Africa - including those believed to have the worst outbreaks - that it couldn't tell if cases have been rising or falling in the last 15 years. The vaccine will be tested on children five to 17 months old to see whether its protective effects shown so far in clinical trials can hold up under real-life conditions. At least 120,000 children in each of the three countries will receive the vaccine, which has taken decades of work and hundreds of millions of dollars to develop. Kenya, Ghana and Malawi were chosen for the vaccine pilot because all have strong prevention and vaccination programs but continue to have high numbers of malaria cases, WHO said. The countries will deliver the vaccine through their existing vaccination programs. WHO is hoping to wipe out malaria by 2040 despite increasing resistance problems to both drugs and insecticides used to kill mosquitoes. "The slow progress in this field is astonishing, given that malaria has been around for millennia and has been a major force for human evolutionary selection, shaping the genetic profiles of African populations," Kathryn Maitland, professor of tropical pediatric infectious diseases at Imperial College London, wrote in The New England Journal of Medicine in December. "Contrast this pace of change with our progress in the treatment of HIV, a disease a little more than three decades old." The malaria vaccine has been developed by pharmaceutical company GlaxoSmithKline, and the $49 million for the first phase of the pilot is being funded by the global vaccine alliance GAVI, UNITAID and Global Fund to Fight AIDS, Tuberculosis and Malaria. Southeast Asia, Latin America and the Middle East also have malaria cases.


News Article | April 20, 2017
Site: www.bbc.co.uk

Urgent action is needed to protect wild salamanders in Europe from a deadly infection, say scientists. The disease may end up wiping out all vulnerable species, with zoos and gene banks the only conservation option, they warn. A fungal infection introduced to northern Europe several years ago behaves as a "perfect storm", say experts. It persists in the environment and may be spread by newts and birds. The fungus, known as B. salamandrivorans, or Bsal, killed almost all fire salamanders in an outbreak in The Netherlands in 2014. Since then, there have been outbreaks in wild salamanders and newts in Belgium and Germany. Researchers led by An Martel of Ghent University in Belgium, are calling for urgent monitoring across Europe. However, they say that there are few options to prevent the disease spreading in the wild, meaning conservation efforts should focus on zoos, captive breeding and gene banks. Commenting on the study, published in the journal, Nature, Matthew Fisher of Imperial College London, said the fungus was not unlike the "perfect pathogen" portrayed in the science-fiction film, Alien. "More must be done to try to conserve fire salamanders and other susceptible amphibian species that have restricted ranges and are under direct threat of extinction from Bsal," he said. "It is currently unclear how Bsal can be combated in the the wild beyond establishing 'amphibian arks' to safeguard susceptible species are the infection marches relentlessly onwards." The fire salamander (Salamandra salamandra) is one of the best-known salamander species in Europe. Scientists expect local extinctions to occur, but say it will take a long time for the infection to reach populations in southern Europe, such as those in Spain and Portugal. Prof Fisher said the real danger is for species of salamander that have very restricted ranges. Some, such as Lanza's alpine salamander and the golden-striped salamander, are on the European Red List of amphibians. "If Bsal reaches these species, they could go rapidly extinct," Prof Fisher told BBC News. Great crested newts are very susceptible to Bsal, he said. So far, the infection has not emerged in the natural environment in the UK, although it is present in captive populations. "It is imperative that Bsal is not introduced to the UK natural environment as that could lead to declines or even extinctions in conserved UK species - primarily great crested newts," said Prof Fisher.


News Article | May 2, 2017
Site: www.chromatographytechniques.com

Scientists have re-examined an overlooked museum fossil and discovered that it is the earliest member of the titanosauriform family of dinosaurs. The fossil, which the researchers from Imperial College London and their colleagues in Europe have named Vouivria damparisensis, has been identified as a brachiosaurid sauropod dinosaur. The researchers suggest the age of Vouivria is around 160-million-years-old, making it the earliest known fossil from the titanosauriform family of dinosaurs, which includes better-known dinosaurs such as the Brachiosaurus. When the fossil was first discovered in France in the 1930s, its species was not identified, and until now it has largely been ignored in scientific literature. The new analysis of the fossil indicates that Vouivria died at an early age, weighed around 15,000 kilograms and was over 15 meters long, which is roughly 1.5 times the size of a double-decker bus in the UK. It had a long neck held at around a 45 degree angle, a long tail, and four legs of equal length. It would have been a plant eater. "Vouivria would have been a herbivore, eating all kinds of vegetation, such as ferns and conifers. This creature lived in the Late Jurassic, around 160 million years ago, at a time when Europe was a series of islands. We don't know what this creature died from, but millions of years later it is providing important evidence to help us understand in more detail the evolution of brachiosaurid sauropods and a much bigger group of dinosaurs that they belonged to, called titanosauriforms," said Philip Mannion, the lead author of the study from the Department of Earth Science and Engineering at Imperial College London. Helping scientists to know more about titanosauriforms Titanosauriforms were a diverse group of sauropod dinosaurs and some of the largest creatures to have ever lived on land. They lived from at least the Late Jurassic, right to the end-Cretaceous mass extinction, when an asteroid wiped out most life on Earth. A lack of fossil records means that it has been difficult for scientists to understand the early evolution of titanosauriforms and how they spread out across the planet. The re-classification of Vouivria as an early titanosauriform will help scientists to understand the spread of these creatures during the Early Cretaceous period, a later period of time, after the Jurassic, around 145-100 million years ago. The team's incorporation of Vouivria into a revised analysis of sauropod evolutionary relationships shows that by the Early Cretaceous period, brachiosaurids were restricted to what is now Africa and the USA, and were probably extinct in Europe. Previously, scientists had suggested the presence of another brachiosaurid sauropod dinosaur called Padillasaurus much further afield in what is now South America, in the Early Cretaceous. However, the team's incorporation of Vouivria into the fossil timeline suggests that Padillasaurus was not a brachiosaurid, and that this group did not spread as far as South America. The Vouivria fossil was originally discovered by palaeontologists in the village of Damparis, in the Jura Department of eastern France, in 1934. Ever since, it has been stored in the Museum National d'Histoire Naturelle, Paris. It was only briefly mentioned by scientists in studies in the 1930s and 1940s, but it was never recognized as a distinct species. It has largely been ignored in the literature, where it has often been referred to simply as the Damparis dinosaur. Now, a deeper analysis of the fossil is also helping the scientists in today's study to understand the environment Vouivria would have been in when it died, which was debated when it was initially found. The researchers believe Vouivria died in a coastal lagoon environment, during a brief sea level decline in Europe, before being buried when sea levels increased once more. When the fossil was first discovered, in rocks that would have originally come from a coastal environment, researchers suggested that its carcass had been washed out to sea, because sauropods were animals that lived on land. Today's team's examination of Vouivria, coupled with an analysis of the rocks it was encased in, provides strong evidence that this was not the case. The genus name of Vouivria is derived from the old French word 'vouivre,' itself from the Latin 'vipera,' meaning 'viper.' In French-Comte, the region in which the specimen was originally discovered, 'la vouivre' is a legendary winged reptile. The species name damparisensis refers to the village Damparis, from which the fossil was originally found. The research was carried out in conjunction with the Museum National d'Histoire Naturelle and the CNRS/Université Paris 1 Panthéon-Sorbonne, with funding from the European Union's Synthesys programme. Currently, titanosauriforms from the Late Cretaceous are poorly understood compared to their relatives in the Late Jurassic. The next step for the researchers will see them expanding on their analysis of the evolutionary relationships of all species in the titanosauriform group. The team are also aiming to find more sauropod remains from older rocks to determine in more detail how they spread across the continents.


News Article | April 4, 2017
Site: www.techtimes.com

The first Brexit happened thousands of years before the United Kingdom voted to politically leave the European Union and this happened when Britain geologically broke off from the rest of Europe. Ancient Britain's separation from the rest of Europe occurred in a two-stage flooding event that destroyed the thin strip of land that used to connect it to France. Researchers said that a large lake likely overflowed 450,000 years ago, about the same time when the Neanderthals are believed to have first appeared in Europe. The second event, which happened 160,000 years ago involved a catastrophic flood that opened the Dover Strait in the English Channel, which now separates Britain from the rest of Europe. Reporting in a study published in the journal Nature Communication on Tuesday, researchers said that they have now found evidence that could explain how the opening of the Dover Strait that severed the land between Britain and France happened. Study researcher Jenny Collier, from Imperial College London, said that evidence suggests that 450,000 years ago, Dover Strait may have been a huge rock ridge of chalk that looked more like Siberia's frozen tundra than the green environment it is today. "It would have been a cold world dotted with waterfalls plunging over the iconic white chalk escarpment that we see today in the White Cliffs of Dover," Collier said. About 10 years ago, researchers found geophysical proof of giant valleys on the seafloor in the English Channel, which they thought were evidence of a large flood gouging out of the land that may have been caused by a breach in the chalk rock ridge that once joined Britain to France. Now geophysical data from France and Belgium and seafloor data from Britain showed huge holes and valley system on the seafloor helping researchers establish how this breach of the chalk ridge occurred. The ridge served as a huge dam with a proglacial lake behind it. Once the lake overflowed, its waters started to cascade like waterfalls over the Dover-Calais land bridge. Laden with abrasive flints that were dissolved from the chalk, the waterfall caused holes in the bedrock beneath eroding and weakening the land bridge. A section then gave way. In a cataclysmic flood, huge amounts of water were eventually released as the glacial lake poured itself into the English Channel. "Sub-bottom records reveal a remarkable set of sediment-infilled depressions that are deeply incised into bedrock that we interpret as giant plunge pools. These support a model of initial erosion of the Dover Strait by lake overspill, plunge pool erosion by waterfalls and subsequent dam breaching," researchers wrote in their study. Collier and colleagues said it is not yet clear how the proglacial lake split over but they have theories. "Perhaps part of the ice sheet broke off, collapsing into the lake, causing a surge that carved a path for the water to cascade off the chalk ridge," Collier said adding that an earthquake may have contributed to more weakening of the ridge causing it to collapse. © 2017 Tech Times, All rights reserved. Do not reproduce without permission.


News Article | April 20, 2017
Site: www.sciencemag.org

A new World Health Organization (WHO) report chops the estimated number of people around the world living with the liver-damaging hepatitis C virus (HCV) in half—but the drop has nothing to do with the recent advent of powerful drugs that cure the disease for most everyone. WHO’s Global Hepatitis Report estimates that 71 million people in 2015 were living with HCV, down from an earlier estimate of 130 million to 150 million. As the report explains, the dramatic drop occurred primarily because of tests that measured HCV’s genetic material, RNA, in people. Previous epidemiological surveys tested whether people had antibodies against the virus, which is less precise. The report estimates that 257 million people are infected with hepatitis B virus (HBV), a number very close to previous estimates. Although HBV and HCV are unrelated, they both persist for decades, often without a person’s knowledge, and both can ultimately cause cirrhosis or liver cancer. Together, the viruses killed 1.34 million people in 2015, which the report notes is comparable to deaths from tuberculosis and higher than those from HIV/AIDS. The previous focus on HCV antibodies, or “seroprevalence,” to determine the number of infected people is confusing in two ways, says Graham Cooke, a hepatitis specialist at Imperial College London. Cooke recently conducted a meta-analysis of studies in sub-Saharan Africa and found that only 51% of people deemed positive on antibody tests had evidence of viral RNA. As much as one-third of this discrepancy is because some people spontaneously clear the virus, he says, although the antibodies linger. Another factor is the antibody test itself, which Cooke says often was imprecise in the past. The new focus on virus rather than antibody meshes with the global push to cure people with the new generation of drugs. Declaring someone cured requires demonstrating that the virus is undetectable on standard blood tests for 12 weeks after treatment stops. “We’re at a moment where we’re pivoting from describing to doing,” Cooke says. “When you’re describing you do seroprevalance. But you don’t need to cure everyone who has antibody.” WHO hepatologist Yvan Hutin in Geneva, Switzerland, the lead writer of the report, notes that a widely discussed paper on the global epidemiology of HCV published in 2014 had vastly reduced estimates. But he says WHO did not immediately adopt the lower numbers because it has a very small staff working on the global hepatitis program and wanted to confirm the figures with further studies. Until 2011, the standard drugs for HCV infection did not directly attack the virus, were highly toxic, required 48 weeks of treatment, and failed in up to 40% of people. But a slew of drugs that cripple HCV have since come to market that require just 12 weeks of treatment, have few side effects, and cure most everyone. The drugs have notoriously steep price tags in developed countries—about $45,000 for a cure—but companies allow poorer countries to use generics that cost as little as $200 for a full course of treatment. In 2015, only 1.1 million HCV-infected people started treatment. The diagnostics used to detect antibodies are far simpler to use than the ones that measure virus, which remain sparsely available in many hard-hit countries. The WHO report estimates that only 14 million people know they have active HCV infections. “The current situation is there may be more of an issue in terms of diagnostic tests than in terms of treatment,” Hutin says. HCV primarily is transmitted when people share needles for drug injection and through improperly sterilized medical equipment; HBV mainly is transmitted from infected mothers to their babies, and, for unclear reasons, between children. WHO estimates that in 2015, 1.75 million people became infected with HCV. The report does not give an estimate of the number of new HBV infections. But it notes that the global new infection rate of HBV in children under 5 years old dropped from 4.7% before there was a widely used vaccine (the first one came to market in 1981) to 1.3% in 2015. Whereas HCV is easier to treat, HBV is easier to prevent. The HBV vaccine is highly effective, and 84% of infants received it in 2015. An HCV vaccine doesn’t exist yet. HBV can be treated, but it can’t be cured. “We are happy but we are not complacent” about the HBV vaccine coverage, says Ana Maria Henao-Restrepo, a WHO immunization specialist in Geneva. In particular, she says that although 84% of infants worldwide are now getting the recommended three vaccine doses, only 39% are getting their first shot within 24 hours of delivery, as is recommended. “The birth dose is really critical because most mother-to-child transmission occurs within days of birth,” Henao-Restrepo says. “We need to ensure that all countries, not only half of them, offer this lifesaving vaccine at birth.” Babies whose mothers have actively replicating HBV at the time of delivery are at risk of infection even if they get the birth dose. Giving pregnant women who are infected with HBV antivirals cuts the chance of infection, according to the report. Overall, Hutin says such interventions likely mean that “soon, hepatitis B in children will be history.”


News Article | April 17, 2017
Site: www.newscientist.com

Computer vision is ready for its next big test: seeing in 3D. The ImageNet Challenge, which has boosted the development of image-recognition algorithms, will be replaced by a new competition next year that aims to help robots see the world in all its depth. Since 2010, researchers have trained image recognition algorithms on the ImageNet database, a go-to set of more than 14 million images hand-labelled with information about the objects they depict. The algorithms learn to classify the objects in the photos into different categories, such as house, steak or Alsatian. Almost all computer vision systems are trained like this before being fine-tuned on a more specific set of images for different tasks. Every year, participants in the ImageNet Large Scale Visual Recognition Challenge try to code algorithms that can categorise these images with as few errors as possible. Seven years ago, this was a difficult task, but now computer vision is great at categorising images. In 2015, a team from Microsoft built a system that was over 95 per cent accurate, surpassing human performance for the first time in the challenge’s history. And photo apps from Google and Apple allow people to search their photo collections using terms like food or baby. Google Photos even classifies images by abstract concepts like “happiness”. “When we were starting the project, these were not things that industry had done yet,” says Alex Berg at the University of North Carolina at Chapel Hill, who is one of the competition’s organisers. “Now they are products that millions of people are using.” So the ImageNet team say it’s time for a fresh challenge in 2018. Although the details of this competition have yet to be decided, it will tackle a problem computer vision has yet to master: making systems that can classify objects in the real world, not just in 2D images, and describe them using natural language. “There is very little work on putting a 3D scene through a machine-learning algorithm,” says Victor Prisacariu at the University of Oxford. Building a large database of images complete with 3D information would allow robots to be trained to recognise objects around them and map out the best route to get somewhere. This database would largely comprise images of scenes inside homes and other buildings. The existing ImageNet database consists of images collected from across the internet and then labelled by hand, but these lack the depth information needed to understand a 3D scene. The database for the new competition could consist of digital models that simulate real-world environments or 360-degree photos that include depth information, says Berg. But first someone must make these images. As this is difficult and costly, the data set is likely to be a lot smaller than the one for the original challenge. Robot vision is ready for its ImageNet moment, says Andrew Davison at Imperial College London. He is already working on the next generation of in-home robots that will take over from devices such as the floor-cleaning Roomba. These will need to know how to deal with objects and manipulate the world around them, he says. “I really think you need this detailed 3D understanding both of the shape of the world, but also a semantic understanding of what’s in it,” he says. The new challenge will also assist augmented and virtual reality, says Davison. Knowing where objects are in the real world will help augmented reality systems like the Microsoft HoloLens depict virtual objects within it. “It’s very much the same capability,” he says. Berg isn’t expecting major progress in the first couple of years of the new challenge, but he has an idea of what success might look like. Eventually, he would like to see robots that can consistently understand the environment around them and explain what they see just as well as a human can. However, achieving either of these things is more than five years away, he says.


News Article | May 3, 2017
Site: www.eurekalert.org

A new generation of prosthetic limbs which will allow the wearer to reach for objects automatically, without thinking -- just like a real hand -- are to be trialled for the first time. Led by biomedical engineers at Newcastle University, UK, and funded by the Engineering and Physical Sciences Research Council (EPSRC), the bionic hand is fitted with a camera which instantaneously takes a picture of the object in front of it, assesses its shape and size and triggers a series of movements in the hand. Bypassing the usual processes which require the user to see the object, physically stimulate the muscles in the arm and trigger a movement in the prosthetic limb, the hand 'sees' and reacts in one fluid movement. A small number of amputees have already trialled the new technology and now the Newcastle University team are working with experts at Newcastle upon Tyne Hospitals NHS Foundation Trust to offer the 'hands with eyes' to patients at Newcastle's Freeman Hospital. Publishing their findings today in the Journal of Neural Engineering, co-author on the study Dr Kianoush Nazarpour, a Senior Lecturer in Biomedical Engineering at Newcastle University, explains: "Prosthetic limbs have changed very little in the past 100 years -- the design is much better and the materials' are lighter weight and more durable but they still work in the same way. "Using computer vision, we have developed a bionic hand which can respond automatically -- in fact, just like a real hand, the user can reach out and pick up a cup or a biscuit with nothing more than a quick glance in the right direction. "Responsiveness has been one of the main barriers to artificial limbs. For many amputees the reference point is their healthy arm or leg so prosthetics seem slow and cumbersome in comparison. "Now, for the first time in a century, we have developed an 'intuitive' hand that can react without thinking." Recent statistics show that in the UK there are around 600 new upper-limb amputees every year, of which 50% are in the age range of 15-54 years old. In the US there are 500,000 upper limb amputees a year. Current prosthetic hands are controlled via myoelectric signals - that is electrical activity of the muscles recorded from the skin surface of the stump. Controlling them, says Dr Nazarpour, takes practice, concentration and, crucially, time. Using neural networks -- the basis for Artificial Intelligence -- lead author on the study Ghazal Ghazaei showed the computer numerous object images and taught it to recognise the 'grip' needed for different objects. "We would show the computer a picture of, for example, a stick," explains Miss Ghazaei, who carried out the work as part of her PhD in the School of Electrical and Electronic Engineering at Newcastle University. "But not just one picture, many images of the same stick from different angles and orientations, even in different light and against different backgrounds and eventually the computer learns what grasp it needs to pick that stick up. "So the computer isn't just matching an image, it's learning to recognise objects and group them according to the grasp type the hand has to perform to successfully pick it up. "It is this which enables it to accurately assess and pick up an object which it has never seen before -- a huge step forward in the development of bionic limbs." Grouping objects by size, shape and orientation, according to the type of grasp that would be needed to pick them up, the team programmed the hand to perform four different 'grasps': palm wrist neutral (such as when you pick up a cup); palm wrist pronated (such as picking up the TV remote); tripod (thumb and two fingers) and pinch (thumb and first finger). Using a 99p camera fitted to the prosthesis, the hand 'sees' an object, picks the most appropriate grasp and sends a signal to the hand -- all within a matter of milliseconds and ten times faster than any other limb currently on the market. "One way would have been to create a photo database of every single object but clearly that would be a massive task and you would literally need every make of pen, toothbrush, shape of cup -- the list is endless," says Dr Nazarpour. "The beauty of this system is that it's much more flexible and the hand is able to pick up novel objects -- which is crucial since in everyday life people effortlessly pick up a variety of objects that they have never seen before." The work is part of a larger research project to develop a bionic hand that can sense pressure and temperature and transmit the information back to the brain. Led by Newcastle University and involving experts from the universities of Leeds, Essex, Keele, Southampton and Imperial College London, the aim is to develop novel electronic devices that connect to the forearm neural networks to allow two-way communications with the brain. Reminiscent of Luke Skywalker's artificial hand, the electrodes in the bionic limb would wrap around the nerve endings in the arm. This would mean for the first time the brain could communicate directly with the prosthesis. The 'hand that sees', explains Dr Nazarpour, is an interim solution that will bridge the gap between current designs and the future. "It's a stepping stone towards our ultimate goal," he says. "But importantly, it's cheap and it can be implemented soon because it doesn't require new prosthetics -- we can just adapt the ones we have." Anne Ewing, Advanced Occupational Therapist at Newcastle upon Tyne Hospitals NHS Foundation Trust, has been working with Dr Nazarpour and his team. "I work with upper limb amputee patients which is extremely rewarding, varied and at times challenging," she said. "We always strive to put the patient at the heart of everything we do and so make sure that any interventions are client centred to ensure patients' individual goals are met either with a prosthesis or alternative method of carrying out a task. "This project in collaboration with Newcastle University has provided an exciting opportunity to help shape the future of upper limb prosthetics, working towards achieving patients' prosthetic expectations and it is wonderful to have been involved." "For me it was literally a case of life or limb," says Doug McIntosh, who lost his right arm in 1997 through cancer. "I had developed a rare form of cancer called epithelial sarcoma, which develops in the deep tissue under the skin, and the doctors had no choice but to amputate the limb to save my life. "Losing an arm and battling cancer with three young children was life changing. I left my job as a life support supervisor in the diving industry and spent a year fund-raising for cancer charities. "It was this and my family that motivated me and got me through the hardest times." Since then, Doug has gone on to be an inspiration to amputees around the world. Becoming the first amputee to cycle from John O'Groats to Land's End in 100hrs, cycle around the coast line of Britain, he has run three London Marathons, cycled The Dallaglio Flintoff Cycle Slam 2012 and 2014 and in 2014 cycled with the British Lions Rugby Team to Murrayfield Rugby Stadium for "Walking with Wounded" Charity. He is currently preparing to do Mont Ventoux this September, three cycle climbs in one day for Cancer Research UK and Maggie's Cancer Centres. Involved in the early trials of the first myoelectric prosthetic limbs, Doug has been working with the Newcastle team to trail the new hand that sees. "The problem is there's nothing yet that really comes close to feeling like the real thing," explains the father-of-three who lives in Westhill, Aberdeen with his wife of 32 years, Diane. "Some of the prosthetics look very realistic but they feel slow and clumsy when you have a working hand to compare them to. "In the end I found it easier just to do without and learn to adapt. When I do use a prosthesis I use a split hook which doesn't look pretty but does the job." But he says the new, responsive hand being developed in Newcastle is a 'huge leap forward'. "This offers for the first time a real alternative for upper limb amputees," he says. "For me, one of the ways of dealing with the loss of my hand was to be very open about it and answer people's questions. But not everyone wants that and so to have the option of a hand that not only looks realistic but also works like a real hand would be an amazing breakthrough and transform the recovery time -- both physically and mentally -- for many amputees."


News Article | April 19, 2017
Site: www.sciencemag.org

Europe's largest and best known salamander species, the fire salamander, is falling victim to a deadly fungus, and new research is making scientists more pessimistic about its future. A 2-year study of a population in Belgium, now entirely wiped out, has revealed that these amphibians can't develop immunity to the fungus, as was hoped. To make matters worse, it turns out the fungus creates a hardy spore that can survive in water for months and also stick to birds' feet, offering a way for it to spread rapidly across the continent. Two other kinds of amphibians, both resistant to the disease, also act as carriers for the highly infectious spores. "This is terrible news," says geneticist Matthew Fisher of Imperial College London, who studies the fungus but was not involved in the new research. "This isn't a problem that's going to go away. It's a problem that's going to get worse." The pathogen, Batrachochytrium salamandrivorans (Bsal), is a chytrid fungus, a type that lives in damp or wet environments and typically consumes dead organic matter. Bsal infects and eats the skin of salamanders, causing lesions, apathy, loss of appetite, and eventually death. Over the past few decades, a related fungus, B. dendrobatidis (Bd), has struck hard at amphibian populations around the world, particularly in the Americas, Australia, Spain, and Portugal. More than 200 species of frogs and toads are thought to have gone extinct, including many kinds of Costa Rica's striking stream-breeding toads. Bsal was identified in a nature reserve in the Netherlands in 2013 after fire salamanders started dying with ulcers and sores similar to those caused by Bd. Fire salamanders (Salamandra salamandra) grow up to 35 centimeters long, can live more than 40 years, and hunt insects and other small prey in forest streams. Their bright yellow spots warn predators of poison around their head and back. In the Dutch nature reserve, the population plummeted 99.9%. The fungus is thought to have arrived in Europe via salamanders or newts imported from Asia for the pet trade. Bsal has since been found in Belgium and Germany in both fire salamanders and alpine newts. As soon as Bsal was spotted in Belgium in April 2014, veterinarian An Martel of Ghent University in Merelbeke, Belgium, and her colleagues began visiting every month to track the population. About 90% of the fire salamanders died within 6 months, and after 2 years all were gone. The fieldwork revealed that adult animals were more likely to get infected, which makes sense because they are in closer contact with each other—through fighting for mating and breeding, for example—than are juveniles. But the death of these adults means that the population likely won't recover. There was no immune response detected in any of the sick animals in the lab, suggesting that it will be impossible to develop a vaccine, the team reports today in . "We really wanted to find solutions to mitigate disease, to save the salamanders, but everything turned out bad," Martel says. The team had also hoped that the fungus would become less virulent—as often occurs when a pathogen reaches a new host that lacks any immunity—but that hasn't happened: Fungal spores taken from the last fire salamanders in the Belgian forest, when dripped onto the backs of healthy salamanders in the lab, were just as lethal as those collected early in the outbreak. "When they come in contact with a single spore, they will die." The paper has more bad news. Researchers knew that Bsal makes spores with a tiny tail called a flagellum, which propels them toward amphibians. If spores dry out, they die. Otherwise, they typically survive for a few days before being eaten by protozoa. But Martel's group discovered that Bsal makes a second type of spore that looks much hardier and is rarely eaten by protozoa. "This will make it almost impossible to eradicate the fungus from the environment," says Martel, who adds that the spores can survive in pond water for more than 2 months. Another experiment showed that soil remained infectious for 48 hours after it was walked on by a sick salamander. In a separate lab test, the spores adhered to goose feet, suggesting they could hitchhike long distances on birds. The group also showed that two species that share the same habitat as the fire salamander are likely carriers of the disease. Midwife toads (Alytes obstetricans) could be infected with the fungus and shed spores for a few weeks, but they didn't get sick. A high dose of the fungus killed alpine newts (Ichthyosaura alpestris), but low doses made them infectious for months without killing them. As has happened with Bd in the Americas, Bsal will lurk in these reservoirs of disease even after local populations of fire salamanders vanish. Any fire salamanders that arrive from elsewhere will likely get infected by newts or toads. According to results from previous infection trials, most salamander species in Europe are likely just as vulnerable to Bsal. The fire salamander has a range that extends across Europe, and the fear is that the fungus will reach endangered salamanders. With small populations, these species could more easily be driven extinct, Fisher says. "The assumption is that they are all at risk," he says, and the findings in the new paper "have really upped their risk status." Martel and European colleagues recently started monitoring for Bsal in seven countries. It is possible to cure amphibians in the lab. For animals that can take the heat, like fire salamanders, 10 days at 25°C will kill the fungus. Other species can be cured with a combination of two drugs. But there is no practical solution for animals in the wild, especially when their habitat is contaminated with fungal spores. Herpetologist Jaime Bosch of the National Museum of Natural History in Madrid had a rare success in eliminating a chytrid fungus from the wild. A few years ago, he and colleagues got rid of Bd on the Spanish island of Mallorca by temporarily removing some 2000 tadpoles of the Mallorcan midwife toad (Alytes muletensis) and disinfecting their ponds with powerful chemicals. But this success would be hard to replicate in less isolated locations, he says. "Right now, we are very far away from having any solution." The only hope in the meantime, Bosch and others say, is to slow the spread of the disease by ending the importation of amphibians. The United States, a hot spot of amphibian diversity, has already taken steps in that direction. Last year, the U.S. Fish and Wildlife Service banned the import of 201 species of salamanders on the grounds that they might introduce the fungus. Joe Mendelson, a herpetologist at Zoo Atlanta, says the new research suggests the list should be expanded to include other carriers such as the toad and newt studied in the new paper. "This is a very important piece of work, and it's terrifying," he says. "If Bsal gets loose in the United States," he says, "it's going to be bad."


News Article | April 21, 2017
Site: motherboard.vice.com

Update: At 11pm BST on April 21, the National Grid announced that Britain had gone 24 hours without coal. Gas—with less than half the carbon emissions of coal – was the single biggest source of energy throughout the day, according to the National Grid. The UK is poised to go an entire day without using any energy from coal sources, according to British electricity network National Grid—the first time Britain will have done so since the late 19th century. Britain's reliance on coal has been falling steadily since the 1950s and peak coal production, some 287 million tonnes, occurred way back in 1913. But at around 11pm tonight, the UK is expected to witness an entire 24-hour period passing without electricity generated by coal, as alternative power sources such as wind, solar, and nuclear become more popular and the warming months reduce electricity demand in the country. The 24-hour milestone came very close to being reached on April 20, when there were at least 19 hours without power from coal, the National Grid said on Twitter, but tonight should see the first working day the UK hasn't needed coal power since the Industrial Revolution. This isn't the first time coal has dropped off the map in terms of the UK's energy mix. The amount of electricity generated from coal fell to zero at various points last year. These periods amounted to a total of 200 hours when no coal was used for energy production in 2016, Dr Iain Staffell, lecturer on sustainable energy at Imperial College London, told Motherboard. "We have retired half of the country's coal capacity in the last three or four years," explained Staffell, pointing out that the government plans to phase out coal power altogether by 2025. "Even without that policy intention, probably nearly all of [the power stations] would have been retired anyway because we haven't built any new ones in a long time," he said. Read more: Remember, Coal Spills Can Ruin Everything Too The latest government figures also reveal a startling drop in the amount of coal being produced in Britain. In fact, production halved between 2015 and 2016. Imports are falling, too, and coal stocks at the end of 2016 had reached "a record low". The importance of coal in the UK power economy may have dwindled quickly of late, but it may still have a part to play in the future, Staffell suggested. It's likely that for several years to come, coal could be useful as an on-demand energy source to help the National Grid deal with peak periods. "You can turn [coal power stations] up when we need more, down when we need less – you can't do that with nuclear, you can't do that with renewables," said Staffell. Still, as renewable sources like wind power continue to rise in terms of electricity production in the UK, a largely coal-free future seems to be approaching. Subscribe to Science Solved It, Motherboard's new show about the greatest mysteries that were solved by science.


News Article | May 3, 2017
Site: www.newscientist.com

An artificial hand is using artificial intelligence to see with an artificial eye. The new prosthetic can choose how best to grab objects placed in front of it automatically, making it easier to use. When it sees an object, the artificial hand detects the intention to grasp by interpreting electrical signals from muscles in the wearer’s arm. It then takes a picture of the object using a cheap webcam and picks one of four possible grasping positions. The different grips include one similar to picking up a cup, one similar to picking up a TV remote from a table, one that uses two fingers and a thumb, and another that uses just the thumb and index finger. “The hand learns the best way to grasp objects – that’s the beauty of it,” says Ghazal Ghazaei at Newcastle University, UK. To train the hand, Ghazaei and her colleagues showed it images of more than 500 objects. Each object came with 72 different images, showing different angles and different backgrounds, as well as the best grip for picking it up. Through trial and error, the system learned to choose the best grips for itself. Existing controllable prosthetics work by converting electrical signals in a person’s arm or leg into movement. But it can take a long time to learn to control an artificial limb and the movements can still be clumsy. The new system is just a prototype, but by giving a hand the ability to see what it is doing and position itself accordingly, the team believe they can make a better prosthetic. The design has been tested by two people who have had a hand amputated. They were able to grab a range of objects with just under 90 per cent accuracy. That’s not bad for a prototype but dropping one out of 10 things users try to pick up is not yet good enough. “We’re aiming for 100 per cent accuracy,” says Ghazaei. The researchers hope to achieve this by trying out different algorithms. They also plan to make a lighter version with the camera embedded in the palm of the hand. The key with prostheses like these is getting the balance right between user and computer control, says Dario Farina at Imperial College London. “People don’t want to feel like a robot, they want to feel like they are fully in control,” he says. It’s important that the technology helps assist grasping rather than fully taking over. “It should be similar to brake assistance on a car, the driver decides when to brake but the car helps them brake better,” says Farina.


News Article | April 29, 2017
Site: www.latimes.com

The future of construction just got a little bit more real. Researchers at MIT have created a mobile robot that can 3-D-print an entire building in a matter of hours — a technology that could be used in disaster zones, on inhospitable planets or even in our proverbial backyards. Though the platform described in the journal Science Robotics is still in early stages, it could offer a revolutionary tool for the construction industry and inspire more architects to rethink the relationship of buildings to people and the environment. Current construction practices typically involve bricklaying, wood framing and concrete casting – technologies that have been around for decades in some cases, and centuries in others. Homes and office buildings are often built in the same boxy, cookie-cutter-like templates, even though the environment from one area to another may change dramatically. “The architecture, engineering, and construction (AEC) sector tends to be risk-averse: Most project fabrication data nowadays have been digitally produced, but the manufacturing and construction processes are mostly done with manual methods and conventional materials adopted a century ago,” Imperial College London researcher Guang-Zhong Yang, the journal’s editor, wrote in an editorial on the paper. In recent years, scientists and engineers have begun to explore the idea that buildings could instead be built through additive manufacturing – that is, 3-D printing. A home could be customized to its local environment, it could use buildings resources more efficiently, and it could deploy materials in more sophisticated ways. “Right now, the way we manufacture things is we go to the mine, we dig out minerals and materials, we ship them to a factory, the factory makes a bunch of mass-made parts, usually out of a single material, and then they’re assembled — screwed together, glued together and shipped back to consumers,” said lead author Steven Keating, a mechanical engineer who did the research as a graduate student under Neri Oxman’s group at the Massachusetts Institute of Technology. But the group’s many projects, he added, revolved around this question: How do we actually fabricate in a way that is more consistent with how biology works? Keating pointed to the tree as one example of a natural builder. Trees can self-repair, operate with self-sufficiency, build onsite with locally sourced materials, and adapt to their environment. “These are the kinds of principles that we’ve looked at for a lot of the projects in the group,” he said. While several groups around the world have been working on large-scale 3-D printing techniques, there have been challenges in this process, Keating said. “A lot of other research projects that are looking at digital construction often don’t create something of an architectural scale — and if they do, they’re not using a process that could be easily integrated into a construction site,” Keating said. “They’re not using materials or a process that can be easily code-certified. And what we wanted to make sure could happen is we could actually break into the construction industry, because it’s a very slow and conservative industry.” Keating and his colleagues’ robot, called the Digital Construction Platform, looks to address those issues. It features hydraulic and electric robotic arms and can be loaded with all kinds of sensors to measure its environment, including lasers and a radiation-detecting Geiger counter. In less than 13.5 hours, the robot was able to zip round and round, printing a 14.6-meter-wide, 3.7-meter-tall open dome structure out of a foam used as insulated formwork. Strange as it looks, this formwork could be filled with concrete. Since this is essentially what already happens in traditional construction, this 3-D printing process could be integrated into current construction techniques. (In both the traditional and 3-D-printed scenarios, the formwork ends up as the building’s insulation.) This process has a number of advantages, many of which allow the robot to design and build more in the way that living systems in nature do, Keating said. Three-dimensional printing uses fewer materials more efficiently. It can also create useful gradients, such as reducing wall thickness from the bottom of a wall toward the top. (Nature does this too: Think of a tree’s trunk at the base versus near the top, or the way a squid beak goes from hard at the tip to soft at the base.) This process can create and work with curves, which are usually more costly for traditional building methods. The formwork also cures so quickly (within about 30 seconds) that the robot can build horizontally without needing structural support the way traditional construction methods do. Rather than trying to design the perfect structure beforehand, a 3-D-printing robot could produce a building that’s completely in tune with its environmental factors – soil moisture, temperature, wind direction and radiation levels, among others. This is how scientists think animals such as termites build their homes — by modifying the structure in response to the environment. Since it’s solar-powered, this robot can be self-sufficient. And like living things, it could potentially create building materials out of stuff in the local ecosystem: The authors showed that the robot was able to take scoops of dirt and turn the compressed earth into building material. The researchers were even able to print with ice. “I know it sounds silly — why would you want to print with ice? — but if you actually look, NASA’s very seriously thinking about using ice as a fabrication material for places in space such as Mars, because ice actually absorbs a lot of cosmic radiation,” Keating said. Printing with ice from the environment would be much more sensible than lugging all your building materials all the way to the Red Planet, he noted. Follow @aminawrite on Twitter for more science news and "like" Los Angeles Times Science & Health on Facebook. Lamb fetuses can now grow in artificial wombs. Will humans be next? When people work together, they’re literally on the same wavelength, brain waves show


News Article | April 19, 2017
Site: www.eurekalert.org

Scientific evidence of a 'higher' state of consciousness has been found in a study led by the University of Sussex. Neuroscientists observed a sustained increase in neural signal diversity - a measure of the complexity of brain activity - of people under the influence of psychedelic drugs, compared with when they were in a normal waking state. The diversity of brain signals provides a mathematical index of the level of consciousness. For example, people who are awake have been shown to have more diverse neural activity using this scale than those who are asleep. This, however, is the first study to show brain-signal diversity that is higher than baseline, that is higher than in someone who is simply 'awake and aware'. Previous studies have tended to focus on lowered states of consciousness, such as sleep, anaesthesia, or the so-called 'vegetative' state. The team say that more research is needed using more sophisticated and varied models to confirm the results but they are cautiously excited. Professor Anil Seth, Co-Director of the Sackler Centre for Consciousness Science at the University of Sussex, said: "This finding shows that the brain-on-psychedelics behaves very differently from normal. "During the psychedelic state, the electrical activity of the brain is less predictable and less 'integrated' than during normal conscious wakefulness - as measured by 'global signal diversity'. "Since this measure has already shown its value as a measure of 'conscious level', we can say that the psychedelic state appears as a higher 'level' of consciousness than normal - but only with respect to this specific mathematical measure." For the study, Michael Schartner, Adam Barrett and Professor Seth of the Sackler Centre reanalysed data that had previously been collected by Imperial College London and the University of Cardiff in which healthy volunteers were given one of three drugs known to induce a psychedelic state: psilocybin, ketamine and LSD. Using brain imaging technology, they measured the tiny magnetic fields produced in the brain and found that, across all three drugs, this measure of conscious level - the neural signal diversity - was reliably higher. This does not mean that the psychedelic state is a 'better' or more desirable state of consciousness, the researchers stress; instead, it shows that the psychedelic brain state is distinctive and can be related to other global changes in conscious level (e.g. sleep, anaesthesia) by application of a simple mathematical measure of signal diversity. Dr Muthukumaraswamy who was involved in all three initial studies commented: "That similar changes in signal diversity were found for all three drugs, despite their quite different pharmacology, is both very striking and also reassuring that the results are robust and repeatable." The findings could help inform discussions gathering momentum about the carefully-controlled medical use of such drugs, for example in treating severe depression. Dr Robin Cahart-Harris of Imperial College London said: "Rigorous research into psychedelics is gaining increasing attention, not least because of the therapeutic potential that these drugs may have when used sensibly and under medical supervision. "The present study's findings help us understand what happens in people's brains when they experience an expansion of their consciousness under psychedelics. People often say they experience insight under these drugs - and when this occurs in a therapeutic context, it can predict positive outcomes. The present findings may help us understand how this can happen." As well as helping to inform possible medical applications, the study adds to a growing scientific understanding of how conscious level (how conscious one is) and conscious content (what one is conscious of) are related to each other. Professor Seth said: "We found correlations between the intensity of the psychedelic experience, as reported by volunteers, and changes in signal diversity. This suggests that our measure has close links not only to global brain changes induced by the drugs, but to those aspects of brain dynamics that underlie specific aspects of conscious experience." The research team are now working hard to identify how specific changes in information flow in the brain underlie specific aspects of psychedelic experience, like hallucinations.


News Article | May 3, 2017
Site: www.futurity.org

A new genetic fingerprinting technique is the first to show the huge diversity of the malaria parasite, one of nature’s most persistent and successful human pathogens. The technique validates a previously untestable “strain hypothesis” that was proposed more than 20 years ago and opens up new ways of thinking about how to tackle this cunning killer. Key to that understanding is changing the way we think about malaria—that it is not so much like the measles and more like the flu. As reported in the Proceedings of the National Academy of Sciences, researchers collected blood samples from 641 children, aged 1 to 12 years from Bakoumba, a village in Gabon, West Africa and the genetic fingerprints of parasites from 200 infected children. Remarkably, every child was infected with malaria parasites that had a different fingerprint from the parasites in every other child. In 1994, Professors Sunetra Gupta and Karen Day, both then working at Imperial College London and later the University of Oxford, proposed that the malaria transmission system may be organized into a set of strains based on diversity of the genes that code for the surface coat of the parasite. If true, this strain diversity could explain why people can be re-infected with malaria many times over. It has taken until now for Day and colleagues to develop and optimize the mathematical and laboratory techniques to finally address the hypothesis. The malaria parasite is a single-celled microorganism (known as a Plasmodium) that infects red blood cells and is transferred from human to human via mosquitoes. It has been infecting people for tens of thousands of years, and, according to the World Health Organization, in 2015, nearly half of the world’s population remained at risk of malaria. Over the past 20 years, Day’s team has developed a way to genetically fingerprint malaria parasites from small amounts of blood based on what are called var genes. Every parasite has approximately 60 of these var genes but only uses one at a time and can switch between the one it uses. The genes encode proteins that coat the surface of the red blood cells that the parasite infects. The var genes are significant because they determine the ability of the parasite to disguise itself from the human immune system, and contribute to the virulence of the disease. If the genes that encode the surface coat overlap between two parasites, such as you would expect in siblings that would share a maximum of 50 percent of their genes, then when someone is re-infected, the immune system will recognize these malaria parasites and quickly purge them if they have seen the parent infections. But if there is little or no overlap in these genes, then the immune system won’t recognize the malaria parasite as readily, leading to chronic infection. The study shows that “the parasite has evolved this enormous diversity with limited overlap between the sets of var genes likely so it can keep re-infecting the same humans,” says Day, now a professor of population science and dean of science at the University of Melbourne. Coauthor Mercedes Pascual, an ecologist and professor at the University of Chicago, describes this as “the parasites forming niches by diversifying. They compete with each other for hosts, and distance themselves from each other to invade the same population of humans, a limited resource.” Current malaria control programs don’t target the diversity of the parasite, Day says. “With malaria, we attack something that is conserved between all strains, but the problem is if you don’t get rid of all of the malaria parasites with current strategies, you have this enormous diversity that can allow the system to bounce back quickly to pre-control levels. The resilience of the system is coming from the diversity, so you’ve got to monitor how approaches to control attack diversity and not just the parasite per se.” Interestingly, the theory of malaria control is based on malaria having no diversity and being like measles. You contract measles once and have lifelong immunity, whereas you can get malaria or the flu many times because there are multiple strains circulating. “Malaria is like flu, but our fingerprinting results show that it is way more complicated,” Day says. By analyzing the var genes, researchers came up with a unique identifier, or fingerprint, for each malaria strain that they call a var code. “Looking down the microscope you would have said all of the infections look the same, but when we did the fingerprinting genetically with this variant antigen gene system, we could see that every child had a different parasite fingerprint, and importantly, each fingerprint was highly unrelated to all other fingerprints,” Day says. This unrelatedness was a surprise, Day says. “Malaria has sex as part of its lifecycle, every time it goes through a mosquito. And so, because the malaria parasite mates you would expect to find related parasites that we might call parents, siblings, cousins, and aunts and uncles in the population.” “Even with very high levels of sex between parasites, their competition for available hosts can be so intense, that really only very unrelated parasites would be fit enough to survive and here we have a structure where highly related parasites were not detected,” says Yael Artzy-Randrup, a theoretical ecologist from the University of Amsterdam, and a coauthor of the study. “Malaria is similar to flu in that humans can be infected multiple times by different malaria parasite variants. However, in contrast to the flu, the situation with malaria is much more complex. With malaria, at any given point of time there is a high diversity of variants coexisting even in very small human populations, while in flu, variants usually replace each other, and people will only be infected by one variant at a time.” After waiting 20 years to get their results, the researchers suffered a setback in 2012 when Hurricane Sandy cut the power to Day’s laboratory at New York University, destroying samples that represented months of work. The team was eventually able to recover and continue its work. Once the team had assembled all the data, they had to assure their scientific peers—many of whom were skeptical of the strain hypothesis—that the pattern of diversity and unrelatedness they were seeing was not just through random chance. Researchers tested the results using statistical and computational techniques inspired by the analysis of complex systems in ecology, such as communities of species in ecosystems. They found that the system was non-random, and the relatives were absent from the population. The project is connected to a central question in ecology: what is the structure of diversity? “We are asking this question for the ensemble of parasites within a population of Plasmodium falciparum, but it can also be asked for the ensemble of tree species in a rainforest,” Pascual says. “It is an exciting time for bringing together quantitative analyses and deep sampling of biological systems in the field. “Our findings indicate that the enormous diversity of the parasite is structured and that we need to consider the implications of this structure for intervention, and possibly develop a different way to model transmission in malaria altogether.” Additional researchers from the University of Melbourne, the University of Chicago, New York University, the University of Michigan, the University of Amsterdam, the University of Montpellier, and the University of Paris Descartes are coauthors of the study.


News Article | May 3, 2017
Site: phys.org

Led by biomedical engineers at Newcastle University, UK, and funded by the Engineering and Physical Sciences Research Council (EPSRC), the bionic hand is fitted with a camera which instantaneously takes a picture of the object in front of it, assesses its shape and size and triggers a series of movements in the hand. Bypassing the usual processes which require the user to see the object, physically stimulate the muscles in the arm and trigger a movement in the prosthetic limb, the hand 'sees' and reacts in one fluid movement. A small number of amputees have already trialled the new technology and now the Newcastle University team are working with experts at Newcastle upon Tyne Hospitals NHS Foundation Trust to offer the 'hands with eyes' to patients at Newcastle's Freeman Hospital. Publishing their findings today in the Journal of Neural Engineering, co-author on the study Dr Kianoush Nazarpour, a Senior Lecturer in Biomedical Engineering at Newcastle University, explains: "Prosthetic limbs have changed very little in the past 100 years—the design is much better and the materials' are lighter weight and more durable but they still work in the same way. "Using computer vision, we have developed a bionic hand which can respond automatically—in fact, just like a real hand, the user can reach out and pick up a cup or a biscuit with nothing more than a quick glance in the right direction. "Responsiveness has been one of the main barriers to artificial limbs. For many amputees the reference point is their healthy arm or leg so prosthetics seem slow and cumbersome in comparison. "Now, for the first time in a century, we have developed an 'intuitive' hand that can react without thinking." Recent statistics show that in the UK there are around 600 new upper-limb amputees every year, of which 50% are in the age range of 15-54 years old. In the US there are 500,000 upper limb amputees a year. Current prosthetic hands are controlled via myoelectric signals - that is electrical activity of the muscles recorded from the skin surface of the stump. Controlling them, says Dr Nazarpour, takes practice, concentration and, crucially, time. Using neural networks—the basis for Artificial Intelligence—lead author on the study Ghazal Ghazaei showed the computer numerous object images and taught it to recognise the 'grip' needed for different objects. "We would show the computer a picture of, for example, a stick," explains Miss Ghazaei, who carried out the work as part of her PhD in the School of Electrical and Electronic Engineering at Newcastle University. "But not just one picture, many images of the same stick from different angles and orientations, even in different light and against different backgrounds and eventually the computer learns what grasp it needs to pick that stick up. "So the computer isn't just matching an image, it's learning to recognise objects and group them according to the grasp type the hand has to perform to successfully pick it up. "It is this which enables it to accurately assess and pick up an object which it has never seen before—a huge step forward in the development of bionic limbs." Grouping objects by size, shape and orientation, according to the type of grasp that would be needed to pick them up, the team programmed the hand to perform four different 'grasps': palm wrist neutral (such as when you pick up a cup); palm wrist pronated (such as picking up the TV remote); tripod (thumb and two fingers) and pinch (thumb and first finger). Using a 99p camera fitted to the prosthesis, the hand 'sees' an object, picks the most appropriate grasp and sends a signal to the hand—all within a matter of milliseconds and ten times faster than any other limb currently on the market. "One way would have been to create a photo database of every single object but clearly that would be a massive task and you would literally need every make of pen, toothbrush, shape of cup—the list is endless," says Dr Nazarpour. "The beauty of this system is that it's much more flexible and the hand is able to pick up novel objects—which is crucial since in everyday life people effortlessly pick up a variety of objects that they have never seen before." The work is part of a larger research project to develop a bionic hand that can sense pressure and temperature and transmit the information back to the brain. Led by Newcastle University and involving experts from the universities of Leeds, Essex, Keele, Southampton and Imperial College London, the aim is to develop novel electronic devices that connect to the forearm neural networks to allow two-way communications with the brain. Reminiscent of Luke Skywalker's artificial hand, the electrodes in the bionic limb would wrap around the nerve endings in the arm. This would mean for the first time the brain could communicate directly with the prosthesis. The 'hand that sees', explains Dr Nazarpour, is an interim solution that will bridge the gap between current designs and the future. "It's a stepping stone towards our ultimate goal," he says. "But importantly, it's cheap and it can be implemented soon because it doesn't require new prosthetics—we can just adapt the ones we have." Anne Ewing, Advanced Occupational Therapist at Newcastle upon Tyne Hospitals NHS Foundation Trust, has been working with Dr Nazarpour and his team. "I work with upper limb amputee patients which is extremely rewarding, varied and at times challenging," she said. "We always strive to put the patient at the heart of everything we do and so make sure that any interventions are client centred to ensure patients' individual goals are met either with a prosthesis or alternative method of carrying out a task. "This project in collaboration with Newcastle University has provided an exciting opportunity to help shape the future of upper limb prosthetics, working towards achieving patients' prosthetic expectations and it is wonderful to have been involved." "For me it was literally a case of life or limb," says Doug McIntosh, who lost his right arm in 1997 through cancer. "I had developed a rare form of cancer called epithelial sarcoma, which develops in the deep tissue under the skin, and the doctors had no choice but to amputate the limb to save my life. "Losing an arm and battling cancer with three young children was life changing. I left my job as a life support supervisor in the diving industry and spent a year fund-raising for cancer charities. "It was this and my family that motivated me and got me through the hardest times." Since then, Doug has gone on to be an inspiration to amputees around the world. Becoming the first amputee to cycle from John O'Groats to Land's End in 100hrs, cycle around the coast line of Britain, he has run three London Marathons, cycled The Dallaglio Flintoff Cycle Slam 2012 and 2014 and in 2014 cycled with the British Lions Rugby Team to Murrayfield Rugby Stadium for "Walking with Wounded" Charity. He is currently preparing to do Mont Ventoux this September, three cycle climbs in one day for Cancer Research UK and Maggie's Cancer Centres. Involved in the early trials of the first myoelectric prosthetic limbs, Doug has been working with the Newcastle team to trail the new hand that sees. "The problem is there's nothing yet that really comes close to feeling like the real thing," explains the father-of-three who lives in Westhill, Aberdeen with his wife of 32 years, Diane. "Some of the prosthetics look very realistic but they feel slow and clumsy when you have a working hand to compare them to. "In the end I found it easier just to do without and learn to adapt. When I do use a prosthesis I use a split hook which doesn't look pretty but does the job." But he says the new, responsive hand being developed in Newcastle is a 'huge leap forward'. "This offers for the first time a real alternative for upper limb amputees," he says. "For me, one of the ways of dealing with the loss of my hand was to be very open about it and answer people's questions. But not everyone wants that and so to have the option of a hand that not only looks realistic but also works like a real hand would be an amazing breakthrough and transform the recovery time—both physically and mentally—for many amputees." Explore further: Bionic hand that is 'sensitive' to touch and temperature More information: G. Ghazaei, A. Alameer, P. Degenaar, G. Morgan, and K. Nazarpour, "Deep learning-based artificial vision for grasp classification in myoelectric hands," Journal of Neural Engineering, 17(3): 036025, 2017.


News Article | April 20, 2017
Site: www.gizmag.com

Reaching a higher state of consciousness is a concept you're more likely to hear a spiritualist spout than a scientist, but now neuroscientists at the University of Sussex claim to have found the first evidence of just such a state. From wakefulness down to a deep coma, consciousness is on a sliding scale measured by the diversity of brain signals, and the researchers found that when under the influence of psychedelic drugs, that diversity jumps to new heights above the everyday baseline. The research builds on data gathered about a year ago by a team at Imperial College London, which dosed up volunteers with psychedelics, including LSD, psilocybin and ketamine, then scanned their brains with magnetoencephalographic (MEG) techniques to examine the effects. This new study set out to determine how a psychedelic state would compare to other levels of wakefulness and unconsciousness, according to a scale of brain signal diversity measured by monitoring the magnetic fields produced by the brain. When a person is asleep, their brain signals are far less diverse than when they're awake and aware, and past research has noted that it varies by what stage of the sleep cycle they're in. Being put under different types of anaesthesia induce even lower scores, and it bottoms out for those in a vegetative state. But this is the first time signal diversity has been seen to be higher than the normal readings of an alert, conscious mind. "This finding shows that the brain-on-psychedelics behaves very differently from normal," says Anil Seth, corresponding author of the study. "During the psychedelic state, the electrical activity of the brain is less predictable and less 'integrated' than during normal conscious wakefulness – as measured by 'global signal diversity.' Since this measure has already shown its value as a measure of 'conscious level', we can say that the psychedelic state appears as a higher 'level' of consciousness than normal – but only with respect to this specific mathematical measure." Interestingly, the more intense a trip the participant reported, the more diverse their brain signals appeared to be. That finding could help scientists better understand the connection between the level of consciousness and what specifically someone is conscious of. "We found correlations between the intensity of the psychedelic experience, as reported by volunteers, and changes in signal diversity," says Seth. "This suggests that our measure has close links not only to global brain changes induced by the drugs, but to those aspects of brain dynamics that underlie specific aspects of conscious experience." But as appealing as a higher state of consciousness might sound, the researchers (and us here at New Atlas) aren't trying to encourage drug use. The team is careful to point out that "higher" doesn't necessarily mean "better," and the key take-away from the study is that a psychedelic experience is a distinct conscious state. "Rigorous research into psychedelics is gaining increasing attention, not least because of the therapeutic potential that these drugs may have when used sensibly and under medical supervision," says Robin Cahart-Harris, another author of the study. "The present study's findings help us understand what happens in people's brains when they experience an expansion of their consciousness under psychedelics. People often say they experience insight under these drugs – and when this occurs in a therapeutic context, it can predict positive outcomes. The present findings may help us understand how this can happen." In the future, the researchers will turn their attention to trying to figure out the biological mechanics behind specific parts of the experience, such as hallucinations. The research was published in the journal Scientific Reports.


News Article | April 19, 2017
Site: www.chromatographytechniques.com

Scientific evidence of a 'higher' state of consciousness has been found in a study led by the University of Sussex. Neuroscientists observed a sustained increase in neural signal diversity - a measure of the complexity of brain activity - of people under the influence of psychedelic drugs, compared with when they were in a normal waking state. The diversity of brain signals provides a mathematical index of the level of consciousness. For example, people who are awake have been shown to have more diverse neural activity using this scale than those who are asleep. This, however, is the first study to show brain-signal diversity that is higher than baseline that is higher than in someone who is simply 'awake and aware'. Previous studies have tended to focus on lowered states of consciousness, such as sleep, anesthesia, or the so-called 'vegetative' state. The team say that more research is needed using more sophisticated and varied models to confirm the results but they are cautiously excited. Professor Anil Seth, Co-Director of the Sackler Centre for Consciousness Science at the University of Sussex, said: "This finding shows that the brain-on-psychedelics behaves very differently from normal. "During the psychedelic state, the electrical activity of the brain is less predictable and less 'integrated' than during normal conscious wakefulness - as measured by 'global signal diversity'. "Since this measure has already shown its value as a measure of 'conscious level', we can say that the psychedelic state appears as a higher 'level' of consciousness than normal - but only with respect to this specific mathematical measure." For the study, Michael Schartner, Adam Barrett and Professor Seth of the Sackler Centre reanalyzed data that had previously been collected by Imperial College London and the University of Cardiff in which healthy volunteers were given one of three drugs known to induce a psychedelic state: psilocybin, ketamine and LSD. Using brain imaging technology, they measured the tiny magnetic fields produced in the brain and found that, across all three drugs, this measure of conscious level - the neural signal diversity - was reliably higher. This does not mean that the psychedelic state is a 'better' or more desirable state of consciousness, the researchers stress; instead, it shows that the psychedelic brain state is distinctive and can be related to other global changes in conscious level (e.g. sleep, anesthesia) by application of a simple mathematical measure of signal diversity. Dr Muthukumaraswamy who was involved in all three initial studies commented: "That similar changes in signal diversity were found for all three drugs, despite their quite different pharmacology, is both very striking and also reassuring that the results are robust and repeatable." The findings could help inform discussions gathering momentum about the carefully-controlled medical use of such drugs, for example in treating severe depression. Dr. Robin Cahart-Harris of Imperial College London said: "Rigorous research into psychedelics is gaining increasing attention, not least because of the therapeutic potential that these drugs may have when used sensibly and under medical supervision. "The present study's findings help us understand what happens in people's brains when they experience an expansion of their consciousness under psychedelics. People often say they experience insight under these drugs - and when this occurs in a therapeutic context, it can predict positive outcomes. The present findings may help us understand how this can happen." As well as helping to inform possible medical applications, the study adds to a growing scientific understanding of how conscious level (how conscious one is) and conscious content (what one is conscious of) are related to each other. Professor Seth said: "We found correlations between the intensity of the psychedelic experience, as reported by volunteers, and changes in signal diversity. This suggests that our measure has close links not only to global brain changes induced by the drugs, but to those aspects of brain dynamics that underlie specific aspects of conscious experience." The research team are now working hard to identify how specific changes in information flow in the brain underlie specific aspects of psychedelic experience, like hallucinations. The findings were published in Scientific Reports.


News Article | May 2, 2017
Site: www.eurekalert.org

Scientists have re-examined an overlooked museum fossil and discovered that it is the earliest known member of the titanosauriform family of dinosaurs IMAGE:  This Vouivria herd are roaming the coast of what is now Europe. Millions of years ago Europe was a chain of islands and, being a herbivore, Vouivria damparisensis would have... view more Scientists have re-examined an overlooked museum fossil and discovered that it is the earliest known member of the titanosauriform family of dinosaurs. The fossil, which the researchers from Imperial College London and their colleagues in Europe have named Vouivria damparisensis, has been identified as a brachiosaurid sauropod dinosaur. The researchers suggest the age of Vouivria is around 160 million years old, making it the earliest known fossil from the titanosauriform family of dinosaurs, which includes better-known dinosaurs such as the Brachiosaurus. When the fossil was first discovered in France in the 1930s, its species was not identified, and until now it has largely been ignored in scientific literature. The new analysis of the fossil indicates that Vouivria died at an early age, weighed around 15,000 kilograms and was over 15 metres long, which is roughly 1.5 times the size of a double-decker bus in the UK. It had a long neck held at around a 45 degree angle, a long tail, and four legs of equal length. It would have been a plant eater. Dr Philip Mannion, the lead author of the study from the Department of Earth Science and Engineering at Imperial College London, said: "Vouivria would have been a herbivore, eating all kinds of vegetation, such as ferns and conifers. This creature lived in the Late Jurassic, around 160 million years ago, at a time when Europe was a series of islands. We don't know what this creature died from, but millions of years later it is providing important evidence to help us understand in more detail the evolution of brachiosaurid sauropods and a much bigger group of dinosaurs that they belonged to, called titanosauriforms." Titanosauriforms were a diverse group of sauropod dinosaurs and some of the largest creatures to have ever lived on land. They lived from at least the Late Jurassic, right to the end-Cretaceous mass extinction, when an asteroid wiped out most life on Earth. A lack of fossil records means that it has been difficult for scientists to understand the early evolution of titanosauriforms and how they spread out across the planet. The re-classification of Vouivria as an early titanosauriform will help scientists to understand the spread of these creatures during the Early Cretaceous period, a later period of time, after the Jurassic, around 145 - 100 million years ago. The team's incorporation of Vouivria into a revised analysis of sauropod evolutionary relationships shows that by the Early Cretaceous period, brachiosaurids were restricted to what is now Africa and the USA, and were probably extinct in Europe. Previously, scientists had suggested the presence of another brachiosaurid sauropod dinosaur called Padillasaurus much further afield in what is now South America, in the Early Cretaceous. However, the team's incorporation of Vouivria into the fossil timeline suggests that Padillasaurus was not a brachiosaurid, and that this group did not spread as far as South America. The Vouivria fossil was originally discovered by palaeontologists in the village of Damparis, in the Jura Department of eastern France, in 1934. Ever since, it has been stored in the Museum National d'Histoire Naturelle, Paris. It was only briefly mentioned by scientists in studies in the 1930s and 1940s, but it was never recognised as a distinct species. It has largely been ignored in the literature, where it has often been referred to simply as the Damparis dinosaur. Now, a deeper analysis of the fossil is also helping the scientists in today's study to understand the environment Vouivria would have been in when it died, which was debated when it was initially found. The researchers believe Vouivria died in a coastal lagoon environment, during a brief sea level decline in Europe, before being buried when sea levels increased once more. When the fossil was first discovered, in rocks that would have originally come from a coastal environment, researchers suggested that its carcass had been washed out to sea, because sauropods were animals that lived on land. Today's team's examination of Vouivria, coupled with an analysis of the rocks it was encased in, provides strong evidence that this was not the case. The genus name of Vouivria is derived from the old French word 'vouivre', itself from the Latin 'vipera', meaning 'viper'. In French-Comte, the region in which the specimen was originally discovered, 'la vouivre' is a legendary winged reptile. The species name damparisensis refers to the village Damparis, from which the fossil was originally found. The research was carried out in conjunction with the Museum National d'Histoire Naturelle and the CNRS/Université Paris 1 Panthéon-Sorbonne, with funding from the European Union's Synthesys programme. Currently, titanosauriforms from the Late Cretaceous are poorly understood compared to their relatives in the Late Jurassic. So, the next step for the researchers will see them expanding on their analysis of the evolutionary relationships of all species in the titanosauriform group. The team are also aiming to find more sauropod remains from older rocks to determine in more detail how they spread across the continents.


News Article | May 2, 2017
Site: www.gizmag.com

After sitting idly in a Paris history museum for more than 80 years, a previously overlooked fossil is shedding light on a decidedly obscure chapter in dinosaur evolution. Not only is the new species providing scientists with new clues, it has turned out to be the earliest relative of a certain long-necked plant-eater called the Brachiosaurus. In 1934 paleontologists came across a dinosaur fossil in the village of Damparis in eastern France. A species was not immediately identified and the fossil was mostly ignored by scientific literature in the 30s and 40s, referred to only as the "Damparis dinosaur." But now scientists from Imperial College London, together with France's Museum National d'Histoire Naturelle, where Damparis has been stored, and Université Paris 1 Panthéon-Sorbonne, have pulled it out for another look. New analysis of the fossil has revealed it to be a brachiosaurid sauropod, a group belonging to a larger group of dinosaurs called the titanosauriforms. These were some of the biggest creatures to ever live on land and roamed the Earth from at least the Late Jurassic (around 160 million years ago) to the mass-extinction event at the end of the Cretaceous period (65 million years ago). The researchers say that the age of the fossil, which has now been named Vouivria damparisensis, is around 160 million years old. This is significant for a couple of reasons. It makes it the earliest known fossil from the titanosauriform family and therefore the earliest relative of the brachiosaurus, and helps to fill in what was a sizable hole in the existing fossil records. "Vouivria would have been a herbivore, eating all kinds of vegetation, such as ferns and conifers," says Imperial College London's Dr Philip Mannion, lead author of the study. "This creature lived in the Late Jurassic, around 160 million years ago, at a time when Europe was a series of islands. We don't know what this creature died from, but millions of years later it is providing important evidence to help us understand in more detail the evolution of brachiosaurid sauropods and a much bigger group of dinosaurs that they belonged to, called titanosauriforms." The scientists say Vouivria died at a young age, weighing around 15,000 kg (33,000 lb) and measured more than 15 m long (50 ft), around 1.5 times the size of a double-decker bus in the UK. It had a long neck, a long tail and four legs of equal length. Without many fossils to work with, it has been hard for scientists to plot the evolution of the titanosauriforms and their spread across the planet. But already Vouivria is starting to fill in some of the blanks. The team believes that the dinosaur died in a coastal lagoon in the midst of a short sea level decline in Europe, and was then buried when the sea rose again. Working the new evidence into analysis of brachiosaurid evolution, the scientists now believe that the creatures were most likely extinct in Europe soon after this creature lived – by the Early Cretaceous period – and restricted to what is now Africa and the USA. They are now expanding that analysis to consider the evolutionary relationships between all members of the titanosauriform family to understand their evolution even further. The research was published in the journal PeerJ.


News Article | April 19, 2017
Site: www.newscientist.com

Measuring neuron activity has revealed that psychedelic drugs really do alter the state of the brain, creating a different kind of consciousness. “We see an increase in the diversity of signals from the brain,” says Anil Seth, at the University of Sussex, UK. “The brain is more complex in its activity.” Seth and his team discovered this by re-analysing data previously collected by researchers at Imperial College London. Robin Carhart-Harris and his colleagues had monitored brain activity in 19 volunteers who had taken ketamine, 15 who had had LSD, and 14 who were under the influence of psilocybin, a hallucinogenic compound in magic mushrooms. Carhart-Harris’s team used sets of sensors attached to the skull to measure the magnetic fields produced by these volunteers’ neurons, and compared these to when each person took a placebo. “We took the activity data, cleaned it up then chopped it into 2-second chunks,” says Seth, whose team worked with Carhart-Harris on the re-analysis. “For each chunk, we could calculate a measure of diversity.” Previous work had shown that people in a state of wakefulness have more diverse patterns of brain activity than people who are asleep. Seth’s team has found that people who have taken psychedelic drugs show even more diversity – the highest level ever measured. These patterns of very high diversity coincided with the volunteers reporting “ego-dissolution” – a feeling that the boundaries between oneself and the world have been blurred. The degree of diversity was also linked to more vivid experiences. There’s mounting evidence that psychedelic drugs may help people with depression in ways that other treatments can’t. Some benefits have already been seen with LSD, ketamine, psilocybin, and ayahuasca, a potion used in South America during religious rites. “I think there’s an awful lot of potential here,” says Seth. “If you suddenly see things in a different way, it could give your outlook a jolt that existing antidepressants can’t because they work on the routine, wakeful state.”


News Article | April 26, 2017
Site: www.eurekalert.org

Sophia Antipolis April 26,2017. The impact of overtraining on the heart is set to be discussed at Europe's leading cardiovascular magnetic resonance meeting, to be held 25 to 27 May in Prague, Czech Republic, at the Clarion Congress Hotel Prague (CCHP). EuroCMR is the largest and most important cardiovascular magnetic resonance (CMR) event in Europe. It is the annual CMR conference of the European Association of Cardiovascular Imaging (EACVI), a registered branch of the European Society of Cardiology (ESC). The scientific programme is available here "Journalists can get the latest research findings on how athletes adapt to exercise and whether it's good or bad for them," said Professor James Moon, programme chair of EuroCMR. "There is a controversial suggestion that overtraining, especially if you're over 40, is not necessarily good for your heart." He continued: "We will also hear from Pierre Croisille from Jean Monnet University, Saint-Étienne, France, who scanned ultramarathon runners during the Tor des Géants during their 300 km race. The marathon is a model of training and adaptation - and the ultramarathon is more extreme -- it's almost a model of dying in intensive care as your body becomes really inflamed, so we have an opportunity to understand the heart at the limits." EuroCMR is Europe's leading clinical CMR meeting. More than 1 000 participants from over 60 countries are expected to attend the 2.5 day congress which is packed with state-of-the-art sessions led by international experts and new scientific research in the abstract programme. A future of universal genetic testing followed by magnetic resonance imaging (MRI) will be outlined in a keynote lecture by Dudley Pennell from the Royal Brompton Hospital in London, UK. Professor Moon said: "The genetics will tell you what diseases you might get - for example heart muscle disease - and the MRI will show where you are in the progression, whether or not you have to change things now or even apply invasive procedures, for example electrophysiology." Members of the press will learn about novel MRI scanners capable of reconstructing images using Microsoft Cloud. Juliano Fernandez from Brazil will reveal how a five minute scan he is testing in 23 countries could bring MRI to less developed countries. MRI has been contraindicated in patients with pacemakers and implantable cardioverter defibrillators (ICDs) but new research suggests that it may be safe and experts will present the current evidence. "We got this quite wrong and journalists can get the full story in a dedicated session," said Professor Moon. "We're moving towards a world where pacemakers are not a contraindication to MRI which is really important because everyone can get brain, spine or cancer scans they need." Adam Timmis from Barts Heart Centre and the National Institute for Health and Care Excellence (NICE), UK, will discuss the role of computed tomography (CT) scans for patients with chest pain, rather than MRI, echocardiography or electrocardiogram (ECG). Professor Moon said: "The UK is heading towards CT scanning first line for chest pain and the other modalities for refined diagnosis once coronary disease is present. This is a major change in direction. NICE is very credible, so the approach should be considered across Europe." Media representatives can get both sides of the story in a debate on when to operate in patients with valve disease. "Cardiologists replace a valve when it's very narrow but may not think about whether or not the heart muscle is coping," said Professor Moon. "It's a bit like performing a liver transplant in a heavy drinker without checking liver function. In the future we will measure how the heart is responding to the diseased valve using MRI and blood tests and then decide whether or not surgery is needed." 3D printing is a hot topic and journalists can have a close look at how imaging and printing the heart and vessels is helping plan operations for our sickest children and adults at Great Ormond Street Hospital, London and elsewhere. Graham Cole from Imperial College London will show where doctors got it wrong in imaging and how to avoid mistakes as researchers develop new treatments for the future. Professor Moon said: "MRI has transformed neurology and our understanding of the body. It's taken us longer in cardiology because the heart moves constantly. We have sorted those problems out and we are able to diagnose patients with cardiovascular disease earlier, tailor the treatment to the individual, and measure the effect of that treatment. Members of the press should register now for this exciting event."


News Article | May 2, 2017
Site: www.sciencenews.org

It won’t be a tsunami. Nor an earthquake. Not even the crushing impact of the space rock. No, if an asteroid kills you, gusting winds and shock waves from falling and exploding space rocks will most likely be to blame. That’s one of the conclusions of a recent computer simulation effort that investigated the fatality risks of more than a million possible asteroid impacts. In one extreme scenario, a simulated 200-meter-wide space rock whizzing 20 kilometers per second whacked London, killing more than 8.7 million people. Nearly three-quarters of that doomsday scenario’s lethality came from winds and shock waves, planetary scientist Clemens Rumpf and colleagues report online March 27 in Meteoritics & Planetary Science. In a separate report, the researchers looked at 1.2 million potential impactors up to 400 meters across striking around the globe. Winds and shock waves caused about 60 percent of the total deaths from all the asteroids, the team’s simulations showed. Impact-generated tsunamis, which many previous studies suggested would be the top killer, accounted for only around one-fifth of the deaths, Rumpf and colleagues report online April 19 in Geophysical Research Letters. “These asteroids aren’t an everyday concern, but the consequences can be severe,” says Rumpf, of the University of Southampton in England. Even asteroids that explode before reaching Earth’s surface can generate high-speed wind gusts, shock waves of pressure in the atmosphere and intense heat. Those rocks big enough to survive the descent pose even more hazards, spawning earthquakes, tsunamis, flying debris and, of course, gaping craters. While previous studies typically considered each of these mechanisms individually, Rumpf and colleagues assembled the first assessment of the relative deadliness of the various effects of such impacts. The estimated hazard posed by each effect could one day help leaders make one of the hardest calls imaginable: whether to deflect an asteroid or let it hit, says Steve Chesley, a planetary scientist at NASA’s Jet Propulsion Laboratory in Pasadena, Calif., who was not involved with either study. The 1.2 million simulated impactors each fell into one of 50,000 scenarios, which varied in location, speed and angle of strike. Each scenario was run with 24 different asteroid sizes, ranging from 15 to 400 meters across. Asteroids in nearly 36,000 of the scenarios, or around 72 percent, descended over water. The deadliness assessment began with a map of human populations and numerical simulations of the energies unleashed by falling asteroids. Those energies were then used alongside existing casualty data from studies of extreme weather and nuclear blasts to calculate the deadliness of the asteroids’ effects at different distances. Rumpf and his team focused on short-term impact effects, rather than long-term consequences such as climate change triggered by dust blown into the atmosphere. A new project simulating 1.2 million asteroid strikes estimates how many deaths could result from each effect of a falling space rock (averages for three classes of asteroid simulated are shown in the interactive below). People who could have died from two or more effects are included in multiple columns. While the most deadly impact killed around 117 million people, many asteroids posed no threat at all, the simulations revealed. More than half of asteroids smaller than 60 meters across — and all asteroids smaller than 18 meters across — caused zero deaths. Rocks smaller than 56 meters wide didn’t even make it to Earth’s surface before exploding in an airburst. Those explosions could still be deadly, though, generating intense heat that burns skin, high-speed winds that hurl debris and pressure waves that rupture internal organs, the team found. Tsunamis became the dominant killer for water impacts, accounting for around 70 to 80 percent of the total deaths from each impact. Even with the tsunamis, though, water impacts were only a fraction as deadly on average as land-hitting counterparts. That’s because impact-generated tsunamis are relatively small and quickly lose steam as they traverse the ocean, the researchers found. Land impacts, on the other hand, cause considerable fatalities through heat, wind and shock waves and are more likely to hit near large population centers. For all asteroids big enough to hit the land or water surface, heat, wind and shock waves continued to cause the most casualties overall. Land-based effects, such as earthquakes and blast debris, resulted in less than 2 percent of total deaths. Deadly asteroid impacts are rare, though, Rumpf says. Most space rocks bombarding Earth are tiny and harmlessly burn up in the atmosphere. Bigger meteors such as the 20-meter-wide rock that lit up the sky and shattered windows around the Russian city of Chelyabinsk in 2013 only frequent Earth about once a century (SN Online: 2/15/13). Impacts capable of inducing extinctions, like the at least 10-kilometer-wide impactor blamed for the end of the dinosaurs 66 million years ago (SN: 2/4/17, p. 16), are even rarer, striking Earth roughly every 100 million years. But asteroid impacts are scary enough that today’s astronomers scan the sky with automated telescopes scouting for potential impactors. So far, they’ve cataloged 27 percent of space rocks 140 meters or larger estimated to be whizzing through the solar system. Other scientists are crunching the numbers on ways to divert an earthbound asteroid. Proposals include whacking the asteroid like a billiard ball with a high-speed spacecraft or frying part of the asteroid’s surface with a nearby nuclear blast so that the vaporized material propels the asteroid away like a jet engine. The recent research could offer guidance on how people should react to an oncoming impactor: whether to evacuate or shelter in place, or to scramble to divert the asteroid. “If the asteroid’s in a size range where the damage will be from shock waves or wind, you can easily shelter in place a large population,” Chesley says. But if the heat generated as the asteroid falls, impacts or explodes “becomes a bigger threat, and you run the risk of fires, then that changes the response of emergency planners,” he says. Making those tough decisions will require more information about compositions and structures of the asteroids themselves, says Lindley Johnson, who serves as the planetary defense officer for NASA in Washington, D.C. Those properties in part determine an asteroid’s potential devastation, and the team didn’t consider how those characteristics might vary, Johnson says. Several asteroid-bound missions are planned to answer such questions, though the recent White House budget proposal would defund a NASA project to reroute an asteroid into the moon’s orbit and send astronauts to study it (SN Online: 3/16/17). In the case of a potential impact, making decisions based on the average deaths presented in the new study could be misleading, warns Gareth Collins, a planetary scientist at Imperial College London. A 60-meter-wide impactor, for instance, caused on average about 6,300 deaths in the simulations. Just a handful of high-fatality events inflated that average, though, including one scenario that resulted in more than 12 million casualties. In fact, most impactors of that size struck away from population centers and killed no one. “You have to put it in perspective,” Collins says.


News Article | April 17, 2017
Site: www.bbc.co.uk

The UK has now started the formal process of leaving the EU, but scientists say they have evidence of a much earlier "Brexit". They have worked out how a thin strip of land that once connected ancient Britain to Europe was destroyed. The researchers believe a large lake overflowed 450,000 years ago, damaging the land link, then a later flood fully opened the Dover Strait. The scars of these events can be found on the seabed of the English Channel. The study is published in the journal Nature Communications. Professor Sanjeev Gupta, who led the study, from Imperial College London, said: "This was really one of the defining events for north west Europe - and certainly the defining event in Britain's history. "This chance geological event, if it hadn't happened, would have meant Britain was always connected to the continent." More than half a million years ago, in the midst of an Ice Age, a land bridge connected Dover in the South of England to Calais in northern France. Immediately to the north of it, was a huge glacial lake, which had formed at the edge of an ice sheet that covered much of Europe. The researchers believe that this lake started to overflow, sending vast amounts of water crashing over the land bridge. The evidence for this was found at the bottom of the English Channel. Decades ago, engineers who were surveying the seabed for the Channel Tunnel, discovered a series of mysterious large underwater holes. Now further scrutiny has revealed that they were most likely caused by the lake overspill. Prof Gupta said: "These holes are now in-filled with sediment, but what's interesting is that they are not linear features like canyons or valleys - they are isolated depressions. "And they occur in a line - a whole series of them stretching between Dover and Calais. And they are huge, 100m-deep carved into the bedrock and hundreds of metres to several kilometres in diameter. "So we interpret these as giant plunge pools. We think there was basically lake water plunging over this rock ridge in the Dover Strait through a whole series of waterfalls, which then eroded and carved out these depressions. "It's difficult to explain them by any other mechanism." The researchers believe the lake started to overflow about 450,000 years ago, which would have seriously weakened the land bridge. But they think a second catastrophic flood that took place about 150,000 years ago would have destroyed it altogether. "We see this huge valley carved through the strait, about eight to 10km wide... and it has a lot of features that are suggestive of flood erosion," said Prof Gupta. Co-author Jenny Collier, also from Imperial College London, said it was not clear what caused either of these events. She said: "Perhaps part of the ice sheet broke off, collapsing into the lake, causing a surge that carved a path for the water to cascade off the chalk ridge. "In terms of the catastrophic failure of the ridge, maybe an earth tremor, which is still characteristic of this region today, further weakened the ridge. "This may have caused the chalk ridge to collapse, releasing the megaflood that we have found evidence for in our studies." The researchers would now like to work out more precise timings of the "geological Brexit". This would mean drilling into the bottom of the Dover Strait and analysing the age of the sediment. "But that would be a huge undertaking," admitted Prof Gupta. "The English Channel is the world's busiest shipping lane and it has huge tidal currents. It will be hugely challenging."


News Article | May 4, 2017
Site: www.chromatographytechniques.com

Synthetic biologists from Imperial College London have re-engineered yeast cells to manufacture the nonribosomal peptide antibiotic penicillin. In laboratory experiments, they were able to demonstrate that this yeast had antibacterial properties against streptococcus bacteria. The authors of the study, which is published in the journal Nature Communications, say their new method demonstrates the effectiveness of using this kind of synthetic biology as a route for discovering new antibiotics. This could open up possibilities for using re-engineered yeast cells to develop new forms of antibiotics and anti-inflammatory drugs from the nonribosomal peptide family. Nonribosomal peptides are normally produced by bacteria and fungi, forming the basis of most antibiotics today. Pharmaceutical companies have long experimented with nonribosomal peptides to make conventional antibiotics. The rise of antimicrobial resistance means there is a need to use genetic engineering techniques to find a new range of antibiotics from bacteria and fungi. However, genetically engineering the more exotic fungi and bacteria- the ones likely to have antibacterial properties -- is challenging because scientists don't have the right tools and they are difficult to grow in a lab environment, requiring special conditions. Baker's yeast on the other hand is easy to genetically engineer. Scientists can simply insert DNA from bacteria and fungi into yeast to carry out experiments, offering a viable new host for antibiotic production research. The rise of synthetic biology methods for yeast will allow researchers to make and test many new gene combinations that could produce a whole new range of new antibiotics. However, the authors are keen to point out that the research is still in its early stages. While this approach does show promise, they have so far produced nonribosomal peptide antibiotic penicillin in small quantities. More research needs to be done to see if it can be adapted to finding other compounds and to get production up to commercially viable quantities. "Humans have been experimenting with yeast for thousands of years. From brewing beer to getting our bread to rise, and more recently for making compounds like anti-malarial drugs, yeast is the microscopic workhorse behind many processes," explained Tom Ellis, from the Centre for Synthetic Biology at Imperial College London. "The rise of drug-resistant superbugs has brought a real urgency to our search for new antibiotics. Our experiments show that yeast can be engineered to produce a well-known antibiotic. This opens up the possibility of using yeast to explore the largely untapped treasure trove of compounds in the nonribosomal peptide family to develop a new generation of antibiotics and anti-inflammatories." Previously, scientists have demonstrated that they could re-engineer a different yeast to make penicillin. However, that species of yeast is not as well understood or amenable to genetic manipulation compared to baker's yeast, used by the authors in today's study, making it less suitable for the development of novel antibiotics using synthetic biology. In their experiments, the team used genes from the filamentous fungus, from which nonribosomal peptide penicillin is naturally derived. These genes caused the yeast cells to produce the nonribosomal peptide penicillin via a two-step biochemical reaction process. First the cells made the nonribosomal peptide base -- the 'backbone' molecule -- by a complex reaction, and then this was modified by a set of further fungal enzymes that turn it into the active antibiotic. During the experimentation process, the team discovered that they didn't need to extract the penicillin molecules from inside the yeast cell. Instead, the cell was expelling the molecules directly into the solution it was in. This meant that the team simply had to add the solution to a petri-dish containing streptococcus bacteria to observe its effectiveness. In the future, this approach could greatly simplify the molecule testing and manufacturing process. "Fungi have had millions of years to evolve the capability to produce bacteria-killing penicillin. We scientists have only been working with yeast in this context for a handful of years, but now that we've developed the blueprint for coaxing yeast to make penicillin, we are confident we can further refine this method to create novel drugs in the future," said Ali Awan, co-author from the Department of Bioengineering at Imperial College London. "We believe yeast could be the new mini-factories of the future, helping us to experiment with new compounds in the nonribosomal peptide family to develop drugs that counter antimicrobial resistance." The team is currently looking for fresh sources of funding and new industrial collaborators to take their research to the next level. "Penicillin was first discovered by Sir Alexander Fleming at St Mary's Hospital Medical School, which is now part of Imperial. He also predicted the rise of antibiotic resistance soon after making his discovery. We hope, in some small way, to build on his legacy, collaborating with industry and academia to develop the next generation of antibiotics using synthetic biology techniques," added Ellis.


News Article | May 4, 2017
Site: www.eurekalert.org

Scientists are closer to understanding the genetic causes of type 2 diabetes by identifying 111 new chromosome locations ('loci') on the human genome that indicate susceptibility to the disease, according to a UCL-led study in collaboration with Imperial College London. Type 2 diabetes is the world's most widespread and devastating metabolic disorder and previously only 76 loci were known and studied. Very few these loci are found in the African American population where the prevalence of type 2 diabetes is almost twice that in the European American population (19% vs. 10%). Of the additional 111 loci identified by the team, 93 (84%) are found in both African American and European populations and only 18 are European-specific. The study, published today in the American Journal of Human Genetics, used a method developed at UCL based on highly informative genetic maps to investigate complex disorders such as type 2 diabetes. European and African American sample populations comprising 5,800 type 2 diabetes case subjects and 9,691 control subjects were analysed, revealing multiple type 2 diabetes loci at regulatory hotspots across the genome. "No disease with a genetic predisposition has been more intensely investigated than type 2 diabetes. We've proven the benefits of gene mapping to identify hundreds of locations where causal mutations might be across many populations, including African Americans. This provides a larger number of characterised loci for scientists to study and will allow us to build a more detailed picture of the genetic architecture of type 2 diabetes," explained lead author, Dr Nikolas Maniatis (UCL Genetics, Evolution & Environment). "Before we can conduct the functional studies required in order to better understand the molecular basis of this disease, we first need to identify as many plausible candidate loci as possible. Genetic maps are key to this task, by integrating the cross-platform genomic data in a biologically meaningful way," added co-lead author, Dr Toby Andrew (ICL, Department of Genomics of Common Disease). The team discovered that the additional 111 loci and previously known 76 loci regulate the expression of at least 266 genes that neighbour the identified disease loci. The vast majority of these loci were found outside of gene coding regions but coincided with regulatory 'hotspots' that alter the expression of these genes in body fat. They are currently investigating whether these loci alter the expression of the same genes in other tissues such as the pancreas, liver and skeletal muscle that are also relevant to type 2 diabetes. Three loci present in African American and European populations were analysed further using deep sequencing in an independent sample of 94 European patients with type 2 diabetes and 94 control subjects in order to identify genetic mutations that cause the disease. The team found that all three loci overlapped with areas of the chromosome containing multiple regulatory elements and epigenetic markers along with candidate causal mutations for type 2 diabetes that can be further investigated. "Our results mean that we can now target the remaining loci on the genetic maps with deep sequencing to try and find the causal mutations within them. We are also very excited that most of the identified disease loci appear to confer risk of disease in diverse populations such as African Americans, implying our findings are likely to be universally applicable and not just confined to Europeans," added Dr Winston Lau (UCL Genetics, Evolution & Environment). "We are now in a strong position to build upon these genomic results, and we can apply the same methods to other complex diseases such as Alzheimer's disease," concluded Dr Maniatis.


News Article | May 5, 2017
Site: www.theguardian.com

An “emotional chatting machine” has been developed by scientists, signalling the approach of an era in which human-robot interactions are seamless and go beyond the purely functional. The chatbot, developed by a Chinese team, is seen as a significant step towards the goal of developing emotionally sophisticated robots. The ECM, as it is known for short, was able to produce factually coherent answers whilst also imbuing its conversation with emotions such as happiness, sadness or disgust. Prof Björn Schuller, a computer scientist at Imperial College London who was not involved in the latest advance, described the work as “an important step” towards personal assistants that could read the emotional undercurrent of a conversation and respond with something akin to empathy. “This will be the next generation of intelligence to be met in daily experience, sooner rather than later,” he said. The paper found that 61% of humans who tested the machine favoured the emotional versions to the neutral chatbot. Similar results have been found in so-called “Wizard of Oz” studies in which a human typing responses masquerades as advanced AI. “It is not a question whether they are desirable – they clearly are – but in which applications they make sense and where they don’t,” said Schuller. Minlie Huang, a computer scientist at Tsinghua University, Beijing and co-author, said: “We’re still far away from a machine that can fully understand the user’s emotion. This is just the first attempt at this problem.” Huang and colleagues started by creating an “emotion classifying” algorithm that learned to detect emotion from 23,000 posts taken from the Chinese social media site Weibo. The posts had been manually classified by humans as sad, happy and so on. The emotion classifier was then used to tag millions of social media interactions according to emotional content. This huge dataset served as a training ground for the chatbot to learn both how to answer questions and how to express emotion. The resulting program could be switched into five possible modes – happy, sad, angry, disgusted, liking – depending on the user’s preference. In one example conversation a user typed in: “Worst day ever. I arrived late because of the traffic.” In neutral mode, the chatbot droned: “You were late”. Alternative responses were: “Sometimes life just sucks!” (disgust mode), “I am always here to support you” (liking) or “Keep smiling! Things will get better” (happy – or, some might say, annoyingly chipper). In the future, the team predict the software could also learn the appropriate emotion to express at a given time. “It could be mostly empathic,” said Huang, adding that a challenge would be to avoid the chatbot reinforcing negative feelings such as rage. Until recently chatbots were widely regarded as a sideshow to more serious attempts at tackling machine intelligence. A chatbot known as Eugene Goostman managed to convince some judges they were talking to a human – but only by posing as a 13-year old Ukrainian boy with a limited grasp of English. Microsoft’s disastrous chatbot Tay was supposed to learn to chat from Twitter interactions, but was terminated after becoming a genocide-supporting Nazi less than 24 hours after being let loose on the internet. The latest study shows that chatbots, driven by a machine learning approach, are starting to make significant headway. Sandra Wachter, a computer scientist at the Oxford Internet Institute, said that in future such algorithms are likely to be personalised. “Some of us prefer a tough-love pep talk, others prefer someone to rant with,” she said. “Humans often struggle with appropriate responses because of the complexity of emotions, so building technologies that could decipher accurately our ‘emotional code’ would be very impressive.” As the stilted computer interactions of today are replaced by something approaching friendly chit-chat, new risks could be encountered. One concern is the potential for technology designed to seduce the user into sharing sensitive personal data. “It could be that children share insights with their ‘artificial friends’ and this data might be stored,” said Wachter. “What if we were to find out that people are more likely to buy more products when they are angry, sad, or bored? The ability to detect these emotions and successfully manipulate them could be a very interesting tool for companies.” There is also the potential for users to become emotionally dependent, or even romantically involved, with their computers. “However, there is also a huge potential for good, such as existing software to teach children on the autism spectrum [about] emotional and social interaction,” said Schuller. “One has to carefully balance benefits and risks and ensure the best exploitation.”


News Article | May 4, 2017
Site: www.biosciencetechnology.com

The synthetic biologists from Imperial College London have re-engineered yeast cells to manufacture the nonribosomal peptide antibiotic penicillin. In laboratory experiments, they were able to demonstrate that this yeast had antibacterial properties against streptococcus bacteria. The authors of the study, which is published today in the journal Nature Communications, say their new method demonstrates the effectiveness of using this kind of synthetic biology as a route for discovering new antibiotics. This could open up possibilities for using re-engineered yeast cells to develop new forms of antibiotics and anti-inflammatory drugs from the nonribosomal peptide family. Nonribosomal peptides are normally produced by bacteria and fungi, forming the basis of most antibiotics today. Pharmaceutical companies have long experimented with nonribosomal peptides to make conventional antibiotics. The rise of antimicrobial resistance means there is a need to use genetic engineering techniques to find a new range of antibiotics from bacteria and fungi. However, genetically engineering the more exotic fungi and bacteria- the ones likely to have antibacterial properties -- is challenging because scientists don't have the right tools and they are difficult to grow in a lab environment, requiring special conditions. Baker's yeast on the other hand is easy to genetically engineer. Scientists can simply insert DNA from bacteria and fungi into yeast to carry out experiments, offering a viable new host for antibiotic production research. The rise of synthetic biology methods for yeast will allow researchers to make and test many new gene combinations that could produce a whole new range of new antibiotics. However, the authors are keen to point out that the research is still in its early stages. While this approach does show promise, they have so far produced nonribosomal peptide antibiotic penicillin in small quantities. More research needs to be done to see if it can be adapted to finding other compounds and to get production up to commercially viable quantities. Dr. Tom Ellis, from the Centre for Synthetic Biology at Imperial College London, explains: "Humans have been experimenting with yeast for thousands of years. From brewing beer to getting our bread to rise, and more recently for making compounds like anti-malarial drugs, yeast is the microscopic workhorse behind many processes. "The rise of drug-resistant superbugs has brought a real urgency to our search for new antibiotics. Our experiments show that yeast can be engineered to produce a well-known antibiotic. This opens up the possibility of using yeast to explore the largely untapped treasure trove of compounds in the nonribosomal peptide family to develop a new generation of antibiotics and anti-inflammatories." Previously, scientists have demonstrated that they could re-engineer a different yeast to make penicillin. However, that species of yeast is not as well understood or amenable to genetic manipulation compared to baker's yeast, used by the authors in today's study, making it less suitable for the development of novel antibiotics using synthetic biology. In their experiments, the team used genes from the filamentous fungus, from which nonribosomal peptide penicillin is naturally derived. These genes caused the yeast cells to produce the nonribosomal peptide penicillin via a two-step biochemical reaction process. First the cells made the nonribosomal peptide base -- the 'backbone' molecule -- by a complex reaction, and then this was modified by a set of further fungal enzymes that turn it into the active antibiotic. During the experimentation process, the team discovered that they didn't need to extract the penicillin molecules from inside the yeast cell. Instead, the cell was expelling the molecules directly into the solution it was in. This meant that the team simply had to add the solution to a petri-dish containing streptococcus bacteria to observe its effectiveness. In the future, this approach could greatly simplify the molecule testing and manufacturing process. Dr. Ali Awan, co-author from the Department of Bioengineering at Imperial College London, explains: "Fungi have had millions of years to evolve the capability to produce bacteria-killing penicillin. We scientists have only been working with yeast in this context for a handful of years, but now that we've developed the blueprint for coaxing yeast to make penicillin, we are confident we can further refine this method to create novel drugs in the future. "We believe yeast could be the new mini-factories of the future, helping us to experiment with new compounds in the nonribosomal peptide family to develop drugs that counter antimicrobial resistance." The team are currently looking for fresh sources of funding and new industrial collaborators to take their research to the next level. Ellis added: "Penicillin was first discovered by Sir Alexander Fleming at St Mary's Hospital Medical School, which is now part of Imperial. He also predicted the rise of antibiotic resistance soon after making his discovery. We hope, in some small way, to build on his legacy, collaborating with industry and academia to develop the next generation of antibiotics using synthetic biology techniques."


News Article | May 4, 2017
Site: www.bbc.co.uk

Pioneering work to extract detailed information from audio recordings of gunshots could give forensic case officers new avenues for solving murder cases. The hustle and bustle of a city going about its business is broken by the crack of gunshots, sending bystanders running and screaming. In the aftermath one man lies dead and another badly injured. Further down the street, four security cameras outside a local resident's home picked up the sound of the exchange of fire between the two men, but no images of what happened. Eyewitnesses reported seeing the pair standing just a few metres apart firing handguns at each other, but it is unclear which of the two perpetrators shot first. In an attempt to unravel what happened, local police called Robert Maher, a professor in electrical and computer engineering at Montana State University. Using audio captured by the microphones on the security cameras, he was able to reconstruct the incident shot-by-shot to reveal where each of the men were standing and who fired first. Prof Maher is one of a small group of acoustics experts working to establish a new field of forensics that examines the sound of gunshots recorded on camera footage or by phones. "Nowadays it is not uncommon for someone with a cell phone to be making a video at the time of a gunfire incident," he explains. "The most common types of recordings are from dashboard cameras or vest-mounted cameras carried by law enforcement officers. "Also common are recordings from an emergency telephone call centres where the calls are being recorded and the caller's phone picks up a gunshot sound. In some cases there are private surveillance systems at homes and businesses that include audio recordings." Gunshots make a distinctive sound that makes them easy to distinguish from other commonly mistaken noises such as a car backfiring or fireworks. A firearm produces an abrupt blast of intense noise from the muzzle that lasts just one or two millionths of a second before disappearing again. High-powered rifles also produce an additional sonic boom as the bullet passes through the sound barrier before the sound of the muzzle blast is detected. Most of us spend our lives surrounded by devices capable of capturing these sounds inadvertently if a crime occurs nearby. Professor Maher's aim is to extract details from these recordings that might help police piece together a crime. Together with his colleagues, he has been compiling a database of firearm sounds in a project funded by US National Institute of Justice. They are firing an array of rifles, shotguns, semiautomatic pistols and revolvers beside an array of 12 microphones arranged in a semicircle. Each of the guns appear broadly similar to human ears when fired on an open range, but using software to analyse the sound waves picked up by the microphones, they have found it is possible to distinguish different types of weapon. "We observe differences between pistols with differing calibre and barrel length for example," says Professor Maher. "Revolvers differ from pistols because sound can emanate from the gap between the revolver cylinder and the gun barrel, causing two sound sources that can be detected at certain angles." His analysis has also revealed other details can be gleaned from recordings of gunfire. The shape of the sound wave produced by a gunshot, for example, is different depending on which way the weapon is pointing. If the microphone is off to one side of the shooter, the split second burst of noise can different compared to when it is in front of or behind the gun. They have also found it is possible to pick up distinct echoes as the initial sound produced by a gunshot reverberates off nearby buildings, parked cars, trees and walls. After the initial blast, other smaller blips in the sound wave can be seen within a fraction of a second of the shot. By calculating the time it takes for sound to travel to and from an obstacle, it is possible to calculate how far a shooter was away from it. It can even reveal if a shooter was firing from an elevated position from the muzzle blast reflecting off the ground. "This means the orientation and location of shooters in some circumstances can be determined," according to Prof Maher, who revealed some of his findings to a symposium organised by the National Institute of Justice in New Orleans last month. "In situations where more than one recording of the shooting scene is available, such as where two or more patrol cars equipped with dashboard audio or video recorders are present at an incident, the position of the vehicles can sometimes help triangulate the sounds." It is a similar concept to the one used by companies like Raytheon, which produces sniper locators for the military that use the sound of a gunshot to locate the shooter. An array of microphones can be mounted on buildings, vehicles or helicopters to help spot shots. Another firm, Shotspotter, uses a network of microphones across 90 cities in the US to help law enforcement detect gunshots. The difference with these systems is that they detect gunfire in real time, while Professor Maher is trying piece together what happened days, weeks and even months after a shooting. In the case described at the start of this article - a real shooting that occurred recently in Cincinnati, Ohio - the injured man claimed he had shot the other man dead in self defence after he was fired at first. With the two gunshots occurring less than a second apart, it was impossible for witnesses to definitely say which of the shooters had fired first. Using the security camera recordings of a local home owner living further down the street, however, Professor Maher was able to reveal two distinct gunshots in the audio. Just a few milliseconds after the first gunshot, a distinct second blip appeared in the sound wave, just moments before the second shot was fired. This blip was the echo of the first gunshot bouncing off a large building at a T-junction around 90 metres to the north. The echo from the second gunshot was far harder to spot in the sound-waves produced. According to Professor Maher, this suggests the first shooter to fire their gun was the one pointing it to the north - the same man who claimed he had been firing in self-defence. Professor Maher hopes the growing amount of technology capable of recording audio will make such analysis even easier in the future. The microphones on many older consumer devices are not designed to handle the abrupt, loud sounds of gunshots and it can overload them But as more homes become equipped with home security cameras and "always-on" smart assistants like Amazon's Echo and Google Home, it may be possible to capture better audio of events. It is something that other forensics experts believe could have a growing role in the future. Mike Brookes, a reader in communications and signal processing at Imperial College London, said: ""The sort of question that such recordings can help with are in sorting out the timing and sequence of events that took place and in establishing the position from which a gun was fired."


News Article | May 8, 2017
Site: news.yahoo.com

About 160 million years ago, a gigantic, long-necked dinosaur — the earliest known titanosaur on record — swooped its lengthy neck to and fro as it foraged for a leafy meal in Jurassic-era France, a new study finds. The newly identified dinosaur was immense: It weighed about 33,000 lbs. (15,000 kilograms), about equivalent to the weight of a garbage truck, and measured more than 50 feet (15 meters) long, or longer than a standard yellow school bus, the researchers said. They named the newfound beast Vouivria damparisensis after the Old French word "vouivre," which is based on the Latin word for viper. The name is also tied to folk history: "La vouivre" is a legendary winged reptile in the region of French-Comte, where the fossils were found. The species name honors the village Damparis, where researchers found the specimen in the 1930s. [Photos: One of the World's Biggest Dinosaurs Discovered] "Vouivria would have been a herbivore, eating all kinds of vegetation, such as ferns and conifers," the study's lead researcher, Philip Mannion, a faculty member in the Department of Earth Science and Engineering at Imperial College London, said in a statement. "This creature lived in the late Jurassic, around 160 million years ago, at a time when Europe was a series of islands." An anatomical analysis revealed that V. damparisensis is the oldest known brachiosaurid, a type of titanosauriform dinosaur. Titanosaurs were a diverse group of sauropods (enormous four-legged, long-necked and long-tailed dinosaurs) that lived from the late Jurassic to end-Cretaceous periods. For instance, Brachiosaurus, a dinosaur with a giraffe-like neck, was a titanosauriform that lived during the Jurassic period. V. damparisensis likely died in a coastal lagoon, when sea levels were briefly lower than usual, the researchers said. The dinosaur's remains were probably buried when sea levels rose again, which would explain why the animal was found buried in rocks that were from a coastal environment, the researchers said. "We don't know what this creature died from, but millions of years later, it is providing important evidence to help us understand in more detail the evolution of brachiosaurid sauropods and a much bigger group of dinosaurs that they belonged to, called titanosauriforms," Mannion said. When researchers discovered V. damparisensis in 1934, it barely received any scientific attention, the researchers said. Instead, paleontologists stored the specimen at the National Museum of Natural History in Paris, and it was only briefly mentioned throughout the years as "the Damparis dinosaur." Now that the specimen has been examined, V. damparisensis will help scientists understand the spread of early brachiosaurids and other titanosauriform dinosaurs across the world, the researchers said. Paleontologists have found other brachiosaurid remains in the United States, Western Europe and Africa, the researchers said. The study was published online May 2 in the journal PeerJ.


News Article | May 4, 2017
Site: phys.org

The authors of the study, which is published today in the journal Nature Communications, say their new method demonstrates the effectiveness of using this kind of synthetic biology as a route for discovering new antibiotics. This could open up possibilities for using re-engineered yeast cells to develop new forms of antibiotics and anti-inflammatory drugs from the nonribosomal peptide family. Nonribosomal peptides are normally produced by bacteria and fungi, forming the basis of most antibiotics today. Pharmaceutical companies have long experimented with nonribosomal peptides to make conventional antibiotics. The rise of antimicrobial resistance means there is a need to use genetic engineering techniques to find a new range of antibiotics from bacteria and fungi. However, genetically engineering the more exotic fungi and bacteria- the ones likely to have antibacterial properties—is challenging because scientists don't have the right tools and they are difficult to grow in a lab environment, requiring special conditions. Baker's yeast on the other hand is easy to genetically engineer. Scientists can simply insert DNA from bacteria and fungi into yeast to carry out experiments, offering a viable new host for antibiotic production research. The rise of synthetic biology methods for yeast will allow researchers to make and test many new gene combinations that could produce a whole new range of new antibiotics. However, the authors are keen to point out that the research is still in its early stages. While this approach does show promise, they have so far produced nonribosomal peptide antibiotic penicillin in small quantities. More research needs to be done to see if it can be adapted to finding other compounds and to get production up to commercially viable quantities. Dr Tom Ellis, from the Centre for Synthetic Biology at Imperial College London, explains: "Humans have been experimenting with yeast for thousands of years. From brewing beer to getting our bread to rise, and more recently for making compounds like anti-malarial drugs, yeast is the microscopic workhorse behind many processes. "The rise of drug-resistant superbugs has brought a real urgency to our search for new antibiotics. Our experiments show that yeast can be engineered to produce a well-known antibiotic. This opens up the possibility of using yeast to explore the largely untapped treasure trove of compounds in the nonribosomal peptide family to develop a new generation of antibiotics and anti-inflammatories." Previously, scientists have demonstrated that they could re-engineer a different yeast to make penicillin. However, that species of yeast is not as well understood or amenable to genetic manipulation compared to baker's yeast, used by the authors in today's study, making it less suitable for the development of novel antibiotics using synthetic biology. In their experiments, the team used genes from the filamentous fungus, from which nonribosomal peptide penicillin is naturally derived. These genes caused the yeast cells to produce the nonribosomal peptide penicillin via a two-step biochemical reaction process. First the cells made the nonribosomal peptide base—the 'backbone' molecule—by a complex reaction, and then this was modified by a set of further fungal enzymes that turn it into the active antibiotic. During the experimentation process, the team discovered that they didn't need to extract the penicillin molecules from inside the yeast cell. Instead, the cell was expelling the molecules directly into the solution it was in. This meant that the team simply had to add the solution to a petri-dish containing streptococcus bacteria to observe its effectiveness. In the future, this approach could greatly simplify the molecule testing and manufacturing process. Dr Ali Awan, co-author from the Department of Bioengineering at Imperial College London, explains: "Fungi have had millions of years to evolve the capability to produce bacteria-killing penicillin. We scientists have only been working with yeast in this context for a handful of years, but now that we've developed the blueprint for coaxing yeast to make penicillin, we are confident we can further refine this method to create novel drugs in the future. "We believe yeast could be the new mini-factories of the future, helping us to experiment with new compounds in the nonribosomal peptide family to develop drugs that counter antimicrobial resistance." The team are currently looking for fresh sources of funding and new industrial collaborators to take their research to the next level. Dr Ellis added: "Penicillin was first discovered by Sir Alexander Fleming at St Mary's Hospital Medical School, which is now part of Imperial. He also predicted the rise of antibiotic resistance soon after making his discovery. We hope, in some small way, to build on his legacy, collaborating with industry and academia to develop the next generation of antibiotics using synthetic biology techniques." Explore further: Fungi have enormous potential for new antibiotics More information: "Biosynthesis of the Antibiotic Nonribosomal Peptide Penicllin in Baker's yeast" Nature Communications, 2017.


News Article | May 5, 2017
Site: www.chromatographytechniques.com

Led by biomedical engineers at Newcastle University and funded by the Engineering and Physical Sciences Research Council (EPSRC), the bionic hand is fitted with a camera that instantaneously takes a picture of the object in front of it, assesses its shape and size and triggers a series of movements in the hand. Bypassing the usual processes that require the user to see the object, physically stimulate the muscles in the arm and trigger a movement in the prosthetic limb, the hand ‘sees’ and reacts in one fluid movement. A small number of amputees have already trialed the new technology and now the Newcastle University team is working with experts at Newcastle upon Tyne Hospitals NHS Foundation Trust to offer the ‘hands with eyes’ to patients at Newcastle’s Freeman Hospital. The team published their findings in the Journal of Neural Engineering, “Prosthetic limbs have changed very little in the past 100 years – the design is much better and the materials’ are lighter weight and more durable but they still work in the same way," explained Kianoush Nazarpour, co-author and senior lecturer in Biomedical Engineering at Newcastle University. “Using computer vision, we have developed a bionic hand which can respond automatically – in fact, just like a real hand, the user can reach out and pick up a cup or a biscuit with nothing more than a quick glance in the right direction. “Responsiveness has been one of the main barriers to artificial limbs. For many amputees the reference point is their healthy arm or leg so prosthetics seem slow and cumbersome in comparison. “Now, for the first time in a century, we have developed an ‘intuitive’ hand that can react without thinking.” Recent statistics show that in the UK there are around 600 new upper-limb amputees every year, of which 50 percent are in the age range of 15- to 54-years-old. In the U.S. there are 500,000 upper limb amputees a year. Current prosthetic hands are controlled via myoelectric signals – that is electrical activity of the muscles recorded from the skin surface of the stump. Controlling them, says Nazarpour, takes practice, concentration and, crucially, time. Using neural networks – the basis for Artificial Intelligence - lead author on the study Ghazal Ghazaei showed the computer numerous object images and taught it to recognize the ‘grip’ needed for different objects. “We would show the computer a picture of, for example, a stick,” explains Ghazaei, who carried out the work as part of her PhD in the School of Electrical and Electronic Engineering at Newcastle University. “But not just one picture, many images of the same stick from different angles and orientations, even in different light and against different backgrounds and eventually the computer learns what grasp it needs to pick that stick up. “So the computer isn’t just matching an image, it’s learning to recognize objects and group them according to the grasp type the hand has to perform to successfully pick it up. “It is this which enables it to accurately assess and pick up an object which it has never seen before – a huge step forward in the development of bionic limbs.” Grouping objects by size, shape and orientation, according to the type of grasp that would be needed to pick them up, the team programmed the hand to perform four different ‘grasps:’ palm wrist neutral (such as when you pick up a cup); palm wrist pronated (such as picking up the TV remote); tripod (thumb and two fingers) and pinch (thumb and first finger). Using a 99p camera fitted to the prosthesis, the hand ‘sees’ an object, picks the most appropriate grasp and sends a signal to the hand – all within a matter of milliseconds and 10 times faster than any other limb currently on the market. “One way would have been to create a photo database of every single object but clearly that would be a massive task and you would literally need every make of pen, toothbrush, shape of cup – the list is endless,” says Nazarpour. “The beauty of this system is that it’s much more flexible and the hand is able to pick up novel objects – which is crucial since in everyday life people effortlessly pick up a variety of objects that they have never seen before.” The work is part of a larger research project to develop a bionic hand that can sense pressure and temperature and transmit the information back to the brain. Led by Newcastle University and involving experts from the universities of Leeds, Essex, Keele, Southampton and Imperial College London, the aim is to develop novel electronic devices that connect to the forearm neural networks to allow two-way communications with the brain. Reminiscent of Luke Skywalker’s artificial hand, the electrodes in the bionic limb would wrap around the nerve endings in the arm.  This would mean for the first time the brain could communicate directly with the prosthesis. The ‘hand that sees,’ explains Nazarpour, is an interim solution that will bridge the gap between current designs and the future. “It’s a stepping stone towards our ultimate goal,” he says. “But importantly, it’s cheap and it can be implemented soon because it doesn’t require new prosthetics – we can just adapt the ones we have.” Anne Ewing, advanced occupational therapist at Newcastle upon Tyne Hospitals NHS Foundation Trust, has been working with Nazarpour and his team. “I work with upper limb amputee patients which is extremely rewarding, varied and at times challenging,” she said. “We always strive to put the patient at the heart of everything we do and so make sure that any interventions are client centred to ensure patients’ individual goals are met either with a prosthesis or alternative method of carrying out a task. “This project in collaboration with Newcastle University has provided an exciting opportunity to help shape the future of upper limb prosthetics, working towards achieving patients’ prosthetic expectations and it is wonderful to have been involved." “For me it was literally a case of life or limb,” says Doug McIntosh, who lost his right arm in 1997 through cancer. “I had developed a rare form of cancer called epithelial sarcoma, which develops in the deep tissue under the skin, and the doctors had no choice but to amputate the limb to save my life. “Losing an arm and battling cancer with three young children was life changing.  I left my job as a life support supervisor in the diving industry and spent a year fund-raising for cancer charities. “It was this and my family that motivated me and got me through the hardest times.” Since then, Doug has gone on to be an inspiration to amputees around the world.  Becoming the first amputee to cycle from John O’Groats to Land’s End in 100hrs, cycle around the coast line of Britain, he has run three London Marathons, cycled The Dallaglio Flintoff Cycle Slam 2012 and 2014 and in 2014 cycled with the British Lions Rugby Team to Murrayfield Rugby Stadium for “Walking with Wounded” Charity.  He is currently preparing to do Mont Ventoux this September, three cycle climbs in one day for Cancer Research UK and Maggie’s Cancer Centres. Involved in the early trials of the first myoelectric prosthetic limbs, Doug has been working with the Newcastle team to trail the new hand that sees. “The problem is there’s nothing yet that really comes close to feeling like the real thing,” explains the father-of-three who lives in Westhill, Aberdeen with his wife of 32 years, Diane. “Some of the prosthetics look very realistic but they feel slow and clumsy when you have a working hand to compare them to. “In the end I found it easier just to do without and learn to adapt.  When I do use a prosthesis I use a split hook which doesn’t look pretty but does the job.” But he says the new, responsive hand being developed in Newcastle is a ‘huge leap forward.’ “This offers for the first time a real alternative for upper limb amputees,” he says. “For me, one of the ways of dealing with the loss of my hand was to be very open about it and answer people’s questions.  But not everyone wants that and so to have the option of a hand that not only looks realistic but also works like a real hand would be an amazing breakthrough and transform the recovery time – both physically and mentally – for many amputees.”


News Article | May 4, 2017
Site: www.theengineer.co.uk

Biomedical engineers from Newcastle University have developed a computer vision system for prosthetic hands, allowing users to grasp and interact with common objects. Current upper limb prosthetics that can grip are controlled by myoelectric signals from the muscles in the stump, but it’s a skill that takes patience and time to master. Funded by the EPSRC, the Newcastle team created a computer vision system that enables prosthetics to ‘see’ with the assistance of an off-the-shelf camera. The work appears in the Journal of Neural Engineering. “Responsiveness has been one of the main barriers to artificial limbs,” said Dr Kianoush Nazarpour, senior lecturer in Biomedical Engineering at Newcastle University. “For many amputees the reference point is their healthy arm or leg, so prosthetics seem slow and cumbersome in comparison.” “Using computer vision, we have developed a bionic hand which can respond automatically – in fact, just like a real hand, the user can reach out and pick up a cup or a biscuit with nothing more than a quick glance in the right direction.” The researchers trained the system using neural networks, showing it numerous pictures of various objects from multiple angles and in different light conditions. Over time, the AI learned which grasp pattern to use for different objects according to their shape, but without measuring specific dimensions or explicitly identifying them. Objects were categorised into four grasp classes: pinch, tripod, palmar wrist neutral and palmar wrist pronated. “The computer isn’t just matching an image, it’s learning to recognise objects and group them according to the grasp type the hand has to perform to successfully pick it up,” said lead author Ghazal Ghazaei, who carried out the work as part of her PhD at Newcastle’s School of Electrical and Electronic Engineering. “It is this which enables it to accurately assess and pick up an object which it has never seen before – a huge step forward in the development of bionic limbs.” The research is part of a wider prosthetics project led by Newcastle, which also involves the universities of Leeds, Essex, Keele, Southampton and Imperial College London. Longer term, the aim is to develop a bionic arm which connects directly to the nerve networks in the amputated limb, and which could be controlled directly by the user’s brain. According to Dr Nazarpour, the camera-assisted prosthetic is an interim solution that can help pave the way. “It’s a stepping stone towards our ultimate goal,” he said. “But importantly, it’s cheap and it can be implemented soon because it doesn’t require new prosthetics – we can just adapt the ones we have.”


News Article | May 4, 2017
Site: www.chromatographytechniques.com

Synthetic biologists from Imperial College London have re-engineered yeast cells to manufacture the nonribosomal peptide antibiotic penicillin. In laboratory experiments, they were able to demonstrate that this yeast had antibacterial properties against streptococcus bacteria. The authors of the study, which is published in the journal Nature Communications, say their new method demonstrates the effectiveness of using this kind of synthetic biology as a route for discovering new antibiotics. This could open up possibilities for using re-engineered yeast cells to develop new forms of antibiotics and anti-inflammatory drugs from the nonribosomal peptide family. Nonribosomal peptides are normally produced by bacteria and fungi, forming the basis of most antibiotics today. Pharmaceutical companies have long experimented with nonribosomal peptides to make conventional antibiotics. The rise of antimicrobial resistance means there is a need to use genetic engineering techniques to find a new range of antibiotics from bacteria and fungi. However, genetically engineering the more exotic fungi and bacteria- the ones likely to have antibacterial properties -- is challenging because scientists don't have the right tools and they are difficult to grow in a lab environment, requiring special conditions. Baker's yeast on the other hand is easy to genetically engineer. Scientists can simply insert DNA from bacteria and fungi into yeast to carry out experiments, offering a viable new host for antibiotic production research. The rise of synthetic biology methods for yeast will allow researchers to make and test many new gene combinations that could produce a whole new range of new antibiotics. However, the authors are keen to point out that the research is still in its early stages. While this approach does show promise, they have so far produced nonribosomal peptide antibiotic penicillin in small quantities. More research needs to be done to see if it can be adapted to finding other compounds and to get production up to commercially viable quantities. "Humans have been experimenting with yeast for thousands of years. From brewing beer to getting our bread to rise, and more recently for making compounds like anti-malarial drugs, yeast is the microscopic workhorse behind many processes," explained Tom Ellis, from the Centre for Synthetic Biology at Imperial College London. "The rise of drug-resistant superbugs has brought a real urgency to our search for new antibiotics. Our experiments show that yeast can be engineered to produce a well-known antibiotic. This opens up the possibility of using yeast to explore the largely untapped treasure trove of compounds in the nonribosomal peptide family to develop a new generation of antibiotics and anti-inflammatories." Previously, scientists have demonstrated that they could re-engineer a different yeast to make penicillin. However, that species of yeast is not as well understood or amenable to genetic manipulation compared to baker's yeast, used by the authors in today's study, making it less suitable for the development of novel antibiotics using synthetic biology. In their experiments, the team used genes from the filamentous fungus, from which nonribosomal peptide penicillin is naturally derived. These genes caused the yeast cells to produce the nonribosomal peptide penicillin via a two-step biochemical reaction process. First the cells made the nonribosomal peptide base -- the 'backbone' molecule -- by a complex reaction, and then this was modified by a set of further fungal enzymes that turn it into the active antibiotic. During the experimentation process, the team discovered that they didn't need to extract the penicillin molecules from inside the yeast cell. Instead, the cell was expelling the molecules directly into the solution it was in. This meant that the team simply had to add the solution to a petri-dish containing streptococcus bacteria to observe its effectiveness. In the future, this approach could greatly simplify the molecule testing and manufacturing process. "Fungi have had millions of years to evolve the capability to produce bacteria-killing penicillin. We scientists have only been working with yeast in this context for a handful of years, but now that we've developed the blueprint for coaxing yeast to make penicillin, we are confident we can further refine this method to create novel drugs in the future," said Ali Awan, co-author from the Department of Bioengineering at Imperial College London. "We believe yeast could be the new mini-factories of the future, helping us to experiment with new compounds in the nonribosomal peptide family to develop drugs that counter antimicrobial resistance." The team is currently looking for fresh sources of funding and new industrial collaborators to take their research to the next level. "Penicillin was first discovered by Sir Alexander Fleming at St Mary's Hospital Medical School, which is now part of Imperial. He also predicted the rise of antibiotic resistance soon after making his discovery. We hope, in some small way, to build on his legacy, collaborating with industry and academia to develop the next generation of antibiotics using synthetic biology techniques," added Ellis.


The incidence of bile duct cancer (cholangiocarcinoma) is increasing year on year throughout the world. More than 2,500 people will be diagnosed with this cancer in the UK in the next year and for most this will be a lethal diagnosis. Fewer than 5% will survive for 12 months – an appalling statistic which hasn’t changed in decades.   In light of this, UK’s leading charity dedicated to bile duct cancer, AMMF, will bring together scientists, researchers, medics and patients from across the globe at its third Conference and Information Day dedicated exclusively to bile duct cancer, on 11 May, 2017 at the Radisson Blu Hotel, Stansted Airport, Essex.   Amongst AMMF-funded researchers who will be presenting updates on their work at this year’s AMMF Conference, will be Professor Stuart Forbes from the MRC Centre for Regenerative Medicine, explaining his research into the signals Wnt and Notch which are thought to drive the growth of cholangiocarcinoma. In addition, Dr Luke Boulter from Edinburgh’s Institute of Genetics & Molecular Medicine will be discussing his very promising work, “Discovering driver mutations in cholangiocarcinoma using forward genetics”.   The work of both these teams, if successful, could bring closer some ‘game changing’ treatment targets for cholangiocarcinoma.   This year’s Conference also sees Professor Narong Khuntikeo from Khon Kaen University, Thailand presenting the work of the CASCAP (Cholangiocarcinoma Screening and Care Program) team in north east Thailand, which has the world’s highest incidence of cholangiocarcinoma. Professor Khuntikeo is vice-president of the Cholangiocarcinoma Foundation of Thailand and recipient of The Royal College of Surgeons of Thailand Outstanding Surgeon Honours Award 2016.   Conference to highlight latest surgical treatments and targeted therapies for bile duct cancer   Other topics to be addressed at this year’s Conference will include the latest surgical developments in the treatment of bile duct cancer, updates on clinical trials, and the status of targeted therapies for cholangiocarcinoma.   Helen Morement, founder and CEO of the AMMF explains, “Although bile duct cancer is the second most common primary liver cancer in the world, with an increasing incidence globally, and despite its appalling survival rates due to late diagnosis and few treatment options, it remains poorly understood and under researched. The Conference is a key platform for an international panel of experts to share news and information about clinical studies and latest research. The findings bring the prospect of early diagnosis and more effective treatments one step closer.”   Helen continues, “We are especially delighted that Professor Richard Syms from Imperial College London, who is also working collaboratively with the team at Khon Kaen University on an AMMF-funded internal imaging project, will be presenting the positive early results of this work at the Conference.”   Bile duct cancer is a rare cancer that occurs in the bile duct in or outside the liver. With few noticeable and often misunderstood symptoms, this disease is frequently diagnosed too late for surgery, the only potentially curative treatment. Without treatment fewer than 5% of patients will survive beyond 12 months. Cases of bile duct cancer have risen steeply and steadily across the world over the past decades. According to the recent NCIN/Cancer52 report, 2,161 people died in 2013 from this disease in England alone.   About AMMF AMMF (The Alan Morement Memorial Fund) was founded and registered as a charity with the Charity Commission in 2002 (registered charity no 1091915). AMMF is the UK’s only cholangiocarcinoma charity, dedicated to tackling this devastating cancer on all fronts: providing information and support, campaigning to raise awareness, and encouraging and supporting research.   In recent years an enormous and extremely worrying worldwide increase in cholangiocarcinoma’s incidence has been noted. Latest figures show there were 2,161 deaths caused by cholangiocarcinoma in 2013 in England alone (NCIN/Cancer52 report). The incidence appears to be increasing across all age groups, including younger people, and the cause of this ongoing increase is unknown. Much more research is desperately needed.   AMMF is dedicated to bringing about improvement for the cholangiocarcinoma patient, working closely throughout the UK with patients, families, carers, clinicians, healthcare professionals, researchers, politicians and policy makers. For more information visit: www.ammf.org.uk (registered charity no.1091915).   About the Conference & Information Day AMMF is not making a charge for attendance at the conference; it is open to all who have an interest in cholangiocarcinoma. However, if delegates would like to help to offset costs, a suggested donation of £25 per head can be made to the AMMF 2017 Conference Justgiving Page by clicking here: https://www.justgiving.com/fundraising/AMMF-Charity2   About the MRC Centre for Regenerative Medicine at the University of Edinburgh The MRC Centre for Regenerative Medicine (CRM) is a research institute based at the University of Edinburgh. Scientists and clinicians study stem cells, disease and tissue repair to advance human health. For more information please visit: http://www.crm.ed.ac.uk/   About the MRC Institute of Genetics and Molecular Medicine at the University of Edinburgh (IGMM). The MRC Institute of Genetics and Molecular Medicine at the University of Edinburgh (IGMM), formed in 2007, is a strategic partnership of the: • MRC Human Genetics Unit (MRC HGU) • Cancer Research UK Edinburgh Centre (CRUK EC) • Centre for Genomic and Experimental Medicine (CGEM).   The IGMM constitutes one of the largest aggregates of human molecular genetics and biology research capacity in the UK with over 70 Principal Investigators and 500 staff and PhD students. By pooling the resources and complementary skills of the constituent centres, IGMM brings together the scientific expertise, technology and support services needed to maximise scientific discovery.   The Institute enables rapid translation of basic scientific discoveries into new treatments, clinical guidelines and innovative products that have significant impact on the society in the UK and Worldwide.   For more information please visit: http://www.ed.ac.uk/igmm/about   About CASCAP (Cholangiocarcinoma Screening and Care Program), Thailand CASCAP stands for the Cholangiocarcinoma Screening and Care Program. The aim of CASCAP is to accelerate the transition of CCA from being a neglected disease to being on the public health national agenda. Its specific focus is to develop and make available a high quality database of compiled information about CCA in the region, to determine the optimal screening program for early diagnosis to maximize the success of surgical treatment, and to increasing both the quality of life and long-term survival of patients. For more information please visit http://www.cascap.info/main/index.php/about-us/about-cascap.html National Cancer Intelligence Network (NCIN) and Cancer52 For more information please visit: http://www.ncin.org.uk/publications/rare_and_less_common_cancers For media inquiries and interviews, please contact: ESTHER PORTA, 3CommPR LONDON, United Kingdom 07870439158 esther@3commpr.co.uk The post Why latest work from top UK cancer researchers could hold potential for future ‘game-changing’ treatments for rare bile duct cancer, cholangiocarcinoma appeared first on PR Fire.


News Article | May 4, 2017
Site: www.eurekalert.org

The synthetic biologists from Imperial College London have re-engineered yeast cells to manufacture the nonribosomal peptide antibiotic penicillin. In laboratory experiments, they were able to demonstrate that this yeast had antibacterial properties against streptococcus bacteria. The authors of the study, which is published today in the journal Nature Communications, say their new method demonstrates the effectiveness of using this kind of synthetic biology as a route for discovering new antibiotics. This could open up possibilities for using re-engineered yeast cells to develop new forms of antibiotics and anti-inflammatory drugs from the nonribosomal peptide family. Nonribosomal peptides are normally produced by bacteria and fungi, forming the basis of most antibiotics today. Pharmaceutical companies have long experimented with nonribosomal peptides to make conventional antibiotics. The rise of antimicrobial resistance means there is a need to use genetic engineering techniques to find a new range of antibiotics from bacteria and fungi. However, genetically engineering the more exotic fungi and bacteria- the ones likely to have antibacterial properties -- is challenging because scientists don't have the right tools and they are difficult to grow in a lab environment, requiring special conditions. Baker's yeast on the other hand is easy to genetically engineer. Scientists can simply insert DNA from bacteria and fungi into yeast to carry out experiments, offering a viable new host for antibiotic production research. The rise of synthetic biology methods for yeast will allow researchers to make and test many new gene combinations that could produce a whole new range of new antibiotics. However, the authors are keen to point out that the research is still in its early stages. While this approach does show promise, they have so far produced nonribosomal peptide antibiotic penicillin in small quantities. More research needs to be done to see if it can be adapted to finding other compounds and to get production up to commercially viable quantities. Dr Tom Ellis, from the Centre for Synthetic Biology at Imperial College London, explains: "Humans have been experimenting with yeast for thousands of years. From brewing beer to getting our bread to rise, and more recently for making compounds like anti-malarial drugs, yeast is the microscopic workhorse behind many processes. "The rise of drug-resistant superbugs has brought a real urgency to our search for new antibiotics. Our experiments show that yeast can be engineered to produce a well-known antibiotic. This opens up the possibility of using yeast to explore the largely untapped treasure trove of compounds in the nonribosomal peptide family to develop a new generation of antibiotics and anti-inflammatories." Previously, scientists have demonstrated that they could re-engineer a different yeast to make penicillin. However, that species of yeast is not as well understood or amenable to genetic manipulation compared to baker's yeast, used by the authors in today's study, making it less suitable for the development of novel antibiotics using synthetic biology. In their experiments, the team used genes from the filamentous fungus, from which nonribosomal peptide penicillin is naturally derived. These genes caused the yeast cells to produce the nonribosomal peptide penicillin via a two-step biochemical reaction process. First the cells made the nonribosomal peptide base -- the 'backbone' molecule -- by a complex reaction, and then this was modified by a set of further fungal enzymes that turn it into the active antibiotic. During the experimentation process, the team discovered that they didn't need to extract the penicillin molecules from inside the yeast cell. Instead, the cell was expelling the molecules directly into the solution it was in. This meant that the team simply had to add the solution to a petri-dish containing streptococcus bacteria to observe its effectiveness. In the future, this approach could greatly simplify the molecule testing and manufacturing process. Dr Ali Awan, co-author from the Department of Bioengineering at Imperial College London, explains: "Fungi have had millions of years to evolve the capability to produce bacteria-killing penicillin. We scientists have only been working with yeast in this context for a handful of years, but now that we've developed the blueprint for coaxing yeast to make penicillin, we are confident we can further refine this method to create novel drugs in the future. "We believe yeast could be the new mini-factories of the future, helping us to experiment with new compounds in the nonribosomal peptide family to develop drugs that counter antimicrobial resistance." The team are currently looking for fresh sources of funding and new industrial collaborators to take their research to the next level. Dr Ellis added: "Penicillin was first discovered by Sir Alexander Fleming at St Mary's Hospital Medical School, which is now part of Imperial. He also predicted the rise of antibiotic resistance soon after making his discovery. We hope, in some small way, to build on his legacy, collaborating with industry and academia to develop the next generation of antibiotics using synthetic biology techniques." The research was carried out in conjunction with SynbiCITE, which is the UK's national centre for the commercialisation of synthetic biology. "Biosynthesis of the Antibiotic Nonribosomal Peptide Penicllin in Baker's yeast" published in the journal Nature Communications on [insert date]. [1] [2] Ali R. Awan, [1] [2] Benjamin A. Blount, [1] [3] David J. Bell, [1] [4] Jack C. Ho, [1] [4] Robert McKiernan, [1] [2] Tom Ellis. [1] Centre for Synthetic Biology and Innovation, Imperial College London, London SW7 2AZ [2] Department of Bioengineering, Imperial College London, London SW7 2AZ [3] SynbiCITE Innovation and Knowledge Centre, Imperial College London, London SW7 2AZ [4] Department of Life Sciences, Imperial College London, London SW7 2AZ Imperial College London is one of the world's leading universities. The College's 16,000 students and 8,000 staff are expanding the frontiers of knowledge in science, medicine, engineering and business, and translating their discoveries into benefits for society. Founded in 1907, Imperial builds on a distinguished past -- having pioneered penicillin, holography and fibre optics -- to shape the future. Imperial researchers work across disciplines to improve health and wellbeing, understand the natural world, engineer novel solutions and lead the data revolution. This blend of academic excellence and its real-world application feeds into Imperial's exceptional learning environment, where students participate in research to push the limits of their degrees. Imperial collaborates widely to achieve greater impact. It works with the NHS to improve healthcare in west London, is a leading partner in research and education within the European Union, and is the UK's number one research collaborator with China. Imperial has nine London campuses, including its White City Campus: a research and innovation centre that is in its initial stages of development in west London. At White City, researchers, businesses and higher education partners will co-locate to create value from ideas on a global scale. Imperial College London academic experts are available for interview via broadcast quality Globelynx TV facilities and an ISDN line for radio at our South Kensington Campus. To request an interview, please contact a member of the communications team http://www.


Researchers investigating a form of adult-onset diabetes that shares features with the two better-known types of diabetes have discovered genetic influences that may offer clues to more accurate diagnosis and treatment. Latent autoimmune diabetes in adults (LADA) is informally called "type 1.5 diabetes" because like type 1 diabetes (T1D), LADA is marked by circulating autoantibodies, an indicator that an overactive immune system is damaging the body's insulin-producing beta cells. But LADA also shares clinical features with type 2 diabetes (T2D), which tends to appear in adulthood. Also, as in T2D, LADA patients do not require insulin treatments when first diagnosed. A study published April 25 in BMC Medicine uses genetic analysis to show that LADA is closer to T1D than to T2D. "Correctly diagnosing subtypes of diabetes is important, because it affects how physicians manage a patient's disease," said co-study leader Struan F.A. Grant, PhD, a genomics researcher at Children's Hospital of Philadelphia (CHOP). "If patients are misdiagnosed with the wrong type of diabetes, they may not receive the most effective medication." Grant collaborated with European scientists, led by Richard David Leslie of the University of London, U.K.; and Bernhard O. Boehm, of Ulm University Medical Center, Germany and the Lee Kong Chian School of Medicine, a joint medical school of Imperial College London and Nanyang Technological University, Singapore. Occurring when patients cannot produce their own insulin or are unable to properly process the insulin they do produce, diabetes is usually classified into two major types. T1D, formerly called juvenile diabetes, generally presents in childhood, but may also appear first in adults. T2D, formerly called non-insulin-dependent diabetes, typically appears in adults, but has been increasing over the past several decades in children and teens. Some 90 percent or more of all patients with diabetes are diagnosed with T2D. Grant and many other researchers have discovered dozens of genetic regions that increase diabetes risk, usually with different sets of variants associated with T1D compared to T2D. The current study, the largest-ever genetic study of LADA, sought to determine how established T1D- or T2D-associated variants operate in the context of LADA. The study team compared DNA from 978 LADA patients, all adults from the U.K. and Germany, to a control group of 1,057 children without diabetes. Another set of control samples came from 2,820 healthy adults in the U.K. All samples were from individuals of European ancestry. The researchers calculated genetic risk scores to measure whether LADA patients had genetic profiles more similar to those of T1D or T2D patients. They found several T1D genetic regions associated with LADA, while relatively few T2D gene regions added to the risk of LADA. The genetic risk in LADA from T1D risk alleles was lower than in childhood-onset T1D, possibly accounting for the fact that LADA appears later in life. One variant, located in TCF7L2, which Grant and colleagues showed in 2006 to be among the strongest genetic risk factors for T2D reported to date, had no role in LADA. "Our finding that LADA is genetically closer to T1D than to T2D suggests that some proportion of patients diagnosed as adults with type 2 diabetes may actually have late-onset type 1 diabetes," said Grant. Grant said that larger studies are needed to further uncover genetic influences in the complex biology of diabetes, adding, "As we continue to integrate genetic findings with clinical characteristics, we may be able to more accurately classify diabetes subtypes to match patients with more effective treatments." Grant received support for this research from the National Institutes of Health (grant R01 DK085212) and the Daniel B. Burke Endowed Chair for Diabetes Research. "Relative Contribution of type 1 and type 2 diabetes loci to the genetic etiology of adult-onset, non-insulin-requiring autoimmune diabetes" BMC Medicine, published online April 25, 2017. http://doi. About Children's Hospital of Philadelphia: Children's Hospital of Philadelphia was founded in 1855 as the nation's first pediatric hospital. Through its long-standing commitment to providing exceptional patient care, training new generations of pediatric healthcare professionals, and pioneering major research initiatives, Children's Hospital has fostered many discoveries that have benefited children worldwide. Its pediatric research program is among the largest in the country. In addition, its unique family-centered care and public service programs have brought the 546-bed hospital recognition as a leading advocate for children and adolescents. For more information, visit http://www.


News Article | May 8, 2017
Site: www.cemag.us

The concept of a perfect lens that can produce immaculate and flawless images has been the Holy Grail of lens makers for centuries. In 1873, a German physicist and optical scientist by the name of Ernst Abbe discovered the diffraction limit of the microscope. In other words, he discovered that conventional lenses are fundamentally incapable of capturing all the details of any given image. Since then, there have been numerous advances in the field to produce images that appear to have higher resolution than allowed by diffraction-limited optics. In 2000, Professor Sir John B. Pendry of Imperial College London — the John Pendry who enticed millions of Harry Potter fans around the world with the possibility of a real Invisibility Cloak — suggested a method of creating a lens with a theoretically perfect focus. The resolution of any optical imaging system has a maximum limit due to diffraction but Pendry’s theoretic perfect lens would be crafted from metamaterials (materials engineered to have properties not found in nature) to go beyond the diffraction limit of conventional lenses. Overcoming this resolution limit of conventional optics could propel optical imaging science and technology into realms once only dreamt by common Muggles. Scientists all over the world have since endeavored to achieve super-resolution imaging that capture the finest of details contained in evanescent waves that would otherwise be lost with conventional lenses. Hyperlenses are super-resolution devices that transform scattered evanescent waves into propagating waves to project the image into the far-field. Recent experiments that focus on a single hyperlens made from an anisotropic metamaterial with a hyperbolic dispersion have demonstrated far-field sub-diffraction imaging in real time. However, such devices are limited by an extremely small observation area which consequently require precise positioning of the subject. A hyperlens array has been considered to be a solution, but fabrication of such an array would be extremely difficult and prohibitively expensive with existing nanofabrication technologies. Research conducted by Professor Junsuk Rho’s team from the Department of Mechanical Engineering and the Department of Chemical Engineering at Pohang University of Science and Technology in collaboration with research team from Korea University has made great contributions to overcoming this obstacle by demonstrating a scalable and reliable fabrication process of a large scale hyperlens device based on direct pattern transfer techniques. This achievement has been published in the world-renowned Scientific Reports. The team solved the main limitations of previous fabrication methods of hyperlens devices through nanoimprint lithography. Based on a simple pattern transfer process, the team was able to readily fabricate a perfect large-scale hyperlens device on a replicated hexagonal array of hemisphere substrate directly printed and pattern-transferred from the master mold, followed by metal-dielectric multilayer deposition by electron beam evaporation. This 5 cm x 5 cm hyperlens array has been demonstrated to resolve sub-diffraction features down to 160 nm under a 410 nm wavelength visible light. Rho anticipates that the research team’s new cost-effective fabrication method can be used to proliferate practical far-field and real-time super-resolution imaging devices that can be widely used in optics, biology, medical science, nanotechnology, and other related interdisciplinary fields. This research was supported by the National Research Foundation of Korea (NRF) grants of Young Investigator program, Engineering Research Center program, Global Frontier program, Pioneer Research program, and the Commercialization Promotion Agency for R&D Outcomes (COMPA) grant, all funded by the Ministry of Science, ICT and Future Planning (MSIP) of the Korean government.


News Article | April 21, 2017
Site: www.eurekalert.org

The concept of a perfect lens that can produce immaculate and flawless images has been the Holy Grail of lens makers for centuries. In 1873, a German physicist and optical scientist by the name of Ernst Abbe discovered the diffraction limit of the microscope. In other words, he discovered that conventional lenses are fundamentally incapable of capturing all the details of any given image. Since then, there have been numerous advances in the field to produce images that appear to have higher resolution than allowed by diffraction-limited optics. In 2000, Professor Sir John B. Pendry of Imperial College London -- the John Pendry who enticed millions of Harry Potter fans around the world with the possibility of a real Invisibility Cloak -- suggested a method of creating a lens with a theoretically perfect focus. The resolution of any optical imaging system has a maximum limit due to diffraction but Pendry's theoretic perfect lens would be crafted from metamaterials (materials engineered to have properties not found in nature) to go beyond the diffraction limit of conventional lenses. Overcoming this resolution limit of conventional optics could propel optical imaging science and technology into realms once only dreamt by common Muggles. Scientists all over the world have since endeavored to achieve super-resolution imaging that capture the finest of details contained in evanescent waves that would otherwise be lost with conventional lenses. Hyperlenses are super-resolution devices that transform scattered evanescent waves into propagating waves to project the image into the far-field. Recent experiments that focus on a single hyperlens made from an anisotropic metamaterial with a hyperbolic dispersion have demonstrated far-field sub-diffraction imaging in real time. However, such devices are limited by an extremely small observation area which consequently require precise positioning of the subject. A hyperlens array has been considered to be a solution, but fabrication of such an array would be extremely difficult and prohibitively expensive with existing nanofabrication technologies. Research conducted by Professor Junsuk Rho's team from the Department of Mechanical Engineering and the Department of Chemical Engineering at Pohang University of Science and Technology in collaboration with research team from Korea University has made great contributions to overcoming this obstacle by demonstrating a scalable and reliable fabrication process of a large scale hyperlens device based on direct pattern transfer techniques. This achievement has been published in the world-renowned Scientific Reports. The team solved the main limitations of previous fabrication methods of hyperlens devices through nanoimprint lithography. Based on a simple pattern transfer process, the team was able to readily fabricate a perfect large-scale hyperlens device on a replicated hexagonal array of hemisphere substrate directly printed and pattern-transferred from the master mold, followed by metal-dielectric multilayer deposition by electron beam evaporation. This 5 cm x 5 cm hyperlens array has been demonstrated to resolve sub-diffraction features down to 160 nm under a 410 nm wavelength visible light. Professor Rho anticipates that the research team's new cost-effective fabrication method can be used to proliferate practical far-field and real-time super-resolution imaging devices that can be widely used in optics, biology, medical science, nanotechnology, and other related interdisciplinary fields. This research was supported by the National Research Foundation of Korea (NRF) grants of Young Investigator program, Engineering Research Center program, Global Frontier program, Pioneer Research program, and the Commercialization Promotion Agency for R&D Outcomes (COMPA) grant, all funded by the Ministry of Science, ICT and Future Planning (MSIP) of the Korean government.


News Article | April 17, 2017
Site: www.materialstoday.com

Using sunlight to drive chemical reactions such as artificial photosynthesis could soon become much more efficient thanks to nanomaterials, say researchers from Imperial College London in the UK. Their work on such nanomaterials could ultimately help improve solar energy technologies and be used for new applications, such as using sunlight to break down harmful chemicals. Sunlight is used to drive many chemical processes that would not otherwise occur. For example, carbon dioxide and water do not ordinarily react, but in the process of photosynthesis plants take these two chemicals and, using sunlight, convert them into oxygen and sugar. The efficiency of this reaction is very high, meaning much of the energy from sunlight is transferred to the chemical reaction, but so far scientists have been unable to mimic this process in man-made artificial devices. One reason for this is that many molecules that can undergo chemical reactions with light do not efficiently absorb the light themselves. They rely on photocatalysts – materials that absorb light efficiently and then pass the energy on to the molecules to drive reactions. In this new study, which is reported in a paper in Nature Communications, the Imperial researchers, together with colleagues in Germany and the US, investigated an artificial photocatalyst material made from metal nanoparticles and found out how to make it more efficient. This discovery could lead to better solar panels, allowing energy from the sun to be harvested more efficiently. The novel photocatalyst could also be used to destroy liquid or gas pollutants, such as pesticides in water, by harnessing sunlight to drive reactions that break down the chemicals into less harmful forms. “This finding opens new opportunities for increasing the efficiency of using and storing sunlight in various technologies,” said lead author Emiliano Cortés from the Department of Physics at Imperial. “By using these materials we can revolutionize our current capabilities for storing and using sunlight with important implications in energy conversion, as well as new uses such as destroying pollutant molecules or gases and water cleaning, among others.” The researchers showed that light-induced chemical reactions occur in certain regions over the surface of these nanomaterials. They also identified which areas of the nanomaterial would be most suitable for transferring energy to chemical reactions, by tracking the locations of very small gold nanoparticles (used as a markers) on the surface of the silver nanocatalytic material. Now that they know which regions are responsible for the process of harvesting light and transferring it to chemical reactions, the team hope to be able to engineer the nanomaterial to increase these areas and make it more efficient. “This is a powerful demonstration of how metallic nanostructures, which we have investigated in my group at Imperial for the last 10 years, continue to surprise us in their abilities to control light on the nanoscale,” said lead researcher Stefan Maier. "The new finding uncovered by Dr Cortés and his collaborators in Germany and the US opens up new possibilities for this field in the areas of photocatalysis and nanochemistry.” This story is adapted from material from Imperial College London, with editorial changes made by Materials Today. The views expressed in this article do not necessarily represent those of Elsevier. Link to original source.


News Article | April 25, 2017
Site: www.materialstoday.com

Powder characterization company Freeman Technology and sorption specialists Surface Measurement Systems, will be hosting a powder characterization workshop on the 18 May 2017 at Imperial College London, UK. The free one-day event will introduce delegates to powder rheology and surface energy measurements. The workshop will include presentations by Jamie Clayton, operations director at Freeman Technology, who will provide an introduction to understanding powder flow and powder behaviour, and Dr Daryl Williams, founder of Surface Measurement Systems, who will discuss the surface energy of powders and powder performance. Delegates will also have an opportunity to hear from Jordan Cheyne, Manager, materials characterization team at Pfizer Sandwich and Iain Davidson, manager, physical properties at Vectura. This story is reprinted from material from Freeman Technology, with editorial changes made by Materials Today. The views expressed in this article do not necessarily represent those of Elsevier.


News Article | May 2, 2017
Site: news.yahoo.com

Researchers from Italy and Portugal announced Tuesday the discovery of a new dinosaur species that lived 150 million years ago in what is Wyoming today. Named Galeamopus pabsti, the Jurassic era dinosaur’s fossil was excavated in 1995 by a Swiss team. Paleontologists from the University of Turin in Italy, and Universidade Nova de Lisboa along with Museu da Lourinhã in Portugal, described the new sauropod species in a paper titled “Osteology of Galeamopus pabsti sp. nov. (Sauropoda: Diplodocidae), with implications for neurocentral closure timing, and the cervico-dorsal transition in diplodocids,” that appeared Tuesday in the journal PeerJ. G. pabsti was similar to the more famous dinosaur genus Diplodocus, whose members grew over 80 feet in size. In comparison, G. pabsti had “more massive legs, and a particularly high and triangular neck close to the head,” according to a statement Tuesday by the authors of the paper. The species was named in honor of Ben Pabst, who along with Hans-Jakob "Kirby" Siber, led the Swiss team of excavators who first found the fossil. Sauropods had greatly elongated necks and tails, and large bodies. They were found in South America, Europe and Africa, but most diversity is known from North America, specifically the U.S. In another study, also announced Tuesday, researchers from the Imperial College London along with colleagues in France and elsewhere in Europe said they identified an overlooked fossil in a museum to be the earliest known member of a giant dinosaur family called titanosauriforms, which were a subgroup of sauropods. Discovered in France in the 1930s, the fossil had been largely ignored up till now. Researchers have named the species Vouivria damparisensis, and suggested in a statement it lived about 160 million years ago. Species that made up the titanosauriform family were among the largest land creatures to have lived, and the most famous member of the family was the brachiosaurus. The individual specimen of V. damparisensis whose fossil the researchers examined was over 15 meters long and weighed about 15,000 kilograms. With a long neck held diagonally upward, a long tail and four legs of equal length, it was likely a herbivore. Philip Mannion, lead author of the study from Imperial College London, said in a statement Tuesday: “Vouivria would have been a herbivore, eating all kinds of vegetation, such as ferns and conifers. This creature lived in the Late Jurassic, around 160 million years ago, at a time when Europe was a series of islands. We don’t know what this creature died from, but millions of years later it is providing important evidence to help us understand in more detail the evolution of brachiosaurid sauropods and a much bigger group of dinosaurs that they belonged to, called titanosauriforms.” The name of the species pays homage to Damparis, the village where the fossil was found. The first word of its name derives from an old French word whose Latin root means “viper.” In the region of Damparis, “la vouivre” is a mythical winged reptile.


The fungus, Batrachochytrium dendrobatidis (Bd), is a type of chytrid that has severely affected over 700 amphibian species worldwide, and has made more species extinct than any other infectious disease known to science – at least 200 so far. It causes chytridiomycosis, a disease that damages amphibian skin and rapidly kills its host. Until now, chytrid was thought to only affect amphibians, a group that includes toads, newts, salamanders and frogs. However, researchers from Imperial College London have now demonstrated in the laboratory that Bd can also infect zebrafish at the larval stage – the developmental phase just after they hatch from eggs. Research into Bd currently relies on studying infected amphibians. However they are difficult to study and also need to be captured in nature, which is not sustainable in the longer term. Amphibians that are sourced from different natural populations also may respond very differently to the fungus. Zebrafish are some of the most widely-used biological model species owing to their transparency at the larval stage, which allows scientists to use microscopy to easily track infections. Their immune systems also have many parallels with that of humans and other vertebrates such as frogs. The team behind today's discovery, which is published in Nature Communications, say their work will lead to zebrafish as a new model for studying the disease. This could give scientists the opportunity to understand in more detail how the fungus harms its amphibian hosts. Professor Mat Fisher, a co-author from Imperial's School of Public Health, said: "The fact that chytrid is able to infect zebrafish larvae could mean that we now have a more effective animal model with which to study the fungus and continue our research in how to save these amphibians." The researchers found that Bd infection took hold in zebrafish larvae in a similar way to how it does in amphibians. Professor Fisher added: "The natural bacterial coating found in young zebrafish appeared to protect them from harm during infection, and meant they could fight off the chytrid. This is a far more humane way to study the fungus than our previous models, and means we now have a new laboratory model." Furthermore, because zebrafish breed quickly, the researchers can use many more than they can with frogs. This would help to make research go further and faster. Co-author Dr Serge Mostowy from Imperial's Department of Medicine said: "A zebrafish model represents a brand new opportunity to study the disease process of chytrids. Young zebrafish have fully developed innate immune systems, which means we can now easily study host-fungus interactions in real time using non-invasive techniques. We can also control their environment with antibiotics, allowing us to study the role of already-present bacteria in influencing chytrid infection." The findings may also offer clues into how the fungus spreads between hosts. The researchers suggest that zebrafish larvae and other fish species could act as environmental reservoirs in the wild, and may pass the infection onto amphibians. Ms Nicole Liew, lead author of the paper from Imperial's MRC Centre of Molecular Bacteriology and Infection said: "The more we know about how Bd can infect hosts and where it resides in the environment, the better we can prepare for it and prevent more deaths. Our findings today give us an exciting wealth of information to work with, opening a whole new avenue of research. From our experiments, we now know some of chytrid's hiding places, and present a new lab model highly suited for fluorescent microscopy, enabling us to learn more about the disease process." The scientists also managed to infect another species of fish, the guppy, but these fish ended up clearing their infection eventually. The authors say that although their research shows that young zebrafish can be infected, further studies are needed to determine the extent that fish might act as reservoirs of infection in the environment. Professor Fisher added: "Our knowledge of this devastating fungus is growing in leaps and bounds, and we are excited to see where this new information will take us in terms of saving our amphibian friends." More information: Nicole Liew et al. Chytrid fungus infection in zebrafish demonstrates that the pathogen can parasitize non-amphibian vertebrate hosts, Nature Communications (2017). DOI: 10.1038/ncomms15048


Patients at risk of developing bowel cancer can significantly benefit from a follow-up colonoscopy, finds research published today in Lancet Oncology. Currently, everyone in the UK over the age of 60 is invited to be screened for bowel cancer, also known as colorectal cancer. It is a major cause of illness and death in developed countries. Small growths in the bowel, called polyps or adenomas, can develop into cancer over a long period of time. However, removing these precancerous growths can drastically reduce the risk of developing bowel cancer. The new research, funded by the National Institute for Health Research (NIHR), shows that most patients who have had treatment to remove growths in their bowel and are classed as being at 'intermediate risk' can benefit substantially from a follow-up or 'surveillance' colonoscopy. However, a proportion of this group of patients are at low risk compared with the general population and are unlikely to benefit significantly from colonoscopy surveillance. The researchers suggest the findings could lead to changes in the way patients are screened and followed-up, and even reduce costs for healthcare services. Professor Wendy Atkin, from the Department of Surgery and Cancer at Imperial College London and chief investigator on the study, said: "The findings could influence national and international guidelines for the screening and surveillance of bowel cancer and could lead to cost savings for the NHS by reducing unnecessary procedures." Those patients who have one-to-two large adenomas (1 cm or larger) or three-to-four small adenomas are classed as being at 'intermediate risk' and are recommended to have a follow-up colonoscopy three years after their adenomas are removed. Most patients offered this surveillance are at intermediate risk In the latest study, researchers from Imperial College London looked at the incidence of bowel cancer and the effectiveness of follow-up colonoscopies in reducing incidence in people found to have intermediate-risk adenomas. The study was commissioned on behalf of the UK National Screening Committee to help inform its current bowel cancer screening programme for the NHS. Professor Atkin, said: "Colonoscopies carry a small risk of complications for patients, and are demanding on NHS resources, with around 20 per cent of colonoscopies in the UK performed for surveillance. It is therefore important to assess whether all people classed as being at intermediate risk need to undergo follow-up colonoscopy." Researchers looked at data for more than 250,000 patients and identified approximately 12,000 people who were diagnosed with intermediate-risk adenomas across 17 UK hospitals. These patients were monitored over an eight year period, and the incidence of bowel cancer was compared in those who had a follow-up colonoscopy with those who had not. They identified a subgroup of patients within the intermediate-risk group, with large adenomas (2 cm or larger), advanced pathology in the adenomas, or polyps in the upper half of the large bowel who were at a higher risk of developing bowel cancer. These 'higher-risk' patients appeared to benefit substantially from at least one follow-up colonoscopy. In addition, intermediate-risk patients who fell into the 'lower-risk' subgroup were found to have a smaller chance of developing bowel cancer than that of the general population. For this group of patients, the researchers suggest that follow-up colonoscopies may not be warranted at all if the initial colonoscopy is of high quality. According to the researchers, the findings will help to shape current and future guidelines on bowel cancer screening both in the UK and internationally. If the changes are adopted, they could lead to cost savings for the NHS and reallocating of resources to focus on those most at risk. "The quality of colonoscopy has improved in recent years and it is important we identify those people who would benefit from a follow-up colonoscopy," said Professor Atkin. "This research showed that there is a subgroup that definitely benefits but there is also a subgroup that possibly don't require a follow-up colonoscopy. "The results of this study provide robust evidence which will be important for informing future surveillance guidelines for how we monitor people in the intermediate-risk group, and will help minimise the costs and risks associated with the unnecessary colonoscopies that are currently performed." Adenoma surveillance and colorectal cancer incidence: a retrospective, multicentre, cohort study by Atkin, W. et al, is published in the journal Lancet Oncology.


News Article | April 25, 2017
Site: www.eurekalert.org

The findings, published today in The Lancet, show women fare worse than men at every stage of treatment, leading to the study's authors to call for urgent improvement in how the condition is managed in women. The researchers, from Imperial College London and the University of Cambridge, found that women are less likely than men to be deemed suitable for keyhole surgery for the condition, which is associated with better outcomes. They are also more likely to be offered no surgical treatment at all. The findings are based on a review of international research into the condition, carried out since 2000. An abdominal aortic aneurysm is caused by a weakening in the wall of the aorta, the body's largest blood vessel, which carries blood from the heart through the abdomen to the rest of the body. Degenerative changes in the aortic wall cause weakening and ballooning of the blood vessel, sometimes to more than three times its normal diameter, with a risk of a potentially life-threatening rupture. Surgical repair for these aneurysms is offered only when the swelling is large enough to make the risk of rupture greater than the risks of the operation, with two types of surgery available. Open surgery involves cutting into the abdomen and replacing all of the ballooning section of the aorta with a tube-like graft. The second procedure, endovascular repair, is a minimally invasive 'keyhole' technique which involves inserting a tube-like graft through the leg artery into the swollen section of the aorta to reinforce the blood vessel's weakened wall. It is associated with better early outcomes than open surgery, but can only be offered when the aneurysm meets certain criteria, due to the shape and size of the grafts. For some patients with large aneurysms, the risk of both of these options are deemed to outweigh the risk of rupture and no treatment is offered unless patient fitness can be improved. The study, funded by the National Institute for Health Research, found that only a third of women were deemed suitable for keyhole surgery, compared with just over half of men. Less than a fifth of men were not offered surgery, compared with a third of women. Mortality rates for women for the 30 days after keyhole surgery were 2.3 per cent compared with 1.4 per cent for men. For open surgery, this rose to 5.4 per cent for women and 2.8 per cent for men. Women tend to develop aneurysms at an older age than men, and their aortas are smaller. Given the current technologies available, both of these factors can affect which type of surgery is deemed suitable, or whether surgery is an option at all. The researchers say that while these factors will form the basis of future research, age and physical fitness are not enough to account for the differences seen in mortality between men and women. Professor Janet Powell, from Imperial's Department of Surgery & Cancer and who led the research, said: "Our findings show that despite overall improvement in mortality rates for this condition, there is a huge disparity between outcomes for men and women, which is not acceptable. "The way abdominal aortic aneurysm is managed in women needs urgent improvement. We need to see if the devices used for keyhole surgery can be made more flexible to enable more women to be offered this option. We also need more grafts designed to fit women, who have smaller aortas, as all the grafts currently available have been designed for men." In the UK, abdominal aortic aneurysm is more prevalent in men, with men over 65 regularly screened for the condition. The condition often has no symptoms and many women are only diagnosed when the aneurysm ruptures, at which point the likelihood of survival can be less than 20 per cent. Professor Powell added: "Abdominal aortic aneurysm is still seen as mainly a male condition, and as a result, the way we manage the condition - from screening to diagnosis and treatment - has been developed with men in mind. Our study shows that this needs to change." 'Morphological suitability for endovascular repair, non-intervention rates, and operative mortality in women and men assessed for intact abdominal aortic aneurysm repair: systematic reviews with meta-analysis' by Ulug, P et al, is published in The Lancet.


News Article | April 5, 2017
Site: www.techtimes.com

New research reveals that the first Brexit actually occurred thousands of years prior to the United Kingdom voting to exit from the European Union: It separated geologically from the rest of Europe. But what makes the so-called “Brexit 1.0” an important piece of ancient history? The historic separation followed a two-phase flooding event that destroyed the thin connecting land between ancient Britain and France. The first event involved a massive overflowing lake from about 450,000 years ago. The second event, on the other hand, occurred 160,000 years earlier, with a huge flood opening the Dover Strait in the English Channel, which now separates Britain from the rest of the continent. According to new evidence, Dover Strait some 450,000 years earlier may have served as a huge rock ridge of chalk, with a huge dam boasting a proglacial lake behind it that resembled Siberia’s frozen tundra more closely than its modern green surroundings. It initially eroded via lake overspill, followed by “plunge pool erosion by waterfalls and subsequent dam breaching,” researchers said in the journal Nature Communication. The researchers are convinced that Britain left Europe in earlier times through the catastrophic route than just simple erosion. The glacial lake, probably induced by an earthquake that weakened the ridge further, over-spilled staggering amounts of water. "The waterfalls were so huge they left behind the plunge pools, some several kilometers in diameter and 100 meters [328 feet] deep in solid rock, running in a line from Calais to Dover," explained author Jenny Collier of Imperial College London. Had the first flood not taken place, Britain could still be connected to Europe today. In that scenario, it could be jutting out the way Denmark does at present, the researchers added. The next step for the team is to take samples from the plunge pools to obtain more data on the events’ timings. One major issue, however, is that the waters above the holes are now the site of one of the world’s busiest shipping lanes. A shallow, narrow channel that spans less than 30 kilometers (18.6 miles), Dover Strait links the North Sea, English Channel, as well as the Atlantic. The area has enormous strategic importance from the earliest waves of invasion and defense, with features such as the Martello Towers built during the Napoleonic War and the World War II frontline fortifications and anti-tank devices along the whole coast. But beyond being of great historic interest, the strait also serves as a busy international shipping lane with more than 500 ship movements every day. In addition, it is a crucial spawning site for marine wildlife and a migration route that sees more than 250 bird species in any given year. Dover Strait’s utmost significance simply lies in the two events that unfolded thousands of years ago. “The opening of the Strait has significance for the biogeography and archaeology of NW Europe, with particular attention on the pattern of early human colonization of Britain,” the team wrote. © 2017 Tech Times, All rights reserved. Do not reproduce without permission.


Megan has always been passionate about preventative healthcare and customized beauty.  "From a very young age, I had two goals that I wanted to achieve in my life," Megan explained. "One was to help people live healthier lives and the other was to help people look younger and more beautiful. To achieve these goals, I focused my studies in the fields of health, wellness and beauty. While I was a Ph. D student at the Imperial College London, I read about Angelina Jolie's experience with genetic testing, which became one of the most debated topics in healthcare worldwide. I've always admired Angelina Jolie for her beauty and humanitarian work and I was incredibly inspired by her story. It was then that I truly began to comprehend the importance of genetic testing. The fact that getting one genetic test can contribute in saving a person's life lead me on a mission to take genetic testing to every corner of the globe." As a spokesmodel and correspondent covering all of the major red carpet events in Hollywood, Megan has access to the biggest celebrities and social media influencers in the world.  Megan has made it her mission to take the conversation around customized skincare, diet, fitness and beauty based on genetic testing to the red carpet and the positive response has been overwhelming. Megan is always looking to partner with innovative brands from around that world and amplify their message in Hollywood and she's already received offers and interest from major global companies. "I knew that Hollywood was the best place for me to take and amplify my message. If I want to really raise awareness about the new innovations in health, beauty and wellness, I should utilize the Hollywood microphone, which is the most powerful microphone in the world. I am on a mission to bridge the gap between the medical world and Hollywood." For information about opportunities to work with Megan Pormer, click here. To view the original version on PR Newswire, visit:http://www.prnewswire.com/news-releases/womens-health-cover-girl-megan-pormer-takes-customized-health-and-beauty-based-on-dna-to-hollywood-300449224.html


News Article | May 3, 2017
Site: news.yahoo.com

(Reuters) - Europe’s top tech hubs tend to radiate from massive capital cities like London, Berlin and Paris. But the heart of European innovation isn’t a major metropolis –it’s a small city in the Dutch-speaking region of Flanders. That’s the conclusion of Reuters’ second annual ranking of Europe’s Most Innovative Universities, a list that identifies and ranks the educational institutions doing the most to advance science, invent new technologies, and help drive the global economy. The most innovative university in Europe, for the second year running, is Belgium’s KU Leuven. This nearly 600-year-old institution was founded by Pope Martin V, but today it’s better known for technology than theology: KU Leuven maintains one of the largest independent research and development organizations on the planet. In fiscal 2015, the university’s research spending exceeded €454 million, and its patent portfolio currently includes 586 active families, each one representing an invention protected in multiple countries. How does a relatively small Catholic university out-innovate bigger, better-known institutions across Europe? KU Leuven earned its first-place rank, in part, by producing a high volume of influential inventions. Its researchers submit more patents than most other universities on the continent, and outside researchers frequently cite KU Leuven inventions in their own patent applications. Those are key criteria in Reuters ranking of Europe’s Most Innovative Universities, which was compiled in partnership with Clarivate Analytics, and is based on proprietary data and analysis of indicators including patent filings and research paper citations. The second most innovative university in Europe is Imperial College London, an institution whose researchers have been responsible for the discovery of penicillin, the development of holography and the invention of fiber optics. The third-place University of Cambridge has been associated with 91 Nobel Laureates during its 800-year history. And the fourth-place Technical University of Munich has spun off more than 800 companies since 1990, including a variety of high-tech startups in industries including renewable energy, semiconductors and nanotechnology. Overall, the same countries that dominate European business and politics dominate the ranking of Europe's Most Innovative Universities. German universities account for 23 of the 100 institutions on the list, more than any other country, and the United Kingdom comes in second, tied with France, each with 17 institutions. But those three countries are also among the most populous and richest countries on the continent. Control for those factors, and it turns out that countries with much smaller populations and modest economies often outperform big ones. The Republic of Ireland has only three schools on the entire list, but with a population of less than 5 million people, it can boast more top 100 innovative universities per capita than any other country in Europe. On the same per capita basis, the second most innovative country on the list is Denmark, followed by Belgium, Switzerland and the Netherlands. Germany, the United Kingdom and France rank in the middle of the pack, an indication that they may be underperforming compared with their smaller neighbors: On a per capita basis, none of those countries has half as many top 100 universities than Ireland. And the same trends hold true if you look at national economies. According to the International Monetary Fund, in 2016 Germany’s gross domestic product exceeded $3.49 trillion –11 times larger than Ireland at $307 billion, yet Germany has only 7 times as many top 100 innovative universities. Some countries underperform even more drastically. Russia is Europe’s most populous country and has the region’s fifth largest economy, yet none of its universities count among the top 100. Other notable absences include any universities from Ukraine or Romania–a fact that reveals another divide between Western and Eastern Europe. To compile the ranking of Europe’s most innovative universities, Clarivate Analytics (formerly the Intellectual Property & Science business of Thomson Reuters) began by identifying more than 600 global organizations that published the most articles in academic journals, including educational institutions, nonprofit charities, and government-funded institutions. That list was reduced to institutions that filed at least 50 patents with the World Intellectual Property Organization in the period between 2010 and 2015. Then they evaluated each candidate on 10 different metrics, focusing on academic papers (which indicate basic research) and patent filings (which point to an institution's ability to apply research and commercialize its discoveries). Finally, they trimmed the list so that it only included European universities, and then ranked them based on their performance. This is the second consecutive year that Clarivate and Reuters have collaborated to rank Europe’s Most Innovative Universities, and three universities that ranked in the top 100 in 2016 fell off the list entirely: the Netherland’s Eindhoven University of Technology, Germany’s University of Kiel, and the UK’s Queens University Belfast. All three universities filed fewer than 50 patents during the period examined for the ranking, and thus were eliminated from consideration. They’ve been replaced by three new entrants to the top 100: the University of Glasgow (#54), the University of Nice Sophia Antipolis (#94), and the Autonomous University of Madrid (#100). The returning universities that made the biggest moves on the list were the Netherland’s Leiden University (up 21 spots to #17) and Germany’s Technical University of Berlin (up 21 spots to #41). Belgium’s Université Libre of Brussels (down 17 to #38) and the UK’s University of Leeds (down 17 to #73) made the biggest moves in the opposite direction. Generally, though, the list remained largely stable: Nine of the top ten schools of 2016 remained in the top 10 for 2017, and 17 of the top 20. This stability is understandable because something as large as university paper output and patent performance is unlikely to change quickly. Of course, the relative ranking of any university does not provide a complete picture of whether its researchers are doing important, innovative work. Since the ranking measures innovation on an institutional level, it may overlook particularly innovative departments or programs: a university might rank low for overall innovation but still operate one of the world's most innovative computer science laboratories, for instance. And it's important to remember that whether a university ranks at the top or the bottom of the list, it's still within the top 100 on the continent: All of these universities produce original research, create useful technology and stimulate the global economy.


News Article | April 20, 2017
Site: hosted2.ap.org

(AP) — Europe's salamanders could be decimated by a flesh-eating alien species that has already wreaked havoc in some parts of the continent, scientists said in a study published Wednesday. Researchers who examined the impact of the alien invader — a fungus native to Asia — on fire salamanders in Belgium and the Netherlands found it to be lethal to the amphibians and almost impossible to eradicate. The study published in the journal Nature Research provides a drastic warning to North America, where the fungus hasn't yet taken hold. "Prevention of introduction is the most important control measure available against the disease," said study co-author An Martel, a veterinarian at the University of Ghent, Belgium, who specializes in wildlife diseases. The B. salamandrivorans fungus, which likely was imported to Europe by the pet trade — causes skin ulcers, effectively eating the salamander's skin and making it susceptible to secondary bacterial infections. Martel and her colleagues began studying the effect of the fungus in early 2014, four years after it was first recorded in Europe. Within six months, the population of fire salamanders at the site in Robertville, Belgium, had shrunk to a tenth of its original size. Two years later less than one percent of the distinctive yellow-and-black patterned amphibians had survived, according to the study. Sexually mature salamanders appeared to be particularly prone to becoming infected with the fungus due to their contact with other individuals, preventing them from producing new generations. Furthermore, researchers found the fungus was able to form spores with thick walls that allowed it to survive for longer and spread further, including on the feet of water birds. Other amphibian species, including newts and toads, were also susceptible, either making them carriers of the fungus or ill themselves. Finally, infected animals failed to develop an immune response that might allow some of the salamander population to survive and ultimately prevail against its new foe, which has already been detected in 12 populations in the Netherlands, Belgium and Germany. Conservationists in the United States are already monitoring wetlands for signs of the fungus . "For highly susceptible species like fire salamanders, there are no available mitigation measures," Martel told The Associated Press. "Classical measures to control animal diseases such as vaccination and repopulation will not be successful since there is no immunity buildup in these species and eradication of the fungus from the ecosystem is unlikely." In a separate comment published by Nature, Matthew C. Fisher, an expert in fungal epidemiology at Imperial College London who wasn't involved in the study, backed the researchers' suggestion that the only way to save Europe's salamanders may be to keep a healthy population in captivity — at least until a cure is found. "It is currently unclear how (the fungus) can be combated in the wild beyond establishing 'amphibian arks' to safeguard susceptible species as the infection marches relentlessly onwards," said Fisher.


Postma D.S.,University of Groningen | Bush A.,Imperial College London | Van Den Berge M.,University of Groningen
The Lancet | Year: 2015

Summary Chronic obstructive pulmonary disease is mainly a smoking-related disorder and affects millions of people worldwide, with a large effect on individual patients and society as a whole. Although the disease becomes clinically apparent around the age of 40-50 years, its origins can begin very early in life. Different risk factors in very early life - ie, in utero and during early childhood - drive the development of clinically apparent chronic obstructive pulmonary disease in later life. In discussions of which risk factors drive chronic obstructive pulmonary disease, it is important to realise that the disease is very heterogeneous and at present is largely diagnosed by lung function only. In this Review, we will discuss the evidence for risk factors for the various phenotypes of chronic obstructive pulmonary disease during different stages of life. © 2015 Elsevier Ltd.


Grant
Agency: GTR | Branch: EPSRC | Program: | Phase: Training Grant | Award Amount: 4.08M | Year: 2014

High Performance Embedded and Distributed Systems (HiPEDS), ranging from implantable smart sensors to secure cloud service providers, offer exciting benefits to society and great opportunities for wealth creation. Although currently UK is the world leader for many technologies underpinning such systems, there is a major threat which comes from the need not only to develop good solutions for sharply focused problems, but also to embed such solutions into complex systems with many diverse aspects, such as power minimisation, performance optimisation, digital and analogue circuitry, security, dependability, analysis and verification. The narrow focus of conventional UK PhD programmes cannot bridge the skills gap that would address this threat to the UKs leadership of HiPEDS. The proposed Centre for Doctoral Training (CDT) aims to train a new generation of leaders with a systems perspective who can transform research and industry involving HiPEDS. The CDT provides a structured and vibrant training programme to train PhD students to gain expertise in a broad range of system issues, to integrate and innovate across multiple layers of the system development stack, to maximise the impact of their work, and to acquire creativity, communication, and entrepreneurial skills. The taught programme comprises a series of modules that combine technical training with group projects addressing team skills and system integration issues. Additional courses and events are designed to cover students personal development and career needs. Such a comprehensive programme is based on aligning the research-oriented elements of the training programme, an industrial internship, and rigorous doctoral research. Our focus in this CDT is on applying two cross-layer research themes: design and optimisation, and analysis and verification, to three key application areas: healthcare systems, smart cities, and the information society. Healthcare systems cover implantable and wearable sensors and their operation as an on-body system, interactions with hospital and primary care systems and medical personnel, and medical imaging and robotic surgery systems. Smart cities cover infrastructure monitoring and actuation components, including smart utilities and smart grid at unprecedented scales. Information society covers technologies for extracting, processing and distributing information for societal benefits; they include many-core and reconfigurable systems targeting a wide range of applications, from vision-based domestic appliances to public and private cloud systems for finance, social networking, and various web services. Graduates from this CDT will be aware of the challenges faced by industry and their impact. Through their broad and deep training, they will be able to address the disconnect between research prototypes and production environments, evaluate research results in realistic situations, assess design tradeoffs based on both practical constraints and theoretical models, and provide rapid translation of promising ideas into production environments. They will have the appropriate systems perspective as well as the vision and skills to become leaders in their field, capable of world-class research and its exploitation to become a global commercial success.


Grant
Agency: European Commission | Branch: H2020 | Program: MSCA-ITN-ETN | Phase: MSCA-ITN-2015-ETN | Award Amount: 3.90M | Year: 2015

Development of fuel injection equipment (FIE) able to reduce pollutant emissions from liquid-fueled transportation and power generation systems is a top industrial priority in order to meet the forthcoming EU 2020 emission legislations. However, design of new FIE is currently constrained by the incomplete physical understanding of complex micro-scale processes, such as in-nozzle cavitation, primary and secondary atomization. Unfortunately, todays computing power does not allow for an all-scale analysis of these processes. The proposed program aims to develop a large eddy simulation (LES) CFD model that will account for the influence of unresolved sub-grid-scale (SGS) processes to engineering scales at affordable computing time scales. The bridging parameter between SGS and macro-scales flow processes is the surface area generation/destruction occurring during fuel atomisation; relevant SGS closure models will be developed through tailored experiments and DNS and will be implemented into the LES model predicting the macroscopic spray development as function of the in-nozzle flow and surrounding air conditions. Validation of the new simulation tool, currently missing from todays state-of-the-art models, will be performed against new benchmark experimental data to be obtained as part of the programme, in addition to those provided by the industrial partners. This will demonstrate the applicability of the model as an engineering design tool suitable for IC engines, gas turbines, fuel burners and even rocket engine fuel injectors. The proposed research and training programme will be undertaken by 15ESRs funded by the EU and one ESR funded independently from an Australian partner; ESRs will be recruited/seconded by universities, research institutes and multinational fuel injection and combustion systems manufacturers that will represent in the best possible way the international, interdisciplinary and intersectoral requirements of the Marie Curie Action guidelines.


Grant
Agency: European Commission | Branch: H2020 | Program: RIA | Phase: PHC-26-2014 | Award Amount: 4.52M | Year: 2015

Type 1 Diabetes Mellitus (T1DM) portraits a high need and challenge for self-management by young patients: a complex illness with a high and increasing prevalence, a regimen that needs adaptation to patients condition and activities, and serious risks for complications and reduced life expectations. When patients do not acquire the knowledge, skills and habits to adhere to their diabetes regimen at childhood, these risks increase suddenly at adolescence. Current mHealth applications have their own specific value for self-management, but are unable to deliver the required comprehensive, prolonged, personalised and context-sensitive support and to reduce these risks persistently. We aim at a Personal Assistant for healthy Lifestyle (PAL) that provides such support, assisting the child, health professional and parent to advance the self-management of children with type 1 diabetes aged 7 - 14, so that an adequate shared patient-caregiver responsibility for childs diabetes regimen is established before adolescence. The PAL system is composed of a social robot, its (mobile) avatar, and an extendable set of (mobile) health applications (diabetes diary, educational quizzes, sorting games, etc.), which all connect to a common knowledge-base and reasoning mechanism. The robot and avatar act as a childs pal or companion, whereas health professionals and parents are supported by, respectively, an Authoring & Control and a Monitor & Inform tool. The PAL-project will assess the benefits of the behavioural change on patients health conditions, and the profits for the caregivers in longitudinal field experiments. The consortium provides the required network, expertise and tools for this research: (a) a knowledge-driven co-design methodology and tool, (b) medical, human factors and technical expertise, (c) end-user participation and (d) initial PAL building-blocks.


Grant
Agency: GTR | Branch: NERC | Program: | Phase: Research Grant | Award Amount: 1.82M | Year: 2015

The impacts of climate change, and warming in particular, on natural ecosystems remain poorly understood, and research to date has focused on individual species (e.g. range shifts of polar bears). Multispecies systems (food webs, ecosystems), however, can possess emergent properties that can only be understood using a system-level perspective. Within a given food web, the microbial world is the engine that drives key ecosystem processes, biogeochemical cycles (e.g. the carbon-cycle) and network properties, but has been hidden from view due to difficulties with identifying which microbes are present and what they are doing. The recent revolution in Next Generation Sequencing has removed this bottleneck and we can now open the microbial black box to characterise the metagenome (who is there?) and metatranscriptome (what are they doing?) of the community for the first time. These advances will allow us to address a key overarching question: should we expect a global response to global warming? There are bodies of theory that suggest this might be the case, including the Metabolic Theory of Ecology and the Everything is Everywhere hypothesis of global microbial biogeography, yet these ideas have yet to be tested rigorously at appropriate scales and in appropriate experimental contexts that allow us to identify patterns and causal relationships in real multispecies systems. We will assess the impacts of warming across multiple levels of biological organisation, from genes to food webs and whole ecosystems, using geothermally warmed freshwaters in 5 high-latitude regions (Svalbard, Iceland, Greenland, Alaska, Kamchatka), where warming is predicted to be especially rapid,. Our study will be the first to characterise the impacts of climate change on multispecies systems at such an unprecedented scale. Surveys of these sentinel systems will be complemented with modelling and experiments conducted in these field sites, as well as in 100s of large-scale mesocosms (artificial streams and ponds) in the field and 1,000s of microcosms of robotically-assembled microbial communities in the laboratory. Our novel genes-to-ecosystems approach will allow us to integrate measures of biodiversity and ecosystem functioning. For instance, we will quantify key functional genes as well as quantifying which genes are switched on (the metatranscriptome) in addition to measuring ecosystem functioning (e.g. processes related to the carbon cycle). We will also measure the impacts of climate change on the complex networks of interacting species we find in nature - what Darwin called the entangled bank - because food webs and other types of networks can produce counterintuitive responses that cannot be predicted from studying species in isolation. One general objective is to assess the scope for biodiversity insurance and resilience of natural systems in the face of climate change. We will combine our intercontinental surveys with natural experiments, bioassays, manipulations and mathematical models to do this. For instance, we will characterise how temperature-mediated losses to biodiversity can compromise key functional attributes of the gene pool and of the ecosystem as a whole. There is an assumption in the academic literature and in policy that freshwater ecosystems are relatively resilient because the apparently huge scope for functional redundancy could allow for compensation for species loss in the face of climate change. However, this has not been quantified empirically in natural systems, and errors in estimating the magnitude of functional redundancy could have substantial environmental and economic repercussions. The research will address a set of key specific questions and hypotheses within our 5 themed Workpackages, of broad significance to both pure and applied ecology, and which also combine to provide a more holistic perspective than has ever been attempted previously.


Grant
Agency: European Commission | Branch: FP7 | Program: CP-IP | Phase: HEALTH.2013.2.3.3-1 | Award Amount: 31.38M | Year: 2014

Far from receding, the threats posed by infections with epidemic potential grow ever greater. Although Europe has amongst the best healthcare systems in the world, and also the worlds supreme researchers in this field, we lack co-ordination and linkage between networks that is required to respond fast to new threats. This consortium of consortia will streamline our response, using primary and secondary healthcare to detect cases with pandemic potential and to activate dynamic rapid investigation teams that will deploy shared resources across Europe to mitigate the impact of future pandemics on European health, infrastructure and economic integrity. If funded, PREPARE will transform Europes response to future severe epidemics or pandemics by providing infrastructure, co-ordination and integration of existing clinical research networks, both in community and hospital settings. It represents a new model of collaboration and will provide a one-stop shop for policy makers, public health agencies, regulators and funders of research into pathogens with epidemic potential. It will do this by mounting interepidemic (peace time) patient oriented clinical trials in children and in adults, investigations of the pathogenesis of relevant infectious diseases and facilitate the development of sophisticated state-of-the-art near-patient diagnostics. We will develop pre-emptive solutions to ethical, administrative, regulatory and logistical bottlenecks that prevent a rapid response in the face of new threats. We will provide education and training not only to the members of the network, but also to external opinion leaders, funders and policy makers thereby streamlining our future response. By strengthening and integrating interepidemic research networks, PREPARE will enable the rapid coordinated deployment of Europes elite clinical investigators, resulting in a highly effective response to future outbreaks based on solid scientific advances.


Grant
Agency: European Commission | Branch: H2020 | Program: MSCA-ITN-ETN | Phase: MSCA-ITN-2015-ETN | Award Amount: 3.81M | Year: 2015

Mathematical, computational models are central in biomedical and biological systems engineering; models enable (i) mechanistically justifying experimental results via current knowledge and (ii) generating new testable hypotheses or novel intervention methods. SyMBioSys is a joint academic/industrial training initiative supporting the convergence of engineering, biological and computational sciences. The consortiums mutual goal is developing a new generation of innovative and entrepreneurial early-stage researchers (ESRs) to develop and exploit cutting-edge dynamic (kinetic) mathematical models for biomedical and biotechnological applications. SyMBioSys integrates: (i) six academic beneficiaries with a strong record in biomedical and biological systems engineering research, these include four universities and two research centres; (ii) four industrial beneficiaries including key players in developing simulation software for process systems engineering, metabolic engineering and industrial biotechnology; (iii) three partner organisations from pharmaceutical, biotechnological and entrepreneurial sectors. SyMBioSys is committed to supporting the establishment of a Biological Systems Engineering research community by stimulating programme coordination via joint activities. The main objectives of this initiative are: * Developing new algorithms and methods for reverse engineering and identifying dynamic models of biosystems and bioprocesses * Developing new model-based optimization algorithms for exploiting dynamic models of biological systems (e.g. predicting behavior in biological networks, identifying design principles and selecting optimal treatment intervention) * Developing software tools, implementing the preceding novel algorithms, using state-of-the-art software engineering practices to ensure usability in biological systems engineering research and practice * Applying the new algorithms and software tools to biomedical and biological test cases.


Grant
Agency: European Commission | Branch: H2020 | Program: MSCA-RISE | Phase: MSCA-RISE-2015 | Award Amount: 517.50K | Year: 2016

The vision for the 5th generation of mobile networks (5G) includes at its heart the Internet of Things (IoT) paradigm, leading to a new era of connectivity where billions of devices exchange data and instill intelligence in our everyday life. The EU has set out to play a leading role in developing 5G technologies by consolidating and building upon the most important research and innovation results attained in previous research programs. Nevertheless, 5G is still in its early research stages. Various issues must be resolved before it can become a reality: we need to join forces across countries, continents, and sectors. The objective of the TACTILENet project is to bring together the complementary expertise of European and third-country partners in order to lay the foundations for addressing basic issues in several facets of 5G networking. The cross-fertilization among partners will contribute to the ongoing research efforts by jointly identifying ambitious yet feasible goals for 5G system, addressing some of the fundamental research problems in achieving these goals, and finally, by designing and analyzing a suite of protocols. Given the size of our consortium and the timeframe of the project, we will focus on some of the most promising directions, such as network densification, energy efficiency/harvesting and short blocklength communications. The consortium brings together expertise from all of the above research directions. Each partner will bring along its expertise in different thrusts, and the project will develop a unifying framework for a systematic study of the Internet of Things within the 5G framework capturing these clearly interrelated research areas. With its suggested mobility plan, the project aspires to strengthen collaboration among partners, exploit complementarities in expertise, educate young researchers and ultimately create a solid basis for fruitful cooperation, going beyond the time-frame of this project.


Grant
Agency: GTR | Branch: STFC | Program: | Phase: Research Grant | Award Amount: 879.19K | Year: 2016

How from a cloud of dust and gas did we arrive at a planet capable of supporting life? This is one of the most fundamental of questions, and engages everyone from school children to scientists. We now know much of the answer: We know that stars, such as our Sun, form by the collapse of interstellar clouds of dust and gas. We know that planets, such as Earth, are constructed in a disk around their host star known as the planetary nebula, formed by the rotation of the collapsing cloud of dust and gas. We know that 4.5 billion years ago in the solar nebula, surrounding the young Sun, all the objects in our Solar System were created through a process called accretion. And among all those bodies the only habitable world yet discovered on which life evolved is Earth. There is, however, much that we still do not know about how our Solar System formed. Why, for example, are all the planets so different? Why is Venus an inferno with a thick carbon dioxide atmosphere, Mars a frozen rock with a thin atmosphere, and Earth a haven for life? The answer lies in events that predated the assembly of these planets; it lies in the early history of the nebula and the events that occurred as fine-dust stuck together to form larger objects known as planetesimals; and in how those planetesimals changed through collisions, heating and the effects of water to become the building blocks of planets. Our research will follow the evolution of planetary materials from the origins of the first dust grains in the protoplanetary disk, through the assembly of planetesimals within the solar nebula to the modification of these objects as and after they became planets. Evidence preserved in meteorites provides a record of our Solar Systems evolution. Meteorites, together with cosmic dust particles, retain the fine-dust particles from the solar nebula. These dust grains are smaller than a millionth of a metre but modern microanalysis can expose their minerals and compositions. We will study the fine-grained components of meteorites and cosmic dust to investigate how fine-dust began accumulating in the solar nebula; how heating by an early hot nebula and repeated short heating events from collisions affected aggregates of dust grains; and whether magnetic fields helped control the distribution of dust in the solar nebula. We will also use numerical models to simulate how the first, fluffy aggregates of dust were compacted to become rock. As well as the rocky and metallic materials that make up the planets, our research will examine the source of Earths water and the fate of organic materials that were crucial to the origins of life. By analysing the isotopes of the volatile elements Zn, Cd and Te in meteorites and samples of Earth, Moon and Mars we will establish the source and timing of water and other volatiles delivered to the planets in the inner Solar System. In addition, through newly developed methods we can trace the history of organic matter in meteorites from their formation in interstellar space, through the solar nebula and into planetesimals. Reading the highly sensitive record in organic matter will reveal how cosmic chemistry furnished the Solar System with the raw materials for life. Once the planets finally formed, their materials continued to change by surface processes such as impacts and the flow of water. Our research will examine how impacts of asteroids and comets shaped planetary crusts and whether this bombardment endangered or aided the emergence of life. We will also study the planet Mars, which provides a second example of a planetary body on which life could have appeared. Imagery of ancient lakes on Mars will reveal a crucial period in the planets history, when global climate change transformed the planet into an arid wasteland, to evaluate the opportunity for organisms to adapt and survive and identify targets for future rover and sample return missions.


Grant
Agency: GTR | Branch: EPSRC | Program: | Phase: Research Grant | Award Amount: 2.06M | Year: 2016

Medicine is undergoing a simultaneous shift at the levels of the individual and the population: we have unprecedented tools for precision monitoring and intervention in individual health and we also have high-resolution behavioural and social data. Precision medicine seeks to deploy therapies that are sensitive to the particular genetic, lifestyle and environmental circumstances of each patient: understanding how best to use these numerous features about each patient is a profound mathematical challenge. We propose to build upon the mathematical, computational and biomedical strengths at Imperial to create a Centre for the Mathematics of Precision Healthcare revolving around the theme of multiscale networks for data-rich precision healthcare and public health. Our Centre proposes to use mathematics to unify individual-level precision medicine with public health by placing high-dimensional individual data and refined interventions in their social network context. Individual health cannot be separated from its behavioural and social context; for instance, highly targeted interventions against a cancer can be undermined by metabolic diseases caused by a dietary behaviour which co-varies with social network structure. Whether we want to tackle chronic disease or the diseases of later life, we must simultaneously consider the joint substrates of the individual together with their social network for transmission of behaviour and disease. We propose to tackle the associated mathematical challenges under the proposed Centre bringing to bear particular strengths of Imperials mathematical research in networks and dynamics, stochastic processes and analysis, control and optimisation, inference and data representation, to the formulation and analysis of mathematical questions at the interface of individual-level personalised medicine and public health, and specifically to the data-rich characterisation of disease progression and transmission, controlled intervention and healthcare provision, placing precision interventions in their wider context. The programme will be initiated and sustained on core research projects and will expand its portfolio of themes and researchers through open calls for co-funded projects and pump-priming initiatives. Our initial set of projects will engage healthcare and clinical resources at Imperial including: (i) patient journeys for disease states in cancer and their successive hospital admissions; multi-omics data and imaging characterisations of (ii) cardiomyopathies and (iii) dementia and co-morbidities; (iv) national population dynamics for epidemiological and epidemics simulation data from Public Health; social networks and (v) health beliefs and (vi) health policy debate. The initial core projects will build upon embedded computational capabilities and data expertise, and will thus concentrate on the development of mathematical methodologies including: sparse state-space methods for the characterisation of disease progression in high-dimensional data using transition graphs in discrete spaces; time-varying networks and control for epidemics data; geometrical similarity graphs to link imaging and omics data for disease progression; stochastic processes and community detection from NHS patient data wedding behavioural and social network data with personal health indicators; statistical learning for the analysis of stratified medicine. The mathematical techniques used to address these requirements will need to combine, among others, ingredients from dynamical and stochastic systems with graph-theoretical notions, sparse statistical learning, inference and optimisation. The Centre will be led by Mathematics but researchers in the Centre span mathematical, biomedical, clinical and computational expertise.


Grant
Agency: GTR | Branch: EPSRC | Program: | Phase: Research Grant | Award Amount: 2.52M | Year: 2013

We currently make more than just fuel from petroleum refining. Many of the plastics, solvents and other products that are used in everyday life are derived from these non-renewable resources. Our research programme aims to replace many of the common materials used as plastics with alternatives created from plants. This will enable us to tie together the UKs desire to move to non-petroleum fuel sources (e.g. biofuels) with our ability to produce renewable polymers and related products. Plant cell walls are made up of two main components: carbohydrate polymers (long chains of sugars) and lignin, which is the glue holding plants together. We will first develop methods of separating these two components using sustainable solvents called ionic liquids. Ionic liquids are salts which are liquids at room temperature, enabling a variety of chemical transformations to be carried out under consitions not normally available to traditional organic solvents. These ionic liquids also reduce pollution as they have no vapours and can be made from non-toxic, non-petroleum based resources. We will take the isolated carbohydrate polymers and break them down into simple sugars using enzymes and then further convert those sugars into building blocks for plastics using a variety of novel catalytic materials specifically designed for this process. The lignin stream will also be broken down and rebuilt into new plastics that can replace common materials. All of these renewable polymers will be used in a wide range of consumer products, including packaging materials, plastic containers and construction materials. The chemical feedstocks that we are creating will be flexible (used for chemical, material and fuel synthesis), safe (these feedstocks are predominantly non-toxic) and sustainable (most of the developed products are biodegradable). This will help reduce the overall environmental impact of the material economy in the UK. The chemistry that we will use focusses on creating highly energy efficient and low-cost ways of making these materials without producing large amounts of waste. We are committed to only developing future manufacturing routes that are benign to the environment in which we all live. In addition, natural material sources often have properties that are superior to those created using artificial means. We plan to exploit these advantages of natural resources in order to produce both replacements for current products and new products with improved performance. This will make our synthetic routes both environmentally responsible and economically advantageous. The UK has an opportunity to take an international lead in this area due to the accumulation of expertise within this country. The overall goal of this project is to develop sustainable manufacturing routes that will stimulate new UK businesses and environmentally responsible means of making common, high value materials. We will bring together scientific experts in designing processes, manufacturing plastics, growing raw biomass resources and developing new chemistries. The flexibility of resources is vital to the success of this endeavour, as no single plant biomass can be used for manufacturing on a year-round basis. Together with experienced leaders of responsible manufacturing industries, we will develop new ways of making everyday materials in a sustainable and economically beneficial way. The result of this research will be a fundamental philosophical shift to our material, chemical, and energy economy. The technologies proposed in this work will help break our dependence on rapidly depleting fossil resources and enable us to become both sustainable and self-sufficient. This will result in greater security, less pollution, and a much more reliable and responsible UK economy.


Grant
Agency: GTR | Branch: NERC | Program: | Phase: Research Grant | Award Amount: 13.53K | Year: 2016

It is widely recognised that ecosystems provide numerous services that are of benefit to humans but, in decisions regarding land and resource use, these tend to be overlooked. Within towns and cities this is particularly the case as nature is often considered to be absent in urban areas. However, as nearly 80% of the UK population live in urban areas there is considerable potential for improvements in ecosystem services to have a large impact on quality of life. As a result the Defra funded Ecosystem Services in the Urban Water Environment (ESUWE) project has begun to apply an ecosystem services approach to demonstrate the benefits that improvements in the urban water environment can have. It has also been recognised that a collaborative approach to decision making assists with the integrated planning that is required for sustainable catchment management. Therefore, the work of ESUWE also aims to provide tools to communicate and engage stakeholders in order to facilitate a participatory approach to catchment management. The ESUWE project has identified numerous ecosystem services provided in urban environments and developed metrics to quantify the costs and benefits associated with these. It is now working in four demonstration areas of varying sizes to map and evaluate ecosystem services and to pilot use in local catchment planning. It is hoped that by communicating information about benefits of environmental improvements, decisions can be better informed and that by mapping ecosystem services, areas where interventions will result in multiple-benefits can be identified and prioritised. Throughout the ESUWE project, Green Infrastructure (GI) has been highlighted as being important for delivering benefits to urban societies along with providing environmental and hydrological improvements. Therefore, the potential to expand the scope of the work beyond those directly involved with catchment planning has been identified. The Innovation Project will enable the application of the research conducted under the ESUWE project to meet the needs of a wider range of end users such as local nature partnership, local planning authorities and construction companies to be investigated so that the impact of the work can be increased. The Innovation Project will facilitate co-development of an ecosystem services mapping approach to the planning of GI with those responsible for land use decisions at local and national levels. This will ensure that the needs of end users are incorporated into the development of decision support tools that facilitate GI planning and help create standardised metrics that can express the benefits of GI for use in differing sectors. Work in four demonstration areas will explore the practical application of the ecosystem services approach, demonstrating the benefits provided by GI and identifying opportunities for these to be increased. This will improve strategic understanding so that the effects of potential land use decisions on levels of services provided in urban area can be explored. This will help to provide an evidence base that can inform decisions regarding trade-offs and promote interventions that provide increased and multiple benefits. The Innovation Project will also result in case studies quantifying the value of GI which can be used to promote the need for increased considerations of its provision in land use decision at both local and national levels. A partnership approach will also identify how mapping can aid integrated local decision making to support other place based initiatives. Finally, by considering how GI can be implemented in a way that delivers multiple benefits, best practice will be identified and promoted.


Grant
Agency: European Commission | Branch: H2020 | Program: IA | Phase: ICT-22-2014 | Award Amount: 3.64M | Year: 2015

Automatic Sentiment Estimation in the Wild


Grant
Agency: GTR | Branch: EPSRC | Program: | Phase: Training Grant | Award Amount: 4.16M | Year: 2014

Recently, an influential American business magazine, Forbes, chose Quantum Engineering as one of its top 10 majors (degree programmes) for 2022. According to Forbes magazine (September 2012): a need is going to arise for specialists capable of taking advantage of quantum mechanical effects in electronics and other products. We propose to renew the CDT in Controlled Quantum Dynamics (CQD) to continue its success in training students to develop quantum technologies in a collaborative manner between experiment and theory and across disciplines. With the ever growing demand for compactness, controllability and accuracy, the size of opto-electronic devices in particular, and electronic devices in general, is approaching the realm where only fully quantum mechanical theory can explain the fluctuations in (and limitations of) these devices. Pushing the frontiers of the very small and very fast looks set to bring about a revolution in our understanding of many fundamental processes in e.g. physics, chemistry and even biology with widespread applications. Although the fundamental basis of quantum theory remains intact, more recent theoretical and experimental developments have led researchers to use the laws of quantum mechanics in new and exciting ways - allowing the manipulation of matter on the atomic scale for hitherto undreamt of applications. This field not only holds the promise of addressing the issue of quantum fluctuations but of turning the quantum behaviour of nano- structures to our advantage. Indeed, the continued development of high-technology is crucial and we are convinced that our proposed CDT can play an important role. When a new field emerges a key challenge in meeting the current and future demands of industry is appropriate training, which is what we propose to achieve in this CDT. The UK plays a leading role in the theory and experimental development of CQD and Imperial College is a centre of excellence within this context. The team involved in the proposed CDT covers a wide range of key activities from theory to experiment. Collectively we have an outstanding track record in research, training of postgraduate students and teaching. The aim of the proposed CDT is to provide a coherent training environment bringing together PhD students from a wide variety of backgrounds and giving them an appreciation of experiment and theory of related fields under the umbrella of CQD. Students graduating from our programme will subsequently find themselves in high-demand both by industry and academia. The proposed CDT addresses the EPSRC strategic area Quantum Information Processing and Quantum Optics and one of the priority areas of the CDT call, Towards Quantum Technologies. The excellence of our doctoral training has been recognised by the award of a highly competitive EU Innovative Doctoral Programme (IDP) in Frontiers of Quantum Technology, which will start in October 2013 running for four years with the budget around 3.8 million euros. The new CDT will closely work with the IDP to maximise synergy. It is clear that other high-profile activities within the general area of CQD are being undertaken in a range of other UK universities and within Imperial College. A key aim of our DTC is inclusivity. We operate a model whereby academics from outside of Imperial College can act as co-supervisors for PhD students on collaborative projects whereby the student spends part of the PhD at the partner institution whilst remaining closely tied to Imperial College and the student cohort. Many of the CDT activities including lectures and summer schools will be open to other PhD students within the UK. Outreach and transferable skills courses will be emphasised to provide a set of outreach classes and to organise various outreach activities including the CDT in CQD Quantum Show to the general public and CDT Festivals and to participate in Imperials Science Festivals.


Grant
Agency: GTR | Branch: NERC | Program: | Phase: Research Grant | Award Amount: 583.88K | Year: 2015

Terrestrial biodiversity is declining globally because of human impacts, of which land-use change has so far been the most important. When people change how land is used, many of the species originally present decline or disappear from the area, while others previously absent become established. Although some species are affected immediately, others might only respond later as the consequences of the land-use change ripple through the ecosystem. Such delayed or protracted responses, which we term biotic lag, have largely been ignored in large-scale models so far. Another shortcoming of much previous work is that it has focused on numbers of species, rather than what they do. Because winners from the change are likely to be ecologically different from losers, the land-use change impacts how the assemblage functions, as well as how many species it contains. Understanding how - and how quickly - land-use change affects local assemblages is crucial for supporting better land-use decisions in the decades to come, as people try to strike the balance between short-term needs for products from ecosystems and the longer-term need for sustainability. The most obvious way to assess the global effects of land-use change on local ecological communities would be to have monitored how land use and the community have changed over a large, representative set of sites over many decades. The sites have to be representative to avoid a biased result, and the long time scale is needed because the responses can unfold over many years. Because there is no such set of sites, less direct approaches are needed. We are planning to scour the ecological literature for comparisons of communities before and after land-use change. We can correct for bias because we have estimates of how common different changes in land use have been; and we will model how responses change over time after a land-use change so that we can use longer-term and shorter-term studies alike. There are many hundreds of suitable studies, and we will ask the researchers who produced them to share their data with us; we will then make them available to everyone at the end of the project. We will combine data on species abundances before and after the land-use change with information about their ecological roles, to reveal how - and how quickly - changing land use affects the relative abundances of the various species and the ecological structure and function of the community. Does conversion of natural habitats to agriculture tend to favour smaller species over large ones, for instance, and if so how quickly? Is metabolism faster in more human-dominated land uses? These analyses will require new compilations of trait data for several ecologically important and highly diverse arthropod groups; to produce these, we will make use of the expertise, collections and library of the Natural History Museum. In an earlier NERC-funded project (PREDICTS: www.predicts.org.uk), we have already compiled over 500 data sets - provided by over 300 different researchers - that compared otherwise-matched sites where land use differed. The PREDICTS database has amassed over 2,000,000 records, from over 18,000 sites in 88 countries. The database contains more than 1% as many species as have been formally described. Our analyses of this unprecedentedly large and representative data set indicates that land-use change has had a marked global impact on average local diversity. However, because PREDICTS data sets are spatial rather than temporal comparisons, they are not well-suited to analysing the dynamics of how assemblages respond to land-use change. More fundamentally, PREDICTS assumption that spatial comparisons are an adequate substitute for temporal data now needs testing. This proposal will deliver the necessary tests, as well as producing the most comprehensive picture of how land-use change reshapes ecological assemblages through time.


Grant
Agency: GTR | Branch: EPSRC | Program: | Phase: Research Grant | Award Amount: 1.49M | Year: 2013

This project will develop new nanometre-sized catalysts and (electro-) chemical processes for producing fuels, including methanol, methane, gasoline and diesel, and chemical products from waste carbon dioxide. It builds upon a successful first phase in which a new, highly controlled nanoparticle catalyst was developed and used to produce methanol from carbon dioxide; the reaction is a pertinent example of the production of a liquid fuel and chemical feedstock. In addition, we developed high temperature electrochemical reactions and reactors for the production of synthesis gas (carbon monoxide and hydrogen) and oxygen from carbon dioxide and water. In this second phase of the project, we shall extend the production of fuels to include methanol, methane, gasoline and diesel, by integrating suitably complementary processes, using energy from renewable sources or off-peak electricity. The latter option is particularly attractive as a means to manage electricity loads as more renewables are integrated with the national power grid. In parallel, we will apply our new nanocatalysts to enable the copolymerization of carbon dioxide with epoxides to produce polycarbonate polyols, components of home insulation foams (polyurethanes). The approach is both commercially and environmentally attractive due to the replacement of 30-50% of the usual petrochemical carbon source (the epoxide) with carbon dioxide, and may be commercialised in the relatively near term. These copolymers are valuable products in their own right and provide a commercial-scale proving ground for the technology, before addressing integration into the larger scale challenges of fuel production and energy management. The programme will continue to improve our catalyst performance and our understanding, to enable carbon dioxide transformations to a range of valuable products. The work will be coupled with a comprehensive process systems analysis in order to develop the most practical and valuable routes to implementation. Our goal is to continue to build on our existing promising results to advance the technology towards commercialisation; the research programme will focus on: 1) Catalyst optimization and scale-up so as to maximise the activities and selectivities for target products. 2) Development and optimization of the process conditions and engineering for the nanocatalysts, including testing and modelling new reactor designs. 3) Process integration and engineering to enable tandem catalyses and efficient generation of renewable fuels, including integration with renewable energy generation taking advantage of off-peak electrical power availability. 4) Detailed economic, energetic, environmental and life cycle analysis of the processes. We will work closely with industrial partners to ensure that the technologies are practical and that key potential impediments to application are addressed. We have a team of seven companies which form our industrial advisory board, representing stakeholders from across the value chain, including: E.On, National Grid, Linde, Johnson Matthey, Simon Carves, Econic Technologies, and Shell.


Grant
Agency: European Commission | Branch: FP7 | Program: CP | Phase: ICT-2013.9.5 | Award Amount: 2.30M | Year: 2014

In a radical paradigm shift, manufacturers are now moving from multicore chips to so-called manycore chips with up to a million independent processors on the same silicon real estate. However, software cannot benefit from the revolutionary potential power increase, unless the design and code is polluted by an unprecedented amount of low-level, fine-grained concurrency detail.Concurrency in mainstream object-oriented languages is based on multithreading. Due to the complexity of balancing work evenly across cores, the thread model is of little benefit for efficient processor use or horizontal scalability. Problems are exacerbated in languages with shared mutable state and a stable notion of identity -- the very foundations of object-orientation. The advent of manycore chips threatens to make not only the object-oriented model obsolete, but also the accumulated know-how of a generation of programmers.Our vision is to provide the means for industry to efficiently develop applications that seamlessly scale to the available parallelism of manycore chips without abandoning the object-oriented paradigm and the associated software engineering methodologies.We will realise this vision by a breakthrough in how parallelism and concurrency are integrated into programming languages, substantiated by a complete inversion of the current canonical language design: constructs facilitating concurrent computation will be default while constructs facilitating synchronised and sequential computation will be explicitly expressed. UpScale will exploit this inversion for a novel agile development methodology based on incremental type-based program annotations specifying ever-richer deployment-related information, and for innovative type-based deployment optimisations both at compile time and at runtime in the runtime system devised in UpScale for massively parallel execution.The targeted breakthrough will profoundly impact software development for the manycore chips of the future.


Grant
Agency: European Commission | Branch: H2020 | Program: RIA | Phase: NMP-29-2015 | Award Amount: 8.00M | Year: 2016

A definitive conclusion about the dangers associated with human or animal exposure to a particular nanomaterial can currently be made upon complex and costly procedures including complete NM characterisation with consequent careful and well-controlled in vivo experiments. A significant progress in the ability of the robust nanotoxicity prediction can be achieved using modern approaches based on one hand on systems biology, on another hand on statistical and other computational methods of analysis. In this project, using a comprehensive self-consistent study, which includes in-vivo, in-vitro and in-silico research, we address main respiratory toxicity pathways for representative set of nanomaterials, identify the mechanistic key events of the pathways, and relate them to interactions at bionano interface via careful post-uptake nanoparticle characterisation and molecular modelling. This approach will allow us to formulate novel set of toxicological mechanism-aware end-points that can be assessed in by means of economic and straightforward tests. Using the exhaustive list of end-points and pathways for the selected nanomaterials and exposure routs, we will enable clear discrimination between different pathways and relate the toxicity pathway to the properties of the material via intelligent QSARs. If successful, this approach will allow grouping of materials based on their ability to produce the pathway-relevant key events, identification of properties of concern for new materials, and will help to reduce the need for blanket toxicity testing and animal testing in the future.


Grant
Agency: European Commission | Branch: FP7 | Program: CP-IP | Phase: GC.NMP.2013-1 | Award Amount: 9.01M | Year: 2013

MARS-EV aims to overcome the ageing phenomenon in Li-ion cells by focusing on the development of high-energy electrode materials (250 Wh/kg at cell level) via sustainable scaled-up synthesis and safe electrolyte systems with improved cycle life (> 3000 cycles at 100%DOD). Through industrial prototype cell assembly and testing coupled with modelling MARS-EV will improve the understanding of the ageing behaviour at the electrode and system levels. Finally, it will address a full life cycle assessment of the developed technology. MARS-EV proposal has six objectives: (i) synthesis of novel nano-structured, high voltage cathodes (Mn, Co and Ni phosphates and low-cobalt, Li-rich NMC) and high capacity anodes (Silicon alloys and interconversion oxides); (ii) development of green and safe, electrolyte chemistries, including ionic liquids, with high performance even at ambient and sub-ambient temperature, as well as electrolyte additives for safe high voltage cathode operation; (iii) investigation of the peculiar electrolyte properties and their interactions with anode and cathode materials; (iv) understanding the ageing and degradation processes with the support of modelling, in order to improve the electrode and electrolyte properties and, thus, their reciprocal interactions and their effects on battery lifetime; (v) realization of up to B5 format pre-industrial pouch cells with optimized electrode and electrolyte components and eco-designed durable packaging; and (vi) boost EU cell and battery manufacturers via the development of economic viable and technologically feasible advanced materials and processes, realization of high-energy, ageing-resistant, easily recyclable cells. MARS-EV brings together partners with complementary skills and expertise, including industry covering the complete chain from active materials suppliers to cell and battery manufacturers, thus ensuring that developments in MARS-EV will directly improve European battery production capacities.


Grant
Agency: European Commission | Branch: H2020 | Program: RIA | Phase: HCO-05-2014 | Award Amount: 3.61M | Year: 2015

South Asians, who represent one-quarter of the worlds population, are at high risk of type-2 diabetes (T2D). Intensive lifestyle modification (healthy diet and physical activity) is effective at preventing T2D amongst South Asians with impaired glucose tolerance, but this approach is limited by high-cost, poor scalability and low impact on T2D burden. We will complete a cluster-randomised clinical trial at 120 locations across India, Pakistan, Sri Lanka and the UK. We will compare family-based intensive lifestyle modification (22 health promotion sessions from a community health worker, active group, N=60 sites) vs usual care (1 session, control group, N=60 sites) for prevention of T2D, amongst 3,600 non-diabetic South Asian men and women with central obesity (waist100cm) and/or prediabetes (HbA1c6.0%). Participants will be followed annually for 3 years. The primary endpoint will be new-onset T2D (physician diagnosis on treatment or HbA1c6.0%, predicted N~734 over 3 years). Secondary endpoints will include waist and weight in the index case and family members. Our study has 80% power to identify a reduction in T2D risk with family-based intervention vs usual care of: 30% in South Asians with central obesity; 24% in South Asians with prediabetes; and 24% overall. Health economic evaluation will determine cost-effectiveness of family based lifestyle modification for prevention of T2D amongst South Asians with central obesity and / or prediabetes. The impact of gender and socio-economic factors on clinical utility and cost-effectiveness will be investigated. Our results will determine whether screening by waist circumference and/or HbA1c, coupled with intervention by family-based lifestyle modification, is an efficient, effective and equitable strategy for prevention of T2D in South Asians. Our findings will thereby provide a robust evidence base for scalable community-wide approaches to reverse the epidemic of T2D amongst the >1.5 billion South Asians worldwide.


Grant
Agency: European Commission | Branch: H2020 | Program: RIA | Phase: SC5-11a-2014 | Award Amount: 6.57M | Year: 2015

The overall aim of Real-Time-Mining is to develop a real-time framework to decrease environmental impact and increase resource efficiency in the European raw material extraction industry. The key concept of the proposed research promotes the change in paradigm from discontinuous intermittent process monitoring to a continuous process and quality management system in highly selective mining operations. Real-Time Mining will develop a real-time process-feedback control loop linking online data acquired during extraction at the mining face rapidly with an sequentially up-datable resource model associated with real-time optimization of long-term planning, short-term sequencing and production control decisions. The project will include research and demonstration activities integrating automated sensor based material characterization, online machine performance measurements, underground navigation and positioning, underground mining system simulation and optimization of planning decisions, state-of-the art updating techniques for resource/reserve models. The impact of the project is expected on the environment through a reduction in CO2-emissions, increased energy efficiency and production of zero waste by maximizing process efficiency and resource utilization. Currently economically marginal deposits or difficult to access deposits will be become industrial viable. This will result in a sustainable increase in the competitiveness of the European raw material extraction through a reduced dependency on raw materials from non-EU sources.


Grant
Agency: European Commission | Branch: H2020 | Program: RIA | Phase: PHC-08-2014 | Award Amount: 25.06M | Year: 2015

The TBVAC2020 proposal builds on the highly successful and long-standing collaborations in subsequent EC-FP5-, FP6- and FP7-funded TB vaccine and biomarker projects, but also brings in a large number of new key partners from excellent laboratories from Europe, USA, Asia, Africa and Australia, many of which are global leaders in the TB field. This was initiated by launching an open call for Expressions of Interest (EoI) prior to this application and to which interested parties could respond. In total, 115 EoIs were received and ranked by the TBVI Steering Committee using proposed H2020 evaluation criteria. This led to the prioritisation of 52 R&D approaches included in this proposal. TBVAC2020 aims to innovate and diversify the current TB vaccine and biomarker pipeline while at the same time applying portfolio management using gating and priority setting criteria to select as early as possible the most promising TB vaccine candidates, and accelerate their development. TBVAC2020 proposes to achieve this by combining creative bottom-up approaches for vaccine discovery (WP1), new preclinical models addressing clinical challenges (WP2) and identification and characterisation of correlates of protection (WP5) with a directive top-down portfolio management approach aiming to select the most promising TB vaccine candidates by their comparative evaluation using objective gating and priority setting criteria (WP6) and by supporting direct, head-to head or comparative preclinical and early clinical evaluation (WP3, WP4). This approach will both innovate and diversify the existing TB vaccine and biomarker pipeline as well as accelerate development of most promising TB vaccine candidates through early development stages. The proposed approach and involvement of many internationally leading groups in the TB vaccine and biomarker area in TBVAC2020 fully aligns with the Global TB Vaccine Partnerships (GTBVP).


Grant
Agency: European Commission | Branch: H2020 | Program: RIA | Phase: FETPROACT-2-2014 | Award Amount: 2.89M | Year: 2015

Contemporary research endeavours aim at equipping artificial systems with human-like cognitive skills, in an attempt to promote their intelligence beyond repetitive task accomplishment. However, despite the crucial role that the sense of time has in human cognition, both in perception and action, the capacity of artificial agents to experience the flow of time remains largely unexplored. The inability of existing systems to perceive time constrains their potential understanding of the inherent temporal characteristics of the dynamic world, which in turn acts as an obstacle to their symbiosis with humans. Time perception is without doubt, not an optional extra, but a necessity for the development of truly autonomous, cognitive machines. TIMESTORM aims at bridging this fundamental gap by shifting the focus of human-machine confluence to the temporal, short- and long-term aspects of symbiotic interaction. The integrative pursuit of research and technological developments in time perception will contribute significantly to ongoing efforts in deciphering the relevant brain circuitry and will also give rise to innovative implementations of artifacts with profoundly enhanced cognitive capacities. Equipping artificial agents with temporal cognition establishes a new framework for the investigation and integration of knowing, doing, and being in artificial systems. The proposed research will study the principles of time processing in the human brain and their replication in-silico, adopting a multidisciplinary research approach that involves developmental studies, brain imaging, computational modelling and embodied experiments. By investigating artificial temporal cognition, TIMESTORM inaugurates a novel research field in cognitive systems with the potential to contribute to the advent of next generation intelligent systems, significantly promoting the seamless integration of artificial agents in human societies.


Grant
Agency: European Commission | Branch: FP7 | Program: CP-IP | Phase: HEALTH.2012.2.1.2-2 | Award Amount: 23.12M | Year: 2012

METACARDIS applies a systems medicine multilevel approach to identify biological relationships between gut microbiota, assessed by metagenomics, and host genome expression regulation, which will improve understanding and innovative care of cardiometabolic diseases (CMD) and their comorbidities. CMD comprise metabolic (obesity, diabetes) and heart diseases characterized by a chronic evolution in ageing populations and costly treatments. Therapies require novel integrated approaches taking into account CMD natural evolution. METACARDIS associates European leaders in metagenomics, who have been successful in establishing the structure of the human microbiome as part of the EU FP7 MetaHIT consortium, clinical and fundamental researchers, SME, patients associations and food companies to improve the understanding of pathophysiological mechanisms, prognosis and diagnosis of CMD. We will use next-generation sequencing technologies and high throughput metabolomic platforms to identify gut microbiota- and metabolomic-derived biomarkers and targets associated with CMD risks. The pathophysiological role of these markers will be tested in both preclinical models and replication cohorts allowing the study of CMD progression in patients collected in three European clinical centres of excellence. Their impact on host gene transcription will be characterised in patients selected for typical features of CMD evolution. Application of computational models and visualisation tools to complex datasets combining clinical information, environmental patterns and gut microbiome, metabolome and transcriptome data is a central integrating component in the research, which will be driven by world leaders in metagenomic and functional genomic data analysis. These studies will identify novel molecular targets, biomarkers and predictors of CMD progression, paving the way for personalized medicine in CMD.


Grant
Agency: European Commission | Branch: FP7 | Program: CSA | Phase: ICT-2013.3.2 | Award Amount: 972.22K | Year: 2014

The mission of this project is to make a significant contribution to raising awareness about the importance of Photonics, aiming at having impact on young minds, entrepreneurs and society as a whole. GoPhoton! aims to transmit a critical message across Europe: Photonics is ubiquitous and pervasive, it is a key enabler of the European economy and job creation, and it offers outstanding career and business opportunities. The project intends to address these challenges through a series of actions that will be developed through a collaborative network, the European Centres for Outreach in Photonics (ECOP), an alliance committed to creating durable long-term partnerships for enhanced engagement in Photonics outreach. The project aims at strongly involving the relevant European stakeholders, seeking synergies with Photonics 21, the industrial clusters and educational networks, and a possible International Year of Light in 2015. The existing tight links of the project partners with the local educational networks (teachers and science museums) will be extensively used and amplified to engage youth. Communication specialists and media will play a critical role as multipliers of the message to make Photonics a household word and to reach out the general public.


Grant
Agency: European Commission | Branch: H2020 | Program: SGA-RIA | Phase: FETFLAGSHIP | Award Amount: 89.00M | Year: 2016

Understanding the human brain is one of the greatest scientific challenges of our time. Such an understanding can provide profound insights into our humanity, leading to fundamentally new computing technologies, and transforming the diagnosis and treatment of brain disorders. Modern ICT brings this prospect within reach. The HBP Flagship Initiative (HBP) thus proposes a unique strategy that uses ICT to integrate neuroscience data from around the world, to develop a unified multi-level understanding of the brain and diseases, and ultimately to emulate its computational capabilities. The goal is to catalyze a global collaborative effort. During the HBPs first Specific Grant Agreement (SGA1), the HBP Core Project will outline the basis for building and operating a tightly integrated Research Infrastructure, providing HBP researchers and the scientific Community with unique resources and capabilities. Partnering Projects will enable independent research groups to expand the capabilities of the HBP Platforms, in order to use them to address otherwise intractable problems in neuroscience, computing and medicine in the future. In addition, collaborations with other national, European and international initiatives will create synergies, maximizing returns on research investment. SGA1 covers the detailed steps that will be taken to move the HBP closer to achieving its ambitious Flagship Objectives.


Grant
Agency: European Commission | Branch: H2020 | Program: RIA | Phase: EE-05-2016 | Award Amount: 2.90M | Year: 2016

THERMOS (Thermal Energy Resource Modelling and Optimisation System) will develop the methods, data, and tools to enable public authorities and other stakeholders to undertake more sophisticated thermal energy system planning far more rapidly and cheaply than they can today. This will amplify and accelerate the development of new low carbon heating and cooling systems across Europe, and enable faster upgrade, refurbishment and expansion of existing systems. The project will realise these benefits at the strategic planning level (quantification of technical potential, identification of new opportunities) and at the project level (optimisation of management and extension of existing and new systems). These outcomes will be achieved through: a) Development of address-level heating and cooling energy supply and demand maps, initially for the four Pilot Cities, and subsequently for the four Replication partners - establishing a standard method and schema for high resolution European energy mapping, incorporating a wide range of additional spatial data needed for modelling and planning of thermal energy systems, and their interactions with electrical and transport energy systems; b) Design and implementation of fast algorithms for modelling and optimising thermal systems, incorporating real-world cost, benefit and performance data, and operating both in wide area search, and local system optimisation contexts; c) Development of a free, open-source software application integrating the spatial datasets with the search and system optimisation algorithms (trialled and tested through the public authorities representing four Pilot Cities); d) Supporting implementation of the energy system mapping methodology, and subsequently the use of the THERMOS software, with a further four Replication Cities/Regions, from three more EU Member States; e) Comprehensive dissemination of mapping outputs and free software tools, targeting public authorities and wider stakeholders across Europe.


Grant
Agency: European Commission | Branch: FP7 | Program: CP-IP | Phase: HEALTH.2013.2.2.1-4 | Award Amount: 15.74M | Year: 2013

Epilepsy is a devastating condition affecting over 50 million people worldwide. This multidisciplinary project is focused on the process leading to epilepsy, epileptogenesis, in adults. Our main hypothesis is that there are combinations of various causes, acting in parallel and/or in succession, that lead to epileptogenesis and development of seizures. Our central premise and vision is that a combinatorial approach is necessary to identify appropriate biomarkers and develop effective antiepileptogenic therapeutics. The project will focus on identifying novel biomarkers and their combinations for epileptogenesis after potentially epileptogenic brain insults in clinically relevant animal models, such as traumatic brain injury (TBI) and status epilepticus (SE); explore multiple basic mechanisms of epileptogenesis and their mutual interactions related to cell degeneration, circuit reorganization, inflammatory processes, free radical formation, altered neurogenesis, BBB dysfunction, genetic and epigenetic alterations; and translating these findings towards the clinic by validating biomarkers identified from animal models in human post TBI brain tissue and blood samples, post-mortem brain tissue in individuals that died soon after SE, and human brain and blood samples from chronic epilepsy cases. The project will identify novel combinatorial biomarkers and novel disease-modifying combinatorial treatment strategies for epileptogenesis, create an Epilepsy Preclinical Biobank, and validate translational potential of results from animal models in human tissue. To adequately address the proposed goals, the project will develop technological breakthroughs, such as completely novel chemogenetic approaches, novel MRI techniques, novel multimodal organic recording devices for simultaneous recordings of EEG / cellular unit activity and biochemical measurements, novel bioluminescence for in vivo promoter activity analysis, and novel systems biology approaches.


Grant
Agency: European Commission | Branch: H2020 | Program: RIA | Phase: PHC-24-2015 | Award Amount: 18.47M | Year: 2016

The management of febrile patients is one of the most common and important problems facing healthcare providers. Distinction between bacterial infections and trivial viral infection on clinical grounds is unreliable, and as a result innumerable patients worldwide undergo hospitalization, invasive investigation and are treated with antibiotics for presumed bacterial infection when, in fact, they are suffering from self-resolving viral infection. We aim to improve diagnosis and management of febrile patients, by application of sophisticated phenotypic, transcriptomic (genomic, proteomic) and bioinformatic approaches to well characterised large-scale, multi-national patient cohorts already recruited with EU funding. We will identify, and validate promising new discriminators of bacterial and viral infection including transcriptomic and clinical phenotypic markers. The most accurate markers distinguishing bacterial and viral infection will be evaluated in prospective cohorts of patients reflecting the different health care settings across European countries. By linking sophisticated new genomic and proteomic approaches to careful clinical phenotyping, and building on pilot data from our previous studies we will develop a comprehensive management plan for febrile patients which can be rolled out in healthcare systems across Europe.


Grant
Agency: GTR | Branch: EPSRC | Program: | Phase: Research Grant | Award Amount: 6.65M | Year: 2015

Society faces major challenges that require disruptive new materials solutions. For example, there is a worldwide demand for materials for sustainable energy applications, such as safer new battery technologies or the efficient capture and utilization of solar energy. This project will develop an integrated approach to designing, synthesizing and evaluating new functional materials, which will be developed across organic and inorganic solids, and also hybrids that contain both organic and inorganic modules in a single solid. The UK is well placed to boost its knowledge economy by discovering breakthrough functional materials, but there is intense global completion. Success, and long-term competitiveness, is critically dependent on developing improved capability to create such materials. All technologically advanced nations have programmes that address this challenge, exemplified by the $100 million of initial funding for the US Materials Genome Initiative. The traditional approach to building functional materials, where the properties arise from the placement of the atoms, can be contrasted with large-scale engineering. In engineering, the underpinning Newtonian physics is understood to the point that complex structures, such as bridges, can be constructed with millimetre precision. By contrast, the engineering of functional materials relies on a much less perfect understanding of the relationship between structure and function at the atomic level, and a still limited capability to achieve atomic level precision in synthesis. Hence, the failure rate in new materials synthesis is enormous compared with large-scale engineering, and this requires large numbers of researchers to drive success, placing the UK at a competitive disadvantage compared to larger countries. The current difficulty of materials design at the atomic level also leads to cultural barriers: in building a bridge, the design team would work closely with the engineering construction team throughout the process. By contrast, the direct, day-to-day integration of theory and synthesis to identify new materials is not common practice, despite impressive advances in the ability of computation to tackle more complex systems. This is a fundamental challenge in materials research. This Programme Grant will tackle the challenge by delivering the daily working-level integration of computation and experiment to discover new materials, driven by a closely interacting team of specialists in structure and property prediction, measurement and materials synthesis. Key to this will be unique methods developed by our team that led to recent landmark publications in Science and Nature. We are therefore internationally well placed to deliver this timely vision. Our approach will enable discovery of functional materials on a much faster timescale. It will have broad scope, because we will develop it across materials types with a range of targeted properties. It will have disruptive impact because it uses chemical understanding and experiment in tandem with calculations that directly exploit chemical knowledge. In the longer term, the approach will enable a wide range of academic and industrial communities in chemistry and also in physics and engineering, where there is often a keener understanding of the properties required for applications, to design better materials. This approach will lead to new materials, such as battery electrolytes, materials for information storage, and photocatalysts for solar energy conversion, that are important societal and commercial targets in their own right. We will exploit discoveries and share the approach with our commercial partners via the Knowledge Centre for Materials Chemistry and the new Materials Innovation Factory, a £68 million UK capital investment in state-of-the-art materials research facilities for both academic and industrial users. Industry and the Universities commit 55% of the project cost.


Grant
Agency: European Commission | Branch: H2020 | Program: RIA | Phase: INFRAIA-01-2016-2017 | Award Amount: 10.00M | Year: 2017

Experimentation in mesocosms is arguably the single most powerful approach to obtain a mechanistic quantitative understanding of ecosystem-level impacts of stressors in complex systems, especially when embedded in long-term observations, theoretical models and experiments conducted at other scales. AQUACOSM builds on an established European network of mesocosm research infrastructures (RI), the FP7 Infra project MESOAQUA (2009-2012), where 167 users successfully conducted 74 projects. AQUACOSM greatly enhances that network on pelagic marine systems in at least 3 ways: first by expanding it to 10 freshwater (rivers and lakes), 2 brackish and 2 benthic marine facilities, and by involving 2 SMEs and reaching out to more, thereby granting effective transnational access to world-leading mesocosm facilities to >340 users on >11500 days; second, by integrating scattered know-how between freshwater and marine RI; and third, by uniting aquatic mesocosm science in an open network beyond the core consortium, with industry involved in an ambitious innovation process, to promote ground-breaking developments in mesocosm technology, instrumentation and data processing. A new dimension of experimental ecosystem science will be reached by coordinated mesocosm experiments along transects from the Mediterranean to the Arctic and beyond salinity boundaries. These efforts will culminate in a joint research activity (JRA) to assess aquatic ecosystem responses across multiple environmental gradients to a selected climate-related key stressor with repercussions for ecosystem services. Overall, AQUACOSM will fill a global void by forging an integrated freshwater and marine research infrastructure network. Long-term sustainability is sought through assessing governance models based on science priorities and economic innovation opportunities. Linkages to and synergies with ESFRI RI and other large initiatives are ensured by AQUACOSM partners and Advisory Board members in those programs.


Grant
Agency: European Commission | Branch: H2020 | Program: FCH2-RIA | Phase: FCH-03-1-2016 | Award Amount: 2.09M | Year: 2017

Project MEMPHYS, MEMbrane based Purification of HYdrogen System, targets the development of a stand-alone hydrogen purification system based on a scalable membrane hydrogen purification module. Applications are for instance hydrogen recovery from biomass fermentation, industrial pipelines, storage in underground caverns, and industrial waste gas streams. The consortium consists of six partners including two universities, two research institutes, and two companies from five different countries. The overall budget totals 2 M, with individual budgets between 220 and 500 T. This project will utilize an electrochemical hydrogen purification (EHP) system. EHP has proven to produce high purity hydrogen (5N) while maintaining low energy consumption because the purification and optional compression are electrochemical and isothermal processes. A low CAPEX for the EHP system is feasible due to the significant reductions of system costs that result from recent design improvements and market introductions of various electrochemical conversion systems such as hydrogen fuel cells. In detail, the purification process will be a two-step process. A catalyst-coated proton exchange membrane will be assisted by one selectively permeable polymer membrane. Standard catalysts are sensitive to impurities in the gas. Therefore, alternative anode catalysts for the EHP cell, an anti-poisoning strategy and an on board diagnostic system will be developed. The resulting MEMPHYS system will be multi-deployable for purification of a large variety of hydrogen sources. A valuable feature of the MEMPHYS system is the simultaneous compression of the purified hydrogen up to 200 bar, facilitating the transport and storage of the purified hydrogen. The MEMPHYS project offers the European Union an excellent chance to advance the expertise in electrochemical conversion systems on a continental level, while at the same time promoting the use and establishment of hydrogen based renewable energy systems.


Grant
Agency: European Commission | Branch: H2020 | Program: CS2-RIA | Phase: JTI-CS2-2016-CFP03-LPA-02-11 | Award Amount: 1.65M | Year: 2017

In SORCERER revolutionary lightweight electrical energy storing composite materials for future electric and hybrid-electric aircraft are to be developed. Building on previous research novel lightweight supercapacitor composites, structural battery and structural energy generating composite materials are to be realised for aeronautical application and demonstrated on the systems level. Such demonstration ranges from table-top demonstrators for structural batteries and energy harvesting materials to aircraft component demonstrators for structural supercapacitors. The SORCERER consortium consist of the world leading research groups on structural power composites. The team has an outstanding scientific track record in research covering all aspects of structural power composites development and manufacture namely: multifunctional matrices (SPE) and carbon fibres (i.e. constituents); separator materials and designs; structural electrodes; connectivity and power management and materials modelling and design. In SORCERER we will build on these experiences to adapt current structural power composites solutions for aeronautical applications as well as develop new materials and devices for the aircraft application. By the end of the project each technology, i.e. structural supercapacitor, battery and power generation device, will have matured and as a result been brought-up at least one step on the TRL scale. In particular, the developed devices will be demonstrated on the systems level. For all structural battery and power generation composite materials this will be the worlds first demonstration on that level of complexity.


Patent
Imperial Innovations Ltd and Imperial College London | Date: 2016-08-10

A method for estimating a channel in a wireless communication system in which a plurality of terminals and a base station communicate with each other, according to one embodiment of the present invention, comprises the steps of: receiving reference signals transmitted through a plurality of slots; and estimating a channel by using the reference signals. Here, for the channel estimation, the number of reference signals received through at least one slot among the plurality of slots is different from the number of reference signals received through the other slots.


Eda G.,National University of Singapore | Maier S.A.,Imperial College London
ACS Nano | Year: 2013

Semiconducting two-dimensional (2D) crystals such as MoS2 and WSe2 exhibit unusual optical properties that can be exploited for novel optoelectronics ranging from flexible photovoltaic cells to harmonic generation and electro-optical modulation devices. Rapid progress of the field, particularly in the growth area, is beginning to enable ways to implement 2D crystals into devices with tailored functionalities. For practical device performance, a key challenge is to maximize light-matter interactions in the material, which is inherently weak due to its atomically thin nature. Light management around the 2D layers with the use of plasmonic nanostructures can provide a compelling solution. © 2013 American Chemical Society.


Grant
Agency: European Commission | Branch: FP7 | Program: CP-IP | Phase: HEALTH-2011.2.2.1-2 | Award Amount: 24.91M | Year: 2012

The goal of this proposal (INMiND) is to carry out collaborative research on molecular mechanisms that link neuroinflammation with neurodegeneration in order to identify novel biological targets for activated microglia, which may serve for both diagnostic and therapeutic purposes, and to translate this knowledge into the clinic. The general objectives of INMiND are: (i) to identify novel mechanisms of regulation and function of microglia under various conditions (inflammatory stimuli; neurodegenerative and -regenerative model systems); (ii) to identify and implement new targets for activated microglia, which may serve for diagnostic (imaging) and therapeutic purposes; (iii) to design new molecular probes (tracers) for these novel targets and to implement and validate them in in vivo model systems and patients; (iv) to image and quantify modulated microglia activity in patients undergoing immune therapy for cognitive impairment and relate findings to clinical outcome. Within INMiND we bring together a group of excellent scientists with a proven background in efficiently accomplishing common scientific goals (FP6 project DiMI, www.dimi.eu), who belong to highly complementary fields of research (from genome-oriented to imaging scientists and clinicians), and who are dedicated to formulate novel image-guided therapeutic strategies for neuroinflammation related neurodegenerative diseases. The strength of this proposal is that, across Europe, it will coordinate research and training activities related to neuroinflammation, neurodegeneration/-regeneration and imaging with special emphasis on translating basic mechanisms into clinical applications that will provide health benefits for our aging population. With its intellectual excellence and its crucial mass the INMiND consortium will play a major role in the European Research Area and will gain European leadership in the creation of new image-guided therapy paradigms in patients with neurodegenerative diseases.


Grant
Agency: European Commission | Branch: FP7 | Program: CP-FP | Phase: SSH.2013.1.1-2 | Award Amount: 3.26M | Year: 2013

The SPINTAN project aims at discovering the theoretical and empirical underpins of public intangible policies. It widens previous work carried out by Corrado, Hulten and Sichel (2005, 2009) including the public sector in their analytical framework in different complementary directions that can be summarized in the following three objectives: (1) to build a public intangible database for a wide set of EU countries, complemented with some big non-EU countries; (2) to analyze the impact of public sector intangibles on innovation, well-being and smart growth (including education, R&D and innovation, and the construction of a digital society); and (3) to pay special attention to the medium/long term consequences of austerity policies in view of the expected recovery. In order to achieve these goals the overall strategy of the project will rely upon the following pillars organized around six work packages. WP 1 concentrates on the methodological discussion on the concept of intangibles in the public sector and the definition of its boundaries. WP 2 will be devoted to the construction of a database for a large set of EU countries and the US, plus three developing countries (China, India, and possibly Brazil), complementary to the one already developed by the INTAN-Invest project. WP 3 will make a detailed analysis of the implications for smart growth and social inclusion of three key aspects of public sector policies: health, education and R&D with special reference to higher education institutions; WP 4 will investigate the effect of spillovers of public sector intangibles on the business sector, within a country or across countries. WP 5 will address the study of the present and future consequences of the austerity measures taken since 2008. And, finally, WP 6 will bring together the different pieces offering a synthesis of the main results emphasizing the main policy implications.


Grant
Agency: European Commission | Branch: FP7 | Program: CP-FP | Phase: NMP.2012.1.4-2 | Award Amount: 4.86M | Year: 2013

The Self-Assembled Virus-like Vectors for Stem Cell Phenotyping (SAVVY) project relies on hierarchical, multi-scale assembly of intrinsically dissimilar nanoparticles to develop novel types of multifunctional Raman probes for analysis and phenotyping of heterogeneous stem cell populations. Our project will address a large unmet need, as stem cells have great potential for a broad range of therapeutic and biotechnological applications. Characterization and sorting of heterogeneous stem cell populations has been intrinsically hampered by (1) lack of specific antibodies, (2) need for fluorescence markers, (3) low concentration of stem cells, (4) low efficiencies/high costs. Our technology will use a fundamentally different approach that (1) does not require antibodies, aptamers, or biomarkers, (2) is fluorescence-label free, and (3) is scalable at acceptable cost. The approach uses intrinsic differences in the composition of membranes of cells to distinguish cell populations. These differences will be detect by SERS and analysed through multicomponent analysis. We have combined the necessary expertise to tackle this challenge: Stellacci has developed rippled nanoparticles that specifically interact with and adhere to cell membranes (analogues to cell penetrating peptides). Lahann has developed bicompartmental Janus polymer particles that have already been surface-modified with rippled particles and integrate specifically in the cell membrane (analogues to viruses). Liz-Marzan has developed highly Raman-active nanoparticles and has demonstrated their selectivity and specificity in SERS experiments. These Raman probes will be loaded into the synthetic viruses to enable membrane fingerprinting. Stevens has developed a Bioinformatics platform for fingerprinting of stem cell populations using cluster analysis algorithms. The effort will be joined by two SMEs, ChipShop and OMT, that will be able to develop the necessary microfluidic and Raman detection hardware.


Grant
Agency: European Commission | Branch: FP7 | Program: CP-FP | Phase: NMP.2013.4.0-2 | Award Amount: 5.13M | Year: 2013

The ArtESun project combines the multidisciplinary and complementary competences of top-level European research groups and industries in order to make significant steps towards high-efficiency >15%, stable and cost efficient OPV technology. For this purpose, the project objectives are set to make break-through advances in the state of the art in terms of (i) the development of innovative high efficient OPV materials which can be used to demonstrate the cost-effective non-vacuum production of large area arbitrary size and shape OPV modules (ii) understanding of the long term stable operation and the degradation mechanisms at the material and OPV device level (iii) the development of roll-to-roll (R2R) additive non-vacuum coating and printing techniques emphasizing efficient materials usage and cost efficient R2R processing and (iv) demonstration of high performance arbitrary size and shape OPV systems in environments relevant to its expected future applications.


Grant
Agency: European Commission | Branch: FP7 | Program: CP | Phase: ENERGY.2011.7.2-1 | Award Amount: 19.44M | Year: 2012

6 Transmission System Operators (Belgium, France, Greece, Norway, Portugal and United Kingdom) and CORESO, a TSO coordination centre, together with 13 RTD performers propose a 4 year R&D project to develop and to validate an open interoperable toolbox which will bring support, by 2015, to future operations of the pan-European electricity transmission network, thus favouring increased coordination/harmonisation of operating procedures among network operators. Under the coordination of RTE, new concepts, methods and tools are developed to define security limits of the pan European system and to quantify the distance between an operating point and its nearest security boundary: this requires building its most likely description and developing a risk based security assessment accounting for its dynamic behaviour. The chain of resulting tools meets 3 overarching functional goals: i) to provide a risk based security assessment accounting for uncertainties around the most likely state, for probabilities of contingencies and for corresponding preventive and corrective actions. ii) to construct more realistic states of any system (taking into account its dynamics) over different time frames (real-time, intraday, day ahead, etc.). iii) to assess system security using time domain simulations (with less approximation than when implementing current standard methods/tools). The prototype tool box is validated according to use cases of increasing complexity: static risk-based security approach at control zone level, dynamic security margins accounting for new power technologies (HVDC, PST, FACTS), use of data coming from off-line security screening rules into on-line security assessment, and finally security maps at pan European level. Dissemination is based on periodic workshops for a permanent user group of network operators invited to use modules to meet their own control zone needs and the ones of present or future coordination centres.


Grant
Agency: European Commission | Branch: FP7 | Program: CP-IP | Phase: HEALTH.2013.1.3-1 | Award Amount: 15.99M | Year: 2013

HeCaToS aims at developing integrative in silico tools for predicting human liver and heart toxicity. The objective is to develop an integrated modeling framework, by combining advances in computational chemistry and systems toxicology, for modelling toxic perturbations in liver and heart across multiple scales. This framework will include vertical integrations of representations from drug(metabolite)-target interactions, through macromolecules/proteins, to (sub-)cellular functionalities and organ physiologies, and even the human whole-body level. In view of the importance of mitochondrial deregulations and of immunological dysfunctions associated with hepatic and cardiac drug-induced injuries, focus will be on these particular Adverse Outcome Pathways. Models will be populated with data from innovative in vitro 3D liver and heart assays challenged with prototypical hepato- or cardiotoxicants; data will be generated by advanced molecular and functional analytical techniques retrieving information on key (sub-)cellular toxic evens. For validating perturbed AOPs in vitro in appropriate human investigations, case studies on patients with liver injuries or cardiomyopathies due to adverse drug effects, will be developed, and biopsies will be subjected to similar analyses. Existing ChEMBL and diXa data infrastructures will be advanced for data gathering, storing and integrated statistical analysis. Model performance in toxicity prediction will be assessed by comparing in silico predictions with experimental results across a multitude of read-out parameters, which in turn will suggest additional experiments for further validating predictions. HeCaToS, organized as a private-public partnership, will generate major socioeconomic impact because it will develop better chemical safety tests leading to safer drugs, but also industrial chemicals, and cosmetics, thereby improving patient and consumer health, and sustaining EUs industrial competitiveness.


Grant
Agency: European Commission | Branch: FP7 | Program: MC-ITN | Phase: FP7-PEOPLE-2012-ITN | Award Amount: 4.11M | Year: 2013

The DREAMS Initial Training Network will investigate the problem of modeling, controlling, removing, and synthesizing acoustic reverberation with the aim of enhancing the quality and intelligibility of audio, music, and speech signals. The proposed research and training program builds upon four disciplines that are equally important in understanding and tackling the (de-)reverberation problem: room acoustics, signal processing, psychoacoustics, and speech and audio processing. The strong commitment of the private sector in the proposed ITN consortium, consisting of 4 academic and 8 industrial partners, illustrates the timeliness and importance of the (de-)reverberation problem in a wide variety of applications. However, carrying application-driven solutions is not the only objective of the DREAMS research program. Indeed, the aim is also to take a significant step forward and make fundamental scientific contributions in each of the four disciplines mentioned earlier. To this end, the envisaged ITN will host 12 early stage researchers and 4 experienced researchers, each performing an individual research project around one of four themes that reflect the most challenging open problems in the area of (de-)reverberation. The DREAMS ITN will be implemented such as to maximize the international and intersectoral experience of the research fellows, by defining relevant secondments in academia and industry, both in the host country and abroad. Moreover, experienced researchers will be expected to take on a supervisory role in coordinating one of the four research themes, with the aim of developing solid skills in leadership and research management. Finally, a training program of extremely high quality is proposed, with local as well as network-wide training, which relies on the scientific excellence of the involved partners and of invited external researchers, and which heavily depends on the input of the private sector.


Grant
Agency: European Commission | Branch: FP7 | Program: CP | Phase: ENERGY.2012.8.8.2 | Award Amount: 26.01M | Year: 2013

The CELSIUS City Consortium is going to deploy 12 new technically and economically innovative demonstrators. Another up to 20 state-of-art demonstrators (already in operation) will proof the CELSIUS City Concept covering the full FP7 8.8.2 requirements. CELSIUS has a clear strategy and a pro-active approach to Market Outreach, which will strive to commit 50 new cities to the CELSIUS Roadmap by the end of 2016. When fully implemented, this will lead to 20-45 TWh reduction in the use of primary energy p.a. CELSIUS City is well positioned to deliver those targets due a strong partnership of major front running European cities and their respective utilities, and further outstanding innovative organizations, with track records both in creating technically and economically innovative demonstrators, as well as in understanding and overcoming the barriers for large scale deployment (e.g. Imperial College (UK), SP (S), TU Delft (NL), Cologne University of Applied Sciences (D), DAppalonia (IT), LSE (UK)). CELSIUS has eight work packages targeting on the successful deployment of the 13 new demonstrators (WP3), supported by a collaborative approach to harvest beyond state-of-the-art insights from Tech & Innovation (WP5) and Stakeholder Acceptance (WP6). The local demonstrator perspective is enriched by the Integration & Roadmap (WP2). The final goal for Communication & Market Outreach (WP8) is based on developing the CELSIUS in the Market Uptake (WP7). A powerful project management office (WP1), seconded by rigor monitoring (WP4), coordinates all work packages and assuring over the time of the CELSIUS Consortium, both impactful deployment and sustainable market outreach. The total cost of the CELSIUS 13 new demonstrators is 69m EUR, of which the cities themselves will provide 55m EUR. The requested EU funding enables these activities laying the foundation for the successful large scale deployment of the CELSIUS City Concept across Europe and beyond 2020.


Grant
Agency: European Commission | Branch: FP7 | Program: CP | Phase: ENV.2013.6.3-1 | Award Amount: 4.50M | Year: 2014

An estimated one billion tyres are discarded each year. Post-Consumer tyre arisings for EU countries (2010) are 3.4M tonnes per year. At the moment nearly 50% of all recycled tyres/components still end up as fuel, in low grade applications or in landfill. All tyre constituents (rubber, high strength steel cord and wire, high strength textile reinforcement) are high quality materials and deserve to be reused for their relevant properties. Construction is the highest user of materials with concrete being the most popular structural material. Concrete is inherently brittle in compression (unless suitably confined) and weak in tension and, hence, it is normally reinforced with steel bars or fibres. The authors believe that highly confined rubberised concrete can lead to highly deformable concrete elements and structures and that tyre steel and textile fibres can be used as concrete reinforcement to control shrinkage cracking. Hence, the aim of this proposal is to develop innovative solutions to reuse all tyre components in high value innovative concrete applications with reduced environmental impact. To achieve this aim, the proposed project will have to overcome scientific and technological challenges in: Development of novel confined rubberised concrete materials and reinforcement Development of high deformability RC elements suitable for integral bridge elements and base isolation systems for vibrations and seismic applications Development of concrete mixes using recycled steel fibres for use in various applications such as slabs on grade, suspended slabs, precast concrete elements and pumpable self compacting concrete or screed Development of concrete mixes using recycled tyre polymer fibres for crack control Development of novel concrete applications using combinations of the different tyre by-products Undertaking demonstrations projects using the developed materials/applications Development and implementation of standardised LCA/LCCA protocols


Grant
Agency: European Commission | Branch: H2020 | Program: RIA | Phase: EINFRA-1-2014 | Award Amount: 8.02M | Year: 2015

In the coming decade a significant number of the 500.000.000 European (EU/EEA) citizens will have their genome determined routinely. This will be complemented with much cheaper (currently ~20 Euro per measurement) acquisition of the metabolome of biofluids (e.g. urine, saliva, blood plasma) which will link the genotype with metabolome data that captures the highly dynamic phenome and exposome of patients. Having such low cost solutions will enable, for the first time, the development of a truly personalised and evidence-based medicine founded on hard scientific measurements. The exposome includes the metabolic information resulting from all the external influences on the human organism such as age, behavioural factors like exercise and nutrition or other environmental factors. Considering that the amount of data generated by molecular phenotyping exceeds the data volume of personal genomes by at least an order of magnitude, the collection of such information will pose dramatic demands on biomedical data management and compute capabilities in Europe. For example, a single typical National Phenome Centre, managing only around 100,000 human samples per year, can generate more than 2 Petabytes of data during this period alone. A scale-up to sizable portions of the European population over time will require data analysis services capable to work on exabyte-scale amounts of biomedical phenotyping data, for which no viable solution exists at the moment. The PhenoMeNal project will develop and deploy an integrated, secure, permanent, on-demand service-driven, privacy-compliant and sustainable e-infrastructure for the processing, analysis and information-mining of the massive amount of medical molecular phenotyping and genotyping data that will be generated by metabolomics applications now entering research and clinic.


Grant
Agency: European Commission | Branch: H2020 | Program: MSCA-IF-EF-CAR | Phase: MSCA-IF-2014-EF | Award Amount: 195.45K | Year: 2015

In line with the EU 2020 Flagship Initiative on a Digital Agenda for Europe and the upcoming EU Cybersecurity Strategy, the goal of the LV-Pri20 project is to aid our ICT-driven lives, by safeguarding the human right of privacy in the digital society. Concretely, the main focus of LV-Pri20 is the formal and automatic analysis of privacy-preservation in todays ICT. LV-Pri20 will focus on the prevalent wireless media, e.g., RF-identification protocols, remote car-unlocking, wearables, machine-to-machine communication in the Internet of Things (IoT)/ubiquitous computing, but it will not neglect wired environments (given their common cloud-connection). LV-Pri20 will assess and automatically analyse privacy-sensitive applications, in their standalone execution, as well as in the more involved setting of multiple, concurrent executions thereof. This will be done systematically and taxonomically: distinct classes of applications (e.g., identification protocols using Electronic Product Codes vs. the Open Smart Grid Protocol) and different privacy properties (e.g., data non-leakage vs. data-user unlinkability) will be respectively analysed via tailored, well-defined techniques. To specify privacy, LV-Pri20 will design/refine different non-classical logic languages which have inherent semantics for privacy-like expression (e.g., strategy logics). For these, we will then develop new model checking algorithms. All will be incorporated in automatic verification software, which already proved efficient in analysing highly distributed systems, inline with, e.g., the IoT applications envisaged herein. LV-Pri20 will have a multi-disciplinary, collaborative nature, an academic core and industrial side. After an initial privacy scrutiny, new/patched RFID-based, privacy-preserving, communication protocols will be (re-)designed and implemented. For these, we will devise mathematical proofs for one-session security, and run automatic analysis of their multi-session executions.


Grant
Agency: GTR | Branch: EPSRC | Program: | Phase: Research Grant | Award Amount: 169.32K | Year: 2015

Advances in fit for use manufacturing of biopharmaceutical drug delivery and pharmaceutical systems are now required to fit Quality by Design (QbD) models. These current regulations require excellence to be built into the preparation of emerging products (both material and process) thereby leading to product robustness and quality. In addition, industrial needs (economical and reproducible quality enhancement) are driving manufacturing towards continuous processes over batch type processes which also rely on QbD (for integrity and quality). EHDA technology is a robust process that has been utilised in various formats (e.g. electrospinning, electrospraying, bubbling and even 3D printing) and is favourable due to applicability with the development of stable nanomedicines and biopharmaceuticals, the emergence of this technology is clearly evident in the UK and on the global scale. Attempts in scaling up (for suitable pharmaceutical scale) and in tandem with continuous processes (including controlled manufacturing) have been very limited. There also, now, remains a huge void in the adaptation of sensible QbD (multi-variate) for the current methods developed and also those required by industry. While lab scale research continues with the ongoing development of such processes (e.g. nanomedicines, smart and controlled delivery), the transition to industry or the clinic will have to meet these regulations (and scales) for there to be a real impact, which is now, also, an important aspect of grass root research in the UK. The EHDA network brings together specialists from academia and industry to advance this technology through several means. Firstly, initiating developments towards a real-viable scale for Pharmaceutical production. Secondly, to incorporate developments in lean manufacturing and legislation (e.g. continuous manufacturing, online diagnostics, QbD and adaptable scale). Thirdly, to marry optimised lean technologies with novel and emerging macromolecular therapies and actives. The network has a wide range of activities and initiatives which will lead to significant developments (and collaborations) in an area of increasing global interest (EHDA processes) - but currently only on a viable lab scale to date. This network will be the first of its kind and will serve as the central and pioneering hub in this remit.


Grant
Agency: European Commission | Branch: H2020 | Program: MSCA-ITN-ETN | Phase: MSCA-ITN-2015-ETN | Award Amount: 3.89M | Year: 2016

Nanomedicine (NM) is regarded as one of the most promising applications of nanotechnology, as it would allow the development of tailored therapies, with a high level of selectivity and efficacy. Although much research has been performed over the past decades, translation from academia to commercial application remains disappointingly low. Reasons that explain the current moderate success of NM are: (1) promising preclinical results are often poorly predictive for clinical safety and effectiveness, (2) the efficient, scalable and reproducible GMP production of nanocarriers has proven to be challenging and (3) regulatory frameworks are not yet fully equipped to efficiently facilitate the introduction of novel nanomedicines. These obstacles are often encountered since the developmental process from carrier design to clinical assessment is performed by a range of scientists from different backgrounds who have difficulty interacting and communicating with each other to clearly understand the necessary design criteria and the scope and limitations of NM. NANOMED brings together all the necessary expertise to oversee the entire development trajectory required for NM. This is achieved by the combined effort of 7 beneficiaries from academia and industry and 5 non-academic partner organisations, which are all thoroughly rooted in nanosciences and pharmaceutical sciences. Our objective is to develop scalable and highly controllable design and synthesis methods for the most promising nanomedicine types in a preclinical setting. NANOMED will train the next generation of NM scientists by offering an extensive joint training programme to 15 incoming ESRs. It focuses on promoting scientific excellence and exploits the specific research and commercial expertise and infrastructure of the NANOMED network as a whole. The exposure to all elements of NM design enables NANOMED to translate expertise from all disciplines to the ESRs, to educate the future leading scientists in the NM field.


Grant
Agency: European Commission | Branch: H2020 | Program: MSCA-RISE | Phase: MSCA-RISE-2016 | Award Amount: 877.50K | Year: 2017

Additive manufacturing (AM) technologies and overall numerical fabrication methods have been recognized by stakeholders as the next industrial revolution bringing customers needs and suppliers offers closer. It cannot be dissociated to the present trends in increased virtualization, cloud approaches and collaborative developments (i.e. sharing of resources). AM is likely to be one good option paving the way to Europe re-industrialization and increased competitiveness. AMITIE will reinforce European capacities in the AM field applied to ceramic-based products. Through its extensive programme of transnational and intersectoral secondments, AMITIE will promote fast technology transfer and enable as well training of AM experts from upstream research down to more technical issues. This will provide Europe with specialists of generic skills having a great potential of knowledge-based careers considering present growing needs for AM industry development. To do that, AMITIE brings together leading academic and industrial European players in the fields of materials science/processes, materials characterizations, AM technologies and associated numerical simulations, applied to the fabrication of functional and/or structural ceramic-based materials for energy/transport, and ICTs applications, as well as biomaterials. Those players will develop a new concept of smart factory for the future based on 3D AM technologies (i.e. powder bed methods, robocasting, inkjet printing, stereolithography, etc.) and their possible hybridization together or with subtractive technologies (e.g. laser machining). It will allow for the production of parts whose dimensions, shapes, functionality and assembly strategies may be tailored to address todays key technological issues of the fabrication of high added value objects following a fully-combinatorial route. This is expected to lead to a new paradigm for production of multiscale, multimaterial and multifunctional components and systems


Grant
Agency: European Commission | Branch: FP7 | Program: MC-ITN | Phase: FP7-PEOPLE-2013-ITN | Award Amount: 4.20M | Year: 2013

RENESENG aims to prepare a new generation of highly-qualified researchers in Biorefinery and biobased chemicals Systems Engineering Sciences in Europe, expected to bear high impact in the design of newly establishing industrial complexes in biorefinering and more generally in eco-industries. The effort requires to bring together academic and industrial teams, with particularly interdisciplinary and high-quality expertise, embracing disciplines in agricultural sciences, chemistry and chemical eng., biology and biotechnology, computer science, process engineering, logistics and business economics, as well as social sciences with an emphasis on life cycle analysis skills. The principal scientific challenge of the network is to develop a program of inter-disciplinary research from expert groups with dedicated interests in biorenewables using a model-assisted systems approach as an integrating aspect, further capitalizing on its potential and role to address complex and large problems. The proposal brings state-of-the-art systems technologies mobilizing a critical mass in Europe that is already particularly active in this area but needs to coordinate the efforts and reduce fragmentation of knowledge. The aim is to develop and validate modelling, synthesis, integration and optimization technology addressing: 1)lignin-based and cellulosic processes2)water based paths to biomass production 3) waste treatment paths 4) combination of biorenewables processing with utilization of other renewable. In parallel RENESENG has developed a program of training activities including, development of communication, business, and social skills, visits and social events allowing to prepare a new profile of researchers able to transmit their knowledge in the next networking teams. RENESENG guaranties high quality careers perspective for all, through the active participation of 6 industrials, the creation of spin-offs and the sustainable implementation of a multicenter PhD training program.


Grant
Agency: European Commission | Branch: FP7 | Program: MC-ITN | Phase: FP7-PEOPLE-2013-ITN | Award Amount: 3.85M | Year: 2013

ECCO-MATE aims to create a research and training platform for the development and implementation of novel fuel mixture preparation, injection profiling, air management and staged/low temperature combustion technologies both in marine and light-duty automotive diesel engines. The marine (slow speed, large-sized, two-stroke engines) and land-transport (high speed, small-to-medium-sized, four-stroke engines) sectors share essentially the same strategic challenges, namely the implementation of energy efficient and fuel flexible combustion technologies, in order to improve efficiency and meet stringent emission standards. However, there is little established training and academic communication between the two sectors, despite the common problems relating to the fuel injection, ignition and combustion methodologies and potentialities of new more environmentally friendly fuels. ECCO-MATE bridges this gap by creating a platform for research output exchange between the two sectors on diesel engine combustion by coupling state-of-the-art flow physics and combustion chemistry with CFD tools and advanced optical diagnostics. The consortium comprises 16 key partners - 6 Universities, 5 major key-stakeholders from the marine and automotive engine industries and 5 associate partners - from 7 EU countries and Japan. The consortium processes multi-disciplinary expertise, strong interests and tradition in both sectors and the necessary critical mass to achieve the research and ensuing training activities that highlight synergies, complementarities and provide solutions to the addressed common problems of the two sectors. The comprehensive training program (academic and professional training, focussed dissemination activities, trans-national and trans-sectoral mobility) exploits the multi-disciplinarity of the consortium creating high level skills for the participating researchers and ensuring continuation of the research activities after the project completion.


Grant
Agency: European Commission | Branch: H2020 | Program: RIA | Phase: SC5-06-2016-2017 | Award Amount: 5.88M | Year: 2016

EUCalc replies to topic a) Managing technology transition. The EUCalc project will deliver a much needed comprehensive framework for research, business, and decision making which enables an appraisal of synergies and trade-offs of feasible decarbonisation pathways on the national scale of Europe and its member countries \ Switzerland. The novel and pragmatic modelling approach is rooted between pure complex energy system and emissions models and integrated impact assessment tools, introduces an intermediate level of complexity and a multi-sector approach and is developed in a co-design process with scientific and societal actors. EUCalc explores decisions made in different sectors, like power generation, transport, industry, agriculture, energy usage and lifestyles in terms of climatological, societal, and economic consequences. For politicians at European and member state level, stakeholders and innovators EUCalc will therefore provide a Transition Pathways Explorer, which can be used as a much more concrete planning tool for the needed technological and societal challenges, associated inertia and lock-in effects. EUCalc will enable to address EU sustainability challenges in a pragmatic way without compromising on scientific rigour. It is meant to become a widely used democratic tool for policy and decision making. It will close - based on sound model components - a gap between actual climate-energy-system models and an increasing demands of decision makers for information at short notice. This will be supported by involving an extended number of decision-makers from policy and business as well as other stakeholders through expert consultations and the co-design of a Transition Pathways Explorer, a My Europe 2050 education tool and a Massive Open Online Course.


Grant
Agency: European Commission | Branch: FP7 | Program: CP-CSA | Phase: Fission-2013-2.1.1 | Award Amount: 10.28M | Year: 2013

Preparing NUGENIA for HORIZON 2020 The objective of the NUGENIA\ project is to support the NUGENIA Association in its role to coordinate and integrate European research on safety of the Gen II and III nuclear installations in order to better ensure their safe long term operation, integrating private and public efforts, and initiating international collaboration that will create added value in its activity fields. The project consists of two parts, the first part being a Coordination and Support Action and the second part a Collaborative Project. The aim of the first part, the Coordination and Support Action, is to establish an efficient, transparent and high quality management structure to carry out the planning and management of R&D including project calls, proposal evaluation, project follow-up dissemination and valorisation of R&D results in the area of safety of existing Gen II and future Gen III nuclear installations. The preparatory work will encompass governance, organizational, legal and financial work, as well as the establishment of annual work plans, with the aim to structure public-public and/or private-public joint programming enabling NUGENIA to develop into the integrator of the research in the respective field in Europe. The management structure will build on the existing organisation of the NUGENIA Association, currently grouping over 70 nuclear organisations from research and industry (utilities, vendors and small and medium enterprises) active in R&D. In the second part, the Collaborative project, one thematic call for research proposals will be organized among the technical areas of plant safety and risk assessment, severe accident prevention and management, core and reactor performance, integrity assessment of systems, structures and components, innovative Generation III design and harmonisation of procedures and methods. The call will take place one year after the start of the project. The call will implement the priorities recognised in the NUGENIA Roadmap, in line with the Sustainable Nuclear Energy Technology Platform (SNETP) and International Atomic Energy Agency (IAEA) strategies. The research call which is going to be organised within the project is open to all eligible organisations. The NUGENIA\ project will benefit from the experience of the NUGENIA Association member organisations on managing national research programmes and from the track record of the NUGENIA project portfolio.


Grant
Agency: European Commission | Branch: H2020 | Program: RIA | Phase: NMBP-10-2016 | Award Amount: 6.09M | Year: 2017

The incidence of Cardiovascular Disease (CD) claims worldwide 17.1 million lives a year, with an estimated 31% of all deaths globally and a EU cost of 139 billion euros. Up to 40% of all deaths occur among the elderly. In spite of all medical efforts, the 5-year mortality was reduced significantly less than that of malignant diseases. This highlights the urgent need to overcome the difficulties associated with present pharmacological therapies (i.e. drug instability, and unspecific targeting) by developing new ground-breaking therapeutic strategies that go far beyond any current regimens. New approaches for safe, efficient, and heart-specific delivery of therapeutics are strongly required. CUPIDO is envisioned to meet these critical needs by providing an unconventional and effective strategy based on nanoparticle-assisted delivery of clinically available and novel therapeutics to the diseased heart. In particular, CUPIDO will develop innovative bioinspired hybrid nanoparticles formulated as biologicals delivery, which are i) biocompatible and biodegradable, ii) designed for crossing biological barriers, and iii) guidable to the heart. A combination of multidisciplinary manufacturing and validation approaches will be employed, bringing the envisioned product beyond the currently available clinical and day-to-day management of CD individuals. Scale-up production, and respect of medical regulatory requirements will allow CUPIDO to deliver a final product for future late pre-clinical and clinical studies. Altogether, CUPIDO will foster the translation of nanomedical applications toward the cardiac field, which although still in its start, offers great potential to overcome the limitations associated to the currently pharmacological treatments.


Grant
Agency: European Commission | Branch: H2020 | Program: RIA | Phase: LCE-24-2016 | Award Amount: 3.21M | Year: 2016

ROLINCAP will search, identify and test novel phase-change solvents, including aqueous and non-aqueous options, as well as phase-change packed bed and Rotating Packed Bed processes for post-combustion CO2 capture. These are high-potential technologies, still in their infancy, with initial evidence pointing to regeneration energy requirements below 2.0 GJ/ton CO2 and considerable reduction of the equipment size, several times compared to conventional processes . These goals will be approached through a holistic decision making framework consisting of methods for modeling and design that have the potential for real breakthroughs in CO2 capture research. The tools proposed in ROLINCAP will cover a vast space of solvent and process options going far beyond the capabilities of existing simulators. ROLINCAP follows a radically new path by proposing one predictive modelling framework, in the form of the SAFT- equation of state, for both physical and chemical equilibrium, for a wide range of phase behaviours and of molecular structures. The envisaged thermodynamic model will be used in optimization-based Computer-aided Molecular Design of phase-change solvents in order to identify options beyond the very few previously identified phase-change solvents. Advanced process design approaches will be used for the development of highly intensified Rotating Packed Bed processes. Phase-change solvents will be considered with respect to their economic and operability RPB process characteristics. The sustainability of both the new solvents and the packed-bed and RPB processes will be investigated considering holistic Life Cycle Assessment analysis and Safety Health and Environmental Hazard assessment. Selected phase-change solvents, new RPB column concepts and packing materials will be tested at TRL 4 and 5 pilot plants. Software in the form of a new SAFT- equation of state will be tested at TRL 5 in the gPROMS process simulator.


Grant
Agency: European Commission | Branch: FP7 | Program: CP-FP | Phase: HEALTH.2012.2.3.2-2 | Award Amount: 7.80M | Year: 2013

Persons with HIV on combination antiretroviral therapy (cART) are at increased risk of the premature development of age-associated non-communicable comorbidities (AANCC), including cardiovascular, chronic kidney, liver and pulmonary disease, diabetes mellitus, osteoporosis, non-AIDS associated malignancies, and neurocognitive impairment. It has therefore been hypothesised that such individuals, despite effective cART, may be prone to accelerated ageing. The underlying pathogenesis is likely to be multifactorial and include sustained immune activation, both systemically and within the central nervous system. By building on an established infrastructure for conducting longitudinal HIV cohort studies in Amsterdam and London, we will provide a detailed, prospective evaluation of AANCC among HIV-infected patients suppressed on cART and appropriately chosen and comparable non-infected controls. In this way, we will provide a robust estimate of the effect of treated HIV infection on the prevalence, incidence and age of onset of AANCC, thus clearly establishing a link between HIV and AANCC. Through the Human Immune System (HIS) mouse model, experimental studies will permit us to differentiate the effects of HIV and cART on metabolic outcomes when applied under controlled conditions, thereby further elucidating the causative nature of the link between HIV and AANCC. To further clarify potential pathogenic mechanisms underlying this causative link, including the possible induction of an inflammation-associated accelerated ageing phenotype, biomarkers which reflect each of these mechanisms will be investigated in biomaterial obtained from HIS mice and humans, and subsequently validated in patients with HIV on cART. The successful execution of the experimental and clinical research outlined in this proposal will be ensured through a strong interdisciplinary collaboration between clinical, basic and translational scientists bridging the fields of HIV, AANCC and ageing.


Grant
Agency: European Commission | Branch: FP7 | Program: CP | Phase: ICT-2013.1.5 | Award Amount: 4.67M | Year: 2013

Confidential and Compliant Clouds (Coco Cloud) aims at allowing the cloud users to securely and privately share their data in the cloud. This will increase the trust of users in the cloud services and thus increase their widespread adoption with consequent benefits for the users and in general for digital economy.The outsourced nature of the Cloud, and the inherent loss of control that goes along with that, means that sensitive data must be carefully controlled to ensure it is always protected (in the most appropriate way for a given situation). Protecting data (including personal information) is essential to citizens, governments and organizations across all sectors, including healthcare and banking. Furthermore, it is only by providing assurances on data protection and data usage control, that we can facilitate data sharing between individuals and organisations or between organisations to create new ventures and novel means of leveraging the data value.We envision the control of the disseminated data based on mutually agreed data sharing agreements that are uniformly and end-to-end enforced. These agreements may reflect legal, contractual or user defined preferences, which may be conflicting and thus an appropriate balance and model for their enforcement must be found.The project aims at creating an efficient and flexible framework for secure data management from the client to the cloud, and vice-versa. We consider in particular three dimensions to this goal:i.\tto facilitate the writing, understanding, analysis, management, enforcement and dissolution of data sharing agreements; going from high level descriptions (close to natural language) to system enforceable data usage policies;ii.\tto consider the most appropriate enforcing mechanisms depending on the underlying infrastructure and context for enforcing data usage policies;iii.\tto address key challenges for legally compliant data sharing in the cloud. By taking a compliance by design approach, the project places an early emphasis on understanding and incorporating legal and regulatory requirements into the data sharing agreements.Coco Cloud will contribute to fulfil the pervasive need for data usage protection in cloud services that arises from different stakeholders, including business organizations and citizens, and overcoming the limitations of currently available technology offerings.The project Consortium combines strong industry players and academic institutions which will deliver high quality research and development; it also includes two end-users of the technological solutions as well as a Law firm able to bring significant expertise in legal practice, also for Cloud.The project outcome will be evaluated via three pilot products and one industrial test bed.


Grant
Agency: European Commission | Branch: FP7 | Program: CP | Phase: Fission-2012-2.1.1 | Award Amount: 9.33M | Year: 2013

After the 2011 disaster that occurred in Japan, improvement of nuclear safety appears more clearly as a paramount condition for further development of nuclear industry. The NURESAFE project addresses engineering aspects of nuclear safety, especially those relative to design basis accidents (DBA). Although the Japanese event was a severe accident, in a process of defense-in-depth, prevention and control of DBA is obviously one of the priorities in the process of safety improvement. In this respect, the best simulation software are needed to justify the design of reactor protection systems and measures taken to prevent and control accidents. The NURESAFE project addresses safety of light water reactors which will represent the major part of fleets in the world along the whole 21st century. The first objective of NURESAFE is to deliver to European stakeholders a reliable software capacity usable for safety analysis needs and to develop a high level of expertise in the proper use of the most recent simulation tools. Nuclear reactor simulation tools are of course already widely used for this purpose but more accurate and predictive software including uncertainty assessment must allow to quantify the margins toward feared phenomena occurring during an accident and they must be able to model innovative and more complex design features. This software capacity will be based on the NURESIM simulation platform created during FP6 NURESIM project and developed during FP7 NURISP project which achieved its goal by making available an integrated set of software at the state of the art. The objectives under the work-program are to develop practical applications usable for safety analysis or operation and design and to expand the use of the NURESIM platform. Therefore, the NURESAFE project concentrates its activities on some safety relevant situation targets. The main outcome of NURESAFE will be the delivery of multiphysics and fully integrated applications.


Grant
Agency: European Commission | Branch: FP7 | Program: CP-IP | Phase: HEALTH.2013.2.2.1-1 | Award Amount: 39.56M | Year: 2013

Traumatic Brain Injury (TBI) is a major cause of death and disability, leading to great personal suffering to victim and relatives, as well as huge direct and indirect costs to society. Strong ethical, medical, social and health economic reasons therefore exist for improving treatment. The CENTER-TBI project will collect a prospective, contemporary, highly granular, observational dataset of 5400 patients, which will be used for better characterization of TBI and for Comparative Effectiveness Research (CER). The generalisability of our results will be reinforced by a contemporaneous registry level data collection in 15-25,000 patients. Our conceptual approach is to exploit the heterogeneity in biology, care, and outcome of TBI, to discover novel pathophysiology, refine disease characterization, and identify effective clinical interventions. Key elements are the use of emerging technologies (biomarkers, genomics and advanced MR imaging) in large numbers of patients, across the entire course of TBI (from injury to late outcome) and across all severities of injury (mild to severe). Improved characterization with these tools will aid Precision Medicine, a concept recently advocated by the US National Academy of Science, facilitating targeted management for individual patients. Our consortium includes leading experts and will bring outstanding biostatistical and neuroinformatics expertise to the project. Collaborations with external partners, other FP7 consortia, and international links within InTBIR, will greatly augment scientific resources and broaden the global scope of our research. We anticipate that the project could revolutionize our view of TBI, leading to more effective and efficient therapy, thus improving outcome and reducing costs. These outcomes reflect the goals of CER to assist consumers, clinicians, health care purchasers, and policy makers to make informed decisions, and will improve healthcare at both individual and population levels.


Grant
Agency: European Commission | Branch: H2020 | Program: RIA | Phase: PHC-23-2014 | Award Amount: 6.96M | Year: 2015

Childrens health affects the future of Europe children are citizens, future workers, parents and carers. Children are dependent on society to provide effective health services (UN Convention on the Rights of the Child). Models of child primary health care vary widely across Europe based on two broad alternatives (primary care paediatricians or generic family doctors), and a variety of models of school health and adolescent direct access services. There is little research to show which model(s) are best, implying that some are inefficient or ineffective, with sub-optimal outcomes. MOCHA will draw on networks, earlier child health projects and local agents to model and evaluate child primary care in all 30 EU/EEA countries. Scientific partners from 11 European countries, plus partners from Australia and USA, encompassing medicine, nursing, economics, informatics, sociology and policy management, will: Categorise the models, and school health and adolescent services Develop innovative measures of quality, outcome, cost, and workforce of each, and apply them using policy documents, routine statistics, and available electronic data sets Assess effects on equality, and on continuity of care with secondary care. Systematically obtain stakeholder views. Indicate optimal future patterns of electronic records and big data to optimise operation of the model(s). The results will demonstrate the optimal model(s) of childrens primary care with a prevention and wellness focus, with an analysis of factors (including cultural) which might facilitate adoption, and indications for policy makers of both the health and economic gains possible. The project will have a strong dissemination programme throughout to ensure dialogue with public, professionals, policy makers, and politicians. The project will take 42 months (36 of scientific work plus start up and close), and deliver major awareness and potential benefit for European childrens health and healthy society.


Grant
Agency: European Commission | Branch: H2020 | Program: RIA | Phase: SFS-03a-2014 | Award Amount: 6.64M | Year: 2015

EMPHASIS is a participatory research project addressing native and alien pests threats (insect pests, pathogens, weeds) for a range of both natural ecosystems and farming systems (field crops, protected crops, forestry, orchards and amenity plants). The overall goal is to ensure a European food security system and the protection of biodiversity and of ecosystems services while developing integrated mechanisms of response measures (practical solutions) to predict, to prevent and to protect agriculture and forestry systems from native and alien pests threats. The specific objectives are the following: 1.Predict, Prioritize and Planning: pest management challenges and opportunities will be evaluated according to stakeholder-focused criteria and through pathway analysis; 2.Prevent: practical solutions for surveillance in different pathways to enhance preparedness will be provided to end-users, and monitoring tools following outbreaks and eradication will be developed; 3.Protect: practical solutions for managing native and alien pests in agriculture, horticulture and forestry will be developed, their technical and economic feasibility will be demonstrated and their market uptake will be enhanced. 4.Promote: a mutual learning process with end-users will be developed, and the solutions identified by the project will be promoted through training and dissemination. The project is in line with EU policy framework (Plant Health Dir. 2000/29/EC, EU Biodiversity strategy to 2020, Dir. 2009/128/EC on sustainable use of pesticides, Roadmap to a Resource Efficient Europe) and its future developments (Reg. on protective measures against pests of plants, Reg. on Invasive Alien Species). The project is not focused on a single management systems but the plant/pest ecosystems dealt with are treated with a multi-method approach to design true IPM methodology that will be developed for key systems with portability to other similar systems, thereby having a large impact.


Grant
Agency: European Commission | Branch: H2020 | Program: RIA | Phase: MG-1.1-2014 | Award Amount: 5.71M | Year: 2015

This project will focus on the development of technologies and methodologies which have the potential to save costs and time across the whole life cycle of the aircraft (design, production, maintenance, overhaul, repair and retrofit), including for certification aspects. Moreover it will also target the integration of additional functions or materials in structural components of the aircraft, the increased use of automation. The first proposed step is the introduction of the -TiAl alloy, a well known promising advanced material for aerospace applications and a revolutionary manufacturing technology. Its specific stiffness and strength, as compared to its low weight, potentially leads to large weight savings (50%), and therefore lower mechanical loads on thermomechanical stressed parts, compared to the common Ni based superalloys. The integration of new material and new manufacturing technology will positively impact several aspects of the manufacturing and maintenance chain, starting from the design, the production, the repair). The aim of this project is twofold: - On one side the work will be focused on the development and integration at industrial of a IPR protected gas atomization process for producing TiAl powders, whose properties must be highly stable from batch to batch. Thanks to the stability of the chemical and granulometric properties of the powders, the application of the Rapid Manufacturing technique to the production of TiAl components will be economically affordable. While this technique is by now well-known, its main drawback resides in the scarce quality of the starting powders. - The other main drawback for the wide industrial application of TiAl components is the integrated optimisation of all the machining steps, that means the setting up of machine tool characteristics and parameters, cutting tool geometry, substrate and coating materials, advanced lubrication technologies.


Grant
Agency: European Commission | Branch: FP7 | Program: ERC-SyG | Phase: ERC-2012-SyG | Award Amount: 14.97M | Year: 2013

Few advances in neuroscience could have as much impact as a precise global description of human brain connectivity and its variability. Understanding this connectome in detail will provide insights into fundamental neural processes and intractable neuropsychiatric diseases. The connectome can be studied at millimetre scale in humans by neuroimaging, particularly diffusion and functional connectivity Magnetic Resonance Imaging. By linking imaging data to genetic, cognitive and environmental information it will be possible to answer previously unsolvable questions concerning normal mental functioning and intractable neuropsychiatric diseases. Current human connectome research relates almost exclusively to the mature brain. However mental capacity and neurodevelopmental diseases are created during early development. Advances in fetal and neonatal Magnetic Resonance Imaging now allow us to undertake The Developing Human Connectome Project (dHCP) which will make major scientific progress by: creating the first 4-dimensional connectome of early life; and undertake pioneer studies into normal and abnormal development. The dHCP will deliver: the first dynamic map of human brain connectivity from 20 to 44 weeks post-conceptional age, linked to imaging, clinical, behavioural and genetic information; comparative maps of the cerebral connectivity associated with neurodevelopmental abnormality, studying well-characterized patients with either the adverse environmental influence of preterm delivery or genetically-characterised Autistic Spectrum Disorder; and novel imaging and analysis methods in an open-source, outward-facing expandable informatics environment that will provide a scalable resource for the research community and advances in clinical medicine.


Healthspan (the life period when one is generally healthy and free from serious disease) depends on nature (genetic make-up) and nurture (environmental influences, from the earliest stages of development throughout life). Genetic studies increasingly reveal mutations and polymorphisms that may affect healthspan. Similarly, claims abound about lifestyle modifications or treatments improving healthspan. In both cases, rigorous testing is hampered by the long lifespan of model organisms like mice (let alone humans) and the difficulty of introducing genetic changes to examine the phenotype of the altered genome. We will develop C. elegans as a healthspan model. Already validated extensively as an ageing model, this organism can be readily modified genetically, and effects of environmental manipulations on healthspan can be measured in days or weeks. Once validated as a healthspan model, it can be used for an initial assessment of preventive and therapeutic measures for humans, as well as for risk identification and the initial evaluation of potential biomarkers. It will also prove useful to study interactions between genetic and various environmental factors.


Grant
Agency: European Commission | Branch: FP7 | Program: CP | Phase: ENERGY.2013.5.1.2 | Award Amount: 9.22M | Year: 2014

ASCENT will provide a robust proof-of-concept of three related high temperature processes; each will lead to a step-change in efficiency of carbon removal in three types of pre-combustion capture, producing the hydrogen needed for highly efficient low-carbon power production. The project brings together five small and medium enterprises preparing to launch these concepts with the support of leading research institutes, universities and industrial partners. The essential feature linking the three technologies is the use of a high temperature solid sorbent for the simultaneous separation of CO2 during conversion of other carbon containing gases (CO and CH4) into H2. Each technology provides a step-change in efficiency because they all separate the CO2 at elevated temperatures (>300C) providing for more efficient heat integration options not available in technologies where the separation occurs at lower temperatures. Each process matches both endothermic and exothermic heat requirements of associated reactions and sorbent regeneration in an integrated in situ approach. The synergies between the three technologies are strong, allowing both multiple interactions between the different work packages and allowing a consistent framework for cross-cutting activities across all the technologies. Each technology will be proven under industrially relevant conditions of pressure and temperature, at a scale that allows the use of industrially relevant materials that can be manufactured at a scale needed for real implementation. This represents a necessary step to be taken for each of the technologies before setting out on the route to future demonstration level activities. ASCENT, Advanced Solid Cycles with Efficient Novel Technologies, addresses the need for original ideas to reduce the energy penalty associated with capturing carbon dioxide during power generation, and create a sustainable market for low carbon emission power with low associated energy penalties


Grant
Agency: European Commission | Branch: FP7 | Program: CP | Phase: ENV.2012.6.4-3 | Award Amount: 11.29M | Year: 2013

The aim of HELIX is to exploit novel tools and methods (remote sensing/GIS-based spatial methods, omics-based approaches, biomarkers of exposure, exposure devices and models, statistical tools for combined exposures, novel study designs, and burden of disease methodologies), to characterise early-life exposure to a wide range of environmental hazards, and integrate and link these with data on major child health outcomes (growth and obesity, neurodevelopment, immune system), thus developing an Early-Life Exposome approach. HELIX uses six existing, prospective birth cohort studies as the only realistic and feasible way to obtain the comprehensive, longitudinal, human data needed to build this early-life exposome. These cohorts have already collected large amounts of data as part of national and EU-funded projects. Results will be integrated with data from European cohorts (>300,000 subjects) and registers, to estimate health impacts at the large European scale. HELIX will make a major contribution to the integrated exposure concept by developing an exposome toolkit and database that will: 1) measure a wide range of major chemical and physical environmental hazards in food, consumer products, water, air, noise, and the built environment, in pre and postnatal periods; 2) integrate data on individual, temporal, and toxicokinetic variability, and on multiple exposures, which will greatly reduce uncertainty in exposure estimates; 3) determine molecular profiles and biological pathways associated with multiple exposures using omics tools; 4) provide exposure-response estimates and thresholds for multiple exposures and child health; and 5) estimate the burden of childhood disease in Europe due to multiple environmental exposures. This integration of the chemical, physical and molecular environment during critical early-life periods will lead to major improvements in health risk and impact assessments and thus to improved prevention strategies for vulnerable populations.


Grant
Agency: European Commission | Branch: H2020 | Program: RIA | Phase: SFS-12-2014 | Award Amount: 8.82M | Year: 2015

Euromix aim to develop an experimentally verified, tiered strategy for the risk assessment of mixtures of multiple chemicals derived from multiple sources across different life stages. The project takes account of the gender dimension and balances the risk of chemicals present in foods against the benefits of those foods. Important concepts for this new strategy are prioritisation criteria for chemicals based on their exposure and hazard characteristics and evaluation of the role of mode of action in grouping chemicals into cumulative assessment groups. In-silico and in-vitro tools will be developed and verified against in-vivo experiments, with focus on four selected endpoints (liver, hormones, development and immunology) to provide a full proof-of-principle. The EuroMix project will result in an innovative platform of bioassays for mixture testing and refined categorisation of chemicals in cumulative assessment groups. New hazard and exposure models will be embedded in a model toolbox, made available for stakeholders through an openly accessible web-based platform. Access to the web-based tools will be facilitated by training. Criteria will be set and guidance will be written on how to use and implement the tiered test strategy. Dissemination and harmonisation of the approach within EU, Codex Alimentarius, and WHO will be achieved by involving a.o. WHO and US-EPA in the project and by the participation of experts playing a key role in helping establish international food safety policies. It is expected that the new mechanism-based strategy, the bioassay platform, the openly accessible web-based model toolbox, and clear guidance on a tiered hazard and exposure test and risk assessment strategy will boost innovation in the public and private sector, provide a sound scientific basis for managing risks to public health from chemical mixtures, ultimately reduce the use of laboratory animals, and support the global discussion of risk assessment policies for mixtures.


Grant
Agency: European Commission | Branch: H2020 | Program: RIA | Phase: PHC-01-2014 | Award Amount: 7.27M | Year: 2015

This programme of work will advance the understanding of the combined effects of factors that cause poor lung function, respiratory disability and the development of COPD . This will be achieved by examination of determinants of lung growth and lung function decline within existing cohorts that cover the whole life course, and which have followed, in detail, the respiratory health status of over 25000 European children and adults from the early 1990s to the present day. Following a comprehensive programme of risk factor identification we will generate a predictive risk score. The programme includes 1) identification of behavioural, environmental, occupational, nutritional, other modifiable lifestyle, genetic determinant of poor lung growth, excess lung function decline and occurrence of low lung function, respiratory disability and COPD within existing child and adult cohorts 2) generation of new data to fill gaps in knowledge on pre-conception and transgenerational determinants and risk factors 3) validation of the role of risk factors by integration of data from relevant disciplines, integration of data from the cohort-related population-based biobanks and exploitation of appropriate statistical techniques 4) generation of information on change in DNA methylation patterns to identify epigenetic changes associated with both disease development and exposure to specific risk factors 5) generation of a predictive risk score for individual risk stratification that takes account of the combined effects of factors that cause poor lung growth, lung function decline, respiratory disability, and COPD and 6) implementation of an online interactive tool for personalised risk prediction based which will be disseminated freely and widely to the population, patients and health care providers. The work will provide an evidence base for risk identification at individual and population level that can underpin future preventive and therapeutic strategies and policies.


Grant
Agency: European Commission | Branch: FP7 | Program: JTI-CP-FCH | Phase: SP1-JTI-FCH.2011.3.7 | Award Amount: 52.35M | Year: 2012

ene.field will deploy up to 1,000 residential fuel cell Combined Heat and Power (micro-CHP) installations, across 11 key Member States. It represents a step change in the volume of fuel cell micro-CHP (micro FC-CHP) deployment in Europe and a meaningful step towards commercialisation of the technology. The programme brings together 9 mature European micro FC-CHP manufacturers into a common analysis framework to deliver trials across all of the available fuel cell CHP technologies. Fuel cell micro-CHP trials will be installed and actively monitored in dwellings across the range of European domestic heating markets, dwelling types and climatic zones, which will lead to an invaluable dataset on domestic energy consumption and micro-CHP applicability across Europe. By learning the practicalities of installing and supporting a fleet of fuel cells with real customers, ene.field partners will take the final step before they can begin commercial roll-out. An increase in volume deployment for the manufacturers involved will stimulate cost reduction of the technology by enabling a move from hand-built products towards serial production and tooling. The ene.field project also brings together over 30 utilities, housing providers and municipalities to bring the products to market and explore different business models for micro-CHP deployment. The data produced by ene.field will be used to provide a fact base for micro FC-CHP, including a definitive environmental lifecycle assessment and cost assessment on a total cost of ownership basis. To inform clear national strategies on micro-CHP within Member States, ene.field will establish the macro-economics and CO2 savings of the technologies in their target markets and make recommendations on the most appropriate policy mechanisms to support the commercialisation of domestic micro-CHP across Europe. Finally ene.field will assess the socio-economic barriers to widespread deployment of micro-CHP and disseminate clear position papers and advice for policy makers to encourage further roll out.


Grant
Agency: European Commission | Branch: FP7 | Program: CP-TP | Phase: KBBE.2013.1.2-04 | Award Amount: 8.53M | Year: 2014

The DROPSA consortium will create new knowledge and understanding of the damage and losses of fruit crops resulting from pests and pathogens, with a specific focus on the new and emerging threats due to Drosophila suzukii and quarantine pathogens Pseudomonas syringae, Xanthomonas fragariae and X. arboricola. The project will deliver a cost effective approach that can be widely implemented by the EU fruit industry. The aims and objectives are to: Determine the pathways of introduction and spread of D. suzukii and pathogens into the EU and develop preventative strategies and recommendations against the introduction of other dangerous fruit pests and pathogens. Determine the biology, ecology and interaction of these pests and diseases in different regions of Europe. This will involve a comprehensive evaluation of the life cycles, host ranges, capacities to disperse, the identification of natural enemies, plant-pathogen interactions as well as the semiochemicals involved in the behaviour of D. suzukii. The biology will provide the platform to develop practical solutions for sustainable pest control. Develop innovative and effective control options using approved chemicals, semiochemicals, novel antimicrobial compounds and biological control agents as well as cultural practices, sterile insect techniques and new mode of action compounds. The most reliable and effective control options will be combined to optimise an integrated pest management (IPM) strategy. Develop forecasting and decision support systems and risk mapping as a component of IPM. The economic viability of proposed strategies for fruit crop protection will be evaluated and used to support decision making in the implementation of IPM strategies to protect the EU fruit sector. To protect intellectual property (IP) and to undertake dissemination and exploitation actions to maximise the impact and up take of the recommended IPM by commercial fruit growers.


Grant
Agency: European Commission | Branch: H2020 | Program: RIA | Phase: BIOTEC-1-2014 | Award Amount: 8.06M | Year: 2015

Mycoplasmas are the smallest cell wall less, free-living microorganisms. The lack of a cell wall makes them resistant to many of the common antibiotics. Every year, infections caused by Mycoplasmas in poultry, cows, and pigs, result in multimillion euros losses in USA and Europe. Currently, there are vaccines against M hyopneumoniae in pigs and M gallisepticum and M synoviae in poultry. However, there is no vaccination against many Mycoplasma species infecting pets, humans and farm animals (ie M bovis cow infection). Mycoplasma species in many cases are difficult to grown in axenic culture and those that grow need a complex media with animal serum. In large scale production of Mycoplasma species for vaccination aside from the high cost of animal serum, more important is the high irreproducibility in the production process and the possible contamination with animal viruses. All this together highlights what European industry needs:i) a defined cheap reproducible medium that is animal serum free and ii) an universal Mycoplasma chassis that could be used in a pipeline to vaccinate against Mycoplasma species, as well as any pathogen. M pneumoniae is an ideal starting point for designing such a vaccine chassis. It has a small genome (860 kb) and it is probably the organism with the most comprehensive systems biology data acquired so far. By genome comparison, metabolic modeling and rationally engineering its genome, we will create a vaccine chassis that will be introduced into an industrial pipeline. The process will be guided by the second world largest industry on animal vaccination (MSD), as well as a SME specialized on peptide display and screening. This will ensure the exploitation and commercialization of our work contributing to maintain Europe privileged position in this field. Our ultimate goal is to meet the needs of the livestock industry,taking care of ethical issues, foreseeable risks, and prepare effective dissemination and training material for the public.


Grant
Agency: European Commission | Branch: FP7 | Program: CP-TP | Phase: FoF.NMP.2013-11 | Award Amount: 7.51M | Year: 2013

The aim of the project is to develop a completely new manufacturing system for the volume production of miniaturised components by overcoming the challenges on the manufacturing with a wide range of materials (metallic alloys, composites, ceramics and polymers), through: (i) developing a high-throughput, flexible and cost-efficient process by simultaneous electrical-forming and electric-fast-sintering (Micro-FAST); (ii) scaling up the process to an industrial scale; (iii) further developing it towards an industrial production system for micro-/nano-manufacturing. These will be enabled/supported by developing: (i) a new machine concept: Micro-FAST CNC Machine; (ii) an innovative inline monitoring and quality inspection system; (iii) innovative multiscale modelling techniques for the analysis of the micro-structural behaviours of materials and its interactions with the production processes; (iv) new tooling techniques for high-performance tools, and (v) high-performance nano-material systems. The whole development will take into account energy savings, cost and waste reduction, and recycling issues which will be studied thoroughly through an expertise Life-Cycle Assessment. The development should lead to substantial improvements in the manufacture of components at micro and nanoscale with a good balance on cost and performance. The consortium seeks: reduction of the overall manufacturing cost by 50-100%; energy consumption by more than 30-50%; achieving full-density (100% density) components; direct economic gains for the SME participants of up to 5-25%. The whole development will support the EU-wide product innovations involving use of miniature and micro-components in many manufacturing sectors and, especially with difficult-to-cut and difficult-to-form materials. Adopting the production system in industry should help the EU manufacturing sectors to gain new technological and business competiveness significantly.


Grant
Agency: European Commission | Branch: FP7 | Program: CP-FP | Phase: HEALTH.2013.3.3-1 | Award Amount: 6.59M | Year: 2013

This project focuses on the systematic promotion and facilitation of active mobility (AM) (i.e. walking and cycling including the combination with public transport use) as an innovative approach to integrate physical activity (PA) into individuals everyday lives. In contrast to sports or exercise, AM requires less time and motivation, since AM provides both convenience as a mode of transport, and a healthy lifestyle. As such it has potential to reach parts of the population which have not been receptive to the appeals and benefits of sports and exercise. The objectives of the project are the following: The project will review the literature on AM and identify innovative measures and systematic initiatives to promote AM as well as traffic safety interventions. A longitudinal study will be conducted to evaluate the ongoing AM initiatives combined with traffic safety interventions to better understand correlates of AM and their effects on overall PA, injury risk and exposure to air pollution. An improved user-friendly tool for more comprehensive health impact assessment (HIA) of AM will be developed. The tool will be applied to AM behavior observed in the case study cities and allow the assessment of health and economic impacts of measures. The project will also produce a compendium of good practices of AM promotion aimed at decision makers, implementing authorities, businesses, civil society organizations and end users. Findings and progress reports will be communicated to diverse target audiences, including policy makers, practitioners, researchers and end-users, through a number of media, i.e. reports, journals, brochures, web-content, workshops and presentations.


Grant
Agency: European Commission | Branch: FP7 | Program: CP | Phase: ICT-2011.3.1 | Award Amount: 17.80M | Year: 2013

To extend beyond existing limits in nanodevice fabrication, new and unconventional lithographic technologies are necessary to reach Single Nanometer Manufacturing (SNM) for novel Beyond CMOS devices. Two approaches are considered: scanning probe lithography (SPL) and focused electron beam induced processing (FEBIP). Our project tackles this challenge by employing SPL and FEBIP with novel small molecule resist materials. The goal is to work from slow direct-write methods to high speed step-and-repeat manufacturing by Nano Imprint Lithography (NIL), developing methods for precise generation, placement, metrology and integration of functional features at 3 - 5 nm by direct write and sub-10nm into a NIL-template. The project will first produce a SPL-tool prototype and will then develop and demonstrate an integrated process flow to establish proof-of-concept Beyond CMOS devices employing developments in industrial manufacturing processes (NIL, plasma etching) and new materials (Graphene, MoS2). By the end of the project: (a) SNM technology will be used to demonstrate novel room temperature single electron and quantum effect devices; (b) a SNM technology platform will be demonstrated, showing an integrated process flow, based on SPL prototype tools, electron beam induced processing, and finally pattern transfer at industrial partner sites. An interdisciplinary team (7 Industry and 8 Research/University partners) from experienced scientists will be established to cover specific fields of expertise: chemical synthesis, scanning probe lithography, FEBIP-Litho, sub-3nm design and device fabrication, single nanometer etching, and Step-and-Repeat NIL- and novel alignment system design. The project coordinator is a University with great experience in nanostructuring and European project management where the executive board includes European industry leaders such as IBM, IMEC, EVG, and Oxford Instruments.


Grant
Agency: European Commission | Branch: FP7 | Program: CP | Phase: ICT-2011.5.2 | Award Amount: 18.10M | Year: 2013

The DementiA Research Enabled by IT project responds to the European Parliaments 2011 resolution for a European Initiative on Alzheimers disease and other dementias, and the EU Year of the Brain 2014 Initiative. It delivers the first patient-specific predictive models for early differential diagnosis of dementias and their evolution. Its mechanistic/phenomenological models of the ageing brain account simultaneously for the patient-specific multiscale biochemical, metabolic and biomechanical brain substrate, as well as for genetic, clinical, demographic and lifestyle determinants. It investigates the effect of metabolic syndrome, diabetes, diets, exercise, and pulmonary conditions on the ageing brain, as environmental factors influencing onset and evolution of dementias.\n\nAn integrated clinical decision support platform will be validated/ tested by access to a dozen databases of international cross-sectional and longitudinal studies, including exclusive access to a population study that has tracked brain ageing in more than 10,000 individuals for over 20 years (Rotterdam Study).\n\nEnabling more objective, earlier, predictive and individualised diagnosis and prognosis of dementias will support health systems worldwide to cope with the burden of 36M patients that, due to ageing societies, will increase to 115M by 2050. Worldwide costs are estimated to 450B annually. In 2012, the WHO declared dementia a global health priority.\n\nOur consortium assembles highly recognised engineering, physical, biomedical and clinical scientists, and industrial partners experienced in exploiting VPH technologies in healthcare. Co-operation with infrastructure projects like VPH-Share, related international Physiome efforts, and other dementia research consortia is assured, allowing European researchers from different disciplines to contribute to share resources, methods and generate new knowledge.


Grant
Agency: European Commission | Branch: FP7 | Program: CP | Phase: ENERGY.2013.3.7.1 | Award Amount: 5.16M | Year: 2013

The main aim of this project is to support the sustainable delivery of non-food biomass feedstock at local, regional and pan European level through developing strategies, and roadmaps that will be informed by a computerized and easy to use toolset (and respective databases) with update harmonized datasets at local, regional, national and pan European level for EU27, western Balkans, Turkey and Ukraine. It will do so by comparing and making use of the most recent relevant information from recent and ongoing EU projects by a set of carefully selected validation case studies and in concise collaboration with key stakeholders from policy, industry and markets.The project fits under the overall umbrella of the Europe 2020 strategy for the building of a bioeconomy, as well as the targets for deployment of renewable energies and reduction of greenhouse gas emissions.The project will build up a concise knowledge base both for the sustainable supply and logistics of nonfood biomass (quantities, costs, technological pathway options for 2020 and beyond), for the development of technology and market strategies to support the development of a resource efficient Bioeconomy for Europe. This includes industrial processes (i.e. bio-based industries) for manufacturing biomass-derived goods/products as well as energy conversion, both for large scale and small scale units.The research work will be organized in three individual but strongly interrelated Themes: Theme 1 will focus on methodological approaches, data collection and estimation of sustainable biomass potentials, resource efficient pathways and optimal logistical supply routes as well as will develop the computerized toolset. Theme 2 will make use of the findings of Theme 1 and develop a Vision, Strategies and an R&D roadmap for the sustainable delivery of non-food biomass feedstock at local, regional and pan European level. Theme 3 will validate the findings from Themes 1 and 2 and ensure the project outreach


Grant
Agency: European Commission | Branch: FP7 | Program: MC-ITN | Phase: FP7-PEOPLE-2013-ITN | Award Amount: 3.93M | Year: 2014

TEMPO addresses the needs of European companies and society for embedded control technology, through training on cutting edge research in the rapidly emerging inter-disciplinary field of embedded predictive control and optimization. The key objectives are: - to expand the scientific and technical knowledge platform for Embedded Predictive Control and Optimization in Europe; - to exploit this platform to train a new generation of world class researchers and professionals that are highly attractive for employment by the European industry; - to establish structures for long-term cooperation and strengthen the relations among the leading universities and industry in Europe in this field, to continuously develop the research training platform that European industry relies on. To achieve the objectives listed above, the main tasks of TEMPO are: - to attract and train 14 Early Stage Researchers in embedded MPC and optimization via a joint academic/industrial program of cutting edge training-by-research, high quality supervision, complementary and transferable skills training, inter-network secondments, and workshops; - to create a closely connected group of leading European scientists that are highly sought after by European industry, and ready to push forward embedded MPC and optimization into new innovative products, industries and services; - to build a solid foundation for long-term European excellence in this field by disseminating the research and training outcomes and best practice of TEMPO into the doctoral schools of the partners, and by fostering long-term partnerships and collaboration mechanisms that will outlast the ITN; - to disseminate the know-how of the participants to each other and to external groups via networking activities, inter-sectoral exposure, secondments, workshops, demonstrations, sharing of learning material, public engagement and outreach activities, and open source public domain software outcomes.


Grant
Agency: European Commission | Branch: H2020 | Program: MSCA-ITN-ETN | Phase: MSCA-ITN-2014-ETN | Award Amount: 4.10M | Year: 2015

Cement production for the construction industry contributes up to 5% of anthropogenic CO2 emissions. Developing more environmentally friendly concrete requires the assessment of strength for a diverse range of new cement materials. Similar issues arise during the development of biocompatible cements for medical applications. Properties of naturally cemented materials of organic origin are of key importance in the oil industry, with carbonate reservoirs prone to creep, particularly during the injection of CO2 for enhanced oil recovery or permanent storage. However, despite the importance of cement materials to our infrastructure, health and environment, we still lack the fundamental basis for understanding the strength of cemented aggregates. Granular pastes and sediments transform to strong solids through reactions at nano-confined mineral interfaces, where nucleation and growth at the adjacent solid surfaces are affected in a manner not yet understood. There is a need for improved concepts, theories and models. NanoHeal targets this issue by bringing six industrial and six academic groups together in a European Training Network (ETN), in an emerging interdisciplinary field spanning from basic sciences to the corresponding engineering disciplines. NanoHeal will deliver an outstanding environment for training and career development of young researchers. The aims of NanoHeal are to: develop innovative probes and models for nanoscale processes that open novel perspectives in design and control of organo-mineral materials. measure and improve the strength and durability of 1) new man-made cemented materials like green concrete, speciality cements in construction and oil and gas recovery, and biocompatible implants and 2) natural sedimentary rocks inside reservoirs and as construction materials educate young interdisciplinary researchers at the interface between fundamental science and European industry.


Grant
Agency: European Commission | Branch: H2020 | Program: RIA | Phase: INFRADEV-4-2014-2015 | Award Amount: 14.84M | Year: 2015

The social and economic challenges of ageing populations and chronic disease can only be met by translation of biomedical discoveries to new, innovative and cost effective treatments. The ESFRI Biological and Medical Research Infrastructures (BMS RI) underpin every step in this process; effectively joining scientific capabilities and shared services will transform the understanding of biological mechanisms and accelerate its translation into medical care. Biological and medical research that addresses the grand challenges of health and ageing span a broad range of scientific disciplines and user communities. The BMS RIs play a central, facilitating role in this groundbreaking research: inter-disciplinary biomedical and translational research requires resources from multiple research infrastructures such as biobank samples, and resources from multiple research infrastructures such as biobank samples, imaging facilities, molecular screening centres or animal models. Through a user-led approach CORBEL will develop the tools, services and data management required by cutting-edge European research projects: collectively the BMS RIs will establish a sustained foundation of collaborative scientific services for biomedical research in Europe and embed the combined infrastructure capabilities into the scientific workflow of advanced users. Furthermore CORBEL will enable the BMS RIs to support users throughout the execution of a scientific project: from planning and grant applications through to the long-term sustainable management and exploitation of research data. By harmonising user access, unifying data management, creating common ethical and legal services, and offering joint innovation support CORBEL will establish and support a new model for biological and medical research in Europe. The BMS RI joint platform will visibly reduce redundancy and simplify project management and transform the ability of users to deliver advanced, cross-disciplinary research.


Grant
Agency: European Commission | Branch: H2020 | Program: IA | Phase: ICT-20-2015 | Award Amount: 7.28M | Year: 2016

Although online education is a paramount pillar of formal, non-formal and informal learning, institutions may still be reluctant to wager for a fully online educational model. As such, there is still a reliance on face-to-face assessment, since online alternatives do not have the deserved expected social recognition and reliability. Thus, the creation of an e-assessment system that will be able to provide effective proof of student identity, authorship within the integration of selected technologies in current learning activities in a scalable and cost efficient manner would be very advantageous. The TeSLA project provides to educational institutions, an adaptive trust e-assessment system for assuring e-assessment processes in online and blended environments. It will support both continuous and final assessment to improve the trust level across students, teachers and institutions. The system will be developed taking into account quality assurance agencies in education, privacy and ethical issues and educational and technological requirements throughout Europe. It will follow the interoperability standards for integration into different learning environment systems providing a scalable and adaptive solution. The TeSLA system will be developed to reduce the current restrictions of time and physical space in teaching and learning, which opens up new opportunities for learners with physical or mental disabilities as well as respecting social and cultural differences. Given the innovative action of the project, the current gap in e-assessment and the growing number of institutions interested in offering online education, the project will conduct large scale pilots to evaluate and assure the reliability of the TeSLA system. By the nature of the product, dissemination will be performed across schools, higher education institutions and vocational training centres. A free version will be distributed, although a commercial-premium version will be launched on the market.


Grant
Agency: European Commission | Branch: H2020 | Program: SGA-RIA | Phase: FETFLAGSHIP | Award Amount: 89.00M | Year: 2016

This project is the second in the series of EC-financed parts of the Graphene Flagship. The Graphene Flagship is a 10 year research and innovation endeavour with a total project cost of 1,000,000,000 euros, funded jointly by the European Commission and member states and associated countries. The first part of the Flagship was a 30-month Collaborative Project, Coordination and Support Action (CP-CSA) under the 7th framework program (2013-2016), while this and the following parts are implemented as Core Projects under the Horizon 2020 framework. The mission of the Graphene Flagship is to take graphene and related layered materials from a state of raw potential to a point where they can revolutionise multiple industries. This will bring a new dimension to future technology a faster, thinner, stronger, flexible, and broadband revolution. Our program will put Europe firmly at the heart of the process, with a manifold return on the EU investment, both in terms of technological innovation and economic growth. To realise this vision, we have brought together a larger European consortium with about 150 partners in 23 countries. The partners represent academia, research institutes and industries, which work closely together in 15 technical work packages and five supporting work packages covering the entire value chain from materials to components and systems. As time progresses, the centre of gravity of the Flagship moves towards applications, which is reflected in the increasing importance of the higher - system - levels of the value chain. In this first core project the main focus is on components and initial system level tasks. The first core project is divided into 4 divisions, which in turn comprise 3 to 5 work packages on related topics. A fifth, external division acts as a link to the parts of the Flagship that are funded by the member states and associated countries, or by other funding sources. This creates a collaborative framework for the entire Flagship.


Grant
Agency: European Commission | Branch: FP7 | Program: CP | Phase: ENV.2013.6.2-1 | Award Amount: 9.99M | Year: 2014

Water and water-related services are major components of the human wellbeing, and as such are major factors of socio-economic development in Europe; yet freshwater systems are under threat by a variety of stressors (organic and inorganic pollution, geomorphological alterations, land cover change, water abstraction, invasive species and pathogens. Some stressors, such as water scarcity, can be a stressor on its own because of its structural character, and drive the effects of other stressors. The relevance of water scarcity as a stressor is more important in semi-arid regions, such as the Mediterranean basin, which are characterized by highly variable river flows and the occurrence of low flows. This has resulted in increases in frequency and magnitude of extreme flow events. Furthermore, in other European regions such as eastern Germany, western Poland and England, water demand exceeds water availability and water scarcity has become an important management issue. Water scarcity is most commonly associated with inappropriate water management, with resulting river flow reductions. It has become one of the most important drivers of change in freshwater ecosystems. Conjoint occurrence of a myriad of stressors (chemical, geomorphological, biological) under water scarcity will produce novel and unfamiliar synergies and most likely very pronounced effects. Within this context, GLOBAQUA has assembled a multidisciplinary team of leading scientists in the fields of hydrology, chemistry, ecology, ecotoxicology, economy, sociology, engineering and modeling in order to study the interaction of multiple stressors within the frame of strong pressure on water resources. The aim is to achieve a better understanding how current management practices and policies could be improved by identifying the main drawbacks and alternatives.


Grant
Agency: European Commission | Branch: FP7 | Program: CP-CSA | Phase: ENERGY.2013.10.1.5 | Award Amount: 13.28M | Year: 2014

Europe has invoked the SET-Plan to design and implement an energy technology policy for Europe to accelerate the development and deployment of cost-effective renewable energy systems, including photovoltaics. With lower cost of solar electricity, PV could significantly contribute to the achievements of the 20-20-20 objectives. The Joint Program on PV of the European Energy Research Alliance (EERA-PV) aims to increase the effectiveness and efficiency of PV R&D through alignment and joint programming of R&D of its member institutes, and to contribute to the R&D-needs of the Solar Europe Industry Initiative. In CHEETAH, all EERA-PV members will, through collaborative R&D activities, (1) focus on solving specific bottlenecks in the R&D Joint Program of EERA-PV, (2) strengthen the collaboration between PV R&D performers in Europe through sharing of knowledge, personnel and facilities, and (3) accelerate the implementation of developed technologies in the European PV industry. Specifically, CHEETAH R&D will support Pillar A (performance enhancement & energy cost reduction) of the SEII Implementation Plan, through materials optimization and performance enhancement. CHEETAHs objectives are threefold: 1) Developing new concepts and technologies for wafer-based crystalline silicon PV (modules with ultra-thin cells), thin-film PV (advanced light management) and organic PV (very low-cost barriers), resulting in (strongly) reduced cost of materials and increased module performance; 2) Fostering long-term European cooperation in the PV R&D sector, by organizing workshops, training of researchers, efficient use of infrastructures; 3) Accelerating the implementation of innovative technologies in the PV industry, by a strong involvement of EPIA and EIT-KIC InnoEnergy in the program It is the ambition of CHEETAH to develop technology and foster manufacturing capabilities so that Europe can regain and build up own manufacturing capacity in all parts of the value chain in due time.


Grant
Agency: European Commission | Branch: H2020 | Program: IA | Phase: LCE-07-2014 | Award Amount: 15.65M | Year: 2015

Unlike the control and observability put in service in HV/MV, LV networks are still being substantially managed as usual: no visibility of power and voltage or grid components status, poor knowledge of connectivity, manual operation of switches or few tools for worker support. The LV grid characteristics (radial topology, exposition to local disturbances, local accumulation of distributed generation, technical and no-technical loses, aging heterogeneous, etc.) limit the construction and refurbish of LV electric infrastructure and the integration on it of grid remote monitoring and operation and automation resources, bringing to difficulties in the implementation of the LV Smart Grid and the integration of Distributed Generation Resources and Active Demand Management (ADM). Smart metering deployment Mandates offer an opportunity to maximize the gains derived from the obliged functions to be deployed related to smart metering, developing and integrating additional innovative grid and ICT infrastructure, functions, services and tools improving grid operation performance and quality and paving the way for benefits and business opportunities for the involved actors (DSOs, customers, retailers and ESCOs). The project aims to develop, deploy and demonstrate innovative solutions (grid systems, functions, services and tools) for advanced Operation and Exploitation of LV/MV networks in a fully smart grid environment improving the capacity of that networks as enablers for Distributed Generation, ADM, Customer empowering and business opportunities. The project proposes 4 real pilots in Portugal, Poland, Spain and Sweden covering: Smart grid monitoring and operation, advanced grid maintenance, DER and ADM integration and active Consumer awareness and participation with cost efficiency. Also proposes specific WPs to maximize the socioeconomic impact of results, especially for their market uptake, business opportunities triggering and society awareness on the smart grid benefits


Grant
Agency: European Commission | Branch: H2020 | Program: RIA | Phase: ICT-29-2016 | Award Amount: 4.00M | Year: 2016

Colorectal cancer represents around one tenth of all cancers worldwide. Early and accurate diagnosis and precise intervention can increase cure rate up to 90%. Improved diagnostic techniques with enough sensitivity and specificity are required to allow in situ assessment, safe characterization and resection of lesions during clinical practice interventions. The multidisciplinary PICCOLO team proposes a new compact, hybrid and multimodal photonics endoscope based on Optical Coherence Tomography (OCT) and Multi-Photon Tomography (MPT) combined with novel red-flag fluorescence technology for in vivo diagnosis and clinical decision support. By combining the outstanding structural information from OCT with the precise functional information from MPT, this innovative endoscope will provide gastroenterologists immediate and detailed in situ identification of colorectal neoplastic lesions and facilitate accurate and reliable in vivo diagnostics, with additional, grading capabilities for colon cancer as well as in-situ lesion infiltration and margin assessment. With the development of compact instrumentation, the cost of the components and thus the system will be significantly reduced. Human representative animal models will be used to generate imaging biomarkers that allow automated detection, assessment and grading of disease. The developed system will be tested in operating room conditions. The consortium comprises the whole value chain including pre-clinical and clinical partners, technology providers, photonics SMEs and endoscopy market leader company. The project will permit these companies to enhance their competitiveness and leadership in the diagnostics sector as well as exploiting new market opportunities. The new endoscope will significantly impact clinical practice allowing in vivo optical biopsy assessment via the automatic analysis of images allowing accurate and efficient characterisation of colorectal lesions.


Howes O.D.,Imperial College London | Howes O.D.,King's College London | Murray R.M.,King's College London
The Lancet | Year: 2014

Schizophrenia remains a major burden on patients and society. The dopamine hypothesis attempts to explain the pathogenic mechanisms of the disorder, and the neurodevelopmental hypothesis the origins. In the past 10 years an alternative, the cognitive model, has gained popularity. However, the first two theories have not been satisfactorily integrated, and the most influential iteration of the cognitive model makes no mention of dopamine, neurodevelopment, or indeed the brain. In this Review we show that developmental alterations secondary to variant genes, early hazards to the brain, and childhood adversity sensitise the dopamine system, and result in excessive presynaptic dopamine synthesis and release. Social adversity biases the cognitive schema that the individual uses to interpret experiences towards paranoid interpretations. Subsequent stress results in dysregulated dopamine release, causing the misattribution of salience to stimuli, which are then misinterpreted by the biased cognitive processes. The resulting paranoia and hallucinations in turn cause further stress, and eventually repeated dopamine dysregulation hardwires the psychotic beliefs. Finally, we consider the implications of this model for understanding and treatment of schizophrenia.


Leader E.,Imperial College London | Lorce C.,University Paris - Sud | Lorce C.,University of Liège
Physics Reports | Year: 2014

The general question, crucial to an understanding of the internal structure of the nucleon, of how to split the total angular momentum of a photon or gluon into spin and orbital contributions is one of the most important and interesting challenges faced by gauge theories like Quantum Electrodynamics and Quantum Chromodynamics. This is particularly challenging since all QED textbooks state that such a splitting cannot be done for a photon (and a fortiori for a gluon) in a gauge-invariant way, yet experimentalists around the world are engaged in measuring what they believe is the gluon spin! This question has been a subject of intense debate and controversy, ever since, in 2008, it was claimed that such a gauge-invariant split was, in fact, possible. We explain in what sense this claim is true and how it turns out that one of the main problems is that such a decomposition is not unique and therefore raises the question of what is the most natural or physical choice. The essential requirement of measurability does not solve the ambiguities and leads us to the conclusion that the choice of a particular decomposition is essentially a matter of taste and convenience. In this review, we provide a pedagogical introduction to the question of angular momentum decomposition in a gauge theory, present the main relevant decompositions and discuss in detail several aspects of the controversies regarding the question of gauge invariance, frame dependence, uniqueness and measurability. We stress the physical implications of the recent developments and collect into a separate section all the sum rules and relations which we think experimentally relevant. We hope that such a review will make the matter amenable to a broader community and will help to clarify the present situation. © 2014 Elsevier B.V.


Johnson R.W.,Royal Infirmary | Rice A.S.C.,Imperial College London
New England Journal of Medicine | Year: 2014

A 73-year-old woman presents with persistent pain and itching in the right T10 dermatome from just above the thoracolumbar junction to the umbilicus since a documented episode of herpes zoster in the same region 1 year earlier. She describes a severe, continuous "burning" pain, unpredictable paroxysms of lancinating pain lasting a few seconds, and intense hypersensitivity to light tactile stimulation, such as clothing brushing against the skin. On physical examination, there are signs of cutaneous scarring throughout the right T10 dermatome, with areas of excoriation caused by scratching. She has patchy loss of tactile perception in this distribution as well as areas of pain provoked by a light brush. Acetaminophen did not help her pain. How would you manage this patient's condition?. © 2014 Massachusetts Medical Society.


Nielsen C.B.,Imperial College London | Turbiez M.,BASF | McCulloch I.,Imperial College London
Advanced Materials | Year: 2013

This progress report summarizes the numerous DPP-containing polymers recently developed for field-effect transistor applications including diphenyl-DPP and dithienyl-DPP-based polymers as the most commonly reported materials, but also difuranyl-DPP, diselenophenyl-DPP and dithienothienyl-DPP- containing polymers. We discuss the hole and electron mobilities that were reported in relation to structural properties such as alkyl substitution patterns, polymer molecular weights and solid state packing, as well as electronic properties including HOMO and LUMO energy levels. We moreover consider important aspects of ambipolar charge transport and highlight fundamental structure-property relations such as the relationships between the thin film morphologies and the charge carrier mobilities observed for DPP-containing polymers. Copyright © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.


Grant
Agency: European Commission | Branch: FP7 | Program: MC-ITN | Phase: FP7-PEOPLE-2013-ITN | Award Amount: 3.78M | Year: 2013

Organic Bioelectonics is a new discipline which holds promise to shape, direct, and change future medical treatments in a revolutionary manner over the next decades. At the moment Europe has a unique leading position in this area, being almost all the world-leading groups in this field located in Europe and constituting the core of this international training network. However, realizing the promise of Organic bioelectronics requires research and training not only crossing disciplines, such as electrical engineering, biology, chemistry, physics, and materials science, but also crossing our European countries. The EU will add value on the global scene only if it acts jointly. OrgBIO is at the core of European technological innovation and will become an indispensable part of the educational canon. It will establish a world-class training platform spreading around the highly interdisciplinary / intersectorial European-led area of organic bioelectronics. Education along with science and entrepreneurial mindsets and attitudes is the core of the OrgBIO training programme, which aims at excellence and innovation, at all level. Excellence in science is guaranteed by the world-leading groups which founded this research area. Innovation in education is guaranteed by the involvement of researchers on education, business experts. Using different sensors, actuators, electronic and interconnect technologies the network will develop multifunctional systems based on organic devices and materials with high sensitivity that are also flexible, conformable and present over large areas for various biomedical / biological applications in the life science. Multi-analyte and disposable analytical systems manufactured by large-area printing methods will provide services to the individuals and healthcare community. Targeted implemented interactions with a wide network of venture capitals and business actors will immediately transfer the research outcome to the European Industry.


Grant
Agency: European Commission | Branch: H2020 | Program: MSCA-ITN-ETN | Phase: MSCA-ITN-2016 | Award Amount: 3.16M | Year: 2017

In the Roadmap for Mental Health and Wellbeing Research in Europe (ROAMER), top-priority is research into child and adolescent mental health symptoms. CAPICE (Childhood and Adolescence Psychopathology: unravelling the complex etiology by a large Interdisciplinary Collaboration in Europe) will address this priority. This network will elaborate on the EArly Genetics and Lifecourse Epidemiology (EAGLE) consortium, a well-established collaboration of the many European birth and adolescent population based (twin and family) cohorts with unique longitudinal information on lifestyle, family environment, health, and emotional and behavioral problems. Phenotypic and genome-wide genotypic data are available for over 60,000 children, in addition to genome-wide genotypes for over 20,000 mothers and epigenome-wide data for over 6,000 children. Combined with the enormous progress in methodology, the results of the research performed in this network will greatly expand our knowledge regarding the etiology of mental health symptoms in children and adolescents and shed light on possible targets for prevention and intervention, e.g. by drug target validation. Moreover, it will provide Early Stage Researchers (ESRs) with an excellent training in the psychiatric genomics field given by a multidisciplinary team of eminent scientists from the academic and non-academic sector highly experienced in e.g., gene-environment interaction and covariation analyses, (epi)genome-wide association studies, Mendelian Randomization (MR) and polygenic analyses. With a focus on common and debilitating problems in childhood and adolescence, including depression, anxiety and Attention Deficit Hyperactivity Disorder, CAPICE will contribute to improving later outcomes of young people in European countries with child and adolescent psychopathology.


Grant
Agency: European Commission | Branch: FP7 | Program: CP-FP | Phase: NMP.2012.4.1-3 | Award Amount: 5.16M | Year: 2013

We will examine the life cycle of rare earth metals used in magnetic phase change technologies. Our primary focus is on room temperature magnetic cooling, a near-market solid state alternative to gas compression in which a phase change magnetocaloric material is magnetised by a permanent magnet. We will address the fabrication, manufacture and use of the magnetocaloric material, aiming: (1) to reduce consumption and eliminate wastage of rare earths during the scalable manufacture of magnetocaloric parts; and (2) to drastically reduce the volume of rare earth permament magnet through a step-change improvement in the performance of low-rare earth or rare earth-free magnetocaloric materials. Such developments will reduce both raw material use and future technology cost, providing the necessary bridge between state-of-the art prototyping activity and industrially scalable production of magnetic cooling engines. The project consortium includes materials physicists, researchers active in the industrial scale-up of parts manufacture, a magnet and magnetocaloric material supplier and an SME. This combination will provide feedback between fundamental magnetocaloric material properties, material performance under test, and potential impact on product design. A large-scale end-user partner will provide analyses of the life cycle, environmental and cost benefits of our research to the domestic refrigeration sector. The knowledge gained from our activities will be used in parallel for the development of magnetocaloric materials for a longer-term application: thermomagnetic power generation.


Grant
Agency: European Commission | Branch: H2020 | Program: BBI-RIA | Phase: BBI.R10-2015 | Award Amount: 3.77M | Year: 2016

The BIOrescue project aims to develop and demonstrate a new innovative biorefinery concept based on the cascading use of spent mushroom substrate (SMS) supplemented by wheat straw (and other seasonal underutilised lignocellulosic feedstocks. i.e pruning residues, residual citrus peels and wastes). This new concept will avoid disposal and allow for the production of some biodegradable bio-based products and bioactive compounds that will help to replace the existing ones based on fossil resources. The research will help to expand the business opportunities of the mushroom cultivation farms, and the know-how and business opportunities of all the partners involved. The main innovations are: - Improved methods for the lab-based rapid (NIR) analysis of biomass - Innovative two step fractionation of SMS - Synergic effects for complete SMS glucan hydrolysis - Innovative enzyme immobilisation strategy - Development of highly efficient glucan-enzymes - Novel lignin based nano- and micro-carriers - Biopesticide production from monomeric sugars SMS derived and their packaging into nanocarriers The consortium involved is a representation of some BIC members including a large company (Monaghan Mushrooms) which is leading the proposal and some SMEs (MetGen Oy and CLEA Technologies) and BIC associate members (University of Naples and CENER). Additionally other relevant partners with well-known expertise in their respective areas contribute to the objectives. Among them some research organisations (Imperial College of London and Max Planck Institute of Polymers) and Innovative SMEs (Celignis Limited, Zabala Innovation Consulting, Greenovate Europe and C-TECH Innovation Ltd). The synergies between large industry and SMEs go beyond the scope of this project. There is a lot of potential for collaboration between agricultural industry (Monaghan) and biotechnology (MetGen and CLEA) to provide novel solutions for continuous circular economy in large agriculture-based value-chains.


Grant
Agency: European Commission | Branch: H2020 | Program: RIA | Phase: CULT-COOP-08-2016 | Award Amount: 2.37M | Year: 2016

Pluggable Social Platform for Heritage Awareness and Participation (PLUGGY) will support citizens in shaping cultural heritage and being shaped by it. PLUGGY will enable them to share their local knowledge and everyday experience with others. The participation will include the contribution of cultural institutions and digital libraries, building extensive networks around a common interest in connecting past, present and future. PLUGGY frames its objectives around the Faro Convention, in line with new social paradigms which declare heritage as an asset and a responsibility for all, aiming to encompass greater democratic participative actions with concern for the local and the everyday. The PLUGGY Social Platform will facilitate a continuing process for creating, modifying and safeguarding heritage where citizens will be prosumers and maintainers of cultural activities. It will be web based, easily accessed and will allow the development of shared identity and differentiation. PLUGGY Social Platforms users will curate stories using the PLUGGY Curatorial Tool. Content will be both crowdsourced and retrieved from digital collections, allowing users to create links between seemingly unrelated facts, events, people and digitized collections, leading to new approaches of presenting cultural resources, and new ways of experiencing them. PLUGGY will provide the necessary architecture for the creation of pluggable applications, allowing for beyond-the-project, not yet imagined ways to utilize the content on the social platform, while focusing on the design of the social interaction, helping to build new virtual heritage communities. The PLUGGY consortium spans 5 countries and includes 4 academic partners (ICCS, TUK, UMA, ICL), a total of 10 museums (PIOP, ESM) and 3 SMEs (CLIO, VIA, XTS) in the fields of cultural heritage and creative applications. They cover the areas of cultural heritage, social platforms, authoring tools, VR/AR, knowledge management, semantics and 3D audio.


Grant
Agency: European Commission | Branch: FP7 | Program: JTI-CP-FCH | Phase: SP1-JTI-FCH.2012.3.2 | Award Amount: 7.36M | Year: 2013

This project aims at improving the robustness, manufacturability, efficiency and cost of Fuel Cells state-of-the-art SOFC stacks so as to reach market entry requirements. We propose a focused project addressing the key issues that have manifested themselves in the course of the ongoing product development efforts at Topsoe Fuel Cell A/S (TOFC). The key issues are the mechanical robustness of solid oxide fuel cells (SOFCs), and the delicate interplay between cell properties, stack design, and operating conditions of the SOFC stack. The novelty of the project lies in combining state of the art methodologies for cost-optimal reliability-based design (COPRD) with actual production optimization. To achieve the COPRD beyond state of the art multi-physical modelling concepts must be developed and validated for significantly improved understanding of the production and operation of SOFC stacks. The key to this understanding is validating experiments and models on multiple levels of the SOFC system and introduction of extensive test programs specified by the COPRD methodology.


Grant
Agency: European Commission | Branch: FP7 | Program: CP | Phase: ENERGY.2012.10.2.1 | Award Amount: 3.55M | Year: 2013

Dye-sensitized solar cell (DSSC) is the leading technology of third-generation solution-processed solar cells with reported efficiencies in excess of 10%. However despite the huge efforts in the last two decades saturation effects are observed in their performance. Efforts so far have been concentrated towards engineering and fine-tuning of the dyes, the electrolytes and the interface of the dye to the electron acceptor, employing titania as the electron acceptor. DSSCs rely, then, on dyes for efficient light harvesting which in turn entails high fabrication costs associated to the Ru-based dyes as well as the use of 10 um thick devices. In addition, optimized titania requires high-temperature processing raising concerns for its potential for low-cost, flexible-platform fabrication. In this project we propose a disruptive approach; to replace titania with a novel electron accepting nanoporous semiconductor with a bandgap suitable for optimized solar harnessing and a very high absorption coefficient to allow total light absorption within 2 um across its absorption spectrum. In addition the deposition of the nanostructured platform will employ processing below 200oC, compatible with plastic, flexible substrates and cost-effective roll-to-roll manufacturing. We will focus on non-toxic high-abundance nanomaterials in order to enable successful deployment of DSSCs with targeted efficiencies in excess of 15% and 10% for SS-DSSCs, thanks to efficient solar harnessing offered by the novel nanocrystal electron acceptor. To tackle this multidisciplinary challenge we have assembled a group of experts in the respective fields: development of nanocrystal solar cells, DSSC technology and physics, atomic layer and surface characterisation and a technology leader (industrial partner) in the manufacturing and development of third generation, thin film, photovoltaic cells and modules (DSSCs).


Grant
Agency: European Commission | Branch: H2020 | Program: RIA | Phase: PHC-09-2015 | Award Amount: 28.14M | Year: 2016

Many HIV vaccine concepts and several efficacy trials have been conducted in the prophylactic and therapeutic fields with limited success. There is an urgent need to develop better vaccines and tools predictive of immunogenicity and of correlates of protection at early stage of vaccine development to mitigate the risks of failure. To address these complex and challenging scientific issues, the European HIV Vaccine Alliance (EHVA) program will develop a Multidisciplinary Vaccine Platform (MVP) in the fields of prophylactic and therapeutic HIV vaccines. The Specific Objectives of the MVP are to build up: 1.Discovery Platform with the goal of generating novel vaccine candidates inducing potent neutralizing and non-neutralizing antibody responses and T-cell responses, 2. Immune Profiling Platform with the goal of ranking novel and existing (benchmark) vaccine candidates on the basis of the immune profile, 3. Data Management/Integration/Down-Selection Platform, with the goal of providing statistical tools for the analysis and interpretation of complex data and algorithms for the efficient selection of vaccines, and 4. Clinical Trials Platform with the goal of accelerating the clinical development of novel vaccines and the early prediction of vaccine failure. EHVA project has developed a global and innovative strategy which includes: a) the multidisciplinary expertise involving immunologists, virologists, structural biology experts, statisticians and computational scientists and clinicians; b) the most innovative technologies to profile immune response and virus reservoir; c) the access to large cohort studies bringing together top European clinical scientists/centres in the fields of prophylactic and therapeutic vaccines, d) the access to a panel of experimental HIV vaccines under clinical development that will be used as benchmark, and e) the liaison to a number of African leading scientists/programs which will foster the testing of future EHVA vaccines through EDCTP


Grant
Agency: European Commission | Branch: H2020 | Program: MSCA-ITN-ETN | Phase: MSCA-ITN-2014-ETN | Award Amount: 2.34M | Year: 2015

Optical laser-based technologies are a key technology of the 21st century. Extension of the range of scientific and commercial laser applications requires a constant expansion of the accessible regimes of laser operation. Concepts from nonlinear optics, driven with ultra-fast lasers provide all means to achieve this goal. However, nonlinear optics typically suffer from low efficiencies, e.g. if high-order processes are involved or if the driving laser pulse intensities must be limited below damage thresholds (e.g. in nonlinear microscopy of living cells, or nonlinear spectroscopy of com-bustion processes). Hence, we require methods to enhance nonlinear optical processes. The field of coherent control provides techniques to manipulate laser-matter interactions. The idea is to use appropriately designed light-matter interactions to steer quantum systems towards a desired out-come, e.g. to support nonlinear optical processes. The goal of HICONO is to combine the concepts of coherent control with high-intensity nonlinear-optical interactions. The particular aim is to enhance the efficiency of nonlinear optical processes and extend the range of high-intensity laser applications. HICONO will develop new coherent con-trol strategies matched to high-intensity nonlinear optics. This will push high-order frequency con-version towards larger output yield, enable novel applications in high-resolution spectroscopy and microscopy, and drive novel technologies for ultra-short pulse generation and characterization. The close cooperation of HICONO with industry partners will lead to commercially relevant devices. In terms of training, HICONO aims at the development of young researchers with appropriate skills to exploit the concepts of high-intensity laser technologies, laser-based control, and applied nonlinear optics. HICONO provides a unique, very broad and technology-oriented early-stage training program with strong exposure of the fellows to industry environment.

Loading Imperial College London collaborators
Loading Imperial College London collaborators