Utrecht, Canada
Utrecht, Canada

Time filter

Source Type

Power M.J.,University of Utah | Whitney B.S.,Northumbria University | Mayle F.E.,University of Reading | Neves D.M.,Royal Botanic Gardens | And 2 more authors.
Philosophical Transactions of the Royal Society B: Biological Sciences | Year: 2016

South American seasonally dry tropical forests (SDTFs) are critically endangered, with only a small proportion of their original distribution remaining. This paper presents a 12 000 year reconstruction of climate change, fire and vegetation dynamics in the Bolivian Chiquitano SDTF, based upon pollen and charcoal analysis, to examine the resilience of this ecosystem to drought and fire. Our analysis demonstrates a complex relationship between climate, fire and floristic composition over multi-millennial time scales, and reveals that moisture variability is the dominant control upon community turnover in this ecosystem. Maximum drought during the Early Holocene, consistent with regional drought reconstructions, correlates with a period of significant fire activity between 8000 and 7000 cal yr BP which resulted in a decrease in SDTF diversity. As fire activity declined but severe regional droughts persisted through the Middle Holocene, SDTFs, including Anadenanthera and Astronium, became firmly established in the Bolivian lowlands. The trend of decreasing fire activity during the last two millennia promotes the idea among forest ecologists that SDTFs are threatened by fire. Our analysis shows that the Chiquitano seasonally dry biome has been more resilient to Holocene changes in climate and fire regime than previously assumed, but raises questions over whether this resilience will continue in the future under increased temperatures and drought coupled with a higher frequency anthropogenic fire regime. © 2016 The Author(s) Published by the Royal Society. All rights reserved.


Dejonckere P.H.,Utrecht University | Dejonckere P.H.,Federal Institute of Occupational Diseases | Dejonckere P.H.,Catholic University of Leuven
Models and Analysis of Vocal Emissions for Biomedical Applications - 7th International Workshop, MAVEBA 2011 | Year: 2011

Voice problems have become a major occupational health issue within the teaching community, as they frequently result in work absenteeism and need for professional re-orientation. four main risk factors have been identified: voice loading, general health condition, environmental factors and psycho-emotional factors (occupational stress and frustration). In order to specifically consider the 'stress' aspect, we investigated voice complaints and voice-related quality of life in the teachers of a special education setting: The national military academy for future noncommissioned officers, actually adolescents in the age 12 to18. the outcomes were compared with those from recent reports about similar studies in common secondary schools in different european countries and in the USA. our results demonstrate that the specific military teacher's population considered in this study clearly shows significantly lower prevalence of voice problems than comparable teacher's populations in 'common' secondary schools. on the other hand, we investigated two specific groups of teachers supposed to have a heavier physical voice load than classical teachers: Teachers of physical education and swimming teachers (in secondary schools). concerning these two classes of teachers, the clear overall similitude with classical teachers provides a strong argument to consider that vocal load and environment is not the sole - or by far the most important - cause of voice complaints. © 2011 Firenze University Press.


Earth has a bad habit of erasing its own history. At intersections of tectonic plates worldwide, slabs of ocean crust dive into the mantle, part of the continuous cycle that not only drives the continents’ drift, but also fuels the volcanism that builds up island chains like Japan and mountains like the Andes. The disappearance of these slabs, called subduction, makes it difficult to reconstruct oceans as they existed hundreds of millions of years ago, as well as the mountains flanking them. “Every day, we’re losing geologic information from the face of the Earth,” says Jonny Wu, a geologist at the University of Houston in Texas. “It’s like losing pieces of broken glass as you’re trying to put it together again.” But geoscientists have begun to pick up these pieces by peering into the mantle itself, using earthquake waves that pass through Earth’s interior to generate images resembling computerized tomography (CT) scans. In the past few years, improvements in these tomographic techniques have revealed many of these cold, thick slabs as they free fall in slow motion to their ultimate graveyard—heaps of rock sitting just above Earth’s molten core, 2900 kilometers below. Now, the complete x-ray of Earth’s interior is coming into focus. Next month, at a meeting of the American Geophysical Union in San Francisco, California, a team of Dutch scientists will announce a catalog of 100 subducted plates, with information about their age, size, and related surface rock records, based on their own tomographic model and cross-checks with other published studies. “Step by step we went deeper and deeper, older and older,” says Douwe van Hinsbergen, a geologist at Utrecht University in the Netherlands, who led the project along with Utrecht geologists Douwe van der Meer and Wim Spakman. This “atlas of the underworld,” as they call it, holds the ghosts of past geography. By rewinding the clock and bringing these cataloged slabs back to the surface, scientists can figure out the sizes and locations of ancient oceans. Moreover, they can locate where the sinking slabs would have triggered melting, releasing blobs of magma that rose into the crust and drove volcanism. That has helped earth scientists pinpoint where ancient mountains rose and later eroded away, their traces visible only in unexplained rock records. “It’s a pretty exciting time to be able to pull all of these pieces together,” says Mathew Domeier, a tectonic modeler at the University of Oslo. That has only recently become possible, as the underlying technique, mantle tomography, is plagued with uncertainties. It relies on millions of seismic waves received by sensors scattered unevenly around the world. Waves with faster arrival times are assumed to have passed through the colder rock of subducted slabs. But seismometer coverage is patchy; earthquakes—the sources of the seismic waves—don’t occur everywhere; and the waves get fuzzier as they pass near the core or travel long distances. “Very often for regions that have the most interesting structures, you have the most uncertainty,” says Ved Lekic, a tomographer at the University of Maryland in College Park. Academic groups around the world use more than 20 models to interpret tomographic data, and their pictures of the mantle and its structures often conflict, says Grace Shephard, a postdoc at the University of Oslo. In the coming months, she will publish a comparison of 14 different models that will assess which slabs seem most likely to be real. Her results could cast doubt on some of the slabs in the Utrecht atlas. But the image of Earth’s interior is becoming more believable, thanks to improved computing power and such intercomparison projects. By now the picture of lost plates is precise enough for scientists to try rewinding the clock, reconstructing vanished worlds. In earlier tomography, the plunging slabs looked like blobs in a lava lamp. But as the models have improved, the slabs in the upper mantle have been revealed to be stiff, straight curtains, says John Suppe, who heads the Center for Tectonics and Tomography at the University of Houston. The images make it clear that as they plunge, the 500-kilometer-thick slabs flex but don’t crumple—and that has made it easier for Suppe and others to unwind them. “We’re finding these plates unfold fairly easily, and they’re not that deformed,” Suppe says. These slab-driven reconstructions are calling into question plate movements inferred from ancient oceanic crust that was scraped off and preserved on the continents, Suppe says. “Almost everywhere we’ve looked at this,” Suppe says, “what we find in the mantle isn’t exactly what would be predicted.” The reconstructions are also resurrecting mountains that had been lost to time. For example, in a study published several months ago, Wu and Suppe reconstructed the travels of 28 slabs to recreate the Philippine Sea as it was more than 50 million years ago. Beyond identifying what appears to be a previously unknown piece of ocean crust, they predicted that as one of their paleoplates plunged into the mantle, it threw up a large chain of volcanoes that eventually collided with Asia. That convulsive process could explain mysterious folded rocks in Japan and beneath the East China Sea. Similarly, slabs beneath North America have helped bring that continent’s history of mountain building into clearer focus. By rewinding the clock for some of them, Karin Sigloch, a geophysicist at the University of Oxford in the United Kingdom, showed that North America’s western mountain chains, including the Rockies, likely formed between 200 million and 50 million years ago when several small plates were subducted beneath the continent, plastering multiple volcanic archipelagos against the landmass. Van Hinsbergen and his Utrecht peers hope their comprehensive atlas of slabs will make it possible to reconstruct a fuller picture of ancient geography. In 2012, they used slab tomography to constrain the longitude of volcanic island arcs that 200 million years ago dotted the ocean surrounding the Pangea supercontinent. Two years later they used their global model to estimate the number of subduction zones that would have been active over the past 250 million years, along with the amount of carbon dioxide (CO ) that subduction-related volcanoes would have emitted. The estimate closely matched geologic proxy records for atmospheric CO  over the same period. And earlier this year, Van Hinsbergen published a study in with Lydian Boschman, a graduate student, that identified several slabs that may have played a role in the birth of the Pacific Ocean. “We have done it,” Van Hinsbergen says. “If this was all nonsense, it is really quite a coincidence.” Even with these new techniques, which Suppe together calls “slab tectonics,” the mantle’s memory of ocean slabs only stretches back 250 million years—the time it takes for one to fall to the bottom of the mantle and be fully recycled. Beyond that, Earth continues to cover its tracks.


News Article | November 8, 2016
Site: www.prweb.com

uBiome, the leading microbial genomics company, has appointed Dr. Elisabeth Bik – who joins the uBiome team in a full-time role from Stanford University School of Medicine – as its new Science Editor. Dr. Bik is regarded by her research peers as one of the world’s authorities on the science of the microbiome. At uBiome, Dr. Bik’s primary focus will be on leading the ongoing publication of scientific findings by the company. Since 2002, Dr. Bik has been a Research Associate at Stanford University School of Medicine, where she has specialized in the composition of the intestinal microbiota of healthy subjects and those with liver diseases, the microbiota of marine mammals, and isolating and detecting microbial DNA from clinical samples. Additionally, since 2014, she has been the editor of the highly respected online Microbiome Digest, a daily summary of scientific papers about microbiome and microbiology research, with a considerable readership in the scientific community. Dr. Bik will continue to edit Microbiome Digest as part of her new position with uBiome. Dr. Bik has authored and co-authored over 30 papers, including the influential Diversity of the Human Intestinal Microbial Flora, which has been cited well over 4,000 times. This ground-breaking research, carried out in collaboration with Stanford’s infectious disease specialist, Dr. Paul B. Eckburg, was published in 2005 – three years before the Human Microbiome Project commenced its work. Described by the journal Nature as a “sharp-eyed microbiologist,” in April 2016, Dr. Bik worked closely with two editors-in-chief at microbiology journals to conduct an analysis of over 20,000 published microbiology, immunology, cancer research, and general science papers, specifically looking for images that were inappropriately duplicated or altered. Such images were found in 3.8% of the papers analyzed. The reputable website Retraction Watch subsequently described her as a “behind-the-scenes force in scientific integrity.” After receiving her PhD at Utrecht University in the Netherlands, Dr. Bik worked at the Dutch National Institute for Health, and the St. Antonius Hospital in Nieuwegein, Utrecht in the Netherlands. In 2002, she joined the Department of Microbiology and Immunology at Stanford University School of Medicine, where, in May 2016, she was awarded Stanford’s prestigious “Microbiome Pioneer” award for her ongoing contributions to science in editing and publishing Microbiome Digest. Dr. Bik brings her considerable experience and expertise to uBiome, a pioneer of applying next generation high-throughput DNA sequencing technology to deliver highly detailed analyses of the human microbiome, the ecosystem of trillions of bacteria that populate the human body. Bacteria in the gut play critical roles in good health, such as supporting digestion and the synthesis of vitamins. However, pathogenic bacteria are associated with a range of conditions – some of them serious – such as celiac disease and inflammatory bowel diseases (including both Crohn’s disease and ulcerative colitis), irritable bowel syndrome, esophageal reflux and esophageal cancer, Clostridium difficile infection, colorectal cancer, and many others. Dr. Elisabeth Bik, new uBiome Science Editor, says: “As someone who has worked in and around the area for over twenty years, it’s rewarding that there has been such a huge increase in interest in the microbiome recently, but, of course, this has been driven by remarkable research driven by my peers and by uBiome. Publishing is, of course, a cornerstone of science, which is why I’m happy that uBiome is placing such great emphasis on ensuring that its work goes through the rigorous time-tested peer-review process, and I’m delighted that I’ll be enabling them to do so as the new Science Editor.” Dr. Jessica Richman, co-founder and CEO of uBiome, says: “We’ve long been inspired by Dr. Bik’s work to disseminate information about our field through the Microbiome Digest. We’re delighted to welcome her to our team and thrilled to work with such a respected expert.” uBiome was founded in 2012 by researchers educated at Stanford, Oxford, and UCSF. The company is funded by Andreessen Horowitz, Y Combinator, and other leading investors. uBiome’s mission is to explore important research questions about the microbiome and to develop accurate and reliable clinical tests based on the microbiome.


TODAY (25 November 2016), DURING THE CELEBRATIONS TO MARK THE ANNIVERSARY OF THE UNIVERSITY OF TWENTE’S FOUNDATION DAY (KNOWN AS DIES NATALIS), THE PROFESSOR DE WINTER AWARD WAS PRESENTED TO DR JEANNETTE HOFMEIJER. Enschede, Netherlands, 28-Nov-2016 — /EuropaWire/ — This publication award for especially talented women academics is an acknowledgement of outstanding academic research and intended to boost the recipient’s academic career. The award is being presented for the tenth time this year. Hofmeijer is receiving the award for her article Early EEG contributes to multimodal outcome prediction of postanoxic coma published in the leading scientific journal Neurology. In her research, she shows that using EEG monitoring can radically improve predictions of the outcome of a coma caused by a lack of oxygen in the brain. Using current methods, it is only possible to make a correct estimate quickly and reliably in 10% of patients. If a new method is used – involving continuous EEG monitoring and observing the speed of recovery in brain activity – it is possible in around 50% of cases. Hofmeijer’s study therefore demonstrated that recovery over time is a better indicator of the seriousness of brain damage than one brief measurement at a single time, which is currently standard practice. The Professor De Winter Award judging panel described it as an outstandingly written article, published in a leading journal. The panel also pointed out that Hofmeijer, who combines her work as a neurologist at Rijnstate Hospital in Arnhem with research at the University of Twente, is an ideal connecting link between medical practice and the academic world. “Not only does her research translate fundamental research into medical practice, it also takes a practical problem from clinical practice as its point of departure. This matches perfectly with the mission of the UT’s MIRA research institute, which sees outstanding research and technology as important catalysts for improving healthcare.” After studying Medicine and Philosophy, Hofmeijer specialized as a neurologist and intensive care doctor. She was awarded her doctorate at Utrecht University in 2007. Since 2008, she has worked at Rijnstate Hospital in Arnhem and in the Clinical Neurophysiology department at the University of Twente. Hofmeijer has published more than 75 academic articles. The Professor De Winter Award, named in honour of the professor who died in 2005, is an international publication award for leading women academics. It is an acknowledgement of outstanding academic research and intended to boost the recipient’s academic career. The award, which consists of €2,500 in cash and a certificate, is funded by the Professor De Winter Award fund, a named fund set up in the Twente University Fund. It was partly made possible by a donation from the professor’s widow, who herself died in 2013. After her death, UT alumnus Henk Hoving and his partner Thijs van Reijn decided to continue the annual donation to the University Fund. The award is being presented for the tenth time this year. In addition to the Professor De Winter Award, the Professor De Winter Fund also finances the Professor De Winter Scholarship every year. It is intended for outstanding women students from abroad who study on a Master’s degree programme at the University of Twente. The scholarship, worth € 7,500 per year over a two-year period, was this year awarded to Karen Abeniacar from Italy. She completed her Bachelor’s degree programme in Industrial Engineering at the Turkish Sabancı University. This academic year, she began the Master’s programme in Industrial Engineering and Management at Twente.


News Article | November 30, 2016
Site: www.prnewswire.co.uk

Risk assessment tool from UK company proven to be one of world's most accurate predictors of dementia and other diseases NOTTINGHAM, United Kingdom, Nov. 30, 2016 /PRNewswire/ -- ROADTOHEALTH, the Digital Health Global 100 company behind award-winning health application Quealth, announced the successful clinical validation of its dementia1 risk assessment, making it a world leader in the accurate prediction of the disease. "Quealth's risk algorithm has been optimised and validated to an excellent level of predictive accuracy for dementia," said Paul Nash, Quealth's Head of Clinical Governance. "It has a 72% chance of distinguishing individuals at risk of developing dementia from those who are not at risk. This is comparable with leading international risk scores including CAIDE2, DSDRS3, BDSI4 and ANU-ADRI5." Quealth is a free health app that allows anyone to assess their risk of developing the five most common lifestyle-driven diseases: dementia, cardiovascular disease (CVD), type 2 diabetes, six forms of cancer, and chronic obstructive pulmonary (lung) disease (COPD). Quealth is governed by an ongoing programme of formal clinical validation led by Nash in collaboration with Quealth's clinical advisor, Dr Stephen Weng, an Applied Epidemiologist at the University of Nottingham's School of Medicine. The programme ensures the predictive accuracy of its disease risk algorithms is both optimised and academically validated. The key measure of how predictive accuracy is normally assessed is based on the 'AUC (c-statistic)'. This is a measure of how accurately the algorithm will predict the disease. Quealth's disease risk algorithms achieve AUC values of between 0.72 and 0.80. This high level of predictive accuracy is directly comparable to – and in many cases higher than – other internationally recognised and respected disease risk prediction algorithms: "Quealth is a consumer-friendly application developed using rigorous scientific methods and has proven extremely accurate when assessing an individual's future risk of disease across multiple health conditions, including diabetes, cardiovascular disease, dementia and COPD," said Dr Weng. Quealth is available for download via Apple's App Store, the Google Play Store, Samsung's Galaxy Apps Store and Lenovo's App Explorer. 1 Dementia has now overtaken heart disease as the leading cause of death in England and Wales, accounting for 11.6% of all deaths registered in 2015 https://www.ons.gov.uk/peoplepopulationandcommunity/birthsdeathsandmarriages/deaths/bulletins/deathsregisteredinenglandandwalesseriesdr/2015 2 CAIDE (Cardiovascular Risk Factors, Aging and Dementia) is a Scandinavian initiative and a joint collaboration between the Department of Neurology, University of Eastern Finland (Kuopio Campus), the National Institute of Health and Welfare (Helsinki) and the Aging Research Center, Karolinksa Institutet (Stockholm) http://www.uef.fi/en/web/caide/ 3 DSDRS (Diabetes-Specific Dementia Risk Score) was funded by the United States National Institute of Health, Kaiser Permanente Community Benefits, Utrecht University, ZonMw, the Netherlands Organisation for Health Research and Development and a Fulbright fellowship https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4429783/ 4 BDSI (Behavioural Dysexecutive Syndrome Inventory) is an ongoing French study by the Centre Hospitalier Universitaire (Amiens) https://clinicaltrials.gov/ct2/show/NCT02819700 5 ANU-ADRI (Australian National University's Alzheimer's Disease Risk Index) is a tool developed by the Australian National University (Canberra) http://anuadri.anu.edu.au Quealth www.quealth.co is the most accurate health risk assessment app available. It assesses your health, coaches you to make improvements and rewards your successes. Quealth combines the latest behavioural change science with the global reach of technology to encourage you to make lifestyle changes and live a longer, healthier life. Based on the award-winning and highly validated Quealth™ Score – a universally recognised and trusted predictor of health risk – Quealth focusses on the prevention of five leading non-communicable diseases: diabetes, six forms of cancer, cardiovascular disease, dementia and chronic obstructive pulmonary disease (COPD). Quealth is owned by the roadtohealth group, an internationally-recognised health risk assessment and lifestyle management company. Formed in 2002, roadtohealth has successfully developed an international footprint by partnering with global brands including Aviva, Samsung, Patient.info and HSBC. All products or brand names mentioned are trademarks or registered trademarks of their respective holders.


News Article | November 30, 2016
Site: www.prnewswire.co.uk

Risk assessment tool from UK company proven to be one of world's most accurate predictors of dementia and other diseases ROADTOHEALTH, the Digital Health Global 100 company behind award-winning health application Quealth, announced the successful clinical validation of its dementia[1] risk assessment, making it a world leader in the accurate prediction of the disease. "Quealth's risk algorithm has been optimised and validated to an excellent level of predictive accuracy for dementia", said Paul Nash, Quealth's Head of Clinical Governance. "It has a 72% chance of distinguishing individuals at risk of developing dementia from those who are not at risk. This is comparable with leading international risk scores including CAIDE[2], DSDRS[3], BDSI[4] and ANU-ADRI[5]." Quealth is a free health app that allows anyone to assess their risk of developing the five most common lifestyle-driven diseases: dementia, cardiovascular disease (CVD), type 2 diabetes, six forms of cancer, and chronic obstructive pulmonary (lung) disease (COPD). Quealth is governed by an ongoing programme of formal clinical validation led by Nash in collaboration with Quealth's clinical advisor, Dr Stephen Weng, an Applied Epidemiologist at the University of Nottingham's School of Medicine. The programme ensures the predictive accuracy of its disease risk algorithms is both optimised and academically validated. The key measure of how predictive accuracy is normally assessed is based on the 'AUC (c-statistic)'. This is a measure of how accurately the algorithm will predict the disease. Quealth's disease risk algorithms achieve AUC values of between 0.72 and 0.80. This high level of predictive accuracy is directly comparable to - and in many cases higher than - other internationally recognised and respected disease risk prediction algorithms: "Quealth is a consumer-friendly application developed using rigorous scientific methods and has proven extremely accurate when assessing an individual's future risk of disease across multiple health conditions, including diabetes, cardiovascular disease, dementia and COPD," said Dr Weng. Quealth is available for download via Apple's App Store, the Google Play Store, Samsung's Galaxy Apps Store and Lenovo's App Explorer. Quealth http://www.quealth.co is the most accurate health risk assessment app available. It assesses your health, coaches you to make improvements and rewards your successes. Quealth combines the latest behavioural change science with the global reach of technology to encourage you to make lifestyle changes and live a longer, healthier life. Based on the award-winning and highly validated Quealth™ Score - a universally recognised and trusted predictor of health risk - Quealth focusses on the prevention of five leading non-communicable diseases: diabetes, six forms of cancer, cardiovascular disease, dementia and chronic obstructive pulmonary disease (COPD). Quealth is owned by the roadtohealth group, an internationally-recognised health risk assessment and lifestyle management company. Formed in 2002, roadtohealth has successfully developed an international footprint by partnering with global brands including Aviva, Samsung, Patient.info and HSBC. All products or brand names mentioned are trademarks or registered trademarks of their respective holders. [1] Dementia has now overtaken heart disease as the leading cause of death in England and Wales, accounting for 11.6% of all deaths registered in 2015 https://www.ons.gov.uk/peoplepopulationandcommunity/birthsdeathsandmarriages/deaths/bulletins/deathsregisteredinenglandandwalesseriesdr/2015 [2] CAIDE (Cardiovascular Risk Factors, Aging and Dementia) is a Scandinavian initiative and a joint collaboration between the Department of Neurology, University of Eastern Finland (Kuopio Campus), the National Institute of Health and Welfare (Helsinki) and the Aging Research Center, Karolinksa Institutet (Stockholm) http://www.uef.fi/en/web/caide/ [3] DSDRS (Diabetes-Specific Dementia Risk Score) was funded by the United States National Institute of Health, Kaiser Permanente Community Benefits, Utrecht University, ZonMw, the Netherlands Organisation for Health Research and Development and a Fulbright fellowship https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4429783/ [4] BDSI (Behavioural Dysexecutive Syndrome Inventory) is an ongoing French study by the Centre Hospitalier Universitaire (Amiens) https://clinicaltrials.gov/ct2/show/NCT02819700 [5] ANU-ADRI (Australian National University's Alzheimer's Disease Risk Index) is a tool developed by the Australian National University (Canberra) http://anuadri.anu.edu.au


News Article | November 17, 2016
Site: www.scientificamerican.com

SAN DIEGO—A wireless device that decodes brain waves has enabled a woman paralyzed by locked-in syndrome to communicate from the comfort of her home, researchers announced this week at the annual meeting of the Society for Neuroscience. The 59-year-old patient, who prefers to remain anonymous but goes by the initials HB, is “trapped” inside her own body, with full mental acuity but completely paralyzed by a disease that struck in 2008 and attacked the neurons that make her muscles move. Unable to breathe on her own, a tube in her neck pumps air into her lungs and she requires round-the-clock assistance from caretakers. Thanks to the latest advance in brain–computer interfaces, however, HB has at least regained some ability to communicate. The new wireless device enables her to select letters on a computer screen using her mind alone, spelling out words at a rate of one letter every 56 seconds, to share her thoughts. “This is a significant achievement. Other attempts on such an advanced case have failed,” says neuroscientist Andrew Schwartz of the University of Pittsburgh, who was not involved in the study, published in The New England Journal of Medicine. HB’s mind is intact and the part of her brain that controls her bodily movements operates perfectly, but the signals from her brain no longer reach her muscles because the motor neurons that relay them have been damaged by amyotrophic lateral sclerosis (ALS), says neuroscientist Erick Aarnoutse, who designed the new device and was responsible for the technical aspects of the research. He is part of a team of physicians and scientists led by neuroscientist Nick Ramsey at Utrecht University in the Netherlands. Previously, the only way HB could communicate was via a system that uses an infrared camera to track her eye movements. But the device is awkward to set up and use for someone who cannot move, and it does not function well in many situations, such as in bright sunlight. Devices that couple neural activity to computers have been used experimentally before to help patients with a range of neurological disorders, including locked-in syndrome. In pioneering work in 1998 neurologist Phillip Kennedy, of Neural Signals, Inc., implanted an array of electrodes into a patient’s brain who was paralyzed by a stroke to control signals in an on-off manner, and in 2015 a team of researchers led by neuroscientist Leigh Hochberg of Brown University implanted a 96-channel electrode array into the cerebral cortex of a 58-year-old woman with locked-in syndrome. Those brain implants helped patients communicate by enabling them to select words displayed on a computer screen, and similar tech has helped patients accomplish other tasks as well. Schwartz’s team at Pitt recently demonstrated that a paralyzed man could use a robotic arm controlled by electrodes implanted in the man’s cerebral cortex to shake hands with Pres. Barack Obama. But surgically implanting electrodes into the brain carries inherent risks. “Anytime there is a wire penetrating the skin there is risk of infection,” Schwartz says, and previous attempts at brain–computer interfaces could only be performed in the laboratory because of the bulky instrumentation required. “This new study is an advance because the implant uses wireless communication with the computer. It is really important to bring this capability home, and they have done that.” For HB, Aarnoutse and his collaborators created a simple, minimally invasive implant that she can use at home or outside to communicate using a familiar computer notebook. To achieve this, doctors lifted a small flap of her scalp in surgery and drilled two finger-size holes through her skull. Then they slipped a thin plastic strip, which looks something like cellophane tape with four tiny dots on it, through the holes to rest on the surface of the brain. The four spots are miniature electrodes that do not penetrate brain tissue, but because they are beneath the skull, they make good electrical contact with the brain to record brain waves with high fidelity. The surgeons then threaded tiny wires from the electrode under the skin to a small electronic control device that was implanted in HB’s chest. The device, made by the biotech company Medtronic, communicates wirelessly by a radio transmitter to an ordinary tablet computer. (Medtronic provided partial support for the research, and one of the study’s authors is employed at the company, although the study states he was not involved in interpreting the results.) Surgeons placed the electrodes over the part of the brain’s motor cortex that becomes activated when HB imagines closing her fingers. Analyzing the brain wave patterns, the researchers observed a simple but reliable pattern. Every time she imagined pinching her fingers, the power of certain frequencies of brain waves abruptly changed, as low-frequency “beta” brainwaves abruptly ceased and higher-frequency “gamma” brain waves whipped up. By measuring the ratio of gamma- to beta-wave power in ongoing brain waves sweeping through HB’s motor cortex, the computer could detect when she was imagining closing her fingers. In this way HB quickly learned to operate a cursor in a video game, mastering that task only two days after the surgery. Next the scientists presented her with an alphabet arrayed in rows and columns on a computer tablet. As the display swept over individual letters in sequence, the woman imagined selecting the appropriate letter as if she were clicking a mouse. The technology is not without controversy, however. Some experts believe that only noninvasive methods should be used to help people with locked-in syndrome communicate, for example by recording brain waves from scalp electrodes. “Implantations like the one reported here may carry an unknown risk for advanced ALS patients,” says Niels Birbaumer, an expert in brain–computer interfaces at Tübingen University in Germany who was not involved in the study. Recording brain waves through the skull, however, currently lacks the sensitivity needed to tap into the neural circuits that control fine voluntary movements. Moreover, that approach is not a practical solution, Aarnoutse says, because it takes a team of highly specialized technicians to attach the electrodes to the electroencephalography (EEG) cap and operate an EEG recording station. This is beyond the capabilities of most caretakers that assist people living with locked-in syndrome. Furthermore, the EEG cap, which looks like a swimming cap with dozens of wires sprouting from it, is something patients would likely never use in their everyday lives. “It would inhibit their interaction with others, and they would never use it outside their home,” Aarnoutse says. Still, some experts say the wireless new device may not justify the risks. “One to two letters per minute is not justifiable [for doing a craniotomy] unless they can improve it,” Kennedy says. Indeed, when HB was first learning to use the device, she told Aarnoutse, “Trying to communicate like this is like tacking a sailboat.” But many patients with locked-in syndrome choose not to use ventilators to breathe when their disease reaches an advanced stage because they cannot communicate and they feel they are a burden on their loved ones, according Schwartz. Studies suggest that locked-in people can lead meaningful and productive lives if they can communicate in some way. “We need to do anything we can to help these people,” he says. “We are talking about life and death.” Now, more than a year after the device was implanted, HB lives at home with her husband and one of her children, and she has gotten much faster at typing out her thoughts. Also, the device works outdoors in the sunshine where her eye tracker fails. “She’s happy,” Aarnoutse says. “The ability to communicate has given her more freedom and made her more independent.”


News Article | October 23, 2015
Site: news.mit.edu

Twelve new faculty members have been invited to join the ranks of the School of Engineering at MIT. Drawn from institutions and industry around the world, and ranging from distinguished senior researchers to promising young investigators, they will contribute to the research and educational activities of six academic departments in the school and a range of other labs and centers across the Institute. “This year we are welcoming another exceptionally strong group of new faculty to engineering,” says Ian A. Waitz, Dean of the School of Engineering. “They are remarkably accomplished, and their research spans some of the most important and pressing challenges in the world. I can’t wait to see what they do.” The new School of Engineering faculty members are: Mohammad Alizadeh will join the faculty as an assistant professor in the Department of Electrical Engineering and Computer Science in September 2015. He was a principal engineer at Cisco, which he joined through the acquisition of Insieme Networks in 2013. Alizadeh completed his undergraduate degree in electrical engineering at Sharif University of Technology and received his PhD in electrical engineering in 2013 from Stanford University, where he was advised by Balaji Prabhakar. His research interests are broadly in the areas of networked systems, data-center networking, and cloud computing. His dissertation focused on designing high-performance packet-transport mechanisms for data centers. His research has garnered significant industry interest: The Data Center TCP congestion control algorithm has been integrated into the Windows Server 2012 operating system; the QCN algorithm has been standardized as the IEEE 802.1Qau standard; and most recently, the CONGA adaptive load-balancing mechanism has been implemented in Cisco’s new flagship Application Centric Infrastructure products. Alizadeh is a recipient of a SIGCOMM best-paper award, a Stanford Electrical Engineering Departmental Fellowship, the Caroline and Fabian Pease Stanford Graduate Fellowship, and the Numerical Technologies Inc. Prize and Fellowship. Tamara Broderick will start as an assistant professor in electrical engineering and computer science in January 2015. She received a BA in mathematics from Princeton in 2007, a master of advanced study for completion of Part III of the Mathematical Tripos from the University of Cambridge in 2008, an MPhil in physics from the University of Cambridge in 2009, and an MS in computer science and a PhD in statistics from the University of California at Berkeley in 2013 and 2014, respectively. Her recent research has focused on developing and analyzing models for scalable, unsupervised learning using Bayesian nonparametrics. She has been awarded the Evelyn Fix Memorial Medal and Citation (for the PhD student on the Berkeley campus showing the greatest promise in statistical research), the Berkeley Fellowship, a National Science Foundation Graduate Research Fellowship, and a Marshall Scholarship. Michael Carbin will join the Department of Electrical Engineering and Computer Science as an assistant professor in January 2016. His research interests include the theory, design, and implementation of programming systems, including languages, program logics, static and dynamic program analyses, run-time systems, and mechanized verifiers. His recent research has focused on the design and implementation of programming systems that deliver improved performance and resilience by incorporating approximate computing and self-healing. Carbin’s research on verifying the reliability of programs that execute on unreliable hardware received a best-paper award at a leading programming languages conference (OOPSLA 2013). His undergraduate research at Stanford received the Wegbreit Prize for Best Computer Science Undergraduate Honors Thesis. As a graduate student at MIT, he received the MIT-Lemelson Presidential and Microsoft Research Graduate Fellowships. James Collins joined the faculty in the Department of Biological Engineering and as a core member of the Institute for Medical Engineering and Science. Collins received a PhD in mechanical engineering from the University of Oxford and was formerly the William F. Warren Distinguished Professor, university professor, professor of biomedical engineering, and director of the Center of Synthetic Biology at Boston University. He is a world leader in bringing together engineering principles and fundamental biology to make new discoveries and invent systems that can improve the human condition. Collins is among the founders of the field of synthetic biology. Otto X. Cordero will join the Department of Civil and Environmental Engineering as an assistant professor. He received a BS in computer and electrical engineering from the Polytechnic University of Ecuador, and an MS in artificial intelligence and PhD in theoretical biology from Utrecht University. For his dissertation, Cordero worked with Paulien Hogeweg on the scaling laws that govern the evolution of genome size in microbes. While a Netherlands Organization for Scientific Research Postdoctoral Fellow working with Martin Polz, he pursued a study of ecological and social interactions in wild populations of bacteria, and demonstrated the importance of these interactions in generating patterns of diversity and sustaining ecological function. In 2013 Cordero was awarded the European Research Council Starting Grant, the most prestigious career award in Europe, to reconstruct and model networks of ecological interactions that form between heterotrophic microbes in the ocean. Since November 2013, he has been an assistant professor at the Swiss Federal Institute of Technology in Zurich. The main goal of Cordero’s lab is to develop the study of natural microbial communities as dynamical systems, using a combination of experimental and computational approaches. Areg Danagoulian joined the faculty in the Department of Nuclear Science and Engineering (NSE) as an assistant professor in July 2014. He received a BS in physics from MIT and a PhD in experimental nuclear physics from the University of Illinois at Urbana-Champaign. He was a postdoctoral associate at the Los Alamos National Laboratory and subsequently worked as a senior scientist at Passport Systems Inc. Danagoulian’s research interests are focused in nuclear security. He works on problems in the areas of nuclear nonproliferation, technologies for arms-control treaty verification, nuclear safeguards, and nuclear-cargo security. Specific projects include the development of zero-knowledge detection concepts for weapon authentication, and research on monochromatic, tunable sources that can be applied to active interrogation of cargoes. Other areas of research include nuclear forensics and the development of new detection concepts. Danagoulian’s research and teaching will contribute to NSE’s growing program in nuclear security. Ruonan Han joined the electrical engineering and computer science faculty in September as an assistant professor. He is also a core member of the Microsystems Technology Laboratories. He earned his BS from Fudan University in 2007, an MS in electrical engineering from the University of Florida in 2009, and his PhD in electrical and computer engineering from Cornell University in 2014. Han’s research group aims to explore microelectronic-circuit and system technologies to bridge the terahertz gap between microwave and infrared domains. They focus on high-power generation, sensitive detection and energy-efficient systems. Han is the recipient of the Electrical Computing and Engineering Director’s Best Thesis Research Award and Innovation Award from Cornell, the Solid-State Circuits Society Pre-Doctoral Achievement Award and Microwave Theory Techniques Society Graduate Fellowship Award from IEEE, as well as the Best Student Paper Award from IEEE Radio-Frequency Integrated Circuits Symposium. Juejun (JJ) Hu joined the faculty in the Department of Materials Science and Engineering in January 2015 as an assistant professor and as the Merton C. Flemings Career Development Professor of Materials Science and Engineering. He comes to MIT from the University of Delaware, where he was a tenure-track assistant professor. Previously, he was a postdoc in MIT’s Microphotonics Center. As the Francis Alison Young Professor, Hu initiated and led research projects involving environmental monitoring, renewable energy, biological sensing, and optical communications. He received the 2013 Gerard J. Mangone Young Scholars Award, which recognizes promising and accomplished young faculty and is the University of Delaware’s highest faculty honor. His research is in three main areas: substrate-blind multifunctional photonic integration, mid-infrared integrated photonics, and 3-D photonic integrated circuits. Hu’s group has applied photonic technologies to address emerging application needs in environmental monitoring, renewable energy harvesting, communications, and biotechnology. He earned a BS in materials science and engineering from Tsinghua University, and a PhD from MIT. Rafael Jaramillo will join the materials science and engineering faculty as an assistant professor and the Toyota Career Development Professor in Materials Science and Engineering in the summer of 2015. He has a BS summa cum laude and an MEng, both in applied and engineering physics, from Cornell University. He also holds a PhD in physics from the University of Chicago. Jaramillo is currently a senior postdoctoral fellow at MIT in the Laboratory of Manufacturing and Productivity (LMP). His interests in renewable energy and accomplishments in developing materials systems and techniques for energy applications led to him receiving the Energy Efficiency and Renewable Energy Postdoctoral Research Fellowship from the U.S. Department of Energy. Prior to his appointment in LMP, Jaramillo was a postdoctoral fellow at the Harvard University Center for the Environment. His research interests lie at the intersection of solid-state physics, materials science, and renewable energy technologies. Stefanie Jegelka joined the faculty in the electrical engineering and computer science in January 2015. Formerly a postdoctoral researcher in the Department of Electrical Engineering and Computer Science at the University of California at Berkeley, she received a PhD in computer science from the Swiss Federal Institute of Technology in Zurich (in collaboration with the Max Planck Institute for Intelligent Systems in Tuebingen, Germany), and a diploma in bioinformatics with distinction from the University of Tuebingen in Germany. During her studies, she was also a research assistant at the Max Planck Institute for Biological Cybernetics and spent a year at the University of Texas at Austin. She conducted research visits to Georgetown University, the University of Washington, the University of Tokyo, the French Institute for Research in Computer Science and Automation, and Microsoft Research. She has been a fellow of the German National Academic Foundation and its College for Life Sciences, and has received a Google Anita Borg Fellowship, a Fellowship of the Klee Foundation, and a Best Paper Award at the International Conference on Machine Learning. Jegelka organized several workshops on discrete optimization in machine learning, and has held three tutorials on submodularity in machine learning at international conferences. Her research interests lie in algorithmic machine learning. In particular, she is interested in modeling and efficiently solving machine-learning problems that involve discrete structure. She has also worked on distributed machine learning, kernel methods, clustering, and applications in computer vision. Aleksander Madry is a former assistant professor in the Swiss Federal Institute of Technology in Lausanne (EPFL) School of Computer and Communication Sciences and started as an assistant professor in electrical engineering and computer science in February 2015. His research centers on tackling fundamental algorithmic problems that are motivated by real-world optimization. Most of his work is concerned with developing new ideas and tools for algorithmic graph theory, with a particular focus on approaching central questions in that area with a mix of combinatorial and linear-algebraic techniques. He is also interested in understanding uncertainty in the context of optimization — how to model it and cope with its presence. Madry received his PhD in computer science from MIT in 2011 and, prior to joining EPFL, spent a year as a postdoctoral researcher at Microsoft Research New England. His work was recognized with a variety of awards, including the Association for Computing Machinery Doctoral Dissertation Award Honorable Mention, the George M. Sprowls Doctoral Dissertation Award, and a number of best paper awards at Foundations of Computer Science, Symposium on Discrete Algorithms, and Symposium on Theory of Computing meetings. Xuanhe Zhao joined the Department of Mechanical Engineering faculty in September 2014 as an assistant professor. Before joining MIT, he was an assistant professor in the Department of Mechanical Engineering and Materials Science at Duke University. He earned his PhD at Harvard University in 2009. Zhao conducts research on the interfaces between solid mechanics, soft materials, and bio-inspired design. His current research goal is to understand and design new soft materials with unprecedented properties for impactful applications. His current research projects are centered on three bio-inspired themes: artificial muscle (dielectric polymers and electromechanics), tough cartilage (tough and bioactive hydrogels and biomechanics), and transformative skin (functional surface instabilities and thin-film mechanics). Zhao’s discovery of new failure mechanisms of dielectric polymers in 2011 and 2012 can potentially enhance electric energy densities of dielectric elastomers and gels by a factor of 10. In 2012, he designed a new synthetic biocompatible hydrogel with hybrid crosslinking, which achieved fracture toughness multiple times higher than articular cartilage — unprecedented by previous synthetic gels. With fiber reinforcements, Zhao further controlled the modulus of the tough hydrogel over a wide range from a few kilopascals to over 10 megapascals in 2013 and 2014. By harnessing surface instabilities such as wrinkles and creases in 2014, he dynamically varied both surface textures and colors of an electro-mechano-chemically responsive elastomers to achieve the dynamic-camouflage function of cephalopods. This work was highlighted by Nature News, reported by the The Washington Post, and featured on the MIT homepage: “How to hide like an octopus.” Xuanhe is a recipient of the National Science Foundation CAREER Award, Office of Naval Research Young Investigator Program Award, and the Early Career Researchers Award from AVS Biomaterial Interfaces Division.


Making this finding even more significant is that it has important implications for understanding and treating psychiatric disorders in which detecting and responding to the emotions of others can be disrupted, including autism spectrum disorder (ASD) and schizophrenia. In the study, which is published in this week's issue of Science, co-authors Larry Young, PhD, and James Burkett, PhD, demonstrated that oxytocin—a brain chemical well-known for maternal nurturing and social bonding—acts in a specific brain region of prairie voles, the same as in humans, to promote consoling behavior. Prairie voles are small rodents known for forming lifelong, monogamous bonds and providing bi-parental care of their young. Consolation is defined as calming contact directed at a distressed individual; for example, primates calm others with a kiss and embrace, whereas voles groom others. The prairie voles' consoling behavior was strongest toward familiar voles, and was not observed in the closely related, but asocial, meadow vole. Study co-author Frans de Waal, PhD, was the first to discover animal consolation behavior in 1979 by observing how chimpanzees provide contact comfort to victims of aggression. According to de Waal, the present vole study has significant implications by confirming the empathic nature of the consolation response. "Scientists have been reluctant to attribute empathy to animals, often assuming selfish motives. These explanations have never worked well for consolation behavior, however, which is why this study is so important," says de Waal. Young is division chief of Behavioral Neuroscience and Psychiatric Disorders at Yerkes, director of the Silvio O. Conte Center for Oxytocin and Social Cognition at Emory, and professor in the Emory University School of Medicine Department of Psychiatry and Behavioral Sciences. His previous research on the neural mechanisms controlling pair bonding in prairie voles has provided insights that may be relevant to the treatment of ASD. De Waal is director of the Living Links Center at Yerkes National Primate Research Center, Professor of Primate Behavior in the Emory University Psychology Department, and University Professor at Utrecht University, The Netherlands. De Waal has published several books on primate social behavior and animal empathy, including The Age of Empathy. Burkett recently completed his doctoral studies in Emory University's Neuroscience PhD program. In addition to being the first study to show consolation outside of large-brained animals, the researchers explicitly tied consolation to maternal nurturing mechanisms in the brain, which suggests empathy, not complex cognition, is key. Observing another animal in distress caused activation in the anterior cingulate cortex, a brain region that is also activated when humans see another person in pain. Prairie voles responded by increasing their pro-social contact, which clearly reduced the other's anxiety. When the study authors blocked oxytocin signaling specifically in the anterior cingulate cortex of prairie voles, the animals no longer consoled others in distress. According to Young and Burkett, research suggests oxytocin may improve social engagement in ASD. Their research findings create an opportunity to explore the neural mechanisms of this previously unrecognized consolation behavior in laboratory animals, placing greater emphasis on research into the brain systems underlying empathy. This research underscores the increasing potential oxytocin has for understanding and treating ASD, schizophrenia, and other psychiatric disorders in which detecting and responding to the emotions of others can be disrupted. "Many complex human traits have their roots in fundamental brain processes that are shared among many other species," says Young. "We now have the opportunity to explore in detail the neural mechanisms underlying empathetic responses in a laboratory rodent with clear implications for humans," Young continues. The authors suggest that consoling behavior evolved in the context of prairie voles' monogamous social structure by tweaking brain systems involved in maternal nurturing, which are present in all mammals. Research reported in this release was supported by the U.S.-based National Institute of Mental Health under award number P50MH100023 and R01MH096983 and the National Institutes of Health's Office of the Director, Office of Research Infrastructure Programs, P51OD011132. For eight decades, the Yerkes National Primate Research Center, Emory University, has been dedicated to conducting essential basic science and translational research to advance scientific understanding and to improve the health and well-being of humans and nonhuman primates. Today, the center, as one of only eight National Institutes of Health-funded national primate research centers, provides leadership, training and resources to foster scientific creativity, collaboration and discoveries. Yerkes-based research is grounded in scientific integrity, expert knowledge, respect for colleagues, an open exchange of ideas and compassionate quality animal care. Within the fields of microbiology and immunology, neurologic diseases, neuropharmacology, behavioral, cognitive and developmental neuroscience, and psychiatric disorders, the center's research programs are seeking ways to: develop vaccines for infectious and noninfectious diseases; understand the basic neurobiology and genetics of social behavior and develop new treatment strategies for improving social functioning in ASD and schizophrenia; interpret brain activity through imaging; increase understanding of progressive illnesses such as Alzheimer's and Parkinson's diseases; unlock the secrets of memory; treat drug addiction; determine how the interaction between genetics and society shape who we are; and advance knowledge about the evolutionary links between biology and behavior. The CTSN mission is to bring together basic and clinical scientists in order to facilitate the translation of our understanding of the social brain into novel treatments for social deficits in psychiatric disorders, including ASD. Explore further: Understanding parallels of human and animal parenting can benefit generations to come


News Article | November 2, 2016
Site: www.eurekalert.org

(Boston)-Claudia Satizabal, PhD, instructor of neurology at Boston University School of Medicine (BUSM), was recently awarded a 2016 research grant to promote diversity from the Alzheimer's Association. The two-year, $118,673 award will be used to further study the impact of obesity on brain aging and Alzheimer's Disease (AD). Satizabal is also affiliated with the Framingham Heart Study (FHS) under the mentorship of Dr. Sudha Seshadri. She is currently investigating the impact of midlife obesity, as well as different dietary, inflammatory and neurotrophic biomarkers, in association with stroke, cognitive function, MRI markers of abnormal brain aging and dementia. "Populations worldwide are facing an obesity epidemic, and these same populations are aging and will contribute to the growing prevalence of dementia and AD. Therefore, it becomes imperative to understand the mechanisms by which obesity increases the risk of dementia and AD, which may help develop health policies and treatment strategies to diminish the consequences of obesity in late life," explained Satizabal. In addition to her work at BUSM and the FHS, Satizabal is actively involved in the neurology and cognitive working groups of the Cohorts for Heart and Aging Research in Genomic Epidemiology (CHARGE) consortium, where she leads projects investigating the genetic variation in fine motor speed, visual memory, and subcortical brain structures. She participates in several other international collaborations including the AD Cohorts Consortium and the replication phase of the AD sequencing project. Prior to joining the FHS in 2013, Satizabal earned her doctorate degree, studying the relationship between inflammatory proteins and cerebrovascular and neurodegenerative MRI markers of abnormal brain aging in the French Three-City Study at the INSERM Neuro-Epidemiology laboratory of the Pierre and Marie Curie University in Paris. In 2007, she was awarded a two-year Utrecht Excellence Scholarship to pursue a master's degree in epidemiology from Utrecht University. She received her undergraduate degree in microbiology from Los Andes University in her home country of Colombia in 2005. The Alzheimer's Association is the largest nonprofit funder of Alzheimer's research in the world, having awarded more than $375 million to fund over 2,400 scientific investigations. Fostering a robust workforce of Alzheimer's researchers is a major goal of the Alzheimer's Association Research Grant and Fellowship Awards. The program funds early-career scientists working on new ideas in Alzheimer's research that will lead to future grant applications to government and other funding sources, including the Alzheimer's Association. It includes supporting researchers from underrepresented racial and ethnic groups in the field.


The team combined their data to produce a video that shows the chemistry of this aging process and takes the viewer on a virtual flight through the pores of a catalyst particle. The results were published today in Nature Communications. The particles, known as fluid catalytic cracking or FCC particles, are used in oil refineries to "crack" large molecules that are left after distillation of crude oil into smaller molecules, such as gasoline. Those oil molecules flow through the catalyst particles in tiny pores and passageways, which ensure accessibility to the active domains where chemical reactions can take place. But while the catalyst material is not consumed in the reaction and in theory could be recycled indefinitely, the pores clog up and the particles slowly lose effectiveness. Worldwide, about 400 reactor systems refine oil into gasoline, accounting for about 40 to 50 percent of today's gasoline production, and each system requires 10 to 40 tons of fresh FCC catalysts daily. Finding new clues about how FCCs age out could be key to improving gasoline production. But the new technique also has potential for understanding the workings of materials for powering cars of the future, according to Yijin Liu, a lead author on the paper and staff scientist at SLAC's Stanford Synchrotron Radiation Lightsource (SSRL), a DOE Office of Science User Facility. "The model we created by combining these two imaging methods can readily be applied to studies of rapid changes in the pore networks of similarly structured materials, such as batteries, fuel cells and underground geological formations," he said. To design materials for tomorrow's energy solutions, scientists must understand how they work at multiple scales invisible to the human eye. In a previous study at SSRL, the team took a series of two-dimensional images of catalyst particles at various angles and used software they developed to combine them into three-dimensional images of whole particles showing the distribution of elements in catalysts at various ages. For the new study, the researchers examined an FCC particle recovered from a refinery using two different 3-D X-ray imaging techniques at two experimental stations, or beamlines, at SSRL. One technique, called X-ray fluorescence, provided a detailed profile of the particle's chemical elements. The other, X-ray transmission microscopy, captured the nanoscale structure of the particle, including fine details about the porous network where metal poisoning can best be observed. "The high-resolution microscopy data provided a map of the pores, and the high sensitivity of X-ray fluorescence showed us where metals in the refining fluids were poisoning the catalyst, which appeared as a colored fog in our visualization," Liu said. The results of the study highlight the importance of having multiple techniques to study a single sample at a facility like SSRL. "There was a lot of development on the beamlines to make it possible to register the data in 3-D at this very fine scale," Liu said. He heads up one of the two beamlines used in the research, which allows him to understand the strength and the limitations of both imaging methods. "Understanding catalyst performance requires interrogating catalyst function from multiple perspectives," SSRL Director Kelly Gaffney said. "The results of this exciting research effort highlight the value of integrating disparate X-ray imaging methods to build a deeper understanding of materials function." Going beyond the observation of the experimental data visualized in the video, the scientists developed a model explaining how the accumulation of metals poisons the efficiency of the catalyst. "We used an analogy between electrical resistance and the degree of pore blockage, between two points in the particle using the new combined data. We then applied formulas well-known in electrical engineering to explain accessibility through the pore network, but also how it changes when metals are blocking pores," said the study's co-lead researcher Florian Meirer, assistant professor of inorganic chemistry and catalysis at Utrecht University. The resulting model simulates the aging of the catalyst, allows scientists to quantify this virtual aging, and helps them predict the collapse of its transportation network. "The model explains for the first time how this happens in a connective manner, which is a big step toward improving the design of such catalysts. Furthermore, this novel approach can be applied to a broad range of other materials that involve the transport of fluids or gases, such as battery electrodes," said Bert Weckhuysen, professor of inorganic chemistry and catalysis at Utrecht University. Explore further: Oil more easily converted into petrol thanks to a smart observational technique More information: Yijin Liu et al. Relating structure and composition with accessibility of a single catalyst particle using correlative 3-dimensional micro-spectroscopy, Nature Communications (2016). DOI: 10.1038/ncomms12634


News Article | December 13, 2016
Site: www.eurekalert.org

When thinking about a welfare recipient, people tend to imagine someone who is African American and who is lazier and less competent than someone who doesn't receive welfare benefits, according to new findings in Psychological Science, a journal of the Association for Psychological Science. This mental image, and its association with specific racial stereotypes, influences people's judgments about who deserves government assistance. "We show that when Americans think about recipients of government benefits, they overwhelmingly imagine a Black person - although, in reality, beneficiaries include White, Black, and Hispanic people in roughly equal proportions," explain researchers Jazmin L. Brown-Iannuzzi (University of Kentucky) and B. Keith Payne (University of North Carolina at Chapel Hill). "Moreover, when people consider granting or withholding benefits to the imagined recipients, they are less willing to grant benefits when the imagined person fits racial stereotypes about black recipients." Brown-Iannuzzi, Payne, and colleagues Ron Dotsch (Utrecht University) and Erin Cooley (Colgate University) were interested in studying how people think about welfare recipients when they considered that Americans often oppose redistributive policies despite rising economic inequality. "One reason, we hypothesized, could be that racial stereotypes shape perceptions of who benefits from such policies," Brown-Iannuzzi and Payne explain. To explore these mental representations, the researchers created a "base" image that was a morph of four images: an African American man, an African American woman, a White man, and a White woman. They then added a filter that distorted the base image to generate 800 unique face images. Participants looked at 400 pairs of faces and selected the face in each pair that looked more like a welfare recipient. The instructions did not refer to race or any other specific trait, and yet the two groups made strikingly similar selections, indicating strong consensus about what welfare recipients "look like." Based on participants' selections, the researchers superimposed images to create one photo showing the average mental image of a welfare recipient and another photo showing the average mental image of a non-recipient. The researchers then asked two completely different groups of participants to evaluate the aggregate faces on various features. Some of the features related to appearance - for example, participants rated the individual's race, gender, likeability, attractiveness, and happiness. Some of the features related to aspects of the individual's personality, including laziness, competence, humanness, and agency. Importantly, the participants were making these evaluations based on the image alone - they didn't know how the images were generated or what they represented. The data, from a total of over 400 adults, showed a strong convergence. Although they did not know what the faces represented, participants rated the average welfare recipient face as more African American (less White), less likeable, less attractive, and less happy than the aggregate non-welfare-recipient face. Moreover, they assessed the individual in the welfare recipient image to be lazier, more incompetent, more hostile, and less human. And these associations seem to have real-world consequences, shaping people's attitudes about who deserves welfare benefits. A total of 229 participants evaluated the individuals shown in the two average images, believing that they were composites photos of actual people who had applied for government welfare programs. The researchers explained that some of these applicants had turned out to be "responsible" recipients, while others had turned out to be "irresponsible." Again, participants rated the average welfare-recipient image as more African American, less competent, and less hardworking than the average non-recipient image. But participants also rated the person shown in the average welfare-recipient image as generally less responsible and less responsible with food stamps and cash assistance than the person in the other image. These impressions seemed to drive participants' opinions about whether the two individuals deserved welfare support. They expressed less support for giving food stamps and cash assistance to the person in the welfare-recipient image than to the person in the other image. The researchers conclude that these strong mental representations may contribute to growing inequality by triggering deeply-rooted biases about how resources should be distributed and to whom. "Our research sheds light on the psychological roots of political policy preferences," say Brown-Iannuzzi and Payne. "These findings can help us understand the different assumptions and different mental representations underlying deep political divisions." All data and materials have been made publicly available at figshare and can be accessed at https:/ . The complete Open Practices Disclosure for this article can be found at http://pss. . This article has received badges for Open Data and Open Materials. More information about the Open Practices badges can be found at https:/ and http://pss. . For more information about this study, please contact: Jazmin Brown-Iannuzzi at jazmin.bi@uky.edu B. Keith Payne at payne@unc.edu The article abstract is available online: http://journals. The APS journal Psychological Scienc is the highest ranked empirical journal in psychology. For a copy of the article "The Relationship Between Mental Representations of Welfare Recipients and Attitudes Toward Welfare" and access to other Psychological Science research findings, please contact Anna Mikulak at 202-293-9300 or amikulak@psychologicalscience.org.


News Article | October 26, 2016
Site: www.nature.com

Redefine excellence Fix incentives to fix science | Do judge Treat metrics only as surrogates An obsession with metrics pervades science. Our institution, the University Medical Center Utrecht in the Netherlands, is not exempt. On our website, we proudly declare that we publish about 2,500 peer-reviewed scientific publications per year, with higher than average citation rates. A few years ago, an evaluation committee spent hours discussing which of several faculty members to promote, only to settle on the two who had already been awarded particularly prestigious grants. Meanwhile, faculty members who spent time crafting policy advice had a hard time explaining how this added to their scientific output, even when it affected clinical decisions across the country. Publications that directly influenced patient care were weighted no higher in evaluations than any other paper, and less if that work appeared in the grey literature — that is, in official reports rather than in scientific journals. Some researchers were actively discouraged from pursuing publications that might improve medicine but would garner few citations. All of this led many faculty members, especially younger ones, to complain that publication pressure kept them from doing what really mattered, such as strengthening contacts with patient organizations or trying to make promising treatments work in the real world. The institution decided to break free of this mindset. Our university medical centre has just completed its first round of professorial appointments using a different approach, which will continue to be used for the roughly 20 professors appointed each year. The institution is evaluating research programmes in a new way. In 2013, senior faculty members and administrators (including F.M.) at the University Medical Center (UMC) Utrecht, Utrecht University and the University of Amsterdam hosted workshops and published a position paper concluding that bibliometric parameters were overemphasized and societal relevance was undervalued1. This led to extensive media attention, with newspapers and television shows devoting sections to the 'crisis' in science. Other efforts have come to similar conclusions2, 3, 4. In the wake of this public discussion, we launched our own internal debates. We had two goals. We wanted to create policies that ensured individual researchers would be judged on their actual contributions and not the counts of their publications. And we wanted our research programmes to be geared towards creating societal impact and not just scientific excellence. Every meeting was attended by 20–60 UMC Utrecht researchers, many explicitly invited for their candour. They ranged from PhD students and young principal investigators to professors and department heads. The executive board, especially F.M., prepared the ground for frank criticism by publicly acknowledging publication pressure, perverse incentives and systemic flaws in science5, 6. Attendees debated the right balance between research driven by curiosity and research inspired by clinical needs. They considered the role of patients' advice in setting research priorities, the definition of a good PhD trajectory and how to weigh up scientific novelty and societal relevance. We published interviews and reports from these meetings on our internal website and in our magazine. We spent the next year redefining the portfolio that applicants seeking academic promotions are asked to submit. There were few examples to guide us, but we took inspiration from the approach used at the Karolinska Institute in Stockholm, which asks candidates for a package of scientific, teaching and other achievements. Along with other elements, Utrecht candidates now provide a short essay about who they are and what their plans are as faculty members. They must discuss achievements in terms of five domains, only one of which is scientific publications and grants. First, candidates describe their managerial responsibilities and academic duties, such as reviewing for journals and contributing to internal and external committees. Second, they explain how much time they devote to students, what courses they have developed and what other responsibilities they have taken on. Then, if applicable, they describe their clinical work as well as their participation in organizing clinical trials and research into new treatments and diagnostics. Finally, the portfolio covers entrepreneurship and community outreach. We also revamped the applicant-evaluation procedure. The chair of the committee is formally tasked with assuring that all domains are discussed for each candidate. This keeps us from overlooking someone who has hard-to-quantify qualities, such as their motivation to turn 'promising' results into something that really matters for patients, or to seek out non-obvious collaborations. Another aspect of breaking free of the 'bibliometric mindset' came in how we assess our multidisciplinary research programmes, each of which has on average 80 principal investigators. The evaluation method was developed by a committee of faculty members mostly in the early stages of their careers. Following processes outlined by the UK Research Excellence Framework, which audits the output of UK institutions, committee members drew on case studies and published literature to define properties that could be used in broad assessments. This led to a suite of semi-qualitative indicators that include conventional outcome measurements, evaluations of leadership and citizenship across UMC Utrecht and other communities, as well as assessments of structure and process, such as how research questions are formed and results disseminated. We think that these shifts will reduce waste7, 8, increase impact, and attract researchers geared for collaborations with each other and with society at large. Researchers at UMC Utrecht are already accustomed to national reviews, so our proposal to revamp evaluations fell on fertile ground. However, crafting these new policies took commitment and patience. Two aspects of our approach were crucial. First, we did not let ourselves become paralysed by the belief that only joint action along with funders and journals would bring real change. We were willing to move forward on our own as an institution. Second, we ensured that although change was stimulated from the top, the criteria were set by the faculty members who expect to be judged by those standards. Indeed, after ample debate fuelled by continuing international criticism of bibliometric indicators, the first wave of group leaders has embraced the new system, which will permeate the institute in the years to come. During the past few years of lectures and workshops, we were initially struck by how little early- and mid-career researchers knew about the 'business model' of modern science and about how science really works. But they were engaged, quick to learn and quick to identify forward-looking ideas to improve science. Students organized a brainstorming session with high-level faculty members about how to change the medical and life-sciences curriculum to incorporate reward-and-incentive structures. The PhD council chose a 'supervisor of the year' on the basis of the quality of supervision, and not just by the highest number of PhD students supervised, as was the custom before. Extended community discussions pay off. We believe that selection and evaluation committees are well aware that bibliometrics can be a reductive force, but that assessors may lack the vocabulary to discuss less-quantifiable dimensions. By formally requiring qualitative indicators and a descriptive portfolio, we broaden what can be talked about9. We shape the structures that shape science — we can make sure that they do not warp it. Some 20 years ago, when I was dean of biological sciences at the University of Manchester, UK, I tried an experiment. At the time, we assessed candidates applying for appointments and promotions using conventional measures: number of publications, quality of journal, h-index and so on. Instead, we decided to ask applicants to tell us what they considered to be their three most important publications and why, and to submit a copy of each. We asked simple, direct questions: what have you discovered? Why is it important? What have you done about your discovery? To make applicants feel more comfortable with this peculiar assessment, we also indicated that they could submit, if they wished, a list of all of their other scientific publications — everyone did. That experience has influenced the work I do now, as director-general of the main science-funding agency in Ireland. The three publications chosen by the applicant told me a lot about their achievements and judgement. Often, they highlighted unconventional impacts of their work. For example, a would-be professor of medicine whose research concerned safely shortening hospital stays selected an article that he had written in the free, unrefereed magazine, Hospital Doctor. Asked why, he replied that hospital managers and most doctors actually read that magazine, so that the piece had facilitated rapid adoption of his findings; he later detailed the impactful results of this in an eminent medical journal (a paper he chose not to submit). I believe most committee members actually read the papers submitted, unlike in other evaluations, where panellists have time only to scan exhaustive lists of publications. This approach may not have changed committee decisions, but it did change incentives of both the candidates and the panellists. The focus was on work that was important and meaningful. When counts of papers or citations become the dominant assessment criteria, people often overlook the basics: what did this scientist do and why does it matter? But committee members often felt uncomfortable; they thought their selection was subjective, and they felt more secure with the numbers. After all, the biological-sciences faculty had just been through a major reform to prioritize research activity. The committee members had a point — bibliometric methods do bring some objectivity and may help to avoid biases and prejudices. Still, such approaches do not necessarily help minorities, young people or those working on particularly difficult problems; nor do they encourage reproducibility (see go.nature.com/2dyn0sq). Exercising judgement is what people making important decisions are supposed to do. When I moved on from my position as dean, the system reverted to its conventional form. Changes that result in differences from a cultural norm are difficult to sustain, particularly when they rely on the passion of a small number of people. In the years since, bibliometric assessments have become ever more embedded in evaluations across the world. Lately, rumblings against their influence have grown louder3. To move the scientific enterprise towards better measures of quality, perhaps we need a collective effort by a group of leading international universities and research funders. What you measure is what you get: so if funders focus on assessing solid research advances (with potential economic and social impact) then this may encourage reliable, important work and discourage bibliometric gaming. What can funders do? By tweaking rewards, these bodies can shape researchers' choices profoundly. The UK government has commissioned two reports2, 10 on how bibliometrics can be gamed, and is mulling ways to improve nationwide evaluations. Already we have seen a higher value placed on reproducibility by the US National Institutes of Health, with an increased focus on methodology, and a policy not to release funds until concerns raised by grant reviewers are explicitly addressed. The Netherlands Organisation for Scientific Research, the country's main funding body, has allocated funding for repeat experiments. Research funders should also explicitly encourage important research, even at the expense of publication rate. To this end, at Science Foundation Ireland, we will experiment with changes to the grant application form that are similar to my Manchester pilot. We will also introduce prizes, for example, for mentorship. We believe that such concrete steps will incentivize high-quality research over the long term, counterbalance some of the distortions in the current system, and help institutions to follow suit. If enough international research organizations and funders return to basic principles in promotions, appointments and evaluations, then perhaps the surrogates can be used properly — as supporting information. They are not endpoints in themselves.


News Article | December 14, 2016
Site: www.sciencenews.org

Self-driving cars promise to transform roadways. There’d be fewer traffic accidents and jams, say proponents, and greater mobility for people who can’t operate a vehicle. The cars could fundamentally change the way we think about getting around. The technology is already rolling onto American streets: Uber has introduced self-driving cabs in Pittsburgh and is experimenting with self-driving trucks for long-haul commercial deliveries. Google’s prototype vehicles are also roaming the roads. (In all these cases, though, human supervisors are along for the ride.) Automakers like Subaru, Toyota and Tesla are also including features such as automatic braking and guided steering on new cars. “I don’t think the ‘self-driving car train’ can be stopped,” says Sebastian Thrun, who established and previously led Google’s self-driving car project. But don’t sell your minivan just yet. Thrun estimates 15 years at least before self-driving cars outnumber conventional cars; others say longer. Technical and scientific experts have weighed in on what big roadblocks remain, and how research can overcome them. To a computer, a highway on a clear day looks completely different than it does in fog or at dusk. Self-driving cars have to detect road features in all conditions, regardless of weather or lighting. “I’ve seen promising results for rain, but snow is a hard one,” says John Leonard, a roboticist at MIT. Sensors need to be reliable, compact and reasonably priced — and paired with detailed maps so a vehicle can make sense of what it sees. Leonard is working with Toyota to help cars respond safely in variable environments, while others are using data from cars’ onboard cameras to create up-to-date maps. “Modern algorithms run on data,” he says. “It’s their fuel.” Self-driving cars struggle to interpret unusual situations, like a traffic officer waving vehicles through a red light. Simple rule-based programming won’t always work because it’s impossible to code for every scenario in advance, says Missy Cummings, who directs a Duke University robotics lab. Body language and other contextual clues help people navigate these situations, but it’s challenging for a computer to tell if, for example, a kid is about to dart into the road. The car “has to be able to abstract; that’s what artificial intelligence is all about,” Cummings says. In a new approach, her team is investigating whether displays on the car can instead alert pedestrians to what the car is going to do. But results suggest walkers ignore the newfangled displays in favor of more old-fashioned cues — say, eyeballing the speed of the car. Even with fully autonomous vehicles on the horizon, most self-driving cars will be semiautonomous for at least the foreseeable future. But figuring out who has what responsibilities at what time can be tricky. How does the car notify a passenger who has been reading or taking a nap that it’s time to take over a task, and how does the car confirm that the passenger is ready to act? “In a sense, you are still concentrating on some of the driving, but you are not really driving,” says Chris Janssen, a cognitive scientist at Utrecht University in the Netherlands. His lab is studying how people direct their attention in these scenarios. One effort uses EEG machines to look at how people’s brains respond to an alert sound when the people are driving versus riding as a passive passenger (as they would in a self-driving car). Janssen is also interested in the best time to deliver instructions and how explicit the instructions should be. In exploring the ethical questions of self-driving cars, Iyad Rahwan, an MIT cognitive scientist, has confirmed that people are selfish: “People buying these cars, they want cars that prioritize the passenger,” says Rahwan — but they want other people’s cars to protect pedestrians instead (SN Online: 6/23/16). In an online exercise called the Moral Machine, players choose whom to save in different scenarios. Does it matter if the pedestrian is an elderly woman? What if she is jaywalking? Society will need to decide what rules and regulations should govern self-driving cars. For the technology to catch on, decisions will have to incorporate moral judgments while still enticing consumers to embrace automation. In 2015, hackers brought a Jeep to a halt on a St. Louis highway by wirelessly accessing its braking and steering via the onboard entertainment system. The demonstration proved that even conventional vehicles have vulnerabilities that, if exploited, could lead to accidents. Self-driving cars, which would get updates and maps through the cloud, would be at even greater risk. “The more computing permeates into everyday objects, the harder it is going to be to keep track of the vulnerabilities,” says Sean Smith, a computer scientist at Dartmouth College. And while terrorists might want to crash cars, Smith can imagine other nefarious acts: For instance, hackers could disable someone’s car and hold it for ransom until receiving a digital payment.


« SAKOR Technologies provides dual-dynamometer testing system to major global powertrain manufacturer in China | Main | DOE HPC4Mfg program funds 13 projects to advance US manufacturing; welding, Li-S batteries among projects » Merging two powerful 3-D X-ray techniques, a team of researchers from the Department of Energy’s SLAC National Accelerator Laboratory and Utrecht University in the Netherlands revealed new details of the metal poisoning process that clogs the pores of fluid catalytic cracking (FCC) catalyst particles used in gasoline production, causing them to lose effectiveness. The team combined their data to produce a video that shows the chemistry of this aging process and takes the viewer on a virtual flight through the pores of a catalyst particle. More broadly, the approach is generally applicable and provides an unprecedented view of dynamic changes in a material’s pore space—an essential factor in the rational design of functional porous materials including those use for batteries and fuel cells. The results were published in an open access paper in Nature Communications. The FCC catalyst particles are used in oil refineries to “crack” large molecules that are left after distillation of crude oil into smaller molecules, such as gasoline. The FCC catalyst is designed as a multi-component, hierarchically porous particle of 50–100 μm diameters and consists of catalytically highly active phases (zeolites) embedded in a matrix consisting of an active component (alumina) and a non-active part made from silica and clay. A highly interconnected hierarchical pore-network in the catalyst with pore sizes ranging from micro-pores ( Although the catalyst material is not consumed in the reaction and in theory could be recycled indefinitely, the pores clog up and the particles slowly lose effectiveness. Worldwide, about 400 reactor systems refine oil into gasoline, accounting for about 40 - 50% of today’s gasoline production, and each system requires 10 - 40 tons of fresh FCC catalysts daily. Finding new clues about how FCCs age out could be key to improving gasoline production. But the new technique also has potential for understanding the workings of materials for powering cars of the future, according to Yijin Liu, a lead author on the paper and staff scientist at SLAC’s Stanford Synchrotron Radiation Lightsource (SSRL), a DOE Office of Science User Facility. In a previous study at SSRL, the team took a series of two-dimensional images of catalyst particles at various angles and used software they developed to combine them into three-dimensional images of whole particles showing the distribution of elements in catalysts at various ages. For the new study, the researchers examined an FCC particle recovered from a refinery using two different 3-D X-ray imaging techniques at two experimental stations, or beamlines, at SSRL. One technique, called X-ray fluorescence, provided a detailed profile of the particle’s chemical elements. The other, X-ray transmission microscopy, captured the nanoscale structure of the particle, including fine details about the porous network where metal poisoning can best be observed. The results of the study highlight the importance of having multiple techniques to study a single sample at a facility like SSRL. Liu, who heads up one of the two beamlines used in the research, noted that there was a lot of development on the beamlines to make it possible to register the data in 3-D at this very fine scale. Going beyond the observation of the experimental data visualized in the video, the scientists developed a model explaining how the accumulation of metals poisons the efficiency of the catalyst. The resulting model simulates the aging of the catalyst, allows scientists to quantify this virtual aging, and helps them predict the collapse of its transportation network. Other researchers who contributed to this work were SSRL’s Courtney Krest and Samuel Webb. This work was supported by the NWO Gravitation program, Netherlands Center for Multiscale Catalytic Energy Conversion, and a European Research Council Advanced Grant.


News Article | December 12, 2016
Site: www.eurekalert.org

The East Antarctic ice sheet appears to be more vulnerable than expected, due to a strong wind that brings warm air and blows away the snow. That is the conclusion reached by a team of climate researchers led by Jan Lenaerts (Utrecht University/KU Leuven) and Stef Lhermitte (TU Delft/KU Leuven), based on a combination of climate models, satellite observations and on-site measurements. Their conclusions will be published in Nature Climate Change on 12 December. "Tens of meters of rising sea levels are locked away in Antarctica", says Lenaerts. "And our research has shown that also East Antarctica is vulnerable to climate change." Current IPCC projections show large uncertainties in Antarctica's contribution to sea level rise, because the role of ice shelf processes remains uncertain. Lenaerts explains: "Little climate change is observable in East Antarctica, because the area is so isolated from the rest of the world." However, to the researchers' astonishment, the ice shelves in some regions of East Antarctica are melting faster than scientists had previously assumed. These ice shelves appear to be extremely sensitive to climate change. Through a unique combination of field work, satellite data and a climate model, the researchers were able to explain why some parts of the East Antarctica ice shelves are melting so rapidly. This is because the strong and persistent wind transports warm, dry air to the region, and blows away the snow. This darkens the surface, which subsequently absorbs more of the sun's heat. The result is a local warmer microclimate with a few literal 'hotspots'. Because the ice shelf is floating in the ocean, its melting does not immediately contribute to sea level rise. However, the ice shelves around Antarctica are extremely important for ice sheet stability, because they hold back the land ice. If the ice shelves collapse, this land ice ends up in the ocean and consequently sea level will rise. Part of the research conducted by Lenaerts and Lhermitte focused on a mysterious crater that was spotted on the King Baudoin ice shelf. "At the time, the media reported that it was probably a meteorite impact crater", Lenaerts says. "My response was: in that area? Then it's definitely not a meteorite; it's proof of strong melting." In January 2016, the researchers visited the crater and discovered that it was a collapsed lake, with a moulin - a hole in the ice- which allowed the water to flow into the ocean. Lhermitte: "That was a huge surprise. Moulins typically are observed on Greenland. And we definitely never see them on an ice shelf." Moreover, the researchers discovered that there were many meltwater lakes hidden under the surface of the ice, some of which were kilometres across. Underwater video images provide a clear image of the amount of meltwater present in the area. Is this a sign of climate change? "The crater isn't new; we found it on satellite images from 1989. The amount of melt water differs immensely from year to year, but it clearly increases during warm years", according to Lhermitte. Last year, an influential publication showed that Antarctica's contribution to rising sea levels depends largely on the stability of these melting ice shelves. Lenaerts: "That study indicated that West Antarctica is extremely sensitive to climate change. But our research now suggests that the much larger East Antarctica ice sheet is also very vulnerable." The study is a collaborative effort by Utrecht University, TU Delft, KU Leuven, Université Libre de Bruxelles and the Alfred-Wegener-Institut. A video is available at https:/ . Supplementary photos and videos are available here. More visuals are available in this stunning web story. Login required until embargo lifts. Please send an email to katrien.bollen@kuleuven.be or to the researchers.

Loading Utrecht University collaborators
Loading Utrecht University collaborators