News Article | February 21, 2017
Water utilities from across the US and Canada are gearing up for Water Utility Energy Challenge. Don't miss your opportunity to compete! The Water Utility Energy and Efficiency Challenge (WUEC) is an innovative program that engages water operators in a competition to reduce the emissions sourced in their energy generation. Funded by the Great Lakes Protection Fund (GLPF), the competition aims to connect utilities with new innovative software while fostering an awareness of the associated emissions, particularly mercury. “Our team is excited to have such a wide breadth of locations and utilities competing,” said Dr. Carol Miller, the technical lead for the competition and the Chair of the Department of Civil and Environmental Engineering at Wayne State University. “The competition has registrations from Illinois, Michigan, Minnesota, Ohio, New York and Wisconsin, as well as the province of Ontario.” The technology is open source for utilities across the United States, but the inaugural challenge and prize dollars will be focused on the Great Lakes Basin. There is no cost to enter, and competing utilities will receive software and tools to assist in monitoring and reporting emissions as well as hands-on technical assistance. The competition will run through 2018 with $30,000 in prizes presented next spring. “AWWA is eager to collaborate with the Great Lakes Protection Fund on this forward-looking work. With its cutting edge technology and training component, the partnership will help water utilities optimize their energy consumption, use cleaner energy sources and, importantly, help keep the Great Lakes clean,” said David LaFrance, AWWA Chief Executive Officer. More information on the challenge can be found by visiting AWWA’s WUEC webpage. The Water Utility Energy Challenge (WUEC) is a technology competition focused on water utilities in the Great Lakes Basin. Offering two top cash prizes of $20,000 and $10,000, the Water Utility Energy Challenge is supported by the Great Lakes Protection Fund. It is a collaborative effort of the American Water Works Association, CDM Smith, E2i, Great Lakes and St. Lawrence Cities Initiative, Growth Capital Network, and Wayne State University. For more information, visit www.AWWA.org/competition Follow WUEC on Twitter Like WUEC on Facebook The Great Lakes Protection Fund (GLPF) is a private, nonprofit corporation formed in 1989 by the governors of the Great Lakes states. It is a permanent environmental endowment that supports collaborative actions to improve the health of the Great Lakes ecosystem. To date, the Fund has made 265 grants and program-related investments representing over $75 million to support the creative work of collaborative teams that test new ideas, take risks, and share what they learn. www.glpf.org Established in 1881, the American Water Works Association is the largest nonprofit, scientific and educational association dedicated to managing and treating water, the world’s most important resource. With approximately 50,000 members, AWWA provides solutions to improve public health, protect the environment, strengthen the economy and enhance our quality of life. www.awwa.org
News Article | February 15, 2017
When people find out there are invisible particles in their food or water, they become alarmed. Arizona State University professor Paul Westerhoff has dedicated his career to producing research that answers people’s questions and moves them past fear. “The things I do are not from a scare-mongering point of view, but trying to answer objective engineering questions,” says Westerhoff, a professor of in the School of Sustainable Engineering and the Built Environment at ASU. Westerhoff, an environmental engineer, has been named one of three Regents' Professors for the 2016-2017 academic year. Regents’ Professor is the highest faculty honor and is conferred on full professors who have made remarkable achievements that have brought them national attention and international distinction. An expert in nanoparticles, Westerhoff started working on the tiny specks even before they had a name. As a graduate student, he worked on water filtration. “At that time we talked about these things called ‘sub-micron particles,’ which we couldn’t measure very well but we did a bunch of experiments with them anyway,” he says. A few years later, when the term “nano” was becoming popular, he realized he had already done it. “So I put in my first proposal, and it got funded because I was one of the first people who had data!” Now, he focuses on using nanoparticles to treat and purify water, an interest that was piqued by a hydrology class he took as an undergraduate. “I understand water,” he says. “I like fishing and swimming and kayaking, and I can go to a river and not only understand the hydrology. But I know why the water is a certain color. And I know where it came from. And I know all the fish that live in it.” From his first studies, he saw the trajectory of public perception about invisible and unknown substances in the environment, and how that could influence his research. “In the environmental world, initially it’s like the world’s going to end. But what I’ve learned is that these things move through predictable trends,” he says, using as an example “Silent Spring,” a 1962 book by conservationist Rachel Carson that documented the effects of the use of pesticides, including DDT. “It’s in this early stage that people are scared, while the agriculture industry and pesticide industry responded by saying that they save millions of lives. In the first few years there’s a lot of uncertainty,” he says. “Then researchers come along and help reduce that uncertainty. “Then there’s another phase where politics come in, and there are cost decisions and people think about regulations and finding alternatives,” he says. “We still find DDT in the environment, but it’s regulated and people really aren’t scared of it. It’s like a 20-year cycle.” Westerhoff says the key is to know which phase is coming next. “As a researcher you want to be focusing on what will be the important question to answer in three to five years, before people even know it’s a question,” he says. “In nano, we were ahead of the game in thinking, ‘Maybe this isn’t so bad, maybe we can use it.’” Now he’s deputy director of the Nanotechnology Enabled Water Treatment Systems Center, which is focused on developing compact, mobile, off-grid systems that can provide clean water to millions of people who lack it. Many of Westerhoff’s research projects have been funded by agencies such as the National Science Foundation and the Environmental Protection Agency, but he also works with water utilities, non-governmental organizations, and industry partners. “Industry wants to know the answers to things. It’s moved out of the scientific ‘what if’ toward reality,” he says. “They all have agendas and as long as you understand their agendas, they ask interesting questions.” Westerhoff was commissioned by the environmental activist group Friends of the Earth to see whether there were nanoparticles in powdered infant formula after the manufacturer declined to reveal whether there were. His lab found needle-shaped nanoparticles in the formula. “In Europe, there’s a warning on their use in cosmetics but yet they’re in infant formula,” he says. They discovered the nanoparticles did not dissolve in either water or saliva, but when they put them in stomach fluid, they dissolved instantly. “They did it to deliver calcium to the gut very efficiently, so they didn’t have to use as much,” he says of the manufacturer. Friends of Earth was concerned that the formula labels didn’t disclose the presences of nanoparticles. “That’s an example of where one group sees something as a risk to society but a company sees it as a benefit.” He’s also seen the evolution of how scientific research is portrayed in the media. In 2008, he supervised a doctoral student on a research project that studied the use of nanosilver in socks to eliminate stinky feet. They wanted to know: Did the particles wash out of the socks and into the water supply? The answer was yes. Journalists jumped all over the story. One headline read, “Toxic socks?” “We kept telling them the amount of silver is very small and won’t affect anything. None of them got it, and everything they wrote was over the top,” Westerhoff says. “They don’t want to hear that ‘everything is safe, there’s no problem.’ They want to hear ‘there’s nanoparticles in donuts.’” In 2015, Westerhoff was named an Outstanding Doctoral Mentor by ASU’s Graduate College. His former students said he is able to deftly balance the guidance that students crave with the independence they need to cultivate. Troy Benn, who worked with Westerhoff on the nanosilver paper and is now an engineer in Montana, says: “For a young kid it was a little bit shocking because you do all your research in a lab and you don’t talk to anyone outside, and all of a sudden people are asking you what you did. “Paul’s good at knowing how much guidance each student needs because they’re all unique.” Kyle Doudrick, who was a graduate student at ASU from 2008 to 2013, says that even with the enormous workload of a full professor, including travel, plus the administrative duties of a vice provost, Westerhoff found time to meet weekly with the students he advised. “It was a good balance of managing but also letting you find yourself in your independence but not so hands off that you had no idea what’s going on,” says Doudrick, who is now an assistant professor in the Department of Civil and Environmental Engineering and Earth Sciences at the University of Notre Dame. “The research I did was on nitrate as a contaminant in water,” he says. “He wasn’t the expert but what he was good at was making the student the expert, and that’s the whole purpose of the PhD, is to become an expert at something.” Even now, Westerhoff teaches ASU 101, the required, one-credit course that all first-time freshmen take. “I ask them why they want to be engineers, and about half have a life story of something they want to solve. They have a deep passion. “And if you don’t hear that until you see them in grad school, you’ve lost touch with what motivates people.”
News Article | February 15, 2017
Eight MIT faculty are among the 84 new members and 22 foreign associates elected to the National Academy of Engineering. Newly elected members for this year also include an impressive 18 MIT-affiliated alumni. Election to the National Academy of Engineering (NAE) is among the highest professional distinctions accorded to an engineer. Academy membership honors those who have made outstanding contributions to "engineering research, practice, or education, including, where appropriate, significant contributions to the engineering literature," and to "the pioneering of new and developing fields of technology, making major advancements in traditional fields of engineering, or developing/implementing innovative approaches to engineering education." The eight elected this year include: Paula Hammond, the David H. Koch Professor, head of the Department of Chemical Engineering, and member of the Koch Institute for Cancer Research, for contributions to self-assembly of polyelectrolytes, colloids, and block copolymers at surfaces and interfaces for energy and health care applications. Daniel Hastings, the Cecil and Ida Green Education Professor in the Department of Aeronautics and Astronautics and chief executive officer and director of the Singapore-MIT Alliance for Research and Technology, for contributions in spacecraft and space system-environment interactions, space system architecture, and leadership in aerospace research and education. Dara Entekhabi, the Bacardi and Stockholm Water Foundations Professor in the departments of Civil and Environmental Engineering and Earth, Atmospheric and Planetary Sciences, for leadership in the hydrologic sciences including the scientific underpinnings for satellite observation of the Earth's water cycle. Dina Katabi, the Andrew (1956) and Erna Viterbi Professor in the Department of Electrical Engineering and Computer Science and a member of the Computer Science and Artificial Intelligence Laboratory (CSAIL), for contributions to network congestion control and to wireless communications. Alexander H. Slocum, the Pappalardo Professor of Mechanical Engineering in the Department of Mechanical Engineering, for contributions to precision machine design and manufacturing across multiple industries and leadership in engineering education. Michael S. Strano, the Carbon P. Dubbs Professor of Chemical Engineering in the Department of Chemical Engineering, for contributions to nanotechnology, including fluorescent sensors for human health and solar and thermal energy devices. Mehmet Toner, professor of health sciences at the Harvard-MIT Division of Health Sciences and Technology and the Helen Andrus Benedict Professor of Surgery at Massachusetts General Hospital, for engineering novel microelectromechanical and microfluidic point-of-care devices that improve detection of cancer, prenatal genetic defects, and infectious disease. Ioannis Yannas, professor of polymer science and engineering in the Department of Mechanical Engineering, for co-developing the first commercially reproducible artificial skin that facilitates new growth, saving the lives of thousands of burn victims. “This is a great class of new NAE members who are affiliated with MIT,” says Ian A. Waitz, dean of the School of Engineering and the Jerome C. Hunsaker Professor in the Department of Aeronautics and Astronautics. “It is wonderful to see our faculty and alumni being honored by their peers for contributions of the highest level.” Including this year’s inductees, 142 current MIT faculty and staff are members of the National Academy of Engineering. With this week’s announcement, this brings NAE’s total U.S. membership to 2,281 and the number of foreign members to 249. Eighteen MIT alumni, including some of the newly elected members listed above, were also named to the NAE this year. They include: Ellen Arruda PhD '92; Aziz Asphahani PhD '75; David Boger ’83; Mark Daskin '74; Bailey Diffie '65; Eric Ducharme SM ’85, ScD ’87; Dara Entekhabi PhD '90; Paula Hammond '84; Daniel Hastings PhD ’80; Dina Katabi SM ’99, PhD ’03; Robert McCabe SM ’81; Alexander Slocum, Jr. ’82; Megan Smith ’86; Darlene Solomon PhD '85; Mehmet Toner SM ’85, PhD '89; George Varghese PhD ’93; Ioannis Yannas SM '59; and Katherine Yelick '82.
News Article | February 23, 2017
Wastewater from oil and gas operations -- including fracking for shale gas -- at a West Virginia site altered microbes downstream, according to a Rutgers-led study. The study, published recently in Science of the Total Environment, showed that wastewater releases, including briny water that contained petroleum and other pollutants, altered the diversity, numbers and functions of microbes. The shifts in the microbial community indicated changes in their respiration and nutrient cycling, along with signs of stress. The study also documented changes in antibiotic resistance in downstream sediments, but did not uncover hot spots, or areas with high levels of resistance. The findings point to the need to understand the impacts on microbial ecosystems from accidental releases or improper treatment of fracking-related wastewater. Moreover, microbial changes in sediments may have implications for the treatment and beneficial reuse of wastewater, the researchers say. "My hope is that the study could be used to start making hypotheses about the impacts of wastewater," said Nicole Fahrenfeld, lead author of the study and assistant professor in Rutgers' Department of Civil and Environmental Engineering. Much remains unknown about the impacts of wastewater from fracking, she added. "I do think we're at the beginning of seeing what the impacts could be," said Fahrenfeld, who works in the School of Engineering. "I want to learn about the real risks and focus our efforts on what matters in the environment." Underground reservoirs of oil and natural gas contain water that is naturally occurring or injected to boost production, according to the U.S. Geological Survey (USGS), whose scientists contributed to the study. During fracking, a fracturing fluid and a solid material are injected into an underground reservoir under very high pressure, creating fractures to increase the porosity and permeability of rocks. Liquid pumped to the surface is usually a mixture of the injected fluids with briny water from the reservoir. It can contain dissolved salt, petroleum and other organic compounds, suspended solids, trace elements, bacteria, naturally occurring radioactive materials and anything injected into wells, the USGS says. Such water is recycled, treated and discharged; spread on roads, evaporated or infiltrated; or injected into deep wells. Fracking for natural gas and oil and its wastewater has increased dramatically in recent years. And that could overwhelm local infrastructure and strain many parts of the post-fracking water cycle, including the storage, treatment, reuse, transportation or disposal of the wastewater, according to the USGS. For the Rutgers-USGS study, water and sediment samples were collected from tributaries of Wolf Creek in West Virginia in June 2014, including an unnamed tributary that runs through an underground injection control facility. The facility includes a disposal well, which injects wastewater to 2,600 feet below the surface, brine storage tanks, an access road and two lined ponds (now-closed) that were used to temporarily store wastewater to allow particles to settle before injection. Water samples were shipped to Rutgers, where they were analyzed. Sediment samples were analyzed at the Waksman Genomics Core Facility at Rutgers. The study generated a rich dataset from metagenomic sequencing, which pinpoints the genes in entire microbial communities, Fahrenfeld noted. "The results showed shifts in the microbial community and antibiotic resistance, but this site doesn't appear to be a new hot spot for antibiotic resistance," she said. The use of biocides in some fracturing fluids raised the question of whether this type of wastewater could serve as an environment that is favorable for increasing antimicrobial resistance. Antimicrobial resistance detected in these sediments did not rise to the levels found in municipal wastewater - an important environmental source of antimicrobial resistance along with agricultural sites. Antibiotics and similar drugs have been used so widely and for so long that the microbes the antibiotics are designed to kill have adapted to them, making the drugs less effective, according to the U.S. Centers for Disease Control and Prevention. At least 2 million people become infected with antibiotic-resistant bacteria each year in the U.S., with at least 23,000 of them dying from the infections. "We have this really nice dataset with all the genes and all the microbes that were at the site," Fahrenfeld said. "We hope to apply some of these techniques to other environmental systems." Study authors include Rutgers undergraduate Hannah Delos Reyes and Rutgers doctoral candidate Alessia Eramo. Other authors include Denise M. Akob, Adam C. Mumford and Isabelle M. Cozzarelli of the U.S. Geological Survey's National Research Program. Mumford earned a doctorate in microbiology at Rutgers.
News Article | February 24, 2017
Wastewater from oil and gas operations – including fracking for shale gas – at a West Virginia site altered microbes downstream, according to a Rutgers-led study. The study, published recently in Science of the Total Environment, showed that wastewater releases, including briny water that contained petroleum and other pollutants, altered the diversity, numbers and functions of microbes. The shifts in the microbial community indicated changes in their respiration and nutrient cycling, along with signs of stress. The study also documented changes in antibiotic resistance in downstream sediments, but did not uncover hot spots, or areas with high levels of resistance. The findings point to the need to understand the impacts on microbial ecosystems from accidental releases or improper treatment of fracking-related wastewater. Moreover, microbial changes in sediments may have implications for the treatment and beneficial reuse of wastewater, the researchers say. “My hope is that the study could be used to start making hypotheses about the impacts of wastewater,” said Nicole Fahrenfeld, lead author of the study and assistant professor in Rutgers’ Department of Civil and Environmental Engineering. Much remains unknown about the impacts of wastewater from fracking, she added. “I do think we’re at the beginning of seeing what the impacts could be,” said Fahrenfeld, who works in the School of Engineering. “I want to learn about the real risks and focus our efforts on what matters in the environment.” Underground reservoirs of oil and natural gas contain water that is naturally occurring or injected to boost production, according to the U.S. Geological Survey (USGS), whose scientists contributed to the study. During fracking, a fracturing fluid and a solid material are injected into an underground reservoir under very high pressure, creating fractures to increase the porosity and permeability of rocks. Liquid pumped to the surface is usually a mixture of the injected fluids with briny water from the reservoir. It can contain dissolved salt, petroleum and other organic compounds, suspended solids, trace elements, bacteria, naturally occurring radioactive materials and anything injected into wells, the USGS says. Such water is recycled, treated and discharged; spread on roads, evaporated or infiltrated; or injected into deep wells. Fracking for natural gas and oil and its wastewater has increased dramatically in recent years. And that could overwhelm local infrastructure and strain many parts of the post-fracking water cycle, including the storage, treatment, reuse, transportation or disposal of the wastewater, according to the USGS. For the Rutgers-USGS study, water and sediment samples were collected from tributaries of Wolf Creek in West Virginia in June 2014, including an unnamed tributary that runs through an underground injection control facility. The facility includes a disposal well, which injects wastewater to 2,600 feet below the surface, brine storage tanks, an access road and two lined ponds (now-closed) that were used to temporarily store wastewater to allow particles to settle before injection. Water samples were shipped to Rutgers, where they were analyzed. Sediment samples were analyzed at the Waksman Genomics Core Facility at Rutgers. The study generated a rich dataset from metagenomic sequencing, which pinpoints the genes in entire microbial communities, Fahrenfeld noted. “The results showed shifts in the microbial community and antibiotic resistance, but this site doesn’t appear to be a new hot spot for antibiotic resistance,” she said. The use of biocides in some fracturing fluids raised the question of whether this type of wastewater could serve as an environment that is favorable for increasing antimicrobial resistance. Antimicrobial resistance detected in these sediments did not rise to the levels found in municipal wastewater – an important environmental source of antimicrobial resistance along with agricultural sites. Antibiotics and similar drugs have been used so widely and for so long that the microbes the antibiotics are designed to kill have adapted to them, making the drugs less effective, according to the U.S. Centers for Disease Control and Prevention. At least 2 million people become infected with antibiotic-resistant bacteria each year in the U.S., with at least 23,000 of them dying from the infections. “We have this really nice dataset with all the genes and all the microbes that were at the site,” Fahrenfeld said. “We hope to apply some of these techniques to other environmental systems.” Study authors include Rutgers undergraduate Hannah Delos Reyes and Rutgers doctoral candidate Alessia Eramo. Other authors include Denise M. Akob, Adam C. Mumford and Isabelle M. Cozzarelli of the U.S. Geological Survey’s National Research Program. Mumford earned a doctorate in microbiology at Rutgers.
News Article | February 15, 2017
A team of researchers at MIT has designed one of the strongest lightweight materials known, by compressing and fusing flakes of graphene, a two-dimensional form of carbon. The new material, a sponge-like configuration with a density of just 5 percent, can have a strength 10 times that of steel. In its two-dimensional form, graphene is thought to be the strongest of all known materials. But researchers until now have had a hard time translating that two-dimensional strength into useful three-dimensional materials. The new findings show that the crucial aspect of the new 3-D forms has more to do with their unusual geometrical configuration than with the material itself, which suggests that similar strong, lightweight materials could be made from a variety of materials by creating similar geometric features. The findings are being reported today in the journal Science Advances, in a paper by Markus Buehler, the head of MIT’s Department of Civil and Environmental Engineering (CEE) and the McAfee Professor of Engineering; Zhao Qin, a CEE research scientist; Gang Seob Jung, a graduate student; and Min Jeong Kang MEng ’16, a recent graduate. Other groups had suggested the possibility of such lightweight structures, but lab experiments so far had failed to match predictions, with some results exhibiting several orders of magnitude less strength than expected. The MIT team decided to solve the mystery by analyzing the material’s behavior down to the level of individual atoms within the structure. They were able to produce a mathematical framework that very closely matches experimental observations. Two-dimensional materials — basically flat sheets that are just one atom in thickness but can be indefinitely large in the other dimensions — have exceptional strength as well as unique electrical properties. But because of their extraordinary thinness, “they are not very useful for making 3-D materials that could be used in vehicles, buildings, or devices,” Buehler says. “What we’ve done is to realize the wish of translating these 2-D materials into three-dimensional structures.” The team was able to compress small flakes of graphene using a combination of heat and pressure. This process produced a strong, stable structure whose form resembles that of some corals and microscopic creatures called diatoms. These shapes, which have an enormous surface area in proportion to their volume, proved to be remarkably strong. “Once we created these 3-D structures, we wanted to see what’s the limit — what’s the strongest possible material we can produce,” says Qin. To do that, they created a variety of 3-D models and then subjected them to various tests. In computational simulations, which mimic the loading conditions in the tensile and compression tests performed in a tensile loading machine, “one of our samples has 5 percent the density of steel, but 10 times the strength,” Qin says. Buehler says that what happens to their 3-D graphene material, which is composed of curved surfaces under deformation, resembles what would happen with sheets of paper. Paper has little strength along its length and width, and can be easily crumpled up. But when made into certain shapes, for example rolled into a tube, suddenly the strength along the length of the tube is much greater and can support substantial weight. Similarly, the geometric arrangement of the graphene flakes after treatment naturally forms a very strong configuration. The new configurations have been made in the lab using a high-resolution, multimaterial 3-D printer. They were mechanically tested for their tensile and compressive properties, and their mechanical response under loading was simulated using the team’s theoretical models. The results from the experiments and simulations matched accurately. The new, more accurate results, based on atomistic computational modeling by the MIT team, ruled out a possibility proposed previously by other teams: that it might be possible to make 3-D graphene structures so lightweight that they would actually be lighter than air, and could be used as a durable replacement for helium in balloons. The current work shows, however, that at such low densities, the material would not have sufficient strength and would collapse from the surrounding air pressure. But many other possible applications of the material could eventually be feasible, the researchers say, for uses that require a combination of extreme strength and light weight. “You could either use the real graphene material or use the geometry we discovered with other materials, like polymers or metals,” Buehler says, to gain similar advantages of strength combined with advantages in cost, processing methods, or other material properties (such as transparency or electrical conductivity). “You can replace the material itself with anything,” Buehler says. “The geometry is the dominant factor. It’s something that has the potential to transfer to many things.” The unusual geometric shapes that graphene naturally forms under heat and pressure look something like a Nerf ball — round, but full of holes. These shapes, known as gyroids, are so complex that “actually making them using conventional manufacturing methods is probably impossible,” Buehler says. The team used 3-D-printed models of the structure, enlarged to thousands of times their natural size, for testing purposes. For actual synthesis, the researchers say, one possibility is to use the polymer or metal particles as templates, coat them with graphene by chemical vapor deposit before heat and pressure treatments, and then chemically or physically remove the polymer or metal phases to leave 3-D graphene in the gyroid form. For this, the computational model given in the current study provides a guideline to evaluate the mechanical quality of the synthesis output. The same geometry could even be applied to large-scale structural materials, they suggest. For example, concrete for a structure such as a bridge might be made with this porous geometry, providing comparable strength with a fraction of the weight. This approach would have the additional benefit of providing good insulation because of the large amount of enclosed airspace within it. Because the shape is riddled with very tiny pore spaces, the material might also find application in some filtration systems, for either water or chemical processing. The mathematical descriptions derived by this group could facilitate the development of a variety of applications, the researchers say. “This is an inspiring study on the mechanics of 3-D graphene assembly,” says Huajian Gao, a professor of engineering at Brown University, who was not involved in this work. “The combination of computational modeling with 3-D-printing-based experiments used in this paper is a powerful new approach in engineering research. It is impressive to see the scaling laws initially derived from nanoscale simulations resurface in macroscale experiments under the help of 3-D printing,” he says. This work, Gao says, “shows a promising direction of bringing the strength of 2-D materials and the power of material architecture design together.” The research was supported by the Office of Naval Research, the Department of Defense Multidisciplinary University Research Initiative, and BASF-North American Center for Research on Advanced Materials.
News Article | February 27, 2017
RESEARCH TRIANGLE PARK, N.C., Feb. 28, 2017 /PRNewswire/ -- "Airpocalypse" -- that's how many referred to the damaging levels of smog that affected many China residents this winter. The high-levels of pollution caused schools to close, transportation to be halted, and gas masks to become a wardrobe staple. A new report, "Air Quality Management Planning Framework," by researchers from RTI International in collaboration with the Jiangsu Environmental Protection Department, the U.S. Environmental Protection Agency, Sonoma Technology, Inc., and the Regulatory Assistance Project, aims to help the Ministry of Environmental Protection of the People's Republic of China lift communities out of the smog by outlining best practices and technologies used in the United States for improving air quality. This work was funded by the U.S. Trade and Development Agency. "China has undergone an economic surge over the past few decades, and as a result air quality has suffered," said Rebecca Nicholson, contributing author and vice president of RTI's Environmental Engineering & Economics Division. "The United States has had its own challenges managing air quality and economic growth, and the report shares lessons learned through that experience." Authors of the report analyzed conditions in three Jiangsu Province cities -- Nanjing, Changzhou, and Suzhou -- that face challenging air quality issues. Jiangsu has experienced rapid urbanization and increased energy demand, leading to air pollution. In 2014, the province had the highest industrial smoke and dust emissions in China, and was ranked second in coal consumption. The geographical makeup of the region also presents a unique challenge -- with mountains on three sides, it is difficult for pollution to disperse. Looking at these challenges, the report lists a range of recommendations for improving air quality, to include: "Technologies, data, and expertise surrounding air quality have never been better," Nicholson said. "Leveraging this knowledge, China can maintain its economic growth while reducing air pollution and improving overall public health." In addition to providing recommendations for air quality improvement, the report provides updates on China's national-level air quality planning efforts, as well as efforts in the Jiangsu Province. Read the full report here. A Chinese version is also available.
News Article | March 2, 2017
A new study, published today in the Canadian Journal of Civil Engineering, presents a risk-based approach for classifying the road surface conditions of a highway network under winter weather events. This approach includes an explicit account of the driving risk that a motorist may experience on a highway. In countries like Canada that have severe winter seasons, transportation agencies often face challenges in meeting the safety and mobility needs of people on the road. To address these challenges, most agencies have a comprehensive winter maintenance program in place that includes policies, best practices, and guidelines for monitoring and reporting of road surface conditions. Typically, road surface condition information is broadcast through a traveler information portal known as 511 system or the website of the road agency. However, there is a lack of consistency in defining and determining the winter driving conditions of a highway across different transportation agencies and jurisdictions. Additionally, different terms may represent different levels of travel risk depending on the agency and location. "The main goal of our study is to develop and propose a new approach to road surface condition classification that provides consistency in the communication of the driving risk that a motorist may experience," says Dr. Lalita Thakali, Research Associate at the University of Waterloo. In this study, researchers from the Department of Civil & Environmental Engineering at the University of Waterloo, propose a risk-based approach for classifying road surface conditions that could be used for monitoring winter driving conditions and directing winter road maintenance operations. The researchers propose a relative risk index on the basis of the risk estimated using a collision model calibrated using detailed hourly data of weather, road surface conditions, traffic and accidents on a large number of highway sections in Ontario over six winter seasons. The study proposed two alternative approaches to address the challenge of determining the overall condition of a highway section or route with non-uniform driving conditions. The first approach applies a risk model to estimate the relative increase in risk under a specific winter weather and road surface conditions as compared to normal conditions. The second approach involves converting different classes of road conditions observed on any given route into a single dominant class based on the relative risk between individual classes of road conditions. This could help drivers assess the road conditions of their entire trip or route. "An ideal classification system for the public should be one that is simple, intuitive, and consistent" continues Dr. Thakali. The risk-based approach for road condition classification introduced in this research represents one step closer towards such an ideal classification system. Further research could look into the feasibility of developing a universal risk index that is applicable across different regions in Canada. The paper, "A risk-based approach to winter road surface condition classification" by Liping Fu, Lalita Thakali, Tae J. Kwon and Taimur Usman was published today in the Canadian Journal of Civil Engineering.
News Article | February 15, 2017
Food and water are two necessities for survival, but what happens when a changing climate in key agricultural regions threatens crop production? Or when the quality of milk cannot be ensured as it is exchanged between producer and seller? Seven MIT graduate students studying food and water security issues presented their research and preliminary findings on issues such as these during the MIT Water and Food Security Student Symposium held on Nov. 21. Hosted by the MIT Department of Civil and Environmental Engineering (CEE) and the MIT Abdul Latif Jameel World Water and Food Security Lab (J-WAFS), the event brought together professors and students to discuss food and water challenges and opportunities to address these through research. Chandra Madramootoo, CEE visiting professor and J-WAFS visiting scholar, curated the event and spoke briefly of the importance of water and food security. “The withdrawal of water varies in different parts of the world. Much larger amounts of water are withdrawn for agriculture compared to industry and domestic uses in South Asia, Middle East, North Africa, Sub-Saharan Africa, Latin America, and East Asia Pacific. This puts stresses on water resources available for food production and our ability to achieve food security. It also puts stress on a very finite resource amongst the other economic and environmental sectors that are competing for that water supply,” Madramootoo highlighted in his opening remarks. During the event, each student presenter was tasked with conveying a broad overview of his or her research in a five minute presentation. There was a wide range of topics, but each student sought to solve a problem relating to food or water. Presentations addressed research as varied as understanding the impact of an environmental threat to agriculture to developing improved irrigation technology to help smallholder farmers around the world. Understanding food and water through an environmental lens Paige Midstokke, a master’s candidate in CEE and a Tata Fellow in Technology and Policy, kicked off the student component of the symposium. Through her research, Midstokke is seeking to improve the drought planning process by researching water security for the state of Maharashtra in India. Since most water planning and management is done at the district level, Midstokke is conducting a case study, focusing on the Aurangabad district. For her thesis, Midstokke is developing a “vulnerability-scarcity index” that integrates socioeconomic data, pumping rates from observation wells, and geographic information system data. “If you work with the government of India, they have a ton of data and they are often willing to give it to you; you get to decide how to put it all together,” she said. Her index is intended to improve early indicators of drought and thus improve water scarcity planning at the district and local levels. Anjuli Jain Figueroa, a PhD candidate in CEE, introduced her research by asking if the question of “sustainable agriculture” is an oxymoron. She noted that historically, increased food production has had negative environmental impacts. Her research, entitled “Sustainable agriculture – quantifying the trade-offs between food security and environmental impacts,” uses case studies from India to describe this trend. During her presentation, Jain Figueroa provided an example of a trade-off she is studying, wherein profit increased for farmers, but there was an unforeseen negative impact. “We see this in a shift that happened with rice. India is growing a lot of rice but it has come at a cost; the cost was nutritional value. Even though families are making more money, the nutritional value for that household decreased. That’s one of the unintended consequences we’re only realizing now,” Jain Figueroa said. She is now working to solve problems like these in her thesis by using a systems approach to study how farmers could increase crop production to meet 2050 population needs, while limiting the negative environmental impacts. Luke Schiferl, a PhD candidate in CEE, added a global perspective to food and water security. Schiferl looks at how air quality affects crop productivity on a global scale and how these effects can be quantified. In his research project “Contrasting particulate matter and ozone effects on crop production,” he uses crop production simulations and chemistry models to quantify the offsetting effects of ozone and particulate matter on crop productivity. This research can suggest how crop production losses can be properly mitigated by air quality improvements once the effects are understood. “We can relate ozone and particulate matter effects with known relationships to crop production and basically plot the different effects,” he said. In the future, Schiferl hopes to apply similar research to simulate the effects water and nutrient restrictions have on these air quality effects and predicting crop productivity. Creating tools and technology to solve food and water security issues Four student contributors to the symposium presented on technological solutions to various environmental issues. Many of these innovative tools are already in use and tested, ready to make positive changes for food and water security around the world. Kevin Patrick Simon, a PhD candidate in mechanical engineering, lent the energy perspective to the water and agriculture discussion. Energy is needed to access water sources with pumps, but energy supply from electricity grids and water sources are distributed unequally across India. In rural India, small farmers still use diesel to run their water pumps. Unfortunately, the cost of diesel discourages year-round cultivation in favor of lower-paying jobs during the winter and dry season. Simon is seeking a solution with his research on “High efficiency, low-cost positive displacement pumps for solar irrigation.” The central question to Simon’s research — “How can we enable people to have better access to water in order to irrigate their land?” — is addressed in part by solar irrigation. He explained the numerous benefits of solar irrigation, including its potential for cost-savings, independence from the electricity grid and environmental sustainability. “It gives small farmers an unprecedented amount of independence and an ability to draw income from their land,” he said. Pulkit Shamshery, also a PhD candidate in mechanical engineering, is taking a different approach to irrigation by making drip irrigation systems more energy-efficient and more accessible to small farmers. Shamshery noted that 15 percent of India’s food production is dependent on over-exploited water resources; “there is an extreme need for more food with less water, and that’s the motivation for drip irrigation.” Advantages of drip irrigation are increased water savings and higher yields. Additionally, farmers can use less fertilizer by only applying it where it is needed. However, drip irrigation is typically a costly endeavor, which led Shamshery to his project “Low cost, energy-efficient drip irrigation system.” Along with a research team, Shamshery looked at where the most pressure was being lost in irrigation systems when using them to retrieve water from surface sources, and then figured out how to reduce that pressure loss. The team created an off-grid product to make irrigation more efficient and available at a lower cost. Shamshery’s component is patent-pending and will be licensed, and the pressure loss results in a cut of cost of an off-grid drip system by 50 percent for an acre of land. “In the United States you can buy milk off the shelf and not worry about any contamination, but that’s not true for developing countries,” said Pranay Jain, Legatum Fellow and PhD candidate in mechanical engineering. Jain highlighted the public health and economic consequences of milk contamination in India, noting that when people don’t trust the quality of milk that is sold to them, the milk industry suffers. Jain’s research project, “Milk quality analysis for villages in India,” addresses the question: “If milk changes hands so many times before it reaches the consumer, how can these buyers and sellers trust each other?” Jain’s solution was to create an affordable, portable instrument to quickly and accurately determine the quality of milk as it moves through the supply chain from farm to grocery store, and to help improve payment mechanisms. By testing the quality of milk on the spot, “the supply chain simplifies; more farmers opt to sell their milk to processing plants, these plants can get more milk, they will pay the farmer more accurately and both sides benefit.” The device is also powered by mobile phones and collects data; the data collected by the device will be saved in the cloud, allowing for researchers to observe trends and monitor the health of their livestock. In India, approximately one out of every five crates of produce is lost due to spoilage, but Kendall Nowocin, Legatum Fellow and PhD candidate in electrical engineering and computer science, is tackling that problem with CoolCrop, a storage apparatus for small farmers. The storage unit is about the size of a walk-in closet and extends the freshness of their produce. CoolCrop is already being piloted through cooperatives involving non-governmental organizations, and their use is augmented with market analytics. Following supply-and-demand, farmers get the most value for their crops if they are in markets with fewer competitors. CoolCrop fills this void by providing market analytics to small farmers, so they know which markets will increase their profit. Explaining their business model, he said that they “can extract an initial profit that pays for the cold storage and increases the value for the farmer.” CEE Professor Dennis McLaughlin noted in the closing remarks that although most of the research shared at the symposium was centered on India, other developing areas around the world are dealing with similar issues. The MIT Water and Food Security Student Symposium was the final component of a seminar series hosted by Chandra Madramootoo. Commenting that on the importance of water security, he said “I think it’s important when we think about water in the broader context, to think about the competition that’s placed between agricultural water and other sources of water. It’s this competing pressure for resources that we need to think about.”
News Article | February 15, 2017
On Dec. 11, 2014, a freight train of a storm steamed through much of California, deluging the San Francisco Bay Area with three inches of rain in just one hour. The storm was fueled by what meteorologists refer to as the “Pineapple Express” — an atmospheric river of moisture that is whipped up over the Pacific’s tropical waters and swept north with the jet stream. By evening, record rainfall had set off mudslides, floods, and power outages across the state. The storm, which has been called California’s “storm of the decade,” is among the state’s most extreme precipitation events in recent history. Now MIT scientists have found that such extreme precipitation events in California should become more frequent as the Earth’s climate warms over this century. The researchers developed a new technique that predicts the frequency of local, extreme rainfall events by identifying telltale large-scale patterns in atmospheric data. For California, they calculated that, if the world’s average temperatures rise by 4 degrees Celsius by the year 2100, the state will experience three more extreme precipitation events than the current average, per year. The researchers, who have published their results in the Journal of Climate, say their technique significantly reduces the uncertainty of extreme storm predictions made by standard climate models. “One of the struggles is, coarse climate models produce a wide range of outcomes. [Rainfall] can increase or decrease,” says Adam Schlosser, senior research scientist in MIT’s Joint Program on the Science and Policy of Global Change. “What our method tells you is, for California, we’re very confident that [heavy precipitation] will increase by the end of the century.” The research was led by Xiang Gao, a research scientist in the Joint Program on the Science and Policy of Global Change. The paper’s co-authors include Paul O’Gorman, associate professor of earth, atmospheric, and planetary sciences; Erwan Monier, principal research scientist in the Joint Program; and Dara Entekhabi, the Bacardi Stockholm Water Foundations Professor of Civil and Environmental Engineering. Currently, researchers estimate the frequency of local heavy precipitation events mainly by using precipitation information simulated from global climate models. But such models typically carry out complex computations to simulate climate processes across hundreds and even thousands of kilometers. At such coarse resolution, it’s extremely difficult for such models to adequately represent small-scale features such as moisture convection and topography, which are essential to making accurate predictions of precipitation. To get a better picture of how future precipitation events might change region by region, Gao decided to focus on not simulated precipitation but large-scale atmospheric patterns, which climate models are able to simulate much more reliably. “We’ve actually found there’s a connection between what climate models do really well, which is to simulate large-scale motions of the atmosphere, and local, heavy precipitation events,” Schlosser says. “We can use this association to tell how frequently these events are occurring now, and how they will change locally, like in New England, or the West Coast.” While definitions vary for what is considered an extreme precipitation event, in this case the researchers defined such an event as being within the top 5 percent of a region’s precipitation amounts in a particular season, over periods of almost three decades. They focused their analysis on two areas: California and the Midwest, regions which generally experience relatively high amounts of precipitation in the winter and summer, respectively. For both regions, the team analyzed large-scale atmospheric features such as wind currents and moisture content, from 1979 to 2005, and noted their patterns each day that extreme precipitation occurred. Using statistical analysis, the researchers identified telltale patterns in the atmospheric data that were associated with heavy storms. “We essentially take snapshots of all the relevant weather information, and we find a common picture, which is used as our red flag,” Schlosser explains. “When we examine historical simulations from a suite of state-of-the-art climate models, we peg every time we see that pattern.” Using the new scheme, the team was able to reproduce collectively the frequency of extreme events that were observed over the 27-year period. More importantly, the results are much more accurate than those based on simulated precipitation from the same climate models. “None of the models are even close to the observations,” Gao says. “And regardless of the combination of atmospheric variables we used, the new schemes were much closer to observations.” Bolstered by their results, the team applied their technique to large-scale atmospheric patterns from climate models to predict how the frequency of heavy storms may change in a warming climate in California and the Midwest over the next century. They analyzed each region under two climate scenarios: a “business as usual” case, in which the world is projected to warm by 4 degrees Celsius by 2100, and a policy-driven case, in which global environmental policies that regulate greenhouse gases should keep the temperature increase to 2 degrees Celsius. For each scenario, the team flagged those modeled large-scale atmospheric patterns that they had determined to be associated with heavy storms. In the Midwest, yearly instances of summer extreme precipitation decreased slightly under both warming scenarios, although the researchers say the results are not without uncertainty. For California, the picture is much clearer: Under the more intense scenario of global warming, the state will experience three more extreme precipitation events per year, on the order of the December 2014 storm. Under the policy-driven scenario, Schlosser says “that trend is cut in half.” The team is now applying its technique to predict changes in heat waves from a globally warming climate. The researchers are looking for patterns in atmospheric data that correlate with past heat waves. If they can more reliably predict the frequency of heat waves in the future, Schlosser says that can be extremely helpful for the long-term maintenance of power grids and transformers. This research was supported, in part, by the National Science Foundation, NASA, and the Department of Energy.