East Saint Louis, IL, United States
East Saint Louis, IL, United States

Time filter

Source Type

News Article | May 22, 2017
Site: www.eurekalert.org

Contrary to posters you may have seen hanging on the walls in science buildings and classrooms, Lijun Liu, professor of geology at Illinois, knows that Earth's interior is not like an onion. While most textbooks demonstrate the outer surface of the Earth as the crust, the next inner level as the mantle, and then the most inner layer as the core, Liu said the reality isn't as clear-cut. "It's not just in layers, because the Earth's interior is not stationary," Liu said. In fact, underneath our feet there's tectonic activity that many scientists have been aware of, but Liu and his team have created a computer model to help better explain it -- a model so effective that researchers believe it has the potential to predict where earthquakes and volcanoes will occur. Using this model, Liu, along with doctoral student Jiashun Hu, and Manuele Faccenda from the University of Padua in Italy, recently published a research paper in the journal of Earth and Planetary Science Letters that focuses on the deep mantle and its relationship to plate tectonics. "It's well-known that there are plate tectonics driving the Earth's evolution, but exactly how this process works is not entirely clear," he said. Liu and Hu looked specifically at the continent of South America to determine which tectonic factors contribute to the deformation, or the evolution, of the mantle. To answer this question, the team created a data-centric model using the Blue Waters supercomputer at the National Center for Supercomputing Applications at Illinois. The sophisticated four-dimensional data-oriented geodynamic models are among the first of their kind. "We are actually the first ones to use data assimilation models in studying mantle deformation, in an approach similar to weather forecasting," Liu said. "We are trying to produce a system model that simultaneously satisfies all the observations we have. We can then obtain a better understanding about dynamic processes of the Earth evolution." While there are many debates in regards to how the Earth's internal evolution is driven, the model created by the team seemed to find an answer that better fits available observations and underlying physics. The team found that the subducting slab -- a portion of the oceanic plate that slides beneath a continental plate -- is the dominant driving force behind the deformation of the mantle. Essentially, the active subduction of the slab determines most other processes that happen as part of a chain reaction. "The result is game-changing. The driving force of mantle flow is actually simpler than people thought," Liu said. "It is the most direct consequence of plate tectonics. When the slab subducts, it naturally controls everything surrounding it. In a way this is elegant, because it's simple." By understanding this mechanism of Earth evolution, the team can make better predictions regarding the movement of the mantle and the lithosphere, or crust. The team then evaluated the model's predictions using other data. Hu, the lead author on the paper, said that by comparing the predictions to tectonic activities such as the formation of mountains and volcanoes, a clear consistency emerged. "We think our story is correct," Hu said. Consequently, the model also provides interesting insight on the evolution of continents as far back as the Jurassic, when dinosaurs roamed the Earth on Pangaea, the only continent at the time. This is still the team's ongoing research. Liu said that in a separate paper that uses the same simulation, published by Liu and Hu in Earth and Planetary Science Letters in 2016, the model provided an accurate prediction for why earthquakes happen in particular locations below South America. He explained that earthquakes aren't evenly spread within the subducting slab, meaning there are potentially areas where an earthquake is more or less likely to take place. "We found that whenever you see a lack of earthquakes in a region, it corresponds to a hole in the slab," Liu said. "Because of the missing slab in the hole, there's no way to generate earthquakes, so we might be able to know where more earthquakes will take place." The model also explained why certain volcanoes might exist further inland and have different compositions, despite the common thought that volcanoes should exist solely along the coast, as a result of water coming off the down-going slab. As the model helps explain, a volcano can form inland if the slab subducts at a shallower angle, and a hole in the shallow slab allows for a special type of magma to form by melting of the crust. "Ultimately this model will provide a promising way of solving the question of how and why continents move the way they do," Liu said. "The answer should depend on what the mantle is doing. This is a way to much better understand Earth evolution." The team is currently expanding the model to analyze the entire globe. "We are looking forward to more exciting results," Liu said.


News Article | May 16, 2017
Site: www.businesswire.com

RESEARCH TRIANGLE PARK, N.C.--(BUSINESS WIRE)--Syngenta welcomed visitors to its new Digital Innovation Lab at the University of Illinois Research Park today during a grand opening event and forum. Faculty members, students, industry leaders, government officials, Syngenta customers and company leaders toured the lab and exchanged insights regarding the future of digital innovation in agriculture. Bringing together the resources of a global agribusiness, the intellectual capital of a major research institution and the business mentality of a tech start-up, the Syngenta Digital Innovation Lab at the University of Illinois Research Park represents the first of three planned sites in Syngenta’s global digital innovation lab network. “We are excited to embark on this chapter in our digital ag journey,” said Dan Burdett, global head of digital agriculture at Syngenta. “The Syngenta Digital Innovation Lab supports an excellent opportunity to collaborate with dynamic students and researchers from a diverse cross-section of disciplines.” The Syngenta Digital Innovation Lab provides a forum for students in computer science, biology, physical sciences and other related subjects to apply their specialized expertise—and cutting-edge technology—to address some of agriculture’s most pressing challenges. Data analytics, cloud technology, mobile applications and smart farm technology are among the platforms students will use to explore and test disruptive ideas in pursuit of solutions to complex issues. Establishing the Syngenta Digital Innovation Lab builds on an existing industry relationship with the National Center for Supercomputing Applications (NCSA) at the University of Illinois, a collaboration that has yielded notable success for the Syngenta plant breeding program. “The NCSA’s specialized expertise in high-performance modeling, simulation and Big Data has helped deliver on some of our key business challenges,” said Bill Danker, domain head of seeds research and breeding at Syngenta. “For example, collaborating with the NCSA has resulted in the development of an improved predictive algorithm to compute genetic allele data sets with greater scope and speed.” Syngenta recently appointed Brandon Dohman as full-time site director of the Syngenta Digital Innovation Lab, with plans to recruit additional professional staff members and student talent. Syngenta has been a leader in digital agriculture since the company’s introduction in 2000 of AgriEdge Excelsior®, a whole-farm management program that helps farmers use their crop inputs, land, water and resources more efficiently. Most recently, Syngenta has aggressively leveraged data analytics in its plant breeding program, which has resulted in improved seed varieties and recognition from experts within the advanced mathematics community. With digital innovation and data integration playing an increasingly pronounced role in bringing successful ag technologies and solutions to market, the Syngenta Digital Innovation Lab is poised to help accelerate the development of these tools, ultimately helping farmers grow more from less. Making crops more efficient is among the six commitments set forth in The Good Growth Plan, the company’s global strategy to sustainably feed a growing population. About the Research Park at the University of Illinois The Research Park at the University of Illinois at Urbana-Champaign is a technology hub for startup companies and corporate research and development operations. Within the Research Park there are more than 100 companies employing students and full-time technology professionals. More information at www.researchpark.illinois.edu. About Syngenta Syngenta is a leading agriculture company helping to improve global food security by enabling millions of farmers to make better use of available resources. Through world class science and innovative crop solutions, our 28,000 people in over 90 countries are working to transform how crops are grown. We are committed to rescuing land from degradation, enhancing biodiversity and revitalizing rural communities. To learn more, visit www.syngenta.com and www.goodgrowthplan.com. Follow us on Twitter at www.twitter.com/Syngenta and www.twitter.com/SyngentaUS. Cautionary Statement Regarding Forward-Looking Statements This document contains forward-looking statements, which can be identified by terminology such as ‘expect,’ ‘would,’ ‘will,’ ‘potential,’ ‘plans,’ ‘prospects,’ ‘estimated,’ ‘aiming,’ ‘on track’ and similar expressions. Such statements may be subject to risks and uncertainties that could cause the actual results to differ materially from these statements. We refer you to Syngenta's publicly available filings with the U.S. Securities and Exchange Commission for information about these and other risks and uncertainties. Syngenta assumes no obligation to update forward-looking statements to reflect actual results, changed assumptions or other factors. This document does not constitute, or form part of, any offer or invitation to sell or issue, or any solicitation of any offer, to purchase or subscribe for any ordinary shares in Syngenta AG, or Syngenta ADSs, nor shall it form the basis of, or be relied on in connection with, any contract thereof. ©2017 Syngenta. 9 Davis Drive, Research Triangle Park, NC 27709. AgriEdge Excelsior® and the Syngenta logo are trademarks of a Syngenta Group Company. All other trademarks are the property of their respective owners.


News Article | May 25, 2017
Site: www.sciencedaily.com

Contrary to posters you may have seen hanging on the walls in science buildings and classrooms, Lijun Liu, professor of geology at Illinois, knows that Earth's interior is not like an onion. While most textbooks demonstrate the outer surface of the Earth as the crust, the next inner level as the mantle, and then the most inner layer as the core, Liu said the reality isn't as clear-cut. "It's not just in layers, because the Earth's interior is not stationary," Liu said. In fact, underneath our feet there's tectonic activity that many scientists have been aware of, but Liu and his team have created a computer model to help better explain it -- a model so effective that researchers believe it has the potential to predict where earthquakes and volcanoes will occur. Using this model, Liu, along with doctoral student Jiashun Hu, and Manuele Faccenda from the University of Padua in Italy, recently published a research paper in the journal of Earth and Planetary Science Letters that focuses on the deep mantle and its relationship to plate tectonics. "It's well-known that there are plate tectonics driving the Earth's evolution, but exactly how this process works is not entirely clear," he said. Liu and Hu looked specifically at the continent of South America to determine which tectonic factors contribute to the deformation, or the evolution, of the mantle. To answer this question, the team created a data-centric model using the Blue Waters supercomputer at the National Center for Supercomputing Applications at Illinois. The sophisticated four-dimensional data-oriented geodynamic models are among the first of their kind. "We are actually the first ones to use data assimilation models in studying mantle deformation, in an approach similar to weather forecasting," Liu said. "We are trying to produce a system model that simultaneously satisfies all the observations we have. We can then obtain a better understanding about dynamic processes of the Earth evolution." While there are many debates in regards to how the Earth's internal evolution is driven, the model created by the team seemed to find an answer that better fits available observations and underlying physics. The team found that the subducting slab -- a portion of the oceanic plate that slides beneath a continental plate -- is the dominant driving force behind the deformation of the mantle. Essentially, the active subduction of the slab determines most other processes that happen as part of a chain reaction. "The result is game-changing. The driving force of mantle flow is actually simpler than people thought," Liu said. "It is the most direct consequence of plate tectonics. When the slab subducts, it naturally controls everything surrounding it. In a way this is elegant, because it's simple." By understanding this mechanism of Earth evolution, the team can make better predictions regarding the movement of the mantle and the lithosphere, or crust. The team then evaluated the model's predictions using other data. Hu, the lead author on the paper, said that by comparing the predictions to tectonic activities such as the formation of mountains and volcanoes, a clear consistency emerged. "We think our story is correct," Hu said. Consequently, the model also provides interesting insight on the evolution of continents as far back as the Jurassic, when dinosaurs roamed the Earth on Pangaea, the only continent at the time. This is still the team's ongoing research. Liu said that in a separate paper that uses the same simulation, published by Liu and Hu in Earth and Planetary Science Letters in 2016, the model provided an accurate prediction for why earthquakes happen in particular locations below South America. He explained that earthquakes aren't evenly spread within the subducting slab, meaning there are potentially areas where an earthquake is more or less likely to take place. "We found that whenever you see a lack of earthquakes in a region, it corresponds to a hole in the slab," Liu said. "Because of the missing slab in the hole, there's no way to generate earthquakes, so we might be able to know where more earthquakes will take place." The model also explained why certain volcanoes might exist further inland and have different compositions, despite the common thought that volcanoes should exist solely along the coast, as a result of water coming off the down-going slab. As the model helps explain, a volcano can form inland if the slab subducts at a shallower angle, and a hole in the shallow slab allows for a special type of magma to form by melting of the crust. "Ultimately this model will provide a promising way of solving the question of how and why continents move the way they do," Liu said. "The answer should depend on what the mantle is doing. This is a way to much better understand Earth evolution." The team is currently expanding the model to analyze the entire globe. "We are looking forward to more exciting results," Liu said.


News Article | November 23, 2016
Site: www.businesswire.com

SALT LAKE CITY--(BUSINESS WIRE)--SC16, the 28th annual international conference of high performance computing, networking, storage and analysis, celebrated the contributions of researchers and scientists - from those just starting their careers to those whose contributions have made lasting impacts. The conference drew more than 11,100 registered attendees and featured a technical program spanning six days. The exhibit hall featured 349 exhibitors from industry, academia and research organizations from around the world. “There has never been a more important time for high performance computing, networking and data analysis,” said SC16 General Chair John West from the Texas Advanced Computing Center. “But it is also an acute time for growing our workforce and expanding diversity in the industry. SC16 was the perfect blend of research, technological advancement, career recognition and improving the ways in which we attract and retain that next generation of scientists.” According to Trey Breckenridge, SC16 Exhibits Chair from Mississippi State University, the SC16 Exhibition was the largest in the history of the conference. The overall size of the exhibition was 150,000 net square feet (breaking the 2015 record of 141,430). The 349 industry and research-focused exhibits included 44 first-timers and 120 organizations from 25 countries outside the United States. During the conference, Salt Lake City also became the hub for the world’s fastest computer network: SCinet, SC16’s custom-built network which delivered 3.15 terabits per second in bandwidth. The network featured 56 miles of fiber deployed throughout the convention center and $32 million in loaned equipment. It was all made possible by 200 volunteers representing global organizations spanning academia, government and industry. For the third year, SC featured an opening “HPC Matters” plenary that this year focused on Precision Medicine, which examined what the future holds in this regard and how advances are only possible through the power of high performance computing and big data. Leading voices from the frontlines of clinical care, medical research, HPC system evolution, pharmaceutical R&D and public policy shared diverse perspectives on the future of precision medicine and how it will impact society. The Technical Program again offered the highest quality original HPC research. The SC workshops set a record with more than 2,500 attendees. There were 14 Best Paper Finalists and six Gordon Bell Finalists. These submissions represent the best of the best in a wide variety of research topics in HPC. “These awards are very important for the SC Conference Series. They celebrate the best and the brightest in high performance computing,” said Satoshi Matsuoka, SC16 Awards Chair from Tokyo Institute of Technology. “These awards are not just plaques or certificates. They define excellence. They set the bar for the years to come and are powerful inspiration for both early career and senior researchers.” Following is the list of Technical Program awards presented at SC16: SC16 received 442 paper submissions, of which 81 were accepted (18.3 percent acceptance rate). Of those, 13 were selected as finalists for the Best Paper (six) and Best Student Paper (seven) awards. The Best Paper Award went to “Daino: A High-Level Framework for Parallel and Efficient AMR on GPUs” by Mohamed Wahib Attia and Naoya Maruyama, RIKEN; and Takayuki Aoki, Tokyo Institute of Technology. The Best Student Paper Award went to “Flexfly: Enabling a Reconfigurable Dragonfly Through Silicon Photonics” by Ke Wen, Payman Samadi, Sebastien Rumley, Christine P. Chen, Yiwen Shen, Meisam Bahadori, and Karen Bergman, Columbia University and Jeremiah Wilke, Sandia National Laboratories. The ACM Gordon Bell Prize is awarded for outstanding team achievement in high performance computing and tracks the progress of parallel computing. This year, the prize was awarded to a 12-member Chinese team for their research project, “10M-Core Scalable Fully-Implicit Solver for Nonhydrostatic Atmospheric Dynamics.” The winning team presented a solver (method for calculating) atmospheric dynamics. In the abstract of their presentation, the winning team writes, “On the road to the seamless weather-climate prediction, a major obstacle is the difficulty of dealing with various spatial and temporal scales. The atmosphere contains time-dependent multi-scale dynamics that support a variety of wave motions.” To simulate the vast number of variables inherent in a weather system developing in the atmosphere, the winning group presents a highly scalable fully implicit solver for three-dimensional nonhydrostatic atmospheric simulations governed by fully compressible Euler equations. Euler equations are a set of equations frequently used to understand fluid dynamics (liquids and gasses in motion). Winning team members are Chao Yang, Chinese Academy of Sciences; Wei Xue, Weimin Zheng, Guangwen Yang, Ping Xu, and Haohuan Fu, Tsinghua University; Hongtao You, National Research Center of Parallel Computer Engineering and Technology; Xinliang Wang, Beijing Normal University; Yulong Ao and Fangfang Liu, Chinese Academy of Sciences, Lin Gan, Tsinghua University; Lanning Wang, Beijing Normal University. This year, SC received 172 detailed poster submissions that went through a rigorous review process. In the end, 112 posters were accepted and five finalists were selected for the Best Poster Award. As part of its research poster activities, SC16 also hosted the ACM Student Research Competition for both undergraduate and graduate students. In all 63 submissions were received, 26 Student Research Competition posters were accepted – 14 in the graduate category and 12 in the undergraduate category. The Best Poster Award went to “A Fast Implicit Solver with Low Memory Footprint and High Scalability for Comprehensive Earthquake Simulation System” with Kohei Fujita from RIKEN as the lead author. First Place: “Touring Dataland? Automated Recommendations for the Big Data Traveler” by Willian Agnew and Michael Fischer, Advisors: Kyle Chard and Ian Foster. Second Place: “Analysis of Variable Selection Methods on Scientific Cluster Measurement Data” by Jonathan Wang, Advisors: Wucherl Yoo and Alex Sim. Third Place: “Discovering Energy Usage Patterns on Scientific Clusters” by Matthew Bae, Advisors: Wucherl Yoo, Alex Sim and Kesheng Wu. First Place: “Job Startup at Exascale: Challenges and Solutions” by Sourav Chakroborty, Advisor: Dhabaleswar K. Panda. Second Place: “Performance Modeling and Engineering with Kerncraft,” by Julian Hammer, Advisors: Georg Hager and Gerhard Wellein. Third Place: “Design and Evaluation of Topology-Aware Scatter and AllGather Algorithms for Dragonfly Networks” by Nathanael Cheriere, Advisor: Matthieu Dorier. The Scientific Visualization and Data Analytics Award featured six finalists. The award went to “Visualization and Analysis of Threats from Asteroid Ocean Impacts” with John Patchett as the lead author. The Student Cluster Competition returned for its 10th year. The competition which debuted at SC07 in Reno and has since been replicated in Europe, Asia and Africa, is a real-time, non-stop, 48-hour challenge in which teams of six undergraduates assemble a small cluster at SC16 and race to complete a real-world workload across a series of scientific applications, demonstrate knowledge of system architecture and application performance, and impress HPC industry judges. The students partner with vendors to design and build a cutting-edge cluster from commercially available components, not to exceed a 3120-watt power limit and work with application experts to tune and run the competition codes. For the first-time ever, the team that won top honors also won the award for achieving highest performance for the Linpack benchmark application. The team “SwanGeese” is from the University of Science and Technology of China. In traditional Chinese culture, the rare Swan Goose stands for teamwork, perseverance and bravery. This is the university’s third appearance in the competition. Also, an ACM SIGHPC Certificate of Appreciation is presented to the authors of a recent SC paper to be used for the SC16 Student Cluster Competition Reproducibility Initiative. The selected paper was “A Parallel Connectivity Algorithm for de Bruijn Graphs in Metagenomic Applications” by Patrick Flick, Chirag Jain, Tony Pan and Srinivas Aluru from Georgia Institute of Technology. The George Michael Memorial HPC Fellowship honors exceptional Ph.D. students. The first recipient is Johann Rudi from the Institute for Computational Engineering and Sciences at the University of Texas at Austin for his project, “Extreme-Scale Implicit Solver for Nonlinear, Multiscale, and Heterogeneous Stokes Flow in the Earth’s Mantle.” The second recipient is Axel Huebl from Helmholtz-Zentrum Dresden-Rossendorf at the Technical University of Dresden for his project, “Scalable, Many-core Particle-in-cell Algorithms to Stimulate Next Generation Particle Accelerators and Corresponding Large-scale Data Analytics.” The SC Conference Series also serves as the venue for recognizing leaders in the HPC community for their contributions during their careers. Here are the career awards presented at SC16: The IEEE-CS Seymour Cray Computer Engineering Award recognizes innovative contributions to high performance computing systems that best exemplify the creative spirit demonstrated by Seymour Cray. The 2016 IEEE-CS Seymour Cray Computer Engineering Award was presented to William J. Camp of Los Alamos National Laboratory “for visionary leadership of the Red Storm project, and for decades of leadership of the HPC community.” Camp previously served as Intel’s Chief Supercomputing Architect and directed Intel’s Exascale R&D efforts. Established in memory of Ken Kennedy, the founder of Rice University's nationally ranked computer science program and one of the world's foremost experts on high-performance computing, the ACM/IEEE-CS Ken Kennedy Award recognizes outstanding contributions to programmability or productivity in high-performance computing together with significant community service or mentoring contributions. The 2016 Ken Kennedy Award was presented to William D. Gropp “for highly influential contributions to the programmability of high-performance parallel and distributed computers, and extraordinary service to the profession.” Gropp Is the Acting Director of the National Center for Supercomputing Applications and Director, Parallel Computing Institute, Thomas M. Siebel Chair in Computer Science at the University of Illinois Urbana-Champaign. The IEEE-CS Sidney Fernbach Memorial Award is awarded for outstanding contributions in the application of high performance computers using innovative approaches. The 2016 IEEE-CS Sidney Fernbach Memorial Award was presented to Vipin Kumar “for foundational work on understanding scalability, and highly scalable algorithms for graph positioning, sparse linear systems and data mining.” Kumar is a Regents Professor at the University of Minnesota. The Supercomputing Conference Test of Time Award recognizes an outstanding paper that has appeared at the SC conference and has deeply influenced the HPC discipline. It is a mark of historical impact and recognition that the paper has changed HPC trends. The winning paper is “Automatically Tuned Linear Algebra Software” by Clint Whaley from University of Tennessee and Jack Dongarra from University of Tennessee and Oak Ridge National Laboratory. IEEE TCSC Award for Excellence in Scalable Computing for Early Career Researchers: The IEEE TCHPC Award for Excellence in Scalable Computing for Early Career Researchers recognizes individuals who have made outstanding and potentially long-lasting contributions to the field within five years of receiving their Ph.D. The 2016 awards were presented to Kyle Chard, Computation Institute , University of Chicago and Argonne National Laboratory; Sunita Chandrassekaran, University of Delaware; and Seyong Lee, Oak Ridge National Laboratory. SC17 will be held next November 12-17 in Denver, Colorado. For more details, go to http://sc17.supercomputing.org/. SC16, sponsored by the IEEE Computer Society and ACM (Association for Computing Machinery), offers a complete technical education program and exhibition to showcase the many ways high performance computing, networking, storage and analysis lead to advances in scientific discovery, research, education and commerce. This premier international conference includes a globally attended technical program, workshops, tutorials, a world-class exhibit area, demonstrations and opportunities for hands-on learning. For more information on SC16, visit: http://sc16.supercomputing.org.


News Article | February 23, 2017
Site: www.businesswire.com

CHAMPAIGN, Ill.--(BUSINESS WIRE)--Syngenta today announced it has established a Digital Innovation Lab at the University of Illinois Research Park where it will employ four full-time employees as well as University of Illinois at Urbana-Champaign student talent to help solve agricultural challenges. Projects at the Digital Innovation Lab will employ “outside the box” thinking with access to tools, technologies, partnerships and resources that enable the research, investigation and delivery of new and novel solutions for seeds product development using data analytics. “Innovation in agriculture is the lifeblood of the work we do at Syngenta. Our goal is to bring new talent to solve difficult challenges, with a focus on seed innovation at the Research Park,” said Bill Danker, Syngenta Domain Head, Seeds Research and Breeding. The center will focus on digital data innovation and strategy, providing Syngenta with agile capabilities to enable the company to accelerate the pace of its digital journey. It will foster new ways to gain insights and make decisions from the company’s data assets. The center will develop capabilities in breeding engineering, digital agriculture, information technology, application development and big data. In conjuction with the opening of the new office, Syngenta has an industry partnership with the National Center for Supercomputing Applications (NCSA) at the University of Illinois at Urbana- Champaign. This partnership enables Syngenta to leverage the NCSA’s unique expertise in high-performance modeling, simulation and big data to gain a competitive edge. “The opening of the Syngenta Innovation Center at Research Park is a welcome new addition that continues to strengthen the relationship between Syngenta and the University of Illinois at Urbana-Champaign,” said Chancellor Robert J. Jones. “The opportunity for our students to be active participants in finding the solutions to increasingly complex societal challenges brought about by an expanding global population adds enormous value to their educational experience. The new center really capitalizes on the internationally recognized expertise in High Performance Computing (HPC), data sciences, and agriculture science at Illinois in ways to enhance Syngenta’s drive for innovation.” The new Research Park operation is located on campus to better connect industry with researchers and students. The center will start with a full-time site director, recruit professional staff, and employ students, who will work on developing projects using technologies such as smart farming, mobile applications, cloud services, and big data. “Digital innovation and integrating our data for greater insight is becoming a critical part of how we invent and bring new products to market, and support them in the market. This new capability will add creativity to the way we approach digital innovation,” said John Ormrod, Syngenta Head IS Global R&D. Student hiring is underway with the goal to be fully operational by the summer. Syngenta is a leading agriculture company helping to improve global food security by enabling millions of farmers to make better use of available resources. Through world class science and innovative crop solutions, our 28,000 people in over 90 countries are working to transform how crops are grown. We are committed to rescuing land from degradation, enhancing biodiversity and revitalizing rural communities. To learn more visit www.syngenta.com and www.goodgrowthplan.com. Follow us on Twitter® at www.twitter.com/Syngenta. About the University of Illinois The University of Illinois is a world leader in research, teaching and discovery. Distinguished by the breadth of its programs, broad academic excellence, and internationally renowned faculty, the University of Illinois has a commitment of excellence in teaching, research, public service and economic development. The University of Illinois at Urbana-Champaign serves the state, the nation, and the world by creating knowledge, preparing students for lives of impact, and addressing critical societal needs through the transfer and application of knowledge. About the Research Park at the University of Illinois The Research Park at the University of Illinois at Urbana-Champaign is a technology hub for startup companies and corporate research and development operations. Within the Research Park there are more than 100 companies employing students and full-time technology professionals. More information at researchpark.illinois.edu. About the National Center for Supercomputing Applications The National Center for Supercomputing Applications (NCSA) provides computing, data, networking, and visualization resources and expertise that help scientists and engineers across the country better understand and improve our world. NCSA is an interdisciplinary hub and is engaged in research and education collaborations with colleagues and students across the campus of the University of Illinois at Urbana-Champaign. For more information, see www.ncsa.illinois.edu.


News Article | November 17, 2016
Site: phys.org

Many years of computational analysis and laboratory and field experiments led to the selection of the proteins targeted in the study. The researchers used tobacco because it is easily modified. Now they are focusing on food crops. "We don't know for certain this approach will work in other crops, but because we're targeting a universal process that is the same in all crops, we're pretty sure it will," said University of Illinois plant biology and crop sciences professor Stephen Long, who led the study with postdoctoral researchers Katarzyna Glowacka and Johannes Kromdijk. The team targeted a process plants use to shield themselves from excessive solar energy. "Crop leaves exposed to full sunlight absorb more light than they can use," Long said. "If they can't get rid of this extra energy, it will actually bleach the leaf." Plants protect themselves by making changes within the leaf that dissipate the excess energy as heat, he said. This process is called nonphotochemical quenching. "But when a cloud crosses the sun, or a leaf goes into the shade of another, it can take up to half an hour for that NPQ process to relax," Long said. "In the shade, the lack of light limits photosynthesis, and NPQ is also wasting light as heat." Long and former graduate student Xinguang Zhu used a supercomputer at the National Center for Supercomputing Applications at the U. of I. to predict how much the slow recovery from NPQ reduces crop productivity over the course of a day. These calculations revealed "surprisingly high losses" of 7.5 percent to 30 percent, depending on the plant type and prevailing temperature, Long said. Long's discussions with University of California, Berkeley researcher and study co-author Krishna Niyogi - an expert on the molecular processes underlying NPQ -suggested that boosting levels of three proteins might speed up the recovery process. To test this concept, the team inserted a "cassette" of the three genes (taken from the model plant Arabidopsis) into tobacco. "The objective was simply to boost the level of three proteins already present in tobacco," Long said. The researchers grew seedlings from multiple experiments, then tested how quickly the engineered plants responded to changes in available light. A fluorescence imaging technique allowed the team to determine which of the transformed plants recovered more quickly upon transfer to shade. The researchers selected the three best performers and tested them in several field plots alongside plots of the unchanged tobacco. Two of the modified plant lines consistently showed 20 percent higher productivity, and the third was 14 percent higher than the unaltered tobacco plants. "Tobacco is grown for its leaves, which were substantially increased," Kromdijk said. "But in food crops, it will be whatever we eat from the plant - the fruit, the seeds or the roots - that we will need to increase." Other experiments have demonstrated that increasing photosynthesis by exposing plants to high carbon dioxide results in more seeds in wheat, soy and rice, he said. "Now we can do this genetically, and we are actively working on repeating our work in various food crops," he said. "This finding offers some rare good news at a time of dire forecasts of future food shortages," Glowacka said. "The United Nations predicts that by 2050 we're going to need to produce about 70 percent more food on the land we're currently using," Long said. "My attitude is that it is very important to have these new technologies on the shelf now because it can take 20 years before such inventions can reach farmer's fields. If we don't do it now, we won't have this solution when we need it." The Bill and Melinda Gates Foundation funded this research, with the stipulation that any new agricultural products that result from the work be licensed in such a way that the technology is freely available to farmers in poor countries of Africa and South Asia. More information: "Improving photosynthesis and crop productivity by accelerating recovery from photoprotection," Science, science.sciencemag.org/cgi/doi/10.1126/science.aai8878


The security of the more than $7 billion in research funded by the National Science Foundation (NSF) will be significantly bolstered, thanks to a $5-million grant awarded to Indiana University, the National Center for Supercomputing Applications (NCSA), the Pittsburgh Supercomputing Center (PSC) and the University of Wisconsin-Madison for a collaborative effort to create the NSF Cybersecurity Center of Excellence. This funding will establish the Center for Trustworthy Scientific Cyberinfrastructure (CTSC), a three-year-old collaboration between the aforementioned institutions, as the NSF Cybersecurity Center of Excellence, an entity focused on addressing cybersecurity challenges of NSF scientific research. Ensuring scientific computing remains trustworthy and uncorrupted is essential in protecting the nation’s science. In its role as a Cybersecurity Center of Excellence, the CTSC will provide readily available cybersecurity services tailored to the NSF science community. These resources will include leadership and coordination across organizations, and education and training to expand the pool of available cybersecurity expertise. "NSF-funded cyberinfrastructure presents unique challenges for operational security personnel and impacts other important areas of research affecting society, including ocean sciences, natural hazards, engineering, biology and physics,” said Anita Nikolich, cybersecurity program director within NSF’s advanced cyberinfrastructure division. “Organizations that host cyberinfrastructure must find the right balance of security, privacy and usability while maintaining an environment in which data are openly shared. Many research organizations lack expertise in technical and policy security, and could benefit from an independent, shared security resource pool." The CTSC will collaborate directly with NSF-funded research organizations to address their cybersecurity challenges and provide forums for cybersecurity collaboration across organizations. For example, Jim Basney of the National Center for Supercomputing Applications will lead CTSC support activities on the topic of identity and access management for research organizations. “Cybersecurity is no longer solely a technical matter — it’s a critical part of any organization’s risk management,” said Von Welch, director of Indiana University’s Center for Applied Cybersecurity Research (CACR) and CTSC principal investigator. “Addressing the cybersecurity risks to science requires a comprehensive understanding of research and the threats it faces. Many of these threats are those faced by other organizations on the Internet, but others are unique to the science community with its collaborative nature and use of high-end information technology and cyberinfrastructure.” The CTSC will also convene an annual NSF Cybersecurity Summit, led by PSC Chief Information Security Officer James A. Marsteller, to share experiences, provide training and discuss cybersecurity challenges. “Organized with significant input from the NSF community, the annual Summit provides a key opportunity to share experiences, lessons learned and advances with other NSF projects,” Marsteller said. “The forum provides an opportunity to discuss serious issues around implementing cybersecurity not only of a technical nature, but also cultural, managerial and budgetary and the like.” An example of a safeguard the CTSC will promote is software assurance, as experienced, respected names in that field, such as Barton Miller, professor at University of Wisconsin-Madison, will offer their expertise to reduce the risks of vulnerabilities and breaches for researchers. “Every day, the news continues to document why truly excellent research in highly applied cybersecurity is a national priority,” said Brad Wheeler, IU vice president for information technology and interim dean of the IU School of Informatics and Computing. “This award adds to the many national distinctions that CACR has achieved in its 13 years as part of IU’s formidable cybersecurity capabilities in education, research and operations.” Additionally, the CTSC will collaborate with the U.S. Department of Energy’s Energy Science Network (ESnet) to develop a threat profile for open science. “The Department of Energy and NSF enable scientific discovery in a range of domains critical to our nation’s future,” said Greg Bell, director for ESnet and division director at the Lawrence Berkeley National Laboratory. “Working together to understand cybersecurity threat models shared by these collaborations is an important step forward for the two agencies, and ESnet is delighted to be collaborating on this effort.”


News Article | February 16, 2017
Site: www.businesswire.com

SPRING, Texas--(BUSINESS WIRE)--ExxonMobil, working with the National Center for Supercomputing Applications (NCSA), has achieved a major breakthrough with proprietary software using more than four times the previous number of processors used on complex oil and gas reservoir simulation models to improve exploration and production results. The breakthrough in parallel simulation used 716,800 processors, the equivalent of harnessing the power of 22,400 computers with 32 processors per computer. ExxonMobil geoscientists and engineers can now make better investment decisions by more efficiently predicting reservoir performance under geological uncertainty to assess a higher volume of alternative development plans in less time. The record run resulted in data output thousands of times faster than typical oil and gas industry reservoir simulation. It was the largest number of processor counts reported by the oil and gas industry, and one of the largest simulations reported by industry in engineering disciplines such as aerospace and manufacturing. “This breakthrough has unlocked new potential for ExxonMobil’s geoscientists and engineers to make more informed and timely decisions on the development and management of oil and gas reservoirs,” said Tom Schuessler, president of ExxonMobil Upstream Research Company. “As our industry looks for cost-effective and environmentally responsible ways to find and develop oil and gas fields, we rely on this type of technology to model the complex processes that govern the flow of oil, water and gas in various reservoirs.” The major breakthrough in parallel simulation results in dramatic reductions in the amount of time previously taken to study oil and gas reservoirs. Reservoir simulation studies are used to guide decisions such as well placement, the design of facilities and development of operational strategies to minimize financial and environmental risk. To model complex processes accurately for the flow of oil, water, and natural gas in the reservoir, simulation software must solve a number of complex equations. Current reservoir management practices in the oil and gas industry are often hampered by the slow speed of reservoir simulation. ExxonMobil’s scientists worked closely with the NCSA to benchmark a series of multi-million to billion cell models on NCSA’s Blue Waters Super Computer. This new reservoir simulation capability efficiently uses hundreds of thousands of processors simultaneously and will have dramatic impact on reservoir management workflows. “NCSA’s Blue Waters sustained petascale system, which has benefited the open science community so tremendously, is also helping industry break through barriers in massively parallel computing,” said Bill Gropp, NCSA’s acting director. “NCSA is thrilled to have worked closely with ExxonMobil to achieve the kind of sustained performance that is so critical in advancing science and engineering.” ExxonMobil’s collaboration with the NCSA required careful planning and optimization of all aspects of the reservoir simulator from input/output to improving communications across hundreds of thousands of processors. These efforts have delivered strong scalability on several processor counts ranging from more than 1,000 to nearly 717,000, the latter being the full capacity of NCSA’s Cray XE6 system. ExxonMobil, the largest publicly traded international oil and gas company, uses technology and innovation to help meet the world’s growing energy needs. We hold an industry-leading inventory of resources and are one of the largest integrated refiners, marketers of petroleum products and chemical manufacturers. For more information, visit www.exxonmobil.com or follow us on Twitter www.twitter.com/exxonmobil. Cautionary Statement: Statements of future events or conditions in this release are forward-looking statements. Actual future results, including the results and impact of new technologies, could vary depending on the outcome of further research and testing; the development and competitiveness of alternative technologies; technical and operating factors; and other factors discussed in this release and under the heading “Factors Affecting Future Results” on the Investors page of ExxonMobil’s website at exxonmobil.com. About the National Center for Supercomputing Applications The National Center for Supercomputing Applications (NCSA) at the University of Illinois at Urbana-Champaign provides supercomputing and advanced digital resources for the nation’s science enterprise. At NCSA, University of Illinois faculty, staff, students, and collaborators from around the globe use advanced digital resources to address research grand challenges for the benefit of science and society. NCSA has been advancing one third of the Fortune 50 for more than 30 years by bringing industry, researchers and students together to solve grand challenges at rapid speed and scale. The Blue Waters Project is supported by the National Science Foundation through awards ACI-0725070 and ACI-1238993.


News Article | March 23, 2016
Site: phys.org

Led by Aleksei Aksimentiev, a professor of physics at the University of Illinois, and Taekjip Ha, a professor of biophysics and biophysical chemistry at Johns Hopkins University and an adjunct at the University of Illinois Center for the Physics of Living Cells along with Aksimentiev, the researchers published their work in the journal Nature Communications. "We are still only starting to explore the physical properties of DNA. It's not just a string of letters," Aksimentiev said. "It's a complex molecule with unique characteristics. The prevailing hypothesis is that everything that happens inside the nucleus, the way the DNA is organized, is all the work of proteins. What we show is that direct DNA-DNA interactions may play a role in large-scale chromosome organization as well." Using the Blue Waters supercomputer at the National Center for Supercomputing Applications on the Illinois campus, Aksimentiev and postdoctoral researcher Jejoong Yoo performed detailed simulations of two DNA molecules interacting in a charged solution such as is found in the cell. The supercomputer allowed them to map each individual atom and its behavior, and to measure the forces between the molecules. They found that, though DNA molecules tend to repel each other in water, in a cell-like environment two DNA molecules can interact according to their respective sequences. "In the DNA alphabet, there is A, T, G and C. We found that when a sequence is rich in A and T, there is a stronger attraction," Aksimentiev said. "Then we looked at what actually causes it at the molecular level. We like to think of DNA as a nice symmetrical helix, but actually there's a line of bumps which are methyl groups, which we find are the key to regulating this sequence-dependent attraction." One of the processes for regulating gene expression is methylation, which adds methyl groups to the DNA helix. In further simulations, the researchers found that the methyl groups strengthen the attraction, so sequences heavy in G and C with methyl groups attached will interact just as strongly as sequences rich in A and T. "The key is the presence of charged particles in the solution," Aksimentiev said. "Let's say you have two people who don't like each other, but I like them both, so I can shake hands with both of them and bring them close. The counter-ions work exactly like that. The strength of how they pull the DNA molecules together depends on how many of them are between the molecules. When we have these bumps, we have a lot of counter-ions." Ha and graduate researcher Hajin Kim experimentally verified the findings of the simulations. Using advanced single-molecule imaging techniques, they isolated two DNA molecules inside a tiny bubble, then watched to see how the molecules interacted. The experiments matched well with the data from the simulations, both for the sequence-dependent interactions and for interactions between methylated DNA. "It was wonderful to see the computational predictions borne out exactly in our experiments," Ha said. "It tells us how accurate the atomic-level simulations are and shows that they can guide new research avenues." The researchers posit that the observed interactions between DNA molecules could play a role in how chromosomes are organized in the cell and which ones are expanded or folded up compactly, determining functions of different cell types or regulating the cell cycle. "For example, once you methylate DNA, the chromosome becomes more compact. It prevents the cellular machinery from accessing the DNA," Aksimentiev said. "It's a way to tell which genes are turned on and which are turned off. This could be part of the bigger question of how chromosomes are arranged and how organizational mechanisms can affect gene expression." Explore further: Charged graphene gives DNA a stage to perform molecular gymnastics More information: Jejoong Yoo et al. Direct evidence for sequence-dependent attraction between double-stranded DNA controlled by methylation, Nature Communications (2016). DOI: 10.1038/ncomms11045


News Article | February 11, 2016
Site: www.techtimes.com

Scientists developed a new computer model that undermines the popular theory about the origins of the Yellowstone Supervolcano. The widespread belief is that the supervolcano was formed from a vertical formation of hot rocks. This column is believed to extend from the top of the planet's core. The enormous magmatic system is sort of the plumbing system beneath the Earth's surface. Researchers from the University of Illinois used the new data about the magmatic system and the supervolcano's past to create an improved computer model that debunks popular theories about its origins. The University's geology professor Lijun Liu said their new computer model presents the complete history of the supervolcano's activities. The research was published in the Geophysical Research Letters journal on Jan. 20. "The majority of previous studies have relied on conceptual, idealized models, which are not physically and geologically accurate," said Liu. The team considered more dynamic processes that enabled their computer model to be more realistic and complex than past ones. The team used the National Center for Supercomputing Applications' Blue Waters supercomputer at the University. They replicated not only the surface's plate tectonic history but also the interior's geophysical image. The machine they used is one of the world's fastest supercomputers and the study is the first one to use such machinery to replicate Yellowstone's complex geophysical data. While the computer model wasn't intended to predict the actual formation of the supervolcano, the model measured the efficiency of past formation theories based on the new data they have on hand. When the popular mantle plume theory was tested, the computer model suggests that the vertical column of hot rocks wasn't theoretically possible. Data on the ancient tectonic plates suggested that a plume would have been blocked from rising. Its unknown origin is a big driver as to why supervolcanoes remain risky and continue to drive public concern. Liu added that the continuous improvement of the supervolcano formation model can help predict Yellowstone's future behaviors. "This research indicates that we need a multidisciplinary approach to understand complicated natural processes like Yellowstone," added Liu.

Loading National Center for Supercomputing Applications collaborators
Loading National Center for Supercomputing Applications collaborators