East Saint Louis, IL, United States
East Saint Louis, IL, United States

Time filter

Source Type

News Article | November 23, 2016
Site: www.businesswire.com

SALT LAKE CITY--(BUSINESS WIRE)--SC16, the 28th annual international conference of high performance computing, networking, storage and analysis, celebrated the contributions of researchers and scientists - from those just starting their careers to those whose contributions have made lasting impacts. The conference drew more than 11,100 registered attendees and featured a technical program spanning six days. The exhibit hall featured 349 exhibitors from industry, academia and research organizations from around the world. “There has never been a more important time for high performance computing, networking and data analysis,” said SC16 General Chair John West from the Texas Advanced Computing Center. “But it is also an acute time for growing our workforce and expanding diversity in the industry. SC16 was the perfect blend of research, technological advancement, career recognition and improving the ways in which we attract and retain that next generation of scientists.” According to Trey Breckenridge, SC16 Exhibits Chair from Mississippi State University, the SC16 Exhibition was the largest in the history of the conference. The overall size of the exhibition was 150,000 net square feet (breaking the 2015 record of 141,430). The 349 industry and research-focused exhibits included 44 first-timers and 120 organizations from 25 countries outside the United States. During the conference, Salt Lake City also became the hub for the world’s fastest computer network: SCinet, SC16’s custom-built network which delivered 3.15 terabits per second in bandwidth. The network featured 56 miles of fiber deployed throughout the convention center and $32 million in loaned equipment. It was all made possible by 200 volunteers representing global organizations spanning academia, government and industry. For the third year, SC featured an opening “HPC Matters” plenary that this year focused on Precision Medicine, which examined what the future holds in this regard and how advances are only possible through the power of high performance computing and big data. Leading voices from the frontlines of clinical care, medical research, HPC system evolution, pharmaceutical R&D and public policy shared diverse perspectives on the future of precision medicine and how it will impact society. The Technical Program again offered the highest quality original HPC research. The SC workshops set a record with more than 2,500 attendees. There were 14 Best Paper Finalists and six Gordon Bell Finalists. These submissions represent the best of the best in a wide variety of research topics in HPC. “These awards are very important for the SC Conference Series. They celebrate the best and the brightest in high performance computing,” said Satoshi Matsuoka, SC16 Awards Chair from Tokyo Institute of Technology. “These awards are not just plaques or certificates. They define excellence. They set the bar for the years to come and are powerful inspiration for both early career and senior researchers.” Following is the list of Technical Program awards presented at SC16: SC16 received 442 paper submissions, of which 81 were accepted (18.3 percent acceptance rate). Of those, 13 were selected as finalists for the Best Paper (six) and Best Student Paper (seven) awards. The Best Paper Award went to “Daino: A High-Level Framework for Parallel and Efficient AMR on GPUs” by Mohamed Wahib Attia and Naoya Maruyama, RIKEN; and Takayuki Aoki, Tokyo Institute of Technology. The Best Student Paper Award went to “Flexfly: Enabling a Reconfigurable Dragonfly Through Silicon Photonics” by Ke Wen, Payman Samadi, Sebastien Rumley, Christine P. Chen, Yiwen Shen, Meisam Bahadori, and Karen Bergman, Columbia University and Jeremiah Wilke, Sandia National Laboratories. The ACM Gordon Bell Prize is awarded for outstanding team achievement in high performance computing and tracks the progress of parallel computing. This year, the prize was awarded to a 12-member Chinese team for their research project, “10M-Core Scalable Fully-Implicit Solver for Nonhydrostatic Atmospheric Dynamics.” The winning team presented a solver (method for calculating) atmospheric dynamics. In the abstract of their presentation, the winning team writes, “On the road to the seamless weather-climate prediction, a major obstacle is the difficulty of dealing with various spatial and temporal scales. The atmosphere contains time-dependent multi-scale dynamics that support a variety of wave motions.” To simulate the vast number of variables inherent in a weather system developing in the atmosphere, the winning group presents a highly scalable fully implicit solver for three-dimensional nonhydrostatic atmospheric simulations governed by fully compressible Euler equations. Euler equations are a set of equations frequently used to understand fluid dynamics (liquids and gasses in motion). Winning team members are Chao Yang, Chinese Academy of Sciences; Wei Xue, Weimin Zheng, Guangwen Yang, Ping Xu, and Haohuan Fu, Tsinghua University; Hongtao You, National Research Center of Parallel Computer Engineering and Technology; Xinliang Wang, Beijing Normal University; Yulong Ao and Fangfang Liu, Chinese Academy of Sciences, Lin Gan, Tsinghua University; Lanning Wang, Beijing Normal University. This year, SC received 172 detailed poster submissions that went through a rigorous review process. In the end, 112 posters were accepted and five finalists were selected for the Best Poster Award. As part of its research poster activities, SC16 also hosted the ACM Student Research Competition for both undergraduate and graduate students. In all 63 submissions were received, 26 Student Research Competition posters were accepted – 14 in the graduate category and 12 in the undergraduate category. The Best Poster Award went to “A Fast Implicit Solver with Low Memory Footprint and High Scalability for Comprehensive Earthquake Simulation System” with Kohei Fujita from RIKEN as the lead author. First Place: “Touring Dataland? Automated Recommendations for the Big Data Traveler” by Willian Agnew and Michael Fischer, Advisors: Kyle Chard and Ian Foster. Second Place: “Analysis of Variable Selection Methods on Scientific Cluster Measurement Data” by Jonathan Wang, Advisors: Wucherl Yoo and Alex Sim. Third Place: “Discovering Energy Usage Patterns on Scientific Clusters” by Matthew Bae, Advisors: Wucherl Yoo, Alex Sim and Kesheng Wu. First Place: “Job Startup at Exascale: Challenges and Solutions” by Sourav Chakroborty, Advisor: Dhabaleswar K. Panda. Second Place: “Performance Modeling and Engineering with Kerncraft,” by Julian Hammer, Advisors: Georg Hager and Gerhard Wellein. Third Place: “Design and Evaluation of Topology-Aware Scatter and AllGather Algorithms for Dragonfly Networks” by Nathanael Cheriere, Advisor: Matthieu Dorier. The Scientific Visualization and Data Analytics Award featured six finalists. The award went to “Visualization and Analysis of Threats from Asteroid Ocean Impacts” with John Patchett as the lead author. The Student Cluster Competition returned for its 10th year. The competition which debuted at SC07 in Reno and has since been replicated in Europe, Asia and Africa, is a real-time, non-stop, 48-hour challenge in which teams of six undergraduates assemble a small cluster at SC16 and race to complete a real-world workload across a series of scientific applications, demonstrate knowledge of system architecture and application performance, and impress HPC industry judges. The students partner with vendors to design and build a cutting-edge cluster from commercially available components, not to exceed a 3120-watt power limit and work with application experts to tune and run the competition codes. For the first-time ever, the team that won top honors also won the award for achieving highest performance for the Linpack benchmark application. The team “SwanGeese” is from the University of Science and Technology of China. In traditional Chinese culture, the rare Swan Goose stands for teamwork, perseverance and bravery. This is the university’s third appearance in the competition. Also, an ACM SIGHPC Certificate of Appreciation is presented to the authors of a recent SC paper to be used for the SC16 Student Cluster Competition Reproducibility Initiative. The selected paper was “A Parallel Connectivity Algorithm for de Bruijn Graphs in Metagenomic Applications” by Patrick Flick, Chirag Jain, Tony Pan and Srinivas Aluru from Georgia Institute of Technology. The George Michael Memorial HPC Fellowship honors exceptional Ph.D. students. The first recipient is Johann Rudi from the Institute for Computational Engineering and Sciences at the University of Texas at Austin for his project, “Extreme-Scale Implicit Solver for Nonlinear, Multiscale, and Heterogeneous Stokes Flow in the Earth’s Mantle.” The second recipient is Axel Huebl from Helmholtz-Zentrum Dresden-Rossendorf at the Technical University of Dresden for his project, “Scalable, Many-core Particle-in-cell Algorithms to Stimulate Next Generation Particle Accelerators and Corresponding Large-scale Data Analytics.” The SC Conference Series also serves as the venue for recognizing leaders in the HPC community for their contributions during their careers. Here are the career awards presented at SC16: The IEEE-CS Seymour Cray Computer Engineering Award recognizes innovative contributions to high performance computing systems that best exemplify the creative spirit demonstrated by Seymour Cray. The 2016 IEEE-CS Seymour Cray Computer Engineering Award was presented to William J. Camp of Los Alamos National Laboratory “for visionary leadership of the Red Storm project, and for decades of leadership of the HPC community.” Camp previously served as Intel’s Chief Supercomputing Architect and directed Intel’s Exascale R&D efforts. Established in memory of Ken Kennedy, the founder of Rice University's nationally ranked computer science program and one of the world's foremost experts on high-performance computing, the ACM/IEEE-CS Ken Kennedy Award recognizes outstanding contributions to programmability or productivity in high-performance computing together with significant community service or mentoring contributions. The 2016 Ken Kennedy Award was presented to William D. Gropp “for highly influential contributions to the programmability of high-performance parallel and distributed computers, and extraordinary service to the profession.” Gropp Is the Acting Director of the National Center for Supercomputing Applications and Director, Parallel Computing Institute, Thomas M. Siebel Chair in Computer Science at the University of Illinois Urbana-Champaign. The IEEE-CS Sidney Fernbach Memorial Award is awarded for outstanding contributions in the application of high performance computers using innovative approaches. The 2016 IEEE-CS Sidney Fernbach Memorial Award was presented to Vipin Kumar “for foundational work on understanding scalability, and highly scalable algorithms for graph positioning, sparse linear systems and data mining.” Kumar is a Regents Professor at the University of Minnesota. The Supercomputing Conference Test of Time Award recognizes an outstanding paper that has appeared at the SC conference and has deeply influenced the HPC discipline. It is a mark of historical impact and recognition that the paper has changed HPC trends. The winning paper is “Automatically Tuned Linear Algebra Software” by Clint Whaley from University of Tennessee and Jack Dongarra from University of Tennessee and Oak Ridge National Laboratory. IEEE TCSC Award for Excellence in Scalable Computing for Early Career Researchers: The IEEE TCHPC Award for Excellence in Scalable Computing for Early Career Researchers recognizes individuals who have made outstanding and potentially long-lasting contributions to the field within five years of receiving their Ph.D. The 2016 awards were presented to Kyle Chard, Computation Institute , University of Chicago and Argonne National Laboratory; Sunita Chandrassekaran, University of Delaware; and Seyong Lee, Oak Ridge National Laboratory. SC17 will be held next November 12-17 in Denver, Colorado. For more details, go to http://sc17.supercomputing.org/. SC16, sponsored by the IEEE Computer Society and ACM (Association for Computing Machinery), offers a complete technical education program and exhibition to showcase the many ways high performance computing, networking, storage and analysis lead to advances in scientific discovery, research, education and commerce. This premier international conference includes a globally attended technical program, workshops, tutorials, a world-class exhibit area, demonstrations and opportunities for hands-on learning. For more information on SC16, visit: http://sc16.supercomputing.org.


News Article | December 1, 2015
Site: www.rdmag.com

When certain massive stars use up all of their fuel and collapse onto their cores, explosions 10 to 100 times brighter than the average supernova occur. Exactly how this happens is not well understood. Astrophysicists from Caltech, UC Berkeley, the Albert Einstein Institute and the Perimeter Institute for Theoretical Physics have used the National Science Foundation's Blue Waters supercomputer to perform 3-D computer simulations to fill in an important missing piece of our understanding of what drives these blasts. The researchers report their findings online in Nature. The lead author on the paper is Philipp Mösta, who started the work while a postdoctoral scholar at Caltech and is now a NASA Einstein Fellow at UC Berkeley. The extremely bright explosions come in two varieties—some are a type of energetic supernovae called hypernovae, while others are gamma-ray bursts (GRBs). Both are driven by focused jets formed in some collapsed stellar cores. In the case of GRBs, the jets themselves escape the star at close to the speed of light and emit strong beams of extremely energetic light called gamma rays. The necessary ingredients to create such jets are rapid rotation and a magnetic field that is a million billion times stronger than Earth's own magnetic field. In the past, scientists have simulated the evolution of massive stars from their collapse to the production of these jet-driven explosions by factoring unrealistically large magnetic fields into their models—without explaining how they could be generated in the first place. But how could magnetic fields strong enough to power the explosions exist in nature? "That's what we were trying to understand with this study," says Luke Roberts, a NASA Einstein Fellow at Caltech and a coauthor on the paper. "How can you start with the magnetic field you might expect in a massive star that is about to collapse—or at least an initial magnetic field that is much weaker than the field required to power these explosions—and build it up to the strength that you need to collimate a jet and drive a jet-driven supernova?" For more than 20 years, theory has suggested that the magnetic field of the inner-most regions of a massive star that has collapsed, also known as a proto-neutron star, could be amplified by an instability in the flow of its plasma if the core is rapidly rotating, causing its outer edge to rotate faster than its center. However, no previous models could prove this process could strengthen a magnetic field to the extent needed to collimate a jet, largely because these simulations lacked the resolution to resolve where the flow becomes unstable. Mösta and his colleagues developed a simulation of a rapidly rotating collapsed stellar core and scaled it so that it could run on the Blue Waters supercomputer, a powerful supercomputer funded by the NSF located at the National Center for Supercomputing Applications at the University of Illinois. Blue Waters is known for its ability to provide sustained high-performance computing for problems that produce large amounts of information. The team's highest-resolution simulation took 18 days of around-the-clock computing by about 130,000 computer processors to simulate just 10 msec of the core's evolution. In the end, the researchers were able to simulate the so-called magnetorotational instability responsible for the amplification of the magnetic field. They saw—as theory predicted—that the instability creates small patches of an intense magnetic field distributed in a chaotic way throughout the core of the collapsed star. "Surprisingly, we found that a dynamo process connects these patches to create a larger, ordered structure," explains David Radice, a Walter Burke Fellow at Caltech and a coauthor on the paper. An early type of electrical generator known as a dynamo produced a current by rotating electromagnetic coils within a magnetic field. Similarly, astrophysical dynamos generate currents when hydromagnetic fluids in stellar cores rotate under the influence of their magnetic fields. Those currents can then amplify the magnetic fields." We find that this process is able to create large-scale fields—the kind you would need to power jets," says Radice. The researchers also note that the magnetic fields they created in their simulations are similar in strength to those seen in magnetars—neutron stars (a type of stellar remnant) with extremely strong magnetic fields. "It takes thousands or millions of years for a proto-neutron star to become a neutron star, and we have not yet simulated that. But if you could transport this thing thousands or millions of years forward in time, you would have a strong enough magnetic field to explain magnetar field strengths," says Roberts. "This might explain some fraction of magnetars or a particular class of very bright supernovae that are thought to be powered by a spinning magnetar at their center."


News Article | November 3, 2016
Site: www.fastcompany.com

If you don't recall Larry Agran's run for the Democratic nomination for president in 1992, I can't blame you—I'd forgotten about it myself. Even at the time, the former mayor of Irvine, California's campaign was famous, if anything, for its obscurity. He ended up collecting a grand total of 58,611 votes in the Democratic primaries and eventually went back to being the mayor of Irvine and a member of the city council, holding the latter office until 2014. But Agran's bid for the White House deserves to be remembered for at least one reason: Along the way, he pioneered online campaigning. At the time, that meant going onto the proprietary dial-up services that thrived in the pre-web era, including CompuServe, Prodigy, GEnie, and an upstart named America Online. Which, to varying degrees, most of the major campaigns did—including those of president George H.W. Bush and the eventual Democratic nominee, Bill Clinton. There was an idealistic streak to this development. "Disgusted by politics' high-gloss superficiality and the 'sound-bite' orientation of the mass media, many American voters are losing interest in one of their fundamental rights," declared an article in CompuServe magazine, a dead-tree publication which the service produced for its members. "For those who subscribe to information services such as CompuServe, however, 'modemocracy' or 'cyberspace campaigning' is making important strides in empowering the electorate." That was CompuServe's own view of what it was doing. Much of the contemporaneous coverage of early online electioneering, however, treated it as...well, kind of cute. It wasn't even a given that it was a good idea, given that the candidates who took it most seriously, such as Agran and Jerry Brown, failed to score their party's nomination. "Larry may be about 50 years ahead of his time," one Republican consultant told the Orange County Register. "Most people can't even program their VCRs. You've got to be a real junkie for this kind of thing." Twenty-four years later, we know that Agran was indeed ahead of his time—but not by half a century. Every four years, online campaigning has become a little more essential to the way presidents get elected. And everyone from Barack Obama to Donald Trump has pursued modern-day equivalents of the techniques created during the 1992 race. Long before 1992, presidential campaigns were using computers to help give themselves a competitive edge. In 1976, for instance, Computerworld reported that the Carter/Mondale campaign was spending $2,000 a month to perform tasks such as managing volunteers, accessing New York Times articles, and tracking expenses using dial-up terminals—including ones aboard the candidates' planes. (They only worked when the aircraft were on the ground.) What campaigns weren't doing was leveraging computers as a communications medium—because hardly any voters were online to be communicated at. Even in 1989, the year after vice president George H.W. Bush defeated Massachusetts governor Michael Dukakis, only 1.7% of consumers tapped into networked information services at home, according to the U.S. Census Bureau. That was in part because there weren't many major services to tap into: The venerable CompuServe and GE's GEnie were already around, but America Online wasn't yet known as America Online, and Prodigy (a joint venture of Sears and IBM) was still in regional trials. By 1991, as the 1992 campaign got underway, CompServe, GEnie, Prodigy, and AOL were competing vigorously and expanding their customer bases. (A few days before the general election on November 3, 1992, AOL, still an upstart compared to CompuServe, announced that it had 200,000 subscribers, up 40% from a year earlier.) Online voters remained a dinky minority—in 1993, only 5.8% of consumers used online information services from home—but they were a rapidly expanding group worth courting. As CompuServe touted, online campaigning allowed a candidate to sidestep the major media companies who served as gatekeepers between politicians and the electorate. It held another attraction, too: It was dirt cheap. That made them perfect for an underdog like Larry Agran. "Agran spoke directly to the people via CompuServe, an electronic mail service." Agran announced his candidacy on August 22, 1991: A day later, the Los Angeles Times was already reporting that the bid had already been "dismissed as hopeless by many analysts." He began with $7,000 in funds and by December had a grand total of four paid staffers. From the start, the deeply entrenched election-coverage machine refused to take Agran seriously, leading him to take his message online. A Democratic candidate debate broadcast on NBC on December 15, 1991 included six candidates: Jerry Brown, Bill Clinton, Tom Harkin, Bob Kerrey, Paul Tongas, and Douglas Wilder. Agran was not invited. Instead, as Orange County magazine reported, "Agran spoke directly to the people a day earlier via CompuServe, an electronic mail service accessible through the Prodigy computer package, which has 850,000 subscribers." That garbled explanation is a clue to just how new a concept online services were to most people: CompuServe was not part of Prodigy, but its archrival. According to CompuServe magazine's article on modemocracy, Agran held an online press conference in January 1992, becoming the first presidential candidate to do so. By contrast, his real-world campaigning continued to go poorly: In April 1992, he was even arrested for disorderly conduct when he tried to crash another debate that did not include him. "In retrospect, the online networks such as CompuServe turned out to be the only way that the Agran campaign could circumvent those powerful insiders at the Democratic National Committee and in the national media who worked to exclude him from the electoral process," said Agran's issues director, Steve Smith, as quoted in CompuServe magazine. Like Agran, Jerry Brown—at the time, the former governor of California, and in 2016, the current one—was running for president on a bare-bones budget and found online campaigning to be a cost-effective way of pressing the virtual flesh. In March 1992, he held an "electronic town meeting" on GEnie that was attended by around 200 members. He then conducted a similar event on CompuServe in May for an audience of about 100. The Washington Post reported that Brown's GEnie session was billed as the first such event headlined by a major presidential candidate for president—which may have been true, if you didn't count poor Larry Agran. The newspaper quoted an excerpt (strictly sic) of the candidate's comments about Jesse Jackson, which he used a borrowed Mac to type: Brown may have been a crummy typist, but the fact he was typing at all underlined that he was personally involved in this particular piece of online campaigning. That was unusual. His digital team, who described themselves in one CompuServe message quoted by the Washington Post's Howard Kurtz as "Deborah, who is one of the computer gurus at campaign headquarters, and Russ, an overworked, underpaid volunteer," uploaded materials such as position papers and press releases to the service. Employees of other campaigns, including eventual nominees Bush and Clinton, served similar roles as emissaries to cyberspace. The Clinton campaign also formed a "Clinton/Gore '92 E-Mail Team" to send out documents such as talking points, some of which went viral and ended up in USENET Internet groups and are still available on Google Groups. Doing so showed "that both the filters and interpretations of the media and the organizational hierarchies of a campaign can be removed from the delivery of information to the citizens and voters," the team's leader said in a memo quoted in a 1993 Villanova Law Review article. Oddly enough, independent candidate Ross Perot—the billionaire founder of a computing services company, who said he had envisioned "electronic town hall" meetings since the 1960s, with computerized voting—didn't embrace online campaigning in its rudimentary-yet-useful 1992 form. When Prodigy invited the general election candidates to post position papers on the service and field questions from members, the Bush and Clinton campaigns participated, but Perot declined. (Prodigy, according to Bloomberg, also tried selling ads to candidates, starting at $10,000.) "Someday, congressmen will be on the network." Modemocracy had its share of controversies. For one thing, it was hardly egalitarian. "How many poor people have Prodigy or CompuServe?" asked Clem Bezold, the executive director of a think tank called the Institute for Alternative Futures, as quoted by Bloomberg. At the time, less than a quarter of all U.S. households had a PC, and online services were billed by the minute and quite pricey. The fact that the online services were proprietary networks where freedom of the press didn't apply was also an issue: One volunteer for Pat Buchanan's campaign grumbled that Prodigy suppressed material such as his musings about overthrowing the federal government. CompuServe even got in hot water for offering free accounts to the campaigns, which the Federal Election Commission deemed to be a form of political contribution. Still, those who were optimistic enough to anticipate online access becoming widespread saw the 1992 digital campaign as the start of something that mattered. The League of Women Voters helped out Prodigy with its "Political Profile" section by providing information on how to register to vote. "Someday, congressmen will be on the network," the league's national director Mary Ellen Barry, told Bloomberg. A whole lot happened in cyberspace between the 1992 and 1996 presidential campaigns. Less than three months after Bill Clinton was elected to his first term, Marc Andreessen and Eric Bina of the National Center for Supercomputing Applications at the University of Illinois Urbana-Champaign released the first version of Mosaic, the first popular graphical browser for the World Wide Web, which Tim Berners-Lee of the CERN research lab in Switzerland had devised in 1990. The easy-to-use Mosaic helped bootstrap the web from a tool for geeks into a nascent consumer medium, a trend that only continued when Andreessen cofounded Netscape and launched the Navigator browser in 1994. By the time the 1996 campaign got underway in earnest, the web was far more of a phenomenon than proprietary online services had ever been. Rather than having to go through CompuServe or Prodigy to reach voters, candidates could simply launch their own sites. They did, and by 2016 standards, the results were a master class in terrible web design. (In a prank that foreshadowed many to come, a couple of wisenheimers registered a bunch of campaign-related domain names and launched their own fake sites, before most candidates had gotten around to building their own.) The web became so important so quickly that it swamped the networks that preceded it. AOL, CompuServe, GEnie, and Prodigy all evolved into internet service providers—and eventually ceased to exist, at least in forms remotely similar to their original incarnations. But they deserve to be remembered, and their role in 1992's digital campaign is a big, epoch-shifting reason why.


News Article | February 23, 2017
Site: www.businesswire.com

CHAMPAIGN, Ill.--(BUSINESS WIRE)--Syngenta today announced it has established a Digital Innovation Lab at the University of Illinois Research Park where it will employ four full-time employees as well as University of Illinois at Urbana-Champaign student talent to help solve agricultural challenges. Projects at the Digital Innovation Lab will employ “outside the box” thinking with access to tools, technologies, partnerships and resources that enable the research, investigation and delivery of new and novel solutions for seeds product development using data analytics. “Innovation in agriculture is the lifeblood of the work we do at Syngenta. Our goal is to bring new talent to solve difficult challenges, with a focus on seed innovation at the Research Park,” said Bill Danker, Syngenta Domain Head, Seeds Research and Breeding. The center will focus on digital data innovation and strategy, providing Syngenta with agile capabilities to enable the company to accelerate the pace of its digital journey. It will foster new ways to gain insights and make decisions from the company’s data assets. The center will develop capabilities in breeding engineering, digital agriculture, information technology, application development and big data. In conjuction with the opening of the new office, Syngenta has an industry partnership with the National Center for Supercomputing Applications (NCSA) at the University of Illinois at Urbana- Champaign. This partnership enables Syngenta to leverage the NCSA’s unique expertise in high-performance modeling, simulation and big data to gain a competitive edge. “The opening of the Syngenta Innovation Center at Research Park is a welcome new addition that continues to strengthen the relationship between Syngenta and the University of Illinois at Urbana-Champaign,” said Chancellor Robert J. Jones. “The opportunity for our students to be active participants in finding the solutions to increasingly complex societal challenges brought about by an expanding global population adds enormous value to their educational experience. The new center really capitalizes on the internationally recognized expertise in High Performance Computing (HPC), data sciences, and agriculture science at Illinois in ways to enhance Syngenta’s drive for innovation.” The new Research Park operation is located on campus to better connect industry with researchers and students. The center will start with a full-time site director, recruit professional staff, and employ students, who will work on developing projects using technologies such as smart farming, mobile applications, cloud services, and big data. “Digital innovation and integrating our data for greater insight is becoming a critical part of how we invent and bring new products to market, and support them in the market. This new capability will add creativity to the way we approach digital innovation,” said John Ormrod, Syngenta Head IS Global R&D. Student hiring is underway with the goal to be fully operational by the summer. Syngenta is a leading agriculture company helping to improve global food security by enabling millions of farmers to make better use of available resources. Through world class science and innovative crop solutions, our 28,000 people in over 90 countries are working to transform how crops are grown. We are committed to rescuing land from degradation, enhancing biodiversity and revitalizing rural communities. To learn more visit www.syngenta.com and www.goodgrowthplan.com. Follow us on Twitter® at www.twitter.com/Syngenta. About the University of Illinois The University of Illinois is a world leader in research, teaching and discovery. Distinguished by the breadth of its programs, broad academic excellence, and internationally renowned faculty, the University of Illinois has a commitment of excellence in teaching, research, public service and economic development. The University of Illinois at Urbana-Champaign serves the state, the nation, and the world by creating knowledge, preparing students for lives of impact, and addressing critical societal needs through the transfer and application of knowledge. About the Research Park at the University of Illinois The Research Park at the University of Illinois at Urbana-Champaign is a technology hub for startup companies and corporate research and development operations. Within the Research Park there are more than 100 companies employing students and full-time technology professionals. More information at researchpark.illinois.edu. About the National Center for Supercomputing Applications The National Center for Supercomputing Applications (NCSA) provides computing, data, networking, and visualization resources and expertise that help scientists and engineers across the country better understand and improve our world. NCSA is an interdisciplinary hub and is engaged in research and education collaborations with colleagues and students across the campus of the University of Illinois at Urbana-Champaign. For more information, see www.ncsa.illinois.edu.


News Article | November 17, 2016
Site: www.eurekalert.org

CHAMPAIGN, Ill. -- Researchers report in the journal Science that they can increase plant productivity by boosting levels of three proteins involved in photosynthesis. In field trials, the scientists saw increases of 14 percent to 20 percent in the productivity of their modified tobacco plants. The work confirms that photosynthesis can be made more efficient to increase plant yield, a hypothesis some in the scientific community once doubted was possible. Many years of computational analysis and laboratory and field experiments led to the selection of the proteins targeted in the study. The researchers used tobacco because it is easily modified. Now they are focusing on food crops. "We don't know for certain this approach will work in other crops, but because we're targeting a universal process that is the same in all crops, we're pretty sure it will," said University of Illinois plant biology and crop sciences professor Stephen Long, who led the study with postdoctoral researchers Katarzyna Glowacka and Johannes Kromdijk. (Watch a video about the research.) The team targeted a process plants use to shield themselves from excessive solar energy. "Crop leaves exposed to full sunlight absorb more light than they can use," Long said. "If they can't get rid of this extra energy, it will actually bleach the leaf." Plants protect themselves by making changes within the leaf that dissipate the excess energy as heat, he said. This process is called nonphotochemical quenching. "But when a cloud crosses the sun, or a leaf goes into the shade of another, it can take up to half an hour for that NPQ process to relax," Long said. "In the shade, the lack of light limits photosynthesis, and NPQ is also wasting light as heat." Long and former graduate student Xinguang Zhu used a supercomputer at the National Center for Supercomputing Applications at the U. of I. to predict how much the slow recovery from NPQ reduces crop productivity over the course of a day. These calculations revealed "surprisingly high losses" of 7.5 percent to 30 percent, depending on the plant type and prevailing temperature, Long said. Long's discussions with University of California, Berkeley researcher and study co-author Krishna Niyogi - an expert on the molecular processes underlying NPQ -suggested that boosting levels of three proteins might speed up the recovery process. To test this concept, the team inserted a "cassette" of the three genes (taken from the model plant Arabidopsis) into tobacco. "The objective was simply to boost the level of three proteins already present in tobacco," Long said. The researchers grew seedlings from multiple experiments, then tested how quickly the engineered plants responded to changes in available light. A fluorescence imaging technique allowed the team to determine which of the transformed plants recovered more quickly upon transfer to shade. The researchers selected the three best performers and tested them in several field plots alongside plots of the unchanged tobacco. Two of the modified plant lines consistently showed 20 percent higher productivity, and the third was 14 percent higher than the unaltered tobacco plants. "Tobacco is grown for its leaves, which were substantially increased," Kromdijk said. "But in food crops, it will be whatever we eat from the plant - the fruit, the seeds or the roots - that we will need to increase." Other experiments have demonstrated that increasing photosynthesis by exposing plants to high carbon dioxide results in more seeds in wheat, soy and rice, he said. "Now we can do this genetically, and we are actively working on repeating our work in various food crops," he said. "This finding offers some rare good news at a time of dire forecasts of future food shortages," Glowacka said. "The United Nations predicts that by 2050 we're going to need to produce about 70 percent more food on the land we're currently using," Long said. "My attitude is that it is very important to have these new technologies on the shelf now because it can take 20 years before such inventions can reach farmer's fields. If we don't do it now, we won't have this solution when we need it." The Bill and Melinda Gates Foundation funded this research, with the stipulation that any new agricultural products that result from the work be licensed in such a way that the technology is freely available to farmers in poor countries of Africa and South Asia. This work was conducted as part of the Realizing Increased Photosynthetic Efficiency program at the Carl R. Woese Institute for Genomic Biology at Illinois. The paper "Improving photosynthesis and crop productivity by accelerating recovery from photoprotection" is available from scipak@aaas.org.


News Article | November 17, 2016
Site: phys.org

Many years of computational analysis and laboratory and field experiments led to the selection of the proteins targeted in the study. The researchers used tobacco because it is easily modified. Now they are focusing on food crops. "We don't know for certain this approach will work in other crops, but because we're targeting a universal process that is the same in all crops, we're pretty sure it will," said University of Illinois plant biology and crop sciences professor Stephen Long, who led the study with postdoctoral researchers Katarzyna Glowacka and Johannes Kromdijk. The team targeted a process plants use to shield themselves from excessive solar energy. "Crop leaves exposed to full sunlight absorb more light than they can use," Long said. "If they can't get rid of this extra energy, it will actually bleach the leaf." Plants protect themselves by making changes within the leaf that dissipate the excess energy as heat, he said. This process is called nonphotochemical quenching. "But when a cloud crosses the sun, or a leaf goes into the shade of another, it can take up to half an hour for that NPQ process to relax," Long said. "In the shade, the lack of light limits photosynthesis, and NPQ is also wasting light as heat." Long and former graduate student Xinguang Zhu used a supercomputer at the National Center for Supercomputing Applications at the U. of I. to predict how much the slow recovery from NPQ reduces crop productivity over the course of a day. These calculations revealed "surprisingly high losses" of 7.5 percent to 30 percent, depending on the plant type and prevailing temperature, Long said. Long's discussions with University of California, Berkeley researcher and study co-author Krishna Niyogi - an expert on the molecular processes underlying NPQ -suggested that boosting levels of three proteins might speed up the recovery process. To test this concept, the team inserted a "cassette" of the three genes (taken from the model plant Arabidopsis) into tobacco. "The objective was simply to boost the level of three proteins already present in tobacco," Long said. The researchers grew seedlings from multiple experiments, then tested how quickly the engineered plants responded to changes in available light. A fluorescence imaging technique allowed the team to determine which of the transformed plants recovered more quickly upon transfer to shade. The researchers selected the three best performers and tested them in several field plots alongside plots of the unchanged tobacco. Two of the modified plant lines consistently showed 20 percent higher productivity, and the third was 14 percent higher than the unaltered tobacco plants. "Tobacco is grown for its leaves, which were substantially increased," Kromdijk said. "But in food crops, it will be whatever we eat from the plant - the fruit, the seeds or the roots - that we will need to increase." Other experiments have demonstrated that increasing photosynthesis by exposing plants to high carbon dioxide results in more seeds in wheat, soy and rice, he said. "Now we can do this genetically, and we are actively working on repeating our work in various food crops," he said. "This finding offers some rare good news at a time of dire forecasts of future food shortages," Glowacka said. "The United Nations predicts that by 2050 we're going to need to produce about 70 percent more food on the land we're currently using," Long said. "My attitude is that it is very important to have these new technologies on the shelf now because it can take 20 years before such inventions can reach farmer's fields. If we don't do it now, we won't have this solution when we need it." The Bill and Melinda Gates Foundation funded this research, with the stipulation that any new agricultural products that result from the work be licensed in such a way that the technology is freely available to farmers in poor countries of Africa and South Asia. More information: "Improving photosynthesis and crop productivity by accelerating recovery from photoprotection," Science, science.sciencemag.org/cgi/doi/10.1126/science.aai8878


The security of the more than $7 billion in research funded by the National Science Foundation (NSF) will be significantly bolstered, thanks to a $5-million grant awarded to Indiana University, the National Center for Supercomputing Applications (NCSA), the Pittsburgh Supercomputing Center (PSC) and the University of Wisconsin-Madison for a collaborative effort to create the NSF Cybersecurity Center of Excellence. This funding will establish the Center for Trustworthy Scientific Cyberinfrastructure (CTSC), a three-year-old collaboration between the aforementioned institutions, as the NSF Cybersecurity Center of Excellence, an entity focused on addressing cybersecurity challenges of NSF scientific research. Ensuring scientific computing remains trustworthy and uncorrupted is essential in protecting the nation’s science. In its role as a Cybersecurity Center of Excellence, the CTSC will provide readily available cybersecurity services tailored to the NSF science community. These resources will include leadership and coordination across organizations, and education and training to expand the pool of available cybersecurity expertise. "NSF-funded cyberinfrastructure presents unique challenges for operational security personnel and impacts other important areas of research affecting society, including ocean sciences, natural hazards, engineering, biology and physics,” said Anita Nikolich, cybersecurity program director within NSF’s advanced cyberinfrastructure division. “Organizations that host cyberinfrastructure must find the right balance of security, privacy and usability while maintaining an environment in which data are openly shared. Many research organizations lack expertise in technical and policy security, and could benefit from an independent, shared security resource pool." The CTSC will collaborate directly with NSF-funded research organizations to address their cybersecurity challenges and provide forums for cybersecurity collaboration across organizations. For example, Jim Basney of the National Center for Supercomputing Applications will lead CTSC support activities on the topic of identity and access management for research organizations. “Cybersecurity is no longer solely a technical matter — it’s a critical part of any organization’s risk management,” said Von Welch, director of Indiana University’s Center for Applied Cybersecurity Research (CACR) and CTSC principal investigator. “Addressing the cybersecurity risks to science requires a comprehensive understanding of research and the threats it faces. Many of these threats are those faced by other organizations on the Internet, but others are unique to the science community with its collaborative nature and use of high-end information technology and cyberinfrastructure.” The CTSC will also convene an annual NSF Cybersecurity Summit, led by PSC Chief Information Security Officer James A. Marsteller, to share experiences, provide training and discuss cybersecurity challenges. “Organized with significant input from the NSF community, the annual Summit provides a key opportunity to share experiences, lessons learned and advances with other NSF projects,” Marsteller said. “The forum provides an opportunity to discuss serious issues around implementing cybersecurity not only of a technical nature, but also cultural, managerial and budgetary and the like.” An example of a safeguard the CTSC will promote is software assurance, as experienced, respected names in that field, such as Barton Miller, professor at University of Wisconsin-Madison, will offer their expertise to reduce the risks of vulnerabilities and breaches for researchers. “Every day, the news continues to document why truly excellent research in highly applied cybersecurity is a national priority,” said Brad Wheeler, IU vice president for information technology and interim dean of the IU School of Informatics and Computing. “This award adds to the many national distinctions that CACR has achieved in its 13 years as part of IU’s formidable cybersecurity capabilities in education, research and operations.” Additionally, the CTSC will collaborate with the U.S. Department of Energy’s Energy Science Network (ESnet) to develop a threat profile for open science. “The Department of Energy and NSF enable scientific discovery in a range of domains critical to our nation’s future,” said Greg Bell, director for ESnet and division director at the Lawrence Berkeley National Laboratory. “Working together to understand cybersecurity threat models shared by these collaborations is an important step forward for the two agencies, and ESnet is delighted to be collaborating on this effort.”


News Article | February 16, 2017
Site: www.businesswire.com

SPRING, Texas--(BUSINESS WIRE)--ExxonMobil, working with the National Center for Supercomputing Applications (NCSA), has achieved a major breakthrough with proprietary software using more than four times the previous number of processors used on complex oil and gas reservoir simulation models to improve exploration and production results. The breakthrough in parallel simulation used 716,800 processors, the equivalent of harnessing the power of 22,400 computers with 32 processors per computer. ExxonMobil geoscientists and engineers can now make better investment decisions by more efficiently predicting reservoir performance under geological uncertainty to assess a higher volume of alternative development plans in less time. The record run resulted in data output thousands of times faster than typical oil and gas industry reservoir simulation. It was the largest number of processor counts reported by the oil and gas industry, and one of the largest simulations reported by industry in engineering disciplines such as aerospace and manufacturing. “This breakthrough has unlocked new potential for ExxonMobil’s geoscientists and engineers to make more informed and timely decisions on the development and management of oil and gas reservoirs,” said Tom Schuessler, president of ExxonMobil Upstream Research Company. “As our industry looks for cost-effective and environmentally responsible ways to find and develop oil and gas fields, we rely on this type of technology to model the complex processes that govern the flow of oil, water and gas in various reservoirs.” The major breakthrough in parallel simulation results in dramatic reductions in the amount of time previously taken to study oil and gas reservoirs. Reservoir simulation studies are used to guide decisions such as well placement, the design of facilities and development of operational strategies to minimize financial and environmental risk. To model complex processes accurately for the flow of oil, water, and natural gas in the reservoir, simulation software must solve a number of complex equations. Current reservoir management practices in the oil and gas industry are often hampered by the slow speed of reservoir simulation. ExxonMobil’s scientists worked closely with the NCSA to benchmark a series of multi-million to billion cell models on NCSA’s Blue Waters Super Computer. This new reservoir simulation capability efficiently uses hundreds of thousands of processors simultaneously and will have dramatic impact on reservoir management workflows. “NCSA’s Blue Waters sustained petascale system, which has benefited the open science community so tremendously, is also helping industry break through barriers in massively parallel computing,” said Bill Gropp, NCSA’s acting director. “NCSA is thrilled to have worked closely with ExxonMobil to achieve the kind of sustained performance that is so critical in advancing science and engineering.” ExxonMobil’s collaboration with the NCSA required careful planning and optimization of all aspects of the reservoir simulator from input/output to improving communications across hundreds of thousands of processors. These efforts have delivered strong scalability on several processor counts ranging from more than 1,000 to nearly 717,000, the latter being the full capacity of NCSA’s Cray XE6 system. ExxonMobil, the largest publicly traded international oil and gas company, uses technology and innovation to help meet the world’s growing energy needs. We hold an industry-leading inventory of resources and are one of the largest integrated refiners, marketers of petroleum products and chemical manufacturers. For more information, visit www.exxonmobil.com or follow us on Twitter www.twitter.com/exxonmobil. Cautionary Statement: Statements of future events or conditions in this release are forward-looking statements. Actual future results, including the results and impact of new technologies, could vary depending on the outcome of further research and testing; the development and competitiveness of alternative technologies; technical and operating factors; and other factors discussed in this release and under the heading “Factors Affecting Future Results” on the Investors page of ExxonMobil’s website at exxonmobil.com. About the National Center for Supercomputing Applications The National Center for Supercomputing Applications (NCSA) at the University of Illinois at Urbana-Champaign provides supercomputing and advanced digital resources for the nation’s science enterprise. At NCSA, University of Illinois faculty, staff, students, and collaborators from around the globe use advanced digital resources to address research grand challenges for the benefit of science and society. NCSA has been advancing one third of the Fortune 50 for more than 30 years by bringing industry, researchers and students together to solve grand challenges at rapid speed and scale. The Blue Waters Project is supported by the National Science Foundation through awards ACI-0725070 and ACI-1238993.


News Article | March 23, 2016
Site: phys.org

Led by Aleksei Aksimentiev, a professor of physics at the University of Illinois, and Taekjip Ha, a professor of biophysics and biophysical chemistry at Johns Hopkins University and an adjunct at the University of Illinois Center for the Physics of Living Cells along with Aksimentiev, the researchers published their work in the journal Nature Communications. "We are still only starting to explore the physical properties of DNA. It's not just a string of letters," Aksimentiev said. "It's a complex molecule with unique characteristics. The prevailing hypothesis is that everything that happens inside the nucleus, the way the DNA is organized, is all the work of proteins. What we show is that direct DNA-DNA interactions may play a role in large-scale chromosome organization as well." Using the Blue Waters supercomputer at the National Center for Supercomputing Applications on the Illinois campus, Aksimentiev and postdoctoral researcher Jejoong Yoo performed detailed simulations of two DNA molecules interacting in a charged solution such as is found in the cell. The supercomputer allowed them to map each individual atom and its behavior, and to measure the forces between the molecules. They found that, though DNA molecules tend to repel each other in water, in a cell-like environment two DNA molecules can interact according to their respective sequences. "In the DNA alphabet, there is A, T, G and C. We found that when a sequence is rich in A and T, there is a stronger attraction," Aksimentiev said. "Then we looked at what actually causes it at the molecular level. We like to think of DNA as a nice symmetrical helix, but actually there's a line of bumps which are methyl groups, which we find are the key to regulating this sequence-dependent attraction." One of the processes for regulating gene expression is methylation, which adds methyl groups to the DNA helix. In further simulations, the researchers found that the methyl groups strengthen the attraction, so sequences heavy in G and C with methyl groups attached will interact just as strongly as sequences rich in A and T. "The key is the presence of charged particles in the solution," Aksimentiev said. "Let's say you have two people who don't like each other, but I like them both, so I can shake hands with both of them and bring them close. The counter-ions work exactly like that. The strength of how they pull the DNA molecules together depends on how many of them are between the molecules. When we have these bumps, we have a lot of counter-ions." Ha and graduate researcher Hajin Kim experimentally verified the findings of the simulations. Using advanced single-molecule imaging techniques, they isolated two DNA molecules inside a tiny bubble, then watched to see how the molecules interacted. The experiments matched well with the data from the simulations, both for the sequence-dependent interactions and for interactions between methylated DNA. "It was wonderful to see the computational predictions borne out exactly in our experiments," Ha said. "It tells us how accurate the atomic-level simulations are and shows that they can guide new research avenues." The researchers posit that the observed interactions between DNA molecules could play a role in how chromosomes are organized in the cell and which ones are expanded or folded up compactly, determining functions of different cell types or regulating the cell cycle. "For example, once you methylate DNA, the chromosome becomes more compact. It prevents the cellular machinery from accessing the DNA," Aksimentiev said. "It's a way to tell which genes are turned on and which are turned off. This could be part of the bigger question of how chromosomes are arranged and how organizational mechanisms can affect gene expression." Explore further: Charged graphene gives DNA a stage to perform molecular gymnastics More information: Jejoong Yoo et al. Direct evidence for sequence-dependent attraction between double-stranded DNA controlled by methylation, Nature Communications (2016). DOI: 10.1038/ncomms11045


News Article | February 11, 2016
Site: www.techtimes.com

Scientists developed a new computer model that undermines the popular theory about the origins of the Yellowstone Supervolcano. The widespread belief is that the supervolcano was formed from a vertical formation of hot rocks. This column is believed to extend from the top of the planet's core. The enormous magmatic system is sort of the plumbing system beneath the Earth's surface. Researchers from the University of Illinois used the new data about the magmatic system and the supervolcano's past to create an improved computer model that debunks popular theories about its origins. The University's geology professor Lijun Liu said their new computer model presents the complete history of the supervolcano's activities. The research was published in the Geophysical Research Letters journal on Jan. 20. "The majority of previous studies have relied on conceptual, idealized models, which are not physically and geologically accurate," said Liu. The team considered more dynamic processes that enabled their computer model to be more realistic and complex than past ones. The team used the National Center for Supercomputing Applications' Blue Waters supercomputer at the University. They replicated not only the surface's plate tectonic history but also the interior's geophysical image. The machine they used is one of the world's fastest supercomputers and the study is the first one to use such machinery to replicate Yellowstone's complex geophysical data. While the computer model wasn't intended to predict the actual formation of the supervolcano, the model measured the efficiency of past formation theories based on the new data they have on hand. When the popular mantle plume theory was tested, the computer model suggests that the vertical column of hot rocks wasn't theoretically possible. Data on the ancient tectonic plates suggested that a plume would have been blocked from rising. Its unknown origin is a big driver as to why supervolcanoes remain risky and continue to drive public concern. Liu added that the continuous improvement of the supervolcano formation model can help predict Yellowstone's future behaviors. "This research indicates that we need a multidisciplinary approach to understand complicated natural processes like Yellowstone," added Liu.

Loading National Center for Supercomputing Applications collaborators
Loading National Center for Supercomputing Applications collaborators