Reproducibility Initiative

Palo Alto, CA, United States

Reproducibility Initiative

Palo Alto, CA, United States

Time filter

Source Type

News Article | November 23, 2016
Site: www.businesswire.com

SALT LAKE CITY--(BUSINESS WIRE)--SC16, the 28th annual international conference of high performance computing, networking, storage and analysis, celebrated the contributions of researchers and scientists - from those just starting their careers to those whose contributions have made lasting impacts. The conference drew more than 11,100 registered attendees and featured a technical program spanning six days. The exhibit hall featured 349 exhibitors from industry, academia and research organizations from around the world. “There has never been a more important time for high performance computing, networking and data analysis,” said SC16 General Chair John West from the Texas Advanced Computing Center. “But it is also an acute time for growing our workforce and expanding diversity in the industry. SC16 was the perfect blend of research, technological advancement, career recognition and improving the ways in which we attract and retain that next generation of scientists.” According to Trey Breckenridge, SC16 Exhibits Chair from Mississippi State University, the SC16 Exhibition was the largest in the history of the conference. The overall size of the exhibition was 150,000 net square feet (breaking the 2015 record of 141,430). The 349 industry and research-focused exhibits included 44 first-timers and 120 organizations from 25 countries outside the United States. During the conference, Salt Lake City also became the hub for the world’s fastest computer network: SCinet, SC16’s custom-built network which delivered 3.15 terabits per second in bandwidth. The network featured 56 miles of fiber deployed throughout the convention center and $32 million in loaned equipment. It was all made possible by 200 volunteers representing global organizations spanning academia, government and industry. For the third year, SC featured an opening “HPC Matters” plenary that this year focused on Precision Medicine, which examined what the future holds in this regard and how advances are only possible through the power of high performance computing and big data. Leading voices from the frontlines of clinical care, medical research, HPC system evolution, pharmaceutical R&D and public policy shared diverse perspectives on the future of precision medicine and how it will impact society. The Technical Program again offered the highest quality original HPC research. The SC workshops set a record with more than 2,500 attendees. There were 14 Best Paper Finalists and six Gordon Bell Finalists. These submissions represent the best of the best in a wide variety of research topics in HPC. “These awards are very important for the SC Conference Series. They celebrate the best and the brightest in high performance computing,” said Satoshi Matsuoka, SC16 Awards Chair from Tokyo Institute of Technology. “These awards are not just plaques or certificates. They define excellence. They set the bar for the years to come and are powerful inspiration for both early career and senior researchers.” Following is the list of Technical Program awards presented at SC16: SC16 received 442 paper submissions, of which 81 were accepted (18.3 percent acceptance rate). Of those, 13 were selected as finalists for the Best Paper (six) and Best Student Paper (seven) awards. The Best Paper Award went to “Daino: A High-Level Framework for Parallel and Efficient AMR on GPUs” by Mohamed Wahib Attia and Naoya Maruyama, RIKEN; and Takayuki Aoki, Tokyo Institute of Technology. The Best Student Paper Award went to “Flexfly: Enabling a Reconfigurable Dragonfly Through Silicon Photonics” by Ke Wen, Payman Samadi, Sebastien Rumley, Christine P. Chen, Yiwen Shen, Meisam Bahadori, and Karen Bergman, Columbia University and Jeremiah Wilke, Sandia National Laboratories. The ACM Gordon Bell Prize is awarded for outstanding team achievement in high performance computing and tracks the progress of parallel computing. This year, the prize was awarded to a 12-member Chinese team for their research project, “10M-Core Scalable Fully-Implicit Solver for Nonhydrostatic Atmospheric Dynamics.” The winning team presented a solver (method for calculating) atmospheric dynamics. In the abstract of their presentation, the winning team writes, “On the road to the seamless weather-climate prediction, a major obstacle is the difficulty of dealing with various spatial and temporal scales. The atmosphere contains time-dependent multi-scale dynamics that support a variety of wave motions.” To simulate the vast number of variables inherent in a weather system developing in the atmosphere, the winning group presents a highly scalable fully implicit solver for three-dimensional nonhydrostatic atmospheric simulations governed by fully compressible Euler equations. Euler equations are a set of equations frequently used to understand fluid dynamics (liquids and gasses in motion). Winning team members are Chao Yang, Chinese Academy of Sciences; Wei Xue, Weimin Zheng, Guangwen Yang, Ping Xu, and Haohuan Fu, Tsinghua University; Hongtao You, National Research Center of Parallel Computer Engineering and Technology; Xinliang Wang, Beijing Normal University; Yulong Ao and Fangfang Liu, Chinese Academy of Sciences, Lin Gan, Tsinghua University; Lanning Wang, Beijing Normal University. This year, SC received 172 detailed poster submissions that went through a rigorous review process. In the end, 112 posters were accepted and five finalists were selected for the Best Poster Award. As part of its research poster activities, SC16 also hosted the ACM Student Research Competition for both undergraduate and graduate students. In all 63 submissions were received, 26 Student Research Competition posters were accepted – 14 in the graduate category and 12 in the undergraduate category. The Best Poster Award went to “A Fast Implicit Solver with Low Memory Footprint and High Scalability for Comprehensive Earthquake Simulation System” with Kohei Fujita from RIKEN as the lead author. First Place: “Touring Dataland? Automated Recommendations for the Big Data Traveler” by Willian Agnew and Michael Fischer, Advisors: Kyle Chard and Ian Foster. Second Place: “Analysis of Variable Selection Methods on Scientific Cluster Measurement Data” by Jonathan Wang, Advisors: Wucherl Yoo and Alex Sim. Third Place: “Discovering Energy Usage Patterns on Scientific Clusters” by Matthew Bae, Advisors: Wucherl Yoo, Alex Sim and Kesheng Wu. First Place: “Job Startup at Exascale: Challenges and Solutions” by Sourav Chakroborty, Advisor: Dhabaleswar K. Panda. Second Place: “Performance Modeling and Engineering with Kerncraft,” by Julian Hammer, Advisors: Georg Hager and Gerhard Wellein. Third Place: “Design and Evaluation of Topology-Aware Scatter and AllGather Algorithms for Dragonfly Networks” by Nathanael Cheriere, Advisor: Matthieu Dorier. The Scientific Visualization and Data Analytics Award featured six finalists. The award went to “Visualization and Analysis of Threats from Asteroid Ocean Impacts” with John Patchett as the lead author. The Student Cluster Competition returned for its 10th year. The competition which debuted at SC07 in Reno and has since been replicated in Europe, Asia and Africa, is a real-time, non-stop, 48-hour challenge in which teams of six undergraduates assemble a small cluster at SC16 and race to complete a real-world workload across a series of scientific applications, demonstrate knowledge of system architecture and application performance, and impress HPC industry judges. The students partner with vendors to design and build a cutting-edge cluster from commercially available components, not to exceed a 3120-watt power limit and work with application experts to tune and run the competition codes. For the first-time ever, the team that won top honors also won the award for achieving highest performance for the Linpack benchmark application. The team “SwanGeese” is from the University of Science and Technology of China. In traditional Chinese culture, the rare Swan Goose stands for teamwork, perseverance and bravery. This is the university’s third appearance in the competition. Also, an ACM SIGHPC Certificate of Appreciation is presented to the authors of a recent SC paper to be used for the SC16 Student Cluster Competition Reproducibility Initiative. The selected paper was “A Parallel Connectivity Algorithm for de Bruijn Graphs in Metagenomic Applications” by Patrick Flick, Chirag Jain, Tony Pan and Srinivas Aluru from Georgia Institute of Technology. The George Michael Memorial HPC Fellowship honors exceptional Ph.D. students. The first recipient is Johann Rudi from the Institute for Computational Engineering and Sciences at the University of Texas at Austin for his project, “Extreme-Scale Implicit Solver for Nonlinear, Multiscale, and Heterogeneous Stokes Flow in the Earth’s Mantle.” The second recipient is Axel Huebl from Helmholtz-Zentrum Dresden-Rossendorf at the Technical University of Dresden for his project, “Scalable, Many-core Particle-in-cell Algorithms to Stimulate Next Generation Particle Accelerators and Corresponding Large-scale Data Analytics.” The SC Conference Series also serves as the venue for recognizing leaders in the HPC community for their contributions during their careers. Here are the career awards presented at SC16: The IEEE-CS Seymour Cray Computer Engineering Award recognizes innovative contributions to high performance computing systems that best exemplify the creative spirit demonstrated by Seymour Cray. The 2016 IEEE-CS Seymour Cray Computer Engineering Award was presented to William J. Camp of Los Alamos National Laboratory “for visionary leadership of the Red Storm project, and for decades of leadership of the HPC community.” Camp previously served as Intel’s Chief Supercomputing Architect and directed Intel’s Exascale R&D efforts. Established in memory of Ken Kennedy, the founder of Rice University's nationally ranked computer science program and one of the world's foremost experts on high-performance computing, the ACM/IEEE-CS Ken Kennedy Award recognizes outstanding contributions to programmability or productivity in high-performance computing together with significant community service or mentoring contributions. The 2016 Ken Kennedy Award was presented to William D. Gropp “for highly influential contributions to the programmability of high-performance parallel and distributed computers, and extraordinary service to the profession.” Gropp Is the Acting Director of the National Center for Supercomputing Applications and Director, Parallel Computing Institute, Thomas M. Siebel Chair in Computer Science at the University of Illinois Urbana-Champaign. The IEEE-CS Sidney Fernbach Memorial Award is awarded for outstanding contributions in the application of high performance computers using innovative approaches. The 2016 IEEE-CS Sidney Fernbach Memorial Award was presented to Vipin Kumar “for foundational work on understanding scalability, and highly scalable algorithms for graph positioning, sparse linear systems and data mining.” Kumar is a Regents Professor at the University of Minnesota. The Supercomputing Conference Test of Time Award recognizes an outstanding paper that has appeared at the SC conference and has deeply influenced the HPC discipline. It is a mark of historical impact and recognition that the paper has changed HPC trends. The winning paper is “Automatically Tuned Linear Algebra Software” by Clint Whaley from University of Tennessee and Jack Dongarra from University of Tennessee and Oak Ridge National Laboratory. IEEE TCSC Award for Excellence in Scalable Computing for Early Career Researchers: The IEEE TCHPC Award for Excellence in Scalable Computing for Early Career Researchers recognizes individuals who have made outstanding and potentially long-lasting contributions to the field within five years of receiving their Ph.D. The 2016 awards were presented to Kyle Chard, Computation Institute , University of Chicago and Argonne National Laboratory; Sunita Chandrassekaran, University of Delaware; and Seyong Lee, Oak Ridge National Laboratory. SC17 will be held next November 12-17 in Denver, Colorado. For more details, go to http://sc17.supercomputing.org/. SC16, sponsored by the IEEE Computer Society and ACM (Association for Computing Machinery), offers a complete technical education program and exhibition to showcase the many ways high performance computing, networking, storage and analysis lead to advances in scientific discovery, research, education and commerce. This premier international conference includes a globally attended technical program, workshops, tutorials, a world-class exhibit area, demonstrations and opportunities for hands-on learning. For more information on SC16, visit: http://sc16.supercomputing.org.


Iorns E.,Reproducibility Initiative | Gunn W.,Reproducibility Initiative | Erath J.,New York University | Rodriguez A.,New York University | And 37 more authors.
PLoS ONE | Year: 2014

This study describes an attempt to replicate experiments from the paper "Effect of BMAP-28 Antimicrobial Peptides on Leishmania major Promastigote and Amastigote Growth: Role of Leishmanolysin in Parasite Survival," which was submitted to the Reproducibility Initiative for independent validation. The cathelicidin bovine myeloid antimicrobial peptide 28 (BMAP-28) and its isomers were previously shown to have potent antiparasitic activity against Leishmania major. We tested the effectiveness of L-BMAP-28 and two of its isomers, the D-amino acid form (D-BMAP-28) and the retro-inverso form (RI-BMAP-28), in both unamidated and amidated forms, as anti-leishmanial agents against Leishmania major promastigotes in vitro. We observed that L-BMAP-28, as well as its D and RI isomers, demonstrate anti-leishmanial activity against L. major promastigotes in vitro. The inhibitory effect was lower than what was seen in the original study. At 2 μM of amidated peptides, the viability was 94%, 36%, and 66% with L-, D- and RI-peptides, versus 57%, 6%, and 18% in the original study. © 2014 Iorns et al.


News Article | August 24, 2016
Site: www.nature.com

No scientist wants to be the first to try to replicate another’s promising study: much better to know what happened when others tried it. Long before replication or reproducibility became major talking points, scientists had strategies to get the word out. Gossip was one. Researchers would compare notes at conferences, and a patchy network would be warned about whether a study was worth building on. Or a vague comment might be buried in a related publication. Tell-tale sentences would start “In our hands”, “It is unclear why our results differed …” or “Interestingly, our results did not …”. What might seem obvious — a paper on attempts and outcomes — was almost never an option. Many journals refused to consider replication studies, and a lot of researchers had no desire to start a feud if their results did not match. So scientists not in the know might waste time exploring a blind alley or be wary about truly promising research. Things are improving. Nowadays, researchers who want to tell the scientific community about their replication studies have multiple ways to do so. They can chronicle their attempts on a blog, post on a preprint server or publish peer-reviewed work in those journals that do not require novelty. Just this year, the online platform F1000 launched the dedicated Preclinical Reproducibility and Robustness channel for refutations, confirmations or more nuanced replication studies. Other titles, including Scientific Data and the American Journal of Gastroenterology, have openly solicited replication attempts and negative results. In 2013, after controversial work on whether bioactive RNA molecules could cross from the digestive tract to the bloodstream, Nature Biotechnology declared itself “receptive to replication”, provided that such studies illuminate crucial research questions (Nature Biotechnol. 31, 943; 2013). The psychology community is a leader in this: Perspectives on Psychological Science has begun publishing a new type of article, and pioneering a new form of collaboration. It asks psychologists to nominate an influential study for replication and to draw up a plan. The original author is invited to offer suggestions on the protocol, multiple labs volunteer to collect data, and results — whatever they may be — are published as a registered replication report (RRR). So far, three have been published, each with a perspective by the original authors. Yet it would be inefficient to pursue such projects for more than a sliver of publications. Most replication attempts are not organized collaborations, but individual laboratories testing the next stage of their research. If those results were shared, science would benefit. Why doesn’t this happen more often? Because the replication ecosystem, such as it is, lacks visibility, value and conventions. When a researcher happens on an exciting paper, there is no easy way to learn about replication attempts. Replication studies are not automatically or consistently linked to original papers on journal websites, PubPeer or PubMed. When a replication attempt is mentioned in passing in a broader study, there is no way to capture it. Journals cannot be expected to curate all replication attempts of papers they publish, although they should support technology that aggregates and disseminates that information. And they should be open to publishing in-depth replication attempts for original papers. For example, Scientific Reports encourages critique by offering to waive its article-processing charge for a peer-reviewed refutation of an article published in the journal. Increased visibility would raise the value of a replication attempt, but also increase the risk of retaliation against replicators. There is little reward for taking that risk. A published replication currently does little to raise the esteem of the replicator with hiring committees or grant reviewers. This creates a chicken–egg problem — researchers don’t want to conduct and publish rigorous replication studies because they are not valued, and replication studies are not valued because few are published. Commendably, funders such as the Laura and John Arnold Foundation in the United States and the Netherlands Organisation for Scientific Research are explicitly supporting replication studies, and setting high expectations for publication. Scientists can help to ensure that such studies are valued by citing them and by discussing them on social media. Conventions around replication studies are in their infancy — even the vocabulary is inadequate. Editors who coordinate RRRs strive to avoid loaded labels such as ‘successful’ and ‘failed’ replications. The Reproducibility Initiative, a project to help labs coordinate independent replications of their own work, also shied away from similar pronouncements after its first study. A paper is a jumble of context, experiments, results, analysis and informed speculation. Outcomes can depend on apparently trivial differences in methods, such as how vigorously reagents are mixed, as one collaboration painstakingly discovered (W. C. Hines et al. Cell Rep. 6, 779–781; 2014). Neither are there conventions for interactions between replicators and the original authors. Some original authors have refused to share data or methodological details. In other cases, some replicators broadcast their attempts without first trying to resolve inconsistencies, a practice that leaves them more open to charges of incompetence. (Thankfully, both replicators and original authors are now backing away from name-calling.) As replication becomes more mainstream, we trust that the community will establish reasonable standards of conduct. To foster better behaviour, replication attempts must become more common. We urge researchers to open their file drawers. We urge authors to cooperate with reasonable requests for primary data, to assume good intent and to write papers — and keep records — assuming that others will want to replicate their work. We urge funders and publishers to support tools that help researchers to thread the literature together. We welcome, and will be glad to help disseminate, results that explore the validity of key publications, including our own.

Loading Reproducibility Initiative collaborators
Loading Reproducibility Initiative collaborators