Beijing, China
Beijing, China

Time filter

Source Type

News Article | May 19, 2017
Site: cerncourier.com

Fermilab’s founding director, Robert Wilson, wanted his new facility to look different from stereotypical government labs. Fermilab’s Wilson Hall, which is purposely reminiscent of the Beauvais Cathedral in France, is a striking landmark in the Chicago area. But it is not the only visual milestone the laboratory’s first director left behind. While on sabbatical in 1961, Wilson studied sculpture at the Accademia di Belle Arti di Firenze in Italy. He did not want Fermilab to look like a standard government facility, and Fermilab has celebrated Wilson’s role as an artist by featuring several of his sculptures. Straddling the Pine Street entrance is Broken Symmetry, a three-span arch, painted black on one side and orange on the other, appearing perfectly symmetrical when viewed from below, but with carefully calculated asymmetry visible from its other views. Atop Ramsey Auditorium stands Wilson’s Mobius Strip, which is made of three-by-five-inch pieces of stainless steel welded on a tubular form eight-feet tall . Gracing the expansive grassy area in front of the laboratory’s Industrial Building Complex is Tractricious. This array of six-and-a-half-inch-diameter stainless-steel cryostat pipes, which were left over from construction of the Tevatron’s magnets, is bunched together in the form of a paraboloid. Wilson derived the name Tractricious from tracktrix: a curve such that any tangent segment from the tangent point on the curve to the curve’s asymptote has constant length. Close to the Users’ Center is The Tree, a sculpture Wilson created with Fermilab welders around 1970. But perhaps the most well known of Wilson’s works of art is the Hyperbolic Obelisk, which stands at the foot of the reflecting pond in front of Wilson Hall. It is 32-feet high, fabricated from three stainless-steel plates each one-quarter-inch thick. In the early 1990s, Wilson drew upon Frank Lloyd Wright’s Prairie school of architecture for the design of the building for the Leon M Lederman Science Education Center. Other architectural landmarks at Fermilab include: the Feynman Computing Center, originally built as the lab’s central computing facility; a concrete Archimedes Spiral covering the pumping stations at Casey’s Pond; and Wilson’s distinctive series of power-transmission-line poles, which resemble the Greek letter pi.


News Article | May 22, 2017
Site: www.rdmag.com

Scientists behind XENON1T, the largest dark matter experiment of its kind ever built, are encouraged by early results, describing them as the best so far in the search for dark matter. Dark matter is one of the basic constituents of the universe, five times more abundant than ordinary matter. Several astronomical measurements have corroborated the existence of dark matter, leading to an international effort to observe it directly. Scientists are trying to detect dark matter particle interacting with ordinary matter through the use of extremely sensitive detectors. Such interactions are so feeble that they have escaped direct detection to date, forcing scientists to build detectors that are more and more sensitive and have extremely low levels of radioactivity. On May 18, the XENON Collaboration released results from a first, 30-day run of XENON1T, showing the detector has a record low radioactivity level, many orders of magnitude below surrounding material on earth. “The care that we put into every single detail of the new detector is finally paying back,” said Luca Grandi, assistant professor in physics at the University of Chicago and member of the XENON Collaboration. “We have excellent discovery potential in the years to come because of the huge dimension of XENON1T and its incredibly low background. These early results already are allowing us to explore regions never explored before.” The XENON Collaboration consists of 135 researchers from the United States, Germany, Italy, Switzerland, Portugal, France, the Netherlands, Israel, Sweden and the United Arab Emirates, who hope to one day confirm dark matter’s existence and shed light on its mysterious properties. Located deep below a mountain in central Italy, XENON1T features a 3.2-ton xenon dual-phase time projection chamber. This central detector sits fully submersed in the middle of the water tank, in order to shield it from natural radioactivity in the cavern. A cryostat helps keep the xenon at a temperature of minus-95 degrees Celsius without freezing the surrounding water. The mountain above the laboratory further shields the detector, preventing it from being perturbed by cosmic rays. But shielding from the outer world is not enough, since all materials on Earth contain tiny traces of natural radioactivity. Thus extreme care was taken to find, select and process the materials making up the detector to achieve the lowest possible radioactive content. This allowed XENON1T to achieve record “silence” necessary to detect the very weak output of dark matter. A particle interaction in the one-ton central core of the time projection chamber leads to tiny flashes of light. Scientists record and study these flashes to infer the position and the energy of the interacting particle—and whether it might be dark matter. Despite the brief 30-day science run, the sensitivity of XENON1T has already overcome that of any other experiment in the field probing unexplored dark matter territory. “For the moment we do not see anything unexpected, so we set new constraints on dark matter properties,” Grandi said. “But XENON1T just started its exciting journey and since the end of the 30-day science run, we have been steadily accumulating new data.” Grandi’s group is very active within XENON1T, and it is contributing to several aspects of the program. After its initial involvement in the preparation, assembly and early operations of the liquid xenon chamber, the group shifted its focus in the last several months to the development of the computing infrastructure and to data analysis. “Despite its low background, XENON1T is producing a large amount of data that needs to be continuously processed,” said Evan Shockley, a graduate student working with Grandi. “The raw data from the detector are directly transferred from Gran Sasso Laboratory to the University of Chicago, serving as the unique distribution point for the entire collaboration.” The framework, developed in collaboration with a group led by Robert Gardner, senior fellow at the Computation Institute, allows for the processing of data, both on local and remote resources belonging to the Open Science Grid. The involvement of UChicago’s Research Computing Center including Director Birali Runesha allows members of the collaboration all around the world to access processed data for high-level analyses. Grandi’s group also has been heavily involved in the analysis that led to this first result. Christopher Tunnell, a fellow at the Kavli Institute for Cosmological Physics, is one of the two XENON1T analysis coordinators and corresponding author of the result. Recently, UChicago hosted about 25 researchers for a month to perform the analyses that led to the first results. “It has been a large, concentrated effort and seeing XENON1T back on the front line makes me forget the never-ending days spent next to my colleagues to look at plots and distributions,“ Tunnell said. “There is no better thrill than leading the way in our knowledge of dark matter for the coming years.”


News Article | May 22, 2017
Site: www.rdmag.com

Scientists behind XENON1T, the largest dark matter experiment of its kind ever built, are encouraged by early results, describing them as the best so far in the search for dark matter. Dark matter is one of the basic constituents of the universe, five times more abundant than ordinary matter. Several astronomical measurements have corroborated the existence of dark matter, leading to an international effort to observe it directly. Scientists are trying to detect dark matter particle interacting with ordinary matter through the use of extremely sensitive detectors. Such interactions are so feeble that they have escaped direct detection to date, forcing scientists to build detectors that are more and more sensitive and have extremely low levels of radioactivity. On May 18, the XENON Collaboration released results from a first, 30-day run of XENON1T, showing the detector has a record low radioactivity level, many orders of magnitude below surrounding material on earth. “The care that we put into every single detail of the new detector is finally paying back,” said Luca Grandi, assistant professor in physics at the University of Chicago and member of the XENON Collaboration. “We have excellent discovery potential in the years to come because of the huge dimension of XENON1T and its incredibly low background. These early results already are allowing us to explore regions never explored before.” The XENON Collaboration consists of 135 researchers from the United States, Germany, Italy, Switzerland, Portugal, France, the Netherlands, Israel, Sweden and the United Arab Emirates, who hope to one day confirm dark matter’s existence and shed light on its mysterious properties. Located deep below a mountain in central Italy, XENON1T features a 3.2-ton xenon dual-phase time projection chamber. This central detector sits fully submersed in the middle of the water tank, in order to shield it from natural radioactivity in the cavern. A cryostat helps keep the xenon at a temperature of minus-95 degrees Celsius without freezing the surrounding water. The mountain above the laboratory further shields the detector, preventing it from being perturbed by cosmic rays. But shielding from the outer world is not enough, since all materials on Earth contain tiny traces of natural radioactivity. Thus extreme care was taken to find, select and process the materials making up the detector to achieve the lowest possible radioactive content. This allowed XENON1T to achieve record “silence” necessary to detect the very weak output of dark matter. A particle interaction in the one-ton central core of the time projection chamber leads to tiny flashes of light. Scientists record and study these flashes to infer the position and the energy of the interacting particle—and whether it might be dark matter. Despite the brief 30-day science run, the sensitivity of XENON1T has already overcome that of any other experiment in the field probing unexplored dark matter territory. “For the moment we do not see anything unexpected, so we set new constraints on dark matter properties,” Grandi said. “But XENON1T just started its exciting journey and since the end of the 30-day science run, we have been steadily accumulating new data.” Grandi’s group is very active within XENON1T, and it is contributing to several aspects of the program. After its initial involvement in the preparation, assembly and early operations of the liquid xenon chamber, the group shifted its focus in the last several months to the development of the computing infrastructure and to data analysis. “Despite its low background, XENON1T is producing a large amount of data that needs to be continuously processed,” said Evan Shockley, a graduate student working with Grandi. “The raw data from the detector are directly transferred from Gran Sasso Laboratory to the University of Chicago, serving as the unique distribution point for the entire collaboration.” The framework, developed in collaboration with a group led by Robert Gardner, senior fellow at the Computation Institute, allows for the processing of data, both on local and remote resources belonging to the Open Science Grid. The involvement of UChicago’s Research Computing Center including Director Birali Runesha allows members of the collaboration all around the world to access processed data for high-level analyses. Grandi’s group also has been heavily involved in the analysis that led to this first result. Christopher Tunnell, a fellow at the Kavli Institute for Cosmological Physics, is one of the two XENON1T analysis coordinators and corresponding author of the result. Recently, UChicago hosted about 25 researchers for a month to perform the analyses that led to the first results. “It has been a large, concentrated effort and seeing XENON1T back on the front line makes me forget the never-ending days spent next to my colleagues to look at plots and distributions,“ Tunnell said. “There is no better thrill than leading the way in our knowledge of dark matter for the coming years.”


News Article | May 24, 2017
Site: www.eurekalert.org

KIT's new computing center with its high-performance computer is granted energy efficiency prize -- researchers from all over Germany can use the petaflop system The new supercomputer of Karlsruhe Institute of Technology (KIT) not only is very fast, but also very economical. The high-performance computer ForHLR II that started operation last year has now reached the first place in the German Computing Center Prize category of "Newly built energy- and resource-efficient computing centers". It has more than 24,000 processor cores and is equipped with a highly energy-efficient cooling system. The costs of the computer amounted to EUR 26 million. The project was funded by the state of Baden-Württemberg and the Federal Republic of Germany at equal shares: "ForHLR II plays an important role in our state's strategy for supercomputing. And, of course, I am very pleased that it is a green product. It is another example of performance being compatible with resource efficiency," Baden-Württemberg Minister of Science, Theresia Bauer, says. "Whoever wants to be at the cutting edge of international research needs highest computing and storage capacity," the President of KIT, Professor Holger Hanselka, says. "Here, highest computing capacity and latest visualization technology for modern simulation methods are combined with a very small energy consumption in line with the strategy of KIT. This reflects our strength of research in the society's relevant areas of demand." "Without the support by the Ministry for Science, Research, and the Arts and the Minister herself, it would have been impossible to make energy efficiency a focus of the project," says Professor Bernhard Neumair, Director of the Steinbuch Centre for Computing (SCC). Operation of the high-performance computer is integrated perfectly into KIT's energy supply concept based on co-generation. Science today produces strongly increasing data volumes that do not only have to be processed and stored, but also visualized. Researchers from all over Germany can use ForHLR II, a petaflop system with more than 1,170 nodes, more than 24,000 processor cores, and 75 terabytes of main memory. One petaflop corresponds to one quadrillion computing operations per second. ForHLR II's capacity exceeds that of ForHLR I that started operation in 2014 by a factor of 2.5. The new computing center building is equipped with latest technology for highly energy-efficient warm water cooling up to 45°C. During the cold season, the waste heat of the system is used to heat the office building. All through the year, reliable cooling of all hot system components is ensured. The system does not need any energy-intensive additional cooling machines. For components that still require classical cold air cooling, a district cooling network based on trigeneration is being established at Campus North of KIT. As a result, economic and environmental efficiency will be even more increased. According to Professor Rudolf Lohner of SCC, design of an environmentally efficient cooling system in one of the warmest areas of Germany was a particular challenge. Lohner accompanied the project from the start through to implementation and coordinated it in the final phase. So-called wet coolers, in which water evaporates on the surface of cooling elements and cools the content, are not suited due to their high maintenance expenditure during pollen flight and their susceptibility to bacteria colonization. "Hence, we had to use dry coolers." Their small cooling capacity was compensated by the size of construction. Lohner points out that many aspects associated with the planning and building of such a complex computing center were mastered in close cooperation with partners and other units of KIT only. "This highest recognition of our successful project by a Germany-wide expert panel reflects the successful growing together of KIT and use of the synergies produced," he thinks. The German Computing Center Prize is granted every year at Germany's biggest computing center congress future thinking. This year, it took place in Darmstadt on April 25 and 26. For further information, please contact: Dr. Felix Mescoli, Pressereferent, Tel.: +49 721 608 48120, Fax: +49 721 608 43658, felix mescoli@kit edu Karlsruhe Institute of Technology (KIT) pools its three core tasks of research, higher education, and innovation in a mission. With about 9,300 employees and 25,000 students, KIT is one of the big institutions of research and higher education in natural sciences and engineering in Europe. KIT - The Research University in the Helmholtz Association Since 2010, the KIT has been certified as a family-friendly university. This press release is available on the internet at http://www. .


News Article | April 17, 2017
Site: www.eurekalert.org

These are just three familiar examples of the hundreds of thousands of small molecules (also called specialized or secondary metabolites) that plants use as chemical ammunition to protect themselves from predation. Unfortunately, identifying the networks of genes that plants use to make these biologically active compounds, which are the source of many of the drugs that people use and abuse daily, has vexed scientists for years, hindering efforts to tap this vast pharmacopeia to produce new and improved therapeutics. Now, Vanderbilt University geneticists think they have come up with an effective and powerful new way for identifying these elusive gene networks, which typically consist of a handful to dozens of different genes, that may overcome this road block. "Plants synthesize massive numbers of bioproducts that are of benefit to society. This team has revolutionized the potential to uncover these natural bioproducts and understand how they are synthesized," said Anne Sylvester, program director in the National Science Foundation's Biological Sciences Directorate, which funded the research. The revolutionary new approach is based on the well-established observation that plants produce these compounds in response to specific environmental conditions. "We hypothesized that the genes within a network that work together to make a specific compound would all respond similarly to the same environmental conditions," explained Jennifer Wisecaver, the post-doctoral fellow who conducted the study. To test this hypothesis, Wisecaver - working with Cornelius Vanderbilt Professor of Biological Sciences Antonis Rokas and undergraduate researcher Alexander Borowsky - turned to Vanderbilt's in-house supercomputer at the Advanced Computing Center for Research & Education in order to crunch data from more than 22,000 gene expression studies performed on eight different model plant species. "These studies use advanced genomic technologies that can detect all the genes that plants turn on or off under specific conditions, such as high salinity, drought or the presence of a specific predator or pathogen," said Wisecaver. But identifying the networks of genes responsible for producing these small molecules from thousands of experiments measuring the activity of thousands of genes is no trivial matter. That's where the Vanderbilt scientists stepped in; They devised a powerful algorithm capable of identifying the networks of genes that show the same behavior (for example, all turning on) across these expression studies. The result of all this number crunching - described in the paper titled "A global co-expression network approach for connecting genes to specialized metabolic pathways in plants" published online Apr. 13 by The Plant Cell journal - was the identification of dozens, possibly even hundreds of gene pathways that produce small metabolites, including several that previous experiments had identified. Vered Tzin from Ben-Gurion University's Jacoob Blaustein Institutes for Desert Research in Israel and Georg Jander from Cornell University's Boyce Thompson Institute for Plant Research in Ithaca, NY, helped verify the predictions the analysis made in corn, and Daniel Kliebenstein from the Department of Plant Sciences at the University of California, Davis helped verify the predictions in the model plant system Arabidopsis. The results of their analysis go against the prevailing theory that the genes that make up these pathways are clustered together on the plant genome. "This idea comes from the observation in fungi and bacteria that the genes that make up these specialized metabolite pathways are clustered together," said Rokas. "In plants, however, these genes appear to be mostly scattered across the genome. Consequently, the strategies for discovering plant gene pathways will need to be different from those developed in the other organisms." The researchers argue that the results of their study show that this approach "is a novel, rich and largely untapped means for high-throughput discovery of the genetic basis and architecture of plant natural products." If that proves to be true, then it could help open the tap on new plant-based therapeutics for treating a broad range of conditions and diseases. The research was funded by a National Science Foundation National Plant Genome Initiative (NPGI) Postdoctoral Research Fellowship to Wisecaver (IOS-1401682), as well as by National Science Foundation grants to Rokas (DEB-1442113) and Jander (IOS-1339237).


News Article | April 17, 2017
Site: www.futurity.org

A new method for identifying the gene networks plants use to create anti-predator chemicals could lead to more effective drugs, a new study suggests. Plants create hundreds of thousands of small molecules (also called specialized or secondary metabolites)—including chemicals like cocaine, nicotine, and capsaicin—to use as “chemical ammunition” to protect themselves from predation. Unfortunately, the difficulty of identifying the networks of genes that plants use to make these biologically active compounds, which are the source of many of the drugs that people use and abuse daily, has hindered efforts to tap this vast pharmacopeia to produce new and improved therapeutics. Now, geneticists think they have come up with an effective and powerful new way for identifying these elusive gene networks, which typically consist of a handful to dozens of different genes. “Plants synthesize massive numbers of bioproducts that are of benefit to society. This team has revolutionized the potential to uncover these natural bioproducts and understand how they are synthesized,” says Anne Sylvester, program director in the National Science Foundation’s Biological Sciences Directorate, which funded the research. The revolutionary new approach is based on the well-established observation that plants produce these compounds in response to specific environmental conditions. “We hypothesized that the genes within a network that work together to make a specific compound would all respond similarly to the same environmental conditions,” explains Jennifer Wisecaver, a postdoctoral fellow at Vanderbilt University who conducted the study. To test this hypothesis, Wisecaver—working with professor of biological sciences Antonis Rokas and undergraduate researcher Alexander Borowsky—turned to Vanderbilt’s in-house supercomputer at the Advanced Computing Center for Research & Education in order to crunch data from more than 22,000 gene expression studies performed on eight different model plant species. “These studies use advanced genomic technologies that can detect all the genes that plants turn on or off under specific conditions, such as high salinity, drought, or the presence of a specific predator or pathogen,” says Wisecaver. But identifying the networks of genes responsible for producing these small molecules from thousands of experiments measuring the activity of thousands of genes is no trivial matter. That’s where the scientists stepped in; they devised a powerful algorithm capable of identifying the networks of genes that show the same behavior (for example, all turning on) across these expression studies. The result of all this number crunching—described in a paper in The Plant Cell—was the identification of dozens, possibly even hundreds, of gene pathways that produce small metabolites, including several that previous experiments had identified. Vered Tzin from Ben-Gurion University’s Jacoob Blaustein Institutes for Desert Research in Israel and Georg Jander from Cornell University’s Boyce Thompson Institute for Plant Research in Ithaca, New York, helped verify the predictions the analysis made in corn, and Daniel Kliebenstein from the plant sciences department at the University of California, Davis helped verify the predictions in the model plant system Arabidopsis. The results of their analysis go against the prevailing theory that the genes that make up these pathways are clustered together on the plant genome. “This idea comes from the observation in fungi and bacteria that the genes that make up these specialized metabolite pathways are clustered together,” says Rokas. “In plants, however, these genes appear to be mostly scattered across the genome. Consequently, the strategies for discovering plant gene pathways will need to be different from those developed in the other organisms.” The researchers argue that the results of their study show that this approach “is a novel, rich, and largely untapped means for high-throughput discovery of the genetic basis and architecture of plant natural products.” If that proves to be true, then it could help open the tap on new plant-based therapeutics for treating a broad range of conditions and diseases. Funding came from a National Science Foundation National Plant Genome Initiative Postdoctoral Research Fellowship to Wisecaver, as well as by National Science Foundation grants to Rokas and Jander.


Unfortunately, identifying the networks of genes that plants use to make these biologically active compounds, which are the source of many of the drugs that people use and abuse daily, has vexed scientists for years, hindering efforts to tap this vast pharmacopeia to produce new and improved therapeutics. Now, Vanderbilt University geneticists think they have come up with an effective and powerful new way for identifying these elusive gene networks, which typically consist of a handful to dozens of different genes, that may overcome this road block. "Plants synthesize massive numbers of bioproducts that are of benefit to society. This team has revolutionized the potential to uncover these natural bioproducts and understand how they are synthesized," said Anne Sylvester, program director in the National Science Foundation's Biological Sciences Directorate, which funded the research. The revolutionary new approach is based on the well-established observation that plants produce these compounds in response to specific environmental conditions. "We hypothesized that the genes within a network that work together to make a specific compound would all respond similarly to the same environmental conditions," explained Jennifer Wisecaver, the post-doctoral fellow who conducted the study. To test this hypothesis, Wisecaver - working with Cornelius Vanderbilt Professor of Biological Sciences Antonis Rokas and undergraduate researcher Alexander Borowsky - turned to Vanderbilt's in-house supercomputer at the Advanced Computing Center for Research & Education in order to crunch data from more than 22,000 gene expression studies performed on eight different model plant species. "These studies use advanced genomic technologies that can detect all the genes that plants turn on or off under specific conditions, such as high salinity, drought or the presence of a specific predator or pathogen," said Wisecaver. But identifying the networks of genes responsible for producing these small molecules from thousands of experiments measuring the activity of thousands of genes is no trivial matter. That's where the Vanderbilt scientists stepped in; They devised a powerful algorithm capable of identifying the networks of genes that show the same behavior (for example, all turning on) across these expression studies. The result of all this number crunching - described in the paper titled "A global co-expression network approach for connecting genes to specialized metabolic pathways in plants" published online Apr. 13 by The Plant Cell journal - was the identification of dozens, possibly even hundreds of gene pathways that produce small metabolites, including several that previous experiments had identified. Vered Tzin from Ben-Gurion University's Jacoob Blaustein Institutes for Desert Research in Israel and Georg Jander from Cornell University's Boyce Thompson Institute for Plant Research in Ithaca, NY, helped verify the predictions the analysis made in corn, and Daniel Kliebenstein from the Department of Plant Sciences at the University of California, Davis helped verify the predictions in the model plant system Arabidopsis. The results of their analysis go against the prevailing theory that the genes that make up these pathways are clustered together on the plant genome. "This idea comes from the observation in fungi and bacteria that the genes that make up these specialized metabolite pathways are clustered together," said Rokas. "In plants, however, these genes appear to be mostly scattered across the genome. Consequently, the strategies for discovering plant gene pathways will need to be different from those developed in the other organisms." The researchers argue that the results of their study show that this approach "is a novel, rich and largely untapped means for high-throughput discovery of the genetic basis and architecture of plant natural products." If that proves to be true, then it could help open the tap on new plant-based therapeutics for treating a broad range of conditions and diseases. Explore further: Researchers detail genetic mechanisms that govern growth and drought response in plants More information: Jennifer H. Wisecaver et al, A Global Co-expression Network Approach for Connecting Genes to Specialized Metabolic Pathways in Plants, The Plant Cell (2017). DOI: 10.1105/tpc.17.00009


News Article | April 19, 2017
Site: phys.org

It sounds like an easy task – after all, any animal with basic vision can see a moving object, decide whether it is food or a threat and react accordingly, but what comes easily to a scallop is a challenge for the world's biggest supercomputers. Crutchfield, along with physics graduate student Adam Rupe and postdoc Ryan James, is designing these new machine learning systems to allow supercomputers to spot large-scale atmospheric structures, such as hurricanes and atmospheric rivers, in climate data.  The UC Davis Complexity Sciences Center, which Crutchfield leads, was recently named as an Intel Parallel Computing Center and is collaborating with Intel Research, the Department of Energy's National Energy Research Scientific Computing Center (NERSC) at the Lawrence Berkeley Lab, Stanford University, and University of Montreal. The entire Big Data Center project is led by Prabhat, leader of the Data And Analytics Services Group at the Berkeley lab. The team works on NERSC's CORI supercomputer, in the top five of the world's fastest machines with over 600,000 CPU cores. Modern science is full of "big data." For climate science, that includes both satellite- and ground-based measurements that span the planet, as well as "big" simulations. "We need new kind of machine learning to interpret very large data and planet-wide simulations," Crutchfield said. Climate and weather systems evolve over time, so the machines need to be able to find patterns not only in space but over time. "Dynamics are key to this," Crutchfield said. Humans (and other visual animals) recognize dynamic changes very quickly, but it's much harder for machines. Pattern Discovery is more than Pattern Recognition With existing technology, computers recognize patterns based on an existing template. That's how voice recognition systems work, by comparing your voice to an existing catalog of sounds. These pattern recognition systems can be very useful but they can't identify anything truly new – that isn't represented in their template. Crutchfield and his team are taking a different approach, based on pattern discovery. They are working on algorithms that allow computers to identify structures in data without knowing what they are in advance. "Learning novel patterns is what humans are uniquely good at, but machines can't do it," he said. Using pattern discovery, a supercomputer would learn how to identify hurricanes or other features in climate and weather data. It might also identify new kinds of structures that are too complex for humans to perceive at all. While this application is in global climate modeling, Crutchfield hopes to make it a new paradigm for analyzing very large datasets. "Usually, you apply known models to interpret the data. To say that you will extract your model directly from the data is a radical claim," he said. The collaboration is part of the Intel Parallel Computing Centers program, which provides funding to universities, institutions, and research labs to modernize key community codes used across a wide range of disciplines to run on industry-standard parallel architectures. Explore further: "Colony" Computer to Look for a Theory of Theories


News Article | April 17, 2017
Site: www.prweb.com

When people think about spinal cord injuries, thoughts generally turn toward Christopher Reeve who was thrown from his horse during trial events for an equestrian competition in 1995, and Steven McDonald, who was shot three times in 1986 after serving two years as an officer with the New York Police Department. Reeve’s and McDonald’s heroic and visible survival stories brought the severity of spinal cord injuries into the international dialogue. Today at the College of Staten Island (CSI), Maria Knikou, PhD, is holding clinical trials of her breakthrough research designed to develop effective rehabilitation strategies to improve the walking ability of persons with spinal cord injuries that have affected the function of the central nervous system. During her ongoing trials, she has recently worked with eight people with spinal cord injuries, including a 20-year-old who fell out of a golf cart and broke his neck nine months ago, and a Midwestern woman who broke her neck. These people, who have been diagnosed with tetraplegia (a spinal cord injury above the first thoracic vertebra or within cervical sections Cervical 1-8) and severe paralysis of the legs, came to CSI to participate in the research trials. After completing four to six weeks of therapy with Dr. Knikou, the patients saw motor function improve, with increased control and reduced spasticity. According to spinalcord.com, “The spinal cord carries nerve fibers traveling both from the brain to the rest of the body and from the body back to the brain. Those coming from the brain are responsible for voluntary control of muscles. Those traveling toward the brain carry sensation.” Dr. Knikou’s non-invasive therapy focuses on assessing the signal transfer from the brain to the legs in order to strengthen and enhance that pathway and provide gains in motor function. Patients who undergo the phase one therapy may be eligible for the phase two Robotic Gait Training, designed to further stimulate brain, spinal, and muscular health on a pathway for improved mobility. People who participate in the trials are provided a stipend, and certain expenses may be covered. Persons who are interested in learning if they are eligible candidates for this unique therapeutic approach should contact Dr. Knikou, Professor of Human Neurophysiology in the Physical Therapy Department of the School of Health Sciences at 718.982.3316 or maria.knikou(at)csi.cuny.edu. All trials are conducted on the Willowbrook campus of the College of Staten Island in New York City. "Dr Knikou's forward-thinking and expertise in human neurophysiology have enabled her to be extremely successful, with ongoing grant support from New York State and other private foundations," commented Dean Maureen Becker, PhD. "She is one of the leading researchers in the School of Health Sciences at the College of Staten Island and her work, one day, will impact the lives of millions of individuals with spinal cord injury." Dr. Knikou’s research project is funded by the New York State Department of Health, Spinal Cord Injury Research Board, under the Project to Accelerate Research Translation (PART) award. She mentors high school, undergraduate, and graduate students, as well as postdoctoral research fellows and junior faculty. Dr. Knikou serves on several editorial boards and has published her research work in high-ranking, peer-reviewed scientific journals. For more information about the College of Staten Island School of Health Sciences visit http://www.csi.cuny.edu/schoolofhealthsciences. About the College of Staten Island The College of Staten Island is a senior college of The City University of New York (CUNY) offering Doctoral programs, Advanced Certificate programs, and Master’s programs, as well as Bachelor’s and Associate’s degrees. CSI is ranked 3rd in New York State by MONEY magazine for Best Colleges and 6th in the nation on CollegeNet’s Social Mobility Index. CSI is also a “Top Master’s University,” as ranked by Washington Monthly; in the Top 15% for Alumni Salary Potential according to Payscale; and has been named a Military Friendly School for seven consecutive years by GI Jobs magazine. The CUNY Interdisciplinary High-Performance Computing Center, one of the most powerful supercomputers in the New York City region, handles big-data analysis for faculty researchers and their student research teams, as well as researchers nationwide. The 204-acre park-like campus of CSI, the largest in NYC, is fully accessible and contains an advanced, networked infrastructure to support technology-based teaching, learning, and research. Dolphin Cove Resident Halls, the college’s new apartment-style luxury suites, celebrates its third year at full occupancy housing students from across NYC, the United States, and the world.


News Article | May 1, 2017
Site: www.biosciencetechnology.com

Digital imaging is another area where HPC-enabled speedups are advancing clinical care. Panelist Simon K. Warfield described innovative imaging techniques his team is applying to increase understanding of the brain’s complex circuitry. Dr. Warfield is the Thorne Griscom Professor of Radiology at Harvard Medical School and the founder and director of the Computational Radiology Lab (CRL) at Boston Children's Hospital. CRL is an Intel Parallel Computing Center that is modernizing the algorithms and data structures of medical image computing on Intel Xeon and Intel Xeon Phi processors. The lab is improving cache performance, vectorization performance and multi-threading performance, as well as creating more sophisticated imaging and modeling strategies. CRL can contribute to improved diagnosis and treatment of brain injuries, multiple sclerosis, depression, Alzheimer’s and many other conditions. Consider the novel technique CRL has developed to show more clearly water’s diffusion through the brain—and pinpoint hindrances and restrictions to its flow. In contrast to traditional image processing approaches, CRL’s diffusion-weighted imaging infers new parametric maps from data measurements. Its computational model includes tens or hundreds of 3D images—each up to 10 million pixels each—as its inputs. “This type of analysis is very computationally intensive,” Warfield said. “With the accelerated algorithm and the Intel Xeon Phi processors, we reduced the time needed from 48 hours to 15 minutes of calculations.” That speedup can translate to immediate benefits in for critically ill patients facing brain surgery. That’s because, as Warfield put it, “When you’re talking about surgical planning, life is a matter of time.” Recently, one of the hospital’s neurosurgery teams realized on a Friday that their patient’s conventional magnetic resonance scan was not clear enough to allow them to proceed with a planned brain resection. With the surgery-planning meeting scheduled for Monday, they requested emergency use of CRL’s diffusion imaging algorithm. The patient had a new scan Saturday evening, the data was processed on Sunday, and the information was ready for the team’s decision on Monday. The panel also highlighted precision medicine’s global reach—and its big data challenges. Fang Lin, Director of the Bioinformatics Center at BGI, described BGI’s use of the Lustre file system to help maintain storage performance as its data volumes grow. BGI is a global research leader as well as a provider of genetic testing products. It also operates the China National Genebank, putting it on the forefront of China’s five-year. BGI cranks 20 terabytes of sequencing data every day. The institute stores13petabytes of genomic data and uses a 10 petabyte file system comprising Intel Enterprise Edition for Lustre Software and open source technologies. Dr. David Torrents, a molecular biologist and research professor at the Barcelona Supercomputing Center, shone a spotlight on the importance of collaboration in advancing precision medicine. BSC provides resources to a variety of international centers and consortia. In addition, the institute conducts its own multidisciplinary research in computational biomedicine and related fields. BSC’s alliances also encompass a range of hospitals and medical centers, enabling it to validate and test its models and tools with data from clinical institutions. “We’re at an exciting moment,” Torrents said. “We are not just developing new solutions for personalized medicine, but now are beginning a pilot program in January 2017 to bring them together and apply them in clinical settings, beginning in Catalonia and then throughout Spain.” The panelists say continued leaps forward in precision medicine will come from faster and more sophisticated analysis of larger volumes of more varied data types. “What we want is a more holistic picture, and for that, it’s becoming absolutely critical to combine many diverse data types together for analysis,” said Lowey. To achieve that holistic picture, researchers want to use deep learning and other forms of artificial intelligence. They also want to apply those AI methods to genomic data in combination with imaging data, lifelong clinical records, population studies, environmental studies, and much more. Different aspects of the precision medicine workflow will have varying processing and storage requirements. So the push continues for faster performance with agile or heterogeneous platform architectures rather than a single “silver bullet” approach. The processors will continue as the primary workhorses, supplemented by embedded resources and FPGA accelerators for parts of the workflow. Distributed compute and storage resources will remain crucial, along with advances in applications and tools. As to the clinical impact of these holistic approaches, look no further than Boston Children’s Hospital. Noninvasive prenatal genomic testing can indicate whether a fetus has the risk factors that predispose it to be born with a malformed heart. If genetic testing shows these factors are present, data-intensive digital imaging can reveal whether the heart is actually deformed. By combining genomic with other medical data in this way, clinicians can provide peace of mind for worried parents-to-be, or help them plan for their child’s future. “We’re starting to connect the genetics that predisposes an individual to heart disease, with the imaging to see if the defect is present, and use that information to influence current treatment,” said Warfield. “That information can also help us plan for the child’s longer-term future. We can predict how they’ll do as teenagers and begin to plan accordingly.” Precision medicine is one of the most promising and meaningful applications of high-performance computing today. “It’s still early days, but we’re moving toward an exciting new era of predictive biology and personalized medicine,” said McManus. “Our panelists gave us a great taste of what’s on the horizon. With continued advances in platform technologies, artificial intelligence, and other areas, we create significant opportunities to increase the science of medicine and ultimately improve human health. Intel is excited to empower scientists and clinicians with technology innovations, resources and expertise as we collaborate to make this new era a reality.” Jan Rowell writes about technology trends in HPC, healthcare, life sciences, and other industries.

Loading Computing Center collaborators
Loading Computing Center collaborators