Overlook, United States
Overlook, United States

Time filter

Source Type

News Article | May 16, 2017
Site: www.eurekalert.org

MIAMI--Researchers believe they have found a new way to monitor the intensity and location of hurricanes from hundreds of miles away by detecting atmospheric waves radiating from the centers of these powerful storms. In a new study, scientists from the University of Miami (UM) Rosenstiel School of Marine and Atmospheric Science and the Hurricane Research Division of the National Oceanic and Atmospheric Administration (NOAA) presented direct observations of the waves, obtained by NOAA aircraft flying in hurricanes and by a research buoy located in the Pacific Ocean. The waves, known as atmospheric gravity waves, are produced by strong thunderstorms near the eye and radiate outward in expanding spirals. "These very subtle waves can sometimes be seen in satellite images," said David Nolan, professor in the Department of Atmospheric Sciences, and lead author of the study. "We were able to measure them in aircraft data and surface instruments." In addition, says Nolan, computer simulations performed at the UM Center for Computational Science can reproduce the waves, showing that the wave strengths can be related to the maximum wind speed in the core of the storm. These findings suggest that hurricanes and typhoons could be monitored from hundreds of miles away with relatively inexpensive instruments, such as barometers and anemometers, much like earthquakes from around the world are monitored by seismometers. The researchers analyzed data obtained from 25 different penetrations by NOAA P3 aircraft into five hurricanes in 2003 and 2004, as well as data from the Extreme Air-Sea Interaction (EASI) buoy deployed in the Pacific Ocean by UM Rosenstiel School scientists in 2010. "The waves cause very weak upward and downward motions, which are recorded by the NOAA P3 as it flies through the storm," said Jun Zhang of the Hurricane Research Division, a veteran of many hurricane flights. "But we were surprised at how clearly the waves could be detected at the surface." "Of course, hurricanes are very well observed by satellites. But these waves can reveal processes occurring in the eyewall of a hurricane that are obscured from the view of satellites by thick clouds," said Nolan. "Any additional measurements, even if they provide similar information as satellites, can lead to better forecasts." The study, titled "Spiral Gravity Waves Radiating from Tropical Cyclones," was published April 30, 2017 in the journal Geophysical Research Letters. The National Science Foundation (grant #AGS1132646) and NOAA Hurricane Forecast Improvement Program (grant #NA14NWS4680028) provided funding for the study. The University of Miami is one of the largest private research institutions in the southeastern United States. The University's mission is to provide quality education, attract and retain outstanding students, support the faculty and their research, and build an endowment for University initiatives. Founded in the 1940's, the Rosenstiel School of Marine & Atmospheric Science has grown into one of the world's premier marine and atmospheric research institutions. Offering dynamic interdisciplinary academics, the Rosenstiel School is dedicated to helping communities to better understand the planet, participating in the establishment of environmental policies, and aiding in the improvement of society and quality of life. For more information, visit: http://www. .


News Article | May 16, 2017
Site: www.sciencedaily.com

Researchers believe they have found a new way to monitor the intensity and location of hurricanes from hundreds of miles away by detecting atmospheric waves radiating from the centers of these powerful storms. In a new study, scientists from the University of Miami (UM) Rosenstiel School of Marine and Atmospheric Science and the Hurricane Research Division of the National Oceanic and Atmospheric Administration (NOAA) presented direct observations of the waves, obtained by NOAA aircraft flying in hurricanes and by a research buoy located in the Pacific Ocean. The waves, known as atmospheric gravity waves, are produced by strong thunderstorms near the eye and radiate outward in expanding spirals. "These very subtle waves can sometimes be seen in satellite images," said David Nolan, professor in the Department of Atmospheric Sciences, and lead author of the study. "We were able to measure them in aircraft data and surface instruments." In addition, says Nolan, computer simulations performed at the UM Center for Computational Science can reproduce the waves, showing that the wave strengths can be related to the maximum wind speed in the core of the storm. These findings suggest that hurricanes and typhoons could be monitored from hundreds of miles away with relatively inexpensive instruments, such as barometers and anemometers, much like earthquakes from around the world are monitored by seismometers. The researchers analyzed data obtained from 25 different penetrations by NOAA P3 aircraft into five hurricanes in 2003 and 2004, as well as data from the Extreme Air-Sea Interaction (EASI) buoy deployed in the Pacific Ocean by UM Rosenstiel School scientists in 2010. "The waves cause very weak upward and downward motions, which are recorded by the NOAA P3 as it flies through the storm," said Jun Zhang of the Hurricane Research Division, a veteran of many hurricane flights. "But we were surprised at how clearly the waves could be detected at the surface." "Of course, hurricanes are very well observed by satellites. But these waves can reveal processes occurring in the eyewall of a hurricane that are obscured from the view of satellites by thick clouds," said Nolan. "Any additional measurements, even if they provide similar information as satellites, can lead to better forecasts."


"The high performance and flexible sizing of DDN systems make them ideal for large-scale machine learning architectures," said Joel Zysman, director of advanced computing at the Center for Computational Science at the University of Miami. "With DDN, we can manage all our different applications from one centrally located storage array, which gives us both the speed we need and the ability to share information effectively. Plus, for our industrial partnership projects that each rely on massive amounts of instrument data in areas like smart cities and autonomous vehicles, DDN enables us to do mass transactions on a scale never before deemed possible. These levels of speed and capacity are capabilities that other providers simply can't match." With storage appliances that can start at a few hundred terabytes and grow to ~10 PB in a single rack, DDN's machine learning customers can scale from test bed to production ramp and beyond in a single platform. DDN solutions are enabling customers to leverage machine learning applications to speed results and improve competitiveness, profitability, customer service, business intelligence and research effectiveness, including: DDN storage allows machine learning algorithms to run faster and to include more data than any other system in the market, which enables researchers to accelerate algorithm testing, decrease development/refinement times and ultimately decrease time to market for the "learned" results – a significant advantage in today's competitive markets. "The uniqueness of DDN's architecture enables The University of Miami to save data being generated constantly from literally millions of sensors to address the entire storage needs for a smart city with up to 15,000 residents," Zysman added. "Equally impressive, we can do all that without impacting our other research, computations and simulations that are going on at the same time." As huge amounts of processing power and large data repositories have become more affordable, a rich environment for the advancement of machine learning and deep learning has emerged. Machine learning applications are being created and implemented across a wide range of processes, replacing or improving human input, and addressing problems that previously were not undertaken because of the sheer volume of the data. "To be successful, machine learning programs need to think big from the start," said Laura Shepard, senior director of product marketing at DDN. "Prototypes of programs that start by using mid-range enterprise storage or by adding drives to servers often find that these approaches are not sustainable when they need to ramp to production. With DDN, customers can transition easily with a single high-performance platform that scales massively. Because of this, DDN is experiencing tremendous demand from both research and enterprise organizations looking for high-performance storage solutions to support machine learning applications." About DDN DataDirect Networks (DDN) is the world's leading big data storage supplier to data-intensive, global organizations. For more than 18 years, DDN has designed, developed, deployed and optimized systems, software and storage solutions that enable enterprises, service providers, universities and government agencies to generate more value and to accelerate time to insight from their data and information, on premise and in the cloud. Organizations leverage the power of DDN storage technology and the deep technical expertise of its team to capture, store, process, analyze, collaborate and distribute data, information and content at the largest scale in the most efficient, reliable and cost-effective manner. DDN customers include many of the world's leading financial services firms and banks, healthcare and life science organizations, manufacturing and energy companies, government and research facilities, and web and cloud service providers. For more information, go to www.ddn.com or call 1-800-837-2298. ©2017 All rights reserved. DDN Storage and DDN are trademarks owned by DataDirect Networks. All other trademarks are the property of their respective owners. To view the original version on PR Newswire, visit:http://www.prnewswire.com/news-releases/ddn-technology-delivers-production-level-performance-for-machine-learning-success-at-scale-300463634.html


News Article | November 20, 2015
Site: www.scientificcomputing.com

Here they are — the five most-visited stories from the past week. One of the largest centralized academic cyberinfrastructures; more than a third of the HPC market leveraging HPE; speeding highly advanced engineering simulations to maximize real-life product performance; the most extreme ‘entanglement’ between pairs of photons ever seen in the lab; and the first data-driven map of earth’s hidden groundwater reserves revealing that use exceeds renewal, are all among the top stories. First Data-driven Map of Earth’s Hidden Groundwater Reserves: Use Exceeds Renewal Groundwater: it’s one of the planet’s most exploited, most precious natural resources. It ranges in age from months to millions of years old. Around the world, there’s increasing demand to know how much we have and how long before it’s tapped out. For the first time since a back-of-the-envelope calculation was attempted in the 1970s, hydrologists have produced a data-driven estimate of the Earth’s total supply of groundwater. Experiment Records Extreme Quantum Weirdness Researchers at the National University of Singapore and the University of Seville in Spain have reported the most extreme ‘entanglement’ between pairs of photons ever seen in the lab. The achievement is evidence for the validity of quantum physics and will bolster confidence in schemes for quantum cryptography and quantum computing designed to exploit this phenomenon. Speeding Highly Advanced Engineering Simulations to Maximize Real-life Product Performance CD-adapco, the largest privately held computational fluid dynamics provider of computer-aided engineering software, has deployed more than one petabyte of Panasas ActiveStor storage to support the high-performance demands of CFD simulations. CD-adapco is pioneering an approach called Multidisciplinary Design Exploration (MDX) that uses engineering data from simulation results to improve a product through multiple design iterations. Hewlett Packard Enterprise reports Significant Momentum in High Performance Computing Hewlett Packard Enterprise announced at SC15 in Austin, that more than a third of the high-performance computing market is leveraging HPE Compute platforms to process, analyze and manage data securely across HPC workloads. The Pittsburgh Supercomputing Center, Texas Advanced Computing Center at the University of Texas at Austin, National Renewable Energy Laboratory, Ghent University and the Academic Computer Centre... Closer to a Cure for Gastrointestinal Cancer At the University of Miami’s Center for Computational Science, more than 2,000 internal researchers and a dozen expert collaborators across academic and industry sectors worldwide are working together in workflow management, data management, data mining, decision support, visualization and cloud computing. CCS maintains one of the largest centralized academic cyberinfrastructures in the country, which fuels vital and critical discoveries.


News Article | February 16, 2017
Site: www.nanotech-now.com

Abstract: Francis (Frank) Alexander, a physicist with extensive management and leadership experience in computational science research, has been named Deputy Director of the Computational Science Initiative at the U.S. Department of Energy's (DOE) Brookhaven National Laboratory, effective February 1. Alexander comes to Brookhaven Lab from DOE's Los Alamos National Laboratory, where he was the acting division leader of the Computer, Computational, and Statistical Sciences (CCS) Division. During his more than 20 years at Los Alamos, he held several leadership roles, including as leader of the CCS Division's Information Sciences Group and leader of the Information Science and Technology Institute. Alexander first joined Los Alamos in 1991 as a postdoctoral researcher at the Center for Nonlinear Studies. He returned to Los Alamos in 1998 after doing postdoctoral work at the Institute for Scientific Computing Research at DOE's Lawrence Livermore National Laboratory and serving as a research assistant professor at Boston University's Center for Computational Science. "I was drawn to Brookhaven by the exciting opportunity to strengthen the ties between computational science and the significant experimental facilities-the Relativistic Heavy Ion Collider, the National Synchrotron Light Source II, and the Center for Functional Nanomaterials [all DOE Office of Science User Facilities]," said Alexander. "The challenge of getting the most out of high-throughput and data-rich science experiments is extremely exciting to me. I very much look forward to working with the talented individuals at Brookhaven on a variety of projects, and am grateful for the opportunity to be part of such a respected institution." In his new role as deputy director, Alexander will work with CSI Director Kerstin Kleese van Dam to expand CSI's research portfolio and realize its potential in data-driven discovery. He will serve as the primary liaison to national security agencies, as well as develop strategic partnerships with other national laboratories, universities, and research institutions. His current research interest is the intersection of machine learning and physics (and other domain sciences). "We are incredibly happy that Frank decided to join our CSI team," said Kleese van Dam. "With his background in high-performance computing, data science, and computational and statistical physics, he is the ideal fit for Brookhaven." Throughout his career, Alexander has worked in a variety of areas, including nonequilibrium physics and computational physics. More recently, he has focused on the optimal design of experiments as part of the joint DOE/National Cancer Institute collaboration on cancer research, as well as on uncertainty quantification and error analysis for the prediction of complex systems' behavior. Alexander has served on many committees and advisory panels, including those related to DOE's Laboratory Directed Research and Development [http://science.energy.gov/lp/laboratory-directed-research-and-development/] program. Currently, he is on DOE's Computational Research Leadership Council and the editorial board of Computing in Science & Engineering Magazine. Alexander received his PhD in physics in 1991 from Rutgers University and a BS in mathematics and physics in 1987 from The Ohio State University. About Brookhaven National Laboratory Brookhaven National Laboratory is supported by the Office of Science of the U.S. Department of Energy. The Office of Science is the single largest supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time. For more information, please click If you have a comment, please us. Issuers of news releases, not 7th Wave, Inc. or Nanotechnology Now, are solely responsible for the accuracy of the content.


News Article | November 29, 2016
Site: www.eurekalert.org

CORAL GABLES, Fla. (November 16, 2016)--Your brain is never really at rest. Neither is it in chaos. Even when not engaged in some task, the brain naturally cycles through identifiable patterns of neural connections--sort of like always practicing your favorite songs when learning to play the guitar. Constantly cycling through brain region connections may make it easier to call to those networks when you need them for high-level cognitive processing, such as memory and attention. The network connections are not all equal, either. Some are more flexible and adaptable than others. This is what Lucina Uddin and Jason Nomi, cognitive neuroscientists at the University of Miami College of Arts and Sciences, found when collaborating with researchers at the University of New Mexico on a study that researchers hope will lay the groundwork for helping children with autism adapt to change more easily. The scientists analyzed an extensive data set of brain region connectivity from the NIH-funded Human Connectome Project (HCP), which is mapping neural connections in the brain and makes its data publicly available. To better understand the human brain connectome, the HCP collected data from hundreds of people who underwent 56 minutes of resting-state functional magnetic resonance imaging (fMRI). A revolutionary tool in brain-mapping research, fMRIs measure brain activity by detecting changes in cerebral blood flow that are associated with brain activity and neural activation. The HCP also collected a number of other measurements, including the subjects' ages, IQs, and results on various mental tasks. Nomi, Uddin, and their fellow researchers analyzed the HCP's resting-state fMRI data and, from potentially hundreds of configurations, teased apart five general brain patterns. They discovered that, most of the time, neural connections in the typical adult population are agile--alert yet fluid and flexible enough to take on whatever challenges or mental tasks are presented. Less frequently, the brain cycles through more rigid connections where the regions are linked in a very specific, less flexible way, says Uddin, assistant professor of psychology and principal investigator in the Brain Connectivity and Cognition Laboratory (BCCL). The researchers then correlated the frequency of these five brain patterns with performance on executive-function tasks--completed outside of the fMRI brain scanner--that tap high-level cognition, such as sorting a deck of cards by the printed image's color and then by its shape. What they found was higher performers tend to have a natural propensity to be in the more flexible and fluid brain states. "People who do better on these tasks tend to have more of the relaxed, flexible brain configuration states and less of the more rigid configuration states," says Nomi, a postdoctoral fellow in the Department of Psychology and the BCCL. With this better understanding of brain activity in a typical population, the researchers are now moving to the next step of their research: testing children with autism to see whether their brains have a natural propensity to spend more time in the more rigid network configurations, making it harder for them to adapt to change as they experience life. "The final step is determining what can we do to help them do better," Uddin says. "Is there a way to induce a brain state that helps children with autism more flexibly adapt? Are there training programs or behavioral therapies that help them become more flexible? And if there are, do they also help their brains become more flexible?" Uddin, Nomi, and their fellow researchers who study the connection between neuroscience and behavior are excited about the direction neuroimaging has taken their field. "In the field of neuroimaging, before, we would have a snapshot of the brain. Now, we have a movie," says Uddin. Neuroscientists are also making more data publically available, and building interdisciplinary collaborations to analyze big data. Uddin, Nomi, and their collaborators were able to analyze more than 80 gigabytes of data for the connectome study in weeks, rather than months, by using the supercomputing resources at UM's Center for Computational Science (CCS). For the follow up study on children with autism, Uddin and Nomi have been working closely with UM's Michael Alessandri, clinical professor of psychology and executive director of the University of Miami-Nova Southeastern University Center for Autism and Related Disabilities (UM-NSU CARD); Melissa Hale, clinical assistant professor of psychology and UM-NSU-CARD's associate director; and Meaghan Parlade, a licensed psychologist at the Autism Spectrum Assessment Clinic (ASAC) in the Department of Psychology as well as the coordinator of research and training for UM-NSU CARD. The team's UMiami Brain Development Lab is looking for children ages 7 to 12, who are typically developing or who have autism, to help them understand more about how the brain functions in both populations. Parents can learn more by viewing this video. For their research study, "Chronnectomic Patterns and Neural Flexibility Underlie Executive Function," Nomi and Uddin worked with Shruti Gopal Vij, a biomedical engineer and postdoctoral researcher in the Brain Connectivity and Cognition Lab and The Mind Research Network in Albuquerque; Dina Dajani, a graduate student in psychology at UM's College of Arts and Sciences; Rosa Steimke, a visiting postdoctoral researcher in psychology in the Brain Connectivity and Cognition Lab; Eswar Damaraju and Srinivas Rachakonda, of The Mind Research Network; and Vince Calhoun, of The Mind Research Network and the Department of Electrical and Computer Engineering at the University of New Mexico.


Martin A.,University of Kentucky | Martin A.,Center for Computational Science | Boyd I.D.,University of Michigan
Journal of Spacecraft and Rockets | Year: 2015

The steps necessary to achieve the strong coupling between a flowfield solver and a material response solver are presented. This type of coupling is required to accurately capture the complex aerothermodynamic physics occurring during hypersonic atmospheric entries. A blowing boundary condition for the flowfield solver is proposed. This allows the ablating gas calculated by the material response solver to be correctly injected in the boundary layer. A moving mesh algorithm for the flowfield solver that implicitly enforces the geometric conservation law is presented. Using that capability, a mesh movement procedure for surface recession and for accurate shock capturing is proposed. The entire technique is tested using a material response solver with surface ablation and pyrolysis coupled to a hypersonic solver for weakly ionized flows in thermochemical nonequilibrium. Results using the reentry trajectory of the IRV-2 test vehicle are presented, showing that the surface heat fluxes remain accurate as the vehicle geometry and freestream conditions change. Copyright © 2014 by Alexandre Martin and Iain D. Boyd.


News Article | January 25, 2016
Site: www.scientificcomputing.com

The aerial image of a Colombian shantytown appeared on the large display screen with such clarity that the 12 people gathered for the demonstration could identify the types and number of garments hanging from a clothesline on the rooftop of a shack. “Just imagine what can be done with hurricane tracks and climatological data,” said Nick Tsinoremas, director of the University of Miami’s Center for Computational Science (CCS). Tsinoremas was commending the visual and display capabilities of the 22-foot-long 2-D display monitor inside CCS’s Visualization Lab. From a bird’s-eye view of a shantytown to an illustration of the branchlike projections of neurons called dendrites, the new lab allows faculty members, researchers, scientists and students to display high-resolution images, data, charts and other information in visually stunning formats. “This is a facility that will appeal to just about anyone on campus — architecture, business, the medical and marine schools,” said Joel Zysman, director of high-performance computing for CCS. “Researchers can display their data like never before, but not only that, do something with that data, such as perform live analysis.” A tie-in with CCS’s Pegasus supercomputer makes that possible, allowing researchers to run simulations through the powerful device and then display their results on screen for analysis and discussion. A smaller 3-D monitor is also available, but content for that system must be specially created, and to experience the 3-D effect, special glasses must be worn. Carie Penabad, associate professor in the School of Architecture, said she plans to use the Viz Lab at some point to present her ongoing research on shantytowns. With assistance from CCS, Penabad is using drones to map squatter settlements in Latin American countries such as Colombia and the Dominican Republic, using her charts to document and better understand those areas that are not included on the official maps of local governments yet are areas where hundreds of thousands of people live in horrid conditions. “I can see all kinds of incredible projects that will be related to what our graduate students do,” said Gina Maranto, director of the undergraduate program in ecosystem science and policy, who gathered some of her students to attend a demo session of the lab, located on the third floor of the Ungar Building. “They do a lot, especially the students who are working on things like vector-borne diseases,” explains Maranto. “We have three or four students who have been doing visualization and looking at land cover and trying to correlate mosquito and land cover and dengue or malaria outbreaks. Compared to working on a little screen or even a fairly large Apple screen — this stuff [the Visualization Lab] is just incredible.” The CCS Viz Lab is a free resource for the UM community, but first-time use of the space requires an orientation session with a CCS support team.


Pitigala S.,Center for Computational science | Li C.,MTSU
Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) | Year: 2014

PubMed is the most comprehensive citation database in the field of biomedicine. It contains over 23 million citations from MEDLINE, life science journals and books. However, retrieving relevant information from PubMed is challenging due to its size and rapid growth. Keyword based information retrieval is not adequate in PubMed. Many tools have been developed to enhance the quality of information retrieval from PubMed. PubMed Related Article (PMRA) feature is one approach developed to help the users retrieve information efficiently. It finds highly related citations to a given citation. This study focuses on extending the PMRA feature to multiple citations in the context of personalized information retrieval. Our experimental results show that the extended PMRA feature using the words appearing in two or more citations is able to find more relevant articles than using the PMRA feature on individual PubMed citations. © 2014 Springer International Publishing Switzerland.


Staaterman E.,University of Miami | Paris C.B.,University of Miami | Helgers J.,Center for Computational Science
Journal of Theoretical Biology | Year: 2012

Larval reef fish possess considerable swimming and sensory abilities, which could enable navigation towards settlement habitat from the open ocean. Due to their small size and relatively low survival, tagging individual larvae is not a viable option, but numerical modeling studies have proven useful for understanding the role of orientation throughout ontogeny. Here we combined the theoretical framework of the biased correlated random walk model with a very high resolution three-dimensional coupled biophysical model to investigate the role of orientation behavior in fish larvae. Virtual larvae of the bicolor damselfish (Stegastes partitus) were released daily during their peak spawning period from two locations in the Florida Keys Reef Tract, a region of complex eddy fields bounded by the strong Florida Current. The larvae began orientation behavior either before or during flexion, and only larvae that were within a given maximum detection distance from the reef were allowed to orient. They were subjected to ontogenetic vertical migration, increased their swimming speed during ontogeny, and settled on reefs within a flexible window of 24 to 32 days of pelagic duration. Early orientation, as well as a large maximum detection distance, increased settlement, implying that the early use of large-scale cues increases survival. Orientation behavior also increased the number of larvae that settled near their home reef, providing evidence that orientation is a mechanism driving self-recruitment. This study demonstrates that despite the low swimming abilities of the earliest larval stages, orientation during this "critical period" would have remarkable demographic consequences. © 2012 Elsevier Ltd.

Loading Center for Computational Science collaborators
Loading Center for Computational Science collaborators