Logan, UT, United States
Logan, UT, United States

Time filter

Source Type

News Article | February 15, 2017
Site: phys.org

"There are just over four light-years between Neptune and Proxima Centauri, the nearest star, and much of this vast territory is unexplored," said lead researcher Marc Kuchner, an astrophysicist at NASA's Goddard Space Flight Center in Greenbelt, Maryland. "Because there's so little sunlight, even large objects in that region barely shine in visible light. But by looking in the infrared, WISE may have imaged objects we otherwise would have missed." WISE scanned the entire sky between 2010 and 2011, producing the most comprehensive survey at mid-infrared wavelengths currently available. With the completion of its primary mission, WISE was shut down in 2011. It was then reactivated in 2013 and given a new mission assisting NASA's efforts to identify potentially hazardous near-Earth objects (NEOs), which are asteroids and comets on orbits that bring them into the vicinity of Earth's orbit. The mission was renamed the Near-Earth Object Wide-field Infrared Survey Explorer (NEOWISE). The new website uses the data to search for unknown objects in and beyond our own solar system. In 2016, astronomers at Caltech in Pasadena, California, showed that several distant solar system objects possessed orbital features indicating they were affected by the gravity of an as-yet-undetected planet, which the researchers nicknamed "Planet Nine." If Planet Nine—also known as Planet X—exists and is as bright as some predictions, it could show up in WISE data. The search also may discover more distant objects like brown dwarfs, sometimes called failed stars, in nearby interstellar space. "Brown dwarfs form like stars but evolve like planets, and the coldest ones are much like Jupiter," said team member Jackie Faherty, an astronomer at the American Museum of Natural History in New York. "By using Backyard Worlds: Planet 9, the public can help us discover more of these strange rogue worlds." Unlike more distant objects, those in or closer to the solar system appear to move across the sky at different rates. The best way to discover them is through a systematic search of moving objects in WISE images. While parts of this search can be done by computers, machines are often overwhelmed by image artifacts, especially in crowded parts of the sky. These include brightness spikes associated with star images and blurry blobs caused by light scattered inside WISE's instruments. Backyard Worlds: Planet 9 relies on human eyes because we easily recognize the important moving objects while ignoring the artifacts. It's a 21st-century version of the technique astronomer Clyde Tombaugh used to find Pluto in 1930, a discovery made 87 years ago this week. On the website, people around the world can work their way through millions of "flipbooks," which are brief animations showing how small patches of the sky changed over several years. Moving objects flagged by participants will be prioritized by the science team for follow-up observations by professional astronomers. Participants will share credit for their discoveries in any scientific publications that result from the project. "Backyard Worlds: Planet 9 has the potential to unlock once-in-a-century discoveries, and it's exciting to think they could be spotted first by a citizen scientist," said team member Aaron Meisner, a postdoctoral researcher at the University of California, Berkeley, who specializes in analyzing WISE images. Backyard Worlds: Planet 9 is a collaboration between NASA, UC Berkeley, the American Museum of Natural History in New York, Arizona State University, the Space Telescope Science Institute in Baltimore, and Zooniverse, a collaboration of scientists, software developers and educators who collectively develop and manage citizen science projects on the internet. NASA's Jet Propulsion Laboratory in Pasadena, California, manages and operates WISE for NASA's Science Mission Directorate. The WISE mission was selected competitively under NASA's Explorers Program managed by the agency's Goddard Space Flight Center. The science instrument was built by the Space Dynamics Laboratory in Logan, Utah. The spacecraft was built by Ball Aerospace & Technologies Corp. in Boulder, Colorado. Science operations and data processing take place at the Infrared Processing and Analysis Center at Caltech, which manages JPL for NASA. Explore further: NEOWISE mission spies one comet, maybe two


News Article | February 15, 2017
Site: www.eurekalert.org

NASA is inviting the public to help search for possible undiscovered worlds in the outer reaches of our solar system and in neighboring interstellar space. A new website, called Backyard Worlds: Planet 9, lets everyone participate in the search by viewing brief movies made from images captured by NASA's Wide-field Infrared Survey Explorer (WISE) mission. The movies highlight objects that have gradually moved across the sky. "There are just over four light-years between Neptune and Proxima Centauri, the nearest star, and much of this vast territory is unexplored," said lead researcher Marc Kuchner, an astrophysicist at NASA's Goddard Space Flight Center in Greenbelt, Maryland. "Because there's so little sunlight, even large objects in that region barely shine in visible light. But by looking in the infrared, WISE may have imaged objects we otherwise would have missed." WISE scanned the entire sky between 2010 and 2011, producing the most comprehensive survey at mid-infrared wavelengths currently available. With the completion of its primary mission, WISE was shut down in 2011. It was then reactivated in 2013 and given a new mission assisting NASA's efforts to identify potentially hazardous near-Earth objects (NEOs), which are asteroids and comets on orbits that bring them into the vicinity of Earth's orbit. The mission was renamed the Near-Earth Object Wide-field Infrared Survey Explorer (NEOWISE). The new website uses the data to search for unknown objects in and beyond our own solar system. In 2016, astronomers at Caltech in Pasadena, California, showed that several distant solar system objects possessed orbital features indicating they were affected by the gravity of an as-yet-undetected planet, which the researchers nicknamed "Planet Nine." If Planet Nine -- also known as Planet X -- exists and is as bright as some predictions, it could show up in WISE data. The search also may discover more distant objects like brown dwarfs, sometimes called failed stars, in nearby interstellar space. "Brown dwarfs form like stars but evolve like planets, and the coldest ones are much like Jupiter," said team member Jackie Faherty, an astronomer at the American Museum of Natural History in New York. "By using Backyard Worlds: Planet 9, the public can help us discover more of these strange rogue worlds." Unlike more distant objects, those in or closer to the solar system appear to move across the sky at different rates. The best way to discover them is through a systematic search of moving objects in WISE images. While parts of this search can be done by computers, machines are often overwhelmed by image artifacts, especially in crowded parts of the sky. These include brightness spikes associated with star images and blurry blobs caused by light scattered inside WISE's instruments. Backyard Worlds: Planet 9 relies on human eyes because we easily recognize the important moving objects while ignoring the artifacts. It's a 21st-century version of the technique astronomer Clyde Tombaugh used to find Pluto in 1930, a discovery made 87 years ago this week. On the website, people around the world can work their way through millions of "flipbooks," which are brief animations showing how small patches of the sky changed over several years. Moving objects flagged by participants will be prioritized by the science team for follow-up observations by professional astronomers. Participants will share credit for their discoveries in any scientific publications that result from the project. "Backyard Worlds: Planet 9 has the potential to unlock once-in-a-century discoveries, and it's exciting to think they could be spotted first by a citizen scientist," said team member Aaron Meisner, a postdoctoral researcher at the University of California, Berkeley, who specializes in analyzing WISE images. Backyard Worlds: Planet 9 is a collaboration between NASA, UC Berkeley, the American Museum of Natural History in New York, Arizona State University, the Space Telescope Science Institute in Baltimore, and Zooniverse, a collaboration of scientists, software developers and educators who collectively develop and manage citizen science projects on the internet. NASA's Jet Propulsion Laboratory in Pasadena, California, manages and operates WISE for NASA's Science Mission Directorate. The WISE mission was selected competitively under NASA's Explorers Program managed by the agency's Goddard Space Flight Center. The science instrument was built by the Space Dynamics Laboratory in Logan, Utah. The spacecraft was built by Ball Aerospace & Technologies Corp. in Boulder, Colorado. Science operations and data processing take place at the Infrared Processing and Analysis Center at Caltech, which manages JPL for NASA. For more information about Backyard Worlds: Planet 9, visit: For more information about NASA's WISE mission, visit:


News Article | December 8, 2016
Site: phys.org

NASA's Origins, Spectral Interpretation, Resource Identification, Security-Regolith Explorer (OSIRIS-REx) launched Sept. 8, 2016, and is travelling to a near-Earth asteroid known as Bennu, to harvest a sample of surface material, and return it to Earth for study. A trio of cameras will capture it all. The OSIRIS-REx Camera Suite, or OCAMS, consists of three cameras. PolyCam is a high-resolution camera that will acquire the first images of Bennu and perform an initial mapping of the asteroid. MapCam is a medium-resolution camera that will map the asteroid in color and search for satellites and dust plumes. SamCam will document the sampling process. The spacecraft will store the images captured by OCAMS and send them to the OSIRIS-REx team every few days. Scientists designed the camera suite to be functionally redundant, meaning that if one of the cameras fails during the mission, the other two cameras can stand in. "When you have a critical mission like this, you want redundancy," said Christian d'Aubigny, OCAMS deputy instrument scientist at the University of Arizona, Tucson. "The cameras have some amount of overlap in their capabilities. They're not exact copies of each other, but if one fails, they can still get the job done." The first camera to see Bennu is called PolyCam. Similar to a polymath—a human that is skilled at doing several different things—PolyCam can perform a wide range of optical tasks. PolyCam has a focus mechanism that allows it to refocus from infinity to about 500 feet (0.15 kilometers), which provides PolyCam with the ability to switch from detecting stars and asteroids from far away to resolving small pebbles on the surface of the asteroid. PolyCam has better visual acuity, or sharpness of vision, than an eagle. Several eagle species can see small objects such as prey from as far as 2 miles away. But with its high resolution, PolyCam can acquire images of similarly sized objects on Bennu at a range of about 30 to 60 miles (48.2 to 96.5 kilometers) to determine its shape and figure out how the scientists can maneuver the spacecraft around the asteroid. Once PolyCam performs an initial mapping of the asteroid, scientists will use the camera to identify a site where the spacecraft might collect a sample of Bennu's surface that is as free of hazards as possible, such as boulders and dramatic slopes. "Already, at about 2 miles (3.5 kilometers), we're dividing the surface of the asteroid into 'go' and 'no go' places," said Bashar Rizk, OCAMS instrument scientist at the University of Arizona. "If a place is covered with hazards, we're just not going to go there because we don't want to risk damaging the spacecraft." The second camera to get a glimpse of Bennu is called MapCam. This camera has a wider field of view than PolyCam and is equipped with a number of color filters in its filter wheel to help the scientists identify locations on the asteroid where specific minerals of interest may be present, particularly those that may have once been in contact with liquid water. MapCam will also look for satellites and dust plumes, which may present a hazard to the spacecraft. There are a number of suspected mechanisms for plume formation, such as sublimation, in which a frozen substance transitions directly to a gas without first passing through the liquid phase, and electromagnetic levitation due to electrical charging from solar wind as the asteroid gets closer to the sun. "Asteroids are exposed to a lot of solar radiation because they have no atmosphere," Rizk said. "They're just mercilessly tortured by the sun every time they go around it." Due to a lack of water on the surface, the scientists predict that Bennu's regolith—a layer of loose material, including dust, soil and broken rock—is very dry, similar to the surface of the moon. The surface material can easily stick to things, increasing the risk of contaminating the OSIRIS-REx spacecraft during sampling. Dust contamination is of particular concern to the spacecraft's third camera—SamCam. This camera is a low-resolution, wide-angle camera designed to get up close and personal with the asteroid to document the sampling acquisition. When it comes time to retrieve a sample, SamCam will need to be able to retain its functionality for up to three attempts. To combat this potential threat, the team at the University of Arizona furnished SamCam with multiple copies of the same filter, which are placed in front of the camera optics to act as a cover. The filters help ensure that the camera has a dust-free, unobstructed viewing of the sampling event in case sampling needs to be repeated. The team also had to account for radiation from gamma rays and X-rays when designing OCAMS. Scientists housed the cameras in a suit of armor made from solid titanium and aluminum. These materials can block the radiation OSIRIS-REx will encounter during the mission. The lenses are made of materials, such as silicon dioxide, that are radiation resistant, as well as a number of other types of glass that are infused with cerium, which prevents the glass from turning opaque when exposed to high levels of radiation. "We tried to think of everything that the spacecraft might be subjected to and account for that," Rizk said. "It's a multi-step process of simulations, testing and design to ensure that the cameras work properly and that we get the best images we can." A collaborative team of engineers and scientists at the University of Arizona's Lunar and Planetary Laboratory and the College of Optical Sciences and the Utah State University's Space Dynamics Laboratory spent four and a half years designing and building OCAMS. "In the end, the University of Arizona OCAMS team did an excellent job designing, building and testing the camera suite," said Brent Bos, OSIRIS-REx optics discipline lead at NASA's Goddard Space Flight Center in Greenbelt, Maryland. OSIRIS-REx's eyes are a critical part of retrieving an asteroid sample, which will help scientists investigate how planets formed and how life began, as well as improve our understanding of asteroids that might impact Earth. Explore further: Cameras delivered for OSIRIS-REx mission as launch prep continues


Grant
Agency: National Aeronautics and Space Administration | Branch: | Program: STTR | Phase: Phase I | Award Amount: 124.99K | Year: 2013

The Conjugate Etalon Spectral Imaging (CESI) concept enables the development of miniature instruments with high spectral resolution, suitable for LEO missions aboard CubeSat or nanosat buses, including constellation missions providing global coverage and characterization of dynamic phenomena. Small size, low power, and a simplified instrument architecture support missions for earth observation, atmospheric science, and planetary science.Unlike prior art hyperspectral and ultraspectral instruments that are much too large and complex for deployment on a nanosat, the CESI concept can be implemented in a small form factor using inexpensive components and requiring only a small optical aperture. CESI superimposes the interferogram from a conjugate Fabry-Perot etalon on the image of a scanned scene captured on a novel high-sensitivity low-noise SWIR focal plane. Using image processing, high resolution spectral characterization is performed independently for each point in the scene. The innovative focal plane and spectroscopic concepts have many promising scientific and commercial applications.The Scanned Etalon Methane Mapper (SEMM) is a CubeSat instrument that incorporates the CESI concept to perform global daytime mapping of atmospheric methane column density. Performance capabilities: ground resolution 100 m; concentration sensitivity 18 ppb; and global revisit ~ 60 days.


Grant C.S.,L3 Communications Inc. | Moon T.K.,Utah State University | Gunther J.H.,Utah State University | Stites M.R.,Space Dynamics Laboratory | Williams G.P.,Brigham Young University
IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing | Year: 2012

Pattern recognition of amorphously shaped objects such as gas plumes, oil spills, or epidemiological spread is difficult because there is no definite shape to match. We consider detection of such amorphously shaped objects using a neighborhood model which operates on a concept of loose spatial contiguity: there is a significant probability that a pixel surrounded by the object of interest itself contains that object of interest, and boundaries tend to be smooth. These assumptions are distilled into a single-parameter prior probability model to use in a maximum a posteriori hypothesis test. The method is evaluated against synthetic data generated from hyperspectral imagery and DIRSIG simulation results. These tests indicate significant improvement on the ROC curves. © 2012 IEEE.


Chatterjee K.,Space Dynamics Laboratory | Chatterjee K.,Utah State University
Journal of Computational Physics | Year: 2016

The objective of this paper is the extension and application of a newly-developed Green's function Monte Carlo (GFMC) algorithm to the estimation of the derivative of the solution of the one-dimensional (1D) Helmholtz equation subject to Neumann and mixed boundary conditions problems. The traditional GFMC approach for the solution of partial differential equations subject to these boundary conditions involves "reflecting boundaries" resulting in relatively large computational times. My work, inspired by the work of K.K. Sabelfeld is philosophically different in that there is no requirement for reflection at these boundaries. The underlying feature of this algorithm is the elimination of the use of reflecting boundaries through the use of novel Green's functions that mimic the boundary conditions of the problem of interest. My past work has involved the application of this algorithm to the estimation of the solution of the 1D Laplace equation, the Helmholtz equation and the modified Helmholtz equation. In this work, this algorithm has been adapted to the estimation of the derivative of the solution which is a very important development. In the traditional approach involving reflection, to estimate the derivative at a certain number of points, one has to a priori estimate the solution at a larger number of points. In the case of a one-dimensional problem for instance, to obtain the derivative of the solution at a point, one has to obtain the solution at two points, one on each side of the point of interest. These points have to be close enough so that the validity of the first-order approximation for the derivative operator is justified and at the same time, the actual difference between the solutions at these two points has to be at least an order of magnitude higher than the statistical error in the estimation of the solution, thus requiring a significantly larger number of random-walks than that required for the estimation of the solution. In this new approach, identical amount of computational resources is needed irrespective of if we are trying to estimate the solution or the derivative. This becomes very significant in electromagnetic problems where the scalar/vector potential is the unknown in the PDE of interest, but the quantity of interest is the electric/magnetic field or in heat conduction problems where temperature of an object is the unknown variable in a PDE, but the quantity of interest is the spatial/temporal variation of the temperature. In this work, this algorithm is applied to the estimation of the derivative of the solution of the 1D Helmholtz equation which is the frequency domain version of both Maxwell's equations and the heat conduction equation. As a result the algorithm is an important first step in the development of computationally efficient GFMC algorithms for Neumann and mixed boundary condition problems. The numerical results have been validated by an exact, analytical solution and very good agreement has been observed. The long-term goal of this research is the application of this methodology to the numerical solution of the F region ionization problem in space plasma modeling. © 2016 Elsevier Inc.


Privalsky V.,Space Dynamics Laboratory
Earth System Dynamics | Year: 2015

Relationships between time series are often studied on the basis of cross-correlation coefficients and regression equations. This approach is generally incorrect for time series, irrespective of the cross-correlation coefficient value, because relations between time series are frequency-dependent. Multivariate time series should be analyzed in both time and frequency domains, including fitting a parametric (preferably, autoregressive) stochastic difference equation to the time series and then calculating functions of frequency such as spectra and coherent spectra, coherences, and frequency response functions. The example with a bivariate time series "Atlantic Multidecadal Oscillation (AMO) - sea surface temperature in Niño area 3.4 (SST3.4)" proves that even when the cross correlation is low, the time series' components can be closely related to each other. A full time and frequency domain description of this bivariate time series is given. The AMO-SST3.4 time series is shown to form a closed-feedback loop system with a 2-year memory. The coherence between AMO and SST3.4 is statistically significant at intermediate frequencies where the coherent spectra amount up to 55% of the total spectral densities. The gain factors are also described. Some recommendations are offered regarding time series analysis in climatology. © Author(s) 2015.


Grant
Agency: Department of Defense | Branch: Defense Advanced Research Projects Agency | Program: STTR | Phase: Phase II | Award Amount: 499.96K | Year: 2013

The 3-D SAR imaging of objects in a scene provides improved military target identification and weapons targeting accuracy. This capability can also potentially be combined with simultaneous SAR/GMTI surveillance modes to provide 3-D context information for moving targets in a scene. In Phase I, TSC demonstrated a promising approach for noncoherently combining 3-D tomographic SAR images generated from data cubes collected on multiple circular orbits at different elevations. This approach significantly reduced layover and provided improved vertical discrimination of target features. TSC also investigated coherent processing techniques including summing phase-corrected and registered data from each orbit. SDL collected X-band data on multiple orbits on several flight tests using their NuSAR and NRL's RASAR L-band sensor. In Phase II, our team will collect additional X- and L-band SAR data and investigate several promising coherent and noncoherent processing techniques for generating 3-D Holographic SAR products, and overcoming sparse aperture limitations. TSC will evaluate the performance using metrics including vertical resolution and sidelobe levels. All of the developed algorithm software will be delivered, and can be customized for high performance processors or sensors such as the GOTCHA Spiral II radar. In addition, TSC will investigate Holographic SAR imaging using simultaneous SAR/GMTI modes.


Grant
Agency: Department of Defense | Branch: Navy | Program: STTR | Phase: Phase I | Award Amount: 79.97K | Year: 2012

In Phase I the team of TSC and SDL will initiate an effort to collect polarimetric target and clutter signatures, identify polarimetric features and algorithms for detecting and classifying targets, and understanding the sensitivity to target aspect angle, grazing angle and radar resolution. In performing the Phase I program, the team will: 1) identify a robust set of polarimetric decomposition and target classification algorithms for resolving small targets, wake signatures, sea clutter and littoral clutter features, 2) collect X-band small ship and clutter data in the Great Salt Lake area, and 3) initiate the modification (through CDR) of an existing SDL radar for operation at C-band. Under the bridge effort our team will: 1) form target signatures and generate polarimetric features using the database using HRR-D, SAR and ISAR processing, and establish the best target imaging strategies and radar parameters, 2) apply the polarimetric feature extraction and target classification algorithms to the collected radar data and to other databases to optimize their performance and minimize the processing time, 3) order the long-lead items for the C-band radar and initiate the radar modifications, and 4) assess alternative locations for performing Phase II data collections.


Grant
Agency: Department of Defense | Branch: Air Force | Program: STTR | Phase: Phase I | Award Amount: 99.98K | Year: 2011

ABSTRACT: In Phase I, TSC and SDL will develop holographic circular SAR (CSAR) processing algorithms that allow a scene to be viewed in three dimensions over 360 degrees in azimuth and with a large elevation angle extent. Our approach is to coherently combine a set of CSAR images collected with different grazing angles to provide vertical resolution using a DTED database as a reference. This holographic SAR imaging will provide target height estimates and additional features for target identification, and will greatly improve the positional accuracy of targets, buildings and terrain. Initial testing of the algorithms will be done using the Gotcha 2006 multiple-orbit data collection. Holographic SAR imaging will eliminate the problematic feature overlay between multiple objects by resolving scatterers in the third dimension. Thus holographic SAR should greatly enhance the Air Force"s ISR capabilities. In Phase II the holographic CSAR algorithms will be fully developed and tested. Also, both L and X-band CSAR data will be collected using the SDL NuSAR radar to demonstrate the performance of the holographic SAR at both low and high radar frequencies. The NuSAR will allow data collection alternatives such as stacked circles, concentric circles, spirals and helical flight paths to be tested. BENEFIT: Holographic CSAR will provide a significantly better target and terrain feature identification capability than a linear flight path SAR, as the target will be viewed over 360 degrees in azimuth and there will be no shadowed areas. Holographic CSAR will also be superior to a single-orbit CSAR, even one using elevation interferometry (i.e. IFSAR) to estimate the target height, because standard IFSAR does not resolve multiple scatterers in height; it merely locates the scatterers in three dimensions. If two scatterers that are displaced in height are not resolved, then the height estimate and horizontal positions of both scatterers will be incorrect in the IFSAR image. In contrast, a true holographic SAR imaging not only provides accurate height estimates for each pixel, it resolves the multiple scatterers displaced in height. It also places the scatterers at the correct geographic location in the image.

Loading Space Dynamics Laboratory collaborators
Loading Space Dynamics Laboratory collaborators