Entity

Time filter

Source Type

Washington, DC, United States

The National Aeronautics and Space Administration is the United States government agency responsible for the civilian space program as well as aeronautics and aerospace research.President Dwight D. Eisenhower established the National Aeronautics and Space Administration in 1958 with a distinctly civilian orientation encouraging peaceful applications in space science. The National Aeronautics and Space Act was passed on July 29, 1958, disestablishing NASA's predecessor, the National Advisory Committee for Aeronautics . The new agency became operational on October 1, 1958.Since that time, most U.S. space exploration efforts have been led by NASA, including the Apollo moon-landing missions, the Skylab space station, and later the Space Shuttle. Currently, NASA is supporting the International Space Station and is overseeing the development of the Orion Multi-Purpose Crew Vehicle, the Space Launch System and Commercial Crew vehicles. The agency is also responsible for the Launch Services Program which provides oversight of launch operations and countdown management for unmanned NASA launches.NASA science is focused on better understanding Earth through the Earth Observing System, advancing heliophysics through the efforts of the Science Mission Directorate's Heliophysics Research Program, exploring bodies throughout the Solar System with advanced robotic missions such as New Horizons, and researching astrophysics topics, such as the Big Bang, through the Great Observatories and associated programs. NASA shares data with various national and international organizations such as from the Greenhouse Gases Observing Satellite. Wikipedia.


Durante M.,Helmholtz Center for Heavy Ion Research | Cucinotta F.A.,NASA
Reviews of Modern Physics | Year: 2011

The health risks of space radiation are arguably the most serious challenge to space exploration, possibly preventing these missions due to safety concerns or increasing their costs to amounts beyond what would be acceptable. Radiation in space is substantially different from Earth: high-energy (E) and charge (Z) particles (HZE) provide the main contribution to the equivalent dose in deep space, whereas γ rays and low-energy α particles are major contributors on Earth. This difference causes a high uncertainty on the estimated radiation health risk (including cancer and noncancer effects), and makes protection extremely difficult. In fact, shielding is very difficult in space: the very high energy of the cosmic rays and the severe mass constraints in spaceflight represent a serious hindrance to effective shielding. Here the physical basis of space radiation protection is described, including the most recent achievements in space radiation transport codes and shielding approaches. Although deterministic and Monte Carlo transport codes can now describe well the interaction of cosmic rays with matter, more accurate double-differential nuclear cross sections are needed to improve the codes. Energy deposition in biological molecules and related effects should also be developed to achieve accurate risk models for long-term exploratory missions. Passive shielding can be effective for solar particle events; however, it is limited for galactic cosmic rays (GCR). Active shielding would have to overcome challenging technical hurdles to protect against GCR. Thus, improved risk assessment and genetic and biomedical approaches are a more likely solution to GCR radiation protection issues. © 2011 American Physical Society. Source


Ostrikov K.,CSIRO | Neyts E.C.,University of Antwerp | Meyyappan M.,NASA
Advances in Physics | Year: 2013

The unique plasma-specific features and physical phenomena in the organization of nanoscale soild-state systems in a broad range of elemental composition, structure, and dimensionality are critically reviewed. These effects lead to the possibility to localize and control energy and matter at nanoscales and to produce self-organized nano-solids with highly unusual and superior properties. A unifying conceptual framework based on the control of production, transport, and self-organization of precursor species is introduced and a variety of plasma-specific non-equilibrium and kinetics-driven phenomena across the many temporal and spatial scales is explained. When the plasma is localized to micrometer and nanometer dimensions, new emergent phenomena arise. The examples range from semiconducting quantum dots and nanowires, chirality control of single-walled carbon nanotubes, ultra-fine manipulation of graphenes, nano-diamond, and organic matter to nano-plasma effects and nano-plasmas of different states of matter. © 2013 Taylor and Francis Group, LLC. Source


Shindell D.T.,NASA
Nature Climate Change | Year: 2014

Understanding climate sensitivity is critical to projecting climate change in response to a given forcing scenario. Recent analyses have suggested that transient climate sensitivity is at the low end of the present model range taking into account the reduced warming rates during the past 10-15 years during which forcing has increased markedly. In contrast, comparisons of modelled feedback processes with observations indicate that the most realistic models have higher sensitivities. Here I analyse results from recent climate modelling intercomparison projects to demonstrate that transient climate sensitivity to historical aerosols and ozone is substantially greater than the transient climate sensitivity to CO 2. This enhanced sensitivity is primarily caused by more of the forcing being located at Northern Hemisphere middle to high latitudes where it triggers more rapid land responses and stronger feedbacks. I find that accounting for this enhancement largely reconciles the two sets of results, and I conclude that the lowest end of the range of transient climate response to CO 2 in present models and assessments (<1.3 °C) is very unlikely. © 2014 Macmillan Publishers Limited. All rights reserved. Source


Comiso J.C.,NASA
Journal of Climate | Year: 2012

The perennial ice area was drastically reduced to 38% of its climatological average in 2007 but recovered slightly in 2008, 2009, and 2010 with the areas being 10%, 24%, and 11% higher than in 2007, respectively. However, trends in extent and area remained strongly negative at -12.2% and -13.5% decade -1, respectively. The thick component of the perennial ice, called multiyear ice, as detected by satellite data during the winters of 1979-2011 was studied, and results reveal that the multiyear ice extent and area are declining at an even more rapid rate of -15.1% and -17.2% decade -1, respectively, with a record low value in 2008 followed by higher values in 2009, 2010, and 2011. Such a high rate in the decline of the thick component of the Arctic ice cover means a reduction in the average ice thickness and an even more vulnerable perennial ice cover. The decline of the multiyear ice area from 2007 to 2008 was not as strong as that of the perennial ice area from 2006 to 2007, suggesting a strong role of second-year ice melt in the latter. The sea ice cover is shown to be strongly correlated with surface temperature, which is increasing at about 3 times the global average in the Arctic but appears weakly correlated with the Arctic Oscillation (AO), which controls the atmospheric circulation in the region. An 8-9-yr cycle is apparent in the multiyear ice record, which could explain, in part, the slight recovery in the last 3 yr. Source


Morton D.C.,NASA
Philosophical transactions of the Royal Society of London. Series B, Biological sciences | Year: 2013

Recent drought events underscore the vulnerability of Amazon forests to understorey fires. The long-term impact of fires on biodiversity and forest carbon stocks depends on the frequency of fire damages and deforestation rates of burned forests. Here, we characterized the spatial and temporal dynamics of understorey fires (1999-2010) and deforestation (2001-2010) in southern Amazonia using new satellite-based estimates of annual fire activity (greater than 50 ha) and deforestation (greater than 10 ha). Understorey forest fires burned more than 85 500 km(2) between 1999 and 2010 (2.8% of all forests). Forests that burned more than once accounted for 16 per cent of all understorey fires. Repeated fire activity was concentrated in Mato Grosso and eastern Pará, whereas single fires were widespread across the arc of deforestation. Routine fire activity in Mato Grosso coincided with annual periods of low night-time relative humidity, suggesting a strong climate control on both single and repeated fires. Understorey fires occurred in regions with active deforestation, yet the interannual variability of fire and deforestation were uncorrelated, and only 2.6 per cent of forests that burned between 1999 and 2008 were deforested for agricultural use by 2010. Evidence from the past decade suggests that future projections of frontier landscapes in Amazonia should separately consider economic drivers to project future deforestation and climate to project fire risk. Source

Discover hidden collaborations