National Climatic Data Center

Asheville, NC, United States

National Climatic Data Center

Asheville, NC, United States

Time filter

Source Type

News Article | May 24, 2017
Site: www.eurekalert.org

Over the past century, the Northeast has experienced an increase in the number of storms with extreme precipitation. A Dartmouth-led study finds that the increase in extreme Northeast storms occurred as an abrupt shift in 1996, particularly in the spring and fall, rather than as a steady change over several decades. The findings were published in an early online release of the American Meteorological Society's Journal of Hydrometeorology. (A pdf of the study is available upon request). With climate change, a warmer atmosphere is able to hold more moisture, which is likely to affect the frequency, intensity and location of extreme precipitation. Understanding historical changes in extreme storms, including in the Northeast, can improve our understanding of future precipitation projections with continued climate change. "Looking at where the increases in extreme precipitation are occurring across the Northeast, interestingly we find that it's not just one part of the Northeast, say the coast, that is experiencing more heavy rainfall events, it's relatively uniform across the region," says Jonathan M. Winter , assistant professor of geography at Dartmouth, who served as one of the co-authors of the study. For the study, the Northeast is defined as Maine, New Hampshire, Vermont, Mass., Conn., R.I., N.J., N.Y., Pa., Md., DC, Del., and W.Va., and draws on weather station data from the Global Historical Climatology Network, which is compiled by the National Oceanic and Atmospheric Association National Climatic Data Center. The threshold for extreme precipitation events depends on the station but regionally averaged is about 2 inches or more of rain in a day. Previous research has referred to the increase in precipitation from 1901 to 2014 as a long-term increase that took place over several decades based on a linear analysis of the data. By analyzing individual changepoints or places where the precipitation record "jumps," this study takes a different approach and finds that these extreme precipitation changes were not consistent with a long-term increase but were in fact due to a shift in extreme precipitation in 1996. From 1996 to 2014, the extreme precipitation in the Northeast was 53 percent higher than from 1901 to 1995. These increases applied to the entire Northeast region except for far western N.Y. and Pa., and a few areas in the mid-Atlantic. Given that the wetter period occurred towards the end of the defined period of the study from 1996 on, the authors note that a linear analysis may not be the most accurate in representing broader changes because the observed precipitation change will vary depending on the time period considered, especially the start date. The study also looks at changes in precipitation across all seasons, finding that the increases in extreme precipitation were driven by extreme storms particularly in the spring and fall. The amount of heavy rainfall from 1996 to 2014 was 83 percent and 85 percent higher in the spring and fall, respectively, than from 1901 to 1995. Tropical cyclones and nor'easters may be the possible key drivers for such changes in the spring and fall. With tropical cyclones in the fall, nor'easters in the winter and spring, and frontal changes in the summer, the Northeast's weather is largely affected by such seasonal systems. Through future work, the researchers plan to study what is driving the increases in total and extreme precipitation since 1996, and will look at the specific weather events associated with these changes. Winter is available for comment at: Jonathan.M.Winter@dartmouth.edu. The study was co-authored by Huanping Huang and Erich C. Osterberg in the Department of Earth Sciences at Dartmouth, Radley M. Horton with the Center for Climate Systems Research at Columbia University and NASA Goddard Institute for Space Studies, and Brian Beckage in the Department of Plant Biology at the University of Vermont.


News Article | May 25, 2017
Site: www.sciencedaily.com

Over the past century, the Northeast has experienced an increase in the number of storms with extreme precipitation. A Dartmouth-led study finds that the increase in extreme Northeast storms occurred as an abrupt shift in 1996, particularly in the spring and fall, rather than as a steady change over several decades. The findings were published in an early online release of the American Meteorological Society's Journal of Hydrometeorology. With climate change, a warmer atmosphere is able to hold more moisture, which is likely to affect the frequency, intensity and location of extreme precipitation. Understanding historical changes in extreme storms, including in the Northeast, can improve our understanding of future precipitation projections with continued climate change. "Looking at where the increases in extreme precipitation are occurring across the Northeast, interestingly we find that it's not just one part of the Northeast, say the coast, that is experiencing more heavy rainfall events, it's relatively uniform across the region," says Jonathan M. Winter , assistant professor of geography at Dartmouth, who served as one of the co-authors of the study. For the study, the Northeast is defined as Maine, New Hampshire, Vermont, Mass., Conn., R.I., N.J., N.Y., Pa., Md., DC, Del., and W.Va., and draws on weather station data from the Global Historical Climatology Network, which is compiled by the National Oceanic and Atmospheric Association National Climatic Data Center. The threshold for extreme precipitation events depends on the station but regionally averaged is about 2 inches or more of rain in a day. Previous research has referred to the increase in precipitation from 1901 to 2014 as a long-term increase that took place over several decades based on a linear analysis of the data. By analyzing individual changepoints or places where the precipitation record "jumps," this study takes a different approach and finds that these extreme precipitation changes were not consistent with a long-term increase but were in fact due to a shift in extreme precipitation in 1996. From 1996 to 2014, the extreme precipitation in the Northeast was 53 percent higher than from 1901 to 1995. These increases applied to the entire Northeast region except for far western N.Y. and Pa., and a few areas in the mid-Atlantic. Given that the wetter period occurred towards the end of the defined period of the study from 1996 on, the authors note that a linear analysis may not be the most accurate in representing broader changes because the observed precipitation change will vary depending on the time period considered, especially the start date. The study also looks at changes in precipitation across all seasons, finding that the increases in extreme precipitation were driven by extreme storms particularly in the spring and fall. The amount of heavy rainfall from 1996 to 2014 was 83 percent and 85 percent higher in the spring and fall, respectively, than from 1901 to 1995. Tropical cyclones and nor'easters may be the possible key drivers for such changes in the spring and fall. With tropical cyclones in the fall, nor'easters in the winter and spring, and frontal changes in the summer, the Northeast's weather is largely affected by such seasonal systems. Through future work, the researchers plan to study what is driving the increases in total and extreme precipitation since 1996, and will look at the specific weather events associated with these changes.


Landrum L.,U.S. National Center for Atmospheric Research | Otto-Bliesner B.L.,U.S. National Center for Atmospheric Research | Wahl E.R.,National Climatic Data Center | Conley A.,U.S. National Center for Atmospheric Research | And 3 more authors.
Journal of Climate | Year: 2013

An overview of a simulation referred to as the "Last Millennium" (LM) simulation of the Community Climate System Model, version 4 (CCSM4), is presented. The CCSM4LMsimulation reproduces many largescale climate patterns suggested by historical and proxy-data records, with Northern Hemisphere (NH) and Southern Hemisphere (SH) surface temperatures cooling to the early 1800s Common Era by ;0.5°C (NH) and ;0.3°C (SH), followed by warming to the present. High latitudes of both hemispheres show polar amplification of the cooling from the Medieval Climate Anomaly (MCA) to the Little Ice Age (LIA) associated with sea ice increases. The LM simulation does not reproduce La Niñ a-like cooling in the eastern Pacific Ocean during the MCA relative to the LIA, as has been suggested by proxy reconstructions. Still, dry medieval conditions over the southwestern and central United States are simulated in agreement with proxy indicators for these regions. Strong global cooling is associated with large volcanic eruptions, with indications of multidecadal colder climate in response to larger eruptions. The CCSM4's response to large volcanic eruptions captures some reconstructed patterns of temperature changes over Europe and North America, but not those of precipitation in the Asian monsoon region. The Atlantic multidecadal oscillation (AMO) has higher variance at centennial periods in theLMsimulation compared to the 1850 nontransient run, suggesting a long-term Atlantic Ocean response to natural forcings. The North Atlantic Oscillation (NAO), Pacific decadal oscillation (PDO), and El Niñ o-Southern Oscillation (ENSO) variability modes show little or no change. CCSM4 does not simulate a persistent positive NAO or a prolonged period of negative PDO during the MCA, as suggested by some proxy reconstructions. © 2013 American Meteorological Society.


Vose R.S.,National Climatic Data Center | Applequist S.,National Climatic Data Center | Menne M.J.,National Climatic Data Center | Williams C.N.,National Climatic Data Center | Thorne P.,Cooperative Institute for Climate and Satellites North Carolina
Geophysical Research Letters | Year: 2012

Temperature trends over 1979-2008 in the U.S. Historical Climatology Network (HCN) are compared with those in six recent atmospheric reanalyses. For the conterminous United States, the trend in the adjusted HCN (0.327 C dec -1) is generally comparable to the ensemble mean of the reanalyses (0.342 C dec-1). It is also well within the range of the reanalysis trend estimates (0.280 to 0.437 C dec-1). The bias adjustments play a critical role, as the raw HCN dataset displays substantially less warming than all of the reanalyses. HCN has slightly lower maximum and minimum temperature trends than those reanalyses with hourly temporal resolution, suggesting the HCN adjustments may not fully compensate for recent non-climatic artifacts at some stations. Spatially, both the adjusted HCN and all of the reanalyses indicate widespread warming across the nation during the study period. Overall, the adjusted HCN is in broad agreement with the suite of reanalyses. © Copyright 2012 by the American Geophysical Union.


Tang Q.,CAS Beijing Institute of Geographic Sciences and Nature Resources Research | Leng G.,University of Chinese Academy of Sciences | Groisman P.Y.,National Climatic Data Center
Journal of Climate | Year: 2012

A pronounced summer warming is observed in Europe since the 1980s that has been accompanied by an increase in the occurrence of heat waves. Water deficit that strongly reduces surface latent cooling is a widely accepted explanation for the causes of hot summers. The authors show that the variance of European summer temperature is partly explained by changes in summer cloudiness. Using observation-based products of climate variables, satellite-derived cloud cover, and radiation products, the authors show that, during the 1984-2007 period, Europe has become less cloudy (except northeastern Europe) and the regions east of Europe have become cloudier in summer daytime. In response, the summer temperatures increased in the areas of total cloud cover decrease and stalled or declined in the areas of cloud cover increase. Trends in the surface shortwave radiation are generally positive (negative) in the regions with summer warming (cooling or stalled warming), whereas the signs of trends in top-of-atmosphere (TOA) reflected shortwave radiation are reversed. The authors' results suggest that total cloud cover is either the important local factor influencing the summer temperature changes in Europe or a major indicator of these changes. © 2012 American Meteorological Society.


Shi X.,University of Washington | Dery S.J.,University of Northern British Columbia | Groisman P.Y.,National Climatic Data Center | Lettenmaier D.P.,University of Washington
Journal of Climate | Year: 2013

Using the Variable Infiltration Capacity (VIC) land surface model forced with gridded climatic observations, the authors reproduce spatial and temporal variations of snow cover extent (SCE) reported by the National Oceanic and Atmospheric Administration (NOAA) Northern Hemisphere weekly satellite SCE data. Both observed and modeled North American and Eurasian snow cover in the pan-Arctic have statistically significant negative trends from April through June over the period 1972-2006. To diagnose the causes of the pan-Arctic SCE recession, the authors identify the role of surface energy fluxes generated in VIC and assess the relationships between 15 hydroclimatic indicators and NOAA SCE observations over each snowcovered sensitivity zone (SCSZ) for both North America and Eurasia. The authors find that surface net radiation (SNR) provides the primary energy source and sensible heat (SH) plays a secondary role in observed changes of SCE. As compared with SNR and SH, latent heat has only a minor influence on snow cover changes. In addition, these changes in surface energy fluxes resulting in the pan-Arctic snow cover recession are mainly driven by statistically significant decreases in snow surface albedo and increased air temperatures (surface air temperature, daily maximum temperature, and daily minimum temperature), as well as statistically significant increased atmospheric water vapor pressure. Contributions of other hydroclimate variables that the authors analyzed (downward shortwave radiation, precipitation, diurnal temperature range, wind speed, and cloud cover) are not significant for observed SCE changes in either the North American or Eurasian SCSZs. © 2013 American Meteorological Society.


Menne M.J.,National Climatic Data Center | Durre I.,National Climatic Data Center | Vose R.S.,National Climatic Data Center | Gleason B.E.,National Climatic Data Center | Houston T.G.,National Climatic Data Center
Journal of Atmospheric and Oceanic Technology | Year: 2012

A database is described that has been designed to fulfill the need for daily climate data over global land areas. The dataset, known as GlobalHistorical Climatology Network (GHCN)-Daily, was developed for a wide variety of potential applications, including climate analysis and monitoring studies that require data at a daily time resolution (e.g., assessments of the frequency of heavy rainfall, heat wave duration, etc.). The dataset contains records from over 80 000 stations in 180 countries and territories, and its processing system produces the official archive for U.S. daily data. Variables commonly include maximum and minimum temperature, total daily precipitation, snowfall, and snow depth; however, about two-thirds of the stations report precipitation only. Quality assurance checks are routinely applied to the full dataset, but the data are not homogenized to account for artifacts associated with the various eras in reporting practice at any particular station (i.e., for changes in systematic bias). Daily updates are provided for many of the station records in GHCN-Daily. The dataset is also regularly reconstructed, usually once per week, from its 201 data source components, ensuring that the dataset is broadly synchronized with its growing list of constituent sources. The daily updates and weekly reprocessed versions of GHCN-Daily are assigned a unique version number, and the most recent dataset version is provided on the GHCN-Daily website for free public access. Each version of the dataset is also archived at the NOAA/National Climatic Data Center in perpetuity for future retrieval.


News Article | September 7, 2016
Site: news.yahoo.com

With the midnight sun sinking lower in the sky each day, now is typically the time of year when the annual summer sea ice melt slows to a crawl in the Arctic. But 2016 is not your typical year in that part of the world. In fact, no year is "typical" anymore for a region that is warming at about twice as fast as the rest of the globe. Right now, broken ice and open waters are inching closer to the geographic North Pole. This is extremely rare, but likely not unprecedented, said Mark Serreze, the director of the National Climatic Data Center, in an interview. SEE ALSO: In diplomatic milestone, the US and China formally join Paris Climate Agreement The state of the sea ice pack at the top of the world is a sign of the rapid pace of warming taking place there. This year has been record warm across the Arctic, and has seen several unseasonably powerful storms swirl across the Arctic Ocean, churning the sea ice. The ice pack after these storms was more vulnerable to melting, since it was split into smaller chunks in greater contact with comparatively mild seawater. In addition, a late season warm spell has propelled 2016 to run close to 2007 for the title of the second-lowest sea ice minimum on record. The area of open water and broken ice chunks, which could be navigable to ships, is "quite near the Pole," Serreze said, nearing 87 degrees North. Satellite imagery indicates the broken ice and open water may even extend to 88 degrees North in some places. (Some of the areas of broken ice are known as "polynyas," spots of  open water surrounded by ice.) "[It's] At a very high latitude now," he said. "It's creeping closer [to the Pole] every year." A polar buoy network that had been operating in the region could have confirmed the presence of open water at or near the Pole, but funding cuts leaves satellite imagery as the best source of data on how the melt season is progressing. There are still at least a couple of weeks left in the melt season, meaning that the broken ice pack and open water could make it to the Pole itself, although weather conditions will have the final say in making that happen. Serreze said this season will end as either the second or third-lowest sea ice extent on record. "It's gonna be a race to the finish," he said, calling 2016 "another year in the new normal of the Arctic."


News Article | April 4, 2016
Site: news.yahoo.com

This photo from January 2015 depicts a nearly snowless Tioga Pass in California's Sierra Nevada Mountains, near Yosemite National Park. The image highlights the dramatic effect of extremely low precipitation and record-high temperatures upon Ca More The weird weather pattern that hatched California's ongoing drought is becoming more common, and could bring more extreme dry spells in the future, a new study finds. California is suffering its worst drought in 1,200 years because of a persistent atmospheric "high" parked just offshore. This high-pressure ridge — aptly named the "ridiculously resilient ridge" — deflects winter storms northward, away from California, according to the researchers. Winter storms are critical for California's water supply; the state receives 75 percent of its precipitation in the coldest months. The blocking pattern also triggers higher temperatures on land and in the coastal ocean. [The 5 Worst Droughts in US History] The ridge appeared in 2012 and received its nickname in December 2013 from Stanford University Ph.D. student Daniel Swain. At its greatest extent, the "RRR" stretched along the entire West Coast, from California north to Alaska. This sort of high-pressure system has emerged more frequently in recent decades, according to new research from Swain and his colleagues. The results were published today (April 1) in the journal Science Advances. Swain analyzed historical data from the U.S. National Climatic Data Center to identify unusual weather years in the past. Along with high temperatures and drought, the researchers also looked for other extreme weather events, such as very wet or very cold years. Then, Swain worked backward to find out what the atmospheric pressure patterns were like when the weather took a severe turn for the worse. "We're not using climate models; we're using real-world observations," Swain told Live Science. "We think it's critically important to consider the extremes, rather than changes in what's going on in the average, because for most practical purposes, a little above or a little below is manageable. The problems start to arise when you have these extreme events." On average, California still receives about the same amount of precipitation as in decades past, the study reported. (The historical data cover climate observations from 1949 to 2015.) But the variability between wet and dry cycles has increased in recent decades, Swain said. The frequency of a specific North Pacific atmospheric pattern — one akin to the ridiculously resilient ridge — significantly increased over the 67-year period, the study reported. That means more drought years, though the frequency of extreme wet years stayed the same. "We have high confidence that specific dry and warm patterns increased in recent decades, but the wet patterns have not decreased and may have actually increased," Swain said. "The problem is that we see the extreme droughts or flood events more frequently." [Dry and Dying: Images of Drought] The new findings could help forecasters better understand how California's weather may shift in response to global warming. "The next step is figuring out why we're seeing this and what is the real cause. Then, we can assess whether climate-model predictions for the future are consistent with what we really should expect. We might be able to have some year-to-year predictability, which could help in preparing for these events," Swain said. The research fits with climate-model predictions of more frequent and intense weather events in the coming decades — drier droughts and heavier rains. California's current drought has been particularly severe because of rising temperatures, which several different research groups attribute to human-caused global warming. The heat bakes what little moisture there is right out of the ground. The ridiculously resilient ridge adds another level of aridity on top of these conditions. The stubborn ridge mostly disappeared in winter 2015-2016, a casualty of a huge El Niño changing Pacific Ocean weather patterns. But the system could reappear when ocean temperatures revert from warm to normal or even below-average temperatures, which is what happens during La Niña events. The two patterns are part of the El Niño Southern Oscillation, or ENSO, a natural climate fluctuation in the Pacific Ocean. Copyright 2016 LiveScience, a Purch company. All rights reserved. This material may not be published, broadcast, rewritten or redistributed.


Cosmic ray detection at LOFAR runs continuously in the background during astronomical observations. When 16 out of the 20 scintillator stations of the LORA particle array detect a signal, a ‘trigger’ is issued and the ring buffers of all active antennas within about a 1-km radius are stored for offline analysis14. Which antennas are active depends on the settings of the astronomical observation. For this analysis, we selected showers that were measured with at least four antenna stations (corresponding to at least 192 antennas) in the low band (30–80 MHz after filtering). The trigger and selection criteria introduce a composition bias. This bias is removed by excluding certain showers on the basis of dedicated sets of simulations that are produced for each observed shower. Each of these sets contains 50 proton and 25 iron showers that span the whole range of possible shower depths. A shower is only accepted if all simulations in its set satisfy the trigger and selection criteria. This anti-bias exclusion removes many showers below 1017 eV, but only four above that energy. Consequently, we restrict our analysis to the higher-energy showers, imposing a minimum bound on the reconstructed shower energy of E  = 1017 eV. Imposing this energy bound introduces another potential source of compositional bias, because the reconstructed energy might depend on the depth of the shower. However, in our reconstruction approach, this effect is very small because energy and X are fitted simultaneously. Extended Data Fig. 7 shows distributions of the ratio between true and reconstructed energy for proton and iron simulations. The systematic offset between the two particle types is of the order of 1%. We used data from the Royal Netherlands Meteorological Institute to check for lightning-storm conditions during our observations. When lightning strikes were detected in the north of the Netherlands within an hour from a detection, the event is flagged and excluded from the analysis. The presence of electric fields in the clouds can severely alter the radio emission even in the absence of lightning discharges30. The polarization angle of the radio pulse is very sensitive to the nature of the emission mechanism15, 31 and is used as an additional veto against strong field conditions. Finally, a quality criterion is imposed on the sample so that only showers that have a core position and arrival direction that allows accurate reconstruction are included. We use the dedicated sets of simulations produced for each shower to derive uncertainties on core position, energy and X . These three values are highly correlated, so a single criterion based on the core uncertainty of σ  < 5 m is sufficient. The quality criterion is based on the dedicated sets of simulations. These sets are produced for a specific combination of core position and arrival direction. Therefore, the quality criterion is effectively a criterion on position and direction, and does not introduce a composition bias. There is no criterion on the quality of the reconstruction of the actual data. By applying the criteria described above we obtain a sample of 118 showers that are fitted to the simulation yielding reduced χ2 values in the range 0.9–2.9. Deviations from unity can be ascribed to uncertainties in antenna response, atmospheric properties such as the index of refraction, or limitations of the simulation software. The energy and X of the shower are reconstructed with the technique described in ref. 18. The statistical uncertainty on the power measurements of individual antennas includes three contributions. First, there is contribution from the background noise, which is a combination of system noise and the galactic background. Second, there is a contribution from uncertainties in the antenna response model. There can be differences between the responses of antennas, either because of antenna properties (for example, cross-talk between nearby antennas) or because of signal properties (for example, polarization). Because these fluctuations are different for each shower core position and arrival direction, they are essentially random and so are included as a 10% statistical uncertainty on the power. Third, there is a contribution due to the error introduced by interpolating the simulated pulse power. Strictly speaking this is not a measurement uncertainty, but it must be taken into account when fitting the data to simulation. The interpolation error is of the order of 2.5% of the maximum power18. The three contributions are added in quadrature and produce the 1σ error bars shown in Extended Data Figs 1,2,3,4,5. The statistical uncertainty on X is given by the quadratic sum of the uncertainties due to the reconstruction technique and the atmospheric correction. The former is found by applying our analysis to simulated events with added Gaussian noise, where the noise level is determined from the data. In the CORSIKA simulations, the standard US atmosphere model was used. The reconstructed shower depth is corrected for variations in the atmosphere using data from the Global Data Assimilation System (GDAS) of the NOAA National Climatic Data Center. We follow a previously developed procedure32, which typically leads to adjustments of the order of 5–20 g cm−2. The remaining uncertainty after correction is of the order of 1 g cm−2. The refractive index of air is a function of temperature, air pressure and relative humidity. Using local weather information, the final data were split in two groups of equal size, corresponding to conditions with relatively high or low refractive index. The mean reconstructed X of these two subsets deviate from that of the total sample by ±5 g cm−2; we adopt this value as an additional statistical uncertainty. Because the refractivity used in simulation corresponds to dry air, there is also an associated systematic error (see below). The total statistical uncertainty on X is found by adding the above factors in quadrature. A distribution of the uncertainty for the showers in our final sample is shown in Extended Data Fig. 6. The energy resolution is 32% and is found by comparing energy scaling factors of the radio power and particle density fit (see Fig. 1). The data have been subjected to several tests (outlined below) to determine the systematic uncertainty on the reconstructed values for X . Zenith-angle dependence. The final data are split into two groups of equal size by selecting showers with a zenith angle below or above 32°. For both groups, the mean reconstructed X is calculated, yielding deviations from the mean value of the complete sample of ±8 g cm−2. This spread is larger than is expected from random fluctuations alone and is included as a systematic uncertainty. The dependence on zenith angle may be related to atmospheric uncertainties (see below). Refractive index of air. As explained above, the refractive index changes because of differences in atmospheric conditions. Fluctuations in X due to changing humidity are of the order of 5 g cm−2 with respect to the mean. However, the refractive index that was used in the radio simulations corresponds to dry air, and is a lower bound on the actual value. Therefore, the real value of X can be higher than the reconstructed value, but not lower; we adopt an asymmetric systematic uncertainty of +10 g cm−2. Hadronic interaction model. Because the reconstruction technique is based on full Monte Carlo simulations, it is sensitive to the choice of hadronic interaction model that is used. A comparison between QGSJETII.04, SYBILL 2.1 and EPOS-LHC, revealed that the uncertainty due to model dependence is about 5 g cm−2. The uncertainty on the composition due to different models (in other words, on how to interpret the measured X values) is larger. Radiation code. For this analysis we used the radiation code CoREAS, in which the contributions of all individual charges to radiation field are added together. The advantage of this microscopic approach is that it is completely model-independent and based on first principles. ZHAireS33 is another microscopic code, which gives very similar results34. To calculate the emission, CoREAS uses the end-point formalism35, whereas ZHAireS is based on the ZHS algorithm36. Both formalisms are derived directly from Maxwell’s equations and have been shown to be equivalent37. The other difference between CoREAS and ZHAires is that they take the particle distribution from different air-shower propagation codes (CORSIKA and AIRES, respectively) that internally use different hadronic interaction models. Because the radiation formalisms themselves are equivalent, small differences between CoREAS and ZHAireS are probably due to differences in the hadronic interaction models used to simulate the particle interactions. Therefore, the choice of radiation code does not introduce additional systematic uncertainty on top of the uncertainty due to hadronic interaction models that is already included. A comparison study with LOFAR data did not show any evidence for a systematic offset between the codes (S.B. et al., in preparation). The remaining small dependence of X on zenith angle is possibly related to the refractive index. Showers with different inclination angles have their shower maximum at different altitudes and, therefore, different local air pressures and refractive indices. Consequently, increasing the refractive index used in simulations will result in a zenith-dependent change in reconstructed Xmax. This could potentially remove the observed dependence of the composition on zenith angle. Correctly taking into account a complete atmospheric model for the profile of the refractivity of air is subject of further study. Here, we treat the effect conservatively by linearly adding the first two contributions to the uncertainty. The other two contributions are independent and are added in quadrature, yielding a total systematic uncertainty of . The systematic uncertainty in the energy reconstruction with the LORA particle detector array is 27%, which includes effects due to detector calibration, hadronic interaction models and the assumed slope of the primary cosmic-ray spectrum in the CORSIKA simulations38, 39. For each observed shower, we calculate a using equation (1): in which X is the reconstructed X , and 〈X 〉 and 〈X 〉 are mean values predicted by QGSJETII.0419. Therefore, a is an energy-independent parameter that is mass sensitive. A pure proton composition results in a wide distribution of a centred around zero, whereas a pure iron composition would results in a narrower distribution centred around one. From the measurements we construct a cumulative distribution function (CDF) using the following Monte Carlo approach. A realization of the data is made by taking the measured values for the energy and X , adding random fluctuations based on the statistical uncertainty of these parameters, and calculating a and the corresponding CDF. By constructing a large number of realizations with different random fluctuations, we calculate the mean CDF and the region that contains 99% of all realizations. These are indicated in Fig. 3 as the solid blue line and the shaded region, respectively. We fit theoretical CDFs on the basis of compositions with two or four mass components to the data. The test statistic in the fit is the maximum deviation between the data and the model CDFs. The p value represents the probability of observing this deviation, or a larger one, assuming the fitted composition model. We first use a two-component model of proton and iron nuclei, in which the mixing ratio is the only free parameter. The best fit is found for a proton fraction of 62%, but it describes the data poorly, with p value of 1.1 × 10−6. A better fit is achieved with a four-component model (p, He, N and Fe), yielding p = 0.17. Although the best fit is found for a helium fraction of 80%, the fit quality deteriorates slowly when replacing helium by protons. This is demonstrated in Fig. 4, in which p is plotted for four-component fits with the fractions of helium and proton fixed, and the ratio between nitrogen and iron is the only free parameter. The solid line in Fig. 4 bounds the parameter space in which p > 0.01. We construct a 99% confidence interval on the total fraction of light elements (p and He) by finding the two extreme values of this fraction that lie within the p > 0.01 region. The total fraction of light elements (p and He) is in the range [0.38, 0.98] at the 99% confidence level, with a best fit value of 0.8. The heaviest composition that is allowed within systematic uncertainties (see above) has a best-fit p + He fraction of 0.6 and a 99% confidence interval of [0.18, 0.82]. Data analysis was done with PyCRTools. PyCRTools is free software, available from http://usg.lofar.org/svn/code/trunk/src/PyCRTools, which can be redistributed and/or modified under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License or any later version.

Loading National Climatic Data Center collaborators
Loading National Climatic Data Center collaborators