Knorr W.,Lund University |
Kaminski T.,FastOpt GmbH |
Arneth A.,KIT IMK IFU |
Weber U.,Max Planck Institute for Biogeochemistry
Biogeosciences | Year: 2014
Human impact on wildfires, a major earth system component, remains poorly understood. While local studies have found more fires close to settlements and roads, assimilated charcoal records and analyses of regional fire patterns from remote-sensing observations point to a decline in fire frequency with increasing human population. Here, we present a global analysis using three multi-year satellite-based burned-area products combined with a parameter estimation and uncertainty analysis with a non-linear model. We show that at the global scale, the impact of increasing population density is mainly to reduce fire frequency. Only for areas with up to 0.1 people per km2, we find that fire frequency increases by 10 to 20% relative to its value at no population. The results are robust against choice of burned-area data set, and indicate that at only very few places on earth, fire frequency is limited by human ignitions. Applying the results to historical population estimates results in a moderate but accelerating decline of global burned area by around 14% since 1800, with most of the decline since 1950. © Author(s) 2014.
Kemp S.,University of Bristol |
Scholze M.,Lund University |
Ziehn T.,CSIRO |
Kaminski T.,FastOpt GmbH
Geoscientific Model Development | Year: 2014
Terrestrial ecosystem models are employed to calculate the sources and sinks of carbon dioxide between land and atmosphere. These models may be heavily parameterised. Where reliable estimates are unavailable for a parameter, it remains highly uncertain; uncertainty of parameters can substantially contribute to overall model output uncertainty. This paper builds on the work of the terrestrial Carbon Cycle Data Assimilation System (CCDAS), which, here, assimilates atmospheric CO2 concentrations to optimise 19 parameters of the underlying terrestrial ecosystem model (Biosphere Energy Transfer and Hydrology scheme, BETHY). Previous experiments have shown that the identified minimum may contain non-physical parameter values. One way to combat this problem is to use constrained optimisation and so avoid the optimiser searching non-physical regions. Another technique is to use penalty terms in the cost function, which are added when the optimisation searches outside of a specified region. The use of parameter transformations is a further method of avoiding this problem, where the optimisation is carried out in a transformed parameter space, thus ensuring that the optimal parameters at the minimum are in the physical domain. We compare these different methods of achieving meaningful parameter values, finding that the parameter transformation method shows consistent results and that the other two provide no useful results. © Author(s) 2014.
Rayner P.J.,CEA Saclay Nuclear Research Center |
Koffi E.,CEA Saclay Nuclear Research Center |
Scholze M.,University of Bristol |
Kaminski T.,FastOpt GmbH |
Dufresne J.-L.,University Pierre and Marie Curie
Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences | Year: 2011
We use a carbon-cycle data assimilation system to estimate the terrestrial biospheric CO2 flux until 2090. The terrestrial sink increases rapidly and the increase is stronger in the presence of climate change. Using a linearized model, we calculate the uncertainty in the flux owing to uncertainty in model parameters. The uncertainty is large and is dominated by the impact of soil moisture on heterotrophic respiration. We show that this uncertainty can be greatly reduced by constraining the model parameters with two decades of atmospheric measurements. © 2011 The Royal Society.
Vossbeck M.,FastOpt GmbH |
Clerici M.,European Commission - Joint Research Center Ispra |
Kaminski T.,FastOpt GmbH |
Lavergne T.,Norwegian Meteorological Institute |
And 3 more authors.
Inverse Problems | Year: 2010
This paper presents an inverse model of radiation transfer processes occurring in the solar domain in vegetation plant canopies. It uses a gradient method to minimize the misfit between model simulation and observed radiant fluxes plus the deviation from prior information on the unknown model parameters. The second derivative of the misfit approximates uncertainty ranges for the estimated model parameters. In a second step, uncertainties are propagated from parameters to simulated radiant fluxes via the model's first derivative. AU derivative information is provided by a highly efficient code generated via automatic differentiation of the radiative transfer code. The paper further derives and evaluates an approach for avoiding secondary minima of the misfit. The approach exploits the smooth dependence of the solution on the observations, and relies on a database of solutions for a discretized version of the observation space. © 2010 IOP Publishing Ltd.
Buchwitz M.,University of Leicester |
Reuter M.,University of Leicester |
Schneising O.,University of Leicester |
Boesch H.,SRON Netherlands Institute for Space Research |
And 46 more authors.
Remote Sensing of Environment | Year: 2015
The GHG-CCI project is one of several projects of the European Space Agency's (ESA) Climate Change Initiative (CCI). The goal of the CCI is to generate and deliver data sets of various satellite-derived Essential Climate Variables (ECVs) in line with GCOS (Global Climate Observing System) requirements. The "ECV Greenhouse Gases" (ECV GHG) is the global distribution of important climate relevant gases - atmospheric CO2 and CH4 - with a quality sufficient to obtain information on regional CO2 and CH4 sources and sinks. Two satellite instruments deliver the main input data for GHG-CCI: SCIAMACHY/ENVISAT and TANSO-FTS/GOSAT. The first order priority goal of GHG-CCI is the further development of retrieval algorithms for near-surface-sensitive column-averaged dry air mole fractions of CO2 and CH4, denoted XCO2 and XCH4, to meet the demanding user requirements. GHG-CCI focuses on four core data products: XCO2 from SCIAMACHY and TANSO and XCH4 from the same two sensors. For each of the four core data products at least two candidate retrieval algorithms have been independently further developed and the corresponding data products have been quality-assessed and inter-compared. This activity is referred to as "Round Robin" (RR) activity within the CCI. The main goal of the RR was to identify for each of the four core products which algorithms should be used to generate the Climate Research Data Package (CRDP). The CRDP will essentially be the first version of the ECV GHG. This manuscript gives an overview of the GHG-CCI RR and related activities. This comprises the establishment of the user requirements, the improvement of the candidate retrieval algorithms and comparisons with ground-based observations and models. The manuscript summarizes the final RR algorithm selection decision and its justification. Comparison with ground-based Total Carbon Column Observing Network (TCCON) data indicates that the "breakthrough" single measurement precision requirement has been met for SCIAMACHY and TANSO XCO2 (<3ppm) and TANSO XCH4 (<17ppb). The achieved relative accuracy for XCH4 is 3-15ppb for SCIAMACHY and 2-8ppb for TANSO depending on algorithm and time period. Meeting the 0.5ppm systematic error requirement for XCO2 remains a challenge: approximately 1ppm has been achieved at the validation sites but also larger differences have been found in regions remote from TCCON. More research is needed to identify the causes for the observed differences. In this context GHG-CCI suggests taking advantage of the ensemble of existing data products, for example, via the EnseMble Median Algorithm (EMMA). © 2013 Elsevier Inc.