Time filter

Source Type

Cafaro M.,University of Salento | Cafaro M.,Euro Mediterranean Center for Climate Change | Tempesta P.,Complutense University of Madrid
Concurrency Computation Practice and Experience | Year: 2011

We present a deterministic parallel algorithm for the k-majority problem, that can be used to find in parallel frequent items, i.e. those whose multiplicity is greater than a given threshold, and is therefore useful to process iceberg queries and in many other different contexts of applied mathematics and information theory. The algorithm can be used both in the online (stream) context and in the offline setting, the difference being that in the former case we are restricted to a single scan of the input elements, so that verifying the frequent items that have been determined is not allowed (e.g. network traffic streams passing through internet routers), while in the latter a parallel scan of the input can be used to determine the actual k-majority elements. To the best of our knowledge, this is the first parallel algorithm solving the proposed problem. Copyright © 2011 John Wiley & Sons, Ltd.

Tavoni M.,Euro Mediterranean Center for Climate Change | Tavoni M.,Fondazione Eni Enrico Mattei | de Cian E.,Euro Mediterranean Center for Climate Change | de Cian E.,Fondazione Eni Enrico Mattei | And 3 more authors.
Climatic Change | Year: 2012

This paper assesses the economic value associated with the development of various low-carbon technologies in the context of climate stabilization. We analyze the impact of restrictions on the development of specific mitigation technologies, comparing three integrated assessment models used in the RECIPE comparison exercise. Our results indicate that the diversification of the carbon mitigation portfolio is an important determinant of the feasibility of climate policy. Foregoing specific low carbon technologies raises the cost of achieving the climate policy, though at different rates. CCS and renewables are shown to have the highest value, given their flexibility and wide coverage. The costs associated with technology failure are shown to be related to the role that each technology plays in the stabilization scenario, but also to the expectations about their technological progress. In particular, the costs of restriction of mature technologies can be partly compensated by more innovation and technological advancement. © 2011 Springer Science+Business Media B.V.

Jakob M.,Potsdam Institute for Climate Impact Research | Luderer G.,Potsdam Institute for Climate Impact Research | Steckel J.,Potsdam Institute for Climate Impact Research | Tavoni M.,Euro Mediterranean Center for Climate Change | Monjon S.,Center International de Recherche sur lEnvironnement et le Developpement
Climatic Change | Year: 2012

This paper compares the results of the three state of the art climate-energy-economy models IMACLIM-R, ReMIND-R, and WITCH to assess the costs of climate change mitigation in scenarios in which the implementation of a global climate agreement is delayed or major emitters decide to participate in the agreement at a later stage only. We find that for stabilizing atmospheric GHG concentrations at 450 ppm CO 2-only, postponing a global agreement to 2020 raises global mitigation costs by at least about half and a delay to 2030 renders ambitious climate targets infeasible to achieve. In the standard policy scenario-in which allocation of emission permits is aimed at equal per-capita levels in the year 2050-regions with above average emissions (such as the EU and the US alongside the rest of Annex-I countries) incur lower mitigation costs by taking early action, even if mitigation efforts in the rest of the world experience a delay. However, regions with low per-capita emissions which are net exporters of emission permits (such as India) can possibly benefit from higher future carbon prices resulting from a delay. We illustrate the economic mechanism behind these observations and analyze how (1) lock-in of carbon intensive infrastructure, (2) differences in global carbon prices, and (3) changes in reduction commitments resulting from delayed action influence mitigation costs. © 2011 Springer Science+Business Media B.V.

Luderer G.,Potsdam Institute for Climate Impact Research | DeCian E.,Euro Mediterranean Center for Climate Change | Hourcade J.-C.,Center International de Recherche sur lEnvironment et le Development | Leimbach M.,Potsdam Institute for Climate Impact Research | And 2 more authors.
Climatic Change | Year: 2012

This paper analyzes the regional distribution of climate change mitigation costs in a global cap-and-trade regime. Four stylized burden-sharing rules are considered, ranging from GDP-based permit allocations to schemes that foresee a long-term convergence of per-capita emission permits. The comparison of results from three structurally different hybrid, integrated energy-economy models allows us to derive robust insights as well as identify sources of uncertainty with respect to the regional distribution of the costs of climate change mitigation. We find that regional costs of climate change mitigation may deviate substantially from the global mean. For all models, the mitigation cost average of the four scenarios is higher for China than for the other macro-regions considered. Furthermore, China suffers above-world-average mitigation costs for most burden-sharing rules in the long-term. A decomposition of mitigation costs into (a) primary (domestic) abatement costs and (b) permit trade effects, reveals that the large uncertainty about the future development of carbon prices results in substantial uncertainties about the financial transfers associated with carbon trade for a given allocation scheme. This variation also implies large uncertainty about the regional distribution of climate policy costs. © 2012 Springer Science+Business Media B.V.

Vaccaro A.,University of Sannio | Mercogliano P.,Euro Mediterranean Center for Climate Change | Schiano P.,Euro Mediterranean Center for Climate Change | Villacci D.,University of Sannio
Electric Power Systems Research | Year: 2011

This paper proposes a novel framework for one-day-ahead wind power forecasting based on information amalgamation from multiple sources. The final objective is to provide a better solution than could otherwise be achieved from the use of single-source data alone. The proposed framework combines multiple forecasting models and adaptive machine learning techniques for information processing. The input data sources are the wind forecast profiles computed by synoptic physical models and measured data coming from meteorological stations which are amalgamated via an adaptive supervised learning system. The latter is based on a local learning algorithm, called the Lazy Learning (LL) algorithm. This algorithm is sequentially updated, in order to adapt the whole architecture to "new" operating conditions. Experimental results obtained on a one-year time scenario show the effectiveness of the proposed data fusion paradigm in addressing the problem of one-day-ahead wind power forecasting. © 2010 Elsevier B.V. All rights reserved.

Emanuel K.,Massachusetts Institute of Technology | Solomon S.,Massachusetts Institute of Technology | Folini D.,ETH Zurich | Davis S.,University of Colorado at Boulder | And 2 more authors.
Journal of Climate | Year: 2013

Virtually all metrics of Atlantic tropical cyclone activity show substantial increases over the past two decades. It is argued here that cooling near the tropical tropopause and the associated decrease in tropical cyclone outflow temperature contributed to the observed increase in tropical cyclone potential intensity over this period. Quantitative uncertainties in the magnitude of the cooling are important, but a broad range of observations supports some cooling. Downscalings of the output of atmospheric general circulation models (AGCMs) that are driven by observed sea surface temperatures and sea ice cover produce little if any increase in Atlantic tropical cyclone metrics over the past two decades, even though observed variability before roughly 1970 is well simulated by some of the models. Part of this shortcoming is traced to the failure of the AGCMs examined to reproduce the observed cooling of the lower stratosphere and tropical tropopause layer (TTL) over the past few decades. The authors caution against using sea surface temperature or proxies based on it to make projections of tropical cyclone activity as there can be significant contributions from other variables such as the outflow temperature. The proposed mechanisms of TTL cooling (e.g., ozone depletion and stratospheric circulation changes) are reviewed, and the need for improved representations of these processes in global models in order to improve projections of future tropical cyclone activity is emphasized. © 2013 American Meteorological Society.

Turco M.,University of Barcelona | Turco M.,Euro Mediterranean Center for Climate Change | Llasat M.C.,University of Barcelona | von Hardenberg J.,CNR Institute of atmospheric Sciences and Climate | Provenzale A.,CNR Institute of atmospheric Sciences and Climate
Climatic Change | Year: 2013

We analyse the impact of climate interannual variability on summer forest fires in Catalonia (northeastern Iberian Peninsula). The study period covers 25 years, from 1983 to 2007. During this period more than 16000 fire events were recorded and the total burned area was more than 240 kha, i. e. around 7.5% of whole Catalonia. We show that the interannual variability of summer fires is significantly correlated with summer precipitation and summer maximum temperature. In addition, fires are significantly related to antecedent climate conditions, showing positive correlation with lagged precipitation and negative correlation with lagged temperatures, both with a time lag of two years, and negative correlation with the minimum temperature in the spring of the same year. The interaction between antecedent climate conditions and fire variability highlights the importance of climate not only in regulating fuel flammability, but also fuel structure. On the basis of these results, we discuss a simple regression model that explains up to 76% of the variance of the Burned Area and up to 91% of the variance of the number of fires. This simple regression model produces reliable out-of-sample predictions of the impact of climate variability on summer forest fires and it could be used to estimate fire response to different climate change scenarios, assuming that climate-vegetation-humans-fire interactions will not change significantly. © 2012 Springer Science+Business Media B.V.

Metzger M.J.,University of Edinburgh | Bunce R.G.H.,Estonian University of Life Sciences | Jongman R.H.G.,Wageningen University | Sayre R.,U.S. Geological Survey | And 3 more authors.
Global Ecology and Biogeography | Year: 2013

To develop a novel global spatial framework for the integration and analysis of ecological and environmental data. Location: The global land surface excluding Antarctica. Methods: A broad set of climate-related variables were considered for inclusion in a quantitative model, which partitions geographic space into bioclimate regions. Statistical screening produced a subset of relevant bioclimate variables, which were further compacted into fewer independent dimensions using principal components analysis (PCA). An ISODATA clustering routine was then used to classify the principal components into relatively homogeneous environmental strata. The strata were aggregated into global environmental zones based on the attribute distances between strata to provide structure and support a consistent nomenclature. Results: The global environmental stratification (GEnS) consists of 125 strata, which have been aggregated into 18 global environmental zones. The stratification has a 30 arcsec resolution (equivalent to 0.86 km2 at the equator). Aggregations of the strata were compared with nine existing global, continental and national bioclimate and ecosystem classifications using the Kappa statistic. Values range between 0.54 and 0.72, indicating good agreement in bioclimate and ecosystem patterns between existing maps and the GEnS. Main conclusions: The GEnS provides a robust spatial analytical framework for the aggregation of local observations, identification of gaps in current monitoring efforts and systematic design of complementary and new monitoring and research. The dataset is available for non-commercial use through the GEO portal (http://www.geoportal.org). © 2012 Blackwell Publishing Ltd.

Scarascia L.,University of Salento | Scarascia L.,Euro Mediterranean Center for Climate Change | Lionello P.,University of Salento | Lionello P.,Euro Mediterranean Center for Climate Change
Global and Planetary Change | Year: 2013

This study aims at discussing evolution of Sea Level (SL) in the Northern Adriatic Sea for the 20th and 21st century. A Linear Regression Model (LRM) which aims at describing the effect of regional processes, is built and validated. This LRM computes the North Adriatic mean SL variations using three predictors: the Mean Sea Level Pressure (MSLP) in the Gulf of Venice, the mean Sea Temperature (ST) of the water column in the South Adriatic and the Upper Level Salinity (ULS) in the central part of the basin. SL data are provided by monthly values recorded at 7 tide gauges distributed along the Italian and Croatian coasts (available at the PSMSL, Permanent Service of Mean Sea Level). MSLP data are provided by the EMULATE data set. Mediterranean ST and ULS data are extracted from the MEDATLAS/2002 database. The study shows that annual SL variations at Northern Adriatic stations are very coherent, so that the Northern Adriatic SL can be reconstructed since 1905 on the basis of only two stations: Venice and Trieste. The LRM is found to be robust, very successful at explaining interannual SL variations and consistent with the physical mechanisms responsible for SL evolution. Results show that observed SL in the 20th century has a large trend, which cannot be explained by this LRM, and it is interpreted as the superposition of land movement and a remote cause (such as polar ice melting). When the LRM is used with the MSLP, ST and ULS from climate model projections for the end of the 21st century (A1B scenario), it produces an SL rise in the range from 2.3 to 14.1. cm, with a best estimate of 8.9. cm. However, results show that the behavior of the remotely forced SL rise is the main source of future SL uncertainty and extrapolating its present trend to the future would expand the range of SL uncertainty from 14 to 49. cm. © 2013 Elsevier B.V.

Storto A.,Norwegian Meteorological Institute | Storto A.,Euro Mediterranean Center for Climate Change | Randriamampianina R.,Norwegian Meteorological Institute
Journal of Geophysical Research: Atmospheres | Year: 2010

Statistical objective analysis requires the explicit specification of the observation and background error covariances. This paper deals with the estimation of the latter within a high-latitude regional model. Four different approaches have been adopted to simulate the error evolution in the analysis and forecast steps of the model: i) the largely-adopted NMC method, applied to both a winter and a summer season data set, ii) global ensemble analyses projected forward to the 6-hour forecast range by the limited-area model itself, iii) limited-area ensemble variational assimilation with unperturbed lateral boundary conditions (LBCs) and iv) limited-area ensemble variational assimilation with perturbed lateral boundary conditions. The structure of the four background error covariance matrices was extensively compared. It turned out that the NMC-derived standard deviations had stronger amplitude at the large scales, especially in the lower troposphere, compared to those derived using the ensemble technique, and much broader horizontal and large-scale vertical correlations. The contribution of lateral boundary perturbations is shown to be significant. Moreover, neglecting these LBCs perturbations contributes to unrealistic features, such as artificial small variances near the boundaries, which tend to spuriously propagate towards the inner area. Furthermore, humidity variances and associated cross-covariances tend to be weaker in the ensemble assimilation experiments, in particular when lateral boundary conditions are not perturbed. A one-month assimilation period allowed us to evaluate the impact of the different background error statistics on the forecasts: wind, geopotential and partly temperature are clearly benefiting from the use of ensemble-based background error covariances; humidity skill scores are noticeably improved when background error covariances from limited-area ensemble assimilation were used. The seasonality of the background error structures was also investigated, and it was found non-negligible for the performance of the data assimilation and forecast systems. © 2010 by the American Geophysical Union.

Loading Euro Mediterranean Center for Climate Change collaborators
Loading Euro Mediterranean Center for Climate Change collaborators