Entity

Time filter

Source Type


Vogelsberger M.,Massachusetts Institute of Technology | Zavala J.,Copenhagen University | Simpson C.,Heidelberg Institute for Theoretical Studies | Jenkins A.,Durham University
Monthly Notices of the Royal Astronomical Society | Year: 2014

We present the first cosmological simulations of dwarf galaxies, which include dark matter self-interactions and baryons. We study two dwarf galaxies within cold dark matter, and four different elastic self-interacting scenarioswith constant and velocity-dependent cross-sections, motivated by a new force in the hidden dark matter sector. Our highest resolution simulation has a baryonic mass resolution of 1.8 × 102M⊙ and a gravitational softening length of 34 pc at z = 0. In this first study we focus on the regime of mostly isolated dwarf galaxies with halo masses ~1010M⊙ where dark matter dynamically dominates even at sub-kpc scales. We find that while the global properties of galaxies of this scale are minimally affected by allowed selfinteractions, their internal structures change significantly if the cross-section is large enough within the inner sub-kpc region. In these dark-matter-dominated systems, self-scattering ties the shape of the stellar distribution to that of the dark matter distribution. In particular, we find that the stellar core radius is closely related to the dark matter core radius generated by self-interactions. Dark matter collisions lead to dwarf galaxies with larger stellar cores and smaller stellar central densities compared to the cold dark matter case. The central metallicity within 1 kpc is also larger by up to ~15 per cent in the former case.We conclude that the mass distribution and characteristics of the central stars in dwarf galaxies can potentially be used to probe the self-interacting nature of dark matter. © 2014 The Authors. Source


Schefzik R.,Heidelberg Institute for Theoretical Studies
Monthly Weather Review | Year: 2016

Contemporary weather forecasts are typically based on ensemble prediction systems, which consist of multiple runs of numerical weather prediction models that vary with respect to the initial conditions and/or the parameterization of the atmosphere. Ensemble forecasts are frequently biased and show dispersion errors and thus need to be statistically postprocessed. However, current postprocessing approaches are often univariate and apply to a single weather quantity at a single location and for a single prediction horizon only, thereby failing to account for potentially crucial dependence structures. Nonparametric multivariate postprocessing methods based on empirical copulas, such as ensemble copula coupling or the Schaake shuffle, can address this shortcoming. A specific implementation of the Schaake shuffle, called the SimSchaake approach, is introduced. The SimSchaake method aggregates univariately postprocessed ensemble forecasts using dependence patterns from past observations. Specifically, the observations are taken from historical dates at which the ensemble forecasts resembled the current ensemble prediction with respect to a specific similarity criterion. The SimSchaake ensemble outperforms all reference ensembles in an application to ensemble forecasts for 2-m temperature from the European Centre for Medium-Range Weather Forecasts. © 2016 American Meteorological Society. Source


Ovcharov E.Y.,Heidelberg Institute for Theoretical Studies
Journal of Machine Learning Research | Year: 2015

To discuss the existence and uniqueness of proper scoring rules one needs to extend the associated entropy functions as sublinear functions to the conic hull of the prediction set. In some natural function spaces, such as the Lebesgue Lp-spaces over Rd, the positive cones have empty interior. Entropy functions defined on such cones have directional derivatives only, which typically exist on large subspaces and behave similarly to gradients. Certain entropies may be further extended continuously to open cones in normed spaces containing signed densities. The extended entropies are Gâteaux differentiable except on a negligible set and have everywhere continuous subgradients due to the supporting hyperplane theorem. We introduce the necessary framework from analysis and algebra that allows us to give an affirmative answer to the titular question of the paper. As a result of this, we give a formal sense in which entropy functions have uniquely associated proper scoring rules. We illustrate our framework by studying the derivatives and subgradients of the following three prototypical entropies: Shannon entropy, Hyvärinen entropy, and quadratic entropy. © 2015 Evgeni Y. Ovcharov. Source


Schefzik R.,Heidelberg Institute for Theoretical Studies
Quarterly Journal of the Royal Meteorological Society | Year: 2016

State-of-the-art weather forecasts usually rely on ensemble prediction systems, accounting for the different sources of uncertainty. As ensembles are typically uncalibrated, they should get statistically postprocessed. Several multivariate ensemble postprocessing techniques that additionally consider spatial, inter-variable and/or temporal dependences have been developed. These can be roughly divided into two groups. The first group comprises parametric, mainly low-dimensional, approaches that are tailored to specific settings. The second group involves non-parametric reordering methods that impose a specific dependence template on univariately postprocessed forecasts and are suitable in any dimension. In this article, these different strategies are combined, with the aim of exploiting the benefits of both concepts. Specifically, a high-dimensional postprocessing problem is divided into multiple low-dimensional instances, each of which is postprocessed via a suitable multivariate parametric method. From each postprocessed low-dimensional distribution, a sample is drawn, which is then reordered according to the corresponding multidimensional rank structure of an appropriately chosen dependence template. In this context, different ranking concepts for multivariate settings are discussed. Finally, all reordered samples are aggregated to obtain the overall postprocessed ensemble. The new approach is applied to ensemble forecasts for temperature and wind speed at several locations from the European Centre for Medium-Range Weather Forecasts, using a recent bivariate ensemble model output statistics postprocessing technique and a reordering based on the raw ensemble forecasts similar to the ensemble copula coupling method. It shows good predictive skill and outperforms reference ensembles. © 2016 Royal Meteorological Society. Source


Stamatakis A.,Heidelberg Institute for Theoretical Studies | Stamatakis A.,Karlsruhe Institute of Technology
Bioinformatics | Year: 2014

Motivation: Phylogenies are increasingly used in all fields of medical and biological research. Moreover, because of the next-generation sequencing revolution, datasets used for conducting phylogenetic analyses grow at an unprecedented pace. RAxML (Randomized Axelerated Maximum Likelihood) is a popular program for phylogenetic analyses of large datasets under maximum likelihood. Since the last RAxML paper in 2006, it has been continuously maintained and extended to accommodate the increasingly growing input datasets and to serve the needs of the user community. Results: I present some of the most notable new features and extensions of RAxML, such as a substantial extension of substitution models and supported data types, the introduction of SSE3, AVX and AVX2 vector intrinsics, techniques for reducing the memory requirements of the code and a plethora of operations for conducting postanalyses on sets of trees. In addition, an up-to-date 50-page user manual covering all new RAxML options is available. © The Author 2013. Published by Oxford University Press. Source

Discover hidden collaborations