The Helsinki Institute of Physics is a physics research institute operated by University of Helsinki, Aalto University and University of Jyväskylä. The institute is responsible for the Finnish research collaboration with CERN.The research is currently focused on following fields: Theory Programme Biophysics Cosmology Particle physics phenomenology String Theory and Quantum Field Theory Ultrarelativistic Heavy Ion Collisions High Energy Physics Programme Electron-Positron Physics COMPASS Detector Laboratory CMS Programme CMS Software and Physics CMS Tracker Technology Programme DataGrid Nuclear Matter Programme ALICE ISOLDE Wikipedia.
Osterberg K.,Helsinki Institute of Physics
International Journal of Modern Physics A | Year: 2014
At CERNs Large Hadron Collider, a unique possibility to study central exclusive processes such as the production of low mass resonances, charmonium states and jets, as well as to search for missing mass or momentum signatures opens up by the detection of both leading protons with the special ß∗ = 90 m optics. At v s = 13 TeV with that optics the leading proton acceptance of the TOTEM Roman Pots covers all diffractive masses, provided that the proton four-momentum transfer |t| ≥ 0.04 GeV2. This paper describes the physics potential with an integrated luminosity of 10 pb-1 and 100 pb-1 for common CMS and TOTEM data taking at ß∗ = 90m optics. © World Scientific Publishing Company.
Renk T.,University of Jyvaskyla |
Renk T.,Helsinki Institute of Physics
Physical Review C - Nuclear Physics | Year: 2012
Hard probes in the context of ultrarelativistic heavy-ion collisions represent a key class of observables studied to gain information about the QCD medium created in such collisions. However, in practice, the so-called jet tomography has turned out to be more difficult than expected initially. One of the major obstacles in extracting reliable tomographic information from the data is that neither the parton-medium interaction nor the medium geometry are known with great precision, and thus a difference in model assumptions in the hard perturbative Quantum Choromdynamics (pQCD) modeling can usually be compensated by a corresponding change of assumptions in the soft bulk medium sector and vice versa. The only way to overcome this problem is to study the full systematics of combinations of parton-medium interaction and bulk medium evolution models. This work presents a meta-analysis summarizing results from a number of such systematical studies and discusses in detail how certain data sets provide specific constraints for models. Combining all available information, only a small group of models exhibiting certain characteristic features consistent with a pQCD picture of parton-medium interaction is found to be viable given the data. In this picture, the dominant mechanism is medium-induced radiation combined with a surprisingly small component of elastic energy transfer into the medium. © 2012 American Physical Society.
Rasanen S.,Helsinki Institute of Physics
Physical Review D - Particles, Fields, Gravitation and Cosmology | Year: 2012
We consider universes that are close to Friedmann-Robertson-Walker in the sense that metric perturbations, their time derivatives and first spatial derivatives are small, but second spatial derivatives are not constrained. We show that if we, in addition, assume that the observer four velocity is close to its background value and close to the four-velocity which defines the hypersurface of averaging, the redshift and the average expansion rate remain close to the Friedmann-Robertson-Walker case. However, this is not true for the angular-diameter distance. The four-velocity assumption implies certain conditions on second derivatives of the metric and/or the matter content. © 2012 American Physical Society.
Hotchkiss S.,Helsinki Institute of Physics
Journal of Cosmology and Astroparticle Physics | Year: 2011
I show that the most common method of quantifying the likelihood that an extreme galaxy cluster could exist is biased and can result in false claims of tension with ΛCDM. This common method uses the probability that at least one cluster could exist above the mass and redshift of an observed cluster. I demonstrate the existence of the bias using sample cluster populations, describe its origin and explain how to remove it. I then suggest potentially more suitable and unbiased measures of the rareness of individual clusters. Each different measure will be most sensitive to different possible types of new physics. I show how to generalise these measures to quantify the total 'rareness' of a set of clusters. It is seen that, when mass uncertainties are marginalised over, there is no tension between the standard ΛCDM cosmological model and the existence of any observed set of clusters. As a case study, I apply these rareness measures to sample cluster populations generated using primordial density perturbations with a non-Gaussian spectrum. © 2011 IOP Publishing Ltd and SISSA.
Lebedev O.,Helsinki Institute of Physics |
Mambrini Y.,University Paris - Sud
Physics Letters, Section B: Nuclear, Elementary Particle and High-Energy Physics | Year: 2014
We consider the possibility that fermionic dark matter (DM) interacts with the Standard Model fermions through an axial Z' boson. As long as Z' decays predominantly into dark matter, the relevant LHC bounds are rather loose. Direct dark matter detection does not significantly constrain this scenario either, since dark matter scattering on nuclei is spin-dependent. As a result, for a range of the Z' mass and couplings, the DM annihilation cross section is large enough to be consistent with thermal history of the Universe. In this framework, the thermal WIMP paradigm, which currently finds itself under pressure, is perfectly viable. © 2014 The Authors.