Catania, Italy
Catania, Italy

The University of Catania is a university located in Catania, Italy, and founded in 1434. It is the oldest university in Sicily, the 13th oldest in Italy and the 29th oldest university in the world. With a population of over 60,000 students, it is the main university in Sicily. Wikipedia.


Time filter

Source Type

News Article | May 4, 2017
Site: physicsworld.com

Taken from the May 2017 issue of Physics World Powerful computers are now allowing cosmologists to solve Einstein’s frighteningly complex equations of general relativity in a cosmological setting for the first time. Tom Giblin, James Mertens and Glenn Starkman describe how this new era of simulations could transform our understanding of the universe From the Genesis story in the Old Testament to the Greek tale of Gaia (Mother Earth) emerging from chaos and giving birth to Uranus (the god of the sky), people have always wondered about the universe and woven creation myths to explain why it looks the way it does. One hundred years ago, however, Albert Einstein gave us a different way to ask that question. Newton’s law of universal gravitation, which was until then our best theory of gravity, describes how objects in the universe interact. But in Einstein’s general theory of relativity, space–time (the marriage of space and time) itself evolves together with its contents. And so cosmology, which studies the universe and its evolution, became at least in principle a modern science – amenable to precise description by mathematical equations, able to make firm predictions, and open to observational tests that could falsify those predictions. Our understanding of the mathematics of the universe has advanced alongside observations of ever-increasing precision, leading us to an astonishing contemporary picture. We live in an expanding universe in which the ordinary material of our everyday lives – protons, neutrons and electrons – makes up only about 5% of the contents of the universe. Roughly 25% is in the form of “dark matter” – material that behaves like ordinary matter as far as gravity is concerned, but is so far invisible except through its gravitational pull. The other 70% of the universe is something completely different, whose gravity pushes things apart rather than pulling them together, causing the expansion of the universe to accelerate over the last few billion years. Naming this unknown substance “dark energy” teaches us nothing about its true nature. Now, a century into its work, cosmology is brimming with existential questions. If there is dark matter, what is it and how can we find it? Is dark energy the energy of empty space, also known as vacuum energy, or is it the cosmological constant, Λ, as first suggested by Einstein in 1917? He introduced the constant after mistakenly thinking it would stop the universe from expanding or contracting, and so – in what he later called his “greatest blunder” – failed to predict the expansion of the universe, which was discovered a dozen years later. Or is one or both of these invisible substances a figment of the cosmologist’s imagination and it is general relativity that must be changed? At the same time as being faced with these fundamental questions, cosmologists are testing their currently accepted model of the universe – dubbed ΛCDM – to greater and greater precision observationally. (CDM indicates the dark-matter particles are cold because they must move slowly, like the molecules in a cold drink, so as not to evaporate from the galaxies they help bind together.) And yet, while we can use general relativity to describe how the universe expanded throughout its history, we are only just starting to use the full theory to model specific details and observations of how galaxies, clusters of galaxies and superclusters are formed and created. How this happens is simple – the equations of general relativity aren’t. While they fit neatly onto a T-shirt or a coffee mug, Einstein’s field equations are horrible to solve even using a computer. The equations involve 10 separate functions of the four dimensions of space and time, which characterize the curvature of space–time in each location, along with 40 functions describing how those 10 functions change, as well as 100 further functions describing how those 40 changes change, all multiplied and added together in complicated ways. Exact solutions exist only in highly simplified approximations to the real universe. So for decades cosmologists have used those idealized solutions and taken the departures from them to be small perturbations – reckoning, in particular, that any departures from homogeneity can be treated independently from the homogeneous part and from one another. This “first-order perturbation theory” has taught us a lot about the early development of cosmic structures – galaxies, clusters of galaxies and superclusters – from barely perceptible concentrations of matter and dark matter in the early universe. The theory also has the advantage that we can do much of the analysis by hand, and follow the rest on computer. But to track the development of galaxies and other structures from after they were formed to the present day, we’ve mostly reverted to Newton’s theory of gravity, which is probably a good approximation. To make progress, we will need to improve on first-order perturbation theory, which treats cosmic structures as independent entities that are affected by the average expansion of the universe, but neither alter the average expansion themselves, nor influence one another. Unfortunately, higher-order perturbation theory is much more complicated – everything affects everything else. Indeed, it’s not clear there is anything to gain from using these higher-order approximations rather than “just solving” the full equations of general relativity instead. Improving the precision of our calculations – how well we think we know the answer – is one thing, as discussed above. But the complexity of Einstein’s equations has made us wonder just how accurate the perturbative description really is. In other words, it might give us answers, but are they the right ones? Nonlinear equations, after all, can have surprising features that appear unexpectedly when you solve them in their full glory, and it is hard to predict surprises. Some leading cosmologists, for example, claim that the accelerating expansion of the universe, which dark energy was invented to explain, is caused instead by the collective effects of cosmic structures in the universe acting through the magic of general relativity. Other cosmologists argue this is nonsense. The only way to be sure is to use the full equations of general relativity. And the good news is that computers are finally becoming fast enough that modelling the universe using the full power of general relativity – without the traditional approximations – is not such a crazy prospect. With some hard work, it may finally be feasible over the next decade. Numerical general relativity itself is not new. As far back as the late 1950s, Richard Arnowitt, Stanley Deser and Charles Misner – together known as ADM – laid out a basic framework in which space–time could be carefully separated into space and time – a vital first step in solving general relativity with a computer. Other researchers also got in on the act, including Thomas Baumgarte, Stuart Shapiro, Masaru Shibata and Takashi Nakamura, who made important improvements to the numerical properties of the ADM system in the 1980s and 1990s so that the dynamics of systems could be followed accurately over long enough times to be interesting. Other techniques for obtaining such long-time stability were also developed, including one imported from fluid mechanics. Known as adaptive mesh refinement, it allowed scarce computer memory resources to be focused only on those parts of problems where they were needed most. Such advances have allowed numerical relativists to simulate with great precision what happens when two black holes merge and create gravitational waves – ripples in space–time. The resulting images are more than eye candy; they were essential in allowing members of the US-based Laser Interferometer Gravitational-Wave Observatory (LIGO) collaboration to announce last year that they had directly detected gravitational waves for the first time. By modelling many different possible configurations of pairs of black holes – different masses, different spins and different orbits – LIGO’s numerical relativists produced a template of the gravitational-wave signal that would result in each case. Other researchers then compared those simulations over and over again to what the experiment had been measuring, until the moment came when a signal was found that matched one of the templates. The signal in question was coming to us from a pair of black holes a billion light-years away spiralling into one another and merging to form a single larger black hole. Using numerical relativity to model cosmology has its own challenges compared to simulating black-hole mergers, which are just single astrophysical events. Some qualitative cosmological questions can be answered by reasonably small-scale simulations, and there are state-of-the-art “N-body” simulations that use Newtonian gravity to follow trillions of independent masses over billions of years to see where gravity takes them. But general relativity offers at least one big advantage over Newtonian gravity – it is local. The difficulty with calculating the gravity experienced by any particular mass in a Newtonian simulation is that you need to add up the effects of all the other masses. Even Isaac Newton himself regarded this “action at a distance” as a failing of his model, since it means that information travels from one side of the simulated universe to the other instantly, violating the speed-of-light limit. In general relativity, however, all the equations are “local”, which means that to determine the gravity at any time or location you only need to know what the gravity and matter distribution were nearby just moments before. This should, in other words, simplify the numerical calculations. Recently, the three of us at Kenyon College and Case Western Reserve University showed that the cosmological problem is finally becoming tractable (Phys. Rev. Lett. 116 251301 and Phys. Rev. D 93 124059). Just days after our paper appeared, Eloisa Bentivegna at the University of Catania in Italy and Marco Bruni at the University of Portsmouth, UK, had similar success (Phys. Rev. Lett. 116 251302). The two groups each presented the results of low-resolution simulations, where grid points are separated by 40 million light-years, with only long-wavelength perturbations. The simulations followed the universe for only a short time by cosmic standards – long enough only for the universe to somewhat more than double in size – but both tracked the evolution of these perturbations in full general relativity with no simplifications or approximations whatsoever. As the eminent Italian cosmologist Sabino Matarese wrote in Nature Physics, “the era of general relativistic numerical simulations in cosmology ha[s] begun”. These preliminary studies are still a long way from competing with modern N-body simulations for resolution, duration or dynamic range. To do so will require advances in the software so that the code can run on much larger computer clusters. We will also need to make the code more stable numerically so that it can model much longer periods of cosmic expansion. The long-term goal is for our numerical simulations to match as far as possible the actual evolution of the universe and its contents, which means using the full theory of general relativity. But given that our existing simulations using full general relativity have revealed no fluctuations driving the accelerated expansion of the universe, it appears instead that accelerated expansion will need new physics – whether dark energy or a modified gravitational theory. Both groups also observe what appear to be small corrections to the dynamics of space–time when compared with simple perturbation theory. Bentivegna and Bruni studied the collapse of structures in the early universe and suggested that they appear to coalesce somewhat more quickly than in the standard simplified theory. Drawing specific conclusions about simulations is a subtle matter in general relativity. At the mathematical heart of the theory is the principle of “co-ordinate invariance”, which essentially says that the laws of physics should be the same no matter what set of labels you use for the locations and times of events. We are all familiar with milder versions of this symmetry: we wouldn’t expect the equations governing basic scientific laws to depend on whether we measure our positions in, say, New York or London, and we don’t need new versions of science textbooks whenever we switch from standard time to daylight savings time and back. Co-ordinate invariance in the context of general relativity is just a more extreme version of that, but it means we must ensure that any information we extract from our simulations does not depend on how we label the points in our simulations. Our Ohio group has taken particular care with this subtlety by sending simulated beams of light from distant points in the distant past at the speed of light through space–time to arrive at the here and now. We then use those beams to simulate observations of the expansion history of our universe. The universe that emerges exhibits an average behaviour that agrees with a corresponding smooth, homogeneous model, but with inhomogeneous structures on top. These additional structures contribute to deviations in observable quantities across the simulated observer’s sky that should soon be accessible to real observers. This work is therefore just the start of a journey. Creating codes that are accurate and sensitive enough to make realistic predictions for future observational programmes – such as the all-sky surveys to be carried out by the Large Scale Synoptic Telescope or the Euclid satellite – will require us to study larger volumes of space. These studies will also have to incorporate ultra-large-scale structures some hundreds of millions of light-years across as well as much smaller-scale structures, such as galaxies and clusters of galaxies. They will also have to follow these volumes for longer stretches of time than is currently possible. All this will require us to introduce some of the same refinements that made it possible to predict the gravitational-wave ripples produced by a merging black hole, such as adaptive mesh refinement to resolve the smaller structures like galaxies, and N-body simulations to allow matter to flow naturally across these structures. These refinements will let us characterize more precisely and more accurately the statistical properties of galaxies and clusters of galaxies – as well as the observations we make of them – taking general relativity fully into account. Doing so will, however, require clusters of computers with millions of cores, rather than the hundreds we use now. These improvements to code will take time, effort and collaboration. Groups around the world – in addition to the two mentioned – are likely to make important contributions. Numerical general-relativistic cosmology is still in its infancy, but the next decade will see huge strides to make the best use of the new generation of cosmological surveys that are being designed and built today. This work will either give us increased confidence in our own scientific genesis story – ΛCDM – or teach us that we still have a lot more thinking to do about how the universe got itself to where it is today.


Grant
Agency: European Commission | Branch: H2020 | Program: RIA | Phase: INFRAIA-1-2014-2015 | Award Amount: 9.98M | Year: 2015

HYDRALAB is an advanced network of environmental hydraulic institutes in Europe, which has been effective in providing access to a suite of major and unique environmental hydraulic facilities from across the whole European scientific community. A continuation project will prepare environmental hydraulic modelling for the upcoming urgent technical challenges associated with adaptations for climate change. A multi-disciplinary approach is essential to meet these challenges. We denote the project HYDRALAB\, in recognition of the added value that will follow from our network changing to enhance the collaboration between specialists and engaging with a new range of stakeholders. The issues associated with climate change impacts on rivers and coasts are significant enough to ask the scientific community to which we open up our facilities to focus their research efforts on adaptations for climate change. We plan to issue themed calls for proposals for access to the facilities, with scientific merit as the main selection criterion, but with preference to the proposals that also address issues of adaptation to climate change impact. In HYDRALAB\, with the prospect of climate change, we will build networking activities that will also involve the wider hydraulic community in the process of generating the deliverables of the project. The first Workshop in the project will be devoted to working together with the larger European hydraulics community not directly involved in HYDRALAB. Increased emphasis will be placed by HYDRALAB\ on engagement with industry a theme that will be delivered initially through the vehicle of a focussed Workshop between HYDRALAB researchers and industry. We will work together with industry to have HYDRALAB\ become part of the innovation cycle by bringing development to market this is particularly relevant for the instruments we develop - to involve industry in our range of project deliverables.


Grant
Agency: European Commission | Branch: H2020 | Program: MSCA-ITN-ETN | Phase: MSCA-ITN-2014-ETN | Award Amount: 3.92M | Year: 2015

The proposed ITN entitled Modelling and computation of Shocks and Interfaces will focus on the training of young researchers in the general area of nonlinear hyperbolic and convection dominated PDEs with emphasis on innovative modelling and computational methods. The research program of the proposed ITN is centered on an important field (in terms of both history and scope), that is placed at the forefront of modern Computational and Applied Mathematics. The fact that hyperbolic convection dominated PDEs is probably one of the very few areas within Computational and Applied Mathematics, where traditionally modelling, Physics, Mechanics, analytical approaches, and advanced computational methods have contributed in synergy to several achievements to date, makes this field eminently suitable to train young researchers in. These researchers can become research leaders in a wide area as well as impacting on both industry and non-academic scientific institutions. The network will consist of some of Europes leading research groups on hyperbolic PDEs, and includes experts on Modelling, Analysis and Computation. A well defined training program is outlined in the proposal. The training program emphasises the European and international dimension of the effort. The training design is expected to produce effective results and foster the expertise on how to structure doctoral training at the European level and enhance the innovation capacity of the involved individuals. The innovative techniques developed will be applied to diverse concrete problems ranging from fluid dynamics and geophysical flows to materials science. In the pursuance of this goal, the research groups will be assisted by experts in these areas of application and non-academic partners, resulting in a significant enhancement of the impact of the research and training program.


Grant
Agency: European Commission | Branch: H2020 | Program: MSCA-RISE | Phase: MSCA-RISE-2015 | Award Amount: 1.19M | Year: 2016

The project For a Better Tomorrow: Social Enterprises on the Move (FAB-MOVE) brings together researchers and practitioners in order to explore the question of how social enterprises can grow and flourish. These objectives will be achieved through a carefully crafted network of academic and non-academic partner organisations co-operating worldwide. Managers and practitioners of social enterprises often lack an easy access to the frontiers of science. FAB-MOVE will significantly improve the transfer of knowledge between academics and non-academics and thus increase the practical applicability of research findings. For an enduring sustainable impact, FAB-MOVE develops a teaching tool to educate (future) managers of social enterprises on how to set up their enterprise in a specific environment, how to combine business with a social goal, and how to develop strategies for growth and scaling-up. Currently there is a lack of knowledge about the influence of different social and economic environments on social enterprises. Local eco-systems and traditions have a decisive impact on the wellbeing, growth and potentials for scaling-up of social enterprises. FAB-MOVE focuses on the embeddedness of social enterprises and its impact on their evolution. It identifies crucial success factors for a sustainable development of these new and innovative organisations in an internationally comparative perspective. Thoroughly analysed case studies will serve as best practices by highlighting how social enterprises overcome crucial problems and manage to grow in different social areas and various regions around the world. In particular, the cases will shed light on how managers of social enterprises cooperate with stakeholders and how their environment composed of promoting actors and existing (political) structures meet their needs in order to improve social cohesion all over Europe.


Grant
Agency: European Commission | Branch: H2020 | Program: CSA | Phase: INFRADEV-02-2016 | Award Amount: 9.05M | Year: 2017

The European Solar Telescope (EST) will be a revolutionary Research Infrastructure that will play a major role in answering key questions in modern Solar Physics. This 4-meter class solar telescope, to be located in the Canary Islands, will provide solar physicists with the most advanced state-of-the-art observing tools to transform our understanding of the complex phenomena that drive the solar magnetic activity. The principal objective of the present Preparatory Phase is to provide both the EST international consortium and the funding agencies with a detailed plan regarding the implementation of EST. The specific objectives of the proposed preparatory phase are: (1) to explore possible legal frameworks and related governance schemes that can be used by agencies to jointly establish, construct and operate EST as a new research infrastructure, with the implementation of an intermediate temporary organisational structure, as a previous step for future phases of the project; (2) to explore funding schemes and funding sources for EST, including a proposal of financial models to make possible the combination of direct financial and in-kind contributions towards the construction and operation of EST; (3) to compare the two possible sites for EST in the Canary Islands Astronomical Observatories and prepare final site agreements; (4) to engage funding agencies and policy makers for a long-term commitment which guarantees the construction and operation phases of the Telescope; (5) to involve industry in the design of EST key elements to the required level of definition and validation for their final production; (6) to enhance and intensify outreach activities and strategic links with national agencies and the user communities of EST. To accomplish the aforementioned goals, this 4-year project, promoted by the European Association for Solar Telescopes (EAST) and the PRE-EST consortium, encompassing 23 research institutions from 16 countries, will set up the Project Office


Grant
Agency: European Commission | Branch: H2020 | Program: RIA | Phase: EURO-4-2014 | Award Amount: 2.28M | Year: 2015

The European Union (EU) is facing the daunting prospect of transboundary crises: threats that escalate across national borders and policy domains. EU member states must collaborate to address these crises. EU governance can play a pivotal role in facilitating a joint response. But does the EU have the institutional leadership capacities to deal with transboundary crises? The response to the financial crisis a textbook example of a transboundary crisis revealed deep problems with crisis leadership, including conflicting diagnoses, regulatory gaps, unclear political jurisdictions and responsibilities, a lack of problem solving capacity, and blame-shifting. Growing euroscepticism has been directly related to the EUs role during this transboundary crisis. This project outlines the institutional requirements for effective and legitimate crisis leadership in the face of transboundary crisis. We define crisis leadership as a set of strategic management functions, including the detection of impending threats, the collection and sharing of information, the coordination of partners, and the communication to the public about the crisis and the response. The project analyses the capacities of political leaders in EU institutions and member states to fulfill these leadership functions. It will pinpoint the existing and required capacities to support these functions. It investigates the crisis management capacities of individual political leaders, and EU institutions. It explores the effects of political leadership on the member state level and studies how crisis management capacity is exercised in various policy sectors. The project will result in recommendations for effective and legitimate crisis leadership. It establishes a crisis management capital index that allows for an evidence-based assessment. It proposes strategies to build support for transboundary crisis management in a multilevel system, reconnecting citizens with an idea of what the EU can do for them.


Grant
Agency: European Commission | Branch: H2020 | Program: CSA | Phase: INFRASUPP-7-2014 | Award Amount: 1.34M | Year: 2015

In African Communities of Practice (CoPs), international collaboration and the pursuit of scientific endeavour has faced a major barrier with the lack of access to e-Infrastructures and high performance network infrastructure enjoyed by European counterparts. With AfricaConnect, the proposed AfricaConnect2 and regional developments, this situation is changing rapidly. In the project Teaming-up for exploiting e-Infrastructures potential to boost RTDI in Africa (eI4Africa) it has been demonstrated clearly that it is possible to develop e-Infrastructures in Africa. It has also been demonstrated clearly that, as with the rest of the world, easy to use web portals, or Science Gateways, are needed to help CoPs to easily access e-Infrastructure facilities and through these collaborate with CoPs across the world. However, a major problem exists: it is very difficult for non-experts to develop Science Gateways and supporting e-Infrastructures. Elements of guides and supporting materials exist but these are either written for different audiences or out of date. This present Coordination and Support Action, called Energising Scientific Endeavour through Science Gateways and e-Infrastructures in Africa (Sci-GaIA), therefore proposes to bring together these materials into clearly structured guides and educational documents that can be used to train and support representatives of NRENs, CoPs and, importantly, Universities to develop Science Gateways and other e-Infrastructures services in Africa. Sci-GaIA plans to work with new and emerging CoPs to develop these exciting technologies, to strengthen e-Infrastructure service provision, especially in terms of open access linked data, and to deliver training and dissemination workshops. This will give a sustainable foundation on which African e-Infrastructures can be developed and be linked to scientific networks across Africa. Importantly, the results of our project will be usable by CoPs in Europe and the rest of the world.

Loading University of Catania collaborators
Loading University of Catania collaborators