Entity

Time filter

Source Type

Pinelands, South Africa

Bernardi G.,SKA SA | Bernardi G.,Rhodes University | Bernardi G.,Harvard - Smithsonian Center for Astrophysics | McQuinn M.,University of California at Berkeley | Greenhill L.J.,Harvard - Smithsonian Center for Astrophysics
Astrophysical Journal | Year: 2015

The most promising near-term observable of the cosmic dark age prior to widespread reionization (z 15-200) is the sky-averaged λ21 cm background arising from hydrogen in the intergalactic medium. Though an individual antenna could in principle detect the line signature, data analysis must separate foregrounds that are orders of magnitude brighter than the λ21 cm background (but that are anticipated to vary monotonically and gradually with frequency, e.g., they are considered "spectrally smooth"). Using more physically motivated models for foregrounds than in previous studies, we show that the intrinsic spectral smoothness of the foregrounds is likely not a concern, and that data analysis for an ideal antenna should be able to detect the λ21 cm signal after subtracting a fifth-order polynomial in log ν. However, we find that the foreground signal is corrupted by the angular and frequency-dependent response of a real antenna. The frequency dependence complicates modeling of foregrounds commonly based on the assumption of spectral smoothness. Our calculations focus on the Large-aperture Experiment to detect the Dark Age, which combines both radiometric and interferometric measurements. We show that statistical uncertainty remaining after fitting antenna gain patterns to interferometric measurements is not anticipated to compromise extraction of the λ21 cm signal for a range of cosmological models after fitting a seventh-order polynomial to radiometric data. Our results generalize to most efforts to measure the sky-averaged spectrum. © 2015. The American Astronomical Society. All rights reserved.. Source


News Article
Site: http://phys.org/physics-news/

Probably the nucleus of an iron atom, it carried about 3x1020 electron volts (eV), the energy of a well-bowled cricket ball, but this was contained in a single particle. This is also way beyond the energy that the Large Hadron Collider (LHC) can give a particle, which is about 1015eV. More of these ultra-high-energy particles have been seen in the past 25 years but they are very rare, arriving at a rate of one per square kilometre per century. It's hard to reach such high energies within our galaxy so the particles probably come from beyond it. From a place far, far away Being charged, cosmic-ray particles are deflected as they travel through our galaxy's magnetic fields, making it difficult to tell where they come from. But we might learn that by studying one of their by-products, neutrinos, which were the focus of this year's Nobel Prize for Physics. Cosmic rays with energies beyond 5x1019eV should interact with the photons of the cosmic microwave background, producing high-energy neutrinos. Being uncharged, neutrinos travel in straight lines, and their direction of arrival points back towards their origin. Neutrinos are interesting in their own right, too, and can be used to test some of the more exotic theories of particle formation in the early universe. Neutrinos interact very little with other matter. That means they can bring us astronomical information from the most distant reaches of the universe. But it also means that to find them you need a really large detector. Fortunately, we can use our moon. In 1962 a Russian-Armenian physicist, Gurgen Askaryan, predicted that neutrinos interacting with rocks under the moon's surface would generate a flash of radio waves lasting just a nanosecond – known as the Askaryan effect – and that this could be detected by a receiver on the moon. In 1992 two Russians, R Dagkesamanskii and I M Zheleznykh, suggested that you wouldn't need to put a receiver on the moon, you could just point a ground-based radio telescope at it. When I heard Zheleznykh talk about this in the early 1990s I realised that we could do the experiment – the first of its kind – with CSIRO's Parkes telescope in New South Wales. I put it together in 1995 with Tim Hankins, a US colleague, and John O'Sullivan, who hadled CSIRO's development of Wi-Fi. That first experiment gave us a limit rather than any detection, but it also triggered a new interest in using radio observations to study high-energy particles. Two US researchers ran a second such experiment in the early 2000s, using NASA's 70-metre antenna at Goldstone in California. This gained a lot of attention but we knew we could do a better one, at least ten times more sensitive. This time we used CSIRO's Compact Array, a set of six 22-metre dishes in northwest NSW. The array had a big advantage over Goldstone: it let us distinguish between radio pulses coming from the moon and those from terrestrial signals (radio-frequency interference). But some effort was needed to adapt it for detecting extremely short pulses. So for our third experiment we decided to try again with Parkes, and use a different way to handle the radio-frequency interference. Parkes has a radio receiver that lets it see 13 spots on the sky simultaneously. This, plus some technical wizardry from CSIRO engineer Paul Roberts, let us eliminate all the radio-frequency interference: a huge achievement. We also used the timing signals from the GPS satellites, to measure the density of free electrons in the ionosphere, the upper part of the atmosphere, and even a piezoelectric barbecue lighter which made a radio impulse we used to calibrate signals. In the past 15 years other research groups had entered the fray, but this second Parkes experiment was three times more sensitive than any previous one of its kind, and we pushed the limit on the flux of ultra-high-energy cosmic neutrinos down to its lowest level. There's a lot of wiggle room in the theories of how high-energy neutrinos are produced, but as observations tighten the limits the theorists have to gradually rule out some of their original ideas. The IceCube experiment in Antarctica recently detected the first high-energy neutrinos from space, but these are still 10,000 times less energetic than the extremely rare ones we have been looking for. There are other experiments proposed or in train that might find these elusive particles, including a satellite that uses the whole of the atmosphere as its detector. The coming Square Kilometre Array (SKA) is the obvious instrument to try again to detect the neutrinos and there are discussions about how it could be used in an experiment. With just a little bit more sensitivity than we had, which you could easily get with the SKA, the hope is that one day we could detect not only neutrinos but also the original cosmic rays interacting with the moon. Explore further: Astronomers use moon in effort to corral elusive cosmic particles


News Article | August 22, 2016
Site: http://phys.org/space-news/

A prototype part of the software system to manage data from the Square Kilometre Array (SKA) telescope has run on the world's second fastest supercomputer in China.


News Article | April 13, 2016
Site: http://www.techtimes.com/rss/sections/science.xml

A mysterious alignment has been witnessed in a remote area of the universe. Sixty-four supermassive black holes have been observed to be spinning out radio jets from their centers, all pointing towards the same direction. Black holes are well known to emit radio emissions. However, this is the first time their alignment is of such a great magnitude. This phenomenon implies that the force governing these black holes is much greater and older, hence the alignment has been linked to "primordial mass fluctuations" in the early universe. "Since these black holes don't know about each other, or have any way of exchanging information or influencing each other directly over such vast scales, this spin alignment must have occurred during the formation of the galaxies in the early universe," said Professor Andrew Russ Taylor, joint UWC/UCT SKA Chair, Director of the recently launched Inter-University Institute for Data Intensive Astronomy, and principal author of the Monthly Notices study. The astronomers have been puzzled over this alignment and have speculated a few theories that could have been responsible for triggering this large scale phenomenon. Few of the speculated theories include cosmic strings – theoretical fault lines in the universe, exotic particles like axions or cosmic magnetic fields, or maybe something entirely different altogether, which is yet to be ascertained. Experts said the recent observation of black hole alignment could provide evidence of the environmental influences that contributed to the formation and evolution of galaxies as well as the primordial fluctuations that brought about the structure of the universe. This strange phenomenon was captured as a result of three years of deep radio imaging carried out by the Giant Metrewave Radio Telescope (GMRT) located in India. The alignment may hold clues about the early universe when the black holes had initially formed. The study was published in the Monthly Notices of the Royal Astronomical Society. © 2016 Tech Times, All rights reserved. Do not reproduce without permission.


News Article | August 29, 2016
Site: http://www.scientificcomputing.com/rss-feeds/all/rss.xml/all

If you want to model weather systems, perform advanced computational mechanics, simulate the impact of climate change, study the interaction of lithium and manganese in batteries at the atomic level, or conduct the next experiment of your latest in vitro biomedical technique virtually — and you want to do it in Africa — then there is only one place to go: the Center for High Performance Computing (CHPC) in Cape Town. Built and operated within the South African Council for Scientific and Industrial Research (CSIR), the CHPC is home to South Africa’s newest (and only) supercomputer. Named “Lengau,” which means “Cheetah” in Setswana, the system became fully operational in May 2016 and was ranked 121 on the June TOP500 list of the world’s fastest supercomputers. Its mission: to make South Africa, and indeed Africa itself, a major player within the international community of HPC-driven scientific researchers while also boosting South Africa’s burgeoning development in scientific and technical education. Such world-class ambitions, however, require equally world-class technology. Based on Intel Xeon processors, the new system is comprised of 1,013 Dell PowerEdge servers totaling 19 racks of compute nodes and storage. It has a total storage capacity of five petabytes and uses Dell networking Ethernet switches and Mellanox FDR InfiniBand with a maximum interconnect speed of 56 GB/s. With over 24,000 cores, the machine is the fastest computer on the African continent at roughly one petaflop (a thousand trillion floating point operations per second) — 15 times faster than CHPC’s previous system. The person leading the effort to make the new supercomputer a reality was CHPC Director, Dr. Happy Sithole. For him, nothing less than world-class supercomputing power would suffice. “For us, it’s no different from the rest of the world in terms of looking for opportunities where we need to accelerate competitiveness. I think high performance computing is vital for competitiveness in developed countries. In South Africa we also have that ambition to accelerate areas where we are competitive in industry and science.” Those research domains are quite broad, Dr. Sithole says. “They cover chemistry, bioinformatics, astronomy, computational mechanics, engineering applications or systems, and the earth sciences including climate change. The South African Weather Service is a key collaborator as well as the Agricultural Research Council. It’s quite a broad spectrum of users.” But advancing scientific research is only one of the key benefits high performance computing offers South Africa, Dr. Sithole says. Helping industry is another. “The first key performance indicator for us is whether we are helping someone solve a problem faster. And the second is whether we demonstrate an impact to non-academic users — whether some of our industries can say we were able to do things much faster, we were able to generate more revenue, because of high performance computing.” Virtual prototyping is a prime example, he says. “The more you are able to do virtual prototypes the faster you can take your product to market. And here at CHPC we have an ongoing investment in virtual prototyping.” But if CHPC shares many of the same goals as other high performance computing centers, it also faces some unique challenges as well as opportunities. “If you look at most centers around the world,” Dr. Sithole says, “they have the option to focus on a specific area. But we don’t have that luxury. We have some users who don’t have access to any other computing resources. That is our uniqueness — that we are the only center in the country and in the continent. We have all those users with varied needs of computing and also of application requirements. But our unique geographical position also brings us unique opportunities and some very good partnerships.” A good example is climate change research. Like other countries, South Africa is very concerned about the future impact greenhouse gases will have on public health, agriculture, the availability of fresh water, and other areas. But what makes climate research here different is its focus on the Southern Hemisphere. “Perhaps our biggest user,” Dr. Sithole says, “is a climate modeling team from the CSIR, which absolutely depends on the CHPC for what they call Variable Resolution Earth System Model or VRESM. This is an earth systems model for climate prediction that contributes to global research efforts. It specifically focuses on the Southern Hemisphere whereas similar modeling efforts elsewhere only focus on the Northern Hemisphere. VRESM relies on the CHPC because of the level of computing resources they are accessing — 9,000 to 10,000 cores at a time — which they cannot get anywhere else. And where before their models were limited to an eight-kilometer resolution, today they are at one-kilometer resolution. This is something they could not do before.” Another example is materials science, particularly in fields like battery research and minerals differentiation (extracting precious metals from ores). South Africa ranks either very near or at the top in deposits of metals like manganese, platinum, chromite, vanadium, and vermiculite. Here too the new system’s increased computational power is having a clear impact. According to Dr. Sithole, “Materials science models that once took 11 days to finish now only take thee-quarters of a day. That’s a major improvement.” On the battery side, scientists use CHPC to model the interaction of atoms from different metals, like lithium and manganese, as a way to predict battery performance. “They’re looking at lithium manganese dioxide,” says Dr. Sithole. “In order to realistically show what happens in the actual battery system, researchers need to simulate a large number of lithium atoms traveling through the manganese. That means scaling the size of the battery system to millions of atoms. Where they could only model hundreds before, they have already surpassed 120,000 atoms and they now see they can push to millions.” CHPC will also play a key role in support of the world’s largest radio telescope — the Square Kilometer Array (SKA) — scheduled to be deployed in South Africa’s Karoo desert by the year 2020. It will be 50 times more sensitive and survey the sky 10,000 times faster than today’s most powerful radio telescopes — and also generate record-setting amounts of astronomical data. The precursor to SKA is the MeerKAT radio telescope, located in South Africa’s Northern Cape. To enable users to have close proximity to their data and also help balance MeerKAT’s — and soon SKA’s — huge compute load, CHPC will support portions of MeerKAT’s data analysis and hold its archives. CHPC will also participate in efforts to create Africa’s first data-intensive cloud infrastructure as part of the country’s new Inter-University Institute for Data Intensive Astronomy (IDIA). Supporting these types of use cases would be impossible, Dr. Sithole says, without the help of vendor partners. “You would not be able to achieve this through working alone. We worked very closely with the Intel team especially when it came to working with the Lustre vendors but also in looking at the libraries and other Intel related dependencies. For example, some configurations under Intel Manager for Lustre software did not allow a number of files to be written at the same time. During this whole process their people were available all the time and were very helpful in resolving issues. Without companies like Intel we would not be able to achieve benefits like efficient parallelization or the introduction of new technologies. So partnerships with OEMs are very important when you are looking to build at scale.” That’s just one of many lessons Dr. Sithole and his team learned in building out CHPC’s new supercomputer. Another was the need “to identify low hanging fruit so you can start demonstrating impact early.” Still another was to “start building expertise within your user base early and get users involved early and incrementally during the build-out process.” Thanks to leadership like that, South Africa now has its own role to play in the global community of high performance computing — while at the same time enjoying the singular opportunities that come from leveraging this continent’s unique and abundant resources. Randall Cronk of greatwriting, LLC is a technology writer with over 30 years’ experience writing about complex systems, products, and services.

Discover hidden collaborations