Kavli Institute for Cosmological Physics

Chicago, IL, United States

Kavli Institute for Cosmological Physics

Chicago, IL, United States
SEARCH FILTERS
Time filter
Source Type

News Article | May 22, 2017
Site: www.rdmag.com

Scientists behind XENON1T, the largest dark matter experiment of its kind ever built, are encouraged by early results, describing them as the best so far in the search for dark matter. Dark matter is one of the basic constituents of the universe, five times more abundant than ordinary matter. Several astronomical measurements have corroborated the existence of dark matter, leading to an international effort to observe it directly. Scientists are trying to detect dark matter particle interacting with ordinary matter through the use of extremely sensitive detectors. Such interactions are so feeble that they have escaped direct detection to date, forcing scientists to build detectors that are more and more sensitive and have extremely low levels of radioactivity. On May 18, the XENON Collaboration released results from a first, 30-day run of XENON1T, showing the detector has a record low radioactivity level, many orders of magnitude below surrounding material on earth. “The care that we put into every single detail of the new detector is finally paying back,” said Luca Grandi, assistant professor in physics at the University of Chicago and member of the XENON Collaboration. “We have excellent discovery potential in the years to come because of the huge dimension of XENON1T and its incredibly low background. These early results already are allowing us to explore regions never explored before.” The XENON Collaboration consists of 135 researchers from the United States, Germany, Italy, Switzerland, Portugal, France, the Netherlands, Israel, Sweden and the United Arab Emirates, who hope to one day confirm dark matter’s existence and shed light on its mysterious properties. Located deep below a mountain in central Italy, XENON1T features a 3.2-ton xenon dual-phase time projection chamber. This central detector sits fully submersed in the middle of the water tank, in order to shield it from natural radioactivity in the cavern. A cryostat helps keep the xenon at a temperature of minus-95 degrees Celsius without freezing the surrounding water. The mountain above the laboratory further shields the detector, preventing it from being perturbed by cosmic rays. But shielding from the outer world is not enough, since all materials on Earth contain tiny traces of natural radioactivity. Thus extreme care was taken to find, select and process the materials making up the detector to achieve the lowest possible radioactive content. This allowed XENON1T to achieve record “silence” necessary to detect the very weak output of dark matter. A particle interaction in the one-ton central core of the time projection chamber leads to tiny flashes of light. Scientists record and study these flashes to infer the position and the energy of the interacting particle—and whether it might be dark matter. Despite the brief 30-day science run, the sensitivity of XENON1T has already overcome that of any other experiment in the field probing unexplored dark matter territory. “For the moment we do not see anything unexpected, so we set new constraints on dark matter properties,” Grandi said. “But XENON1T just started its exciting journey and since the end of the 30-day science run, we have been steadily accumulating new data.” Grandi’s group is very active within XENON1T, and it is contributing to several aspects of the program. After its initial involvement in the preparation, assembly and early operations of the liquid xenon chamber, the group shifted its focus in the last several months to the development of the computing infrastructure and to data analysis. “Despite its low background, XENON1T is producing a large amount of data that needs to be continuously processed,” said Evan Shockley, a graduate student working with Grandi. “The raw data from the detector are directly transferred from Gran Sasso Laboratory to the University of Chicago, serving as the unique distribution point for the entire collaboration.” The framework, developed in collaboration with a group led by Robert Gardner, senior fellow at the Computation Institute, allows for the processing of data, both on local and remote resources belonging to the Open Science Grid. The involvement of UChicago’s Research Computing Center including Director Birali Runesha allows members of the collaboration all around the world to access processed data for high-level analyses. Grandi’s group also has been heavily involved in the analysis that led to this first result. Christopher Tunnell, a fellow at the Kavli Institute for Cosmological Physics, is one of the two XENON1T analysis coordinators and corresponding author of the result. Recently, UChicago hosted about 25 researchers for a month to perform the analyses that led to the first results. “It has been a large, concentrated effort and seeing XENON1T back on the front line makes me forget the never-ending days spent next to my colleagues to look at plots and distributions,“ Tunnell said. “There is no better thrill than leading the way in our knowledge of dark matter for the coming years.”


News Article | May 22, 2017
Site: www.rdmag.com

Scientists behind XENON1T, the largest dark matter experiment of its kind ever built, are encouraged by early results, describing them as the best so far in the search for dark matter. Dark matter is one of the basic constituents of the universe, five times more abundant than ordinary matter. Several astronomical measurements have corroborated the existence of dark matter, leading to an international effort to observe it directly. Scientists are trying to detect dark matter particle interacting with ordinary matter through the use of extremely sensitive detectors. Such interactions are so feeble that they have escaped direct detection to date, forcing scientists to build detectors that are more and more sensitive and have extremely low levels of radioactivity. On May 18, the XENON Collaboration released results from a first, 30-day run of XENON1T, showing the detector has a record low radioactivity level, many orders of magnitude below surrounding material on earth. “The care that we put into every single detail of the new detector is finally paying back,” said Luca Grandi, assistant professor in physics at the University of Chicago and member of the XENON Collaboration. “We have excellent discovery potential in the years to come because of the huge dimension of XENON1T and its incredibly low background. These early results already are allowing us to explore regions never explored before.” The XENON Collaboration consists of 135 researchers from the United States, Germany, Italy, Switzerland, Portugal, France, the Netherlands, Israel, Sweden and the United Arab Emirates, who hope to one day confirm dark matter’s existence and shed light on its mysterious properties. Located deep below a mountain in central Italy, XENON1T features a 3.2-ton xenon dual-phase time projection chamber. This central detector sits fully submersed in the middle of the water tank, in order to shield it from natural radioactivity in the cavern. A cryostat helps keep the xenon at a temperature of minus-95 degrees Celsius without freezing the surrounding water. The mountain above the laboratory further shields the detector, preventing it from being perturbed by cosmic rays. But shielding from the outer world is not enough, since all materials on Earth contain tiny traces of natural radioactivity. Thus extreme care was taken to find, select and process the materials making up the detector to achieve the lowest possible radioactive content. This allowed XENON1T to achieve record “silence” necessary to detect the very weak output of dark matter. A particle interaction in the one-ton central core of the time projection chamber leads to tiny flashes of light. Scientists record and study these flashes to infer the position and the energy of the interacting particle—and whether it might be dark matter. Despite the brief 30-day science run, the sensitivity of XENON1T has already overcome that of any other experiment in the field probing unexplored dark matter territory. “For the moment we do not see anything unexpected, so we set new constraints on dark matter properties,” Grandi said. “But XENON1T just started its exciting journey and since the end of the 30-day science run, we have been steadily accumulating new data.” Grandi’s group is very active within XENON1T, and it is contributing to several aspects of the program. After its initial involvement in the preparation, assembly and early operations of the liquid xenon chamber, the group shifted its focus in the last several months to the development of the computing infrastructure and to data analysis. “Despite its low background, XENON1T is producing a large amount of data that needs to be continuously processed,” said Evan Shockley, a graduate student working with Grandi. “The raw data from the detector are directly transferred from Gran Sasso Laboratory to the University of Chicago, serving as the unique distribution point for the entire collaboration.” The framework, developed in collaboration with a group led by Robert Gardner, senior fellow at the Computation Institute, allows for the processing of data, both on local and remote resources belonging to the Open Science Grid. The involvement of UChicago’s Research Computing Center including Director Birali Runesha allows members of the collaboration all around the world to access processed data for high-level analyses. Grandi’s group also has been heavily involved in the analysis that led to this first result. Christopher Tunnell, a fellow at the Kavli Institute for Cosmological Physics, is one of the two XENON1T analysis coordinators and corresponding author of the result. Recently, UChicago hosted about 25 researchers for a month to perform the analyses that led to the first results. “It has been a large, concentrated effort and seeing XENON1T back on the front line makes me forget the never-ending days spent next to my colleagues to look at plots and distributions,“ Tunnell said. “There is no better thrill than leading the way in our knowledge of dark matter for the coming years.”


News Article | January 4, 2016
Site: www.scientificcomputing.com

In this special feature, we have invited top astronomers to handpick the Hubble Space Telescope image that has the most scientific relevance to them. The images they’ve chosen aren’t always the colorful glory shots that populate the countless “best of” galleries around the internet, but rather their impact comes in the scientific insights they reveal. My all-time favorite astronomical object is the Orion Nebula — a beautiful and nearby cloud of gas that is actively forming stars. I was a high school student when I first saw the nebula through a small telescope and it gave me such a sense of achievement to manually point the telescope in the right direction and, after a fair bit of hunting, to finally track it down in the sky (there was no automatic ‘go-to’ button on that telescope). Of course, what I saw on that long ago night was an amazingly delicate and wispy cloud of gas in black and white. One of the wonderful things that Hubble does is to reveal the colors of the universe. And this image of the Orion Nebula, is our best chance to imagine what it would look like if we could possibly go there and see it up-close. So many of Hubble’s images have become iconic, and for me the joy is seeing its beautiful images bring science and art together in a way that engages the public. The entrance to my office, features an enormous copy of this image wallpapered on a wall 4m wide and 2.5m tall. I can tell you, it’s a lovely way to start each working day. The impact of the fragments of Comet Shoemaker Levy 9 with Jupiter in July 1994 was the first time astronomers had advance warning of a planetary collision. Many of the world’s telescopes, including the recently repaired Hubble, turned their gaze onto the giant planet. The comet crash was also my first professional experience of observational astronomy. From a frigid dome on Mount Stromlo, we hoped to see Jupiter’s moons reflect light from comet fragments crashing into the far side of Jupiter. Unfortunately we saw no flashes of light from Jupiter’s moons. However, Hubble got an amazing and unexpected view. The impacts on the far side of Jupiter produced plumes that rose so far above Jupiter’s clouds that they briefly came into view from Earth. As Jupiter rotated on its axis, enormous dark scars came into view. Each scar was the result of the impact of a comet fragment, and some of the scars were larger in diameter than our moon. For astronomers around the globe, it was a jaw dropping sight. NASA, ESA and Jonathan Nichols (University of Leicester), CC BY This pair of images shows a spectacular ultraviolet aurora light show occurring near Saturn’s north pole in 2013. The two images were taken just 18 hours apart, but show changes in the brightness and shape of the auroras. We used these images to better understand how much of an impact the solar wind has on the auroras. We used Hubble photographs like these acquired by my astronomer colleagues to monitor the auroras while using the Cassini spacecraft, in orbit around Saturn, to observe radio emissions associated with the lights. We were able to determine that the brightness of the auroras is correlated with higher radio intensities. Therefore, I can use Cassini’s continuous radio observations to tell me whether or not the auroras are active, even if we don’t always have images to look at. This was a large effort including many Cassini investigators and Earth-based astronomers. This far-ultraviolet image of Jupiter’s northern aurora shows the steady improvement in capability of Hubble’s scientific instruments. The Space Telescope Imaging Spectrograph (STIS) images showed, for the first time, the full range of auroral emissions that we were just beginning to understand. The earlier Wide Field Planetary Camera 2 (WFPC2) camera had shown that Jupiter’s auroral emissions rotated with the planet, rather than being fixed with the direction to the sun, thus Jupiter did not behave like the Earth. We knew that there were aurora from the mega-ampere currents flowing from Io along the magnetic field down to Jupiter, but we were not certain this would occur with the other satellites. While there were many ultraviolet images of Jupiter taken with STIS, I like this one because it clearly shows the auroral emissions from the magnetic footprints of Jupiter’s moons Io, Europa, and Ganymede, and Io’s emission clearly shows the height of the auroral curtain. To me it looks three-dimensional. Take a good look at these images of the dwarf planet, Pluto, which show detail at the extreme limit of Hubble’s capabilities. A few days from now, they will be old hat, and no-one will bother looking at them again. Why? Because in early May, the New Horizons spacecraft will be close enough to Pluto for its cameras to reveal better detail, as the craft nears its 14 July rendezvous. Yet this sequence of images — dating from the early 2000s — has given planetary scientists their best insights to date, the variegated colors revealing subtle variations in Pluto’s surface chemistry. That yellowish region prominent in the center image, for example, has an excess of frozen carbon monoxide. Why that should be is unknown. The Hubble images are all the more remarkable given that Pluto is only 2/3 the diameter of our own moon, but nearly 13,000 times farther away. I once dragged my wife into my office to proudly show her the results of some imaging observations made at the Anglo-Australian Telescope with a (then) new and (then) state-of-the-art 8,192 x 8,192 pixel imager. The images were so large, they had to be printed out on multiple A4 pages, and then stuck together to create a huge black-and-white map of a cluster of galaxies that covered a whole wall. I was crushed when she took one look and said: “Looks like mould”. Which just goes to show the best science is not always the prettiest. My choice of the greatest image from HST is another black-and-white image from 2012 that also “looks like mould”. But buried in the heart of the image is an apparently unremarkable faint dot. However it represents the confirmed detection of the coldest example of a brown dwarf then discovered. An object lurking less than 10 parsecs (32.6 light years) away from the sun with a temperature of about 350 Kelvin (77 degrees Celsius) –- colder than a cup of tea! And to this day it remains one of the coldest compact objects we’ve detected outside out solar system. NASA/ESA/STScI, processing by Lucas Macri (Texas A&M University). Observations carried out as part of HST Guest Observer program 9810. In 2004, I was part of a team that used the recently-installed Advanced Camera for Surveys (ACS) on Hubble to observe a small region of the disk of a nearby spiral galaxy (Messier 106) on 12 separate occasions within 45 days. These observations allowed us to discover over 200 Cepheid variables, which are very useful to measure distances to galaxies and ultimately determine the expansion rate of the universe (appropriately named the Hubble constant). This method requires a proper calibration of Cepheid luminosities, which can be done in Messier 106 thanks to a very precise and accurate estimate of the distance to this galaxy (24.8 million light-years, give or take 3%) obtained via radio observations of water clouds orbiting the massive black hole at its center (not included in the image). A few years later, I was involved in another project that used these observations as the first step in a robust cosmic distance ladder and determined the value of the Hubble constant with a total uncertainty of three percent. NASA, ESA and H.E. Bond (STScI), CC BY One of the images that excited me most — even though it never became famous — was our first one of the light echo around the strange explosive star V838 Monocerotis. Its eruption was discovered in January 2002, and its light echo was discovered about a month later, both from small ground-based telescopes. Although light from the explosion travels straight to the Earth, it also goes out to the side, reflects off nearby dust, and arrives at Earth later, producing the “echo.” Astronauts had serviced Hubble in March 2002, installing the new Advanced Camera for Surveys (ACS). In April, we were one of the first to use ACS for science observations. I always liked to think that NASA somehow knew that the light from V838 was on its way to us from 20,000 light-years away, and got ACS installed just in time! The image, even in only one color, was amazing. We obtained many more Hubble observations of the echo over the ensuing decade, and they are some of the most spectacular of all, and VERY famous, but I still remember being awed when I saw this first one. X-ray: NASA/CXC/Univ of Iowa/P.Kaaret et al.; Optical: NASA/ESA/STScI/Univ of Iowa/P.Kaaret et al., CC BY-NC Galaxies form stars. Some of those stars end their “normal” lives by collapsing into black holes, but then begin new lives as powerful X-ray emitters powered by gas sucked off a companion star. I obtained this Hubble image (in red) of the Medusa galaxy to better understand the relation between black hole X-ray binaries and star formation. The striking appearance of the Medusa arises because it’s a collision between two galaxies – the “hair” is remnants of one galaxy torn apart by the gravity of the other. The blue in the image shows X-rays, imaged with the Chandra X-ray Observatory. The blue dots are black hole binaries. Earlier work had suggested that the number of X-ray binaries is simply proportional to the rate at which the host galaxy forms stars. These images of the Medusa allowed us to show that the same relation holds, even in the midst of galactic collisions. NASA, ESA, the Hubble Heritage (STScI/AURA)-ESA/Hubble Collaboration, and A. Evans (University of Virginia, Charlottesville/NRAO/Stony Brook University), CC BY Some of the Hubble Space Telescope images that appeal to me a great deal show interacting and merging galaxies, such as the Antennae (NGC 4038 and NGC 4039), the Mice (NGC 4676), the Cartwheel galaxy (ESO 350-40), and many others without nicknames. These are spectacular examples of violent events that are common in the evolution of galaxies. The images provide us with exquisite detail about what goes on during these interactions: the distortion of the galaxies, the channeling of gas towards their centers, and the formation of stars. I find these images very useful when I explain to the general public the context of my own research, the accretion of gas by the supermassive black holes at the centers of such galaxies. Particularly neat and useful is a video put together by Frank Summers at the Space Telescope Science Institute (STScI), illustrating what we learn by comparing such images with models of galaxy collisions. Our best computer simulations tell us galaxies grow by colliding and merging with each other. Similarly our theories tell us that when two spiral galaxies collide, they should form a large elliptical galaxy. But actually seeing it happen is another story entirely! This beautiful Hubble image has captured a galaxy collision in action. This doesn’t just tell us that our predictions are good, but it lets us start working out the details because we can now see what actually happens. There are fireworks of new star formation triggered as the gas clouds collide and huge distortions going on as the spiral arms break up. We have a long way to go before we’ll completely understand how big galaxies form, but images like this are pointing the way. This is the highest-resolution view of a collimated jet powered by a supermassive black hole in the nucleus of the galaxy M87 (the biggest galaxy in the Virgo Cluster, 55 million light years from us). The jet shoots out of the hot plasma region surrounding the black hole (top left) and we can see it streaming down across the galaxy, over a distance of 6,000 light-years. The white/purple light of the jet in this stunning image is produced by the stream of electrons spiraling around magnetic field lines at a speed of approximately 98% of the speed of light. Understanding the energy budget of black holes is a challenging and fascinating problem in astrophysics. When gas falls into a black hole, a huge amount of energy is released in the form of visible light, X-rays and jets of electrons and positrons traveling almost at the speed of light. With Hubble, we can measure the size of the black hole (a thousand times bigger than the central black hole of our galaxy), the energy and speed of its jet, and the structure of the magnetic field that collimates it. NASA, Jayanne English (University of Manitoba), Sally Hunsberger (Pennsylvania State University), Zolt Levay (Space Telescope Science Institute), Sarah Gallagher (Pennsylvania State University), and Jane Charlton (Pennsylvania State University), CC BY When my Hubble Space Telescope proposal was accepted in 1998 it was one of the biggest thrills of my life. To imagine that, for me, the telescope would capture Stephan’s Quintet, a stunning compact group of galaxies! Over the next billion years Stephan’s Quintet galaxies will continue in their majestic dance, guided by each other’s gravitational attraction. Eventually they will merge, change their forms, and ultimately become one. We have since observed several other compact groups of galaxies with Hubble, but Stephan’s Quintet will always be special because its gas has been released from its galaxies and lights up in dramatic bursts of intergalactic star formation. What a fine thing to be alive at a time when we can build the Hubble and push our minds to glimpse the meaning of these signals from our universe. Thanks to all the heroes who made and maintained Hubble. When Hubble was launched in 1990, I was beginning my PhD studies into gravitational lensing, the action of mass bending the paths of light rays as they travel across the universe. Hubble’s image of the massive galaxy cluster, Abell 2218, brings this gravitational lensing into sharp focus, revealing how the massive quantity of dark matter present in the cluster – matter that binds the many hundreds of galaxies together — magnifies the light from sources many times more distant. As you stare deeply into the image, these highly magnified images are apparent as long thin streaks, the distorted views of baby galaxies that would normally be impossible to detect. It gives you pause to think that such gravitational lenses, acting as natural telescopes, use the gravitational pull from invisible matter to reveal amazing detail of the universe we cannot normally see! NASA, ESA, J. Rigby (NASA Goddard Space Flight Center), K. Sharon (Kavli Institute for Cosmological Physics, University of Chicago), and M. Gladders and E. Wuyts (University of Chicago) Gravitational lensing is an extraordinary manifestation of the effect of mass on the shape of space-time in our universe. Essentially, where there is mass the space is curved, and so objects viewed in the distance, beyond these mass structures, have their images distorted. It’s somewhat like a mirage; indeed this is the term the French use for this effect. In the early days of the Hubble Space Telescope, an image appeared of the lensing effects of a massive cluster of galaxies: the tiny background galaxies were stretched and distorted but embraced the cluster, almost like a pair of hands. I was stunned. This was a tribute to the extraordinary resolution of the telescope, operating far above the Earth’s atmosphere. Viewed from the ground, these extraordinary thin wisps of galactic light would have been smeared out and not distinguishable from the background noise. My third year astrophysics class explored the 100 Top Shots of Hubble, and they were most impressed by the extraordinary, but true colors of the clouds of gas. However, I cannot go past an image displaying the effect of mass on the very fabric of our universe. NASA, ESA, J. Richard (Center for Astronomical Research/Observatory of Lyon, France), and J.-P. Kneib (Astrophysical Laboratory of Marseille, France), CC BY With General Relativity, Einstein postulated that matter changes space-time and can bend light. A fascinating consequence is that very massive objects in the universe will magnify light from distant galaxies, in essence becoming cosmic telescopes. With the Hubble Space Telescope, we have now harnessed this powerful ability to peer back in time to search for the first galaxies. This Hubble image shows a hive of galaxies that have enough mass to bend light from very distant galaxies into bright arcs. My first project as a graduate student was to study these remarkable objects, and I still use the Hubble today to explore the nature of galaxies across cosmic time. To the human eye, the night sky in this image is completely empty. A tiny region no thicker than a grain of rice held at arms length. The Hubble Space Telescope was pointed at this region for 12 full days, letting light hit the detectors and slowly, one by one, the galaxies appeared, until the entire image was filled with 10,000 galaxies stretching all the way across the universe. The most distant are tiny red dots tens of billions of light years away, dating back to a time just a few hundred million years after the Big Bang. The scientific value of this single image is enormous. It revolutionized our theories both of how early galaxies could form and how rapidly they could grow. The history of our universe, as well as the rich variety of galaxy shapes and sizes, is contained in a single image. To me, what truly makes this picture extraordinary is that it gives a glimpse into the scale of our visible universe. So many galaxies in so small an area implies that there are 100 thousand million galaxies across the entire night sky. One entire galaxy for every star in our Milky Way! NASA, ESA, and J. Lotz, M. Mountain, A. Koekemoer, and the HFF Team (STScI), CC BY This is what Hubble is all about. A single, awe-inspiring view can unmask so much about our Universe: its distant past, its ongoing assembly, and even the fundamental physical laws that tie it all together. We’re peering through the heart of a swarming cluster of galaxies. Those glowing white balls are giant galaxies that dominated the cluster center. Look closely and you’ll see diffuse shreds of white light being ripped off of them! The cluster is acting like a gravitational blender, churning many individual galaxies into a single cloud of stars. But the cluster itself is just the first chapter in the cosmic story being revealed here. See those faint blue rings and arcs? Those are the distorted images of other galaxies that sit far in the distance. The immense gravity of the cluster causes the space-time around it to warp. As light from distant galaxies passes by, it’s forced to bend into weird shapes, like a warped magnifying glass would distort and brighten our view of a faint candle. Leveraging our understanding of Einstein’s General Relativity, Hubble is using the cluster as a gravitational telescope, allowing us to see farther and fainter than ever before possible. We are looking far back in time to see galaxies as they were more than 13 billion years ago! As a theorist, I want to understand the full life cycle of galaxies – how they are born (small, blue, bursting with new stars), how they grow, and eventually how they die (big, red, fading with the light of ancient stars). Hubble allows us to connect these stages. Some of the faintest, most distant galaxies in this image are destined to become monster galaxies like those glowing white in the foreground. We’re seeing the distant past and the present in a single glorious picture. Tanya Hill, Honorary Fellow of the University of Melbourne and Senior Curator (Astronomy), Museum Victoria. This article was originally published on The Conversation. Read the original article.


Adam Hadhazy, writer and editor for The Kavli Foundation, contributed this article to Space.com'sExpert Voices: Op-Ed & Insights. Astronomers are increasingly enlisting volunteer "citizen scientists" to help them examine a seemingly endless stream of images and measurements of the universe, and their combined efforts are having a powerful impact on the study of the cosmos. Just last November, a citizen science project called Space Warps announced the discovery of 29 new gravitational lenses, regions in the universe where massive objects bend the paths of photons (from galaxies and other light sources) as they travel toward Earth. As cosmic phenomena go, the lenses are highly prized by scientists because they offer tantalizing glimpses of objects too distant, and dim, to be seen through existing telescopes, and information on the objects that are acting as lenses. The Space Warps' haul of lenses is all the more impressive because of how it was obtained. During an eight-month period, about 37,000 volunteers individually combed more than 430,000 digital images in a huge, online photo library of deep space. Automated computer programs have identified most of the 500 gravitational lenses on astronomer’s books. However, computers failed to flag the 29 lenses the Space Warps volunteers spotted, speaking to unique skills we humans possess. The Kavli Foundation spoke with three researchers, all co-authors of two papers published in the Monthly Notices of the Royal Astronomical Society (SPACE WARPS – I. Crowdsourcing the discovery of gravitational lenses SPACE WARPS– II. New gravitational lens candidates from the CFHTLS discovered through citizen science) describing the Space Warps findings. In our roundtable, the researchers discussed the findings and the critical role citizen science is playing in furthering astronomical discovery. The participants were: The following is an edited transcript of the roundtable discussion. The participants have been provided the opportunity to amend or edit their remarks. The Kavli Foundation: Anupreeta and Aprajita, where did you get the idea — along with your co-principal investigator Phil Marshall of the Kavli Institute for Particle Astrophysics and Cosmology (KIPAC) at Stanford University — to put volunteers to work on identifying gravitational lenses starting back in 2013? Anupreeta More: A few years ago, Chris Lintott gave a talk on citizen science at the Kavli Institute for Cosmological Physics in Chicago, where I was working at the time. It got me thinking about a lens search by citizen scientists. Aprajita Verma: For Phil Marshall and I, Space Warps grew out of Galaxy Zoo. Soon after Galaxy Zoo launched, I started to look at some of the galaxies that were being posted on the Galaxy Zoo user forum that had potential lensed features surrounding them. This was a great by product of the core Galaxy Zoo project. However, we realized that to find these incredibly rare sources, which are often confused with other objects, we really needed a tailored interface to efficiently find lenses. This grew into Space Warps. TKF: Chris, Galaxy Zoo itself was inspired by Stardust@home, the first astronomy-based citizen science project in which people played an active role. Until then, citizen scientists were often computer owners who offered up free processing power on their devices to aid in machine-driven data analysis. Were you concerned when you started Galaxy Zoo in 2007 that it would be hard to attract volunteers? Chris Lintott: Since Stardust@home involved people looking at images of a comet's dust grains brought back by NASA's Stardust space probe, we thought "Well, if people are willing to look at dust grains, then surely they'd be happy to look at our galaxies!" But that turned out to be almost beside the point. As we've done many of these citizen science projects over the years, we've discovered it's not the quality of the images that matter. After all, our galaxies aren't typically beautiful. They are not the Hubble Space Telescope shots that you’d expect to find on the front page of the New York Times. Our galaxies are often fuzzy, little, enigmatic blobs. The Space Warps images are pretty, but again they're not the kind of thing you would sell as a poster in the gift shop at the Kennedy Space Center. It's actually the ideas that get people excited. I think Space Warps and Galaxy Zoo have been successful because they have done a great job of explaining to people why we need their help. We're saying to them: "Look, if you do this simple task, it allows us to do science." This idea is best shown by Planet Hunters, a citizen science project that searches for exoplanets in data from NASA's Kepler spacecraft. Users are looking at graphs for fun. But because the idea is the discovery of exoplanets, people will put up with looking at data. TKF: What sort of unique science is made possible because of Space Warps? Verma: Gravitational lenses allow us to look at objects, such as very distant galaxies, that are fainter and in much more detail than with the telescopes we have now. It's enabling the kind of science we'll be routinely doing with extremely large telescopes in the future. More: That's right. Something unique about gravitational lensing is that it acts like a natural telescope and allows us to study some really faint, distant galaxies which we wouldn't get to study otherwise. We're seeing these distant galaxies in the early stages of their life cycle, which helps us understand how galaxies evolve over time. Also, in a gravitational lens system, it's possible for us to study the properties of the foreground galaxies or galaxy groups that are gravitationally lensing the background sources. For example, we can measure the mass of these foreground galaxies and also study how mass is distributed in them. TKF: Space Warps and other citizen science projects flourish because computer programs sometimes struggle at identifying features in data. Why do computers have trouble spotting the characteristic arc or blobby shapes of gravitational lenses that humans can? More: The problem is that these arc-like images of distant galaxies can have very different shapes and profiles. The process of lensing magnifies these galaxies' images and can distort them. Also, these distant galaxies emit light at different wavelengths and can appear to have different colors. Furthermore, there are structures in these galaxies that can change the shape of the arcs. Verma: Also, lots of spiral galaxies have bluish spiral arms that can look like lenses. We call these objects "lens impostors" and we find many more of these false positives compared to rare, true gravitational lenses. More: All these differences make it difficult to automate the process for finding lenses. But human beings are very good at pattern recognition. The dynamic range that our eyes and our brains offer is much greater than a computer algorithm. Lintott: Another thing to bear in mind in astronomy, particularly in Space Warps, is that we're often looking for rare objects. A computer's performance depends very strongly on how many examples you have to "train" it with. When you're dealing with rare things, that's often very difficult to do. We can't assemble large collections of hundreds of thousands of examples of gravitational lenses because we don't have them yet. Also, people — unlike computers — check beyond what we are telling them to look for when they review images. One of the great Space Warps examples is the discovery of a "red ring" gravitational lens. All the example lenses on the Space Warps site are blue in color. But because we have human classifiers, they had no trouble noticing this red thing that looks a little like these blue things they've been taught to keep an eye out for. Humans have an ability to make intuitive leaps like that, and that's very important. Verma: I echo the point that it's very difficult to program diversity and adaptability into any computer algorithm, whereas we kind of get it for free from the citizen scientists! [Laughter] TKF: Aprajita and Anupreeta, what’s the importance of the red ring object Chris just mentioned that the Space Warps community discovered in 2014 and has nicknamed 9io9? Verma: This object was a really exciting find, and it's a classic example of something we hadn't seen before that citizen scientists quickly found. We think that inside the background galaxy there's both an active black hole, which is producing radio wave emissions, as well as regions of star-formation. They're both stretched by the lensing into these spectacular arcs. It's just a really nice example of what lensing can do. We're still putting in further observations to try and really understand what this object is like. More: In this particular case with 9io9, there is the usual, main lensing galaxy, but then there is also another, small, satellite galaxy, whose mass and gravity are also contributing to the lensing. The satellite galaxy produces visible effects on the lensed images and we can use this to study its mass distribution. There are no other methods besides gravitational lensing which can provide as accurate a mass estimate for galaxies at such great distances. TKF: Besides 9io9, citizen astrophysicists have turned up other bizarre, previously unknown phenomena. One example is Hanny’s Voorwerp, a galaxy-size gas cloud discovered in 2007 in Galaxy Zoo. More recently, in 2015, Planet Hunters spotted huge decreases in the starlight coming from a star called KIC 8462. The cause could be an eclipsing swarm of comets; another, albeit unlikely, possibility that has set off rampant speculation on the Internet is that an alien megastructure is blocking light from the star. Why does citizen science seemingly work so well at making completely unexpected discoveries? Lintott: I often talk about the human ability to be distracted as a good thing. If we're doing a routine task and something unusual comes along, we stop to pay attention to it. That's rather hard to develop with automated computer systems. They can look for anomalies, but in astronomy, most anomalies are boring, such as satellites crossing in front of the telescope, or the telescope's camera malfunctions. However, humans are really good at spotting interesting anomalies like Hanny's Voorwerp, which looks like either an amorphous green blob or an evil Kermit the Frog, depending on how you squint at it. [Laughter] The point is, it's something you want to pay attention to. The other great thing about citizen science is that the volunteers who find these unusual things start to investigate and become advocates for them. Citizen scientists will jump up and down and tell us professional scientists we should pay attention to something. The great Zooniverse discoveries have always been from that combination of somebody who's distracted and then asks questions about what he or she has found. TKF: Aprajita and Chris, you are both working on the Large Synoptic Survey Telescope (LSST). It will conduct the largest-ever scan of the sky starting in 2022 and should turn up tons of new gravitational lenses. Do you envision a Space Warps-style citizen science project for LSST? Verma: Citizens will play a huge role in the LSST, which is a game-changer for lensing. We know of about 500 lenses currently. LSST will find on the order of tens to hundreds of thousands of lenses. We will potentially require the skill that citizen scientists have in looking for exotic and challenging objects. Also, LSST’s dataset will have a time dimension. We're really going to make a movie of the universe, and this will turn up a number of surprises. I can see citizen scientists being instrumental in a lot of the discoveries LSST will make. Lintott: One thing that's challenging about LSST is the sheer size of the dataset. If you were a citizen scientist, say, who had subscribed to receive text message alerts for when objects change in the sky as LSST makes its movie of the universe, then you would end up with a couple of billion text messages a night. Obviously that would not work. So that means we need to filter the data. We'll dynamically decide whether to assign a task to a machine or to a citizen scientist, or indeed to a professional scientist. TKF: Chris, that comment reminds me of something you said to TIME magazine in 2008: "In many parts of science, we're not constrained by what data we can get, we're constrained by what we can do with the data we have. Citizen science is a very powerful way of solving that problem.” In this era of Big Data, how important do you all see citizen science being moving forward, given that computers will surely get better at visual recognition tasks? Lintott: In astronomy, if you're looking at things that are routine, like a spiral galaxy or a common type of supernova, I think the machines will take over. They will do so having been trained on the large datasets that citizen scientists will provide. But I think there will be citizen involvement for a long while and it will become more interesting as we use machines to do more of the routine work and filter the data. The tasks for citizen scientists will involve more varied things — more of the unusual, Hanny's Voorwerp-type of discoveries. Plus, a lot of unusual discoveries will need to be followed up, and I'd like to see citizen scientists get further into the process of analysis. Without them, I think we're going to end up with a pile of interesting objects which professional scientists just don't have time to deal with. Verma: We have already seen a huge commitment from citizen scientists, particularly those who've spent a long time on Galaxy Zoo and Space Warps. For example, on Space Warps, we have a group of people who are interested in doing gravitational lens modeling, which has long been the domain of the professional astronomer. So we know that there's an appetite there to do further analysis with the objects they’ve found. I think in the future, the citizen science community will work hand-in-hand with professional astronomers. TKF: Are there new citizen astrophysicist opportunities on the horizon related to your projects? Lintott: Galaxy Zoo has a new lease on life, actually. We just added in new galaxies from a telescope in Chile. These galaxies are relatively close and their images are beautiful. It's our first proper look at the southern sky, so we have an all-new part of the universe to explore. It gives users a chance to be the first to see galaxies — if they get over to Galaxy Zoo quickly! Verma: For Space Warps, we are expecting new data and new projects to be online next year. More: Here in Japan, we are leading an imaging survey called the Hyper Suprime-Cam (HSC) survey and it's going to be much larger and deeper than what we have been looking at so far. We expect to find more than an order of magnitude increase in the number of lenses. Currently, we are preparing images of the candidates from the HSC survey and hope to start a new lens search with Space Warps soon. TKF: Is it the thrill of discovery that entices most citizen scientist volunteers? Some of the images in Galaxy Zoo have never been seen before because they were taken by a robotic telescope and stored away. Volunteers therefore have the chance to see something no one else ever has. More: That discovery aspect is personal. I think it's always exciting for anyone. Lintott: When we set up Galaxy Zoo, we thought it would be a huge motivation to see something that's yours and be the first human to lay eyes on a galaxy. Exploring space in that way is something that until Galaxy Zoo only happened on "Star Trek." [Laughter] In the years since, we've also come to realize that citizen science is a collective endeavor. The people who've been through 10,000 images without finding anything have contributed to the discovery of something like the red ring galaxy just as much as the person who happens to stumble across it. You need to get rid of the empty data as well. I've been surprised by how much our volunteers believe that. It's a far cry from the traditional, public view of scientific discovery in which the lone genius makes the discovery and gets all the credit. Verma: We set out with Space Warps for citizen scientists to be part of our collaboration and they've really enabled us to produce important findings. They've inspired us with their dedication and productivity. We've learned from our analysis that basically anyone who joins Space Warps has an impact on the results. We are also especially grateful for a very dedicated, diligent group that has made most of the lens classifications. We look forward to welcoming everyone back in our future projects! Follow all of the Expert Voices issues and debates — and become part of the discussion — on Facebook, Twitter and Google+. The views expressed are those of the author and do not necessarily reflect the views of the publisher. This version of the article was originally published on Space.com. Copyright 2016 SPACE.com, a Purch company. All rights reserved. This material may not be published, broadcast, rewritten or redistributed.


Dark matter is one of the basic ingredients of the universe, and searches to detect it in laboratory-based experiments have been conducted for decades. However, until today dark matter has been observed only indirectly, via its gravitational interactions that govern the dynamics of the cosmos at all length-scales. It is expected that dark matter is made of a new, stable elementary particle that has escaped detection so far. "We expect that several tens of thousands of dark matter particles per second are passing through the area of a thumbnail," said Luca Grandi, a UChicago assistant professor in physics and a member of the Kavli Institute for Cosmological Physics. "The fact that we did not detect them yet tells us that their probability to interact with the atoms of our detector is very small, and that we need more sensitive instruments to find the rare signature of this particle." Grandi is a member of the XENON Collaboration, which consists of 21 research groups from the United States, Germany, Italy, Switzerland, Portugal, France, the Netherlands, Israel, Sweden and the United Arab Emirates. The collaboration's inauguration event took place Nov. 11 at the Laboratori Nazionali del Gran Sasso, one of the largest underground laboratories in the world. "We need to put our experiment deep underground, using about 1,400 meters of solid rock to shield it from cosmic rays," said Grandi, who participated in the inauguration along with guests from funding agencies as well as journalists and colleagues. About 80 visitors joined the ceremony at the laboratory's experimental site, which measures 110 meters long, 15 meters wide and 15 meters high. There, the new instrument is installed inside a 10-meter-diameter water shield to protect it from radioactive background radiation that originates from the environment. During introductory presentations, Elena Aprile, Columbia University professor and founder of the XENON project, illustrated the evolution of the program. It began with a 3 kilogram detector 15 years ago. The present-day instrument has a total mass of 3,500 kilograms. XENON1T employs the ultra-pure noble gas xenon as dark matter detection material, cooled down to –95 degrees Celsius to make it liquid. "In order to see the rare interactions of a dark matter particle in your detector, you need to build an instrument with a large mass and an extremely low radioactive background," said Grandi. "Otherwise you will have no chance to find the right events within the background signals." For this reason, the XENON scientists have carefully selected all materials used in the construction of the detector, ensuring that their intrinsic contamination with radioactive isotopes meet the low-background experiment's requirement. "One has to realize that objects without any radioactivity do not exist," Grandi explained. "Minute traces of impurities are present in everything, from simple things like metal slabs to the walls of the laboratory to the human body. We are trying to reduce and control these radioactive contaminants as much as possible." The XENON scientists measure tiny flashes of light and charge to reconstruct the position of the particle interaction within their detector, as well as the deposited energy and whether it might be induced by a dark matter particle or not. The light is observed by 248 sensitive photosensors, capable of detecting even single photons. A vacuum-insulated double-wall cryostat, resembling a gigantic version of a thermos flask, contains the cryogenic xenon and the dark matter detector. The xenon gas is cooled and purified from impurities in the three-story XENON building, an installation with a transparent glass facade next to the water shield, which allows visitors to view the scientists inside. A gigantic stainless-steel sphere equipped with pipes and valves is installed on the ground floor. "It can accommodate 7.6 tons of xenon in liquid and gaseous form," said Aprile. "This is more than two times the capacity we need for XENON1T, as we want to be prepared to swiftly increase the sensitivity of the experiment with a larger mass detector in the near future." Once fully operational, XENON1T will be the most sensitive dark matter experiment in the world. Grandi's group has been deeply involved in the preparation and assembly of the xenon Time Projection Chamber, the core of the detector. His group is also in charge for the development of the U.S. computing center for XENON1T data analysis via the UChicago Research Computing Center, directed by Birali Runesha, in close cooperation with Robert Gardner and his team at the Computation Institute. In addition to Columbia's Aprile, leading the other six U.S. institutions are Ethan Brown, Rensselaer Polytechnic Institute; Petr Chaguine, Rice University; Rafael Lang, Purdue University; Kaixuan Ni, University of California, San Diego; and Hanguo Wang, University of California, Los Angeles. XEON1T's first results are expected in early 2016. The collaboration expects the instrument to achieve most of its objectives within two years of data collection. The researchers then will move their project into a new phase. "Of course we want to detect the dark matter particle," Grandi said, "but even if we have only found some hints after two years, we are in an excellent position to move on as we are already now preparing the next step of the project, which will be the far more sensitive XENONnT."


Schmidt F.,California Institute of Technology | Rozo E.,University of Chicago | Rozo E.,Kavli Institute for Cosmological Physics
Astrophysical Journal | Year: 2011

Large catalogs of shear-selected peaks have recently become a reality. In order to properly interpret the abundance and properties of these peaks, it is necessary to take into account the effects of the clustering of source galaxies, among themselves and with the lens. In addition, the preferred selection of magnified galaxies in a flux- and size-limited sample leads to fluctuations in the apparent source density that correlate with the lensing field. In this paper, we investigate these issues for two different choices of shear estimators that are commonly in use today: globally normalized and locally normalized estimators. While in principle equivalent, in practice these estimators respond differently to systematic effects such as magnification and cluster member dilution. Furthermore, we find that the answer to the question of which estimator is statistically superior depends on the specific shape of the filter employed for peak finding; suboptimal choices of the estimator+filter combination can result in a suppression of the number of high peaks by orders of magnitude. Magnification and size bias generally act to increase the signal-to-noise ν of shear peaks; for high peaks the boost can be as large as Δν 1-2. Due to the steepness of the peak abundance function, these boosts can result in a significant increase in the observed abundance of shear peaks. A companion paper investigates these same issues within the context of stacked weak-lensing mass estimates. © 2011. The American Astronomical Society. All rights reserved.


Dodelson S.,Fermi National Accelerator Laboratory | Dodelson S.,University of Chicago | Dodelson S.,Kavli Institute for Cosmological Physics
Physical Review D - Particles, Fields, Gravitation and Cosmology | Year: 2010

One of the most promising ways of detecting primordial gravitational waves generated during inflation is to observe B-modes of polarization, generated by Thomson scattering after reionization, in the cosmic microwave background. Large scale foregrounds though are expected to be a major systematic issue, so-in the event of a tentative detection-an independent confirmation of large scale gravitational waves would be almost essential. Previous authors have suggested searching for the analogous mode of cosmic shear in weak lensing surveys but have shown that the signal to noise of this mode is marginal at best. This argument is reconsidered here, accounting for the cross correlations of the polarization and lensing B-modes. A lensing survey can potentially strengthen the argument for a detection of primordial gravitational waves, although it is unlikely to help constrain the amplitude of the signal. © 2010 The American Physical Society.


News Article | February 16, 2017
Site: www.scientificcomputing.com

The next generation of supercomputers will help researchers tackle increasingly complex problems through modeling large-scale systems, such as nuclear reactors or global climate, and simulating complex phenomena, such as the chemistry of molecular interactions. In order to be successful, these systems must be able to carry out vast numbers of calculations at extreme speeds, reliably store enormous amounts of information and be able to quickly deliver this information with minimal errors. To create such a system, computer designers have to first find ways to overcome limitations in existing high-performance computing systems and develop, design and optimize new software and hardware technologies to operate at exascale. ‘Exascale’ refers to high-performance computing systems capable of at least a billion billion calculations per second, or a factor of 50 times faster than the nation's most powerful supercomputers in use today. Computational scientists aim to use these systems to generate new insights and accelerate discoveries in materials science, precision medicine, national security and numerous other fields. As collaborators in four co-design centers created by the U.S. Department of Energy’s (DOE) Exascale Computing Project (ECP), researchers at the DOE's Argonne National Laboratory are helping to solve some of these complex challenges and pave the way for the creation of exascale supercomputers. The term ’co-design’ describes the integrated development and evolution of hardware technologies, computational applications and associated software. In pursuit of ECP’s mission to help people solve realistic application problems through exascale computing, each co-design center targets different features and challenges relating to exascale computing. Co-design Center for Online Data Analysis and Reduction at the Exascale (CODAR) Ian Foster, a University of Chicago professor and Argonne Distinguished Fellow, is leading a co-design center on a mission to strengthen and optimize processes for data analysis and reduction for the exascale. “Exascale systems will be 50 times faster than existing systems, but it would be too expensive to build out storage that would be 50 times faster as well," he said. "This means we no longer have the option to write out more data and store all of it. And if we can’t change that, then something else needs to change." Foster and other researchers in CODAR are working to overcome the gap between computation speed and the limitations in the speed and capacity of storage by developing smarter, more selective ways of reducing data without losing important information. There are many powerful techniques for doing data reduction, and CODAR researchers are studying various approaches. One such approach, lossy compression, is a method whereby unnecessary or redundant information is removed to reduce overall data size. This technique is what’s used to transform the detail-rich images captured on our phone camera sensors into JPEG files, which are small in size. While data is lost in the process, the most important information ― the amount needed for our eyes to interpret the images clearly ― is maintained, and as a result, we can store hundreds more photos on our devices. “The same thing happens when data compression is used as a technique for scientific data reduction. The important difference here is that scientific users need to precisely control and check the accuracy of the compressed data with respect to their specific needs,” said Argonne computer scientist Franck Cappello, who is leading the data reduction team for CODAR. Other data reduction techniques include use of summary statistics and feature extraction. Center for Efficient Exascale Discretizations (CEED) CEED is working to improve another feature for exascale computing ― how applications create computer models. More specifically, they’re looking at the process of discretization, in which the physics of the problem is represented as a finite number of grid points that represent the model of the system. “Determining the best layout of the grid points and representation of the model is important for rapid simulation,” said computational scientist Misun Min, the Argonne lead in CEED. Discretization is important for computer modeling and simulation because the process enables researchers to numerically represent physical systems, like nuclear reactors, combustion engines or climate systems. How researchers discretize the systems they're studying affects the amount and speed of computation at exascale. CEED is focused particularly on high-order discretizations that require relatively few grid points to accurately represent physical systems. “Our aim is to enable more efficient discretization while still maintaining a high level of accuracy for the researcher. Greater efficiency will help minimize the number of calculations needed, which would in turn reduce the overall size of computation, and also enable relatively fast relay of information,” said Paul Fischer, a professor at the University of Illinois at Urbana-Champaign and Argonne computational scientist involved in CEED. Co-design Center for Particle Applications (CoPA) Researchers behind CoPA are studying methods that model natural phenomena using particles, such as molecules, electrons or atoms. In high-performance computing, researchers can represent systems via discrete particles or smooth entities such as electromagnetic waves or sound waves; or they can combine the two techniques. Particle methods span a wide range of application areas, including materials science, chemistry, cosmology, molecular dynamics and turbulent flows. When using particle methods, researchers characterize the interactions of particles with other particles and with their environment in terms of short-range and long-range interactions. “The idea behind the co-design center is that, instead of everyone bringing their own specialized methods, we identify a set of building blocks, and then find the right way to deal with the common problems associated with these methods on the new supercomputers,” Salman Habib, the Argonne lead in CoPA and a senior member of the Kavli Institute for Cosmological Physics at the University of Chicago, said. “Argonne’s collaboration in this effort is in methods for long-range particle interactions as well as speeding up codes for short-range interactions; we work hard on what is needed to make codes run fast,” he said. Block-Structured AMR Co-design Center The Block-structured AMR Co-design Center focuses on making computation more efficient using a technique known as adaptive mesh refinement, or AMR. AMR allows an application to achieve higher level of precision at specific points or locations of interest within the computational domain and lower levels of precision elsewhere. In other words, AMR helps to focus the computing power where it is most effective to get the most precise calculations for lowest cost. “Without AMR, calculations would require so much more resources and time,” said Anshu Dubey, the Argonne lead in the Block-Structured AMR Center and a fellow of the Computation Institute. “AMR helps researchers to focus the computational resources on features of interest in their applications while enabling efficiency in computing.” AMR is already used in applications such as combustion, astrophysics and cosmology; now researchers in the Block-Structured AMR co-design center are focused on enhancing and augmenting it for future exascale platforms.


Rozo E.,Kavli Institute for Cosmological Physics | Rozo E.,University of Chicago | Vikhlinin A.,Harvard - Smithsonian Center for Astrophysics | More S.,Kavli Institute for Cosmological Physics
Astrophysical Journal | Year: 2012

Sunyaev-Zeldovich (SZ) clusters surveys, such as Planck, the South Pole Telescope, and the Atacama Cosmology Telescope, will soon be publishing several hundred SZ-selected systems. The key ingredient required to transport the mass calibration from current X-ray-selected cluster samples to these SZ systems is the Y SZ-YX scaling relation. We constrain the amplitude, slope, and scatter of the Y SZ-YX scaling relation using SZ data from Planck and X-ray data from Chandra. We find a best-fit amplitude of ln (D 2 A Y SZ/CYX ) = -0.202 ± 0.024 at the pivot point CYX = 8 × 10-5 Mpc2. This corresponds to a Y SZ/Y X ratio of 0.82 ± 0.024, in good agreement with X-ray expectations after including the effects of gas clumping. The slope of the relation is α = 0.916 ± 0.032, consistent with unity at ≈ 2.3σ. We are unable to detect intrinsic scatter, and find no evidence that the scaling relation depends on cluster dynamical state. © 2012. The American Astronomical Society. All rights reserved.


Mortonson M.J.,Ohio State University | Hu W.,Kavli Institute for Cosmological Physics
Physical Review D - Particles, Fields, Gravitation and Cosmology | Year: 2010

The recent detection of secondary CMB anisotropy by the South Pole Telescope limits temperature fluctuations from the optical depth-modulated Doppler effect to T3000<√13μK at multipoles ℓ ∼3000. This bound is the first empirical constraint on optical depth fluctuations at arcminute scales, τ3000=0.001T3000/μK, implying that these fluctuations are no more than a few percent of the mean. Modulation of the quadrupole source to polarization generates B modes that are bounded as B3000=0.003T3000. The maximal extrapolation to the ℓ ∼100 gravitational wave regime yields B100=0.1T3000 and remains in excess of gravitational lensing if the comoving size of ionized regions is R 80 Mpc. If patchy reionization produces much of the observed arcminute scale temperature fluctuations, current bounds on B100 already require R 200 Mpc, and limits on both T3000 and B100 can be expected to improve rapidly. © 2010 The American Physical Society.

Loading Kavli Institute for Cosmological Physics collaborators
Loading Kavli Institute for Cosmological Physics collaborators