Yukawa Institute for Theoretical Physics

Kyoto, Japan

Yukawa Institute for Theoretical Physics

Kyoto, Japan
Time filter
Source Type

News Article | May 10, 2017
Site: www.scientificamerican.com

The origins of space and time are among the most mysterious and contentious topics in science. Our February 2017 article “Pop Goes the Universe” argues against the dominant idea that the early cosmos underwent an extremely rapid expansion called inflation. Its authors instead advocate for another scenario—that our universe began not with a bang but with a bounce from a previously contracting cosmos. In the letter below, a group of 33 physicists who study inflationary cosmology respond to that article. It is followed by a reply from the authors (an extended version of their reply can be found here). In “Pop Goes the Universe,” by Anna Ijjas, Paul J. Steinhardt and Abraham Loeb, the authors (hereafter “IS&L”) make the case for a bouncing cosmology, as was proposed by Steinhardt and others in 2001. They close by making the extraordinary claim that inflationary cosmology “cannot be evaluated using the scientific method” and go on to assert that some scientists who accept inflation have proposed “discarding one of [science’s] defining properties: empirical testability,” thereby “promoting the idea of some kind of nonempirical science.” We have no idea what scientists they are referring to. We disagree with a number of statements in their article, but in this letter, we will focus on our categorical disagreement with these statements about the testability of inflation. There is no disputing the fact that inflation has become the dominant paradigm in cosmology. Many scientists from around the world have been hard at work for years investigating models of cosmic inflation and comparing these predictions with empirical observations. According to the high-energy physics database INSPIRE, there are now more than 14,000 papers in the scientific literature, written by over 9,000 distinct scientists, that use the word “inflation” or “inflationary” in their titles or abstracts. By claiming that inflationary cosmology lies outside the scientific method, IS&L are dismissing the research of not only all the authors of this letter but also that of a substantial contingent of the scientific community. Moreover, as the work of several major, international collaborations has made clear, inflation is not only testable, but it has been subjected to a significant number of tests and so far has passed every one. Inflation is not a unique theory but rather a class of models based on similar principles. Of course, nobody believes that all these models are correct, so the relevant question is whether there exists at least one model of inflation that seems well motivated, in terms of the underlying particle physics assumptions, and that correctly describes the measurable properties of our universe. This is very similar to the early steps in the development of the Standard Model of particle physics, when a variety of quantum field theory models were explored in search of one that fit all the experiments. Although there is in principle a wide space of inflationary models to examine, there is a very simple class of inflationary models (technically, “single-field slow-roll” models) that all give very similar predictions for most observable quantities—predictions that were clearly enunciated decades ago. These “standard” inflationary models form a well-defined class that has been studied extensively. (IS&L have expressed strong opinions about what they consider to be the simplest models within this class, but simplicity is subjective, and we see no reason to restrict attention to such a narrow subclass.) Some of the standard inflationary models have now been ruled out by precise empirical data, and this is part of the desirable process of using observation to thin out the set of viable models. But many models in this class continue to be very successful empirically. The standard inflationary models predict that the universe should have a critical mass density (that is, it should be geometrically flat), and they also predict the statistical properties of the faint ripples that we detect in the cosmic microwave background (CMB). First, the ripples should be nearly “scale-invariant,” meaning that they have nearly the same intensity at all angular scales. Second, the ripples should be “adiabatic,” meaning that the perturbations are the same in all components: the ordinary matter, radiation and dark matter all fluctuate together. Third, they should be “Gaussian,” which is a statement about the statistical patterns of relatively bright and dark regions. Fourth and finally, the models also make predictions for the patterns of polarization in the CMB, which can be divided into two classes, called E-modes and B-modes. The predictions for the E-modes are very similar for all standard inflationary models, whereas the levels of B-modes, which are a measure of gravitational radiation in the early universe, vary significantly within the class of standard models. The remarkable fact is that, starting with the results of the Cosmic Background Explorer (COBE) satellite in 1992, numerous experiments have confirmed that these predictions (along with several others too technical to discuss here) accurately describe our universe. The average mass density of the universe has now been measured to an accuracy of about half of a percent, and it agrees perfectly with the prediction of inflation. (When inflation was first proposed, the average mass density was uncertain by at least a factor of three, so this is an impressive success.) The ripples of the CMB have been measured carefully by two more satellite experiments, the Wilkinson Microwave Anisotropy Probe (WMAP) and the Planck satellite, as well as many ground- and balloon-based experiments—all confirming that the primordial fluctuations are indeed nearly scale-invariant and very accurately adiabatic and Gaussian, precisely as predicted (ahead of time) by standard models of inflation. The B-modes of polarization have not yet been seen, which is consistent with many, though not all, of the standard models, and the E-modes are found to agree with the predictions. In 2016 the Planck satellite team (a collaboration of about 260 authors) summarized its conclusions by saying that “the Planck results offer powerful evidence in favour of simple inflationary models.” So if inflation is untestable, as IS&L would have us believe, why have there been so many tests of it and with such remarkable success? While the successes of inflationary models are unmistakable, IS&L nonetheless make the claim that inflation is untestable. (We are bewildered by IS&L’s assertion that the dramatic observational successes of inflation should be discounted while they accuse the advocates of inflation of abandoning empirical science!) They contend, for example, that inflation is untestable because its predictions can be changed by varying the shape of the inflationary energy density curve or the initial conditions. But the testability of a theory in no way requires that all its predictions be independent of the choice of parameters. If such parameter independence were required, then we would also have to question the status of the Standard Model, with its empirically determined particle content and 19 or more empirically determined parameters. An important point is that standard inflationary models could have failed any of the empirical tests described above, but they did not. IS&L write about how “a failing theory gets increasingly immunized against experiment by attempts to patch it,” insinuating that this has something to do with inflation. But despite IS&L’s rhetoric, it is standard practice in empirical science to modify a theory as new data come to light, as, for example, the Standard Model has been modified to account for newly discovered quarks and leptons. For inflationary cosmology, meanwhile, there has so far been no need to go beyond the class of standard inflationary models. IS&L also assert that inflation is untestable because it leads to eternal inflation and a multiverse. Yet although the possibility of a multiverse is an active area of study, this possibility in no way interferes with the empirical testability of inflation. If the multiverse picture is valid, then the Standard Model would be properly understood as a description of the physics in our visible universe, and similarly the models of inflation that are being refined by current observations would describe the ways inflation can happen in our particular part of the universe. Both theories would remain squarely within the domain of empirical science. Scientists would still be able to compare newly obtained data—from astrophysical observations and particle physics experiments—with precise, quantitative predictions of specific inflationary and particle physics models. Note that this issue is separate from the loftier goal of developing a theoretical framework that can predict, without the use of observational data, the specific models of particle physics and inflation that should be expected to describe our visible universe. Like any scientific theory, inflation need not address all conceivable questions. Inflationary models, like all scientific theories, rest on a set of assumptions, and to understand those assumptions we might need to appeal to some deeper theory. This, however, does not undermine the success of inflationary models. The situation is similar to the standard hot big bang cosmology: the fact that it left several questions unresolved, such as the near-critical mass density and the origin of structure (which are solved elegantly by inflation), does not undermine its many successful predictions, including its prediction of the relative abundances of light chemical elements. The fact that our knowledge of the universe is still incomplete is absolutely no reason to ignore the impressive empirical success of the standard inflationary models. During the more than 35 years of its existence, inflationary theory has gradually become the main cosmological paradigm describing the early stages of the evolution of the universe and the formation of its large-scale structure. No one claims that inflation has become certain; scientific theories don’t get proved the way mathematical theorems do, but as time passes, the successful ones become better and better established by improved experimental tests and theoretical advances. This has happened with inflation. Progress continues, supported by the enthusiastic efforts of many scientists who have chosen to participate in this vibrant branch of cosmology. Empirical science is alive and well! »Click here to jump to the authors’ reply Alan H. Guth             Victor F. Weisskopf Professor of Physics, Massachusetts Institute of Technology             http://web.mit.edu/physics/people/faculty/guth_alan.html David I. Kaiser             Germeshausen Professor of the History of Science and Professor of Physics, Massachusetts Institute of Technology             http://web.mit.edu/physics/people/faculty/kaiser_david.html Andrei D. Linde             Harald Trap Friis Professor of Physics, Stanford University             https://physics.stanford.edu/people/faculty/andrei-linde Yasunori Nomura             Professor of Physics and Director, Berkeley Center for Theoretical Physics, University of California, Berkeley             http://physics.berkeley.edu/people/faculty/yasunori-nomura Charles L. Bennett             Bloomberg Distinguished Professor and Alumni Centennial Professor of Physics and Astronomy, Johns Hopkins University             Principal Investigator, Wilkinson Microwave Anisotropy Probe (WMAP) mission             Deputy Principal Investigator and Science Working Group member, Cosmic Background Explorer (COBE) mission             http://physics-astronomy.jhu.edu/directory/charles-l-bennett/ J. Richard Bond             University Professor, University of Toronto and Director, Canadian Institute for Advanced Research Cosmology and Gravity Program, Canadian Institute for Theoretical Astrophysics             Member of the Planck collaboration             http://www.cita.utoronto.ca/~bond/ François Bouchet             Director of Research, Institut d’Astrophysique de Paris, CNRS and Sorbonne Université-UPMC             Deputy Principal Investigator, Planck satellite HFI (High Frequency Instrument) Consortium and Member, Planck Science Team             http://savoirs.ens.fr/conferencier.php?id=145 Sean Carroll             Research Professor of Physics, California Institute of Technology             http://www.astro.caltech.edu/people/faculty/Sean_Carroll.html George Efstathiou             Professor of Astrophysics, Kavli Institute for Cosmology, University of Cambridge             Member, Planck Science Team             http://www.ast.cam.ac.uk/~gpe/ Stephen Hawking             Lucasian Professor of Mathematics (Emeritus) and Dennis Stanton Avery and Sally Tsui Wong-Avery Director of Research, Department of Applied Mathematics and Theoretical Physics, University of Cambridge             http://www.damtp.cam.ac.uk/people/s.w.hawking/ Renata Kallosh             Professor of Physics, Stanford University             https://physics.stanford.edu/people/faculty/renata-kallosh Eiichiro Komatsu             Director of the Department of Physical Cosmology, Max-Planck-Institute für Astrophysik, Garching             Member, Wilkinson Microwave Anisotropy Probe (WMAP) collaboration             http://wwwmpa.mpa-garching.mpg.de/~komatsu/ Lawrence Krauss             Foundation Professor in the School of Earth and Space Exploration and Department of Physics, and Director, The Origins Project at Arizona State University             http://krauss.faculty.asu.edu David H. Lyth             Professor of Physics (Emeritus), Lancaster University             http://www.lancaster.ac.uk/physics/about-us/people/david-lyth Juan Maldacena             Carl P. Feinberg Professor of Physics, Institute for Advanced Study             https://www.sns.ias.edu/malda John C. Mather             Senior Astrophysicist and Goddard Fellow, NASA Goddard Space Flight Center and recipient of the Nobel Prize in Physics (2006)             Project Scientist, Cosmic Background Explorer (COBE) mission and             Senior Project Scientist, James Webb Space Telescope             https://science.gsfc.nasa.gov/sed/bio/john.c.mather Hiranya Peiris             Professor of Astrophysics, University College London and Director, Oskar Klein Centre for Cosmoparticle Physics, Stockholm             Member, Wilkinson Microwave Anisotropy Probe (WMAP) collaboration and Planck collaboration             http://zuserver2.star.ucl.ac.uk/~hiranya/ Malcolm Perry             Professor of Theoretical Physics, University of Cambridge             http://www.damtp.cam.ac.uk/people/m.j.perry/ Lisa Randall             Frank B. Baird, Jr., Professor of Science, Department of Physics, Harvard University             https://www.physics.harvard.edu/people/facpages/randall Martin Rees             Astronomer Royal of Great Britain, former President of the Royal Society of London, and Professor (Emeritus) of Cosmology and Astrophysics, University of Cambridge             http://www.ast.cam.ac.uk/~mjr/ Misao Sasaki             Professor, Yukawa Institute for Theoretical Physics, Kyoto University             http://www2.yukawa.kyoto-u.ac.jp/~misao.sasaki/ Leonardo Senatore             Associate Professor of Physics, Stanford University             https://physics.stanford.edu/people/faculty/leonardo-senatore Eva Silverstein             Professor of Physics, Stanford University             https://physics.stanford.edu/people/faculty/eva-silverstein George F. Smoot III             Professor of Physics (Emeritus), Founding Director, Berkeley Center for Cosmological Physics, and recipient of the Nobel Prize in Physics (2006)             Principal Investigator, Cosmic Background Explorer (COBE) mission             http://physics.berkeley.edu/people/faculty/george-smoot-iii Alexei Starobinsky             Principal Researcher, Landau Institute for Theoretical Physics, Moscow             http://www.itp.ac.ru/en/persons/starobinsky-aleksei-aleksandrovich/ Leonard Susskind             Felix Bloch Professor of Physics and Wells Family Director, Stanford Institute for Theoretical Physics, Stanford University             https://physics.stanford.edu/people/faculty/leonard-susskind Michael S. Turner             Bruce. V. Rauner Distinguished Service Professor, Department of Astronomy and Astrophysics and Department of Physics, University of Chicago             https://astro.uchicago.edu/people/michael-s-turner.php Alexander Vilenkin             L. and J. Bernstein Professor of Evolutionary Science and Director, Institute of Cosmology, Tufts University             http://cosmos2.phy.tufts.edu/vilenkin.html Steven Weinberg             Jack S. Josey-Welch Foundation Chair and Regental Professor and Director, Theory Research Group, Department of Physics, University of Texas at Austin, and recipient of the Nobel Prize in Physics (1979)             https://web2.ph.utexas.edu/~weintech/weinberg.html Rainer Weiss                         Professor of Physics (Emeritus), Massachusetts Institute of Technology             Chair, Science Working Group, Cosmic Background Explorer (COBE) mission             Co-Founder, Laser Interferometric Gravitational-wave Observatory (LIGO)             http://web.mit.edu/physics/people/faculty/weiss_rainer.html Frank Wilczek                      Herman Feshbach Professor of Physics, Massachusetts Institute of Technology, and recipient of the Nobel Prize in Physics (2004)             http://web.mit.edu/physics/people/faculty/wilczek_frank.html Edward Witten                      Charles Simonyi Professor of Physics, Institute for Advanced Study and recipient of the Fields Medal (1990)             https://www.sns.ias.edu/witten Matias Zaldarriaga               Professor of Astrophysics, Institute for Advanced Study             https://www.sns.ias.edu/matiasz THE AUTHORS REPLY: We have great respect for the scientists who signed the rebuttal to our article, but we are disappointed by their response, which misses our key point: the differences between the inflationary theory once thought to be possible and the theory as understood today. The claim that inflation has been confirmed refers to the outdated theory before we understood its fundamental problems. We firmly believe that in a healthy scientific community, respectful disagreement is possible and hence reject the suggestion that by pointing out problems, we are discarding the work of all of those who developed the theory of inflation and enabled precise measurements of the universe. Historically, the thinking about inflation was based on a series of misunderstandings. It was not understood that the outcome of inflation is highly sensitive to initial conditions. And it was not understood that inflation generically leads to eternal inflation and, consequently, a multiverse—an infinite diversity of outcomes. Papers claiming that inflation predicts this or that ignore these problems. Our point is that we should be talking about the contemporary version of inflation, warts and all, not some defunct relic. Logically, if the outcome of inflation is highly sensitive to initial conditions that are not yet understood, as the respondents concede, the outcome cannot be determined. And if inflation produces a multiverse in which, to quote a previous statement from one of the responding authors (Guth), “anything that can happen will happen”—it makes no sense whatsoever to talk about predictions. Unlike the Standard Model, even after fixing all the parameters, any inflationary model gives an infinite diversity of outcomes with none preferred over any other. This makes inflation immune from any observational test. For more details, see our 2014 paper “Inflationary Schism” (preprint available at https://arxiv.org/abs/1402.6980). We are three independent thinkers representing different generations of scientists. Our article was not intended to revisit old debates but to discuss the implications of recent observations and to point out unresolved issues that present opportunities for a new generation of young cosmologists to make a lasting impact. We hope readers will go back and review our article’s concluding paragraphs. We advocated against invoking authority and for open recognition of the shortcomings of current concepts, a reinvigorated effort to resolve these problems and an open-minded exploration of diverse ideas that avoid them altogether. We stand by these principles.

Bombin H.,Yukawa Institute for Theoretical Physics
Physical Review X | Year: 2016

Fault-tolerant quantum computation techniques rely on weakly correlated noise. Here, I show that it is enough to assumeweak spatial correlations:Time correlations can take any form. In particular, single-shot errorcorrection techniques exhibit a noise threshold for quantum memories under spatially local stochastic noise.

De Felice A.,Institute for Fundamental Study | De Felice A.,Yukawa Institute for Theoretical Physics | Nakamura T.,Kyoto University | Tanaka T.,Yukawa Institute for Theoretical Physics
Progress of Theoretical and Experimental Physics | Year: 2014

We discuss graviton oscillations based on the ghost-free bi-gravity theory.We point out that this theory possesses a natural cosmological background solution that is very close to the case of general relativity. Furthermore, the interesting parameter range of the graviton mass, which can be explored by observations of gravitational waves, is not at all excluded by the constraint from solar system tests. Therefore, a graviton oscillation with a possible inverse chirp signal would be an interesting scientific target for KAGRA, Advanced LIGO, Advanced Virgo, and GEO. © The Author(s) 2014.

Chernykh M.,TU Darmstadt | Feldmeier H.,Helmholtz Center for Heavy Ion Research | Feldmeier H.,Yukawa Institute for Theoretical Physics | Neff T.,Helmholtz Center for Heavy Ion Research | And 2 more authors.
Physical Review Letters | Year: 2010

The pair decay width of the first excited 0+ state in C12 (the Hoyle state) is deduced from a novel analysis of the world data on inelastic electron scattering covering a wide momentum transfer range, thereby resolving previous discrepancies. The extracted value Γπ=(62.3±2.0)μeV is independently confirmed by new data at low momentum transfers measured at the S-DALINAC and reduces the uncertainty of the literature values by more than a factor of 3. A precise knowledge of Γπ is mandatory for quantitative studies of some key issues in the modeling of supernovae and of asymptotic giant branch stars, the most likely site of the slow-neutron nucleosynthesis process. © 2010 The American Physical Society.

News Article | November 18, 2015
Site: www.nature.com

In early 2009, determined to make the most of his first sabbatical from teaching, Mark Van Raamsdonk decided to tackle one of the deepest mysteries in physics: the relationship between quantum mechanics and gravity. After a year of work and consultation with colleagues, he submitted a paper on the topic to the Journal of High Energy Physics. In April 2010, the journal sent him a rejection — with a referee’s report implying that Van Raamsdonk, a physicist at the University of British Columbia in Vancouver, was a crackpot. His next submission, to General Relativity and Gravitation, fared little better: the referee’s report was scathing, and the journal’s editor asked for a complete rewrite. But by then, Van Raamsdonk had entered a shorter version of the paper into a prestigious annual essay contest run by the Gravity Research Foundation in Wellesley, Massachusetts. Not only did he win first prize, but he also got to savour a particularly satisfying irony: the honour included guaranteed publication in General Relativity and Gravitation. The journal published the shorter essay1 in June 2010. Still, the editors had good reason to be cautious. A successful unification of quantum mechanics and gravity has eluded physicists for nearly a century. Quantum mechanics governs the world of the small — the weird realm in which an atom or particle can be in many places at the same time, and can simultaneously spin both clockwise and anticlockwise. Gravity governs the Universe at large — from the fall of an apple to the motion of planets, stars and galaxies — and is described by Albert Einstein’s general theory of relativity, announced 100 years ago this month. The theory holds that gravity is geometry: particles are deflected when they pass near a massive object not because they feel a force, said Einstein, but because space and time around the object are curved. Both theories have been abundantly verified through experiment, yet the realities they describe seem utterly incompatible. And from the editors’ standpoint, Van Raamsdonk’s approach to resolving this incompatibility was  strange. All that’s needed, he asserted, is ‘entanglement’: the phenomenon that many physicists believe to be the ultimate in quantum weirdness. Entanglement lets the measurement of one particle instantaneously determine the state of a partner particle, no matter how far away it may be — even on the other side of the Milky Way. Einstein loathed the idea of entanglement, and famously derided it as “spooky action at a distance”. But it is central to quantum theory. And Van Raamsdonk, drawing on work by like-minded physicists going back more than a decade, argued for the ultimate irony — that, despite Einstein’s objections, entanglement might be the basis of geometry, and thus of Einstein’s geometric theory of gravity. “Space-time,” he says, “is just a geometrical picture of how stuff in the quantum system is entangled.” This idea is a long way from being proved, and is hardly a complete theory of quantum gravity. But independent studies have reached much the same conclusion, drawing intense interest from major theorists. A small industry of physicists is now working to expand the geometry–entanglement relationship, using all the modern tools developed for quantum computing and quantum information theory. “I would not hesitate for a minute,” says physicist Bartłomiej Czech of Stanford University in California, “to call the connections between quantum theory and gravity that have emerged in the last ten years revolutionary.” Much of this work rests on a discovery2 announced in 1997 by physicist Juan Maldacena, now at the Institute for Advanced Study in Princeton, New Jersey. Maldacena’s research had led him to consider the relationship between two seemingly different model universes. One is a cosmos similar to our own. Although it neither expands nor contracts, it has three dimensions, is filled with quantum particles and obeys Einstein’s equations of gravity. Known as anti-de Sitter space (AdS), it is commonly referred to as the bulk. The other model is also filled with elementary particles, but it has one dimension fewer and doesn’t recognize gravity. Commonly known as the boundary, it is a mathematically defined membrane that lies an infinite distance from any given point in the bulk, yet completely encloses it, much like the 2D surface of a balloon enclosing a 3D volume of air. The boundary particles obey the equations of a quantum system known as conformal field theory (CFT). Maldacena discovered that the boundary and the bulk are completely equivalent. Like the 2D circuitry of a computer chip that encodes the 3D imagery of a computer game, the relatively simple, gravity-free equations that prevail on the boundary contain the same information and describe the same physics as the more complex equations that rule the bulk. “It’s kind of a miraculous thing,” says Van Raamsdonk. Suddenly, he says, Maldacena’s duality gave physicists a way to think about quantum gravity in the bulk without thinking about gravity at all: they just had to look at the equivalent quantum state on the boundary. And in the years since, so many have rushed to explore this idea that Maldacena’s paper is now one of the most highly cited articles in physics. Among the enthusiasts was Van Raamsdonk, who started his sabbatical by pondering one of the central unsolved questions posed by Maldacena’s discovery: exactly how does a quantum field on the boundary produce gravity in the bulk? There had already been hints3 that the answer might involve some sort of relation between geometry and entanglement. But it was unclear how significant these hints were: all the earlier work on this idea had dealt with special cases, such as a bulk universe that contained a black hole. So Van Raamsdonk decided to settle the matter, and work out whether the relationship was true in general, or was just a mathematical oddity. He first considered an empty bulk universe, which corresponded to a single quantum field on the boundary. This field, and the quantum relationships that tied various parts of it together, contained the only entanglement in the system. But now, Van Raamsdonk wondered, what would happen to the bulk universe if that boundary entanglement were removed? He was able to answer that question using mathematical tools4 introduced in 2006 by Shinsei Ryu, now at the University of Illinois at Urbana–Champaign, and Tadashi Takanagi, now at the Yukawa Institute for Theoretical Physics at Kyoto University in Japan. Their equations allowed him to model a slow and methodical reduction in the boundary field’s entanglement, and to watch the response in the bulk, where he saw space-time steadily elongating and pulling apart (see ‘The entanglement connection’). Ultimately, he found, reducing the entanglement to zero would break the space-time into disjointed chunks, like chewing gum stretched too far. The geometry–entanglement relationship was general, Van Raamsdonk realized. Entanglement is the essential ingredient that knits space-time together into a smooth whole — not just in exotic cases with black holes, but always. “I felt that I had understood something about a fundamental question that perhaps nobody had understood before,” he recalls: “Essentially, what is space-time?” Quantum entanglement as geometric glue — this was the essence of Van Raamsdonk’s rejected paper and winning essay, and an idea that has increasingly resonated among physicists. No one has yet found a rigorous proof, so the idea still ranks as a conjecture. But many independent lines of reasoning support it. In 2013, for example, Maldacena and Leonard Susskind of Stanford published5 a related conjecture that they dubbed ER = EPR, in honour of two landmark papers from 1935. ER, by Einstein and American-Israeli physicist Nathan Rosen, introduced6 what is now called a wormhole: a tunnel through space-time connecting two black holes. (No real particle could actually travel through such a wormhole, science-fiction films notwithstanding: that would require moving faster than light, which is impossible.) EPR, by Einstein, Rosen and American physicist Boris Podolsky, was the first paper to clearly articulate what is now called entanglement7. Maldacena and Susskind’s conjecture was that these two concepts are related by more than a common publication date. If any two particles are connected by entanglement, the physicists suggested, then they are effectively joined by a wormhole. And vice versa: the connection that physicists call a wormhole is equivalent to entanglement. They are different ways of describing the same underlying reality. No one has a clear idea of what this under­lying reality is. But physicists are increasingly convinced that it must exist. Maldacena, Susskind and others have been testing the ER = EPR hypothesis to see if it is mathematically consistent with everything else that is known about entanglement and wormholes — and so far, the answer is yes. Other lines of support for the geometry–entanglement relationship have come from condensed-matter physics and quantum information theory: fields in which entanglement already plays a central part. This has allowed researchers from these disciplines to attack quantum gravity with a whole array of fresh concepts and mathematical tools. Tensor networks, for example, are a technique developed by condensed-matter physicists to track the quantum states of huge numbers of subatomic particles. Brian Swingle was using them in this way in 2007, when he was a graduate student at the Massachusetts Institute of Technology (MIT) in Cambridge, calculating how groups of electrons interact in a solid mat­erial. He found that the most useful network for this purpose started by linking adjacent pairs of electrons, which are most likely to interact with each other, then linking larger and larger groups in a pattern that resembled the hierarchy of a family tree. But then, during a course in quantum field theory, Swingle learned about Maldacena’s bulk–boundary correspondence and noticed an intriguing pattern: the mapping between the bulk and the boundary showed exactly the same tree-like network. Swingle wondered whether this resemblance might be more than just coincidence. And in 2012, he published8 calculations showing that it was: he had independently reached much the same conclusion as Van Raamsdonk, thereby adding strong support to the geometry–entanglement idea. “You can think of space as being built from entanglement in this very precise way using the tensors,” says Swingle, who is now at Stanford and has seen tensor networks become a frequently used tool to explore the geometry–entanglement correspondence. Another prime example of cross-fertilization is the theory of quantum error-correcting codes, which physicists invented to aid the construction of quantum computers. These machines encode information not in bits but in ‘qubits’: quantum states, such as the up or down spin of an electron, that can take on values of 1 and 0 simultaneously. In principle, when the qubits interact and become entangled in the right way, such a device could perform calculations that an ordinary computer could not finish in the lifetime of the Universe. But in practice, the process can be incredibly fragile: the slightest disturbance from the outside world will disrupt the qubits’ delicate entanglement and destroy any possibility of quantum computation. That need inspired quantum error-correcting codes, numerical strategies that repair corrupted correlations between the qubits and make the computation more robust. One hallmark of these codes is that they are always ‘non-local’: the information needed to restore any given qubit has to be spread out over a wide region of space. Otherwise, damage in a single spot could destroy any hope of recovery. And that non-locality, in turn, accounts for the fascination that many quantum information theorists feel when they first encounter Maldacena’s bulk–boundary correspondence: it shows a very similar kind of non-locality. The information that corresponds to a small region of the bulk is spread over a vast region of the boundary. “Anyone could look at AdS–CFT and say that it’s sort of vaguely analogous to a quantum error-correcting code,” says Scott Aaronson, a computer scientist at MIT. But in work published in June9, physicists led by Daniel Harlow at Harvard University in Cambridge and John Preskill of the California Institute of Technology in Pasadena argue for something stronger: that the Maldacena duality is itself a quantum error-correcting code. They have demonstrated that this is mathematically correct in a simple model, and are now trying to show that the assertion holds more generally. “People have been saying for years that entanglement is somehow important for the emergence of the bulk,” says Harlow. “But for the first time, I think we are really getting a glimpse of how and why.” That prospect seems to be enticing for the Simons Foundation, a philanthropic organization in New York City that announced in August that it would provide US$2.5 million per year for at least 4 years to help researchers to move forward on the gravity–quantum information connection. “Information theory provides a powerful way to structure our thinking about fundamental physics,” says Patrick Hayden, the Stanford physicist who is directing the programme. He adds that the Simons sponsorship will support 16 main researchers at 14 institutions worldwide, along with students, postdocs and a series of workshops and schools. Ultimately, one major goal is to build up a comprehensive dictionary for translating geometric concepts into quantum language, and vice versa. This will hopefully help physicists to find their way to the complete theory of quantum gravity. Still, researchers face several challenges. One is that the bulk–boundary correspondence does not apply in our Universe, which is neither static nor bounded; it is expanding and apparently infinite. Most researchers in the field do think that calculations using Maldacena’s correspondence are telling them something true about the real Universe, but there is little agreement as yet on exactly how to translate results from one regime to the other. Another challenge is that the standard definition of entanglement refers to particles only at a given moment. A complete theory of quantum gravity will have to add time to that picture. “Entanglement is a big piece of the story, but it’s not the whole story,” says Susskind. He thinks physicists may have to embrace another concept from quantum information theory: computational complexity, the number of logical steps, or operations, needed to construct the quantum state of a system. A system with low complexity is analogous to a quantum computer with almost all the qubits on zero: it is easy to define and to build. One with high complexity is analogous to a set of qubits encoding a number that would take aeons to compute. Susskind’s road to computational complexity began about a decade ago, when he noticed that a solution to Einstein’s equations of general relativity allowed a wormhole in AdS space to get longer and longer as time went on. What did that correspond to on the boundary, he wondered? What was changing there? Susskind knew that it couldn’t be entanglement, because the correlations that produce entanglement between different particles on the boundary reach their maximum in less than a second10. In an article last year11, however, he and Douglas Stanford, now at the Institute for Advanced Study, showed that as time progressed, the quantum state on the boundary would vary in exactly the way expected from computational complexity. “It appears more and more that the growth of the interior of a black hole is exactly the growth of computational complexity,” says Susskind. If quantum entanglement knits together pieces of space, he says, then computational complexity may drive the growth of space — and thus bring in the elusive element of time. One potential consequence, which he is just beginning to explore, could be a link between the growth of computational complexity and the expansion of the Universe. Another is that, because the insides of black holes are the very regions where quantum gravity is thought to dominate, computational complexity may have a key role in a complete theory of quantum gravity. Despite the remaining challenges, there is a sense among the practitioners of this field that they have begun to glimpse something real and very important. “I didn’t know what space was made of before,” says Swingle. “It wasn’t clear that question even had meaning.” But now, he says, it is becoming increasingly apparent that the question does make sense. “And the answer is something that we understand,” says Swingle. “It’s made of entanglement.” As for Van Raamsdonk, he has written some 20 papers on quantum entanglement since 2009. All of them, he says, have been accepted for publication.

Igata T.,Osaka City University | Harada T.,Rikkyo University | Kimura M.,Yukawa Institute for Theoretical Physics
Physical Review D - Particles, Fields, Gravitation and Cosmology | Year: 2012

We study high energy charged particle collisions near the horizon in an electromagnetic field around a rotating black hole and reveal the condition of the fine-tuning to obtain arbitrarily large center-of-mass (CM) energy. We demonstrate that the CM energy can be arbitrarily large as the uniformly magnetized rotating black hole arbitrarily approaches maximal rotation under the situation that a charged particle plunges from the innermost stable circular orbit (ISCO) and collides with another particle near the horizon. Recently, Frolov proposed that the CM energy can be arbitrarily high if the magnetic field is arbitrarily strong, when a particle collides with a charged particle orbiting the ISCO with finite energy near the horizon of a uniformly magnetized Schwarzschild black hole. We show that the charged particle orbiting the ISCO around a spinning black hole needs arbitrarily high energy in the strong field limit. This suggests that Frolov's process is unstable against the black hole spin. Nevertheless, we see that magnetic fields may substantially promote the capability of rotating black holes as particle accelerators in astrophysical situations. © 2012 American Physical Society.

Harada T.,Rikkyo University | Kimura M.,Yukawa Institute for Theoretical Physics
Physical Review D - Particles, Fields, Gravitation and Cosmology | Year: 2011

An inspiraling object of mass μ around a Kerr black hole of mass M(μ) experiences a continuous transition near the innermost stable circular orbit from adiabatic inspiral to plunge into the horizon as gravitational radiation extracts its energy and angular momentum. We investigate the collision of such an object with a generic counterpart around a Kerr black hole. We find that the angular momentum of the object is fine-tuned through gravitational radiation and that the high-velocity collision of the object with a generic counterpart naturally occurs around a nearly maximally rotating black hole. We also find that the center-of-mass energy can be far beyond the Planck energy for dark matter particles colliding around a stellar mass black hole and as high as 1058erg for stellar mass compact objects colliding around a supermassive black hole, where the present transition formalism is well justified. Therefore, rapidly rotating black holes can accelerate objects inspiraling around them to energy high enough to be of great physical interest. © 2011 American Physical Society.

Bombin H.,Yukawa Institute for Theoretical Physics | Bombin H.,Copenhagen University
New Journal of Physics | Year: 2016

Topological stabilizer codes with different spatial dimensions have complementary properties. Here I show that the spatial dimension can be switched using gauge fixing. Combining 2D and 3D gauge color codes in a 3D qubit lattice, fault-tolerant quantum computation can be achieved with constant time overhead on the number of logical gates, up to efficient global classical computation, using only local quantum operations. Single-shot error correction plays a crucial role. © 2016 IOP Publishing Ltd and Deutsche Physikalische Gesellschaft.

Xu M.,Nanjing University | Xu M.,Yukawa Institute for Theoretical Physics | Nagataki S.,Yukawa Institute for Theoretical Physics | Huang Y.F.,Nanjing University
Astrophysical Journal | Year: 2011

We propose an off-axis relativistic jet model for the Type Ic supernova SN 2007gr. Most of the energy (2 × 1051erg) in the explosion is contained in non-relativistic ejecta which produces the supernova (SN). The optical emission comes from the decay process of 56Ni synthesized in the bulk SN ejecta. Only very little energy (1048erg) is contained in the relativistic jet with initial velocity about 0.94 times the speed of light. The radio and X-ray emission comes from this relativistic jet. With some typical parameters of a Wolf-Rayet star (progenitor of Type Ic SN), i.e., the mass-loss rate and the wind velocity v w = 1.5 × 103 km s-1 together with an observing angle of θobs = 633, we can obtain multiband light curves that fit the observations well. All the observed data are consistent with our model. Thus, we conclude that SN 2007gr contains a weak relativistic jet and we observe the jet from off-axis. © 2011. The American Astronomical Society. All rights reserved..

Xu M.,Nanjing University | Xu M.,Yukawa Institute for Theoretical Physics | Xu M.,Yunnan University | Nagataki S.,Yukawa Institute for Theoretical Physics | And 2 more authors.
Astrophysical Journal | Year: 2012

We show that the photospheres of "failed" gamma-ray bursts (GRBs), whose bulk Lorentz factors are much lower than 100, can be outside of internal shocks. The resulting radiation from the photospheres is thermal and bright in the UV/soft X-ray band. The photospheric emission lasts for about 1000 s with a luminosity about several times 1046 ergs-1. These events can be observed by current and future satellites. It is also shown that the afterglows of failed GRBs are peculiar at the early stage, which makes it possible to distinguish failed GRBs from ordinary GRBs and beaming-induced orphan afterglows. © 2012. The American Astronomical Society. All rights reserved.

Loading Yukawa Institute for Theoretical Physics collaborators
Loading Yukawa Institute for Theoretical Physics collaborators