Balliol College

United Kingdom

Balliol College

United Kingdom

Time filter

Source Type

News Article | November 7, 2016
Site: www.theguardian.com

My former colleague Palmer Newbould, who has died aged 87, was a champion of scientific nature conservation, an innovative university teacher and a generous, warm-hearted man with broad interests. His nature conservation work was based mainly in Northern Ireland, where wide-ranging conservation legislation was introduced only in 1965. Palmer served on two statutory committees in the 1970s – the Nature Reserves Committee and Ulster Countryside Committee – before becoming chairman of the Council for Nature Conservation and the Countryside in 1989, for which he was appointed OBE. He was also a Northern Ireland representative on the UK’s Joint Nature Conservation Committee and served on Ireland’s Nuclear Energy Board. Born in south-west London, he was the son of Dorothy (nee Pugh) and Alfred Newbould. His father played a significant role in the early cinema industry and was a Liberal MP between 1919 and 1922. Palmer was educated at Charterhouse school, Surrey, and Balliol College, Oxford. His academic career began as a plant ecology lecturer at University College London in 1955, where he had earlier met Jo, his wife, while they were both botany PhD students. They were married in 1954. Respect for his research on peat bogs and ecosystem productivity led to involvement with the International Biological Programme and appointment as vice-president of the British Ecological Society. At UCL he was convener of Europe’s first MSc in conservation, which quickly became the model for similar courses elsewhere. In 1968 he was made a professor of biology at the New University of Ulster (NUU), helping to introduce undergraduate courses in ecology and environmental science, among the first of their kind. His horticultural skills were invaluable in beginning the transformation of the exposed Coleraine campus into today’s wooded landscape. At NUU his good humour, integrity, and interest in the arts as well as the sciences made him a successful, if reluctant, administrator. As acting vice-chancellor during the merger between NUU and the Ulster Polytechnic the trust he enjoyed among colleagues helped bring complex negotiations to a successful conclusion. The merged institution became the University of Ulster (UU) and Palmer became provost of UU’s Coleraine campus. In retirement, in Cirencester, Gloucestershire, he became a trustee of the National Heritage Memorial Fund, which distributes some of the National Lottery funds, and, with Jo, worked on a project monitoring biodiversity in Mallorca’s S’Albufera wetlands. He is survived by Jo and their children, Elizabeth, Andrew and Susan.


News Article | August 22, 2016
Site: www.scientificcomputing.com

Researchers at the University of Oxford have achieved a quantum logic gate with record-breaking 99.9% precision, reaching the benchmark required theoretically to build a quantum computer. Quantum computers, which function according to the laws of quantum physics, have the potential to dwarf the processing power of today's computers, able to process huge amounts of information all at once. The team achieved the logic gate, which places two atoms in a state of quantum entanglement and is the fundamental building block of quantum computing, with a precision (or fidelity) substantially greater than the previous world record. Quantum entanglement – a phenomenon described by Einstein as 'spooky' but which is at the heart of quantum technologies – occurs when two particles stay connected, such that an action on one affects the other, even when they are separated by great distances. The research, carried out by scientists from the Engineering and Physical Sciences Research Council (EPSRC)-funded Networked Quantum Information Technologies Hub (NQIT), which is led by Oxford University, is reported in the journal Physical Review Letters. Dr Chris Ballance, a research fellow at Magdalen College, Oxford and lead author of the paper, said: 'The development of a "quantum computer" is one of the outstanding technological challenges of the 21st century. A quantum computer is a machine that processes information according to the rules of quantum physics, which govern the behaviour of microscopic particles at the scale of atoms and smaller. 'An important point is that it is not merely a different technology for computing in the same way our everyday computers work; it is at a very fundamental level a different way of processing information. It turns out that this quantum-mechanical way of manipulating information gives quantum computers the ability to solve certain problems far more efficiently than any conceivable conventional computer. One such problem is related to breaking secure codes, while another is searching large data sets. Quantum computers are naturally well-suited to simulating other quantum systems, which may help, for example, our understanding of complex molecules relevant to chemistry and biology.' Quantum technology is a complex area, but one analogy that has been used to explain the concept of quantum computing is that it is like being able to read all of the books in a library at the same time, whereas conventional computing is like having to read them one after another. This may be over-simplistic, but it is useful in conveying the way in which quantum computing has the potential to revolutionise the field. Professor David Lucas, of Oxford University's Department of Physics and Balliol College, Oxford, a co-author of the paper, said: 'The concept of "quantum entanglement" is fundamental to quantum computing and describes a situation where two quantum objects – in our case, two individual atoms – share a joint quantum state. That means, for example, that measuring a property of one of the atoms tells you something about the other. 'A quantum logic gate is an operation which can take two independent atoms and put them into this special entangled state. The precision of the gate is a measure of how well this works: in our case, 99.9% precision means that, on average, 999 times out of 1,000 we will have generated the entangled state correctly, and one time out of 1,000 something went wrong. 'To put this in context, quantum theory says that – as far as anyone has found so far – you simply can't build a quantum computer at all if the precision drops below about 99%. At the 99.9% level you can build a quantum computer in theory, but in practice it could be very difficult and thus enormously expensive. If, in the future, a precision of 99.99% can be attained, the prospects look a lot more favourable.' Professor Lucas added: 'Achieving a logic gate with 99.9% precision is another important milestone on the road to developing a quantum computer. A quantum logic gate on its own does not constitute a quantum computer, but you can't build the computer without them. 'An analogy from conventional computing hardware would be that we have finally worked out how to build a transistor with good enough performance to make logic circuits, but the technology for wiring thousands of those transistors together to build an electronic computer is still in its infancy.' The method used by the Oxford team was invented at NIST in Boulder, USA, and, in a paper published alongside Oxford's in Physical Review Letters, the NIST team also reports the achievement of 99.9% precision.


Wallace D.,Balliol College | Wallace D.,University of Oxford
Studies in History and Philosophy of Science Part B - Studies in History and Philosophy of Modern Physics | Year: 2011

I argue against the currently prevalent view that algebraic quantum field theory (AQFT) is the correct framework for philosophy of quantum field theory and that "conventional" quantum field theory (CQFT), of the sort used in mainstream particle physics, is not suitable for foundational study. In doing so, I defend that position that AQFT and CQFT should be understood as rival programs to resolve the mathematical and physical pathologies of renormalization theory, and that CQFT has succeeded in this task and AQFT has failed. I also defend CQFT from recent criticisms made by Doreen Fraser. © 2011 Elsevier Ltd.


Harvey J.P.,Balliol College
Yale Journal of Biology and Medicine | Year: 2013

Synesthesia, the conscious, idiosyncratic, repeatable, and involuntary sensation of one sensory modality in response to another, is a condition that has puzzled both researchers and philosophers for centuries. Much time has been spent proving the condition's existence as well as investigating its etiology, but what can be learned from synesthesia remains a poorly discussed topic. Here, synaesthesia is presented as a possible answer rather than a question to the current gaps in our understanding of sensory perception. By first appreciating the similarities between normal sensory perception and synesthesia, one can use what is known about synaesthesia, from behavioral and imaging studies, to inform our understanding of "normal" sensory perception. In particular, in considering synesthesia, one can better understand how and where the different sensory modalities interact in the brain, how different sensory modalities can interact without confusion - the binding problem - as well as how sensory perception develops. © 2013.


Schroeren D.P.B.,Balliol College
Foundations of Physics | Year: 2013

The decoherent histories formalism, developed by Griffiths, Gell-Mann, and Hartle (in Phys. Rev. A 76:022104, 2007; arXiv:1106. 0767v3 [quant-ph], 2011; Consistent Quantum Theory, Cambridge University Press, 2003; arXiv:gr-qc/9304006v2, 1992) is a general framework in which to formulate a timeless, 'generalised' quantum theory and extract predictions from it. Recent advances in spin foam models allow for loop gravity to be cast in this framework. In this paper, I propose a decoherence functional for loop gravity and interpret existing results (Bianchi et al. in Phys. Rev. D 83:104015, 2011; Phys. Rev. D 82:084035, 2010) as showing that coarse grained histories follow quasiclassical trajectories in the appropriate limit. © 2013 Springer Science+Business Media New York.


Morriss-Kay G.,University of Oxford | Morriss-Kay G.,Balliol College
Journal of Anatomy | Year: 2016

The Journal of Anatomy was launched 150 years ago as the Journal of Anatomy and Physiology, in an age when anatomy and physiology were not regarded as separate disciplines. European science in general was advancing rapidly at the time (it was 7 years after publication of Darwin's Origin of Species), and the recent demise of the Natural History Review meant that there was no English language publication covering these subjects. The founding editors were George Murray Humphry of Cambridge and William Turner of Edinburgh, together with Alfred Newton of Cambridge and Edward Perceval Wright of Dublin (the last two served only for a year). The pivotal event leading to the Journal's foundation was the 1866 meeting of the British Association, at which Humphry delivered the ‘Address in Physiology’ (printed in the first issue). Turner, who was also present at the 1866 British Association meeting, remained as a member of the editorial team for 50 years and was a major contributor of Journal articles. The title was changed to Journal of Anatomy in October 1916, when it was taken under the wing, in terms of both management and ownership, by the Anatomical Society. This article reviews the early years of the Journal’s publication in more detail than later years because of the historical interest of this less familiar material. The subject matter, which has remained surprisingly consistent over the years, is illustrated by examples from some notable contributions. The evolution of illustration techniques is surveyed from 1866 to the present day; the final section provides brief summaries of all of the chief editors. © 2016 Anatomical Society


Douglas T.,Balliol College | Douglas T.,University of Oxford
Neuroethics | Year: 2014

It is plausible that we have moral reasons to become better at conforming to our moral reasons. However, it is not always clear what means to greater moral conformity we should adopt. John Harris has recently argued that we have reason to adopt traditional, deliberative means in preference to means that alter our affective or conative states directly-that is, without engaging our deliberative faculties. One of Harris' concerns about direct means is that they would produce only a superficial kind of moral improvement. Though they might increase our moral conformity, there is some deeper kind of moral improvement that they would fail to produce, or would produce to a lesser degree than more traditional means. I consider whether this concern might be justified by appeal to the concept of moral worth. I assess three attempts to show that, even where they were equally effective at increasing one's moral conformity, direct interventions would be less conducive to moral worth than typical deliberative alternatives. Each of these attempts is inspired by Kant's views on moral worth. Each, I argue, fails. © 2013 The Author(s).


Wallace D.,Balliol College
Entropy | Year: 2014

I explore the reduction of thermodynamics to statistical mechanics by treating the former as a control theory: A theory of which transitions between states can be induced on a system (assumed to obey some known underlying dynamics) by means of operations from a fixed list. I recover the results of standard thermodynamics in this framework on the assumption that the available operations do not include measurements which affect subsequent choices of operations. I then relax this assumption and use the framework to consider the vexed questions of Maxwell's demon and Landauer's principle. Throughout, I assume rather than prove the basic irreversibility features of statistical mechanics, taking care to distinguish them from the conceptually distinct assumptions of thermodynamics proper. © 2014 by the authors.


Pontzen A.,Oxford Astrophysics | Pontzen A.,Balliol College | Governato F.,University of Washington
Monthly Notices of the Royal Astronomical Society | Year: 2013

We use maximum entropy arguments to derive the phase-space distribution of a virialized dark matter halo. Our distribution function gives an improved representation of the end product of violent relaxation. This is achieved by incorporating physically motivated dynamical constraints (specifically on orbital actions) which prevent arbitrary redistribution of energy. We compare the predictions with three high-resolution dark matter simulations of widely varying mass. The numerical distribution function is accurately predicted by our argument, producing an excellent match for the vast majority of particles. The remaining particles constitute the central cusp of the halo (≤4 per cent of the dark matter). They can be accounted for within the presented framework once the short dynamical time-scales of the centre are taken into account. © 2013 The Authors. Published by Oxford University Press on behalf of the Royal Astronomical Society.


Leistedt B.,University College London | Peiris H.V.,University College London | Mortlock D.J.,Imperial College London | Benoit-Levy A.,University College London | And 2 more authors.
Monthly Notices of the Royal Astronomical Society | Year: 2013

The angular power spectrum is a powerful statistic for analysing cosmological signals imprinted in the clustering of matter. However, current galaxy and quasar surveys cover limited portions of the sky, and are contaminated by systematics that can mimic cosmological signatures and jeopardize the interpretation of the measured power spectra.We provide a framework for obtaining unbiased estimates of the angular power spectra of large-scale structure surveys at the largest scales using quadratic estimators. The methodis tested by analysing the 600 CMASS mock catalogues constructed for the Baryon Oscillation Spectroscopic Survey. We then consider the Richards et al.catalogue of photometric quasars from the sixth Data Release of the Sloan Digital Sky Survey, which is known to include significant stellar contamination and systematic uncertainties. Focusing on the sample of ultraviolet-excess sources, we show that the excess clustering power present on the largest scales can be largely mitigated by making use of improved sky masks and projecting out the modes corresponding to the principal systematics. In particular, we find that the sample of objects with photometric redshift 1.3 < z̃p < 2.2 exhibits no evidence of contamination when using our most conservative mask and mode projection. This indicates that any residual systematics is well within the statistical uncertainties.We conclude that, using our approach, this sample can be used for cosmological studies. © 2013 The Authors Published by Oxford University Press on behalf of the Royal Astronomical Society.

Loading Balliol College collaborators
Loading Balliol College collaborators