Danish National Research Foundation

Denmark

Danish National Research Foundation

Denmark
SEARCH FILTERS
Time filter
Source Type

News Article | December 20, 2016
Site: www.eurekalert.org

Researchers model how 'publication bias' does -- and doesn't -- affect the 'canonization' of facts in science Arguing in a Boston courtroom in 1770, John Adams famously pronounced, "Facts are stubborn things," which cannot be altered by "our wishes, our inclinations or the dictates of our passion." But facts, however stubborn, must pass through the trials of human perception before being acknowledged -- or "canonized" -- as facts. Given this, some may be forgiven for looking at passionate debates over the color of a dress and wondering if facts are up to the challenge. Carl Bergstrom believes facts stand a fighting chance, especially if science has their back. A professor of biology at the University of Washington, he has used mathematical modeling to investigate the practice of science, and how science could be shaped by the biases and incentives inherent to human institutions. "Science is a process of revealing facts through experimentation," said Bergstrom. "But science is also a human endeavor, built on human institutions. Scientists seek status and respond to incentives just like anyone else does. So it is worth asking -- with precise, answerable questions -- if, when and how these incentives affect the practice of science." In an article published Dec. 20 in the journal eLife, Bergstrom and co-authors present a mathematical model that explores whether "publication bias" -- the tendency of journals to publish mostly positive experimental results -- influences how scientists canonize facts. Their results offer a warning that sharing positive results comes with the risk that a false claim could be canonized as fact. But their findings also offer hope by suggesting that simple changes to publication practices can minimize the risk of false canonization. These issues have become particularly relevant over the past decade, as prominent articles have questioned the reproducibility of scientific experiments -- a hallmark of validity for discoveries made using the scientific method. But neither Bergstrom nor most of the scientists engaged in these debates are questioning the validity of heavily studied and thoroughly demonstrated scientific truths, such as evolution, anthropogenic climate change or the general safety of vaccination. "We're modeling the chances of 'false canonization' of facts on lower levels of the scientific method," said Bergstrom. "Evolution happens, and explains the diversity of life. Climate change is real. But we wanted to model if publication bias increases the risk of false canonization at the lowest levels of fact acquisition." Bergstrom cites a historical example of false canonization in science that lies close to our hearts -- or specifically, below them. Biologists once postulated that bacteria caused stomach ulcers. But in the 1950s, gastroenterologist E.D. Palmer reported evidence that bacteria could not survive in the human gut. "These findings, supported by the efficacy of antacids, supported the alternative 'chemical theory of ulcer development,' which was subsequently canonized," said Bergstrom. "The problem was that Palmer was using experimental protocols that would not have detected Helicobacter pylori, the bacteria that we know today causes ulcers. It took about a half century to correct this falsehood." While the idea of false canonization itself may cause dyspepsia, Bergstrom and his team -- lead author Silas Nissen of the Niels Bohr Institute in Denmark and co-authors Kevin Gross of North Carolina State University and UW undergraduate student Tali Magidson -- set out to model the risks of false canonization given the fact that scientists have incentives to publish only their best, positive results. The so-called "negative results," which show no clear, definitive conclusions or simply do not affirm a hypothesis, are much less likely to be published in peer-reviewed journals. "The net effect of publication bias is that negative results are less likely to be seen, read and processed by scientific peers," said Bergstrom. "Is this misleading the canonization process?" For their model, Bergstrom's team incorporated variables such as the rates of error in experiments, how much evidence is needed to canonize a claim as fact and the frequency with which negative results are published. Their mathematical model showed that the lower the publication rate is for negative results, the higher the risk for false canonization. And according to their model, one possible solution -- raising the bar for canonization -- didn't help alleviate this risk. "It turns out that requiring more evidence before canonizing a claim as fact did not help," said Bergstrom. "Instead, our model showed that you need to publish more negative results -- at least more than we probably are now." Since most negative results live out their obscurity in the pages of laboratory notebooks, it is difficult to quantify the ratio that are published. But clinical trials, which must be registered with the U.S. Food and Drug Administration before they begin, offer a window into how often negative results make it into the peer-reviewed literature. A 2008 analysis of 74 clinical trials for antidepressant drugs showed that scarcely more than 10 percent of negative results were published, compared to over 90 percent for positive results. "Negative results are probably published at different rates in other fields of science," said Bergstrom. "And new options today, such as self-publishing papers online and the rise of journals that accept some negative results, may affect this. But in general, we need to share negative results more than we are doing today." Their model also indicated that negative results had the biggest impact as a claim approached the point of canonization. That finding may offer scientists an easy way to prevent false canonization. "By more closely scrutinizing claims as they achieve broader acceptance, we could identify false claims and keep them from being canonized," said Bergstrom. To Bergstrom, the model raises valid questions about how scientists choose to publish and share their findings -- both positive and negative. He hopes that their findings pave the way for more detailed exploration of bias in scientific institutions, including the effects of funding sources and the different effects of incentives on different fields of science. But he believes a cultural shift is needed to avoid the risks of publication bias. "As a community, we tend to say, 'Damn it, this didn't work, and I'm not going to write it up,'" said Bergstrom. "But I'd like scientists to reconsider that tendency, because science is only efficient if we publish a reasonable fraction of our negative findings." The research was funded by the Danish National Research Foundation, the University of Washington and the John Templeton Foundation. For more information, contact Bergstrom at cbergst@u.washington.edu or 206-685-3847.


News Article | November 14, 2016
Site: www.eurekalert.org

New research demonstrates the highest plasmon energy ever observed in graphene plasmons and brings graphene into the regime of telecommunication applications WASHINGTON - Graphene's unique properties can be both a blessing and a curse to researchers, especially to those at the intersection of optical and electronic applications. These single-atom thick sheets feature highly mobile electrons on their flexible profiles, making them excellent conductors, but in general graphene sheets do not interact with light efficiently. Problematic for shorter wavelength light, photons in the near infrared region of the spectrum, where telecommunication applications become realizable. In a paper published this week in the journal Optics Letters, from The Optical Society (OSA), researchers at the Technical University of Denmark have demonstrated, for the first time, efficient absorption enhancement at a wavelength of 2 micrometers by graphene, specifically by the plasmons of nanoscale graphene disks. Much like water ripples arising from the energy of a dropped pebble, electronic oscillations can arise in freely moving conduction electrons by absorbing light energy. The resulting collective, coherent motions of these electrons are called plasmons, which also serve to amplify the strength of the absorbed light's electric field at close proximity. Plasmons are becoming increasingly commonplace in various optoelectronic applications where highly conductive metals can be easily integrated. Graphene plasmons, however, face an extra set of challenges unfamiliar to the plasmons of bulk metals. One of these challenges is the relatively long wavelength needed to excite them. Many efforts taking advantage of the enhancing effects of plasmons on graphene have demonstrated promise, but for low energy light. "The motivation of our work is to push graphene plasmons to shorter wavelengths in order to integrate graphene plasmon concepts with existing mature technologies," said Sanshui Xiao, associate professor from the Technical University of Denmark. To do so, Xiao, Wang and their collaborators took inspiration from recent developments at the university's Center of Nanostructured Graphene (CNG), where they demonstrated a self-assembly method resulting in large arrays of graphene nanostructures. Their method primarily uses geometry to bolster the graphene plasmon effects at shorter wavelengths by decreasing the size of the graphene structures. Using lithographic masks prepared by a block copolymer based self-assembly method, the researchers made arrays of graphene nanodisks. They controlled the final size of the disks by exposing the array to oxygen plasma which etched away at the disks, bringing the average diameter down to approximately 18 nm. This is approximately 1000 times smaller than the width of a human hair. The array of approximately 18 nm disks, resulting from 10 seconds of etching with oxygen plasma, showed a clear resonance with 2 micrometer wavelength light, the shortest wavelength resonance ever observed in graphene plasmons. An assumption might be that longer etching times or finer lithographic masks, and therefore smaller disks, would result in even shorter wavelengths. Generally speaking this is true, but at 18 nm the disks already start requiring consideration of atomic details and quantum effects. Instead, the team plans to tune graphene plasmon resonances at smaller scales in the future using electrical gating methods, where the local concentration of electrons and electric field profile alter resonances. Xiao said, "To further push graphene plasmons to shorter wavelengths, we plan to use electrical gating. Instead of graphene disks, graphene antidots (i.e. graphene sheets with regular holes) will be chosen because it is easy to implement a back-gating technique." There are also fundamental limits to the physics that prevent shortening the graphene plasmon resonance wavelength with more etching. "When the wavelength becomes shorter, the interband transition will soon play a key role, leading to broadening of the resonance. Due to weak coupling of light with graphene plasmons and this broadening effect, it will become hard to observe the resonance feature," Xiao explained. This project is supported by Danish National Research Foundation Center for Nanostructured Graphene (DNRF103). Paper: Z. Wang, T. Li, K. Almdal, N. Mortensen, S. Xiau and S. Ndoni, "Experimental demonstration of graphene plasmons working close to the near-infrared window," Opt. Lett. 41, 5345-5348. DOI: 10.1364/OL.41.005345 Optics Letters offers rapid dissemination of new results in all areas of optics with short, original, peer-reviewed communications. Optics Letters covers the latest research in optical science, including optical measurements, optical components and devices, atmospheric optics, biomedical optics, Fourier optics, integrated optics, optical processing, optoelectronics, lasers, nonlinear optics, optical storage and holography, optical coherence, polarization, quantum electronics, ultrafast optical phenomena, photonic crystals and fiber optics. Founded in 1916, The Optical Society (OSA) is the leading professional organization for scientists, engineers, students and business leaders who fuel discoveries, shape real-life applications and accelerate achievements in the science of light. Through world-renowned publications, meetings and membership initiatives, OSA provides quality research, inspired interactions and dedicated resources for its extensive global network of optics and photonics experts. For more information, visit osa.org/100.


News Article | November 16, 2016
Site: www.eurekalert.org

Environmental DNA in seawater samples may provide accurate information about deepwater fish populations, according to a study published November 16, 2016 in the open-access journal PLOS ONE by Philip Francis Thomsen from the Centre for GeoGenetics at the Danish Natural History Museum, University of Copenhagen, Denmark, and colleagues. Fish in remote polar and deepwater habitats are threatened by climate change and increased fishing efforts, making it important to monitor populations. However, monitoring can be logistically difficult and currently depends on invasive techniques such as bottom trawling and unreliable reports of catches. Less invasive, more reliable monitoring techniques are therefore needed. To address this need, Thomsen and colleagues assessed an alternative monitoring technique which relies on sequencing environmental DNA (eDNA) in seawater samples. They collected seawater samples at sites off Southwest Greenland, at varying depths between 188 and 918 meters, sequencing the DNA in these samples to determine the fish species present. They compared these results to catch data obtained by simultaneous trawling at each site. Thomsen and colleagues found that data on fish biomass and abundance was correlated with eDNA sequence abundance. Twenty-six families of fish, including rays and halibut, were identified by both trawling and environmental DNA techniques, compared to just two families found only in trawling and three found only in eDNA. Environmental DNA sampling also detected a higher abundance of the Greenland Shark than trawling did, which may indicate that the technique can effectively detect large fish which may evade trawling nets. Environmental DNA sampling will need to undergo further testing to determine its effectiveness as a monitoring technique. Nonetheless, the authors state their study demonstrates how eDNA could be used in non-invasive monitoring, for commercial fishing as well as to assess the impact of climate change on the biodiversity of these remote ecosystems. In your coverage please use this URL to provide access to the freely available paper: http://dx. Citation: Thomsen PF, Møller PR, Sigsgaard EE, Knudsen SW, Jørgensen OA, Willerslev E (2016) Environmental DNA from Seawater Samples Correlate with Trawl Catches of Subarctic, Deepwater Fishes. PLoS ONE 11(11): e0165252. doi:10.1371/journal.pone.0165252 Funding: Danish National Research Foundation funded the work. Greenland Institute of Natural Resources, Greenland Self-government, Department for education and research funded the work. The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript. Competing Interests: The authors have declared that no competing interests exist.


News Article | November 17, 2016
Site: www.eurekalert.org

Researchers who have sequenced the genome of a 5,310-year-old corn cob have discovered that the maize grown in central Mexico all those years ago was genetically more similar to modern maize than to its wild ancestor. For example, the ancient maize already carried genetic variants responsible for making kernels soft, a common feature of modern corn. The findings are reported in Current Biology on November 17. "Around 9,000 years ago in modern-day Mexico, people started collecting and consuming teosinte, a wild grass," says Nathan Wales of the Natural History Museum of Denmark. "Over the course of several thousand years, human-driven selection caused major physical changes, turning the unproductive plant into modern maize, commonly known as corn. Maize as we know it looks so different from its wild ancestor that a couple of decades ago scientists had not reached a consensus regarding the true ancestor of maize." To better understand the domestication history of the world's most produced crop, Wales and his colleagues, including Jazmín Ramos-Madrigal, sequenced the genome of a 5,310-year-old maize cob from central Mexico. The cob, known as Tehuacan162, was excavated from a cave in the Tehuacan Valley in the 1960s, during a major archaeological expedition lead by Richard MacNeish. Fortunately, the Robert S. Peabody Museum in Andover, MA, took excellent care of the ancient maize specimen--one of the five oldest known in the world--for decades. Wales explains that this particular cob and the DNA within it had been unusually well preserved. "Archaeological specimens frequently have high levels of bacterial DNA due to decomposition and soil contaminants," he says. "However, during genetic testing of ancient cobs, we were astonished to find that 70 percent of the DNA from the Tehuacan162 cob was from the plant!" Most other ancient samples contain less than 10 percent plant DNA. Tehuacan162 didn't have hard seed coats like its wild ancestor would have. But, the ancient cob is less than a tenth of the size of modern cobs, at less than two centimeters long. In addition, the ancient cob produced only eight rows of kernels, about half that of modern maize. That led the researchers to suspect that its genes would offer clues on the early stages of maize domestication. To make the most of the small sample, Wales and Ramos-Madrigal used cutting-edge paleogenomic techniques. They extracted DNA with a method designed to recover ultra-short DNA, taking special care to avoid losing any genetic material. As a result, the researchers were able to prepare sufficient DNA for sequencing while still preserving enough of the sample to determine the cob's precise age via radiocarbon dating. The new findings offer an informative snapshot in the 10,000-year evolutionary history of maize and its domestication, the researchers say. In addition to elucidating how maize provided a dietary foundation for ancient civilizations like the Maya, such studies can also aid in understanding and improving commercially important lines of modern maize, the researchers say. "This is only the beginning of the story," Ramos-Madrigal says. "Humans dispersed maize across the Americas very quickly and very successfully. We want to know how humans dispersed it, which routes they took, and how maize adapted to such diverse environments." This research was supported by the Lundbeck Foundation, the Danish Council for Independent Research, and the Danish National Research Foundation . Current Biology, Ramos-Madrigal et al.; "Genome Sequence of a 5,310-Year-Old Maize Cob Provides Insights into the Early Stages of Maize Domestication" http://www.cell.com/current-biology/fulltext/S0960-9822(16)31120-4 Current Biology (@CurrentBiology), published by Cell Press, is a bimonthly journal that features papers across all areas of biology. Current Biology strives to foster communication across fields of biology, both by publishing important findings of general interest and through highly accessible front matter for non-specialists. Visit: http://www. . To receive Cell Press media alerts, contact press@cell.com.


Reinhard L.,Danish National Research Foundation | Tidow H.,Danish National Research Foundation | Clausen M.J.,Danish National Research Foundation | Nissen P.,Danish National Research Foundation
Cellular and Molecular Life Sciences | Year: 2013

The Na+,K+-ATPase, or sodium pump, is well known for its role in ion transport across the plasma membrane of animal cells. It carries out the transport of Na+ ions out of the cell and of K+ ions into the cell and thus maintains electrolyte and fluid balance. In addition to the fundamental ion-pumping function of the Na+,K +-ATPase, recent work has suggested additional roles for Na +,K+-ATPase in signal transduction and biomembrane structure. Several signaling pathways have been found to involve Na +,K+-ATPase, which serves as a docking station for a fast-growing number of protein interaction partners. In this review, we focus on Na+,K+-ATPase as a signal transducer, but also briefly discuss other Na+,K+-ATPase protein-protein interactions, providing a comprehensive overview of the diverse signaling functions ascribed to this well-known enzyme. © 2012 Springer Basel AG.


Rosen C.B.,Danish National Research Foundation | Hansen D.J.,Danish National Research Foundation | Gothelf K.V.,Danish National Research Foundation
Organic and Biomolecular Chemistry | Year: 2013

Fluoride detection through hydrogen bonding or deprotonation is most commonly achieved using amide, urea or pyrrole derivatives. The sensor molecules are often complex constructs and several synthetic steps are required for their preparation. Here we report the discovery that simple arylaldoximes have remarkable properties as fluoride anion sensors, providing distinct colorimetric or fluorescent readouts, depending on the structure of the arylaldoxime. The oximes showed exceptional selectivity towards fluoride over other typical anions, and low detection limits for fluoride in both DMSO and DMSO-water mixtures were obtained. © 2013 The Royal Society of Chemistry.


News Article | December 8, 2015
Site: phys.org

All science students learn how human cell division takes place. The copying or replication of the genome, the cell's DNA, has until now been believed only to take place during the so-called S-phase in the cell cycle. The new results show that this is not the case, because some regions of the genome are copied only after the cell enters the next crucial phase in the cell cycle called mitosis. "It has radically altered our views and requires that the textbook view of the human cell cycle be revised", says Professor Ian Hickson, Director of the Centre for Chromosome Stability and affiliated with the Center for Healthy Aging. The research project was funded by the Danish National Research Foundation and was just published in the international scientific journal Nature. This unusual pathway for copying of the DNA occurs at specific regions of the human genome called fragile sites, and during mitosis, chromosomes in these fragile areas have a tendency to break. The fragile sites are conserved across species and are frequently associated with undesirable genome rearrangements in connection with the development of cancer. "We now know that these so-called 'chromosome breaks' are not actually broken, but instead comprise a region of DNA that is newly synthesized in mitosis. They appear broken because they are far less compacted than the rest of the chromosome," adds Professor Hickson. Cancer cells utilize this unusual form of DNA replication because one of the side effects of the genetic changes that cause cancer is so-called 'replication stress'. The scientists weren't specifically looking for this but fortunately they saw something very unusual when looking at human cancer cells under the microscope. "When we realized what was happening, it took us about 3 years to determine the mechanism underlying this phenomenon." "All science students learn that DNA is replicated in S-phase. Our results show that this is not the case, because some regions are replicated only after the cell enters mitosis," he adds. The scientists already know of two proteins that are essential for this unusual pathway for DNA replication, but now aim to define the full 'toolbox' of factors that are required. They can then proceed with studies to identify chemical compounds that block the process. This would constitute the first stage in identifying potential new treatments for cancer. "Although it has not yet been proven, it seems that the growth of many, or indeed most, cancers in humans is dependent on this process. Hence, the development of a reliable, therapeutic drugs strategy would likely have wide applicability in cancer therapy." "Our aim is to generate results that will lead to the development of new approaches to treatments of various types of cancer," concludes Professor Hickson. Explore further: Infradian oscillation of circadian genes in a mouse model of bipolar disorder More information: Sheroy Minocherhomji et al. Replication stress activates DNA repair synthesis in mitosis, Nature (2015). DOI: 10.1038/nature16139


News Article | March 10, 2016
Site: www.scientificcomputing.com

Quantum technology has the potential to revolutionize computation, cryptography and simulation of quantum systems. However, quantum physics places a new demand on information processing hardware: quantum states are fragile, and so must be controlled without being measured. Researchers at the Niels Bohr Institute have demonstrated a key property of Majorana zero modes that protects them from decoherence. The result lends positive support to the existence of Majorana modes, and goes further by showing that they are protected, as predicted theoretically. The results have been published in the scientific magazine, Nature. Normal computers are limited in their ability to solve certain classes of problems. The limitation lies in the fact that the operation of a conventional computers is based on classical states, or bits, the fundamental unit of information that is either 0 or 1. In a quantum computer, data is stored in quantum bits, or qubits. According to the laws of quantum mechanics, a qubit can be in a superposition of states — a 0 and 1 at the same time. By taking advantage of this and other properties of quantum physics, a quantum computer made of interconnected qubits should be able to tackle certain problems much more efficiently than would be possible on a classical computer. There are many different physical systems that could, in principle, be used as quantum bits. The problem is that most quantum systems lose coherence very quickly — the qubit becomes a regular bit once measured. This is why researchers are still searching for the best implementation of quantum hardware. Enter the Majorana zero mode, a delocalized state in a superconductor that resists decoherence by sharing quantum information between separated locations. In a Majorana mode, the information is stored in such a way that a disturbance of either location leaves the quantum information intact. “We are investigating a new kind of particle, called a Majorana zero mode, which can provide a basis for quantum information that is protected against measurement by a special and who knows, perhaps unique property of these particles. Majorana particles don’t exist as particles on their own, but they can be created using a combination of materials involving superconductors and semiconductors. What we find is that, first of all, the Majorana modes are present, verifying previous experiments, but more importantly that they are protected, just as theory predicts,” says Villum Kann Rasmussen Professor Charles Marcus, Director of the Center for Quantum Devices (QDev) and Station Q Copenhagen, at the Niels Bohr Institute, University of Copenhagen. The Center for Quantum Devices is a leading research center in quantum information technology — with activities in theory, experiment and materials research. Semiconductor nanowires around 10 micrometers long and around 0.1 micrometers in diameter, coated with superconducting aluminum were used to form isolated islands of various lengths. By applying a strong magnetic field along the axis of the wire, and cooling the wires to below a tenth of a kelvin, a new kind of superconducting state, called a topological superconductor, was formed. In 2012, physicists at Delft University in the Netherlands found the first signatures of Majorana zero modes in a similar system, with further evidence revealed in subsequent experiments around the world. Now, researchers at the Center for Quantum Devices have demonstrated critical predictions regarding their behavior, namely that their quantum states are protected in a fundamentally different manner from conventional quantum states. The experiments were carried out by Ph.D. Candidate Sven Albrecht and postdoc Andrew Higginbotham, now at the University of Colorado/NIST, using new superconductor-semiconductor hybrid nanowires developed by Assistant Professor Peter Krogstrup in collaboration with Marcus and Professor Jesper Nygard. “The protection is related to the exotic property of the Majorana mode that it simultaneously exists on both ends of the nanowire, but not in the middle. To destroy its quantum state, you have to act on both ends at the same time, which is unlikely,” says Albrecht. Albrecht explains that it was a challenging effort to demonstrate the protection experimentally. The researchers had to repeat their experiment many times with nanowires of different lengths in order to show that the protection improved with wire length. “Exponential protection is an important check as we continue our basic exploration, and ultimately application, of topological states of matter. Two things have pushed the field forward — from the first Majorana sightings at Delft to the present results — the first is strong interaction between theory and experiment. The second is remarkable materials development in Copenhagen, an effort that predates our Center. Without these new materials, the field was rather stuck. That’s behind us now.” says Charles Marcus. The research at the Center for Quantum Devices and Station Q Copenhagen was supported by Microsoft Research and the Danish National Research Foundation and the Villum Foundation.


News Article | March 12, 2016
Site: www.nanotech-now.com

Abstract: Quantum technology has the potential to revolutionize computation, cryptography, and simulation of quantum systems. However, quantum physics places a new demand on information processing hardware: quantum states are fragile, and so must be controlled without being measured. Researchers at the Niels Bohr Institute have now demonstrated a key property of Majorana zero modes that protects them from decoherence. The result lends positive support to the existence of Majorana modes, and goes further by showing that they are protected, as predicted theoretically. The results have been published in the prestigious scientific magazine, Nature. Normal computers are limited in their ability to solve certain classes of problems. The limitation lies in the fact that the operation of a conventional computers is based on classical states, or bits, the fundamental unit of information that is either 0 or 1. In a quantum computer, data is stored in quantum bits, or qubits. According to the laws of quantum mechanics, a qubit can be in a superposition of states --- a 0 and 1 at the same time. By taking advantage of this and other properties of quantum physics, a quantum computer made of interconnected qubits should be able to tackle certain problems much more efficiently than would be possible on a classical computer. There are many different physical systems that could in principle be used as quantum bits. The problem is that most quantum systems lose coherence very quickly--the qubit becomes a regular bit once measured. This is why researchers are still searching for the best implementation of quantum hardware. Enter the Majorana zero mode, a delocalized state in a superconductor that resists decoherence by sharing quantum information between separated locations. In a Majorana mode, the information is stored in such a way that a disturbance of either location leaves the quantum information intact. "We are investigating a new kind of particle, called a Majorana zero mode, which can provide a basis for quantum information that is protected against measurement by a special and who knows, perhaps unique property of these particles. Majorana particles don't exist as particles on their own, but they can be created using a combination of materials involving superconductors and semiconductors. What we find is that, first of all, the Majorana modes are present, verifying previous experiments, but more importantly that they are protected, just as theory predicts," says Villum Kann Rasmussen Professor Charles Marcus, Director of the Center for Quantum Devices (QDev) and Station Q Copenhagen, at the Niels Bohr Institute, University of Copenhagen. Nanowires for quantum technology The Center for Quantum Devices is a leading research center in quantum information technology - with activities in theory, experiment, and materials research. Semiconductor nanowires around 10 micrometers long and around 0.1 micrometers in diameter, coated with superconducting aluminum were used to form isolated islands of various lengths. By applying a strong magnetic field along the axis of the wire, and cooling the wires to below a tenth of a kelvin, a new kind of superconducting state, called a topological superconductor, was formed. Quantum states are protected In 2012, physicists at Delft University in the Netherlands found the first signatures of Majorana zero modes in a similar system, with further evidence revealed in subsequent experiments around the world. Now, researchers at the Center for Quantum Devices have demonstrated critical predictions regarding their behavior, namely that their quantum states are protected in a fundamentally different manner from conventional quantum states. The experiments were carried out by PhD Candidate Sven Albrecht and postdoc Andrew Higginbotham, now at the University of Colorado/NIST, USA, using new superconductor-semiconductor hybrid nanowires developed by Assistant Professor Peter Krogstrup in collaboration with Marcus and Professor Jesper Nygard. "The protection is related to the exotic property of the Majorana mode that it simultaneously exists on both ends of the nanowire, but not in the middle. To destroy its quantum state, you have to act on both ends at the same time, which is unlikely", says Sven Albrecht. Albrecht explains that it was a challenging effort to demonstrate the protection experimentally. The researchers had to repeat their experiment many times with nanowires of different lengths in order to show that the protection improved with wire length. "Exponential protection is an important check as we continue our basic exploration, and ultimately application, of topological states of matter. Two things have pushed the field forward--from the first Majorana sightings at Delft to the present results--the first is strong interaction between theory and experiment. The second is remarkable materials development in Copenhagen, an effort that predates our Center. Without these new materials, the field was rather stuck. That's behind us now." says Charles Marcus. ### The research at the Center for Quantum Devices and Station Q Copenhagen was supported by Microsoft Research and the Danish National Research Foundation and the Villum Foundation. For more information, please click Contacts: Gertie Skaarup 45-28-75-06-20 Charles Marcus Professor Director Center for Quantum Devices QDev Niels Bohr Institute Univeresity of Copenhagen +45 2034-1181 Sven Albrecht PhD Candidate Center for Quantum Devices QDev Niels Bohr Institutet Københavns Universitet +45 2155-2975 If you have a comment, please us. Issuers of news releases, not 7th Wave, Inc. or Nanotechnology Now, are solely responsible for the accuracy of the content.


News Article | March 4, 2016
Site: www.scientificcomputing.com

Since researchers first succeeded in mapping the human genome back in 2003, the technological development has moved at warp speed, and the process, which at that time took several years and billions of dollars, can now be performed in a few days. In the Klaus Hansen research group at the Biotech Research & Innovation Centre, University of Copenhagen, researchers have developed a new type of software, which enables a much faster analysis and interpretation of the vast amounts of data provided by sequencing technology. “The amount of information that a genome researcher creates and which makes the basis of his scientific work has grown a million times during the last two decades. Today, the challenge does not consist in creating the data, but in exploring them and deducing meaningful conclusions. We believe that this analytical tool, which we have called “EaSeq” can help researchers in doing so,” said Associate Professor Klaus Hansen. ChIP sequencing — an insight into the workflow of human cells The EaSeq software has been developed for analysis of so called ChIP sequencing. DNA sequencing is used for mapping the sequence of the base pairs, which our DNA consists of, and ChIP sequencing is a derived method in which the sequences are used to determine the presence of different cell components in the genome at a given time. “Roughly speaking, ChIP sequencing can be compared to a microscope, which enables us to observe the presence of different cell components in the entire genome at a given time. The method is still quite young and holds the potential to be applied within many more scientific fields, which can benefit from understanding how healthy and pathological cells control and uses genes,” said Associate Professor Mads Lerdrup. While ChIP sequencing has made it possible to produce enormous amounts of data very fast, the analysis of these data has — until now — been a tedious process. Most of the analytical software being used requires knowledge of computer programming and researchers have, therefore, been dependent on specialists in order to decode and analyze their data. EaSeq offers a far more visual and intuitive alternative, which makes it possible for biomedical researchers to study and test hypotheses using their own data. This means that, instead of waiting for weeks for others to carry out an analysis, researchers will be able to perform the analyses themselves in a matter of hours. Today, DNA sequencing is gaining ground within the clinical area where it is e.g. being used for diagnosis and targeting of treatment within the cancer area. The developers of EaSeq see similar perspectives for ChIP sequencing in the clinical work and, in that context, strong analytical tools will be pivotal. “The DNA sequence itself tells us very little about how cells actual decodes the DNA and, to understand this, we need to map out which cell components are present in different parts of the genome at a specific time. It is our hope that we, by increasing feasibility, can enable researchers to faster uncover such knowledge and apply it clinically,” said Associate professor Mads Lerdrup. The research project has been financed by the Danish National Research Foundation and the results have been published in the journal Nature Structural & Molecular Biology.

Loading Danish National Research Foundation collaborators
Loading Danish National Research Foundation collaborators