News Article | February 21, 2017
Scientists from the Broad Institute of MIT and Harvard in Cambridge, Massachusetts, have identified novel mutations in bacteria that promote the evolution of high-level antibiotic resistance. The findings, published in eLife, add to our understanding of how antibiotic resistance develops, which the team says is crucial for maintaining the effectiveness of both existing and future drugs. The rise of antibiotic-resistant bacteria is challenging clinicians, with some infections already resistant to nearly all available drugs. A 2013 report from the Centers for Disease Control and Prevention estimates that such infections kill at least 23,000 people each year in the United States alone*. Deborah Hung, senior author of the current study and Core Institute Member and Co-Director of the Infectious Disease and Microbiome Program at the Broad Institute, says: "Some species of bacteria, including mycobacteria, develop drug resistance as a result of mutations in their genes. We wanted to gain new insight into the molecular processes that promote resistance in these species by looking at the relationships between the concentration of antibiotics, their killing effects on bacteria, and the emergence of drug-resistant mutants." To do this, Hung and her team grew hundreds of cultures of the species Mycobacterium smegmatis (M. smegmatis), a cousin of the bacterium that causes tuberculosis. They exposed the bacteria to low antibiotic concentrations, where the drugs' microbe-killing effects were relatively slow. This allowed the team to monitor the killing of sensitive bacteria while isolating individual wells where mutants developed. "We detected the outgrowth of drug-resistant mutants in a fraction of our cultures," says first author James Gomez. "Each individual carried single mutations in different components of the ribosome, the complex molecular machine responsible for building proteins within cells." The team found that these novel ribosomal mutations granted the bacteria resistance to several different classes of antibiotics that do not even target the ribosome, and to which the mutants had never been exposed. They also enhanced resistance to two non-antibiotic stresses: heat shock and membrane stress. Gomez explains: "We did see a fitness cost to the bacteria in that the mutations reduced their growth rate. However, the reprogramming that occurred within the cells in response to the mutations made the bacteria much less sensitive to both antibiotic and non-antibiotic stresses. This suggests that, in species such as M. smegmatis, these types of mutations can enhance fitness in multidrug environments and serve as stepping stones toward the development of high-level drug resistance, despite the cost that the mutations have on growth." The team now wants to explore this phenomenon across diverse bacterial species, including Mycobacterium tuberculosis, by coupling experimental biological approaches with a thorough exploration of genome sequence information. A more complete understanding of how multidrug resistance emerges could help in the development or optimisation of new drugs for treating bacterial infections. The paper 'Ribosomal mutations promote the evolution of antibiotic resistance in a multidrug environment' can be freely accessed online at http://dx. . Contents, including text, figures, and data, are free to reuse under a CC BY 4.0 license. *CDC. Antibiotic Resistance Threats in the United States, 2013. Centers for Disease Control and Prevention, 2013: https:/ eLife is a unique collaboration between the funders and practitioners of research to improve the way important research is selected, presented, and shared. eLife publishes outstanding works across the life sciences and biomedicine -- from basic biological research to applied, translational, and clinical studies. All papers are selected by active scientists in the research community. Decisions and responses are agreed by the reviewers and consolidated by the Reviewing Editor into a single, clear set of instructions for authors, removing the need for laborious cycles of revision and allowing authors to publish their findings quickly. eLife is supported by the Howard Hughes Medical Institute, the Max Planck Society, and the Wellcome Trust. Learn more at elifesciences.org.
News Article | February 15, 2017
The novel mode filter for laser beams in the LG33 mode, which was developed at the AEI. Top: mode filter in the laboratory. Bottom: schematic of the mode filter. Credit: Noack/Max Planck Institute for Gravitational Physics One year ago, the first direct detection of gravitational waves was announced. Laser experts from the Max Planck Institute for Gravitational Physics (Albert Einstein Institute; AEI), from the Leibniz Universität Hannover, and from the Laser Zentrum Hannover e.V. (LZH) played leading roles in this discovery, because their super-precise laser technology at the heart of the LIGO instruments in the USA enabled the detection of weak gravitational-wave signals. Now, AEI researchers have presented two new technologies capable of further increasing the sensitivity of future gravitational-wave detectors. The Max Planck Society now strengthens the development of laser systems for third-generation gravitational-wave detectors. The AEI, in collaboration with the LZH, receives over the next five years 3.75 million Euro research funding for the development of novel lasers Zentrum Hannover receives over the next five years 3.75 million Euro research funding for the development of novel lasers and stabilization methods. "We have made two important breakthroughs," says Apl. Prof. Benno Willke, leader of the laser development group at the AEI. "Our work is another step towards using a novel type of laser beam profile in interferometric gravitational-wave detectors. Furthermore, we have shown how to increase the power stability of the high-power lasers used in the detectors. These are important steps towards the future of gravitational-wave astronomy." The results were published in the renowned science journal Optics Letters and were highlighted by the editors. The beams of all laser systems currently used in gravitational-wave detectors have higher intensity at the centre than at the edges. This leads to an undesirable strong influence of mirror surface fluctuations on the measurement precision of gravitational-wave detectors. This so-called thermal noise can be reduced by a more homogeneous laser intensity distribution. In 2013 a team with AEI involvement showed how more homogeneous high-power laser beams in the so-called LG mode can be created. Now, Andreas Noack has studied in his MSc thesis in Benno Willke's team how these laser beams can be fed into future gravitational-wave detectors. The first step on the way into the detector is a device known as a pre-mode cleaner, which optimizes the beam profile and reduces beam jitter. Willke's team showed that the new LG beam is incompatible with the pre-mode cleaners currently in use. The researchers also showed how to solve this problem. They developed a new pre-mode cleaner, which is compatible with the LG laser beams. "The design of the next-generation gravitational wave detectors is not set," says Willke. "Therefore, we are testing different types of lasers to have as many options for new gravitational wave detectors as possible. We now have made a big step ahead with the promising LG beams." All interferometric gravitational-wave detectors like LIGO, Virgo, and GEO600 rely on laser systems that keep their high output power stable over years and that show very little short timescale power fluctuations. Benno Willke's research group plays a world-wide leading role in this research area. They constructed the laser systems for GEO600 and Advanced LIGO, without which the first direct detection of gravitational waves in September 2015 would not have been possible. Now, Jonas Junker has further refined the existing power stabilization system in his MSc thesis in Willke's team. A part of the laser light is picked off and distributed on multiple photodetectors to precisely determine the total laser power. If it varies, the main laser power is corrected accordingly. In their experiment, the scientists extended the current system by adding, among other things, another photodetector to also control and correct the pointing of the laser beam. The improved power stabilization scheme has been successfully applied to the 35 Watt laser system of the 10 meter prototype interferometer at the AEI. The prototype is used by researchers in Hannover for demonstrations and tests of technologies for the third generation of detectors and for research on quantum mechanical effects in these instruments. The level of power stability reached is five times higher than that in comparable experiments of other groups. This value agrees very well with results from isolated table-top experiments. "An experiment in the well isolated environment of an optical laboratory is completely different from a complex large-scale experiment like the 10 meter prototype. We have shown for the first time that it is possible to transfer the excellent stability level from a table-top experiment," says Willke. "We show that these photodiode arrays work as expected, meaning it should also be possible to achieve this high stability with the identical multi-photodetector arrays used in Advanced LIGO." Explore further: LIGO discovery named Science's 2016 Breakthrough of the Year More information: Andreas Noack et al. Higher-order Laguerre–Gauss modes in (non-) planar four-mirror cavities for future gravitational wave detectors, Optics Letters (2017). DOI: 10.1364/OL.42.000751
News Article | January 7, 2016
That's the scenario—done with some molecular time travel—that emerged from basic research in the lab of University of Oregon biochemist Ken Prehoda. The mutation and a change it brought in protein interactions are detailed in eLife, an open-access journal launched in 2012 with support of the Howard Hughes Medical Institute, the Max Planck Society and the Wellcome Trust. The research helps to address several important questions that scientists have had about evolution, said Prehoda, a professor in the Department of Chemistry and Biochemistry and director of the UO's Institute of Molecular Biology. It also has implications for studying disease states, such as cancer, in which damaged cells no longer cooperate with other cells in our bodies and revert back to a unicellular state where each is on its own. Mutations can lead to good or bad results, or even to a combination of the two, said Prehoda, whose laboratory primarily focuses on how proteins work inside of cells. "Proteins are the workhorses of our cells, performing a wide variety of tasks such as metabolism," he said. "But how does a protein that performs one task evolve to perform another? And how do complex systems like those that allow cells to work together in an organized way, evolve the many different proteins they require? Our work suggests that new protein functions can evolve with a very small number of mutations. In this case, only one was required." For the research, Prehoda's team began looking at choanoflagellates with the help of Nicole King's group at the University of California, Berkeley. Choanoflagellates are a group of free-living, single-celled organisms considered to be the closest living relative of animals. These sponge-like, seawater-dwelling organisms have a short, outward-facing squiggly tail called a flagellum that allows them to move and gather food. Choanoflagellates exist both in a single-celled solitary form, but also as multi-cellular colonies. Prehoda and colleagues then used ancestral protein reconstruction, a technique devised at the UO by co-author Joseph W. Thornton, a biologist now at the University of Chicago. By using gene sequencing and computational methods to move backward in the evolutionary tree, researchers can see molecular changes and infer how proteins performed in the deep past. In the new research, gene sequences from more than 40 other organisms were put into play. The team's journey identified a mutation that was important for opening the door to organized multicellular animals that eventually no longer needed their tails. They also found that the choanoflagellate flagellum is critical for organizing its multi-cellular colony, suggesting that this may have also been the case as our single-celled ancestor transitioned to a multi-cellular lifestyle. Prehoda's team suggests that the tail's role became less important when the gene for an enzyme duplicated within cells, and a single mutation allowed one of the copies to help orient and arrange newly made cells. The protein domain that resulted from this mutation is found today in all animal genomes and their close unicellular relatives but absent in other life forms. "This mutation is one small change that dramatically altered the protein's function, allowing it to perform a completely different task" Prehoda said. "You could say that animals really like these proteins because there are now over 70 of them inside of us." Explore further: How an enzyme tells stem cells which way to divide More information: Douglas P Anderson et al. Evolution of an ancient protein function involved in organized multicellularity in animals, eLife (2016). DOI: 10.7554/eLife.10147
News Article | February 22, 2017
Cecilia Lanny Winata knew almost nothing about Poland before she was invited to move there from Singapore. At a 2013 zebrafish conference in Barcelona, Spain — where she presented her work on developmental genomics — she was approached by Jacek Kuznicki, director of the International Institute of Molecular and Cell Biology (IIMCB) in Warsaw, who asked if she might like to do science at his institute. The first thing she did was to look Poland up on a map. A visit to the institute three months later persuaded her that the Polish capital is as good a place to start a lab as any better-known science hub in Asia, Europe or North America. Impressed by the IIMCB's state-of-the-art labs, and by the spirit of optimism and enthusiasm among its staff, she decided to take a chance. “Frankly, I was scared about exploring the possibility of moving to a country I knew so little about,” she says. “But I find Warsaw to be a great place to do science, and I feel that as a young group leader I can establish myself in Poland for at least a couple of years.” Stretching from the Baltic Sea in the north to the wild Tatra Mountains in the south, Poland is larger than Italy or the United Kingdom. Like most European countries, it has a long and noble tradition of science and scholarship, epitomized by the likes of Nicolaus Copernicus and Marie Curie. But Polish science, like its national history, has been turbulent. During more than four decades of communist rule, when the country was effectively a satellite state of the Soviet Union, Polish scientists were largely isolated from the rest of the world. Communist regimes generously supported scientific research that was often conducted in secrecy. But when communism imploded around 1990, science in Poland (and throughout Eastern Europe) suffered a dramatic financial collapse and an exodus of researchers. Those days of hardship are over. Poland's research intensity — the percentage of gross domestic product (GDP) spent on science — almost doubled between 2005 and 2015, to 1%. Its GDP grew even faster, so overall public and private science spending more than tripled, to €4.3 billion (US$4.6 billion). And since 2004, when the country was one of several from the former Eastern bloc to join the European Union, about €100 billion in EU infrastructure funding has been spent on modernizing roads, hospitals — and scientific facilities. Economically, Poland is already the most successful transition country in Eastern Europe. As for its standing in science, it seems to be en route to regaining lost strength and talent. Poland is already taking the lion's share of scientific publications produced in Eastern Europe. An influx of foreign researchers such as Winata and the creation of international centres and research facilities add a cosmopolitan touch to the country's science. And the proportion of funding through competitive grants is sharply rising. As a result, Poland's contribution to 68 leading science journals examined for Nature's 2016 rising-stars index leapt by 12.7% between 2014 and 2015 (see Nature 535, S56–S61; 2016). But although the brain drain is abating, the European Research Council is still awarding few of its prestigious grants to Polish science institutions — just 3 last year and 16 since 2007 — far fewer than have gone to the United Kingdom, Germany or Hungary. An ambitious scheme, launched last year, aims to create a network of independent research centres that will lure more top scientists. The IIMCB, established in 1999, is emblematic of the country's upswing. In line with international practice, group leaders are selected in open competition and employed on fixed-term contracts. Their research performance is regularly evaluated by an international advisory board. Thanks to the institute's reputation, Winata found it much easier than she had expected to quickly establish a competitive multinational group that includes Polish and Indian PhD students, two postdocs who have returned from abroad and one postdoc from Pakistan. Her exploratory visit had already convinced her that the institute's zebrafish facilities and experimental equipment are top-notch. Salaries, although on average about one-third of those in Western Europe, were no hindrance, she say. The IIMCB leads the EU-funded FishMed project — aimed at establishing zebrafish models for human diseases — which allows her to offer her lab members more money than universities and institutes that rely solely on Polish funding sources. The €3.6-million project also involves elite centres in Austria, Germany, the Netherlands, Switzerland and the United Kingdom. “Being able to closely collaborate and exchange staff and students with leading groups in other countries is a big plus,” she says. EU infrastructure funding has helped to refurbish existing labs and create new campuses and science parks across Poland. Warsaw is the country's main research hub, but other cities have benefited as well. The Małopolskie Centre of Biotechnology in Krakow, which received €24 million in structural funds, and the EIT+ science campus in Wrocław, which got more than €200 million, boast ample lab space and research equipment. But the international success of the IIMCB, whose researchers have published in Nature and other high-impact journals, isn't easily replicated (see 'In Pole position'). Poland needs brainpower to establish itself as a scientific nation of international rank, says Maciej Żylicz, president of the Foundation for Polish Science, the country's biggest independent research-funding agency. “We have fantastic labs by now, but we're lacking enough scientists,” he says. “We must attract more foreign talent.” Persuading aspiring foreign scientists to come and do science in Poland hasn't been easy. Some among the country's traditionally conservative academic community are suspicious of attempts to make science more international, more competitive and less hierarchical. At universities, the power over money, research directions and publications often still resides with established professors, who tend to resist change. Early-stage independence, which is crucial for a young scientist looking for a professional career in international science, is rare. “Any progress in science depends on courage to try new things, something that has been missing in Poland in the past,” says Olga Malinkiewicz, founder of the Wrocław-based Saule Technologies, a privately backed solar-energy company. “And if you never were in touch with good scientists abroad you can't really change things at home.” No one was studying photonics seriously in Poland in 2006, when Malinkiewicz left for graduate studies in Spain. While at the University of Valencia, she developed a type of efficient solar cell based on a promising material called perovskite (see Nature 513, 470; 2014). Potential investors began to line up before she finished her PhD, and when the opportunity arose in 2014 to start her own company in premises rented from the EIT+, she didn't think twice. Her expanding company has since moved to the Wrocław Technology Park, where commercialization of perovskite-based solar cells is set to start this year. Although no official figures have been made public, a Japanese investor is said to have provided $5.3 million. Meanwhile, the Foundation for Polish Science has launched an ambitious attempt to create a network of independent basic-research institutes run by international-calibre researchers. Ten planned centres will each operate in strategic partnership with an existing institute abroad, and will focus on emerging fields. The government has earmarked €126 million in EU structural funds for this International Research Agendas programme. What's new, says Żylicz, is that the centres will be built up around individual scientists who will have maximum freedom to define research directions and hire staff — a principle adopted from the German Max Planck Society, which has successfully applied it for decades. “An empty lab is not a good starting point for science,” he says. “That's why we will look for strong leaders first and then apply a tailor-made research structure around them.” Tomasz Dietl and Tomasz Wojtowicz, two Polish semiconductor physicists who last June won the first round of competition for funding, have received 40 million złoty ($9.8 million) for research on topological phases of matter and new classes of exotic materials. The windfall, says Dietl, will allow him to hire some 40 scientists and set up 6 groups at an international centre in Warsaw, hosted by the Polish Academy of Science's Institute of Physics. “We're not the centre of the universe,” he says. “But we can offer a great deal of scientific freedom, excellent research conditions, nice salaries and good connections with top institutes abroad.” Researchers at the institute will be able to rely on experimental facilities, such as equipment for crystal growth and low-temperature physics, that are on a par with the best centres in Western Europe, says solid-state physicist Laurens Molenkamp of the University of Würzburg in Germany. He has a long-standing collaboration with Dietl and will serve as an adviser to the new centre. “The molecular-beam facilities they now have in Warsaw are pretty unique in Europe,” he says. Winners of the second open call will be announced in April, and two more calls will follow. Żylicz hopes that the programme will eventually draw a few dozen principal investigators and hundreds of international postdocs to do science in Poland. Funding for each centre is limited to five years. “But as they learn to swim in Polish waters I hope that many newcomers will opt to stay longer,” he says. Competition for funds is much less fierce in Poland than in Germany and many other countries, says Austrian-born structural biologist Sebastian Glatt, who leads an independent research group funded by the Max Planck Society at the Małopolskie Centre of Biotechnology. Things there have turned out so well for him that he is considering extending his stay in Poland beyond the envisaged five years. Within a year of starting, his lab had grown to 16 members — including postdocs from Austria, Spain, Taiwan and Ukraine — and it is set to keep expanding. He has no teaching obligations and is pleased with his success in attracting foreign talent and securing grant money from Polish and European sources. “There is abundant grant money available in Poland now and it is easy for junior scientists with a good track record to get funded here,” he says. “That's a huge advantage — and from the large number of job applications I receive, I can see that many people are aware of it.” Scientists who consider moving to Poland, says Winata, should make sure that their host institute is prepared to help foreigners to acclimatize, for example by supporting them in dealing with authorities and landlords. They should also choose institutes that adopt an open-minded and communicative research culture. Glatt is keen for students to openly discuss their work in department seminars and for scientists to exchange ideas while meeting in core research facilities or during social events. “Office doors at our institutes are wide open all the time,” he says. The government is set to continue to enlarge and modernize Poland's research base. Teaming up with high-profile institutes in Western Europe will assist that effort, and will also help Polish science to get international recognition, says Żylicz. The Max Planck Society plans to expand its collaboration with Poland, and France, Switzerland and Spain are also potential partners. Outside the new labs and campuses, Poland has turned into a colourful place with liberal cities brimming with restaurants, bars and theatres. “Poland has become a much different country to the one I had left ten years ago,” says Malinkiewicz. “Something is happening here, and now is a perfect moment for scientists to come and grab their piece of cake.”
News Article | February 21, 2017
Scientists at Max Planck Florida Institute for Neuroscience are working to understand how neurons in the cerebellum, a region in the back of the brain that controls movement, interact with each other In a study published in Cell Reports in February 2017, Matt Rowan, Ph.D., a Post-doctoral researcher in the lab of Dr. Jason Christie, sought to understand the molecular mechanisms behind a type of short-term neuronal plasticity that may have importance for motor control. The team showed that this type of plasticity can impact neurotransmission in as little as 100 milliseconds and depends upon inactivation of Kv3 channels. Interestingly, the team also found that this type of plasticity occurs more readily in juvenile brains than in mature ones. Neuronal communication is frequently described simply as an all-or-nothing event. If a neuron is depolarized enough, it will fire and release neurotransmitters to communicate with another neuron; if it doesn't reach the threshold to fire, it doesn't send a signal at all. However, depolarizations that don't reach the threshold to make the neuron fire can still impact neurotransmission. The depolarization spreads throughout the neuron, and when the neuron does eventually reach the threshold to fire, it releases a stronger signal with more neurotransmitters. This is known as analog-to-digital facilitation, a type of short-term plasticity. "This has been seen before, and we're adding a molecular mechanism showing exactly the molecule you need to get this sort of facilitation," explained Rowan. Researchers were already aware that this type of short-term plasticity exists, but had struggled to view it directly because the axons that utilize this type of plasticity are difficult for scientists to access. This means that some of the molecular mechanisms behind the phenomenon remain mysterious. For the current study, the team used novel techniques for voltage imaging and patch clamp recordings that allowed them to visualize and record from these tiny sections of individual neurons. The researchers observed analog-to-digital facilitation as it occurred in experimental models. They showed that subthreshold depolarization spreads from the body of the neuron down its axon, the long extension through which action potentials travel before causing the neurons to release neurotransmitters into a synapse. Here, subthreshold depolarizations impacted neurotransmission in the juvenile models by briefly making Kv3.4 channel unavailable thereby increasing the duration of the presynaptic spike. The fact that the group observed less of this same plasticity in mature models suggests that learning and experience may temper this type of plasticity as an animal matures. The team chose to study inhibitory interneurons in the cerebellum because they play an especially important role in the function of circuits throughout the cerebellum as well as the rest of the brain. Understanding this type of neuronal plasticity may have important implications for understanding motor disorders such as cerebellar ataxia, a disorder that can cause a variety of motor problems in humans ranging from increased falling to difficulty with speech and swallowing. This work was supported by the National Institutes of Health (NS083127 and NS083894) and funding from The Max Planck Florida Institute for Neuroscience and the Max Planck Society. The Max Planck Florida Institute for Neuroscience (Jupiter, Florida, USA) specializes in the development and application of novel technologies for probing the structure, function, and development of neural circuits. It is the first research institute of the Max Planck Society in the United States.
News Article | February 21, 2017
Some organisms might have an interesting strategy for long-term survival: switching between two unsustainable forms of behaviour that, when kept unchecked, can actually cause them to wipe out their own homes. This discovery, published in the journal eLife, could provide insight into how some species, including humans, can survive and even thrive in harsh conditions and with limited environmental resources. During their life cycles, organisms such as slime moulds switch between living as single, free-ranging individuals (known as 'nomads') and living communally in a colony. To explore the benefits of this adaptation, researchers from the Singapore Institute of Technology and Yale University created a mathematical model that can be applied to such behaviour-switching organisms. Their model suggests that the strategy can ensure survival, even when each behavior would independently result in extinction. "This is an example of a counter-intuitive phenomenon called Parrondo's paradox, where two losing games, when played in a specific order, can surprisingly end in a victory," says first author Zong Xuan Tan, an undergraduate at Yale University. "Previous studies have demonstrated that the paradox can occur when organisms are faced with unpredictable environments. However, our research shows that externally caused environmental variation is not actually needed for organisms to display this behaviour - the paradox can also occur when they form colonies that destroy their own habitats." The model considers a situation where nomads live relatively independently and are unaffected by competition and cooperation, but are subject to steady extinction under poor environmental conditions. Colonists, on the other hand, live in close proximity and are subject to both competitive and cooperative effects. They can also deplete the resources of their habitat over time, resulting in their own extinction. "These two losing strategies can actually lead to survival because when the organisms switch from their destructive colonial form to live as nomads instead, this allows for habitat regeneration," says senior author Kang Hao Cheong, Assistant Professor in the Engineering Cluster at the Singapore Institute of Technology. "Once colonial population sizes are sufficiently small, environmental resources are allowed to recover. The nomads can then take advantage of the restored stocks by switching back to colonialism." Cheong explains that a variety of mechanisms might trigger this switching behaviour. For example, highly mobile nomadic organisms could frequently re-enter their original colonial habitat, thereby detecting whether resource levels are high enough for recolonization. Switching behaviour could also be genetically programmed, such that 'involuntary' individual sacrifice ends up promoting the long-term survival of the species. "The possibility of an ecological Parrondo's paradox could have wide-ranging applications across the fields of ecology and population biology," Cheong says. Tan and Cheong are already exploring ways to adapt their model to specific organisms, and to investigate the possible evolutionary origins of this behavioural phenomenon. The paper 'Nomadic-colonial life strategies enable paradoxical survival and growth despite habitat destruction' can be freely accessed online at http://dx. . Contents, including text, figures, and data, are free to reuse under a CC BY 4.0 license. eLife is a unique collaboration between the funders and practitioners of research to improve the way important research is selected, presented, and shared. eLife publishes outstanding works across the life sciences and biomedicine -- from basic biological research to applied, translational, and clinical studies. All papers are selected by active scientists in the research community. Decisions and responses are agreed by the reviewers and consolidated by the Reviewing Editor into a single, clear set of instructions for authors, removing the need for laborious cycles of revision and allowing authors to publish their findings quickly. eLife is supported by the Howard Hughes Medical Institute, the Max Planck Society, and the Wellcome Trust. Learn more at elifesciences.org.
News Article | March 1, 2017
Glass artisans in medieval times exploited the effect long before it was even known. They coloured the magnificent windows of gothic cathedrals with nanoparticles of gold, which glowed red in the light. It was not until the middle of the 20th century that the underlying physical phenomenon was given a name: plasmons. These collective oscillations of free electrons are stimulated by the absorption of incident electromagnetic radiation. The smaller the metallic particles, the shorter the wavelength of the absorbed radiation. In some cases, the resonance frequency, i.e., the absorption maximum, falls within the visible light spectrum. The unabsorbed part of the spectrum is then scattered or reflected, creating an impression of colour. The metallic particles, which usually appear silvery, copper-coloured or golden, then take on entirely new colours. Researchers are also taking advantage of the effect to develop plasmonic printing, in which tailor-made square metal particles are arranged in specific patterns on a substrate. The edge length of the particles is in the order of less than 100 nanometres (100 billionths of a metre). This allows a resolution of 100,000 dots per inch – several times greater than what today's printers and displays can achieve. For metallic particles measuring several 100 nanometres across, the resonance frequency of the plasmons lies within the visible light spectrum. When white light falls on such particles, they appear in a specific colour, for example red or blue. The colour of the metal in question is determined by the size of the particles and their distance from each other. These adjustment parameters therefore serve the same purpose in plasmonic printing as the palette of colours in painting. The trick with the chemical reaction The Smart Nanoplasmonics Research Group at the Max Planck Institute for Intelligent Systems in Stuttgart also makes use of this colour variability. They are currently working on making dynamic plasmonic printing. They have now presented an approach that allows them to alter the colours of the pixels predictably – even after an image has been printed. "The trick is to use magnesium. It can undergo a reversible chemical reaction in which the metallic character of the element is lost," explains Laura Na Liu, who leads the Stuttgart research group. "Magnesium can absorb up to 7.6% of hydrogen by weight to form magnesium hydride, or MgH2", Liu continues. The researchers coat the magnesium with palladium, which acts as a catalyst in the reaction. During the continuous transition of metallic magnesium into non-metallic MgH2, the colour of some of the pixels changes several times. The colour change and the speed of the rate at which it proceeds follow a clear pattern. This is determined both by the size of and the distance between the individual magnesium particles as well as by the amount of hydrogen present. In the case of total hydrogen saturation, the colour disappears completely, and the pixels reflect all the white light that falls on them. This is because the magnesium is no longer present in metallic form but only as MgH2. Hence, there are also no free metal electrons that can be made to oscillate. The scientists demonstrated the effect of such dynamic colour behaviour on a plasmonic print of Minerva, the Roman goddess of wisdom, which also bore the logo of the Max Planck Society. They chose the size of their magnesium particles so that Minerva's hair first appeared reddish, the head covering yellow, the feather crest red and the laurel wreath and outline of her face blue. They then washed the micro-print with hydrogen. A time-lapse film shows how the individual colours change. Yellow turns red, red turns blue, and blue turns white. After a few minutes all the colours disappear, revealing a white surface instead of Minerva. The scientists also showed that this process is reversible by replacing the hydrogen stream with a stream of oxygen. The oxygen reacts with the hydrogen in the magnesium hydride to form water, so that the magnesium particles become metallic again. The pixels then change back in reverse order, and in the end Minerva appears in her original colours. In a similar manner the researchers first made the micro image of a famous Van Gogh painting disappear and then reappear. They also produced complex animations that give the impression of fireworks. The principle of a new encryption technique Laura Na Liu can imagine using this principle in a new encryption technology. To demonstrate this, the group formed various letters with magnesium pixels. The addition of hydrogen then caused some letters to disappear over time, like the image of Minerva. "As for the rest of the letters, a thin oxide layer formed on the magnesium particles after exposing the sample in air for a short time before palladium deposition," Liu explains. This layer is impermeable to hydrogen. The magnesium lying under the oxide layer therefore remains metallic − and visible − because light is able to excite the plasmons in the magnesium. In this way it is possible to conceal a message, for example by mixing real and nonsensical information. Only the intended recipient is able to make the nonsensical information disappear and filter out the real message. For example, after decoding the message "Hartford" with hydrogen, only the words "art or" would remain visible. To make it more difficult to crack such encrypted messages, the group is currently working on a process that would require a precisely adjusted hydrogen concentration for deciphering. Liu believes that the technology could also be used some day in the fight against counterfeiting. "For example, plasmonic security features could be printed on banknotes or pharmaceutical packs, which could later be checked or read only under specific conditions unknown to counterfeiters." It doesn't necessarily have to be hydrogen Laura Na Liu knows that the use of hydrogen makes some applications difficult and impractical for everyday use such as in mobile displays. "We see our work as a starting shot for a new principle: the use of chemical reactions for dynamic printing," the Stuttgart physicist says. It is certainly conceivable that the research will soon lead to the discovery of chemical reactions for colour changes other than the phase transition between magnesium and magnesium dihydride, for example, reactions that require no gaseous reactants. Explore further: Rotate an image, another one appears (w/ Video)
News Article | February 15, 2017
Researchers from the University of Toronto, Canada, have discovered a reason why we often struggle to remember the smaller details of past experiences. Writing in the journal eLife, the team found that there are specific groups of neurons in the medial prefrontal cortex (mPFC) of a rat's brain - the region most associated with long-term memory. These neurons develop codes to help store relevant, general information from multiple experiences while, over time, losing the more irrelevant, minor details unique to each experience. The findings provide new insight into how the brain collects and stores useful knowledge about the world that can be adapted and applied to new experiences. "Memories of recent experiences are rich in incidental detail but, with time, the brain is thought to extract important information that is common across various past experiences," says Kaori Takehara-Nishiuchi, senior author and Associate Professor of Psychology at the University of Toronto. "We predicted that groups of neurons in the mPFC build representations of this information over the period when long-term memory consolidation is known to take place, and that this information has a larger representation in the brain than the smaller details." To test their prediction, the team studied how two different memories with overlapping associative features are coded by neuron groups in the mPFC of rat brains, and how these codes change over time. Rats were given two experiences with an interval between each: one involving a light and tone stimulus, and the other involving a physical stimulus. This gave them two memories that shared a common stimulus relationship. The scientists then tracked the neuron activity in the animals' brains from the first day of learning to four weeks following their experiences. "This experiment revealed that groups of neurons in the mPFC initially encode both the unique and shared features of the stimuli in a similar way," says first author Mark Morrissey, formerly a graduate researcher at the University of Toronto. "However, over the course of a month, the coding becomes more sensitive to the shared features and less sensitive to the unique features, which become lost." Further experiments also revealed that the brain can adapt the general knowledge gained from multiple experiences immediately to a new situation. "This goes some way to answering the long-standing question of whether the formation of generalised memory is simply a result of the brain's network 'forgetting' incidental features," Morrissey explains. "On the contrary, we show that groups of neurons develop coding to store shared information from different experiences while, seemingly independently, losing selectivity for irrelevant details." Morrissey adds that the unique coding property of the mPFC identified in the study may support its role in the formation, maintenance, and updating of associative knowledge structures that help support flexible and adaptive behaviour in rats and other animals. The paper 'Generalizable knowledge outweighs incidental details in prefrontal ensemble code over time' can be freely accessed online at http://dx. . Contents, including text, figures, and data, are free to reuse under a CC BY 4.0 license. eLife is a unique collaboration between the funders and practitioners of research to improve the way important research is selected, presented, and shared. eLife publishes outstanding works across the life sciences and biomedicine -- from basic biological research to applied, translational, and clinical studies. All papers are selected by active scientists in the research community. Decisions and responses are agreed by the reviewers and consolidated by the Reviewing Editor into a single, clear set of instructions for authors, removing the need for laborious cycles of revision and allowing authors to publish their findings quickly. eLife is supported by the Howard Hughes Medical Institute, the Max Planck Society, and the Wellcome Trust. Learn more at elifesciences.org.
Leydesdorff L.,University of Amsterdam |
Bornmann L.,Max Planck Society
Journal of the American Society for Information Science and Technology | Year: 2011
In bibliometrics, the association of "impact" with central-tendency statistics is mistaken. Impacts add up, and citation curves therefore should be integrated instead of averaged. For example, the journals MIS Quarterly and Journal of the American Society for Information Science and Technology differ by a factor of 2 in terms of their respective impact factors (IF), but the journal with the lower IF has the higher impact. Using percentile ranks (e.g., top-1%, top-10%, etc.), an Integrated Impact Indicator (I3) can be based on integration of the citation curves, but after normalization of the citation curves to the same scale. The results across document sets can be compared as percentages of the total impact of a reference set. Total number of citations, however, should not be used instead because the shape of the citation curves is then not appreciated. I3 can be applied to any document set and any citation window. The results of the integration (summation) are fully decomposable in terms of journals or institutional units such as nations, universities, and so on because percentile ranks are determined at the paper level. In this study, we first compare I3 with IFs for the journals in two Institute for Scientific Information subject categories ("Information Science & Library Science" and "Multidisciplinary Sciences"). The library and information science set is additionally decomposed in terms of nations. Policy implications of this possible paradigm shift in citation impact analysis are specified. © 2011 ASIS&T.
Bornmann L.,Max Planck Society
Scientometrics | Year: 2012
Purpose-this paper aims to look at the Hawthorne effect in editorial peer review. Design/methodology/approach-discusses the quality evaluation of refereed scholarly journals. Findings-a key finding of this research was that in the peer review process of one and the same manuscript, reviewers or editors, respectively, arrive at different judgments. This phenomenon is named as "Hawthorne effect" because the different judgements are dependent on the specific conditions under which the peer review process at the individual journals takes place. Originality/value-provides a discussion on the quality evaluation of scholarly journals. © 2011 Akadémiai Kiadó, Budapest, Hungary.