LMU
München, Germany
LMU
München, Germany

Time filter

Source Type

News Article | March 2, 2017
Site: www.eurekalert.org

This year ten researchers - including four women and six men - will receive the Heinz Maier-Leibnitz Prize, the most important award for early career researchers in Germany. The recipients were chosen by a selection committee in Bonn appointed by the Deutsche Forschungsgemeinschaft (DFG, German Research Foundation) and the Federal Ministry of Education and Research (BMBF). The prizewinners will each be presented with the €20,000 prize on 3 May in Berlin. This will be followed by a celebration of the 40th anniversary of the Heinz Maier-Leibnitz Prize. The Heinz Maier-Leibnitz Prize has been awarded annually to outstanding early career researchers since 1977 - as both recognition and an incentive to continue pursuing a path of academic excellence. Since 1980 it has been named after the atomic physicist and former DFG President Heinz Maier-Leibnitz, during whose period in office (1973-1979) it was first awarded. The Heinz Maier-Leibnitz Prize is regarded not just as the most important award for early career researchers in Germany. In a survey carried out by "bild der wissenschaft" magazine, the major research organisations voted the Heinz Maier-Leibnitz Prize the third most important research prize in Germany - after the Gottfried Wilhelm Leibniz Prize, presented by the DFG, and the Deutscher Zukunftspreis, awarded by the German President. A total of 154 researchers representing all research areas were nominated for this year's prize; 14 of the nominees were then shortlisted. "We were delighted at the sheer number of nominations received in the prize's anniversary year," said the chair of the selection committee, mathematician and DFG Vice President Prof. Dr. Marlis Hochbruck. "The ten recipients are an outstanding example of the high standard of academic quality and qualification of many young researchers in Germany." In his research, Andreas Geiger deals with the broad field of computer vision, in which he has already achieved international renown. His work combines machine vision and robotics. Geiger's main aim is to understand the basic principles of autonomous intelligent systems, especially in the area of autonomous driving. His work is therefore highly relevant not only socially, but also economically. Many of the algorithms he has developed are now being used by research teams and companies throughout the world and his scientific papers have already won multiple awards. Since 2016 Geiger has led the independent Max Planck research group 'Autonomous Machine Vision'. In the same year he was offered an interim professorship at ETH Zurich, in one of the world's biggest and most renowned labs for computer vision. As a postdoctoral researcher Christian Gross was involved in the pioneering development of microscopes for the observation of single atoms in optical grids. This enabled him to model a wide range of quantum systems experimentally and answer questions at the boundary of statistical physics and quantum mechanics. Gross achieved important results relating to phase transitions, magnetic correlations and non-equilibrium systems. Another key area of his work is the physics of Rydberg superatoms, with which he has generated new types of quantum crystals, for example. In 2015 Gross received an ERC Starting Grant for his project 'Rydberg-dressed Quantum Many-Body Systems' in order to advance research with his team that could pave the way for the design of quantum magnets. How do our attitudes influence our choices and ability to make moral judgements? When do personal experiences turn into prejudices? Mandy Hütter seeks answers to questions like these. She demonstrates that not all attitudes are the result of conscious learning processes and that moral judgements are also dependent on 'situational cues'. Hütter has published her results in internationally respected journals. In clinical practice they have proved useful in interventional approaches for phobias and are also creating new insights in the area of social prejudices, the study of democratic processes and the 'wisdom of the many'. Hütter, who also regularly presents her work to the general public, is a junior professor and the leader of the Social and Organisational Psychology group at the University of Tübingen. She also leads an Emmy Noether independent junior research group. Difficulty dealing with emotions and regulating them through changed evaluation is not limited to people with a range of psychological disorders: the same applies to healthy people who have an increased risk of developing such disorders. This is one finding from the work of psychologist Philipp Kanske, who studies the influence of emotions on the way we think and perceive things. He combines basic research with clinical studies, which enables him to adopt an original perspective on the topic at various psychological levels. With approximately 50 publications to date, Kanske has already had a notable impact on clinical-psychological neuroscience. In 2015 he was appointed to the Junge Akademie of the Berlin-Brandenburg Academy of Sciences and Humanities and the German National Academy of Sciences Leopoldina. At the Max Planck Institute in Leipzig he leads the Research Unit 'Psychopathology of the Social Brain'. Since 2013 Kirchlechner has led the working group 'Nano-/Micromechanics of Materials' at the Max Planck Institute for Iron Research in Düsseldorf, where he and his team study the deformation and failure of materials in mesoscopic dimensions. The team's combination of micromechanical experiments and innovative methods for the characterisation of structures - including the so-called micro-Laue method - is unique. One measurement method co-developed by Kirchlechner makes it possible to investigate the influence of atomic defects on specific material properties. It therefore provides answers to key questions in materials science and engineering, specifically the mechanisms of fine grain hardening and the formation of dislocation structures during fatigue processes. Kirchlechner is already considered an internationally recognised expert in micromechanical experiments on synchrotrons. Olivier Namur collected a number of awards while still a student in Belgium and now publishes in his specialist field - the study of volcanic systems and magmatic processes on Earth, the Moon and Mercury - with remarkable impact in international bodies. Namur has developed thermodynamic models not only of the crystallization of magmas, but also of their physical properties. His research has also resulted in new experimental high-pressure, high-temperature methods. Another focus of Namur's research is the investigation and modelling of the textures of minerals in igneous rock, which contain information about the transport of materials and temporal processes in the Earth's deep crust. In recent years this has included crystal mushes, magmas with a very high crystalline content, which reach the surface as fragments due to eruptions and could provide clues as to the structure of the Earth's lower crust. Ute Scholl's field is the study of hypertonia, especially (pre)disposition to this condition due to genetic defects in ion channels and ion transporters. After writing her doctoral thesis on CIC-K chloride channels, which produced a number of highly regarded publications, in her postdoctoral phase she became the first researcher to describe a new syndrome and its genetic basis, which is associated with epilepsy, inner ear hearing loss, ataxia and renal salt loss. Scholl's research has made a significant contribution to the understanding of the hormonal degeneration processes that lead to secondary hypertonia with consequences such as cardiac circulatory disorders or stroke. Since 2014 Scholl has been a junior professor in Experimental Nephrology and Hypertensiology at the University of Düsseldorf. In 2016 she served as deputy spokesperson of the Junges Kolleg of the North Rhine-Westphalian Academy of Sciences, Humanities and the Arts. Her work has won numerous awards, including the Walter Clawiter Prize and the Ingrid zu Solms Research Prize. With his dissertation 'Verisimilitudo. Die epistemologischen Voraussetzungen der Gotteslehre Abaelards' and his habilitation thesis 'Theologie aus anthropologischer Ansicht. Der Entwurf Franz Oberthürs (1745-1831)', within a few years Michael Seewald established himself as an expert in dogmatics and ecumenical theology. The former won the Cardinal Wetter Prize of the Catholic Academy in Bavaria, while the latter was awarded the Karl Rahner Prize presented by the University of Innsbruck. Through his habilitation thesis, in particular, Seewald presented a fundamental work on the reception of the European Enlightenment in the environment of Catholic dogmatics, which, through an individual person, also sheds new light on the general relationship between the Catholic Church and modernity. This fills an important gap in research. Since January 2016, Seewald has taught as a private lecturer in dogmatics and ecumenical theology at LMU Munich. Marion Silies began to study the motion perception of Drosophila as a postdoctoral researcher at Stanford University. Since 2014 she has led the Emmy Noether independent junior research group 'The Cellular and Molecular Basis of Motion Perception' at the University of Göttingen. In this group she investigates the outstanding question of how neural networks perform critical calculation operations and how sensory systems use these calculations to extract information from the environment and control behaviour. Among the tools Silies uses is a genetic 'toolbox', established by her and now used by countless laboratories worldwide. With this toolbox researchers can manipulate neural function in specific cells and thus identify the neural networks of motion perception. Silies has won multiple awards for her work. In 2016 she received an ERC Starting Grant for her project 'MicroCyFly'. Within comparative literature, Evi Zemanek's fields of research range from antiquity to the present day. In the field of cultural ecology and 'ecocriticism', which investigates literary texts in the context of ecological aspects, she is considered a pioneer in German-language literature studies. In 2012 she established the DFG early career researcher network 'Ethics and Aesthetics of Literary Representations of Ecological Transformations', on behalf of which she organised six groundbreaking conferences. Since her dissertation 'Das Gesicht im Gedicht' (2010), intermediality research, especially the relationship of literature to painting, photography and architecture, has been another key aspect of her scholarly work. Zemanek is a junior professor of Modern German Literature and Intermediality at the University of Freiburg. In the winter semester 2016/2017 she will serve as an interim professor in the Institute of Media and Cultural Studies. The 2017 Heinz Maier-Leibnitz Prize award ceremony, followed by a celebratory event, will be held on 3 May at 6 pm at the Berlin-Brandenburg Academy of Sciences and Humanities, Markgrafenstraße 38, 10117 Berlin. Representatives from the media are cordially invited to attend the award ceremony. Please register in advance with the DFG Press and Public Relations Office, tel. +49 228 885-2109, presse@dfg.de. More information about the prize and previous winners is available at: http://www.


The cell is the fundamental unit of all living organisms. Hence, in order to understand essential biological processes and the perturbations that give rise to disease, one must first dissect the functions of cells and the mechanisms that regulate them. Modern high-throughput protein and nucleic-acid sequencing techniques have become an indispensable component of this endeavor. In particular, single-cell RNA sequencing (scRNA-seq) permits one to determine the levels of RNA molecules – the gene copies - that are expressed in a given cell, and several versions of the methodology have been described in recent years. The spectrum of genes expressed in a given cell amounts to a molecular fingerprint, which yields a detailed picture of its current functional state. "For this reason, the technology has become an extraordinarily valuable tool, not only for basic research but also for the development of new approaches to treat diseases," says LMU biologist Wolfgang Enard. Enard and his team have now undertaken the first comprehensive comparative analysis of the various RNA sequencing techniques, with regard to their sensitivity, precision and cost efficiency. Their results appear in the leading journal Molecular Cell. The purpose of scRNA-seq is to identify the relative amounts of the messenger RNA (mRNA) molecules present in the cells of interest. mRNAs are the blueprints that specify the structures of all the proteins made in the cell, and represent "transcribed" copies of the corresponding genetic information encoded in specific segments of the genomic DNA in the cell nucleus. In the cytoplasm surrounding the nucleus, the nucleotide sequences of mRNAs are "translated" into the amino-acid sequences of proteins by molecular machines called ribosomes. Thus a complete catalog of the mRNAs in a cell provides a comprehensive view of the proteins that it produces, and tells one what subset of the thousands of genes in the genome are active and how their activity is regulated. Furthermore, aberrant patterns of gene activity point to disturbances in gene expression and cell function, and reveal the presence of specific pathologies. The scRNA-seq procedure itself can be carried out using commercially available kits, but many researchers prefer to assemble the components required for their preferred formulations themselves. In order to ascertain which of the methods currently in use is most effective and economical, Enard and his colleagues applied six different methods to mouse embryonal stem cells and compared the spectra of mRNAs detected by each of them. They then used this data to compute how much it costs for each method to reliably detect differently expressed genes between two cell types. "This comparison revealed that some of the commercial kits are ten times more expensive than the corresponding home-made versions," Enard says. However, the researchers point out that the choice of the optimal method largely depends on the conditions and demands of the individual experiment. "It does make a difference whether one wants to analyze the activity of hundreds of genes in thousands of individual cells, or thousands of genes in hundreds of cells," Enard says. "We were able to demonstrate which method is best for a given purpose, and we also obtained data that will be useful for the further development of the technology." The new findings are of particular interest in the field of genomics. For example, scRNA-seq is a fundamental prerequisite for the success of the effort to assemble a Human Cell Atlas – one of the most ambitious international projects in genomics since the initial sequencing of the human genome. It aims to provide no less than a complete inventory of all the cell types and subtypes in the human body at all stages of development from embryo to adult on the basis of their patterns of gene activity. It is estimated that the total number of cells in the human body is on the order of 3.5 × 1013. Scientists expect that such an atlas would revolutionize our knowledge of human biology and our understanding of disease processes. Explore further: Stepping up the hunt for genetic diseases More information: Christoph Ziegenhain et al. Comparative Analysis of Single-Cell RNA Sequencing Methods, Molecular Cell (2017). DOI: 10.1016/j.molcel.2017.01.023


News Article | February 21, 2017
Site: www.eurekalert.org

A collaborative study describes a novel myoclonic epilepsy syndrome in dogs for the first time and discovers its genetic cause at DIRAS1 gene. The affected dogs developed myoclonic seizures at young age - on average 6 months old - and seizures occur typically at rest. In some of the dogs the seizures could be triggered by light. The canine myoclonic epilepsy resembles human juvenile myoclonic syndrome in many aspects and the study has therefore meaningful implications for epilepsy research across species, says Professor Hannes Lohi from the canine gene research group, University of Helsinki. Myoclonic epilepsies are one of the most common forms of epilepsy in human and the canine findings will not only help in diagnostics but also provide a novel entry point to understand the pathophysiology of the disease. The identified DIRAS1 gene may play a role in cholinergic transmission in the brain and provides a novel target for the development of epilepsy treatments. We found a novel epilepsy gene, DIRAS1, which has not been linked to any neurological diseases before. The gene is poorly characterized so far, but some studies suggest that it may play a role in cholinergic neurotransmission, which could be a highly relevant pathway for the myoclonic epilepsies, explains MSc Sarviaho, co-first author of the study. The genetic backgrounds of myoclonic epilepsies are not well known yet, and our study provides a new candidate gene, which helps to further characterize the underlying pathophysiology in future studies. This would be important for the development of new treatment scenarios, summarizes Professor Lohi, senior author of the study. The affected dogs continue to serve as preclinical models when new treatment options are sought in ongoing studies. The results have implications for both veterinary diagnostics and breeding programs. We screened over 600 Rhodesian Ridgebacks and about 1000 epileptic dogs in other breeds and found that the DIRAS1 defect was specific for juvenile myoclonic epilepsy in Rhodesian Ridgebacks so far, says MSc Sarviaho. With the help of the genetic test, veterinarians can diagnose this specific epilepsy in their canine patients while breeders will be able to identify carriers and revise the breeding plans to avoid future affected puppies. About 15% of the dogs in the breed carry the DIRAS1 mutation and dogs all over Europe and beyond are affected, says DVM Franziska Wieländer from LMU Munich. To characterize the clinical features, researchers utilized a novel wireless video-EEG recording method. This allows a real-time monitoring of the electrical events prior, during and after the seizure episode in unsedated dogs. All the wires from electrodes are attached to a small portable device on the dog's back that transmits the data straight to our computers. Thus, the dog is free to move around and we can record the EEG for long periods at one go, explains Professor Fiona James. She has been previously developing the method at the University of Guelph, Ontario, Canada. Video-EEG is a routine approach in the human epilepsy clinic but only piloted now for the dogs. The beauty of the method is that we can easily correlate the behavioral changes with the recorded electroencephalographs and compare them to human EEG results. Indeed, with this technique we were able to identify epilepsy at an early stage and prior to the development of generalized tonic-clonic seizures. Moreover, we found strikingly similar EEG patterns in dogs that have been described in human myoclonic epilepsy", describes Professor Andrea Fischer from LMU Munich. Video-EEG is a powerful new approach for veterinary epilepsy research compared to previous short, 20-minute interictal measurements under sedation and gives much more accurate results, says Wieländer. Careful clinical studies helped to establish proper study cohorts to identify the genetic cause. The study was published in Proceedings of the National Academy of Sciences of the USA (PNAS) on 20 February 2017.


News Article | February 21, 2017
Site: www.eurekalert.org

Every cell has its own individual molecular fingerprint, which is informative for its functions and regulatory states. Researchers from Ludwig-Maximilians-Universitaet (LMU) in Munich have now carried out a comprehensive comparison of methodologies that quantify RNAs of single cells. The cell is the fundamental unit of all living organisms. Hence, in order to understand essential biological processes and the perturbations that give rise to disease, one must first dissect the functions of cells and the mechanisms that regulate them. Modern high-throughput protein and nucleic-acid sequencing techniques have become an indispensable component of this endeavor. In particular, single-cell RNA sequencing (scRNA-seq) permits one to determine the levels of RNA molecules - the gene copies - that are expressed in a given cell, and several versions of the methodology have been described in recent years. The spectrum of genes expressed in a given cell amounts to a molecular fingerprint, which yields a detailed picture of its current functional state. "For this reason, the technology has become an extraordinarily valuable tool, not only for basic research but also for the development of new approaches to treat diseases," says LMU biologist Wolfgang Enard. Enard and his team have now undertaken the first comprehensive comparative analysis of the various RNA sequencing techniques, with regard to their sensitivity, precision and cost efficiency. Their results appear in the leading journal Molecular Cell. The purpose of scRNA-seq is to identify the relative amounts of the messenger RNA (mRNA) molecules present in the cells of interest. mRNAs are the blueprints that specify the structures of all the proteins made in the cell, and represent "transcribed" copies of the corresponding genetic information encoded in specific segments of the genomic DNA in the cell nucleus. In the cytoplasm surrounding the nucleus, the nucleotide sequences of mRNAs are "translated" into the amino-acid sequences of proteins by molecular machines called ribosomes. Thus a complete catalog of the mRNAs in a cell provides a comprehensive view of the proteins that it produces, and tells one what subset of the thousands of genes in the genome are active and how their activity is regulated. Furthermore, aberrant patterns of gene activity point to disturbances in gene expression and cell function, and reveal the presence of specific pathologies. The scRNA-seq procedure itself can be carried out using commercially available kits, but many researchers prefer to assemble the components required for their preferred formulations themselves. In order to ascertain which of the methods currently in use is most effective and economical, Enard and his colleagues applied six different methods to mouse embryonal stem cells and compared the spectra of mRNAs detected by each of them. They then used this data to compute how much it costs for each method to reliably detect differently expressed genes between two cell types. "This comparison revealed that some of the commercial kits are ten times more expensive than the corresponding home-made versions," Enard says. However, the researchers point out that the choice of the optimal method largely depends on the conditions and demands of the individual experiment. "It does make a difference whether one wants to analyze the activity of hundreds of genes in thousands of individual cells, or thousands of genes in hundreds of cells," Enard says. "We were able to demonstrate which method is best for a given purpose, and we also obtained data that will be useful for the further development of the technology." The new findings are of particular interest in the field of genomics. For example, scRNA-seq is a fundamental prerequisite for the success of the effort to assemble a Human Cell Atlas - one of the most ambitious international projects in genomics since the initial sequencing of the human genome. It aims to provide no less than a complete inventory of all the cell types and subtypes in the human body at all stages of development from embryo to adult on the basis of their patterns of gene activity. It is estimated that the total number of cells in the human body is on the order of 3.5 × 1013. Scientists expect that such an atlas would revolutionize our knowledge of human biology and our understanding of disease processes.


Every human cell contains some two meters of deoxyribonucleic acid (DNA), which encodes the genetic information that specifies the cellular structures and functions. Moreover, this "genomic" DNA is packed into the cell nucleus, which is less than 10 micrometers in diameter. This means that the nuclear DNA must be packed, primarily by interacting with specific proteins. The basic packing unit is a particle made of proteins called histones, around which the DNA is wrapped. Comparable to small spools, these structures are referred to as nucleosomes. Nucleosomes in turn are linked to one another by segments of DNA that extend between the core particles and are not wrapped around them. Viewed under the electron microscope, the DNA packed in nucleosomes resembles beads on a string. The next level of packaging involves the mutual interaction of nucleosomes, and the resulting higher-order structures have not yet been completely characterized. A team of scientists led by Hendrik Dietz of the Technical University of Munich and Philipp Korber at LMU's Biomedical Center has now taken a substantial step toward solving this puzzle: For the first time ever, they have succeeded in directly measuring the attractive forces that act between nucleosomes. Their results appear in the journals Science Advances and Nano Letters. Dietz, holder of the Chair for Experimental Biophysics at the TUM, uses DNA as a construction material to build molecular structures—a technology referred to as DNA origami. He and his team have now used the method to create structures consisting of two rigid DNA bars connected by a flexible joint that acts as a spring. These can be used like tweezers to measure the strength of the interactions between nucleosomes. One nucleosome is attached to each arm of the tweezers. "We can control the position and orientation of the nucleosomes in the DNA tweezers with a very high degree of precision," says Dietz. "This is very important when it comes to really being able to measure the interactions." The LMU researchers took on the task of developing nucleosome structures that can be integrated into the tweezers. Philipp Korber, Privatdozent and Group Leader at the Chair of Molecular Biology at the BMC, explains: "Normally the two double-stranded ends of the DNA spooled around the nucleosome are very close to one another. But what we needed were two protruding single strands, closer to the middle. This was a significant issue, since such a configuration can destabilize the entire structure. Our team member Corinna Lieleg nevertheless succeeded in finding the right spots for these handles." The researchers were able to measure a very weak interaction between integrated nucleosomes, equivalent to an attractive force of 1.6 kcal/mol, at a range of about 6 nanometers (nm). The orientations of the nucleosomes relative to one another were found to have hardly any effect. However, particular chemical modifications in the histone proteins further weakened the interactions. The problem of the 30-nm fiber The result could help resolve a current scientific dispute. According to the current theory, nucleosomes form a type of super-spiral with a diameter of 30 nanometers, the so called 30-nm fiber. So far, however, this higher-order 30-nm structure has never been observed in living cells. Whether or not the chromatin really takes on the form of such a super-spiral is still highly controversial. Indeed, the minute attractive forces between the nucleosomes, which the researchers have now successfully measured, appear to contradict the theory. "Our data point to very soft structures that are easily deformed by external influences," says Dietz. How nucleosomes are organized into higher-order structures is a fundamentally important issue, as it has profound implications for the control of gene expression. Only those genes which lie within relatively non-compact chromatin are accessible to 'activation', which allows the proteins they encode to be produced by the cellular machinery. Gene regulation through DNA packaging goes awry in cancer cells "Over the past ten years it has become clear that many of the changes and mutations that transform cells into cancer cells take place at this level," Korber says. In a cancer cell, the normal mechanisms that determine which genes are active and which are inactive are disrupted. Genomic regions which should not be accessible are left open and vice versa. "However, if only the packaging is defective, and not the gene itself, it should in principle be possible to restore the proper packaging again." The researchers plan to use the molecular tweezer technique to investigate other structures. "In biology the orientation of structures with respect to one another is always important," says Korber. "Now we have a kind of molecular clamp which we can use to specifically control the spatial orientation of structures to one another." Explore further: Correctly packaging the complete yeast genome using purified components in the test-tube More information: Jonas J. Funke et al. Exploring Nucleosome Unwrapping Using DNA Origami, Nano Letters (2016). DOI: 10.1021/acs.nanolett.6b04169 J. J. Funke et al. Uncovering the forces between nucleosomes using DNA origami, Science Advances (2016). DOI: 10.1126/sciadv.1600974


News Article | February 27, 2017
Site: www.cemag.us

The mode of packaging of the genomic DNA in the cell nucleus determines patterns of gene expression. Munich researchers have used DNA-based nano-tweezers to measure the forces between nucleosomes, the basic packing units of nuclear DNA. Every human cell contains some two meters of deoxyribonucleic acid (DNA), which encodes the genetic information that specifies the cellular structures and functions. Moreover, this “genomic” DNA is packed into the cell nucleus, which is less than 10 micrometers in diameter. This means that the nuclear DNA must be packed, primarily by interacting with specific proteins. The basic packing unit is a particle made of proteins called histones, around which the DNA is wrapped. Comparable to small spools, these structures are referred to as nucleosomes. Nucleosomes in turn are linked to one another by segments of DNA that extend between the core particles and are not wrapped around them. Viewed under the electron microscope, the DNA packed in nucleosomes resembles beads on a string. The next level of packaging involves the mutual interaction of nucleosomes, and the resulting higher-order structures have not yet been completely characterized. A team of scientists led by Hendrik Dietz of the Technical University of Munich and Philipp Korber at LMU’s Biomedical Center has now taken a substantial step toward solving this puzzle: For the first time ever, they have succeeded in directly measuring the attractive forces that act between nucleosomes. Their results appear in the journals Science Advances and Nano Letters. Dietz, holder of the Chair for Experimental Biophysics at the TUM, uses DNA as a construction material to build molecular structures — a technology referred to as DNA origami. He and his team have now used the method to create structures consisting of two rigid DNA bars connected by a flexible joint that acts as a spring. These can be used like tweezers to measure the strength of the interactions between nucleosomes. One nucleosome is attached to each arm of the tweezers. “We can control the position and orientation of the nucleosomes in the DNA tweezers with a very high degree of precision,” says Dietz. “This is very important when it comes to really being able to measure the interactions.” The LMU researchers took on the task of developing nucleosome structures that can be integrated into the tweezers. Philipp Korber, Privatdozent and Group Leader at the Chair of Molecular Biology at the BMC, explains: “Normally the two double-stranded ends of the DNA spooled around the nucleosome are very close to one another. But what we needed were two protruding single strands, closer to the middle. This was a significant issue, since such a configuration can destabilize the entire structure. Our team member Corinna Lieleg nevertheless succeeded in finding the right spots for these handles." The researchers were able to measure a very weak interaction between integrated nucleosomes, equivalent to an attractive force of 1.6 kcal/mol, at a range of about 6 nanometers (nm). The orientations of the nucleosomes relative to one another were found to have hardly any effect. However, particular chemical modifications in the histone proteins further weakened the interactions. The result could help resolve a current scientific dispute. According to the current theory, nucleosomes form a type of super-spiral with a diameter of 30 nanometers, the so called 30-nm fiber. So far, however, this higher-order 30-nm structure has never been observed in living cells. Whether or not the chromatin really takes on the form of such a super-spiral is still highly controversial. Indeed, the minute attractive forces between the nucleosomes, which the researchers have now successfully measured, appear to contradict the theory. “Our data point to very soft structures that are easily deformed by external influences,” says Dietz. How nucleosomes are organized into higher-order structures is a fundamentally important issue, as it has profound implications for the control of gene expression. Only those genes which lie within relatively non-compact chromatin are accessible to “activation,” which allows the proteins they encode to be produced by the cellular machinery. “Over the past ten years it has become clear that many of the changes and mutations that transform cells into cancer cells take place at this level,” Korber says. In a cancer cell, the normal mechanisms that determine which genes are active and which are inactive are disrupted. Genomic regions which should not be accessible are left open and vice versa. “However, if only the packaging is defective, and not the gene itself, it should in principle be possible to restore the proper packaging again.” The researchers plan to use the molecular tweezer technique to investigate other structures. “In biology the orientation of structures with respect to one another is always important,” says Korber. “Now we have a kind of molecular clamp which we can use to specifically control the spatial orientation of structures to one another.” In another experiment, the researchers had measured the force needed to unwind the DNA from the nucleosome. Thus, the DNA-based origami tweezers can be used to measure the forces acting between and within nucleosomes.


News Article | March 2, 2017
Site: www.eurekalert.org

In spite of its limitations, automated journalism will expand. According to media researchers, this development underlines the need for critical, contextualised journalism. Journalists and editors believe 'robo-journalists' do not have a good nose for news and produce one-dimensional stories, according to new research published today. However, despite these limitations, the report reveals plans for the technology to be rolled out more widely with the potential to replace "hundreds" of journalists at Thomson Reuters alone. The researchers, Professor Neil Thurman (Ludwig-Maximilians-Universitaet (LMU) in Munich), Konstantin Dörr (University of Zurich) and Dr. Jessica Kunert (LMU Munich), interviewed journalists, editors, and executives from CNN, BBC, Thomson Reuters, Trinity Mirror, and News UK in an exploratory study. The journalists were given hands-on experience with robo-writing software during a workshop. Robo -- or automated -- journalism, is software that converts structured data into stories with limited to no human intervention beyond the initial programming. It is used by news organisations including Associated Press, the Los Angeles Times, and Forbes. The report is published in the international peer-reviewed journal, Digital Journalism. The journalists and editors in Dr. Thurman's study believe robo-journalism's reliance on data streams and the need to program news angles in advance means the stories produced lack the context, complexity, and creativity of much traditional reporting. Journalists also thought the need to template robo-written stories in advance is a drawback. One, from the BBC, said "you can't get a reaction to those numbers, you can't explain or interrogate them because the story template was written before the numbers came out" and concluded, after using robo-writing technology first hand, it was not worth the BBC researching the technology further. Despite these shortcomings, journalists do believe robo-journalism does have the potential to reduce costs and increase the speed and specificity of some reporting. Journalists at CNN and Reuters thought it could "reduce costs" by replacing "expensive staff" who are doing "fairly simplistic and time-consuming work". A Reuters journalist believed automation could improve speed and accuracy, and said "we are looking at it in all parts of the company". Another Reuters journalist said automation will be used for stories they do not "have the resources to cover manually" or for topics currently below the threshold of reportability. Robo-journalism was seen as something that could both support and threaten journalistic objectivity. A journalist from The Sun thought it could enhance the accuracy of factual reporting. However, another, from the BBC, was concerned that the volume of content it is possible to produce through automation could make it easier for "prejudiced" individuals or organisations to influence the news agenda. "The increased volume of news resulting from automation may", Professor Thurman says, "make it more difficult to navigate a world already saturated with information and actually increase the need for the very human skills that good journalists embody -- news judgement, curiosity, and skepticism -- in order that we can all continue to be informed, succinctly, comprehensively, and accurately, about the world around us."


Journalists and editors believe 'robo-journalists' do not have a good nose for news and produce one-dimensional stories, according to new research published today. However, despite these limitations, the report reveals plans for the technology to be rolled out more widely with the potential to replace "hundreds" of journalists at Thomson Reuters alone. The researchers, Professor Neil Thurman (LMU Munich), Konstantin Dörr (University of Zurich) and Dr. Jessica Kunert (LMU Munich), interviewed journalists, editors, and executives from CNN, BBC, Thomson Reuters, Trinity Mirror, and News UK in an exploratory study. The journalists were given hands-on experience with robo-writing software during a workshop. Robo—or automated—journalism, is software that converts structured data into stories with limited to no human intervention beyond the initial programming. It is used by news organisations including Associated Press, the Los Angeles Times, and Forbes. The report is published in the international peer-reviewed journal, Digital Journalism. The journalists and editors in Dr. Thurman's study believe robo-journalism's reliance on data streams and the need to program news angles in advance means the stories produced lack the context, complexity, and creativity of much traditional reporting. Journalists also thought the need to template robo-written stories in advance is a drawback. One, from the BBC, said "you can't get a reaction to those numbers, you can't explain or interrogate them because the story template was written before the numbers came out" and concluded, after using robo-writing technology first hand, it was not worth the BBC researching the technology further. Despite these shortcomings, journalists do believe robo-journalism does have the potential to reduce costs and increase the speed and specificity of some reporting. Journalists at CNN and Reuters thought it could "reduce costs" by replacing "expensive staff" who are doing "fairly simplistic and time-consuming work". A Reuters journalist believed automation could improve speed and accuracy, and said "we are looking at it in all parts of the company". Another Reuters journalist said automation will be used for stories they do not "have the resources to cover manually" or for topics currently below the threshold of reportability. Robo-journalism was seen as something that could both support and threaten journalistic objectivity. A journalist from The Sun thought it could enhance the accuracy of factual reporting. However, another, from the BBC, was concerned that the volume of content it is possible to produce through automation could make it easier for "prejudiced" individuals or organisations to influence the news agenda. "The increased volume of news resulting from automation may", Professor Thurman says, "make it more difficult to navigate a world already saturated with information and actually increase the need for the very human skills that good journalists embody—news judgement, curiosity, and skepticism—in order that we can all continue to be informed, succinctly, comprehensively, and accurately, about the world around us." Explore further: New study identifies 'disconnect' between media and public More information: When Reporters Get Hands-on with Robo-Writing. www.tandfonline.com/doi/full/10.1080/21670811.2017.1289819


News Article | February 16, 2017
Site: www.eurekalert.org

The evolution of cells and organisms is thought to have been preceded by a phase in which informational molecules like DNA could be replicated selectively. New work shows that hairpin structures make particularly effective DNA replicators. In the metabolism of all living organisms there is a clear division of labor: Nucleic acids (DNA and RNA) carry the information for the synthesis of proteins, and proteins provide the structural and executive functions required by cells, such as the controlled and specific catalysis of chemical reactions by enzymes. However, in recent decades, it has become clear that this distinction is by no means absolute. In particular RNA is capable of ignoring the boundary outlined above and is known to play a catalytic role in many important processes. For example, certain RNA molecules can catalyze the replication of other nucleic acids, and this versatility could help to explain how life originated on Earth. Nucleic acid molecules are made up of subunits called nucleotides, which differ in their so-called bases. The bases found in RNA are referred to as A, C, G and U (DNA uses T in place of U). These bases fall into two complementary pairs, whose members specifically interact, A with T (or U) and G with C. This complementarity is what accounts for the stability of the DNA double helix, and enables single strands of RNA to fold into complex shapes. Life is thought to have emerged from a process of chemical evolution in which nucleic acid sequences could be selectively replicated. Thus, in prebiotic systems certain molecular "species" that carried information were reproduced at the expense of others. In biological systems, such selectivity is normally mediated by so-called primers -- strands of nucleic acid that pair (as described above) with part of the molecule to be replicated, to form a short double helix. The primer provides a starting point for the extension of the double-stranded region to form a new daughter strand. Moreover, this process can be reconstructed in the test-tube. The pros and cons of hairpin replicators Georg Urtel and Thomas Rind, who are members of the research group led by Dieter Braun (Professor of Systems Biophysics at LMU), have used such a system to identify properties the might favor the selective replication of DNA molecules. For their experiments, they chose a single-stranded DNA sequence that adopts a so-called hairpin structure. In these molecules, the base sequences at either end are complementary to each other, as are short stretches of sequence within the rest of the molecule. This distribution of complementary sequences causes such a strand to fold into a hairpin-like conformation. Thanks to the pairing rules outlined above, replication of a single strand of DNA produces a second strand whose sequence differs from that of the first. Each strand of a non-hairpin structure therefore needs its own primer for replication. But with hairpins, one primer suffices to prime synthesis of both the original and its complementary strand. "This means that hairpins are relatively simple replicators," Georg Urtel points out. The downside is that the hairpin structure makes primer binding more difficult, and this in turn limits their replication rate. Molecular species that are devoid of hairpin structures don't have this problem. In subsequent experiments the researchers discovered that two simple hairpin species could cooperate to give rise to a much more efficient replicator, which requires two primers for its amplification. The two hairpin species selected each required a different primer, but their sequences were in part identical. The switch to cooperative replication occurs when replication of one of the hairpins stalls. "As a rule, replication processes in nature are never perfect," says Dieter Braun. "Such a premature halt is not something that one needs to design into the system. It happens stochastically and we make use of it in our experiments." The partially replicated hairpin can, however, bind to a molecule of the second species, and serves as a primer that can be further elongated. Moreover, the resulting product no longer forms a hairpin. In other words, it represents a new molecular species. Such so-called 'crossbreeds' need two primers for their replication, but can nevertheless be replicated significantly faster than either of their hairpin progenitors For further experiments showed that, upon serial dilution of the population, the hairpin DNAs soon become extinct. However, the sequence information they contained survives in the crossbreeds and can be replicated further. The converse experiment confirmed that information is indeed conserved: If crossbreeds are supplied with only one primer, the corresponding progenitor hairpin species can still be replicated by the kind of switching process mentioned above. But, in the absence of the second primer, the crossbreed dies out. "Thus, the crossbreeding process not only provides for the transition from 'simple and slow' replicators to more rapid replicators, it also makes it possible for the system to adapt to the prevailing conditions," Urtel explains. "It also suggests how early replicators could have cooperated with each other under prebiotic conditions prior to the origin of living systems."


News Article | February 16, 2017
Site: phys.org

In the metabolism of all living organisms there is a clear division of labor: Nucleic acids (DNA and RNA) carry the information for the synthesis of proteins, and proteins provide the structural and executive functions required by cells, such as the controlled and specific catalysis of chemical reactions by enzymes. However, in recent decades, it has become clear that this distinction is by no means absolute. In particular RNA is capable of ignoring the boundary outlined above and is known to play a catalytic role in many important processes. For example, certain RNA molecules can catalyze the replication of other nucleic acids, and this versatility could help to explain how life originated on Earth. Nucleic acid molecules are made up of subunits called nucleotides, which differ in their so-called bases. The bases found in RNA are referred to as A, C, G and U (DNA uses T in place of U). These bases fall into two complementary pairs, whose members specifically interact, A with T (or U) and G with C. This complementarity is what accounts for the stability of the DNA double helix, and enables single strands of RNA to fold into complex shapes. Life is thought to have emerged from a process of chemical evolution in which nucleic acid sequences could be selectively replicated. Thus, in prebiotic systems certain molecular "species" that carried information were reproduced at the expense of others. In biological systems, such selectivity is normally mediated by so-called primers—strands of nucleic acid that pair (as described above) with part of the molecule to be replicated, to form a short double helix. The primer provides a starting point for the extension of the double-stranded region to form a new daughter strand. Moreover, this process can be reconstructed in the test-tube. The pros and cons of hairpin replicators Georg Urtel and Thomas Rind, who are members of the research group led by Dieter Braun (Professor of Systems Biophysics at LMU), have used such a system to identify properties the might favor the selective replication of DNA molecules. For their experiments, they chose a single-stranded DNA sequence that adopts a so-called hairpin structure. In these molecules, the base sequences at either end are complementary to each other, as are short stretches of sequence within the rest of the molecule. This distribution of complementary sequences causes such a strand to fold into a hairpin-like conformation. Thanks to the pairing rules outlined above, replication of a single strand of DNA produces a second strand whose sequence differs from that of the first. Each strand of a non-hairpin structure therefore needs its own primer for replication. But with hairpins, one primer suffices to prime synthesis of both the original and its complementary strand. "This means that hairpins are relatively simple replicators," Georg Urtel points out. The downside is that the hairpin structure makes primer binding more difficult, and this in turn limits their replication rate. Molecular species that are devoid of hairpin structures don't have this problem. In subsequent experiments the researchers discovered that two simple hairpin species could cooperate to give rise to a much more efficient replicator, which requires two primers for its amplification. The two hairpin species selected each required a different primer, but their sequences were in part identical. The switch to cooperative replication occurs when replication of one of the hairpins stalls. "As a rule, replication processes in nature are never perfect," says Dieter Braun. "Such a premature halt is not something that one needs to design into the system. It happens stochastically and we make use of it in our experiments." The partially replicated hairpin can, however, bind to a molecule of the second species, and serves as a primer that can be further elongated. Moreover, the resulting product no longer forms a hairpin. In other words, it represents a new molecular species. Such so-called 'crossbreeds' need two primers for their replication, but can nevertheless be replicated significantly faster than either of their hairpin progenitors For further experiments showed that, upon serial dilution of the population, the hairpin DNAs soon become extinct. However, the sequence information they contained survives in the crossbreeds and can be replicated further. The converse experiment confirmed that information is indeed conserved: If crossbreeds are supplied with only one primer, the corresponding progenitor hairpin species can still be replicated by the kind of switching process mentioned above. But, in the absence of the second primer, the crossbreed dies out. "Thus, the crossbreeding process not only provides for the transition from 'simple and slow' replicators to more rapid replicators, it also makes it possible for the system to adapt to the prevailing conditions," Urtel explains. "It also suggests how early replicators could have cooperated with each other under prebiotic conditions prior to the origin of living systems." Explore further: Genetic switch regulates transcription and replication in human mitochondria More information: Georg C. Urtel et al. Reversible Switching of Cooperating Replicators, Physical Review Letters (2017). DOI: 10.1103/PhysRevLett.118.078102

Loading LMU collaborators
Loading LMU collaborators