Leipzig, Germany
Leipzig, Germany

Leipzig University , located in Leipzig in the Free State of Saxony, Germany, is one of the oldest universities in the world and the second-oldest university in Germany. Famous alumni include Leibniz, Goethe, Ranke, Nietzsche, Wagner, Angela Merkel, Raila Odinga, Tycho Brahe and nine Nobel laureates are associated with this university.The university was founded on December 2, 1409 by Frederick I, Elector of Saxony and his brother William II, Margrave of Meissen, and originally comprised the four scholastic faculties. Since its inception the university has engaged in teaching and research for over 600 years without interruption. Wikipedia.


Time filter

Source Type

Patent
University of Leipzig | Date: 2016-07-11

The invention relates to modified antibiotic peptides, in particular derivatives of apidaecin and oncocin, preferably having increased stability, reduced immunoreaction, and improved pharmacokinetics. In the invention, the peptide antibiotics are reversibly protected by means of a linker having the polymer polyethylene glycol (PEG). The peptide linker contains a recognition sequence for trypsin-like serum proteases. In the apidaecin derivatives, the linker and the PEG are bonded to a side chain. In the serum, the linker is cut by serum proteases and PEG is separated off. The released peptide still contains remnants of the linker, which are still bonded to the amino group in the side chain. Astonishingly, said remaining remnants of the linker impair the activity of the antimicrobial peptide only a little or not at all.


Kroy K.,University of Leipzig | Chakraborty D.,Indian Institute of Science | Cichos F.,University of Leipzig
European Physical Journal: Special Topics | Year: 2016

Hot microswimmers are self-propelled Brownian particles that exploit local heating for their directed self-thermophoretic motion. We provide a pedagogical overview of the key physical mechanisms underlying this promising new technology. It covers the hydrodynamics of swimming, thermophoresis and -osmosis, hot Brownian motion, force-free steering, and dedicated experimental and simulation tools to analyze hot Brownian swimmers. © 2016, EDP Sciences and Springer.


Bluher M.,University of Leipzig
Clinical Science | Year: 2016

The worldwide obesity epidemic has become a major health concern, because it contributes to higher mortality due to an increased risk for noncommunicable diseases including cardiovascular diseases, type 2 diabetes, musculoskeletal disorders and some cancers. Insulin resistance may link accumulation of adipose tissue in obesity to metabolic diseases, although the underlying mechanisms are not completely understood. In the past decades, data from human studies and transgenic animal models strongly suggested correlative, but also causative associations between activation of proinflammatory pathways and insulin resistance. Particularly chronic inflammation in adipose tissue seems to play an important role in the development of obesity-related insulin resistance. On the other hand, adipose tissue inflammation has been shown to be essential for healthy adipose tissue expansion and remodelling. However, whether adipose tissue inflammation represents a consequence or a cause of impaired insulin sensitivity remains an open question. A better understanding of the molecular pathways linking excess adipose tissue storage to chronic inflammation and insulin resistance may provide the basis for the future development of anti-inflammatory treatment strategies to improve adverse metabolic consequences of obesity. In this review, potential mechanisms of adipose tissue inflammation and how adipose tissue inflammation may cause insulin resistance are discussed. © 2016 The Author(s).


Baumann R.,University of Leipzig | Strass H.,University of Leipzig
Principles of Knowledge Representation and Reasoning: Proceedings of the 15th International Conference, KR 2016 | Year: 2016

We consider knowledge representation (KR) formalisms as collections of finite knowledge bases with a model-theoretic semantics. In this setting, we show that for every KR formalism there is a formalism that characterizes strong equivalence in the original formalism, that is unique up to isomorphism and that has a model theory similar to classical logic.


Baumann R.,University of Leipzig
Principles of Knowledge Representation and Reasoning: Proceedings of the 15th International Conference, KR 2016 | Year: 2016

A central question in knowledge representation is the following: given some knowledge representation formalism, is it possible, and if so how, to simplify parts of a knowledge base without affecting its meaning, even in the light of additional information? The term strong equivalence was coined in the literature, i.e. strongly equivalent knowledge bases can be locally replaced by each other in a bigger theory without changing the semantics of the latter. In contrast to classical (monotone) logics where standard and strong equivalence coincide, it is possible to find ordinary but not strongly equivalent objects for any nonmonotonic formalism available in the literature. This paper addresses these questions in the context of abstract argumentation theory. Much effort has been spent to characterize several argumentation tailored equivalence notions w.r.t. extension-based semantics. In recent times labelling-based semantics have received increasing attention, for example in connection with algorithms computing extensions, proof procedures, dialogue games, dynamics in argumentation as well as belief revision in general. Of course, equivalence notions allowing for replacements are of high interest for the mentioned topics. In this paper we provide kernel-based characterization theorems for semantics based on complete labellings as well as admissible labellings w.r.t. eight different equivalence notions including the aforementioned most prominent one, namely strong equivalence.


Popkova Y.,University of Leipzig | Schiller J.,University of Leipzig
Rapid Communications in Mass Spectrometry | Year: 2017

Rationale: Ion suppression is a known disadvantage in mixture analysis. Matrix-assisted laser desorption/ionization (MALDI) mass spectra of crude adipose tissue extracts are dominated by triacylglycerol (TAG) signals while less abundant phospholipids such as phosphatidylcholines (PC) and particularly phosphatidylethanolamines (PE) are suppressed. It is suggested that addition of an excess of cesium (Cs) ions helps to overcome this problem. Methods: Selected lipid mixtures of known compositions and organic adipose tissue extracts were investigated by positive ion MALDI-time-of-flight mass spectrometry (TOF MS). 2,5-Dihydroxybenzoic acid (DHB) in methanol was used as the matrix. In selected cases the methanolic DHB solution was saturated by the addition of different solid alkali chlorides (such as NaCl, KCl, RbCl and CsCl). Studies on the solubilities of these salts in methanol and the interaction with DHB (by 13C NMR) were also performed. Results: Saturation of the DHB matrix with solid CsCl leads to tremendous intensity differences, i.e. the intensities of the TAG signals (which otherwise dominate the mass spectra) are significantly reduced. In contrast, the intensity of small signals of phospholipids increases considerably. Decrease in the TAG signal intensity is particularly caused by the considerable size of the Cs+ ion which prevents successful analyte ionization. Conclusions: The addition of CsCl improves the detectability of otherwise invisible or weak phospholipid ions. This is a simple approach to detect small amounts of phospholipids in the presence of an excess of TAG. No laborious and time-consuming separation of the total lipid extract into the individual lipid classes is required. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.


Fuhrmann D.,University of Leipzig | Dietrich S.,University of Leipzig | Krautscheid H.,University of Leipzig
Chemistry - A European Journal | Year: 2017

Five copper zinc thiolate complexes [(iPr3PCu)2(ZnEt2)(edt)]2 (1-Et), [(iPr3PCu)2(Zn(iPr)2)(edt)]2 (1-iPr), [(iPr3PCu)4(edt)2(ZnMe2)]2 (2), [(iPr3PCu)3(ZnPh2)(ZnPh)(edt)2]2 (3), and [(iPr3PCu)2Zn2(edt)3]6 (4) were prepared by the reaction of [(iPr3PCu)2(edt)]2 with ZnR2 (R=Me, Et, Ph, iPr) with or without addition of ethanedithiol (edt2-=ethane-1,2-dithiolate). The molecular structures of these complexes were determined by single crystal X-ray diffraction. The ethanedithiolate ligands coordinate in μ3-η1:η2:η1 (2, 4), μ4-η1:η1:η2:η1 (1-R, 3), and μ5-η1:η1:η2:η1:η1 (2) bridging modes, each sulfur atom binds to two or three metal atoms. Evidence for the presence of the weak Zn-S bonds in solution was provided by NMR spectroscopy. Mixtures of 1-Et, 1-iPr, or 3 with Sn(edt)2 were examined by thermogravimetry up to 600°C, whereupon volatile thermolysis products were identified by mass spectrometry. In all thermolysis experiments, the formation of Cu2ZnSnS4 as main product, besides small amounts of binary metal sulfides, was confirmed by X-ray powder diffraction (PXRD) and EDX (energy dispersive X-ray spectroscopy) analysis. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.


Babari P.,University of Leipzig | Droste M.,University of Leipzig
Journal of Computer and System Sciences | Year: 2017

Picture languages have been investigated by several research groups. Here, we define weighted two-dimensional on-line tessellation automata (w2. ota) taking weights from a new weight structure called picture valuation monoid. The behavior of this automaton model is a picture series mapping pictures to values. As our first main result, we prove a Nivat theorem for w2. ota. It shows a connection of mappings and recognizability of certain simple series. Then, we introduce a weighted MSO logic which models average density of pictures. In the second main result, we show that w2. ota and a fragment of our weighted MSO logics are expressively equivalent. © 2017 Elsevier Inc.


Patel P.N.,Space Applications Center | Quaas J.,University of Leipzig | Kumar R.,Space Applications Center
Atmospheric Chemistry and Physics | Year: 2017

In a previous study of Quaas et al. (2008) the radiative forcing by anthropogenic aerosol due to aerosol-cloud interactions, RFaci, was obtained by a statistical analysis of satellite retrievals using a multilinear regression. Here we employ a new statistical approach to obtain the fitting parameters, determined using a nonlinear least square statistical approach for the relationship between planetary albedo and cloud properties and, further, for the relationship between cloud properties and aerosol optical depth. In order to verify the performance, the results from both statistical approaches (previous and present) were compared to the results from radiative transfer simulations over three regions for different seasons. We find that the results of the new statistical approach agree well with the simulated results both over land and ocean. The new statistical approach increases the correlation by 21-23% and reduces the error compared to the previous approach. © Author(s) 2017.


Dose-response relationships for the training of working memory function in neurorehabilitation have not yet been systematically investigated. We review existing experimental studies and meta-analyses with respect to different subject groups (brain-damaged patients, healthy older and younger subjects, children with learning difficulties). Most of the studies show a positive correlation between training frequency and effect size. The effect of less than 10 training sessions is indistinguishable from retest effects in untreated control groups. In order to achieve long-lasting training effects and a transfer to other cognitive functions, a training period of at least 20 sessions is recommended based on the empirical evidence. In addition, the results suggest that a distributed training is more effective than an intensive training schedule; thus 2-3 sessions per week, 30-60 minutes each, are recommended. However, these data are not based on studies with patients after brain injury. For some patients, shorter training periods may be useful depending on their concentrative capacities. Not all of the studies found positive dose-response correlations. Therefore, other variables such as methodological aspects, features of the training, or characteristics of the subjects should be taken into account. Implications for clinical practice are discussed. © Hippocampus Verlag 2017.


Zierenberg J.,University of Leipzig | Schierz P.,University of Leipzig | Janke W.,University of Leipzig
Nature Communications | Year: 2017

A common approach to study nucleation rates is the estimation of free-energy barriers. This usually requires knowledge about the shape of the forming droplet, a task that becomes notoriously difficult in macromolecular setups starting with a proper definition of the cluster boundary. Here we demonstrate a shape-free determination of the free energy for temperature-driven cluster formation in particle as well as polymer systems. Combined with rigorous results on equilibrium droplet formation, this allows for a well-defined finite-size scaling analysis of the effective interfacial free energy at a fixed density. We first verify the theoretical predictions for the formation of a liquid droplet in a supersaturated particle gas by generalized-ensemble Monte Carlo simulations of a Lennard-Jones system. Going one step further, we then generalize this approach to cluster formation in a dilute polymer solution. Our results suggest an analogy with particle condensation, when the macromolecules are interpreted as extended particles. © 2017 Japan Antibiotics Research Association All rights reserved.


Eichenberg D.,University of Leipzig | Pietsch K.,University of Leipzig | Meister C.,University of Leipzig | Ding W.,Zhejiang University | And 3 more authors.
Journal of Plant Ecology | Year: 2017

Aims We studied the influence of tree species diversity on the dynamics of coarse wood decomposition in developing forest communities in a natural, topographically heterogeneous landscape. Using the litter bag technique, we investigated how and to which extent canopy tree species richness or the exclusion of mesoinvertebrates and macroinvertebrates affected wood decomposition in the light of natural variations in the microclimate. We compared the relative importance of the two aspects (experimental treatment versus microclimate) on wood decay rates using Schima superba as a standard litter. Methods Coarse woody debris (CWD) was deposited in litter bags with two different mesh sizes in a total of 134 plots along a gradient of canopy tree species richness (0-24 species). Wood decomposition was assessed at two consecutive time points, one and three years after deposition in the field. Local climatic conditions were assessed throughout the duration of the experiment. Microclimatic conditions were assessed both, directly in the field as well as indirectly via correlations with local topography. We used analysis of variance based approaches to assess the relative importance of the treatments (community tree species richness and macro invertebrate exclusion) and microclimatic conditions on wood decay. Important Findings No direct influence of tree species richness on wood decay could be detected. However, the exclusion of macroinvertebrates significantly decreased wood decomposition rates. In addition, microclimatic conditions accounted for a substantial proportion of explained variance in the observed data. Here, wood decomposition was negatively affected by low mean temperatures and high variations in local humidity and temperature. However, tree species richness as well as the respective species composition affected the presence of termites within forest communities. These, in turn, significantly increased the decay of CWD. The strength of both, experimental treatment and microclimate increased with ongoing decomposition. We conclude that, while tree species richness per se has no direct influence on wood decomposition, its influence on the local arthropod decomposer community (especially the presence of termites) does have an effect. © The Author 2017. Published by Oxford University Press on behalf of the Institute of Botany, Chinese Academy of Sciences and the Botanical Society of China All rights reserved.


Pfeifer J.,University of Leipzig
Ghosts - or the (Nearly) Invisible: Spectral Phenomena in Literature and the Media | Year: 2016

Loa, ghosts in Haitian Vodou, coalesce African and Creole gods, with the imagery of Catholic saints, all rooted in Haiti's colonial history as a Spanish and later French colony with slaves of different ethnicities from the West Coast of Africa. Although the Vodou belief system rests upon a firm belief in a supreme being, vodou serviteurs1 pray to, serve and ask for guidance from their loa. In order to understand loa and possession by a loa, the concept of the soul in Vodou will be observed. At Vodou ceremonies a loa will manifest him- or herself through possession into a head of a serviteur, becoming now the "horse", which is being "ridden" by the loa. The distinction which kind of loa is riding a person is made visible by a loa's unique dance rhythm, song, clothing, colour, talk, sacrificial food and drink.


Braun T.,University of Leipzig | Schmukle S.C.,University of Leipzig | Kunzmann U.,University of Leipzig
Psychology and Aging | Year: 2017

The primary goal of this study was to address the stability-despite-loss paradox of subjective well-being. Performance-based and self-evaluative measures of cognitive functioning were examined as predictors of subjective well-being in middle-aged and older adults using data from the Interdisciplinary Longitudinal Study of Adult Development (ILSE). Consistent with past work, subjective well-being remained relatively stable over a period of 12 years in both age groups, although performance-based and self-rated cognition declined over time. Cognitive status, as determined by standard psychometric tests of fluid cognitive abilities, was unrelated to longitudinal change in subjective well-being. A symmetrical measure of self-rated cognitive performance predicted intraindividual change in subjective well-being in middleaged but not older adults. This pattern of findings helps clarify why many older people may be able to maintain their subjective well-being, while their cognitive abilities decline. © 2017 American Psychological Association.


Wichmann G.,University of Leipzig
Recent Results in Cancer Research | Year: 2017

The human papillomavirus (HPV) comprises a heterogeneous group of double-strand DNA viruses with variable potential to infect human epithelial cells and trigger neoplastic transformation. Its 8 kb genome encodes proteins required for virus replication and self-organized formation of infectious particles but also for early proteins E6 and E7 able to trigger neoplastic transformation. E6 and E7 of high-risk (HR) HPV subtypes can bind to p53 or release E2F and abrogate replication control. Due to variable amino acid sequence (AAS) in the binding sites of E6 and E7 particular HR-HPV variants within subtypes are essentially heterogeneous in efficacy triggering neoplastic transformation and cancer development. This could explain differences in the clinical course of HPV-driven head and neck cancer. © Springer International Publishing Switzerland 2017.


Schrey-Petersen S.,University of Leipzig | Stepan H.,University of Leipzig
Current Hypertension Reports | Year: 2017

Purpose of Review: Preeclampsia remains one of the most important complications in pregnancy worldwide. With this review, we aim to give an overview on important research findings over the last years and their effects on current clinical management. Recent Findings: The association between preeclampsia and altered angiogenesis is nowadays widely accepted. Only during the last years, assessment of angiogenic factors such as the soluble fms-like tyrosine kinase-1-to-placental growth factor (sFlt-1/PlGF) ratio has become available to everyday clinical practice with commercially available automated measurements. With these, preeclampsia can be confirmed or ruled out in uncertainty of diagnosis, and a short-term prognosis can be given in patients with symptoms of preeclampsia. Pilot studies show that maternal serum levels of sFlt-1 can be reduced by therapeutic apheresis and that this might prolong pregnancy in case of very early severe preeclampsia. Summary: The automated measurement of the sFlt-1/PlGF ratio is starting to influence clinical management of preeclampsia. Apheresis might offer new treatment options, but still needs to be evaluated in randomized trials. © 2017, Springer Science+Business Media New York.


Daneri S.,Friedrich - Alexander - University, Erlangen - Nuremberg | Szekelyhidi L.,University of Leipzig
Archive for Rational Mechanics and Analysis | Year: 2017

In this paper we address the Cauchy problem for the incompressible Euler equations in the periodic setting. We prove that the set of Hölder (Formula presented.) wild initial data is dense in (Formula presented.), where we call an initial datum wild if it admits infinitely many admissible Hölder (Formula presented.) weak solutions. We also introduce a new set of stationary flows which we use as a perturbation profile instead of Beltrami flows in order to show that a general form of the h-principle applies to Hölder-continuous weak solutions of the Euler equations. Our result indicates that in a deterministic theory of three dimensional turbulence the Reynolds stress tensor can be arbitrary and need not satisfy any additional closure relation. © 2017 Springer-Verlag Berlin Heidelberg


Choque Rivero A.E.,Universidad Michoacana de San Nicolás de Hidalgo | Madler C.,University of Leipzig
Complex Analysis and Operator Theory | Year: 2017

By using Schur transformed sequences and Dyukarev–Stieltjes parameters we obtain a new representation of the resolvent matrix corresponding to the truncated matricial Stieltjes moment problem. Explicit relations between orthogonal matrix polynomials and matrix polynomials of the second kind constructed from consecutive Schur transformed sequences are obtained. Additionally, a non-negative Hermitian measure for which the matrix polynomials of the second kind are the orthogonal matrix polynomials is found. © 2017 Springer International Publishing


Wei R.,University of Leipzig | Zimmermann W.,University of Leipzig
Microbial Biotechnology | Year: 2017

Petroleum-based plastics have replaced many natural materials in their former applications. With their excellent properties, they have found widespread uses in almost every area of human life. However, the high recalcitrance of many synthetic plastics results in their long persistence in the environment, and the growing amount of plastic waste ending up in landfills and in the oceans has become a global concern. In recent years, a number of microbial enzymes capable of modifying or degrading recalcitrant synthetic polymers have been identified. They are emerging as candidates for the development of biocatalytic plastic recycling processes, by which valuable raw materials can be recovered in an environmentally sustainable way. This review is focused on microbial biocatalysts involved in the degradation of the synthetic plastics polyethylene, polystyrene, polyurethane and polyethylene terephthalate (PET). Recent progress in the application of polyester hydrolases for the recovery of PET building blocks and challenges for the application of these enzymes in alternative plastic waste recycling processes will be discussed. © 2017 John Wiley & Sons Ltd and Society for Applied Microbiology.


Bouameur J.-E.,University of Leipzig | Magin T.M.,University of Leipzig
Sub-Cellular Biochemistry | Year: 2017

Cytoplasmic intermediate filaments (IFs) represent a major cytoskeletal network contributing to cell shape, adhesion and migration as well as to tissue resilience and renewal in numerous bilaterians, including mammals. The observation that IFs are dispensable in cultured mammalian cells, but cause tissue-specific, lifethreatening disorders, has pushed the need to investigate their function in vivo. In keeping with human disease, the deletion or mutation of murine IF genes resulted in highly specific pathologies. Epidermal keratins, together with desmin, are essential to protect corresponding tissues against mechanical force but also participate in stabilizing cell adhesion and in inflammatory signalling. Surprisingly, other IF proteins contribute to tissue integrity to a much lesser extent than anticipated, pointing towards their role in stress situations. In support, the overexpression of small chaperones or the interference with inflammatory signalling in several settings has been shown to rescue severe tissue pathologies that resulted from the expression of mutant IF proteins. It stills remains an open issue whether the wide range of IF disorders share similar pathomechanisms. Moreover, we lack an understanding how IF proteins participate in signalling processes. Now, with a large number of mouse models in hand, the next challenge will be to develop organotypic cell culture models to dissect pathomechanisms at the molecular level, to employ Crispr/Casmediated genome engineering to optimize models and, finally, to combine available animal models with medicinal chemistry for the development of molecular therapies. © Springer International Publishing AG 2017.


Etzel M.,University of Leipzig | Morl M.,University of Leipzig
Biochemistry | Year: 2017

In synthetic biology, metabolic engineering, and gene therapy, there is a strong demand for orthogonal or externally controlled regulation of gene expression. Here, RNA-based regulatory devices represent a promising emerging alternative to proteins, allowing a fast and direct control of gene expression, as no synthesis of regulatory proteins is required. Besides programmable ribozyme elements controlling mRNA stability, regulatory RNA structures in untranslated regions are highly interesting for engineering approaches. Riboswitches are especially well suited, as they show a modular composition of sensor and response elements, allowing a free combination of different modules in a plug-and-play-like mode. The sensor or aptamer domain specifically interacts with a trigger molecule as a ligand, modulating the activity of the adjacent response domain that controls the expression of the genes located downstream, in most cases at the level of transcription or translation. In this review, we discuss the recent advances and strategies for designing such synthetic riboswitches based on natural or artificial components and readout systems, from trial-and-error approaches to rational design strategies. As the past several years have shown dramatic development in this fascinating field of research, we can give only a limited overview of the basic riboswitch design principles that is far from complete, and we apologize for not being able to consider every successful and interesting approach described in the literature. © 2017 American Chemical Society.


Hecker P.,University of Leipzig
Muslim Rap, Halal Soaps, and Revolutionary Theater: Artistic Developments in The Muslim World | Year: 2011

"i got no problem with religion or religious people. My problem is they got a problem with me," my counterpart with the long, blond dyed hair so aptly sums up. With his tattooed arms and the "pilot shades" on his head, he could easily be considered as the Turkish incarnation of American glam rock star Bret Michaels, who had just dropped by to have a couple of beers before hitting on the beautiful young women in the bar where we were doing the interview. Lighting another cigarette, he disdainfully adds: "You know, when they saw me on TV or out in the streets, they shit on me."1 Saying this, he refers to how he is being perceived in the eyes of the Turkish public. And indeed, the appearance and behavior of Turkish rockers and metalheads-with their long hair, black clothes, tattoos, earrings and piercings, and their love for Turkish raki and beer-are still often labeled as deviant and contradictory to prevalent concepts of morality and religion. If we look at Turkish heavy metal from a perspective of resistance and power (Karin van Nieuwkerk, introduction to this volume), we need to address contemporary discourses on secularism and Islamism in Turkish society. Islamic actors, who, for a long time, have found themselves in a marginalized position resisting the laicist doctrines of the Kemalist state, are blaming Turkish rockers and metalheads for their supposedly loose morals and disrespect to Islamic traditions. Today, however, political Islam no longer represents an oppositional counterpublic, but with the electoral victory of the Muslim conservative Justice and Development Party (AKP), has taken the dominant power position in state and society. While adherents of the Turkish hip-hop scene in the 1990s countered what they perceived as the domination of secularist Kemalism (Thomas Solomon, this volume), Turkish metalheads today, by the same token, see themselves in a marginalized position in which they resist the dominance of Islamic revivalism. The present government's Islamization policies are usually seen as evidence for its intentions to subvert the secularist principles of the Turkish state. Consequently, many Turkish metalheads openly speak of their fears of Turkey "becoming Iran" (see Farzaneh Hemmasi, this volume) and losing their individual freedoms to the orthodox interpretations of political Islam. This chapter aims to explore how particular cultural practices associated with heavy metal are contesting Islamic concepts of morality in Turkish society. After briefl y introducing the history of Turkish heavy metal and providing an insight into the Islamization policies of the present government, it examines the public discourse on heavy metal, shedding light on the different forms of moral subversiveness ascribed to it by the Turkish media. Finally, it investigates how heavy metal culture is contesting Islamic morality in everyday life. In this respect, the text refers to aspects of gender, religion, and anti-Christian blasphemy in a Muslim context. Copyright © 2011 by University of Texas Press. All rights reserved.


News Article | May 3, 2017
Site: www.eurekalert.org

Phthalates, which are used as plasticizers in plastics, can considerably increase the risk of allergies among children. This was demonstrated by UFZ researchers in conjunction with scientists from the University of Leipzig and the German Cancer Research Center (DKFZ) in a current study published in the Journal of Allergy and Clinical Immunology. According to this study, an increased risk of children developing allergic asthma exists if the mother has been particularly heavily exposed to phthalates during pregnancy and breastfeeding. The mother-child cohort from the LINA study was the starting and end point of this translational study. In our day-to-day lives, we come into contact with countless plastics containing plasticizers. These plasticizers, which also include the aforementioned phthalates, are used when processing plastics in order to make the products more flexible. Phthalates can enter our bodies through the skin, foodstuffs or respiration. "It is a well-known fact that phthalates affect our hormone system and can thereby have an adverse effect on our metabolism or fertility. But that's not the end of it," says UFZ environmental immunologist Dr Tobias Polte. "The results of our current study demonstrate that phthalates also interfere with the immune system and can significantly increase the risk of developing allergies." At the outset of the study, the team of UFZ researchers examined the urine of pregnant women from the LINA (lifestyle and environmental factors and their influence on the newborn-allergy-risk) mother-child cohort study and searched for metabolites of phthalates. The concentration level determined in each case was found to correlate with the occurrence of allergic asthma among the children. "There was a clearly discernible relationship between higher concentrations of the metabolite of benzylbutylphthalate (BBP) in the mother's urine and the presence of allergic asthma in their children", explains Dr Irina Lehmann, who heads the LINA study. Researchers were able to confirm the results from the mother-child cohort in the mouse model in collaboration with colleagues from the Medical Faculty at the University of Leipzig. In this process, mice were exposed to a certain phthalate concentration during pregnancy and the lactation period, which led to comparable concentrations of the BBP metabolite in urine to those observed in heavily exposed mothers from the LINA cohort. The offspring demonstrated a clear tendency to develop allergic asthma; even the third generation continued to be affected. Among the adult mice, on the other hand, there were no increased allergic symptoms. "The time factor is therefore decisive: if the organism is exposed to phthalates during the early stages of development, this may have effects on the risk of illness for the two subsequent generations," explains Polte. "The prenatal development process is thus clearly altered by the phthalate exposure." In order to establish precisely what may have been modified, Polte and his team, in collaboration with colleagues from the German Cancer Research Center (DKFZ), took a close look at the genes of the young mice born to exposed mothers. So-called methyl groups were found in the DNA of these genes - and to a greater extent than is usually the case. In the course of this so-called epigenetic modification of the DNA, methyl groups attach themselves to a gene like a kind of padlock and thus prevent its code from being read, meaning that the associated protein cannot be produced. After the researchers treated the mice with a special substance intended to crack the methyl "locks" on the affected genes, the mice demonstrated fewer signs of allergic asthma than before. Dr Polte concludes the following: "Phthalates apparently switch off decisive genes by means of DNA methylation, causing the activity of these genes to be reduced in the young mice." But which genes cause allergic asthma if they cannot be read? So-called T-helper 2 cells play a central part in the development of allergies. These are kept in check by special opponents (repressors). If a repressor gene cannot be read as a result of being blocked by methyl groups, the T-helper 2 cells that are conducive to the development of allergies are no longer sufficiently inhibited, meaning that an allergy is likely to develop. "We surmise that this connection is decisive for the development of allergic asthma caused by phthalates," says Polte. "Furthermore, in the cell experiment, we were able to demonstrate that there is an increased formation of T-helper 2 cells from the immune cells of the offspring of exposed mother mice than is the case for the offspring of non-exposed animals. This enabled us to establish an increased tendency towards allergies once again." In mice, the researchers were able to prove that a repressor gene that has been switched off due to DNA methylation is responsible for the development of allergic asthma. But does this mechanism also play a part in humans? In order to answer this question, the researchers consulted the LINA cohort once more. They searched for the corresponding gene among the children with allergic asthma and studied the degree of methylation and gene activity. Here, too, it became apparent that the gene was blocked by methyl groups and thus could not be read. "Thanks to our translational study approach - which led from humans via the mouse model and cellular culture back to humans again - we have been able to demonstrate that epigenetic modifications are apparently responsible for the fact that children of mothers who had a high exposure to phthalates during pregnancy and breastfeeding have an increased risk of developing allergic asthma," says Polte. "The objective of our further research will be to understand exactly how specific phthalates give rise to the methylation of genes which are relevant for the development of allergies." Susanne Jahreis, Saskia Trump, Mario Bauer, Tobias Bauer, Loreen Thu?rmann, Ralph Feltens, Qi Wang, Lei Gu, Konrad Gru?tzmann, Stefan Röder, Marco Averbeck, Dieter Weichenhan, Christoph Plass, Ulrich Sack, Michael Borte, Virginie Dubourg, Gerrit, Schu?u?rmann, Jan C. Simon, Martin von Bergen, Jörg Hackermu?ller, Roland Eils, Irina Lehmann, Tobias Polte (2017): Maternal phthalate exposure promotes allergic airway inflammation over two generations via epigenetic modifications, Journal of Allergy and Clinical Immunology; doi: 10.1016/j.jaci.2017.03.017; http://doi. PD Dr Tobias Polte Head of the Helmholtz University Research Group "Experimental Allergology and Immunology" Tel.: +49 341 235-1545 E-mail: tobias.polte@ufz.de https:/ Dr Irina Lehmann Head of the UFZ Department of Environmental Immunology Tel.: +49 341 235-1216 Email: irina.lehmann@ufz.de http://www.


News Article | March 10, 2017
Site: www.techtimes.com

Ancient Egypt still continues to fascinate many, giving us something to marvel at. A recent archaeological find was especially monumental, as it was a statue of a well-known pharaoh that's been sitting under the homes of modern Egyptians for over 3,000 years. The massive statue was found amid the rising water, industrial waste, and rubble under a residential area in eastern Cairo. Believed to be a statue of the great Pharaoh Ramses II, the colossal statue is 26 feet tall, made of quartzite, and possibly about 3,000 years old. Though the statue has been damaged and bears no insignia pointing to the identity of the pharaoh whose statue it was of, its close proximity to an ancient temple devoted to Ramses II points to the idea that the statue is of his likeness. What's more, the city where the statue was found was built above the ancient city of Heliopolis, a city devoted to the worship of ancient Egyptian sun god Re, of whom Ramses was a worshipper of. Ramses II was one of ancient Egypt's longest ruling and most well-known kings. In his 60-year rule, his military campaigns and exploits helped expand Egypt's territory and advanced the growth and prosperity of what was already a powerful kingdom. His long and powerful ruling gave him the enduring title of "Ramses the Great." His reputation as a great king lasted beyond his lifetime so much so that nine more pharaohs took the name Ramses in his honor. By the time of his death at the age of about 90 in 1213 BC, he had already given Egypt a larger amount of territory and riches and built a large number of statues and memorials all over Egypt. His preserved remains were found in a tomb in the Valley of the Kings but has since been moved to Cairo's Egyptian Museum, where it remains to this day. Some believe that among the many Ramses pharaohs that ruled Egypt, Ramses II was the pharaoh from the Biblical book of Exodus with from whom Moses freed the Israelites. The excavation project that led to the discovery of the massive statue is a continuous joint project between Egypt's Ministry of Antiquities and the University of Leipzig to salvage and preserve the archaeology in the ancient city of Heliopolis. Found in the same excavation site was the limestone statue of Ramses II's grandson Pharaoh Seti II, hence furthering the belief that the massive statue is of the great ruler. Preservation projects are continuous in Egypt to both preserve and hopefully learn more about the great civilization, from the young pharaoh King Tut to lost civilizations and the great leader Ramses II. © 2017 Tech Times, All rights reserved. Do not reproduce without permission.


Ziemer M.,University of Leipzig | Kardaun S.H.,University of Groningen | Liss Y.,Albert Ludwigs University of Freiburg | Mockenhaupt M.,Albert Ludwigs University of Freiburg
British Journal of Dermatology | Year: 2012

Background Stevens-Johnson syndrome (SJS) and toxic epidermal necrolysis (TEN) are severe cutaneous adverse reactions with high morbidity and mortality. Some expressions of lupus erythematosus (LE) may cause enormous difficulties in differentiating them from SJS and TEN by showing large areas of sheet-like epidermal necrosis. Objective To evaluate clinically and histopathologically probable or definite cases of SJS/TEN with a history of systemic or other LE [(S)LE]. Methods This was a retrospective analysis of validated cases of SJS/TEN with a history of (S)LE, based on a large population-based national registry. Results Among 1366 patients with SJS/TEN, 17 with a sufficiently documented history of (S)LE and representative histological material could be identified, suggesting a considerable over-representation of LE in patients with SJS/TEN. Eight of these showed clinically and/or histopathologically some LE-characteristic features interfering with the diagnosis of SJS/TEN. Differentiation could be elaborated on clinical and histopathological grounds: four patients were classified as SJS/TEN with a preceding (S)LE exacerbation and/or LE-typical histopathological features, and four as 'TEN-like' (S)LE. Conclusion Most patients with SJS/TEN and a history of (S)LE demonstrate clinical and histopathological properties allowing clear differentiation. However, occasionally acute cutaneous manifestations of (S)LE and SJS/TEN can be phenotypically similar, caused by extensive epidermal necrosis. Although no feature by itself is conclusive, a combination of recent (S)LE exacerbation, evident photodistribution, annular lesions and absent or only mild focal erosive mucosal involvement may favour LE over SJS/TEN clinically. Histopathologically, in particular, junctional vacuolar alteration, and the presence of solitary necrotic keratinocytes at lower epidermal levels, combined with moderate to dense periadnexal and perivascular lymphocytic infiltrates with a variable presence of melanophages, and mucin point to a LE-related origin. © 2011 The Authors. BJD © 2011 British Association of Dermatologists.


Pazaitou-Panayiotou K.,Theagenion Cancer Hospital | Michalakis K.,Queen Mary, University of London | Paschke R.,University of Leipzig
Hormone and Metabolic Research | Year: 2012

Thyroid cancer can be associated with thyrotoxicosis caused by Graves' disease, toxic multinodular goiter, or autonomously functioning thyroid adenoma. The objective of this study was to summarize current evidence regarding the association of thyroid cancer and hyperthyroidism, particularly with respect to the type of hyperthyroidism found in some patients, and whether this affects the outcome of the patient. A PubMed search was performed up to August 2011. Articles were identified using combinations of the following keywords/phrases: thyroid cancer, papillary thyroid cancer, follicular thyroid cancer, medullary thyroid cancer, anaplastic thyroid cancer, hyperthyroidism, Graves' disease, autonomous adenoma, toxic thyroid nodule, and toxic multinodular goiter. Original research papers, case reports, and review articles were included. We concluded that the incidence, as well as the prognosis of thyroid cancer associated with hyperthyroidism is a matter of debate. It seems that Graves' disease is associated with larger, multifocal, and potentially more aggressive thyroid cancer than single hot nodules or multinodular toxic goiter. Patients with Graves' and thyroid nodules are at higher risk to develop thyroid cancer compared to patients with diffuse goiter. Every suspicious nodule associated with hyperthyroidism should be evaluated carefully. © Georg Thieme Verlag KG Stuttgart · New York.


Von Reumont B.M.,Natural History Museum in London | Richter S.,University of Leipzig | Alvarez F.,National Autonomous University of Mexico | Bleidorn C.,University of Leipzig | Jenner R.A.,Natural History Museum in London
Molecular Biology and Evolution | Year: 2014

Animal venoms have evolved many times. Venomous species are especially common in three of the four main groups of arthropods (Chelicerata, Myriapoda, and Hexapoda), which together represent tens of thousands of species of venomous spiders, scorpions, centipedes, and hymenopterans. Surprisingly, despite their great diversity of body plans, there is no unambiguous evidence that any crustacean is venomous. We provide the first conclusive evidence that the aquatic, blind, and cave-dwelling remipede crustaceans are venomous and that venoms evolved in all four major arthropod groups. We produced a three-dimensional reconstruction of the venom delivery apparatus of the remipede Speleonectes tulumensis, showing that remipedes can inject venom in a controlled manner. A transcriptomic profile of its venom glands shows that they express a unique cocktail of transcripts coding for known venom toxins, including a diversity of enzymes and a probable paralytic neurotoxin very similar to one described from spider venom. We screened a transcriptomic library obtained from whole animals and identified a nontoxin paralog of the remipede neurotoxin that is not expressed in the venom glands. This allowed us to reconstruct its probable evolutionary origin and underlines the importance of incorporating data derived from nonvenom gland tissue to elucidate the evolution of candidate venom proteins. This first glimpse into the venom of a crustacean and primitively aquatic arthropod reveals conspicuous differences from the venoms of other predatory arthropods such as centipedes, scorpions, and spiders and contributes valuable information for ultimately disentangling the many factors shaping the biology and evolution of venoms and venomous species. © 2013 The Author 2013. Published by Oxford University Press on behalf of the Society for Molecular Biology and Evolution.


Schakel A.M.J.,University of Leipzig | Schakel A.M.J.,Federal University of Pernambuco
Annals of Physics | Year: 2011

The effective action describing the gapless Nambu-Goldstone, or Anderson-Bogoliubov, mode of a zero-temperature dilute Fermi gas at unitarity is derived up to next-to-leading order in derivatives from the microscopic theory. Apart from a next-to-leading order term that is suppressed in the BCS limit, the effective action obtained in the strong-coupling unitary limit is proportional to that obtained in the weak-coupling BCS limit. © 2010 Elsevier Inc.


Lordick F.,University of Leipzig | Kang Y.-K.,Asan Medical Center | Chung H.-C.,Yonsei University | Salman P.,Fundacion Arturo Lopez Perez | And 12 more authors.
The Lancet Oncology | Year: 2013

Background: Patients with advanced gastric cancer have a poor prognosis and few efficacious treatment options. We aimed to assess the addition of cetuximab to capecitabine-cisplatin chemotherapy in patients with advanced gastric or gastro-oesophageal junction cancer. Methods: In our open-label, randomised phase 3 trial (EXPAND), we enrolled adults aged 18 years or older with histologically confirmed locally advanced unresectable (M0) or metastatic (M1) adenocarcinoma of the stomach or gastro-oesophageal junction. We enrolled patients at 164 sites (teaching hospitals and clinics) in 25 countries, and randomly assigned eligible participants (1:1) to receive first-line chemotherapy with or without cetuximab. Randomisation was done with a permuted block randomisation procedure (variable block size), stratified by disease stage (M0 vs M1), previous oesophagectomy or gastrectomy (yes vs no), and previous (neo)adjuvant (radio)chemotherapy (yes vs no). Treatment consisted of 3-week cycles of twice-daily capecitabine 1000 mg/m2 (on days 1-14) and intravenous cisplatin 80 mg/m2 (on day 1), with or without weekly cetuximab (400 mg/m2 initial infusion on day 1 followed by 250 mg/m2 per week thereafter). The primary endpoint was progression-free survival (PFS), assessed by a masked independent review committee in the intention-to-treat population. We assessed safety in all patients who received at least one dose of study drug. This study is registered at EudraCT, number 2007-004219-75. Findings: Between June 30, 2008, and Dec 15, 2010, we enrolled 904 patients. Median PFS for 455 patients allocated capecitabine-cisplatin plus cetuximab was 4·4 months (95% CI 4·2-5·5) compared with 5·6 months (5·1-5·7) for 449 patients who were allocated to receive capecitabine-cisplatin alone (hazard ratio 1·09, 95% CI 0·92-1·29; p=0·32). 369 (83%) of 446 patients in the chemotherapy plus cetuximab group and 337 (77%) of 436 patients in the chemotherapy group had grade 3-4 adverse events, including grade 3-4 diarrhoea, hypokalaemia, hypomagnesaemia, rash, and hand-foot syndrome. Grade 3-4 neutropenia was more common in controls than in patients who received cetuximab. Incidence of grade 3-4 skin reactions and acne-like rash was substantially higher in the cetuximab-containing regimen than in the control regimen. 239 (54%) of 446 in the cetuximab group and 194 (44%) of 436 in the control group had any grade of serious adverse event. Interpretation: Addition of cetuximab to capecitabine-cisplatin provided no additional benefit to chemotherapy alone in the first-line treatment of advanced gastric cancer in our trial. Funding: Merck KGaA. © 2013 Elsevier Ltd.


Richter D.,University of Leipzig | Kunzmann U.,University of Bamberg
Psychology and Aging | Year: 2011

This study investigated age differences in cognitive and affective facets of empathy: the ability to perceive another's emotions accurately, the capacity to share another's emotions, and the ability to behaviorally express sympathy in an empathic episode. Participants, 80 younger (Mage = 32 years) and 73 older (Mage = 59 years) adults, viewed eight film clips, each portraying a younger or an older adult thinking-aloud about an emotionally engaging topic that was relevant to either younger adults or older adults. In comparison to their younger counterparts, older adults generally reported and expressed greater sympathy while observing the target persons; and they were better able to share the emotions of the target persons who talked about a topic that was relevant to older adults. Age-related deficits in the cognitive ability to accurately perceive another's emotions were only evident when the target person talked about a topic of little relevance to older adults. In sum, the present performance-based evidence speaks for multidirectional age differences in empathy. © 2010 American Psychological Association.


Sens-Schonfelder C.,German Research Center for Geosciences | Pomponi E.,University of Leipzig | Peltier A.,CNRS Paris Institute of Global Physics
Journal of Volcanology and Geothermal Research | Year: 2014

Activity of Piton de la Fournaise (PdF) volcano in La Réunion Island modifies the seismic velocities within the edifice. Using the 2010 and 2011 data from a network of 21 seismic stations in the vicinity of PdF, changes of seismic velocities are investigated using passive image interferometry, i.e. interferometry of seismic noise correlations. As noise correlations change significantly over time in response to volcanic activity, a method is presented that allows us to measure continuous long term velocity changes with high and constant accuracy by using multiple periods as reference. A long term velocity increase is found that averages about 0.25% per year. This trend is superimposed by short term changes that exhibit a clear connection with summit seismo-tectonic earthquakes indicating the effect of volcanic activity. Characteristic signatures of velocity changes are identified for post-eruptive periods of deflation that show an increase of velocity associated with subsidence observed by GPS. Periods of pre-eruptive inflation are characterized by decreasing velocity. Seismic crises can be associated with either increasing or decreasing velocity depending on whether the magma movement leads to deflation due to an eruption emptying the shallow plumbing system or to inflation caused by a non-eruptive intrusion. With a simple assumption about the spatial sensitivity of the measurements both processes are found to have the strongest effect in the central summit area of the volcano which also shows the strongest surface displacements during the time investigated here. We do not observe a dependence of the velocity change on the location of the erupting fissures, instead the distribution of changes for the three inflation periods and the two eruptions are similar indicating that the velocity changes observed here reflect the dynamics of a shallow magma reservoir rather than the effect of the eruption at the surface. © 2014.


Borte S.,Karolinska University Hospital | Borte S.,University of Leipzig | Borte S.,Hospital St Georg Ggmbh Leipzig | Von Dobeln U.,Karolinska University Hospital | Hammarstrom L.,Karolinska University Hospital
Current Opinion in Hematology | Year: 2013

PURPOSE OF REVIEW: Technical possibilities to screen for inborn errors of immune function at the neonatal stage have been rapidly progressing, whereas the guidelines that apply for the evaluation of benefits and concerns on expanding screening panels have not been broadly discussed for primary immunodeficiency diseases (PID). This review reflects on the assessment of severe combined immunodeficiencies (SCID), primary agammaglobulinaemias (such as X-linked agammaglobulinaemia) and inherited haemophagocytic syndromes (such as familial haemophagocytic lymphohistiocytosis) to be included in newborn screening (NBS) programmes. RECENT FINDINGS: Screening programmes in several federal states in the United States have been supplemented with the T-cell receptor excision circle assay during the past few years to identify children with SCID. The reported experience indicates that an efficient and validated screening approach for SCID is feasible on a population-based scale. SUMMARY: In the light of recent advances, severe PID ought to be discussed for their rapid implementation in national NBS programmes based upon clinical, social and economical criteria as consolidated in the extended 22-item Wilson-Jungner framework. Although SCID currently most favourably fulfils these screening guidelines, other strong candidates can be identified among primary immunodeficiency disorders. Future efforts of healthcare professionals and policy makers are essential to improve the concept of neonatal screening for PID. © 2012 Wolters Kluwer Health | Lippincott Williams & Wilkins.


Sarkar S.,Charité - Medical University of Berlin | Sarkar S.,University of Hamburg | Sarkar R.,Asklepios Westklinikum | Berg T.,University of Leipzig | Schaefer M.,Charité - Medical University of Berlin
British Journal of Psychiatry | Year: 2015

Background Antiviral therapy with interferon-alpha (IFN-α) for hepatitis C virus (HCV) infection is associated with increased risk for depression.Aims To identify clinical predictors for IFN-α-induced depression during antiviral therapy for HCV infection.Method Depression (defined with the Montgomery-Åsberg Depression Rating Scale (MADRS)) was evaluated before and during antiviral treatment in 91 people with chronic HCV infection without a history of psychiatric disorders. Cognitive function was evaluated using the Trail Making Test A/B (TMT A/B). (Trial registration at ClinicalTrials.gov: NCT00136318.).Results Depression during antiviral therapy was significantly associated with a baseline MADRS score of 3 or higher (P = 0.006). In total, 89% (n = 16) of patients who had a baseline score 40 for the single item sadness developed depression. Poor baseline performance in the TMT A (P = 0.027) and TMT B (P=0.033) was predictive for severe depression.Conclusions Pre-treatment screening for subthreshold depressive and cognitive symptoms will help to identify those at risk for IFN-α-associated depression among patients with chronic hepatitis C.


Bluher M.,University of Leipzig | Mantzoros C.S.,Boston Medical Health Center
Metabolism: Clinical and Experimental | Year: 2015

This year marks the 20th anniversary of the discovery of leptin, which has tremendously stimulated translational obesity research. The discovery of leptin has led to realizations that have established adipose tissue as an endocrine organ, secreting bioactive molecules including hormones now termed adipokines. Through adipokines, the adipose tissue influences the regulation of several important physiological functions including but not limited to appetite, satiety, energy expenditure, activity, insulin sensitivity and secretion, glucose and lipid metabolism, fat distribution, endothelial function, hemostasis, blood pressure, neuroendocrine regulation, and function of the immune system. Adipokines have a great potential for clinical use as potential therapeutics for obesity, obesity related metabolic, cardiovascular and other diseases. After 20 years of intense research efforts, recombinant leptin and the leptin analog metreleptin are already available for the treatment of congenital leptin deficiency and lipodystrophy. Other adipokines are also emerging as promising candidates for urgently needed novel pharmacological treatment strategies not only in obesity but also other disease states associated with and influenced by adipose tissue size and activity. In addition, prediction of reduced type 2 diabetes risk by high circulating adiponectin concentrations suggests that adipokines have the potential to be used as biomarkers for individual treatment success and disease progression, to monitor clinical responses and to identify non-responders to anti-obesity interventions. With the growing number of adipokines there is an increasing need to define their function, molecular targets and translational potential for the treatment of obesity and other diseases. In this review we present research data on adipose tissue secreted hormones, the discovery of which followed the discovery of leptin 20 years ago pointing to future research directions to unravel mechanisms of action for adipokines. © 2015 Elsevier Inc. All rights reserved.


Stahl U.,Max Planck Institute for Biogeochemistry | Reu B.,University of Leipzig | Wirth C.,University of Leipzig | Wirth C.,German Center for Integrative Biodiversity Research iDiv Halle Jena Leipzig
Proceedings of the National Academy of Sciences of the United States of America | Year: 2014

Using functional traits to explain species' range limits is a promising approach in functional biogeography. It replaces the idiosyncrasy of species-specific climate ranges with a generic trait-based predictive framework. In addition, it has the potential to shed light on specific filter mechanisms creating large-scale vegetation patterns. However, its application to a continental flora, spanning large climate gradients, has been hampered by a lack of trait data. Here, we explore whether five key plant functional traits (seed mass, wood density, specific leaf area (SLA), maximum height, and longevity of a tree) - indicative of life history, mechanical, and physiological adaptations - explain the climate ranges of 250 North American tree species distributed from the boreal to the subtropics. Although the relationship between traits and the median climate across a species range is weak, quantile regressions revealed strong effects on range limits. Wood density and seed mass were strongly related to the lower but not upper temperature range limits of species. Maximum height affects the species range limits in both dry and humid climates, whereas SLA and longevity do not show clear relationships. These results allow the definition and delineation of climatic "no-go areas" for North American tree species based on key traits. As some of these key traits serve as important parameters in recent vegetation models, the implementation of trait-based climatic constraints has the potential to predict both range shifts and ecosystem consequences on a more functional basis. Moreover, for future trait-based vegetation models our results provide a benchmark for model evaluation.


Dahms S.O.,Leibniz Institute for Age Research | Kuester M.,Leibniz Institute for Age Research | Streb C.,Friedrich - Alexander - University, Erlangen - Nuremberg | Roth C.,University of Leipzig | And 2 more authors.
Acta Crystallographica Section D: Biological Crystallography | Year: 2013

Heavy-atom clusters (HA clusters) containing a large number of specifically arranged electron-dense scatterers are especially useful for experimental phase determination of large complex structures, weakly diffracting crystals or structures with large unit cells. Often, the determination of the exact orientation of the HA cluster and hence of the individual heavy-atom positions proves to be the critical step in successful phasing and subsequent structure solution. Here, it is demonstrated that molecular replacement (MR) with either anomalous or isomorphous differences is a useful strategy for the correct placement of HA cluster compounds. The polyoxometallate cluster hexasodium -metatungstate (HMT) was applied in phasing the structure of death receptor 6. Even though the HA cluster is bound in alternate partially occupied orientations and is located at a special position, its correct localization and orientation could be determined at resolutions as low as 4.9Å. The broad applicability of this approach was demonstrated for five different derivative crystals that included the compounds tantalum tetradecabromide and trisodium phosphotungstate in addition to HMT. The correct placement of the HA cluster depends on the length of the intramolecular vectors chosen for MR, such that both a larger cluster size and the optimal choice of the wavelength used for anomalous data collection strongly affect the outcome.


Reichenbach A.,University of Leipzig | Derouiche A.,University of Bonn | Kirchhoff F.,Max Planck Institute for Experimental Medicine | Kirchhoff F.,Research Center for Molecular Physiology of the Brain | Kirchhoff F.,Saarland University
Brain Research Reviews | Year: 2010

The major glial population of the brain is constituted by astroglia. Highly branched and ramified protoplasmic astrocytes are the predominant form in grey matter and are found in almost all regions of the central nervous system. In cerebellum and retina, there two forms of elongated radial glia exist (Bergmann glia and Müller cells, respectively) that share many features with the protoplasmic astrocytes in respect to their perisynaptic association. Although these three astroglial cell types are different in their gross morphology, they are characterized by a polarized orientation of their processes. While one or only few processes have contacts with CNS boundaries such as capillaries and pia, an overwhelming number of thin filopodia- and lamellipodia-like process terminals contact and enwrap synapses, the sites of neuronal communication. The perisynaptic glial processes are the primary compartments that sense neuronal activity. After signal integration, they can also modulate synaptic transmission, thereby contributing to neural plasticity. Despite their importance, the mechanisms that (1) target astroglial processes toward pre- and postsynaptic compartments and (2) control the interaction during plastic events of the brain such as learning or injury are poorly understood. This review will summarize our current knowledge and highlight some open questions. © 2010 Elsevier B.V.


Grant
Agency: European Commission | Branch: FP7 | Program: CP-IP | Phase: HEALTH.2010.2.3.3-2 | Award Amount: 21.90M | Year: 2010

RNA virus infections kill millions of humans annually, largely due to the lack of suitable vaccines and drugs to control them. This problem is addressed in this FP7 call and in response a consortium of Europes and Asias leading molecular virologists, structural biologists, medicinal chemists and bioinformaticians has been brought together to generate a state-of-the-art drug discovery and design programme. The project aims to identify Small molecule Inhibitor Leads Versus Emerging and neglected RNA viruses (SILVER). It will focus its activities on selected medically important RNA viruses for which the development of drugs is considered essential (Dengue-, entero- and paramyxoviruses), whereas other relatively neglected and/or emerging RNA viruses will be explored to identify the most promising viral protein targets and antiviral compounds. A pipeline strategy has been developed to enable the inclusion in SILVER of viruses at all levels of existing knowledge. Targets for potential drugs include infectious virus, structurally characterised viral enzymes and other proteins. Leads for currently available antiviral drugs have been identified by screening compound libraries in virus-infected cell culture systems and in vitro assays using purified viral enzymes. Selective inhibitors of viral replication have also been (and are being) derived using detailed structural knowledge of viral proteins and structure-based drug design. Hits will be assayed using individual viral protein targets and replicative proteins in complex with viral RNA. The potential protective activity of the most potent inhibitors, that have a favourable (in vitro) ADME-tox profile, will be assessed in relevant infection models in animals. Licenses on promising compounds or compound classes will be presented to the interested pharmaceutical industry. The SILVER consortium will be well placed to play a major role in contributing to the international effort to develop strategies to improve world health.


Grant
Agency: European Commission | Branch: FP7 | Program: CP | Phase: ICT-2013.3.3 | Award Amount: 3.86M | Year: 2013

The paradigm of societal uses protected by biometric identification (ID) from national security and controlled access, to health care, banking and leisure requires coming up with ever more reliable built-in ID detection systems. In this context, PiezoMAT proposes a new technology of high-resolution fingerprint sensors based on a matrix of interconnected piezoelectric nanowires (NWs). The long term objective of PiezoMAT is to offer high performance fingerprint sensors with minimal volume occupation for integration into built-in systems able to compete on the market with the best existing products.PiezoMAT proceeds by local deformation of an array of individually contacted piezoelectric NWs and reconstruction from generated potentials, whose amplitudes are proportional to the NW displacement. Each NW and its associated electronics constitute a sensor, or pixel. The sub-micron dimension of NWs allows for high spatial frequency sampling of every fingerprint feature, enabling extremely reliable fingerprint differentiation through detection of the smallest minutiae (pores and ridge shapes). Charge collection efficiency is very dependent on the electrode configuration on each NW. PiezoMAT explores several possible configurations associated with gradual levels of technological challenges and risks, with a strong focus on developing reliable device design tools for present and future application-related adaptability. For the purpose of the PiezoMAT research, it is foreseen to collect generated charges and analogue output signals via metal lines connected to deported electronics on a printed circuit board. This configuration does not allow for maximum NW integration density but is designed to yield sufficient resolution to demonstrate the concept, major technological achievements and actual performance increase as compared to the state-of-the-art. Long term developments will pursue full electronics integration for an optimal sensor resolution.


Grant
Agency: European Commission | Branch: FP7 | Program: CP-CSA-Infra | Phase: INFRA-2008-1.1.1 | Award Amount: 10.84M | Year: 2009

Research infrastructures are part of the ECs current priorities in structuring the European research area. Among these, the importance of high magnetic field facilities has been recognized, as witnessed by the continued EC funding of many high magnetic field projects. Under FP6, three different projects have been accepted. The Grenoble High Magnetic Field Laboratory ihas been running a RITA program (until 31/12/2007), whereas the TNA of all other European high field facilities is being coordinated by the I3 EuroMagNET (until 31/12/2008). The European pulsed high magnetic field laboratories are executing a Design Study for the next generation pulsed field user facilities (until 31/3/2009). All these programs are running very satisfactorily and contribute to the excellence of Europes high magnetic field research. For FP7, the principal actors of Europes high magnetic field research, the Grenoble High Magnetic Field Laboratory (Grenoble, France), the High Field Magnet Laboratory (Nijmegen, the Netherlands), the Hochfeld Labor Dresden (Dresden, Germany) and the Laboratoire National des Champs Magntiques Pulss (Toulouse, France) propose to unite all their transnational access, together with joint research activities and networking activities into one I3, called EuroMagNET II. This I3 is considered as a very important step towards full collaboration between Europes high field facilities, which will bring European high magnetic field science to a comparable level of that in the USA. It is also a step towards the creation of a multi-site European Magnetic Field Laboratory (EMFL). Within the context of the ESFRI Roadmap Update, a proposal for such an EMFL is currently under consideration, for a planned realization in 2015.


Grant
Agency: European Commission | Branch: FP7 | Program: CP | Phase: ICT-2011.5.2 | Award Amount: 5.47M | Year: 2013

The Go-Smart project will build a generic open-source software simulation environment for planning of image guided percutaneous Minimally Invasive Cancer Treatment (MICT). MICT includes radiofrequency ablation (RFA), cryoablation, microwave ablation (MW), transarterial chemoembolisation (TACE), brachytherapy (BT), and prospectively, irreversible electroporation (IRE). Beside of TACE each type of MICT uses needles that are inserted into the tumour tissue and the tissue is destroyed through heating, cooling, and application of an electric field or radiation. These treatments are often combined with TACE. The commonalities between the different procedures allow for the development of a generic, reusable, robust simulation environment with the relevant physics and physiology needed to correctly predict the result of MICT in terms of lesion size and shape. The environment will incorporate patient data and appropriate physiological models to simulate tissue response to heat, cooling, hypoxia, radiation, or electrical pulses. The models will account for multi-scale physiological dependencies between a full organ, its anatomical structures and tissue properties down to the cellular level. The software environment will be open-ended with extendable interfaces to allow clinicians to add further patient data collected before, during and after MICTs. This data will be used by the research community to refine the existing physiological tissue models thus transforming the environment into a user-driven growing info-structure. The Go-Smart environment will allow the Interventional Radiologists (IR) to select an optimal type of MICT by simulating the personalised result of the different treatments and medical protocols in patient specific conditions. Bringing different MICTs into a unified simulation environment is a unique approach and will promote their systematic comparison and establish much needed common standards and protocols for MICT in Europe.


Grant
Agency: European Commission | Branch: FP7 | Program: CP | Phase: ICT-2009.4.3 | Award Amount: 9.93M | Year: 2010

Over the last 3 years, the semantic web activity has gained momentum with the widespread publishing of structured data as RDF. The Linked Data paradigm has therefore evolved from a practical research idea into a very promising candidate for addressing one of the biggest challenges in the area of intelligent information management: the exploitation of the Web as a platform for data and information integration in addition to document search. To translate this initial success into a world-scale disruptive reality, encompassing the Web 2.0 world and enterprise data alike, the following research challenges need to be addressed: improve coherence and quality of data published on the Web, close the performance gap between relational and RDF data management, establish trust on the Linked Data Web and generally lower the entrance barrier for data publishers and users. With partners among those who initiated and strongly supported the Linked Open Data initiative, the LOD2 project aims at tackling these challenges by developing:\n1. enterprise-ready tools and methodologies for exposing and managing very large amounts of structured information on the Data Web,\n2. a testbed and bootstrap network of high-quality multi-domain, multi-lingual ontologies from sources such as Wikipedia and OpenStreetMap.\n3. machine learning algorithms for automatically enriching, repairing, interlinking and fusing data from the Web.\n4. standards and methods for reliably tracking provenance, ensuring privacy and data security as well as for assessing the quality of information.\n5. adaptive tools for searching, browsing, and authoring of Linked Data.\nWe will integrate and syndicate linked data with large-scale, existing applications and showcase the benefits in the three application scenarios media & publishing, corporate data intranets and e-government.


Grant
Agency: European Commission | Branch: FP7 | Program: CP-CSA | Phase: SPA.2011.1.5-02 | Award Amount: 27.65M | Year: 2011

MACC II (Monitoring Atmospheric Composition and Climate Interim Implementation) is designed to meet the requirements that have been expressed for prototype operational GMES services for the atmospheric domain. From late-2011 MACC II will continue the operation and development of the GMES service lines established by the MACC project and prepare for its transition in 2014 to become the atmospheric monitoring component of GMES Operations. MACC II will prepare for full operations in terms of continuity, sustainability and availability. It will maintain and further develop the efficiency and resilience of its end-to-end processing system, and will refine the quality of the products of the system. It will adapt the system to make use of observations from new satellites, in particular the first of the atmospheric Sentinels, and will interface with FP7 RTD projects that contribute towards long-term service improvement. MACC II will ensure that its service lines best meet both the requirements of downstream-service providers and end users, and the requirements of the global scientific user community. The service lines will cover air quality, climate forcing, stratospheric ozone and solar radiation. MACC II will deliver products and information that support the establishment and implementation of European policy and wider international programmes. It will acquire and assimilate observational data to provide sustained real-time and retrospective global monitoring of greenhouse gases, aerosols and reactive gases such as tropospheric ozone and nitrogen dioxide. It will provide daily global forecasts of atmospheric composition, detailed air-quality forecasts and assessments for Europe, and key information on long range transport of atmospheric pollutants. It will provide comprehensive web-based graphical products and gridded data. Feedback will be given to space agencies and providers of in situ data on the quality of their data and future observational requirements.


Grant
Agency: European Commission | Branch: FP7 | Program: CP-FP | Phase: HEALTH.2013.2.4.2-3 | Award Amount: 3.85M | Year: 2013

More than 14 million Europeans suffer from heart failure (HF), of which more than 50 % have HF with preserved left ventricular (LV) ejection fraction (EF) (HFPEF, diastolic heart failure). HFPEF is the only cardiovascular disease with increasing prevalence and incidence, affecting 10-20% of the elderly and contributing substantially to hospitalizations of elderly HF patients. Currently, no medical treatment has been shown to be effective and the economic, social and personal burden of HFPEF is enormous; this disease constitutes one of the most pressing unmet clinical needs. A cardinal feature of HFPEF is exercise intolerance. The pathophysiology of exercise intolerance in HFPEF depends on multiple factors in heart, endothelium and skeletal muscles. From a pathophysiological point of view, exercise could by far outweigh any pharmacological intervention in this heterogeneous syndrome, since lifestyle dependent risk factor, physical inactivity, and physical deconditioning underlay and contribute to HFPEF. OptimEx will focus on the cardiovascular effects of exercise training as primary and secondary prevention of HFPEF. We will combine in vivo and in vitro studies in man and rats in serial experiments that will advance our understanding of fundamental cellular and molecular mechanisms underpinning dose-dependent exercise-induced changes in the heart, blood vessels and skeletal muscles. This research is aimed to tackle one of the major health problems the developed world faces with its ageing societies and increasing prevalence of the HFPEF and will support sustainable health systems in EU member states through improvements in the clinical management of a common and disabling disease. The project is therefore highly relevant to improve the health of European citizens and important to promote healthy ageing and preventing disease.


Grant
Agency: European Commission | Branch: FP7 | Program: CP-FP | Phase: ENV.2011.1.1.2-2 | Award Amount: 3.81M | Year: 2011

ECLIPSE aims to develop and assess effective emission abatement strategies for short-lived climate agents in order to provide sound scientific advice on how to mitigate climate change while improving the quality of air. Current climate policy does not consider a range of short-lived gases and aerosols, and their precursors (including nitrogen oxides, volatile organic compounds, sulphate, and black carbon). These nevertheless make a significant contribution to climate change and directly influence air quality. There are fundamental scientific uncertainties in characterizing both the climate and air quality impacts of short-lived species and many aspects (for example, the regional dependence) are quite distinct to those for the longer-lived climate gases already included in the Kyoto Protocol. ECLIPSE will bring together 11 institutes with established and complementary expertise for a closely co-ordinated 3 year programme. It will build on existing knowledge and use state-of-the-art chemistry and climate models to (i) improve understanding of key atmospheric processes (including the impact of short-lived species on cloud properties) and characterize existing uncertainties; (ii) evaluate model simulations of short-lived species and their long-range transport using ground-based and satellite observations; (iii) perform case studies on key source and receptor regions (focused on Southeastern Europe, China and the Arctic); (iv) quantify the radiative forcing and climate response due to short-lived species, incorporating the dependence on where the species are emitted; (v) refine the calculation of climate metrics, and develop novel metrics which, for example, consider rate of climate warming and go beyond using global-mean quantities; (vi) clarify possible win-win and trade-off situations between climate policy and air quality policy; (vii) identify a set of concrete cost-effective abatement measures of short-lived species with large co-benefits.


Grant
Agency: European Commission | Branch: FP7 | Program: MC-ITN | Phase: FP7-PEOPLE-ITN-2008 | Award Amount: 3.29M | Year: 2009

It became evident over the last 20 years that glial cells are involved in virtually every aspect of nervous system function. Neurons and glia exchange chemical signals that are essential for the normal function of the nervous system, and are crucial in disease. Recently it was discovered that glial cells also act as neural stem cells both in early development and adulthood, and in synapse formation. This rapid progress in glia research generated not only many new insights but also numerous new questions, and it became obvious that to deal with the complexity of glia-neuron interactions and to formulate new concepts in the field, we need novel experimental paradigms and methodologies. The EdU-GLIA network is aimed at equipping young researchers with the most advanced skills knowledge in glial cell research, in order to generate a new generation of scientists dedicated to resolving open questions of glia-neuron interactions. We will offer young investigators high quality interdisciplinary projects that will be individually tailored and each will be supervised by two mentors. Our team of supervisors consists of leading experts in the field, including an industrial partner. EdU-GLIA is based upon current original and promising research models ranging from basic science to clinical application, and on the most advanced and sophisticated research techniques. To complement the research projects, we will offer the young scientists a rich selection of courses, workshops, guest lectures, and symposia. This program will not only provide insights into research techniques and paradigms complementary to the individual projects, but will also strengthen professional skills such as writing papers, presenting data, and ethical conduct, enhancing the career prospects of the fellows. In summary, the main objective of EdU-GLIA is to train promising young researchers for careers in basic as well as in translational research, including clinical application and industry.


Grant
Agency: European Commission | Branch: H2020 | Program: RIA | Phase: EINFRA-9-2015 | Award Amount: 8.22M | Year: 2016

The overall objective of READ is to implement a Virtual Research Environment where archivists, humanities scholars, computer scientists and volunteers are collaborating with the ultimate goal of boosting research, innovation, development and usage of cutting edge technology for the automated recognition, transcription, indexing and enrichment of handwritten archival documents. This Virtual Research Environment will not be built from the ground up, but will benefit from research, tools, data and resources generated in multiple national and EU funded research and development projects and provide a basis for sustaining the network and the technology in the future. This ICT based e-infrastructure will address the Societal Challenge mentioned in Europe in a Changing World namely the transmission of European cultural heritage and the uses of the past as one of the core requirements of a reflective society. Based on research and innovation enabled by the READ Virtual Research Environment we will be able to explore and access hundreds of kilometres of archival documents via full-text search and therefore be able to open up one of the last hidden treasures of Europes rich cultural hertitage.


Grant
Agency: European Commission | Branch: FP7 | Program: CP-IP | Phase: HEALTH.2012.2.1.2-2 | Award Amount: 23.12M | Year: 2012

METACARDIS applies a systems medicine multilevel approach to identify biological relationships between gut microbiota, assessed by metagenomics, and host genome expression regulation, which will improve understanding and innovative care of cardiometabolic diseases (CMD) and their comorbidities. CMD comprise metabolic (obesity, diabetes) and heart diseases characterized by a chronic evolution in ageing populations and costly treatments. Therapies require novel integrated approaches taking into account CMD natural evolution. METACARDIS associates European leaders in metagenomics, who have been successful in establishing the structure of the human microbiome as part of the EU FP7 MetaHIT consortium, clinical and fundamental researchers, SME, patients associations and food companies to improve the understanding of pathophysiological mechanisms, prognosis and diagnosis of CMD. We will use next-generation sequencing technologies and high throughput metabolomic platforms to identify gut microbiota- and metabolomic-derived biomarkers and targets associated with CMD risks. The pathophysiological role of these markers will be tested in both preclinical models and replication cohorts allowing the study of CMD progression in patients collected in three European clinical centres of excellence. Their impact on host gene transcription will be characterised in patients selected for typical features of CMD evolution. Application of computational models and visualisation tools to complex datasets combining clinical information, environmental patterns and gut microbiome, metabolome and transcriptome data is a central integrating component in the research, which will be driven by world leaders in metagenomic and functional genomic data analysis. These studies will identify novel molecular targets, biomarkers and predictors of CMD progression, paving the way for personalized medicine in CMD.


Grant
Agency: European Commission | Branch: FP7 | Program: CP | Phase: ICT-2013.5.2 | Award Amount: 3.71M | Year: 2014

The ClinicIMPPACT proposal builds upon the successful completion of the IMPPACT project (Grant No. 223877, completed in February 2012) that resulted in the creation of a physiological radio-frequency ablation (RFA) model for liver cancer treatment. This preliminary RFA model has been tested in porcine animal studies with extensive histological workup and in a clinical study using patient data for retrospective simulations.\nThe main objectives of this proposal are: (i) to bring the existing IMPPACT RFA model for liver cancer treatment into clinical practice; (ii) to verify and refine the model in a small clinical study; (iii) to develop the model into a real-time patient-specific RFA planning and support system for Interventional Radiologists (IR) under special consideration of their clinical workflow needs; (iv) to establish a corresponding training procedure for IRs; (v) to evaluate the practicality and benefit of the model for routine clinical purposes by analyzing user surveys and running expert forums.\nThe proposed RFA planning and support tool is therefore unique as it offers a validated software environment, where IRs can interact during the RFA treatment with the virtual tumour ablation through the extensive use of simulation and visualisation technology.


Grant
Agency: European Commission | Branch: FP7 | Program: CP-IP | Phase: HEALTH.2010.4.2-9-1 | Award Amount: 9.40M | Year: 2011

In the development of products for use by humans it is vital to identify compounds with toxic properties at an early stage of their development, to avoid spending time and resource on unsuitable and potentially unsafe candidate products. Human pluripotent stem cell lines offer a unique opportunity to develop a wide variety of human cell-based test systems because they may be expanded indefinitely and triggered to differentiate into any cell type. SCR&Tox aims at making use of these two attributes to provide in vitro assays for predicting toxicity of pharmaceutical compounds and cosmetic ingredients. The consortium has been designed to address all issues related with biological and technological resources to meet that goal. In order to demonstrate the value of pluripotent stem cells for toxicology, the consortium will focus on four complementary aspects: Relevance i.e. establishing and maintaining discrete cell phenotypes over long-term cultures; providing large versatility to adapt to assays of specific pathways. Efficiency for i) automated cell production and differentiation, ii) cell engineering for differentiation and selection iii) multi-parametric toxicology using functional genomic, proteomic and bioelectronics. Extension i.e. i) scalability through production of cells and technologies for industrial-scale assays, and ii) diversity of phenotypes (5 different tissues), and of genotypes (over 30 different donors). Normalization validation and demonstration of reproducibility and robustness of cell-based assays on industrial-scale platforms, to allow for secondary development in the pharmaceutical and cosmetic industry. SCR&Tox will be intricately associated to other consortia of the Alternative Testing call, sharing biological, technological and methodological resources. Proof of concept of the proposed pluripotent stem cell-based assays for toxicology will be provided on the basis of toxicity pathways and test compounds identified by other consortia.


Grant
Agency: European Commission | Branch: H2020 | Program: RIA | Phase: PHC-11-2015 | Award Amount: 7.42M | Year: 2016

Cancer is the second leading cause of mortality in EU member states with ~90% of all cancer deaths caused by metastatic spread. Despite its significance, measuring metastatic potential as well as potential indicators of therapy efficacy remain unmet clinical challenges. Recently, it has been demonstrated in vitro, that aggressive metastatic cells pull on their surroundings suggesting that metastatic potential could be gauged by measuring the forces exert by tumours. Furthermore, many solid tumours show a significantly increased interstitial fluid pressure (IFP) which prevents the efficient uptake of therapeutic agents. As a result, a reduction in IFP is recognized as a hallmark of therapeutic efficacy. Currently, there is no non-invasive modality that can directly image these forces in vivo. Our objective is the non-invasive measurement of both IFP within tumours as well as the forces they exert on their surrounding environment. This will be used to predict a tumours metastatic potential and importantly, changes in these forces will be used to predict the therapeutic efficacy of drug therapy. To attain this goal, the biomechanical properties of the tumour and its neighbouring tissue will be measured via MR-elastography at various measured deformation states. Resultant images will be used to reconstruct images of the internal and external forces acting on the tumour. We call this novel imaging modality Magnetic Resonance Force (MRF) imaging. We will calibrate MRF via cell cultures and pre-clinical models, and then test the method in breast, liver, and brain cancer patients. Thereby, we will investigate whether MRF data can predict metastatic spread and measure IFP in patients. We will also investigate the potential to non-invasively modulate the force environment of cancer cells via externally applied shear forces with the aim of impacting cell motility and proliferation. This can provide novel mechanism for anticancer therapeutic agents via mechanotransduction.


Grant
Agency: European Commission | Branch: FP7 | Program: CP-CSA-Infra | Phase: INFRA-2011-1.1.20. | Award Amount: 12.58M | Year: 2012

The Project promotes the access to five European Research Infrastructures, and it is structured into nine Networking Activities, plus the Management of the Consortium, and fourteen Joint Research Activities. The Project will profit of the success of the previous HadronPhysics project in FP6 and the current HadronPhysics2 in FP7, and originates from the initiative of more than 2.500 European scientists working in the field of hadron physics. Hadron physics deals with the study of strongly interacting particles, the hadrons. Hadrons are composed of quarks and gluons. Their interaction is described by Quantum Chromo Dynamics, the theory of the strong force. Hadrons form more complex systems, in particular atomic. Under extreme conditions of pressure and temperature, hadrons may loose their identity and dissolve into a new state of matter similar to the primordial matter of the early Universe. The Networking Activities are related to the organization of experimental and theoretical collaborative work concerning both ongoing activities at present Research Infrastructures and planned experiments at future facilities. In hadron physics the close interaction between experimentalists and theoreticians is of paramount importance. The Joint Research Activities concentrate on technological innovations for present and future experiments. Applications in material science, medicine, information, technology, etc., represent natural fall-outs. The main objective of this Integrating Activity is to optimize the use and development of the Research Infrastructures existing in Europe working in the field of hadron physics. The Project aims as well at structuring, on European scale, the way Research Infrastructures operate, and at fostering their joint development in terms of capacity and performance. The approach used is the bottom up approach, to respond to the needs of the scientific community in all fields of science and technology.


Grant
Agency: European Commission | Branch: FP7 | Program: CP-IP | Phase: HEALTH.2011.2.1.1-1 | Award Amount: 39.64M | Year: 2011

In response to the call for a high impact initiative on the human epigenome, the BLUEPRINT Consortium has been formed with the aim of generating at least 100 reference epigenomes and studying them to advance and exploit knowledge of the underlying biological processes and mechanisms in health and disease. BLUEPRINT will focus on distinct types of haematopoietic cells from healthy individuals and on their malignant leukaemic counterparts. Reference epigenomes will be generated by state-of-the-art technologies from highly purified cells for a comprehensive set of epigenetic marks in accordance with quality standards set by IHEC. This resource-generating activity will be conducted at dedicated centres to be complemented by confederated hypothesis-driven research into blood-based diseases, including common leukaemias and autoimmune disease (T1D), by epigenetic targets and compound identification, and by discovery and validation of epigenetic markers for diagnostic use. By focussing on 100 samples of known genetic variation BLUEPRINT will complete an epigenome-wide association study, maximizing the biomedical relevance of the reference epigenomes. Key to the success of BLUEPRINT will be the integration with other data sources (i.e. ICGC, 1000 genomes and ENCODE), comprehensive bioinformatic analysis, and user-friendly dissemination to the wider scientific community. The involvement of innovative companies will energize epigenomic research in the private sector by creating new targets for compounds and the development of smart technologies for better diagnostic tests. BLUEPRINT will outreach through a network of associated members and form critical alliances with leading networks in genomics and epigenomics within Europe and worldwide. Through its interdisciplinarity and scientific excellence combined with its strong commitment to networking, training and communication BLUEPRINT strives to become the cornerstone of the EU contribution to IHEC.


Grant
Agency: European Commission | Branch: FP7 | Program: CP-FP | Phase: HEALTH-2007-3.1-4 | Award Amount: 3.83M | Year: 2009

The main objective of the proposed project is to develop and to validate a system for measurement and feedback of outcome quality and support of decision making. The project will be executed in the areas of postoperative pain management which serves as an example for other fields of medicine with a high variation of care. The project will provide the medical community with a unique, user-friendly system to improve treatment of patients with postoperative pain. We propose to develop and implement a web-based information system, featuring three functions: Feedback and benchmarking system which provides participating sites with continuously updating data and analyses about the quality of care they provide compared to other institutions and allows identification of best clinical practice. Clinical Decision Support System for Post-Operative Pain, which responds to queries made by physicians for advice regarding treatment of individual patients. A Knowledge Library which provides clinicians with easily accessible summaries of evidence-based recommendations tailored to specific post-operative situations. The first two functions will draw their information from a large database or registry. The registry will receive data about post-operative patients from x participating clinical sites across Europe. The third function, the Knowledge Library, will draw its information from published, peer-reviewed studies, and will be updated periodically. To increase the benefit of the system to end-users, the registry will be complemented with patient data on side effects and treatment costs. All of these will be integrated into the feedback system. The proposed project is the first comprehensive, concerted European effort in the field of improving clinical decision making. It integrates experience gained from national initiatives, and the expertise of world-leading, European-based, groups dealing with benchmarking, health outcomes and health care utilization research


Grant
Agency: European Commission | Branch: FP7 | Program: CP-IP | Phase: NMP.2012.1.3-3 | Award Amount: 49.52M | Year: 2013

The innovative and economic potential of Manufactured Nano Materials (MNMs) is threatened by a limited understanding of the related EHS issues. While toxicity data is continuously becoming available, the relevance to regulators is often unclear or unproven. The shrinking time to market of new MNM drives the need for urgent action by regulators. NANoREG is the first FP7 project to deliver the answers needed by regulators and legislators on EHS by linking them to a scientific evaluation of data and test methods. Based on questions and requirements supplied by regulators and legislators, NANoREG will: (i) provide answers and solutions from existing data, complemented with new knowledge, (ii) Provide a tool box of relevant instruments for risk assessment, characterisation, toxicity testing and exposure measurements of MNMs, (iii) develop, for the long term, new testing strategies adapted to innovation requirements, (iv) Establish a close collaboration among authorities, industry and science leading to efficient and practically applicable risk management approaches for MNMs and products containing MNMs. The interdisciplinary approach involving the three main stakeholders (Regulation, Industry and Science) will significantly contribute to reducing the risks from MNMs in industrial and consumer products. NANoREG starts by analysing existing knowledge (from WPMN-, FP- and other projects). This is combined with a synthesis of the needs of the authorities and new knowledge covering the identified gaps, used to fill the validated NANoREG tool box and data base, conform with ECHAs IUCLID DB structure. To answer regulatory questions and needs NANoREG will set up the liaisons with the regulation and legislation authorities in the NANoREG partner countries, establish and intensify the liaisons with selected industries and new enterprises, and develop liaisons to global standardisation and regulation institutions in countries like USA, Canada, Australia, Japan, and Russia.


Grant
Agency: European Commission | Branch: FP7 | Program: MC-ITN | Phase: FP7-PEOPLE-2012-ITN | Award Amount: 3.79M | Year: 2012

Renal and liver diseases are global public health problems, with the incidences of end-stage renal disease (ESRD) and end-stage liver disease (ESLD) rising annually. Due to the lack of donor kidneys, most of ESRD patients depend on dialysis treatment using either an artificial kidney or the peritoneal membrane. Both modes are inefficient in removing uremic waste molecules and inadequately remove excess body fluids, potassium and phosphate contributing significantly to severe patient health problems, poor life quality and high mortality (15-20% per year). The impairment of liver functions has also serious implications and it is responsible for high rates of patient morbidity and mortality. Presently, liver transplantation remains the treatment of choice for ESLD patients but it is limited by both the high costs and severe shortage of donor organs. BIOART ITN will provide state-of-the-art multidisciplinary training for a cohort of 16 young researchers in order to equip them with the skills required to make a significant impact in the treatment of kidney and liver diseases. For this, BIOART ITN will develop: - A prototype artificial kidney devices enabling prolonged / continuous removal of uremic toxins - A prototype bioartificial kidney device that utilizes human renal epithelial cells for removal of uremic toxins. - Prototype bioreactor devices to ensure viability and functions of hepatocyte cells. In fact, BIOART ITN will provide the European Union with specific multidisciplinary expertise in the area of (bio)artificial organs for treatment of kidney and liver diseases. This will be achieved through the training of highly educated researchers able to understand and eventually to manage all scientific, industrial and clinical aspects of these (bio)artificial organs. Fourteen individual RTD Projects will be performed and thirty five scientific training courses will be offered by the host organizations to the recruited young researchers. During four years, the re


Grant
Agency: European Commission | Branch: FP7 | Program: MC-IRSES | Phase: FP7-PEOPLE-2012-IRSES | Award Amount: 428.40K | Year: 2013

BRASINOEU aims to study the translocation and nanosafety issues of engineered metal oxide nanoparticles (NPs).The new scientific and technology developments of nanotechnology require a deeper knowledge of the effects of nanotechnology based products on human health. This knowledge is fundamental for the development of nanotechnology and to achieve its full acceptance. The concept of safe by design is based on the application of nanosafety to design the nanomaterials in order to prevent or reduce their possible harm to humans and the environment. The project encompass the synthesis of metal oxide NPs, with a focus on magnetic oxides, their surface modification and post modification in biological fluids; immunological and genotoxicity studies, and translocation studies both in vitro and in vivo. The project will seek to establish relationships between designed NP properties and their translocation at cellular and body level as well as their immune- and genotoxic response. This is a fundamental issue for the safe design of NPs. Also, the toxicological response will be studied as a function of the uptake dose of NPs. At cellular level a battery of techniques will be applied for localization and quantification of NPs: Transmission Electron Microscopy, Raman, Confocal Microscopy, Ion Beam Microscopy, etc. Positron Emission Tomography and Magnetic Resonance Imaging will be employed for biodistribution and quantification studies in animal models. BRASINOU is formed by an international team with the required and complementary expertise to address the proposed work from an international and multidisciplinary perspective. The project gathers internationally recognized groups in immunology, genotoxicity, nanoparticle synthesis, surface chemistry, biophysics, imaging and materials science. The complementary of the groups involved in the project will help to develop new highly skilled professional and scientific horizontal connections.


Grant
Agency: European Commission | Branch: FP7 | Program: CP-FP | Phase: NMP.2011.1.2-3 | Award Amount: 4.96M | Year: 2012

The shortage of drinking water in many regions on the planet constitutes a real problem and hazard. The use of seawater, brackish water and wastewater for human consumption is not a new concept. In spite of the success of membrane technology in water reclamation, membrane separation systems suffer from a serious problem: membrane fouling. The main downside is an inevitabe increase in operation and maintenance costs as well as an adverse effect on the lifespan of the membrane (harsh cleaning treatment). LbLBRANE is an ambitious project ensuring competent input right from the membrane concept down to lab-scale production and optimisation before scaling-up in pilot plants for end users. LbLBRANE applies novel nanotechnology tools, namely the layer-by-layer (LbL) technology to develop a versatile and generic procedure for the fast fabrication of low-cost, stable, chemical-resistant polyelectrolyte membranes. The LbL technology is the way to go for a bottom-up nano-engineered membrane whereby the modification is performed stepwise in a controlled manner - the thickness can be finely tuned by the number of layers deposited, the architecture of the film can be compartmentalised by incorporating functional species (polyelectrolyte as well as nanoparticles with specific functions, such as antibacterial properties) and the morphology of the film can be modulated via the pH, charge density and type of polyelectrolyte pairs to create pore size (hence permeability) tailored according to the specific need of the membranes. Our concern is focused towards high performance, regenerable membranes which could be cleaned in-situ and hybrid membranes with extremely high flux with high permselectivity and mechanical robustness. The ultimate aim is towards implementation of LbL on large industrial scale, from module design and construction to end user, especially for water reuse and metal/acid recovery.


Grant
Agency: European Commission | Branch: FP7 | Program: CP-FP | Phase: HEALTH.2013.2.4.2-2 | Award Amount: 7.80M | Year: 2013

Cardiogenic shock (CS) complicating acute myocardial infarction (AMI) represents a major European health care concern with mortality rates between 40-70%. Approximately 70-80% of these patients present with multivessel disease defined as coronary lesions in more than one vessel. The clinician is faced with the decision to either 1) intervene only on the culprit lesion acutely responsible for the initiation of cardiogenic shock, or 2) treat additional lesions considered hemodynamically significant but not acutely triggering the CS cascade as well. Current guidelines recommend percutaneous coronary intervention of all critical lesions. However, due to a lack of randomized trials, these recommendations are solely based on registry data and pathophysiological considerations. Aim of the randomized CULPRIT-SHOCK trial is therefore to compare a) immediate multivessel PCI versus b) culprit lesion only PCI in patients with AMI complicated by CS. A total of 706 CS patients will be randomized in several European countries. The primary endpoint will be 30-day all-cause mortality and/or severe renal failure requiring renal replacement therapy. CULPRIT-SHOCK will therefore determine the optimal percutaneous revascularization strategy in patients with AMI and multivessel disease complicated by CS. In addition, a comprehensive array of efficacy, safety and socio-economic parameters for the chosen population will be assessed. Multiple secondary endpoints and several substudies (microcirculation, biomarkers, angiography) will serve to further understand the presumed differential effects of the 2 treatment arms and to understand the underlying pathophysiology and prognostic markers. From these parameters a multivariable regression model and a risk score for the prediction of clinical prognosis and a cost-effectiveness model in AMI and CS will be developed. Furthermore, CULPRIT-SHOCK will obtain data on CS patients not meeting inclusion criteria by instituting a separate registry.


Grant
Agency: European Commission | Branch: H2020 | Program: RIA | Phase: BIOTEC-1-2014 | Award Amount: 7.06M | Year: 2015

P4SB is about the utilization of the conceptual and material tools of contemporary Synthetic Biology to bring about the sustainable and environmentally friendly bioconversion of oil-based plastic waste into fully biodegradable counterparts by means of deeply engineered, whole-cell bacterial catalysts. These tools will be used to design tailor-made enzymes for the bio-depolymerization of PET (polyethylene terephthalate) and PU (polyurethane), but also for the custom design of a Pseudomonas putida Cell Factory capable of metabolizing the resulting monomers. Pseudomonas putida will undergo deep metabolic surgery to channel these diverse substrates efficiently into the production of polyhydroxyalkanoates (PHA) and derivatives. In addition, synthetic downstream processing modules based on the programmed non-lytic secretion of PHA will facilitate the release and recovery of the bioplastic from the bacterial biomass. These industry driven objectives will help to address the market need for novel routes to valorise the gigantic plastic waste streams in the European Union and beyond, with direct opportunities for SME partners of P4SB spanning the entire value chain from plastic waste via Synthetic Biology to biodegradable plastic. As a result we anticipate a completely biobased process reducing the environmental impact of plastic waste by establishing it as a novel bulk second generation carbon source for industrial biotechnology, while at the same time opening new opportunities for the European plastic recycling industry and helping to achieve the ambitious recycling targets set by the European Union for 2020.


Grant
Agency: European Commission | Branch: H2020 | Program: CSA | Phase: SPACE | Award Amount: 5.00M | Year: 2014

MACC-III is the last of the pre-operational stages in the development of the Copernicus Atmosphere Service. Its overall institutional objective is to function as the bridge between the developmental precursor projects - GEMS, PROMOTE, MACC and MACC-II- and the Atmosphere Service envisaged to form part of Copernicus Operations. MACC-III will provide continuity of the atmospheric services provided by MACC-II. Its continued provision of coherent atmospheric data and information, either directly or via value-adding downstream services, is for the benefit of European citizens and helps meet global needs as a key European contribution to the Global Climate Observing System (GCOS) and the encompassing Global Earth Observation System of Systems (GEOSS). Its services cover in particular: air quality, climate forcing, stratospheric ozone, UV radiation and solar-energy resources. MACC-IIIs services are freely and openly available to users throughout Europe and in the world. MACC-III and its downstream service sector will enable European citizens at home and abroad to benefit from improved warning, advisory and general information services and from improved formulation and implementation of regulatory policy. MACC-III, together with its scientific-user sector, also helps to improve the provision of science-based information for policy-makers and for decision-making at all levels. The most significant economic benefit by far identified in the ESA-sponsored Socio-Economic Benefits Analysis of Copernicus report published in July 2006 was the long-term benefit from international policy on climate change. Long-term benefit from air quality information ranked second among all Copernicus benefits in terms of present value. Immediate benefits can be achieved through efficiency gains in relation to current policies. The estimated benefits substantially outweigh the costs of developing and operating the proposed services.


Grant
Agency: European Commission | Branch: FP7 | Program: CP-IP | Phase: ENV.2010.2.1.4-1 | Award Amount: 9.23M | Year: 2010

FunDivEUROPE (FUNctional significance of forest bioDIVersity in EUROPE) proposes to quantify the effects of forest biodiversity on ecosystem function and services in major European forest types in the main bioclimatic regions of Europe. FunDivEUROPE will be based on four scientific platforms and seven cross-cutting Work Packages. The project will combine a global network of tree diversity experiments (Experimental Platform) with a newly designed network of observational plots in six focal regions within Europe (Exploratory Platform). Additionally, the project will integrate an in-depth analysis of inventory-based datasets of existing forest monitoring networks to extend the scope to larger spatial and temporal scales (Inventory Platform). FunDivEUROPE will thus combine the strengths of various scientific approaches to explore and quantify the significance of forest biodiversity for a very large range of ecosystem processes and ecosystem services. Using modeling and state-of-the-art techniques for quantitative synthesis, the project will integrate information gained from the different platforms to assess the performance of pure and mixed species stands under changing climate. In addition to the three research platforms, FunDivEUROPE will set up a Knowledge Transfer Platform in order to foster communication, aggregation and synthesis of individual findings in the Work Packages and communication with stakeholders, policy makers and the wider public. The information gained should thus enable forest owners, forest managers and forest policy makers to adapt policies and management for sustainable use of forest ecosystems in a changing environment, capitalizing on the potential effects of biodiversity for ecosystem functioning. The experiences gained within FunDivEUROPE will finally allow contributing to the development of the European Long-Term Ecosystem Research Network, complementing existing forest observation and monitoring networks.


Grant
Agency: European Commission | Branch: FP7 | Program: CP-FP | Phase: HEALTH.2013.2.4.2-2 | Award Amount: 7.70M | Year: 2014

Coronary artery disease (CAD) is the leading cause of death in high-income countries. Invasive coronary angiography (ICA) is the reference standard for the diagnosis of CAD and allows immediate therapy. However, only 40% of patients undergoing ICA actually have obstructive CAD and ICA has relatively rare but considerable risks. Coronary computed tomography (CT) is the most accurate diagnostic test for CAD currently available. CT may become the most effective strategy to reduce the ca. 2 million annual negative ICAs in Europe by enabling early and safe discharge of the majority of patients with an intermediate risk of CAD. To evaluate this, we propose the DISCHARGE project that will be implemented by a multinational European consortium. The core of the project is the DISCHARGE pragmatic randomised controlled trial. The primary hypothesis will be that CT is superior to ICA for major adverse cardiovascular events (cardiovascular death, nonfatal myocardial infarction and stroke) after a maximum follow-up of 4 years in a selected broad population of stable chest pain patients with intermediate pretest likelihood of CAD. The trial will include 23 clinical sites from 18 European countries ensuring broad geographical representation. Comparative effectiveness research of complementing work packages include gender-related analysis, systematic review of evidence, cost-effectiveness analysis, and health-related quality of life. DISCHARGE has the capability to influence current standards and guidelines as well as coverage decisions and will raise awareness among patients, health care providers, and decision-makers in Europe about the effectiveness and cost-effectiveness of coronary CT angiography.


Grant
Agency: European Commission | Branch: FP7 | Program: CP-FP | Phase: HEALTH-2007-3.3-4 | Award Amount: 3.84M | Year: 2008

Suicide is a serious public health problem in the EU calling for effective interventions. The aim of this project is to provide EU member states with an evidence based prevention concept, concrete materials and instruments for running and evaluating these interventions and recommendations for the proper implementation of the intervention. These aims will be achieved by the following objectives: > Analysis of differences in suicide rates among European countries and harmonisation of procedures for definition, assessment and evaluation of suicidality > Development of a state of the art intervention concept for the prevention of suicidality that considers current evidence-based best practices and international experiences with multilevel interventions, such as that of the European Alliance Against Depression > Implementation of comparable multilevel community based prevention interventions in four European model regions > Evaluation of the interventions in a pre-post, controlled and cross-nationally comparable design concerning effectiveness with respect to both suicides and non-fatal suicidal acts, efficiency (including health economic evaluations), involved processes and finally the interplay between the single intervention measures > Distribution of an optimised suicide preventive intervention concept, corresponding materials and instruments, and recommendations for implementation to policy makers and stakeholders


Grant
Agency: European Commission | Branch: FP7 | Program: CP-IP | Phase: SSH.2013.5.2-1 | Award Amount: 6.39M | Year: 2014

Using an innovative interdisciplinary approach, MIME will generate an organised body of policy-relevant propositions addressing the full range of questions raised in the call. Our aim is to identify the language policies and strategies that best combine mobility and inclusion. MIME emphasises complementarity between disciplines, and brings together researchers from sociolinguistics, political science, sociology, history, geography, economics, education, translation studies, psychology, and law, who all have longstanding experience in the application of their discipline to language issues. The diverse concepts and methods are combined in an analytical framework designed to ensure their practice-oriented integration. MIME identifies, assesses and recommends measures for the management of trade-offs between the potentially conflicting goals of mobility and inclusion in a multilingual Europe. Rather than taking existing trade-offs as a given, we think that they can be modified, both in symbolic and in material/financial terms, and we argue that this objective can best be achieved through carefully designed public policies and the intelligent use of dynamics in civil society. Several partners have been involved in successful FP6 research, and key advances achieved there will guide the MIME project: languages are viewed as fluid realities in a context of high mobility of people, goods, services, and knowledge, influencing the way in which skills and identities are used and constantly re-shaped. The project integrates these micro-level insights into a macro-level approach to multilingual Europe. MIME results will be made widely available through a creative approach to dissemination, including training modules and the MIME Stakeholder Forum, allowing for sustained dialogue between academics, professional associations and local/regional authorities. The project culminates in a consensus conference where recommendations based on the project findings are adopted.


Grant
Agency: European Commission | Branch: FP7 | Program: CP | Phase: ENV.2013.6.1-2 | Award Amount: 11.32M | Year: 2013

StratoClim will produce more reliable projections of climate change and stratospheric ozone by a better understanding and improved representation of key processes in the Upper Troposphere and Stratosphere (UTS). This will be achieved by an integrated approach bridging observations from dedicated field activities, process modelling on all scales, and global modelling with a suite of chemistry climate models (CCMs) and Earth system models (ESMs). At present, complex interactions and feedbacks are inadequately represented in global models with respect to natural and anthropogenic emissions of greenhouse gases, aerosol precursors and other important trace gases, the atmospheric dynamics affecting transport into and through the UTS, and chemical and microphysical processes governing the chemistry and the radiative properties of the UTS. StratoClim will (a) improve the understanding of the microphysical, chemical and dynamical processes that determine the composition of the UTS, such as the formation, loss and redistribution of aerosol, ozone and water vapour, and how these processes will be affected by climate change; (b) implement these processes and fully include the interactive feedback from UTS ozone and aerosol on surface climate in CCMs and ESMs. Through StratoClim new measurements will be obtained in key regions: (1) in a tropical campaign with a high altitude research aircraft carrying an innovative and comprehensive payload, (2) by a new tropical station for unprecedented ground and sonde measurements, and (3) through newly developed satellite data products. The improved climate models will be used to make more robust and accurate predictions of surface climate and stratospheric ozone, both with a view to the protection of life on Earth. Socioeconomic implications will be assessed and policy relevant information will be communicated to policy makers and the public through a dedicated office for communication, stakeholder contact and international co-operation.


Grant
Agency: European Commission | Branch: FP7 | Program: CP-FP | Phase: AAT-2007-3.4-02;AAT-2007-4.2-06 | Award Amount: 5.69M | Year: 2008

The safe use of complex engineering structures such as aircrafts can only be guaranteed when efficient means of damage assessment are in place. Whereas the design of civil structures is nowadays based on a damage tolerance approach and time based inspection cycles, it is envisaged that the large cost associated with this approach can be drastically reduced by switching to a condition based maintenance schedule. Structural health monitoring is a technology where integrated sensors are used to enable continuous monitoring of the structural integrity. In the last years there is an increasing interest in structural health monitoring systems for aircraft. Beside the expected enhancement of safety and maintenance performance, also economic aspects play an important role. This regards on the one hand the reduction of unnecessary inspection costs and on the other hand, the possible weight reduction of aircraft part at the designing phase of an aircraft. This project wants to continue the project Aircraft Integrated Structural Health Assessment (AISHA) EU-FP6, priority 4 - STREP project nr. 502907) which was dedicated to the establishment of the basic elements of a health monitoring systems based on ultrasonic Lamb waves. Lamb waves are guided waves propagating in plate-like structures. Experiments on lab-scale and on selected full-scale parts showed the ability of Lamb-waves to give indications of correlations between acoustic parameters and damage in structural parts. The consortium is aware of the fact that a 42-month project is not sufficient to make the final step for a ready-to-use system. Thus, based on the rich experiences obtained in the running AISHA project which will be finished in June 2007, we want propose a new project with an adapted work plan and additional new partners.


Grant
Agency: European Commission | Branch: H2020 | Program: RIA | Phase: PHC-05-2014 | Award Amount: 6.46M | Year: 2015

Breast cancer affects more than 360,000 women per year in the EU and causes more than 90,000 deaths. Identification of women at high risk of the disease can lead to disease prevention through intensive screening, chemoprevention or prophylactic surgery. Breast cancer risk is determined by a combination of genetic and lifestyle risk factors. The advent of next generation sequencing has opened up the opportunity for testing in many disease genes, and diagnostic gene panel testing is being introduced in many EU countries. However, the cancer risks associated with most variants in most genes are unknown. This leads to a major problem in appropriate counselling and management of women undergoing panel testing. In this project, we aim to build a knowledge base that will allow identification of women at high-risk of breast cancer, in particular through comprehensive evaluation of DNA variants in known and suspected breast cancer genes. We will exploit the huge resources established through the Breast Cancer Association Consortium (BCAC) and ENIGMA (Evidence-based Network for the Interpretation of Germline Mutant Alleles). We will expand the existing datasets by sequencing all known breast cancer susceptibility genes in 20,000 breast cancer cases and 20,000 controls from population-based studies, and 10,000 cases from multiple case families. Sequence data will be integrated with in-silico and functional data, with data on other known risk factors, to generate a comprehensive risk model that can provide personalised risk estimates. We will develop online tools to aid the interpretation of gene variants and provide risk estimates in a user-friendly format, to help genetic counsellors and patients worldwide to make informed clinical decisions. We will evaluate the acceptability and utility of comprehensive gene panel testing in the clinical genetics context.


Grant
Agency: European Commission | Branch: FP7 | Program: CP-IP | Phase: HEALTH.2011.2.4.2-1 | Award Amount: 14.96M | Year: 2012

The consortium led by UKER and EuroHYP, the European Stroke Research Network for Hypothermia, proposes a large, multicentre clinical trial which will assess mild hypothermia as a novel treatment for ischemic stroke. Stroke is the second cause of death world-wide and the second cause of lost disability-adjusted life years in high-income countries. Stroke incidence rises exponentially with age, so its social and economic burden will grow with the ageing of the European population. Current treatment options for the 80 to 85% of all strokes due to cerebral ischaemia - around. 900,000 events in Europe every year, or one every 40 seconds - are extremely limited. Systematic review of experimental studies suggests that hypothermia is the most promising intervention identified to date. Therapeutic cooling is effective in reducing ischaemic brain injury following cardiac arrest, and hypothermia is therefore considered by experts the most promising treatment for patients with acute ischaemic stroke, next to reperfusion strategies. The EuroHYP-1 trial is a pan-European, open, randomised, phase III clinical trial which will assess the benefit or harm of therapeutic cooling in 1500 awake adult patients with acute ischaemic stroke. In addition to efficacy and safety, the economic impact of therapeutic hypothermia will be assessed, along with several sub-studies involving imaging, ultrasound, and biomarker methods. The investigators involved in the EuroHYP-1 consortium are leading European experts in statistical design and analysis, therapeutic hypothermia, imaging, health economics, ultrasound, biomarkers, and trial execution (implementation and monitoring). Moreover in addition to these academic experts the consortium also involves European patient and family advocacy groups and small and medium-size enterprises, and the joint endeavours of this extended team will ensure the successful enrolment of patients at eighty hospitals across 25 countries in Europe.


Chronic aortic aneurysms are permanent and localized dilations of the aorta that remain asymptomatic for long periods of time but continue to increase in diameter before they eventually rupture. Left untreated, the patients prognosis is dismal, since the internal bleeding of the rupture brings about sudden death. Although successful treatment cures the disease, the risky procedures can result in paraplegia from spinal cord ischaemia or even death, particularly for aneurysms extending from the thoracic to the abdominal aorta and thus involving many segmental arteries to the spinal cord, i.e. thoracoabdominal aortic aneurysms of Crawford type II. Although various strategies have achieved a remarkable decrease in the incidence of paraplegia, it is still no less than 10 to 20%. However, it has been found that the deliberate occlusion of the segmental arteries to the paraspinous collateral network finally supplying the spinal cord does not increase rates of permanent paraplegia. A therapeutic option, minimally invasive segmental artery coil embolization has been devised which proceeds in a staged way to occlude groups of arteries under highly controlled conditions after which time must be allowed for arteriogenesis to build a robust collateral blood supply. PAPA-ARTiS is a phase II trial to demonstrate that a staged treatment approach can reduce paraplegia and mortality dramatically. It can be expected to have both a dramatic impact on the individual patients quality of life if saved from a wheelchair, and also upon financial systems through savings in; 1) lower costs in EU health care; 2) lower pay-outs in disability insurance (est. at 500k in Year 1), and; 3) loss of economic output from unemployment. Approx. 2500 patients a year in Europe undergo these high risk operations with a cumulative paraplegia rate of over 15%; therefore >100M per year in costs can be avoided and significantly more considering the expected elimination of type II endoleaks.


Grant
Agency: European Commission | Branch: FP7 | Program: CP-IP | Phase: KBBE-2007-1-1-04 | Award Amount: 8.18M | Year: 2009

QUANTOMICS will deliver a step-change in the availability of cutting edge technologies and tools for the economic exploitation of livestock genomes. We will provide the tools to identify rapidly the causative DNA variation underpinning sustainability in livestock and for industry to exploit high-density genomic information. Our adaptable quantitative and genomic tools each based on cutting-edge technologies and valuable in itself, will together form a powerful integrated pipeline with wide application. To deliver these outcomes we will; i) use comparative genomics to annotate putatively functional features of the genomes of the EUs key farmed animal livestock species; ii) enhance existing molecular genetic tools (to include copy number variation, CNV); iii) deliver computationally optimised tools for genome-wide selection (GWS) to include CNV; iv) apply these tools to important health and welfare traits in commercial populations of dairy cattle and broiler chickens, determining the benefits and constraints; v) use hyper-parallel resequencing of DNA within identified genomic features underlying loci of large effect in significant numbers of animals to catalogue variation; vi) develop new visualisation tools to make this variation publicly available via the Ensembl genome-browser; vii) develop tools to prioritise the likely functionality of identified polymorphisms; viii) validate the utility of the putative causative haplotypes within commercial populations; ix) test the potential advances from combined GWS and gene assisted selection in breeding programmes; x) explore new methods to manage molecular biodiversity; xi) assess the implications of these new tools for breeding programme design, and xii) disseminate results of the project achieving major competitive, animal health and welfare impacts across the EU livestock industry and ultimately consumers. QUANTOMICS will have wide application in all farmed species and leave a legacy of resources for future research.


Patients with cardiovascular risk factors, e.g. hypertension and obesity are at risk of developing heart failure with preserved ejection fraction (HFpEF), a highly prevalent disease in the elderly, mostly women population. There is currently no specific, defined treatment for HFpEF, beyond control of risk factors. Activation of cardiac and vascular Beta3-adrenergic receptors (B3AR) represents a new concept and a novel target for structural cardiac disease. B3AR expression and coupling were demonstrated in human myocardium and vasculature. In pre-clinical models with expression of the human receptor, its activation attenuates myocardial remodelling, i.e. decreases hypertrophy and fibrosis in response to neurohormonal or hemodynamic stress. Mirabegron is a new agonist of B3AR available for human use, that was recently introduced for a non-cardiovascular indication (overactive bladder disease). The primary objective of the project is to design and implement a multi-centric, prospective, randomized, placebo-controlled clinical trial testing the additional beneficial effect of mirabegron, versus placebo over 12 months on top of standard treatment of patients carrying structural cardiac disease without overt heart failure (stage B of AHA classification); the co-primary end-point will be the quantitative change in myocardial hypertrophy measured by cardiac MRI; and in diastolic ventricular function, measured by Doppler echocardiography (E/E); in addition, exercise tolerance (peak VO2) will be measured as well as circulating biomarkers reflecting both myocardial remodeling and function. In addition, we will test the effect of mirabegron on beige/brown fat activation and metabolism. Our proposal therefore combines a major conceptual advance and repurposing of an original drug to validate pre-clinical discoveries in the context of a major health problem.


News Article | November 23, 2016
Site: www.newscientist.com

Try, for a moment, to envisage a world without countries. Imagine a map not divided into neat, coloured patches, each with clear borders, governments, laws. Try to describe anything our society does – trade, travel, science, sport, maintaining peace and security – without mentioning countries. Try to describe yourself: you have a right to at least one nationality, and the right to change it, but not the right to have none. Those coloured patches on the map may be democracies, dictatorships or too chaotic to be either, but virtually all claim to be one thing: a nation state, the sovereign territory of a “people” or nation who are entitled to self-determination within a self-governing state. So says the United Nations, which now numbers 193 of them. And more and more peoples want their own state, from Scots voting for independence to jihadis declaring a new state in the Middle East. Many of the big news stories of the day, from conflicts in Gaza and Ukraine to rows over immigration and membership of the European Union, are linked to nation states in some way. Even as our economies globalise, nation states remain the planet’s premier political institution. Large votes for nationalist parties in this year’s EU elections prove nationalism remains alive – even as the EU tries to transcend it. Yet there is a growing feeling among economists, political scientists and even national governments that the nation state is not necessarily the best scale on which to run our affairs. We must manage vital matters like food supply and climate on a global scale, yet national agendas repeatedly trump the global good. At a smaller scale, city and regional administrations often seem to serve people better than national governments. How, then, should we organise ourselves? Is the nation state a natural, inevitable institution? Or is it a dangerous anachronism in a globalised world? These are not normally scientific questions – but that is changing. Complexity theorists, social scientists and historians are addressing them using new techniques, and the answers are not always what you might expect. Far from timeless, the nation state is a recent phenomenon. And as complexity keeps rising, it is already mutating into novel political structures. Get set for neo-medievalism. Before the late 18th century there were no real nation states, says John Breuilly of the London School of Economics. If you travelled across Europe, no one asked for your passport at borders; neither passports nor borders as we know them existed. People had ethnic and cultural identities, but these didn’t really define the political entity they lived in. That goes back to the anthropology, and psychology, of humanity’s earliest politics. We started as wandering, extended families, then formed larger bands of hunter-gatherers, and then, around 10,000 years ago, settled in farming villages. Such alliances had adaptive advantages, as people cooperated to feed and defend themselves. But they also had limits. Robin Dunbar of the University of Oxford has shown that one individual can keep track of social interactions linking no more than around 150 people. Evidence for that includes studies of villages and army units through history, and the average tally of Facebook friends. But there was one important reason to have more friends than that: war. “In small-scale societies, between 10 and 60 per cent of male deaths are attributable to warfare,” says Peter Turchin of the University of Connecticut at Storrs. More allies meant a higher chance of survival. Turchin has found that ancient Eurasian empires grew largest where fighting was fiercest, suggesting war was a major factor in political enlargement. Archaeologist Ian Morris of Stanford University in California reasons that as populations grew, people could no longer find empty lands where they could escape foes. The losers of battles were simply absorbed into the enemy’s domain – so domains grew bigger. How did they get past Dunbar’s number? Humanity’s universal answer was the invention of hierarchy. Several villages allied themselves under a chief; several chiefdoms banded together under a higher chief. To grow, these alliances added more villages, and if necessary more layers of hierarchy. Hierarchies meant leaders could coordinate large groups without anyone having to keep personal track of more than 150 people. In addition to their immediate circle, an individual interacted with one person from a higher level in the hierarchy, and typically eight people from lower levels, says Turchin. These alliances continued to enlarge and increase in complexity in order to perform more kinds of collective actions, says Yaneer Bar-Yam of the New England Complex Systems Institute in Cambridge, Massachusetts. For a society to survive, its collective behaviour must be as complex as the challenges it faces – including competition from neighbours. If one group adopted a hierarchical society, its competitors also had to. Hierarchies spread and social complexity grew. Larger hierarchies not only won more wars but also fed more people through economies of scale, which enabled technical and social innovations such as irrigation, food storage, record-keeping and a unifying religion. Cities, kingdoms and empires followed. But these were not nation states. A conquered city or region could be subsumed into an empire regardless of its inhabitants’ “national” identity. “The view of the state as a necessary framework for politics, as old as civilisation itself, does not stand up to scrutiny,” says historian Andreas Osiander of the University of Leipzig in Germany. “The view of the state as a necessary framework for politics does not stand up” One key point is that agrarian societies required little actual governing. Nine people in 10 were peasants who had to farm or starve, so were largely self-organising. Government intervened to take its cut, enforce basic criminal law and keep the peace within its undisputed territories. Otherwise its main role was to fight to keep those territories, or acquire more. Even quite late on, rulers spent little time governing, says Osiander. In the 17th century Louis XIV of France had half a million troops fighting foreign wars but only 2000 keeping order at home. In the 18th century, the Dutch and Swiss needed no central government at all. Many eastern European immigrants arriving in the US in the 19th century could say what village they came from, but not what country: it didn’t matter to them. Before the modern era, says Breuilly, people defined themselves “vertically” by who their rulers were. There was little horizontal interaction between peasants beyond local markets. Whoever else the king ruled over, and whether those people were anything like oneself, was largely irrelevant. Such systems are very different from today’s states, which have well-defined boundaries filled with citizens. In a system of vertical loyalties, says Breuilly, power peaks where the overlord lives and peters out in frontier territories that shade into neighbouring regions. Ancient empires are coloured on modern maps as if they had firm borders, but they didn’t. Moreover, people and territories often came under different jurisdictions for different purposes. Such loose control, says Bar-Yam, meant pre-modern political units were only capable of scaling up a few simple actions such as growing food, fighting battles, collecting tribute and keeping order. Some, like the Roman Empire, did this on a very large scale. But complexity – the different actions society could collectively perform – was relatively low. Complexity was limited by the energy a society could harness. For most of history that essentially meant human and animal labour. In the late Middle Ages, Europe harnessed more, especially water power. This boosted social complexity – trade increased, for example– requiring more government. A decentralised feudal system gave way to centralised monarchies with more power. But these were still not nation states. Monarchies were defined by who ruled them, and rulers were defined by mutual recognition – or its converse, near-constant warfare. In Europe, however, as trade grew, monarchs discovered they could get more power from wealth than war. In 1648, Europe’s Peace of Westphalia ended centuries of war by declaring existing kingdoms, empires and other polities “sovereign”: none was to interfere in the internal affairs of others. This was a step towards modern states – but these sovereign entities were still not defined by their peoples’ national identities. International law is said to date from the Westphalia treaty, yet the word “international” was not coined until 132 years later. By then Europe had hit the tipping point of the industrial revolution. Harnessing vastly more energy from coal meant that complex behaviours performed by individuals, such as weaving, could be amplified, says Bar-Yam, producing much more complex collective behaviours. This demanded a different kind of government. In 1776 and 1789, revolutions in the US and France created the first nation states, defined by the national identity of their citizens rather than the bloodlines of their rulers. According to one landmark history of the period, says Breuilly, “in 1800 almost nobody in France thought of themselves as French. By 1900 they all did.” For various reasons, people in England had an earlier sense of “Englishness”, he says, but it was not expressed as a nationalist ideology. By 1918, with the dismemberment of Europe’s last multinational empires such as the Habsburgs in the first world war, European state boundaries had been redrawn largely along cultural and linguistic lines. In Europe at least, the nation state was the new norm. Part of the reason was a pragmatic adaptation of the scale of political control required to run an industrial economy. Unlike farming, industry needs steel, coal and other resources which are not uniformly distributed, so many micro-states were no longer viable. Meanwhile, empires became unwieldy as they industrialised and needed more actual governing. So in 19th-century Europe, micro-states fused and empires split. These new nation states were justified not merely as economically efficient, but as the fulfilment of their inhabitants’ national destiny. A succession of historians has nonetheless concluded that it was the states that defined their respective nations, and not the other way around. France, for example, was not the natural expression of a pre-existing French nation. At the revolution in 1789, half its residents did not speak French. In 1860, when Italy unified, only 2.5 per cent of residents regularly spoke standard Italian. Its leaders spoke French to each other. One famously said that, having created Italy, they now had to create Italians – a process many feel is still taking place. “At the revolution in 1789, half of France’s residents did not speak French” Sociologist Siniša Maleševic of University College Dublin in Ireland believes that this “nation building” was a key step in the evolution of modern nation states. It required the creation of an ideology of nationalism that emotionally equated the nation with people’s Dunbar circle of family and friends. That in turn relied heavily on mass communication technologies. In an influential analysis, Benedict Anderson of Cornell University in New York described nations as “imagined” communities: they far outnumber our immediate circle and we will never meet them all, yet people will die for their nation as they would for their family. Such nationalist feelings, he argued, arose after mass-market books standardised vernaculars and created linguistic communities. Newspapers allowed people to learn about events of common concern, creating a large “horizontal” community that was previously impossible. National identity was also deliberately fostered by state-funded mass education. The key factor driving this ideological process, Maleševic says, was an underlying structural one: the development of far-reaching bureaucracies needed to run complex industrialised societies. For example, says Breuilly, in the 1880s Prussia became the first government to pay unemployment benefits. At first they were paid only in a worker’s native village, where identification was not a problem. As people migrated for work, benefits were made available anywhere in Prussia. “It wasn’t until then that they had to establish who a Prussian was,” he says, and they needed bureaucracy to do it. Citizenship papers, censuses and policed borders followed. That meant hierarchical control structures ballooned, with more layers of middle management. Such bureaucracy was what really brought people together in nation-sized units, argues Maleševic. But not by design: it emerged out of the behaviour of complex hierarchical systems. As people do more kinds of activities, says Bar-Yam, the control structure of their society inevitably becomes denser. In the emerging nation state, that translates into more bureaucrats per head of population. Being tied into such close bureaucratic control also encouraged people to feel personal ties with the state, especially as ties to church and village declined. As governments exerted greater control, people got more rights, such as voting, in return. For the first time, people felt the state was theirs. Once Europe had established the nation state model and prospered, says Breuilly, everyone wanted to follow suit. In fact it’s hard now to imagine that there could be another way. But is a structure that grew spontaneously out of the complexity of the industrial revolution really the best way to manage our affairs? According to Brian Slattery of York University in Toronto, Canada, nation states still thrive on a widely held belief that “the world is naturally made of distinct, homogeneous national or tribal groups which occupy separate portions of the globe, and claim most people’s primary allegiance”. But anthropological research does not bear that out, he says. Even in tribal societies, ethnic and cultural pluralism has always been widespread. Multilingualism is common, cultures shade into each other, and language and cultural groups are not congruent. Moreover, people always have a sense of belonging to numerous different groups based on region, culture, background and more. “The claim that a person’s identity and well-being is tied in a central way to the well-being of the national group is wrong as a simple matter of historical fact,” says Slattery. Perhaps it is no wonder, then, that the nation-state model fails so often: since 1960 there have been more than 180 civil wars worldwide. Such conflicts are often blamed on ethnic or sectarian tensions. Failed states, such as Syria right now, are typically riven by violence along such lines. According to the idea that nation states should contain only one nation, such failures have often been blamed on the colonial legacy of bundling together many peoples within unnatural boundaries. But for every Syria or Iraq there is a Singapore, Malaysia or Tanzania, getting along okay despite having several “national” groups. Immigrant states in Australia and the Americas, meanwhile, forged single nations out of massive initial diversity. What makes the difference? It turns out that while ethnicity and language are important, what really matters is bureaucracy. This is clear in the varying fates of the independent states that emerged as Europe’s overseas empires fell apart after the second world war. According to the mythology of nationalism, all they needed was a territory, a flag, a national government and UN recognition. In fact what they really needed was complex bureaucracy. Some former colonies that had one became stable democracies, notably India. Others did not, especially those such as the former Belgian Congo, whose colonial rulers had merely extracted resources. Many of these became dictatorships, which require a much simpler bureaucracy than democracies. Dictatorships exacerbate ethnic strife because their institutions do not promote citizens’ identification with the nation. In such situations, people fall back on trusted alliances based on kinship, which readily elicit Dunbar-like loyalties. Insecure governments allied to ethnic groups favour their own, while grievances among the disfavoured groups grow – and the resulting conflict can be fierce. Recent research confirms that the problem is not ethnic diversity itself, but not enough official inclusiveness. Countries with little historic ethnic diversity are now having to learn that on the fly, as people migrate to find jobs within a globalised economy. How that pans out may depend on whether people self-segregate. Humans like being around people like themselves, and ethnic enclaves can be the result. Jennifer Neal of Michigan State University in East Lansing has used agent-based modelling to look at the effect of this in city neighbourhoods. Her work suggests that enclaves promote social cohesion, but at the cost of decreasing tolerance between groups. Small enclaves in close proximity may be the solution. But at what scale? Bar-Yam says communities where people are well mixed – such as in peaceable Singapore, where enclaves are actively discouraged – tend not to have ethnic strife. Larger enclaves can also foster stability. Using mathematical models to correlate the size of enclaves with the incidences of ethnic strife in India, Switzerland and the former Yugoslavia, he found that enclaves 56 kilometres or more wide make for peaceful coexistence – especially if they are separated by natural geographical barriers, Switzerland’s 26 cantons, for example, which have different languages and religions, meet Bar-Yam’s spatial stability test – except one. A French-speaking enclave in German-speaking Berne experienced the only major unrest in recent Swiss history. It was resolved by making it a separate canton, Jura, which meets the criteria. Again, though, ethnicity and language are only part of the story. Lars-Erik Cederman of the Swiss Federal Institute of Technology in Zurich argues that Swiss cantons have achieved peace not by geographical adjustment of frontiers, but by political arrangements giving cantons considerable autonomy and a part in collective decisions. Similarly, using a recently compiled database to analyse civil wars since 1960, Cederman finds that strife is indeed more likely in countries that are more ethnically diverse. But careful analysis confirms that trouble arises not from diversity alone, but when certain groups are systematically excluded from power. Governments with ethnicity-based politics were especially vulnerable. The US set up just such a government in Iraq after the 2003 invasion. Exclusion of Sunni by Shiites led to insurgents declaring a Sunni state in occupied territory in Iraq and Syria. True to nation-state mythology, it rejects the colonial boundaries of Iraq and Syria, as they force dissimilar “nations” together. Yet the solution cannot be imposing ethnic uniformity. Historically, so-called ethnic cleansing has been uniquely bloody, and “national” uniformity is no guarantee of harmony. In any case, there is no good definition of an ethnic group. Many people’s ethnicities are mixed and change with the political weather: the numbers who claimed to be German in the Czech Sudetenland territory annexed by Hitler changed dramatically before and after the war. Russian claims to Russian-speakers in eastern Ukraine now may be equally flimsy. Both Bar-Yam’s and Cederman’s research suggests one answer to diversity within nation states: devolve power to local communities, as multicultural states such as Belgium and Canada have done. “We need a conception of the state as a place where multiple affiliations and languages and religions may be safe and flourish,” says Slattery. “That is the ideal Tanzania has embraced and it seems to be working reasonably well.” Tanzania has more than 120 ethnic groups and about 100 languages. In the end, what may matter more than ethnicity, language or religion is economic scale. The scale needed to prosper may have changed with technology – tiny Estonia is a high-tech winner – but a small state may still not pack enough economic power to compete. That is one reason why Estonia is such an enthusiastic member of the European Union. After the devastating wars in the 20th century, European countries tried to prevent further war by integrating their basic industries. That project, which became the European Union, now primarily offers member states profitable economies of scale, through manufacturing and selling in the world’s largest single market. What the EU fails to inspire is nationalist-style allegiance – which Maleševic thinks nowadays relies on the “banal” nationalism of sport, anthems, TV news programmes, even song contests. That means Europeans’ allegiances are no longer identified with the political unit that handles much of their government. Ironically, says Jan Zielonka of the University of Oxford, the EU has saved Europe’s nation states, which are now too small to compete individually. The call by nationalist parties to “take back power from Brussels”, he argues, would lead to weaker countries, not stronger ones. He sees a different problem. Nation states grew out of the complex hierarchies of the industrial revolution. The EU adds another layer of hierarchy – but without enough underlying integration to wield decisive power. It lacks both of Maleševic’s necessary conditions: nationalist ideology and pervasive integrating bureaucracy. Even so, the EU may point the way to what a post-nation-state world will look like. Zielonka agrees that further integration of Europe’s governing systems is needed as economies become more interdependent. But he says Europe’s often-paralysed hierarchy cannot achieve this. Instead he sees the replacement of hierarchy by networks of cities, regions and even non-governmental organisations. Sound familiar? Proponents call it neo-medievalism. “The future structure and exercise of political power will resemble the medieval model more than the Westphalian one,” Zielonka says. “The latter is about concentration of power, sovereignty and clear-cut identity.” Neo-medievalism, on the other hand, means overlapping authorities, divided sovereignty, multiple identities and governing institutions, and fuzzy borders. “The future exercise of power will resemble the medieval model” Anne-Marie Slaughter of Princeton University, a former US assistant secretary of state, also sees hierarchies giving way to global networks primarily of experts and bureaucrats from nation states. For example, governments now work more through flexible networks such as the G7 (or 8, or 20) to manage global problems than through the UN hierarchy. Ian Goldin, head of the Oxford Martin School at the University of Oxford, which analyses global problems, thinks such networks must emerge. He believes existing institutions such as UN agencies and the World Bank are structurally unable to deal with problems that emerge from global interrelatedness, such as economic instability, pandemics, climate change and cybersecurity – partly because they are hierarchies of member states which themselves cannot deal with these global problems. He quotes Slaughter: “Networked problems require a networked response.” Again, the underlying behaviour of systems and the limits of the human brain explain why. Bar-Yam notes that in any hierarchy, the person at the top has to be able to get their head around the whole system. When systems are too complex for one human mind to grasp, he argues that they must evolve from hierarchies into networks where no one person is in charge. Where does this leave nation states? “They remain the main containers of power in the world,” says Breuilly. And we need their power to maintain the personal security that has permitted human violence to decline to all-time lows. Moreover, says Dani Rodrik of Princeton’s Institute for Advanced Study, the very globalised economy that is allowing these networks to emerge needs something or somebody to write and enforce the rules. Nation states are currently the only entities powerful enough to do this. Yet their limitations are clear, both in solving global problems and resolving local conflicts. One solution may be to pay more attention to the scale of government. Known as subsidiarity, this is a basic principle of the EU: the idea that government should act at the level where it is most effective, with local government for local problems and higher powers at higher scales. There is empirical evidence that it works: social and ecological systems can be better governed when their users self-organise than when they are run by outside leaders. However, it is hard to see how our political system can evolve coherently in that direction. Nation states could get in the way of both devolution to local control and networking to achieve global goals. With climate change, it is arguable that they already have. There is an alternative to evolving towards a globalised world of interlocking networks, neo-medieval or not, and that is collapse. “Most hierarchical systems tend to become top-heavy, expensive and incapable of responding to change,” says Marten Scheffer of Wageningen University in the Netherlands. “The resulting tension may be released through partial collapse.” For nation states, that could mean anything from the renewed pre-eminence of cities to Iraq-style anarchy. An uncertain prospect, but there is an upside. Collapse, say some, is the creative destruction that allows new structures to emerge. Like it or not, our societies may already be undergoing this transition. We cannot yet imagine there are no countries. But recognising that they were temporary solutions to specific historical situations can only help us manage a transition to whatever we need next. Whether or not our nations endure, the structures through which we govern our affairs are due for a change. Time to start imagining. Leader: “In our world beyond nations, the future is medieval” This article appeared in print under the headline “Imagine there’s no countries…”


Grant
Agency: European Commission | Branch: FP7 | Program: CP | Phase: ENERGY.2013.5.1.2 | Award Amount: 10.50M | Year: 2014

The key objective of the M4CO2 project is to develop and prototype Mixed Matrix Membranes based on highly engineered Metal organic frameworks and polymers (M4) that outperform current technology for CO2 Capture (CO2) in pre- and post-combustion, meeting the energy and cost reduction targets of the European SET plan. By applying the innovative concept of M4 by a consortium of world key players, continuous separation processes of unsurpassed energy efficiency will be realized as a gas-liquid phase change is absent, reducing the energy penalty and resulting in smaller CO2 footprints. Further, gas separation membrane units are safer, environmentally friendly and, in general, have smaller physical footprints than other types of plants like amine stripping. In this way this project aims at a quantum leap in energy reduction for CO2 separation with associated cost efficiency and environmental impact reduction. The developed membranes will allow CO2 capture at prices below 15 /ton CO2 ( 10-15 /MWh), amply meeting the targets of the European SET plan (90% of CO2 recovery at a cost lower than 25/MWh). This will be underpinned experimentally as well as through conceptual process designs and economic projections by the industrial partners. By developing optimized M4s, we will combine: i) easy manufacturing, ii) high fluxes per unit volume and iii) high selectivity through advanced material tailoring. The main barriers that we will take away are the optimization of the MOF-polymer interaction and selective transport through the composite, where chemical compatibility, filler morphology and dispersion, and polymer rigidity all play a key role. Innovatively the project will be the first systematic, integral study into this type of membranes with investigations at all relevant length scales; including the careful design of the polymer(s) and the tuning of MOF crystals targeting the application in M4s and the design of the separation process.


Meusinger H.,Thuringer Landessternwarte Tautenburg | Balafkan N.,University of Leipzig
Astronomy and Astrophysics | Year: 2014

Aims. A tiny fraction of the quasar population shows remarkably weak emission lines. Several hypotheses have been developed, but the weak line quasar (WLQ) phenomenon still remains puzzling. The aim of this study was to create a sizeable sample of WLQs and WLQ-like objects and to evaluate various properties of this sample. Methods. We performed a search for WLQs in the spectroscopic data from the Sloan Digital Sky Survey Data Release 7 based on Kohonen self-organising maps for nearly 105 quasar spectra. The final sample consists of 365 quasars in the redshift range z = 0.6 - 4.2 (z = 1.50 ± 0.45) and includes in particular a subsample of 46 WLQs with equivalent widths WMg ii< 11 Å and WC iv< 4.8 Å. We compared the luminosities, black hole masses, Eddington ratios, accretion rates, variability, spectral slopes, and radio properties of the WLQs with those of control samples of ordinary quasars. Particular attention was paid to selection effects. Results. The WLQs have, on average, significantly higher luminosities, Eddington ratios, and accretion rates. About half of the excess comes from a selection bias, but an intrinsic excess remains probably caused primarily by higher accretion rates. The spectral energy distribution shows a bluer continuum at rest-frame wavelengths 1500 Å. The variability in the optical and UV is relatively low, even taking the variability-luminosity anti-correlation into account. The percentage of radio detected quasars and of core-dominant radio sources is significantly higher than for the control sample, whereas the mean radio-loudness is lower. Conclusions. The properties of our WLQ sample can be consistently understood assuming that it consists of a mix of quasars at the beginning of a stage of increased accretion activity and of beamed radio-quiet quasars. The higher luminosities and Eddington ratios in combination with a bluer spectral energy distribution can be explained by hotter continua, i.e. higher accretion rates. If quasar activity consists of subphases with different accretion rates, a change towards a higher rate is probably accompanied by an only slow development of the broad line region. The composite WLQ spectrum can be reasonably matched by the ordinary quasar composite where the continuum has been replaced by that of a hotter disk. A similar effect can be achieved by an additional power-law component in relativistically boosted radio-quiet quasars, which may explain the high percentage of radio quasars. © 2014 ESO.


Hirrlinger J.,Max Planck Institute for Experimental Medicine | Hirrlinger J.,University of Leipzig | Hirrlinger J.,Carl Ludwig Institute for Physiology | Nave K.-A.,Max Planck Institute for Experimental Medicine
GLIA | Year: 2014

In the mammalian brain, the subcortical white matter comprises long-range axonal projections and their associated glial cells. Here, astrocytes and oligodendrocytes serve specific functions during development and throughout adult life, when they meet the metabolic needs of long fiber tracts. Within a short period of time, oligodendrocytes generate large amount of lipids, such as cholesterol, and membrane proteins for building the myelin sheaths. After myelination has been completed, a remaining function of glial metabolism is the energetic support of axonal transport and impulse propagation. Astrocytes can support axonal energy metabolism under low glucose conditions by the degradation of stored glycogen. Recently it has been recognized that the ability of glycolytic oligodendrocytes to deliver pyruvate and lactate is critical for axonal functions in vivo. In this review, we discuss the specific demands of oligodendrocytes during myelination and potential routes of metabolites between glial cells and myelinated axons. As examples, four specific metabolites are highlighted (cholesterol, glycogen, lactate, and N-acetyl-aspartate) that contribute to the specific functions of white matter glia. Regulatory processes are discussed that could be involved in coordinating metabolic adaptations and in providing feedback information about metabolic states. © 2014 Wiley Periodicals, Inc.


Zahn S.,University of Leipzig | Stark A.,TU Darmstadt
Physical Chemistry Chemical Physics | Year: 2015

The dissolution of 1-alkyl-3-methylimidazolium chloride ILs with short alkyl chains in trihexyltetradecylphosphonium chloride does not only exhibit a large negative entropy. Also, in the resulting mixtures, the phosphonium cation diffuses faster than the much smaller imidazolium cation. Both unexpected features originate from the formation of a large symmetric ion cluster cage in which the imidazolium cation is caught by three chloride anions and four phosphonium cations. © the Owner Societies 2015.


Dagres N.,National and Kapodistrian University of Athens | Hindricks G.,University of Leipzig
European Heart Journal | Year: 2013

Patients who have experienced a myocardial infarction (MI) are at increased risk of sudden cardiac death (SCD). With the advent of implantable cardioverter-defibrillators (ICDs), accurate risk stratification has become very relevant. Numerous investigations have proven that a reduced left ventricular ejection fraction (LVEF) significantly increases the SCD risk. Furthermore, ICD implantation in patients with reduced LVEF confers significant survival benefit. As a result, LVEF is the cornerstone of current decision making for prophylactic ICD implantation after MI. However, LVEF as standalone risk stratifier has major limitations: (i) the majority of SCD cases occur in patients with preserved or moderately reduced LVEF, (ii) only relatively few patients with reduced LVEF will benefit from an ICD (most will never experience a threatening arrhythmic event, others have a high risk for non-sudden death), (iii) a reduced LVEF is a risk factor for both sudden and non-sudden death. Several other non-invasive and invasive risk stratifiers, such as ventricular ectopy, QRS duration, signal-averaged electrocardiogram, microvolt T-wave alternans, markers of autonomic tone as well as programmed ventricular stimulation, have been evaluated. However, none of these techniques has unequivocally demonstrated the efficacy when applied alone or in combination with LVEF. Apart from their limited sensitivity, most of them are risk factors for both sudden and non-sudden death. Considering the multiple mechanisms involved in SCD, it seems unlikely that a single test will prove adequate for all patients. A combination of clinical characteristics with selected stratification tools may significantly improve risk stratification in the future. © The Author 2013.


Hilbert A.,University of Leipzig | Braehler E.,University of Leipzig | Haeuser W.,Klinikum Saarbrucken | Zenger M.,University of Leipzig
Obesity | Year: 2014

Objective Weight bias has strong associations with psychopathology in overweight and obese individuals. However, self-evaluative processes, as conceptualized in the process model of self-stigma, and implications for other health-related outcomes, remain to be clarified. Design and Methods In a representative general population sample of N = 1158 overweight and obese individuals, the impact of core self-evaluation as a mediator between weight bias internalization and mental and global health outcomes as well as between weight bias internalization and health care utilization, was examined using structural equation modeling. Results In overweight and obese individuals, greater weight bias internalization predicted lower core self-evaluation, which in turn predicted greater depression and anxiety, lower global health, and greater health care utilization. These mediational associations were largely stable in subsample analyses and after controlling for sociodemographic variables. Conclusions The results show that overweight and obese individuals with internalized weight bias are at risk for impaired health, especially if they experience low core self-evaluation, making them a group with which to target for interventions to reduce self-stigma. Weight bias internalization did not represent a barrier to health care utilization, but predicted greater health care utilization in association with greater health impairments. Copyright © 2013 The Obesity Society.


Schnabel S.,University of Georgia | Janke W.,University of Leipzig | Bachmann M.,Jülich Research Center
Journal of Computational Physics | Year: 2011

The investigation of freezing transitions of single polymers is computationally demanding, since surface effects dominate the nucleation process. In recent studies we have systematically shown that the freezing properties of flexible, elastic polymers depend on the precise chain length. Performing multicanonical Monte Carlo simulations, we faced several computational challenges in connection with liquid-solid and solid-solid transitions. For this reason, we developed novel methods and update strategies to overcome the arising problems. We introduce novel Monte Carlo moves and two extensions to the multicanonical method. © 2011 Elsevier Inc.


Scheinert D.,University of Leipzig | Schulte K.-L.,Ev. Krankenhaus Konigin Elisabeth Herzberge | Zeller T.,Universitaets Herzzentrum Freiburg Bad Krozingen | Lammer J.,Medical University of Vienna | Tepe G.,Rosenheim Hospital
Journal of Endovascular Therapy | Year: 2015

Purpose: To evaluate the safety and efficacy of the novel Passeo-18 Lux paclitaxel-coated balloon compared with the Passeo-18 uncoated balloon in patients with symptomatic de novo or restenotic femoropopliteal lesions. Methods: Sixty patients (34 men; mean age 70.7±10.1 years) in 5 European centers were enrolled in the BIOLUX P-I trial (ClinicalTrials.gov identifier NCT01056120) and randomized 1:1 to either the paclitaxel-coated balloon or the uncoated balloon. The primary endpoint was late lumen loss at 6 months. Secondary endpoints were binary restenosis at 6 months, clinically driven target lesion revascularization (TLR), change in ankle-brachial index and Rutherford classification, and major adverse events at 6 and 12 months. Results: At 6 months, patients treated with paclitaxel-coated balloons had a significantly lower late lumen loss (0.51±0.72 vs. 1.04±1.00 mm, p=0.033) and binary restenosis (11.5% vs. 34.6%, p=0.048) than the control group. Correspondingly, clinically driven TLR was lower in the paclitaxel-coated balloon group at 12 months [15.4% vs. 41.7% (p=0.064) for the intention-to-treat population and 16.0% vs. 52.9%, (p=0.020) for the as-treated population]. No death and one minor amputation were observed compared with two deaths and two minor amputations in the control group. No major amputations or thrombosis were reported. Conclusion: The Passeo-18 Lux paclitaxel-coated balloon has been proven to be safe and effective in patients with femoropopliteal lesions, with superior performance outcomes compared with treatment with an uncoated balloon. © The Author(s) 2015.


Olbrich S.,University of Leipzig | Arns M.,University Utrecht | Arns M.,Research Institute Brainclinics
International Review of Psychiatry | Year: 2013

Major depressive disorder (MDD) has high population prevalence and is associated with substantial impact on quality of life, not least due to an unsatisfactory time span of sometimes several weeks from initiation of treatment to clinical response. Therefore extensive research focused on the identification of cost-effective and widely available electroencephalogram (EEG)-based biomarkers that not only allow distinguishing between patients and healthy controls but also have predictive value for treatment response for a variety of treatments. In this comprehensive overview on EEG research on MDD, biomarkers that are either assessed at baseline or during the early course of treatment and are helpful in discriminating patients from healthy controls and assist in predicting treatment outcome are reviewed, covering recent decades up to now. Reviewed markers include quantitative EEG (QEEG) measures, connectivity measures, EEG vigilance-based measures, sleep-EEG-related measures and event-related potentials (ERPs). Further, the value and limitations of these different markers are discussed. Finally, the need for integrated models of brain function and the necessity for standardized procedures in EEG biomarker research are highlighted to enhance future research in this field. © 2013 Institute of Psychiatry.


Moddel M.,University of Leipzig | Janke W.,University of Leipzig | Bachmann M.,Jülich Research Center
Physical Chemistry Chemical Physics | Year: 2010

In detailed microcanonical analyses of densities of states obtained by extensive multicanonical Monte Carlo computer simulations, we investigate the caloric properties of conformational transitions that adsorbing polymers experience near attractive substrates. For short chains and strong surface attraction, the microcanonical entropy turns out to be a convex function of energy in the transition regime, indicating that surface-entropic effects are relevant. Albeit known to be a continuous transition in the thermodynamic limit of infinitely long chains, the adsorption transition of nongrafted finite-length polymers thus exhibits a clear signature of a first-order-like transition, with coexisting phases of adsorbed and desorbed conformations. Another remarkable consequence of the convexity of the microcanonical entropy is that the transition is accompanied by a decrease of the microcanonical temperature with increasing energy. Since this is a characteristic physical effect it might not be ignored in analyses of cooperative macrostate transitions in finite systems. © 2010 the Owner Societies.


Grant
Agency: European Commission | Branch: FP7 | Program: CP-FP | Phase: HEALTH-2007-2.2.1-7 | Award Amount: 6.29M | Year: 2008

At present more than 5 million people in the EU suffer from dementia and other neurodegenerative diseases and that number will grow as the average age of the population continues to increase. The efficacy of current medicines is limited and new therapeutic targets are sorely needed. Several independent lines of evidence have established an important role of prolyl oligopeptidase (PREP) in brain function and dysfunction. Aberrant PREP activity is involved in the progression of neurodegenerative disorders and PREP inhibitors are being developed for the treatment of memory and cognition deficits. Now a consortium of expert scientists from 8 academic institutes and 3 SMEs come together for 4 years in this NEUROPRO project to boost European research aimed at 1) unravelling the biological role of PREP and PREP-like proteins in neuropathology, 2) determining the mode of action of PREP inhibitors and 3) firmly establishing their therapeutic potential. Specialists from different disciplines cell and molecular biology, enzymology, chemistry, crystallography, biology and pharmacology will work in a concerted and focussed way to achieve the goals using 6 work packages concentrating on PREP-regulated pathways in health and disease, PREP substrates, inhibitor target identification, drug development and validation, and generation of specific cell lines and animal models of neurodegenerative diseases. The SMEs involved are leaders in PREP inhibitor development and peptide analysis, and have in the past already brought novel therapeutics on the market. By the end of the project we expect to have proof of concept that PREP inhibition is a valid therapeutic target which will ultimately lead to new methods for the early detection, prevention or restoration of PREP-related neurodegeneration. The project also comprises instruments to translate basic research into clinical applications and will thus broaden the scope of treatments available to Europes ageing population.


Grant
Agency: European Commission | Branch: FP7 | Program: CP | Phase: Fission-2007-3.1-01 | Award Amount: 6.70M | Year: 2008

Recent epidemiologic evidence suggests that moderate and low radiation doses to the heart may result in a moderate, but significant increase in cardiovascular mortality. So far, the pathogenesis of radiation induced heart disease has not been studied in detail. Pathohistologic studies suggest that microvascular damage plays a crucial role in the development of radiation induced cardiovascular disease. In addition, radiation may increase atherosclerotic lesions in the coronary arteries. The aim of this collaborative research project is to elucidate the pathogenesis of early and late alteration in the microcirculation of the heart and of atherosclerotic lesions in arteries after exposure to low radiation doses in comparison to high radiation doses. A major goal will be the investigation of early molecular, proinflammatory and prothrombotic changes as well as perfusion alteration, cardiac cell integrity and immunologic influences. To achieve this goal, in vivo as well as ex vivo and in vitro studies will be performed. A central component of the project will be the local irradiation of the heart with subsequent isolation of cardiomyocytes and cardiac endothelial cells to provide all participating groups with the same biological material for further study. In addition, structural, morphological and molecular studies will be complemented by functional assays and imaging methods.


Grant
Agency: European Commission | Branch: FP7 | Program: CP | Phase: ICT-2013.9.7 | Award Amount: 8.27M | Year: 2013

The DIADEMS project aims at exploiting the unique physical properties of NV color centres in ultrapure single-crystal CVD-grown diamond to develop innovative devices with unprecedented performances for ICT applications. By exploiting the atom-like structure of the NV that exhibits spin dependent optical transitions, DIADEMS will make optics-based magnetometry possible.\nThe objectives of DIADEMS are to develop\n- Wide field magnetic imagers with 1 nT sensivities,\n- Scanning probe magnetometer with sensitivity 10 nT and spatial resolution 10 nm,\n- Sensor heads with resolution 1 pT.\n\nTo reach such performances, DIADEMS will:\n- Use new theoretical protocols for sensing,\n- Develop ultrahigh purity diamond material with controlled single nitrogen implantation with a precision better than 5 nm,\n- Process scanning probe tips with diametre in the 20 nm range,\n- Transfer them to AFM cantilever, improve the emission properties of NV by coupling them with photonic cavities and photonic waveguides.\n\nDIADEMS outputs will demonstrate new ICT functionalities that will boost applications with high impact on society:\n- Calibration and optimization of write/read magnetic heads for future high capacity (3 Tbit per square inch) storage disk required for intense computing,\n- Imaging of electron-spin in graphene and carbon nanotubes for next generation of electronic components based on spintronics,\n- Non-invasive investigation of living neuronal networks to understand brain function,\n- Demonstration of magnetic resonance imaging of single spins allowing single protein imaging for medical research.\n\nDIADEMS aims at integrating the efforts of the European Community on NV centres to push further the limits of this promising technology and to keep Europes prominent position.


Wostmann M.,Max Planck Institute for Human Cognitive and Brain Sciences | Wostmann M.,International Max Planck Research School on Neuroscience of Communication | Schroger E.,University of Leipzig | Obleser J.,Max Planck Institute for Human Cognitive and Brain Sciences
Journal of Cognitive Neuroscience | Year: 2015

The flexible allocation of attention enables us to perceive and behave successfully despite irrelevant distractors. How do acoustic challenges influence this allocation of attention, and to what extent is this ability preserved in normally aging listeners? Younger and healthy older participants performed a masked auditory number comparison while EEG was recorded. To vary selective attention demands, we manipulated perceptual separability of spoken digits from a masking talker by varying acoustic detail (temporal fine structure). Listening conditions were adjusted individually to equalize stimulus audibility as well as the overall level of performance across participants. Accuracy increased, and response times decreased with more acoustic detail. The decrease in response times with more acoustic detail was stronger in the group of older participants. The onset of the distracting speech masker triggered a prominent contingent negative variation (CNV) in the EEG. Notably, CNV magnitude decreased parametrically with increasing acoustic detail in both age groups. Within identical levels of acoustic detail, larger CNV magnitude was associated with improved accuracy. Across age groups, neuropsychological markers further linked early CNV magnitude directly to individual attentional capacity. Results demonstrate for the first time that, in a demanding listening task, instantaneous acoustic conditions guide the allocation of attention. Second, such basic neural mechanisms of preparatory attention allocation seem preserved in healthy aging, despite impending sensory decline. © 2015 Massachusetts Institute of Technology.


Grant
Agency: European Commission | Branch: H2020 | Program: MSCA-RISE | Phase: MSCA-RISE-2014 | Award Amount: 1.22M | Year: 2015

HYMADE focuses on the development of capsules and engineered colloidal particles for drug delivery combining mesoporous colloids, the Layer by Layer (LbL) technique and virosomes. The capsules and particles have potential applications in cancer and inflammatory diseases such as rheumatoid arthritis and uveitis. The project is based on the secondments of Early Stage Researchers and Experienced Researchers and networking and training activities between European and non European academic institutions. HYMADE aims to profit from the combination of hybrid materials to fabricate advanced drug delivery systems with controlled release and targeting efficiency of biological entities. The project also aims to gain understanding of the self assembly process of hybrid materials and the transport properties of the drug delivery systems.The biological fate, drug release, degradation and therapeutical efficiency of the drug delivery systems will be studied in vitro and in vivo with state of the art imaging techniques. To achieve these goals we have gathered an international multidisciplinary team with scientists at the forefront of Material Science, Self assembly, Physics, Chemistry, Biophysics and Imaging from Germany, Austria, France and Spain on the European side and from United States of America, Argentina and Armenia on the non European side.


Grant
Agency: European Commission | Branch: FP7 | Program: MC-ITN | Phase: FP7-PEOPLE-2012-ITN | Award Amount: 3.69M | Year: 2013

MARATONE is a Marie Curie Initial Training Network proposal that directly addresses the need for high-level training and career pathways in mental health to increase the inter-sectorial and trans-national employability of young scientists in the academic, public and private sectors to meet the enormous challenge of the 2009 EU Parliament Resolution on Mental Health. The Resolution set out recommendations for a comprehensive and integrated mental health strategy for Europe. MARATONE is designed to address the biggest challenge to implementing this ambitious strategy: the lack of training for career pathways for young scientists in multidisciplinary mental health research. MARATONE is built on the innovative theoretical premise of horizontal epidemiology, the view that psychosocial difficulties associated with mental health disorders are not exclusively determined by the diagnosis of the particular disorder in a vertical, silo-like pattern but horizontally in a manner that reflects commonalities in the lived experience of people with diverse mental health problems. Grounded in this theoretical foundation, MARATONEs multidisciplinary network of partners will collaboratively develop methodologies for measuring the individual and social impact of mental health disorders, so as to create strategies for the social and private sector responses to mental ill health in the form of health promotion and prevention programmes, and at the national level, strategies for human rights protections in policies and programming. The consortium will provide young researchers with scientific expertise in mental health, as well as basic technical and communication skills, including research development and management, international human rights commitments, and commercial exploitation and dissemination.


Grant
Agency: European Commission | Branch: FP7 | Program: CP-CSA-Infra | Phase: INFRA-2008-1.1.1 | Award Amount: 9.66M | Year: 2008

EUFAR is the Integrating Activity for airborne research in Geo-science. It will integrate the airborne community, to ensure that researchers may have access to the most suited infrastructure they need, irrespective of the location of the infrastructure. The EUFAR consortium comprises 32 legal entities. 14 operators of airborne facilities, and 18 experts in airborne research. They contribute to 9 Networking Activities, Trans-national Access to 26 installations, and 3 Joint Research Activities. A Scientific Advisory Committee, constituted of eminent scientists, contributes to a better integration of the users with the operators to tackle new user driven developments. Transnational Access coordination aims at providing a wider and more efficient access to the infrastructures. The working group for the Future of the Fleet fosters the joint development of airborne infrastructures in terms of capacity and performance. The Expert Working Groups facilitate a wider sharing of knowledge and technologies across fields. The activity for Education and Training provides training courses to new users. The working group on Standards and Protocols contributes to better structure the way research infrastructures operate. The development of a distributed data base for airborne activities improves the access to the data collected by the aircraft. All these activities rely on an unique web portal to airborne research in Europe. The working group on the Sustainable Structure aims at promoting solutions for the long term sustainability of EUFAR. Among the JRA, one will develop and characterize airborne hygrometers, the second one will develop and implement quality layers in the processing chains of hyperspectral imagery, and the third one will develop an airborne drop spectrometer based on a new principle.


Grant
Agency: European Commission | Branch: FP7 | Program: CP-IP | Phase: HEALTH-2007-2.1.2-3 | Award Amount: 16.09M | Year: 2008

EuroSyStem brings together elite European research teams to create a unique and world-leading programme in fundamental stem cell biology. By interconnecting complementary biological and computational expertise we will drive the generation of new knowledge on the characteristics of normal and abnormal stem cells. We will pave the way for application of systems methodology by measuring and modelling stem cell properties and behaviour. Information will be mined from studies in model organisms, but our primary focus is on the paradigmatic mammalian stem cells haematopoietic, epithelial, neural and embryonic. We will compare cellular hierarchy, signalling, epigenetics, dysregulation, and plasticity. Niche dependence, asymmetric division, transcriptional circuitry and the decision between self-renewal and commitment are linked in a cross-cutting work package. A multidisciplinary approach combines transgenesis, real time imaging, multi-parameter flow cytometry, transcriptomics, RNA interference, proteomics and single cell methodologies. SMEs will contribute to the development of enhanced resolution quantitative technologies. A platform work package will provide new computational tools and database resources, enabling implementation of novel analytical and modelling approaches. EuroSyStem will engage with and provide a focal point for the European stem cell research community. The targeted collaborations within the EuroSyStem research project will be augmented by federating European research excellence in different tissues and organisms. We will organise annual symposia, training workshops, summer schools, networking and research opportunities to promote a flourishing basic stem cell research community. This network will foster interaction and synergy, accelerating progress to a deeper and more comprehensive understanding of stem cell properties. In parallel EuroSyStem will develop WEB resources, educational and outreach materials for scientists and the lay community.


Grant
Agency: European Commission | Branch: FP7 | Program: CP-FP | Phase: HEALTH-2007-1.4-5 | Award Amount: 4.33M | Year: 2008

Formidable challenges remain to prevent and treat successfully neurodegenerative diseases. Traditional pharmacological approaches, as well as those using stem cells, have made progress but their impact remain limited. As suggested by clinical results in Canavan and Parkinsons disease, gene transfer offers substantial potential. However, this strategy of therapeutic intervention also brings unique obstacles - in particular the need to address feasibility, efficacy and safety. BrainCAV's foundation is the potential of canine adenovirus type 2 (CAV-2) vectors, which preferentially transduce neurons and undergo a very efficient long-distance targeting via axonal transport. Moreover, the episomal long-term expression leads to safe, efficient neuron-specific gene delivery. We proposed a structured translational approach that spans basic research through pre-clinical model feasibility, efficacy and safety. To provide a proof-of-principle of the effectiveness of CAV-2, we tackle mucopolysaccharidosis type VII, a global, orphan disease commonly affecting children, and Parkinson's disease, a focal degeneration of dopaminergic neurones commonly affecting aged population. To develop and execute this project, BrainCAV brings together an interdisciplinary combination of partners with unique expertise that will take CAV-2 vectors to the doorstep of clinical trials.


Grant
Agency: European Commission | Branch: FP7 | Program: CP-TP | Phase: HEALTH-2007-2.4.3-6 | Award Amount: 5.38M | Year: 2008

Obesity is the fasted growing health problem in the Western World. In the United States, obesity as cause of preventable mortality (365,000 deaths in the year 2000) will soon overtake tobacco (435,000 deaths) and already has overtaken alcohol consumption (85,000 deaths), infectious diseases (75,000) and motor vehicle crashes (43,000). More important, it is also a problem in children and adolescent, and accordingly, one of the major future health problems. We hypothesize that the reduction of hormones/signals, released or blocked after food intake as nutrition signal, significantly contributes to the feed-back in food intake, and subsequently to the onset in obesity. Accordingly, the focus of this project is the understanding of the contribution of gastro-intestinal peptides to the onset of obesity. It is aimed to identify the most relevant hormones or combinations of hormones, and subsequently to develop anti-obesity drugs that are based on endogenous hormone agonists or antagonists. The project will include the following aspects and competences: Production of endogenous hormones by synthetic approaches according to the guidelines of GMP and investigation in human volunteers by fMRT. Most promising candidates will be studied in more detail. This will include ghrelin, orexin, GLP, PP and PYY. Experiments will occur on the molecular, cellular and animal level. Original and optimised hormones will be modified with DOTA and used for labelling, and subsequent analysis in PET studies to allow the follow up of distribution and stability. In a medicinal chemistry concept, non peptidic drugs and peptide mimetics will be further explored for PYY and PP first as well as for ghrelin, orexin and GLP.


Grant
Agency: European Commission | Branch: H2020 | Program: RIA | Phase: PHC-01-2014 | Award Amount: 6.22M | Year: 2015

A large body of evidence supports associations between exposure to anthropogenic chemicals and endocrine disruptive effects, leading to disorders in humans and wildlife. Based on the scientific documentation it is beyond doubt that endocrine disrupting chemicals (EDCs) are of concern and need to be handled according to the risks they pose, as single chemicals or as mixtures. To develop chemical risk assessment to respond to these concerns, there is an urgent need to improve our understanding of the mechanisms and health effects of EDCs, in particular in mixtures. This will require selection, refinement and development of tools for assessment of EDC mixtures to bring current risk assessment procedures to a level where they can support risk management. This project is designed to promote the use of safe chemicals for the next generation. EDC-MixRisk aims to meet the societal need for improved decision-making regarding human exposure risks to mixtures of EDCs. EDC-MixRisk will determine risks for multiple adverse health outcomes based on molecular mechanisms involved after early life exposure to EDC mixtures. The task is approached through interdisciplinary cooperation between experts in epidemiology, experimental toxicology and molecular biology, and risk assessment. The value of this combined research effort is: i) Identification of EDC mixtures that are associated with multiple adverse health outcomes in three health domains (growth and metabolism, neurodevelopment and sexual development) in epidemiology; ii) Identification of molecular mechanisms and pathways underlying the associations between exposure and adverse health outcomes by the use and development of state-of-the-art experimental models and iii) Development of a transparent and systematic framework in risk assessment for integrating epidemiological and experimental research to facilitate the assessment of risk and societal impact, thus promoting better risk management for EDCs and their mixtures.


Grant
Agency: European Commission | Branch: H2020 | Program: RIA | Phase: ICT-03-2014 | Award Amount: 4.10M | Year: 2015

The LOMID project will define pathways to the manufacture of flexible OLED microdisplays with an exceptionally large area (16 mm x 20 mm, screen diagonal of 25.4 mm) at acceptably high yields (>65%). This will be achieved by developing a robust silicon-based chip design allowing high pixel counts (1024x1280 (SXGA)) and high spatial resolution(pixel sizes of 10 m x 10 m corresponding to 2000 ppi). These display innovations will be coupled to a highly reliable manufacturing of the backplane. Cheap processes (e.g. based on 0.35 m lithography) will be developed and special attention will be given to the interface between the top metal electrode of the CMOS backplane and the subsequent OLED layers. All these developments will be done on a 200 mm wafer scale. Along with this, a new testing procedure for quality control of the CMOS wafer (prior to OLED deposition) will be developed and promoted for standardisation. The flexibility of the large area microdisplays will be achieved by wafer thinning to enable a bending radius of 45 mm. Along with the new functionality, the durability of the devices has to be guaranteed despite bending to be comparable to rigid devices. The project will address this by improving the OLED efficiency (e.g. operating lifetime > 15,000 hours) and by modifying the device encapsulation to both fulfil the necessary barrier requirements (WVTR < 10^-6 g/d m2) and to give sufficient mechanical protection. The demand for and timeliness of these flexible, large area microdisplays is shown by the strong interest of industrial integrators to demonstrate the benefits of the innovative OLED microdisplays. Within the project, industrial integrators will validate the projects microdisplays in smart glasses for virtual reality and to aid those with impaired vision.


Grant
Agency: European Commission | Branch: H2020 | Program: MSCA-ITN-ETN | Phase: MSCA-ITN-2014-ETN | Award Amount: 3.73M | Year: 2015

The physical and psychosocial impact of armed conflict on children is immense and particularly so, if these children are associated with the enemy. Overwhelming evidence suggests that children born of war (CBOW), i.e. children fathered by foreign soldiers and born to local mothers have been and continue to be a major obstacle to successful integration of both their mothers and themselves into post-conflict societies. At a global level, previous UN studies have further emphasized the lack of research on children born out of forced pregnancies in armed conflict. The proposed network addresses the described shortcomings by advancing the knowledge base through systematic analysis of lived experiences of CBOW in a variety of 20th century conflict and post-conflict situations. The main research goal is to further our understanding of how (if at all) CBOW in conflict and post-conflict situations are integrated into society; how (if at all) militaries, governments, and nongovernmental policy makers assist this integration process; and how the childrens lived experiences reflect broader societal attitudes to memories of war and vice versa. Our vision is to promote scientific excellence by exploiting the specific research expertise and infrastructure of the co-ordinating partner and all participants in order to advance the research competencies and employability of early career researchers. Their enhanced understanding of the challenges of CBOW in volatile societies will inform the normative debates and, ultimately, policies on the reintegration of CBOW into post-conflict societies. By combining historical, social empirical, psychiatric, political, legal, memory, public health and development studies with the discourse surrounding currently enacted humanitarian intervention, insights gained from this network will surpass existing knowledge and will help improve on current integration efforts.


News Article | December 1, 2016
Site: www.eurekalert.org

New Haven, Conn. - Scientists have taken spectroscopic snapshots of nature's most mysterious relay race: the passage of extra protons from one water molecule to another during conductivity. The finding represents a major benchmark in our knowledge of how water conducts a positive electrical charge, which is a fundamental mechanism found in biology and chemistry. The researchers, led by Yale chemistry professor Mark Johnson, report their discovery in the Dec. 1 edition of the journal Science. For more than 200 years, scientists have speculated about the specific forces at work when electricity passes through water -- a process known as the Grotthuss mechanism. It occurs in vision, for example, when light hits the eye's retina. It also turns up in the way fuel cells operate. But the details have remained murky. In particular, scientists have sought an experimental way to follow the structural changes in the web of interconnected water molecules when an extra proton is transferred from one oxygen atom to another. "The oxygen atoms don't need to move much at all," Johnson said. "It is kind of like Newton's cradle, the child's toy with a line of steel balls, each one suspended by a string. If you lift one ball so that it strikes the line, only the end ball moves away, leaving the others unperturbed." Johnson's lab has spent years exploring the chemistry of water at the molecular level. Often, this is done with specially designed instruments built at Yale. Among the lab's many discoveries are innovative uses of electrospray ionization, which was developed by the late Yale Nobel laureate John Fenn. Johnson and his team have developed ways to fast-freeze the chemical process so that transient structures can be isolated, revealing the contorted arrangements of atoms during a reaction. The practical uses for these methods range from the optimization of alternative energy technologies to the development of pharmaceuticals. In the case of the proton relay race, previous attempts to capture the process hinged on using infrared color changes to see it. But the result always came out looking like a blurry photograph. "In fact, it appeared that this blurring would be too severe to ever allow a compelling connection between color and structure," Johnson said. The answer, he found, was to work with only a few molecules of "heavy water" -- water made of the deuterium isotope of hydrogen -- and chill them to almost absolute zero. Suddenly, the images of the proton in motion were dramatically sharper. "In essence, we uncovered a kind of Rosetta Stone that reveals the structural information encoded in color," Johnson said. "We were able to reveal a sequence of concerted deformations, like the frames of a movie." Johnson's lab was assisted by the experimental group of Knut Asmis at the University of Leipzig and the theory groups of Ken Jordan of the University of Pittsburgh and Anne McCoy of the University of Washington. One area where this information will be useful is in understanding chemical processes that occur at the surface of water, Johnson noted. There is active debate among scientists regarding whether the surface of water is more or less acidic than the bulk of water. At present, there is no way to measure the surface pH of water. The paper's first author is Conrad Wolke, a former Yale doctoral student in Johnson's lab. Co-authors of the paper are from the University of Chicago, Ohio State University, the University of Pittsburgh, the University of Washington, the University of Leipzig, and the Fritz Haber Institute of the Max Planck Society. Financial support for the research came from the U.S. Department of Energy, the National Science Foundation, the Ohio Supercomputing Center, and the Collaborative Research Center of the German Research Foundation DFG.


News Article | November 23, 2015
Site: cen.acs.org

Last July, Carmen Bachmann had an idea. The University of Leipzig professor wanted to help the growing influx of refugees arriving in Germany from conflict-ridden Syria, as well as from Afghanistan, Iraq, and elsewhere. With the help of an undergraduate student, Bachmann built an online platform called “Chance for Science” designed to connect refugee scientists with local academics. Bachmann’s organization is one of several efforts reaching out to scientists and engineers thought to be among the hundreds of thousands of refugees, asylum seekers, and others in Europe looking for temporary safety from conflict in the Middle East. Many European countries are tightening their borders to refugees after this month’s deadly terrorist attacks in Paris. But given how many people have already arrived, nonprofits and government agencies reckon there must be some highly qualified scientists among them. Finding those trained in the sciences will be difficult. And figuring out how to help them might be even harder, given that the migrants’ challenges include finding housing, language barriers, and asylum applications. The United Nations High Commissioner for Refugees estimates that well over 800,000 refugees and asylum seekers arrived in Europe by sea alone so far this year, more than half of whom came from Syria, and nearly one-fifth of whom came from Afghanistan. Those numbers dwarf efforts in the U.S., which accepted just 132 in 2014, according to the U.S. Department of Health & Human Services. Earlier this year, President Barack Obama promised to accept 10,000 refugees from Syria in the next year. However, there has been a backlash against that plan in the U.S. after the Paris attacks. Refugees who want to stay in Europe—many hope to make it to Germany or Sweden—must obtain visas that allow either temporary asylum or permission to immigrate. Only then can they think about procuring permission to work—and searching for jobs in what are already tight academic and industry markets. This is just the most recent of Europe’s waves of immigrants. Before the current flood of Syrians, for example, Iraqis arrived in droves in the 1990s and again in the 2000s. Statistics from the European Union show that since 2009, non-EU citizens’ rates of employment have dropped relative to that of EU citizens. These numbers suggest it will be harder for current refugees to find jobs if and when they receive permission to work. Last month, the European Commission’s research and innovation directorate announced its Science4Refugees initiative. The agency hopes its online platform—tied to its existing jobs portal, Euraxess—will act as a kind of matchmaker for refugees seeking science jobs and raise the profile of immigration issues among scientists. But it offers no financial support. Those working on the program say that it’s too soon to evaluate success; meanwhile, the initiative’s staff is working to boost awareness of the program among refugees. They acknowledge that finding their target audience will be difficult and take some time; they hope to find other networks and websites in different countries working on the same goals. As of mid-November, universities have already flagged some 150 jobs on Euraxess with the Science4Refugees label, highlighting their willingness to consider assisting applicants with visas and other red tape. But only around 20 CVs were posted by people with refugee status. That could be because many refugees are still getting settled. First-time asylum seekers have registered in Germany, Hungary, Austria, France, Italy, and Sweden in the greatest numbers. Germany is the number one destination of choice for current refugees: 343,000 applied for asylum between January and October. But getting a work visa could take several months to a year, even after getting asylum papers. The situation is easier in Sweden, which welcomed more than 112,000 refugees from January to October. Although it can take months to grant an individual asylum seeker status, individuals have permission to work in the country the day after receiving their papers. Earlier waves of migration to Sweden have led to assistance programs that eventually could be useful to recent arrivals. For example, the Swedish government offers to pay salaries for researchers in six-month internships with universities and companies; that offer is open to educated immigrants with the right to work as well as Swedes who have been unemployed for six months or more. But in Sweden, as in other EU countries, language could prove to be a stumbling block for even educated asylum seekers: State-sponsored intensive language courses for engineers, medical personnel, and others take one-and-a-half years to complete to teach proficiency in workaday Swedish. Still, the language of science is English, and that may be the case in Lebanon or Syria just as much as it is in Sweden or Germany. Back at the University of Leipzig, refugees who may have had their studies disrupted can audit courses taught in German or English if they are proficient in these languages. (The university has no numbers yet on how many students are taking advantage of these resources.) The Free University of Berlin is also opening courses to refugees even though they are not officially enrolled as students. The university sees this as preparation for enrollment at a later stage, once they have received visas or other kinds of permission to attend classes. In the meantime, already-enrolled students can volunteer to assist refugees in getting familiar with the university, work that counts for credit in job-qualification courses. Meanwhile, Bachmann’s “Chance for Science” now has 200 people registered on the online platform. Only 20 are refugees and might have asylum status. Bachmann is not sure that her efforts will make a difference, but she hopes she can help keep knowledgeable people in their fields.


News Article | February 3, 2016
Site: phys.org

Light microscopy image of a live Drosophila that was unable to produce enough growth factor idgf6 due to a genetic modification. As a result, defects can be seen in the respiratory organ as well as in the chitinous shell. Credit: Dr. Matthias Behr With their chitinous shells, insects seem almost invulnerable – but like Achilles' heel in Greek mythology, their impressive armor can still be attacked. Researchers at the universities of Bonn and Leipzig studied fruit flies (Drosophila) and discovered the molecular processes that are able to break through this protective casing. The enzyme chitinase 2 and growth factor idgf6 are especially important in correctly forming the insects' shells. These findings are relevant for fighting parasites, and will be published in the professional journal Scientific Reports. The same things that work with fruit flies (Drosophila) – one of developmental biologists' favorite animals to study – can generally also be applied to other insects. The deactivation of chitinase 2 and/or idgf6 genes results in a fragile shell that does not support adequate protection for larva of fruit flies and very likely other insects such as mosquitos. "Pathogens can then easily infiltrate the animals, and they usually die during the larval stage," says Assistant Professor Dr. Matthias Behr, who transferred from the Life & Medical Sciences (LIMES) Institute at his alma mater in Bonn to the Sächsische Inkubator für die klinische Translation (SIKT) at the University of Leipzig. The project was financed with funding from Special Research Area 645 at the University of Bonn. The current discovery offers completely new starting points for keeping agricultural parasites as well as dangerous disease-carrying insects in check. The enzyme chitinase 2 and growth factor idgf6 are essential for shell formation in nearly all insects, as well as in arthropods like crabs and spiders. "However, there are minor species-related differences that could allow us to develop tailor-made inhibitors that will prevent proper development of the chitinous shell in certain species," says first author Yanina-Yasmin Pesch from the LIMES Institute at the University of Bonn. Specially developed substances could be used to attack the chitinous covering of one arthropod species while leaving other species unharmed. Dr. Behr names two examples of possible applications: the spotted-wing drosophila (Drosophila suzukii) that recently migrated to Germany, and the new Zika virus pathogen. The spotted-wing drosophila causes enormous damage for the agricultural industry because it attacks a large volume of ripening fruit. The Zika virus is transmitted to people through mosquito bites. This virus is suspected of causing birth defects in Brazil, among other places. The researchers hope their discovery will make it easier to fight these kinds of dangerous insects in the future. The researchers from the universities of Bonn and Leipzig, as well as from the Max Planck Institute of Biophysical Chemistry in Göttingen, turned up one other surprising find: "Until now, scientists assumed that chitinase 2 was a degradation enzyme," reports Pesch. "But surprisingly, it has now been found that the enzyme is essential in forming the chitinous shell." When the protective casing is being created, chitinase shortens the chitin to the right length. The precisely tailored components are then combined with other materials to build the shell. As the team of researchers already showed in a previous study, the "Obstructor-A" protein plays a key role here. Like a construction-site manager, it makes sure that various building materials are added to the protective shell in the right places. "Step by step, our research is revealing molecular details about the insects' Achilles heel," says Dr. Behr. More information: Yanina-Yasmin Pesch et al. Chitinases and Imaginal disc growth factors organize the extracellular matrix formation at barrier tissues in insects, Scientific Reports (2016). DOI: 10.1038/srep18340


Rother T.,German Aerospace Center | Wauer J.,University of Leipzig
Applied Optics | Year: 2010

In this paper we discuss the influence of two different sets of weighting functions on the accuracy behavior of T-matrix calculations for scalar scattering problems. The first set of weighting functions is related to one of Waterman's original approaches. The other set results into a least-squares scheme for the transmission problem. It is shown that both sets of weighting functions produce results with a converse accuracy behavior in the near and far fields. Additional information, such as reciprocity and the fulfillment of the boundary condition, are needed to choose the set of weighting functions that is most appropriate for a certain application. The obtained criteria are applied afterward to an iterative T-matrix approach we developed to analyze scattering on regular particle geometries with an impressed but slight surface irregularity. However, its usefulness is demonstrated in this paper by analyzing the far-field scattering behavior of Chebyshev particles of higher orders. © 2010 Optical Society of America.


Holstein A.,Clinic Lippe Detmold | Patzer O.M.,Clinic Lippe Detmold | Machalke K.,Clinic Lippe Detmold | Holstein J.D.,University of Leipzig | And 2 more authors.
Diabetes Care | Year: 2012

OBJECTIVE - To compare the incidences of severe hypoglycemia and corresponding clinical circumstances in a German population between 2007-2010 and 1997-2000. RESEARCH DESIGN AND METHODS - A screening for severe hypoglycemia was performed in the Lippe-Detmold area in Germany to sensitively detect severe hypoglycemia. This was defined as a symptomatic event requiring treatment with intravenous glucose and being confirmed by a blood glucose measurement of <50 mg/dL. RESULTS - Severe hypoglycemia increased considerably from 264 events in 1997-2000 to 495 events in 2007-2010, which translated into an increase in frequency of severe hypoglycemia among all emergency admissions from 0.68 to 0.83% (P = 0.015). This was mostly related to intensification of antihyperglycemic therapy, particularly in the increasingly morbid group of hypoglycemic patients with type 2 diabetes indicated by lower HbA1c, more comedication (3.3 vs. 7.7 drugs), and more concomitant diseases (3.6 vs. 4.4) (all P values <0.001). CONCLUSIONS - Within a 10-year period, there was an intensification of antihyperglycemic therapy in increasingly comorbid subjects, leading to a considerably higher incidence of severe hypoglycemia. © 2012 by the American Diabetes Association.


Nielsen J.C.,Aarhus University Hospital | Johannessen A.,Copenhagen University | Raatikainen P.,University of Tampere | Hindricks G.,University of Leipzig | And 7 more authors.
New England Journal of Medicine | Year: 2012

BACKGROUND: There are limited data comparing radiofrequency catheter ablation with antiarrhythmic drug therapy as first-line treatment in patients with paroxysmal atrial fibrillation. METHODS: We randomly assigned 294 patients with paroxysmal atrial fibrillation and no history of antiarrhythmic drug use to an initial treatment strategy of either radiofrequency catheter ablation (146 patients) or therapy with class IC or class III antiarrhythmic agents (148 patients). Follow-up included 7-day Holter-monitor recording at 3, 6, 12, 18, and 24 months. Primary end points were the cumulative and per-visit burden of atrial fibrillation (i.e., percentage of time in atrial fibrillation on Holter-monitor recordings). Analyses were performed on an intention-to-treat basis. RESULTS: There was no significant difference between the ablation and drug-therapy groups in the cumulative burden of atrial fibrillation (90th percentile of arrhythmia burden, 13% and 19%, respectively; P = 0.10) or the burden at 3, 6, 12, or 18 months. At 24 months, the burden of atrial fibrillation was significantly lower in the ablation group than in the drug-therapy group (90th percentile, 9% vs. 18%; P = 0.007), and more patients in the ablation group were free from any atrial fibrillation (85% vs. 71%, P = 0.004) and from symptomatic atrial fibrillation (93% vs. 84%, P = 0.01). One death in the ablation group was due to a procedure-related stroke; there were three cases of cardiac tamponade in the ablation group. In the drug-therapy group, 54 patients (36%) underwent supplementary ablation. CONCLUSIONS: In comparing radiofrequency ablation with antiarrhythmic drug therapy as first-line treatment in patients with paroxysmal atrial fibrillation, we found no significant difference between the treatment groups in the cumulative burden of atrial fibrillation over a period of 2 years. Copyright © 2012 Massachusetts Medical Society. All rights reserved.


Ahsanullah,Free University of Berlin | Rademann J.,University of Leipzig | Rademann J.,Leibniz Institute for Molecular Pharmacology
Angewandte Chemie - International Edition | Year: 2010

Support and guidance: Azidopeptidyl phosphoranes on a solid support react very efficiently through cyclative cleavage to yield cyclopeptides with an incorporated triazole ring. The solid support is advantageous as cyclization is favored strongly over oligomerization reactions and thus only cyclized products are released. "Chemical equation presented". © 2010 Wiley-VCH Verlag GmbH & Co. KGaA.


Masuda N.,University of Tokyo | Klemm K.,University of Leipzig | Eguiluz V.M.,Institute Fisica Interdisciplinar y Sistemas Complejos IFISC CSIC UIB
Physical Review Letters | Year: 2013

Interactions among units in complex systems occur in a specific sequential order, thus affecting the flow of information, the propagation of diseases, and general dynamical processes. We investigate the Laplacian spectrum of temporal networks and compare it with that of the corresponding aggregate network. First, we show that the spectrum of the ensemble average of a temporal network has identical eigenmodes but smaller eigenvalues than the aggregate networks. In large networks without edge condensation, the expected temporal dynamics is a time-rescaled version of the aggregate dynamics. Even for single sequential realizations, diffusive dynamics is slower in temporal networks. These discrepancies are due to the noncommutability of interactions. We illustrate our analytical findings using a simple temporal motif, larger network models, and real temporal networks. Published by American Physical Society.


Holzwarth F.,University of Leipzig | Kahl A.,University of Leipzig | Bauhus J.,Albert Ludwigs University of Freiburg | Wirth C.,University of Leipzig
Journal of Ecology | Year: 2013

Partitioning of tree mortality into different modes of death allows the tracing and mechanistic modelling of individual key processes of forest dynamics each varying depending on site, species and individual risk factors. This, in turn, may improve long-term predictions of the development of old-growth forests. Six different individual tree mortality modes (uprooted and snapped (both with or without rot as a predisposing factor), standing dead and crushed by other trees) were analysed, and statistical models were derived for three tree species (European beech Fagus sylvatica, hornbeam Carpinus betulus and common ash Fraxinus excelsior) based on a repeated inventory of more than 13 000 trees in a 28 ha near-natural deciduous forest in Central Germany. The frequently described U-shaped curve of size-dependent mortality was observed in beech and hornbeam (but not ash) and could be explained by the joint operation of processes related to the six distinct mortality modes. The results for beech, the most abundant species, suggest that each mortality mode is prevalent in different life-history stages: small trees died mostly standing or being crushed, medium-sized trees had the highest chance of survival, and very large trees experienced increased rates of mortality, mainly by uprooting or snapping. Reduced growth as a predictor also played a role but only for standing dead, all other mortality modes showed no relationship to tree growth. Synthesis. Tree mortality can be partitioned into distinct processes, and species tend to differ in their susceptibility to one or more of them. This forms a fundamental basis for the understanding of forest dynamics in natural forests, and any mechanistic modelling of mortality in vegetation models could be improved by correctly addressing and formulating the various mortality processes. © 2012 The Authors. Journal of Ecology © 2012 British Ecological Society.


Raz R.,University of Leipzig | Raz R.,Leibniz Institute for Molecular Pharmacology | Rademann J.,University of Leipzig | Rademann J.,Free University of Berlin
Organic Letters | Year: 2012

Rapid and efficient preparation of peptide thioacids from 2-cyanoethyl peptide thioesters has been accomplished. S-2-Cyanoethyl peptide thioesters were obtained cleanly without the need for purification from resin-bound tert-butyl peptide thioesters using 3-mercaptopropionitrile as a nucleophile. Elimination of the 2-cyanoethyl group proceeded rapidly (t1/2 < 8 min) under mild conditions and furnished peptide thioacids up to the size of a 16-mer. Peptide thioacids could be isolated or formed in situ and reacted smoothly with electron-deficient azides yielding an amide as the ligation product. © 2012 American Chemical Society.


Borries C.,German Aerospace Center | Hoffmann P.,University of Leipzig
Journal of Geophysical Research: Space Physics | Year: 2010

A first description of the characteristics of the F2-layer ionosphere oscillations with periods between 2 and 30 days (so-called planetary wave-type oscillations, PWTO) is derived using regional maps of total electron content (TEC), covering the northern hemisphere from 50°N to the North Pole. Oscillations forced by quasiperiodic variations of solar signals (e.g., extreme ultraviolet irradiation and solar wind) are identified and separated using wavelet transformation. It is found that up to 50% of the PWTO intensity in the F2-layer ionosphere occur due to solar variability. The signals that are not obviously related to solar variability are spectrally decomposed to characterize oscillations by their period and wave number. Climatological analyses of the occurring oscillations in TEC reveal similarities to stratospheric planetary waves (PW). These are typical periods of PW, the strongest wave activity during winter and the enhancement of the wave activity at similar latitudes. But there are also major differences to stratospheric PW. In the ionosphere zonal mean oscillations are dominant, stationary waves are not observed, and the dominant periods are shorter than the typical periods of PW in the stratosphere. However, the obtained results are in good agreement with recent numerical modeling results that showed no stationary wave propagation into the F2-region ionosphere and only fast, short-period waves that propagate up to the lower thermosphere. © 2010 by the American Geophysical Union.


Mohr F.W.,University of Leipzig | Morice M.-C.,Hopital Prive Jacques Cartier | Kappetein A.P.,Erasmus University Rotterdam | Feldman T.E.,Evanston Hospital | And 9 more authors.
The Lancet | Year: 2013

Background We report the 5-year results of the SYNTAX trial, which compared coronary artery bypass graft surgery (CABG) with percutaneous coronary intervention (PCI) for the treatment of patients with left main coronary disease or three-vessel disease, to confirm findings at 1 and 3 years. Methods The randomised, clinical SYNTAX trial with nested registries took place in 85 centres in the USA and Europe. A cardiac surgeon and interventional cardiologist at each centre assessed consecutive patients with de-novo three-vessel disease or left main coronary disease to determine suitability for study treatments. Eligible patients suitable for either treatment were randomly assigned (1:1) by an interactive voice response system to either PCI with a first-generation paclitaxel-eluting stent or to CABG. Patients suitable for only one treatment option were entered into either the PCI-only or CABG-only registries. We analysed a composite rate of major adverse cardiac and cerebrovascular events (MACCE) at 5-year follow-up by Kaplan-Meier analysis on an intention-to-treat basis. This study is registered with ClinicalTrials.gov, number NCT00114972. Findings 1800 patients were randomly assigned to CABG (n=897) or PCI (n=903). More patients who were assigned to CABG withdrew consent than did those assigned to PCI (50 vs 11). After 5 years' follow-up, Kaplan-Meier estimates of MACCE were 26•9% in the CABG group and 37•3% in the PCI group (p<0•0001). Estimates of myocardial infarction (3•8% in the CABG group vs 9•7% in the PCI group; p<0•0001) and repeat revascularisation (13•7% vs 25•9%; p<0•0001) were significantly increased with PCI versus CABG. All-cause death (11•4% in the CABG group vs 13•9% in the PCI group; p=0•10) and stroke (3•7% vs 2•4%; p=0•09) were not significantly different between groups. 28•6% of patients in the CABG group with low SYNTAX scores had MACCE versus 32•1% of patients in the PCI group (p=0•43) and 31•0% in the CABG group with left main coronary disease had MACCE versus 36•9% in the PCI group (p=0•12); however, in patients with intermediate or high SYNTAX scores, MACCE was significantly increased with PCI (intermediate score, 25•8% of the CABG group vs 36•0% of the PCI group; p=0•008; high score, 26•8% vs 44•0%; p<0•0001). Interpretation CABG should remain the standard of care for patients with complex lesions (high or intermediate SYNTAX scores). For patients with less complex disease (low SYNTAX scores) or left main coronary disease (low or intermediate SYNTAX scores), PCI is an acceptable alternative. All patients with complex multivessel coronary artery disease should be reviewed and discussed by both a cardiac surgeon and interventional cardiologist to reach consensus on optimum treatment. Funding Boston Scientific.


Klein A.A.,Papworth Hospital | Skubas N.J.,New York Medical College | Ender J.,University of Leipzig
Anesthesia and Analgesia | Year: 2014

Transcatheter aortic valve replacement (TAVR) is performed with increasing frequency in the United States since Food and Drug Administration approval in 2011. The procedure involves the replacement of a severely stenosed native or bioprosthetic aortic valve with a specially constructed valvular prosthesis that is mounted onto a stent, without the use of cardiopulmonary bypass and the complications of a major open surgical procedure. TAVR has been performed mostly in elderly patients with multiple comorbidities or who have undergone previous cardiac surgery. The most commonly used access routes are the femoral artery (transfemoral) or the cardiac apex (transapical), but the transaortic and transubclavian approaches are also used with varying frequency. Conscious sedation may be used in patients undergoing transfemoral TAVR, but the use of general anesthesia has not been shown to carry greater risk and permits the use of transesophageal echocardiography to assist in valve positioning and diagnose complications. Cardiovascular instability during TAVR is relatively common, necessitating invasive monitoring and frequent use of vasoactive medications. Complications of the procedure are still relatively common and the most frequent is vascular injury to the access sites or the aorta. Cardiovascular collapse may be the result of major hemorrhage pericardial effusion with tamponade or coronary occlusion due to incorrect valve placement. Persistent hypotension, myocardial stunning, or injury requiring open surgical intervention may necessitate the use of cardiopulmonary bypass, the facilities for which should always be immediately available. Ongoing and planned trials comparing conventional surgery with TAVR in lower risk and younger patients should determine the place of TAVR in the medium-to long-term future. © 2014 International Anesthesia Research Society.


Autorino R.,Cleveland Clinic | Autorino R.,The Second University of Naples | Kaouk J.H.,Cleveland Clinic | Stolzenburg J.-U.,University of Leipzig | And 4 more authors.
European Urology | Year: 2013

Context: Despite the increasing interest in laparoendoscopic single-site surgery (LESS) worldwide, the actual role of this novel approach in the field of minimally invasive urologic surgery remains to be determined. It has been postulated that robotic technology could be applied to LESS to overcome the current constraints. Objective: To summarize and critically analyze the available evidence on the current status and future of robotic applications in single-site surgery. Evidence acquisition: A systematic literature review was performed in April 2011 using PubMed and the Thomson-Reuters Web of Science. In the free-text protocol, the following terms were applied: robotic single site surgery, robotic single port surgery, robotic single incision surgery, and robotic laparoendoscopic single site surgery. Review articles, editorials, commentaries, and letters to the editor were included only if deemed to contain relevant information. In addition, cited references from the selected articles and from review articles retrieved in the search were assessed for significant manuscripts not previously included. The authors selected 55 articles according to the search strategy based on Preferred Reporting Items for Systematic Reviews and Meta-analysis criteria. Evidence synthesis: The volume of available clinical outcomes of robotic LESS (R-LESS) has considerably grown since the pioneering description of the first successful clinical series of single-port robotic procedures. So far, a cumulative number of roughly 150 robotic urologic LESS cases have been reported by different institutions across the globe with a variety of techniques and port configurations. The feasibility of robot-assisted single-incision colorectal procedures, as well as of many gynecologic procedures, has also been demonstrated. A novel set of single-site instruments specifically dedicated to LESS is now commercially available for use with the da Vinci Si surgical system, and both experimental and clinical use have been reported. However, the current robotic systems were specifically designed for LESS. The ideal robotic platform should have a low external profile, the possibility of being deployed through a single access site, and the possibility of restoring intra-abdominal triangulation while maintaining the maximum degree of freedom for precise maneuvers and strength for reliable traction. Several purpose-built robotic prototypes for single-port surgery are being tested. Conclusions: Significant advances have been achieved in the field of R-LESS since the first reported clinical series in 2009. Given the several advantages offered by current the da Vinci system, it is likely that its adoption in this field will increase. The recent introduction of purpose-built instrumentation is likely to further foster the application of robotics to LESS. However, we are still far from the ideal robotic platform. Significant improvements are needed before this technique might reach widespread adoption beyond selected centers. Further advances in the field of robotic technology are expected to provide the optimal interface to facilitate LESS. © 2012 European Association of Urology. Published by Elsevier B.V. All rights reserved.


Zahn S.,University of Leipzig | Macfarlane D.R.,Monash University | Izgorodina E.I.,Monash University
Physical Chemistry Chemical Physics | Year: 2013

We present high-level benchmark calculations of interaction energies of 236 ion pair structures of ionic liquids constituting a new IL-2013 set. 33 different approaches using various basis sets are validated against these benchmark data. Overall, traditional functionals like B3LYP, without an explicit dispersion correction, should be avoided when investigating ionic liquids. We can recommend the third version of Grimme's empirical dispersion correction (DFT-D3) and the LC-BOP functional, as well as most functionals of the Minnesota family of the M0X type. Our results highlight the importance of diffuse basis set functions for the accurate prediction of the IL energetics using any DFT functional. The best combination of reasonable accuracy and reasonable cost was found to be the M06-L functional in combination with the 6-31++G** basis set, producing a remarkable mean absolute deviation of only 4.2 kJ mol-1 and a maximum deviation of -12.5 kJ mol-1. Second-order Møller-Plesset perturbation theory (MP2) in combination with counterpoise-corrected triple-ζ basis sets can also be recommended for reliable calculations of energetics of ionic liquids. © 2013 The Owner Societies.


Raz R.,Leibniz Institute for Molecular Pharmacology | Raz R.,Free University of Berlin | Rademann J.,University of Leipzig | Rademann J.,Leibniz Institute for Molecular Pharmacology
Organic Letters | Year: 2011

tert-Butyl thioesters display an astonishing stability toward secondary amines in basic milieu, in contrast to other alkyl and aryl thioesters. Exploiting this enhanced stability, peptide thioesters were synthesized in a direct manner, applying a tert-butyl thiol linker for Fmoc-based solid-phase peptide synthesis. © 2011 American Chemical Society.


Eden B.,University of Leipzig | Korchemsky G.P.,CEA Saclay Nuclear Research Center | Sokatchev E.,University of Savoy
Journal of High Energy Physics | Year: 2011

We study the correlation functions of half-BPS protected operators in N = 4 super-Yang-Mills theory, in the limit where the positions of adjacent operators become light-like separated. We compute the loop corrections by means of Lagrangian insertions. The divergences resulting from the light-cone limit are regularized by changing the dimension of the integration measure over the insertion points. Switching from coordinates to dual momenta, we show that the logarithm of the correlation function is identical with twice the logarithm of the matching MHV gluon scattering amplitude. We present a number of examples of this new relation, at one and two loops.


Klimchitskaya G.L.,North West Technical University | Bordag M.,University of Leipzig | Mostepanenko V.M.,Noncommercial Partnership Scientific Instruments
International Journal of Modern Physics A | Year: 2012

We analyze recent experiments on measuring the thermal Casimir force with account of possible background effects. Special attention is paid to the validity of the proximity force approximation (PFA) used in the comparison between the experimental data and computational results in experiments employing a sphere-plate geometry. The PFA results are compared with the exact results where they are available. The possibility to use fitting procedures in theory-experiment comparison is discussed. On this basis we reconsider experiments exploiting spherical lenses of centimeter-size radii. © 2012 World Scientific Publishing Company.


Mueller M.,University of Leipzig | Johnston D.A.,Heriot - Watt University | Janke W.,University of Leipzig
Nuclear Physics B | Year: 2014

The three-dimensional purely plaquette gonihedric Ising model and its dual are investigated to resolve inconsistencies in the literature for the values of the inverse transition temperature of the very strong temperature-driven first-order phase transition that is apparent in the system. Multicanonical simulations of this model allow us to measure system configurations that are suppressed by more than 60 orders of magnitude compared to probable states. With the resulting high-precision data, we find excellent agreement with our recently proposed nonstandard finite-size scaling laws for models with a macroscopic degeneracy of the low-temperature phase by challenging the prefactors numerically. We find an overall consistent inverse transition temperature of β∞=0.551334(8) from the simulations of the original model both with periodic and fixed boundary conditions, and the dual model with periodic boundary conditions. For the original model with periodic boundary conditions, we obtain the first reliable estimate of the interface tension σ=0.12037(18), using the statistics of suppressed configurations. © 2014 The Authors.


Mueller M.,University of Leipzig | Janke W.,University of Leipzig | Johnston D.A.,Heriot - Watt University
Physical Review Letters | Year: 2014

We note that the standard inverse system volume scaling for finite-size corrections at a first-order phase transition (i.e., 1/L3 for an L×L×L lattice in 3D) is transmuted to 1/L2 scaling if there is an exponential low-temperature phase degeneracy. The gonihedric Ising model which has a four-spin interaction, plaquette Hamiltonian provides an exemplar of just such a system. We use multicanonical simulations of this model to generate high-precision data which provide strong confirmation of the nonstandard finite-size scaling law. The dual to the gonihedric model, which is an anisotropically coupled Ashkin-Teller model, has a similar degeneracy and also displays the nonstandard scaling. © 2014 American Physical Society.


Chicherin D.,Saint Petersburg State University | Derkachov S.,Russian Academy of Sciences | Kirschner R.,University of Leipzig
Nuclear Physics B | Year: 2014

Yangian symmetry of amplitudes in N=4 super-Yang-Mills theory is formulated in terms of eigenvalue relations for monodromy matrix operators. The Quantum Inverse Scattering Method provides the appropriate tools to treat the extended symmetry and to recover as its consequences many known features like cyclic and inversion symmetry, BCFW recursion, Inverse Soft Limit construction, Grassmannian integral representation, R-invariants and on-shell diagram approach. © 2014 The Authors.


Koch J.,Schoen Klinik Bad Arolsen | Exner C.,University of Leipzig
Psychiatry Research | Year: 2015

While initial studies supported the hypothesis that cognitive characteristics that capture cognitive resources act as underlying mechanisms in memory deficits in obsessive-compulsive disorder (OCD), the influence of those characteristics on selective attention has not been studied, yet. In this study, we examined the influence of cognitive self-consciousness (CSC), rumination and worrying on performance in selective attention in OCD and compared the results to a depressive and a healthy control group. We found that 36 OCD and 36 depressive participants were impaired in selective attention in comparison to 36 healthy controls. In all groups, hierarchical regression analyses demonstrated that age, intelligence and years in school significantly predicted performance in selective attention. But only in OCD, the predictive power of the regression model was improved when CSC, rumination and worrying were implemented as predictor variables. In contrast, in none of the three groups the predictive power improved when indicators of severity of obsessive-compulsive (OC) and depressive symptoms and trait anxiety were introduced as predictor variables. Thus, our results support the assumption that mental characteristics that bind cognitive resources play an important role in the understanding of selective attention deficits in OCD and that this mechanism is especially relevant for OCD. © 2014 Elsevier Ireland Ltd.


Otto S.,University of Leipzig | Trautmann T.,German Aerospace Center | Wendisch M.,University of Leipzig
Atmospheric Chemistry and Physics | Year: 2011

Realistic size equivalence and shape of Saharan mineral dust particles are derived from in-situ particle, lidar and sun photometer measurements during SAMUM-1 in Morocco (19 May 2006), dealing with measured size- and altitude-resolved axis ratio distributions of assumed spheroidal model particles. The data were applied in optical property, radiative effect, forcing and heating effect simulations to quantify the realistic impact of particle non-sphericity. It turned out that volume-to-surface equivalent spheroids with prolate shape are most realistic: particle non-sphericity only slightly affects single scattering albedo and asymmetry parameter but may enhance extinction coefficient by up to 10 %. At the bottom of the atmosphere (BOA) the Saharan mineral dust always leads to a loss of solar radiation, while the sign of the forcing at the top of the atmosphere (TOA) depends on surface albedo: solar cooling/warming over a mean ocean/land surface. In the thermal spectral range the dust inhibits the emission of radiation to space and warms the BOA. The most realistic case of particle non-sphericity causes changes of total (solar plus thermal) forcing by 55/5 % at the TOA over ocean/land and 15 % at the BOA over both land and ocean and enhances total radiative heating within the dust plume by up to 20 %. Large dust particles significantly contribute to all the radiative effects reported. They strongly enhance the absorbing properties and forward scattering in the solar and increase predominantly, e.g., the total TOA forcing of the dust over land. © 2011 Author(s).


Vogel A.,Friedrich - Schiller University of Jena | Scherer-Lorenzen M.,Albert Ludwigs University of Freiburg | Weigelt A.,University of Leipzig
PLoS ONE | Year: 2012

The degree to which biodiversity may promote the stability of grasslands in the light of climatic variability, such as prolonged summer drought, has attracted considerable interest. Studies so far yielded inconsistent results and in addition, the effect of different grassland management practices on their response to drought remains an open question. We experimentally combined the manipulation of prolonged summer drought (sheltered vs. unsheltered sites), plant species loss (6 levels of 60 down to 1 species) and management intensity (4 levels varying in mowing frequency and amount of fertilizer application). Stability was measured as resistance and resilience of aboveground biomass production in grasslands against decreased summer precipitation, where resistance is the difference between drought treatments directly after drought induction and resilience is the difference between drought treatments in spring of the following year. We hypothesized that (i) management intensification amplifies biomass decrease under drought, (ii) resistance decreases with increasing species richness and with management intensification and (iii) resilience increases with increasing species richness and with management intensification. We found that resistance and resilience of grasslands to summer drought are highly dependent on management intensity and partly on species richness. Frequent mowing reduced the resistance of grasslands against drought and increasing species richness decreased resistance in one of our two study years. Resilience was positively related to species richness only under the highest management treatment. We conclude that low mowing frequency is more important for high resistance against drought than species richness. Nevertheless, species richness increased aboveground productivity in all management treatments both under drought and ambient conditions and should therefore be maintained under future climates. © 2012 Vogel et al.


Mann M.,Albert Ludwigs University of Freiburg | Klemm K.,University of Leipzig
Physical Review E - Statistical, Nonlinear, and Soft Matter Physics | Year: 2011

Many physical and chemical processes, such as folding of biopolymers, are best described as dynamics on large combinatorial energy landscapes. A concise approximate description of the dynamics is obtained by partitioning the microstates of the landscape into macrostates. Since most landscapes of interest are not tractable analytically, the probabilities of transitions between macrostates need to be extracted numerically from the microscopic ones, typically by full enumeration of the state space or approximations using the Arrhenius law. Here, we propose to approximate transition probabilities by a Markov chain Monte Carlo method. For landscapes of the number partitioning problem and an RNA switch molecule, we show that the method allows for accurate probability estimates with significantly reduced computational cost. © 2011 American Physical Society.


Fedorova M.,University of Leipzig | Kuleva N.,Saint Petersburg State University | Hoffmann R.,University of Leipzig
Journal of Proteome Research | Year: 2010

Reactive oxidative species (ROS) play important roles in cellular signaling but can also modify and often functionally inactivate other biomolecules. Thus, cells have developed effective enzymatic and nonenzymatic strategies to scavenge ROS. However, under oxidative stress, ROS production is able to overwhelm the scavenging systems, increasing the levels of functionally impaired proteins. A major class of irreversible oxidative modifications is carbonylation, which refers to reactive carbonyl-groups. In this investigation, we have studied the production and clearance rates for skeletal muscle proteins in a rat model of acute oxidative stress over a time period of 24 h using a gel-based proteomics approach. Optimized ELISA and Western blots with 10-fold improved sensitivities showed that the carbonylation level was stable at 4 nmol per mg protein 3 h following ROS induction. The carbonylation level then increased 3-fold over 6 h and then remained stable. In total, the oxidative stress changed the steady state levels of 20 proteins and resulted in the carbonylation of 38 skeletal muscle proteins. Carbonylation of these proteins followed diverse kinetics with some proteins being highly carbonylated very quickly, whereas others peaked in the 9 h sample or continued to increase up to 24 h after oxidative stress was induced. © 2010 American Chemical Society.


Fedorova M.,University of Leipzig | Kuleva N.,Saint Petersburg State University | Hoffmann R.,University of Leipzig
Journal of Proteome Research | Year: 2010

Increased levels of reactive oxygen species (ROS) cause oxidative stress and are believed to play a key role in the development of age-related diseases and mammalian aging in general by oxidizing proteins, lipids, and DNA. In this study, we have investigated the effects of ROS on actin in an established rat model of acute oxidative stress using short-term X-ray irradiation. Relative to the control, the actin functions studied in vitro were reduced for (i) actin polymerization to a minimum of 33% after 9 h and (ii) actin activated Mg 2+ -ATPase activity of myosin to 55% after 9 h. At 24 h, the activities had partially recovered to 64 and 80% of the control sample, respectively. The underlying oxidative modifications were also studied at the molecular level, The content of reactive carbonyl-groups increased 4-fold within the studied 24 h period. Among the five cysteine residues of actin, CyS 239 and Cys 259 were oxidized to sulfenic (Cys-SOH), sulfinic (CyS-SO 2H), or sulfonic (CyS-SO 3H) acids by increasing amounts over the time periods studied. The content of methionine sulfoxides also increased for 15 of the 16 methionine residues, with Met 44, Met 47, and Met 355 having the highest sulfoxide contents, Met 82 was also further oxidized to the sulfone. Among the four tryptophan residues present in actin, only Trp 79 and Trp 86 appeared to undergo oxidation, The relative contents of hydroxy-tryptophan, N-formyl-kynurenine, and kynurenine increased after irradiation, reaching a maximum in the 9 h sample. © 2010 American Chemical Society.


Grant
Agency: European Commission | Branch: FP7 | Program: CSA | Phase: ICT-2013.9.1 | Award Amount: 1.34M | Year: 2013

Young researchers working in future and emerging technologies (FET) are critical to the success of strategically important areas of science and technology in Europe. However, to realise their full potential as individuals and as a collective resource, they need to optimise their capacity and capability to generate and realise breakthrough ideas and research. The aim of the EYE project is to build a lasting European community of high potential young researchers (YRs) that are able to generate radical new ideas and build research collaborations in interdisciplinary areas, EYE will help them to develop their research potential and their ability to develop new curricula for FET. Specifically, EYE focuses on (a) S&T ideas of higher risk nature that can be generated through ideation and brainstorming and (b) collaboration between YRs across various disciplines and from different parts of Europe, and (c) on the YRs themselves, by developing their leadership potential through networking and training in the specific methods used in European collaborative projects.\nEYE will achieve its goal by implementing an integrated programme of complementary regional and European events:\n(a) Lab Surfing workshops in 6 regions of Europe that inform YRs about the most advanced FET research across various disciplines, brainstorm future paradigms and enhance YRs scientific administration skills;\n(b) Europe-wide Blue Sky Conferences for YRs in 38 countries in Europe (EU members states and selected countries associated with FP7) to enable further consolidation of ideas at a European level and wider networking with academia, industry and policy makers;\n(c) Science Incubator Summer Schools to assist selected YRs in bringing their ideas to a position where they might form the basis of future FET project proposals.\nEYE will conduct two rounds of these events over 2 years in order to reach a wider group of YRs in Europe and to ensure the sustainability of the EYE action after the end of the project. The EYE activities are supported with an online platform (NOVA-Networking for Outstanding Visionaries & Academics) which serves as an operational tool to prepare the events and as a professional platform for ideation, networking, collaboration and discussion amongst YRs.\nThe projects thematic scope is broadly defined by 9 multidisciplinary research areas identified in the recent public consultation on future FET, as well as Horizon 2020 societal challenges. In particular, EYE will seek areas where Information and Communication Technologies (ICT) can bring new interdisciplinary research opportunities and will support both curiosity- and agenda-driven research. The project brings together a broad representation of the multidisciplinary research community in Europe with 11 participants from 9 countries including 7 universities, 2 strong research institutions, and 2 SMEs.


Grant
Agency: European Commission | Branch: FP7 | Program: CP-IP | Phase: KBBE.2013.2.1-01 | Award Amount: 11.45M | Year: 2014

MooDFOOD is a Multi-country cOllaborative project on the rOle of Diet, Food-related behaviour, and Obesity in the prevention of Depression. Depression is one of the most prevalent, severe and disabling disorders in the EU and places a heavy burden on individuals and families. A large proportion of the EU population is overweight which increases depression risk. Improving food-related behaviour and nutrient status offer opportunities to prevent depression, specifically for people prone to being overweight. The MooDFOOD consortium combines expertise in nutrition, consumer behaviour, psychiatry and preventive psychology and uses a unique integrative approach. Existing high quality data of longitudinal prospective European cohort studies will be combined with new data from surveys, short-term experiments and a long-term preventive intervention study. This approach will provide insight in the causality of the link between diet and depression and underlying pathways, and will identify which modifications related to depression lead to beneficial dietary changes and lower the environmental burden of the diet. Knowledge on all these aspects will be integrated and used to develop novel nutritional strategies to prevent depression. The MooDFOOD consortium aims 1) to gain a better understanding of the psychological, lifestyle and environmental pathways underlying the multi-faceted, bidirectional links of food intake, nutrient status, food-related behaviour and obesity with depression and 2) to develop and disseminate innovative evidence-based, feasible, effective and sustainable nutritional strategies for the prevention of clinical depression. In close collaboration with stakeholders and experts MooDFOOD will transform these nutritional strategies into guidelines and practical tools to guide policy at EU- and Member State levels. Promotion through extensive European networks will lower the risk of depression and contribute to overall health of all EU citizens.


Grant
Agency: European Commission | Branch: H2020 | Program: RIA | Phase: ICT-35-2016 | Award Amount: 532.88K | Year: 2017

In this project, we propose an in-depth empirical investigation of privacy in the sharing economy. We will investigate three challenges in particular: privacy, participation/exclusion and power. First, sharing services come with compounded privacy risks extending beyond the informational into the physical realm. In addition, online sharing services entail both institutional and social privacy threats. Second, sharing services might exclude certain population segments and increase social inequality by systematically disadvantaging and discriminating against underprivileged groups (those living in remote areas, unemployed, impoverished disabled, disconnected, elderly) and favoring privileged individuals. Third and finally, sharing services may disempower users by detaching them from their possessions, by relying on opaque algorithms and creating new forms of distinction such as aruch as arbitrary rating systems, where manipulation is easy and possibilities to challenge the ratings are limited. We research the topic from a multi-disciplinary social science perspective and include a variety of methodological approaches as well as research contexts with our collaboration partners. To quantify these findings, we follow up with quantitative surveys that give us solid evidence on how power, privacy and participation are at play with sharing. By aggregating our findings in design principles for sharing platforms we intend to bring the design of sharing platforms to a new level of maturity by support the user centered, responsible and fair design of sharing platforms.


Stamov D.R.,Karlsruhe Institute of Technology | Pompe T.,University of Leipzig
Soft Matter | Year: 2012

Collagen I is one of the most abundant molecules in vertebrates constituting major parts of the fibrillar extracellular matrix (ECM), thus providing structural integrity and mechanical resilience. It has therefore become an almost ubiquitous biomolecule to use in contemporary biomimetic cell culture scaffolds and in tissue engineering scenarios where new functions for biomedical applications are sought. As collagen I easily self-assembles into fibrillar structures, a number of approaches aim to integrate new functionalities by varying the compositional complexity of the developed scaffolds. Such composite matrices make use of the abundant knowledge about the fibrillar collagen I structure and its binding sites for other ECM molecules. This review gives an overview of the reconstitution of collagen I scaffolds by the implementation of other organic biomolecules. We focus on the self-assembly and structure of the collagen I fibrils affected by the interaction with cofactors and comment on mechanics and biomedical use of such composite scaffolds. © The Royal Society of Chemistry 2012.


Grant
Agency: European Commission | Branch: FP7 | Program: CP-CSA-Infra | Phase: INFRA-2012-1.1.13. | Award Amount: 7.04M | Year: 2014

EUFAR aims at providing researchers with Open Access to the airborne facilities the most suited to their needs. EUFAR thus allocates Transnational Access to 21 installations, develops a culture of co-operation between scientists and operators, and organizes training courses to attract young scientists to airborne research. To improve the quality of the service, EUFAR supports the experts on airborne measurements, constitutes a central data base and develops standards and protocols for this data base to be fully interoperable with Earth observation data bases. EUFAR supports two Joint Research Activities dedicated to (i) the development of methodologies and tools for the integrated use of airborne hyperspectral imaging data and airborne laser scanning data and (ii) the development of robust calibration systems for the core gas-phase chemical measurements currently made on-board research aircraft. To optimise the use and development of airborne research infrastructure, the EUFAR Strategy and European Integration will (i) constitute a Strategic Advisory Committee in which representatives of research institutions will define scientific priorities, jointly support Open Access with in kind contributions to the operation and the harmonized development of the European fleet and (ii) constitute the EUFAR sustainable legal structure. Following the Innovation Union objectives, EUFAR will invite representatives of end user industries to participate in the SAC and constitute a Technology Transfer Office to support both market pull and technology push driven innovation. Workshops will be organized like Innovation Conventions where EUFAR experts and SMEs will closely interact and develop partnerships to transfer airborne research instruments, methodologies and software into new products.


The main objective of this research proposal is to identify and elaborate those characteristics of ENM that determine their biological hazard potential. This potential includes the ability of ENM to induce damage at the cellular, tissue, or organism levels by interacting with cellular structures leading to impairment of key cellular functions. These adverse effects may be mediated by ENM-induced alterations in gene expression and translation, but may involve also epigenetic transformation of genetic functions. We believe that it will be possible to create a set of biomarkers of ENM toxicity that are relevant in assessing and predicting the safety and toxicity of ENM across species. The ENM-organism interaction is complex and depends, not simply on the composition of ENM core, but particularly on its physico-chemical properties. In fact, important physico-chemical properties are largely governed by their surface properties. All of these factors determine the binding of different biomolecules on the surface of the ENM, the formation of a corona around the ENM core. Thus, any positive or negative biological effect of ENM in organisms may be dynamically modulated by the bio-molecule corona associated with or substituted into the ENM surface rather than the ENM on its own. The bio-molecule corona of seemingly identical ENM cores may undergo dynamic changes during their passage through different biological compartments; in other words, their biological effects are governed by this complex surface chemistry. We propose that understanding the fundamental characteristics of ENM underpinning their biological effects will provide a sound foundation with which to classify ENM according to their safety. Therefore, the overarching objective of this research is to provide a means to develop a safety classification of ENM based on an understanding of their interactions with living organisms at the molecular, cellular, and organism levels based on their material characteristics.


Grant
Agency: European Commission | Branch: FP7 | Program: CP | Phase: ICT-2007.5.3 | Award Amount: 5.46M | Year: 2008

In 2006, over 45.000 European citizens died of cirrhosis of the liver and 44,000 additional citizens of liver cancer, knowing that the same year 48,700 new liver cancer cases were declared. Surgical procedures remain the options that offer the foremost success rate against such pathologies. Regretfully, surgery is not so frequent due to several limitations. Indeed, eligibility for liver surgery is based on the minimum safety liver volume remaining after resection (standardized FLR), but this minimum value varies over time and from one patient to another according to biological and mechanical properties of the liver. Since 1996, a large set of preoperative planning software has been developed, but all of them provide only the volume of the liver before and after resection. However interesting, this limited information is not sufficient to improve the rate of surgical eligibility. PASSPORT for Liver Surgery aims at overcoming these limitations by offering a patient-specific modelling that combines anatomical, mechanical, appearance and biological preoperative modelled information in a unified model of the patient. This first complete Virtual liver will be developed in an Open Source Framework allowing vertical integration of biomedical data, from macroscopic to microscopic patient information. From these models, a dynamic liver modelling will provide the patient-specific minimum safety standardized FLR in an educative and preoperative planning simulator allowing to predict the feasibility of the gesture and surgeons ability to realise it. Thus, any patient will be able to know the risk level of a proposed therapy. Finally, we expect to increase the rate of surgical treatment so as to save patients with a liver pathology. To reach these purposes, PASSPORT is composed of a high level partnership between internationally renowned surgical teams, leading European research teams in surgical simulation and an international leading company in surgical instrumentation.


Grant
Agency: European Commission | Branch: FP7 | Program: CP-FP | Phase: HEALTH-2007-2.1.2-5 | Award Amount: 3.99M | Year: 2008

The goal of the CANCERSYS project is to establish a multi-scale model for two major signalling pathways involved in the formation of hepatocellular carcinoma, the beta-catenin and ras signalling pathways. Integrative studies linking measurements in primary heptocytes with effects at the organ level will address the impact of these signalling networks on proliferation, tissue organization and formation of hepatocellular carcinoma. In a close collaboration of scientists from theoretical fields and life sciences, our approach will combine dynamic modelling of signalling networks with spatial-temporal modelling of the liver microarchitecture. For this purpose dynamic models of the beta-catenin and ras core modules and their interactions will be integrated into a single-cell based three-dimensional model of the liver lobule. This model will be used to predict the impact of beta-catenin and ras activation on tissue organization starting with the early events, such as enhanced proliferation as well as micromotility of single cells, followed by formation of nodules and finally of dedifferentiated hepatocarcinomas. Predictions obtained by the model will be validated by inducible Apclox/lox and Ha-rasQ61R mouse strains, which have already been established by members of our consortium and allow induction of hepatocellular carcinoma. In addition the model predictions of the combined influence of active beta-catenin and ras signalling will be validated using a mouse strain that allows induction of both pathways in hepatocytes. In an iterative process, the model will be validated and adjusted to the in vivo situation. The model aims at the identification of systems properties of beta-catenin and ras signalling exploited during carcinogenesis and will foster the prediction of strategies for effective intervention, thereby facilitating the design of novel therapeutic strategies to combat hepatocellular carcinoma.


Grant
Agency: European Commission | Branch: FP7 | Program: CP-FP | Phase: KBBE-2009-3-5-01 | Award Amount: 4.52M | Year: 2010

MAGICPAH aims to explore, understand and exploit the catalytic activities of microbial communities involved in the degradation of persistent PAHs. It will integrate (meta-) genomic studies with in-situ activity assessment based on stable isotope probing particularly in complex matrices of different terrestrial and marine environments. PAH degradation under various conditions of bioavailability will be assessed as to improve rational exploitation of the catalytic properties of bacteria for the treatment and prevention of PAH pollution. We will generate a knowledge base not only on the microbial catabolome for biodegradation of PAHs in various impacted environmental settings based on genome gazing, retrieval and characterization of specific enzymes but also on systems related bioavailability of contaminant mixtures. MAGICPAH takes into account the tremendous undiscovered metagenomic resources by the direct retrieval from genome/metagenome libraries and consequent characterization of enzymes through activity screens. These screens will include a high-end functional small-molecule fluorescence screening platform and will allow us to directly access novel metabolic reactions followed by their rational exploitation for biocatalysis and the re-construction of biodegradation networks. Results from (meta-) genomic approaches will be correlated with microbial in situ activity assessments, specifically dedicated to identifying key players and key reactions involved in anaerobic PAH metabolism. Key processes for PAH metabolism particularly in marine and composting environments and the kinetics of aerobic degradation of PAH under different conditions of bioavailability will be assessed in model systems, the rational manipulation of which will allow us to deduce correlations between system performance and genomic blueprint. The results will be used to improve treatments of PAH-contaminated sites.


Grant
Agency: European Commission | Branch: FP7 | Program: BSG-SME | Phase: SME-1 | Award Amount: 1.47M | Year: 2008

To achieve energy generation from sustainable resources the production of bio mass and the hydrogen economy are now featured worldwide with a tremendous effort. The processes for producing and cleaning of bio and hydrogen gas mixtures will be performed under elevated or high pressures. The design and performing as well as the control of those processes will generate an increasing demand for in Situ high pressure concentration measuring. Up to now no accurate instruments or even measuring methods are available for this purpose. To perform high pressure concentration measuring the usual treatment is to expand the fluid mixtures down to ambient or very low pressure, which has many disadvantages and may even be not possible in certain cases. The goal of the ProBio-HySens project is the development and combination of sensors for measuring optical, thermo physical and electro magnetic properties, to achieve an in Situ high pressure concentration measuring in bio and hydrogen gas mixtures. To reach this main goal the development of new high pressure in Situ sensors and of a high pressure Gas Mixture Generating and Sensor Calibration apparatus (GMG-SC) is required. This instrumentation will allow the first time to analyze multi component gas mixtures in Situ under process conditions up to 200 C and 20 MPa may be even 50 MPa. It avoids devices for sampling, pressure reduction and control which have to be used up to now. Thereby no blocking of valves and tubings by condensation, freezing or precipitation will occur any more. More over it avoids sophisticated, time and cost consuming analyzers requiring intermediately high calibration efforts. The new senor modules will be robust and reliable and range between very economic versions to high end solutions. They will be specially tailored for bio and hydrogen gas mixtures but being also applicable to all other kind of fluid mixtures including super critical fluids if a pressure range up to 50 MPa is reached.


Grant
Agency: European Commission | Branch: H2020 | Program: RIA | Phase: EURO-3-2014 | Award Amount: 2.50M | Year: 2015

The current crisis has indirectly contributed to questioning the efficiency of financial markets and democratic institutions at European and national levels. Recent data from the Eurobarometer (July 2013) shows a continuous decrease in the trust levels that citizens from the European Union have on national governments and parliaments, radically decreasing in more than 25 points in the last six years (European Commission, 2013). This situation is jeopardizing the European project while at the same time a lively public debate about the meaning of European identity is taking place across Europe. Several social scientists have argued that the social and economic inequalities in the new global order are contributing to civil social reactions, based on solidarity, aiming to achieve a better society for all (Touraine, 2007; Wright, 2010). This project aims to analyzing in depth the acts of solidarity which are being developed across Europe, the extent to which they respond to dialogic and inclusive processes, the related outcomes and the policy developments. The project starts from previous findings on successful actions which are combating the crisis by creating employment or improving access to health through acts of solidarity. These acts are thus contributing to construct more inclusive and prosperous societies, by influencing at the macro-level (social inequalities) and micro-level (psychological wellbeing). In this regard, the research will identify common elements among these acts in order to examine their transferability to different contexts. To cover this objective, effects of these actions in five social areas will be studied in depth: housing, education, employment, engagement and health. Simultaneously, special attention will be paid on social investment policies which are supporting these initiatives.


News Article | March 2, 2017
Site: www.eurekalert.org

A core set of genes involved in the responses of honey bees to multiple diseases caused by viruses and parasites has been identified by an international team of researchers. The findings provide a better-defined starting point for future studies of honey-bee health, and may help scientists and beekeepers breed honey bees that are more resilient to stress. "In the past decade, honey-bee populations have experienced severe and persistent losses across the Northern Hemisphere, mainly due to the effects of pathogens, such as fungi and viruses," said Vincent Doublet, postdoctoral research fellow, University of Exeter. "The genes that we identified offer new possibilities for the generation of honey-bee stocks that are resistant to these pathogens." According to the researchers, recent advances in DNA sequencing have prompted numerous investigations of the genes involved in honey-bee responses to pathogens. Yet, until now, this vast quantity of data has been too cumbersome and idiosyncratic to reveal overarching patterns in honey-bee immunity. "While many studies have used genomic approaches to understand how bees respond to viruses and parasites, it has been difficult to compare across these studies to find the core genes and pathways that help the bee fight off stressors," said Distinguished Professor of Entomology Christina Grozinger, Penn State. "Our team created a new bioinformatics tool that has enabled us to integrate information from 19 different genomic datasets to identify the key genes involved in honey bees' response to diseases." Specifically, the team of 28 researchers, representing eight countries, created a new statistical technique, called directed rank-product analysis. The technique allowed them to identify the genes that were expressed similarly across the 19 datasets, rather than just the genes that were expressed more than others within a dataset. The scientists found that these similarly expressed genes included those that encode proteins responsible for the response to tissue damage by pathogens, and those that encode enzymes involved in the metabolism of carbohydrates from food, among many others. A decrease in carbohydrate metabolism, they suggested, may illustrate the cost of the infection on the organism. The researchers report their findings in today's (Mar. 2) issue of BMC Genomics. "Honey bees were thought to respond to different disease organisms in entirely different ways, but we have learned that they mostly rely on a core set of genes that they turn on or off in response to any major pathogenic challenge," said Robert Paxton, professor of zoology, German Centre for Integrative Biodiversity Research. "We can now explore the physiological mechanisms by which pathogens overcome their honey-bee hosts, and how honey bees can fight back against those pathogens." The implications of the findings are not limited to honey bees. The team found that the core genes are part of conserved pathways -- meaning they have been maintained throughout the course of evolution among insects and therefore are shared by other insects. According to Doublet, this means that the genes provide important knowledge for understanding pathogen interactions with other insects, such as bumble bees, and for using pathogens to control insect pests, such as aphids and certain moths. "This analysis provides unprecedented insight into the mechanisms that underpin the interactions between insects and their pathogens," said Doublet. "With this analysis, we generated a list of genes that will likely be an important source for future functional studies, for breeding more resilient honey-bee stocks and for controlling emerging bee diseases." This research was supported by iDiv, the German Center for Integrative Biodiversity Research, located in Leipzig, Germany. Other authors on the paper include Yvonne Poeschl, German Centre for Integrative Biodiversity Research; Andreas Gogol-Döring, Technische Hochschule Mittelhessen; Cédric Alaux, INRA; Desiderato Annoscia, Università degli Studi di Udine; Christian Aurori, University of Agricultural Sciences and Veterinary Medicine of Cluj-Napoca; Seth Barribeau, University of Liverpool; Oscar Bedoya-Reina, University of Edinburgh; Mark Brown, Royal Holloway University of London; James Bull, Swansea University; Michelle Flenniken, Montana State University; David Galbraith, Penn State; Elke Genersch, Institute for Bee Research of Hohen Neuendorf; Sebastian Gisder, Institute for Bee Research of Hohen Neuendorf; Ivo Grosse, Martin Luther University Halle-Wittenberg; Holly Holt, University of Minnesota; Dan Hultmark, Umeå University; H. Michael G. Lattorff, International Centre of Insect Physiology and Ecology; Yves Le Conte, INRA; Fabio Manfredini, Royal Holloway University of London; Dino McMahon, Freie Universität Berlin; Robin Moritz, Martin Luther University Halle-Wittenberg; Francesco Nazzi, Università degli Studi di Udine; Elina Niño, University of California, Davis; Katja Nowick, University of Leipzig; and Ronald van Rij, Radboud University.


Asorey M.,University of Zaragoza | Munoz-Castaneda J.M.,University of Leipzig
Nuclear Physics B | Year: 2013

The infrared behaviour of quantum field theories confined in bounded domains is strongly dependent on the shape and structure of space boundaries. The most significant physical effect arises in the behaviour of the vacuum energy. The Casimir energy can be attractive or repulsive depending on the nature of the boundary. We calculate the vacuum energy for a massless scalar field confined between two homogeneous parallel plates with the most general type of boundary conditions depending on four parameters. The analysis provides a powerful method to identify which boundary conditions generate attractive or repulsive Casimir forces between the plates. In the interface between both regimes we find a very interesting family of boundary conditions which do not induce any type of Casimir force. We also show that the attractive regime holds far beyond identical boundary conditions for the two plates required by the Kenneth-Klich theorem and that the strongest attractive Casimir force appears for periodic boundary conditions whereas the strongest repulsive Casimir force corresponds to anti-periodic boundary conditions. Most of the analysed boundary conditions are new and some of them can be physically implemented with metamaterials. © 2013 Elsevier B.V.


Wessel D.L.,Childrens National Medical Center | Berger F.,German Heart Institute Berlin | Li J.S.,Duke Clinical Research Institute | Dahnert I.,University of Leipzig | And 4 more authors.
New England Journal of Medicine | Year: 2013

BACKGROUND: Infants with cyanotic congenital heart disease palliated with placement of a systemic-to-pulmonary-artery shunt are at risk for shunt thrombosis and death. We investigated whether the addition of clopidogrel to conventional therapy reduces mortality from any cause and morbidity related to the shunt. METHODS: In a multicenter, double-blind, event-driven trial, we randomly assigned infants 92 days of age or younger with cyanotic congenital heart disease and a systemic-to-pul-monary-artery shunt to receive clopidogrel at a dose of 0.2 mg per kilogram of body weight per day (467 infants) or placebo (439 infants), in addition to conventional therapy (including aspirin in 87.9% of infants). The primary efficacy end point was a composite of death or heart transplantation, shunt thrombosis, or performance of a cardiac procedure due to an event considered to be thrombotic in nature before 120 days of age. RESULTS: The rate of the composite primary end point did not differ significantly between the clopidogrel group (19.1%) and the placebo group (20.5%) (absolute risk difference, 1.4 percentage points; relative risk reduction with clopidogrel, 11.1%; 95% confidence interval, -19.2 to 33.6; P = 0.43), nor did the rates of the three components of the composite primary end point. There was no significant benefit of clopidogrel treatment in any subgroup, including subgroups defined by shunt type. Clopidogrel recipients and placebo recipients had similar rates of overall bleeding (18.8% and 20.2%, respectively) and severe bleeding (4.1% and 3.4%, respectively). CONCLUSIONS: Clopidogrel therapy in infants with cyanotic congenital heart disease palliated with a systemic-to-pulmonary-artery shunt, most of whom received concomitant aspirin therapy, did not reduce either mortality from any cause or shunt-related morbidity. (Funded by Sanofi-Aventis and Bristol-Myers Squibb; ClinicalTrials.gov number, NCT00396877.) Copyright © 2013 Massachusetts Medical Society.


Nam C.C.W.,Max Planck Institute for Meteorology | Nam C.C.W.,Laboratoire Of Meteorologie Dynamique | Quaas J.,Max Planck Institute for Meteorology | Quaas J.,University of Leipzig
Journal of Climate | Year: 2012

Observations from Cloud-Aerosol Lidar and Infrared Pathfinder Satellite Observations (CALIPSO) and CloudSat satellites are used to evaluate clouds and precipitation in the ECHAM5 general circulation model. Active lidar and radar instruments on board CALIPSO and CloudSat allow the vertical distribution of clouds and their optical properties to be studied on a global scale. To evaluate the clouds modeled by ECHAM5with CALIPSO and CloudSat, the lidar and radar satellite simulators of the Cloud Feedback Model Intercomparison Project's Observation Simulator Package are used. Comparison of ECHAM5 with CALIPSO and CloudSat found large-scale features resolved by the model, such as the Hadley circulation, are captured well. The lidar simulator demonstrated ECHAM5 overestimates the amount of high-level clouds, particularly optically thin clouds. High-altitude clouds in ECHAM5 consistently produced greater lidar scattering ratios compared with CALIPSO. Consequently, the lidar signal in ECHAM5 frequently attenuated high in the atmosphere. The large scattering ratios were due to an underestimation of effective ice crystal radii in ECHAM5. Doubling the effective ice crystal radii improved the scattering ratios and frequency of attenuation. Additionally, doubling the effective ice crystal radii improved the detection of ECHAM5's highest-level clouds by the radar simulator, in better agreement with CloudSat. ECHAM5 was also shown to significantly underestimate midlevel clouds and (sub)tropical low-level clouds. The low-level clouds produced were consistently perceived by the lidar simulator as too optically thick. The radar simulator demonstrated ECHAM5 overestimates the frequency of precipitation, yet underestimates its intensity compared with CloudSat observations. These findings imply compensating mechanisms inECHAM5 balance out the radiative imbalance caused by incorrect optical properties of clouds and consistently large hydrometeors in the atmosphere. © 2012 American Meteorological Society.


Algergawy A.,University of Leipzig | Nayak R.,Queensland University of Technology | Saake G.,Otto Von Guericke University of Magdeburg
Information Sciences | Year: 2010

Schema matching plays a central role in a myriad of XML-based applications. There has been a growing need for developing high-performance matching systems in order to identify and discover semantic correspondences across XML data. XML schema matching methods face several challenges in the form of definition, adoption, utilization, and combination of element similarity measures. In this paper, we classify, review, and experimentally compare major methods of element similarity measures and their combinations. We aim at presenting a unified view which is useful when developing a new element similarity measure, when implementing an XML schema matching component, when using an XML schema matching system, and when comparing XML schema matching systems. © 2010 Elsevier Inc. All rights reserved.


Conrad K.,TU Dresden | Roggenbuck D.,GA Generic Assays GmbH | Reinhold D.,Otto Von Guericke University of Magdeburg | Sack U.,University of Leipzig
Autoimmunity Reviews | Year: 2012

Disease associated autoantibodies (AAB) are important biomarkers not only to confirm the diagnosis of the respective systemic autoimmune disease but also to diagnose the disease at very early stages (mono- or oligosymptomatic manifestations) or to diagnose the respective disease without the typical clinical manifestations (atypical forms). A confirmation of the diagnosis in early stages is required, if patients should benefit from early therapeutic intervention. Furthermore, AAB determinations are used for prognostic purposes and for monitoring of disease activity or response to therapy. For the advancement of autoantibody diagnostics in clinical practice the following aspects have to be considered: (i) The search for novel clinically relevant AAB and the identification of autoantigenic targets of AAB broadened the spectrum of autoimmune diagnostics and permit the diagnosis of former idiopathic diseases. (ii) To obtain steady diagnostic variables of clinically relevant AAB, the evaluation studies have to be standardized. (iii) Several special features and novel developments of autoantibody diagnostics make correct interpretation of antibody test results increasingly difficult. (iv) Beside standardization of AAB detection methods and quality management efforts the improvement of autoantibody diagnostics depends on further development of diagnostic algorithms including cost-effective multiparametric analyses. © 2011 Elsevier B.V.


Singer S.,University of Mainz | Singer S.,University of Leipzig | Dieng S.,OnkoZert | Wesselmann S.,German Cancer Society
Psycho-Oncology | Year: 2013

Background Over the last few years, a nationwide voluntary certification system for cancer centres has been established in Germany. To qualify for certification, cancer centres must provide psycho-oncological care to every patient who needs it. The aim of this study was to find out how many patients have been treated by a psycho-oncologist in the certified centres. Methods All cancer centres in Germany that were re-certified in 2010 provided data documenting how many patients with primary cancer received at least 30 min of psycho-oncological consultation in 2009. Results Data from n = 456 certified cancer centres were available. In the centres, a total of 36 165 patients were seen by a psycho-oncologist for at least 30 min, representing 37.3% of all patients in the centres. The highest percentage of patients who received psycho-oncological care was found in breast cancer centres (66.7%), and the lowest in prostate cancer centres (6.8%). Half of the patients (50.0%) in gynaecological cancer centres, 37.7% in colon cancer centres and 25.4% in lung cancer centres received psycho-oncological care. Conclusions Compared with non-certified centres, the proportion of patients receiving psycho-oncological care in certified cancer centres has increased. Copyright © 2012 John Wiley & Sons, Ltd.


Ertych N.,University of Gottingen | Stolz A.,University of Gottingen | Stenzinger A.,University of Heidelberg | Weichert W.,University of Heidelberg | And 6 more authors.
Nature Cell Biology | Year: 2014

Chromosomal instability (CIN) is defined as the perpetual missegregation of whole chromosomes during mitosis and represents a hallmark of human cancer. However, the mechanisms influencing CIN and its consequences on tumour growth are largely unknown. We identified an increase in microtubule plus-end assembly rates as a mechanism influencing CIN in colorectal cancer cells. This phenotype is induced by overexpression of the oncogene AURKA or by loss of the tumour suppressor gene CHK2, a genetic constitution found in 73% of human colorectal cancers. Increased microtubule assembly rates are associated with transient abnormalities in mitotic spindle geometry promoting the generation of lagging chromosomes and influencing CIN. Reconstitution of proper microtubule assembly rates by chemical or genetic means suppresses CIN and thereby, unexpectedly, accelerates tumour growth in vitro and in vivo. Thus, we identify a fundamental mechanism influencing CIN in cancer cells and reveal its adverse consequence on tumour growth. © 2014 Macmillan Publishers Limited. All rights reserved.


Tarnok A.,University of Leipzig | Ulrich H.,University of Sao Paulo | Bocsi J.,University of Leipzig
Cytometry Part A | Year: 2010

Stem cells have turned into promising tools for studying the mechanisms of development, regeneration, and for cell therapy of various disorders. Stem cells are found in the embryo and in most adult tissues participating in endogenous tissue regeneration. They are capable of autorenovation, often maintain their multipotency of differentiation into various tissues of their germ line and are, therefore, ideal candidates for cellular therapy taken that they can be unequivocally identified and isolated. In this review, we report stem cell marker expression used for identification of various stem cell lineages, including very small embryonic stem cells, neural, hematopoietic, mesenchymal, epithelial and limbal epithelial stem cells, endothelial progenitor cells, supraadventitial adipose stromal cells, adipose pericytes, and cancer stem cells. These cells usually cannot be distinguished by a single stem cell marker, because their expression partially overlaps between lineages. Recent advances in flow cytometry allowing the simultaneous detection of various markers have facilitated stem cell identification for clinical diagnosis and research. So far experimental evidence suggests the existence of cells with different properties, i.e., the capability to different in various cell types. Several studies indicate that expression of classical markers for stem cell classification, such as CD34, CD45, and CD133, may differ between the virtually same stem and progenitor cells, i.e., endothelial progenitor or mesenchymal stem cells, when they were obtained from different tissues. This finding raises questions whether phenotypic differences are due to the source or if it is only caused by different isolation and experimental conditions. © 2009 International Society for Advancement of Cytometry.


Heino N.,University of Leipzig | Pan J.Z.,University of Aberdeen
Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) | Year: 2012

Recent developments in hardware have shown an increase in parallelism as opposed to clock rates. In order to fully exploit these new avenues of performance improvement, computationally expensive workloads have to be expressed in a way that allows for fine-grained parallelism. In this paper, we address the problem of describing RDFS entailment in such a way. Different from previous work on parallel RDFS reasoning, we assume a shared memory architecture. We analyze the problem of duplicates that naturally occur in RDFS reasoning and develop strategies towards its mitigation, exploiting all levels of our architecture. We implement and evaluate our approach on two real-world datasets and study its performance characteristics on different levels of parallelization. We conclude that RDFS entailment lends itself well to parallelization but can benefit even more from careful optimizations that take into account intricacies of modern parallel hardware. © 2012 Springer-Verlag Berlin Heidelberg.


Gehlot S.,Max Planck Institute for Meteorology | Gehlot S.,University of Hamburg | Quaas J.,Max Planck Institute for Meteorology | Quaas J.,University of Leipzig
Journal of Climate | Year: 2012

A process-oriented climate model evaluation is presented, applying the International Satellite Cloud Climatology Project (ISCCP) simulator to pinpoint deficiencies related to the cloud processes in the ECHAM5general circulation model.ALagrangian trajectory analysis is performed to track the transitions of anvil cirrus originating from deep convective detrainment to cirrostratus and thin cirrus, comparing ISCCP observations and the ECHAM5 model. Trajectories of cloudy air parcels originating from deep convection are computed for both, the ISCCP observations and the model, over which the ISCCP joint histograms are used for analyzing the cirrus life cycle over 5 days. The cirrostratus and cirrus clouds originate from detrainment from deep convection decay and gradually thin out after the convective event over 3-4 days. The effect of the convection-cirrus transitions in a warmer climate is analyzed in order to understand the climate feedbacks due to deep convective cloud transitions. An idealized climate change simulation is performed using a+2-K sea surface temperature (SST) perturbation. The Lagrangian trajectory analysis over perturbed climate suggests that more and thicker cirrostratus and cirrus clouds occur in the warmer climate compared to the present-day climate. Stronger convection is noticed in the perturbed climate, which leads to an increased precipitation, especially on day-2 and-3 after the individual convective events. The shortwave and the longwave cloud forcings both increase in the warmer climate, with an increase of net cloud radiative forcing (NCRF), leading to an overall positive feedback of the increased cirrostratus and cirrus clouds from a Lagrangian transition perspective. © 2012 American Meteorological Society.


Collura M.,CNRS Jean Lamour Institute | Collura M.,University of Leipzig | Karevski D.,CNRS Jean Lamour Institute
Physical Review Letters | Year: 2010

We analyze the coherent quantum evolution of a many-particle system after slowly sweeping a power-law confining potential. The amplitude of the confining potential is varied in time along a power-law ramp such that the many-particle system finally reaches or crosses a critical point. Under this protocol we derive general scaling laws for the density of excitations created during the nonadiabatic sweep of the confining potential. It is found that the mean excitation density follows an algebraic law as a function of the sweeping rate with an exponent that depends on the space-time properties of the potential. We confirm our scaling laws by first order adiabatic calculation and exact results on the Ising quantum chain with a varying transverse field. © 2010 The American Physical Society.


Klocke D.,Max Planck Institute for Meteorology | Klocke D.,European Center for Medium Range Weather Forecasts | Pincus R.,University of Colorado at Boulder | Quaas J.,Max Planck Institute for Meteorology | Quaas J.,University of Leipzig
Journal of Climate | Year: 2011

The distribution of model-based estimates of equilibrium climate sensitivity has not changed substantially in more than 30 years. Efforts to narrow this distribution by weighting projections according to measures of model fidelity have so far failed, largely because climate sensitivity is independent of current measures of skill in current ensembles of models. This work presents a cautionary example showing that measures of model fidelity that are effective at narrowing the distribution of future projections (because they are systematically related to climate sensitivity in an ensemble of models) may be poor measures of the likelihood that a model will provide an accurate estimate of climate sensitivity (and thus degrade distributions of projections if they are used as weights). Furthermore, it appears unlikely that statistical tests alone can identify robust measures of likelihood. The conclusions are drawn from two ensembles: one obtained by perturbing parameters in a single climate model and a second containing the majority of the world's climate models. The simple ensemble reproduces many aspects of the multimodel ensemble, including the distributions of skill in reproducing the present-day climatology of clouds and radiation, the distribution of climate sensitivity, and the dependence of climate sensitivity on certain cloud regimes. Weighting by error measures targeted on those regimes permits the development of tighter relationships between climate sensitivity and model error and, hence, narrower distributions of climate sensitivity in the simple ensemble. These relationships, however, do not carry into the multimodel ensemble. This suggests that model weighting based on statistical relationships alone is unfounded and perhaps that climate model errors are still large enough that model weighting is not sensible. © 2011 American Meteorological Society.


Andersen S.K.,University of Aberdeen | Muller M.M.,University of Leipzig | Hillyard S.A.,University of California at San Diego
Journal of Neuroscience | Year: 2015

Experiments that study feature-based attention have often examined situations in which selection is based on a single feature (e.g., the color red). However, in more complex situations relevant stimuli may not be set apart from other stimuli by a single defining property but by a specific combination of features. Here, we examined sustained attentional selection of stimuli defined by conjunctions of color and orientation.Humanobservers attended to one out of four concurrently presented superimposed fields of randomly moving horizontal or vertical bars of red or blue color to detect brief intervals of coherent motion. Selective stimulus processing in early visual cortex was assessed by recordings of steady-state visual evoked potentials (SSVEPs) elicited by each of the flickering fields of stimuli. We directly contrasted attentional selection of single features and feature conjunctions and found that SSVEP amplitudes on conditions in which selection was based on a single feature only (color or orientation) exactly predicted the magnitude of attentional enhancement of SSVEPs when attending to a conjunction of both features. Furthermore, enhanced SSVEP amplitudes elicited by attended stimuli were accompanied by equivalent reductions of SSVEP amplitudes elicited by unattended stimuli in all cases. We conclude that attentional selection of a feature-conjunction stimulus is accomplished by the parallel and independent facilitation of its constituent feature dimensions in early visual cortex. © 2015, the authors.


Tinius M.,Center for Joint Surgery | Hepp P.,University of Leipzig | Becker R.,City Hospital
Knee Surgery, Sports Traumatology, Arthroscopy | Year: 2012

Purpose: Patients presenting anterior cruciate ligament (ACL) deficiency and isolated osteoarthritis of the medial compartment are treated either with biplanar osteotomy or with total knee arthroplasty (TKA). However, these patients between the forties and fifties are often very active in daily life and feel limited due to their knee. In order to follow the idea of preserving as much as possible from the joint, the concept of unicondylar joint replacement in conjunction with ACL reconstruction has been followed. There seems to be a limited experience with this concept. The purpose of the follow-up study was to evaluate the midterm clinical and functional outcome. Methods: Twenty-seven patients were followed up for 53 months. The mean age of the 11 men and 16 women was 44 years. All patients were treated by combined unicompartmental knee arthroplasty and anterior cruciate ligament reconstruction. Results: The Knee Society Score improved significantly from 77.1 ± 11.6 points to 166.0 ± 12.1 points (P ≤ 0.01). No revision surgery was required and no radiolucent lines were observed on the radiographs at the time of follow-up. The anterior translation showed less than 5 mm in 24 patients and 5 mm in the remaining 3 patients. Conclusions: The midterm clinical data have shown that combined surgery of UKA and anterior cruciate ligament reconstruction has revealed promising results. The restored knee stability seems to prevent the failure of UKA. However, long-term follow-up studies are required in these patients who received partial joint replacement fairly early in their life. Level of evidence: IV. © 2011 Springer-Verlag.


Braun G.,University of Leipzig | Fiorini S.,Free University of Colombia | Pokutta S.,Georgia Institute of Technology | Steurer D.,Cornell University
Proceedings - Annual IEEE Symposium on Foundations of Computer Science, FOCS | Year: 2012

We develop a framework for proving approximation limits of polynomial-size linear programs from lower bounds on the nonnegative ranks of suitably defined matrices. This framework yields unconditional impossibility results that are applicable to any linear program as opposed to only programs generated by hierarchies. Using our framework, we prove that quadratic approximations for CLIQUE require linear programs of exponential size. (This lower bound applies to linear programs using a certain encoding of CLIQUE as a linear optimization problem) Moreover, we establish a similar result for approximations of semi definite programs by linear programs. Our main technical ingredient is a quantitative improvement of Razborov's rectangle corruption lemma (1992) for the high error regime, which gives strong lower bounds on the nonnegative rank of certain perturbations of the unique disjoint ness matrix. © 2012 IEEE.


Allam Y.,Alexandria University | Silbermann J.,Waldklinikum | Riese F.,University of Leipzig | Greiner-Perth R.,Hochfranken
European Spine Journal | Year: 2013

Introduction: Although pedicle screw fixation is a wellestablished technique for the lumbar spine, screw placement in the thoracic spine is more challenging because of the smaller pedicle size and more complex 3D anatomy. The intraoperative use of image guidance devices may allow surgeons a safer, more accurate method for placing thoracic pedicle screws while limiting radiation exposure. This generic 3D imaging technique is a new generation intraoperative CT imaging system designed without compromise to address the needs of a modern OR. Aim The aim of our study was to check the accuracy of this generic 3D navigated pedicle screw implants in comparison to free hand technique described by Roy-Camille at the thoracic spine using CT scans. Material and methods The material of this study was divided into two groups: free hand group (group I) (18 patients; 108 screws) and 3D group (27 patients; 100 screws). The patients were operated upon from January 2009 to March 2010. Screw implantation was performed during internal fixation for fractures, tumors, and spondylodiscitis of the thoracic spine as well as for degenerative lumbar scoliosis. Results The accuracy rate in our work was 89.8 % in the free hand group compared to 98 % in the generic 3D navigated group. Conclusion: In conclusion, 3D navigation-assisted pedicle screw placement is superior to free hand technique in the thoracic spine. © Springer-Verlag 2012.


Engel C.,University of Leipzig | Fischer C.,University of Heidelberg
Breast Care | Year: 2015

BRCA1/2 mutation carriers have a considerably increased risk to develop breast and ovarian cancer. The personalized clinical management of carriers and other at-risk individuals depends on precise knowledge of the cancer risks. In this report, we give an overview of the present literature on empirical cancer risks, and we describe risk prediction models that are currently used for individual risk assessment in clinical practice. Cancer risks show large variability between studies. Breast cancer risks are at 40-87% for BRCA1 mutation carriers and 18-88% for BRCA2 mutation carriers. For ovarian cancer, the risk estimates are in the range of 22-65% for BRCA1 and 10-35% for BRCA2. The contralateral breast cancer risk is high (10-year risk after first cancer 27% for BRCA1 and 19% for BRCA2). Risk prediction models have been proposed to provide more individualized risk prediction, using additional knowledge on family history, mode of inheritance of major genes, and other genetic and non-genetic risk factors. User-friendly software tools have been developed that serve as basis for decision-making in family counseling units. In conclusion, further assessment of cancer risks and model validation is needed, ideally based on prospective cohort studies. To obtain such data, clinical management of carriers and other at-risk individuals should always be accompanied by standardized scientific documentation. © 2015 S. Karger GmbH, Freiburg.


Schmidt T.,University of Heidelberg | Lordick F.,University of Leipzig | Herrmann K.,University of Würzburg | Ott K.,Vascular and Thoracic Surgery
JNCCN Journal of the National Comprehensive Cancer Network | Year: 2015

In esophageal cancer, functional imaging using PET can provide important additional information beyond standard staging techniques that may eventually lead to therapeutic consequences. The most commonly used tracer is fluorodeoxyglucose (FDG), which has high avidity for both squamous cell cancer and adenocarcinoma of the esophagus. The value of FDG-PET is limited in early esophageal cancer, whereas additional information is provided in 15% to 20% of locally advanced tumors. Neoadjuvant treatment is currently the standard of care in locally advanced esophageal cancer in most countries because randomized studies have shown a significant survival benefit. Because responders and nonresponders have a significantly different prognosis, functional imaging to tailor preoperative treatment would be of interest. Metabolic imaging using FDG-PET is an established method of response evaluation in clinical trials. The value of metabolic response evaluation is known to depend on the histologic subtype and the type of preoperative treatment delivered. An association of FDG-PET-based metabolic response with clinical response and prognosis was shown for absolute standardized uptake value (SUV) or a decrease of SUV levelsbefore, during, and after therapy. However, contradictory findings exist in the literature and prospective validation is missing. Additionally,no consensus exists on time points or cutoff levels for metabolic response evaluation. Furthermore, correct prediction of a posttherapeutic pathologic complete remission is currently not possible using FDG-PET. Of high interest is early response monitoring during preoperative chemotherapy, with potential subsequent therapy modification. This tailored approach still needs validation in prospective multicenter trials. © 2015 by the National Comprehensive Cancer Network. All rights reserved.


Grant
Agency: European Commission | Branch: FP7 | Program: BSG-SME | Phase: SME-1 | Award Amount: 1.36M | Year: 2010

Colorectal cancer and lung cancer cause millions of death each year. Currently there is no suitable non-invasive method for the early detection of these types of cancer. The tumour suppressor gene BARD1 (BRCA1-associated RING domain protein) is aberrantly expressed in several types of cancer and could be a diagnostic target for early cancer diagnosis in blood samples. The overall objective of the project is to develop blood tests for the early detection of colorectal and lung cancer based on cancer-specific BARD1 isoforms. The outlined tests will analyse BARD1 isoforms at two levels: the expression of isoform-specific RNA in circulating tumour cells (CTC) and the presence of isoform-specific autoantibodies in serum. To reach these objectives, several technological challenges have to be overcome. The BARDiag consortium includes 3 SMEs and 4 research centres, who have excellent expertise, specific knowledge, the required lab infrastructure and necessary clinical materials that will enable to conduct this project. Within the frame of the project, innovative methods for the isolation of CTCs in colorectal and lung cancer patients will be developed, the specific signatures of the BARD1 isoforms at both mRNA and autoimmune levels will be defined, and assays for the detection of these isoforms will be established, validated with clinical samples and tested for their marketability. The results of the proposed project will have extensive impacts. Not only more scientific knowledge on the expression of BARD1 isoforms in colorectal and lung cancer will be obtained and therefore the understanding for cancer will be improved, but also non-invasive methods for the early detection of colorectal and lung cancer will be made broadly available in form of commercial test kits. The SMEs will extend their expertise and knowledge and therefore strengthen their economic power, which will contribute to increase European competitiveness.


Grant
Agency: European Commission | Branch: H2020 | Program: RIA | Phase: PHC-15-2014 | Award Amount: 6.33M | Year: 2015

Stem cell regenerative therapies hold great promise for patients suffering from a variety of disorders that are associated with tissue or organ injury. Regeneration relies on tissue or organ-specific stem and progenitor cells, but can also aim at promoting the endogenous repair capacity of the body. Mesenchymal stromal cells (MSC) are undergoing clinical testing in a variety of clinical conditions aiming at repair through direct or indirect mechanisms. Their ability to form bone or cartilage is used to directly repair these tissues. In other conditions their regenerative effects are based on endogenous repair through their anti-inflammatory properties. The latter mechanism is important in the treatment of acute Graft-versus-Host Disease (GvHD). We have been involved in the clinical development from the beginning and we have shown the therapeutic potential. However, no results of controlled randomized phase 3 studies have been published to date, thereby hampering safety and efficacy assessment. Within our consortium we have developed an academic infrastructure for the harmonized production of MSC. In the RETHRIM proposal this will be combined with our clinical expertise to conduct the first Europe-wide placebo controlled randomized phase III trial using MSC regenerative therapy for the treatment of steroid-resistant visceral GvHD. Central to the RETHRIM project is the clinical trial for which 150 patients will be recruited. All MSC products will be extensively analysed using molecular and functional markers, in order to develop a potency signature for the product and for the prediction of response. We also intend to collect data from additional quality of life, health technology assessment and ethical studies. We will apply for an Orphan Drug Designation in Europe and this may serve as a stepping-stone for the further commercialization of the MSC product, once a positive outcome has been obtained.


Flash Physics is our daily pick of the latest need-to-know developments from the global physics community selected by Physics World's team of editors and reporters An abrupt transition in the electrical resistance of graphite at 350 K could be a signature of superconductivity occurring well above room temperature (293 K). That is the claim of Pablo Esquinazi and colleagues at the University of Leipzig in Germany and also in Brazil and Australia. The effect was spotted in samples of natural graphite that came from a mine in Brazil. While claims of room-temperature superconductivity in graphite have been made several times over the past 40 years, this is the first time that the transition temperature has been measured, according to Esquinazi. The team found that the transition went away when the graphite was exposed to a magnetic field – something that is indicative of superconductivity. The team believes that individual grains within their samples are tiny superconductors and the spaces between the gaps act as Josephson junctions that allow supercurrents to flow from one grain to another. X-ray diffraction studies suggest that the grains have atomic structures that could support superconductivity, says Esquinazi. The research is described in New Journal of Physics. The fundamental physical constants have been measured with sufficient precision to allow the values to be used to redefine the International System of Units (SI), according to scientists at NIST in Gaithersburg, Maryland. These constants include the speed of light, the Planck and Boltzmann constants and the electrical charge of the electron. Metrologists are in the process of creating a completely new way of defining SI units – such as the metre, kilogram and second – in terms of the fundamental constants. This is unlike the current definition, which relies in part on artefacts such as the standard kilogram that is stored in Paris – and losing mass over time. In the new system, the Planck constant – which is now known to 12 parts in one billion – would be used to define the kilogram. Other planned changes involve using the Boltzmann constant (6 parts in 10 million) to define the kelvin, which is currently defined using the triple point of water. "These now ultra-small uncertainties in the constants will allow the General Conference on Weights and Measures to revise the International System of Units so that the seven base units will be exactly defined in terms of fundamental constants," says NIST's Donald Burgess. An X-ray free-electron laser at SLAC in the US has been used to observe two important steps in photosynthesis in which water molecules are split to liberate oxygen atoms. The work was done by an international team of scientists that used X-ray pulses just 40 fs in duration to determine the structure of a protein complex called "photosystem II", which is involved in water splitting. Unlike previous studies of the process, which were done using frozen samples, the measurement was done at room temperature. The team was able to observe the steps in the four-step cycle by first firing pulses of green laser light at the liquid sample to initiate the splitting. Then, the X-ray pulses are used to measure the structure of photosystem II as the splitting proceeds. The team hopes its measurements will shed light on how water is split by a complex in photosystem II that contains manganese and calcium. "Learning how exactly this water-splitting process works will be a breakthrough in our understanding, and it can help in the development of solar fuels and renewable energy," explains team member Vittal Yachandra of Berkeley Lab. The research is reported in Nature.


Poordad F.,University of Texas Health Science Center at San Antonio | Hezode C.,University Paris Est Creteil | Trinh R.,AbbVie | Kowdley K.V.,Virginia Mason Medical Center | And 12 more authors.
New England Journal of Medicine | Year: 2014

BACKGROUND: Interferon-containing regimens for the treatment of hepatitis C virus (HCV) infection are associated with increased toxic effects in patients who also have cirrhosis. We evaluated the interferon-free combination of the protease inhibitor ABT-450 with ritonavir (ABT-450/r), the NS5A inhibitor ombitasvir (ABT-267), the nonnucleoside polymerase inhibitor dasabuvir (ABT-333), and ribavirin in an open-label phase 3 trial involving previously untreated and previously treated adults with HCV genotype 1 infection and compensated cirrhosis. METHODS: We randomly assigned 380 patients with Child-Pugh class A cirrhosis to receive either 12 or 24 weeks of treatment with ABT-450/r-ombitasvir (at a once-daily dose of 150 mg of ABT-450, 100 mg of ritonavir, and 25 mg of ombitasvir), dasabuvir (250 mg twice daily), and ribavirin administered according to body weight. The primary efficacy end point was a sustained virologic response 12 weeks after the end of treatment. The rate of sustained virologic response in each group was compared with the estimated rate with a telaprevir-based regimen (47%; 95% confidence interval [CI], 41 to 54). A noninferiority margin of 10.5 percentage points established 43% as the noninferiority threshold; the superiority threshold was 54%. RESULTS: A total of 191 of 208 patients who received 12 weeks of treatment had a sustained virologic response at post-treatment week 12, for a rate of 91.8% (97.5% CI, 87.6 to 96.1). A total of 165 of 172 patients who received 24 weeks of treatment had a sustained virologic response at post-treatment week 12, for a rate of 95.9% (97.5% CI, 92.6 to 99.3). These rates were superior to the historical control rate. The three most common adverse events were fatigue (in 32.7% of patients in the 12-week group and 46.5% of patients in the 24-week group), headache (in 27.9% and 30.8%, respectively), and nausea (in 17.8% and 20.3%, respectively). The hemoglobin level was less than 10 g per deciliter in 7.2% and 11.0% of patients in the respective groups. Overall, 2.1% of patients discontinued treatment owing to adverse events. CONCLUSIONS: In this phase 3 trial of an oral, interferon-free regimen evaluated exclusively in patients with HCV genotype 1 infection and cirrhosis, multitargeted therapy with the use of three new antiviral agents and ribavirin resulted in high rates of sustained virologic response. Drug discontinuations due to adverse events were infrequent. Copyright © 2014 Massachusetts Medical Society.


News Article | November 23, 2016
Site: www.eurekalert.org

New research by an international team shows that the present thinning and retreat of Pine Island Glacier in West Antarctica is part of a climatically forced trend that was triggered in the 1940s. The team -- made up of scientists from Lawrence Livermore National Laboratory, the British Antarctic Survey, University of Copenhagen, University of Alaska, Naval Postgraduate School, NASA Goddard Space Flight Center, Lamont-Doherty Earth Observatory of Columbia University, University of Leipzig, University of Geneva and University of Cambridge -- analyzed sediment cores recovered beneath the floating Pine Island Glacier ice shelf. The team concluded the date at which the grounding line retreated from a prominent seafloor ridge was in 1945 at the latest. The team also found that final ungrounding of the ice shelf from the ridge occurred in 1970. "Our results suggest that, even when climate forcing (such as El Niños, which create warmer water) weakened, ice-sheet retreat continued," said James Smith of the British Antarctic Survey and lead author of an article appearing in the Nov. 23 issue of the journal, Nature. The West Antarctic Ice Sheet is one of the largest potential sources of water that will contribute to rising sea levels. Over the past 40 years, glaciers flowing into the Amundsen Sea sector of the ice sheet have thinned at an accelerating rate, and several numerical models suggest that unstable and irreversible retreat of the grounding line -- which marks the boundary between grounded ice and floating ice shelf -- is under way. Understanding this recent retreat requires a detailed knowledge of grounding-line history, but the locations of the grounding line before the advent of satellite monitoring in the 1990s are poorly dated. Pine Island Glacier, which drains into the Amundsen Sea, has retreated continuously throughout the short period for which there are observational records (from 1992 to the present). The coherent thinning of this and other glaciers along the Amundsen Sea coast indicates a response to external forcing and has been attributed to high basal melting of the floating ice shelves by warm circumpolar deep water. Thinner ice shelves are less able to buttress inland ice, leading to glacier acceleration and ice-sheet thinning. Evidence gathered by Autosub, an autonomous underwater vehicle operating beneath the ice shelf of Pine Island Glacier, revealed a prominent sea-floor ridge that probably acted as the most recent steady grounding-line position. The earliest visible satellite image, from 1973, showed a bump on the ice surface that was interpreted as the last point of grounding on the highest part of the ridge. The bump had disappeared several years later, suggesting that the present phase of thinning was already underway. "This finding provided the first hint that the recent retreat could be part of a longer-term process that started decades or even centuries before satellite observations became available," Smith said. For the study, the team drilled three 20-centimeter holes through the Pine Island Glacier ice shelf during December 2012 and January 2013 to access the ocean cavity below. Sediment cores were recovered at each site. Changes in the lithology and composition of sediment deposited beneath the glacier record the transition from grounded glacier to freely floating ice. Measurements of lead (Pb-210) and plutonium in the sediment were used to determine when the ice retreat began. Analyses of trace levels of global fallout plutonium in the sediment were performed by high-precision mass spectrometry at LLNL. The appearance of plutonium in the sediment marks the onset of above-ground testing of nuclear weapons in the 1950s, and indicates that ice-sheet retreat began before this time. The Lab's Amy Gaffney contributed to this portion of the study. "Despite a return to pre-1940s climatic conditions in the ensuing decades, thinning and glacier retreat has not stopped and is unlikely to be reversible without a major change in marine or glaciological conditions," Smith said. "A period of warming in the Antarctic shelf waters triggered a substantial change in the ice sheet, via the mechanism that we see today -- that is, ocean-driven thinning and retreat of ice shelves leads to inland glacier acceleration and ice-sheet thinning." Founded in 1952, Lawrence Livermore National Laboratory provides solutions to our nation's most important national security challenges through innovative science, engineering and technology. Lawrence Livermore National Laboratory is managed by Lawrence Livermore National Security, LLC for the U.S. Department of Energy's National Nuclear Security Administration.


Wolf T.,University of Leipzig | Meyer B.C.,TU Dortmund
Ecological Indicators | Year: 2010

Housing suburbanisation led in the past decades to problems caused by deconcentration of population and intensive area-consumption. Major social, economic and ecological functions for a sustainable spatial decision support in the suburban landscape are described and functionalised by indicators and modelled using GIS with the aim to minimise the problems related to the suburbanisation. The indicators chosen include human-ecological functions, accessibility and infrastructure development and the regulation and regeneration of population and biocoenosis. Out of a balanced list of 11 indicators (one is used twice) the regulation of traffic noise immissions, the landscape accessibility to the nearest freeway and the habitat network integration of sites are modelled, assessed and discussed detailed. The indicator modelling operationalises a wide range of methods including the analysis of travel costs, distance functions, visibility analysis and landscape metrics on the basis of public available data (biotope types, digital elevation model and road data). The methods are applied to a suburban agricultural landscape northeast of Leipzig in Saxony/Germany (66 km2). Three scenarios developed for the aggregation of multiple considerations are demonstrated with maps - based on the status quo of the "(mono)-functional landscape", the "multi functional landscape" and the "sustainable landscape". The scenarios aggregate an increasing number of indicators to form a comprehensive assessment. The result maps clearly show the suitable areas for private housing that fulfil e.g. silence, recreational functions while simultaneously ensuring nature protection. The paper emphasises the integrative prospects of landscape functions for monitoring, indicator assessment and the integration to land use decision-making in the context of spatial planning. © 2009 Elsevier Ltd. All rights reserved.


Keil A.,University of Florida | Muller M.M.,University of Leipzig
Brain Research | Year: 2010

This study examined the latency and amplitude of cortical processes associated with feature-based visual selective attention, using frequency-domain and time-domain measures derived from dense-array electroencephalography. Participants were asked to identify targets based on conjunctions of three types of object features (color, size, and completeness). This procedure aimed to examine (1) the modulation of sensory responses to one or more stimulus features characterizing an object and (2) the facilitation and reduction effects associated with competing features, attended and unattended, in the same object. The selection negativity, an event-related potential measure of sensory amplification for attended features, showed a parametric increase of amplitude as a function of the number of attended features. Late oscillations in the gamma band range were also smaller for stimuli with one or more non-attended visual features but were enhanced for stimuli sharing the overall gestalt with the target. The latency of this late gamma modulation was delayed when two target features were combined, compared to one single discriminative feature. Latency analyses also showed that late bursts of induced high-frequency oscillatory activity peaked around 60 ms later than the selection negativity. Oscillatory activity reflected both selective amplification and competition between object features. These results suggest that sensory amplification of selected features is followed by integrative processing in more widespread networks. Oscillatory activity in these networks is reduced by distraction and is enhanced when attended features can be mapped to specific action. © 2009 Elsevier B.V. All rights reserved.


Czech A.,University of Potsdam | Wende S.,University of Leipzig | Morl M.,University of Leipzig | Pan T.,University of Chicago | Ignatova Z.,University of Potsdam
PLoS Genetics | Year: 2013

Stress-induced changes of gene expression are crucial for survival of eukaryotic cells. Regulation at the level of translation provides the necessary plasticity for immediate changes of cellular activities and protein levels. In this study, we demonstrate that exposure to oxidative stress results in a quick repression of translation by deactivation of the aminoacyl-ends of all transfer-RNA (tRNA). An oxidative-stress activated nuclease, angiogenin, cleaves first within the conserved single-stranded 3′-CCA termini of all tRNAs, thereby blocking their use in translation. This CCA deactivation is reversible and quickly repairable by the CCA-adding enzyme [ATP(CTP):tRNA nucleotidyltransferase]. Through this mechanism the eukaryotic cell dynamically represses and reactivates translation at low metabolic costs. © 2013 Czech et al.


Hellwig C.T.,University of Leipzig | Hellwig C.T.,Royal College of Surgeons in Ireland | Rehm M.,Royal College of Surgeons in Ireland
Molecular Cancer Therapeutics | Year: 2012

TRAIL and agonistic antibodies raised against TRAIL death receptors are highly promising new anticancer agents. In this brief review, we describe the recent advances in the molecular understanding of TRAIL signaling and the progress made in using TRAIL or agonistic antibodies clinically in mono- and combination therapies. Synergies have been reported in various scenarios of TRAIL-based multidrug treatments, and these can be used to potentiate the efficacy of therapies targeting TRAIL death receptors. We pay particular attention to structure the current knowledge on the diverse molecular mechanisms that are thought to give rise to these synergies and describe how different signaling features evoking synergies can be associated with distinct classes of drugs used in TRAIL-based combination treatments. ©2012 AACR.


Laue R.,University of Leipzig | Awad A.,University of Potsdam
Journal of Visual Languages and Computing | Year: 2011

Business processes are commonly modeled using a graphical modeling language. The most widespread notation for this purpose is business process diagrams in the Business Process Modeling Notation (BPMN). In this article, we use the visual query language BPMN-Q for expressing patterns that are related to possible problems in such business process diagrams. We discuss two classes of problems that can be found frequently in real-world models: sequence flow errors and model fragments that can make the model difficult to understand.By using a query processor, a business process modeler is able to identify possible errors in business process diagrams. Moreover, the erroneous parts of the business process diagram can be highlighted when an instance of an error pattern is found. This way, the modeler gets an easy-to-understand feedback in the visual modeling language he or she is familiar with. This is an advantage over current validation methods, which usually lack this kind of intuitive feedback. © 2011 Elsevier Ltd.


Maringka M.,Diakoniekrankenhaus Henriettenstiftung | Giri S.,University of Leipzig | Bader A.,University of Leipzig
Biomaterials | Year: 2010

Using primary porcine hepatocytes, artificial extracorporeal liver support (AEL) is a therapy that carries out the liver functions of liver failure patients until their own organs have been regenerated or until whole organ transplantation. Significant variation exists with regard to current bioreactor designs for AEL, and they may not reflect the in vivo architecture of the liver since each individual hepatocyte has its own direct contact with blood plasma for oxygen and nutrient supply and detoxification. The present study, based on our flat membrane bioreactor (FMB), aimed at in vivo liver architecture and to meet authentic clinical levels of human plasma exposure. Since many existing preclinical AELs are based on commercial culture medium with or without nonhuman serum, they may not authentically reflect the clinical situation in human patients, and little research has been done on human plasma exposure in in vitro culture-based bioreactors. To address this situation, herein we examined liver-specific functions such as albumin secretion, urea synthesis, glutamic oxaloacetic transaminase (GOT), glutamic pyruvic transaminase (GPT), cell membrane stability by lactate dehydrogenase (LDH) test and ammonia clearance by using human plasma and serum-free medium in long-term culture of primary porcine hepatocytes to show the potential of our clinically relevant FMB. We observed that the organotypical double-gel (DG) culture is superior to conventional collagen-coated single-gel (SG) cultures. The performance of liver-specific functions by the FMB has long-term stability with intact cell morphology for up to 20 days under both plasma exposure and serum-free media. Our three focus points (long-term culture that correlates with the generation time of spontaneous regeneration, high-density culture, organotypical culture model using human plasma) may provide valuable clinical clues for AEL. © 2009 Elsevier Ltd. All rights reserved.


Ou Q.,China University of Geosciences | Shu D.,China University of Geosciences | Shu D.,Northwest University, China | Mayer G.,University of Leipzig
Nature Communications | Year: 2012

Cambrian lobopodians are important for understanding the evolution of arthropods, but despite their soft-bodied preservation, the organization of the cephalic region remains obscure. Here we describe new material of the early Cambrian lobopodian Onychodictyon ferox from southern China, which reveals hitherto unknown head structures. These include a proboscis with a terminal mouth, an anterior arcuate sclerite, a pair of ocellus-like eyes and branched, antenniform appendages associated with this ocular segment. These findings, combined with a comparison with other lobopodians, suggest that the head of the last common ancestor of fossil lobopodians and extant panarthropods comprized a single ocular segment with a proboscis and terminal mouth. The lack of specialized mouthparts in O. ferox and the involvement of non-homologous mouthparts in onychophorans, tardigrades and arthropods argue against a common origin of definitive mouth openings among panarthropods, whereas the embryonic stomodaeum might well be homologous at least in Onychophora and Arthropoda. © 2012 Macmillan Publishers Limited.


Gaunitz F.,University of Leipzig | Hipkiss A.R.,Aston University
Amino Acids | Year: 2012

The application of carnosine in medicine has been discussed since several years, but many claims of therapeutic effects have not been substantiated by rigorous experimental examination. In the present perspective, a possible use of carnosine as an anti-neoplastic therapeutic, especially for the treatment of malignant brain tumours such as glioblastoma is discussed. Possible mechanisms by which carnosine may perform its anti-tumourigenic effects are outlined and its expected bioavailability and possible negative and positive side effects are considered. Finally, alternative strategies are examined such as treatment with other dipeptides or β-alanine. © 2012 Springer-Verlag.


Gorgens A.,University of Duisburg - Essen | Radtke S.,University of Duisburg - Essen | Mollmann M.,University of Duisburg - Essen | Cross M.,University of Leipzig | And 3 more authors.
Cell Reports | Year: 2013

The classical model of hematopoiesis predicts a dichotomous lineage restriction of multipotent hematopoietic progenitors (MPPs) into common lymphoid progenitors (CLPs) and common myeloid progenitors (CMPs). However, this idea has been challenged by the identification of lymphoid progenitors retaining partial myeloid potential (e.g., LMPPs), implying that granulocytes can arise within both the classical lymphoid and the myeloid branches. Here, we resolve this issue by using cell-surface CD133 expression to discriminate functional progenitor populations. We show that eosinophilic and basophilic granulocytes as well as erythrocytes and megakaryocytes derive from a common erythro-myeloid progenitor (EMP), whereas neutrophilic granulocytes arise independently within a lympho-myeloid branch with long-term progenitor function. These findings challenge the concept of a CMP and restore dichotomy to the classical hematopoietic model. © 2013 The Authors.


Widmann A.,University of Leipzig | Engbert R.,University of Potsdam | Schroger E.,University of Leipzig
Journal of Neuroscience | Year: 2014

The mental chronometry of the human brain's processing of sounds to be categorized as targets has intensively been studied in cognitive neuroscience. According to current theories, a series of successive stages consisting of the registration, identification, and categorization of the sound has to be completed before participants are able to report the sound as a target by button press after ~300-500 ms. Here we use miniature eye movements as a tool to study the categorization of a sound as a target or nontarget, indicating that an initial categorization is present already after 80-100 ms. During visual fixation, the rate of microsaccades, the fastest components of miniature eye movements, is transiently modulated after auditory stimulation. In two experiments, we measured microsaccade rates in human participants in an auditory three-tone oddball paradigm (including rare nontarget sounds) and observed a difference in the microsaccade rates between targets and nontargets as early as 142 ms after sound onset. This finding was replicated in a third experiment with directed saccades measured in a paradigm in which tones had to be matched to score-like visual symbols. Considering the delays introduced by (motor) signal transmission and data analysis constraints, the brain must have differentiated target from nontarget sounds as fast as 80-100 ms after sound onset in both paradigms. We suggest that predictive information processing for expected input makes higher cognitive attributes, such as a sound's identity and category, available already during early sensory processing. The measurement of eye movements is thus a promising approach to investigate hearing. © 2014 the authors.


Grant
Agency: European Commission | Branch: H2020 | Program: MSCA-ITN-ETN | Phase: MSCA-ITN-2015-ETN | Award Amount: 3.55M | Year: 2015

MASSTRPLAN will train the next generation of interdisciplinary research leaders in advanced molecular analytical techniques to detect oxidized phospholipids & proteins in biological & clinical samples, evaluate their biochemical roles in inflammation, and translate these findings to develop new diagnostic tools. Chronic inflammatory diseases such as diabetes, cardiovascular disease (CVD) & cancer are major causes of mortality and cost the EU economy dearly in healthcare and lost working time; CVD alone is estimated to be responsible for 47% of deaths and to cost the EU 196 billion a year. Scientists able to develop advanced analytical tools for detecting oxidative biomolecule modifications and assessing their contribution to cell dysfunction & disease are urgently needed. The objectives of MASSTRPLAN are to 1) train early stage researchers (ESRs) in advanced and novel chromatography, mass spectrometry, and complementary techniques including microscopy and bioinformatics to detect challenging heterogeneous biomolecule modifications and determine their functional effects; 2) give ESRs a broad perspective on relevance & mechanisms of oxidative modifications in pathophysiology and biotechnology; 3) enable ESRs trained in technology development to engage effectively with the clinical sector; and 4) train ESRs in translational and development skills to produce new protocols, materials and commercializable diagnostic tools. The ETN will achieve this by bringing together 10 beneficiaries and 15 partners from academic, industrial and healthcare organizations working in analytical, bioinformatic, biological, clinical & biotech fields to provide multidisciplinary, cross-sector training. Extensive mobility, industrial secondments and network-wide training will yield a cohort of analytical scientists with the unique theoretical, technological, and entrepreneurial skill set to yield new understanding of oxidative inflammatory disorders, leading to better tools and therapies.


Grant
Agency: European Commission | Branch: FP7 | Program: CP-FP | Phase: HEALTH.2010.2.3.3-3 | Award Amount: 3.92M | Year: 2011

The world is facing a variety of viral infections of high pathogenic potential. These are either novel or formerly only endemic in specific areas of the world. It is intrinsic to such emerging diseases that actions to prevent and fight them must be taken while the number of infections is still relatively low and geographically restricted. Therefore, research efforts are required well before large outbreaks occur. In addition, effective surveillance networks for a given emerging disease must be established in time. Only with tools for treatment and control (such as vaccines) it will be possible to avoid major uncontrolled outbreaks. This proposal aims at the development of these tools for the control and prevention of one of the most threatening vector-borne emerging diseases, West Nile Fever, caused by West Nile Virus (WNV), which has recently spread through North America. Although the viral strains are similar in America and Europe, different conditions for a WNV epidemic have to be taken into account, like insect vectors, reservoir hosts (birds) and their endemic virus populations plus specialities of European climate and geography. To achieve the goals of the call and to make a significant impact in the enhancement of Europes preparedness to WNV, the consortium has defined three major scientific and technical objectives. Firstly, to develop a diagnostic system for WNV-infections, which has no cross-reaction with other common flavivirus infections. Secondly, to develop a vaccine for humans and last but not least to establish a scientific network to collect, investigate and standardize biological data associated with WNV records using standardized methods. Several European Institutes supported by US scientist experienced with the North American outbreak will be collaborating to fight the disease from a European perspective.


Grant
Agency: European Commission | Branch: FP7 | Program: CP-FP | Phase: HEALTH.2011.2.4.3-2 | Award Amount: 7.90M | Year: 2012

The number of individuals with obesity and type 2 diabetes mellitus (T2DM) is increasing. An alarming aspect is decline in age of onset of T2DM, which is coupled to rise in childhood obesity. Accentuated insulin secretion is observed early in young obese individuals. In many subjects insulin hypersecretion is evident when insulin sensitivity is essentially normal. Based on these observations we propose insulin hypersecretion as an etiological factor promoting lipid deposition, insulin resistance, cellular dysfunction and death in insulin-producing beta-cells and insulin-target brown adipocytes. Pharmacology-based treatment strategies are limited for this growing patient group and the aim of the proposal is to identify novel strategies reducing insulin hypersecretion, which has not been considered a target for intervention in young obese individuals. To address the issue, pediatric obesity clinics and academic centres with focus on beta-cell biology, brown adipocyte imaging, transcript and protein profiling, genetics, epidemiology and bioinfomatics have formed a consortium with two SMEs specialized on biomarker discovery and clinical trials and one large drug company. In the project well-characterized European patient cohorts of more than 3000 obese children will be further characterized with regard to insulin secretion and brown adipocyte mass. Currently used drugs and new principles of intervention based on novel genes, idenitifed in the project and linked with insulin hypersecretion, will be examined for effects on insulin hypersecretion in translational work including the young obese subjects and isolated human islets. Following comprehensive analysis candidate compounds/principles attenuating insulin hypersecretion will be selected, from which novel therapeutic strategies are expected to emerge. Such therapeutic strategies will be of importance for afflicted individuals and European health economy and lead to new opportunities for European industry.


Grant
Agency: European Commission | Branch: FP7 | Program: CP-FP | Phase: NMP.2011.2.2-2 | Award Amount: 5.46M | Year: 2012

Age-related cancers, especially of the trachea, are neoplastic lesions that significantly impact upon the lives of thousands of European patients each year. Unfortunately, most present with inoperable lesions for which median survival is less than 12 months. Based on our previous clinically successful experiences with in vivo completely tissue engineered tracheal replacement in benign tracheal diseases, we recently applied this technology in 2 patients with otherwise inoperable primary tracheal cancers. The successful observed outcome confirms the unique opportunity to scale-up an effective therapeutic approach into a widely accessible clinical technology, which could enhance not only the quality of life but even cure otherwise untreatable patients. However, a limitation of our current technology is the time it takes to re-populate the decellularized trachea. This may prove critical in the case of cancer patients. Further, the size of the transplant is currently limited due to the fact that the transplanted tissue needs to be efficiently and rapidly vascularised to prevent necrosis in vivo. To surmount these limitations, we aim to: i) improve our current technique of in vivo tissue engineering human tracheae in a small number of patients and subsequently begin a formal clinical trial, ii) develop pharmacological approaches to activate endogenous stem cells, stimulate tissue regeneration and vascularisation in situ, iii) develop a synthetic tracheal scaffold using a novel nanocomposite polymer as alternatives to natural human scaffolds and iv) develop good medical practice manufacturing process for safe, efficient and cost effective commercial production. This research project is aimed to define a robust airway implantation technique assuring a better outcome for thousands of patients each year. Moreover, we aim to use these results as a starting point to develop clinical approaches that could improve the treatment of age-related cancers of other hollow organs.


Reservoir for heavy hydrogen: Molecules of the heavy hydrogen isotopes deuterium and tritium preferentially bind to copper atoms in a metal-organic framework compound. The metal atoms are therefore symbolically represented as shells in this image. Credit: University Leipzig / Thomas Häse Deuterium and tritium are substances with a future - but they are rare. The heavy isotopes of hydrogen not only have numerous applications in science but could also contribute to the energy mix of tomorrow as fuels for nuclear fusion. Deuterium is also contained in some drugs that are currently undergoing regulatory approval in the US. However, the process of filtering deuterium out of the natural isotopic mixture of hydrogen is at present both difficult and expensive. Scientists from the Max Planck Institute for Intelligent Systems, the Max Planck Institute for Solid State Research, the University of Leipzig, Jacobs University Bremen, the University of Augsburg, and Oak Ridge National Laboratory (USA) may be able to remedy this problem. They have presented a metal-organic framework compound that can be used to separate the two isotopes from normal hydrogen more efficiently than previous methods. In drugs, deuterium has a life-prolonging effect – albeit initially only for the active substance itself. The human metabolic system breaks down molecules carrying the deuterium isotope, which is twice as heavy as hydrogen, more slowly than the same substance incorporating normal hydrogen. Drugs containing deuterium can therefore be given in smaller doses, which means that their side effects are also reduced. Deuterium, like the even heavier radioactive hydrogen isotope tritium, also plays a role in nuclear fusion. This process, which makes stars shine, may some day fuel power plants in which atomic nuclei are fused together, releasing large amounts of energy in the process. Whereas deuterium has only been used in pharmaceuticals for a short time and its potential use in power plants still lies in the future, it has long been used in science, for example to track the path of nutrients through the metabolic system. "Deuterium and, to a certain extent, tritium are useful in some applications," says Michael Hirscher, who, as Leader of a Research Group at the Max Planck Institute for Intelligent Systems, has played a key role in the research. "To date, however, it has been very difficult to separate deuterium from light hydrogen," he says. Deuterium is obtained from heavy water, which occurs in natural water at a concentration of just 15 parts per thousand. The heavy water is first isolated by a combination of chemical and physical methods, such as distillation, to obtain deuterium gas. The whole process is so intricate and energy-intensive that a gramme of deuterium with a purity of 99.8 percent costs around 100 euros, making hydrogen's heavy brother around three times more precious than gold, although deuterium is more than 300 times more abundant in the oceans and Earth's crust than gold. "Our metal-organic framework compound should make it easier and less energy-intensive to isolate deuterium from the naturally occurring mixture of hydrogen isotopes," says Dirk Volkmer, whose colleagues in the Department of Solid-State Chemistry at the University of Augsburg synthesized the material. In a metal-organic framework, or MOF for short, metal ions are linked by organic molecules to form a crystal with relatively large pores. Such substances are able to absorb large quantities of gas in relation to their weight. In the compound that the research team proposes for use as a deuterium and tritium filter, zinc and copper ions form the metallic nodes. As early as 2012 the scientists presented a metal-organic framework compound containing only zinc as the metallic component. It was able to filter out deuterium – but only at a temperature of minus 223 degrees Celsius. With copper instead of zinc, the filter can be cooled with liquid nitrogen The Augsburg-based chemists therefore replaced some of the zinc atoms with copper atoms, whose electron shells more selectively filter out deuterium and does so at higher temperatures. Michael Hirscher and his staff at the Max Planck Institute for Intelligent Systems and researchers at the Oak Ridge National Laboratory confirmed this property in various tests. Among other things, they determined the quantities of deuterium and normal hydrogen that the material absorbs from a mixture of equal parts of the two isotopes at various temperatures. They found that at minus 173 degrees Celsius it stores twelve times more deuterium. "At this temperature the separating process can be cooled with liquid nitrogen, which makes it more cost-effective than methods that only work at minus 200 degrees," says Michael Hirscher. The team of theoretical chemists headed by Thomas Heine, who has recently assumed a chair at the University of Leipzig after previous teaching at Jacobs University in Bremen, helped interpret the collected data. "Our calculations fitted the various parts of the experimental puzzle together into a coherent picture," the scientist says. The metal-organic framework has to absorb even more gas The data for deuterium and normal hydrogen showed that the predictions of the calculations agreed very well with the experimental results. The theoreticians are therefore confident that those calculations, which cannot easily be tested experimentally, are just as valid. "Our calculations for tritium would probably be right too. But this can only be experimentally confirmed under stringent safety procedures," Thomas Heine says. The material also absorbs the radioactive hydrogen isotope very efficiently from a mixture of isotopes. That could be a useful property in a particular application in which the aim is not to obtain the isotope but to get rid of it. Water from power plants – including the water that flooded the Fukushima reactors in the 2011 disaster – contains tritium. The new metal-organic framework compound may provide a way to dispose of this radioactive waste, although the radioactively contaminated water first has to undergo electrolysis to convert the tritium-containing water molecules into tritium-containing hydrogen gas. However, before tritium and deuterium can be filtered out of the isotope mixture using large-pore crystals in practice, the technique first has to be refined – not least, so that it absorbs more gas. Neutrons are ideal to study the adsorption of molecular hydrogen Neutron scattering is a very sensitive tool to study the motion of hydrogen, the neutron also distinguishes the signals coming from different isotopes like hydrogen and deuterium. "In the metal organic framework, hydrogen molecules adsorbe on different sites, by tracking the relative populations of hydrogen and deuterium in each site, neutrons clearly elucidated the mechanism of isotopic separation." Timmy Ramirez-Cuesta, from the Spallation neutron scource at Oak Ridge National Laboratory says. The research made use of ORNL's VISION spectrometer, the world most powerful chemical neutron spectrometer. More information: Capture of heavy hydrogen isotopes in a metal-organic framework with active Cu(I) sites. Nature Communications, 28 February 2017; DOI: 10.1038/ncomms14496


Grant
Agency: European Commission | Branch: H2020 | Program: RIA | Phase: LCE-31-2016-2017 | Award Amount: 4.00M | Year: 2016

ECHOES is a multi-disciplinary research project providing policy makers with comprehensive information, data, and policy-ready recommendations about the successful implementation of the Energy Union and SET plan. Individual and collective energy choices and social acceptance of energy transitions are analysed in a multi-disciplinary process including key stakeholders as co-constructors of the knowledge. To account for the rich contexts in which individuals and collectives administer their energy choices, ECHOES utilizes three complementary perspectives: 1) individual decision-making as part of collectives, 2) collectives constituting energy cultures and life-styles, and (3) formal social units such as municipalities and states. To reduce greenhouse gas emissions and create a better Energy Union, system change is required. While technological change is a key component in this change, successful implementation of that change relies on the multi-disciplinary social science knowledge that ECHOES produces. Therefore, three broad technological foci which will run as cross-cutting issues and recurrent themes through ECHOES: smart energy technologies, electric mobility, and buildings. All three technology foci address high impact areas that have been prioritised by national and international policies, and are associated with great potential savings in greenhouse gas emissions. ECHOES uniquely comprehensive methodological approach includes a representative multinational survey covering all 28 EU countries plus Norway and Turkey, syntheses of existing data and literature, policy assessments, as well as quantitative experiments, interviews, netnography, focus groups, workshops, site visits and case studies in eight countries. All data collected in the project will be systematised in a built-for-purpose database that will serve both as an analytical tool for the project and as a valuable resource for stakeholders and researchers after the projects lifetime.


Schade M.,Humboldt University of Berlin | Berti D.,University of Florence | Huster D.,University of Leipzig | Herrmann A.,Humboldt University of Berlin | Arbuzova A.,Humboldt University of Berlin
Advances in Colloid and Interface Science | Year: 2014

Lipophilic nucleic acids have become a versatile tool for structuring and functionalization of lipid bilayers and biological membranes as well as cargo vehicles to transport and deliver bioactive compounds, like interference RNA, into cells by taking advantage of reversible hybridization with complementary strands. This contribution reviews the different types of conjugates of lipophilic nucleic acids, and their physicochemical and self-assembly properties. Strategies for choosing a nucleic acid, lipophilic modification, and linker are discussed. Interaction with lipid membranes and its stability, dynamic structure and assembly of lipophilic nucleic acids upon embedding into biological membranes are specific points of the review. A large diversity of conjugates including lipophilic peptide nucleic acid and siRNA provides tailored solutions for specific applications in bio- and nanotechnology as well as in cell biology and medicine, as illustrated through some selected examples. © 2014 Elsevier B.V.


Bernstein P.A.,Microsoft | Madhavan J.,Google | Rahm E.,University of Leipzig
Proceedings of the VLDB Endowment | Year: 2011

In a paper published in the 2001 VLDB Conference, we proposed treating generic schema matching as an independent problem. We developed a taxonomy of existing techniques, a new schema matching algorithm, and an approach to comparative evaluation. Since then, the field has grown into a major research topic. We briefly summarize the new techniques that have been developed and applications of the techniques in the commercial world. We conclude by discussing future trends and recommendations for further work. © 2011 VLDB Endowment.


Hoehndorf R.,University of Cambridge | Harris M.A.,University of Cambridge | Herre H.,University of Leipzig | Rustici G.,European Bioinformatics Institute | Gkoutos G.V.,University of Cambridge
Bioinformatics | Year: 2012

Motivation: The systematic observation of phenotypes has become a crucial tool of functional genomics, and several large international projects are currently underway to identify and characterize the phenotypes that are associated with genotypes in several species. To integrate phenotype descriptions within and across species, phenotype ontologies have been developed. Applying ontologies to unify phenotype descriptions in the domain of physiology has been a particular challenge due to the high complexity of the underlying domain. Results: In this study, we present the outline of a theory and its implementation for an ontology of physiology-related phenotypes. We provide a formal description of process attributes and relate them to the attributes of their temporal parts and participants. We apply our theory to create the Cellular Phenotype Ontology (CPO). The CPO is an ontology of morphological and physiological phenotypic characteristics of cells, cell components and cellular processes. Its prime application is to provide terms and uniform definition patterns for the annotation of cellular phenotypes. The CPO can be used for the annotation of observed abnormalities in domains, such as systems microscopy, in which cellular abnormalities are observed and for which no phenotype ontology has been created. © The Author 2012. Published by Oxford University Press. All rights reserved.


Fletcher W.J.,Goethe University Frankfurt | Fletcher W.J.,University of Manchester | Zielhofer C.,University of Leipzig
Catena | Year: 2013

In this paper we explore the evidence for Holocene Rapid Climate Changes (RCCs) in Western Mediterranean records, examining similarities and differences in the timing and nature of impacts on different components of the natural environment (vegetation, fluvial and coastal sedimentation, fire activity, soil formation). Marine, lacustrine, and fluvial archives of the Western Mediterranean (Iberian Peninsula and Northwest Africa) provide evidence for both pervasive millennial-scale climatic variability and abrupt (decadal- to centennial-scale) transitions. We focus in particular on three RCCs characterised by high-latitude cooling, glacier advances and North-Atlantic ice-rafting events: the mid-Holocene RCC interval 6-5. cal. ka BP, the late-Holocene RCC interval 3.5-2.5. cal. ka BP, and the historical RCC interval known as the Little Ice Age (LIA, 1300-1950. AD). Evidence from multiple records indicates wide-ranging impacts of RCCs in the Western Mediterranean region. The three RCC intervals were characterised, however, by contrasting hydrological situations in the Western Mediterranean, with prevailing dry conditions including marked aridification events during the RCC intervals 6-5 and 3.5-2.5. cal. ka BP, and prevailing or recurrent wet conditions during the LIA. We examine issues of proxy sensitivity in palaeoecological and geomorphological records and evaluate examples of contrasting geomorphological responses to regional climatic triggers between humid and semi-arid sectors of the Western Mediterranean. Finally, we consider the long-term sensitivity of the region to rapid climate change, the role of threshold changes, and the extent to which this region represents a "fragile" landscape. © 2011 Elsevier B.V.


Urban P.,University of Leipzig | Simonov A.,ETH Zurich | Weber T.,ETH Zurich | Oeckler O.,University of Leipzig
Journal of Applied Crystallography | Year: 2015

Metastable Ge4Bi2Te7 is highly disordered; the average structure corresponds to the rocksalt type. The diffraction pattern shows diffuse streaks interconnecting Bragg reflections along all cubic 111 directions. These streaks exhibit satellite-like maxima and arise from vacancy ordering in non-periodically spaced defect layers. The atom layers near these vacancy layers are displaced with respect to the average structure: they tend to form α-GeTe-type double layers. The three-dimensional difference pair distribution function (3D-ΔPDF) method yields quantitative information on the distribution of defect layer spacings, which peaks at a value corresponding to Ge3Bi2Te6 building blocks. The cation distribution along with the displacement of the atom layers is refined as well, using a least-squares approach. Bi concentrates on cation positions next to the vacancy layers. © 2015 International Union of Crystallography.


Bader A.,University of Leipzig | Macchiarini P.,University of Florence
Journal of Cellular and Molecular Medicine | Year: 2010

In June 2008, the world's first whole tissue-engineered organ - the windpipe - was successfully transplanted into a 31-year-old lady, and about 18 months following surgery she is leading a near normal life without immunosuppression. This outcome has been achieved by employing three groundbreaking technologies of regenerative medicine: (i) a donor trachea first decellularized using a detergent (without denaturing the collagenous matrix), (ii) the two main autologous tracheal cells, namely mesenchymal stem cell derived cartilage-like cells and epithelial respiratory cells and (iii) a specifically designed bioreactor that reseed, before implantation, the in vitro pre-expanded and pre-differentiated autologous cells on the desired surfaces of the decellularized matrix. Given the long-term safety, efficacy and efforts using such a conventional approach and the potential advantages of regenerative implants to make them available for anyone, we have investigated a novel alternative concept how to fully avoid in vitro cell replication, expansion and differentiation, use the human native site as micro-niche, potentiate the human body's site-specific response by adding boosting, permissive and recruitment impulses in full respect of sociological and regulatory prerequisites. This tissue-engineered approach and ongoing research in airway transplantation is reviewed and presented here. © 2010 The Authors Journal compilation © 2010 Foundation for Cellular and Molecular Medicine/Blackwell Publishing Ltd.


Pirozhenko I.G.,Joint Institute for Nuclear Research | Bordag M.,University of Leipzig
Physical Review D - Particles, Fields, Gravitation and Cosmology | Year: 2013

The electromagnetic vacuum energy is considered in the presence of a perfectly conducting plane and a ball with dielectric permittivity ε and magnetic permeability μ, μ≠1. The attention is focused on the Casimir repulsion in this system caused by the magnetic permeability of the sphere. In the case of a perfectly permeable sphere, μ=∞, the vacuum energy is estimated numerically. The short- and long-distance asymptotes corresponding to the repulsive force and respective low-temperature corrections and high-temperature limits are found for a wide range of μ. The constraints on the Casimir repulsion in this system are discussed. © 2013 American Physical Society.


Baldauf C.,Fritz Haber Institute of the Max Planck Society | Hofmann H.-J.,University of Leipzig
Helvetica Chimica Acta | Year: 2012

The enormous developments of computer technologies allow the broad employment of ab initio MO theory in foldamer research. In this review, we demonstrate the efficiency and reliability of ab initio MO methods for the description of the helix formation in oligomers of ω-amino acids on the basis of representative examples. Thus, ab initio MO theory successfully accompanies foldamer research by confirmation and interpretation of experimental results and stimulation of future experiments. The high predictive power of the methods opens the way to novel structure classes with special properties. Nowadays, ab initio MO theory has become an inherent part in the arsenal of methods applied in foldamer research. Copyright © 2012 Verlag Helvetica Chimica Acta AG, Zürich, Switzerland.


Loretz M.,ETH Zurich | Pezzagna S.,University of Leipzig | Meijer J.,University of Leipzig | Degen C.L.,ETH Zurich
Applied Physics Letters | Year: 2014

We present nanoscale nuclear magnetic resonance (NMR) measurements performed with nitrogen-vacancy (NV) centers located down to about 2 nm from the diamond surface. NV centers were created by shallow ion implantation followed by a slow, nanometer-by-nanometer removal of diamond material using oxidative etching in air. The close proximity of NV centers to the surface yielded large 1H NMR signals of up to 3.4 μT-rms, corresponding to ∼330 statistically polarized or ∼10 fully polarized proton spins in a (1.8 nm)3 detection volume. © 2014 AIP Publishing LLC.


Bellouin N.,UK Met Office | Bellouin N.,University of Reading | Quaas J.,University of Leipzig | Morcrette J.-J.,European Center for Medium range Weather Forecast | Boucher O.,University Pierre and Marie Curie
Atmospheric Chemistry and Physics | Year: 2013

The European Centre for Medium-range Weather Forecast (ECMWF) provides an aerosol re-analysis starting from year 2003 for the Monitoring Atmospheric Composition and Climate (MACC) project. The re-analysis assimilates total aerosol optical depth retrieved by the Moderate Resolution Imaging Spectroradiometer (MODIS) to correct for model departures from observed aerosols. The re-analysis therefore combines satellite retrievals with the full spatial coverage of a numerical model. Re-analysed products are used here to estimate the shortwave direct and first indirect radiative forcing of anthropogenic aerosols over the period 2003-2010, using methods previously applied to satellite retrievals of aerosols and clouds. The best estimate of globally-averaged, all-sky direct radiative forcing is-0.7 ± 0.3 Wm−2. The standard deviation is obtained by a Monte-Carlo analysis of uncertainties, which accounts for uncertainties in the aerosol anthropogenic fraction, aerosol absorption, and cloudy-sky effects. Further accounting for differences between the present-day natural and pre-industrial aerosols provides a direct radiative forcing estimate of-0.4 ± 0.3 Wm−2. The best estimate of globally-averaged, all-sky first indirect radiative forcing is-0.6 ± 0.4 Wm−2. Its standard deviation accounts for uncertainties in the aerosol anthropogenic fraction, and in cloud albedo and cloud droplet number concentration susceptibilities to aerosol changes. The distribution of first indirect radiative forcing is asymmetric and is bounded by-0.1 and-2.0 Wm-2. In order to decrease uncertainty ranges, better observational constraints on aerosol absorption and sensitivity of cloud droplet number concentrations to aerosol changes are required. © 2013 Author(s).


Franke H.,University of Leipzig | Verkhratsky A.,University of Manchester | Burnstock G.,University College London | Illes P.,University of Leipzig
Purinergic Signalling | Year: 2012

Astrocytes are fundamental for central nervous system (CNS) physiology and are the fulcrum of neurological diseases. Astroglial cells control development of the nervous system, regulate synaptogenesis, maturation, maintenance and plasticity of synapses and are central for nervous system homeostasis. Astroglial reactions determine progression and outcome of many neuropathologies and are critical for regeneration and remodelling of neural circuits following trauma, stroke, ischaemia or neurodegenerative disorders. They secrete multiple neurotransmitters and neurohormones to communicate with neurones, microglia and the vascular walls of capillaries. Signalling through release of ATP is the most widespread mean of communication between astrocytes and other types of neural cells. ATP serves as a fast excitatory neurotransmitter and has pronounced long-term (trophic) roles in cell proliferation, growth, and development. During pathology, ATP is released from damaged cells and acts both as a cytotoxic factor and a proinflammatory mediator, being a universal "danger" signal. In this review, we summarise contemporary knowledge on the role of purinergic receptors (P2Rs) in a variety of diseases in relation to changes of astrocytic functions and nucleotide signalling. We have focussed on the role of the ionotropic P2X and metabotropic P2YRs working alone or in concert to modify the release of neurotransmitters, to activate signalling cascades and to change the expression levels of ion channels and protein kinases. All these effects are of great importance for the initiation, progression and maintenance of astrogliosis-the conserved and ubiquitous glial defensive reaction to CNS pathologies. We highlighted specific aspects of reactive astrogliosis, especially with respect to the involvement of the P2X7 and P2Y1R subtypes. Reactive astrogliosis exerts both beneficial and detrimental effects in a context-specific manner determined by distinct molecular signalling cascades. Understanding the role of purinergic signalling in astrocytes is critical to identifying new therapeutic principles to treat acute and chronic neurological diseases. © 2012 Springer Science+Business Media B.V.


Heinke L.,University of Leipzig | Heinke L.,Fritz Haber Institute of the Max Planck Society | Heinke L.,Lawrence Berkeley National Laboratory | Karger J.,University of Leipzig
Physical Review Letters | Year: 2011

The rates of uptake and release of guest molecules in nanoporous solids are often strongly influenced or even controlled by transport resistances at the external surface ("surface barriers") rather than by intraparticle diffusion, which was assumed to be rate controlling in many of the earlier kinetic studies. By correlating the surface resistance with the intracrystalline diffusivity, we develop here a microkinetic model which closely reproduces the experimentally observed results for short-chain alkanes in Zn(tbip), a member of the novel metal-organic framework family of nanoporous materials. It seems likely that this mechanism, which is shown to provide a rational explanation of the commonly observed discrepancies between "macro" and "micro" measurements of intracrystalline diffusion, may be fairly general. © 2011 American Physical Society.


Bordag M.,University of Leipzig | Pirozhenko I.G.,Joint Institute for Nuclear Research
Physical Review D - Particles, Fields, Gravitation and Cosmology | Year: 2015

Within the Dirac model for the electronic excitations of graphene, we calculate the full polarization tensor with finite mass and chemical potential. It has, besides the (00)-component, a second form factor, which must be accounted for. We obtain explicit formulas for both form factors and for the reflection coefficients. Using these, we discuss the regions in the momentum-frequency plane where plasmons may exist and give numeric solutions for the plasmon dispersion relations. It turns out that plasmons exist for both, transverse electric and transverse magnetic polarizations over the whole range of the ratio of mass to chemical potential, except for zero chemical potential, where only a TE plasmon exists. © 2015 American Physical Society.


Bordag M.,University of Leipzig | Pirozhenko I.G.,Joint Institute for Nuclear Research
Physical Review D - Particles, Fields, Gravitation and Cosmology | Year: 2010

The violation of the third law of thermodynamics for metals described by the Drude model and for dielectrics with finite dc conductivity is one of the most interesting problems in the field of the Casimir effect. It manifests itself as a nonvanishing of the entropy for vanishing temperature. We review the relevant calculations for plane surfaces and calculate the corresponding contributions for a ball in front of a plane. In this geometry, these appear in much the same way as for parallel planes. We conclude that the violation of the 3rd law is not related to the infinite size of the planes. © 2010 The American Physical Society.


Kohler A.C.,University of Regensburg | Kohler A.C.,University of Leipzig | Sag C.M.,University of Regensburg | Maier L.S.,University of Regensburg
Journal of Molecular and Cellular Cardiology | Year: 2014

Reactive oxygen species (ROS) are highly reactive oxygen-derived chemical compounds that are by-products of aerobic cellular metabolism as well as crucial second messengers in numerous signaling pathways. In excitation-contraction-coupling (ECC), which links electrical signaling and coordinated cardiac contraction, ROS have a severe impact on several key ion handling proteins such as ion channels and transporters, but also on regulating proteins such as protein kinases (e.g. CaMKII, PKA or PKC), thereby pivotally influencing the delicate balance of this finely tuned system. While essential as second messengers, ROS may be deleterious when excessively produced due to a disturbed balance in Na+ and Ca2+ handling, resulting in Na+ and Ca2+ overload, SR Ca2+ loss and contractile dysfunction. This may, in the end, result in systolic and diastolic dysfunction and arrhythmias. This review aims to provide an overview of the single targets of ROS in ECC and to outline the role of ROS in major cardiac pathologies, such as heart failure and arrhythmogenesis. This article is part of a Special Issue entitled "Redox Signalling in the Cardiovascular System". © 2014 Elsevier Ltd.


Niederwieser D.,University of Leipzig | Schmitz S.,Onkologische Schwerpunktpraxis
European Journal of Haematology | Year: 2011

The regulation of biosimilars is a process that is still developing. In Europe, guidance regarding the approval and use of biosimilars has evolved with the products under consideration. It is now more than 3years since the first biosimilar agents in oncology support, erythropoiesis-stimulating agents, were approved in the EU. More recently, biosimilar granulocyte colony-stimulating factors have received marketing approval in Europe. This review considers general issues surrounding the introduction of biosimilars and highlights current specific issues pertinent to their use in clinical practice in oncology. Information on marketing approval, extrapolation, labelling, substitution, immunogenicity and traceability of each biosimilar product is important, especially in oncology where patients are treated in repeated therapy courses, often with complicated protocols, and where biosimilars are not used as a unique therapy for replacement of e.g. growth hormone or insulin. While future developments in the regulation of biosimilars will need to address multiple issues, in the interim physicians should remain aware of the inherent differences between biosimilar and innovator products. © 2011 John Wiley & Sons A/S.


Illes P.,University of Leipzig | Verkhratsky A.,University of Manchester | Burnstock G.,University College London | Franke H.,University of Leipzig
Neuroscientist | Year: 2012

Astrocytes are a class of neural cells that control homeostasis at all levels of the central and peripheral nervous system. There is a bidirectional neuron-glia interaction via a number of extracellular signaling molecules, glutamate and ATP being the most widespread. ATP activates ionotropic P2X and metabotropic P2Y receptors, which operate in both neurons and astrocytes. Morphological, biochemical, and functional evidence indicates the expression of astroglial P2X1/5 heteromeric and P2X7 homomeric receptors, which mediate physiological and pathophysiological responses. Activation of P2X1/5 receptors triggers rapid increase of intracellular Na+ that initiates immediate cellular reactions, such as the depression of the glutamate transporter to keep high glutamate concentrations in the synaptic cleft, the activation of the local lactate shuttle to supply energy substrate to pre- and postsynaptic neuronal structures, and the reversal of the Na+/Ca2+ exchange resulting in additional Ca2+ entry. The consequences of P2X7 receptor activation are mostly but not exclusively mediated by the entry of Ca2+ and result in reorganization of the cytoskeleton, inflammation, apoptosis/necrosis, and proliferation, usually at a prolonged time scale. Thus, astroglia detect by P2X1/5 and P2X7 receptors both physiological concentrations of ATP secreted from presynaptic nerve terminals and also much higher concentrations of ATP attained under pathological conditions. © 2012 The Author(s).


Ho H.T.,Max Planck Institute for Human Cognitive and Brain Sciences | Schroger E.,University of Leipzig | Kotz S.A.,University of Manchester
Journal of Cognitive Neuroscience | Year: 2015

Recent findings on multisensory integration suggest that selective attention influences cross-sensory interactions from an early processing stage. Yet, in the field of emotional face– voice integration, the hypothesis prevails that facial and vocal emotional information interacts preattentively. Using ERPs, we investigated the influence of selective attention on the perception of congruent versus incongruent combinations of neutral and angry facial and vocal expressions. Attention was manipulated via four tasks that directed participants to (i) the facial expression, (ii) the vocal expression, (iii) the emotional congruence between the face and the voice, and (iv) the synchrony between lip movement and speech onset. Our results revealed early interactions between facial and vocal emotional expressions, manifested as modulations of the auditory N1 and P2 amplitude by incongruent emotional face–voice combinations. Although audiovisual emotional interactions within the N1 time window were affected by the attentional manipulations, interactions within the P2 modulation showed no such attentional influence. Thus, we propose that the N1 and P2 are functionally dissociated in terms of emotional face–voice processing and discuss evidence in support of the notion that the N1 is associated with cross-sensory prediction, whereas the P2 relates to the derivation of an emotional percept. Essentially, our findings put the integration of facial and vocal emotional expressions into a new perspective—one that regards the integration process as a composite of multiple, possibly independent subprocesses, some of which are susceptible to attentional modulation, whereas others may be influenced by additional factors. © 2015 Massachusetts Institute of Technology.


Connell L.,University of Manchester | Lynott D.,University of Manchester | Dreyer F.,University of Leipzig
PLoS ONE | Year: 2012

Theories of embodied cognition suggest that conceptual processing relies on the same neural resources that are utilized for perception and action. Evidence for these perceptual simulations comes from neuroimaging and behavioural research, such as demonstrations of somatotopic motor cortex activations following the presentation of action-related words, or facilitation of grasp responses following presentation of object names. However, the interpretation of such effects has been called into question by suggestions that neural activation in modality-specific sensorimotor regions may be epiphenomenal, and merely the result of spreading activations from "disembodied", abstracted, symbolic representations. Here, we present two studies that focus on the perceptual modalities of touch and proprioception. We show that in a timed object-comparison task, concurrent tactile or proprioceptive stimulation to the hands facilitates conceptual processing relative to control stimulation. This facilitation occurs only for small, manipulable objects, where tactile and proprioceptive information form part of the multimodal perceptual experience of interacting with such objects, but facilitation is not observed for large, nonmanipulable objects where such perceptual information is uninformative. Importantly, these facilitation effects are independent of motor and action planning, and indicate that modality-specific perceptual information plays a functionally constitutive role in our mental representations of objects, which supports embodied assumptions that concepts are grounded in the same neural systems that govern perception and action. © 2012 Connell et al.


Neumann M.,University of Regensburg | Zeitler K.,University of Regensburg | Zeitler K.,University of Leipzig
Chemistry - A European Journal | Year: 2013

Metal-free cooperation: The cooperative combination of EosinY as a photoredox catalyst with organocatalytic thiourea allows for the highly diastereoselective construction of trans-1,2-cycloalkanes and heterocycles. This new efficient, cooperative organophotoredox/organocatalysis protocol presents a valuable alternative to metal-based photoredox approaches and is the first example of combining photoredox with hydrogen-bond catalysis (see scheme). Copyright © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.


Bickel T.,French National Center for Scientific Research | Zecua G.,University of Leipzig | Wurger A.,French National Center for Scientific Research
Physical Review E - Statistical, Nonlinear, and Soft Matter Physics | Year: 2014

We theoretically study the motion of surface-active Janus particles, driven by an effective slip velocity due to a nonuniform temperature or concentration field ψ. With parameters realized in thermal traps, we find that the torque exerted by the gradient ψ inhibits rotational diffusion and favors alignment of the particle axes. In a swarm of active particles, this polarization adds a novel term to the drift velocity and modifies the collective behavior. Self-polarization in a nonuniform laser beam could be used for guiding hot particles along a given trajectory. © 2014 American Physical Society.


Holloczki O.,University of Leipzig | Nyulaszi L.,Budapest University of Technology and Economics
Topics in Current Chemistry | Year: 2014

In the last decade an explosive development has been observed in the fields of both ionic liquids (ILs) as potential chemically inert solvents with many possible technical applications, and N-heterocyclic carbenes (NHCs) as catalysts with superb performance. Since the cations of many ILs can be deprotonated by strong bases yielding NHCs, this two fields are inherently connected. It has only recently been recognized that some of the commonly used basic anions of the ILs (such as acetate) are able to deprotonate azolium cations. While the resulting NHC could clearly be observed in the vapor phase, in the liquid – where the mutual electrostatic interactions within the ion network stabilize the ion pairs – the neutral NHC cannot be detected by commonly used analytical techniques; however, from these ionic liquids NHCs can be trapped, e.g., by complex formation, or more importantly these ILs can be directly used as catalysts, since the NHC content is sufficiently large for these applications. Apart from imidazole-2-ylidenes, the formation of other highly reactive neutral species (“abnormal carbenes,” 2- alkylideneimidazoles, pyridine-ylidenes or pyridinium-ylides) is feasible in highly basic ionic liquids. The cross-fertilizing overlap between the two fields may provide access to a great advance in both areas, and we give an overview here on the results published so far, and also on the remaining possibilities and challenges in the concept of “carbenes from ionic liquids.” © Springer-Verlag Berlin Heidelberg 2013.


Ganzer R.,University of Regensburg | Stolzenburg J.-U.,University of Leipzig | Wieland W.F.,University of Regensburg | Brundl J.,University of Regensburg
European Urology | Year: 2012

Background: Many authors advocate a high anterior incision during nerve-sparing radical prostatectomy (RP) to improve potency results. Despite a growing number of studies describing autonomic nerves in the ventrolateral position of the prostate, little is known about their quality and their role in erectile function. Objective: The intention of this study was a detailed characterisation of the topographic distribution of periprostatic nerves, including immunohistochemical differentiation of proerectile parasympathetic from sympathetic nerves. Design, setting, and participants: A total of 228 whole-mount sections of 38 prostates (base, middle, apex) from patients following non-nerve-sparing laparoscopic RP were analysed. Immunohistochemical analysis was performed using antibodies against tyrosine hydroxylase for sympathetic and vesicular acetylcholine transporter for parasympathetic nerve fibre staining. Outcome measurements and statistical analysis: Quantification of periprostatic parasympathetic and sympathetic nerves was performed after defining prostatic regions via a digital grid. Differences among three independent variables were tested with the nonparametric Kruskal-Wallis test. Results and limitations: The total number of parasympathetic nerves did not decrease from the base to the apex. They were dispersed at the base and mainly located dorsolaterally at the apex, with 14.6% above the horizontal line at the base and only 1.5% at the apex. In contrast, the total number of sympathetic nerves decreased significantly from base to apex, with a constant proportion of ventrolateral nerves between 9% (base) and 6.2% (apex). This anatomic study is limited by the investigation of postprostatectomy specimens and the lack of functional results. Conclusions: Despite the presence of ventrolateral periprostatic nerves, only a minority of these nerves seems to have a parasympathetic proerectile quality. The arguments in favour of a high anterior incision during nerve-sparing prostatectomy might not only include preserved nerves but also other factors, such as reduced traction or improved anatomic support of the neural structures. © 2012 European Association of Urology.


Boosmann K.,University of Leipzig | Heinen A.,Heinrich Heine University Düsseldorf | Kury P.,Heinrich Heine University Düsseldorf | Engele J.,University of Leipzig
Journal of Cell Science | Year: 2010

The alternative SDF-1 (stromal cell derived factor-1) receptor, CXCR7, has been suggested to act as either a scavenger of extracellular SDF-1 or a modulator of the primary SDF-1 receptor, CXCR4. CXCR7, however, also directly affects the function of various tumor-cell types. Here, we demonstrate that CXCR7 is an active component of SDF-1 signalling in astrocytes and Schwann cells. Cultured cortical astrocytes and peripheral nerve Schwann cells exhibit comparable total and cell-surface levels of expression of both SDF-1 receptors. Stimulation of astrocytes with SDF-1 resulted in the temporary activation of Erk1/2, Akt and PKCζ/λ, but not p38 and PKCα/β. Schwann cells showed SDF-1-induced activation of Erk1/2, Akt and p38, but not PKCα/β and PKCζ/λ. The respective signalling pattern remained fully inducible in astrocytes from CXCR4-deficient mice, but was abrogated following depletion of astrocytic CXCR7 by RNAi. In Schwann cells, RNAi-mediated depletion of either CXCR4 or CXCR7 silenced SDF-1 signalling. The findings of the astrocytic receptor-depletion experiments were reproduced by CXCR7 antagonist CCX754, but not by CXCR4 antagonist AMD3100, both of which abolished astrocytic SDF-1 signalling. Further underlining the functional importance of CXCR7 signalling in glial cells, we show that the mitogenic effects of SDF-1 on both glial cell types are impaired upon depleting CXCR7.


Karges B.,RWTH Aachen | Rosenbauer J.,Heinrich Heine University Düsseldorf | Kapellen T.,University of Leipzig | Wagner V.M.,University of Lübeck | And 3 more authors.
PLoS Medicine | Year: 2014

Severe hypoglycemia is a major complication of insulin treatment in patients with type 1 diabetes, limiting full realization of glycemic control. It has been shown in the past that low levels of hemoglobin A1c (HbA1c), a marker of average plasma glucose, predict a high risk of severe hypoglycemia, but it is uncertain whether this association still exists. Based on advances in diabetes technology and pharmacotherapy, we hypothesized that the inverse association between severe hypoglycemia and HbA1c has decreased in recent years.We analyzed data of 37,539 patients with type 1 diabetes (mean age ± standard deviation 14.4±3.8 y, range 1–20 y) from the DPV (Diabetes Patienten Verlaufsdokumentation) Initiative diabetes cohort prospectively documented between January 1, 1995, and December 31, 2012. The DPV cohort covers an estimated proportion of >80% of all pediatric diabetes patients in Germany and Austria. Associations of severe hypoglycemia, hypoglycemic coma, and HbA1c levels were assessed by multivariable regression analysis. From 1995 to 2012, the relative risk (RR) for severe hypoglycemia and coma per 1% HbA1c decrease declined from 1.28 (95% CI 1.19–1.37) to 1.05 (1.00–1.09) and from 1.39 (1.23–1.56) to 1.01 (0.93–1.10), respectively, corresponding to a risk reduction of 1.2% (95% CI 0.6–1.7, p<0.001) and 1.9% (0.8–2.9, p<0.001) each year, respectively. Risk reduction of severe hypoglycemia and coma was strongest in patients with HbA1c levels of 6.0%–6.9% (RR 0.96 and 0.90 each year) and 7.0%–7.9% (RR 0.96 and 0.89 each year). From 1995 to 2012, glucose monitoring frequency and the use of insulin analogs and insulin pumps increased (p<0.001). Our study was not designed to investigate the effects of different treatment modalities on hypoglycemia risk. Limitations are that associations between diabetes education and physical activity and severe hypoglycemia were not addressed in this study.The previously strong association of low HbA1c with severe hypoglycemia and coma in young individuals with type 1 diabetes has substantially decreased in the last decade, allowing achievement of near-normal glycemic control in these patients.Please see later in the article for the Editors' Summary. © 2014 Karges et al.


Grant
Agency: European Commission | Branch: FP7 | Program: CSA | Phase: ICT-2011.4.4 | Award Amount: 1.38M | Year: 2012

BioASQ will push for a solution to the information access problem of biomedical experts by setting up a challenge on biomedical semantic indexing and question answering (QA). Biomedical knowledge is dispersed in hundreds of heterogeneous knowledge sources and databases; many of them are connected on the Linked Open Data cloud. Biomedical experts, on the other hand, are in constant need of highly specialized information, which they cannot easily obtain. To address their needs, an information system needs to ``understand the data and answer efficiently the experts questions. Often, however, experts need responses that cannot be answered by a single information source. To integrate information from disparate sources, semantic indexing of the vast quantities of information is needed to bridge the experts needs with the available data sources. Semantic indexing is currently achieved by manual annotation, and does not scale up. Automating this process requires large-scale classification of data into hierarchically organized concepts. QA methods capable of ``interpreting questions in terms of the same concepts are also needed. BioASQ will push towards improved biomedical semantic indexing and QA via ambitious, yet realistic challenge tasks. The challenge will run in two stages, designed to (a) adapt traditional semantic indexing and QA methods to the needs of biomedical experts, and (b) collect feedback and improve the experimental setting itself. A large computational infrastructure, already available to the consortium, will be used to evaluate competing systems. The required datasets and evaluation measures will be established before the challenge. Biomedical experts will participate in the consortium, both as partners and through a supporting network of third parties.


Grant
Agency: European Commission | Branch: H2020 | Program: MSCA-ITN-ETN | Phase: MSCA-ITN-2014-ETN | Award Amount: 2.25M | Year: 2015

The use of visible light energy to induce chemical transformations constitutes an interesting and green activation mode of organic molecules. However, implementation of this energy source in organic synthetic methodologies and in the industrial production of fine chemicals has been challenging. The Photo4Future Innovative Training Network establishes a training network with five beneficiaries from academia and five beneficiaries from industry to tackle the challenges associated with photochemistry in a coherent and comprehensive fashion. In total 13 Early Stage Researchers will be recruited within the Photo4Future network. The network will provide them with opportunities to undertake research with the aim to overcome the current limitations towards the applicability and scalability of photochemical transformations. This will be achieved through a rational design of novel photocatalytic methodologies, improved catalytic systems and innovative photoreactors. Furthermore, the ESRs will be trained in the Photo4Future graduate school, covering training in scientific, personal and complementary skills. All the ESRs will perform two secondments, of which at least one is carried out with an industrial partner. Consequently, the ESRs will have improved career prospects and a higher employability. Due to the high degree of industrial participation, the Photo4Future network will provide an innovation-friendly environment where scientific results can grow and become products or services that will benefit European economies.


Grant
Agency: European Commission | Branch: FP7 | Program: MC-ITN | Phase: FP7-PEOPLE-2012-ITN | Award Amount: 3.68M | Year: 2013

The SusPhos training network will bring about a paradigm shift in the teaching of sustainable phosphorus chemistry, in the training of multidisciplinary-competent scientists, and in the publics view on chemistry to preserve the essential element phosphorus from depletion. SusPhos represents the first systematic investigation of the eco-friendly production, smart use, recycling and commercial exploitation of phosphorus-based processes and materials that use the precious element phosphorus in a sustainable manner. This approach should lead to fundamental insights into sustainable technologies and create an ideal platform for the training of young, ambitious researchers in a superb collaborative European setting. SusPhos will educate 14 broadly-oriented researchers at the interface of synthetic chemistry, catalysis, materials science, process chemistry, industrial phosphorus chemistry, and technology transfer. SusPhos intense training module combines the complementary strengths of nine academic and three industrial teams to promote intersectoral mobility of top-class multi-skilled researchers to enforce cross-fertilization of enhanced research synergies between the public and private European chemical sector. In its dual-mentor programme each of the ESRs and ERs will be supervised by one mentor from academia and one from industry to ensure an outstanding training in both sectors. The training programme uses highly innovative and timely methodologies to provide comprehensive multidisciplinary training of a new generation of young researchers capable of understanding and applying green chemistry to the conservation of phosphorus by environmentally benign conversions. Our SME Magpie Polymers and leading chemical companies Thermphos, Arkema and DSM will ensure rapid and effective technology transfer. As such the network will facilitate Europes continued global leadership on the sustainable use of phosphorus in an increasingly fierce competition for resources.


Grant
Agency: European Commission | Branch: FP7 | Program: MC-IRSES | Phase: FP7-PEOPLE-2013-IRSES | Award Amount: 545.80K | Year: 2014

With the statics of a wealth of model systems in statistical and condensed-matter systems being relatively well understood, an increased effort in research in these fields is channelled towards the understanding of dynamical phenomena. This agenda is even more pressing as dynamic effects are of crucial importance for many experimentally observed and technologically important phenomena. In the first work package, we will use the concerted effort and abundant synergies of the globe-spanning team of research groups to study a wide range of dynamical phenomena in statistical and condensed-matter physics, ranging from magnetic systems, to model systems on complex networks, to complex electronic systems such as graphene, to applications in soft matter and bio-physics. The results and techniques are expected to boost the understanding of dynamical effects in the human model systems in work package 2 which, in turn, is expected to offer new perspectives on the description of the systems investigated here. Socio-economic systems are increasingly studied with tools borrowed from statistical and condensed-matter physics. The theory of complex networks, in particular, has undergone an exciting development and has now found applications in a wide range of fields, including transport problems in modeling traffic systems, disease spreading or models for modeling agents acting in online communities. In the second work package, researchers in the consortium will apply techniques harvested from the wide array of tools used in statistical and condensed-matter physics as exemplified in the work items collected in work package 1 to a range of socio-economic systems with a focus on traffic and public transport models as well as the modeling of human behavior via the proxy of massivley multiplayer online games.


Grant
Agency: European Commission | Branch: FP7 | Program: CP-FP | Phase: NMP-2008-1.1-1;NMP-2008-2.1-1 | Award Amount: 3.37M | Year: 2009

The goal of the MultiPlat project is to develop biomimetic proton conductive membranes with nanometer thickness (nanomembranes) through convergence of a number of fields. The primary application of this multipurpose nanotechnological platform is the next generation of fuel cells where it will replace the prevailing evolutionary modifications of the state of the art solutions. Secondary applications cover diverse fields, including photonics, sensorics, biointerfaces, medicine and others. The natural proton conductive nanomembranes are the most ubiquitous building element in biology. The core concept of the project is to postulate, introduce and fabricate a novel composite nanomembrane and to functionalize them through the integration of proton conducting nanochannels in a manner analogous to that in biological cells. At the same time the nanomembranes will be strengthened through the introduction of inorganic part. This functionalisation itself is a complex and largely unsolved issue. In this way the nanomembranes will merge artificial and biological properties. We intend to use convergence of diverse fields including physics, chemistry, biomimetics, and nanotechnology. The focus will be primarily on the use of various nanotechnology methods for nanomembrane fabrication, their functionalisation through lamination, surface patterning, inclusion of fillers and structural modification through the engineering of built-in nanochannels. The research should result in functional models and a breadboard model. The industrial partners will ensure the application of the results. The objectives of the project fully satisfy the call NMP-2008-1.1-1 Converging sciences and technologies (nano, bio, info, cogni). The expected impacts include breakthroughs in knowledge in the converging fields, important practical applications and industrial innovations, with major significance for clean energy production, environmental protection and welfare improvement.


Grant
Agency: European Commission | Branch: FP7 | Program: BSG-SME | Phase: SME-1 | Award Amount: 1.47M | Year: 2008

The aim of this project is to prepare and start-up the commercial exploitation of the semantic collaboration software OntoWiki in the three different target markets, namely Enterprise Knowledge Management, semantically enhanced content management for E-Learning and E-Tourism. OntoWiki is a comprehensive open-source platform for social semantic collaboration. It is developed at Universitt Leipzig and has a large and active user base. Within the course of the project OntoWiki will be further developed and adopted to the needs of the SME for exploitation of OntoWiki in the prospective target markets.


Grant
Agency: European Commission | Branch: H2020 | Program: RIA | Phase: FETPROACT-01-2016 | Award Amount: 5.82M | Year: 2017

Social media and the digitization of news and discussion fora are having far-reaching effects on the way individuals and communities communicate, organize, and express themselves. Can the information circulating on these platforms be tapped to better understand and analyze the enormous problems facing our contemporary society? Could this help us to better monitor the growing number of social crises due to cultural differences and diverging world-views? Would this facilitate early detection and perhaps even ways to resolve conflicts before they lead to violence? The Odycceus project answers all these questions affirmatively. It will develop the conceptual foundations, methodologies, and tools to translate this bold vision into reality and demonstrate its power in a large number of cases. Specifically, the project seeks conceptual breakthroughs in Global Systems Science, including a fine-grained representation of cultural conflicts based on conceptual spaces and sophisticated text analysis, extensions of game theory to handle games with both divergent interests and divergent mindsets, and new models of alignment and polarization dynamics. The project will also develop an open modular platform, called Penelope, that integrates tools for the complete pipeline, from data scraped from social media and digital sources, to visualization of the analyses and models developed by the project. The platform features an infrastructure allowing developers to provide new plug-ins for additional steps in the pipeline, share them with others, and jointly develop the platform as an open source community. Finally, the project will build two innovative participatory tools, the Opinion Observatory and the Opinion Facilitator, which allow citizens to monitor, visualize and influence the dynamics of conflict situations that involve heterogeneous cultural biases and non-transparent entanglements of multilateral interests.


Grant
Agency: European Commission | Branch: FP7 | Program: CP-FP | Phase: HEALTH-2007-3.1-5 | Award Amount: 3.88M | Year: 2009

Recombinant growth hormone (GH) has been used since 1985. Current indications for GH use in children include GH deficiency and an increasing number of conditions where childhood short stature is not primarily due to deficient GH secretion. Approximately 40 000 children in the EU are treated with daily injections of GH. The efficacy of GH to increase adult height is undisputed in children with severe GH deficiency but is more limited in other indications where current estimates suggest a gain of about 1 cm of adult height per year of treatment. The clinical significance of height gains has been poorly evaluated. The possibility has been raised that GH use in childhood might increase the risk of cancer later in life. However, little data is available to further explore this concern. SAGhE is an integrated consortium of paediatric endocrinologist, epidemiologists and biostatisticians that will collect and analyse data to address the questions of appropriateness and safety of childhood GH treatments. The impact on both height and psychosocial components will be evaluated on a large unbiased metacohort of patients followed to adult height. Safety will be evaluated by analysing long term mortality and long term cancer incidence. The data obtained will then be integrated and disseminated to several levels of users. SAGhE will contribute to the aims of the FP7 Health work programme and to the new Community Action programme of public Health in the field of better use of medicines. It will realize the application of evidence-based medicine in Europe, by the size and design of the study, the independence and scientific quality of data analysis and its translation into evidence-based guidelines. It will be comprehensive at the EU level and will test for national differences. It will address patient safety, one of the key points of the work programme. SAGhE is unique worldwide in its design, size and potential to answer important questions raised on childhood GH treatments.


Grant
Agency: European Commission | Branch: FP7 | Program: CP-FP | Phase: NMP-2008-2.6-2 | Award Amount: 691.73K | Year: 2009

The proposed work aims at developing the tools required for the intelligent choosing and tuning of nano-porous materials with respect to a specific application. For this purpose, a combined computational theoretical and experimental study is envisaged in order to digitally reconstruct the porous matrix of selected advanced materials, mainly for applications involving sorption of carbon dioxide and methane by employing advanced Statistical Mechanics based computer simulation methods, both, in atomistic (Monte Carlo, ab initio and equilibrium and non-equilibrium Molecular Dynamics) and mesoscopic level (Kinetic Monte Carlo and Lattice Gas Cellular Automata). The reasoning behind this strategy is that the structure of materials spans a wide range of length scales, making thus sorption and transport phenomena depend upon length and time scale. As a consequence, the proposed computational methodology consists of many levels in order to address properly these phenomena. Moreover, a complementary approach to computer simulations is provided through direct comparison of two highly sophisticated methods for measuring motion of guest molecules inside porous materials, namely, quasi-elastic neutron scattering (QENS) and pulsed field gradient nuclear magnetic resonance (PFG NMR), carried out by the groups of Lyon and Leipzig respectively. This type of combined studies can be perfectly utilized through the proposed work towards a fascinating insight of the relation of the material interior to the sorption and transport mechanisms of sorbates such as carbon dioxide and methane, both involved in the so-called greenhouse effect.


Vladimirov A.A.,Joint Institute for Nuclear Research | Ihle D.,University of Leipzig | Plakida N.M.,RAS Institute for Nuclear Research
Physical Review B - Condensed Matter and Materials Physics | Year: 2011

A microscopic theory of the dynamic spin susceptibility (DSS) in the superconducting state within the t-J model is presented. It is based on an exact representation for the DSS obtained by applying the Mori-type projection technique for the relaxation function in terms of Hubbard operators. The static spin susceptibility is evaluated by a sum-rule-conserving generalized mean-field approximation, while the self-energy is calculated in the mode-coupling approximation. The spectrum of spin excitations is studied in a homogeneous phase of the underdoped and optimally doped regions. The DSS reveals a resonance mode (RM) at the antiferromagnetic wave vector Q=π(1,1) at low temperatures due to a strong suppression of the damping of spin excitations. This is explained by an involvement of spin excitations in the decay process in addition to the particle-hole continuum usually considered in random-phase-type approximations. The spin gap in the spin-excitation spectrum at Q plays a dominant role in limiting the decay in comparison with the superconducting gap, which results in the observation of the RM even above Tc in the underdoped region. A good agreement with inelastic neutron-scattering experiments on the RM in YBa2Cu3Oy compounds is found. © 2011 American Physical Society.


Patent
Tiergesundheitsdienst Bayern E.V. and University of Leipzig | Date: 2010-10-22

The present invention refers to a novel circovirus as causative agent of bone marrow aplasia with haemorrhagic disease in cattle. The present invention provides novel nucleic acid and protein sequences for diagnostic and therapeutic uses.


Harrison C.,Guys Hospital | Kiladjian J.-J.,University Paris Diderot | Al-Ali H.K.,University of Leipzig | Gisslinger H.,Medical University of Vienna | And 9 more authors.
New England Journal of Medicine | Year: 2012

BACKGROUND: Treatment options for myelofibrosis are limited. We evaluated the efficacy and safety of ruxolitinib, a potent and selective Janus kinase (JAK) 1 and 2 inhibitor, as compared with the best available therapy, in patients with myelofibrosis. METHODS: We assigned 219 patients with intermediate-2 or high-risk primary myelofibrosis, post-polycythemia vera myelofibrosis, or post-essential thrombocythemia myelofibrosis to receive oral ruxolitinib or the best available therapy. The primary end point and key secondary end point of the study were the percentage of patients with at least a 35% reduction in spleen volume at week 48 and at week 24, respectively, as assessed with the use of magnetic resonance imaging or computed tomography. RESULTS: A total of 28% of the patients in the ruxolitinib group had at least a 35% reduction in spleen volume at week 48, as compared with 0% in the group receiving the best available therapy (P<0.001); the corresponding percentages at week 24 were 32% and 0% (P<0.001). At 48 weeks, the mean palpable spleen length had decreased by 56% with ruxolitinib but had increased by 4% with the best available therapy. The median duration of response with ruxolitinib was not reached, with 80% of patients still having a response at a median follow-up of 12 months. Patients in the ruxolitinib group had an improvement in overall quality-of-life measures and a reduction in symptoms associated with myelofibrosis. The most common hematologic abnormalities of grade 3 or higher in either group were thrombocytopenia and anemia, which were managed with a dose reduction, interruption of treatment, or transfusion. One patient in each group discontinued treatment owing to thrombocytopenia, and none discontinued owing to anemia. Nonhematologic adverse events were rare and mostly grade 1 or 2. Two cases of acute myeloid leukemia were reported with the best available therapy. CONCLUSIONS: Continuous ruxolitinib therapy, as compared with the best available therapy, was associated with marked and durable reductions in splenomegaly and disease-related symptoms, improvements in role functioning and quality of life, and modest toxic effects. An influence on overall survival has not yet been shown. (Funded by Novartis Pharmaceuticals; ClinicalTrials.gov number, NCT00934544.). Copyright © 2012 Massachusetts Medical Society.


Borchardt S.,University of Leipzig | Selmke M.,Princeton University
Applied Optics | Year: 2015

We describe the individual contributions to the intensity distribution of the parhelic circle for plate-oriented hexagonal crystals at exactly zero solar elevation using geometrical optics. An experimental as well as theoretical study of in-plane ray paths provides details on the mechanism for several halos, including the parhelia, the 120° and 90° parhelia, a blue edge, and the Liljequist parhelia. Azimuthal coordinates for associated characteristic features in the intensity distribution are compared with experimental data obtained using a spinning hexagonal glass prism. © 2015 Optical Society of America.


Brewka G.,University of Leipzig | Eiter T.,Vienna University of Technology | Truszczynski M.,University of Kentuckys
Communications of the ACM | Year: 2011

Can solving hard computational problems be made easy? If we restrict the scope of the question to computational problems that can be stated in terms of constraints over binary domains, and if we understand "easy" as "using a simple and intuitive modeling language that comes with software for processing programs in the language," then the answer is Yes! Answer Set Programming (ASP, for short) fits the bill. While already well represented at research conferences and workshops, ASP has been around for barely more than a decade. Its origins, however, go back a long time; it is an outcome of years of research in knowledge representation, logic programming, and constraint satisfaction-areas that sought and studied declarative languages to model domain knowledge, as well as general-purpose computational tools for processing programs and theories that represent problem specifications in these languages. ASP borrows from each of these areas, all the time aiming. © 2011 ACM.


Castaneda J.M.M.,University of Leipzig | Guilarte J.M.,University of Salamanca | Mosquera A.M.,University of Salamanca | Mosquera A.M.,University of Caldas
Physical Review D - Particles, Fields, Gravitation and Cosmology | Year: 2013

In this paper the quantum vacuum energies induced by massive fluctuations of one real scalar field on a configuration of two partially transparent plates are analyzed. The physical properties of the infinitely thin plates are characterized by two Dirac-δ potentials. We find that an attractive or repulsive Casimir force arises between the plates when the weights of the δ's have equal or different sign. If some of the plates absorb fluctuations below some threshold of energy (the corresponding weight is negative), there is the need to set a minimum mass to the scalar field fluctuations to preserve unitarity in the corresponding quantum field theory. Two repulsive δ interactions are compatible with massless fluctuations. The effect of Dirichlet boundary conditions at the endpoints of the interval (-a,a) on a massless scalar quantum field theory defined on this interval is tantamount to letting the weights of the repulsive δ interactions go to +∞. © 2013 American Physical Society.


Verlohren S.,Charité - Medical University of Berlin | Stepan H.,University of Leipzig | Dechend R.,Experimental and Clinical Research Center
Clinical Science | Year: 2012

The pathogenesis of pre-eclampsia is still not completely known; however, in the recent decade, there have been tremendous research efforts leading to impressive results highlighting the role of a disturbed angiogenic balance as one of the key features of the disease. Numerous studies have shown the key role of the placenta in the pathogenesis of pre-eclampsia. A shift in the sFlt-1 (soluble Fms-like tyrosine kinase-1)/PlGF (placental growth factor) ratio is associated with the disease. Although pre-eclampsia seems to be a clearly defined disease, clinical presentation, and particularly the dynamics of the clinical course, can vary enormously. The only available tools to diagnose pre-eclampsia are blood pressure measurement and urine protein sampling. However, these tools have a low sensitivity and specificity regarding the prediction of the course of the disease or maternal and perinatal outcomes. The only cure for the disease is delivery, although a timely diagnosis helps in decreasing maternal and fetal morbidity and mortality. The sFlt1/PlGF ratio is able to give additional valuable information on the status and progression of the disease and is apt to be implemented in the diagnostic algorithm of pre-eclampsia. In the present review, we aim to provide an overview of the vast literature on angiogenesis and anti-angiogenesis factors in pre-eclampsia that have been published over the last decade. We introduce work from basic research groups who have focused on the pathophysiological basis of the disease. Furthermore, we review studies with a clinical focus in which the sFlt-1/PlGF ratio has been analysed along with other candidates for routine clinical assessment of pre-eclampsia. © The Authors Journal compilation. © 2012 Biochemical Society.


Schneider D.,University of Leipzig | Valiullin R.,University of Leipzig | Monson P.A.,University of Massachusetts Amherst
Langmuir | Year: 2014

We have studied the filling dynamics of model capillaries using dynamic mean field theory for a confined lattice gas and Kawasaki dynamics simulations. We have found two different scenarios for filling of capped nanocapillaries from the vapor phase. As compared to channels with macroscopic width, in which the filling process occurs by the detachment of the meniscus from the cap, in mesoscopic channels there is an alternative mechanism associated with the spontaneous condensation of the liquid close to the pore opening and its subsequent growth toward the closed pore end. We show that these two scenarios have totally different filling dynamics, providing an additional mechanism for slow capillary condensation kinetics in nanoscopic objects. © 2014 American Chemical Society.


Bottke P.,University of Graz | Freude D.,University of Leipzig | Wilkening M.,University of Graz
Journal of Physical Chemistry C | Year: 2013

Two-dimensional (2D) 6Li exchange magic angle spinning (MAS) nuclear magnetic resonance (NMR) spectroscopy was used to probe extremely slow lithium hopping processes in a polycrystalline powder sample of lithium zirconate, Li2ZrO3. In agreement with the crystal structure of Li2ZrO3, the 6Li MAS NMR spectra recorded are composed of two signals (- 0.10 and 0.26 ppm) with equal intensity. They reflect the two magnetically (and electrically) inequivalent Li sites in Li2ZrO3. The mixing-time dependent 2D MAS NMR spectra, which were acquired at a bearing gas temperature of ca. 310 K, clearly reveal off-diagonal intensities indicating Li exchange processes with exchange rates as low as 60 jumps/hour. To our knowledge, this is by far one of the slowest Li solid-state diffusion processes ever probed by 6Li 2D exchange MAS NMR spectroscopy (submitted to J. Phys. Chem. C, 2013). © 2013 American Chemical Society.


Bregulla A.P.,University of Leipzig | Yang H.,Princeton University | Cichos F.,University of Leipzig
ACS Nano | Year: 2014

Force-free trapping and steering of single photophoretically self-propelled Janus-type particles using a feedback mechanism is experimentally demonstrated. Realtime information on particle position and orientation is used to switch the self-propulsion mechanism of the particle optically. The orientational Brownian motion of the particle thereby provides the reorientation mechanism for the microswimmer. The particle size dependence of the photophoretic propulsion velocity reveals that photon nudging provides an increased position accuracy for decreasing particle radius. The explored steering mechanism is suitable for navigation in complex biological environments and in-depth studies of collective swimming effects. © 2014 American Chemical Society.


Katus T.,University of London | Katus T.,University of Leipzig | Muller M.M.,University of Leipzig | Eimer M.,University of London
Journal of Neuroscience | Year: 2015

To adaptively guide ongoing behavior, representations in working memory (WM) often have to be modified in line with changing task demands. We used event-related potentials (ERPs) to demonstrate that tactile WM representations are stored in modality-specific cortical regions, that the goal-directed modulation of these representations is mediated through hemispheric-specific activation of somatosensory areas, and that the rehearsal of somatotopic coordinates in memory is accomplished by modality-specific spatial attention mechanisms. Participants encoded two tactile sample stimuli presented simultaneously to the left and right hands, before visual retro-cues indicated which of these stimuli had to be retained to be matched with a subsequent test stimulus on the same hand. Retro-cues triggered a sustained tactile contralateral delay activity component with a scalp topography over somatosensory cortex contralateral to the cued hand. Early somatosensory ERP components to task-irrelevant probe stimuli (that were presented after the retro-cues) and to subsequent test stimuli were enhanced when these stimuli appeared at the currently memorized location relative to other locations on the cued hand, demonstrating that a precise focus of spatial attention was established during the selective maintenance of tactile events in WM. These effects were observed regardless of whether participants performed the matching task with uncrossed or crossed hands, indicating that WM representations in this task were based on somatotopic rather than allocentric spatial coordinates. In conclusion, spatial rehearsal in tactileWMoperates within somatotopically organized sensory brain areas that have been recruited for information storage. © 2015 Katus et al.


Andersson A.,University of Leipzig | Andersson A.,Max Planck Institute for Mathematics in the Sciences
Letters in Mathematical Physics | Year: 2014

In this paper, we develop a rigorous observable- and symmetry generator-related framework for quantum measurement theory by applying operator deformation techniques previously used in noncommutative quantum field theory. This enables the conventional observables (represented by unbounded operators) to play a role also in the more general setting. In addition, it gives a way of explicitly calculating the so-called instrument of the measurement process. © 2013 Springer Science+Business Media Dordrecht.


Brewka G.,University of Leipzig | Dunne P.E.,University of Liverpool | Woltran S.,Vienna University of Technology
IJCAI International Joint Conference on Artificial Intelligence | Year: 2011

One criticism often advanced against abstract argumentation frameworks (AFs), is that these consider only one form of interaction between atomic arguments: specifically that an argument attacks another. Attempts to broaden the class of relationships include bipolar frameworks, where arguments support others, and abstract dialectical frameworks (ADFs). The latter, allow "acceptance" of an argument, x, to be predicated on a given propositional function, Cx, dependent on the corresponding acceptance of its parents, i.e. those y for which 〈y, x〉 occurs. Although offering a richly expressive formalism subsuming both standard and bipolar AFs, an issue that arises with ADFs is whether this expressiveness is achieved in a manner that would be infeasible within standard AFs. Can the semantics used in ADFs be mapped to some AF semantics? How many arguments are needed in an AF to "simulate" an ADF? We show that (in a formally defined sense) any ADF can be simulated by an AF of similar size and that this translation can be realised by a polynomial time algorithm.


Munoz-Castaneda J.M.,University of Leipzig | Mateos Guilarte J.,University of Salamanca
Physical Review D - Particles, Fields, Gravitation and Cosmology | Year: 2015

The effects induced by the quantum vacuum fluctuations of one massless real scalar field on a configuration of two partially transparent plates are investigated. The physical properties of the infinitely thin plates are simulated by means of Dirac-δ-δ′ point interactions. It is shown that the distortion caused on the fluctuations by this external background gives rise to a generalization of Robin boundary conditions. The T operator for potentials concentrated on points with nondefined parity is evaluated with total generality. The quantum vacuum interaction energy between the two plates is computed in several dimensions using the TGTG formula to find positive, negative, and zero Casimir energies. The parity properties of the δ-δ′ potential demands that one distinguish between opposite and identical objects. It is shown that between identical sets of δ-δ′ plates, repulsive, attractive, or null quantum vacuum forces arise. However, there is always attraction between a pair of opposite δ-δ′ plates. © 2015 American Physical Society.


Kunisch K.,University of Graz | Wagner M.,University of Leipzig
Nonlinear Analysis: Real World Applications | Year: 2012

For the monodomain approximation of the bidomain equations, a weak solution concept is proposed. We analyze it for the FitzHughNagumo and the RogersMcCulloch ionic models, obtaining existence and uniqueness theorems. Subsequently, we investigate optimal control problems subject to the monodomain equations. After proving the existence of global minimizers, the system of the first-order necessary optimality conditions is rigorously characterized. For the adjoint system, we prove an existence and regularity theorem as well. © 2011 Elsevier Ltd. All rights reserved.


Grant
Agency: European Commission | Branch: FP7 | Program: CP | Phase: ICT-2007.5.3 | Award Amount: 4.55M | Year: 2008

IMPPACT will develop an intervention planning and monitoring application for Radiofrequency Ablation (RFA) of malignant liver tumours. RFA is a minimally invasive form to treat cancer without open surgery, by placing a needle inside the malignancy and destroying it through intensive heating. Though the advantages of this approach are obvious, the intervention is currently hard to plan, almost impossible to monitor or assess, and therefore is not the first choice for treatment. IMPPACT will develop a physiological model of the liver and simulate the interventions result, accounting for patient specific physiological factors. Gaps in the understanding of particular aspects of the RFA treatment will be closed by multi-scale studies on cells and animals. New findings will be evaluated microscopically and transformed into macroscopic equations. The long-established bio-heat equation will be extended to incorporate multiple scales. Validation will be performed at multiple levels. Images from ongoing patient treatment will be used to cross check validity for human physiology. Final validation will be performed at macroscopic level through visual comparison of simulation and treatment results gathered in animal studies and during patient treatment. This extensive validation together with a user-centred software design approach will guarantee suitability of the solution for clinical practice. The consortium consists of two Hospitals, three Universities, one Research Institute and one industrial SME. The final project deliverables will be the patient specific intervention planning system and an augmented reality training simulator for the RFA intervention.


Grant
Agency: European Commission | Branch: H2020 | Program: ERC-STG | Phase: ERC-2016-STG | Award Amount: 1.34M | Year: 2016

Combinatorics, and its interplay with geometry, has fascinated our ancestors as shown by early stone carvings in the Neolithic period. Modern combinatorics is motivated by the ubiquity of its structures in both pure and applied mathematics. The work of Hochster and Stanley, who realized the relation of enumerative questions to commutative algebra and toric geometry made a vital contribution to the development of this subject. Their work was a central contribution to the classification of face numbers of simple polytopes, and the initial success lead to a wealth of research in which combinatorial problems were translated to algebra and geometry and then solved using deep results such as Saitos hard Lefschetz theorem. As a caveat, this also made branches of combinatorics reliant on algebra and geometry to provide new ideas. In this proposal, I want to reverse this approach and extend our understanding of geometry and algebra guided by combinatorial methods. In this spirit I propose new combinatorial approaches to the interplay of curvature and topology, to isoperimetry, geometric analysis, and intersection theory, to name a few. In addition, while these subjects are interesting by themselves, they are also designed to advance classical topics, for example, the diameter of polyhedra (as in the Hirsch conjecture), arrangement theory (and the study of arrangement complements), Hodge theory (as in Grothendiecks standard conjectures), and realization problems of discrete objects (as in Connes embedding problem for type II factors). This proposal is supported by the review of some already developed tools, such as relative Stanley--Reisner theory (which is equipped to deal with combinatorial isoperimetries), combinatorial Hodge theory (which extends the K\ahler package to purely combinatorial settings), and discrete PDEs (which were used to construct counterexamples to old problems in discrete geometry).


Grant
Agency: European Commission | Branch: FP7 | Program: CP-FP | Phase: NMP-2008-1.3-2 | Award Amount: 2.93M | Year: 2009

Metal oxide and metal NPs are particularly dangerous for two reasons: their special catalytic activity coming from the properties of their nanointerface may interfere with numerous intracellular biochemical processes and the decomposition of NPs and the ion leakage could heavily interfere with the intracellular free metal ion homeostasis, which is essential for cell metabolism. A very specific problem is the difficulty of localizing and quantifying them in cells. Obtaining dose effect relationships is not simple, because of the unknown amount of material present in affected cells. The following main points will be addressed in this proposal:1) Design and synthesis of metal oxide and metal NPs, which can be traced by SPECT, PET, and fluorescence techniques and the appropriate characterization of these NPs.2) Application of label-free techniques, such as IBM and EM to ensure that the radioactive and fluorescent constituents do not modify the cytological and organismic response by themselves.3) Characterization of the uptake, distribution kinetics and NP release at the level of the organism.4) Study of the interaction of NPs with plasma components forming complexes with NPs and the assessment of their possible impact on the uptake compared with that of bare or capped particles.5) Quantification and localization of metallic NPs in immune competent cells is a key task for the establishment of proper dose-response correlations. A technique applicable with living cells as ultimate control will be IBM, capable of detecting single metal NPs in cells at different depths.6) Development of sophisticated cell physiological approaches focusing on the determination of oxidative activity, cytokine production and adaptive processes concerning signalling pathways beyond standard vitality tests. The research project will indicate toxic levels of various NPs and sub-toxic effects will be investigated by analysing the signalling response of immune cells


Kempter V.,Clausthal University of Technology | Kirchner B.,University of Leipzig
Journal of Molecular Structure | Year: 2010

In the first part of this report experimental results are discussed which focus onto the importance of hydrogen atoms in the interaction of imidazolium-based ionic liquids. These include examples for the cation-anion interaction in neat ionic liquids as well as the interactions between ionic liquids and their molecular environment, water in particular. Most of the studies emphasize the importance of the C(2)-H group of the imidazolium ring for the intra- and intermolecular interactions; commonly, the interactions of the type C-H ... X (X =: O, halide) are attributed to "hydrogen bonding". In the second part it is analyzed whether these interactions and their consequences fulfill the criteria set by standard definitions of hydrogen bonding. Two cation-anion co-conformations at the C(2)-H group are found. One co-conformer (in-plane) often resembles a hydrogen bond while the other one (on-top) points to a non-hydrogen bonding behavior. Furthermore, the degree of hydrogen bonding for the in-plane structure is very dependent on the anion. Spatial distribution functions show that, in general, both co-conformations are occupied. However, the question of how long a particular co-conformer is populated in the liquid state has yet to be answered. Therefore, it is concluded that the term "hydrogen bond" should, at present, be treated with care to characterize the cation-anion contacts, because of the above-mentioned difficulties. Once more it must be stressed that oversimplifications and generalizations, even for this subclass of ionic liquids have to be avoided, because these liquids are more complicated than it appears from first sight. © 2010 Elsevier B.V. All rights reserved.


Qian B.,Princeton University | Qian B.,Massachusetts Institute of Technology | Montiel D.,Princeton University | Bregulla A.,University of Leipzig | And 2 more authors.
Chemical Science | Year: 2013

A simple scheme is presented for remotely maneuvering individual microscopic swimmers by means of on-demand photo-induced actuation, where a laser gently and intermittently pushed the swimmer along its body axis (photon nudging) through a combination of radiation-pressure force and photophoretic pull. The proposed strategy utilized rotational random walks to reorient the micro-swimmer and turned on its propulsion only when the swimmer was aligned with the target location (adaptive control). A Langevin-type equation of motion was formulated, integrating these two ideas to describe the dynamics of the stochastically controlled swimmer. The strategy was examined using computer simulations and illustrated in a proof-of-principle experiment steering a gold-coated Janus micro-sphere moving in three dimensions. The physical parameters relevant to the two actuating forces under the experimental conditions were investigated theoretically and experimentally, revealing that a ∼7 K temperature differential on the micro-swimmer surface could generate a propelling photophoretic strength of ∼0.1 pN. The controllability and positioning error were discussed using both experimental data and Langevin dynamics simulations, where the latter was further used to identify two key unitless control parameters for manipulation accuracy and efficiency; they were the number of random-walk turns the swimmer experienced on the experimental timescale (the revolution number) and the photon-nudge distance within the rotational diffusion time (the propulsion number). A comparison of simulation and experiment indicated that a near-optimal micron-precision motion control was achieved. This journal is © The Royal Society of Chemistry 2013.


Weber J.,Max Planck Institute of Colloids and Interfaces | Schmidt J.,TU Berlin | Thomas A.,TU Berlin | Bohlmann W.,University of Leipzig
Langmuir | Year: 2010

The microporosity of two microporous polymer networks is investigated in detail. Both networks are based on a central spirobifluorene motif but have different linker groups, namely, imide and thiophene units. The microporosity of the networks is based on the "polymers of intrinsic microporosity (PIM)" design strategy. Nitrogen, argon, and carbon dioxide were used as sorbates in order to analyze the microporosity in greater detail. The gas sorption data was analyzed with respect to important parameters such as specific surface area, pore volume, and pore size (distribution). It is shown that the results can be strongly model dependent and swelling effects have to be regarded. 129Xe NMR was used as an independent technique for the estimation of the average pore size of the polymer networks. The results indicate that both networks are mainly ultramicroporous (pore sizes<0.8 nm) in the dry state, which was not expected based on the molecular design. Phase separation and network defects might influence the overall network morphology strongly. Finally, the observed swelling indicates that this "soft" microporous matter might have a different micropore size in the solvent swollen/filled state that in the dry state. © 2010 American Chemical Society.


Schlemmer J.,University of Vienna | Zahn J.,University of Leipzig
Annals of Physics | Year: 2015

We review different definitions of the current density for quantized fermions in the presence of an external electromagnetic field. Several deficiencies in the popular prescription due to Schwinger and the mode sum formula for static external potentials are pointed out. We argue that Dirac's method, which is the analog of the Hadamard point-splitting employed in quantum field theory in curved space-times, is conceptually the most satisfactory. As a concrete example, we discuss vacuum polarization and the stress-energy tensor for massless fermions in 1+1 dimension. Also a general formula for the vacuum polarization in static external potentials in 3+1 dimensions is derived. © 2015 Elsevier Inc.


Jaeger M.,University of New South Wales | Jaeger M.,University of Leipzig | Lang E.W.,Red Cross
Neurocritical Care | Year: 2013

Background: To investigate the relationship between cerebrovascular pressure reactivity and cerebral oxygen regulation after head injury. Methods: Continuous monitoring of the partial pressure of brain tissue oxygen (P brO2), mean arterial blood pressure (MAP), and intracranial pressure (ICP) in 11 patients. The cerebrovascular pressure reactivity index (PRx) was calculated as the moving correlation coefficient between MAP and ICP. For assessment of the cerebral oxygen regulation system a brain tissue oxygen response (TOR) was calculated, where the response of P brO2 to an increase of the arterial oxygen through ventilation with 100 % oxygen for 15 min is tested. Arterial blood gas analysis was performed before and after changing ventilator settings. Results: Arterial oxygen increased from 108 ± 6 mmHg to 494 ± 68 mmHg during ventilation with 100 % oxygen. PbrO2 increased from 28 ± 7 mmHg to 78 ± 29 mmHg, resulting in a mean TOR of 0.48 ± 0.24. Mean PRx was 0.05 ± 0.22. The correlation between PRx and TOR was r = 0.69, P = 0.019. The correlation of PRx and TOR with the Glasgow outcome scale at 6 months was r = 0.47, P = 0.142; and r = -0.33, P = 0.32, respectively. Conclusions: The results suggest a strong link between cerebrovascular pressure reactivity and the brain's ability to control for its extracellular oxygen content. Their simultaneous impairment indicates that their common actuating element for cerebral blood flow control, the cerebral resistance vessels, are equally impaired in their ability to regulate for MAP fluctuations and changes in brain oxygen. © 2013 Springer Science+Business Media New York.


Reddy V.Y.,Mount Sinai School of Medicine | Mobius-Winkler S.,University of Leipzig | Miller M.A.,Mount Sinai School of Medicine | Neuzil P.,Homolka Hospital | And 4 more authors.
Journal of the American College of Cardiology | Year: 2013

Objectives The purpose of this study was to assess the safety and efficacy of left atrial appendage (LAA) closure in nonvalvular atrial fibrillation (AF) patients ineligible for warfarin therapy. Background The PROTECT AF (Watchman Left Atrial Appendage System for Embolic Protection in Patients With Atrial Fibrillation) trial demonstrated that LAA closure with the Watchman device (Boston Scientific, Natick, Massachusetts) was noninferior to warfarin therapy. However, the PROTECT AF trial only included patients who were candidates for warfarin, and even patients randomly assigned to the LAA closure arm received concomitant warfarin for 6 weeks after Watchman implantation. Methods A multicenter, prospective, nonrandomized study was conducted of LAA closure with the Watchman device in 150 patients with nonvalvular AF and CHADS2 (congestive heart failure, hypertension, age ≥75 years, diabetes mellitus, and prior stroke or transient ischemic attack) score ≥1, who were considered ineligible for warfarin. The primary efficacy endpoint was the combined events of ischemic stroke, hemorrhagic stroke, systemic embolism, and cardiovascular/unexplained death. Results The mean CHADS2 score and CHA2DS2-VASc (CHADS2 score plus 2 points for age ≥75 years and 1 point for vascular disease, age 65 to 74 years, or female sex) score were 2.8 ± 1.2 and 4.4 ± 1.7, respectively. History of hemorrhagic/bleeding tendencies (93%) was the most common reason for warfarin ineligibility. Mean duration of follow-up was 14.4 ± 8.6 months. Serious procedure- or device-related safety events occurred in 8.7% of patients (13 of 150 patients). All-cause stroke or systemic embolism occurred in 4 patients (2.3% per year): ischemic stroke in 3 patients (1.7% per year) and hemorrhagic stroke in 1 patient (0.6% per year). This ischemic stroke rate was less than that expected (7.3% per year) based on the CHADS2 scores of the patient cohort. Conclusions LAA closure with the Watchman device can be safely performed without a warfarin transition, and is a reasonable alternative to consider for patients at high risk for stroke but with contraindications to systemic oral anticoagulation. (ASA Plavix Feasibility Study With Watchman Left Atrial Appendage Closure Technology [ASAP]; NCT00851578).© 2013 by the American College of Cardiology Foundation Published by Elsevier Inc.


Gawel E.,Helmholtz Center for Environmental Research | Gawel E.,University of Leipzig | Purkus A.,Helmholtz Center for Environmental Research
Energy Policy | Year: 2013

With the share of renewable energies within the electricity sector rising, improving their market and system integration is of increasing importance. By offering plant operators a premium on top of the electricity market price, premium schemes represent an option to increase the alignment of renewable electricity production with market signals, and have been implemented by several EU member states. This paper examines the case study of the German market premium scheme adopted in 2012. Building on an evaluation of early experiences, we discuss whether the market premium contributes to the aims of market and/or system integration (effectiveness), and what potential efficiency gains and additional costs of "administering integration" are associated with it (efficiency). While exposing renewables to price risks is not the scheme's purpose, it has successfully increased participation in direct marketing. However, risks of overcompensating producers for marketing and balancing costs are high, and the benefits of gradually leading plant operators towards the market are questionable. Incentives for demand-oriented production are established, but they seem insufficient particularly in the case of intermittent renewable energy sources. To conclude, we provide an outlook on alternative designs of premium schemes, and discuss whether they seem better suited for addressing the challenges ahead. © 2013 Elsevier Ltd.


Lechner G.,University of Vienna | Lechner G.,University of Leipzig | Schutzenhofer C.,University of Vienna
Annales Henri Poincare | Year: 2014

The recent construction of integrable quantum field theories on two-dimensional Minkowski space by operator-algebraic methods is extended to models with a richer particle spectrum, including finitely many massive particle species transforming under a global gauge group. Starting from a two-particle S-matrix satisfying the usual requirements (unitarity, Yang-Baxter equation, Poincaré and gauge invariance, crossing symmetry,...), a pair of relatively wedge-local quantum fields is constructed which determines the field net of the model. Although the verification of the modular nuclearity condition as a criterion for the existence of local fields is not carried out in this paper, arguments are presented that suggest it holds in typical examples such as non-linear O(N) σ-models. It is also shown that for all models complying with this condition, the presented construction solves the inverse scattering problem by recovering the S-matrix from the model via Haag-Ruelle scattering theory, and a proof of asymptotic completeness is given. © 2013 Springer Basel.


Lehmann P.,Helmholtz Center for Environmental Research | Gawel E.,Helmholtz Center for Environmental Research | Gawel E.,University of Leipzig
Energy Policy | Year: 2013

In virtually all EU Member States, the EU Emissions Trading Scheme (EU ETS) is complemented by support schemes for electricity generation from renewable energy sources (RES-E). This policy mix has been subject to strong criticism. It is mainly argued that RES-E schemes contribute nothing to emissions reduction and undermine the cost-effectiveness of the EU ETS. Consequently, many scholars suggest the abolition of RES-E schemes. However, this conclusion rests on quite narrow and unrealistic assumptions about the design and performance of markets and policies. This article provides a systematic and comprehensive review and discussion of possible rationales for combining the EU ETS with RES-E support schemes. The first and most important reason may be restrictions to technology development and adoption. These may be attributed to the failure of markets as well as policies, and more generally to the path dependency in socio-technical systems. Under these conditions, RES-E schemes are required to reach sufficient levels of technology development. In addition, it is highlighted that in contrast to the EU ETS RES-E support schemes may provide benefits beyond mitigating climate change. © 2012 Elsevier Ltd.


Gawel E.,Helmholtz Center for Environmental Research | Bernsen K.,University of Leipzig
Environment and Planning C: Government and Policy | Year: 2013

Virtual water, the amount of water used along a good's value chain, has come under discussion. Fairness and efficiency problems are seen to arise in the reallocation of access to water resources through the means of international trade. Moral issues are attached to both imports and exports, and even to a country's own consumption of virtual water. Global institutional arrangements have therefore been suggested to regulate virtual water trade both efficiently and 'fairly'. With this paper we will provide a short overview of the concept's history and findings, and an analysis from the perspective of economic trade theory, bringing up the old debate about the economic and environmental merits of free trade. The contribution of this paper will be to examine the performance of virtual water concepts in advising business or policy decisions in the form of global governance arrangements. It must be concluded that the virtual water concept is limited in terms of its usefulness in providing policy advice. The usually applied normative criteria are inconsistent, implying governance schemes that improve neither efficiency nor sustainability. Water-related problems should be solved in the respective arenas and not by global governance schemes or trade barriers.


Papadopoulos P.,Max Planck Institute for Polymer Research | Kossack W.,University of Leipzig | Kremer F.,University of Leipzig
Soft Matter | Year: 2013

The intra- and inter-molecular interactions of salol and polystyrene, as low molecular weight and polymeric glass-forming model systems, are studied by Fourier-transform infrared (FTIR) spectroscopy and Broadband Dielectric Spectroscopy (BDS). By analysing the temperature dependencies of specific IR absorption bands it is demonstrated that each molecular moiety in the glass-formers has its own signature in the course of the dynamic glass transition: while some do not show any change at the calorimetric glass transition temperature others exhibit a pronounced kink. The effects cannot be attributed solely to microscopic thermal expansion, but instead indicate gradual conformational changes. The ease of application of this approach to a variety of systems in different geometries and external conditions can assist the modelling of glasses and the understanding of the coupling between the glass transition and molecular-level dynamics. © 2013 The Royal Society of Chemistry.

Loading University of Leipzig collaborators
Loading University of Leipzig collaborators