The University of Konstanz is a university in the city of Konstanz in Baden-Württemberg, Germany. It was founded in 1966, and the main campus on the Gießberg was opened in 1972. The University is situated on the shore of Lake Constance just four kilometers from the Swiss border. As one of eleven German Excellence Universities, University of Konstanz is consistently ranked among the global top 250 by the Times Higher Education World University Rankings .Over 10,000 students from close to 100 countries are enrolled at the university, while over 220 links to European partner universities and numerous exchange programmes facilitate global networking. All in all students can choose from more than 100 degree programs. Moreover, Konstanz University cooperates with a large number of foreign universities such as Johns Hopkins University, Yale University, the University of Chicago and the University of Zurich. Its library is open 24 hours a day and has more than two million books. Wikipedia.
Diederichs K.,University of Konstanz |
Karplus P.A.,Oregon State University
Acta Crystallographica Section D: Biological Crystallography | Year: 2013
In macromolecular X-ray crystallography, typical data sets have substantial multiplicity. This can be used to calculate the consistency of repeated measurements and thereby assess data quality. Recently, the properties of a correlation coefficient, CC1/2, that can be used for this purpose were characterized and it was shown that CC1/2 has superior properties compared with 'merging' R values. A derived quantity, CC*, links data and model quality. Using experimental data sets, the behaviour of CC1/2 and the more conventional indicators were compared in two situations of practical importance: merging data sets from different crystals and selectively rejecting weak observations or (merged) unique reflections from a data set. In these situations controlled 'paired-refinement' tests show that even though discarding the weaker data leads to improvements in the merging R values, the refined models based on these data are of lower quality. These results show the folly of such data-filtering practices aimed at improving the merging R values. Interestingly, in all of these tests CC1/2 is the one data-quality indicator for which the behaviour accurately reflects which of the alternative data-handling strategies results in the best-quality refined model. Its properties in the presence of systematic error are documented and discussed.
Aichem A.,University of Konstanz
Nature communications | Year: 2010
The ubiquitin-like modifier FAT10 targets proteins for degradation by the proteasome and is activated by the E1 enzyme UBA6. In this study, we identify the UBA6-specific E2 enzyme (USE1) as an interaction partner of FAT10. Activated FAT10 can be transferred from UBA6 onto USE1 in vitro, and endogenous USE1 and FAT10 can be coimmunoprecipitated from intact cells. Small interfering RNA-mediated downregulation of USE1 mRNA resulted in a strong reduction of FAT10 conjugate formation under endogenous conditions, suggesting that USE1 is a major E2 enzyme in the FAT10 conjugation cascade. Interestingly, USE1 is not only the first E2 enzyme but also the first known substrate of FAT10 conjugation, as it was efficiently auto-FAT10ylated in cis but not in trans.
Rendle S.,University of Konstanz
ACM Transactions on Intelligent Systems and Technology | Year: 2012
Factorization approaches provide high accuracy in several important prediction problems, for example, recommender systems. However, applying factorization approaches to a new prediction problem is a nontrivial task and requires a lot of expert knowledge. Typically, a new model is developed, a learning algorithm is derived, and the approach has to be implemented. Factorization machines (FM) are a generic approach since they can mimic most factorization models just by feature engineering. This way, factorization machines combine the generality of feature engineering with the superiority of factorization models in estimating interactions between categorical variables of large domain. libFM is a software implementation for factorization machines that features stochastic gradient descent (SGD) and alternating least-squares (ALS) optimization, as well as Bayesian inference using Markov Chain Monto Carlo (MCMC). This article summarizes the recent research on factorization machines both in terms of modeling and learning, provides extensions for the ALS and MCMC algorithms, and describes the software tool libFM. © 2012 ACM 2157-6904/2012/05-ART57 $10.00.
Winter R.F.,University of Konstanz
Organometallics | Year: 2014
This review is a cautionary note against the often purported direct relation between the half-wave potential splitting δE1/2 (or δE°) for stepwise, consecutive electron transfer from systems featuring two or more identical redox sites and the true electronic coupling HAB and charge or spin distribution in the ground state of intermittently formed mixed-valent (MV) systems. Several examples where these different quantities go in parallel are contrasted with other examples where this is not the case. Different kinds of such "non-conformist" behavior are outlined with the aid of representative examples. These include cases of fairly strong electronic couplings and large degrees of ground-state delocalization despite small values of δE1/2 - sometimes just above the statistical limit or even below that - as well as examples for just the opposite behavior of no detectable electronic coupling despite appreciable electrochemical half-wave potential splitting. The crucial roles of the nominal bridges that interconnect the individual redox sites and of the environment (solvent, supporting electrolyte) in determining δE1/2 and HAB are emphasized. We also seek to provide some guidelines for the practitioner as to how to discriminate between these various types of behaviors and how to determine the strength of the electronic coupling between the redox sites. (Figure Presented) © 2014 American Chemical Society.
Stuermer C.A.O.,University of Konstanz
Trends in Cell Biology | Year: 2010
The proteins reggie-1 and reggie-2 were originally discovered in neurons during axon regeneration. Subsequently, they were independently identified as markers of lipid rafts in flotation assays and were hence named flotillins. Since then, reggie/flotillin proteins have been found to be evolutionarily conserved and are present in all vertebrate cells - yet their function has remained elusive and controversial. Recent results now show that reggie/flotillin proteins are indeed necessary for axon regeneration and growth: no axons form when reggies/flotillins are downregulated and signaling pathways controlling actin dynamics are perturbed. Their widespread expression and conservation, however, suggest that these proteins regulate basic cellular functions beyond regeneration. It is argued here that the reggie/flotillin proteins regulate processes vital to all cells - the targeted delivery of bulk membrane and specific membrane proteins from internal vesicle pools to strategically important sites including cell contact sites, the T cell cap, regenerating axons and growth cones and other protrusions. © 2009 Elsevier Ltd. All rights reserved.