Time filter

Source Type

Bozek K.,Max Planck Institute for Computer Science | Lengauer T.,Max Planck Institute for Computer Science | Sierra S.,University of Cologne | Kaiser R.,University of Cologne | Domingues F.S.,Center for Biomedicine
PLoS Computational Biology | Year: 2013

The relationship of HIV tropism with disease progression and the recent development of CCR5-blocking drugs underscore the importance of monitoring virus coreceptor usage. As an alternative to costly phenotypic assays, computational methods aim at predicting virus tropism based on the sequence and structure of the V3 loop of the virus gp120 protein. Here we present a numerical descriptor of the V3 loop encoding its physicochemical and structural properties. The descriptor allows for structure-based prediction of HIV tropism and identification of properties of the V3 loop that are crucial for coreceptor usage. Use of the proposed descriptor for prediction results in a statistically significant improvement over the prediction based solely on V3 sequence with 3 percentage points improvement in AUC and 7 percentage points in sensitivity at the specificity of the 11/25 rule (95%). We additionally assessed the predictive power of the new method on clinically derived 'bulk' sequence data and obtained a statistically significant improvement in AUC of 3 percentage points over sequence-based prediction. Furthermore, we demonstrated the capacity of our method to predict therapy outcome by applying it to 53 samples from patients undergoing Maraviroc therapy. The analysis of structural features of the loop informative of tropism indicates the importance of two loop regions and their physicochemical properties. The regions are located on opposite strands of the loop stem and the respective features are predominantly charge-, hydrophobicity- and structure-related. These regions are in close proximity in the bound conformation of the loop potentially forming a site determinant for the coreceptor binding. The method is available via server under http://structure.bioinf.mpi-inf.mpg.de/. © 2013 Bozek et al.


Elhayek A.,Max Planck Institute for Computer Science | Welk M.,University of Medical Sciences and Technology | Weickert J.,Saarland University
Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) | Year: 2011

Fluorescence microscopy methods are an important imaging technique in cell biology. Due to their depth sensitivity they allow a direct 3-D imaging. However, the resulting volume data sets are undersampled in depth, and the 2-D slices are blurred and noisy. Reconstructing the full 3-D information from these data is therefore a challenging task, and of high relevance for biological applications. We address this problem by combining deconvolution of the 3-D data set with interpolation of additional slices in an integrated variational approach. Our novel 3-D reconstruction model, Interpolating Robust and Regularised Richardson-Lucy reconstruction (IRRRL), merges the Robust and Regularised Richardson-Lucy deconvolution (RRRL) from [16] with variational interpolation. In this paper we develop the theoretical approach and its efficient numerical implementation using Fast Fourier Transform and a coarse-to-fine multiscale strategy. Experiments on confocal fluorescence microscopy data demonstrate the high restoration quality and computational efficiency of our approach. © 2011 Springer-Verlag.


Gall J.,Max Planck Institute for Computer Science | Rosenhahn B.,Max Planck Institute for Computer Science | Brox T.,TU Dresden | Seidel H.-P.,Max Planck Institute for Computer Science
International Journal of Computer Vision | Year: 2010

Local optimization and filtering have been widely applied to model-based 3D human motion capture. Global stochastic optimization has recently been proposed as promising alternative solution for tracking and initialization. In order to benefit from optimization and filtering, we introduce a multi-layer framework that combines stochastic optimization, filtering, and local optimization. While the first layer relies on interacting simulated annealing and some weak prior information on physical constraints, the second layer refines the estimates by filtering and local optimization such that the accuracy is increased and ambiguities are resolved over time without imposing restrictions on the dynamics. In our experimental evaluation, we demonstrate the significant improvements of the multi-layer framework and provide quantitative 3D pose tracking results for the complete HumanEva-II dataset. The paper further comprises a comparison of global stochastic optimization with particle filtering, annealed particle filtering, and local optimization.


Doerr B.,Max Planck Institute for Computer Science
Information Processing Letters | Year: 2013

We give a simple deterministic O(logK/loglogK) approximation algorithm for the Min-Max Selecting Items problem, where K is the number of scenarios. While our main goal is simplicity, this result also improves over the previous best approximation ratio of O(logK) due to Kasperski, Kurpisz, and Zieliński (2013) [4]. Despite using the method of pessimistic estimators, the algorithm has a polynomial runtime also in the RAM model of computation. We also show that the LP formulation for this problem by Kasperski and Zieliński (2009) [6], which is the basis for the previous work and ours, has an integrality gap of at least Ω(logK/loglogK). © 2013 Elsevier B.V.


Doerr B.,Max Planck Institute for Computer Science | Goldberg L.A.,University of Liverpool
Algorithmica | Year: 2013

We show that, for any c>0, the (1+1) evolutionary algorithm using an arbitrary mutation rate p n =c/n finds the optimum of a linear objective function over bit strings of length n in expected time Θ(nlogn). Previously, this was only known for c≤1. Since previous work also shows that universal drift functions cannot exist for c larger than a certain constant, we instead define drift functions which depend crucially on the relevant objective functions (and also on c itself). Using these carefully-constructed drift functions, we prove that the expected optimisation time is Θ(nlogn). By giving an alternative proof of the multiplicative drift theorem, we also show that our optimisation-time bound holds with high probability. © 2011 Springer Science+Business Media, LLC.


Neumann S.,University of Vienna | Wiese A.,Max Planck Institute for Computer Science
Leibniz International Proceedings in Informatics, LIPIcs | Year: 2016

During the last twenty years, a lot of research was conducted on the sport elimination problem: Given a sports league and its remaining matches, we have to decide whether a given team can still possibly win the competition, i.e., place first in the league at the end. Previously, the computational complexity of this problem was investigated only for games with two participating teams per game. In this paper we consider Debating Tournaments and Debating Leagues in the British Parliamentary format, where four teams are participating in each game. We prove that it is NP-hard to decide whether a given team can win a Debating League, even if at most two matches are remaining for each team. This contrasts settings like football where two teams play in each game since there this case is still polynomial time solvable. We prove our result even for a fictitious restricted setting with only three teams per game. On the other hand, for the common setting of Debating Tournaments we show that this problem is fixed parameter tractable if the parameter is the number of remaining rounds k. This also holds for the practically very important question of whether a team can still qualify for the knock-out phase of the tournament and the combined parameter k + b where b denotes the threshold rank for qualifying. Finally, we show that the latter problem is polynomial time solvable for any constant k and arbitrary values b that are part of the input. © Stefan Neumann and Andreas Wiese.


Wiese A.,Max Planck Institute for Computer Science
Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) | Year: 2016

In the Independent Set of Convex Polygons problem we are given a set of weighted convex polygons in the plane and we want to compute a maximum weight subset of non-overlapping polygons. This is a very natural and well-studied problem with applications in many different areas. Unfortunately, there is a very large gap between the known upper and lower bounds for this problem. The best polynomial time algorithm we know has an approximation ratio of nϵ and the best known lower bound shows only strong NP-hardness. In this paper we close this gap completely, assuming that we are allowed to shrink the polygons a little bit, by a factor 1 − δ for an arbitrarily small constant δ > 0, while the compared optimal solution cannot do this (resource augmentation). In this setting, we improve the approximation ratio from nϵ to 1 + ϵ which matches the above lower bound that still holds if we can shrink the polygons. © Springer-Verlag Berlin Heidelberg 2016.


Doerr B.,Max Planck Institute for Computer Science | Pohl S.,Max Planck Institute for Computer Science
GECCO'12 - Proceedings of the 14th International Conference on Genetic and Evolutionary Computation | Year: 2012

We analyze the run-time of the (1 + 1) Evolutionary Algorithm optimizing an arbitrary linear function f : {0,1,...,r} n -> R. If the mutation probability of the algorithm is p = c/n, then (1 + o(1))(e c/c))rn log n + O(r 3n log log n) is an upper bound for the expected time needed to find the optimum. We also give a lower bound of (1 + o(1))(1/c)rn log n. Hence for constant c and all r slightly smaller than (log n) 1/3, our bounds deviate by only a constant factor, which is e(1 + o(1)) for the standard mutation probability of 1/n. The proof of the upper bound uses multiplicative adaptive drift analysis as developed in a series of recent papers. We cannot close the gap for larger values of r, but find indications that multiplicative drift is not the optimal analysis tool for this case. © 2012 ACM.


Weikum G.,Max Planck Institute for Computer Science
Data Science Journal | Year: 2013

Discovery of documents, data sources, facts, and opinions is at the core of digital information and knowledge services. The ability to search, discover, compile, and analyze relevant information for a user's specific tasks is important in science, business, and the society. Information discovery is based on search engines, which in turn have mostly focused on finding documents of various kinds, such as publications, Web pages, and news articles. Search engine technology has been developed for both Internet/Web search and with different requirements, enterprise search within companies and intranets of organizations. A few commercial stakeholders such as Google and Microsoft have dominated Internet search. They provide excellent support for simple queries to satisfy popular information needs by as opposed to expert-level needs by advanced users. The spectrum of search scopes and items to be discovered has been expanded by multimedia data, such as photos, videos, and music, and by social media, such as blogs, tweets, and online forums.


Bogojeska J.,Max Planck Institute for Computer Science
Advances in Neural Information Processing Systems 24: 25th Annual Conference on Neural Information Processing Systems 2011, NIPS 2011 | Year: 2011

This paper presents an approach that predicts the effectiveness of HIV combination therapies by simultaneously addressing several problems affecting the available HIV clinical data sets: the different treatment backgrounds of the samples, the uneven representation of the levels of therapy experience, the missing treatment history information, the uneven therapy representation and the unbalanced therapy outcome representation. The computational validation on clinical data shows that, compared to the most commonly used approach that does not account for the issues mentioned above, our model has significantly higher predictive power. This is especially true for samples stemming from patients with longer treatment history and samples associated with rare therapies. Furthermore, our approach is at least as powerful for the remaining samples.

Loading Max Planck Institute for Computer Science collaborators
Loading Max Planck Institute for Computer Science collaborators