Time filter

Source Type

Saarbrucken, Germany

The Max Planck Institute for Informatics is a research institute in computer science with a focus on algorithms and their applications in a broad sense. It hosts fundamental research as well a research for various application domains . It is part of the Max-Planck-Gesellschaft, Germany's largest society for fundamental research.The research institutes of the Max Planck Society have a national and international reputation as “Centres of Excellence” for pure research. The institute consists of five departments and two research groups: The Algorithms and Complexity Department is headed by Prof. Dr. Kurt Mehlhorn, The Computer Vision and Multimodal Computing Department is headed by Prof. Dr. Bernt Schiele, The Department Computational Biology and Applied Algorithmics is headed by Prof. Dr. Thomas Lengauer, Ph.D. The Computer Graphics Department is headed by Prof. Dr. Hans-Peter Seidel The Databases and Information Systems Department is headed by Prof. Dr. Gerhard Weikum Research Group Automation of Logic is headed by Prof. Dr. Christoph Weidenbach The Independent Research Group Computational Genomics and Epidemiology is headed by Dr. Alice McHardy.Previously, it included the following departments: The Programming Logics Department was headed by Prof. Dr. Harald Ganzinger Members of the institute have received various awards. Professor Kurt Mehlhorn and Professor Hans-Peter Seidel received the Gottfried Wilhelm Leibniz Prize, Professor Kurt Mehlhorn and Professor Thomas Lengauer received the Konrad-Zuse-Medal, and in 2004 Professor Harald Ganzinger received the Herbrand Award.The institute, along with the Max Planck Institute for Software Systems , the German Research Centre for Artificial Intelligence and the entire Computer Science department of Saarland University, is involved in the Internationales Begegnungs- und Forschungszentrum für Informatik.The International Max Planck Research School for Computer Science is the graduate school of the MPII and the MPI-SWS. It was founded in 2000 and offers a fully funded PhD-Program in cooperation with Saarland University. Dean is Prof. Dr. Gerhard Weikum. Wikipedia.

Schmidt J.M.,Max Planck Institute for Informatics
SIAM Journal on Computing

One of the most noted construction methods of 3-vertex-connected graphs is due to Tutte and is based on the following fact: Any 3-vertex-connected graph G = (V, E) on more than 4 vertices contains a contractible edge, i.e., an edge whose contraction generates a 3-connected graph. This implies the existence of a sequence of edge contractions from G to the complete graph K4, such that every intermediate graph is 3-vertex-connected. A theorem of Barnette and Grünbaum gives a similar sequence using removals on edges instead of contractions. We show how to compute both sequences in optimal time, improving the previously best known running times of O(|V|2) to O(|E|). This result has a number of consequences; an important one is a new linear-time test of 3-connectivity that is certifying; finding such an algorithm has been a major open problem in the design of certifying algorithms in recent years. The test is conceptually different from well-known linear-time 3-connectivity tests and uses a certificate that is easy to verify in time O(|E|). We show how to extend the results to an optimal certifying test of 3-edge-connectivity. © 2013 Society for Industrial and Applied Mathematics. Source

Kratsch S.,University Utrecht | Wahlstrom M.,Max Planck Institute for Informatics
Proceedings - Annual IEEE Symposium on Foundations of Computer Science, FOCS

The existence of a polynomial kernel for Odd Cycle Transversal was a notorious open problem in parameterized complexity. Recently, this was settled by the present authors (Kratsch and Wahlströom, SODA 2012), with a randomized polynomial kernel for the problem, using matroid theory to encode flow questions over a set of terminals in size polynomial in the number of terminals (rather than the total graph size, which may be superpolynomially larger). In the current work we further establish the usefulness of matroid theory to kernelization by showing applications of a result on representative sets due to Lovász (Combinatorial Surveys 1977) and Marx (TCS 2009). We show how representative sets can be used to give a polynomial kernel for the elusive Almost 2-sat problem (where the task is to remove at most k clauses to make a 2-CNF formula satisfiable), solving a major open problem in kernelization. We further apply the representative sets tool to the problem of finding irrelevant vertices in graph cut problems, that is, vertices which can be made undeletable without affecting the status of the problem. This gives the first significant progress towards a polynomial kernel for the Multiway Cut problem, in particular, we get a polynomial kernel for Multiway Cut instances with a bounded number of terminals. Both these kernelization results have significant spin-off effects, producing the first polynomial kernels for a range of related problems. More generally, the irrelevant vertex results have implications for covering min-cuts in graphs. In particular, given a directed graph and a set of terminals, we can find a set of size polynomial in the number of terminals (a cut-covering set) which contains a minimum vertex cut for every choice of sources and sinks from the terminal set. Similarly, given an undirected graph and a set of terminals, we can find a set of vertices, of size polynomial in the number of terminals, which contains a minimum multiway cut for every partition of the terminals into a bounded number of sets. Both results are polynomial time. We expect this to have further applications, in particular, we get direct, reduction rule-based kernelizations for all problems above, in contrast to the indirect compression-based kernel previously given for Odd Cycle Transversal. All our results are randomized, with failure probabilities which can be made exponentially small in the size of the input, due to needing a representation of a matroid to apply the representative sets tool. © 2012 IEEE. Source

Bock C.,Austrian Academy of Sciences | Bock C.,Medical University of Vienna | Lengauer T.,Max Planck Institute for Informatics
Nature Reviews Cancer

Drug resistance is a common cause of treatment failure for HIV infection and cancer. The high mutation rate of HIV leads to genetic heterogeneity among viral populations and provides the seed from which drug-resistant clones emerge in response to therapy. Similarly, most cancers are characterized by extensive genetic, epigenetic, transcriptional and cellular diversity, and drug-resistant cancer cells outgrow their non-resistant peers in a process of somatic evolution. Patient-specific combination of antiviral drugs has emerged as a powerful approach for treating drug-resistant HIV infection, using genotype-based predictions to identify the best matched combination therapy among several hundred possible combinations of HIV drugs. In this Opinion article, we argue that HIV therapy provides a 'blueprint' for designing and validating patient-specific combination therapies in cancer. © 2012 Macmillan Publishers Limited. All rights reserved. Source

Bringmann K.,Max Planck Institute for Informatics
Proceedings - Annual IEEE Symposium on Foundations of Computer Science, FOCS

The Fréchet distance is a well-studied and very popular measure of similarity of two curves. Many variants and extensions have been studied since Alt and Godau introduced this measure to computational geometry in 1991. Their original algorithm to compute the Fréchet distance of two polygonal curves with n vertices has a runtime of O(n2 log n). More than 20 years later, the state of the art algorithms for most variants still take time more than O(n2/log n), but no matching lower bounds are known, not even under reasonable complexity theoretic assumptions. To obtain a conditional lower bound, in this paper we assume the Strong Exponential Time Hypothesis or, more precisely, that there is no O∗((2-δ)N) algorithm for CNF-SAT for any delta > 0. Under this assumption we show that the Fréchet distance cannot be computed in strongly subquadratic time, i.e., in time O(n2-δ) for any delta > 0. This means that finding faster algorithms for the Fréchet distance is as hard as finding faster CNF-SAT algorithms, and the existence of a strongly subquadratic algorithm can be considered unlikely. Our result holds for both the continuous and the discrete Fréchet distance. We extend the main result in various directions. Based on the same assumption we (1) show non-existence of a strongly subquadratic 1.001-approximation, (2) present tight lower bounds in case the numbers of vertices of the two curves are imbalanced, and (3) examine realistic input assumptions (c-packed curves). © 2014 IEEE. Source

Bringmann K.,Max Planck Institute for Informatics
Computational Geometry: Theory and Applications

The measure problem of Klee asks for the volume of the union of n axis-parallel boxes in a fixed dimension d. We give an O(n( d+2)/3) time algorithm for the special case of all boxes being cubes or, more generally, fat boxes. Previously, the fastest run-time was nd/ 22 O(log*n), achieved by the general case algorithm of Chan [SoCG 2008]. For the general problem our run-time would imply a breakthrough for the k-clique problem. © 2011 Elsevier B.V. All rights reserved. Source

Discover hidden collaborations