Entity

Time filter

Source Type

IST
Austria

Chatterjee K.,IST
Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) | Year: 2014

We study two-player concurrent games on finite-state graphs played for an infinite number of rounds, where in each round, the two players (player 1 and player 2) choose their moves independently and simultaneously; the current state and the two moves determine the successor state. The objectives are ω-regular winning conditions specified as parity objectives. We consider the qualitative analysis problems: the computation of the almost-sure and limit-sure winning set of states, where player 1 can ensure to win with probability 1 and with probability arbitrarily close to 1, respectively. In general the almost-sure and limit-sure winning strategies require both infinite-memory as well as infinite-precision (to describe probabilities). While the qualitative analysis problem for concurrent parity games with infinite-memory, infinite-precision randomized strategies was studied before, we study the bounded-rationality problem for qualitative analysis of concurrent parity games, where the strategy set for player 1 is restricted to bounded-resource strategies. In terms of precision, strategies can be deterministic, uniform, finite-precision, or infinite-precision; and in terms of memory, strategies can be memoryless, finite-memory, or infinite-memory. We present a precise and complete characterization of the qualitative winning sets for all combinations of classes of strategies. In particular, we show that uniform memoryless strategies are as powerful as finite-precision infinite-memory strategies, and infinite-precision memoryless strategies are as powerful as infinite-precision finite-memory strategies. We show that the winning sets can be computed in O(n2d+3) time, where n is the size of the game structure and 2d is the number of priorities (or colors), and our algorithms are symbolic. The membership problem of whether a state belongs to a winning set can be decided in NP ∩ coNP. Our symbolic algorithms are based on a characterization of the winning sets as μ-calculus formulas, however, our μ-calculus formulas are crucially different from the ones for concurrent parity games (without bounded rationality); and our memoryless witness strategy constructions are significantly different from the infinite-memory witness strategy constructions for concurrent parity games. © 2014 Springer-Verlag. Source


Gupta A.,IST
Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) | Year: 2012

Unsatisfiability proofs find many applications in verification. Today, many SAT solvers are capable of producing resolution proofs of unsatisfiability. For efficiency smaller proofs are preferred over bigger ones. The solvers apply proof reduction methods to remove redundant parts of the proofs while and after generating the proofs. One method of reducing resolution proofs is redundant resolution reduction, i.e., removing repeated pivots in the paths of resolution proofs (aka Pivot recycle). The known single pass algorithm only tries to remove redundancies in the parts of the proof that are trees. In this paper, we present three modifications to improve the algorithm such that the redundancies can be found in the parts of the proofs that are DAGs. The first modified algorithm covers greater number of redundancies as compared to the known algorithm without incurring any additional cost. The second modified algorithm covers even greater number of the redundancies but it may have longer run times. Our third modified algorithm is parametrized and can trade off between run times and the coverage of the redundancies. We have implemented our algorithms in OpenSMT and applied them on unsatisfiability proofs of 198 examples from plain MUS track of SAT11 competition. The first and second algorithm additionally remove 0.89% and 10.57% of clauses respectively as compared to the original algorithm. For certain value of the parameter, the third algorithm removes almost as many clauses as the second algorithm but is significantly faster. © 2012 Springer-Verlag. Source


Pietrzak K.,IST
Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) | Year: 2012

The (decisional) learning with errors problem (LWE) asks to distinguish "noisy" inner products of a secret vector with random vectors from uniform. The learning parities with noise problem (LPN) is the special case where the elements of the vectors are bits. In recent years, the LWE and LPN problems have found many applications in cryptography. In this paper we introduce a (seemingly) much stronger adaptive assumption, called "subspace LWE" (SLWE), where the adversary can learn the inner product of the secret and random vectors after they were projected into an adaptively and adversarially chosen subspace. We prove that, surprisingly, the SLWE problem mapping into subspaces of dimension d is almost as hard as LWE using secrets of length d (the other direction is trivial.) This result immediately implies that several existing cryptosystems whose security is based on the hardness of the LWE/LPN problems are provably secure in a much stronger sense than anticipated. As an illustrative example we show that the standard way of using LPN for symmetric CPA secure encryption is even secure against a very powerful class of related key attacks. © 2012 Springer-Verlag. Source


Bruno S.,A.O. Fatebenefratelli e Oftalmico | Shiffman M.L.,Virginia Commonwealth University | Roberts S.K.,Alfred Hospital | Gane E.J.,Auckland Clinical Studies | And 3 more authors.
Hepatology | Year: 2010

The objective of this study is to determine the efficacy and safety of peginterferon alfa-2a (40KD)/ribavirin in patients with advanced fibrosis. Data from 341 genotype 1/4 patients (99 with bridging fibrosis/cirrhosis) treated for 48 weeks and 1547 genotype 2/3 patients (380 with bridging fibrosis/cirrhosis) treated for 16 or 24 weeks enrolled in three randomized international studies were analyzed. Sustained virological response (SVR) rates decreased progressively from 60% in genotype 1/4 patients without advanced fibrosis to 51% in those with bridging fibrosis and 33% in those with cirrhosis (trend test P = 0.0028); and from 76% to 61% and 57%, respectively, in genotype 2/3 patients treated for 24 weeks (trend test P < 0.0001). Irrespective of genotype, patients without advanced fibrosis were more likely to have an earlier response to treatment that was associated with higher SVR rates and lower relapse rates during untreated follow-up. Among patients with or without a diagnosis of advanced fibrosis, rates of SVR and relapse were similar for patients with similar responses in the first 12 weeks. Conclusion: Compared with patients with less severe disease, SVR rates are significantly lower in patients with advanced fibrosis. However, irrespective of genotype and degree of fibrosis, the time to become hepatitis C virus (HCV) RNA undetectable was the strongest predictor of SVR. Copyright © 2009 by the American Association for the Study of Liver Diseases. Source


This paper generalizes the well-known Diffusion Curves Images (DCI), which are composed of a set of Bezier curves with colors specified on either side. These colors are diffused as Laplace functions over the image domain, which results in smooth color gradients interrupted by the Bezier curves. Our new formulation allows for more color control away from the boundary, providing a similar expressive power as recent Bilaplace image models without introducing associated issues and computational costs. The new model is based on a special Laplace function blending and a new edge blur formulation. We demonstrate that given some user-defined boundary curves over an input raster image, fitting colors and edge blur from the image to the new model and subsequent editing and animation is equally convenient as with DCIs. Numerous examples and comparisons to DCIs are presented. © 2016 The Author(s) Computer Graphics Forum © 2016 The Eurographics Association and John Wiley & Sons Ltd. Published by John Wiley & Sons Ltd. Source

Discover hidden collaborations