Time filter

Source Type

Gouttefarde M.,CNRS Montpellier Laboratory of Informatics, Robotics and Microelectronics | Daney D.,French Institute for Research in Computer Science and Automation | Merlet J.-P.,French Institute for Research in Computer Science and Automation
IEEE Transactions on Robotics | Year: 2011

This paper deals with the wrench-feasible workspace (WFW) of n-degree-of-freedom parallel robots driven by n or more than n cables. The WFW is the set of mobile platform poses for which the cables can balance any wrench of a given set of wrenches, such that the tension in each cable remains within a prescribed range. Requirements of nonnegative cable tensions, as well as maximum admissible tensions, are thus satisfied. The determination of the WFW is an important issue since its size and shape are highly dependent on the geometry of the robot and on the ranges of allowed cable tensions. The approach proposed in this paper is mainly based on interval analysis. Two sufficient conditions are presented, namely, a sufficient condition for a box of poses to be fully inside the WFW and a sufficient condition for a box of poses to be fully outside the WFW. These sufficient conditions are relevant since they can be tested, with the means to test them being discussed in the paper. Used within usual branch-and-prune algorithms, these tests enable WFW determinations in which full-dimensional sets of poses (volumes) are found to lie within or, on the contrary, to lie outside the WFW. This provides a useful alternative to a basic discretization, the latter consisting of testing a discrete (zero-dimensional) finite set of poses. In order to improve the efficiency of the computations, a means to mitigate the undesirable effects of the so-called wrapping effect is introduced. The paper also illustrates how the proposed approach is capable of dealing with small uncertainties on the geometric design parameters of a parallel cable-driven robot. © 2010 IEEE.

Richa R.,CNRS Montpellier Laboratory of Informatics, Robotics and Microelectronics
Medical image computing and computer-assisted intervention : MICCAI ... International Conference on Medical Image Computing and Computer-Assisted Intervention | Year: 2010

In the context of minimally invasive cardiac surgery, active vision-based motion compensation schemes have been proposed for mitigating problems related to physiological motion. However, robust and accurate visual tracking is a difficult task. The purpose of this paper is to present a hybrid tracker that estimates the heart surface deformation using the outputs of multiple visual tracking techniques. In the proposed method, the failure of an individual technique can be circumvented by the success of others, enabling the robust estimation of the heart surface deformation with increased spatial resolution. In addition, for coping with the absence of visual information due to motion blur or occlusions, a temporal heart motion model is incorporated as an additional support for the visual tracking task. The superior performance of the proposed technique compared to existing techniques individually is demonstrated through experiments conducted on recorded images of an in vivo minimally invasive CABG using the DaVinci robotic platform.

Lartillot N.,University of Montréal | Lartillot N.,CNRS Montpellier Laboratory of Informatics, Robotics and Microelectronics | Rodrigue N.,Agriculture and Agri Food Canada | Rodrigue N.,University of Ottawa | And 2 more authors.
Systematic Biology | Year: 2013

Modeling across site variation of the substitution process is increasingly recognized as important for obtaining more accurate phylogenetic reconstructions. Both finite and infinite mixture models have been proposed and have been shown to significantly improve on classical single-matrix models. Compared with their finite counterparts, infinite mixtures have a greater expressivity.However, they are computationally more challenging. This has resulted in practical compromises in the design of infinite mixture models. In particular, a fast but simplified version of a Dirichlet process model over equilibrium frequency profiles implemented in PhyloBayes has often been used in recent phylogenomics studies, while more refined model structures, more realistic and empirically more fit, have been practically out of reach. We introduce a message passing interface version ofPhyloBayes,implementing the Dirichletprocess mixture models aswell as more classical empirical matrices and finite mixtures. The parallelization is made efficient thanks to the combination of two algorithmic strategies: a partial Gibbs sampling update of the tree topology and the use of a truncated stick-breaking representation for the Dirichlet process prior. The implementation shows close to linear gains in computational speed for up to 64 cores, thus allowing faster phylogenetic reconstruction under complex mixture models. PhyloBayes MPI is freely available from our website www.phylobayes.org. [Bayesian inference; Dirichlet process; mixture models; phylogenetics; phylogenomics.] © The Author(s) 2013.

Gascuel O.,CNRS Montpellier Laboratory of Informatics, Robotics and Microelectronics | Steel M.,University of Canterbury
Systematic Biology | Year: 2014

Predicting the ancestral sequences of a group of homologous sequences related by a phylogenetic tree has been the subject of many studies, and numerous methods have been proposed for this purpose. Theoretical results are available that show that when the substitution rates become too large, reconstructing the ancestral state at the tree root is no longer feasible. Here, we also study the reconstruction of the ancestral changes that occurred along the tree edges. We show that, that, depending on the tree and branch length distribution, reconstructing these changes (i.e., reconstructing the ancestral state of all internal nodes in the tree) may be easier or harder than reconstructing the ancestral root state. However, results from information theory indicate that for the standard Yule tree, the task of reconstructing internal node states remains feasible, even for very high substitution rates. Moreover, computer simulations demonstrate that for more complex trees and scenarios, this result still holds. For a large variety of counting, parsimony- and likelihood-based methods, the predictive accuracy of a randomly selected internal node in the tree is indeed much higher than the accuracy of the same method when applied to the tree root. Moreover, parsimony- and likelihood-based methods appear to be remarkably robust to sampling bias and model mis-specification. [Ancestral state prediction; character evolution; majority rule; Markov model; maximum likelihood; parsimony; phylogenetic tree.] © The Author(s) 2014.

Lartillot N.,University of Montréal | Lartillot N.,CNRS Montpellier Laboratory of Informatics, Robotics and Microelectronics
Molecular Biology and Evolution | Year: 2013

According to the nearly-neutral model, variation in long-term effective population size among species should result in correlated variation in the ratio of nonsynonymous over synonymous substitution rates (dN/dS). Previous empirical investigations in mammals have been consistent with this prediction, suggesting an important role for nearly-neutral effects on protein-coding sequence evolution. GC-biased gene conversion (gBGC), on the other hand, is increasingly recognized as a major evolutionary force shaping genome nucleotide composition. When sufficiently strong compared with random drift, gBGC may significantly interfere with a nearly-neutral regime and impact dN/dS in a complex manner. Here, we investigate the phylogenetic correlations between dN/dS, the equilibrium GC composition (GC*), and several life-history and karyotypic traits in placental mammals. We show that the equilibrium GC composition decreases with body mass and increases with the number of chromosomes, suggesting a modulation of the strength of biased gene conversion due to changes in effective population size and genome-wide recombination rate. The variation in dN/dS is complex and only partially fits the prediction of the nearly-neutral theory. However, specifically restricting estimation of the dN/dS ratio on GC-conservative transversions, which are immune from gBGC, results in correlations that are more compatible with a nearly-neutral interpretation. Our investigation indicates the presence of complex interactions between selection and biased gene conversion and suggests that further mechanistic development is warranted, to tease out mutation, selection, drift, and conversion. © The Author 2012.

Gouy M.,CNRS Biometry and Evolutionary Biology Laboratory | Guindon S.,CNRS Montpellier Laboratory of Informatics, Robotics and Microelectronics | Guindon S.,University of Auckland | Gascuel O.,CNRS Montpellier Laboratory of Informatics, Robotics and Microelectronics
Molecular Biology and Evolution | Year: 2010

We present SeaView version 4, a multiplatform program designed to facilitate multiple alignment and phylogenetic tree building from molecular sequence data through the use of a graphical user interface. SeaView version 4 combines all the functions of the widely used programs SeaView (in its previous versions) and Phylo-win, and expands them by adding network access to sequence databases, alignment with arbitrary algorithm, maximum-likelihood tree building with PhyML, and display, printing, and copy-to-clipboard of rooted or unrooted, binary or multifurcating phylogenetic trees. In relation to the wide present offer of tools and algorithms for phylogenetic analyses, SeaView is especially useful for teaching and for occasional users of such software. SeaView is freely available at http://pbil.univ-lyon1.fr/software/seaview.

Villabona-Arenas C.J.,CNRS Montpellier Laboratory of Informatics, Robotics and Microelectronics
AIDS | Year: 2016

OBJECTIVE:: In resource-limited countries (RLC), antiretroviral therapy (ART) has been scaled up, but individual monitoring is still sub-optimal. Here we studied whether or not ART had an impact on the frequency and selection of drug resistance mutations (DRMs) under these settings. We also examined whether differences exist between HIV-1 genetic variants. DESIGN:: A total of 3,736 sequences from individuals failing standard first-line ART (n?=?1,599, AZT/d4T+3TC+NVP/EFV) were analyzed and compared to sequences from reverse transcriptase inhibitor (RTI)-naïve individuals (n?=?2,137) from 10 West and Central Africa countries. METHODS:: Fisher exact tests and corrections for multiple comparisons were used to assess the significance of associations. RESULTS:: All RTI-DRM from the 2015-IAS list, except F227C, and nine mutations from other expert lists were observed to confer extensive resistance and cross-resistance. Five additional independently-selected mutations (I94L, L109I, V111L, T139R, T165L) were statistically associated with treatment. The proportion of sequences with multiple mutations, and the frequency of all TAMs, M184V, certain NNRTIS, I94L and L109I showed substantial increase with time on ART. Only one nucleoside and two non-nucleoside RTI-DRMs differed by subtype/CRF. CONCLUSIONS:: This study validates the global robustness of the actual DRM repertoire, in particular for CRF-02 predominating in West and Central Africa, despite our finding of five additional selected mutations. However, long-term ART without virological monitoring clearly leads to the accumulation of mutations and the emergence of additional variations, which limit drug options for treatment and can be transmitted. Improved monitoring and optimization of ART are necessary for the long-term effectiveness of ART. Copyright © 2016 Wolters Kluwer Health, Inc.

Guindon S.,University of Auckland | Guindon S.,CNRS Montpellier Laboratory of Informatics, Robotics and Microelectronics
Systematic Biology | Year: 2013

The accuracy and precision of species divergence date estimation from molecular data strongly depend on the models describing the variation of substitution rates along a phylogeny. These models generally assume that rates randomly fluctuate along branches from one node to the next. However, for mathematical convenience, the stochasticity of such a process is ignored when translating these rate trajectories into branch lengths. This study addresses this shortcoming. A new approach is described that explicitly considers the average substitution rates along branches as random quantities, resulting in a more realistic description of the variations of evolutionary rates along lineages. The proposed method provides more precise estimates of the rate autocorrelation parameter as well as divergence times. Also, simulation results indicate that ignoring the stochastic variation of rates along edges can lead to significant overestimation of specific node ages. Altogether, the new approach introduced in this study is a step forward to designing biologically relevant models of rate evolution that are well suited to data sets with dense taxon sampling which are likely to present rate autocorrelation. © 2012 The Author(s).

Lu A.,University of Auckland | Lu A.,The New Zealand Institute for Plant and Food Research Ltd | Guindon S.,University of Auckland | Guindon S.,CNRS Montpellier Laboratory of Informatics, Robotics and Microelectronics
Molecular Biology and Evolution | Year: 2014

The branch-site model is a widely popular approach that accommodates for the lineage-and the site-specific heterogeneity of natural selection regimes among coding sequences. This model relies on prior knowledge of the (foreground) lineage(s) evolving under positive selection at some sites. Unfortunately, such prior information is not always available in practice. A more recent technique (Guindon S, Rodrigo A, Dyer K, Huelsenbeck J. 2004. Modeling the site-specific variation of selection patterns along lineages. Proc Natl Acad Sci USA 101:12957-12962) alleviates this issue by explicitly modeling the variability of selection patterns using a stochastic process. However, the performance of this approach for deciding whether a set of homologous sequences evolved under positive selection at some point has not been assessed yet. This study compares the sensitivity and specificity of tests for positive selection derived from both the standard and the stochastic approaches using extensive simulations. We show that the two methods have low proportions of type I errors, that is, they tend to be conservative when testing the null hypothesis of no positive selection if sequences truly evolve under neutral or negative selection regimes. Also, the standard approach is more powerful than the stochastic one when the prior knowledge on foreground lineages is correct. When this prior is incorrect, however, the stochastic approach outperforms the standard model in a broad range of conditions. Additional comparisons also suggest that the stochastic branch-site method compares favorably with the recently proposed mixed-effects model of evolution of Murrell et al. (Murrell B, Wertheim JO, Moola S, Weighill T, Scheffler K, Pond SLK. 2012. Detecting individual sites subject to episodic diversifying selection. PLoS Genet. 8:e1002764). Altogether, our results show that the standard branch-site model is well suited to confirmatory analyses, whereas the stochastic approach should be preferred over the standard or the mixed-effects ones for exploratory studies. © 2013 The Author.

Romashchenko A.,CNRS Montpellier Laboratory of Informatics, Robotics and Microelectronics
Theory of Computing Systems | Year: 2014

We study probabilistic bit-probe schemes for the membership problem. Given a set A of at most n elements from the universe of size m we organize such a structure that queries of type "x∈A?" can be answered very quickly. H. Buhrman, P.B. Miltersen, J. Radhakrishnan, and S. Venkatesh proposed a randomized bit-probe scheme that needs space of O(nlogm) bits. That scheme has a randomized algorithm processing queries; it needs to read only one randomly chosen bit from the memory to answer a query. For every x the answer is correct with high probability (with two-sided errors). In this paper we slightly modify the bit-probe model of Buhrman et al. and consider schemes with a small auxiliary information in "cache" memory. In this model, we show that for the membership problem there exists a bit-probe scheme with one-sided error that needs space of O(nlog2 m+poly(logm)) bits, which cannot be achieved in the model without cache. We also obtain a slightly weaker result (space of size n 1+δpoly(logm) bits and two bit probes for every query) for a scheme that is effectively encodable. © 2012 Springer Science+Business Media, LLC.

Loading CNRS Montpellier Laboratory of Informatics, Robotics and Microelectronics collaborators
Loading CNRS Montpellier Laboratory of Informatics, Robotics and Microelectronics collaborators