Institute des Systemes Complexes

France

Institute des Systemes Complexes

France

Time filter

Source Type

Castro-Gonzalez C.,Technical University of Madrid | Castro-Gonzalez C.,CIBER ISCIII | Castro-Gonzalez C.,Massachusetts Institute of Technology | Luengo-Oroz M.A.,Technical University of Madrid | And 21 more authors.
PLoS Computational Biology | Year: 2014

A gene expression atlas is an essential resource to quantify and understand the multiscale processes of embryogenesis in time and space. The automated reconstruction of a prototypic 4D atlas for vertebrate early embryos, using multicolor fluorescence in situ hybridization with nuclear counterstain, requires dedicated computational strategies. To this goal, we designed an original methodological framework implemented in a software tool called Match-IT. With only minimal human supervision, our system is able to gather gene expression patterns observed in different analyzed embryos with phenotypic variability and map them onto a series of common 3D templates over time, creating a 4D atlas. This framework was used to construct an atlas composed of 6 gene expression templates from a cohort of zebrafish early embryos spanning 6 developmental stages from 4 to 6.3 hpf (hours post fertilization). They included 53 specimens, 181,415 detected cell nuclei and the segmentation of 98 gene expression patterns observed in 3D for 9 different genes. In addition, an interactive visualization software, Atlas-IT, was developed to inspect, supervise and analyze the atlas. Match-IT and Atlas-IT, including user manuals, representative datasets and video tutorials, are publicly and freely available online. We also propose computational methods and tools for the quantitative assessment of the gene expression templates at the cellular scale, with the identification, visualization and analysis of coexpression patterns, synexpression groups and their dynamics through developmental stages. © 2014 Castro-González et al.


Tonda A.P.,Institute des Systemes Complexes | Lutton E.,University Paris - Sud | Reuillon R.,Institute des Systemes Complexes | Squillero G.,Polytechnic University of Turin | Wuillemin P.-H.,LIP6
Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) | Year: 2012

Bayesian networks are stochastic models, widely adopted to encode knowledge in several fields. One of the most interesting features of a Bayesian network is the possibility of learning its structure from a set of data, and subsequently use the resulting model to perform new predictions. Structure learning for such models is a NP-hard problem, for which the scientific community developed two main approaches: score-and-search metaheuristics, often evolutionary-based, and dependency-analysis deterministic algorithms, based on stochastic tests. State-of-the-art solutions have been presented in both domains, but all methodologies start from the assumption of having access to large sets of learning data available, often numbering thousands of samples. This is not the case for many real-world applications, especially in the food processing and research industry. This paper proposes an evolutionary approach to the Bayesian structure learning problem, specifically tailored for learning sets of limited size. Falling in the category of score-and-search techniques, the methodology exploits an evolutionary algorithm able to work directly on graph structures, previously used for assembly language generation, and a scoring function based on the Akaike Information Criterion, a well-studied metric of stochastic model performance. Experimental results show that the approach is able to outperform a state-of-the-art dependency-analysis algorithm, providing better models for small datasets. © 2012 Springer-Verlag.


Cointet J.-P.,University Paris Est Creteil | Cointet J.-P.,Institute des systemes complexes | Mogoutov A.,University Paris Est Creteil | Bourret P.,Aix - Marseille University | And 2 more authors.
Medecine/Sciences | Year: 2012

This paper examines the emergence and development of one of the key components of genomics, namely gene expression profiling. It does so by resorting to computer-based methods to analyze and visualize networks of scientific publications. Our results show the central role played by oncology in this domain, insofar as the initial proofof-principle articles based on a plant model organism have quickly led to the demonstration of the value of these techniques in blood cancers and to applications in the field of solid tumors, and in particular breast cancer. The article also outlines the essential role played by novel bioinformatics and biostatistical tools in the development of the domain. These computational disciplines thus qualify as one of the three corners (in addition to the laboratory and the clinic) of the translational research triangle.


Possoz C.,French National Center for Scientific Research | Junier I.,Institut Universitaire de France | Junier I.,Institute des Systemes Complexes | Espeli O.,French National Center for Scientific Research
Frontiers in Bioscience | Year: 2012

Dividing cells have mechanisms to ensure that their genomes are faithfully segregated into daughter cells. In bacteria, the description of these mechanisms has been considerably improved in the recent years. This review focuses on the different aspects of bacterial chromosome segregation that can be understood thanks to the studies performed with model organisms: Escherichia coli, Bacillus subtilis, Caulobacter crescentus and Vibrio cholerae. We describe the global positionning of the nucleoid in the cell and the specific localization and dynamics of different chromosomal loci, kinetic and biophysic aspects of chromosome segregation are presented. Finally, a presentation of the key proteins involved in the chromosome segregation is made.


Rizzi B.,French National Center for Scientific Research | Rizzi B.,Institute des Systemes Complexes | Peyrieras N.,French National Center for Scientific Research | Peyrieras N.,Institute des Systemes Complexes
Journal of Chemical Biology | Year: 2014

Embryogenesis is a dynamic process with an intrinsic variability whose understanding requires the integration of molecular, genetic, and cellular dynamics. Biological circuits function over time at the level of single cells and require a precise analysis of the topology, temporality, and probability of events. Integrative developmental biology is currently looking for the appropriate strategies to capture the intrinsic properties of biological systems. The "-omic" approaches require disruption of the function of the biological circuit; they provide static information, with low temporal resolution and usually with population averaging that masks fast or variable features at the cellular scale and in a single individual. This data should be correlated with cell behavior as cells are the integrators of biological activity. Cellular dynamics are captured by the in vivo microscopy observation of live organisms. This can be used to reconstruct the 3D + time cell lineage tree to serve as the basis for modeling the organism's multiscale dynamics. We discuss here the progress that has been made in this direction, starting with the reconstruction over time of three-dimensional digital embryos from in toto time-lapse imaging. Digital specimens provide the means for a quantitative description of the development of model organisms that can be stored, shared, and compared. They open the way to in silico experimentation and to a more theoretical approach to biological processes. We show, with some unpublished results, how the proposed methodology can be applied to sea urchin species that have been model organisms in the field of classical embryology and modern developmental biology for over a century. © 2013 The Author(s).


Auger P.,Center Ird Of Lile Of France | Lett C.,Institute des Systemes Complexes | Moussaoui A.,Abou Bekr Belkaid University Tlemcen | Pioch S.,Caisse des Depots Co
Canadian Journal of Fisheries and Aquatic Sciences | Year: 2010

We present a mathematical model of artificial pelagic multisite fisheries. The model is a stock-effort dynamical model of a fishery subdivided into artificial fishing sites such as fish-aggregating devices (FADs) or artificial habitats (AHs). The objective of the work is to investigate the effects of the number of sites on the global activity of the fishery. We consider a linear chain of fishing sites in which fish are harvested by fishing vessels and a free stock that is unattached to the sites and not exploited. Fish movements between the sites and the free stock, as well as vessel displacements between the sites, are assumed to take place at a faster time scale than the variation of the stock and the change of the fleet size. We take advantage of these two time scales to derive a reduced model governing the dynamics of the total fish stock and the total fishing effort. We show that there exists an optimal number of fishing sites that maximizes the total catch at equilibrium. We finally extend the model to the situation in which both fish attached to the sites and fish in the free stock are exploited.


Reuillon R.,Institute des Systemes Complexes | Leclaire M.,CITES | Rey-Coyrehourcq S.,CITES
Future Generation Computer Systems | Year: 2013

Complex-systems describe multiple levels of collective structure and organization. In such systems, the emergence of global behaviour from local interactions is generally studied through large scale experiments on numerical models. This analysis generates important computation loads which require the use of multi-core servers, clusters or grid computing. Dealing with such large scale executions is especially challenging for modellers who do not possess the theoretical and methodological skills required to take advantage of high performance computing environments. That is why we have designed a cloud approach for model experimentation. This approach has been implemented in OpenMOLE (Open MOdeL Experiment) as a Domain Specific Language (DSL) that leverages the naturally parallel aspect of model experiments. The OpenMOLE DSL has been designed to explore user-supplied models. It delegates transparently their numerous executions to remote execution environment. From a user perspective, those environments are viewed as services providing computing power, therefore no technical detail is ever exposed. This paper presents the OpenMOLE DSL through the example of a toy model exploration and through the automated calibration of a real-world complex-system model in the field of geography. © 2013 Elsevier B.V. All rights reserved.


Sanchez E.,Polytechnic University of Turin | Squillero G.,Polytechnic University of Turin | Tonda A.,Institute des Systemes Complexes
Proceedings - International Workshop on Microprocessor Test and Verification | Year: 2012

The 40 years since the appearance of the Intel 4004 deeply changed how microprocessors are designed. Today, essential steps in the validation process are performed relying on physical dices, analyzing the actual behavior under appropriate stimuli. This paper presents a methodology that can be used to devise assembly programs suitable for a range of on-silicon activities, like speed debug, timing verification or speed binning. The methodology is fully automatic. It exploits the feedback from the microprocessor under examination and does not rely on information about its microarchitecture, nor does it require design-for-debug features. The experimental evaluation performed on a Intel Pentium Core i7-950 demonstrates the feasibility of the approach. © 2011 IEEE.


Barbieri D.,Institute des Systemes Complexes | Citti G.,University of Bologna | Cocci G.,University of Bologna | Sarti A.,French School for Advanced Studies in the Social Sciences
Journal of Mathematical Imaging and Vision | Year: 2014

In this paper we develop a geometrical model of functional architecture for the processing of spatio-temporal visual stimuli. The model arises from the properties of the receptive field linear dynamics of orientation and speedselective cells in the visual cortex, that can be embedded in the definition of a geometry where the connectivity between points is driven by the contact structure of a 5D manifold. Then, we compute the stochastic kernels that are the approximations of two Fokker Planck operators associated to the geometry, and implement them as facilitation patterns within a neural population activity model, in order to reproduce some psychophysiological findings about the perception of contours in motion and trajectories of points found in the literature. © Springer Science+Business Media New York 2014.


Reuillon R.,Institute des Systemes Complexes | Leclaire M.,Institute des Systemes Complexes | Passerat-Palmbach J.,Imperial College London
Proceedings of the 2015 International Conference on High Performance Computing and Simulation, HPCS 2015 | Year: 2015

OpenMOLE is a scientific workflow engine with a strong emphasis on workload distribution. Workflows are designed using a high level Domain Specific Language (DSL) built on top of Scala. It exposes natural parallelism constructs to easily delegate the workload resulting from a workflow to a wide range of distributed computing environments. In this work, we briefly expose the strong assets of OpenMOLE and demonstrate its efficiency at exploring the parameter set of an agent simulation model. We perform a multi-objective optimisation on this model using computationally expensive Genetic Algorithms (GA). OpenMOLE hides the complexity of designing such an experiment thanks to its DSL, and transparently distributes the optimisation process. The example shows how an initialisation of the GA with a population of 200,000 individuals can be evaluated in one hour on the European Grid Infrastructure. © 2015 IEEE.

Loading Institute des Systemes Complexes collaborators
Loading Institute des Systemes Complexes collaborators