Time filter

Source Type

Agency: European Commission | Branch: FP7 | Program: CSA | Phase: INFRA-2007-1.2.3;INFRA-2007-1.2-03 | Award Amount: 5.11M | Year: 2008

EELA-2 aims to build, on the current EELA e-Infrastructure, a high capacity, production-quality, scalable Grid Facility providing round-the-clock, worldwide access to distributed computing, storage and network resources for a wide spectrum of applications from European and Latin American scientific communities. The project will provide an empowered Grid Facility with versatile services fulfilling application requirements and ensure the long-term sustainability of the e-Infrastructure beyond the term of the project. The specific EELA-2 objectives are: - Build a Grid Facility by: Expanding the current EELA e-Infrastructure to consist of more production sites mobilising more computing nodes and more storage space, at start of the project and to further grow storage over the duration of the project; Providing, in collaboration with related projects (e.g. EGEE), the full set of Grid Services needed by all types of scientific applications; Supporting applications various types (from classical off-line data processing up to control and data acquisition of scientific instruments), selected against well defined criteria (including grid added value, suitability for Grid deployment, outreach/potential impact); - Ensure the Grid Facility sustainability: Through the already established and new contacts with policy/decision makers, collaborating with RedCLARA and NRENs and supporting the ongoing creation of e-Science Initiatives and/or National Grid initiatives (NGI). Building the support of the e-Infrastructure to provide a complete set of Global Services from a Central Operation Centre and to pave the way for the creation of Regional Operation Centres in Latin America: Attracting new applications; Making available knowledge of EELA-2 Grid Facility to all potential users, developers, and decision makers through an extensive Training and Dissemination program; Creating knowledge repositories federated with the EGEE ones.

Agency: European Commission | Branch: H2020 | Program: RIA | Phase: SC1-PM-22-2016 | Award Amount: 15.59M | Year: 2016

ZIKAlliance is a multidisciplinary project with a global One Health approach, built: on a multi-centric network of clinical cohorts in the Caribbean, Central & South America; research sites in countries where the virus has been or is currently circulating (Africa, Asia, Polynesia) or at risk for emergence (Reunion Island); a strong network of European and Brazilian clinical & basic research institutions; and multiple interfaces with other scientific and public health programmes. ZIKAlliance will addrees three key objectives relating to (i) impact of Zika virus (ZIKV) infection during pregnancy and short & medium term effects on newborns, (ii) associated natural history of ZIKV infection in humans and their environment in the context of other circulating arboviruses and (iii) building the overall capacity for preparedness research for future epidemic threats in Latin America & the Caribbean. The project will take advantage of large standardised clinical cohorts of pregnant women and febrile patients in regions of Latin America and the Caribbean were the virus is circulating, expanding a preexisting network established by the IDAMS EU project. I will also benefit of a very strong expertise in basic and environmental sciences, with access to both field work and sophisticated technological infrastructures to characterise virus replication and physiopathology mechanisms. To meet its 3 key objectives, the scientific project has been organised in 9 work packages, with WP2/3 dedicated to clinical research (cohorts, clinical biology, epidemiology & modeling), WP3/4 to basic research (virology & antivirals, pathophysiology & animal models), WP5/6 to environmental research (animal reservoirs, vectors & vector control) , WP7/8 to social sciences & communication, and WP9 to management. The broad consortium set-up allow gathering the necessary expertise for an actual interdisciplinary approach, and operating in a range of countries with contrasting ZIKV epidemiological status.

Millan D.,Polytechnic University of Catalonia | Millan D.,Laboratorio Nacional Of Computacao Cientifica | Arroyo M.,Polytechnic University of Catalonia
Computer Methods in Applied Mechanics and Engineering | Year: 2013

Model reduction in computational mechanics is generally addressed with linear dimensionality reduction methods such as Principal Components Analysis (PCA). Hypothesizing that in many applications of interest the essential dynamics evolve on a nonlinear manifold, we explore here reduced order modeling based on nonlinear dimensionality reduction methods. Such methods are gaining popularity in diverse fields of science and technology, such as machine perception or molecular simulation. We consider finite deformation elastodynamics as a model problem, and identify the manifold where the dynamics essentially take place - the slow manifold - by nonlinear dimensionality reduction methods applied to a database of snapshots. Contrary to linear dimensionality reduction, the smooth parametrization of the slow manifold needs special techniques, and we use local maximum entropy approximants. We then formulate the Lagrangian mechanics on these data-based generalized coordinates, and develop variational time-integrators. Our proof-of-concept example shows that a few nonlinear collective variables provide similar accuracy to tens of PCA modes, suggesting that the proposed method may be very attractive in control or optimization applications. Furthermore, the reduced number of variables brings insight into the mechanics of the system under scrutiny. Our simulations also highlight the need of modeling the net effect of the disregarded degrees of freedom on the reduced dynamics at long times. © 2013 Elsevier B.V.

Leiva J.S.,Instituto Balseiro | Blanco P.J.,Laboratorio Nacional Of Computacao Cientifica | Buscaglia G.C.,University of Sao Paulo
International Journal for Numerical Methods in Engineering | Year: 2010

In this article we address decomposition strategies especially tailored to perform strong coupling of dimensionally heterogeneous models, under the hypothesis that one wants to solve each submodel separately and implement the interaction between subdomains by boundary conditions alone. The novel methodology takes full advantage of the small number of interface unknowns in this kind of problems. Existing algorithms can be viewed as variants of the 'natural' staggered algorithm in which each domain transfers function values to the other, and receives fluxes (or forces), and vice versa. This natural algorithm is known as Dirichlet-to-Neumann in the Domain Decomposition literature. Essentially, we propose a framework in which this algorithm is equivalent to applying Gauss-Seidel iterations to a suitably defined (linear or nonlinear) system of equations. It is then immediate to switch to other iterative solvers such as GMRES or other Krylov-based method, which we assess through numerical experiments showing the significant gain that can be achieved. Indeed, the benefit is that an extremely flexible, automatic coupling strategy can be developed, which in addition leads to iterative procedures that are parameter-free and rapidly converging. Further, in linear problems they have the finite termination property. © 2009 John Wiley & Sons, Ltd.

Coutinho D.F.,Pontifical Catholic University of Rio Grande do Sul | Fu M.,University of Newcastle | De Souza C.E.,Laboratorio Nacional Of Computacao Cientifica
IEEE Transactions on Automatic Control | Year: 2010

Although there has been a lot of research on analysis and synthesis of quantized feedback control systems, most results are developed for the case of a single quantizer (either measurement quantization or control signal quantization). In this technical note, we investigate the case of feedback control systems subject to both input and output quantization. This is motivated by the fact that it is common in remotely controlled systems that measurement and control signals are shared over a single digital network. More specifically, we consider a single-input single-output linear system with memoryless logarithmic quantizers. We firstly show that the output feedback quadratic stabilization problem in this setting can be addressed with no conservatism by means of a sector bound approach. Secondly, we provide a sufficient condition for quadratic stabilization via the solution of a scaled H∞ control problem. Finally, we analyze a problem of bandwidth allocation in the communication channel for finite-level input and output quantizers. © 2010 IEEE.

Arruda N.C.B.,Laboratorio Nacional Of Computacao Cientifica | Loula A.F.D.,Laboratorio Nacional Of Computacao Cientifica | Almeida R.C.,Laboratorio Nacional Of Computacao Cientifica
Computer Methods in Applied Mechanics and Engineering | Year: 2013

We propose and analyze a stabilized hybrid finite element method for elliptic problems consisting of locally discontinuous Galerkin problems in the primal variable coupled to a globally continuous problem in the multiplier. Numerical analysis shows that the proposed formulation preserves the main properties of the associate DG method such as consistency, stability, boundedness and optimal rates of convergence in the energy norm, and in the L2(Ω) norm for adjoint consistent formulations. For using an element based data structure, it has basically the same complexity and computational cost of classical conforming finite element methods. Convergence studies confirm the optimal rates of convergence predicted by the numerical analysis presented here, with accuracy equivalent or even better than the corresponding DG approximations. © 2012.

Higashi S.,Laboratorio Nacional Of Computacao Cientifica
BMC genomics | Year: 2012

An essential step of a metagenomic study is the taxonomic classification, that is, the identification of the taxonomic lineage of the organisms in a given sample. The taxonomic classification process involves a series of decisions. Currently, in the context of metagenomics, such decisions are usually based on empirical studies that consider one specific type of classifier. In this study we propose a general framework for analyzing the impact that several decisions can have on the classification problem. Instead of focusing on any specific classifier, we define a generic score function that provides a measure of the difficulty of the classification task. Using this framework, we analyze the impact of the following parameters on the taxonomic classification problem: (i) the length of n-mers used to encode the metagenomic sequences, (ii) the similarity measure used to compare sequences, and (iii) the type of taxonomic classification, which can be conventional or hierarchical, depending on whether the classification process occurs in a single shot or in several steps according to the taxonomic tree. We defined a score function that measures the degree of separability of the taxonomic classes under a given configuration induced by the parameters above. We conducted an extensive computational experiment and found out that reasonable values for the parameters of interest could be (i) intermediate values of n, the length of the n-mers; (ii) any similarity measure, because all of them resulted in similar scores; and (iii) the hierarchical strategy, which performed better in all of the cases. As expected, short n-mers generate lower configuration scores because they give rise to frequency vectors that represent distinct sequences in a similar way. On the other hand, large values for n result in sparse frequency vectors that represent differently metagenomic fragments that are in fact similar, also leading to low configuration scores. Regarding the similarity measure, in contrast to our expectations, the variation of the measures did not change the configuration scores significantly. Finally, the hierarchical strategy was more effective than the conventional strategy, which suggests that, instead of using a single classifier, one should adopt multiple classifiers organized as a hierarchy.

Fernandes D.T.,Laboratorio Nacional Of Computacao Cientifica | Loula A.F.D.,Laboratorio Nacional Of Computacao Cientifica
International Journal for Numerical Methods in Engineering | Year: 2010

A quasi optimal finite difference method (QOFD) is proposed for the Helmholtz problem. The stencils' coefficients are obtained numerically by minimizing a least-squares functional of the local truncation error for plane wave solutions in any direction. In one dimension this approach leads to a nodally exact scheme, with no truncation error, for uniform or non-uniform meshes. In two dimensions, when applied to a uniform cartesian grid, a 9-point sixth-order scheme is derived with the same truncation error of the quasi-stabilized finite element method (QSFEM) introduced by Babuška et al. (Comp. Meth. Appl. Mech. Eng. 1995; 128:325-359). Similarly, a 27-point sixth-order stencil is derived in three dimensions. The QOFD formulation, proposed here, is naturally applied on uniform, non-uniform and unstructured meshes in any dimension. Numerical results are presented showing optimal rates of convergence and reduced pollution effects for large values of the wave number. © 2009 John Wiley & Sons, Ltd.

Portugal R.,Laboratorio Nacional Of Computacao Cientifica | Boettcher S.,Emory University | Falkner S.,Emory University
Physical Review A - Atomic, Molecular, and Optical Physics | Year: 2015

A coinless, discrete-time quantum walk possesses a Hilbert space whose dimension is smaller compared to the widely studied coined walk. Coined walks require the direct product of the site basis with the coin space; coinless walks operate purely in the site basis, which is clearly minimal. These coinless quantum walks have received considerable attention recently because they have evolution operators that can be obtained by a graphical method based on lattice tessellations and they have been shown to be as efficient as the best known coined walks when used as a quantum search algorithm. We argue that both formulations in their most general form are equivalent. In particular, we demonstrate how to transform the one-dimensional version of the coinless quantum walk into an equivalent extended coined version for a specific family of evolution operators. We present some of its basic, asymptotic features for the one-dimensional lattice with some examples of tessellations, and analyze the mixing time and limiting probability distributions on cycles. ©2015 American Physical Society.

Blanco P.J.,Laboratorio Nacional Of Computacao Cientifica | Feijoo R.A.,Laboratorio Nacional Of Computacao Cientifica
Medical Engineering and Physics | Year: 2013

In the present work a computational model of the entire cardiovascular system is developed using heterogeneous mathematical representations. This model integrates different levels of detail for the blood circulation. The arterial tree is described by a one dimensional model in order to simulate the wave propagation phenomena that take place at the larger arterial vessels. The inflow and outflow locations of this 1D model are coupled with lumped parameter descriptions of the remainder part of the circulatory system, closing the loop. The four cardiac valves are considered using a valve model which allows for stenoses and regurgitation phenomena. In addition, full 3D geometrical models of arterial districts are embedded in this closed-loop circuit to model the local blood flow in specific vessels. This kind of detailed closed-loop network for the cardiovascular system allows hemodynamics analyses of patient-specific arterial district, delivering naturally the appropriate boundary conditions for different cardiovascular scenarios. An example of application involving the effect of aortic insufficiency on the local hemodynamics of a cerebral aneurism is provided as a motivation to reproduce, through numerical simulation, the hemodynamic environment in patients suffering from infective endocarditis and mycotic aneurisms. The need for incorporating homeostatic control mechanisms is also discussed in view of the large sensitivity observed in the results, noting that this kind of integrative modeling allows such incorporation. © 2012 IPEM.

Loading Laboratorio Nacional Of Computacao Cientifica collaborators
Loading Laboratorio Nacional Of Computacao Cientifica collaborators