Entity

Time filter

Source Type

Aldeadávila de la Ribera, Spain

Garcia-Laencina P.J.,Centro Universitario Of La Defensa Of San Javier | Sancho-Gomez J.-L.,Technical University of Cartagena | Figueiras-Vidal A.R.,Charles III University of Madrid
Expert Systems with Applications | Year: 2013

Datasets with missing values are frequent in real-world classification problems. It seems obvious that imputation of missing values can be considered as a series of secondary tasks, while classification is the main purpose of any machine dealing with these datasets. Consequently, Multi-Task Learning (MTL) schemes offer an interesting alternative approach to solve missing data problems. In this paper, we propose an MTL-based method for training and operating a modified Multi-Layer Perceptron (MLP) architecture to work in incomplete data contexts. The proposed approach achieves a balance between both classification and imputation by exploiting the advantages of MTL. Extensive experimental comparisons with well-known imputation algorithms show that this approach provides excellent results. The method is never worse than the traditional algorithms - an important robustness property - and, also, it clearly outperforms them in several problems. © 2012 Elsevier Ltd. All rights reserved. Source


Urda D.,University of Malaga | Subirats J.L.,University of Malaga | Garcia-Laencina P.J.,Centro Universitario Of La Defensa Of San Javier | Franco L.,University of Malaga | And 2 more authors.
Computer Methods and Programs in Biomedicine | Year: 2012

The imputation of unknown or missing data is a crucial task on the analysis of biomedical datasets. There are several situations where it is necessary to classify or identify instances given incomplete vectors, and the existence of missing values can much degrade the performance of the algorithms used for the classification/recognition. The task of learning accurately from incomplete data raises a number of issues some of which have not been completely solved in machine learning applications. In this sense, effective missing value estimation methods are required. Different methods for missing data imputations exist but most of the times the selection of the appropriate technique involves testing several methods, comparing them and choosing the right one. Furthermore, applying these methods, in most cases, is not straightforward, as they involve several technical details, and in particular in cases such as when dealing with microarray datasets, the application of the methods requires huge computational resources. As far as we know, there is not a public software application that can provide the computing capabilities required for carrying the task of data imputation. This paper presents a new public tool for missing data imputation that is attached to a computer cluster in order to execute high computational tasks. The software WIMP (Web IMPutation) is a public available web site where registered users can create, execute, analyze and store their simulations related to missing data imputation. © 2012 Elsevier Ireland Ltd. Source


Garcia-Cascales J.R.,Technical University of Cartagena | Velasco F.J.S.,Centro Universitario Of La Defensa Of San Javier | Oton-Martinez R.A.,Technical University of Cartagena | Espin-Tolosa S.,Technical University of Cartagena | And 3 more authors.
Fusion Engineering and Design | Year: 2015

The code DUST is a CFD code developed by the Technical University of Cartagena, Spain and the Institute of Radioprotection and Nuclear Security, France (IRSN) with the objective to assess the dust explosion hazard in the vacuum vessel of ITER. Thus, DUST code permits the analysis of dust spatial distribution, remobilisation and entrainment, explosion, and combustion. Some assumptions such as particle incompressibility and negligible effect of pressure on the solid phase make the model quite appealing from the mathematical point of view, as the systems of equations that characterise the behaviour of the solid and gaseous phases are decoupled. The objective of this work is to present the model implemented in the code to characterise metal combustion. In order to evaluate its ability analysing reactive mixtures of multicomponent gases and multicomponent solids, two combustion problems are studied, namely H2/N2/O2/C and H2/N2/O2/W mixtures. The system of equations considered and finite volume approach are briefly presented. The closure relationships used are commented and special attention is paid to the reaction rate correlations used in the model. The numerical results are compared with those obtained experimentally at the IRSN/CNRS facility in Orleans. They are commented and some conclusions are finally drawn. © 2015 Elsevier B.V. Source


Monreal-Gonzalez G.,Technical University of Cartagena | Oton-Martinez R.A.,Technical University of Cartagena | Oton-Martinez R.A.,EXPAL Systems S.A. MAXAM Defence | Velasco F.J.S.,Centro Universitario Of La Defensa Of San Javier | And 4 more authors.
Proceedings - 28th International Symposium on Ballistics, BALLISTICS 2014 | Year: 2014

This paper presents a 1D code to assist in the analysis of internal ballistic problems. This is based on a conservative formulation of the model proposed by Gough in the seventies. This is formed by a set of seven partial differential equations corresponding to the balances of mass, momentum, and energy of each phase and a constitutive law for the surface regression length of the solid phase. The authors have chosen an Eulerian approach based on a finite volume approximation in which the conserved variables are determined explicitly. A splitting technique is applied solving the system of equations in several steps. This consists of solving separately the convective part of the homogeneous system and after the system of ODEs which includes the source terms. For the convective part, numerical fluxes are evaluated by means of approximate Riemann solvers. Rusanov scheme and AUSM family of schemes are extended for their use in this context. Source terms are calculated explicit and implicitly making the model quite robust for the type of problems studied so far. The constitutive equations used are briefly studied; namely interfacial drag, interfacial heat transfer and combustion law. Solid phase is considered incompressible and Nobel-Abel equation of state is used to characterize the Thermodynamic state of the gas phase. These are satisfactory approaches for this type of problems. The robustness of the model proposed to analyze internal ballistic problems is shown by studying some experimental tests. Source


Garcia-Cascales J.R.,Technical University of Cartagena | Oton-Martinez R.A.,Technical University of Cartagena | Vera-Garcia F.,Technical University of Cartagena | Amat-Plata S.,Technical University of Cartagena | And 4 more authors.
AIP Conference Proceedings | Year: 2012

This work describes the use of splitting methods in the analysis of two-phase mixtures of gas and particles under conditions that make the source terms very stiff. In this case, these are low pressure and very dense solids. The integration of the source terms proposed seems to help to tackle without difficulty the type of problems of interest. Interfacial friction and heat transfer are the two closure laws included in the model. The gas phase is considered a perfect gas and the solid one is assumed incompressible. Some numerical results complete this work. © 2012 American Institute of Physics. Source

Discover hidden collaborations