Adorjan A.,ORT Uruguay University
Advances in Intelligent Systems and Computing | Year: 2017
In introductory math courses, especially Calculus 1, dropout rates and failure are generally high, and creating activities that increase retention and motivate students to obtain better final results is a challenge. In order to develop several competencies in our students of Software Engineering courses, Calculus I at Universidad ORT Uruguay focuses on several competencies such as: synthesis, abstraction and problem solving (based on the ACM/IEEE Curriculum Guidelines for Undergraduate Degree Programs in Software Engineering). Every semester we reflect on our practice and try to answer the following research question: What kind of activities can we design in Calculus 1 to retain students and obtain better results? This paper explores students’ perspectives on creating an article based on IEEE conference format related to one of the course topics and present the work in class using a poster. Preliminary results show an increase in retention and significant differences in final course results compared with two control groups. Collaborative learning activities using online editing tools encourage students to become self-learners and our role as teacher changed from being the center of the class to becoming a moderator where the principal figures are the students. © Springer International Publishing AG 2017.
Paganini F.,ORT Uruguay University |
Tang A.,Cornell University |
Ferragut A.,ORT Uruguay University |
Andrew L.L.H.,Swinburne University of Technology
IEEE Transactions on Automatic Control | Year: 2012
Rate allocation among a fixed set of end-to-end connections in the Internet is carried out by congestion control, which has a well established model: it optimizes a concave network utility, a particular case of which is the alpha-fair bandwidth allocation. This paper studies the slower dynamics of connections themselves, that arrive randomly in the network and are served at the allocated rate. It has been shown that under the condition that the mean offered load at each link is less than its capacity, the resulting queueing system is stochastically stable, for the case of exponentially distributed file-sizes. The conjecture that the result holds for general file-size distributions has remained open, and is very relevant since heavy-tailed distributions are often the best models of Internet file sizes. In this paper, building on existing fluid models of the system, we use a partial differential equation to characterize the dynamics. The equation keeps track of residual file size and therefore is suitable for general file size distributions. For alpha fair bandwidth allocation, with any positive alpha parameter, a Lyapunov function is constructed with negative drift when the offered load is less than capacity. With this tool we answer the conjecture affirmatively in the fluid sense: we prove asymptotic convergence to zero of the fluid model for general file-size distributions of finite mean, and finite-time convergence for those of finite p>1 moment. In the stochastic sense, we build on recent work that relates fluid and stochastic stability subject to a certain light-tailed restriction. We further provide the supplementary fluid stability argument to establish the conjecture for this class that includes phase-type distributions. Results are supplemented by illustrative network simulations at the packet level. © 2011 IEEE.
Betancor L.,ORT Uruguay University |
Johnson G.R.,Air Force Research Lab |
Luckarift H.R.,Air Force Research Lab |
Luckarift H.R.,Universal Technology Corporation
ChemCatChem | Year: 2013
Typically, the use of heterogeneous enzyme catalysis is aimed at sustainability, reusability, or enhanced functionality of the biocatalyst and is achieved by immobilizing enzymes onto a support matrix or at a defined interface. Controlled enzyme immobilization is particularly important in bioelectrocatalysis because the catalyst must be effectively connected to a transducer to exploit its activity. This Review discusses what must be addressed for coupling biocatalysts to an electrode and the toolbox of methods that are available for achieving this outcome. As an illustration, we focus on the immobilization and stabilization of laccases at electronic interfaces. Historically, laccases have been used for the decolorization of dyes and for the synthesis of bio-organic compounds; however, more recently, they have been applied to the fields of sensing and energy harvesting.1-3 There is an ever-increasing focus on the development of new energy technologies, in which laccases find application (e.g., as cathodic catalysts in enzymatic fuel cells). Herein, we discuss the heterogeneous laccase biocatalysts that have been reported over the past 10-15years and discuss why laccases continue to be biotechnologically relevant enzymes. Various methods for the immobilization of laccases are described, including the use of nanoscale supports and a range of encapsulation and cross-linking chemistries. We consider the application of immobilized laccases to the food industry, in the synthesis of pharmaceuticals, and in environmental applications, specifically in cases in which stabilization through heterogenization of the enzyme is critical to the application. We also include a consideration of electrochemical biosensors and the specific incorporation of laccases on the surfaces of transducers. © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Calegari D.,University of the Republic of Uruguay |
Szasz N.,ORT Uruguay University
Electronic Notes in Theoretical Computer Science | Year: 2013
Within the Model-Driven Engineering paradigm, software development is based on the definition of models providing different views of the system to be constructed and model transformations supporting a (semi)automatic development process. The verification of models and model transformations is crucial in order to improve the quality and the reliability of the products developed using this paradigm. In this context, the verification of a model transformation has three main components: the transformation itself, the properties of interest addressed, and the verification techniques used to establish the properties. In this paper we present an exhaustive review of the literature on the verification of model transformations analyzing these three components. We also take a problem-based approach exemplifying those aspects of interest that could be verified on a model transformation and show how this can be done. Finally, we conclude the need of an integrated environment for addressing the heterogeneous verification of model transformations. © 2013 Elsevier B.V.
Garbarino-Alberti H.,ORT Uruguay University
International Journal of Human Capital and Information Technology Professionals | Year: 2013
Information Technology (IT) plays an important role in organizations, particularly in small and medium-sized enterprises (SMEs). These firms have a simple structure with less specialized tasks and tight human, financial and material resources, so it is particularly important to use an appropriate IT governance framework (ITG) to such enterprises. This paper shows the results of applying an ITG framework designed for SMEs in a case study focused on IT Human Resources (IT HR) and the lessons learned. Conclusions highlight the importance of the quality of IT HR along with the key role played by related enterprise policies. Copyright © 2013, IGI Global.
Lopez-Vazquez C.,ORT Uruguay University
International Journal of Remote Sensing | Year: 2016
The choice of the best interpolation algorithm of data gathered at a finite number of locations has been a persistently relevant topic. Typical papers take a single data set, a single set of data points, and a handful of algorithms. The process considers a subset I of the data points as known, builds the interpolant with each algorithm, applies it to the points of another subset C, and evaluates the MAE (mean absolute error), the RMSE (root mean square error), or any other metric over such points. The less these statistics are, the better the algorithm is, so a deterministic ranking between methods (without confidence level) can be derived based upon it. Ties between methods are usually not considered. In this article a complete protocol is proposed in order to build, with a modest additional effort, a ranking with a confidence level. To illustrate this point, the results of two tests are shown. In the first one, a simple Monte Carlo experiment was devised using irregularly distributed points taken from a reference DEM (digital elevation model) in raster format. Different metrics led to different rankings, suggesting that the choice of the metric to define the ‘best interpolation algorithm’ would need a trade-off. The second experiment used mean daily radiation data from an international interpolation comparison exercise and RMSE as the metric of success. Only five simple interpolation methods were employed. The ranking using this protocol anticipated correctly the first and second place, afterwards confirmed employing independent control data. © 2016 Informa UK Limited, trading as Taylor & Francis Group.
Ferragut A.,ORT Uruguay University |
Paganini F.,ORT Uruguay University
IEEE/ACM Transactions on Networking | Year: 2014
This paper studies network resource allocation between users that manage multiple connections, possibly through different routes, where each connection is subject to congestion control. We formulate a user-centric Network Utility Maximization problem that takes into account the aggregate rate a user obtains from all connections, and we propose decentralized means to achieve this fairness objective. In a first proposal, cooperative users control their number of active connections based on congestion prices from the transport layer to emulate suitable primal-dual dynamics in the aggregate rate; we show this control achieves asymptotic convergence to the optimal user-centric allocation. For the case of noncooperative users, we show that network stability and user-centric fairness can be enforced by a utility-based admission control implemented at the network edge. We also study stability and fairness issues when routing of incoming connections is enabled at the edge router. We obtain in this case a characterization of the stability region of loads that can be served with routing alone and a generalization of our admission control policy to ensure user-centric fairness when the stability condition is not met. The proposed algorithms are implemented at the packet level in ns2 and demonstrated through simulation. © 2013 IEEE.
Solari M.,ORT Uruguay University
International Symposium on Empirical Software Engineering and Measurement | Year: 2013
The Empirical Software Engineering community is taking a growing interest in replicating experiments, but replications still pose a challenge for researchers. To be able to do better replications, we require detailed knowledge of what happens when a replication is performed. This article introduces a procedure for empirically evaluating replications designed to identify experimental incidents. The evaluation employs a qualitative method to analyze interviews with the principal investigator and other sources of information. We evaluated five replications of the same experiment run by different experimenters at different sites. We identified 49 incident types that occurred between 1 and 4 times across the evaluated replications. Although the replications were conducted within an experimentation network which was set up with collaborative instruments, we identified incidents in all the experimental activities. If experimenters know which incidents are likely to occur in replications, they will be able to focus on the identified problems and improve the replications that they perform. This report is a first step to create and improve experiment communication instruments such as laboratory packages. © 2013 IEEE.
Lopez-Vazquez C.,ORT Uruguay University
Proceedings - 2015 International Workshop on Data Mining with Industrial Applications, DMIA 2015: Part of the ETyC 2015 | Year: 2015
The main body of the literature states that Artificial Neural Networks must be regarded as a «black box» without further interpretation due to the inherent difficulties for analyze the weights and bias terms. Some authors claim that ANN trained as a regression device tend to organize itself by specializing some neurons to learn the main relationships embedded in the training set, while other neurons are more concerned with the noise. We suggest here a rule to identify the «noise-related» neurons in multilayer perceptron ANN, and we assume that those neurons are activated only when some unusual values (or combination of values) are present. We consider those events as candidates to hold an outlier. The use of the ANN as outlier detector does not require further training, and can be easily applied. © 2015 IEEE.
Feijer D.,ORT Uruguay University |
Paganini F.,ORT Uruguay University
Automatica | Year: 2010
This paper considers dynamic laws that seek a saddle point of a function of two vector variables, by moving each in the direction of the corresponding partial gradient. This method has old roots in the classical work of Arrow, Hurwicz and Uzawa on convex optimization, and has seen renewed interest with its recent application to resource allocation in communication networks. This paper brings other tools to bear on this problem, in particular Krasovskii's method to find Lyapunov functions, and recently obtained extensions of the LaSalle invariance principle for hybrid systems. These methods are used to obtain stability proofs of these primaldual laws in different scenarios, and applications to cross-layer network optimization are exhibited. © 2010 Elsevier Ltd. All rights reserved.