Time filter

Source Type

New Philadelphia, United States

Jabangwe R.,Dundalk Institute of Technology | Jabangwe R.,Blekinge Institute of Technology | Smite D.,Blekinge Institute of Technology | Hessbo E.,HCL Technologies
Information and Software Technology | Year: 2016

Context Offshore outsourcing collaborations can result in distributed development, which has been linked to quality-related concerns. However, there are few studies that focus on the implication of distributed development on quality, and they report inconsistent findings using different proxies for quality. Thus, there is a need for more studies, as well as to identify useful proxies for certain distributed contexts. The presented empirical study was performed in a context that involved offshore outsourcing vendors in a multisite distributed development setting. Objective The aim of the study is to investigate how quality changes during evolution in a distributed development environment that incurs organizational changes in terms of number of companies involved. Method A case study approach is followed in the investigation. Only post-release defects are used as a proxy for external quality due to unreliable defect data found pre-release such as those reported during integration. Focus group meetings were also held with practitioners. Results The results suggest that practices that can be grouped into product, people, and process categories can help ensure post-release quality. However, post-release defects are insufficient for showing a conclusive impact on quality of the development setting. This is because the development teams worked independently as isolated distributed teams, and integration defects would help to better reflect on the impact on quality of the development setting. Conclusions The mitigation practices identified can be useful information to practitioners that are planning to engage in similar globally distributed development projects. Finally, it is important to take into consideration the arrangement of distributed development teams in global projects, and to use the context to identify appropriate proxies for quality in order to draw correct conclusions about the implications of the context. This would help with providing practitioners with well-founded findings about the impact on quality of globally distributed development settings. © 2015 Elsevier B.V. All rights reserved. Source

Bajpai N.,HCL Technologies | Dang S.,Jaypee Institute of Information Technology | Sharma S.K.,Jaypee Institute of Information Technology
International Journal of Pharmaceutical and Clinical Research | Year: 2015

With the advent of modifications in Indian regulations for clinical trials, there is the need to have standardization of individual clinical data management steps. This will not only have cost saving by mitigation of risk by avoiding data corruption but will help to evolve the Indian regulation towards globalization. This report shares the authors view based on experiences drawn by implementation of clinical trial data management procedures in an Indian biopharmaceutical company on vaccine trails. © 2015, International Journal of Pharmaceutical and Clinical Research. Source

Dash S.,North Orissa University | Dash A.,HCL Technologies
2014 14th International Conference on Hybrid Intelligent Systems, HIS 2014 | Year: 2014

Research on feature selection techniques for identifying informative genes from high dimensional microarray datasets has received considerable attention. Numerous researchers have proposed various optimized solutions to reduce noises, redundancy in dataset and to enhance the accuracy and generalization of the classification model by applying many computational tools. High-dimensional microarray gene expression dataset has limitations to many feature selection techniques with respect to generalization and effectiveness. A robust feature selection technique need to be designed which can remove irrelevant data, increase learning accuracy and improve comprehensibility of the experimental result. In this work, a novel correlation based feature selection algorithm using symmetrical uncertainty and multilayer perceptron algorithm is proposed. This method can identify the relevancy of the features to the class and also the redundancy considering all other relevant features of the dataset. It also evaluates the worth of a set attributes by measuring the symmetrical uncertainty with respect to another set of attributes. The effectiveness of the method is validated through various correlation based feature selection techniques using multi-category high-dimensional microarray datasets. © 2014 IEEE. Source

Choudhary P.K.,HCL Technologies | Routray S.,Institute of Management Technology
International Journal of Web Based Communities | Year: 2016

The features of popular social networking sites (SNSs) have been evolving rapidly. In this regard, there arise interesting questions that warrant investigations like, does user perception of SNSs get affected due to feature evolution of the SNSs? What features of the SNSs really matter to the users? Are features of SNSs important reasons for usage of these sites? In order to investigate these questions, the authors conducted an analytical study of SNSs usage in India. The study was undertaken with three stage qualitative process of detailed literature review, expert opinion elicitation followed by in-depth focus group discussions with 12 select panel of SNSs users. The study corroborates existing literature on SNS usage context in terms of usage purpose, usage impact due to user profile and age-group, user presence on different SNS platforms, and also shows some new findings on user perceptions on feature evolutions. Copyright © 2016 Inderscience Enterprises Ltd. Source

Kalpana M.,Sethu Institute of Technology | Dhanalakshmi R.,HCL Technologies | Parthiban P.,National Institute of Technology Tiruchirappalli
Scientific World Journal | Year: 2014

This research work proposes a mathematical model for the lifetime of wireless sensor networks (WSN). It also proposes an energy efficient routing algorithm for WSN called hierarchical energy tree based routing algorithm (HETRA) based on hierarchical energy tree constructed using the available energy in each node. The energy efficiency is further augmented by reducing the packet drops using exponential congestion control algorithm (TCP/EXP). The algorithms are evaluated in WSNs interconnected to fixed network with seven distribution patterns, simulated in ns2 and compared with the existing algorithms based on the parameters such as number of data packets, throughput, network lifetime, and data packets average network lifetime product. Evaluation and simulation results show that the combination of HETRA and TCP/EXP maximizes longer network lifetime in all the patterns. The lifetime of the network with HETRA algorithm has increased approximately 3.2 times that of the network implemented with AODV. © 2014 M. Kalpana et al. Source

Discover hidden collaborations