Cambridge Institute of Technology

Ranchi, India

Cambridge Institute of Technology

Ranchi, India
Time filter
Source Type

Yang X.-S.,Middlesex University | Deb S.,Cambridge Institute of Technology | Fong S.,University of Macau
Journal of Multiple-Valued Logic and Soft Computing | Year: 2014

The efficiency of any metaheuristic algorithm largely depends on the way of balancing local intensive exploitation and global diverse exploration. Studies show that bat algorithm can provide a good balance between these two key components with superior efficiency. In this paper, we first review some commonly used metaheuristic algorithms, and then compare the performance of bat algorithm with the so-called intermittent search strategy. From simulations, we found that bat algorithm is better than the optimal intermittent search strategy. We also analyse the comparison results and their implications for higher dimensional optimization problems. In addition, we also apply bat algorithm in solving business optimization and engineering design problems. ©2014 Old City Publishing, Inc.

Fong S.,University of Macau | Deb S.,Cambridge Institute of Technology | Yang X.-S.,Middlesex University | Li J.,University of Macau
IT Professional | Year: 2014

The purpose of classification in medical informatics is to predict the presence or absence of a particular disease as well as disease types from historical data. Medical data often contain irrelevant features and noise, and an appropriate subset of the significant features can improve classification accuracy. Therefore, researchers apply feature selection to identify and remove irrelevant and redundant features. The authors propose a versatile feature selection approach called Swarm Search Feature Selection (SS-FS), based on stochastic swarm intelligence. It is designed to overcome NP-hard combinatorial search problems such as the selection of an optimal feature subset from an extremely large array of features - which is not uncommon in biomedical data. SS-FS is demonstrated to be a feasible computing tool in achieving high accuracy in classification via testing with two empirical biomedical datasets. This article is part of a special issue on life sciences computing. © 2014 IEEE.

Arulprakasajothi M.,JNTUA | Elangovan K.,Cambridge Institute of Technology | HemaChandra Reddy K.,JNTUA | Suresh S.,National Institute of Technology Tiruchirappalli
Materials Today: Proceedings | Year: 2015

This paper presents the heat transfer study of TiO2/water Nanofluids with different concentrations. Nanofluids have emerged as an exciting new class of nanotechnology based heat transfer fluids and have grown enormously in the past few years. Nanofluids increases the performance of heat exchanging devices than the conventional working fluids. In the present work, TiO2/water nanofluids with various volume concentrations of 0.1%,0.25%,0.5% and0.75% were prepared using two step method for heat transfer study. Stability, thermal conductivity and viscosity measurement was conducted using zeta potential, KD2 Pro and Brookfield Viscometer respectively. Finally, the experiment was conducted using TiO2/water nanofluids as working fluid in a tube heat exchanger to study the heat transfer performance. The experimental results show that the nusselt number increases with an increase of particle volume fraction and the nusselt number enhancement was observed to be 13.2% over the base fluid for volume concentration of 0.75%. From the experimental observations, enhancement in Nusselt number is larger than the enhancement in friction factor. © 2015 Elsevier Ltd.

Rashid E.,Cambridge Institute of Technology
International Journal of Services, Technology and Management | Year: 2015

Making case-based reasoning (CBR), effective and efficient I have introduced some new features i.e., renovation of the knowledgebase (KBS) and reducing the maintenance cost by removing the ambiguous cases from the KBS. Renovation of knowledgebase is the process of removing duplicate record stored in knowledgebase as well as adding new problems along with new solutions. This paper explores improvisation of case-based reasoning and its applications for software fault prediction. The system predicts the error level with respect to LOC and development time and both are dependent variables that affect the quality level. At the outset, it deals with the possibilities of using lines of code and development time from any language may be compared and be used as a uniform metric. Five different similarity measures have been used to find the best method that increases the accuracy. The system is able to get the information by using an information retrieval (IR) technique from the existing knowledgebase. The experimental results reveal that the CBR method with the implementation of similarity measures is a viable technique for the fault prediction with practical advantages. In order to obtain the result I have used indigenous tool. Copyright © 2015 Inderscience Enterprises Ltd.

Akhtar M.A.K.,Cambridge Institute of Technology | Sahoo G.,Birla Institute of Technology
Procedia Computer Science | Year: 2015

MANET is a cooperative network in which every node is responsible for routing and forwarding as a result consumes more battery power and bandwidth. In order to save itself in terms of battery power and bandwidth noncooperation is genuine. Cooperation can be enhanced on the basis of reduction in resource consumption by involving a limited number of nodes in routing activities rather than all. To get accurate selection of nodes to define a backbone several works have been proposed in the literature. These works define a backbone with impractical assumptions that is not feasible for MANET. In this paper we have presented the Backbone Group (BG) model, which involve the minimum number of nodes called BG in routing activities instead of all. A BG is a minimal set of nodes that efficiently connects the network. We have divided a MANET in terms of the single hop neighborhood called locality group (LG). In a LG we have a cluster head (CH), a set of regular nodes (RNs) and one or more border nodes (BNs). The CHs are responsible for the creation and management of LG and BG. The CHs use a BG for a threshold time then switches to another BG, to involve all nodes in network participation. The proposed model shows its effectiveness in terms of reduction in routing overhead up to a ratio (n2: n2/k) where k is the number of LGs. © 2015 The Authors.

Yang X.-S.,Middlesex University | Deb S.,Cambridge Institute of Technology
Neural Computing and Applications | Year: 2014

Cuckoo search (CS) is a relatively new algorithm, developed by Yang and Deb in 2009, and the same has been found to be efficient in solving global optimization problems. In this paper, we review the fundamental ideas of cuckoo search and the latest developments as well as its applications. We analyze the algorithm and gain insight into its search mechanisms and find out why it is efficient. We also discuss the essence of algorithms and its link to self-organizing systems, and finally, we propose some important topics for further research. © 2013 Springer-Verlag London.

Akhtar M.A.K.,Cambridge Institute of Technology | Sahoo G.,Birla Institute of Technology
Smart Innovation, Systems and Technologies | Year: 2016

Energy and bandwidth are the scarce resource in a wireless network. In order to prolong its life nodes drop packets of others to save these resources. These resources are the major cause of selfish misbehavior or noncooperation. To enforce nodes cooperation this paper presents the reduction in resource consumption using Compressive Sensing. Our model compresses the neighborhood sparse data such as routing table updates and other advertisement. We have divided a MANET in terms of the neighborhood called neighborhood group (NG). Sparse data are compressed by neighborhood node and then forwarded to the leader node. The leader node joins all neighborhood data to reconstruct the original data and then broadcasts in its neighborhood. This gives a reduction in resource consumption because major computations are performed at leader end which saves battery power of neighborhood nodes. It compresses sparse data before transmission thus reduces the amount of transmitting data in the network which saves the total energy consumption to prolong life of the network. It also prevents from several attacks because individual nodes do not accept the advertisement and updates directly rather it uses leader node processed information. © Springer India 2016.

Dalali S.,Cambridge Institute of Technology | Suresh L.,Cambridge Institute of Technology
Procedia Computer Science | Year: 2016

With vast and fast development of information technology in our daily life plays an important role Storing significant information is a big challenge for researchers Face recognition is one of the most important fields in information technology where storing minimum significant information and accurate recognition is most challenging This paper proposes an idea of Daubechives wavelet based with modified Local Binary Pattern (LBP) for face recognition with minimum significant information Applying various Daubechive wavelets with single level decomposition, by considering only the approximation wavelets co-efficient reduces the storage capacity and gives significant information Then modified LBP is applied on significant wavelet coefficient for simple, fast and accurate face recognition Proposed work shows higher percentage of face recognition rate by using lesser features with effective threshold. © 2016 The Authors. Published by Elsevier B.V.

Kumar P.,Cambridge Institute of Technology | Chandra M.,Birla Institute of Technology
Proceedings of the 2011 World Congress on Information and Communication Technologies, WICT 2011 | Year: 2011

In this paper Wavelet Based Mel Frequency Cepstral Coefficient (WMFCC) features are proposed for speaker verification. The performance of WMFCC features is evaluated and compared with the performance of Mel Frequency Cepstral Coefficient (MFCC) features. A database of ten Hindi digits of sixteen speakers is used during simulation of results. Gaussian Mixture Models (GMMs) are used for maximum log likelihood calculation during verification. The proposed features have shown an increment of 1.18% in performance over MFCC features for text dependent speaker verification system. © 2011 IEEE.

Yang X.-S.,Middlesex University | Deb S.,Cambridge Institute of Technology | Loomes M.,Middlesex University | Karamanoglu M.,Middlesex University
Neural Computing and Applications | Year: 2013

The performance of any algorithm will largely depend on the setting of its algorithm-dependent parameters. The optimal setting should allow the algorithm to achieve the best performance for solving a range of optimization problems. However, such parameter tuning itself is a tough optimization problem. In this paper, we present a framework for self-tuning algorithms so that an algorithm to be tuned can be used to tune the algorithm itself. Using the firefly algorithm as an example, we show that this framework works well. It is also found that different parameters may have different sensitivities and thus require different degrees of tuning. Parameters with high sensitivities require fine-tuning to achieve optimality. © 2013 Springer-Verlag London.

Loading Cambridge Institute of Technology collaborators
Loading Cambridge Institute of Technology collaborators