NorthCap University

Gurgaon, India

NorthCap University

Gurgaon, India
SEARCH FILTERS
Time filter
Source Type

Chhikara R.R.,NorthCap University | Kumari M.,NorthCap University
2016 International Conference on Computation System and Information Technology for Sustainable Solutions, CSITSS 2016 | Year: 2016

Steganalysis is capable of identifying the carrier(s) which have information hidden in them in such a way that their very existence is concealed. In this paper we propose a classification system with neural networks which reduces computational complexity through a pre-processing step (feature selection) performed by Bhattacharyya distance for image steganalysis. This approach is able to identify relevant features which are a subset of original features extracted from spatial as well as transform domain. It helps in overcoming the problem of 'curse of dimensionalty' by removing redundant features by feature selection step before classifying the dataset. The experiments are performed on dataset obtained by four steganography algorithms outguess, steghide, PQ and nsF5 with two classifiers Support Vector Machine and Back Propagation neural networks. Classifier in combination with Bhattacharyya distance filter feature selection approach shows an improvement of 2-20% against total number of features. © 2016 IEEE.


Patel S.,NorthCap University | Kaur J.,NorthCap University
Proceeding - IEEE International Conference on Computing, Communication and Automation, ICCCA 2016 | Year: 2016

The term metrics is basically a quantitative measure of extent to which given attributes influenced by a system, component or a process. Metrics are required to measure software quality, improve software quality and to predict software quality. There are various approaches available to develop a software metrics like object oriented approach, component based approach, distributed approach. Component based software engineering is a latest approach in developing software. The main function of component based metrics is to provide reusability and decrease cost and development time. These metrics are used for evaluating quality and managing risk. The aim of this paper is to study component based metrics. In this paper comparison of some component based metric is done on the basis of functional and non-functional characteristics of software and discussed how this new approach is different from any other approach for software development. © 2016 IEEE.


Adlakha A.,NorthCap University | Chhikara R.R.,NorthCap University
Proceeding - IEEE International Conference on Computing, Communication and Automation, ICCCA 2016 | Year: 2016

Steganalysis aims at detecting the presence of steganography with or without knowledge of steganographic technique used. Feature based Steganalysis determines the presence of secret data by comparing the statistical features of both cover and stego images. Using a large number of features for Steganalysis relative to training set size may reduce classification accuracy and also increase computational complexity. Feature Selection selects subset of features from the given dataset which maximizes classifiers performance. In this paper we have applied four different filter feature selection approaches to features extracted from transform domain and then compared their performance by using five different classifiers. © 2016 IEEE.


Kumari A.C.,NorthCap University | Srinivas K.,Dayalbagh Educational Institute
Information and Software Technology | Year: 2016

Context: In requirements engineering phase of the software development life cycle, one of the main concerns of software engineers is to select a set of software requirements for implementation in the next release of the software from many requirements proposed by the customers, while balancing budget and customer satisfaction. Objective: To analyse the efficacy of Quantum-inspired Elitist Multi-objective Evolutionary Algorithm (QEMEA), Quantum-inspired Multi-objective Differential Evolution Algorithm (QMDEA) and Multi-objective Quantum-inspired Hybrid Differential Evolution (MQHDE) in solving the software requirements selection problem. Method: The paper reports on empirical evaluation of the performance of three quantum-inspired multi-objective evolutionary algorithms along with Non-dominated Sorting Genetic Algorithm-II (NSGA-II). The comparison includes the obtained Pareto fronts, the three performance metrics - Generational Distance, Spread and Hypervolume, attained boundary solutions, and size of the Pareto front. Results: The results reveal that MQHDE outperformed other methods in producing high quality solutions; while QMDEA is able to produce well distributed solutions with extreme boundary solutions. Conclusion: The hybridization of Differential Evolution with Genetic Algorithms coupled with quantum computing concepts (MQHDE) provided a means to effectively balance the two issues of multi-objective optimization - convergence and diversity. © 2016 Elsevier B.V. All rights reserved.


Swamee P.K.,Northcap University | Rathie P.N.,Federal University of Ceará
Journal of Irrigation and Drainage Engineering | Year: 2016

Normal depth is a key parameter occurring in open-channel hydraulics. For all practical canal sections, open-channel resistance equation involves implicit form for normal depth. Therefore, the determination of the normal depth involves the tedious method of trial and error. Presented in this paper is the good approximate explicit equation for normal depth in parabolic open-channel sections. © 2016 American Society of Civil Engineers.


Dixit S.,Northcap University | Badgaiyan A.J.,Vindhya Institute of Management and Research
Resources, Conservation and Recycling | Year: 2016

Considering enormous adverse impact of improper disposal of e-waste on green house emissions and global climate change, it is imperative to develop improved understanding about reverse logistics. However, as majority of consumers prefer to store their e-waste at home rather than returning it to producer thereby limiting the successful implementation of reverse logistics, it is important to understand the psychological determinants of consumers' intention to return e-waste so that effective strategies could be designed accordingly. This research work aims to strengthen e-waste acquisition from the consumers by determining the psychological determinants of intention to return e-waste. This research work aims to strengthen e-waste acquisition from the consumers by determining the psychological determinants of intention to return e-waste. For this, a survey instrument was administered on 750 mobile phone users in India and structural equation modeling was used to analyze the responses. Research findings show that return intention acts as a mediating variable in prediction of return behavior. Further, perceived behavioral control, subjective norms, moral norms, willingness to sacrifice were identified as antecedents to return intention. Given the possibility of increased proportion of e-waste returns through strengthening of behavioral intentions, the study findings and suggestive inputs may help the firms in fulfilling their "Extended Producer Responsibility". © 2015 Elsevier B.V. All rights reserved.


Tyagi C.,NorthCap University | Sharma A.,NorthCap University
Journal of Physics D: Applied Physics | Year: 2016

The present work deals with the preparation and characterization of 2-mercaptoethanol capped cadmium selenide (CdSe) nanoparticles, dispersed in poly(diallyl dimethyl ammonium chloride) (PDADMAC) polyelectrolyte aqueous solution. X-ray diffraction spectra, scanning electron microscopy and energy-dispersive x-ray have been used to determine the structure, particle size (d), surface morphology and composition of various constituents. The absorption spectra of pure PDADMAC and the CdSe/PDADMAC polymer nanocomposite (PNC) are analyzed to determine the values of the absorption coefficient (α) and energy band gap (E g) which are found to be 4 eV and 3.26 eV respectively. A red shift in the spectrum of the PNC, as compared to the pure polymer, has been observed. With the addition of CdSe nanoparticles in the PDADMAC polyelectrolyte, a remarkable change in the optical parameters of the pure polymer has been observed. The refractive index (n) obtained by using Swanepoel's method decreases in the case of the PNC as compared to the pure polymer. The value of the static refractive index (n 0) is found to be 4.29 for the pure polymer and 1.52 for the PNC. The extinction coefficient, dielectric constants, optical conductivity and relaxation time have been evaluated. The Wemple-DiDomenico model has been used to evaluate the dispersion parameters such as the average energy gap (E 0) and dispersion energy (E d). The values of the nonlinear refractive index (n 2) of the pure polymer and PNC have been determined using the theoretical approaches suggested by Boling and Tichy and Ticha. n 2 increases in the case of PNC, which relates to the decreased energy band gap. Photoluminescence (PL) spectra have been studied to explore the energy band structure and interaction between CdSe nanoparticles and PDADMAC. The PL peaks obtained at 437 nm and 461 nm correspond to the pure polymer whereas the peak at 577 nm is attributed to CdSe. © 2016 IOP Publishing Ltd.


Nagpal A.,Northcap University | Gaur D.,Northcap University
Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) | Year: 2015

Feature subset selection is one of the techniques to extract the highly relevant subset of original features from a dataset. In this paper, we have proposed a new algorithm to filter the features from the dataset using a greedy stepwise forward selection technique. The Proposed algorithm uses gain ratio as the greedy evaluation measure. It utilizes multiple feature correlation technique to remove the redundant features from the data set. Experiments that are carried out to evaluate the Proposed algorithm are based on number of features, runtime and classification accuracy of three classifiers namely Naïve Bayes, the Tree based C4.5 and Instant Based IB1. The results have been compared with other two feature selection algorithms, i.e. Fast Correlation-Based Filter Solution (FCBS) and Fast clustering based feature selection algorithm (FAST) over the datasets of different dimensions and domain. A unified metric, which combines all three parameters (number of features, runtime, classification accuracy) together, has also been taken to compare the algorithms. The result shows that our Proposed algorithm has a significant improvement than other feature selection algorithms for large dimensional data while working on a data set of image domain. © Springer International Publishing Switzerland 2015.


Kaushal H.,NorthCap University | Kaddoum G.,NorthCap University
IEEE Communications Surveys and Tutorials | Year: 2016

In recent years, free space optical (FSO) communication has gained significant importance owing to its unique features: large bandwidth, license free spectrum, high data rate, easy and quick deployability, less power and low mass requirements. FSO communication uses optical carrier in the near infrared (IR) band to establish either terrestrial links within the Earth's atmosphere or inter-satellite/deep space links or ground-To-satellite/satellite-To-ground links. It also finds its applications in remote sensing, radio astronomy, military, disaster recovery, last mile access, backhaul for wireless cellular networks and many more. However, despite of great potential of FSO communication, its performance is limited by the adverse effects (viz., absorption, scattering and turbulence) of the atmospheric channel. Out of these three effects, the atmospheric turbulence is a major challenge that may lead to serious degradation in the bit error rate (BER) performance of the system and make the communication link infeasible. This paper presents a comprehensive survey on various challenges faced by FSO communication system for ground-To-satellite/satellite-To-ground and inter-satellite links. It also provide details of various performance mitigation techniques in order to have high link availability and reliability. The first part of the paper will focus on various types of impairments that pose a serious challenge to the performance of optical communication system for ground-To-satellite/satellite-To-ground and inter-satellite links. The latter part of the paper will provide the reader with an exhaustive review of various techniques both at physical layer as well as at the other layers (link, network or transport layer) to combat the adverse effects of the atmosphere. It also uniquely presents a recently developed technique using orbital angular momentum for utilizing the high capacity advantage of optical carrier in case of space-based and near-Earth optical communication links. This survey provides the reader with comprehensive details on the use of space-based optical backhaul links in order to provide high capacity and low cost backhaul solutions. © 1998-2012 IEEE.


An asymmetric scheme has been proposed for optical double images encryption in the gyrator wavelet transform (GWT) domain. Grayscale and binary images are encrypted separately using double random phase encoding (DRPE) in the GWT domain. Phase masks based on devil's vortex Fresnel Lens (DVFLs) and random phase masks (RPMs) are jointly used in spatial as well as in the Fourier plane. The images to be encrypted are first gyrator transformed and then single-level discrete wavelet transformed (DWT) to decompose LL,HL,LH and HH matrices of approximation, horizontal, vertical and diagonal coefficients. The resulting coefficients from the DWT are multiplied by other RPMs and the results are applied to inverse discrete wavelet transform (IDWT) for obtaining the encrypted images. The images are recovered from their corresponding encrypted images by using the correct parameters of the GWT, DVFL and its digital implementation has been performed using MATLAB 7.6.0 (R2008a). The mother wavelet family, DVFL and gyrator transform orders associated with the GWT are extra keys that cause difficulty to an attacker. Thus, the scheme is more secure as compared to conventional techniques. The efficacy of the proposed scheme is verified by computing mean-squared-error (MSE) between recovered and the original images. The sensitivity of the proposed scheme is verified with encryption parameters and noise attacks. © 2016 Elsevier Ltd.

Loading NorthCap University collaborators
Loading NorthCap University collaborators