Gupta R.,BAHRA University |
Advances in Intelligent Systems and Computing | Year: 2017
Free-space optics (FSO) proved to be a complete replacement of radio frequency (RF) communications in recent years. The performance characteristics of FSO channel is affected by varying climate conditions such as fog, haze, rain, etc. due to their major influence on the laser beam quality propagation through the atmosphere. Attenuation due to fog conditions results in severe effect on the received power. Because of this still FSO has not achieved a mass success in the market. The quality of FSO system is analysed in terms of its signal-to-noise ratio. In this paper, the performance is analysed for different foggy weather conditions like dense, continental, maritime, stable, advection and dense haze using Kim and Kruse models. The density of the fog is governed by liquid water content (LWC) present in the atmosphere. The data rate at an optical wavelength of 1550 nm has also been studied here. © Springer Science+Business Media Singapore 2017.
Kaur P.,UIET |
Gupta A.,University of Punjab
Advances in Intelligent Systems and Computing | Year: 2017
There is a tremendous growth in telecommunication networks due to the emergence of various communication techniques. Optical communications have been the major contributors to it. Free space optical communication (FSO) is a technique based on transmission of data by propagating the light in free space. This is an effective technique to transmit the data at high bit rate over short distance with the added advantage of easy and fast installation and high security. Hybrid RF/FSO technique improve the overall reliability of the system. In this paper, aerial application of the RF/FSO system, i.e., airborne Internet which includes the use of optical links in the network of unmanned aerial vehicles (UAVs) is given. Different formations of UAV swarms, few methods to combat the problems faced by unmanned aerial vehicles (UAVs) and high altitude platform (HAPs) for research work have been discussed. A model is also proposed to improve the reliability of the swarm network. © Springer Science+Business Media Singapore 2017.
Proceeding - IEEE International Conference on Computing, Communication and Automation, ICCCA 2016 | Year: 2016
Adverse drug reactions represent a major health problem all over the world. It defines any injury caused by taking a drug or overdose of drug or due to combination of two or more drugs. Detection of adverse drug reactions is necessary because they affect a large number of people and can help in raising early warning against adverse effects of drugs and help medical practitioners in making treatment effective and timely. In today's digital era a huge amount of data related to adverse effects of drugs is being collected at hospitals, drug retail stores and by drug manufacturers. This data can be utilized for finding out the hidden relationships between drugs and their adverse reactions. But due to the vast amount of data it is not possible to analyze it manually. Data mining is the process of extracting meaningful and useful patterns hidden in large amounts of data. Data mining techniques can be availed in the medical domain for extracting the relationships between drugs and adverse reactions. These techniques help in saving cost as opposed to experimental detection of drugs and adverse reaction relationships. We have done a survey on determining how data mining techniques can be utilized in detection of adverse effects of drugs and represent it in a comprehensible way. The papers which we have reviewed focuses mainly on detecting the adverse effects of drugs after the drug has been launched in the market for use. © 2016 IEEE.
Deep G.,Chandigarh Engineering College |
Kaur L.,Punjabi University |
Computer Methods in Biomechanics and Biomedical Engineering: Imaging and Visualization | Year: 2017
In this paper a new feature descriptor ‘local quantised extrema quinary pattern (LQEQryP)’ is proposed for biomedical image indexing and retrieval. The binary and non-binary codings such as local binary patterns (LBP), local ternary patterns (LTP) and local quinary patterns (LQP) encode the gray scale relationship between the centre pixel and its surrounding neighbours in two dimensional (2D) local region of an image, whereas the proposed method encodes the spatial relation between any pair of neighbours in a local region along the given directions (i.e. 0°, 45°, 90° and 135°) for a given centre pixel in an image. The novelty of the proposed method is it uses quinary pattern features from horizontal-vertical-diagonal-anti-diagonal (HVDA7) structure of directional local extrema values of an image to encode more spatial structure information which lead to better retrieval. LQEQryP also provides a significant increase in discriminative power by allowing larger local pattern neighbourhoods. The experiments have been carried out for proving the worth of proposed algorithm on three different types of benchmark biomedical databases; (i) computed tomography (CT) scanned lung image databases named as LIDC-IDRI-CT and VIA/I–ELCAP-CT, (ii) brain magnetic resonance imaging (MRI) database named as OASIS-MRI. The results demonstrate the superiority of the proposed method in terms of average retrieval precision (ARP) and average retrieval rate (ARR) over state-of-the-art feature extraction techniques such as LBP, LTP and LQEP, etc. © 2017 Informa UK Limited, trading as Taylor & Francis Group
Arora A.,JUIT Solan |
International Journal of Pharmacy and Technology | Year: 2016
This paper work is preferentially concerned with the synthesis of quantum dots (QDs) by solution growth method at room temperature and their characterization. Excellent structural and optical and surface properties have been attained like high luminescence and stable structure. Prepared quantum dots structures and core/shell quantum dots structure are cadmium selenide (CdSe), Zinc sulfide (ZnS) and Cadmium selenide/ Zinc sulfide (CdSe/ZnS). © 2016, International Journal of Pharmacy and Technology. All rights reserved.
Singh J.,DAV Institute of Engineering and Technology |
Singh S.,UIET |
Singh D.,Control Engg. |
Uddin M.,Dr. B.R. Ambedkar N.I.T
Signal Processing: Image Communication | Year: 2011
Image compression plays a pivotal role in minimizing the data size and reduction in transmission costs. Many coding techniques have been developed, but the most effective is the JPEG compression. However, the reconstructed images from JPEG compression produce noticeable image degradations near block boundaries called blocking artifacts, particularly in highly compressed images. A method to detect and reduce these artifacts without smoothing images and without removing perceptual features has been presented in this paper. In this work, a low computational deblocking filter with four modes is proposed, including three frequency-related modes (smooth, non-smooth, and intermediate) and a corner mode for the corner of four blocks. Extensive experiments and comparison with other deblocking methods have been conducted on the basis of PSNR, MSSIM, SF, and MOS to justify the effectiveness of the proposed method. The proposed algorithm keeps the computation lower and achieves better detail preservation and artifact removal performance. © 2011 Elsevier B.V.
Deep G.,IET Bhaddal |
Kaur L.,Punjabi University |
Procedia Computer Science | Year: 2016
This paper focuses on the comparison of two new proposed pattern descriptors i.e., local mesh ternary pattern (LMeTerP) and directional local ternary quantized extrema pattern (DLTerQEP) for biomedical image indexing and retrieval. The standard local binary patterns (LBP) and local ternary patterns (LTP) encode the gray scale relationship between the center pixel and its surrounding neighbors in two dimensional (2D) local region of an image whereas the former descriptor encodes the gray scale relationship among the neighbors for a given center pixel with three selected directions of mess patterns which is generated from 2D image and later descriptor encodes the spatial relation between any pair of neighbors in a local region along the given directions (i.e., 0°, 45°, 90° and 135°) for a given center pixel in an image. The novelty of the proposed descriptors is that they use ternary patterns from images to encode more spatial structure information which lead to better retrieval. The experimental results demonstrate the superiority of the new techniques in terms of average retrieval precision (ARP) and average retrieval rate (ARR) over state-of-the-art feature extraction techniques (like LBP, LTP, LQEP, LMeP etc.) on three different types of benchmark biomedical databases. © 2016 The Authors. Published by Elsevier B.V.
MATEC Web of Conferences | Year: 2016
Cloud computing is an incipient innovation which broadly spreads among analysts. It furnishes clients with foundation, stage and programming as enhancement which is easily available by means of web. A cloud is a sort of parallel and conveyed framework comprising of a gathering of virtualized PCs that are utilized to execute various tasks to accomplish good execution time, accomplish due date and usage of its assets. The scheduling issue can be seen as the finding an ideal task of assignments over the accessible arrangement of assets with the goal that we can accomplish the wanted objectives for tasks. This paper presents an optimal algorithm for scheduling tasks to get their waiting time as a QoS parameter. The algorithm is simulated using Cloudsim simulator and experiments are carried out to help clients to make sense of the bottleneck of utilizing no. of virtual machine parallely. © Owned by the authors, published by EDP Sciences, 2016.
Goyal A.,C DAC |
International Conference on Computing, Communication and Automation, ICCCA 2015 | Year: 2015
MapReduce is a programming model specifically developed for the management and processing of 'Big Data' - extremely large amounts of data that expects high level of analyzing capabilities. With every passing day volumes of data is generated and collected from multiple data resources across the planet. This data must be analyzed in the sense of volume or speed of data moving to and from the data management systems. MapReduce efficiently execute programs on large clusters by utilizing the concept of parallelism. Till now Google's MapReduce framework has been considered as the most successful implementation for Big Data. A number of implementations of MapReduce programming model have been proposed. This paper discusses various emerging implementations of MapReduce model. An emphasis is also given on the leading and lacking strength of these implementations. © 2015 IEEE.
Marriwala N.,U.I.E.T |
Proceedings of the 2012 World Congress on Information and Communication Technologies, WICT 2012 | Year: 2012
A wireless sensor network consist of small devices, called sensor nodes that are equipped with sensors to monitor the physical and environmental conditions such as pressure, temperature, humidity, motion, speed etc. The nodes in the wireless sensor network were battery powered, so one of the important issues in wireless sensor network is the inherent limited battery power within network sensor nodes. Minimizing energy dissipation and maximizing network lifetime are important issues in the design of sensor networks so if the power exhausted node would quit from the network, and it overall affect the network lifetime. Minimizing energy dissipation and maximizing network lifetime are important issues in the design of applications and protocols for sensor networks. In this paper there is improvement of lifetime of wireless sensor network in terms increasing alive nodes in network by using a different approach to select cluster head. The cluster head selection is based on the basis of maximum residual energy and minimum distance and chooses a optimal pat between the cluster heads to transmit to the base station. © 2012 IEEE.