LNMIIT

Jaipur, India
Jaipur, India
SEARCH FILTERS
Time filter
Source Type

Nahar S.,LNMIIT | Joshi M.V.,Dhirubhai Ambani Institute of ICT
Proceedings - International Conference on Pattern Recognition | Year: 2017

In this paper, we propose a new approach for dense disparity estimation in a global energy minimization framework. We combine the feature matching cost defined using the learned hierarchical features of given left and right stereo images, with the pixel-based intensity matching cost to form the data term. The features are learned in an unsupervised way using the deep deconvolutional network. Our regularization term consists of an inhomogeneous Gaussian markov random field (IGMRF) prior that captures the smoothness as well as preserves sharp discontinuities in the disparity map. An iterative two phase algorithm is proposed to minimize the energy function in order to estimate the dense disparity map. In phase one, IGMRF parameters are computed, keeping the disparity map fixed, and in phase two, the disparity map is refined by minimizing the energy function using graph cuts, with other parameters fixed. Experimental results on the Middlebury stereo benchmarks demonstrate the effectiveness of the proposed approach. © 2016 IEEE.


Sharma H.,Manipal University India | Khatri N.,LNMIIT
2017 International Conference on Nascent Technologies in Engineering, ICNTE 2017 - Proceedings | Year: 2017

A novel algorithm for the colour image encryption is proposed in this paper. Proposed algorithm is based on image scrambling and Linear canonical Transform. The image scrambling accomplished using the chaotic function with iterations along the length of the image to be scrambled. The Linear canonical transform (LCT) is used to encrypt all the three components of colour image i.e. Red, Green and Blue at the same time. Application of the LCT over these components makes to not affect each other. So the connections between Red, Green and Blue components is condensed in the encrypted components. The security of the colour image is increased using the proposed algorithm. Simulation results represents that the proposed algorithm is best suited for the colour image encryption as well it provide safety to various attacks. © 2017 IEEE.


Choudhary P.,LNMIIT | Kant V.,LNMIIT | Dwivedi P.,Motilal Nehru National Institute of Technology
ACM International Conference Proceeding Series | Year: 2017

Recommender system (RS), a web personalization tool, attempts to generate suitable recommendations to users based on their preferences. Generally, recommender system works on overall ratings but these ratings do not reflect the actual user preferences. Therefore, incorporation of multiple criteria ratings into RS can capture the user preferences accurately and produce effective recommendations to users. Multi criteria recommender systems (MCRS) generate recommendations to users based on the aggregation of similarities computed on multiple criteria using collaborative filtering. However, capturing optimal weights of various users on different criteria in the process of similarity aggregation is a major concern. Further selection of appropriate similarity measure is another challenge for employing collaborative filtering. Our work in this paper is an attempt towards developing multi criteria recommender systems by utilizing various similarity measures and particle swarm optimization to learn optimal weights. Experimental results reveal that our proposed approaches outperform other traditional approaches. © 2017 ACM.


Sureka A.,Indian Institute of Technology Delhi | Lal S.,Indian Institute of Technology Delhi | Agarwal L.,LNMIIT
Proceedings - Asia-Pacific Software Engineering Conference, APSEC | Year: 2011

Defect tracking systems such as Bugzilla and JIRA and source code version control systems such as CVS and SVN are widely used applications to support software development and maintenance activities. Previous studies show that bug databases and version databases are often used as standalone and separate repositories without explicit linkages between issue reports and corresponding commit transactions. This is because developers often do not explicitly mention or tag commit transactions with the relevant bug report IDs. The lack of explicit links between these two databases has been identified as a serious process data quality issue (incomplete and biased data) having implications in predictive model building (such as defect density and error proneness computation) and hypothesis-testing based on the dataset. Researchers have proposed solutions to link the two databases and performed experiments on open source projects such as FireFox Mozilla. We review previous approaches and propose a novel technique (based on Fellegi-Sunter (FS) Model for record linkages) to automatically integrate the two databases that overcomes some of the drawbacks of traditional methods. We validate the proposed approach by performing experiments on publicly available bug and version dataset obtained from two open-source projects (Apache HTTP Server and WikiMedia). The results of our experiments demonstrate that the proposed solution is effective in recovering traceability links (missing links) between bug-fixing commits and corresponding bug reports. © 2011 IEEE.


Nathani A.,Dhirubhai Ambani Institute of ICT | Chaudhary S.,Dhirubhai Ambani Institute of ICT | Somani G.,LNMIIT
Future Generation Computer Systems | Year: 2012

In present scenario, most of the Infrastructure as a Service (IaaS) clouds use simple resource allocation policies like immediate and best effort. Immediate allocation policy allocates the resources if available, otherwise the request is rejected. Best-effort policy also allocates the requested resources if available otherwise the request is placed in a FIFO queue. It is not possible for a cloud provider to satisfy all the requests due to finite resources at a time. Haizea is a resource lease manager that tries to address these issues by introducing complex resource allocation policies. Haizea uses resource leases as resource allocation abstraction and implements these leases by allocating Virtual Machines (VMs). Haizea supports four kinds of resource allocation policies: immediate, best effort, advanced reservation and deadline sensitive. This work provides a better way to support deadline sensitive leases in Haizea while minimizing the total number of leases rejected by it. Proposed dynamic planning based scheduling algorithm is implemented in Haizea that can admit new leases and prepare the schedule whenever a new lease can be accommodated. Experiments results show that it maximizes resource utilization and acceptance of leases compared to the existing algorithm of Haizea. © 2010 Elsevier B.V. All rights reserved.


Kant V.,LNMIIT | Dwivedi P.,Motilal Nehru National Institute of Technology
17th International Conference on Information Integration and Web-Based Applications and Services, iiWAS 2015 - Proceedings | Year: 2015

Memory-based collaborative filtering (CF) techniques have been widely implemented for predicting ratings to unseen items by aggregating ratings of similar users or items in recommender systems (RS). Usually, sufficient ratings from similar users or similar items are not available in the rating matrix, due to the data sparsity problem. Further, these techniques suffer from correlation based problems inherent in used similarity measures. Consequently, higher prediction accuracy cannot be achieved. In this paper, we propose the use of fuzzy Naïve Bayesian (FNB) classifier for user based CF and item based CF for implicitly computing similarity between users as well as items on the basis of conditional probabilities and develop fuzzy Naïve Bayesian classifier to user based CF (FNB-UB-CF) and item based CF (FNB-IB-CF). We further develop a hybrid RS (FNB-UB-IB-CF) by combining the proposed FNB-UB-CF and FNB-IB-CF. Their combinations would be helpful in alleviating the sparsity because both user ratings and item ratings are employed. Experimental results demonstrate that the proposed methods are indeed more robust against data sparsity and give better recommendation quality using a popular MovieLens dataset. © 2015 ACM.


Jadhav T.,IIT IndoreMadhya Pradesh | Misra R.,IIT IndoreMadhya Pradesh | Biswas S.,LNMIIT | Sharma G.D.,R and nter for Science and Engineering
Physical Chemistry Chemical Physics | Year: 2015

In this study, we have used three D-A type carbazole substituted BODIPY (carbazole connected to the meso position of BODIPY) small molecules as donors along with PC71BM as an electron acceptor for the fabrication of solution processed bulk heterojunction organic solar cells. The devices based on the as cast active layer showed power conversion efficiency in the range of 2.20-2.70%, with high open circuit voltage (Voc) in the range of 0.94-1.08 V. The high Voc is related to the deeper highest occupied molecular orbital energy level of these small molecules. The power conversion efficiency (PCE) of devices based on thermally annealed and solvent vapor annealed (TSVA) 3a:PC71BM and 3c:PC71BM processed active layers improved up to 5.05% and 4.80%, respectively, attributed to the improved light harvesting ability of active layers, better phase separation for exciton dissociation and balanced charge transport, induced by the TA and TSVA treatment. This journal is © the Owner Societies.


Shekhawat G.K.,LNMIIT | Karmakar P.,LNMIIT
International Conference on Ubiquitous and Future Networks, ICUFN | Year: 2016

Cooperative Spectrum Sensing (CSS) is a reliable detection approach that performs better over faded channel. In CSS, all Secondary Users (SU) have equal weight. We have studied performance of Centralized CSS (CCSS) using Normal Factor Graph (NFG) model with logical OR and Neyman-Pearson based Likelihood Ratio Testing (LRT) fusion rules. Here, we have proposed a novel Weighted CCSS (WCCSS) using NFG based probabilistic inference model. Sum Product algorithm (SPA) has been used for message passing and different weight assignment strategies have been used. The performance of WCCSS approach in Cognitive Radio Network (CRN) under different channels have been studied through simulation. © 2016 IEEE.


Nahar S.,LNMIIT | Joshi M.V.,Dhirubhai Ambani Institute of ICT
Proceedings - 3rd IAPR Asian Conference on Pattern Recognition, ACPR 2015 | Year: 2015

In this work, we propose to use an Inhomogeneous Gaussian Markov Random Field (IGMRF) and sparsity based priors in a regularization framework in order to estimate the dense disparity map. The IGMRF prior captures the spatial variation among disparities locally as well as it preserves sharp discontinuities. The sparsity prior captures the additional structure such as sparseness in the disparity map. The sparseness of the disparities are represented over the overcomplete dictionary which is learned from the estimated disparity map of the given stereo pair, using K-singular value decomposition (K-SVD) algorithm. The dictionary atoms are adaptive to the disparities of the given stereo pair. The sparse representation of disparities is used as a prior which is combined with the IGMRF prior in an energy minimization framework for estimating the disparity map. Disparity map is estimated using a two phase, iterative algorithm. In phase one, IGMRF parameters are computed at each pixel location and the dictionary is learned as well as the sparseness of disparities are obtained while keeping the disparity map fixed, and in phase two, disparity map is estimated by keeping the other parameters fixed. Experimental results on the standard dataset demonstrate the effectiveness of the proposed approach. © 2015 IEEE.


Saxena N.,Sungkyunkwan University | Sahu B.J.R.,LNMIIT | Han Y.S.,Sungkyul University
IEEE Communications Letters | Year: 2014

Commercial deployment of 4G LTE networks and rapid penetration of smart phones have exponentially increased the wireless data traffic, thus increasing the energy consumption and greenhouse (CO2) gas emission. The concept of green 4G LTE networks lies in the development of energy efficient LTE systems for reducing the greenhouse emissions as well as operators' energy bill. In this letter we first identify the complexity of the optimal traffic awareness in LTE networks and subsequently design a cooperative communication framework for traffic-aware energy optimization. The LTE eNBs explore an information theoretic approach to capture the dynamics and uncertainty of network traffic. Subsequently, using an online, stochastic game theoretic algorithm, the eNBs communicate amongst themselves to optimize the traffic awareness. Optimal traffic awareness helps in reducing the network energy consumption. Simulation results demonstrate that our framework results in almost 22% (40 KW-Hr) daily energy savings across a LTE network of 400 cells. © 2012 IEEE.

Loading LNMIIT collaborators
Loading LNMIIT collaborators