Delhi, India
Delhi, India

Time filter

Source Type

Priyadarshi P.,GGSIPU | Rai C.S.,University of Delhi
Proceeding - IEEE International Conference on Computing, Communication and Automation, ICCCA 2016 | Year: 2016

In this paper authors have addressed the problems in blind adaptation technique implementations. Blind adaptation techniques shows poor convergence property in comparison to the supervised techniques which uses training sequences for adaption purpose e.g. least mean square algorithm. Gradient descent based adaption/estimation technique is one of the widely used blind channel adaptation/estimation schemes. The most commonly and widely used gradient descent based blind channel adaptation/estimation algorithm is the Constant Modulus Algorithm (CMA) which suffers from the poor convergence property. Also constant modulus algorithm is phase blind. In this work, authors have presented a new modified CMA Algorithm for blind equalization. The modified update equation is based on mean forth error criteria. Matlab Simulation results proves the claimed fast convergence, low BER value as compared to the CMA algorithm in noisy environment. © 2016 IEEE.


Mudgil P.,Banasthali University | Sharma A.K.,YMCAU | Gupta P.,GGSIPU
Proceedings - 5th International Conference on Computational Intelligence and Communication Networks, CICN 2013 | Year: 2013

Owing to the dynamic nature of the web, it is difficult for search engine to find the relevant documents to serve a user query. For this purpose, search engine maintains the index of downloaded documents stored in the local repository. Whenever a query comes search engine searches the index in order to find the relevant matched results to be presented to the user. The quality of the matched result depends on the information stored in the index. The more efficient is the structure of index, more efficient the performance of search engine. Generally, inverted index are based solely on the frequency of keywords present in number of documents. In order to improve the efficiency of the search engine, an improved indexing mechanism to index the web documents is being proposed that keeps the context related information integrated with the frequency of the keyword. The structure is implemented using Trie. The implementation results on various documents show that proposed index efficiently stores the documents and search is fast. © 2013 IEEE.


Sharma A.K.,YMCAU | Gupta P.,GGSIPU | Singh S.K.,JIITU
Proceedings - International Conference on Communication Systems and Network Technologies, CSNT 2012 | Year: 2012

The organization and retrieval of information from hyper-linked documents is a challenging task for search engine expected to satisfy user queries with relevant content in first few top results displayed to the user. This arrangement implies in need of maintaining index of relevant context based pages. This paper provides a novel technique for indexing the web documents based on the context of keywords that helps a search engine to serve a query with more specific documents, and relevant contents. © 2012 IEEE.


Gosain A.,GGSIPU | Heena,GGSIPU
Procedia Computer Science | Year: 2016

Storage of pre-computed views in data warehouse can essentially reduce query processing cost for decision support queries. The problem is to choose an optimal set of materialized views. Various frameworks such as lattice, MVPP and AND-OR graphs and algorithms like heuristic based, greedy, stochastic algorithm have been proposed in the literature for materialized view selection. Heuristic and greedy algorithms become slower in high dimensional search space while stochastic algorithms do not guarantee global optimal solution but reach to the optimum most solution in a fast and efficient way. In this paper we have implemented Particle Swarm Optimization (PSO) algorithm, one of the stochastic algorithm, on lattice framework to select an optimal set of views for materialization in data warehouse by minimizing query processing cost. We have compared our results with Genetic algorithm to prove the effectiveness of PSO algorithm over genetic algorithm. © 2016 The Authors.


Gosain A.,GGSIPU | Heena,GGSIPU
Procedia Computer Science | Year: 2015

Quality of data warehouse is very crucial for managerial strategic decisions. Multidimensional data modeling has been accepted as a basis for data warehouse, thus data model quality has a great impact on overall quality of data warehouse. Metrics act as a tool to measure the quality of data warehouse model. Various authors have proposed metrics to assess the quality attributes of conceptual data models for data warehouse such as understandability, maintainability etc. All the related research work inspires us to investigate the metrics proposed to measure data warehouse data model quality, the various quality factors assessed and to provide a ground work for research advancement in this field. A total of 22 studies were selected and analyzed to identify the various validation techniques used to prove usage and practical utility of metrics and the quality factors measured by these metrics. Opportunities for future work lie in the gaps that were found in the validation of the metrics and the lack of quality factors measured. © 2015 The Authors. Published by Elsevier B.V.


Aggarwal B.,GGSIPU | Gupta M.,University of Delhi | Gupta A.K.,National Institute of Technology Kurukshetra
Proceedings of the 2013 International Conference on Advances in Computing, Communications and Informatics, ICACCI 2013 | Year: 2013

A new low voltage level shifted flipped voltage follower (FVF) current mirror has been proposed. The proposed current mirror has very low input resistance and high output resistance. The proposed current mirror is shown to have wider operating range as compared to FVF based current mirror. The simulations of proposed current mirror are carried out using Mentor Graphics Eldospice based on TSMC 0.18μm CMOS technology, for input current range of 0-280μA. The proposed current mirror operates with a single supply voltage of 1V and consumes 402.5μW power at 50μA input current. The simulation results show that it provides input and output resistances of 26° and 562K° respectively. © 2013 IEEE.


Nagpal S.,Netaji Subhas Institute of Technology | Gosain A.,GGSIPU | Sabharwal S.,Netaji Subhas Institute of Technology
International Journal of Systems Assurance Engineering and Management | Year: 2013

Structural complexity metrics have been widely used to assess quality of an artefact. Researchers in past have defined complexity metrics to assess the quality of multidimensional models for data warehouse. These metrics have been defined considering various elements like facts, dimensions, dimension hierarchies etc., but have not taken into account the relationships among these elements of the models. In our previous work, a comprehensive complexity metric for multidimensional models for data warehouse has been proposed which not only considered complexity due to the elements but also structural complexity due to relationships among these elements. However, the proposal lacks theoretical and empirical validation of the metric. Hence, practical utility of the metric could not be established. This paper validates the proposed metric theoretically as well as empirically. The theoretical validation using Briand's framework shows that the proposed metric satisfies most of the properties required for a complexity measure. Empirical validation is carried out to observe the relationship between the complexity metric and understandability-a sub-characteristic of maintainability of multidimensional models. The results show that the metric has significant positive correlation with understandability of multidimensional models. Predictive model based on Ordinal Regression proposed in this work indicates that the proposed complexity metric may act as objective indicator for understandability as accuracy of the model is 86.3 % which is quite high. © 2013 The Society for Reliability Engineering, Quality and Operations Management (SREQOM), India and The Division of Operation and Maintenance, Lulea University of Technology, Sweden.


Aggarwal T.,GGSIPU | Furqan A.,GGSIPU | Kalra K.,GGSIPU
2015 International Conference on Advances in Computing, Communications and Informatics, ICACCI 2015 | Year: 2015

This paper presents a computational based system for detection and classification of lung nodules from chest CT scan images. In this study we consider the case of a primary lung cancer. Optimal thresholding and gray level characteristics are used for segmentation of lung nodules from the lung volume area. After detection of lung mass tissue, geometrical features are extracted. Simple image processing techniques like filtering, morphological operation etc. are used on CT images collected from Cancer Imaging Archive database to make the study effective and efficient. To distinguish between the nodule and normal pulmonary structure, geometrical features are merged with LDA (linear discriminate analysis) classifier. GLCM technique is used for calculating statistical features. The results show that proposed methodology successfully detects and provides prior classification of nodules and normal anatomy structure effectively, based on geometrical, statistical and gray level characteristics. Results also provide 84 % accuracy, 97.14 % sensitivity and 53.33 % specificity. © 2015 IEEE.


Rawal Y.,GGSIPU | Basra V.,GGSIPU | Ahuja A.,GGSIPU | Garg B.,GGSIPU
2015 International Conference on Computing for Sustainable Global Development, INDIACom 2015 | Year: 2015

In our approach to find a suitable means for accessing and traversing a graph with edges that have the some alternating weight assigned to them. We studied combinational properties of graphs, thus allowing us to come to a newer and easier method to fully traverse a dynamic graph that stimulates the real-time environment with the weight upon its edges changing continuously. Our approach yields a fully dynamic algorithm for general directed graphs with non-negative edges with weights that are revalued continuously. The sequence of operations remains to be in O(n2) amortized time per update and unit worst-case time per distance query, where n is the number of vertices. The focus of our study is to primarily ensure that in an environment where several nodes co-exist with weights and paths on the edges sprouting irregularly, there is still a palpable method to calculate the shortest path. Our algorithm is deterministic, uses simple data structures thus is efficient to be employed for its goal. © 2015 IEEE.


Agarwal S.,GGSIPU | Singh A.P.,GGSIPU | Anand N.,AIACTR
2013 4th International Conference on Computing, Communications and Networking Technologies, ICCCNT 2013 | Year: 2013

The paper reviews and introduces the state-of-the-art nature-inspired metaheuristic algorithms in optimization, including Firefly algorithm, PSO algorithms and ABC algorithm. By implementing these algorithms in Matlab, we will use worked examples to show how each algorithm works. Firefly algorithm is one of the evolutionary optimization algorithms, and is inspired by the flashing behaviour of fireflies in nature. There are many noisy non-linear mathematical optimization problems that can be effectively solved by Metaheuristic Algorithms. Mathematical optimization or programming is the study of such planning and design problems using mathematical tools. Nowadays, computer simulations become an indispensable tool for solving such optimization problems with various efficient search algorithms. Nature-inspired algorithms are among the most powerful algorithms for optimization. Firefly algorithm is one of the new metaheuristic algorithms for optimization problems. The algorithm is inspired by the flashing behaviour of fireflies. A Firefly Algorithm (FA) is a recent nature inspired optimization algorithm, which simulates the flash pattern and characteristics of fireflies. It is a powerful swarm intelligence algorithm inspired by the flash phenomenon of the fireflies. In this context, three types of meta-heuristics called Artificial bee Colony algorithm, Particle Swarm Optimization (PSO) and Firefly algorithms were devised to find optimal solutions of noisy non-linear continuous mathematical models. A series of computational experiments using each algorithm were conducted. The stimulation result of this experiment were analyzed and compared to the best solutions found so The Firefly algorithm in each noisy non linear optimization function seems to perform better and efficient. © 2013 IEEE.

Loading GGSIPU collaborators
Loading GGSIPU collaborators