Entity

Time filter

Source Type


Trabelsi S.,Communication Networks and Security Research Laboratory | Boudriga N.,Communication Networks and Security Research Laboratory
2010 ACS/IEEE International Conference on Computer Systems and Applications, AICCSA 2010 | Year: 2010

The fourth generation of mobile wireless networks (4G) is expected to be the most promising architecture for QoS provision due to its scalability, convenience for mobility support and capability of interworking heterogeneous radio access networks which ensure both session continuity and QoS support. One major design issue of the 4G is the support of optimized handoff functionalities. More specifically, total disruption during a handoff should be minimized and its complexity hidden to end users. This article focuses on dynamic predictive resource reservation in 4G in order to maximize handoff success probability. We discuss how to reserve radio resources according to future mobile terminal location expressed in a probabilistic way, to load conditions of target Base Station/Access Point BS/AP, and to the specificity of data structure of each access network. Different resource reservation algorithms are devised in this paper. The objective is to efficiently utilize the wireless radio resources, to enhance the handoff performances and to improve, therefore, the overall system performances. Results based on a detailed performance evaluation study are also presented here to demonstrate the efficacy of the proposed algorithms.


Djemaiel Y.,Communication Networks and Security Research Laboratory | Boudriga N.,Communication Networks and Security Research Laboratory
Proceedings of the 24th International Business Information Management Association Conference - Crafting Global Competitive Economies: 2020 Vision Strategic Planning and Smart Implementation | Year: 2014

Nowadays, information systems are subject to several kinds of attacks that threaten their normal behavior and even may lead to the loss of provided services. These attacks have an effect directly on the quality of information systems that should be monitored in a continuous manner in order to detect possible degradation of quality metrics and to react in an efficient manner to maintain acceptable values for such metrics and therefore guarantee the required services. The degradation of quality metrics may be a useful mean to detect malicious activities that are not reported by intrusion detection systems. This paper deals with this need by proposing a novel Petri Net-based approach that enables the detection of attacks even if there are missing or false alerts by introducing novel kind of transitions that ensure the evaluation of the degradation of the quality of IS resources in addition to the identification of the most probable transitions. The efficiency of the proposed scheme is evaluated by considering the Petri Net that models a sales management information system and the set of attached quality parameters. The proposed scheme enables the prediction of the set of actions that lead to the degradation of the quality of the IS in addition to the localization of IS resources that should be protected.


Essaddi N.,Communication Networks and Security Research Laboratory | Hamdi M.,Communication Networks and Security Research Laboratory | Habib S.,Kuwait University | Boudriga N.,Communication Networks and Security Research Laboratory
International Journal of Communication Networks and Distributed Systems | Year: 2011

Wireless sensor networks (WSNs) have been the subject of an important development during the last years. Most of the applications deployed over WSNs require strong coverage requirements, especially those related to the detection and tracking of distributed events. In this paper, we use the Voronoi tessellation of the region of interest to formulate and solve an evolutionary optimisation problem modelling the activation of the deployed sensors. The major idea behind our approach is to adapt the spatial sensor distribution to the local probability of target presence. We show, through the results of our experiments, that our method allows a non-uniform deployment of the sensor nodes, which is better suitable for tracking applications. © 2011 Inderscience Enterprises Ltd.


Essaddi N.,Communication Networks and Security Research Laboratory | Hamdi M.,Communication Networks and Security Research Laboratory | Habib S.,Kuwait University | Boudriga N.,Communication Networks and Security Research Laboratory
2010 ACS/IEEE International Conference on Computer Systems and Applications, AICCSA 2010 | Year: 2010

Wireless Sensor Networks (WSNs) have inspired tremendous research interest in since the mid-1990s. Advancement in wireless communication and miniature electromechanical systems (MEMSs) have enabled the development of low-cost, low power, multi-functional, tiny sensor nodes that can sense the environment, perform data processing, and communicate with each other untethered over short distances. Most of the applications deployed over WSNs require strong coverage requirements, especially those related to the detection and tracking of distributed events. Moreover, these events are forwarded to the analysis center(s) through a set of sink nodes that locally gather data emanating from the elementary sensors. This paper proposes a coverage control scheme that adapts to the situation where multiple sink nodes are deployed within the monitored area. On the opposite to traditional coverage approaches that aim at guaranteeing a uniform density distribution, we place the sensor nodes in a manner that increases the coverage degree according to their proximity to a sink node. To reduce the complexity of the optimization process, we consider a discrete search space by structuring the monitored into a uniform grid. An evolutionary algorithm is then used to choose whether to activate or not sensor nodes within every cell of the grid. We conducted a set of simulations in order to evaluate the performance of the proposed strategy, mainly in ensuring multi-target tracking.


Abdallah W.,Communication Networks and Security Research Laboratory | Hamdi M.,Communication Networks and Security Research Laboratory | Noureddine B.,Communication Networks and Security Research Laboratory | Obaidat M.S.,Monmouth University
Simulation | Year: 2010

Optical burst switching (OBS) is a promising method for data transfer in photonic networks based on a Wavelength Division Multiplexing (WDM) technology. Transmission control protocol (TCP)-based applications generate the majority of data traffic in the Internet; thus, understanding and improving the performance of TCP over OBS networks is critical. In this paper, we develop a novel burstdropping strategy to improve the quality of service provided by TCP over OBS networks. Our approach relies on random-segment dropping according to the capacity of a special optical component, called the optical virtual memory, which is used for buffering purposes within the optical switches. The core node predicts incipient congestion by computing the average blocking duration in the optical virtual memories. When this size exceeds a threshold, segments are randomly dropped. Simulation results show that the proposed method performs better than common techniques in terms of burst loss probability and transmission delay. © 2010 The Society for Modeling and Simulation International.

Discover hidden collaborations