Time filter

Source Type

Bennour I.E.,National School in Computer Science
International Design and Test Workshop | Year: 2017

SystemC has been developed as a standard system level language intended to enable transaction level modeling (TLM) and intellectual properties (IPs) exchange at multiple abstraction levels. To re-use formal analysis and verification methods on a SystemC code, the code has to be translated to a formal representation. Petri net is one of several mathematical modeling languages for the description of communication protocols and programs written with process-oriented parallel languages. In a previous work we dealt with the translation of a SystemC TLM module to Petri net. In this paper, we extend the translation to verify the TLM2 protocol consistency used by a module. © 2016 IEEE.


Nguyen T.T.S.,National School in Computer Science
ACM International Conference Proceeding Series | Year: 2016

In this paper, we propose a novel framework using the word2vec model, a deep learning method, integrated with a book ontology in order to enhance semantically searching books. The idea starts from constructing a book ontology for reasoning book information efficiently. A deep learning method, namely the word2vec model, is then utilized to represent vectors of words occurring on book descriptions. These vectors would help finding most relevant books given a query string. The integration of the word2vec model and the book ontology is able to achieve high performance in searching books. A database of Amazon books is taken into account examining the proposed method, compared with an advanced keyword matching method. The experimental results show that the proposed method can produce more accurate searching results. © 2016 ACM.


Bouabana-Tebibel T.,National School in Computer Science | Rubin S.H.,Systems Center Pacific
Information Sciences | Year: 2013

Weak sequencing is the implicit composition operator for interactions defined by the OMG specification. Accordingly, most semantics retain this operator to compose a CombinedFragment with the rest of the interactions. But all of them use only formalisms based on trace or interleaving semantics. True-concurrency-based formalisms ignore the standard interpretation and introduce synchronization on entering and exiting fragments. In this paper, we propose to revise the formal semantics of the CombinedFragments using a formalism that offers a high expressivity power to describe execution traces with regard to true concurrency as well as interleaving. We define an appropriate semantics, which is in accordance with the UML 2.4 specification regarding the event ordering over the operands and the constraints evaluation. For this purpose, we propose an approach to translate the CombinedFragments into Colored Petri Nets, or CPNs. The derived specification is value-oriented, composed of identified objects and events, thus allowing a more precise analysis of the model behavior. It is verified by model checking. A case study is given to illustrate the approach throughout the paper. © 2013 Elsevier Inc. All rights reserved.


Saadaoui F.,Higher Institute of Applied science and Technology | Rabbouch H.,National School in Computer Science
Expert Systems with Applications | Year: 2014

The paper proposes a parsimonious nonlinear framework for modeling bivariate stochastic processes. The method is a vector autoregressive-like approach equipped with a wavelet-based feedforward neural network, allowing practitioners dealing with extremely random two-dimensional information to make predictions and plan their future more and more precisely. Artificial Neural Networks (ANN) are recognized as powerful computing devices and universal approximators that proved valuable for a wide range of univariate time series problems. We expand their coverage to handle nonlinear bivariate data. Wavelet techniques are used to strengthen the procedure, since they allow to break up processes information into a finite number of sub-signals, and subsequently extract microscopic patterns in both time and frequency fields. The proposed model can be very valuable especially when modeling nonlinear econophysical systems with high extent of volatility. © 2014 Elsevier Ltd. All rights reserved.


Gould S.,National School in Computer Science
IEEE Transactions on Pattern Analysis and Machine Intelligence | Year: 2015

Markov random fields containing higher-order terms are becoming increasingly popular due to their ability to capture complicated relationships as soft constraints involving many output random variables. In computer vision an important class of constraints encode a preference for label consistency over large sets of pixels and can be modeled using higher-order terms known as lower linear envelope potentials. In this paper we develop an algorithm for learning the parameters of binary Markov random fields with weighted lower linear envelope potentials. We first show how to perform exact energy minimization on these models in time polynomial in the number of variables and number of linear envelope functions. Then, with tractable inference in hand, we show how the parameters of the lower linear envelope potentials can be estimated from labeled training data within a max-margin learning framework. We explore three variants of the lower linear envelope parameterization and demonstrate results on both synthetic and real-world problems. © 2015 IEEE.


Kieu T.D.,National School in Computer Science | Chang C.-C.,Feng Chia University
Expert Systems with Applications | Year: 2011

Recently, Zhang and Wang proposed a steganographic scheme by exploiting modification direction (EMD) to embed one secret digit d in the base-(2 × n + 1) notational system into a group of n cover pixels at a time. Therefore, the hiding capacity of the EMD method is log2(2 × n + 1)/n bit per pixel (bpp). In addition, its visual quality is not optimal. To overcome the drawbacks of the EMD method, we propose a novel steganographic scheme by exploiting eight modification directions to hide several secret bits into a cover pixel pair at a time. By this way, the proposed method can achieve various hiding capacities of 1, 2, 3, 4, and 4.5 bpp and good visual qualities of 52.39, 46.75, 40.83, 34.83, and 31.70 dB, respectively. The experimental results show that the proposed method outperforms three recently published works, namely Mielikainen's, Zhang and Wang's, and Yang et al.'s methods. © 2011 Published by Elsevier Ltd.


Bouabana-Tebibel T.,National School in Computer Science
Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) | Year: 2012

Presently, the main concern of ad hoc routing protocols is no longer to find an optimal route to a given destination but to find the safe route free from malicious attackers. Several secure ad hoc routing protocols proposed, in the literature, are based on public key cryptography which drawback is to consume much more resources and decrease consequently network performances. In this paper, we propose a secure routing scheme for the DSR protocol. The proposed scheme combines the hash chains and digital signatures to provide a high level of security while reducing the costs of hop-by-hop signature generation and verification. The proposed protocol is analyzed using the NS-2 simulator. © 2012 Springer-Verlag.


Chakchouk N.,National School in Computer Science
IEEE Communications Surveys and Tutorials | Year: 2015

The great advances made in the wireless technology have enabled the deployment of wireless communication networks in some of the harshest environments such as volcanoes, hurricane-affected regions, and underground mines. In such challenging environments suffering from the lack of infrastructure, traditional routing is not efficient and sometimes not even feasible. Moreover, the exponential growth of the number of wireless connected devices has created the need for a new routing paradigm that could benefit from the potentials offered by these heterogeneous wireless devices. Hence, in order to overcome the traditional routing limitations, and to increase the capacity of current dynamic heterogeneous wireless networks, the opportunistic routing paradigm has been proposed and developed in recent research works. Motivated by the great interest that has been attributed to this new paradigm within the last decade, we provide a comprehensive survey of the existing literature related to opportunistic routing. We first study the main design building blocks of opportunistic routing. Then, we provide a taxonomy for opportunistic routing proposals, based on their routing objectives as well as the optimization tools and approaches used in the routing design. Hence, five opportunistic routing classes are defined and studied in this paper, namely, geographic opportunistic routing, link-state-aware opportunistic routing, probabilistic opportunistic routing, optimization-based opportunistic routing, and cross-layer opportunistic routing. We also review the main protocols proposed in the literature for each class. Finally, we identify and discuss the main future research directions related to the opportunistic routing design, optimization, and deployment. © 2015 IEEE.


Shivakumara P.,National School in Computer Science | Phan T.Q.,National School in Computer Science | Tan C.L.,National School in Computer Science
IEEE Transactions on Pattern Analysis and Machine Intelligence | Year: 2011

AbstractIn this paper, we propose a method based on the Laplacian in the frequency domain for video text detection. Unlike many other approaches which assume that text is horizontally-oriented, our method is able to handle text of arbitrary orientation. The input image is first filtered with Fourier-Laplacian. K - means clustering is then used to identify candidate text regions based on the maximum difference. The skeleton of each connected component helps to separate the different text strings from each other. Finally, text string straightness and edge density are used for false positive elimination. Experimental results show that the proposed method is able to handle graphics text and scene text of both horizontal and nonhorizontal orientation. © 2011 IEEE.


Milovanov A.,National School in Computer Science
Theory of Computing Systems | Year: 2016

Algorithmic statistics is a part of algorithmic information theory (Kolmogorov complexity theory) that studies the following task: given a finite object x (say, a binary string), find an ‘explanation’ for it, i.e., a simple finite set that contains x and where x is a ‘typical element’. Both notions (‘simple’ and ‘typical’) are defined in terms of Kolmogorov complexity. It is known that this cannot be achieved for some objects: there are some “non-stochastic” objects that do not have good explanations. In this paper we study the properties of maximally non-stochastic objects; we call them “antistochastic”. In this paper, we demonstrate that the antistochastic strings have the following property (Theorem 6): if an antistochastic string x has complexity k, then any k bit of information about x are enough to reconstruct x (with logarithmic advice). In particular, if we erase all but k bits of this antistochastic string, the erased bits can be restored from the remaining ones (with logarithmic advice). As a corollary we get the existence of good list-decoding codes with erasures (or other ways of deleting part of the information). Antistochastic strings can also be used as a source of counterexamples in algorithmic information theory. We show that the symmetry of information property fails for total conditional complexity for antistochastic strings. An extended abstract of this paper was presented at the 10th International Computer Science Symposium in Russia (Milovanov, 2015). © 2016 Springer Science+Business Media New York

Loading National School in Computer Science collaborators
Loading National School in Computer Science collaborators