Key Laboratory of Computer Network and Information Integration Southeast University

Laboratory of, China

Key Laboratory of Computer Network and Information Integration Southeast University

Laboratory of, China
SEARCH FILTERS
Time filter
Source Type

Wu J.,Nanjing University of Posts and Telecommunications | Wu J.,Key Laboratory of Computer Network and Information Integration Southeast University | Wu J.,University of Victoria | Zhu Y.,Nanjing University of Posts and Telecommunications | And 4 more authors.
2016 IEEE Global Communications Conference, GLOBECOM 2016 - Proceedings | Year: 2016

Delay-Tolerant Networks (DTNs) are wireless mobile networks, where the nodes are sparse and end-to- end connectivity is rare. Since DTN nodes are mostly energy-limited devices, there is an immediate need to have energy-efficient routing protocols, allowing the network to perform better and function longer. Besides, in the real world, people carrying the nodes form a lot of communities because of similar interests, and they behave with social selfishness. How to improve the energy efficiency in multi-community scenarios has been an important problem. In this paper, we analytically model the performance of epidemic routing protocols in multi-community scenarios with social selfishness considerations using the Ordinary Differential Equations (ODEs). Further, an energy-efficient copy-limit-optimized algorithm based on the Box's complex method for epidemic routing is proposed, which is designed to determine the optimal copy limit in multiple communities, and can improve the energy efficiency effectively. At last, both the numerical and simulation results show that the routing protocol with the proposed algorithm can reduce the energy consumption effectively, and the impact of social selfishness is also analyzed. © 2016 IEEE.


Cai Z.,Nanjing University of Science and Technology | Cai Z.,Key Laboratory of Image and Video Understanding for Social Safety | Cai Z.,Key Laboratory of Computer Network and Information Integration Southeast University | Li X.,Nanjing Southeast University | And 2 more authors.
Future Generation Computer Systems | Year: 2017

Bag-of-Tasks (BoT) workflows are widespread in many big data analysis fields. However, there are very few cloud resource provisioning and scheduling algorithms tailored for BoT workflows. Furthermore, existing algorithms fail to consider the stochastic task execution times of BoT workflows which leads to deadline violations and increased resource renting costs. In this paper, we propose a dynamic cloud resource provisioning and scheduling algorithm which aims to fulfill the workflow deadline by using the sum of task execution time expectation and standard deviation to estimate real task execution times. A bag-based delay scheduling strategy and a single-type based virtual machine interval renting method are presented to decrease the resource renting cost. The proposed algorithm is evaluated using a cloud simulator ElasticSim which is extended from CloudSim. The results show that the dynamic algorithm decreases the resource renting cost while guaranteeing the workflow deadline compared to the existing algorithms. © 2017 Elsevier B.V.


Wu J.,Nanjing University of Posts and Telecommunications | Wu J.,Key Laboratory of Computer Network and Information Integration Southeast University | Wu J.,University of Victoria | Wang J.,Nanjing University of Posts and Telecommunications | And 4 more authors.
IEEE Wireless Communications and Networking Conference, WCNC | Year: 2016

Node mobility and end-to-end disconnections in Delay Tolerant Networks (DTNs) greatly weaken the effectiveness of data transmission. Although social-based strategies can be used to deal with the problem, most existing approaches adopt multicopy strategy to forward messages which inevitably add more unnecessary cost. One of the most important issues is the selection of the best intermediate node to forward messages to the destination node. In this paper, we focus on finding a quality metric associated with better relays which is evaluated by Reachable Probability Centrality (RPC) as we proposed. RPC combines the contact matrix and multi-hop forwarding probability based on the weighted social network, thus ensuring an effective relay selection. We also propose a distributed RPC-based routing algorithm, which demonstrates the applicability of our scheme in the decentralized environment of DTNs. Extensive trace-driven simulations show that RPC outperforms other centrality measures and our proposed routing algorithm can significantly reduce the data forwarding cost while having comparable delivery ratio and delay to those of the Epidemic routing. © 2016 IEEE.


Chen Y.,Nanjing Southeast University | Chen Y.,Key Laboratory of Computer Network and Information Integration Southeast University | Yang J.,Beijing Institute of Technology | Shu H.,Nanjing Southeast University | And 11 more authors.
PLoS ONE | Year: 2014

An effective approach termed Recursive Gaussian Maximum Likelihood Estimation (RGMLE) is developed in this paper to suppress 2-D impulse noise. And two algorithms termed RGMLE-C and RGMLE-CS are derived by using spatially-adaptive variances, which are respectively estimated based on certainty and joint certainty & similarity information. To give reliable implementation of RGMLE-C and RGMLE-CS algorithms, a novel recursion stopping strategy is proposed by evaluating the estimation error of uncorrupted pixels. Numerical experiments on different noise densities show that the proposed two algorithms can lead to significantly better results than some typical median type filters. Efficient implementation is also realized via GPU (Graphic Processing Unit)-based parallelization techniques. © 2014 Chen et al.


Wang Z.,Nanjing Southeast University | Wang Z.,Key Laboratory of Computer Network and Information Integration Southeast University | Wang Z.,Chuzhou University | Li B.,Nanjing Southeast University | And 5 more authors.
SEKE 2011 - Proceedings of the 23rd International Conference on Software Engineering and Knowledge Engineering | Year: 2011

A common problem in object-oriented software integration testing is to determine the order in which, classes are integrated and tested. In this paper, we first overview some related work based on their objectives in current literature and then provide some analysis and evaluation.


Liu Q.,Nanjing Southeast University | Liu Q.,Key Laboratory of Computer Network and Information Integration Southeast University | Gao Z.,Nanjing Southeast University | Gao Z.,Key Laboratory of Computer Network and Information Integration Southeast University | And 2 more authors.
Knowledge-Based Systems | Year: 2016

Opinion target extraction, also called aspect extraction, aims to extract fine-grained opinion targets from opinion texts, such as customer reviews of products and services. This task is important because opinions without targets are of limited use. It is one of the core tasks of the popular aspect-oriented opinion mining, and is also among the most challenging tasks tackled by opinion mining researchers. Previous work has shown that the syntactic-based approach, which employs extraction rules about grammar dependency relations between opinion words and aspects (or targets), performs quite well. This approach is highly desirable in practice because it is unsupervised and domain independent. The problem of this approach is that the extraction rules should be carefully selected and tuned manually so as not to produce too many errors. Although it is easy to evaluate the accuracy of each rule automatically, it is not easy to select a set of rules that produces the best overall result due to the overlapping coverage of the rules. In this paper, we propose two approaches to select an effective set of rules. The first approach employs a greedy algorithm, and the second one employs a local search algorithm, specifically, simulated annealing. Our experiment results show that the proposed approaches can select a subset of a given rule set to achieve significantly better results than the full rule set and the existing state-of-the-art CRF-based supervised method. © 2016.


Tao C.,Nanjing Southeast University | Tao C.,Key Laboratory of Computer Network and Information Integration Southeast University | Tao C.,San Jose State University | Li B.,Nanjing Southeast University | And 2 more authors.
Journal of Software | Year: 2013

Today, component-based software engineering has been widely used in software construction to reduce project cost and speed up software development cycle. Due to software changes in new release or update of components, regression testing is needed to assure system quality. When changes made to a component, the component could be affected, moreover, the changes could bring impacts on the entire system. We firstly identify diverse changes made to components and system based on models, then perform change impact analysis, and finally refresh regression test suite using a state-based testing practice. Related existing research did not address the issue of systematic regression testing of component-based software, especially at system level. The paper also reports a case study based on a realistic component-based software system using a, which shows that the approach is feasible and effective. © 2013 ACADEMY PUBLISHER.


Liu F.,Nanjing Southeast University | Li B.,Key Laboratory of Computer Network and Information Integration Southeast University
SEKE 2011 - Proceedings of the 23rd International Conference on Software Engineering and Knowledge Engineering | Year: 2011

This paper gives a novel method of investigating flow-sensitive pointer analysis for multithreaded program based on petri net. The method mainly borrows causal data flow analysis idea from Azadeh Farzan. Petri net is used to describe control flow structure of multithreaded program. And pointer points-to information is propagated along causal dependencies of events in the partial order execution of petri net. The problem of pointer analysis is reduced to the coverability problem on the petri net.


Wen W.,Nanjing Southeast University | Wen W.,Key Laboratory of Computer Network and Information Integration Southeast University | Li B.,Nanjing Southeast University | Li B.,Key Laboratory of Computer Network and Information Integration Southeast University | And 4 more authors.
SEKE 2011 - Proceedings of the 23rd International Conference on Software Engineering and Knowledge Engineering | Year: 2011

Spectrum-based fault localization technique mainly utilizes testing coverage information to calculate the suspiciousness of each, program element to find- the faulty element. However, this technique does not fully take consideration of dependences between program elements , thus its capacity for efficient fault localization is limited. This paper combines program slicing with program spectrum technique, and proposes a program slicing spectrum-based software fault localization (PSS-SFL) technique. Firstly, PSS-SFL analyzes dependences between program-elements, and deletes some elements unrelated to the failed test outputs; then it builds the program slicing spectrum-model and defines a novel suspiciousness metric for each-slice element; finally, the faulty element is located according to the suspiciousness metric results. Experimental results show that PSS-SFL can be effective and more precise to locate the fault than program spectrum-based Tarantula technique.


Wang Z.,Northeastern University China | Zhao Y.,Northeastern University China | Zhao Y.,Key Laboratory of Computer Network and Information Integration Southeast University | Wang G.,Northeastern University China | Cheng Y.,Northeastern University China
Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) | Year: 2015

Discriminative subgraph mining from a large collection of graph objects is a crucial problem for graph classification. Several main memory-based approaches have been proposed to mine discriminative subgraphs, but they always lack scalability and are not suitable for large-scale graph databases. Based on theMapReduce model, we propose an efficient method, MRGAGC, to process discriminative subgraph mining. MRGAGC employs the iterative MapReduce framework to mine discriminative subgraphs. Each map step applies the evolutionary computation and three evolutionary strategies to generate a set of locally optimal discriminative subgraphs, and the reduce step aggregates all the discriminative subgraphs and outputs the result. The iteration loop terminates until the stopping condition threshold is met. In the end, we employ subgraph coverage rules to build graph classifiers using the discriminative subgraphs mined by MRGAGC. Extensive experimental results on both real and synthetic datasets show that MRGAGC obviously outperforms the other approaches in terms of both classification accuracy and runtime efficiency. © Springer International Publishing Switzerland 2015.

Loading Key Laboratory of Computer Network and Information Integration Southeast University collaborators
Loading Key Laboratory of Computer Network and Information Integration Southeast University collaborators