Entity

Time filter

Source Type


Pan Z.,Jilin University | Pan Z.,Key Laboratory of Computation and Knowledge Engineering | Ouyang D.,Jilin University | Ouyang D.,Key Laboratory of Computation and Knowledge Engineering
Journal of Computational Information Systems | Year: 2014

In this letter, we suggest a novel Object Shrunken (OS) algorithm to handle the image classification task. Unlike the prior art, this letter considers the foreground or the object location in the image for more discriminative image-level representation. The OS algorithm suggests a straightforward procedure to box the object location. It first proposes a Weighted Local Outlier Factor (WLOF) to remove all the interest point outliers, and then positions the object location in terms of the distribution of the rest interest points. We evaluate the proposed algorithm on the well-known dataset Caltech-101. The resulting OS algorithm outperforms the state-of-art approaches in the image classification task. 1553-9105/Copyright © 2014 Binary Information Press Source


Cui X.,Jilin University | Cui X.,Key Laboratory of Computation and Knowledge Engineering | Ouyang D.,Jilin University | Ouyang D.,Key Laboratory of Computation and Knowledge Engineering | And 5 more authors.
Journal of Computational Information Systems | Year: 2012

OWL Ontologies may change continually to meet user's dynamic request and it may damage the integrity of the ontology. Thus, it is in urgent to propose an effective strategy to maintain the integrity of the continually changed ontology. In this paper, we propose a framework of integrity maintenance of continually changed OWL ontology. At the basis of the connectedness between ontology data and integrity constraints, we define the maximal relevant subontology of the continually changed ontology and perform the integrity checking in its maximal relevant subontology instead of the entire ontology. In further, for the different change operations, the integrity checking is realized by finding the corresponding maximal relevant subontologies. Finally, for the unintegrity ontology data, obtain the integrity ontology for the continually changed ontology which satisfies all the integrity constraints by modifying the ontology axioms. Thus, maintain the integrity of the continually changed ontology. © 2012 Binary Information Press. Source


Wang Y.,Jilin University | Wang Y.,Key Laboratory of Computation and Knowledge Engineering | Zuo W.,Jilin University | Zuo W.,Key Laboratory of Computation and Knowledge Engineering | And 5 more authors.
Communications in Computer and Information Science | Year: 2011

Deep Web contains a significant amount of visited information, in order to effectively guide users to the appropriate searchable web databases, we need to organize it according to different domain. Ontology plays an important role in locating Deep Web content, therefore, this paper proposes a new Deep Web database selection framework based on ontology. Firstly, constructing domain ontology content model (DOCM), and then, designing the ontologyassisted similarity algorithm, which adds semantic information to form eigenvectors, lastly, selecting the mapping relational databases as domainspecific databases. Experiment shows that the method can effectively select Deep Web databases. ©2011 Springer-Verlag Berlin Heidelberg. Source


Wang Y.,Jilin University | Wang Y.,Key Laboratory of Computation and Knowledge Engineering | Li H.,Jilin University | Zuo W.,Jilin University | And 4 more authors.
Advanced Science Letters | Year: 2012

A significant amount of information can only be accessed through query interfaces, so it plays an important role for integrating these interfaces to generate a global interface, which allows uniform access to disparate relevant sources. In this paper, a novel approach of integrating Deep Web query interfaces is proposed based on ontology, which mainly subsumes three aspects: schema extraction, schema matching and schema merging. Firstly, designing schema extraction algorithm based on visual features to grasp and analyze labels and control constraints of semantic attributes. Then, capturing semantic relationships of attributes among different query interfaces by exploiting the "bridging" effect in the process of matching attributes based on ontology. Lastly, merging these source query interfaces to construct a unified schema based on the identified mapping relationships after schema matching. Through a detailed experimental evaluation, the ontology-assisted approach of interface integration is feasible and effective. © 2012 American Scientific Publishers. All Rights Reserved. Source


Zuo W.,Jilin University | Zuo W.,Key Laboratory of Computation and Knowledge Engineering | Wang Y.,Jilin University | Wang Y.,Key Laboratory of Computation and Knowledge Engineering | And 5 more authors.
Journal of Computational Information Systems | Year: 2010

Deep Web contains a significant amount of visited information, in order to be able to make full use of the information, we need to organize it according to different domain. Therefore, it is imperative that Deep Web databases should be classified by domain automatically. In this paper, a new Deep Web database classification framework is proposed, which adds semantic information to feature vectors and centroid vector by extracting the synsets of terms which can be obtained from WordNet, and replace the terms by corresponding synsets in the feature vectors and centroid vector to achieve dimensionality reduction of vectors. Lastly, highlight the semantic feature vectors by semantic centroid vector, and classify the highlighted semantic feature vectors by classification algorithm. Experiments show that experiment 3 which combines experiment 1 and experiment 2 can effectively improve the classification accuracy of Deep Web databases. Copyright © 2010 Binary Information Press. Source

Discover hidden collaborations