Entity

Time filter

Source Type


Xinjun P.,Shanghai Normal University | Xinjun P.,Scientific Computing Key Laboratory of Shanghai Universities
Expert Systems with Applications | Year: 2010

The twin support vector hypersphere (TSVH) is a novel efficient pattern recognition tool, because it determines a pair of hyperspheres by solving two related SVM-type problems, each of which is smaller than in a classical SVM. In this paper we formulate a least squares version for this classifier, termed as the least squares twin support vector hypersphere (LS-TSVH). This formulation leads to extremely simple and fast algorithm for generating binary classifier based on a pair of hyperspheres. Due to equality type constraints in the formulation, the solution follows from solving two sets of nonlinear equations, instead of the two dual quadratic programming problems (QPPs) for TSVH. We show that the two sets of nonlinear equations are solved using the well-known Newton downhill algorithm. The effectiveness of proposed LS-TSVH is demonstrated by experimental results on several artificial and benchmark datasets. © 2010 Elsevier Ltd. All rights reserved. Source


Peng X.,Shanghai Normal University | Peng X.,Scientific Computing Key Laboratory of Shanghai Universities
Neurocomputing | Year: 2012

In this paper, an efficient twin parametric insensitive support vector regression (TPISVR) is proposed. The TPISVR determines indirectly the regression function through a pair of nonparallel parametric-insensitive up- and down-bound functions solved by two smaller sized support vector machine (SVM)-type problems, which causes the TPISVR not only have the faster learning speed than the classical SVR, but also be suitable for many cases, especially when the noise is heteroscedastic, that is, the noise strongly depends on the input value. The proposed method has the advantage of using the ratio of the parameters ν and c for controlling the bounds of fractions of support vectors and errors. The experimental results on several artificial and benchmark datasets indicate that the TPISVR not only has fast learning speed, but also shows good generalization performance. © 2011. Source


Peng X.,Shanghai Normal University | Peng X.,Scientific Computing Key Laboratory of Shanghai Universities
Neurocomputing | Year: 2010

Twin support vector regression (TSVR) obtains faster learning speed by solving a pair of smaller sized support vector machine (SVM)-typed problems than classical support vector regression (SVR). In this paper, a primal version for TSVR, termed primal TSVR (PTSVR), is first presented. By introducing a quadratic function to approximate its loss function, PTSVR directly optimizes the pair of quadratic programming problems (QPPs) of TSVR in the primal space based on a series of sets of linear equations. PTSVR can obviously improve the learning speed of TSVR without loss of the generalization. To improve the prediction speed, a greedy-based sparse TSVR (STSVR) in the primal space is further suggested. STSVR uses a simple back-fitting strategy to iteratively select its basis functions and update the augmented vectors. Computational results on several synthetic as well as benchmark datasets confirm the merits of PTSVR and STSVR. © 2010 Elsevier B.V. Source


Ceng L.-C.,Shanghai Normal University | Ceng L.-C.,Scientific Computing Key Laboratory of Shanghai Universities | Ansari Q.H.,Aligarh Muslim University | Schaible S.,Chung Yuan Christian University
Journal of Global Optimization | Year: 2012

In this paper, we introduce and analyze a new hybrid extragradient-like iterative algorithm for finding a common solution of a generalized mixed equilibrium problem, a system of generalized equilibrium problems and a fixed point problem of infinitely many non expansive mappings. Under some mild conditions, we prove the strong convergence of the sequence generated by the proposed algorithm to a common solution of these three problems. Such solution also solves an optimization problem. Several special cases are also discussed. The results presented in this paper are the supplement, extension, improvement and generalization of the previously known results in this area. © 2011 Springer Science+Business Media, LLC. Source


Peng X.,Shanghai Normal University | Peng X.,Scientific Computing Key Laboratory of Shanghai Universities
Information Sciences | Year: 2011

Twin support vector machines (TSVM) obtain faster training speeds than classical support vector machines (SVM). However, TSVM augmented vectors lose sparsity. In this paper, a rapid sparse twin support vector machine (STSVM) classifier in primal space is proposed to improve the sparsity and robustness of TSVM. Based on a simple back-fitting strategy, the STSVM iteratively builds each nonparallel hyperplanes by adding one support vector (SV) from the corresponding class at one time. This process is terminated using an adaptive and stable stopping criterion. STSVM learning is implemented by linear equation computing systems through introducing a quadratic function to approximate the empirical risk. The computational results on several synthetic and benchmark datasets indicate that the STSVM obtains a sparse separating hyperplane at a low cost without sacrificing its generalization performance. © 2011 Elsevier Inc. All rights reserved. Source

Discover hidden collaborations