Publications by authors named "Junyou Ye"

ν-one-class support vector classification (ν-OCSVC) has garnered significant attention for its remarkable performance in handling single-class classification and anomaly detection. Nonetheless, the model does not yield a unique decision boundary, and potentially compromises learning performance when the training data is contaminated by some outliers or mislabeled observations. This paper presents a novel C-parameter version of bounded one-class support vector classification (C-BOCSVC) to determine a unique decision boundary.

View Article and Find Full Text PDF

Twin support vector machine (TSVM) is a practical machine learning algorithm, whereas traditional TSVM can be limited for data with outliers or noises. To address this problem, we propose a novel TSVM with the symmetric LINEX loss function (SLTSVM) for robust classification. There are several advantages of our method: (1) The performance of the proposed SLTSVM for data with outliers or noise can be improved by using the symmetric LINEX loss function.

View Article and Find Full Text PDF

For multi-class classification problems, a new kernel-free nonlinear classifier is presented, called the hard quadratic surface least squares regression (HQSLSR). It combines the benefits of the least squares loss function and quadratic kernel-free trick. The optimization problem of HQSLSR is convex and unconstrained, making it easy to solve.

View Article and Find Full Text PDF

In this paper, a kernel-free quadratic surface support vector regression with non-negative constraints (NQSSVR) is proposed for the regression problem. The task of the NQSSVR is to find a quadratic function as a regression function. By utilizing the quadratic surface kernel-free technique, the model avoids the difficulty of choosing the kernel function and corresponding parameters, and has interpretability to a certain extent.

View Article and Find Full Text PDF