This paper addresses a large class of nonsmooth nonconvex stochastic DC (difference-of-convex functions) programs where endogenous uncertainty is involved and i.i.d.
View Article and Find Full Text PDFIEEE Trans Neural Netw Learn Syst
May 2024
Stochastic algorithms are well-known for their performance in the era of big data. In this article, we study nonsmooth stochastic Difference-of-Convex functions (DC) programs-the major class of nonconvex stochastic optimization, which have a variety of applications in divers domains, in particular, machine learning. We propose new online stochastic algorithms based on the state-of-the-art DC Algorithm (DCA)-a powerful approach in nonconvex programming framework, in the online context of streaming data continuously generated by some (unknown) source distribution.
View Article and Find Full Text PDFWe consider the large sum of DC (Difference of Convex) functions minimization problem which appear in several different areas, especially in stochastic optimization and machine learning. Two DCA (DC Algorithm) based algorithms are proposed: stochastic DCA and inexact stochastic DCA. We prove that the convergence of both algorithms to a critical point is guaranteed with probability one.
View Article and Find Full Text PDFNeural Comput
April 2020
We investigate an approach based on DC (Difference of Convex functions) programming and DCA (DC Algorithm) for online learning techniques. The prediction problem of an online learner can be formulated as a DC program for which online DCA is applied. We propose the two so-called complete/approximate versions of online DCA scheme and prove their logarithmic/sublinear regrets.
View Article and Find Full Text PDFThe need to select groups of variables arises in many statistical modeling problems and applications. In this paper, we consider the ℓ-norm regularization for enforcing group sparsity and investigate a DC (Difference of Convex functions) approximation approach for solving the ℓ-norm regularization problem. We show that, with suitable parameters, the original and approximate problems are equivalent.
View Article and Find Full Text PDFThis letter proposes a novel approach using the [Formula: see text]-norm regularization for the sparse covariance matrix estimation (SCME) problem. The objective function of SCME problem is composed of a nonconvex part and the [Formula: see text] term, which is discontinuous and difficult to tackle. Appropriate DC (difference of convex functions) approximations of [Formula: see text]-norm are used that result in approximation SCME problems that are still nonconvex.
View Article and Find Full Text PDFIn this letter, we consider the nonnegative matrix factorization (NMF) problem and several NMF variants. Two approaches based on DC (difference of convex functions) programming and DCA (DC algorithm) are developed. The first approach follows the alternating framework that requires solving, at each iteration, two nonnegativity-constrained least squares subproblems for which DCA-based schemes are investigated.
View Article and Find Full Text PDFAutomatic discovery of community structures in complex networks is a fundamental task in many disciplines, including physics, biology, and the social sciences. The most used criterion for characterizing the existence of a community structure in a network is modularity, a quantitative measure proposed by Newman and Girvan (2004). The discovery community can be formulated as the so-called modularity maximization problem that consists of finding a partition of nodes of a network with the highest modularity.
View Article and Find Full Text PDFIn this paper, we consider the problem of feature selection for linear SVMs on uncertain data that is inherently prevalent in almost all datasets. Using principles of Robust Optimization, we propose robust schemes to handle data with ellipsoidal model and box model of uncertainty. The difficulty in treating ℓ0-norm in feature selection problem is overcome by using appropriate approximations and Difference of Convex functions (DC) programming and DC Algorithms (DCA).
View Article and Find Full Text PDFWe investigate difference of convex functions (DC) programming and the DC algorithm (DCA) to solve the block clustering problem in the continuous framework, which traditionally requires solving a hard combinatorial optimization problem. DC reformulation techniques and exact penalty in DC programming are developed to build an appropriate equivalent DC program of the block clustering problem. They lead to an elegant and explicit DCA scheme for the resulting DC program.
View Article and Find Full Text PDF