A general class of nonconvex optimization problems is considered, where the penalty is the composition of a linear operator with a nonsmooth nonconvex mapping, which is concave on the positive real line. The necessary optimality condition of a regularized version of the original problem is solved by means of a monotonically convergent scheme. Such problems arise in continuum mechanics, as for instance cohesive fractures, where singular behaviour is usually modelled by nonsmooth nonconvex energies. The proposed algorithm is successfully tested for fracture mechanics problems. Its performance is also compared to two alternative algorithms for nonsmooth nonconvex optimization arising in optimal control and mathematical imaging.
Download full-text PDF |
Source |
---|---|
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC6944259 | PMC |
http://dx.doi.org/10.1007/s10957-019-01545-4 | DOI Listing |
Neural Netw
January 2025
Department of Mathematics, Harbin Institute of Technology, Weihai, China. Electronic address:
Nonsmooth nonconvex optimization problems are pivotal in engineering practice due to the inherent nonsmooth and nonconvex characteristics of many real-world complex systems and models. The nonsmoothness and nonconvexity of the objective and constraint functions bring great challenges to the design and convergence analysis of the optimization algorithms. This paper presents a smooth gradient approximation neural network for such optimization problems, in which a smooth approximation technique with time-varying control parameter is introduced for handling nonsmooth nonregular objective functions.
View Article and Find Full Text PDFNeural Netw
December 2024
College of Life Science, Hunan Normal University, Changsha, PR China. Electronic address:
IEEE Trans Pattern Anal Mach Intell
August 2024
Zeroth-order (a.k.a, derivative-free) methods are a class of effective optimization methods for solving complex machine learning problems, where gradients of the objective functions are not available or computationally prohibitive.
View Article and Find Full Text PDFJ Mach Learn Res
January 2024
Department of Radiology, University of Chicago, Chicago, IL 60637, USA.
The alternating direction method of multipliers (ADMM) algorithm is a powerful and flexible tool for complex optimization problems of the form . ADMM exhibits robust empirical performance across a range of challenging settings including nonsmoothness and nonconvexity of the objective functions and , and provides a simple and natural approach to the inverse problem of image reconstruction for computed tomography (CT) imaging. From the theoretical point of view, existing results for convergence in the nonconvex setting generally assume smoothness in at least one of the component functions in the objective.
View Article and Find Full Text PDFEntropy (Basel)
April 2024
Department of Electrical and Computer Engineering, Princeton University, Princeton, NJ 08544, USA.
In this paper, the problem of joint transmission and computation resource allocation for a multi-user probabilistic semantic communication (PSC) network is investigated. In the considered model, users employ semantic information extraction techniques to compress their large-sized data before transmitting them to a multi-antenna base station (BS). Our model represents large-sized data through substantial knowledge graphs, utilizing shared probability graphs between the users and the BS for efficient semantic compression.
View Article and Find Full Text PDFEnter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!