Separable nonlinear models are very common in various research fields, such as machine learning and system identification. The variable projection (VP) approach is efficient for the optimization of such models. In this paper, we study various VP algorithms based on different matrix decompositions. Compared with the previous method, we use the analytical expression of the Jacobian matrix instead of finite differences. This improves the efficiency of the VP algorithms. In particular, based on the modified Gram-Schmidt (MGS) method, a more robust implementation of the VP algorithm is introduced for separable nonlinear least-squares problems. In numerical experiments, we compare the performance of five different implementations of the VP algorithm. Numerical results show the efficiency and robustness of the proposed MGS method-based VP algorithm.
Download full-text PDF |
Source |
---|---|
http://dx.doi.org/10.1109/TNNLS.2018.2884909 | DOI Listing |
Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!