This work demonstrates the efficiency of machine learning in the correction of spectral intensity variations in laser-induced breakdown spectroscopy (LIBS) due to changes of the laser pulse energy, such changes can occur over a wide range, from 7.9 to 71.1 mJ in our experiment. The developed multivariate correction model led to a precise determination of the concentration of a minor element (magnesium for instance) in the samples (aluminum alloys in this work) with a precision of 6.3% (relative standard deviation, RSD) using the LIBS spectra affected by the laser pulse energy change. A comparison to the classical univariate corrections with laser pulse energy, total spectral intensity, ablation crater volume and plasma temperature, further highlights the significance of the developed method.
Download full-text PDF |
Source |
---|---|
http://dx.doi.org/10.1364/OE.392176 | DOI Listing |
Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!