Self-absorption seriously affects the accuracy and stability of quantitative analysis in laser-induced breakdown spectroscopy (LIBS). To reduce the effect of self-absorption, we investigated the temporal evolution of the self-absorption effect by establishing exponential calibration curves. Meanwhile, the temporal evolution mechanism of the self-absorption effect was also investigated. The results indicated that self-absorption was weak at the early stage of plasma expansion. For determination of manganese (Mn) in steel, as an example, the concentration of upper bound of linearity (C) was 2.000 wt. % at the early stage of plasma expansion (in a time window of 0.2-0.4 μs)-much higher than 0.363 wt. % at a traditional optimization time window (2-3 μs). The accuracy and stability of quantitative analysis at the time window of 0.2-0.4 μs was also much better than at the time window of 2-3 μs. This work provides a simple method for improving quantitative analysis performance and avoiding the self-absorption effect in LIBS.

Download full-text PDF

Source
http://dx.doi.org/10.1364/OE.27.004261DOI Listing

Publication Analysis

Top Keywords

time window
16
quantitative analysis
12
laser-induced breakdown
8
breakdown spectroscopy
8
accuracy stability
8
stability quantitative
8
self-absorption investigated
8
temporal evolution
8
early stage
8
stage plasma
8

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!