Research on spectroscopy screening methods based on optical computation using laser-induced breakdown spectroscopy
Abstract
The screening of interfering spectral lines is of great significance for improving quantitative accuracy during the LIBS quantitative analysis process. This paper proposes a novel method that utilizes optical computation for spectral line screening. The method calculated the spectral line intensity using optical formulae and obtained a reference intensity. First, during the initial screening process, the plasma temperature was determined using the Boltzmann double-line method. This step ensured that the temperature remained unaffected by the actual spectral intensities. Subsequently, the reference intensity was determined based on the formula for spectral line intensity emitted from energy levels. Finally, comparing the reference intensity with the actual intensity yields a ratio. The characteristic spectral lines were then screened based on the results of this ratio. The selected spectral line data were utilized in artificial neural network (ANN) analysis. The results demonstrate a significant improvement in the determination coefficient (R2), increasing from 0.6378 to 0.9992. Simultaneously, the root mean square error (RMSE) was reduced from 2.9098 to 0.1135. This study provides a feasible approach for addressing spectral interference issues in LIBS by integrating theoretical calculations and machine learning models.