Influence of self-absorption on the performance of laser-induced breakdown spectroscopy (LIBS)

Michael A. Player*, John Watson, Jolyon M O De Freitas

*Corresponding author for this work

Research output: Contribution to journalConference articlepeer-review

3 Citations (Scopus)

Abstract

LIBS is based on atomic emission from plasma formed by laser ablation and excitation. It offers non-contacting and nearly non-destructive elemental analysis, but limited analytical accuracy. An empirical power-law 'calibration curve' is usually required. From our work and from recent work by Gornushkin et al. this arises from self-absorption. Assuming Local Thermal Equilibrium (LTE), irradiance is found from integrals over the Voigt profile, which we compute using the complex error function. Calibration curves show a break between linear and power-law regions, with a square-root dependence at high concentrations. Irradiance depends on width of the Lorentz (pressure broadened) component, and the simple Boltzmann temperature dependence is modified. Gornushkin et al. extended calibration curves into the linear region, obtaining the Voigt parameter, but more typically this region is inaccessible. Self-absorption theory should provide improved temperature measurement in the power-law region and, although absolute concentration determination requires the Lorentz width, its known temperature and pressure dependence should reduce the ad-hoc nature of calibration curves.

Original languageEnglish
Pages (from-to)260-268
Number of pages9
JournalProceedings of SPIE - The International Society for Optical Engineering
Volume4076
DOIs
Publication statusPublished - 31 Aug 2000
EventOptical Diagnostics for Industrial Applications - Glasgow, UK
Duration: 22 May 200024 May 2000

Fingerprint

Dive into the research topics of 'Influence of self-absorption on the performance of laser-induced breakdown spectroscopy (LIBS)'. Together they form a unique fingerprint.

Cite this