In order to meet a growing need for fieldable mass spectrometer systems for precise elemental and isotopic analyses, the liquid sampling-atmospheric pressure glow discharge (LS-APGD) has a number of very promising characteristics. One key set of attributes that await validation deals with the performance characteristics relative to isotope ratio precision and accuracy. Owing to its availability and prior experience with this research team, the initial evaluation of isotope ratio (IR) performance was performed on a Thermo Scientific Exactive Orbitrap instrument. While the mass accuracy and resolution performance for Orbitrap analyzers are well-documented, no detailed evaluations of the IR performance have been published. Efforts described here involve two variables: the inherent IR precision and accuracy delivered by the LS-APGD microplasma and the inherent IR measurement qualities of Orbitrap analyzers. Important to the IR performance, the various operating parameters of the Orbitrap sampling interface, high-energy collisional dissociation (HCD) stage, and ion injection/data acquisition have been evaluated. The IR performance for a range of other elements, including natural, depleted, and enriched uranium isotopes was determined. In all cases, the precision and accuracy are degraded when measuring low abundance (<0.1% isotope fractions). In the best case, IR precision on the order of 0.1% RSD can be achieved, with values of 1%-3% RSD observed for low-abundance species. The results suggest that the LS-APGD is a promising candidate for field deployable MS analysis and that the high resolving powers of the Orbitrap may be complemented with a here-to-fore unknown capacity to deliver high-precision IRs. Graphical Abstract ᅟ.
Download full-text PDF |
Source |
---|---|
http://dx.doi.org/10.1007/s13361-016-1402-4 | DOI Listing |
Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!