Sensitivity, Prediction Uncertainty, and Detection Limit for Artificial Neural Network Calibrations.

Anal Chem

Departamento de Química Analítica, Facultad de Ciencias Bioquímicas y Farmacéuticas, Universidad Nacional de Rosario, Instituto de Química de Rosario (IQUIR-CONICET) , Suipacha 531, Rosario S2002LRK, Argentina.

Published: August 2016

With the proliferation of multivariate calibration methods based on artificial neural networks, expressions for the estimation of figures of merit such as sensitivity, prediction uncertainty, and detection limit are urgently needed. This would bring nonlinear multivariate calibration methodologies to the same status as the linear counterparts in terms of comparability. Currently only the average prediction error or the ratio of performance to deviation for a test sample set is employed to characterize and promote neural network calibrations. It is clear that additional information is required. We report for the first time expressions that easily allow one to compute three relevant figures: (1) the sensitivity, which turns out to be sample-dependent, as expected, (2) the prediction uncertainty, and (3) the detection limit. The approach resembles that employed for linear multivariate calibration, i.e., partial least-squares regression, specifically adapted to neural network calibration scenarios. As usual, both simulated and real (near-infrared) spectral data sets serve to illustrate the proposal.

Download full-text PDF

Source
http://dx.doi.org/10.1021/acs.analchem.6b01857DOI Listing

Publication Analysis

Top Keywords

prediction uncertainty
12
uncertainty detection
12
detection limit
12
neural network
12
multivariate calibration
12
sensitivity prediction
8
artificial neural
8
network calibrations
8
limit artificial
4
neural
4

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!