On the 'optimal' density power divergence tuning parameter.

J Appl Stat

School of Mathematics and Statistics, The Open University, Milton Keynes, UK.

Published: March 2020

The density power divergence, indexed by a single tuning parameter , has proved to be a very useful tool in minimum distance inference. The family of density power divergences provides a generalized estimation scheme which includes likelihood-based procedures (represented by choice for the tuning parameter) as a special case. However, under data contamination, this scheme provides several more stable choices for model fitting and analysis (provided by positive values for the tuning parameter ). As larger values of necessarily lead to a drop in model efficiency, determining the optimal value of to provide the best compromise between model-efficiency and stability against data contamination in any real situation is a major challenge. In this paper, we provide a refinement of an existing technique with the aim of eliminating the dependence of the procedure on an initial pilot estimator. Numerical evidence is provided to demonstrate the very good performance of the method. Our technique has a general flavour, and we expect that similar tuning parameter selection algorithms will work well for other M-estimators, or any robust procedure that depends on the choice of a tuning parameter.

Download full-text PDF

Source
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC9042124PMC
http://dx.doi.org/10.1080/02664763.2020.1736524DOI Listing

Publication Analysis

Top Keywords

tuning parameter
24
density power
12
power divergence
8
choice tuning
8
data contamination
8
tuning
6
parameter
6
'optimal' density
4
divergence tuning
4
parameter density
4

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!