Abstract
Wang and Ghosh (2011) proposed a Kullback-Leibler divergence (KLD) which is asymptotically equivalent to the KLD by Goutis and Robert (1998) when the reference model (in comparison with a competing fi{ligature}tted model) is correctly specifi{ligature}ed and when certain regularity conditions hold true. While properties of the KLD by Wang and Ghosh (2011) have been investigated in the Bayesian framework, this paper further explores the property of this KLD in the frequentist framework using four application examples, each fi{ligature}tted by two competing non-nested models.
Original language | English (US) |
---|---|
Pages (from-to) | 409-429 |
Number of pages | 21 |
Journal | Statistical Modelling |
Volume | 13 |
Issue number | 5-6 |
DOIs | |
State | Published - Oct 2013 |
Keywords
- Kullback-Leibler divergence
- comparison of non-nested models
- information criterion
ASJC Scopus subject areas
- Statistics and Probability
- Statistics, Probability and Uncertainty