Applications of a Kullback-Leibler divergence for comparing non-nested models

Chen Pin Wang, Booil Jo

Research output: Contribution to journalArticlepeer-review

7 Scopus citations


Wang and Ghosh (2011) proposed a Kullback-Leibler divergence (KLD) which is asymptotically equivalent to the KLD by Goutis and Robert (1998) when the reference model (in comparison with a competing fi{ligature}tted model) is correctly specifi{ligature}ed and when certain regularity conditions hold true. While properties of the KLD by Wang and Ghosh (2011) have been investigated in the Bayesian framework, this paper further explores the property of this KLD in the frequentist framework using four application examples, each fi{ligature}tted by two competing non-nested models.

Original languageEnglish (US)
Pages (from-to)409-429
Number of pages21
JournalStatistical Modelling
Issue number5-6
StatePublished - Oct 2013


  • Kullback-Leibler divergence
  • comparison of non-nested models
  • information criterion

ASJC Scopus subject areas

  • Statistics and Probability
  • Statistics, Probability and Uncertainty


Dive into the research topics of 'Applications of a Kullback-Leibler divergence for comparing non-nested models'. Together they form a unique fingerprint.

Cite this