Deviance information criterion for Bayesian model selection: Justification and variation

Deviance information criterion (DIC) has been extensively used for making Bayesian model selection. It is a Bayesian version of AIC and chooses a model that gives the smallest expected Kullback-Leibler divergence between the data generating process (DGP) and a predictive distribution asymptotically....

Full description

Saved in:
Bibliographic Details
Main Authors: LI, Yong, Jun YU, ZENG, Tao
Format: text
Language:English
Published: Institutional Knowledge at Singapore Management University 2017
Subjects:
AIC
DIC
Online Access:https://ink.library.smu.edu.sg/soe_research/1927
https://ink.library.smu.edu.sg/context/soe_research/article/2926/viewcontent/DICTheory10.pdf
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Singapore Management University
Language: English
Description
Summary:Deviance information criterion (DIC) has been extensively used for making Bayesian model selection. It is a Bayesian version of AIC and chooses a model that gives the smallest expected Kullback-Leibler divergence between the data generating process (DGP) and a predictive distribution asymptotically. We show that when the plug-in predictive distribution is used, DIC can have a rigorous decision-theoretic justification under regularity conditions. An alternative expression for DIC, based on the Bayesian predictive distribution, is proposed. The new DIC has a smaller penalty term than the original DIC and is very easy to compute from the MCMC output. It is invariant to reparameterization and yields a smaller expected loss than the original DIC asymptotically.