Toggle Main Menu Toggle Search

Open Access padlockePrints

An objective Bayes factor with improper priors

Lookup NU author(s): Dr Cristiano VillaORCiD


Full text for this publication is not currently held within this repository. Alternative links are provided below where available.


© 2021 Elsevier B.V.A new look at the use of improper priors in Bayes factors for model comparison is presented. As is well known, in such a case, the Bayes factor is only defined up to an arbitrary constant. Most current methods overcome the problem by using part of the sample to train the Bayes factor (Fractional Bayes Factor) or to transform the improper prior in to a proper distribution (Intrinsic Bayes Factors) and use the remainder of the sample for the model comparison. It is provided an alternative approach which relies on matching divergences between density functions so as to establish a value for the constant appearing in the Bayes factor. These are the Kullback–Leibler divergence and the Fisher information divergence; the latter being crucial as it does not depend on an unknown normalizing constant. Demonstrations of the performance of the proposed method are provided through numerous illustrations and comparisons, showing that the main advantage over existing ones is that it does not require any input from the experimenter; it is fully automated.

Publication metadata

Author(s): Villa C, Walker SG

Publication type: Article

Publication status: Published

Journal: Computational Statistics and Data Analysis

Year: 2022

Volume: 168

Print publication date: 01/04/2022

Online publication date: 22/11/2021

Acceptance date: 18/11/2021

ISSN (print): 0167-9473

ISSN (electronic): 1872-7352

Publisher: Elsevier B.V.


DOI: 10.1016/j.csda.2021.107404


Altmetrics provided by Altmetric