Toggle Main Menu Toggle Search

Open Access padlockePrints

Gradient-Free Kernel Stein Discrepancy

Lookup NU author(s): Matthew Fisher, Professor Chris Oates

Downloads

Full text for this publication is not currently held within this repository. Alternative links are provided below where available.


Abstract

© 2023 Neural information processing systems foundation. All rights reserved.Stein discrepancies have emerged as a powerful statistical tool, being applied to fundamental statistical problems including parameter inference, goodness-of-fit testing, and sampling. The canonical Stein discrepancies require the derivatives of a statistical model to be computed, and in return provide theoretical guarantees of convergence detection and control. However, for complex statistical models, the stable numerical computation of derivatives can require bespoke algorithmic development and render Stein discrepancies impractical. This paper focuses on posterior approximation using Stein discrepancies, and introduces a collection of non-canonical Stein discrepancies that are gradient-free, meaning that derivatives of the statistical model are not required. Sufficient conditions for convergence detection and control are established, and applications to sampling and variational inference are presented.


Publication metadata

Author(s): Fisher MA, Oates CJ

Publication type: Conference Proceedings (inc. Abstract)

Publication status: Published

Conference Name: Advances in Neural Information Processing Systems 37 (NeurIPS 2023)

Year of Conference: 2023

Acceptance date: 02/04/2023

Publisher: Neural Information Processing Systems Foundation

URL: https://proceedings.neurips.cc/paper_files/paper/2023/hash/4b4d25dc0c52d3cf43d5b203cdfdf241-Abstract-Conference.html


Share