Toggle Main Menu Toggle Search

Open Access padlockePrints

Integration In Reproducing Kernel Hilbert Spaces Of Gaussian Kernels

Lookup NU author(s): Professor Chris Oates


Full text for this publication is not currently held within this repository. Alternative links are provided below where available.


© 2021 American Mathematical SocietyThe Gaussian kernel plays a central role in machine learning, uncertainty quantification and scattered data approximation, but has received relatively little attention from a numerical analysis standpoint. The basic problem of finding an algorithm for efficient numerical integration of functions reproduced by Gaussian kernels has not been fully solved. In this article we construct two classes of algorithms that use N evaluations to integrate d-variate functions reproduced by Gaussian kernels and prove the exponential or super-algebraic decay of their worst-case errors. In contrast to earlier work, no constraints are placed on the length-scale parameter of the Gaussian kernel. The first class of algorithms is obtained via an appropriate scaling of the classical Gauss–Hermite rules. For these algorithms we derive lower and upper bounds on the worst-case error of the forms exp(−c1N1/d)N1/(4d) and exp(−c2N1/d)N−1/(4d), respectively, for positive constants c1c2. The second class of algorithms we construct is more flexible and uses worst-case optimal weights for points that may be taken as a nested sequence. For these algorithms we derive upper bounds of the form exp(−c3N1/(2d)) for a positive constant c3.

Publication metadata

Author(s): Karvonen T, Oates CJ, Girolami M

Publication type: Article

Publication status: Published

Journal: Mathematics of Computation

Year: 2021

Volume: 90

Issue: 331

Pages: 2209-2233

Print publication date: 01/09/2021

Online publication date: 18/06/2021

Acceptance date: 16/12/2020

ISSN (print): 0025-5718

ISSN (electronic): 1088-6842

Publisher: American Mathematical Society


DOI: 10.1090/mcom/3659


Altmetrics provided by Altmetric