Deep state-space Gaussian processes

Zheng Zhao*, Muhammad Emzir, Simo Särkkä

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

13 Scopus citations

Abstract

This paper is concerned with a state-space approach to deep Gaussian process (DGP) regression. We construct the DGP by hierarchically putting transformed Gaussian process (GP) priors on the length scales and magnitudes of the next level of Gaussian processes in the hierarchy. The idea of the state-space approach is to represent the DGP as a non-linear hierarchical system of linear stochastic differential equations (SDEs), where each SDE corresponds to a conditional GP. The DGP regression problem then becomes a state estimation problem, and we can estimate the state efficiently with sequential methods by using the Markov property of the state-space DGP. The computational complexity scales linearly with respect to the number of measurements. Based on this, we formulate state-space MAP as well as Bayesian filtering and smoothing solutions to the DGP regression problem. We demonstrate the performance of the proposed models and methods on synthetic non-stationary signals and apply the state-space DGP to detection of the gravitational waves from LIGO measurements.

Original languageEnglish
Article number75
JournalStatistics and Computing
Volume31
Issue number6
DOIs
StatePublished - Nov 2021
Externally publishedYes

Bibliographical note

Publisher Copyright:
© 2021, The Author(s).

Keywords

  • Deep Gaussian process
  • Gaussian filtering and smoothing
  • Gaussian process regression
  • Gravitational wave detection
  • Maximum a posteriori estimate
  • Particle filter
  • State space
  • Stochastic differential equation

ASJC Scopus subject areas

  • Theoretical Computer Science
  • Statistics and Probability
  • Statistics, Probability and Uncertainty
  • Computational Theory and Mathematics

Fingerprint

Dive into the research topics of 'Deep state-space Gaussian processes'. Together they form a unique fingerprint.

Cite this