Consider a noisy chirp signal $$ \begin{equation} \begin{split} Y_k &= \alpha(t_k) \sin\bigg( \int^{t_k}_0 2 \, \pi \, f(s) \diff s \bigg) \\ &\quad+ \xi_k. \end{split} \nonumber \end{equation} $$ This repository offers an SDE-GP approach to estimating the instantaneous frequency function $f$ from measurements $\lbrace Y_1, Y_2,\ldots \rbrace$.

This repository contains codes that reproduce the experiments and figures in my doctoral thesis, and LaTeX source files that compile the thesis.

Generative diffusions are powerful models for sampling complex distributions. How could we develop an MCMC sampler to do conditional sampling?

This repo features a beginner-lever Rust implementation of Kalman filter and RTS smoother.

Consider the filtering problem for a model $$ \begin{equation} \begin{split} X_k \cond X_{k-1} &\sim \PP_{X_k \cond X_{k-1}}, \\ Y_k \cond X_k &\sim p_{Y_k \cond X_k}. \end{split} \nonumber \end{equation} $$ We provide an asymptotically exact algorithm to compute the moment $\mathbb{E}[X_k^n \mid Y_{1:k}]$ for arbitrary order $n$.

Consider a Bayesian neural network but we only put priors on a subsets of its parameters. How do we efficiently train such a neural network efficiently under the hood of Feynman--Kac formalism?

You can do Gaussian process regression in $O(\log N)$ time which is way faster than the cubic $O(N^3)$! This is achieved by a parallelisation of the regression. This implementation supports Matérn, RBF, Quasi-periodic, and other covariance functions, however, it currently deals with temporal Gaussian processes only. It is possible to work for more general GPs, provided that their covariance functions are separable.

Given a signal (with Gaussian measurement noises), how to compute the spectrogram of it? We can put priors on its Fourier coefficients, then use Kalman filter and smoother to find their posterior distributions. This gives continuous-time spectrogram for evenly/unevenly sampled data, in contrast to traditional window-based methods, such as STFT or Mel.

How to find a good prior for your latent function? You can try state-space deep Gaussian process (SS-DGP) -- a general class of non-stationary processes. In particular, if your latent function is irregular (e.g., discontinuity) or is assumed to have time-varying features (e.g., frequency or volatility), SS-DGP would be a powerful weapon for you.

This is the theme that this website is generated from. Suitable for researchers working in academy.

Let $X$ be a stochastic process governed by an SDE. How to compute $$ \begin{equation} \mathbb{E} [\phi(X(t)) \mid X(s)] \nonumber \end{equation} $$ for any test function $\phi$ of interests and times $t\geq s$? Apart from commonly used Euler--Maruyama and Monte Carlo methods, we can also use the Taylor moment expansion (TME) method. This gives closed-form and asymptotical approximations suitable for a large class of $\phi$.

This package features continuous-discrete stochastic filtering and smoothing by using the TME method. It is also with a few general implementations, such as EKFS and sigma-points.