Abstract
In this paper we address the problem of modelling time series with irregular intervals by incorporating a continuous-time version of the Kalman filter into a neural network architecture. Building on the idea of Recurrent Kalman Networks (RKNs) we use an encoder-decoder structure to learn a latent observation space and latent state space in which the dynamics of the data can be approximated linearly. Here, a recurrent Kalman component alternates between continuous latent state propagation and Bayesian updates from incoming observations. This allows us to model and react instantaneously to observations as they come at arbitrary time steps while ensuring sufficient expressive power to model nonlinear dynamics. Experiments on synthetic data show that the model is indeed able to capture continuous, nonlinear dynamics.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Chen, T.Q., Rubanova, Y., Bettencourt, J., Duvenaud, D.: Neural ordinary differential equations. In: Advances in Neural Information Processing Systems 31, pp. 6571ā6583. CurranAssociates, Inc. (2018)
Rubanova, Y., Chen, T.Q., Duvenaud, D.: Latent ordinary differential equations for irregularly-sampled time series. In: Advances in Neural Information Processing Systems 32, pp. 5320ā5330. CurranAssociates, Inc. (2019)
Kidger, P., Morrill, J., Foster, J, Lyons, T.: Neural controlled differential equations for irregular time series. arXiv preprint arXiv:2005.08926 (2020)
De Brouwer, E., Simm, J., Arany, A., Moreau, Y.: GRU-ODE-Bayes: continuous modeling of sporadically-observed time series. In: Advances in Neural Information Processing Systems 32, pp. 7379ā7390. Curran Associates, Inc. (2019)
Kalman, R.E.: A new approach to linear filtering and prediction problems. Trans. ASME J. Basic Eng. 82, 35ā45 (1960)
Jazwinski, A.H.: Stochastic Processes and Filtering Theory. Academic Press, New York (1970)
Axelsson, P., Gustafsson, F.: Discrete-time solutions to the continuous-time differential Lyapunov equation with applications to Kalman filtering. IEEE Trans. Autom. Control 60(3), 632ā643 (2014)
Fraccaro, M., Kamronn, S., Paquet, U., Winther, O.: A disentangled recognition and nonlinear dynamics model for unsupervised learning. In: Advances in Neural Information Processing Systems, pp. 3601ā3610. NeurIPS (2017)
Becker, P., Pandya, H., Gebhardt, G., Zhao, C., Taylor, C. J., Neumann, G.: Recurrent Kalman networks: factorized inference in high-dimensional deep feature spaces. In: International Conference on Machine Learning, pp. 544ā552. PMLR (2019)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
Ā© 2021 Springer Nature Switzerland AG
About this paper
Cite this paper
Schirmer, M., Eltayeb, M., Rudolph, M. (2021). Continuous-Discrete Recurrent Kalman Networks forĀ Irregular Time Series. In: Kamp, M., et al. Machine Learning and Principles and Practice of Knowledge Discovery in Databases. ECML PKDD 2021. Communications in Computer and Information Science, vol 1524. Springer, Cham. https://doi.org/10.1007/978-3-030-93736-2_23
Download citation
DOI: https://doi.org/10.1007/978-3-030-93736-2_23
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-93735-5
Online ISBN: 978-3-030-93736-2
eBook Packages: Computer ScienceComputer Science (R0)