The Kalman filtering update on Wikipedia is inefficient if the latent space dimension n is higher-dimensional than the observation space dimension m. This is because the covariance update involves matrix multiplications in the full latent space, which has O(n3) complexity. Here is a faster way (all notation as on Wikipedia):
Ck←HPk|k−1Sk←Rk+CkH⊤Kk←(S−1kCk)⊤Pk|k←Pk|k−1−KCk
Where Ck=HPk|k−1 is a m×n intermediate value.
Complexity is limited by the O(mn2) multiplication in the last step. This is in contrast to the standard update, which includes four O(n3) multiplications in the covariance update. The speed up of O(n/m) seems minor, but this form of the update also optimizes constant factors and the speed-up was substantial my application. This Technical Note on Manipulating Multivariate Gaussian Distributions may be helpful.
Rotating the latent space so that the projection H can be computed simply by dropping rows/columns also leads to significant speed-ups, as it removes two further matrix multiplications. In this form, the update uses only one O(m2n) and one O(mn2) operation, in contrast to the eleven matrix multiplications in naively implementing the standard covariance update.