Published online by Cambridge University Press: 12 September 2017
A numerical time-stepping algorithm for differential or partial differential equations is proposed that adaptively modifies the dimensionality of the underlying modal basis expansion. Specifically, the method takes advantage of any underlying low-dimensional manifolds or subspaces in the system by using dimensionality-reduction techniques, such as the proper orthogonal decomposition, in order to adaptively represent the solution in the optimal basis modes. The method can provide significant computational savings for systems where low-dimensional manifolds are present since the reduction can lower the dimensionality of the underlying high-dimensional system by orders of magnitude. A comparison of the computational efficiency and error for this method are given showing the algorithm to be potentially of great value for high-dimensional dynamical systems simulations, especially where slow-manifold dynamics are known to arise. The method is envisioned to automatically take advantage of any potential computational saving associated with dimensionality-reduction, much as adaptive time-steppers automatically take advantage of large step sizes whenever possible.