Speaker
Description
Many model order reduction (MOR) methods rely on the computation of an orthonormal basis of a subspace onto which the large full order model is projected. Numerically, this entails the orthogonalization of a set of vectors. The nature of the MOR process imposes several requirements for the orthogonalization process. Firstly, MOR is often times performed in an adaptive or iterative manner, where the quality of the reduced order model, i.e., the dimension of the reduced subspace, is decided on the fly. Therefore, it is important that the orthogonalization routine can be executed iteratively. Secondly, one possibly has to deal with high-dimensional arrays of abstract vectors that do not allow explicit element-wise access to their elements, making it difficult to employ so-called orthogonal triangularization algorithms
, such as Householder QR. For these reasons, (modified) Gram-Schmidt-type algorithms are commonly used in applications. These methods belong to the category of triangular orthogonalization
algorithms, that do not rely on element-wise access to the vectors, and can be easily updated. Recently, shifted-Cholesky-QR-type algorithms have gained attention. These also belong to aforementioned category and have proven their aptitude for MOR algorithms in previous studies. A key benefit of these methods is that they are communication-avoiding - leading to vastly superior performance on memory-bandwidth-limited problems and parallel or distributed architectures. This work formulates an efficient updating scheme for Cholesky-QR-type algorithms and proposes an improved shifting strategy for highly ill-conditioned matrices. Driven by the MOR applications, in the numerical experiments, we further introduce the dominant subspace angle as a quality measure in addition to classic measures like deviation from orthogonality or the reconstruction error.