Parallel algorithms for downdating the least-squares estimator of the regression model
Computationally efficient parallel algorithms for downdating the least-squares estimator of the ordinary linear model (OLM) are proposed. The algorithms are block versions of sequential Givens strategies and efficiently exploit the triangular structure of the matrices. The first strategy utilizes the orthogonal matrix which is derived from the QR decomposition of the initial data matrix. The orthogonal matrix is updated and explicitely computed. The second approach is based on hyperbolic transformations. This is equivalent to update the model with the imaginary data that is to be deleted. An efficient distribution of matrices over the processors is proposed. Furthermore, the new algorithms do not require any inter-processor communication. The theoretical complexities are derived and experimental results are presented and analyzed. The parallel strategies are scalable and highly efficient for large-scale downdating least-squares problems.