In this work we address regularization parameter estimation for ill-posed linear inverse problems with an penalty. Regularization parameter selection is of utmost importance for all of inverse problems and estimating it generally relies on the experience of the practitioner. For regularization with an penalty there exist a lot of parameter selection methods that exploit the fact that the solution and the residual can be written in explicit form. Parameter selection methods are functionals that depend on the regularization parameter where the minimizer is the desired regularization parameter that should lead to a good solution. Evaluation of these parameter selection methods still requires solving the inverse problem multiple times. Efficient evaluation of the parameter selection methods can be done through model order reduction. Two popular model order reduction techniques are Lanczos based methods (a Krylov subspace method) and the Randomized Singular Value Decomposition (RSVD). In this work we compare the two approaches. We derive error bounds for the parameter selection methods using the RSVD. We compare the performance of the Lanczos process versus the performance of RSVD for efficient parameter selection. The RSVD algorithm we use is based on the Adaptive Randomized Range Finder algorithm which allows for easy determination of the dimension of the reduced order model. Some parameter selection also require the evaluation of the trace of a large matrix. We compare the use of a randomized trace estimator versus the use of the Ritz values from the Lanczos process. The examples we use for our experiments are two model problems from geosciences.

, , , ,
doi.org/10.1016/j.cageo.2020.104427
Computers & Geosciences
Computational Imaging

Luiken, N., & van Leeuwen, T. (2020). Comparing RSVD and Krylov methods for linear inverse problems. Computers & Geosciences, 137. doi:10.1016/j.cageo.2020.104427