Description: Many computational problems depend on solving systems of linear equations. The Conjugate Gradient method (CG) is a widely used iterative method that solves systems of linear equations. Early termination of CG sacrifices accuracy to save computational resources.
The Bayesian Conjugate Gradient method (BayesCG) is a probabilistic generalization of CG that solves systems of linear equations and produces a probability distribution (called the posterior) that models the error. One input of BayesCG is a probability distribution (called the prior) that models the user's initial belief about the solution. The choice of the prior affects how well the posterior models the error and how quickly BayesCG converges to the solution. We develop a structure-exploiting prior that causes BayesCG to match the fast convergence speed of CG, nearly match the low computational cost of CG, and produce an informative posterior distribution. We also discuss how BayesCG can be used to analyse the propagation of CG error through a computational pipeline.
Join on your computer or mobile app
Click here to join the meeting
Or call in (audio only)
+1 630-556-7958,,699110053# United States, Big Rock
Phone Conference ID: 699 110 053#