Speaker
Description
The Primal-Dual Hybrid Gradient Algorithm (PDHG) holds relevance in image reconstruction due to its ability to implement non-smooth penalties. This algorithm also serves as the base for the “learned primal dual” method, which enables an AI-based, physics-inspired reconstruction. A unique challenge in emission tomography is that the optimization metric is the Poisson-likelihood, which often implies slower convergence. In this study, we compare the convergence properties of the preconditioned PDHG with the commonly used Maximum Likelihood Expectation Maximization (ML-EM) method in Positron Emission Tomography (PET). We provide theoretical considerations and simulations performed on an idealized 2D setup. Our findings indicate that the convergence speed of PDHG is independent of signal contrast, unlike ML-EM. When implementing a diagonal preconditioner, we achieved a performance comparable to ML-EM. However, we discovered that rescaling data in the PDHG algorithm significantly impacts the rate of convergence. This optimal value is found when the average image values are of the order of 1. This issue appears to be due to the different values of the Hessian of the primal and the dual problem. However, it can be addressed by appropriate scaling before the reconstruction
Field | Software and quantification |
---|