Poster
Diffusion Sampling Correction via Approximately 10 Parameters
Guangyi Wang · Wei Peng · lijiang Li · Wenyu Chen · Yuren Cai · Song-Zhi Su
East Exhibition Hall A-B #E-3304
While powerful for generation, Diffusion Probabilistic Models (DPMs) face slow sampling challenges, for which various distillation-based methods have been proposed. However, they typically require significant additional training costs and model parameter storage, limiting their practicality. In this work, we propose PCA-based Adaptive Search (PAS), which optimizes existing solvers for DPMs with minimal additional costs. Specifically, we first employ PCA to obtain a few basis vectors to span the high-dimensional sampling space, which enables us to learn just a set of coordinates to correct the sampling direction; furthermore, based on the observation that the cumulative truncation error exhibits an ``S"-shape, we design an adaptive search strategy that further enhances the sampling efficiency and reduces the number of stored parameters to approximately 10. Extensive experiments demonstrate that PAS can significantly enhance existing fast solvers in a plug-and-play manner with negligible costs. E.g., on CIFAR10, PAS optimizes DDIM's FID from 15.69 to 4.37 (NFE=10) using only 12 parameters and sub-minute training on a single A100 GPU. Code is available at https://github.com/onefly123/PAS.
While powerful for generation, diffusion models face slow sampling challenges. We propose PAS, a plug-and-play training paradigm designed to accelerate diffusion model sampling with minimal costs. PAS uses PCA to extract a few basis vectors to span the high-dimensional sampling space, allowing the correction of the sampling direction with only a set of coordinates. PAS also includes an adaptive search strategy to enhance sampling efficiency and reduce storage requirements. With only approximately 10 parameters and under an hour of training, PAS greatly improves sampling quality across various datasets, making diffusion models more practical for real-world applications.