Score-based generative models have emerged as state-ofthe-art generative models. In this paper, we introduce a novel sampling scheme that can be combined with pretrained score-based diffusion models to speed up sampling by a factor of two to five in terms of the number of function evaluations (NFEs) with a superior Frechet Inception ´ distance (FID), compared to Annealed Langevin dynamics in noise-conditional score network (NCSN) and improved noise-conditional score network (NCSN++). The proposed sampling algorithm is inspired by momentum-based accelerated gradient descent used in convex optimization techniques. We validate the sampling efficiency of the proposed algorithm in terms of FID on CIFAR-10 and CelebA datasets.
BibTex Code Here