Denoising Diffusion fashions have demonstrated their proficiency for generative sampling. Nevertheless, producing good samples typically requires many iterations. Consequently, methods akin to binary time-distillation (BTD) have been proposed to scale back the variety of community requires a hard and fast structure. On this paper, we introduce TRAnsitive Closure Time-distillation (TRACT), a brand new technique that extends BTD. For single step diffusion,TRACT improves FID by as much as 2.4x on the identical structure, and achieves new single-step Denoising Diffusion Implicit Fashions (DDIM) state-of-the-art FID (7.4 for ImageNet64, 3.8 for CIFAR10). Lastly we tease aside the tactic via prolonged ablations. The PyTorch implementation shall be launched quickly.