基础研究
- DDPM => Denoising Diffusion Probabilistic Models
- DDIM => Denoising Diffusion Implicit Models
- NCSN => Generative Modeling by Estimating Gradients of the Data Distribution
- Score-Based SDE => SCORE-BASED GENERATIVE MODELING THROUGH STOCHASTIC DIFFERENTIAL EQUATIONS
- class-Guidence => diffusion-models-beat-gans-on-image-synthesis
- class-free Guidence => Classifier-Free Diffusion Guidance.
- IDM => High-Resolution Image Synthesis with Latent Diffusion Models.
- EDM => Elucidating the Design Space of Diffusion-Based Generative Models
- GLIDE Towards Photorealistic Image Generation and Editing with Text-Guided Diffusion Models
- iDDPM => Improved Denoising Diffusion Probabilistic Model
- A-DPM => Analytic-DPM an Analytic Estimate of the Optimal Reverse Variance in Diffusion Probabilistic Models.
- DiT => Scalable Diffusion Models with Transformers
- Flow-Matching => FLOW MATCHING FOR GENERATIVE MODELING.
- Rectified Flow => Flow Straight and Fast:Learning to Generate and Transfer Data with Rectified Flow
- Shortcut Models => One Step Diffusion via Shortcut Models.
- Mean-Flow => Mean Flows for One-step Generative Modeling
- ShortDF => Optimizing for the Shortest Path in Denoising Diffusion Model
- RayFlow Instance-Aware Diffusion Acceleration via Adaptive Flow Trajectories
加速采样
Training-Free
基于求解器
- iPNDMs => Pseudo Numerical Methods for Diffusion Models on Manifolds.
- DEIS => FAST SAMPLING OF DIFFUSION MODELS WITH EXPONENTIAL INTEGRATOR
- DPM-Solver => DPM-Solver-A Fast ODE Solver for Diffusion Probabilistic Model Sampling in Around 10 Steps.
- DPM-Solver++ => DPM-Solver-Plus-Plus-Fast Solver for Guided Sampling of Diffusion Probabilistic Models
- AM => Boosting Diffusion Models with an Adaptive Momentum Sampler
基于蒸馏
基于特征重用\模型简化
- FreeU => FreeU Free Lunch in Diffusion U-Net
- DeepCache => DeepCache Accelerating Diffusion Models for Free.
- Skip-Tuning => The Surprising Effectiveness of Skip-Tuning in Diffusion Sampling
- Faster Diffusion-Rethinking the Role of the Encoder
- increment-calibrated caching => Accelerating Diffusion Transformer via Increment-Calibrated Caching with Channel-Aware Singular Value Decomposition
- BlockDance => BlockDance Reuse Structurally Similar Spatio-Temporal Features to Accelerate Diffusion Transformers
- CacheQuant => CacheQuant Comprehensively Accelerated Diffusion Models
- DreamCache => Finetuning-Free Lightweight Personalized Image Generation via Feature Caching
- PFDiff => PFDiff: Training-Free Acceleration of Diffusion Models Combining Past and Future Scores
- DiffCR => Layer and Timestep-Adaptive Differentiable Token Compression Ratios for Efficient Diffusion Transformers
基于时间步搜索
- Auto Diffusion => AutoDiffusion Training-Free Optimization of Time Steps and Architectures for Automated Diffusion Model Acceleration
- GITS => On-the-Trajectory-Regularity-of-ODE-based-Diffusion-Sampling
Training-Based
基于求解器_
基于蒸馏_
- CM => Consistency Models.
- 两阶段无分类器引导模型的蒸馏 => On Distillation of Guided Diffusion Models
- CTM => Consistency Trajectory Models Learning Probability Flow ODE Trajectory of Diffusion.
- SFD => Simple and Fast Distillation of Diffusion Models
- ARD => Autoregressive Distillation of Diffusion Transformers
- NitroFusion => NitroFusion High-Fidelity Single-Step Diffusion through Dynamic Adversarial Training
- Random Conditioning for Diffusion Model Compression with Distillation
Training-Efficient
_基于求解器
_基于蒸馏
- Amed => Fast ODE-based Sampling for Diffusion Models in Around 5 Steps
- PAS => Diffusion Sampling Correction via Approximately 10 Parameters
- Morse => Morse Dual-Sampling for Lossless Acceleration of Diffusion Models
加速训练
基于训练步长
- SpeeD => A Closer Look at Time Steps is Worthy of Triple Speed-Up for Diffusion Model Training
- 2025-CVPR-Adaptive Non-Uniform Timestep Sampling for Diffusion Model Training