Skip to content
/ GMem Public

[Preprint] Generative Modeling with Explicit Memory

Notifications You must be signed in to change notification settings

LINs-lab/GMem

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

7 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

GMem: Generative Modeling with Explicit Memory

Teaser image

Yi Tang 👨‍🎓, Peng Sun 👨‍🎨, Zhenglin Cheng 👨‍🎓, Tao Lin ⛷️

[arXiv] 📄 | [BibTeX] 🏷️

Abstract

Recent studies indicate that the denoising process in deep generative diffusion models implicitly learns and memorizes semantic information from the data distribution. These findings suggest that capturing more complex data distributions requires larger neural networks, leading to a substantial increase in computational demands, which in turn become the primary bottleneck in both training and inference of diffusion models. To this end, we introduce Generative Modeling with Explicit Memory GMem, leveraging an external memory bank in both training and sampling phases of diffusion models. This approach preserves semantic information from data distributions, reducing reliance on neural network capacity for learning and generalizing across diverse datasets. The results are significant: our GMem enhances both training, sampling efficiency, and generation quality. For instance, on ImageNet at $256 \times 256$ resolution, GMem accelerates SiT training by over $46.7\times$, achieving the performance of a SiT model trained for $7 M$ steps in fewer than $150K$ steps. Compared to the most efficient existing method, REPA, GMem still offers a $16\times$ speedup, attaining an FID score of 5.75 within $250K$ steps, whereas REPA requires over $4M$ steps. Additionally, our method achieves state-of-the-art generation quality, with an FID score of 3.56 without classifier-free guidance on ImageNet $256\times256$.


Requirements

  • Python and PyTorch:

    • 64-bit Python 3.10 or later.
    • PyTorch 2.4.0 or later (earlier versions might work but are not guaranteed).
  • Additional Python Libraries:

    • A complete list of required libraries is provided in the requirements.txt file.
    • To install them, execute the following command:
      pip install -r requirements.txt

Getting Started

To reproduce the results from the paper, run the following script:

bash scripts/sample-gmem-xl.sh

Important: make sure to change --ckpt to correct path.


Pre-trained Models and Memory Bank

We offer the following pre-trained model and memory bank here:

GMem Checkpoints

Backbone Training Steps Dataset Bank Size Training Epo. Download
SiT-XL/2 2M ImageNet $256\times 256$ 640,000 5 Huggingface

Additional Information

  • Up next: the training code and scripts for GMem.

Bibliography

If you find this repository helpful for your project, please consider citing our work:

@article{tang2024generative,
  title={Generative Modeling with Explicit Memory},
  author={Tang, Yi and Sun, Peng and Cheng, Zhenglin and Lin, Tao},
  journal={arXiv preprint arXiv:2412.08781},
  year={2024}
}

Acknowledgement

This code is mainly built upon SiT, edm2, and REPA repositories.

About

[Preprint] Generative Modeling with Explicit Memory

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published