Skip to content

Latest commit

 

History

History
9 lines (5 loc) · 869 Bytes

README.md

File metadata and controls

9 lines (5 loc) · 869 Bytes

large-model-parallelism-illustration

Model parallelism 101

Learn how model parallelism enables training models like stable diffusion and Chat GPT in less than 300 lines of code. This notebook provides practical local implementations of the main model parallelism methods. It explores three approaches: data parallelism, tensor parallelism, and pipeline parallelism with a 2-layer MLP example that can be naturally extended to more complex models.

Reading this notebook will give you a solid overview of model parallelism techniques and an intuition for how to implement them.

Pull requests welcome. Illustration above generated with Lexica's Aperture model.