This repository contains the code for coding, pretraining, and finetuning a GPT-like LLM and is the official code repository for the book Build a Large Language Model (From Scratch).
(If you downloaded the code bundle from the Manning website, please consider visiting the official code repository on GitHub at https://github.com/rasbt/LLMs-from-scratch.)
In Build a Large Language Model (From Scratch), you'll discover how LLMs work from the inside out. In this book, I'll guide you step by step through creating your own LLM, explaining each stage with clear text, diagrams, and examples.
The method described in this book for training and developing your own small-but-functional model for educational purposes mirrors the approach used in creating large-scale foundational models such as those behind ChatGPT.
- Link to the official source code repository
- Link to the early access version at Manning
- ISBN 9781633437166
- Publication in Early 2025 (estimated)
Please note that the Readme.md
file is a Markdown (.md
) file. If you have downloaded this code bundle from the Manning website and are viewing it on your local computer, I recommend using a Markdown editor or previewer for proper viewing. If you haven't installed a Markdown editor yet, MarkText is a good free option.
Alternatively, you can view this and other files on GitHub at https://github.com/rasbt/LLMs-from-scratch.
Chapter Title | Main Code (for quick access) | All Code + Supplementary |
---|---|---|
Ch 1: Understanding Large Language Models | No code | - |
Ch 2: Working with Text Data | - ch02.ipynb - dataloader.ipynb (summary) - exercise-solutions.ipynb |
./ch02 |
Ch 3: Coding Attention Mechanisms | - ch03.ipynb - multihead-attention.ipynb (summary) - exercise-solutions.ipynb |
./ch03 |
Ch 4: Implementing a GPT Model from Scratch | - ch04.ipynb - gpt.py (summary) - exercise-solutions.ipynb |
./ch04 |
Ch 5: Pretraining on Unlabeled Data | - ch05.ipynb - train.py (summary) - generate.py (summary) |
./ch05 |
Ch 6: Finetuning for Text Classification | Q2 2024 | ... |
Ch 7: Finetuning with Human Feedback | Q2 2024 | ... |
Ch 8: Using Large Language Models in Practice | Q2/3 2024 | ... |
Appendix A: Introduction to PyTorch | - code-part1.ipynb - code-part2.ipynb - DDP-script.py - exercise-solutions.ipynb |
./appendix-A |
Appendix B: References and Further Reading | No code | - |
Appendix C: Exercises | No code | - |
Appendix D: Adding Bells and Whistles to the Training Loop | - appendix-D.ipynb | ./appendix-D |
Tip
Please see this and this folder if you need more guidance on installing Python and Python packages.
Shown below is a mental model summarizing the contents covered in this book.
Several folders contain optional materials as a bonus for interested readers:
-
Appendix A:
-
Chapter 2:
-
Chapter 3:
-
Chapter 5:
Below are interesting projects by readers of the Build A Large Language Model From Scratch book:
- https://github.com/Intelligence-Manifesto/LLMs-from-scratch, a fork of this repository with Chinese translation
If you find this book or code useful for your research, please consider citing it:
@book{build-llms-from-scratch-book,
author = {Sebastian Raschka},
title = {Build A Large Language Model (From Scratch)},
publisher = {Manning},
year = {2023},
isbn = {978-1633437166},
url = {https://www.manning.com/books/build-a-large-language-model-from-scratch},
note = {Work in progress},
github = {https://github.com/rasbt/LLMs-from-scratch}
}