Road to LLM A learning roadmap from the tensor to large language models (LLMs). Inspired by fromthetensor and ai-notebooks. Quickstart Guide A virtual environment is highly recommended. python3 -m venv env source env/bin/activate pip install -e . Roadmap Section 1: Introduction Introduction to tinygrad Section 2: Vision Deep Learning in Neural Networks (2014) An Introduction to Convolutional Neural Networks(2015) Deep Residual Learning for Image Recognition (2015) Transformers for Image Recognition at Scale (2020) Section 3: Language Attention Is All You Need (2017) Language Models are Unsupervised Multitask Learners (2019) BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding (2019) Language Models are Few-Shot Learners (2020) LLaMA: Open and Efficient Foundation Language Models (2023)