Skip to content

arafatsafin42069/Hands-On-Large-Language-Models

 
 

Repository files navigation

Hands-On Large Language Models

Welcome! In this repository you will find the code for all examples throughout the book Hands-On Large Language Models written by Jay Alammar and Maarten Grootendorst which we playfully dubbed:

"The Illustrated LLM Book"

Through the visually educational nature of this book and with over 250 custom made figures, learn the practical tools and concepts you need to use Large Language Models today!


The digital version of the book is available on:

Note that the book is sent to the printer and will be released in print format in the coming weeks!

Table of Contents

We advise to run all examples through Google Colab for the easiest setup. Google Colab allows you to use a T4 GPU with 16GB of VRAM for free. All examples were mainly built and tested using Google Colab, so it should be the most stable platform. However, any other cloud provider should work.

Chapter Notebook
Chapter 1: Introduction to Language Models Open In Colab
Chapter 2: Tokens and Embeddings Open In Colab
Chapter 3: Looking Inside Transformer LLMs Open In Colab
Chapter 4: Text Classification Open In Colab
Chapter 5: Text Clustering and Topic Modeling Open In Colab
Chapter 6: Prompt Engineering Open In Colab
Chapter 7: Advanced Text Generation Techniques and Tools Open In Colab
Chapter 8: Semantic Search and Retrieval-Augmented Generation Open In Colab
Chapter 9: Multimodal Large Language Models Open In Colab
Chapter 10: Creating Text Embedding Models Open In Colab
Chapter 11: Fine-tuning Representation Models for Classification Open In Colab
Chapter 12: Fine-tuning Generation Models Open In Colab

Tip

You can check the setup folder for a quick-start guide to install all packages locally and you can check the conda folder for a complete guide on how to setup your environment, including conda and PyTorch installation. Note that the depending on your OS, Python version, and dependencies your results might be slightly differ. However, they should this be similar to the examples in the book.

Reviews

"Jay and Maarten have continued their tradition of providing beautifully illustrated and insightful descriptions of complex topics in their new book. Bolstered with working code, timelines, and references to key papers, their book is a valuable resource for anyone looking to understand the main techniques behind how Large Language Models are built."

Andrew Ng - founder of DeepLearning.AI


"This is an exceptional guide to the world of language models and their practical applications in industry. Its highly-visual coverage of generative, representational, and retrieval applications of language models empowers readers to quickly understand, use, and refine LLMs. Highly recommended!"

Nils Reimers - Director of Machine Learning at Cohere | creator of sentence-transformers


"I can’t think of another book that is more important to read right now. On every single page, I learned something that is critical to success in this era of language models."

Josh Starmer - StatQuest


"If you’re looking to get up to speed in everything regarding LLMs, look no further! In this wonderful book, Jay and Maarten will take you from zero to expert in the history and latest advances in large language models. With very intuitive explanations, great real-life examples, clear illustrations, and comprehensive code labs, this book lifts the curtain on the complexities of transformer models, tokenizers, semantic search, RAG, and many other cutting-edge technologies. A must read for anyone interested in the latest AI technology!"

Luis Serrano, PhD - Founder and CEO of Serrano Academy


"Hands-On Large Language Models brings clarity and practical examples to cut through the hype of AI. It provides a wealth of great diagrams and visual aids to supplement the clear explanations. The worked examples and code make concrete what other books leave abstract. The book starts with simple introductory beginnings, and steadily builds in scope. By the final chapters, you will be fine-tuning and building your own large language models with confidence."

Leland McInnes - Researcher at the Tutte Institute for Mathematics and Computing | creator of UMAP and HDBSCAN


Additional Resources

We attempted to put as much information into the book without it being overwhelming. However, even with a 400-page book there is still much to discover! If you are interested in similar illustrated/visual guides we created, these might be of interest to you:

A Visual Guide to Mamba A Visual Guide to Quantization The Illustrated Stable Diffusion

Citation

Please consider citing the book if you consider it useful for your research:

@book{hands-on-llms-book,
  author       = {Jay Alammar and Maarten Grootendorst},
  title        = {Hands-On Large Language Models},
  publisher    = {O'Reilly},
  year         = {2024},
  isbn         = {978-1098150969},
  url          = {https://www.oreilly.com/library/view/hands-on-large-language/9781098150952/},
  github       = {https://github.com/HandsOnLLM/Hands-On-Large-Language-Models}
}

About

Official code repo for the O'Reilly Book - "Hands-On Large Language Models"

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Jupyter Notebook 100.0%