Skip to content

Commit

Permalink
Update ch07.ipynb
Browse files Browse the repository at this point in the history
  • Loading branch information
rasbt authored Nov 8, 2024
1 parent 4a251c4 commit 8e86702
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion ch07/01_main-chapter-code/ch07.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -2757,7 +2757,7 @@
" 1. [Direct Preference Optimization (DPO) for LLM Alignment (From Scratch)](../04_preference-tuning-with-dpo/dpo-from-scratch.ipynb) implements a popular preference tuning mechanism to align the model from this chapter more closely with human preferences\n",
" 2. [Llama 3.2 From Scratch (A Standalone Notebook)](../../ch05/07_gpt_to_llama/standalone-llama32.ipynb), a from-scratch implementation of Meta AI's popular Llama 3.2, including loading the official pretrained weights; if you are up to some additional experiments, you can replace the `GPTModel` model in each of the chapters with the `Llama3Model` class (it should work as a 1:1 replacement)\n",
" 3. [Converting GPT to Llama](../../ch05/07_gpt_to_llama) contains code with step-by-step guides that explain the differences between GPT-2 and the various Llama models\n",
" 4. [Understanding the Difference Between Embedding Layers and Linear Layers](../../ch03/03_bonus_embedding-vs-matmul/embeddings-and-linear-layers.ipynb) is a conceptual explanation illustrating that the `Embedding` layer in PyTorch, which we use at the input stage of an LLM, is mathematically equivalent to a linear layer applied to one-hot encoded data\n",
" 4. [Understanding the Difference Between Embedding Layers and Linear Layers](../../ch02/03_bonus_embedding-vs-matmul/embeddings-and-linear-layers.ipynb) is a conceptual explanation illustrating that the `Embedding` layer in PyTorch, which we use at the input stage of an LLM, is mathematically equivalent to a linear layer applied to one-hot encoded data\n",
"- Happy further reading!"
]
}
Expand Down

0 comments on commit 8e86702

Please sign in to comment.