Skip to content

dataflowr/notebooks

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Feb 18, 2025
28885a9 · Feb 18, 2025
Jan 21, 2024
Jan 21, 2024
Jan 21, 2024
Sep 1, 2022
Jan 21, 2024
May 21, 2023
Jan 21, 2024
Jan 21, 2024
Feb 1, 2021
Apr 11, 2023
Apr 11, 2023
Jul 11, 2024
Mar 30, 2023
Jan 21, 2024
Jan 21, 2024
Aug 31, 2022
Apr 11, 2023
Apr 11, 2023
Jan 21, 2024
Jan 21, 2024
Dec 22, 2023
Jan 21, 2024
Feb 18, 2025
Apr 5, 2023
Nov 3, 2018
Dec 4, 2024
Jul 11, 2024
Oct 20, 2020

Repository files navigation

Dataflowr

Code and notebooks for the deep learning course dataflowr. Here is the schedule followed at école polytechnique in 2023:

🌻Session1️⃣ Finetuning VGG

Things to remember
  • you do not need to understand everything to run a deep learning model! But the main goal of this course will be to come back to each step done today and understand them...
  • to use the dataloader from Pytorch, you need to follow the API (i.e. for classification store your dataset in folders)
  • using a pretrained model and modifying it to adapt it to a similar task is easy.
  • if you do not understand why we take this loss, that's fine, we'll cover that in Module 3.
  • even with a GPU, avoid unnecessary computations!

🌻Session2️⃣ PyTorch tensors and Autodiff

Things to remember
  • Pytorch tensors = Numpy on GPU + gradients!
  • in deep learning, broadcasting is used everywhere. The rules are the same as for Numpy.
  • Automatic differentiation is not only the chain rule! Backpropagation algorithm (or dual numbers) is a clever algorithm to implement automatic differentiation...

🌻Session3️⃣

Things to remember
  • Loss vs Accuracy. Know your loss for a classification task!
  • know your optimizer (Module 4)
  • know how to build a neural net with torch.nn.module (Module 5)
  • know how to use convolution and pooling layers (kernel, stride, padding)
  • know how to use dropout

🌻Session4️⃣

Things to remember
  • know how to use dataloader
  • to deal with categorical variables in deep learning, use embeddings
  • in the case of word embedding, starting in an unsupervised setting, we built a supervised task (i.e. predicting central / context words in a window) and learned the representation thanks to negative sampling
  • know your batchnorm
  • architectures with skip connections allows deeper models

🌻Session5️⃣

🌻Session6️⃣

🌻Session7️⃣

🌻Session8️⃣

🌻Session9️⃣

For more updates: Twitter URL

🌻 All notebooks

Usage

If you want to run locally, follow the instructions of Module 0 - Running the notebooks locally

2020 version of the course

Archives are available on the archive-2020 branch.