Skip to content

Commit

Permalink
Added checkpoints and results for MAE (#321)
Browse files Browse the repository at this point in the history
  • Loading branch information
vturrisi authored Jan 8, 2023
1 parent 469674b commit 92ec35f
Show file tree
Hide file tree
Showing 8 changed files with 90 additions and 20 deletions.
25 changes: 8 additions & 17 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -18,6 +18,7 @@ The library is self-contained, but it is possible to use the models outside of s
---

## News
* **[Jan 07 2023]**: :diving_mask: Added results, checkpoints and configs for MAE on ImageNet. Thanks to [HuangChiEn](https://github.com/HuangChiEn).
* **[Dec 31 2022]**: :stars: Shiny new logo! Huge thanks to [Luiz](https://www.instagram.com/linhaaspera/)!
* **[Sep 27 2022]**: :pencil: Brand new config system using OmegaConf/Hydra. Adds more clarity and flexibility. New tutorials will follow soon!
* **[Aug 04 2022]**: :paintbrush: Added [MAE](https://arxiv.org/abs/2111.06377) and supports finetuning the backbone with `main_linear.py`, mixup, cutmix and [random augment](https://arxiv.org/abs/1909.13719).
Expand Down Expand Up @@ -280,23 +281,13 @@ All pretrained models avaiable can be downloaded directly via the tables below o

### ImageNet

| Method | Backbone | Epochs | Dali | Acc@1 (online) | Acc@1 (offline) | Acc@5 (online) | Acc@5 (offline) | Checkpoint |
|--------------|:--------:|:------:|:------------------:|:--------------:|:---------------:|:--------------:|:---------------:|:----------:|
| Barlow Twins | ResNet50 | 100 | :heavy_check_mark: | 67.18 | 67.23 | 87.69 | 87.98 | [:link:](https://drive.google.com/drive/folders/1IQUIrCOSduAjUJ31WJ1G5tHDZzWUIEft?usp=sharing) |
| BYOL | ResNet50 | 100 | :heavy_check_mark: | 68.63 | 68.37 | 88.80 | 88.66 | [:link:](https://drive.google.com/drive/folders/1-UXo-MttdrqiEQXfV4Duc93fA3mIdsha?usp=sharing) |
|DeepCluster V2| ResNet50 | 100 | :heavy_check_mark: | | | | | |
| DINO | ResNet50 | 100 | :heavy_check_mark: | | | | | |
| MoCo V2+ | ResNet50 | 100 | :heavy_check_mark: | 62.61 | 66.84 | 85.40 | 87.60 | [:link:](https://drive.google.com/drive/folders/1NiBDmieEpNqkwrgn_H7bMnEDVAYc8Sk7?usp=sharing) |
| MoCo V3 | ResNet50 | 100 | :heavy_check_mark: | | | | | |
| NNCLR | ResNet50 | 100 | :heavy_check_mark: | | | | | |
| ReSSL | ResNet50 | 100 | :heavy_check_mark: | | | | | |
| SimCLR | ResNet50 | 100 | :heavy_check_mark: | | | | | |
| Simsiam | ResNet50 | 100 | :heavy_check_mark: | | | | | |
| SupCon | ResNet50 | 100 | :heavy_check_mark: | | | | | |
| SwAV | ResNet50 | 100 | :heavy_check_mark: | | | | | |
| VIbCReg | ResNet50 | 100 | :heavy_check_mark: | | | | | |
| VICReg | ResNet50 | 100 | :heavy_check_mark: | | | | | |
| W-MSE | ResNet50 | 100 | :heavy_check_mark: | | | | | |
| Method | Backbone | Epochs | Dali | Acc@1 (online) | Acc@1 (offline) | Acc@5 (online) | Acc@5 (offline) | Checkpoint | Finetuned Checkpoint
|--------------|:--------:|:------:|:------------------:|:--------------:|:---------------:|:--------------:|:---------------:|:----------:|:----------:|
| Barlow Twins | ResNet50 | 100 | :heavy_check_mark: | 67.18 | 67.23 | 87.69 | 87.98 | [:link:](https://drive.google.com/drive/folders/1IQUIrCOSduAjUJ31WJ1G5tHDZzWUIEft?usp=sharing) | |
| BYOL | ResNet50 | 100 | :heavy_check_mark: | 68.63 | 68.37 | 88.80 | 88.66 | [:link:](https://drive.google.com/drive/folders/1-UXo-MttdrqiEQXfV4Duc93fA3mIdsha?usp=sharing) | |
| MoCo V2+ | ResNet50 | 100 | :heavy_check_mark: | 62.61 | 66.84 | 85.40 | 87.60 | [:link:](https://drive.google.com/drive/folders/1NiBDmieEpNqkwrgn_H7bMnEDVAYc8Sk7?usp=sharing) | |
| MAE | ViT-B/16 | 100 | :x: | ~ | 81.60 (finetuned) | ~ | 95.50 (finetuned) | [:link:](https://drive.google.com/drive/folders/1OuaXCnQ7WeqyKPxfJibAkXoVTx7S8Hbb) | [:link:](https://drive.google.com/drive/folders/1c9DGhmLsTTtOu2vc9rodqm89wKtp40C5) |



## Training efficiency for DALI
Expand Down
3 changes: 2 additions & 1 deletion scripts/finetune/imagenet-100/mae.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,7 @@ data:
dataset: imagenet100
train_path: "./datasets/imagenet-100/train"
val_path: "./datasets/imagenet-100/val"
format: "dali"
format: "image_folder"
num_workers: 4
optimizer:
name: "adamw"
Expand All @@ -31,6 +31,7 @@ optimizer:
layer_decay: 0.75
scheduler:
name: "warmup_cosine"
warmup_start_lr: 0.0
checkpoint:
enabled: True
dir: "trained_models"
Expand Down
52 changes: 52 additions & 0 deletions scripts/finetune/imagenet/mae.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,52 @@
defaults:
- _self_
- wandb: private.yaml
- override hydra/hydra_logging: disabled
- override hydra/job_logging: disabled

# disable hydra outputs
hydra:
output_subdir: null
run:
dir: .

name: "mae-imagenet-finetune"
pretrained_feature_extractor: None
backbone:
name: "vit_base"
kwargs:
drop_path_rate: 0.1
pretrain_method: "mae"
data:
dataset: "imagenet"
train_path: "./datasets/imagenet/train"
val_path: "./datasets/imagenet/val"
format: "image_folder"
num_workers: 4
optimizer:
name: "adamw"
batch_size: 64
lr: 5e-4
weight_decay: 0.05
layer_decay: 0.75
scheduler:
name: "warmup_cosine"
warmup_start_lr: 0.0
checkpoint:
enabled: True
dir: "trained_models"
frequency: 1
auto_resume:
enabled: True
label_smoothing: 0.1
mixup: 0.8
cutmix: 1.0
finetune: True

# overwrite PL stuff
max_epochs: 100
devices: [0, 1, 2, 3, 4, 5, 6, 7]
sync_batchnorm: True
accelerator: "gpu"
strategy: "ddp"
precision: 16
3 changes: 3 additions & 0 deletions scripts/finetune/imagenet/wandb/mhug.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
enabled: True
entity: unitn-mhug
project: "solo-learn"
3 changes: 3 additions & 0 deletions scripts/finetune/imagenet/wandb/private.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
enabled: True
entity: None
project: "solo-learn"
3 changes: 2 additions & 1 deletion scripts/pretrain/imagenet-100/mae.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -25,7 +25,7 @@ data:
dataset: imagenet100
train_path: "./datasets/imagenet-100/train"
val_path: "./datasets/imagenet-100/val"
format: "dali"
format: "image_folder"
num_workers: 4
optimizer:
name: "adamw"
Expand All @@ -37,6 +37,7 @@ optimizer:
betas: [0.9, 0.95]
scheduler:
name: "warmup_cosine"
warmup_start_lr: 0.0
checkpoint:
enabled: True
dir: "trained_models"
Expand Down
3 changes: 2 additions & 1 deletion scripts/pretrain/imagenet/mae.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -28,7 +28,7 @@ data:
dataset: imagenet
train_path: "./datasets/imagenet/train"
val_path: "./datasets/imagenet/val"
format: "dali"
format: "image_folder"
num_workers: 4
optimizer:
name: "adamw"
Expand All @@ -40,6 +40,7 @@ optimizer:
betas: [0.9, 0.95]
scheduler:
name: "warmup_cosine"
warmup_start_lr: 0.0
checkpoint:
enabled: True
dir: "trained_models"
Expand Down
18 changes: 18 additions & 0 deletions zoo/imagenet.sh
Original file line number Diff line number Diff line change
Expand Up @@ -23,3 +23,21 @@ cd mocov2plus
gdown https://drive.google.com/uc?id=1BBauwWTJV38BCf56KtOK9TJWLyjNH-mP
gdown https://drive.google.com/uc?id=1JMpGSYjefFzxT5GTbEc_2d4THxOxC3Ca
cd ..

# MAE
mkdir mae
cd mae

mkdir pretrain
cd pretrain
gdown https://drive.google.com/uc?id=1WfkMVNGrQB-NK12XPkcWWxJsFy1H_0TI
gdown https://drive.google.com/uc?id=1EAeZy3lyr35wVcPBISKQXjHFxXtxA0DY
cd ..

mkdir finetune
cd finetune
gdown https://drive.google.com/uc?id=1buWWhf7zPJtpL3qOG_LRePfnwurjoJtw
gdown https://drive.google.com/uc?id=1n6symLssKGolf_WQd5I1RS-Gj5e-go92
cd ..

cd ..

0 comments on commit 92ec35f

Please sign in to comment.