You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Thanks for your error report and we appreciate it a lot.
Checklist
I have searched related issues but cannot get the expected help.
The bug has not been fixed in the latest version.
Describe the bug
我在使用FastConvMAE在imagenet-1k数据集上做预训练的时候,报错为下图
我查看了报错的图片是存在的,不知道为什么路径报错后面会加上label,而且data_train_root无论怎么改报错的地址都不变就算为空也一样,而且就算我改变目录结构直接把图片放到imagenet文件夹而不是imagenet\train文件夹下时也报一样的错 To Reproduce
What command or script did you run? python -m torch.distributed.launch --nproc_per_node=3 --master_port=29930
tools/train.py
configs/selfsup/fast_convmae/fast_convmae_vit_base_patch16_8xb64_50e.py
--work_dir ./work_dir
--launcher pytorch
Did you make any modifications on the code or config? Did you understand what you have modified?
Thanks for your error report and we appreciate it a lot.
Checklist
Describe the bug
我在使用FastConvMAE在imagenet-1k数据集上做预训练的时候,报错为下图
我查看了报错的图片是存在的,不知道为什么路径报错后面会加上label,而且data_train_root无论怎么改报错的地址都不变就算为空也一样,而且就算我改变目录结构直接把图片放到imagenet文件夹而不是imagenet\train文件夹下时也报一样的错
To Reproduce
python -m torch.distributed.launch --nproc_per_node=3 --master_port=29930
tools/train.py
configs/selfsup/fast_convmae/fast_convmae_vit_base_patch16_8xb64_50e.py
--work_dir ./work_dir
--launcher pytorch
imagenet-1k格式如下
EasyCV/data/imagenet
└── train
└── n01440764
└── n01443537
└── ...
└── val
└── n01440764
└── n01443537
└── ...
└── meta
├── train.txt
├── val.txt
├── ...
Environment
Python: 3.7.16 (default, Jan 17 2023, 22:20:44) [GCC 11.2.0]
CUDA available: True
CUDA_HOME: /usr/local/cuda-11.7
NVCC: Build cuda_11.7.r11.7/compiler.31442593_0
GPU 0,1,2: NVIDIA Graphics Device
GCC: gcc (Ubuntu 9.4.0-1ubuntu1~20.04.1) 9.4.0
PyTorch: 1.8.0+cu111
PyTorch compiling details: PyTorch built with:
TorchVision: 0.9.0+cu111
OpenCV: 4.9.0
MMCV: 1.4.4
EasyCV: 0.11.6
The text was updated successfully, but these errors were encountered: