You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi, can you please tell, how much gpu memory is required while finetuning the model? i am trying to finetune it using the Nivida 2080Ti with a memory of 12GB. But I am getting Cuda out of memory error.
The text was updated successfully, but these errors were encountered:
Hi @DhavalTaunk08,
I would you mind expanding on some of your settings a bit?
When fine-tuning, what is your dataset? How large is your batch size? What do you set as the sequence length? Do you run it with mixed precision (fp16) and does your gpu have support and installed NVIDIA apex?
Hi @MarkusSagen, I am using my own custom built dataset having same format as Wikisum dataset. I tried different batch sizes varying from 2 to 16. I have tried sequence length from 2 to 4096. I am using mixed precision (fp16) and the gpu is also well setup.
I think it is unlikely to work on with those specs unfortunately. xlm-r, just to initialize the weights and train for a minimal text sample is likely to take up more then 18gb I would estimate
Hi, can you please tell, how much gpu memory is required while finetuning the model? i am trying to finetune it using the Nivida 2080Ti with a memory of 12GB. But I am getting Cuda out of memory error.
The text was updated successfully, but these errors were encountered: