-
Notifications
You must be signed in to change notification settings - Fork 44
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
OutOfMemoryError with PrototypicalCalibrationBlock #73
Comments
Solution is to locate the PCB module :/path/defrcn/defrcn/evaluation/calibration_layer py build_prototypes function in the code: 'All_feature.append (feature.cpu ().data)' adds the following code: features =None. |
thanks for your reply! this works! |
However, this error still occurs from time to time. |
features create by you customs datasets for novel classes ,you can solve this by reducing the number of novel classes, or generate features offline, instead of loading the novel datas to train it when the model validated, save it through the pickle module, and then modify the code to load the offline trained one directly during validation |
The device I use is A800 80G, and the novel data I set is 10 shot 13class, and when the model is loaded with the pcb module, the video memory occupies 53G, and 80G is not enough before the modification |
Hello, when I train my dataset using DeFRCN, I encountered an issue. The base training process goes smoothly, but when I attempt K-shot finetuning, I keep getting an OutOfMemoryError.
I tried to solve it and found that when setting PCB_ENABLE to False, this issue doesn't occur.
However, when PCB_ENABLE is set to True, even if I adjust IMS_PER_BATCH to 1 on A100-40G, I still encounter the OutOfMemoryError.
Has anyone else experienced a similar issue? How was it resolved?
The text was updated successfully, but these errors were encountered: