Replies: 4 comments
-
Hey @rcslight, Thanks for using NeuralForecast, by default our methods will check if you have a GPU available and execute with it. If you are using your own gpu, you have to make sure that you have a correct CUDA installation, and that your python environment detects your GPU.
|
Beta Was this translation helpful? Give feedback.
-
hi, thanks for the insanely quick answer! I'm using my own device, and it all checks out, however whenever I instantiate the model it still default to CPU it seems. It happens that when I test it with my own dataset, funnily enough, it shows up as After trying to check what cause these observations, I was able to nail down the behavior by pausing the training and checking for the device.
maybe |
Beta Was this translation helpful? Give feedback.
-
To save the GPU memory usage From your readings of the NVIDIA/SMI you are already using the GPU. |
Beta Was this translation helpful? Give feedback.
-
Great, thanks for the assistance! |
Beta Was this translation helpful? Give feedback.
-
Hello everyone!
I've been following the NBEATSx example usage but can't seem to enforce GPU as a device. I have scrambled the lib and docs and didn't find an evident way to do it, or if it's even possible.
I've tried adding GPU as accelerator to no avail:
I can use it on the same environment with pytorch without any trouble.
I've been having fun with the lib so far on my usecase, and hope to contribute with yalls in the future! Cheers!
Beta Was this translation helpful? Give feedback.
All reactions