-
Notifications
You must be signed in to change notification settings - Fork 197
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Running umxhq on a large test track (Georgia Wonder - Siren) blows up memory >64GB #113
Comments
I just saw the other suggestion to do inference in 30s-sized chunks so I'll do it that way. |
hmmm, could you check whether the If not, it means that the system is trying to process the whole track, which may be the source of the problem |
Yes, it is being used (the default of 200). |
ok, and when you have 0 iterations, it works fine ? |
Yes, I'm just surprised because it's the first time I had an issue running a full evaluation. |
ok. Oh I guess I should fix the memory usage |
what's the length of the track ? |
If you would like, I can take a look with memory_profiler and see if I can create any savings to contribute to this project? Song looks like it's 7:10:
|
well ok we could do that together ! Thanks (I'm not super available these times, but I'm curious about it). Normally this batchsize parameter should be saving quite a lot of ram, so if you may profile as a start to see what are the tensors that are exploding ? Are you tracking gradient btw ? |
I just tried disabling grad on my audio tensor, didn't save much. Some heavy lines from my profiling:
|
I thought I could be smart and only apply Wiener on |
Running the umxhq separator with the default wiener separation (niter=1) really blows up my memory usage when I run umx with the CPU. Is it really supposed to do that?
I could swear this used to run fine before, and I never had more than 64GB of RAM. It sounds like a conspiracy but I wonder if some possible ffmpeg version upgrade could be silently causing more memory to be used?
The text was updated successfully, but these errors were encountered: