You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have done my due diligence in trying to find the answer myself.
Topic
The PyTorch implementation
Question
When I attempted to reproduce the MIMI codec, I found that my codebook utilization was very low in all layers except for the first distillation layer. Meanwhile, I noticed that the codebook utilization of all RVQ layers in the official checkpoint was very high.
Is there any technique to improve the utilization of these codebooks, especially for the RVQ layers other than the distillation layer? I’m looking forward to your reply. Thank you!
The text was updated successfully, but these errors were encountered:
Due diligence
Topic
The PyTorch implementation
Question
When I attempted to reproduce the MIMI codec, I found that my codebook utilization was very low in all layers except for the first distillation layer. Meanwhile, I noticed that the codebook utilization of all RVQ layers in the official checkpoint was very high.
Is there any technique to improve the utilization of these codebooks, especially for the RVQ layers other than the distillation layer? I’m looking forward to your reply. Thank you!
The text was updated successfully, but these errors were encountered: