About EmbeddingBag of torch #149
henryqin1997
started this conversation in
Community | Ideas
Replies: 2 comments
-
@kurisusnowdeng Can you answer @henryqin1997 's query? |
Beta Was this translation helpful? Give feedback.
0 replies
-
Hi, @henryqin1997 . Thank you for your interest. We are sorry that we do not support embeddingbag now. You can directly use the pytorch embeddingbag if no tensor parallelism involved. Or, to use tensor parallel embeddingbag, you may have to use our tensor parallel embedding and aggregate the results manually. We will add this layer into our system very soon. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
In pytorch, EmbeddingBag is itself a module with description "Computes sums or means of 'bags' of embeddings, without instantiating the intermediate embeddings". DLRM take use of this module, and currently colossal has not adapted it. Is EmbeddingBag from pytorch supposed to work directly with colossal layers, or it will be adapted later?
Beta Was this translation helpful? Give feedback.
All reactions