You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi! I was wondering if I a GPT Code Clippy model could generate embeddings instead of output generation?
The purpose is to embed code in a semantical space, such that it can be used as a feature for another neural network. I have done the same with BERT (more as a baseline, since this model is not trained on code), and with the OpenAI Codex model (with a paying API), and therefore would love to use one of your models as well.
Thank you!
The text was updated successfully, but these errors were encountered:
So you could use our models for embedding. However, I would not recommend it. I'd suggest checking out graphcodebert from Microsoft for doing this. It gets good performance across a ton of other code related tasks: https://huggingface.co/microsoft/graphcodebert-base
Hi! I was wondering if I a GPT Code Clippy model could generate embeddings instead of output generation?
The purpose is to embed code in a semantical space, such that it can be used as a feature for another neural network. I have done the same with BERT (more as a baseline, since this model is not trained on code), and with the OpenAI Codex model (with a paying API), and therefore would love to use one of your models as well.
Thank you!
The text was updated successfully, but these errors were encountered: