You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on Sep 24, 2024. It is now read-only.
Hi! I'm currently trying to deploy this awesome model on Heroku. The code works fine when working locally and I get no particular error message when I deploy it on Heroku. I have upgraded dynos so RAM should not be an issue (I'm not warned about it either). I have also deployed some fine-tuned BERT models in another dyno(s) and they work fine. But for the pegasus, the build complies but then the deployment fails and the only error I have to work with is H12 when requesting paraphrasing through postman.
Anyone else has any experience with deploying the Pegasus on Heroku? The model is uploaded on S3 storage and hence does not affecting the slug size. Also, if someone has experience if one can use hugging face for a similar API task I would gladly listen!
Best regards and thanks in advance
The text was updated successfully, but these errors were encountered:
Sign up for freeto subscribe to this conversation on GitHub.
Already have an account?
Sign in.
Hi! I'm currently trying to deploy this awesome model on Heroku. The code works fine when working locally and I get no particular error message when I deploy it on Heroku. I have upgraded dynos so RAM should not be an issue (I'm not warned about it either). I have also deployed some fine-tuned BERT models in another dyno(s) and they work fine. But for the pegasus, the build complies but then the deployment fails and the only error I have to work with is H12 when requesting paraphrasing through postman.
Anyone else has any experience with deploying the Pegasus on Heroku? The model is uploaded on S3 storage and hence does not affecting the slug size. Also, if someone has experience if one can use hugging face for a similar API task I would gladly listen!
Best regards and thanks in advance
The text was updated successfully, but these errors were encountered: