-
Notifications
You must be signed in to change notification settings - Fork 12
Initial commit #28
base: main
Are you sure you want to change the base?
Initial commit #28
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hey @krokozyab thank you for the interest in this challenge and for your solution.
As you can see, our gh actions failed to build your solution, we suspect the base image being too large (i.e python:3.11
). Could you please try using pytorch/pytorch:*-cuda*-cudnn*-runtime
image (we generally use those for this purpose) or if you want the base to be pure python, python:3.11-slim
should do the trick. Thank you in advance!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hey @krokozyab, thank you for the fix, now tests ran as expected. Few notes:
- Here are our tests results on a grafana dashboard. Errors you see are HTTP-400 errors returned from empty texts (these are intended and need to be handled as valid input)
- You forgot to remove
FROM python:3.11-slim
from the Dockerfile and runtime CUDA was missing (docker "ignored" firstFROM
). Commenting out 3rd line fixes this issue
Once again than you for this beautiful solution and if you want you can improve or optimise it and re-requests review. Please address aforementioned issues if you plan to do so. Every contribution will count while choosing a challenge winner.
2. Fixed dockerfile. 3. Introduced fastapi workers.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@krokozyab thank you for this additions, here are tests results
and here is one example of a text that returns HTTP-500 I was able to catch:
Hi, I guess (only guess) the test does not simulate multisession, and all request sticks to just one worker so the multi-worker solution just makes the solution worse. |
Hello, please test it.