Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Example usage of kafka endpoints as an input and output for triton server using Python API #92

Merged
merged 13 commits into from
Jul 24, 2024

Conversation

pthalasta
Copy link
Contributor

This PR provides an example of using kafka endpoints as an input and output for triton server using Python API. As an example code, we have deployed a preprocessing stage of tokenizer as a custom model.

Input: Each message in the kafka topic
Output: json string of the output produced after passing the input through the deployed model

This PR has an example to support #3131 using triton's python API

@pthalasta pthalasta changed the title [DRAFT] Example usage of kafka endpoints as an input and output for triton server using Python API Example usage of kafka endpoints as an input and output for triton server using Python API May 10, 2024
harryskim
harryskim previously approved these changes Jul 19, 2024
Copy link

@harryskim harryskim left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM :) Thanks for putting this together!

Copy link
Collaborator

@fpetrini15 fpetrini15 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Outstanding work, thanks @pthalasta!

@fpetrini15 fpetrini15 merged commit a4bd15b into triton-inference-server:main Jul 24, 2024
3 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants