Skip to content

Is possible to retrieve tokens from OpenAI? #115

Answered by trebedea
JesusMF23 asked this question in Q&A
Discussion options

You must be logged in to vote

Hi @JesusMF23 ,

We have LLM stats for each call, see here how it is used:

log.info("--- :: Stats: %s" % llm_stats)

At this moment, we reset the stats at the beginning of each generate_async call:

But you can easily get the stats after each call and monitor the total usage for an app. Is this what you are looking for?

Replies: 1 comment 4 replies

Comment options

You must be logged in to vote
4 replies
@JesusMF23
Comment options

@trebedea
Comment options

@JesusMF23
Comment options

@drazvan
Comment options

Answer selected by JesusMF23
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
3 participants