Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Non-streamed responses aren't reflected in the dashboard #938

Open
manuelmorales opened this issue Feb 5, 2025 · 4 comments
Open

Non-streamed responses aren't reflected in the dashboard #938

manuelmorales opened this issue Feb 5, 2025 · 4 comments

Comments

@manuelmorales
Copy link

Describe the issue

When chat completions requests are sent in a non-streaming way, they are not reflected in the dashboard whatsoever. The issue was found in this Discord thread.

Steps to Reproduce

You can use the script below to reproduce the issue.

#! /bin/bash
set -e
source .env
set -x

curl "$OPENAI_API_BASE/chat/completions" \
    -H "Content-Type: application/json" \
    -H "Authorization: Bearer $OPENAI_API_KEY" \
    -d '{
        "model": "gpt-4o-mini",
        "messages": [
            {
                "role": "system",
                "content": "You are a helpful assistant."
            },
            {
                "role": "user",
                "content": "Write a haiku that explains the concept of recursion."
            }
        ]
    }'

The request above returns 200 OK and returns correct chat completions. But it doesn't show in the dashboard. I don't know if it's scanned for secrets.

It happens with Anthropic too.

Just by adding "stream": true to the body of the request is enough to make the problem go away.

Operating System

Linux (Intel)

IDE and Version

curl

Extension and Version

No extension was used.

Provider

OpenAI

Model

Tested with 4o-mini and sonnet. But I suspect it applies to all models

Codegate version

v0.1.16 (docker image ghcr.io/stacklok/codegate latest a20bef562ed0)

Logs

No response

Additional Context

No response

@kantord
Copy link
Member

kantord commented Feb 5, 2025

I can confirm the same problem

@dussab
Copy link
Member

dussab commented Feb 6, 2025

@kantord @manuelmorales I think PR #967 should fix this.

@aponcedeleonch
Copy link
Contributor

aponcedeleonch commented Feb 10, 2025

@dussab I just tested this and the prompt and alerts are getting recorded correctly but not the outputs. We still cannot close the issue :(

@lukehinds
Copy link
Contributor

thanks @aponcedeleonch

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

5 participants