-
Notifications
You must be signed in to change notification settings - Fork 80
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support LLM token count and metadata for Vercel AI SDK #999
Comments
Thanks for flagging @zirkelc. We are hoping to fix at least the token counting issue very soon |
Any updates on this? This is a quite critical feature to have on customer-facing applications |
Hey @Boohi, we've just shipped 0.2.1 with a new OTEL-based approach: Docs will follow soon, but here's an example in Node.js: /**
* OpenTelemetry trace exporter for Vercel AI SDK.
*
* @example
* ```ts
* import { AISDKExporter } from "langsmith/vercel";
* import { Client } from "langsmith";
*
* import { generateText } from "ai";
* import { openai } from "@ai-sdk/openai";
*
* import { NodeSDK } from "@opentelemetry/sdk-node";
* import { getNodeAutoInstrumentations } from "@opentelemetry/auto-instrumentations-node";
*
* const client = new Client();
*
* const sdk = new NodeSDK({
* traceExporter: new AISDKExporter({ client }),
* instrumentations: [getNodeAutoInstrumentations()],
* });
*
* sdk.start();
*
* const res = await generateText({
* model: openai("gpt-4o-mini"),
* messages: [
* {
* role: "user",
* content: "What color is the sky?",
* },
* ],
* experimental_telemetry: AISDKExporter.getSettings({
* runName: "langsmith_traced_call",
* metadata: { userId: "123", language: "english" },
* }),
* });
*
* await sdk.shutdown();
* ```
*/ If you are using any LangChain packages, we recommend bumping all to latest along with this. |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Feature request
LangSmith does not show the token count for traces created from
wrapAISDKModel
for the Vercel AI SDK. The counts are sent to LangSmith:However, the counts are not rendered in the UI. Also, any metadata added via
experimental_telemetry.metadata
is not sent to LangSmith.Motivation
I'm exploring the usage of Vercel AI SDK while keeping the good DX from LangSmith.
The text was updated successfully, but these errors were encountered: