-
Notifications
You must be signed in to change notification settings - Fork 2.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Incompatible type for .withStructuredOutput
between different LLM provider
#6795
Comments
I found a similar closed issue that might be relevant to your problem: Zod and Enum support. This issue was resolved by updating to version 0.0.59 from 0.0.55 [1]. The type error you are encountering when using if (typeof llm.withStructuredOutput !== "function") {
throw new Error(
"The specified LLM does not support the 'withStructuredOutput'. Please ensure you are using an LLM that supports this feature."
);
} Additionally, the if (method === "jsonMode") {
throw new Error(`Google only supports "functionCalling" as a method.`);
} Ensure that the LLM provider you are using supports the In LangChain.js, the withStructuredOutput?<
RunOutput extends Record<string, any> = Record<string, any>
>(
schema: z.ZodType<RunOutput> | Record<string, any>,
config?: StructuredOutputMethodOptions<boolean>
): Runnable<BaseLanguageModelInput, RunOutput> | Runnable<BaseLanguageModelInput, { raw: BaseMessage; parsed: RunOutput }>; Make sure the LLM you are using extends from The To resolve this issue, ensure that the LLM you are using is compatible with the |
I am using an LLM which compatible with If I just use any of the LLM either groq or OpenAI it works fine. Only creates type problem when the llm type is either group or openai |
To resolve the TypeScript type incompatibility issue when using Here is the relevant code snippet for the withStructuredOutput<
RunOutput extends Record<string, any> = Record<string, any>
>(
outputSchema:
| z.ZodType<RunOutput>
| Record<string, any>,
config?: StructuredOutputMethodOptions<false>
): Runnable<BaseLanguageModelInput, RunOutput>;
withStructuredOutput<
RunOutput extends Record<string, any> = Record<string, any>
>(
outputSchema:
| z.ZodType<RunOutput>
| Record<string, any>,
config?: StructuredOutputMethodOptions<true>
): Runnable<BaseLanguageModelInput, { raw: BaseMessage; parsed: RunOutput }>;
withStructuredOutput<
RunOutput extends Record<string, any> = Record<string, any>
>(
outputSchema:
| z.ZodType<RunOutput>
| Record<string, any>,
config?: StructuredOutputMethodOptions<boolean>
):
| Runnable<BaseLanguageModelInput, RunOutput>
| Runnable<
BaseLanguageModelInput,
{ raw: BaseMessage; parsed: RunOutput }
> {
const schema: z.ZodType<RunOutput> | Record<string, any> = outputSchema;
const name = config?.name;
const description = schema.description ?? "A function available to call.";
const method = config?.method;
const includeRaw = config?.includeRaw;
if (method === "jsonMode") {
throw new Error(`ChatBedrockConverse does not support 'jsonMode'.`);
}
// Additional implementation details...
} Ensure that the If you are conditionally generating the LLM object, you might need to use a type assertion or a type guard to ensure TypeScript understands that the returned LLM object supports the export function getLangChainLlm(m?: Models, p?: ProvidersType): ChatGroq | ChatOpenAI {
const { model, provider } = getLlmModelAndProvider(m, p);
switch (provider) {
case Providers.Enum.groq:
return new ChatGroq({
model,
temperature: 0,
maxRetries: 2,
});
case Providers.Enum.openai:
return new ChatOpenAI({
model,
temperature: 0,
maxRetries: 2,
});
default:
throw new Error(`Unknown provider: ${provider}`);
}
}
const llm = getLangChainLlm();
if ('withStructuredOutput' in llm) {
llm.withStructuredOutput(SomeZodSchema);
} This approach ensures that TypeScript recognizes that the |
@dosu |
This seems like it could be a bug that needs your attention, @jacoblee93, could you take a look? |
Hey! Sorry you're running into this - can you confirm you've got the latest versions of @langchain/groq and OpenAI? We just pushed 0.3.0 of core yesterday, updated docs will be coming soon but you'll need the latest for all packages to use it. |
Hi @jacoblee93 Thanks for your quick reply. Yes I just updated to 0.3.0, before creating this issue, Here is how my package.json looks like
I checked the type def of both the packages: https://github.com/langchain-ai/langchainjs/blob/main/libs/langchain-openai/src/chat_models.ts#L1716 The only difference that I see is, the OpenAI one has an extra type in the union which is |
Got it - will have a look but OpenAI does accept a few unique args so it may be expected |
Thanks @jacoblee93 In that case can we mention the types in a way so that that the TS type inference works correctly, this more seems to be a problem where TS is not able to infer the the type properly. |
Checked other resources
Example Code
I generate the langchain LLM object conditionally like this:
Error Message and Stack Trace (if applicable)
While trying to use
llm.withStructuredOutput(SomeZodSchema)
, I get this type errorDescription
I think the langchain doc says it provides an unified interface for
withStructuredOutput
accross different LLMs. In that case the type should be compatible ?System Info
platform: Mac
Node: 20.10.0
npm: 10.2.3
The text was updated successfully, but these errors were encountered: