Skip to content

Commit

Permalink
chore: deprecate ai/react (#4816)
Browse files Browse the repository at this point in the history
  • Loading branch information
lgrammel authored Feb 10, 2025
1 parent fb18dec commit dc49119
Show file tree
Hide file tree
Showing 90 changed files with 189 additions and 137 deletions.
5 changes: 5 additions & 0 deletions .changeset/two-rats-mix.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
---
'ai': patch
---

chore: deprecate ai/react
Original file line number Diff line number Diff line change
Expand Up @@ -69,7 +69,7 @@ Let's create a simple chat interface with `useChat`. You will call the `/api/cha
```tsx filename='app/page.tsx'
'use client';

import { useChat } from 'ai/react';
import { useChat } from '@ai-sdk/react';
import Image from 'next/image';

export default function Chat() {
Expand Down
4 changes: 2 additions & 2 deletions content/cookbook/01-next/120-stream-assistant-response.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -8,12 +8,12 @@ tags: ['next', 'streaming', 'assistant']

## Client

Let's create a simple chat interface that allows users to send messages to the assistant and receive responses. You will integrate the `useAssistant` hook from `ai/react` to stream the messages and status.
Let's create a simple chat interface that allows users to send messages to the assistant and receive responses. You will integrate the `useAssistant` hook from `@ai-sdk/react` to stream the messages and status.

```tsx filename='app/page.tsx'
'use client';

import { Message, useAssistant } from 'ai/react';
import { Message, useAssistant } from '@ai-sdk/react';

export default function Page() {
const { status, messages, input, submitMessage, handleInputChange } =
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ tags: ['next', 'streaming', 'assistant']

# Stream Assistant Response with Tools

Let's create a simple chat interface that allows users to send messages to the assistant and receive responses and give it the ability to use tools. You will integrate the `useAssistant` hook from `ai/react` to stream the messages and status.
Let's create a simple chat interface that allows users to send messages to the assistant and receive responses and give it the ability to use tools. You will integrate the `useAssistant` hook from `@ai-sdk/react` to stream the messages and status.

You will need to provide the list of tools on the OpenAI [Assistant Dashboard](https://platform.openai.com/assistants). You can use the following schema to create a tool to convert celsius to fahrenheit.

Expand All @@ -29,12 +29,12 @@ You will need to provide the list of tools on the OpenAI [Assistant Dashboard](h

## Client

Let's create a simple chat interface that allows users to send messages to the assistant and receive responses. You will integrate the `useAssistant` hook from `ai/react` to stream the messages and status.
Let's create a simple chat interface that allows users to send messages to the assistant and receive responses. You will integrate the `useAssistant` hook from `@ai-sdk/react` to stream the messages and status.

```tsx filename='app/page.tsx'
'use client';

import { Message, useAssistant } from 'ai/react';
import { Message, useAssistant } from '@ai-sdk/react';

export default function Page() {
const { status, messages, input, submitMessage, handleInputChange } =
Expand Down
4 changes: 2 additions & 2 deletions content/cookbook/01-next/122-caching-middleware.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -10,12 +10,12 @@ Let's create a simple chat interface that uses [`LanguageModelMiddleware`](/docs

## Client

Let's create a simple chat interface that allows users to send messages to the assistant and receive responses. You will integrate the `useChat` hook from `ai/react` to stream responses.
Let's create a simple chat interface that allows users to send messages to the assistant and receive responses. You will integrate the `useChat` hook from `@ai-sdk/react` to stream responses.

```tsx filename='app/page.tsx'
'use client';

import { useChat } from 'ai/react';
import { useChat } from '@ai-sdk/react';

export default function Chat() {
const { messages, input, handleInputChange, handleSubmit, error } = useChat();
Expand Down
4 changes: 2 additions & 2 deletions content/cookbook/01-next/20-stream-text.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -14,12 +14,12 @@ Text generation can sometimes take a long time to complete, especially when you'

## Client

Let's create a simple React component that imports the `useCompletion` hook from the `ai/react` module. The `useCompletion` hook will call the `/api/completion` endpoint when a button is clicked. The endpoint will generate text based on the input prompt and stream it to the client.
Let's create a simple React component that imports the `useCompletion` hook from the `@ai-sdk/react` module. The `useCompletion` hook will call the `/api/completion` endpoint when a button is clicked. The endpoint will generate text based on the input prompt and stream it to the client.

```tsx filename="app/page.tsx"
'use client';

import { useCompletion } from 'ai/react';
import { useCompletion } from '@ai-sdk/react';

export default function Page() {
const { completion, complete } = useCompletion({
Expand Down
4 changes: 2 additions & 2 deletions content/cookbook/01-next/21-stream-text-with-chat-prompt.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -25,12 +25,12 @@ Chat completion can sometimes take a long time to finish, especially when the re

## Client

Let's create a React component that imports the `useChat` hook from the `ai/react` module. The `useChat` hook will call the `/api/chat` endpoint when the user sends a message. The endpoint will generate the assistant's response based on the conversation history and stream it to the client.
Let's create a React component that imports the `useChat` hook from the `@ai-sdk/react` module. The `useChat` hook will call the `/api/chat` endpoint when the user sends a message. The endpoint will generate the assistant's response based on the conversation history and stream it to the client.

```tsx filename='app/page.tsx'
'use client';

import { useChat } from 'ai/react';
import { useChat } from '@ai-sdk/react';

export default function Page() {
const { messages, input, setInput, append } = useChat();
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -56,7 +56,7 @@ You can replace the `imageUrl` with the actual URL of the image you want to send
```typescript filename='app/page.tsx' highlight="18-20"
'use client';

import { useChat } from 'ai/react';
import { useChat } from '@ai-sdk/react';

// Allow streaming responses up to 30 seconds
export const maxDuration = 30;
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -82,7 +82,7 @@ Finally, on the client, use the `useChat` hook to manage the chat state and rend
```typescript filename='app/page.tsx'
'use client';

import { useChat } from 'ai/react';
import { useChat } from '@ai-sdk/react';
import { MemoizedMarkdown } from '@/components/memoized-markdown';

export default function Page() {
Expand Down
4 changes: 2 additions & 2 deletions content/cookbook/01-next/40-stream-object.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -66,7 +66,7 @@ Please note the code for handling `undefined` values in the JSX.
```tsx filename='app/page.tsx'
'use client';

import { experimental_useObject as useObject } from 'ai/react';
import { experimental_useObject as useObject } from '@ai-sdk/react';
import { notificationSchema } from './api/use-object/schema';

export default function Page() {
Expand Down Expand Up @@ -126,7 +126,7 @@ You can also use the `stop` function to stop the object generation process.
```tsx filename='app/page.tsx' highlight="7,16,21,24"
'use client';

import { experimental_useObject as useObject } from 'ai/react';
import { experimental_useObject as useObject } from '@ai-sdk/react';
import { notificationSchema } from './api/use-object/schema';

export default function Page() {
Expand Down
4 changes: 2 additions & 2 deletions content/cookbook/01-next/70-call-tools.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -28,14 +28,14 @@ Some models allow developers to provide a list of tools that can be called at an

## Client

Let's create a React component that imports the `useChat` hook from the `ai/react` module. The `useChat` hook will call the `/api/chat` endpoint when the user sends a message. The endpoint will generate the assistant's response based on the conversation history and stream it to the client. If the assistant responds with a tool call, the hook will automatically display them as well.
Let's create a React component that imports the `useChat` hook from the `@ai-sdk/react` module. The `useChat` hook will call the `/api/chat` endpoint when the user sends a message. The endpoint will generate the assistant's response based on the conversation history and stream it to the client. If the assistant responds with a tool call, the hook will automatically display them as well.

We will use the `maxSteps` to specify the maximum number of steps (i.e., LLM calls) that can be made to prevent infinite loops. In this example, you will set it to `2` to allow for two backend calls to happen.

```tsx filename='app/page.tsx'
'use client';

import { useChat } from 'ai/react';
import { useChat } from '@ai-sdk/react';

export default function Page() {
const { messages, input, setInput, append } = useChat({
Expand Down
4 changes: 2 additions & 2 deletions content/cookbook/01-next/71-call-tools-in-parallel.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -28,14 +28,14 @@ Some language models support calling tools in parallel. This is particularly use

## Client

Let's create a React component that imports the `useChat` hook from the `ai/react` module. The `useChat` hook will call the `/api/chat` endpoint when the user sends a message. The endpoint will generate the assistant's response based on the conversation history and stream it to the client. If the assistant responds with a tool call, the hook will automatically display them as well.
Let's create a React component that imports the `useChat` hook from the `@ai-sdk/react` module. The `useChat` hook will call the `/api/chat` endpoint when the user sends a message. The endpoint will generate the assistant's response based on the conversation history and stream it to the client. If the assistant responds with a tool call, the hook will automatically display them as well.

You will use the `maxSteps` to specify the maximum number of steps that can made before the model or the user responds with a text message. In this example, you will set it to `2` to allow for another call with the tool result to happen.

```tsx filename='app/page.tsx'
'use client';

import { useChat } from 'ai/react';
import { useChat } from '@ai-sdk/react';

export default function Page() {
const { messages, input, setInput, append } = useChat({
Expand Down
4 changes: 2 additions & 2 deletions content/cookbook/01-next/72-call-tools-multiple-steps.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -10,14 +10,14 @@ Some language models are great at calling tools in multiple steps to achieve a m

## Client

Let's create a React component that imports the `useChat` hook from the `ai/react` module. The `useChat` hook will call the `/api/chat` endpoint when the user sends a message. The endpoint will generate the assistant's response based on the conversation history and stream it to the client. If the assistant responds with a tool call, the hook will automatically display them as well.
Let's create a React component that imports the `useChat` hook from the `@ai-sdk/react` module. The `useChat` hook will call the `/api/chat` endpoint when the user sends a message. The endpoint will generate the assistant's response based on the conversation history and stream it to the client. If the assistant responds with a tool call, the hook will automatically display them as well.

To call tools in multiple steps, you can use the `maxSteps` option to specify the maximum number of steps that can be made before the model or the user responds with a text message. In this example, you will set it to `5` to allow for multiple tool calls.

```tsx filename='app/page.tsx'
'use client';

import { useChat } from 'ai/react';
import { useChat } from '@ai-sdk/react';

export default function Page() {
const { messages, input, setInput, append } = useChat({
Expand Down
6 changes: 3 additions & 3 deletions content/cookbook/01-next/75-human-in-the-loop.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@ On the frontend, use the `useChat` hook to manage the message state and user int
```tsx filename="app/page.tsx"
'use client';

import { useChat } from 'ai/react';
import { useChat } from '@ai-sdk/react';

export default function Chat() {
const { messages, input, handleInputChange, handleSubmit } = useChat();
Expand Down Expand Up @@ -150,7 +150,7 @@ You can check if the tool requiring confirmation has been called and, if so, pre
```tsx filename="app/page.tsx"
'use client';

import { useChat } from 'ai/react';
import { useChat } from '@ai-sdk/react';

export default function Chat() {
const { messages, input, handleInputChange, handleSubmit, addToolResult } =
Expand Down Expand Up @@ -559,7 +559,7 @@ Finally, update the frontend to use the new `getToolsRequiringConfirmation` func
```tsx filename="app/page.tsx"
'use client';

import { Message, useChat } from 'ai/react';
import { Message, useChat } from '@ai-sdk/react';
import {
APPROVAL,
getToolsRequiringConfirmation,
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -32,7 +32,7 @@ This can be useful if you want to reduce the amount of data sent to the server.
```typescript filename='app/page.tsx' highlight="7-10"
'use client';

import { useChat } from 'ai/react';
import { useChat } from '@ai-sdk/react';

export default function Chat() {
const { messages, input, handleInputChange, handleSubmit } = useChat({
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -45,7 +45,7 @@ Let's build an assistant that gets the weather for any city by calling the `getW
'use client';

import { ToolInvocation } from 'ai';
import { Message, useChat } from 'ai/react';
import { Message, useChat } from '@ai-sdk/react';

export default function Chat() {
const { messages, input, handleInputChange, handleSubmit, addToolResult } =
Expand Down
8 changes: 4 additions & 4 deletions content/docs/02-getting-started/02-nextjs-app-router.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -38,7 +38,7 @@ Navigate to the newly created directory:

### Install dependencies

Install `ai` and `@ai-sdk/openai`, the AI package and AI SDK's [ OpenAI provider ](/providers/ai-sdk-providers/openai) respectively.
Install `ai`, `@ai-sdk/react`, and `@ai-sdk/openai`, the AI package, AI SDK's React hooks, and AI SDK's [ OpenAI provider ](/providers/ai-sdk-providers/openai) respectively.

<Note>
The AI SDK is designed to be a unified interface to interact with any large
Expand Down Expand Up @@ -125,7 +125,7 @@ Update your root page (`app/page.tsx`) with the following code to show a list of
```tsx filename="app/page.tsx"
'use client';

import { useChat } from 'ai/react';
import { useChat } from '@ai-sdk/react';

export default function Chat() {
const { messages, input, handleInputChange, handleSubmit } = useChat();
Expand Down Expand Up @@ -240,7 +240,7 @@ To display the tool invocations in your UI, update your `app/page.tsx` file:
```tsx filename="app/page.tsx" highlight="12-16"
'use client';

import { useChat } from 'ai/react';
import { useChat } from '@ai-sdk/react';

export default function Chat() {
const { messages, input, handleInputChange, handleSubmit } = useChat();
Expand Down Expand Up @@ -287,7 +287,7 @@ Modify your `app/page.tsx` file to include the `maxSteps` option:
```tsx filename="app/page.tsx" highlight="7"
'use client';

import { useChat } from 'ai/react';
import { useChat } from '@ai-sdk/react';

export default function Chat() {
const { messages, input, handleInputChange, handleSubmit } = useChat({
Expand Down
8 changes: 4 additions & 4 deletions content/docs/02-getting-started/03-nextjs-pages-router.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -38,7 +38,7 @@ Navigate to the newly created directory:

### Install dependencies

Install `ai` and `@ai-sdk/openai`, the AI SDK's OpenAI provider.
Install `ai`, `@ai-sdk/react`, and `@ai-sdk/openai`, the AI package, AI SDK's React hooks, and AI SDK's [ OpenAI provider ](/providers/ai-sdk-providers/openai) respectively.

<Note>
The AI SDK is designed to be a unified interface to interact with any large
Expand Down Expand Up @@ -129,7 +129,7 @@ Now that you have an API route that can query an LLM, it's time to setup your fr
Update your root page (`pages/index.tsx`) with the following code to show a list of chat messages and provide a user message input:

```tsx filename="pages/index.tsx"
import { useChat } from 'ai/react';
import { useChat } from '@ai-sdk/react';

export default function Chat() {
const { messages, input, handleInputChange, handleSubmit } = useChat();
Expand Down Expand Up @@ -238,7 +238,7 @@ Notice the blank response in the UI? This is because instead of generating a tex
To display the tool invocations in your UI, update your `pages/index.tsx` file:

```tsx filename="pages/index.tsx" highlight="11-15"
import { useChat } from 'ai/react';
import { useChat } from '@ai-sdk/react';

export default function Chat() {
const { messages, input, handleInputChange, handleSubmit } = useChat();
Expand Down Expand Up @@ -283,7 +283,7 @@ To solve this, you can enable multi-step tool calls using the `maxSteps` option
Modify your `pages/index.tsx` file to include the `maxSteps` option:

```tsx filename="pages/index.tsx" highlight="6"
import { useChat } from 'ai/react';
import { useChat } from '@ai-sdk/react';

export default function Chat() {
const { messages, input, handleInputChange, handleSubmit } = useChat({
Expand Down
8 changes: 4 additions & 4 deletions content/docs/02-guides/01-rag-chatbot.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -252,9 +252,9 @@ This function will take an input string and split it by periods, filtering out a

You will use the AI SDK to create embeddings. This will require two more dependencies, which you can install by running the following command:

<Snippet text="pnpm add ai @ai-sdk/openai" />
<Snippet text="pnpm add ai @ai-sdk/react @ai-sdk/openai" />

This will install the [AI SDK](https://sdk.vercel.ai/docs) and the [OpenAI provider](/providers/ai-sdk-providers/openai).
This will install the [AI SDK](https://sdk.vercel.ai/docs), AI SDK's React hooks, and AI SDK's [OpenAI provider](/providers/ai-sdk-providers/openai).

<Note>
The AI SDK is designed to be a unified interface to interact with any large
Expand Down Expand Up @@ -381,7 +381,7 @@ Replace your root page (`app/page.tsx`) with the following code.
```tsx filename="app/page.tsx"
'use client';

import { useChat } from 'ai/react';
import { useChat } from '@ai-sdk/react';

export default function Chat() {
const { messages, input, handleInputChange, handleSubmit } = useChat();
Expand Down Expand Up @@ -551,7 +551,7 @@ Let’s make a few changes in the UI to communicate to the user when a tool has
```tsx filename="app/page.tsx" highlight="14-22"
'use client';

import { useChat } from 'ai/react';
import { useChat } from '@ai-sdk/react';

export default function Chat() {
const { messages, input, handleInputChange, handleSubmit } = useChat();
Expand Down
10 changes: 5 additions & 5 deletions content/docs/02-guides/02-multi-modal-chatbot.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -50,13 +50,13 @@ Install `ai` and `@ai-sdk/openai`, the Vercel AI package and the AI SDK's [ Open
<div className="my-4">
<Tabs items={['pnpm', 'npm', 'yarn']}>
<Tab>
<Snippet text="pnpm add ai @ai-sdk/openai" dark />
<Snippet text="pnpm add ai @ai-sdk/react @ai-sdk/openai" dark />
</Tab>
<Tab>
<Snippet text="npm install ai @ai-sdk/openai" dark />
<Snippet text="npm install ai @ai-sdk/react @ai-sdk/openai" dark />
</Tab>
<Tab>
<Snippet text="yarn add ai @ai-sdk/openai" dark />
<Snippet text="yarn add ai @ai-sdk/react @ai-sdk/openai" dark />
</Tab>
</Tabs>
</div>
Expand Down Expand Up @@ -133,7 +133,7 @@ Update your root page (`app/page.tsx`) with the following code to show a list of
```tsx filename="app/page.tsx"
'use client';

import { useChat } from 'ai/react';
import { useChat } from '@ai-sdk/react';

export default function Chat() {
const { messages, input, handleInputChange, handleSubmit } = useChat();
Expand Down Expand Up @@ -183,7 +183,7 @@ Update your root page (`app/page.tsx`) with the following code:
```tsx filename="app/page.tsx" highlight="4-5,10-11,19-33,39-49,51-61"
'use client';

import { useChat } from 'ai/react';
import { useChat } from '@ai-sdk/react';
import { useRef, useState } from 'react';
import Image from 'next/image';

Expand Down
2 changes: 1 addition & 1 deletion content/docs/02-guides/03-llama-3_1.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -239,7 +239,7 @@ export async function POST(req: Request) {
```tsx filename="app/page.tsx"
'use client';

import { useChat } from 'ai/react';
import { useChat } from '@ai-sdk/react';

export default function Page() {
const { messages, input, handleInputChange, handleSubmit } = useChat();
Expand Down
2 changes: 1 addition & 1 deletion content/docs/02-guides/04-o1.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -206,7 +206,7 @@ export async function POST(req: Request) {
```tsx filename="app/page.tsx"
'use client';

import { useChat } from 'ai/react';
import { useChat } from '@ai-sdk/react';

export default function Page() {
const { messages, input, handleInputChange, handleSubmit, error } = useChat();
Expand Down
2 changes: 1 addition & 1 deletion content/docs/02-guides/04-o3.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -181,7 +181,7 @@ Finally, update the root page (`app/page.tsx`) to use the `useChat` hook:
```tsx filename="app/page.tsx"
'use client';

import { useChat } from 'ai/react';
import { useChat } from '@ai-sdk/react';

export default function Page() {
const { messages, input, handleInputChange, handleSubmit, error } = useChat();
Expand Down
Loading

0 comments on commit dc49119

Please sign in to comment.