Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

GenerateText and useChat work as expected but streamText doesn't work at all! #3154

Closed
heinergiehl opened this issue Sep 28, 2024 · 8 comments

Comments

@heinergiehl
Copy link

Description

Using useChat and streamText from "ai" leads to a status of pending of the /api/chat API route
Here my package.json:

`{
  "name": "clerk",
  "version": "0.1.0",
  "private": true,
  "scripts": {
    "dev": "next dev --turbo",
    "build": "next build",
    "start": "next start",
    "lint": "next lint"
  },
  "dependencies": {
    "@ai-sdk/openai": "^0.0.62",
    "@aws-sdk/client-s3": "^3.658.1",
    "@aws-sdk/s3-request-presigner": "^3.658.1",
    "@clerk/nextjs": "^5.6.3",
    "@hookform/resolvers": "^3.9.0",
    "@langchain/community": "^0.3.3",
    "@langchain/core": "^0.3.3",
    "@pinecone-database/doc-splitter": "^0.0.1",
    "@pinecone-database/pinecone": "^3.0.3",
    "@prisma/client": "5.20.0",
    "@radix-ui/react-accordion": "^1.2.0",
    "@radix-ui/react-alert-dialog": "^1.1.1",
    "@radix-ui/react-aspect-ratio": "^1.1.0",
    "@radix-ui/react-avatar": "^1.1.0",
    "@radix-ui/react-checkbox": "^1.1.1",
    "@radix-ui/react-collapsible": "^1.1.0",
    "@radix-ui/react-context-menu": "^2.2.1",
    "@radix-ui/react-dialog": "^1.1.1",
    "@radix-ui/react-dropdown-menu": "^2.1.1",
    "@radix-ui/react-hover-card": "^1.1.1",
    "@radix-ui/react-icons": "^1.3.0",
    "@radix-ui/react-label": "^2.1.0",
    "@radix-ui/react-menubar": "^1.1.1",
    "@radix-ui/react-navigation-menu": "^1.2.0",
    "@radix-ui/react-popover": "^1.1.1",
    "@radix-ui/react-progress": "^1.1.0",
    "@radix-ui/react-radio-group": "^1.2.0",
    "@radix-ui/react-scroll-area": "^1.1.0",
    "@radix-ui/react-select": "^2.1.1",
    "@radix-ui/react-separator": "^1.1.0",
    "@radix-ui/react-slider": "^1.2.0",
    "@radix-ui/react-slot": "^1.1.0",
    "@radix-ui/react-switch": "^1.1.0",
    "@radix-ui/react-tabs": "^1.1.0",
    "@radix-ui/react-toast": "^1.2.1",
    "@radix-ui/react-toggle": "^1.1.0",
    "@radix-ui/react-toggle-group": "^1.1.0",
    "@radix-ui/react-tooltip": "^1.1.2",
    "@react-pdf-viewer/core": "^3.12.0",
    "@react-pdf-viewer/default-layout": "^3.12.0",
    "@tanstack/react-query": "^5.56.2",
    "@tanstack/react-query-devtools": "^5.58.0",
    "@types/canvas-confetti": "^1.6.4",
    "@types/md5": "^2.3.5",
    "add": "^2.0.6",
    "canvas-confetti": "^1.9.3",
    "class-variance-authority": "^0.7.0",
    "clsx": "^2.1.1",
    "cmdk": "1.0.0",
    "cobe": "^0.6.3",
    "date-fns": "^4.1.0",
    "embla-carousel-react": "^8.3.0",
    "framer-motion": "^11.9.0",
    "input-otp": "^1.2.4",
    "install": "^0.13.0",
    "langchain": "^0.3.2",
    "lucide-react": "^0.446.0",
    "md5": "^2.3.0",
    "next": "14.2.5",
    "next-themes": "^0.3.0",
    "openai-edge": "^1.2.2",
    "pdf-parse": "^1.1.1",
    "prisma": "^5.20.0",
    "react": "^18.3.1",
    "react-day-picker": "8.10.1",
    "react-dom": "^18.3.1",
    "react-dropzone": "^14.2.3",
    "react-hook-form": "^7.53.0",
    "react-icon-cloud": "^4.1.4",
    "react-resizable-panels": "^2.1.3",
    "react-spring": "^9.7.4",
    "react-tweet": "^3.2.1",
    "recharts": "^2.12.7",
    "shadcn": "^2.1.0",
    "sonner": "^1.5.0",
    "tailwind-merge": "^2.5.2",
    "tailwindcss-animate": "^1.0.7",
    "vaul": "^1.0.0",
    "zod": "^3.23.8"
  },
  "devDependencies": {
    "@types/node": "^20.16.10",
    "@types/react": "^18.3.10",
    "@types/react-dom": "^18.3.0",
    "eslint": "^8.57.1",
    "eslint-config-next": "14.2.5",
    "postcss": "^8.4.47",
    "tailwindcss": "^3.4.13",
    "typescript": "^5.6.2"
  }
}
`

Code example

ChatComponent:

`"use client"
import React from "react"
import { Input } from "./ui/input"
import { useChat } from "ai/react"
import { Button } from "./ui/button"
import { Send } from "lucide-react"
import MessageList from "./MessageList"
import { useQuery } from "@tanstack/react-query"
import { getMessages } from "@/app/actions/get-messages"
type Props = { chatId: number }
const ChatComponent = ({ chatId }: Props) => {
  const { data, isLoading } = useQuery({
    queryKey: ["chat", chatId],
    queryFn: async () => {
      const response = await getMessages({ chatId })
      return response
    },
  })
  const { messages, input, handleInputChange, handleSubmit } = useChat({
    api: "/api/chat",
    body: {
      chatId,
    },
    initialMessages: data || [[]],
  })
  React.useEffect(() => {
    const messageContainer = document.getElementById("message-container")
    if (messageContainer) {
      messageContainer.scrollTo({
        top: messageContainer.scrollHeight,
        behavior: "smooth",
      })
    }
  }, [messages])
  return (
    <div className="relative h-screen" id="message-container">
      {/* header */}
      <div className="sticky top-0 inset-x-0 p-2 bg-white h-fit">
        <h3 className="text-xl font-bold">Chat</h3>
      </div>
      {/* message list */}
      <MessageList messages={messages} isLoading={isLoading} />
      <form
        onSubmit={handleSubmit}
        className="sticky bottom-0 inset-x-0 px-2 py-4 bg-white"
      >
        <div className="flex">
          <Input
            value={input}
            onChange={handleInputChange}
            placeholder="Ask any question..."
            className="w-full"
          />
          <Button className="bg-blue-600 ml-2">
            <Send className="h-4 w-4" />
          </Button>
        </div>
      </form>
    </div>
  )
}
export default ChatComponent
`

and API route:

`import { getContext } from "@/lib/context"
import prisma from "@/lib/db"
import { Message, Role } from "@prisma/client"
import { openai } from "@ai-sdk/openai"
import { convertToCoreMessages, streamText, generateText, StreamData } from "ai"
import { NextRequest, NextResponse } from "next/server"
// Allow streaming responses up to 30 seconds
export const maxDuration = 10
export const POST = async (req: NextRequest) => {
  type Props = {
    messages: Message[]
    chatId: number
  }
  const data = await req.json()
  const { messages, chatId } = data
  console.log("messages", messages, "chatId", chatId)
  const chats = await prisma.chat.findMany({
    where: {
      id: chatId,
    },
  })
  const chat = chats[0]
  const fileKey = chat.fileKey
  const lastMessage = messages[messages.length - 1]
  const context = await getContext(lastMessage.content, fileKey)
  const prompt = `AI assistant is a brand new, powerful, human-like artificial intelligence.
      The traits of AI include expert knowledge, helpfulness, cleverness, and articulateness.
      AI is a well-behaved and well-mannered individual.
      AI is always friendly, kind, and inspiring, and he is eager to provide vivid and thoughtful responses to the user.
      AI has the sum of all knowledge in their brain, and is able to accurately answer nearly any question about any topic in conversation.
      AI assistant is a big fan of Pinecone and Vercel.
      START CONTEXT BLOCK
      ${context}
      END OF CONTEXT BLOCK
      AI assistant will take into account any CONTEXT BLOCK that is provided in a conversation.
      If the context does not provide the answer to question, the AI assistant will say, "I'm sorry, but I don't know the answer to that question".
      AI assistant will not apologize for previous responses, but instead will indicated new information was gained.
      AI assistant will not invent anything that is not drawn directly from the context.
      `
  const input = {
    system: prompt,
    model: openai("gpt-3.5-turbo"),
    messages: convertToCoreMessages(messages),
  }
  // update user message database
  await prisma.message.create({
    data: {
      content: lastMessage.content,
      chatId,
      role: Role.user,
    },
  })
  try {
    const result = await streamText(input)
    await prisma.message.create({
      data: {
        content: await result.text,
        chatId,
        role: Role.system,
      },
    })
    return result.toDataStreamResponse({})
  } catch (error) {
    console.log("error calling openai api", error)
    throw error
  }
}
`

in the network tab of my chrome browser I only see pending and then nothing happens. no Errors and nothing.

Additional context

No response

@heinergiehl
Copy link
Author

heinergiehl commented Sep 28, 2024

Here you see the pending status.
{122B165D-AECD-423F-B2A0-E02327E9E5C4}

And here are some more information about the request:

`Request URL:
http://localhost:3000/api/chat
Referrer Policy:
strict-origin-when-cross-origin
accept:
*/*
accept-encoding:
gzip, deflate, br, zstd
accept-language:
en-US,en;q=0.9
connection:
keep-alive
content-length:
2908
content-type:
application/json
cookie:
_ga=GA1.1.1671624404.1723726453; sb-xrijkesniomonfjjzpnb-auth-token=base64-eyJhY2Nlc3NfdG9rZW4iOiJleUpoYkdjaU9pSklVekkxTmlJc0ltdHBaQ0k2SW5aTlUwbGFXbXQ0VDBVMWFtdzRhRFVpTENKMGVYQWlPaUpLVjFRaWZRLmV5SnBjM01pT2lKb2RIUndjem92TDNoeWFXcHJaWE51YVc5dGIyNW1hbXA2Y0c1aUxuTjFjR0ZpWVhObExtTnZMMkYxZEdndmRqRWlMQ0p6ZFdJaU9pSTNaV1ZoWW1Nd1pTMDJNall6TFRRek5qTXRZbVJpWlMxaFpqYzVZell6TkdJMVpERWlMQ0poZFdRaU9pSmhkWFJvWlc1MGFXTmhkR1ZrSWl3aVpYaHdJam94TnpJME56azFPVFUwTENKcFlYUWlPakUzTWpRM09USXpOVFFzSW1WdFlXbHNJam9pYUdWcGJtVnlMbWRwWldoc1FIUjFMV1J2Y25SdGRXNWtMbVJsSWl3aWNHaHZibVVpT2lJaUxDSmhjSEJmYldWMFlXUmhkR0VpT25zaWNISnZkbWxrWlhJaU9pSmxiV0ZwYkNJc0luQnliM1pwWkdWeWN5STZXeUpsYldGcGJDSmRmU3dpZFhObGNsOXRaWFJoWkdGMFlTSTZleUpsYldGcGJDSTZJbWhsYVc1bGNpNW5hV1ZvYkVCMGRTMWtiM0owYlhWdVpDNWtaU0lzSW1WdFlXbHNYM1psY21sbWFXVmtJanBtWVd4elpTd2ljR2h2Ym1WZmRtVnlhV1pwWldRaU9tWmhiSE5sTENKemRXSWlPaUkzWldWaFltTXdaUzAyTWpZekxUUXpOak10WW1SaVpTMWhaamM1WXpZek5HSTFaREVpZlN3aWNtOXNaU0k2SW1GMWRHaGxiblJwWTJGMFpXUWlMQ0poWVd3aU9pSmhZV3d4SWl3aVlXMXlJanBiZXlKdFpYUm9iMlFpT2lKbGJXRnBiQzl6YVdkdWRYQWlMQ0owYVcxbGMzUmhiWEFpT2pFM01qUTNPVEl6TlRSOVhTd2ljMlZ6YzJsdmJsOXBaQ0k2SW1JME9HWTRZMkl4TFdNNFlXRXROR00xWkMwNE1EVmxMV05pTkdJM09ERmpaV1EzWlNJc0ltbHpYMkZ1YjI1NWJXOTFjeUk2Wm1Gc2MyVjkudVNqNEpLblBydGN6V3lJS0FsalkxN2JIMU15WXppYThjV0RIaFhKT3pENCIsInRva2VuX3R5cGUiOiJiZWFyZXIiLCJleHBpcmVzX2luIjozNjAwLCJleHBpcmVzX2F0IjoxNzI0Nzk1OTU0LCJyZWZyZXNoX3Rva2VuIjoibnBmUTRzMUpRTzd6VldHdVlTWHkwUSIsInVzZXIiOnsiaWQiOiI3ZWVhYmMwZS02MjYzLTQzNjMtYmRiZS1hZjc5YzYzNGI1ZDEiLCJhdWQiOiJhdXRoZW50aWNhdGVkIiwicm9sZSI6ImF1dGhlbnRpY2F0ZWQiLCJlbWFpbCI6ImhlaW5lci5naWVobEB0dS1kb3J0bXVuZC5kZSIsImVtYWlsX2NvbmZpcm1lZF9hdCI6IjIwMjQtMDgtMjdUMjA6NTk6MTMuODk1Njc4WiIsInBob25lIjoiIiwiY29uZmlybWF0aW9uX3NlbnRfYXQiOiIyMDI0LTA4LTI3VDIwOjU4OjU0LjkxNjgzOFoiLCJjb25maXJtZWRfYXQiOiIyMDI0LTA4LTI3VDIwOjU5OjEzLjg5NTY3OFoiLCJsYXN0X3NpZ25faW5fYXQiOiIyMDI0LTA4LTI3VDIwOjU5OjE0LjQxNjgyMzQwMVoiLCJhcHBfbWV0YWRhdGEiOnsicHJvdmlkZXIiOiJlbWFpbCIsInByb3ZpZGVycyI6WyJlbWFpbCJdfSwidXNlcl9tZXRhZGF0YSI6eyJlbWFpbCI6ImhlaW5lci5naWVobEB0dS1kb3J0bXVuZC5kZSIsImVtYWlsX3ZlcmlmaWVkIjpmYWxzZSwicGhvbmVfdmVyaWZpZWQiOmZhbHNlLCJzdWIiOiI3ZWVhYmMwZS02MjYzLTQzNjMtYmRiZS1hZjc5YzYzNGI1ZDEifSwiaWRlbnRpdGllcyI6W3siaWRlbnRpdHlfaWQiOiIwZWJkMTU3OC1kNTIzLTRjOGQtYjUzYS1jZGM5MjFiNWExMTIiLCJpZCI6IjdlZWFiYzBlLTYyNjMtNDM2My1iZGJlLWFmNzljNjM0YjVkMSIsInVzZXJfaWQiOiI3ZWVhYmMwZS02MjYzLTQzNjMtYmRiZS1hZjc5YzYzNGI1ZDEiLCJpZGVudGl0eV9kYXRhIjp7ImVtYWlsIjoiaGVpbmVyLmdpZWhsQHR1LWRvcnRtdW5kLmRlIiwiZW1haWxfdmVyaWZpZWQiOmZhbHNlLCJwaG9uZV92ZXJpZmllZCI6ZmFsc2UsInN1YiI6IjdlZWFiYzBlLTYyNjMtNDM2My1iZGJlLWFmNzljNjM0YjVkMSJ9LCJwcm92aWRlciI6ImVtYWlsIiwibGFzdF9zaWduX2luX2F0IjoiMjAyNC0wOC0yN1QyMDo1ODo1NC44Nzk5MjZaIiwiY3JlYXRlZF9hdCI6IjIwMjQtMDgtMjdUMjA6NTg6NTQuODc5OTgxWiIsInVwZGF0ZWRfYXQiOiIyMDI0LTA4LTI3VDIwOjU4OjU0Ljg3OTk4MVoiLCJlbWFpbCI6ImhlaW5lci5naWVobEB0dS1kb3J0bXVuZC5kZSJ9XSwiY3JlYXRlZF9hdCI6IjIwMjQtMDgtMjdUMjA6NTg6NTQuODQ5NDVaIiwidXBkYXRlZF9hdCI6IjIwMjQtMDgtMjdUMjA6NTk6MTQuNDI0MDYyWiIsImlzX2Fub255bW91cyI6ZmFsc2V9fQ; __clerk_db_jwt_gTmQwYFN=dvb_2lGCWFnGrRK84DTLrSC7SA2wNaq; __stripe_mid=6bc66d0d-2e0d-4f05-b523-d442a4cd9278b6f4d9; _ga_8M37TENBJS=GS1.1.1727195997.7.1.1727197257.0.0.0; __clerk_db_jwt_61-uoEHA=dvb_2mcsfOBwq6i64Qwv3NR9ysWMkKT; __session_61-uoEHA=eyJhbGciOiJSUzI1NiIsImNhdCI6ImNsX0I3ZDRQRDExMUFBQSIsImtpZCI6Imluc18ybWNzU29ndmpQTU9MSzVhV1A3dndBdGlIQXAiLCJ0eXAiOiJKV1QifQ.eyJhenAiOiJodHRwOi8vbG9jYWxob3N0OjMwMDAiLCJleHAiOjE3MjczOTc4NzAsImlhdCI6MTcyNzM5NzgxMCwiaXNzIjoiaHR0cHM6Ly9zdHVubmluZy1xdWFpbC04Ny5jbGVyay5hY2NvdW50cy5kZXYiLCJuYmYiOjE3MjczOTc4MDAsInNpZCI6InNlc3NfMm1jc3c4UkJaYndmSG9mbEwzVWRLVlZpVGp6Iiwic3ViIjoidXNlcl8ybWNzd0FXNWhqT0lISlNBdXVJU3BIYWRyOWsifQ.WPaGiiBbKuOUyfuGXMJSFuV8ORaQYysaY4t9jLdVsGhZNcgIkSKNCQbzyVK4VG3etoj3CAloPf9Vy7-KsSMG-IOyeWCTuRAl5jO9aLqn6fgvirJA1x41eirJY3MfdqvYHg0O3N-kzqZzSH-XiW-bGqWmmm6xqCh80uXuUU68HFnNztHwZNZB_TM3zdZQj2bopOuC6aL_8U44-ujM2j3rViX6jSR-GncuWAd0R9dHbG0U2MfzN1i2w_AmhMCIGitSYRrzFwLcbcB7v5b1KMGCJaw8c1AdiY3KvLuPDelDvHqGtJek-MhVE2x6KBIZmdh5NRqkrrBbUbJzR-J3CfsQ0g; __clerk_db_jwt=dvb_2lGCWFnGrRK84DTLrSC7SA2wNaq; __client_uat_61-uoEHA=1727387434; __session_gTmQwYFN=eyJhbGciOiJSUzI1NiIsImNhdCI6ImNsX0I3ZDRQRDExMUFBQSIsImtpZCI6Imluc18ybEdDQllGbjJlWnRPMnR6MTdYZ2t0NmJyUDkiLCJ0eXAiOiJKV1QifQ.eyJhenAiOiJodHRwOi8vbG9jYWxob3N0OjMwMDAiLCJleHAiOjE3Mjc0ODQ4ODAsImlhdCI6MTcyNzQ4NDgyMCwiaXNzIjoiaHR0cHM6Ly9ndWlkZWQtcm9iaW4tODAuY2xlcmsuYWNjb3VudHMuZGV2IiwibmJmIjoxNzI3NDg0ODEwLCJzaWQiOiJzZXNzXzJtUUFmWUM2dXRFSjFwclhRVDBkTTZnU1gwSSIsInN1YiI6InVzZXJfMm05ZDFYY3h6OHlvQVd0bWxXdnlJQ2pGdkJ2In0.y-OwB53_A6hZlgqumZGronQOp4iuHecZ2CoQFSwAPTyeQvPQ0CqRrfcp9Ahlj20tF3x0e7V26uexm5pCEc2svK2laKtW-cSOf2PuM_BLC6PNZhqCg2JuEO4_lVCYy6RQQNj9sWdJ7f_n3d_kkY0TeIT0siXkXuemE0FH7sN-Ek-7QJImKgCQL5UcjZDg9N1OSoY_Ka6lMuEr0D5RMCD8Djh_67tTAssKR-miLPd7axjg8V6ny82ztR79Um2tFwrEQZcYz061Q3bu_590j55kh-sOfotJuVW6tB-tCBAvZr80k7VzITzhtm99h-xAnLp1BaekgNfKRWbt3D9UP2Ikog; __session=eyJhbGciOiJSUzI1NiIsImNhdCI6ImNsX0I3ZDRQRDExMUFBQSIsImtpZCI6Imluc18ybEdDQllGbjJlWnRPMnR6MTdYZ2t0NmJyUDkiLCJ0eXAiOiJKV1QifQ.eyJhenAiOiJodHRwOi8vbG9jYWxob3N0OjMwMDAiLCJleHAiOjE3Mjc0ODQ4ODAsImlhdCI6MTcyNzQ4NDgyMCwiaXNzIjoiaHR0cHM6Ly9ndWlkZWQtcm9iaW4tODAuY2xlcmsuYWNjb3VudHMuZGV2IiwibmJmIjoxNzI3NDg0ODEwLCJzaWQiOiJzZXNzXzJtUUFmWUM2dXRFSjFwclhRVDBkTTZnU1gwSSIsInN1YiI6InVzZXJfMm05ZDFYY3h6OHlvQVd0bWxXdnlJQ2pGdkJ2In0.y-OwB53_A6hZlgqumZGronQOp4iuHecZ2CoQFSwAPTyeQvPQ0CqRrfcp9Ahlj20tF3x0e7V26uexm5pCEc2svK2laKtW-cSOf2PuM_BLC6PNZhqCg2JuEO4_lVCYy6RQQNj9sWdJ7f_n3d_kkY0TeIT0siXkXuemE0FH7sN-Ek-7QJImKgCQL5UcjZDg9N1OSoY_Ka6lMuEr0D5RMCD8Djh_67tTAssKR-miLPd7axjg8V6ny82ztR79Um2tFwrEQZcYz061Q3bu_590j55kh-sOfotJuVW6tB-tCBAvZr80k7VzITzhtm99h-xAnLp1BaekgNfKRWbt3D9UP2Ikog; __client_uat_gTmQwYFN=1726998529; __client_uat=1726998529
host:
localhost:3000
origin:
http://localhost:3000
referer:
http://localhost:3000/chat/29
sec-ch-ua:
"Google Chrome";v="129", "Not=A?Brand";v="8", "Chromium";v="129"
sec-ch-ua-mobile:
?0
sec-ch-ua-platform:
"Windows"
sec-fetch-dest:
empty
sec-fetch-mode:
cors
sec-fetch-site:
same-origin
user-agent:
Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/129.0.0.0 Safari/537.36`

Thanks for the help :)

@lgrammel
Copy link
Collaborator

You are not importing ai in your package json?

@heinergiehl
Copy link
Author

You are not importing ai in your package json?

Thx MrSir. I would not have seen this. You are correct, and I installed it. However, same problem still persists :(

@heinergiehl
Copy link
Author

You are not importing ai in your package json?
here the updated package.json
{
"name": "clerk",
"version": "0.1.0",
"private": true,
"scripts": {
"dev": "next dev --turbo",
"build": "next build",
"start": "next start",
"lint": "next lint"
},
"dependencies": {
"@ai-sdk/openai": "^0.0.62",
"@aws-sdk/client-s3": "^3.658.1",
"@aws-sdk/s3-request-presigner": "^3.658.1",
"@clerk/nextjs": "^5.6.3",
"@hookform/resolvers": "^3.9.0",
"@langchain/community": "^0.3.3",
"@langchain/core": "^0.3.3",
"@pinecone-database/doc-splitter": "^0.0.1",
"@pinecone-database/pinecone": "^3.0.3",
"@prisma/client": "5.20.0",
"@radix-ui/react-accordion": "^1.2.0",
"@radix-ui/react-alert-dialog": "^1.1.1",
"@radix-ui/react-aspect-ratio": "^1.1.0",
"@radix-ui/react-avatar": "^1.1.0",
"@radix-ui/react-checkbox": "^1.1.1",
"@radix-ui/react-collapsible": "^1.1.0",
"@radix-ui/react-context-menu": "^2.2.1",
"@radix-ui/react-dialog": "^1.1.1",
"@radix-ui/react-dropdown-menu": "^2.1.1",
"@radix-ui/react-hover-card": "^1.1.1",
"@radix-ui/react-icons": "^1.3.0",
"@radix-ui/react-label": "^2.1.0",
"@radix-ui/react-menubar": "^1.1.1",
"@radix-ui/react-navigation-menu": "^1.2.0",
"@radix-ui/react-popover": "^1.1.1",
"@radix-ui/react-progress": "^1.1.0",
"@radix-ui/react-radio-group": "^1.2.0",
"@radix-ui/react-scroll-area": "^1.1.0",
"@radix-ui/react-select": "^2.1.1",
"@radix-ui/react-separator": "^1.1.0",
"@radix-ui/react-slider": "^1.2.0",
"@radix-ui/react-slot": "^1.1.0",
"@radix-ui/react-switch": "^1.1.0",
"@radix-ui/react-tabs": "^1.1.0",
"@radix-ui/react-toast": "^1.2.1",
"@radix-ui/react-toggle": "^1.1.0",
"@radix-ui/react-toggle-group": "^1.1.0",
"@radix-ui/react-tooltip": "^1.1.2",
"@react-pdf-viewer/core": "^3.12.0",
"@react-pdf-viewer/default-layout": "^3.12.0",
"@tanstack/react-query": "^5.56.2",
"@tanstack/react-query-devtools": "^5.58.0",
"@types/canvas-confetti": "^1.6.4",
"@types/md5": "^2.3.5",
"add": "^2.0.6",
"ai": "^3.4.7",
"canvas-confetti": "^1.9.3",
"class-variance-authority": "^0.7.0",
"clsx": "^2.1.1",
"cmdk": "1.0.0",
"cobe": "^0.6.3",
"date-fns": "^4.1.0",
"embla-carousel-react": "^8.3.0",
"framer-motion": "^11.9.0",
"input-otp": "^1.2.4",
"install": "^0.13.0",
"langchain": "^0.3.2",
"lucide-react": "^0.446.0",
"md5": "^2.3.0",
"next": "14.2.5",
"next-themes": "^0.3.0",
"openai-edge": "^1.2.2",
"pdf-parse": "^1.1.1",
"prisma": "^5.20.0",
"react": "^18.3.1",
"react-day-picker": "8.10.1",
"react-dom": "^18.3.1",
"react-dropzone": "^14.2.3",
"react-hook-form": "^7.53.0",
"react-icon-cloud": "^4.1.4",
"react-resizable-panels": "^2.1.3",
"react-spring": "^9.7.4",
"react-tweet": "^3.2.1",
"recharts": "^2.12.7",
"shadcn": "^2.1.0",
"sonner": "^1.5.0",
"tailwind-merge": "^2.5.2",
"tailwindcss-animate": "^1.0.7",
"vaul": "^1.0.0",
"zod": "^3.23.8"
},
"devDependencies": {
"@types/node": "^20.16.10",
"@types/react": "^18.3.10",
"@types/react-dom": "^18.3.0",
"eslint": "^8.57.1",
"eslint-config-next": "14.2.5",
"postcss": "^8.4.47",
"tailwindcss": "^3.4.13",
"typescript": "^5.6.2"
}
}

@nicoalbanese
Copy link
Contributor

Try removing the code below from your try block as it's blocking the streaming response.

    await prisma.message.create({
      data: {
        content: await result.text,
        chatId,
        role: Role.system,
      },
    })

You can instead use onFinish with streamText to add the message to the database.

@heinergiehl
Copy link
Author

Try removing the code below from your try block as it's blocking the streaming response.

    await prisma.message.create({
      data: {
        content: await result.text,
        chatId,
        role: Role.system,
      },
    })

You can instead use onFinish with streamText to add the message to the database.

Thanks, MrSir, you 100 % nailed it. It had nothing to do with the beautiful Vercel AI package, but it was due to Prisma. Putting the Prisma call in the onFinished callback fixed it 100%, and streaming is working like a charm.

  const input = {
      model: google("gemini-1.5-flash", {}),
      system: prompt,
      messages: convertToCoreMessages(messages),
      onFinish: async (data: any) => {
        console.log("onFinish", data)
        const message = await prisma.message.create({
          data: {
            content: await data.text,
            chatId,
            role: Role.system,
          },
        })
        // Trigger a Pusher event
        await pusherServer.trigger(`chat-${chatId}`, "new-message", message)
      },
    }

@arvindanta
Copy link

Hi folks, can we use useChat with generateText ?
The useChat hits an endpoint /api/chat. How should I return the response from generateText.

for streamText, am able to do someting like response.toDataStreamResponse.

Note: am using Next.js app router.

@lgrammel
Copy link
Collaborator

@arvindanta you can set use chat to text protocol, and then just return a response that contains the text.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants