Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Chat With Gemini Streaming Langchain #121

Open
wants to merge 22 commits into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
22 commits
Select commit Hold shift + click to select a range
b5398f6
added chat-with-openai-streaming-langchain
skushagra9 Dec 19, 2023
eb5a09a
Merge branch 'stackwiseai:main' into new
skushagra9 Dec 19, 2023
0fc6a7d
deleted yarn.lock
skushagra9 Dec 19, 2023
771f2e4
added adge function to chat-with-openai-streaming-langchain
skushagra9 Dec 19, 2023
9420ddd
Update route.ts
skushagra9 Dec 19, 2023
bc4b382
Merge branch 'stackwiseai:main' into new
skushagra9 Dec 19, 2023
6b3d520
Merge branch 'stackwiseai:main' into new
skushagra9 Dec 20, 2023
068d203
Added Streaming Functionality with Langchain
skushagra9 Dec 20, 2023
57baff2
Update chat-with-openai-streaming-langchain.tsx
skushagra9 Dec 20, 2023
b521325
Update route.ts
skushagra9 Dec 20, 2023
5c9b77f
Merge branch 'stackwiseai:main' into new
skushagra9 Dec 20, 2023
307c592
Create chat-with-openai-streaming-langchain.png
skushagra9 Dec 20, 2023
443687c
Some Basic Functionality
skushagra9 Dec 20, 2023
89a729b
Merge branch 'new' of https://github.com/skushagra9/stackwise into new
skushagra9 Dec 20, 2023
8b14016
Revert "Some Basic Functionality"
skushagra9 Dec 20, 2023
839f445
Merge branch 'stackwiseai:main' into new
skushagra9 Dec 21, 2023
3d2cbc5
Added the chat with gemini streaming langchain
skushagra9 Dec 21, 2023
239de68
Fie changes
skushagra9 Dec 21, 2023
0a4b20d
Merge branch 'stackwiseai:main' into new
skushagra9 Dec 21, 2023
f327cf3
Create chat-with-gemini-streaming-langchain.png
skushagra9 Dec 21, 2023
f42eafa
Merge branch 'new' of https://github.com/skushagra9/stackwise into new
skushagra9 Dec 21, 2023
f234f13
Merge branch 'stackwiseai:main' into new
skushagra9 Dec 26, 2023
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
47 changes: 47 additions & 0 deletions ui/app/api/chat-with-gemini-streaming-langchain/route.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,47 @@
import { NextRequest, NextResponse } from "next/server";
import { Message as VercelChatMessage, StreamingTextResponse } from "ai";
import { ChatGoogleGenerativeAI } from '@langchain/google-genai';
import { BytesOutputParser } from "langchain/schema/output_parser";
import { PromptTemplate } from "langchain/prompts";

export const runtime = "edge";

const formatMessage = (message: VercelChatMessage) => {
return `${message.role}: ${message.content}`;
};

const TEMPLATE = `
Current conversation:
{chat_history}

User: {input}
AI:`;

export async function POST(req: NextRequest) {


try {
const body = await req.json();
const messages = body.messages ?? [];
const formattedPreviousMessages = messages.slice(0, -1).map(formatMessage);
const currentMessageContent = messages[messages.length - 1].content;
const prompt = PromptTemplate.fromTemplate(TEMPLATE);


const model = new ChatGoogleGenerativeAI({
temperature: 0.8,
});

const outputParser = new BytesOutputParser();
const chain = prompt.pipe(model).pipe(outputParser);

const stream = await chain.stream({
chat_history: formattedPreviousMessages.join("\n"),
input: currentMessageContent,
});

return new StreamingTextResponse(stream);
} catch (e: any) {
return NextResponse.json({ error: e.message }, { status: 500 });
}
}
76 changes: 76 additions & 0 deletions ui/app/components/stacks/chat-with-gemini-streaming-langchain.tsx
Original file line number Diff line number Diff line change
@@ -0,0 +1,76 @@
import React, { useState, useEffect } from 'react';
import { IoSend } from 'react-icons/io5';
import { useChat } from 'ai/react';

export const ChatWithOpenAIStreaming = () => {
const [inputValue, setInputValue] = useState('');
const { messages, input, handleInputChange, handleSubmit } = useChat({
api: '/api/chat-with-gemini-streaming-langchain'
});

const [loading, setLoading] = useState(false);

const latestAssistantResponse = messages
.filter((m) => m.role === 'assistant')
.map((m) => m.content)
.pop();

const handleFormSubmit = async (event) => {
event.preventDefault();
setLoading(true);

try {
await handleSubmit(event);
} finally {
setLoading(false);
}
};

useEffect(() => {
if (latestAssistantResponse !== undefined) {
setInputValue('');
}
}, [latestAssistantResponse]);

return (
<div className="w-3/4 md:w-1/2">
<form onSubmit={handleFormSubmit} className="flex flex-col">
<div className="relative w-full">
<input
type="text"
value={inputValue || input}
onChange={(e) => {
setInputValue(e.target.value);
handleInputChange(e);
}}
placeholder="Ask anything..."
className="focus:shadow-outline w-full rounded-full border border-gray-400 py-2 pl-4 pr-10 focus:outline-none"
onKeyDown={(e) => {
if (e.key === 'Enter') handleFormSubmit(e);
}}
/>
<button
type="submit"
className={`focus:shadow-outline absolute right-0 top-0 h-full cursor-pointer rounded-r-full px-4 font-bold text-black focus:outline-none ${
loading ? 'cursor-not-allowed opacity-50' : ''
}`}
disabled={loading}
>
<IoSend />
</button>
</div>
</form>
<div className="min-h-4 mt-4 max-h-96 w-full overflow-auto rounded-md bg-[#faf0e6] p-4 md:max-h-[28rem]">
{loading ? (
<span className="text-sm text-gray-400">Generating... </span>
) : latestAssistantResponse ? (
latestAssistantResponse
) : (
<p className="text-sm text-gray-400">Output here...</p>
)}
</div>
</div>
);
};

export default ChatWithOpenAIStreaming;
69 changes: 0 additions & 69 deletions ui/app/components/stacks/kushagra-stack.tsx

This file was deleted.

Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading