-
Notifications
You must be signed in to change notification settings - Fork 1.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add characterization tests for basic langchain text responses and stream data. #676
Merged
Merged
Changes from all commits
Commits
Show all changes
14 commits
Select commit
Hold shift + click to select a range
446c81c
Introduce x-flush-delay header to speed up tests.
lgrammel e89405f
Add langchain dev dependency for testing.
lgrammel 24e1dbd
Add basic text and stream data text tests for langchain.
lgrammel 86a5745
Change test to cts to work around error in edge environment.
lgrammel 43ef736
Revert "Change test to cts to work around error in edge environment."
lgrammel ed4f311
Exclude langchain test in edge environment.
lgrammel bbb36d2
Make port configurable.
lgrammel 43c29d7
Use different port.
lgrammel 4eae30c
Switch to dynamic imports for Node 16 environment.
lgrammel b2805df
Merge branch 'main' into lg/langchain-tests
lgrammel aac8f39
Update data position.
lgrammel c9913f8
Remove Node 16 code.
lgrammel e2c0a85
Mock uuid.
lgrammel 71be52f
Merge branch 'main' into lg/langchain-tests
lgrammel File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,145 @@ | ||
import { createClient } from '../tests/utils/mock-client'; | ||
import { setup } from '../tests/utils/mock-service'; | ||
|
||
describe('LangchainStream', () => { | ||
let server: ReturnType<typeof setup>; | ||
beforeAll(() => { | ||
server = setup(3031); | ||
}); | ||
afterAll(() => { | ||
server.teardown(); | ||
}); | ||
|
||
jest.mock('uuid', () => { | ||
let count = 0; | ||
return { | ||
v4: () => { | ||
return `uuid-${count++}`; | ||
}, | ||
}; | ||
}); | ||
|
||
const { LangChainStream, StreamingTextResponse, experimental_StreamData } = | ||
require('.') as typeof import('.'); | ||
const { ChatOpenAI } = | ||
require('langchain/chat_models/openai') as typeof import('langchain/chat_models/openai'); | ||
const { HumanMessage } = | ||
require('langchain/schema') as typeof import('langchain/schema'); | ||
|
||
function readAllChunks(response: Response) { | ||
return createClient(response).readAll(); | ||
} | ||
|
||
it('should be able to parse SSE and receive the streamed response', async () => { | ||
const { stream, handlers } = LangChainStream(); | ||
|
||
const llm = new ChatOpenAI({ | ||
streaming: true, | ||
openAIApiKey: 'fake', | ||
configuration: { | ||
baseURL: server.api, | ||
defaultHeaders: { | ||
'x-mock-service': 'openai', | ||
'x-mock-type': 'chat', | ||
'x-flush-delay': '5', | ||
}, | ||
}, | ||
}); | ||
|
||
llm.call([new HumanMessage('hello')], {}, [handlers]).catch(console.error); | ||
|
||
const response = new StreamingTextResponse(stream); | ||
|
||
expect(await readAllChunks(response)).toEqual([ | ||
'', | ||
'Hello', | ||
',', | ||
' world', | ||
'.', | ||
'', | ||
]); | ||
}); | ||
|
||
describe('StreamData prototcol', () => { | ||
it('should send text', async () => { | ||
const data = new experimental_StreamData(); | ||
|
||
const { stream, handlers } = LangChainStream({ | ||
onFinal() { | ||
data.close(); | ||
}, | ||
experimental_streamData: true, | ||
}); | ||
|
||
const llm = new ChatOpenAI({ | ||
streaming: true, | ||
openAIApiKey: 'fake', | ||
configuration: { | ||
baseURL: server.api, | ||
defaultHeaders: { | ||
'x-mock-service': 'openai', | ||
'x-mock-type': 'chat', | ||
'x-flush-delay': '5', | ||
}, | ||
}, | ||
}); | ||
|
||
llm | ||
.call([new HumanMessage('hello')], {}, [handlers]) | ||
.catch(console.error); | ||
|
||
const response = new StreamingTextResponse(stream, {}, data); | ||
|
||
expect(await readAllChunks(response)).toEqual([ | ||
'0:""\n', | ||
'0:"Hello"\n', | ||
'0:","\n', | ||
'0:" world"\n', | ||
'0:"."\n', | ||
'0:""\n', | ||
]); | ||
}); | ||
|
||
it('should send text and data', async () => { | ||
const data = new experimental_StreamData(); | ||
|
||
data.append({ t1: 'v1' }); | ||
|
||
const { stream, handlers } = LangChainStream({ | ||
onFinal() { | ||
data.close(); | ||
}, | ||
experimental_streamData: true, | ||
}); | ||
|
||
const llm = new ChatOpenAI({ | ||
streaming: true, | ||
openAIApiKey: 'fake', | ||
configuration: { | ||
baseURL: server.api, | ||
defaultHeaders: { | ||
'x-mock-service': 'openai', | ||
'x-mock-type': 'chat', | ||
'x-flush-delay': '5', | ||
}, | ||
}, | ||
}); | ||
|
||
llm | ||
.call([new HumanMessage('hello')], {}, [handlers]) | ||
.catch(console.error); | ||
|
||
const response = new StreamingTextResponse(stream, {}, data); | ||
|
||
expect(await readAllChunks(response)).toEqual([ | ||
'2:"[{\\"t1\\":\\"v1\\"}]"\n', | ||
'0:""\n', | ||
'0:"Hello"\n', | ||
'0:","\n', | ||
'0:" world"\n', | ||
'0:"."\n', | ||
'0:""\n', | ||
]); | ||
}); | ||
}); | ||
}); |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Oops, something went wrong.
Oops, something went wrong.
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
With Langchain, there is an empty text chunk at the beginning and the end. Is this desired?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
No, but it shouldn't be a problem and not worth investigating until we finalize the complex mode API imo