You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
In the dkgInsert.ts file, the knowledge asset is currently constructed base on only the latest user query. We should make it so that it is based on the whole conversation up to that point. Instead of _state.currentPost, the whole conversation should be extracted from the state, and passed to the constructKnowledgeAsset function.
The text was updated successfully, but these errors were encountered:
* add livepeer on index.ts as llm provider
* updated livepeer models
* add livepeer as llm provider
* add retry logic on livepeer img gen
* add handlelivepeer
* update test
* add livepeer model keys on .example.env
* Merge pull request #2 from Titan-Node/livepeer-doc-updates
Updated docs for Livepeer LLM integration
* add endpoint on livepeer on models.ts
* edit livepeer model config at model.ts
* Add Livepeer to image gen plugin environments
Fixes this error
```
Error handling message: Error: Image generation configuration validation failed:
: At least one of ANTHROPIC_API_KEY, NINETEEN_AI_API_KEY, TOGETHER_API_KEY, HEURIST_API_KEY, FAL_API_KEY, OPENAI_API_KEY or VENICE_API_KEY is required
at validateImageGenConfig (file:///root/eliza-test/eliza-livepeer-integration/packages/plugin-image-generation/dist/index.js:38:19)
```
* add comments on livepeer model sizes
* remove retry logic from livepeer generate text and img
* Fixed .env naming convention and fixed mismatch bug within code
* add bearer on livepeer calls
* change in parsing to accomodate for new livepeer update
* addadd nineteen api key on the message
---------
Co-authored-by: Titan Node <[email protected]>
In the dkgInsert.ts file, the knowledge asset is currently constructed base on only the latest user query. We should make it so that it is based on the whole conversation up to that point. Instead of _state.currentPost, the whole conversation should be extracted from the state, and passed to the constructKnowledgeAsset function.
The text was updated successfully, but these errors were encountered: