Skip to content

Commit

Permalink
Update GroqLLMService to use llama-3.3-70b-versatile as the default m…
Browse files Browse the repository at this point in the history
…odel
  • Loading branch information
markbackman committed Feb 9, 2025
1 parent 32b9de5 commit f2b0727
Show file tree
Hide file tree
Showing 2 changed files with 5 additions and 2 deletions.
3 changes: 3 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -19,6 +19,9 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0
- Updated foundation example `14f-function-calling-groq.py` to use
`GroqSTTService` for transcription.

- Updated `GroqLLMService` to use `llama-3.3-70b-versatile` as the default
model.

- `RTVIObserver` doesn't handle `LLMSearchResponseFrame` frames anymore. For
now, to handle those frames you need to create a `GoogleRTVIObserver` instead.

Expand Down
4 changes: 2 additions & 2 deletions src/pipecat/services/groq.py
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@ class GroqLLMService(OpenAILLMService):
Args:
api_key (str): The API key for accessing Groq's API
base_url (str, optional): The base URL for Groq API. Defaults to "https://api.groq.com/openai/v1"
model (str, optional): The model identifier to use. Defaults to "llama-3.1-70b-versatile"
model (str, optional): The model identifier to use. Defaults to "llama-3.3-70b-versatile"
**kwargs: Additional keyword arguments passed to OpenAILLMService
"""

Expand All @@ -31,7 +31,7 @@ def __init__(
*,
api_key: str,
base_url: str = "https://api.groq.com/openai/v1",
model: str = "llama-3.1-70b-versatile",
model: str = "llama-3.3-70b-versatile",
**kwargs,
):
super().__init__(api_key=api_key, base_url=base_url, model=model, **kwargs)
Expand Down

0 comments on commit f2b0727

Please sign in to comment.