Skip to content

Commit

Permalink
Merge branch 'llms/fake' of github.com:devalexandre/langchaingo into …
Browse files Browse the repository at this point in the history
…llms/fake
  • Loading branch information
devalexandre committed Jun 27, 2024
2 parents c4cbd03 + 6a5321c commit 7418a9a
Show file tree
Hide file tree
Showing 206 changed files with 3,951 additions and 598 deletions.
1 change: 1 addition & 0 deletions .golangci.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -30,6 +30,7 @@ linters:
- ireturn # disabling temporarily
- perfsprint
- musttag
- tagalign # Impractical for schema-defined output parser, which relies heavily on struct tagging.

linters-settings:
cyclop:
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -32,6 +32,7 @@ func textToSplit() []schema.Document {

log.Println("Document loaded: ", len(docs))
}
```
<DocCardList />
89 changes: 89 additions & 0 deletions docs/docs/modules/model_io/models/llms/Integrations/groq.mdx
Original file line number Diff line number Diff line change
@@ -0,0 +1,89 @@
---
sidebar_label: Groq
---

# Groq

## Overview

This documentation provides a detailed overview and technical guidance for integrating Groq's machine learning models with the Langchaingo library in the Go programming environment. This integration allows Go developers to leverage the power of pre-trained AI models for various applications, including natural language processing, text generation, and more.

## Prerequisites

- Go programming language installed on your machine (version 1.22.0 or higher recommended).
- A valid Groq API key. Obtain it by creating an account on the Groq platform and generating a new token.

## Installation

To install the Groq package in your Go project, run the following command:

```bash
go get github.com/tmc/langchaingo
```

Ensure that your Groq API key is set as an environment variable:

```bash
export GROQ_API_KEY=your-api-key
```

You can use .env file to store the API key and load it in your Go application.

.env file:

```bash
GROQ_API_KEY=your-api-key
```
but you not need use godotenv package to load the .env file.


## Usage

```go
package main
import (
"context"
"fmt"
"log"
"os"
"github.com/joho/godotenv"
"github.com/tmc/langchaingo/llms"
"github.com/tmc/langchaingo/llms/openai"
)
func main() {
// Load the Groq API key from the .env file if you use it
err := godotenv.Load()
if err != nil {
log.Fatalf("Error loading .env file")
}
apiKey := os.Getenv("GROQ_API_KEY")
llm, err := openai.New(
openai.WithModel("llama3-8b-8192"),
openai.WithBaseURL("https://api.groq.com/openai/v1"),
openai.WithToken(apiKey),
)
if err != nil {
log.Fatal(err)
}
ctx := context.Background()
_, err = llms.GenerateFromSinglePrompt(ctx,
llm,
"Write a long poem about how golang is a fantastic language.",
llms.WithTemperature(0.8),
llms.WithMaxTokens(4096),
llms.WithStreamingFunc(func(ctx context.Context, chunk []byte) error {
fmt.Print(string(chunk))
return nil
}),
)
fmt.Println()
if err != nil {
log.Fatal(err)
}
}
```
Original file line number Diff line number Diff line change
Expand Up @@ -7,25 +7,26 @@ import WatsonxExample from "@examples/watsonx-llm-example/watsonx_example.go";

# watsonx

Integration support for [IBM watsonx](https://www.ibm.com/watsonx) foundation models.
Integration support for [IBM watsonx](https://www.ibm.com/watsonx) foundation models with [`watsonx-go`](https://github.com/IBM/watsonx-go
).

## Setup

You will need to set the following environment variables for using the WatsonX AI API.

- `IBMCLOUD_API_KEY`: generate from your [IBM Cloud account](https://cloud.ibm.com/iam/apikeys).
- `WATSONX_API_KEY`: generate from your [IBM Cloud account](https://cloud.ibm.com/iam/apikeys).
- `WATSONX_PROJECT_ID`: copy from your [watsonx project settings](https://dataplatform.cloud.ibm.com/projects/?context=wx).

Alternatively, these can be passed into the model on creation:

```go
import (
wx "github.com/h0rv/go-watsonx/models"
wx "github.com/IBM/watsonx-go/pkg/models"
"github.com/tmc/langchaingo/llms/watsonx"
)
...
llm, _ := watsonx.New(
wx.WithIBMCloudAPIKey("YOUR IBM CLOUD API KEY"),
wx.WithWatsonxAPIKey("YOUR WATSONX API KEY"),
wx.WithWatsonxProjectID("YOUR WATSONX PROJECT ID"),
)
```
Expand Down
47 changes: 45 additions & 2 deletions examples/README.md
Original file line number Diff line number Diff line change
@@ -1,3 +1,46 @@
# 🎉 Examples
# LangChain Go Examples 🚀

This directory tree contains examples that are independently buildable and runnable.
Welcome to the exciting set of LangChain Go examples! 🎉 This directory tree is packed with fun and practical demonstrations of how to use LangChain with various language models and tools. Whether you're a seasoned AI developer or just starting out, there's something here for everyone!

## What's Inside? 📦

This collection includes examples for:

- Different Language Models: OpenAI, Anthropic, Cohere, Ollama, and more!
- Vector Stores: Chroma, Pinecone, Weaviate, and others for efficient similarity searches.
- Chains and Agents: See how to build complex AI workflows and autonomous agents.
- Tools and Integrations: Explore connections with Zapier, SQL databases, and more.
- Memory Systems: Learn about various memory implementations for contextual conversations.

## Key Features 🌟

1. **Diverse LLM Integration**: Examples showcasing integration with multiple language models.
2. **Vector Store Demonstrations**: Practical uses of vector databases for semantic search and data retrieval.
3. **Chain and Agent Construction**: Learn to build sophisticated AI workflows and autonomous agents.
4. **Tool Usage**: See how to leverage external tools and APIs within your AI applications.
5. **Memory Management**: Explore different ways to maintain context in conversations.

## How to Use 🛠️

Each example is contained in its own directory with a dedicated README and Go files. To run an example:

1. Navigate to the example's directory.
2. Read the README for specific instructions and requirements.
3. Run the Go file(s) as instructed.

## Getting Started 🚀

1. Clone this repository.
2. Ensure you have Go installed on your system.
3. Set up any required API keys or environment variables as specified in individual examples.
4. Dive into the example that interests you most!

## Contribute 🤝

Feel free to contribute your own examples or improvements! We love seeing creative uses of LangChain Go.

## Have Fun! 😄

Remember, the world of AI is vast and exciting. These examples are just the beginning. Feel free to experiment, modify, and build upon these examples to create your own amazing AI applications!

Happy coding, and may your AI adventures be ever thrilling! 🚀🤖
38 changes: 38 additions & 0 deletions examples/anthropic-completion-example/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,38 @@
# Anthropic Completion Example

Hello there, fellow Go enthusiasts and AI adventurers! 👋 Welcome to this exciting example of using the Anthropic API with Go!

## What's in this directory?

This directory contains a simple yet powerful example of how to use the Anthropic API to generate text completions using Go. Here's what you'll find:

1. `anthropic_completion_example.go`: This is the main Go file that demonstrates how to use the Anthropic API. It's a great starting point for your AI-powered adventures!

## What does the code do?

The `anthropic_completion_example.go` file showcases how to:

- Initialize an Anthropic LLM (Language Model) client
- Generate text completions using the Claude 3 Opus model
- Stream the generated text in real-time

It even includes a fun prompt asking Claude to write a poem about Golang-powered AI systems! 🤖📝

## How to use this example

1. Make sure you have Go installed on your system.
2. Set up your Anthropic API key as an environment variable.
3. Run the example using `go run anthropic_completion_example.go`.
4. Watch as the AI-generated poem streams to your console!

## Dependencies

This project uses the fantastic `langchaingo` library to interact with the Anthropic API. It's a great tool for building AI-powered applications in Go!

## What to expect

When you run the example, you'll see a poem about Golang-powered AI systems being generated and printed to your console in real-time. It's like watching an AI poet at work! 🎭

## Have fun!

We hope this example inspires you to create amazing AI-powered applications using Go and Anthropic's powerful language models. Happy coding! 🚀🎉
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@ import (

func main() {
llm, err := anthropic.New(
anthropic.WithModel("claude-3-opus-20240229"),
anthropic.WithModel("claude-3-5-sonnet-20240620"),
)
if err != nil {
log.Fatal(err)
Expand Down
2 changes: 1 addition & 1 deletion examples/anthropic-completion-example/go.mod
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@ go 1.22.0

toolchain go1.22.1

require github.com/tmc/langchaingo v0.1.11
require github.com/tmc/langchaingo v0.1.12-pre.0

require (
github.com/dlclark/regexp2 v1.10.0 // indirect
Expand Down
4 changes: 2 additions & 2 deletions examples/anthropic-completion-example/go.sum
Original file line number Diff line number Diff line change
Expand Up @@ -12,8 +12,8 @@ github.com/pmezard/go-difflib v1.0.0 h1:4DBwDE0NGyQoBHbLQYPwSUPoCMWR5BEzIk/f1lZb
github.com/pmezard/go-difflib v1.0.0/go.mod h1:iKH77koFhYxTK1pcRnkKkqfTogsbg7gZNVY4sRDYZ/4=
github.com/stretchr/testify v1.9.0 h1:HtqpIVDClZ4nwg75+f6Lvsy/wHu+3BoSGCbBAcpTsTg=
github.com/stretchr/testify v1.9.0/go.mod h1:r2ic/lqez/lEtzL7wO/rwa5dbSLXVDPFyf8C91i36aY=
github.com/tmc/langchaingo v0.1.11 h1:QXK6T8zQzRSKispzeUk66yWcwVV3Igkq7R15y9NtZxM=
github.com/tmc/langchaingo v0.1.11/go.mod h1:MFQg4CUOwjT5VTYjorSmXHhQju6XqdJkSbdpBcZf7SQ=
github.com/tmc/langchaingo v0.1.12-pre.0 h1:8s7GU3qUYwDdMDu+gdBqBXUqNbvRpOoAjSDczRb340c=
github.com/tmc/langchaingo v0.1.12-pre.0/go.mod h1:cd62xD6h+ouk8k/QQFhOsjRYBSA1JJ5UVKXSIgm7Ni4=
gopkg.in/yaml.v2 v2.4.0 h1:D8xgwECY7CYvx+Y2n4sBz93Jn9JRvxdiyyo8CTfuKaY=
gopkg.in/yaml.v2 v2.4.0/go.mod h1:RDklbk79AGWmwhnvt/jBztapEOGDOx6ZbXqjP6csGnQ=
gopkg.in/yaml.v3 v3.0.1 h1:fxVm/GzAzEWqLHuvctI91KS9hhNmmWOoWu0XTYJS7CA=
Expand Down
43 changes: 43 additions & 0 deletions examples/anthropic-tool-call-example/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,43 @@
# Anthropic Tool Call Example 🌟

Welcome to the Anthropic Tool Call Example! This fun little Go program demonstrates how to use the Anthropic API to create an AI assistant that can answer questions about the weather using function calling. Let's dive in and see what it does!

## What Does This Example Do? 🤔

This example showcases the following cool features:

1. **AI-Powered Weather Assistant**: It creates an AI assistant using Anthropic's Claude model that can answer questions about the weather in different cities.

2. **Function Calling**: The assistant can use a special tool (function) called `getCurrentWeather` to fetch weather information for specific locations.

3. **Conversation Flow**: It demonstrates a back-and-forth conversation between a human and the AI assistant, including multiple queries about weather in different cities.

4. **Tool Execution**: When the AI assistant needs to use the weather tool, the program executes it and provides the results back to the assistant.

## How It Works 🛠️

1. The program starts by creating an Anthropic client using the Claude 3 Haiku model.

2. It then initiates a conversation by asking about the weather in Boston.

3. The AI assistant recognizes the need for weather information and calls the `getCurrentWeather` function.

4. The program executes the function call, fetching mock weather data for Boston.

5. The AI assistant receives the weather data and formulates a response.

6. The conversation continues with additional questions about weather in Chicago, demonstrating the assistant's ability to handle multiple queries and retain context.

## Fun Features 🎉

- **Mock Weather Data**: The example uses a simple map to provide mock weather data for Boston and Chicago. It's not real-time data, but it's perfect for demonstrating how the system works!

- **Flexible Conversations**: You can easily modify the conversation flow by adding more questions or changing the cities mentioned.

- **Tool Definition**: The `availableTools` slice defines the `getCurrentWeather` function, which the AI can use to fetch weather information.

## Try It Out! 🚀

Run the example and watch as the AI assistant cheerfully answers questions about the weather in different cities. Feel free to modify the code to add more cities or even create your own tools for the AI to use!

Happy coding, and may your weather always be sunny! ☀️
Original file line number Diff line number Diff line change
Expand Up @@ -41,6 +41,53 @@ func main() {
log.Fatal(err)
}
fmt.Println(resp.Choices[0].Content)

// populate ai response
assistantResponse := llms.TextParts(llms.ChatMessageTypeAI, resp.Choices[0].Content)
messageHistory = append(messageHistory, assistantResponse)

fmt.Println("asking again...")
// Human asks again
humanQuestion := llms.TextParts(llms.ChatMessageTypeHuman, "How about the weather in chicago?")
messageHistory = append(messageHistory, humanQuestion)

// Send Request
resp, err = llm.GenerateContent(ctx, messageHistory, llms.WithTools(availableTools))
if err != nil {
log.Fatal(err)
}

// Perform Tool call
messageHistory = executeToolCalls(ctx, llm, messageHistory, resp)
fmt.Println("Querying with tool response...")
resp, err = llm.GenerateContent(ctx, messageHistory, llms.WithTools(availableTools))
if err != nil {
log.Fatal(err)
}
fmt.Println(resp.Choices[0].Content)

// populate ai response
assistantResponse = llms.TextParts(llms.ChatMessageTypeAI, resp.Choices[0].Content)
messageHistory = append(messageHistory, assistantResponse)

// Compare responsses
humanQuestion = llms.TextParts(llms.ChatMessageTypeHuman, "How do these compare?")
messageHistory = append(messageHistory, humanQuestion)

// Send Request
resp, err = llm.GenerateContent(ctx, messageHistory, llms.WithTools(availableTools))
if err != nil {
log.Fatal(err)
}
// Perform Tool call
messageHistory = executeToolCalls(ctx, llm, messageHistory, resp)
fmt.Println("Asking for comparison...")
resp, err = llm.GenerateContent(ctx, messageHistory, llms.WithTools(availableTools))
if err != nil {
log.Fatal(err)
}
fmt.Println(resp.Choices[0].Content)

}

// executeToolCalls executes the tool calls in the response and returns the
Expand All @@ -50,9 +97,18 @@ func executeToolCalls(ctx context.Context, llm llms.Model, messageHistory []llms
for _, toolCall := range choice.ToolCalls {

// Append tool_use to messageHistory
assistantResponse := llms.TextParts(llms.ChatMessageTypeAI)
for _, tc := range choice.ToolCalls {
assistantResponse.Parts = append(assistantResponse.Parts, tc)
assistantResponse := llms.MessageContent{
Role: llms.ChatMessageTypeAI,
Parts: []llms.ContentPart{
llms.ToolCall{
ID: toolCall.ID,
Type: toolCall.Type,
FunctionCall: &llms.FunctionCall{
Name: toolCall.FunctionCall.Name,
Arguments: toolCall.FunctionCall.Arguments,
},
},
},
}
messageHistory = append(messageHistory, assistantResponse)

Expand Down
2 changes: 1 addition & 1 deletion examples/anthropic-tool-call-example/go.mod
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@ module github.com/tmc/langchaingo/examples/anthropic-tool-call-example

go 1.22.0

require github.com/tmc/langchaingo v0.1.11
require github.com/tmc/langchaingo v0.1.12-pre.0

require (
github.com/dlclark/regexp2 v1.10.0 // indirect
Expand Down
4 changes: 2 additions & 2 deletions examples/anthropic-tool-call-example/go.sum
Original file line number Diff line number Diff line change
Expand Up @@ -12,8 +12,8 @@ github.com/pmezard/go-difflib v1.0.0 h1:4DBwDE0NGyQoBHbLQYPwSUPoCMWR5BEzIk/f1lZb
github.com/pmezard/go-difflib v1.0.0/go.mod h1:iKH77koFhYxTK1pcRnkKkqfTogsbg7gZNVY4sRDYZ/4=
github.com/stretchr/testify v1.9.0 h1:HtqpIVDClZ4nwg75+f6Lvsy/wHu+3BoSGCbBAcpTsTg=
github.com/stretchr/testify v1.9.0/go.mod h1:r2ic/lqez/lEtzL7wO/rwa5dbSLXVDPFyf8C91i36aY=
github.com/tmc/langchaingo v0.1.11 h1:QXK6T8zQzRSKispzeUk66yWcwVV3Igkq7R15y9NtZxM=
github.com/tmc/langchaingo v0.1.11/go.mod h1:MFQg4CUOwjT5VTYjorSmXHhQju6XqdJkSbdpBcZf7SQ=
github.com/tmc/langchaingo v0.1.12-pre.0 h1:8s7GU3qUYwDdMDu+gdBqBXUqNbvRpOoAjSDczRb340c=
github.com/tmc/langchaingo v0.1.12-pre.0/go.mod h1:cd62xD6h+ouk8k/QQFhOsjRYBSA1JJ5UVKXSIgm7Ni4=
gopkg.in/yaml.v2 v2.4.0 h1:D8xgwECY7CYvx+Y2n4sBz93Jn9JRvxdiyyo8CTfuKaY=
gopkg.in/yaml.v2 v2.4.0/go.mod h1:RDklbk79AGWmwhnvt/jBztapEOGDOx6ZbXqjP6csGnQ=
gopkg.in/yaml.v3 v3.0.1 h1:fxVm/GzAzEWqLHuvctI91KS9hhNmmWOoWu0XTYJS7CA=
Expand Down
Loading

0 comments on commit 7418a9a

Please sign in to comment.