Skip to content

Commit

Permalink
Update code sample for new Ollama approach (#43258)
Browse files Browse the repository at this point in the history
* Update packages and code sample for new Ollama approach
  • Loading branch information
alexwolfmsft authored Oct 30, 2024
1 parent 8b79865 commit d8ababc
Show file tree
Hide file tree
Showing 3 changed files with 52 additions and 40 deletions.
43 changes: 3 additions & 40 deletions docs/ai/quickstarts/quickstart-local-ai.md
Original file line number Diff line number Diff line change
Expand Up @@ -54,10 +54,11 @@ Complete the following steps to create a .NET console app that will connect to y
dotnet new console
```

1. Add the Semantic Kernel SDK package to your app:
1. Add the [Semantic Kernel SDK](https://www.nuget.org/packages/Microsoft.SemanticKernel) and the [Semantic Kernel Ollama Connector](https://www.nuget.org/packages/Microsoft.SemanticKernel.Connectors.Ollama/1.25.0-alpha) packages to your app:

```dotnetcli
dotnet add package Microsoft.SemanticKernel
dotnet add package Microsoft.SemanticKernel.Connectors.Ollama
```

1. Open the new app in your editor of choice, such as Visual Studio Code.
Expand All @@ -72,45 +73,7 @@ The Semantic Kernel SDK provides many services and features to connect to AI mod
1. Open the _Program.cs_ file and replace the contents of the file with the following code:
```csharp
using Microsoft.SemanticKernel;
using Microsoft.SemanticKernel.ChatCompletion;
// Create a kernel with OpenAI chat completion
#pragma warning disable SKEXP0010
Kernel kernel = Kernel.CreateBuilder()
.AddOpenAIChatCompletion(
modelId: "phi3:mini",
endpoint: new Uri("http://localhost:11434"),
apiKey: "")
.Build();
var aiChatService = kernel.GetRequiredService<IChatCompletionService>();
var chatHistory = new ChatHistory();
while (true)
{
// Get user prompt and add to chat history
Console.WriteLine("Your prompt:");
var userPrompt = Console.ReadLine();
chatHistory.Add(new ChatMessageContent(AuthorRole.User, userPrompt));
// Stream the AI response and add to chat history
Console.WriteLine("AI Response:");
var response = "";
await foreach(var item in
aiChatService.GetStreamingChatMessageContentsAsync(chatHistory))
{
Console.Write(item.Content);
response += item.Content;
}
chatHistory.Add(new ChatMessageContent(AuthorRole.Assistant, response));
Console.WriteLine();
}
```
> [!NOTE]
> The `#pragma warning disable SKEXP0010` line is included due to the experimental state of some Semantic Kernel SDK features.
:::code language="csharp" source="snippets/local-ai/program.cs" :::
The preceding code accomplishes the following tasks:
- Creates a `Kernel` object and uses it to retrieve a chat completion service.
Expand Down
34 changes: 34 additions & 0 deletions docs/ai/quickstarts/snippets/local-ai/Program.cs
Original file line number Diff line number Diff line change
@@ -0,0 +1,34 @@
using Microsoft.SemanticKernel;
using Microsoft.SemanticKernel.ChatCompletion;

// Create a kernel with OpenAI chat completion
// Warning due to the experimental state of some Semantic Kernel SDK features.
#pragma warning disable SKEXP0070
Kernel kernel = Kernel.CreateBuilder()
.AddOllamaChatCompletion(
modelId: "phi3:mini",
endpoint: new Uri("http://localhost:11434"))
.Build();

var aiChatService = kernel.GetRequiredService<IChatCompletionService>();
var chatHistory = new ChatHistory();

while (true)
{
// Get user prompt and add to chat history
Console.WriteLine("Your prompt:");
var userPrompt = Console.ReadLine();
chatHistory.Add(new ChatMessageContent(AuthorRole.User, userPrompt));

// Stream the AI response and add to chat history
Console.WriteLine("AI Response:");
var response = "";
await foreach(var item in
aiChatService.GetStreamingChatMessageContentsAsync(chatHistory))
{
Console.Write(item.Content);
response += item.Content;
}
chatHistory.Add(new ChatMessageContent(AuthorRole.Assistant, response));
Console.WriteLine();
}
15 changes: 15 additions & 0 deletions docs/ai/quickstarts/snippets/local-ai/ollama.csproj
Original file line number Diff line number Diff line change
@@ -0,0 +1,15 @@
<Project Sdk="Microsoft.NET.Sdk">

<PropertyGroup>
<OutputType>Exe</OutputType>
<TargetFramework>net8.0</TargetFramework>
<ImplicitUsings>enable</ImplicitUsings>
<Nullable>enable</Nullable>
</PropertyGroup>

<ItemGroup>
<PackageReference Include="Microsoft.SemanticKernel" Version="1.25.0" />
<PackageReference Include="Microsoft.SemanticKernel.Connectors.Ollama" Version="1.22.0-alpha" />
</ItemGroup>

</Project>

0 comments on commit d8ababc

Please sign in to comment.