Skip to content

Latest commit

 

History

History
72 lines (44 loc) · 2.68 KB

CsharpOllamaCodeSpaces.md

File metadata and controls

72 lines (44 loc) · 2.68 KB

Ollama C# Playground

This labs is designed to test Phi-3 with C# samples directly in GitHub Codespaces as an easy way for anyone to try out SLMs (small language models) entirely in the browser.

How to create the C# + Ollama + Phi-3 Codespace

  1. Create a new Codespace using the Code button at the top of the repository. Select the [+ New with options ...] Create Codespace with options

  2. From the options page, select the configuration named Ollama with Phi-3 for C#

Select the option Ollama with Phi-3 for C#, to create the CodeSpace

  1. Once the Codespace is loaded, it should have ollama pre-installed, the latest Phi-3 model downloaded, and .NET 8 installed.

  2. (Optional) Using the Codespace termina, ask Ollama to run phi3 model:

    ollama run phi3
  3. You can send a message to that model from the prompt.

    >>> Write a joke about kittens
  4. After several seconds, you should see a response stream in from the model.

    run ollama and ask for a joke

  5. To learn about different techniques used with language models, check the sample projects in the .\src folder:

Project Description
Sample01 This is a sample project that uses a the Phi-3 hosted in ollama model to answer a question.
Sample02 This is a sample project that implement a Console chat using Semantic Kernel.
Sample03 This is a sample project that implement a RAG using local embeddings and Semantic Kernel. Check the details of the local RAG here

How to run a sample

  1. Open a terminal and navigate to the desired project. In example, let's run Sample02, the console chat.

    cd .\src\Sample02\
  2. Run the project with the command

    dotnet run
  3. The project Sample02, defines a custom system message:

    var history = new ChatHistory();
    history.AddSystemMessage("You are a useful chatbot. If you don't know an answer, say 'I don't know!'. Always reply in a funny ways. Use emojis if possible.");
  4. So when the user ask a question, like What is the capital of Italy?, the chat replies using the local mode.

    The output is similar to this one:

    Chat running demo

Video Tutorials

If you want to learn more about how to use Codespaces with Ollama in a GitHub Repository, check the following 3 minute video:

Watch the video