Skip to content

Commit

Permalink
switch to Mistral 7B model
Browse files Browse the repository at this point in the history
  • Loading branch information
deleolajide committed Nov 22, 2023
1 parent 43ae80f commit 2462ae4
Show file tree
Hide file tree
Showing 4 changed files with 11 additions and 1 deletion.
6 changes: 6 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -23,6 +23,12 @@ copy llama.jar to the plugins folder
### Enable LLaMA
Enables or disables the plugin. Reload plugin or restart Openfire if this or any of the settings other settings are changed.

### Use Hosted LLaMA server
This causes the plugin to use a remote llama.cpp server instead of the local server running Openfire

### Hosted URL
The URL to the remote llama.cpp server to be used. The plugin will assume that remote server has the correct LLaMA model and configuration. It will send requests to this URL.

### Username/Password
This is Openfire username/password for the user that will act as a chatbot for LLaMA. By default the user will be “llama” and the password witll be a random string. If you are using ldap or your Openfire user manager is in read-only mode and a new user cannot be created, then you must create the user and specify the username and password here…

Expand Down
Binary file modified docs/llama-settings.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
4 changes: 4 additions & 0 deletions readme.html
Original file line number Diff line number Diff line change
Expand Up @@ -46,6 +46,10 @@ <h2 id="configuration">Configuration</h2>
<img src="https://igniterealtime.github.io/openfire-llama-plugin/llama-settings.png">
<h3 id="enable-llama">Enable LLaMA</h3>
<p>Enables or disables the plugin. Reload plugin or restart Openfire if this or any of the settings other settings are changed.</p>
<h3 id="enable-hosted">Use Hosted LLaMA server</h3>
<p>This causes the plugin to use a remote llama.cpp server instead of the local server running Openfire. The plugin will assume that remote server has the correct LLaMA model and configuration. It will send requests to this URL.</p>
<h3 id="hosted-url">Hosted URL</h3>
<p>The URL to the remote llama.cpp server to be used.</p>
<h3 id="usernamepassword">Username/Password</h3>
<p>This is Openfire username/password for the user that will act as a chatbot for LLaMA. By default the user will be “llama” and the password witll be a random string. If you are using ldap or your Openfire user manager is in read-only mode and a new user cannot be created, then you must create the user and specify the username and password here…</p>
<h3 id="alias">Alias</h3>
Expand Down
2 changes: 1 addition & 1 deletion src/java/org/ifsoft/llama/openfire/LLaMA.java
Original file line number Diff line number Diff line change
Expand Up @@ -144,7 +144,7 @@ public static String getHostedUrl() {
}

public static String getModelUrl() {
return "https://huggingface.co/TheBloke/Llama-2-7b-Chat-GGUF/blob/main/llama-2-7b-chat.Q5_K_M.gguf";
return "https://huggingface.co/TheBloke/Mistral-7B-Instruct-v0.1-GGUF/resolve/main/mistral-7b-instruct-v0.1.Q5_K_M.gguf?download=true";
}

public static String getSystemPrompt() {
Expand Down

0 comments on commit 2462ae4

Please sign in to comment.