Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Wrong LLMs in backend #10

Open
mmarti-tsch opened this issue Apr 22, 2024 · 5 comments
Open

Wrong LLMs in backend #10

mmarti-tsch opened this issue Apr 22, 2024 · 5 comments

Comments

@mmarti-tsch
Copy link

It seems that upon creation of a chatbot with create-tsi, we dont' always have the LLM in the backend which was selected in the creation process.
Asking the LLM, it reveals that it is using chatGPT 3.5, even though we selected mixtral.

@andrej-schreiner
Copy link
Collaborator

@marcusschiesser any idea?

@marcusschiesser
Copy link
Collaborator

@mmarti-tsch what's the MODEL value in the .env file? If that's correct my guess is it's the LLM API

@Phalanx-SoftDev
Copy link

I experienced the same problem

image
image

Even though I selected mistral as llm it shows me GPT-3

@mohdamir
Copy link
Collaborator

While it tells its name as GPT3.5, its still Mixtral because in .env file Mixtral is selected and LLM Hub is using that model only. We need to check this hallucination. When i have asked about the creator of the model, following was the response.

Screenshot 2024-04-23 at 4 44 32 PM

@andrej-schreiner
Copy link
Collaborator

@tattrongvu is this expected behaviour or do we need to tweak the system prompts?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants