Replies: 2 comments
-
Beta Was this translation helpful? Give feedback.
0 replies
-
It seems you are using chat model you need FIM model.
|
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hello,
I am using Visual Studio Code with the Twinny Extension. On one of my PCs, I host Ollama running the deepseek-coder1.3b model (I also have codellama
and codeqwen
available). On my other PC, I have connected Twinny to the LLM hosted on the first PC.
While the chat functionality in the extension works fine, the autocompletion is problematic. Sometimes, it suggests the wrong programming languages or simply recommends irrelevant strings (e.g., "It seems like you've posted a piece of JavaScript code that includes several functions..."). The code completions it provides are not very helpful.
My question is: What could be causing these issues? Is it the model, the settings, or something else I am doing wrong? I would greatly appreciate any assistance in resolving this problem.
(I don't know what context would help to solve the error, so if any context would help you just ask for it :) )
Thank you!
Beta Was this translation helpful? Give feedback.
All reactions