-
Notifications
You must be signed in to change notification settings - Fork 314
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
local-apps: update node-llama-cpp
snippet
#1169
local-apps: update node-llama-cpp
snippet
#1169
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Amazing! Let's wait for @ngxson to give an LGTM too and merge!
` --model "hf:${model.id}/${filepath ?? "{{GGUF_FILE}}"}" \\`, | ||
` --prompt 'Hi there!'`, | ||
].join("\n"), | ||
content: `npx -y node-llama-cpp chat hf:${model.id}${tagName}`, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
nice, this looks quite clean 👍
@ngxson Let me know if you need me to make changes to the PR to merge it |
Do you need any input from me to merge this PR? It'd be a bummer to see this PR get stale... |
@ngxson i think we can merge if you approve |
The failed CI doesn't relate to this PR, so I assume it's ok to merge? |
yes imo |
This PR updates the code snippet of
node-llama-cpp
to use a tag-based model URI, like inllama.cpp
and Ollama.The implementation for this uses the Ollama support on Hugging Face, and is based on this PR: ggml-org/llama.cpp#11195
Examples: