-
Notifications
You must be signed in to change notification settings - Fork 376
Issues: SciSharp/LLamaSharp
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Author
Label
Projects
Milestones
Assignee
Sort
Issues list
[BUG]: llamasharp 0.20.0 can not be used with visual studio 2019
#1076
opened Feb 2, 2025 by
futureflsl
[Feature]: DeepSeek-R1-Distill-Qwen or similar distilled DeepSeek gguf support
enhancement
New feature or request
#1059
opened Jan 28, 2025 by
zsogitbe
[BUG]: System.AccessViolationException on KernelMemory Example in Llama.Examples
#1058
opened Jan 27, 2025 by
JLeaman99
Argument out of range exception when running any prompt through DeepSeek-R1-Distill-Llama-8B-Q8_0
#1053
opened Jan 21, 2025 by
wased89
[BUG]: Outputting Chinese characters may result in incomplete UTF8 encoding, causing garbled text
#1048
opened Jan 18, 2025 by
jxq1997216
[BUG]: Failed to load ./runtimes/win-x64/native/cuda12/llama.dll
#1014
opened Dec 7, 2024 by
vadsavin
[BUG]: Unhandled exception. System.Text.Json.JsonException: '0x00' is an invalid start of a value
#1008
opened Dec 2, 2024 by
biapar
[BUG]: loglevel 0 and 1 from llama.cpp doesn't seem to be supported
#995
opened Nov 25, 2024 by
LoicDagnas
How to publish a self-contained, single runtime, with multiple backends?
#977
opened Nov 12, 2024 by
jasonliv
Previous Next
ProTip!
What’s not been updated in a month: updated:<2025-01-04.