You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
According to the content of my message, it is impossible for the model to reply "Never go to give you up", but every time it encounters an EOS token, it will be at the end of the response
Reproduction Steps
Use this code
usingLLama;usingLLama.Common;usingLLama.Sampling;usingLLama.Transformers;usingSystem.Text;namespaceLlamaTest{internalclassProgram{staticasyncTaskMain(string[]args){stringmodelPath=@"C:\Users\Xavier\.ollama\models\blobs\sha256-60cfdbde0472c3b850493551288a152f0858a0d1974964d6925c2b908035db76";varparameters=newModelParams(modelPath){ContextSize=1024,// The longest length of chat as memory.GpuLayerCount=5,};usingvarmodel=LLamaWeights.LoadFromFile(parameters);usingvarcontext=model.CreateContext(parameters);varexecutor=newInteractiveExecutor(context);varchatHistory=newChatHistory();ChatSessionsession=newChatSession(executor,chatHistory);session.WithHistoryTransform(newPromptTemplateTransformer(model,true));InferenceParamsinferenceParams=newInferenceParams(){//MaxTokens = 256, // No more than 256 tokens should appear in answer. Remove it if antiprompt is enough for control.AntiPrompts=newList<string>{"Never gonna give you up"},// Stop generation once antiprompts appear.SamplingPipeline=newDefaultSamplingPipeline(),};while(true){varinput=Console.ReadLine();varmessage=newChatHistory.Message(AuthorRole.User,input);awaitforeach(vartextinsession.ChatAsync(message,true,inferenceParams)){Console.Write(text);}}}}}
Description
According to the content of my message, it is impossible for the model to reply "Never go to give you up", but every time it encounters an EOS token, it will be at the end of the response
Reproduction Steps
Use this code
And here is the model: https://ollama.org.cn/library/deepseek-llm
Environment & Configuration
Known Workarounds
No response
The text was updated successfully, but these errors were encountered: