Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Minor Bug fix with ollama #556

Merged
merged 3 commits into from
Feb 1, 2025
Merged

Minor Bug fix with ollama #556

merged 3 commits into from
Feb 1, 2025

Conversation

damaoooo
Copy link
Contributor

  1. Improved the prompt to output the translated text only.
  2. Removed the CoT output from r1 model.

Sort of mitigates #545 , #535 , but this “Response Too long” infinite loop is not solved and needs more work

  1. 修改了一下prompt,让模型直接输出翻译后的文本
  2. 把深度求索r1的思维链给过滤了。

算是缓解 #545 , #535 的问题,但是这个Response Too long的死循环的问题没有解决,需要更多的工作

@Byaidu Byaidu merged commit b79b6e9 into Byaidu:main Feb 1, 2025
2 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants