Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Ollama run error #1547

Closed
unsiao opened this issue Dec 2, 2024 · 0 comments
Closed

Ollama run error #1547

unsiao opened this issue Dec 2, 2024 · 0 comments

Comments

@unsiao
Copy link

unsiao commented Dec 2, 2024

Traceback (most recent call last):
  File "<frozen runpy>", line 198, in _run_module_as_main
  File "<frozen runpy>", line 88, in _run_code
  File "C:\Users\unsia\anaconda3\Scripts\interpreter.exe\__main__.py", line 7, in <module>
  File "C:\Users\unsia\anaconda3\Lib\site-packages\interpreter\terminal_interface\start_terminal_interface.py", line 612, in main
    start_terminal_interface(interpreter)
  File "C:\Users\unsia\anaconda3\Lib\site-packages\interpreter\terminal_interface\start_terminal_interface.py", line 471, in start_terminal_interface
    interpreter = profile(
                  ^^^^^^^^
  File "C:\Users\unsia\anaconda3\Lib\site-packages\interpreter\terminal_interface\profiles\profiles.py", line 64, in profile
    return apply_profile(interpreter, profile, profile_path)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\unsia\anaconda3\Lib\site-packages\interpreter\terminal_interface\profiles\profiles.py", line 148, in apply_profile
    exec(profile["start_script"], scope, scope)
  File "<string>", line 1, in <module>
  File "C:\Users\unsia\anaconda3\Lib\site-packages\interpreter\core\core.py", line 145, in local_setup
    self = local_setup(self)
           ^^^^^^^^^^^^^^^^^
  File "C:\Users\unsia\anaconda3\Lib\site-packages\interpreter\terminal_interface\local_setup.py", line 314, in local_setup
    interpreter.computer.ai.chat("ping")
  File "C:\Users\unsia\anaconda3\Lib\site-packages\interpreter\core\computer\ai\ai.py", line 134, in chat
    for chunk in self.computer.interpreter.llm.run(messages):
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\unsia\anaconda3\Lib\site-packages\interpreter\core\llm\llm.py", line 86, in run
    self.load()
  File "C:\Users\unsia\anaconda3\Lib\site-packages\interpreter\core\llm\llm.py", line 397, in load
    self.interpreter.computer.ai.chat("ping")
  File "C:\Users\unsia\anaconda3\Lib\site-packages\interpreter\core\computer\ai\ai.py", line 134, in chat
    for chunk in self.computer.interpreter.llm.run(messages):
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\unsia\anaconda3\Lib\site-packages\interpreter\core\llm\llm.py", line 322, in run
    yield from run_tool_calling_llm(self, params)
  File "C:\Users\unsia\anaconda3\Lib\site-packages\interpreter\core\llm\run_tool_calling_llm.py", line 178, in run_tool_calling_llm
    for chunk in llm.completions(**request_params):
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\unsia\anaconda3\Lib\site-packages\interpreter\core\llm\llm.py", line 466, in fixed_litellm_completions
    raise first_error  # If all attempts fail, raise the first error
    ^^^^^^^^^^^^^^^^^
  File "C:\Users\unsia\anaconda3\Lib\site-packages\interpreter\core\llm\llm.py", line 443, in fixed_litellm_completions
    yield from litellm.completion(**params)
  File "C:\Users\unsia\anaconda3\Lib\site-packages\litellm\llms\ollama.py", line 455, in ollama_completion_stream
    raise e
  File "C:\Users\unsia\anaconda3\Lib\site-packages\litellm\llms\ollama.py", line 433, in ollama_completion_stream
    function_call = json.loads(response_content)
                    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\unsia\anaconda3\Lib\json\__init__.py", line 346, in loads
    return _default_decoder.decode(s)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\unsia\anaconda3\Lib\json\decoder.py", line 337, in decode
    obj, end = self.raw_decode(s, idx=_w(s, 0).end())
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\unsia\anaconda3\Lib\json\decoder.py", line 353, in raw_decode
    obj, end = self.scan_once(s, idx)
               ^^^^^^^^^^^^^^^^^^^^^^
json.decoder.JSONDecodeError: Unterminated string starting at: line 1 column 2 (char 1)

这个错误信息表明在尝试解析JSON数据时遇到了问题。具体来说,json.decoder.JSONDecodeError: Unterminated string starting at: line 1 column 2 (char 1) 表示JSON字符串没有正确终止。

根据堆栈跟踪,错误发生在处理LLM(Large Language Model)响应的地方,特别是当尝试将响应内容解析为JSON对象时。这可能是由于以下几个原因之一:

  1. 服务器返回的数据格式不正确:LLM服务可能返回了非JSON格式的数据,导致无法解析。
  2. 网络传输过程中数据丢失或损坏:在从LLM服务接收数据的过程中,可能存在数据包丢失或损坏的情况。
  3. 客户端代码问题:负责接收和解析响应的代码可能存在逻辑错误。

为了进一步诊断问题,可以采取以下步骤:

  • 检查LLM服务的日志,确认其是否正常工作并且返回正确的JSON格式数据。
  • 在代码中添加调试信息,打印出接收到的原始响应内容,以便检查其格式是否符合预期。
  • 确认网络连接稳定,排除因网络问题导致的数据传输中断或损坏的可能性。
  • 审查客户端代码,确保所有相关的请求和响应处理逻辑正确无误。
@unsiao unsiao closed this as completed Dec 2, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant