Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

GraphRag #5

Open
aviranmz opened this issue Jul 17, 2024 · 4 comments
Open

GraphRag #5

aviranmz opened this issue Jul 17, 2024 · 4 comments

Comments

@aviranmz
Copy link

i started the litellm:

INFO: Application startup complete.
INFO: Uvicorn running on http://0.0.0.0:4000 (Press CTRL+C to quit)
INFO: 127.0.0.1:55848 - "GET / HTTP/1.1" 200 OK
INFO: 127.0.0.1:55848 - "GET /openapi.json HTTP/1.1" 200 OK
INFO: 127.0.0.1:55850 - "GET /models HTTP/1.1" 200 OK

Also on -> http://localhost:4000/models
i get -> {"data":[{"id":"ollama/llama3:latest","object":"model","created":1677610602,"owned_by":"openai"}],"object":"list"}

and on the UI only enter hello:
and i get this

2024-07-17 22:29:37 - Retrying request to /chat/completions in 0.906669 seconds
2024-07-17 22:29:38 - Retrying request to /chat/completions in 1.630011 seconds
2024-07-17 22:29:39 - Connection error.
Traceback (most recent call last):
File "/opt/homebrew/Caskroom/miniforge/base/envs/RAG_agents/lib/python3.12/site-packages/httpx/_transports/default.py", line 69, in map_httpcore_exceptions
yield
File "/opt/homebrew/Caskroom/miniforge/base/envs/RAG_agents/lib/python3.12/site-packages/httpx/_transports/default.py", line 233, in handle_request
resp = self._pool.handle_request(req)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/homebrew/Caskroom/miniforge/base/envs/RAG_agents/lib/python3.12/site-packages/httpcore/_sync/connection_pool.py", line 216, in handle_request
raise exc from None
File "/opt/homebrew/Caskroom/miniforge/base/envs/RAG_agents/lib/python3.12/site-packages/httpcore/_sync/connection_pool.py", line 196, in handle_request
response = connection.handle_request(
^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/homebrew/Caskroom/miniforge/base/envs/RAG_agents/lib/python3.12/site-packages/httpcore/_sync/http_proxy.py", line 207, in handle_request
return self._connection.handle_request(proxy_request)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/homebrew/Caskroom/miniforge/base/envs/RAG_agents/lib/python3.12/site-packages/httpcore/_sync/connection.py", line 99, in handle_request
raise exc
File "/opt/homebrew/Caskroom/miniforge/base/envs/RAG_agents/lib/python3.12/site-packages/httpcore/_sync/connection.py", line 76, in handle_request
stream = self._connect(request)
^^^^^^^^^^^^^^^^^^^^^^
File "/opt/homebrew/Caskroom/miniforge/base/envs/RAG_agents/lib/python3.12/site-packages/httpcore/_sync/connection.py", line 122, in _connect
stream = self._network_backend.connect_tcp(**kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/homebrew/Caskroom/miniforge/base/envs/RAG_agents/lib/python3.12/site-packages/httpcore/_backends/sync.py", line 205, in connect_tcp
with map_exceptions(exc_map):
File "/opt/homebrew/Caskroom/miniforge/base/envs/RAG_agents/lib/python3.12/contextlib.py", line 158, in exit
self.gen.throw(value)
File "/opt/homebrew/Caskroom/miniforge/base/envs/RAG_agents/lib/python3.12/site-packages/httpcore/_exceptions.py", line 14, in map_exceptions
raise to_exc(exc) from exc
httpcore.ConnectError: [Errno 8] nodename nor servname provided, or not known

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
File "/opt/homebrew/Caskroom/miniforge/base/envs/RAG_agents/lib/python3.12/site-packages/openai/_base_client.py", line 978, in _request
response = self._client.send(
^^^^^^^^^^^^^^^^^^
File "/opt/homebrew/Caskroom/miniforge/base/envs/RAG_agents/lib/python3.12/site-packages/httpx/_client.py", line 914, in send
response = self._send_handling_auth(
^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/homebrew/Caskroom/miniforge/base/envs/RAG_agents/lib/python3.12/site-packages/httpx/_client.py", line 942, in _send_handling_auth
response = self._send_handling_redirects(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/homebrew/Caskroom/miniforge/base/envs/RAG_agents/lib/python3.12/site-packages/httpx/_client.py", line 979, in _send_handling_redirects
response = self._send_single_request(request)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/homebrew/Caskroom/miniforge/base/envs/RAG_agents/lib/python3.12/site-packages/httpx/_client.py", line 1015, in _send_single_request
response = transport.handle_request(request)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/homebrew/Caskroom/miniforge/base/envs/RAG_agents/lib/python3.12/site-packages/httpx/_transports/default.py", line 232, in handle_request
with map_httpcore_exceptions():
File "/opt/homebrew/Caskroom/miniforge/base/envs/RAG_agents/lib/python3.12/contextlib.py", line 158, in exit
self.gen.throw(value)
File "/opt/homebrew/Caskroom/miniforge/base/envs/RAG_agents/lib/python3.12/site-packages/httpx/_transports/default.py", line 86, in map_httpcore_exceptions
raise mapped_exc(message) from exc
httpx.ConnectError: [Errno 8] nodename nor servname provided, or not known

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
File "/opt/homebrew/Caskroom/miniforge/base/envs/RAG_agents/lib/python3.12/site-packages/chainlit/utils.py", line 44, in wrapper
return await user_function(**params_values)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/aviranm/workspace/autogen_graphRAG/autogen_graphRAG/appUI.py", line 165, in run_conversation
await cl.make_async(user_proxy.initiate_chat)( manager, message=CONTEXT, )
File "/opt/homebrew/Caskroom/miniforge/base/envs/RAG_agents/lib/python3.12/site-packages/asyncer/_main.py", line 358, in wrapper
return await anyio.to_thread.run_sync(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/homebrew/Caskroom/miniforge/base/envs/RAG_agents/lib/python3.12/site-packages/anyio/to_thread.py", line 33, in run_sync
return await get_asynclib().run_sync_in_worker_thread(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/homebrew/Caskroom/miniforge/base/envs/RAG_agents/lib/python3.12/site-packages/anyio/_backends/_asyncio.py", line 877, in run_sync_in_worker_thread
return await future
^^^^^^^^^^^^
File "/opt/homebrew/Caskroom/miniforge/base/envs/RAG_agents/lib/python3.12/asyncio/futures.py", line 287, in await
yield self # This tells Task to wait for completion.
^^^^^^^^^^
File "/opt/homebrew/Caskroom/miniforge/base/envs/RAG_agents/lib/python3.12/asyncio/tasks.py", line 385, in __wakeup
future.result()
File "/opt/homebrew/Caskroom/miniforge/base/envs/RAG_agents/lib/python3.12/asyncio/futures.py", line 203, in result
raise self._exception.with_traceback(self._exception_tb)
File "/opt/homebrew/Caskroom/miniforge/base/envs/RAG_agents/lib/python3.12/site-packages/anyio/_backends/_asyncio.py", line 807, in run
result = context.run(func, *args)
^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/homebrew/Caskroom/miniforge/base/envs/RAG_agents/lib/python3.12/site-packages/autogen/agentchat/conversable_agent.py", line 1018, in initiate_chat
self.send(msg2send, recipient, silent=silent)
File "/Users/aviranm/workspace/autogen_graphRAG/autogen_graphRAG/utils/chainlit_agents.py", line 76, in send
super(ChainlitUserProxyAgent, self).send(
File "/opt/homebrew/Caskroom/miniforge/base/envs/RAG_agents/lib/python3.12/site-packages/autogen/agentchat/conversable_agent.py", line 655, in send
recipient.receive(message, self, request_reply, silent)
File "/opt/homebrew/Caskroom/miniforge/base/envs/RAG_agents/lib/python3.12/site-packages/autogen/agentchat/conversable_agent.py", line 818, in receive
reply = self.generate_reply(messages=self.chat_messages[sender], sender=sender)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/homebrew/Caskroom/miniforge/base/envs/RAG_agents/lib/python3.12/site-packages/autogen/agentchat/conversable_agent.py", line 1972, in generate_reply
final, reply = reply_func(self, messages=messages, sender=sender, config=reply_func_tuple["config"])
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/homebrew/Caskroom/miniforge/base/envs/RAG_agents/lib/python3.12/site-packages/autogen/agentchat/groupchat.py", line 1052, in run_chat
reply = speaker.generate_reply(sender=self)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/homebrew/Caskroom/miniforge/base/envs/RAG_agents/lib/python3.12/site-packages/autogen/agentchat/conversable_agent.py", line 1972, in generate_reply
final, reply = reply_func(self, messages=messages, sender=sender, config=reply_func_tuple["config"])
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/homebrew/Caskroom/miniforge/base/envs/RAG_agents/lib/python3.12/site-packages/autogen/agentchat/conversable_agent.py", line 1340, in generate_oai_reply
extracted_response = self._generate_oai_reply_from_client(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/homebrew/Caskroom/miniforge/base/envs/RAG_agents/lib/python3.12/site-packages/autogen/agentchat/conversable_agent.py", line 1359, in _generate_oai_reply_from_client
response = llm_client.create(
^^^^^^^^^^^^^^^^^^
File "/opt/homebrew/Caskroom/miniforge/base/envs/RAG_agents/lib/python3.12/site-packages/autogen/oai/client.py", line 722, in create
response = client.create(params)
^^^^^^^^^^^^^^^^^^^^^
File "/opt/homebrew/Caskroom/miniforge/base/envs/RAG_agents/lib/python3.12/site-packages/autogen/oai/client.py", line 320, in create
response = completions.create(**params)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/homebrew/Caskroom/miniforge/base/envs/RAG_agents/lib/python3.12/site-packages/openai/_utils/_utils.py", line 277, in wrapper
return func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "/opt/homebrew/Caskroom/miniforge/base/envs/RAG_agents/lib/python3.12/site-packages/openai/resources/chat/completions.py", line 643, in create
return self._post(
^^^^^^^^^^^
File "/opt/homebrew/Caskroom/miniforge/base/envs/RAG_agents/lib/python3.12/site-packages/openai/_base_client.py", line 1266, in post
return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/homebrew/Caskroom/miniforge/base/envs/RAG_agents/lib/python3.12/site-packages/openai/_base_client.py", line 942, in request
return self._request(
^^^^^^^^^^^^^^
File "/opt/homebrew/Caskroom/miniforge/base/envs/RAG_agents/lib/python3.12/site-packages/openai/_base_client.py", line 1002, in _request
return self._retry_request(
^^^^^^^^^^^^^^^^^^^^
File "/opt/homebrew/Caskroom/miniforge/base/envs/RAG_agents/lib/python3.12/site-packages/openai/_base_client.py", line 1079, in _retry_request
return self._request(
^^^^^^^^^^^^^^
File "/opt/homebrew/Caskroom/miniforge/base/envs/RAG_agents/lib/python3.12/site-packages/openai/_base_client.py", line 1002, in _request
return self._retry_request(
^^^^^^^^^^^^^^^^^^^^
File "/opt/homebrew/Caskroom/miniforge/base/envs/RAG_agents/lib/python3.12/site-packages/openai/_base_client.py", line 1079, in _retry_request
return self._request(
^^^^^^^^^^^^^^
File "/opt/homebrew/Caskroom/miniforge/base/envs/RAG_agents/lib/python3.12/site-packages/openai/_base_client.py", line 1012, in _request
raise APIConnectionError(request=request) from err
openai.APIConnectionError: Connection error.

@aviranmz aviranmz changed the title Readme GraphRag Jul 22, 2024
@karthik-codex
Copy link
Owner

Hey sorry, I couldn't get back to this sooner. Did you get a chance to fix this?

@punit461
Copy link

stuck on the same thing,

This could be due to Running all these stuff on CPU?

Since I have tried multiple times running agents for different projects, I did not find any context which can prove me theory, but with all agentic approach I run into problem

@hemanthaar
Copy link

I am running into this issue now and I couldn't solve it myself. Do you have any suggestions?

@hemanthaar
Copy link

If you change the ip address from 0.0.0.0:4000 to localhost:4000 in the appUI.py file for autogen llm config, it will work again.

LLama3 LLM from Lite-LLM Server for Agents

llm_config_autogen = {
"seed": 42, # change the seed for different trials
"temperature": 0,
"config_list": [{"model": "litellm",
"base_url": "http://localhose:4000/",
'api_key': 'ollama'},
],
"timeout": 60000,
}

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants