Replies: 14 comments 6 replies
-
no goals for assistant, my money is on marvin ai doing a great job |
Beta Was this translation helpful? Give feedback.
-
I would also love for the Assistants API to be supported |
Beta Was this translation helpful? Give feedback.
-
what is the application you're looking for? |
Beta Was this translation helpful? Give feedback.
-
Developing full application is much easier with Assistants API as OpenAI does all the state-management for you. The application just need to keep the entity ids: Assiatand_id, Thread_id, run_id. Also all entities support "metadata" that can be used to store more application specific and situational attributes. This is perfect of server-less apps. I am working on a multi-agent system. Each agent is an Assistant. These Assistants are used in the same Thread. So I can switch Assistant based on the current state of the Thread. |
Beta Was this translation helpful? Give feedback.
-
right but what about |
Beta Was this translation helpful? Give feedback.
-
The current functionalities of Instructor are supper useful. If the same functionality (structure data extraction etc.) is ported to Assistant API, it can be used for extracting structured data etc. in apps that are already based on Assistant API. |
Beta Was this translation helpful? Give feedback.
-
@jxnl For me, the applications of "Choose one of the options below. Include no other text in your answer". This proves to be unreliable, especially for models below gpt4 - often the LLM adds extra detail, breaking the expected output structure. For this type of classification problem, which is I imagine a common use case for the new assistants API, having a library which enforces structured output would be amazing! |
Beta Was this translation helpful? Give feedback.
-
I think chat api will be deprecated in the next few months as the name implies it was initially intended for a narrow use cases. Simple implementation could be at Assistant creation/update point and at run creation time by adding it as tool and instruction. |
Beta Was this translation helpful? Give feedback.
-
😭 this is so much responsibility. Open to some sketch prs! |
Beta Was this translation helpful? Give feedback.
-
I have played around with assistants and have modified instructor locally to have some things (e.g. something similar to |
Beta Was this translation helpful? Give feedback.
-
@covitof I'd be interested in seeing that if you're able to show us |
Beta Was this translation helpful? Give feedback.
-
Basically I've used the assistants as described in the documentation and then, once I get the @classmethod
def from_run(
cls,
run: Run,
tool_index: int = 0,
validation_context=None,
strict: bool = None,
):
"""Execute the function from the result of an assistant run
Parameters:
run (Run): The result object of an assistant run
tool_index (int): The index of the tool call to execute
validation_context (dict): The validation context to use for validating the response
strict (bool): Whether to use strict json parsing
Returns:
cls (OpenAISchema): An instance of the class
"""
submit_tool_outputs = run.required_action.submit_tool_outputs
tool_call = submit_tool_outputs.tool_calls[tool_index]
assert (
tool_call.function.name == cls.openai_schema["name"]
), "Tool name does not match"
return cls.model_validate_json(
tool_call.function.arguments,
context=validation_context,
strict=strict,
) The type |
Beta Was this translation helpful? Give feedback.
-
I wonder just using create_and_run would work. from instructor import OpenAISchema, patch, Mode
from openai import OpenAI
client = patch(OpenAI(), Mode.TOOLS)
class UserDetail(OpenAISchema):
age: int
name: str
# create assistant
assistant = client.beta.assistants.create(
name="User Detail Assistant",
description="You are great at extracting name and age of users.",
model="gpt-3.5-turbo-1106",
)
run = client.beta.threads.create_and_run(
assistant_id=assistant.id,
instructions = "This person is called James and is 28 years old.",
tools=[{"type": "function", "function":UserDetail.openai_schema}],
)
sec = 0
while run.status == "queued" or run.status == "in_progress":
run = client.beta.threads.runs.retrieve(
thread_id=run.thread_id,
run_id=run.id,
)
sec += 1
print(f"Waiting for run to complete ({sec}s)" , run.status)
time.sleep(1)
print(UserDetail.from_run(run))
client.beta.threads.delete(thread_id=run.thread_id)
#cancel since no need to wait for run to complete
client.beta.threads.runs.cancel(run_id=run.id)
client.beta.assistants.delete(assistant_id=assistant.id) Possible you can wrap the above logic in a single function: extract(content : str , schema : OpenAISchema) . Also async version since you have to wait for the run status to transition to "requires_action". |
Beta Was this translation helpful? Give feedback.
-
Any Update about Assiatant APi Integration? |
Beta Was this translation helpful? Give feedback.
-
What are your thoughts on supporting Assistants API? parallel function calls are one advantage.
I feel the API may change over the next few months, so maybe we should wait.
Beta Was this translation helpful? Give feedback.
All reactions