We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
How can we use o1, or o1-mini with instruct?
Because I tried and received:
Error code: 404 - {'error': {'message': 'tools is not supported in this model. For a list of supported models, refer to https://platform.openai.com/docs/guides/function-calling#models-supporting-function-calling.', 'type': 'invalid_request_error', 'param': None, 'code': None}}
this is my code `# for exponential backoff
import instructor from langfuse.openai import OpenAI
from pydantic import BaseModel from tenacity import ( retry, stop_after_attempt, wait_random_exponential, )
from retrieval.core.interfaces import BasePromptTemplate, LLMInterface
class Gpt(LLMInterface): def init(self, model: str): self.model = model
self.llm = instructor.from_openai(OpenAI()) @retry(wait=wait_random_exponential(min=1, max=60), stop=stop_after_attempt(6)) def completion_with_backoff(self, llm, **kwargs): return llm.chat.completions.create(**kwargs) def get_answer( self, prompt: BasePromptTemplate, formatted_instruction: BaseModel, temperature=0, *args, **kwargs, ): formatted_prompt = prompt.create_template(*args, **kwargs) answer = self.completion_with_backoff( llm=self.llm, model=self.model, temperature=temperature, response_model=formatted_instruction, messages=[ { "role": "user", "content": formatted_prompt, }, ], ) if formatted_instruction: return answer.dict() else: return answer.choices[0].message.content
Thanks, Alex
The text was updated successfully, but these errors were encountered:
No branches or pull requests
How can we use o1, or o1-mini with instruct?
Because I tried and received:
Error code: 404 - {'error': {'message': 'tools is not supported in this model. For a list of supported models, refer to https://platform.openai.com/docs/guides/function-calling#models-supporting-function-calling.', 'type': 'invalid_request_error', 'param': None, 'code': None}}
this is my code
`# for exponential backoff
import instructor
from langfuse.openai import OpenAI
from openai import OpenAI
from pydantic import BaseModel
from tenacity import (
retry,
stop_after_attempt,
wait_random_exponential,
)
from retrieval.core.interfaces import BasePromptTemplate, LLMInterface
class Gpt(LLMInterface):
def init(self, model: str):
self.model = model
Thanks,
Alex
The text was updated successfully, but these errors were encountered: