Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Retry on error #66

Merged
merged 3 commits into from
Feb 15, 2024
Merged

Retry on error #66

merged 3 commits into from
Feb 15, 2024

Conversation

granawkins
Copy link
Member

  1. Add retries to the config, to retry generating a script if it raises and exception.
  2. Update prompt to explain how to handle errors, and to let them propagate.
  3. Update all examples with new prompt + remove try/except blocks.

Copy link
Member

@jakethekoenig jakethekoenig left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good to me except for my one comment.

@@ -35,9 +36,14 @@ def rawdog(prompt: str, config, llm_client):
print(message)
except KeyboardInterrupt:
error = "Execution interrupted by user"
except Exception as e:
error = str(e)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Now that the python script is run in a subprocess there aren't really very many exceptional cases. Basically just litellm errors and dryrun rejected scripts. In my opinion we shouldn't try to recover from either of them. I think if the user doesn't accept the edits instead of throwing an exception we should break from the loop. Presumably if they didn't like the llm's first attempt they might rather give new information. If they do want to rerun exactly they can ask the same question again.

@jakethekoenig jakethekoenig merged commit b466737 into main Feb 15, 2024
2 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants