Skip to content

Releases: cloudbridgeuy/c

2023-08-15T1614

15 Aug 16:14
Compare
Choose a tag to compare

Release Notes

Documentation

  • Updated main README.md

Refactors

  • anthropic and openai command's SessionOptions.model is now optional
  • Model::default() is used if model is None

Bug Fixes

  • anthropic, openai, and vertex commands now stop the spinner after receiving the response
  • Added #[serde(rename = "claude-2")] attribute to Model::Claude2
  • Citation fields are now optional

Dependencies

  • No changes

2023-08-15T1513

15 Aug 15:13
Compare
Choose a tag to compare

Release Notes

Bug Fixes

  • anthropic, openai, and vertex commands now stop the spinner after receiving the response
  • Added #[serde(rename = "claude-2")] attribute to Model::Claude2
  • Citation fields are now optional

Documentation

  • Updated c README.md
  • Updated main README.md

Refactors

  • RequestOptions.instances is now built from Session.options.context instead of message history
  • Removed logic to map Message.Role to VertexMessage.Author
  • trim_messages() now only trims if total token count exceeds max and message count is odd

Dependencies

  • No changes

2023-08-15T1257

15 Aug 12:58
Compare
Choose a tag to compare

Release Notes

New Features

  • Added Vertex subcommand and commands::vertex module
  • Added Model enum to represent Vertex AI models
  • Added RequestOptions, SessionOptions, Candidate, Citation, CitationMeta, SafetyAttributes, Prediction, TokenCount, TokenMetadata, Metadata and Response structs for the Vertex API
  • Added parse_temperature, parse_top_k and parse_top_p functions to parse float command line arguments
  • Added run() function to handle the vertex subcommand
  • Added complete_messages() and trim_messages() helper functions
  • Added merge_options() function to merge command options into session options
  • Added complete() function to send a completion request to the Vertex API
  • Added print_output() function to print the Vertex API response
  • Added pub fn anonymous(vendor: Vendor, options: T, max_supported_tokens: u32) -> Session function to Session
  • anonymous() function generates a random ID, sets the path to ~/.c/sessions/anonymous/{id}.yaml and returns a Session
  • Added system option to Session
  • Added logic to handle system option

Bug Fixes

  • Removed logic to map Message.Role to VertexMessage.Author
  • trim_messages() now only trims if total token count exceeds max and message count is odd
  • merge_options() now always sets the endpoint field
  • SessionOptions.endpoint is now optional with a default
  • Added context field to SessionOptions and CommandOptions
  • merge_options() now sets the context field
  • complete() now handles the optional endpoint field
  • Made Role enum lowercase
  • Added as_str() method to Model enum
  • Changed model option type to Model enum
  • Changed OpenAI API URL
  • Updated complete() and complete_stream() methods

Refactors

  • RequestOptions.instances is now built from Session.options.context instead of message history
  • Changed Vendor to Clone + Default
  • Changed Meta to Clone + Default
  • Changed Session to Clone + Default
  • Removed unused use ulid::Ulid; imports from anthropic.rs and openai.rs
  • Changed let session: Session = Session::new(Ulid::new().to_string(), ...); to let session: Session = Session::anonymous(...); in run() for both anthropic.rs and openai.rs
  • Changed model: String to model: Model in RequestOptions
  • Added model: Option to SessionOptions
  • Changed model: options.model.to_string() to model: Some(options.model) in From for SessionOptions impl
  • Changed session.options.model = options.model.unwrap().as_str().to_string(); to session.options.model = options.model; in merge_options()
  • Changed model: session.options.model.to_string() to model: session.options.model.unwrap_or_default() in complete_stream() and complete()
  • Added let max_tokens_to_sample = session.options.max_tokens_to_sample.unwrap_or(1000); and used that instead of session.options.max_tokens_to_sample in complete_stream() and complete()
  • Added SessionOptions struct for session options
  • Added From for SessionOptions impl
  • Changed Session to Session
  • Replaced all uses of last_request_options with options
  • Added SessionOptions to the Session struct
  • Removed commented out SessionOptions struct
  • Removed println!("options: {:?}", options); from impl From<CommandOptions> for RequestOptions
  • openai command now creates a Session instead of a custom struct
  • Session stores request options, metadata, and message history
  • Session can be loaded/saved to/from the filesystem
  • Completion prompt is built from the session history
  • Message.Role.Human is mapped to CompletionMessage.Role.User
  • Added complete_messages() to trim messages to max supported tokens
  • Removed Options, CompleteRequestBody, HistoryMessage, and trim_messages()
  • Added Session, Message, Vendor, Role, and Meta structs
  • anthropic command now creates a Session instead of a custom struct
  • Session stores request options, metadata, and message history
  • Session can be loaded/saved to/from the filesystem
  • Completion prompt is built from the session history
  • No variable renames

Dependencies

  • No changes

2023-08-08T0129

08 Aug 01:29
Compare
Choose a tag to compare

Release notes for 2023-08-08T0129

New Features

  • Added system option to Session

Bug Fixes

  • Made Role enum lowercase
  • Added as_str() method to Model enum
  • Changed model option type to Model enum
  • Changed OpenAI API URL
  • Updated complete() and complete_stream() methods

Refactors

  • Moved all the openai code into a single file
  • Moved all the anthropic code into a single file

Release 2023-08-04T1330

04 Aug 13:30
Compare
Choose a tag to compare

Release Notes

v1.1.0

Features

  • Added support for Claude v2 and the new Anthropic streaming API.
  • Added the Claude2 model enum variant.
  • Started working on the OpenAI chat complete command.

Fixes

  • Handled stop reasons in streaming responses.
  • Made max_tokens_to_sample required and changed type to u32.
  • Defaulted max_supported_tokens to 4096 and calculated max_tokens_to_sample if not provided.
  • Removed spinner logic from openai.rs.
  • Removed truncated field from Chunk.

Refactors

  • Updated CompleteCreateCommand in anthropic.rs.
    ⠼ - Cleaned up the anthropic crate by removing unused code.

v1.0.0

01 Aug 00:54
Compare
Choose a tag to compare

Release Notes

First version of my new Ai CLI: c! 🚀

v1.0.0

Features

  • Added streaming support to the anthropic crate and c command.
  • Added a spinner to show progress for streaming responses.
  • Started working on a new version of the CLI.
  • Handled streaming in the CLI to properly handle OpenAI streaming responses.

Refactors

  • Cleaned up the anthropic crate by removing unused code and making some fields optional.
  • Improved the usage of sessions to define completion behavior.
  • Moved most logic into the c crate.
  • Replaced anyhow for color_eyre and log for tracing in the anthropic crate.

Fixes

  • Applied changes done to other services.

Chores

  • Added the tracing dependency.
  • Added bacon to the project.

2023-07-01T2342

01 Jul 23:42
Compare
Choose a tag to compare

Release notes for 2023-07-01T2342

2023-06-25T2348

25 Jun 23:48
Compare
Choose a tag to compare

Release notes for 2023-06-25T2348

2023-06-14T1306

14 Jun 13:06
Compare
Choose a tag to compare

Release notes for 2023-06-14T1306

2023-06-09T2037

09 Jun 20:37
Compare
Choose a tag to compare

Release Notes

New Features

  • Integrate the anthropic module to enable conversations with Claude
  • Add the anthropic case to interact with Claude
  • Add messages to ChatsApi struct to store conversation history with OpenAI

Dependencies

  • Add anthropic and anyhow as new dependencies

Breaking Changes

  • The b chats create command no longer uses a default model if the model option is not provided. For new chat sessions, you must specify the model to use. Existing chat sessions in the sessions file must be updated to include the model field.

Commits

  • (ad134be) feat(b): integrate anthropic to b
  • (60012a1) feat(anthropic): create the anthropic case to interact with Claude
  • (45ab11d) feat(openai): add messages to ChatsApi struct to store conversation history
  • (8e2a7a2) chore: add new dependencies anthropic and anyhow to Cargo.lock file
  • (a5ca6a2) feat: add the anthropic module