Releases: cloudbridgeuy/c
Releases · cloudbridgeuy/c
2023-08-15T1614
Release Notes
Documentation
- Updated main README.md
Refactors
- anthropic and openai command's SessionOptions.model is now optional
- Model::default() is used if model is None
Bug Fixes
- anthropic, openai, and vertex commands now stop the spinner after receiving the response
- Added #[serde(rename = "claude-2")] attribute to Model::Claude2
- Citation fields are now optional
Dependencies
- No changes
2023-08-15T1513
Release Notes
Bug Fixes
- anthropic, openai, and vertex commands now stop the spinner after receiving the response
- Added #[serde(rename = "claude-2")] attribute to Model::Claude2
- Citation fields are now optional
Documentation
- Updated c README.md
- Updated main README.md
Refactors
- RequestOptions.instances is now built from Session.options.context instead of message history
- Removed logic to map Message.Role to VertexMessage.Author
- trim_messages() now only trims if total token count exceeds max and message count is odd
Dependencies
- No changes
2023-08-15T1257
Release Notes
New Features
- Added Vertex subcommand and commands::vertex module
- Added Model enum to represent Vertex AI models
- Added RequestOptions, SessionOptions, Candidate, Citation, CitationMeta, SafetyAttributes, Prediction, TokenCount, TokenMetadata, Metadata and Response structs for the Vertex API
- Added parse_temperature, parse_top_k and parse_top_p functions to parse float command line arguments
- Added run() function to handle the vertex subcommand
- Added complete_messages() and trim_messages() helper functions
- Added merge_options() function to merge command options into session options
- Added complete() function to send a completion request to the Vertex API
- Added print_output() function to print the Vertex API response
- Added pub fn anonymous(vendor: Vendor, options: T, max_supported_tokens: u32) -> Session function to Session
- anonymous() function generates a random ID, sets the path to ~/.c/sessions/anonymous/{id}.yaml and returns a Session
- Added
system
option toSession
- Added logic to handle
system
option
Bug Fixes
- Removed logic to map Message.Role to VertexMessage.Author
- trim_messages() now only trims if total token count exceeds max and message count is odd
- merge_options() now always sets the endpoint field
- SessionOptions.endpoint is now optional with a default
- Added context field to SessionOptions and CommandOptions
- merge_options() now sets the context field
- complete() now handles the optional endpoint field
- Made
Role
enumlowercase
- Added
as_str()
method toModel
enum - Changed
model
option type toModel
enum - Changed OpenAI API URL
- Updated
complete()
andcomplete_stream()
methods
Refactors
- RequestOptions.instances is now built from Session.options.context instead of message history
- Changed Vendor to Clone + Default
- Changed Meta to Clone + Default
- Changed Session to Clone + Default
- Removed unused use ulid::Ulid; imports from anthropic.rs and openai.rs
- Changed let session: Session = Session::new(Ulid::new().to_string(), ...); to let session: Session = Session::anonymous(...); in run() for both anthropic.rs and openai.rs
- Changed model: String to model: Model in RequestOptions
- Added model: Option to SessionOptions
- Changed model: options.model.to_string() to model: Some(options.model) in From for SessionOptions impl
- Changed session.options.model = options.model.unwrap().as_str().to_string(); to session.options.model = options.model; in merge_options()
- Changed model: session.options.model.to_string() to model: session.options.model.unwrap_or_default() in complete_stream() and complete()
- Added let max_tokens_to_sample = session.options.max_tokens_to_sample.unwrap_or(1000); and used that instead of session.options.max_tokens_to_sample in complete_stream() and complete()
- Added SessionOptions struct for session options
- Added From for SessionOptions impl
- Changed Session to Session
- Replaced all uses of last_request_options with options
- Added SessionOptions to the Session struct
- Removed commented out SessionOptions struct
- Removed
println!("options: {:?}", options);
fromimpl From<CommandOptions> for RequestOptions
- openai command now creates a Session instead of a custom struct
- Session stores request options, metadata, and message history
- Session can be loaded/saved to/from the filesystem
- Completion prompt is built from the session history
- Message.Role.Human is mapped to CompletionMessage.Role.User
- Added complete_messages() to trim messages to max supported tokens
- Removed Options, CompleteRequestBody, HistoryMessage, and trim_messages()
- Added Session, Message, Vendor, Role, and Meta structs
- anthropic command now creates a Session instead of a custom struct
- Session stores request options, metadata, and message history
- Session can be loaded/saved to/from the filesystem
- Completion prompt is built from the session history
- No variable renames
Dependencies
- No changes
2023-08-08T0129
Release notes for 2023-08-08T0129
New Features
- Added
system
option toSession
Bug Fixes
- Made
Role
enumlowercase
- Added
as_str()
method toModel
enum - Changed
model
option type toModel
enum - Changed OpenAI API URL
- Updated
complete()
andcomplete_stream()
methods
Refactors
- Moved all the openai code into a single file
- Moved all the anthropic code into a single file
Release 2023-08-04T1330
Release Notes
v1.1.0
Features
- Added support for Claude v2 and the new Anthropic streaming API.
- Added the
Claude2
model enum variant. - Started working on the OpenAI chat complete command.
Fixes
- Handled stop reasons in streaming responses.
- Made
max_tokens_to_sample
required and changed type tou32
. - Defaulted
max_supported_tokens
to 4096 and calculatedmax_tokens_to_sample
if not provided. - Removed spinner logic from
openai.rs
. - Removed
truncated
field fromChunk
.
Refactors
- Updated
CompleteCreateCommand
inanthropic.rs
.
⠼ - Cleaned up theanthropic
crate by removing unused code.
v1.0.0
Release Notes
First version of my new Ai CLI:
c
! 🚀
v1.0.0
Features
- Added streaming support to the
anthropic
crate andc
command. - Added a spinner to show progress for streaming responses.
- Started working on a new version of the CLI.
- Handled streaming in the CLI to properly handle OpenAI streaming responses.
Refactors
- Cleaned up the
anthropic
crate by removing unused code and making some fields optional. - Improved the usage of sessions to define completion behavior.
- Moved most logic into the
c
crate. - Replaced
anyhow
forcolor_eyre
andlog
fortracing
in theanthropic
crate.
Fixes
- Applied changes done to other services.
Chores
- Added the
tracing
dependency. - Added
bacon
to the project.
2023-07-01T2342
Release notes for 2023-07-01T2342
2023-06-25T2348
Release notes for 2023-06-25T2348
2023-06-14T1306
Release notes for 2023-06-14T1306
2023-06-09T2037
Release Notes
New Features
- Integrate the anthropic module to enable conversations with Claude
- Add the anthropic case to interact with Claude
- Add messages to ChatsApi struct to store conversation history with OpenAI
Dependencies
- Add anthropic and anyhow as new dependencies
Breaking Changes
- The
b chats create
command no longer uses a default model if themodel
option is not provided. For new chat sessions, you must specify the model to use. Existing chat sessions in thesessions
file must be updated to include themodel
field.
Commits
- (ad134be) feat(b): integrate anthropic to b
- (60012a1) feat(anthropic): create the anthropic case to interact with Claude
- (45ab11d) feat(openai): add messages to ChatsApi struct to store conversation history
- (8e2a7a2) chore: add new dependencies anthropic and anyhow to Cargo.lock file
- (a5ca6a2) feat: add the anthropic module