Skip to content

Commit

Permalink
Merge pull request #118 from danbev/with_main_gpu
Browse files Browse the repository at this point in the history
add with_main_gpu to LlamaModelParams
  • Loading branch information
MarcusDunn authored Feb 29, 2024
2 parents 52687ca + dd264e7 commit 446d16d
Showing 1 changed file with 7 additions and 0 deletions.
7 changes: 7 additions & 0 deletions llama-cpp-2/src/model/params.rs
Original file line number Diff line number Diff line change
Expand Up @@ -56,6 +56,13 @@ impl LlamaModelParams {
self
}

/// sets the main GPU
#[must_use]
pub fn with_main_gpu(mut self, main_gpu: i32) -> Self {
self.params.main_gpu = main_gpu;
self
}

/// sets `vocab_only`
#[must_use]
pub fn with_vocab_only(mut self, vocab_only: bool) -> Self {
Expand Down

0 comments on commit 446d16d

Please sign in to comment.