Skip to content

Commit

Permalink
Remove double negations for checkCancellation function (#98)
Browse files Browse the repository at this point in the history
# Remove double negations for `checkCancellation` function


## ♻️ Current situation & Problem

As mentionned in issue #83 as well as in the PR [#81
(discussion)](#81 (comment)),
the SpeziLLM package is currently using a double negation (`guard`
followed by `!`) for every `checkCancellation()` function call. As
double negations make the code harder to read, we could use `if`
instead.

## ⚙️ Release Notes 
* Remove the double negations for `checkCancellation` by replacing the
`guard` statements that are followed by a not operator into an `if`
statement.

## 📝 Code of Conduct & Contributing Guidelines 

By submitting creating this pull request, you agree to follow our [Code
of
Conduct](https://github.com/StanfordSpezi/.github/blob/main/CODE_OF_CONDUCT.md)
and [Contributing
Guidelines](https://github.com/StanfordSpezi/.github/blob/main/CONTRIBUTING.md):
- [x] I agree to follow the [Code of
Conduct](https://github.com/StanfordSpezi/.github/blob/main/CODE_OF_CONDUCT.md)
and [Contributing
Guidelines](https://github.com/StanfordSpezi/.github/blob/main/CONTRIBUTING.md).
  • Loading branch information
Seb-Ltz authored Feb 15, 2025
1 parent 4a86cbf commit 3f26ab4
Show file tree
Hide file tree
Showing 6 changed files with 9 additions and 9 deletions.
4 changes: 2 additions & 2 deletions Sources/SpeziLLM/Mock/LLMMockSession.swift
Original file line number Diff line number Diff line change
Expand Up @@ -48,7 +48,7 @@ public final class LLMMockSession: LLMSession, @unchecked Sendable {
self.state = .loading
}
try? await Task.sleep(for: .seconds(1))
guard await !checkCancellation(on: continuation) else {
if await checkCancellation(on: continuation) {
return
}

Expand All @@ -60,7 +60,7 @@ public final class LLMMockSession: LLMSession, @unchecked Sendable {
let tokens = ["Mock ", "Message ", "from ", "SpeziLLM!"]
for token in tokens {
try? await Task.sleep(for: .milliseconds(500))
guard await !checkCancellation(on: continuation) else {
if await checkCancellation(on: continuation) {
return
}
await injectAndYield(token, on: continuation)
Expand Down
2 changes: 1 addition & 1 deletion Sources/SpeziLLMFog/LLMFogSession+Generation.swift
Original file line number Diff line number Diff line change
Expand Up @@ -37,7 +37,7 @@ extension LLMFogSession {

do {
for try await streamResult in chatStream {
guard await !checkCancellation(on: continuation) else {
if await checkCancellation(on: continuation) {
Self.logger.debug("SpeziLLMFog: LLM inference cancelled because of Task cancellation.")
return
}
Expand Down
4 changes: 2 additions & 2 deletions Sources/SpeziLLMLocal/LLMLocalSession+Generate.swift
Original file line number Diff line number Diff line change
Expand Up @@ -51,7 +51,7 @@ extension LLMLocalSession {

MLXRandom.seed(self.schema.parameters.seed ?? UInt64(Date.timeIntervalSinceReferenceDate * 1000))

guard await !checkCancellation(on: continuation) else {
if await checkCancellation(on: continuation) {
return
}

Expand Down Expand Up @@ -163,7 +163,7 @@ extension LLMLocalSession {

for token in tokens {
try? await Task.sleep(for: .seconds(1))
guard await !checkCancellation(on: continuation) else {
if await checkCancellation(on: continuation) {
return
}
continuation.yield(token)
Expand Down
2 changes: 1 addition & 1 deletion Sources/SpeziLLMLocal/LLMLocalSession.swift
Original file line number Diff line number Diff line change
Expand Up @@ -132,7 +132,7 @@ public final class LLMLocalSession: LLMSession, @unchecked Sendable {
}
}

guard await !checkCancellation(on: continuation) else {
if await checkCancellation(on: continuation) {
return
}

Expand Down
4 changes: 2 additions & 2 deletions Sources/SpeziLLMOpenAI/LLMOpenAISession+Generation.swift
Original file line number Diff line number Diff line change
Expand Up @@ -50,7 +50,7 @@ extension LLMOpenAISession {
continue
}

guard await !checkCancellation(on: continuation) else {
if await checkCancellation(on: continuation) {
Self.logger.debug("SpeziLLMOpenAI: LLM inference cancelled because of Task cancellation.")
return
}
Expand Down Expand Up @@ -148,7 +148,7 @@ extension LLMOpenAISession {
// Errors thrown by the functions are surfaced to the user as an LLM generation error
functionCallResponse = try await function.execute()
} catch is CancellationError {
guard await !self.checkCancellation(on: continuation) else {
if await self.checkCancellation(on: continuation) {
Self.logger.debug("SpeziLLMOpenAI: Function call execution cancelled because of Task cancellation.")
throw CancellationError()
}
Expand Down
2 changes: 1 addition & 1 deletion Sources/SpeziLLMOpenAI/LLMOpenAISession.swift
Original file line number Diff line number Diff line change
Expand Up @@ -140,7 +140,7 @@ public final class LLMOpenAISession: LLMSession, @unchecked Sendable {
}
}

guard await !checkCancellation(on: continuation) else {
if await checkCancellation(on: continuation) {
return
}

Expand Down

0 comments on commit 3f26ab4

Please sign in to comment.