Skip to content

Commit

Permalink
fix copy-pasted block comment
Browse files Browse the repository at this point in the history
  • Loading branch information
daniel-geon-park committed Jan 28, 2025
1 parent a9592ca commit 63fee4f
Showing 1 changed file with 2 additions and 4 deletions.
6 changes: 2 additions & 4 deletions python/sglang/srt/layers/attention/hip_radix_attention.py
Original file line number Diff line number Diff line change
@@ -1,10 +1,8 @@
from __future__ import annotations

"""
Support different attention backends.
Now there are two backends: FlashInfer and Triton.
FlashInfer is faster and Triton is easier to customize.
Each backend supports two operators: extend (i.e. prefill with cached prefix) and decode.
HiP Attention Backend for SGLang
https://arxiv.org/pdf/2406.09827
"""

import logging
Expand Down

0 comments on commit 63fee4f

Please sign in to comment.