Skip to content

vLLM backend

vLLM backend #8

Triggered via pull request May 16, 2024 05:57
Status Cancelled
Total duration 2m 28s
Artifacts

test_cli_cuda_vllm.yaml

on: pull_request
run_cli_cuda_pytorch_tests
2m 11s
run_cli_cuda_pytorch_tests
Fit to window
Zoom out
Zoom in

Annotations

2 errors
run_cli_cuda_pytorch_tests
Canceling since a higher priority waiting request for 'CLI CUDA vLLM Tests-196' exists
run_cli_cuda_pytorch_tests
The operation was canceled.