Skip to content

Run eval/inference on any Ollama or Llamafile or vLLM hosted locally #1179

Run eval/inference on any Ollama or Llamafile or vLLM hosted locally

Run eval/inference on any Ollama or Llamafile or vLLM hosted locally #1179

Triggered via pull request January 31, 2025 15:49
Status Success
Total duration 12s
Artifacts
Prevent merge
0s
Prevent merge
Fit to window
Zoom out
Zoom in