Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix vLLM v1 compatibility #1121

Merged
merged 2 commits into from
Feb 27, 2025
Merged

Fix vLLM v1 compatibility #1121

merged 2 commits into from
Feb 27, 2025

Conversation

Atry
Copy link

@Atry Atry commented Feb 15, 2025

Don't delete model_executor if it does not exist, e.g. when VLLM_USE_V1 is set.

See https://docs.vllm.ai/en/stable/serving/env_vars.html and https://github.com/vllm-project/vllm/blob/main/vllm/v1/engine/llm_engine.py

Atry and others added 2 commits February 14, 2025 19:34
Don't delete model_executor if it does not exist, e.g. when VLLM_USE_V1 is set
Copy link

codspeed-hq bot commented Feb 27, 2025

CodSpeed Performance Report

Merging #1121 will not alter performance

Comparing Atry:patch-1 (9df6612) with main (1b6c101)

Summary

✅ 1 untouched benchmarks

@plaguss plaguss changed the base branch from main to develop February 27, 2025 10:47
@plaguss
Copy link
Contributor

plaguss commented Feb 27, 2025

Thanks for the fix!

@plaguss plaguss merged commit a66d894 into argilla-io:develop Feb 27, 2025
3 of 8 checks passed
@Atry Atry deleted the patch-1 branch February 27, 2025 16:55
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants