Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ci: update Transformers to v4.48.0 #1282

Draft
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

dvrogozh
Copy link
Contributor

@dvrogozh dvrogozh commented Jan 13, 2025

We don't need to manually exclude some cuda specific tests for models since this was addressed on Huggingface side for v4.48.0.

Previously failing test_prompt_lookup_decoding_matches_greedy_search test
in fuyu model now is skipped.

See: huggingface/transformers#35269

@dvrogozh dvrogozh marked this pull request as ready for review January 13, 2025 22:00
@dvrogozh dvrogozh marked this pull request as draft January 14, 2025 01:02
We don't need to manually exclude some cuda specific tests for
models since this was addressed on Huggingface side for v4.48.0, see.

Previously failing test_prompt_lookup_decoding_matches_greedy_search test
in fuyu model now is skipped.

See: huggingface/transformers#35269
Signed-off-by: Dmitry Rogozhkin <[email protected]>
@dvrogozh
Copy link
Contributor Author

dvrogozh commented Jan 14, 2025

Transformers v4.48.0 has a regression in tests:

# TRANSFORMERS_TEST_DEVICE_SPEC=spec.py python3 -m pytest tests/models/marian/test_modeling_marian.py -k backbone

E   ModuleNotFoundError: No module named 'transformers.models.marian.convert_marian_to_pytorch'

I don't see this issue in main at c23a1c193. It might be reasonable to wait for v4.49.0 or bring this issue to HF if it will persist.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant