Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix transformers/torch errors on nn.RMSNorm by pinning transformers. #6458

Closed
wants to merge 4 commits into from

Conversation

loadams
Copy link
Contributor

@loadams loadams commented Aug 28, 2024

HPU pipeline was failing due to updated transformers package that seems to have a dependency on torch 2.4: https://github.com/microsoft/DeepSpeed/actions/runs/10586654075/job/29335873815

Opened an issue with transformers to investigate: huggingface/transformers#33176

CC: @nelyahu - we can also fix this when/if there is a package with torch 2.4

@loadams
Copy link
Contributor Author

loadams commented Aug 28, 2024

Should be fixed in latest PR to HF Transformers - won't need to complete this PR but leaving until transformers PR is merged.

@loadams loadams changed the title Fix HPU/transformers/torch errors on nn.RMSNorm by pinning transformers. Fix transformers/torch errors on nn.RMSNorm by pinning transformers. Aug 28, 2024
@nelyahu
Copy link
Contributor

nelyahu commented Aug 29, 2024

@loadams on 1.18.0 we will upgrade to pt2.4.0.
CC: @vshekhawat-hlab

@loadams
Copy link
Contributor Author

loadams commented Aug 29, 2024

@loadams on 1.18.0 we will upgrade to pt2.4.0. CC: @vshekhawat-hlab

Thanks @nelyahu - turns out this is a bug in transformers and we will need that fixed regardless, but good to know, thanks!

@loadams
Copy link
Contributor Author

loadams commented Aug 29, 2024

PR in transformers is here: huggingface/transformers#33177

@loadams loadams closed this Aug 29, 2024
@loadams loadams deleted the loadams/pin-hpu-transformers branch August 29, 2024 16:42
@nelyahu
Copy link
Contributor

nelyahu commented Sep 1, 2024

@loadams it seems that the Gaudi2 pipeline is still broken.
Are you planning to wait till PR for transformers will be merged and included in a version?
i think it might take a while. no?

@loadams
Copy link
Contributor Author

loadams commented Sep 3, 2024

The PR has merged into transformers (took longer than anticipated) but it looks to be running again, so no need to pin anything.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants