Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Related issues were discussed: huggingface/transformers#28731, Lightning-AI/litgpt#327. The solution used was to upgrade the `torch` version to `2.2.1` which is referenced [comment](huggingface/transformers#28731 (comment)), and [colab release note](https://colab.research.google.com/notebooks/relnotes.ipynb#scrollTo=59f6f87f). For example, some notebooks encountered the above problem and had to temporarily fix the error with the following code: ```python import torch torch.backends.cuda.enable_mem_efficient_sdp(False) torch.backends.cuda.enable_flash_sdp(False) ``` - [Ghost 7B Alpha - Playground using Transformers, en](https://www.kaggle.com/code/lamhieu/ghost-7b-alpha-playground-using-transformers-en) - [Ghost 7B Alpha - Playground using Transformers, vi](https://www.kaggle.com/code/lamhieu/ghost-7b-alpha-playground-using-transformers-vi) --------- Co-authored-by: Dustin H <[email protected]>
- Loading branch information