Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

linear layer backward #228

Closed
ayoub-louati opened this issue Jan 3, 2023 · 1 comment · May be fixed by #233
Closed

linear layer backward #228

ayoub-louati opened this issue Jan 3, 2023 · 1 comment · May be fixed by #233
Assignees
Labels
enhancement New feature or request feature

Comments

@ayoub-louati
Copy link
Contributor

add backward for linear layer kernel, see hazyresearch implementation:

https://github.com/HazyResearch/flash-attention/blob/v0.2.2/flash_attn/ops/triton/linear.py#L285

@ayoub-louati ayoub-louati self-assigned this Jan 3, 2023
@ayoub-louati ayoub-louati added model Model scope, HF, etc. feature enhancement New feature or request and removed model Model scope, HF, etc. labels Jan 3, 2023
@ayoub-louati
Copy link
Contributor Author

ayoub-louati commented Mar 9, 2023

linked to this PR: #233
Added the backward function for linear layer, tests passed without errors but we still have errors when enabling cuda_graphs (it may mean that we have hidden errors in the regular tests when desabling cuda_graphs).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request feature
Projects
None yet
Development

Successfully merging a pull request may close this issue.

1 participant