Skip to content

Commit

Permalink
will start working on linear attention again
Browse files Browse the repository at this point in the history
  • Loading branch information
lucidrains committed May 23, 2023
1 parent 3a27dd2 commit a26ebc1
Show file tree
Hide file tree
Showing 4 changed files with 40 additions and 1 deletion.
10 changes: 9 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -26,10 +26,18 @@ CUDA implementation of autoregressive linear attention, with all the latest rese

```bibtex
@article{Zhai2021AnAF,
title = {An Attention Free Transformer}, // it is not really attention free, blatant clickbait. just linear attention with some gating added in
title = {An Attention Free Transformer},
author = {Shuangfei Zhai and Walter A. Talbott and Nitish Srivastava and Chen Huang and Hanlin Goh and Ruixiang Zhang and Joshua M. Susskind},
journal = {ArXiv},
year = {2021},
volume = {abs/2105.14103}
}
```

```bibtex
@inproceedings{Peng2023RWKVRR,
title = {RWKV: Reinventing RNNs for the Transformer Era},
author = {Bo Peng and Eric Alcaide and Quentin Anthony and Alon Albalak and Samuel Arcadinho and Huanqi Cao and Xin Cheng and Michael Chung and Matteo Grella and GV KranthiKiran and Xuzheng He and Haowen Hou and Przemyslaw Kazienko and Jan Kocon and Jiaming Kong and Bartlomiej Koptyra and Hayden Lau and Krishna Sri Ipsit Mantri and Ferdinand Mom and Atsushi Saito and Xiangru Tang and Bolun Wang and Johan S. Wind and Stansilaw Wozniak and Ruichong Zhang and Zhenyuan Zhang and Qihang Zhao and Peng Zhou and Jian Zhu and Rui-Jie Zhu},
year = {2023}
}
```
Empty file.
Empty file.
31 changes: 31 additions & 0 deletions setup.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,31 @@
from setuptools import setup, find_packages

setup(
name = 'autoregressive-linear-attention-cuda',
packages = find_packages(exclude=[]),
version = '0.0.1',
license='MIT',
description = 'Autoregressive Linear Attention CUDA kernel',
author = 'Phil Wang',
author_email = '[email protected]',
long_description_content_type = 'text/markdown',
url = 'https://github.com/lucidrains/autoregressive-linear-attention-cuda',
keywords = [
'artificial intelligence',
'deep learning',
'transformers',
'attention mechanism',
'linear attention',
'cuda'
],
install_requires=[
'torch>=1.6'
],
classifiers=[
'Development Status :: 4 - Beta',
'Intended Audience :: Developers',
'Topic :: Scientific/Engineering :: Artificial Intelligence',
'License :: OSI Approved :: MIT License',
'Programming Language :: Python :: 3.6',
],
)

0 comments on commit a26ebc1

Please sign in to comment.