Skip to content

Releases: lucidrains/recurrent-memory-transformer-pytorch

0.5.6

11 Feb 18:28
Compare
Choose a tag to compare

Full Changelog: 0.5.5...0.5.6

0.5.5

31 Aug 21:14
Compare
Choose a tag to compare
address https://github.com/lucidrains/recurrent-memory-transformer-py…

0.5.4

31 Aug 19:39
Compare
Choose a tag to compare
address https://github.com/lucidrains/recurrent-memory-transformer-py…

0.5.3

29 Aug 16:50
Compare
Choose a tag to compare
allow for customizing whether read memory is stop gradded, the one ad…

0.5.2

29 Aug 16:30
Compare
Choose a tag to compare
reinject the write memory positions

0.5.1

29 Aug 16:26
Compare
Choose a tag to compare
address https://github.com/lucidrains/recurrent-memory-transformer-py…

0.5.0

09 Aug 14:18
Compare
Choose a tag to compare
give a null key / value to protect against entirely masked out row, a…

0.4.3

08 Aug 21:04
Compare
Choose a tag to compare
address https://github.com/lucidrains/recurrent-memory-transformer-py…

0.4.2

08 Aug 15:55
Compare
Choose a tag to compare
address https://github.com/lucidrains/recurrent-memory-transformer-py…

0.4.1

27 May 01:17
Compare
Choose a tag to compare
turns out resiDual is prone to overflowing in fp16. add the scaling s…