Skip to content

Releases: lucidrains/recurrent-memory-transformer-pytorch

0.1.9

26 Apr 13:36
Compare
Choose a tag to compare
bring in a trick from the cogview paper

0.1.8

26 Apr 03:29
Compare
Choose a tag to compare
fix custom causal mask for memories and main sequence

0.1.7

26 Apr 02:42
Compare
Choose a tag to compare
final tweak, so network can differentiate better between read and wri…

0.1.6

26 Apr 02:02
Compare
Choose a tag to compare
use absolute positional embedding and token shift as default

0.1.5

25 Apr 22:57
Compare
Choose a tag to compare
knock off a todo

0.1.4

25 Apr 21:35
Compare
Choose a tag to compare
eval decorator never used

0.1.2

25 Apr 21:14
Compare
Choose a tag to compare
actually can do away with initial read memory

0.1.1

25 Apr 20:26
Compare
Choose a tag to compare
labels can be directly passed in, if training encoder

0.1.0

25 Apr 19:10
Compare
Choose a tag to compare
complete the memory replay backprop technique from the memformer pape…

0.0.14

25 Apr 17:11
Compare
Choose a tag to compare
0.0.14