Releases: lucidrains/recurrent-memory-transformer-pytorch
Releases · lucidrains/recurrent-memory-transformer-pytorch
0.1.9
bring in a trick from the cogview paper
0.1.8
fix custom causal mask for memories and main sequence
0.1.7
final tweak, so network can differentiate better between read and wri…
0.1.6
use absolute positional embedding and token shift as default
0.1.5
knock off a todo
0.1.4
eval decorator never used
0.1.2
actually can do away with initial read memory
0.1.1
labels can be directly passed in, if training encoder
0.1.0
complete the memory replay backprop technique from the memformer pape…
0.0.14
0.0.14