Skip to content

Commit

Permalink
doc for row-wise adagrad optimizer
Browse files Browse the repository at this point in the history
  • Loading branch information
YazhiGao committed Dec 18, 2020
1 parent 55becd8 commit 3593be9
Show file tree
Hide file tree
Showing 2 changed files with 7 additions and 1 deletion.
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -364,7 +364,7 @@ Version
-------
0.1 : Initial release of the DLRM code
1.0 : DLRM with distributed training
1.0 : DLRM with distributed training, cpu support for row-wise adagrad optimizer
Requirements
------------
Expand Down
6 changes: 6 additions & 0 deletions optim/rwsadagrad.py
Original file line number Diff line number Diff line change
@@ -1,3 +1,9 @@
# Copyright (c) Facebook, Inc. and its affiliates.
#
# This source code is licensed under the MIT license found in the
# LICENSE file in the root directory of this source tree.


import torch
from torch.optim import Optimizer

Expand Down

0 comments on commit 3593be9

Please sign in to comment.