Skip to content

Commit

Permalink
add activations docs and add 1.6 changelog entry
Browse files Browse the repository at this point in the history
  • Loading branch information
rsokl committed Jun 21, 2020
1 parent 698c747 commit b469c78
Show file tree
Hide file tree
Showing 33 changed files with 135 additions and 79 deletions.
17 changes: 17 additions & 0 deletions docs/source/changes.rst
Original file line number Diff line number Diff line change
Expand Up @@ -6,6 +6,23 @@ This is a record of all past mygrad releases and what went into them,
in reverse chronological order. All previous releases should still be available
on pip.

.. _v1.6.0:

------------------
1.6.0 - 2020-06-21
------------------

New features:

- Adds :func:`~mygrad.nnet.activations.elu`
- Adds :func:`~mygrad.nnet.activations.glu`
- Adds :func:`~mygrad.nnet.activations.leaky_relu`
- Adds :func:`~mygrad.nnet.activations.selu`
- Adds :func:`~mygrad.nnet.activations.soft_sign`

Big thanks to David Mascharka!


.. _v1.5.0:

-------------------
Expand Down
6 changes: 6 additions & 0 deletions docs/source/generated/mygrad.nnet.activations.elu.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,6 @@
mygrad.nnet.activations.elu
===========================

.. currentmodule:: mygrad.nnet.activations

.. autofunction:: elu
6 changes: 6 additions & 0 deletions docs/source/generated/mygrad.nnet.activations.glu.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,6 @@
mygrad.nnet.activations.glu
===========================

.. currentmodule:: mygrad.nnet.activations

.. autofunction:: glu
6 changes: 6 additions & 0 deletions docs/source/generated/mygrad.nnet.activations.hard_tanh.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,6 @@
mygrad.nnet.activations.hard_tanh
=================================

.. currentmodule:: mygrad.nnet.activations

.. autofunction:: hard_tanh
6 changes: 6 additions & 0 deletions docs/source/generated/mygrad.nnet.activations.leaky_relu.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,6 @@
mygrad.nnet.activations.leaky_relu
==================================

.. currentmodule:: mygrad.nnet.activations

.. autofunction:: leaky_relu
6 changes: 6 additions & 0 deletions docs/source/generated/mygrad.nnet.activations.logsoftmax.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,6 @@
mygrad.nnet.activations.logsoftmax
==================================

.. currentmodule:: mygrad.nnet.activations

.. autofunction:: logsoftmax
6 changes: 6 additions & 0 deletions docs/source/generated/mygrad.nnet.activations.relu.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,6 @@
mygrad.nnet.activations.relu
============================

.. currentmodule:: mygrad.nnet.activations

.. autofunction:: relu
6 changes: 6 additions & 0 deletions docs/source/generated/mygrad.nnet.activations.selu.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,6 @@
mygrad.nnet.activations.selu
============================

.. currentmodule:: mygrad.nnet.activations

.. autofunction:: selu
6 changes: 6 additions & 0 deletions docs/source/generated/mygrad.nnet.activations.sigmoid.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,6 @@
mygrad.nnet.activations.sigmoid
===============================

.. currentmodule:: mygrad.nnet.activations

.. autofunction:: sigmoid
6 changes: 6 additions & 0 deletions docs/source/generated/mygrad.nnet.activations.soft_sign.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,6 @@
mygrad.nnet.activations.soft_sign
=================================

.. currentmodule:: mygrad.nnet.activations

.. autofunction:: soft_sign
6 changes: 6 additions & 0 deletions docs/source/generated/mygrad.nnet.activations.softmax.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,6 @@
mygrad.nnet.activations.softmax
===============================

.. currentmodule:: mygrad.nnet.activations

.. autofunction:: softmax
6 changes: 6 additions & 0 deletions docs/source/generated/mygrad.nnet.activations.tanh.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,6 @@
mygrad.nnet.activations.tanh
============================

.. currentmodule:: mygrad.nnet.activations

.. autofunction:: tanh
6 changes: 0 additions & 6 deletions docs/source/generated/mygrad.nnet.batchnorm.rst

This file was deleted.

6 changes: 0 additions & 6 deletions docs/source/generated/mygrad.nnet.conv_nd.rst

This file was deleted.

6 changes: 0 additions & 6 deletions docs/source/generated/mygrad.nnet.gru.rst

This file was deleted.

6 changes: 0 additions & 6 deletions docs/source/generated/mygrad.nnet.hard_tanh.rst

This file was deleted.

6 changes: 6 additions & 0 deletions docs/source/generated/mygrad.nnet.layers.batchnorm.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,6 @@
mygrad.nnet.layers.batchnorm
============================

.. currentmodule:: mygrad.nnet.layers

.. autofunction:: batchnorm
6 changes: 6 additions & 0 deletions docs/source/generated/mygrad.nnet.layers.conv_nd.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,6 @@
mygrad.nnet.layers.conv\_nd
===========================

.. currentmodule:: mygrad.nnet.layers

.. autofunction:: conv_nd
6 changes: 6 additions & 0 deletions docs/source/generated/mygrad.nnet.layers.gru.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,6 @@
mygrad.nnet.layers.gru
======================

.. currentmodule:: mygrad.nnet.layers

.. autofunction:: gru
6 changes: 6 additions & 0 deletions docs/source/generated/mygrad.nnet.layers.max_pool.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,6 @@
mygrad.nnet.layers.max\_pool
============================

.. currentmodule:: mygrad.nnet.layers

.. autofunction:: max_pool
6 changes: 0 additions & 6 deletions docs/source/generated/mygrad.nnet.logsoftmax.rst

This file was deleted.

Original file line number Diff line number Diff line change
@@ -0,0 +1,6 @@
mygrad.nnet.losses.margin\_ranking\_loss
========================================

.. currentmodule:: mygrad.nnet.losses

.. autofunction:: margin_ranking_loss
6 changes: 6 additions & 0 deletions docs/source/generated/mygrad.nnet.losses.multiclass_hinge.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,6 @@
mygrad.nnet.losses.multiclass\_hinge
====================================

.. currentmodule:: mygrad.nnet.losses

.. autofunction:: multiclass_hinge
Original file line number Diff line number Diff line change
@@ -0,0 +1,6 @@
mygrad.nnet.losses.softmax\_crossentropy
========================================

.. currentmodule:: mygrad.nnet.losses

.. autofunction:: softmax_crossentropy
6 changes: 0 additions & 6 deletions docs/source/generated/mygrad.nnet.margin_ranking_loss.rst

This file was deleted.

6 changes: 0 additions & 6 deletions docs/source/generated/mygrad.nnet.max_pool.rst

This file was deleted.

6 changes: 0 additions & 6 deletions docs/source/generated/mygrad.nnet.multiclass_hinge.rst

This file was deleted.

6 changes: 0 additions & 6 deletions docs/source/generated/mygrad.nnet.relu.rst

This file was deleted.

6 changes: 0 additions & 6 deletions docs/source/generated/mygrad.nnet.sigmoid.rst

This file was deleted.

6 changes: 0 additions & 6 deletions docs/source/generated/mygrad.nnet.softmax.rst

This file was deleted.

6 changes: 0 additions & 6 deletions docs/source/generated/mygrad.nnet.softmax_crossentropy.rst

This file was deleted.

6 changes: 0 additions & 6 deletions docs/source/generated/mygrad.nnet.tanh.rst

This file was deleted.

11 changes: 10 additions & 1 deletion docs/source/nnet.rst
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
Neural network operations (:mod:`mygrad.nnet`)
**********************************************

.. currentmodule:: mygrad.nnet
.. currentmodule:: mygrad.nnet.layers


Layer operations
Expand All @@ -14,6 +14,7 @@ Layer operations
max_pool
gru

.. currentmodule:: mygrad.nnet.losses

Losses
------
Expand All @@ -25,15 +26,23 @@ Losses
margin_ranking_loss


.. currentmodule:: mygrad.nnet.activations

Activations
-----------
.. autosummary::
:toctree: generated/


elu
glu
hard_tanh
leaky_relu
logsoftmax
selu
sigmoid
softmax
soft_sign
relu
tanh

Expand Down

0 comments on commit b469c78

Please sign in to comment.