Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Graph] change of inplace-type setting method #2794

Merged
merged 1 commit into from
Nov 22, 2024

Conversation

baek2sm
Copy link
Contributor

@baek2sm baek2sm commented Nov 13, 2024

The method of setting the inplace-type has been redefined.

The reason why inplace processing becomes complicated is that since a multi-out layer shares output variables, so it needs to be considered whether or not inplace can be performed.

Actually, the layers that can perform inplace even after the multi-out layer are only no-operation layers(no-op layers). These no-op layers include identity, reshape, and flatten layers.

For other layers, even if they support inplace, they cannot perform inplace when there is a multi-out layer in front of them.

Note that because no-op layers connected with multi-out layer share memory with the multi-out layer, so they have the same properties as the multi-out layer. This property is expressed as RESTRICTING in our script.

Based on these definitions, I've redesigned the method of setting inplace type.

  1. By default, initialize the inplace type for each layer. If supportInPlace is true, it will be initialized as NON_RESTRICTING; otherwise, it will be initialized as NONE.
  2. However, not all layers are initialized like this. For multi-out layers or no-op layers, if supportInPlace is true, they will be initialized as RESTRICTING types(However, the no-op layer will be changed to a NON_RESTRICTING type if that is not connected with the multi-out layer within the network_graph.cpp).
  3. After initialization, confirm the input connections from the network_graph.cpp to determine the final inplace type. It's clearer to see the source code for this part.

Self evaluation:

  1. Build test: [X]Passed [ ]Failed [ ]Skipped
  2. Run test: [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: Seungbaek Hong [email protected]

@taos-ci
Copy link

taos-ci commented Nov 13, 2024

📝 TAOS-CI Version: 1.5.20200925. Thank you for submitting PR #2794. Please a submit 1commit/1PR (one commit per one PR) policy to get comments quickly from reviewers. Your PR must pass all verificiation processes of cibot before starting a review process from reviewers. If you are new member to join this project, please read manuals in documentation folder and wiki page. In order to monitor a progress status of your PR in more detail, visit http://ci.nnstreamer.ai/.

@taos-ci
Copy link

taos-ci commented Nov 13, 2024

:octocat: cibot: @baek2sm, nntrainer/layers/identity_layer.h does not include Doxygen tags such as @file @brief @author @bug. You must include the Doxygen tags in the source code. Please refer to a Doxygen manual at http://github.com/nnstreamer/TAOS-CI/blob/main/ci/doc/doxygen-documentation.md

Copy link

@taos-ci taos-ci left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@baek2sm, 💯 All CI checkers are successfully verified. Thanks.

@baek2sm baek2sm changed the title [Graph] change of inplace-type setting method [WIP][Graph] change of inplace-type setting method Nov 14, 2024
@baek2sm baek2sm added WIP and removed Need Review labels Nov 14, 2024
@baek2sm baek2sm changed the title [WIP][Graph] change of inplace-type setting method [Graph] change of inplace-type setting method Nov 14, 2024
@baek2sm baek2sm added Need Review and removed WIP labels Nov 14, 2024
Copy link

@taos-ci taos-ci left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@baek2sm, 💯 All CI checkers are successfully verified. Thanks.

Copy link
Member

@skykongkong8 skykongkong8 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@@ -796,7 +713,8 @@ setInplaceSharedMemoryConfigByLayer(const std::shared_ptr<LayerNode> &lnode,
shared_var = true;
shared_grad = true;
}
/** @todo for addition layer, variables are not shared but gradients are */
/** @todo for addition layer, variables are not shared but gradients are
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Could you check the indentation in clang-format?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This comment needed to be deleted. I've removed it. Thank you!

Comment on lines 942 to 945
/**
* @brief
*
*/
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It seems to be an unintentional comment.

Suggested change
/**
* @brief
*
*/

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@EunjuYang I removed it. Thanks!

* @brief Initialize the in-place type of the layer
* @return InPlaceType
*/
InPlaceType initializeInPlaceType() override {
Copy link
Contributor

@EunjuYang EunjuYang Nov 19, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Simple question!
It seems FlattenLayer, which inherits ReshapeLayer, to override the initializeInPlaceType() with the same code of its parent class. If you think it is always same, what about defining this function (reshape's initializeInPlaceType) as final instead of override to reduce code replication?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@EunjuYang That's a good point! I set that function of the reshape layer to be final instead of override, and removed that function from the flatten layer. Thank you.

The method of setting the inplace-type has been redefined.

The reason why inplace processing becomes complicated is that since a multi-out layer shares output variables, so it needs to be considered whether or not inplace can be performed.

To simplify the problem, the layers that can perform inplace even after the multi-out layer are only no-operation layers(no-op layers). These no-op layers include identity, reshape, and flatten layers.

For other layers, even if they support inplace, they cannot perform inplace when there is a multi-out layer in front of them.

Note that because no-op layers connected with multi-out layer share memory with the multi-out layer, so they have the same properties as the multi-out layer. This is expressed as RESTRICTING in our script.

Based on these definitions, I've redesigned the method of setting inplace type.

1. By default, initialize the inplace type for each layer. If supportInPlace is true, it will be initialized as NON_RESTRICTING; otherwise, it will be initialized as NONE.
2. However, not all layers are initialized like this. For multi-out layers or no-op layers, if supportInPlace is true, they will be initialized as RESTRICTING types(However, the no-op layer will be changed to a non-restricting type if that is not connected with the multi-out layer).
3. After initialization, confirm the input connections from the network_graph.cpp to determine the final inplace type. It's clearer to see the source code for this part.

**Self evaluation:**
1. Build test:   [X]Passed [ ]Failed [ ]Skipped
2. Run test:     [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: Seungbaek Hong <[email protected]>
Copy link

@taos-ci taos-ci left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@baek2sm, 💯 All CI checkers are successfully verified. Thanks.

Copy link
Contributor

@EunjuYang EunjuYang left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM!

Copy link
Contributor

@djeong20 djeong20 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@jijoongmoon jijoongmoon merged commit 2c34be6 into nnstreamer:main Nov 22, 2024
41 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

6 participants