Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[layer] add pow operation layer @open sesame 12/06 20:13 #2801

Merged
merged 1 commit into from
Jan 23, 2025

Conversation

baek2sm
Copy link
Contributor

@baek2sm baek2sm commented Nov 20, 2024

added a pow operation layer.

there was an example of a pow layer in the custom layers, so I modified the key value of the custom pow layer to "custom_pow" in order to avoid duplication of layer key value.

Self evaluation:

  1. Build test: [X]Passed [ ]Failed [ ]Skipped
  2. Run test: [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: Seungbaek Hong [email protected]

@taos-ci
Copy link

taos-ci commented Nov 20, 2024

📝 TAOS-CI Version: 1.5.20200925. Thank you for submitting PR #2801. Please a submit 1commit/1PR (one commit per one PR) policy to get comments quickly from reviewers. Your PR must pass all verificiation processes of cibot before starting a review process from reviewers. If you are new member to join this project, please read manuals in documentation folder and wiki page. In order to monitor a progress status of your PR in more detail, visit http://ci.nnstreamer.ai/.

@baek2sm baek2sm force-pushed the pow_layer branch 7 times, most recently from 0653289 to ac31a83 Compare November 20, 2024 11:14
@taos-ci
Copy link

taos-ci commented Nov 20, 2024

:octocat: cibot: @baek2sm, A builder checker could not be completed because one of the checkers is not completed. In order to find out a reason, please go to http://ci.nnstreamer.ai/nntrainer/ci/repo-workers/pr-checker/2801-202411202014550.55517911911011-ac31a83628513b209b423e494584b2097a450335/.

Comment on lines 29 to 32
void PowLayer::forwarding_operation(const Tensor &input, Tensor &hidden) {
float exp = std::get<props::Exponent>(pow_props).get();
input.pow(exp, hidden);
}
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Quick question! this isn't really has to do with PR but:
Is there any difference with multiplying itself if the given exponent is set to 2? ( or inv_sqrt function <-> pow with -0.5 exponent )
I find almost every pow or pow_i function usage in the nntrainer use 2.0, 0.5, or -0.5 as an exponent, and this is handled with naive loop Tensor member function.

Copy link
Contributor Author

@baek2sm baek2sm Nov 21, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@skykongkong8 There is no difference in computation between multiplying itself and setting an exponent value to 2. However, for the cases you mentioned, I will handle them by adding square, sqrt and rsqrt functions instead of using the pow function. Thanks!

@baek2sm baek2sm changed the title [Wait for #2797,#2800][layer] add pow operation layer [WIP][Wait for #2797,#2800][layer] add pow operation layer Nov 21, 2024
Copy link

@taos-ci taos-ci left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@baek2sm, 💯 All CI checkers are successfully verified. Thanks.

@baek2sm baek2sm changed the title [WIP][Wait for #2797,#2800][layer] add pow operation layer [Wait for #2797,#2800][layer] add pow operation layer Nov 21, 2024
@baek2sm baek2sm changed the title [Wait for #2797,#2800][layer] add pow operation layer [Wait for #2797][layer] add pow operation layer Nov 29, 2024
Copy link

@taos-ci taos-ci left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@baek2sm, 💯 All CI checkers are successfully verified. Thanks.

@baek2sm baek2sm changed the title [Wait for #2797][layer] add pow operation layer [Wait for #2797][layer] add pow operation layer @open sesame 12/02 14:40 Dec 2, 2024
Copy link

@taos-ci taos-ci left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@baek2sm, 💯 All CI checkers are successfully verified. Thanks.

Copy link

@taos-ci taos-ci left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@baek2sm, 💯 All CI checkers are successfully verified. Thanks.

@baek2sm baek2sm changed the title [Wait for #2797][layer] add pow operation layer @open sesame 12/02 14:40 [Wait for #2797][layer] add pow operation layer @open sesame 12/06 20:13 Dec 6, 2024
@@ -9,6 +9,7 @@ layer_sources = [
'subtract_layer.cpp',
'multiply_layer.cpp',
'divide_layer.cpp',
'pow_layer.cpp',
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is a simple suggestion (not directly related to this PR though).
What about using prefix like 'op_' to operation layers?
It might reduce the confusion on the layer types (e.g., add_layer & addition_layer).

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That sounds good idea. I'll reflect it in later pr. thanks!

}

void PowLayer::forwarding_operation(const Tensor &input, Tensor &hidden) {
float exp = std::get<props::Exponent>(pow_props).get();
Copy link
Contributor

@EunjuYang EunjuYang Dec 9, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can we change the name exp ? (exp function is defined in cmath)

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I've modified it(exp -> exponent). Thanks a lot!

Copy link
Contributor

@EunjuYang EunjuYang left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@myungjoo
Copy link
Member

unittest_models_v2.tar.gz

  1. You have two .tar.gz files with the same name.
  2. I don't see any scripts that access these .tar.gz files.

@baek2sm
Copy link
Contributor Author

baek2sm commented Dec 23, 2024

@myungjoo

  1. The inclusion of the duplicate file was my mistake. I've removed one, thanks!
  2. That file contains the golden data for unittest and it's extracted with test/unittest/meson.build script(It's already included, not this commit). Then test/unittest/unittest_models.cpp script utilizes these golden data files. Thanks a lot.

Copy link
Collaborator

@jijoongmoon jijoongmoon left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

added a pow operation layer.

there was an example of a pow layer in the custom layers, so I modified the key value of the custom pow layer to "custom_pow" in order to avoid duplication of layer key value.

**Self evaluation:**
1. Build test:   [X]Passed [ ]Failed [ ]Skipped
2. Run test:     [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: Seungbaek Hong <[email protected]>
@baek2sm baek2sm changed the title [Wait for #2797][layer] add pow operation layer @open sesame 12/06 20:13 [layer] add pow operation layer @open sesame 12/06 20:13 Jan 7, 2025
@jijoongmoon jijoongmoon merged commit 091c496 into nnstreamer:main Jan 23, 2025
22 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

6 participants