Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Draft: [onert-micro] Bring up SparseCrossEntropy Loss #13431

Draft
wants to merge 5 commits into
base: master
Choose a base branch
from

Conversation

ljwoo94
Copy link
Contributor

@ljwoo94 ljwoo94 commented Jul 15, 2024

What

  • This commit adds feature to support Sparse Cross Entropy as a Loss function.
  • This commit also adds Sparse Cross Entropy Accuracy metric to check accuracy of above loss.

ToDo

Jungwoo Lee added 4 commits July 11, 2024 15:58
This testcase is added to test SparseCrossEntropy featture that will be
added

Signed-off-by: Jungwoo Lee <[email protected]>
@ljwoo94
Copy link
Contributor Author

ljwoo94 commented Jul 15, 2024

TF Result: .95
Onert-micro result: .95

Copy link
Contributor

@BalyshevArtem BalyshevArtem left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

overall lgtm, thank you!
Let's split this draft into prs and merge it

Comment on lines +67 to +70
if (loss_type == SPARSE_CROSS_ENTROPY)
{
offset = batch_num * data_type_size;
}
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@BalyshevArtem
Is it ok to changing offset according to the loss_type??
I had no idea of refactoring this code to apply sparse cross entropy... :'(

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants