Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Bug fix for the schedule after the update_optimizer #272

Merged
merged 6 commits into from
Nov 30, 2023

Conversation

oguzhanbsolak
Copy link
Contributor

After the update_optimizer, we need to reflect the updated optimizer to the compression scheduler. Otherwise, the scheduler uses the old optimizer, and lr is not scheduled correctly. This PR fixes the issue by traversing each policy and replacing the old optimizer object with the new one.

train.py Outdated
@@ -128,6 +128,7 @@

def main():
"""main"""
# pylint: disable=too-many-branches
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

please remove and fix limits in .pylintrc and .github/linters/.python-lint

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Realized in the latest commit.

Copy link
Contributor

@rotx-eva rotx-eva left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please remove line 131 and fix limits in .pylintrc and .github/linters/.python-lint instead

Copy link
Contributor

@rotx-eva rotx-eva left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please also change .pylintrc

Copy link
Contributor

@rotx-eva rotx-eva left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Shouldn't this be:

       if compression_scheduler:
            for ep, _ in enumerate(compression_scheduler.policies):
                for pol in compression_scheduler.policies[ep]:
                    for attr_key in dir(pol):
                        attr = getattr(pol, attr_key)
                        if hasattr(attr, 'optimizer'):
                            attr.optimizer = optimizer

@rotx-eva
Copy link
Contributor

Has this been tested with anything other than lr_scheduler?

@oguzhanbsolak
Copy link
Contributor Author

Hi Robert,

Has this been tested with anything other than lr_scheduler?

I tested this change with lr_schedulers and RelationBasedKDPolicy policies. In the current train.py structure, we are only using KnowledgeDistillationPolicy, RelationBasedKDPolicy, and lr_scheduler. From them, only lr_scheduler has the "optimizer" attribute. Therefore, this change shouldn't affect other policies.

Shouldn't this be:

       if compression_scheduler:
            for ep, _ in enumerate(compression_scheduler.policies):
                for pol in compression_scheduler.policies[ep]:
                    for attr_key in dir(pol):
                        attr = getattr(pol, attr_key)
                        if hasattr(attr, 'optimizer'):
                            attr.optimizer = optimizer

When we don't define a scheduler policy, the train.py creates a default compression_scheduler. Therefore, there will be always a compression_scheduler at the state that this snippet will be run, and I believe we don't need to check if there is a compression_scheduler.

For readability your suggested version seems much better, so I am planning to push a new commit as in below:

                for pol in compression_scheduler.policies[ep]:
                    for attr_key in dir(pol):
                        attr = getattr(pol, attr_key)
                        if hasattr(attr, 'optimizer'):
                            attr.optimizer = optimizer

Copy link
Contributor

@ermanok ermanok left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good.

@rotx-eva rotx-eva merged commit 74bb060 into analogdevicesinc:develop Nov 30, 2023
1 check passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants