-
Notifications
You must be signed in to change notification settings - Fork 186
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
🏂 Bert SNPE conversion & evaluation case #925
Conversation
1dfbf0a
to
b7561bc
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for the changes
) -> FileListDataLoader: | ||
if isinstance(dataloader, FileListDataLoader): | ||
return dataloader | ||
return FileListCommonDataLoader(dataloader, model.io_config, batch_size=file_list_batch_size) | ||
return FileListCommonDataLoader(dataloader, model.io_config, batch_size=file_chunk_size) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
do we need rename the parameter batch_size of FileListCommonDataLoader to file_chunk_size?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Maybe not. There are still many other cases which create the file list data loader directly. Let us say user prepare the local inputs text(like what we did for mobilenet/inception/vgg case), they still need the concept of batch_size to run batch inputs.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
In mobilenet/inception/vgg case, the batch_size is used as normal batch_size like dataset? I see the source code https://github.com/microsoft/Olive/blob/main/olive/platform_sdk/qualcomm/utils/data_loader.py#L73 use the batch size as chunk. No?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
For the file list data loader, the batch size always performs like file chunk
.
I am thinking for Mobilenet case, when user create the the local file and file list data loader, then they begin to iterate the data loader. Then batch_size
may looks better? As it is a common concept in any data loader.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
If you insist to choose the file_chunk_size
, maybe I can do it in a separated PR? I do not know how many example configs need to be updated.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
does the parameter changes need to update the config json?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I am just thinking the batch_size is a common term associated with dataloader. If we use same term but for different purpose, we'd better use a different name.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please split the batch_size and file_chunk_size in next PR.
Describe your changes
Checklist before requesting a review
lintrunner -a
(Optional) Issue link