Skip to content

Why the training saved checkpoint is larger than the official pre-trained checkpoint? #8970

Answered by RangeKing
RangeKing asked this question in Q&A
Discussion options

You must be logged in to vote

Because the optimizer parameters are also stored in the training-saved checkpoint. If you want to remove that part of the parameter, you can use the tools/model_converters/publish_model.py script. For details, you can refer to the doc: prepare-a-model-for-publishing.


因为训练保存的模型当中还保存了 optimizer 参数。如果想删除那部分参数,可以使用 tools/model_converters/publish_model.py 脚本。具体可以参考 文档:如何发布模型

Replies: 1 comment

Comment options

RangeKing
Oct 8, 2022
Collaborator Author

You must be logged in to vote
0 replies
Answer selected by RangeKing
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
1 participant