-
Why the training-saved checkpoint is larger than the official pre-trained checkpoint? |
Beta Was this translation helpful? Give feedback.
Answered by
RangeKing
Oct 8, 2022
Replies: 1 comment
-
Because the optimizer parameters are also stored in the training-saved checkpoint. If you want to remove that part of the parameter, you can use the 因为训练保存的模型当中还保存了 optimizer 参数。如果想删除那部分参数,可以使用 |
Beta Was this translation helpful? Give feedback.
0 replies
Answer selected by
RangeKing
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Because the optimizer parameters are also stored in the training-saved checkpoint. If you want to remove that part of the parameter, you can use the
tools/model_converters/publish_model.py
script. For details, you can refer to the doc: prepare-a-model-for-publishing.因为训练保存的模型当中还保存了 optimizer 参数。如果想删除那部分参数,可以使用
tools/model_converters/publish_model.py
脚本。具体可以参考 文档:如何发布模型 。