We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
您好,感谢分享, 请教一个多机训练问题。 使用keras和tf.distribute.experimental.MultiWorkerMirroredStrategy()进行多机训练时,batch_size=batch_size * worker_num,那么输入的数据(tfrecord格式)需要切分成worker_num份吗?如果不切分,每个worker都读取所有的数据,这种分布式策略会不会自动对数据进行分发呢?
The text was updated successfully, but these errors were encountered:
No branches or pull requests
您好,感谢分享, 请教一个多机训练问题。
使用keras和tf.distribute.experimental.MultiWorkerMirroredStrategy()进行多机训练时,batch_size=batch_size * worker_num,那么输入的数据(tfrecord格式)需要切分成worker_num份吗?如果不切分,每个worker都读取所有的数据,这种分布式策略会不会自动对数据进行分发呢?
The text was updated successfully, but these errors were encountered: