Skip to content

Commit

Permalink
Fix small typo
Browse files Browse the repository at this point in the history
PiperOrigin-RevId: 574223535
  • Loading branch information
grasskin authored and copybara-github committed Oct 17, 2023
1 parent c6fe18c commit d742984
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion site/en/tutorials/distribute/custom_training.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -364,7 +364,7 @@
"\n",
" * Input batches shorter than `GLOBAL_BATCH_SIZE` create unpleasant corner cases in several places. In practice, it often works best to avoid them by allowing batches to span epoch boundaries using `Dataset.repeat().batch()` and defining approximate epochs by step counts, not dataset ends. Alternatively, `Dataset.batch(drop_remainder=True)` maintains the notion of epoch but drops the last few examples.\n",
"\n",
" For illustration, this example goes the harder route and allows short batches, so that each training epoch contains each trainig example exactly once.\n",
" For illustration, this example goes the harder route and allows short batches, so that each training epoch contains each training example exactly once.\n",
" \n",
" Which denominator should be used by `tf.nn.compute_average_loss()`?\n",
"\n",
Expand Down

0 comments on commit d742984

Please sign in to comment.