diff --git a/docs/batching.html b/docs/batching.html index 3306898a4..44e2b0efc 100644 --- a/docs/batching.html +++ b/docs/batching.html @@ -76,4 +76,4 @@
The need for different mesh batch modes is inherent to the way PyTorch operators are implemented. To fully utilize the optimized PyTorch ops, the Meshes data structure allows for efficient conversion between the different batch modes. This is crucial when aiming for a fast and efficient training cycle. An example of this is Mesh R-CNN. Here, in the same forward pass different parts of the network assume different inputs, which are computed by converting between the different batch modes. In particular, vert_align assumes a padded input tensor while immediately after graph_conv assumes a packed input tensor.
-