Skip to content

Commit

Permalink
Merge branch 'update_hugectr_version_23.6.0' into 'main'
Browse files Browse the repository at this point in the history
Update new version: 23.6.0

See merge request dl/hugectr/hugectr!1388
  • Loading branch information
minseokl committed Jun 14, 2023
2 parents fda5fed + f860a3b commit cfe69e2
Show file tree
Hide file tree
Showing 28 changed files with 50 additions and 50 deletions.
2 changes: 1 addition & 1 deletion HugeCTR/include/common.hpp
Original file line number Diff line number Diff line change
Expand Up @@ -58,7 +58,7 @@
namespace HugeCTR {

#define HUGECTR_VERSION_MAJOR 23
#define HUGECTR_VERSION_MINOR 5
#define HUGECTR_VERSION_MINOR 6
#define HUGECTR_VERSION_PATCH 0

#define WARP_SIZE 32
Expand Down
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -42,7 +42,7 @@ If you'd like to quickly train a model using the Python interface, do the follow

1. Start a NGC container with your local host directory (/your/host/dir mounted) by running the following command:
```
docker run --gpus=all --rm -it --cap-add SYS_NICE -v /your/host/dir:/your/container/dir -w /your/container/dir -it -u $(id -u):$(id -g) nvcr.io/nvidia/merlin/merlin-hugectr:23.05
docker run --gpus=all --rm -it --cap-add SYS_NICE -v /your/host/dir:/your/container/dir -w /your/container/dir -it -u $(id -u):$(id -g) nvcr.io/nvidia/merlin/merlin-hugectr:23.06
```

**NOTE**: The **/your/host/dir** directory is just as visible as the **/your/container/dir** directory. The **/your/host/dir** directory is also your starting directory.
Expand Down
4 changes: 2 additions & 2 deletions docs/source/hierarchical_parameter_server/profiling_hps.md
Original file line number Diff line number Diff line change
Expand Up @@ -67,13 +67,13 @@ To build HPS profiler from source, do the following:
Pull the container using the following command:

```shell
docker pull nvcr.io/nvidia/merlin/merlin-hugectr:23.05
docker pull nvcr.io/nvidia/merlin/merlin-hugectr:23.06
```

Launch the container in interactive mode (mount the HugeCTR root directory into the container for your convenience) by running this command:

```shell
docker run --gpus all --rm -it --cap-add SYS_NICE --ipc=host --ulimit memlock=-1 --ulimit stack=67108864 -u root -v $(pwd):/HugeCTR -w /HugeCTR -p 8888:8888 nvcr.io/nvidia/merlin/merlin-hugectr:23.05
docker run --gpus all --rm -it --cap-add SYS_NICE --ipc=host --ulimit memlock=-1 --ulimit stack=67108864 -u root -v $(pwd):/HugeCTR -w /HugeCTR -p 8888:8888 nvcr.io/nvidia/merlin/merlin-hugectr:23.06
```

3. Here is an example of how you can build HPS Profiler using the build options:
Expand Down
2 changes: 1 addition & 1 deletion docs/source/hugectr_user_guide.md
Original file line number Diff line number Diff line change
Expand Up @@ -83,7 +83,7 @@ The following sample command pulls and starts the Merlin Training container:

```shell
# Run the container in interactive mode
$ docker run --gpus=all --rm -it --cap-add SYS_NICE nvcr.io/nvidia/merlin/merlin-hugectr:23.05
$ docker run --gpus=all --rm -it --cap-add SYS_NICE nvcr.io/nvidia/merlin/merlin-hugectr:23.06
```

### Building HugeCTR from Scratch
Expand Down
2 changes: 1 addition & 1 deletion hps_tf/hps_cc/config.hpp
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@
*/
#pragma once

// TODO: The configurations are not needed anymore in merlin-base:23.05
// TODO: The configurations are not needed anymore in merlin-base:23.06
// #include <absl/base/options.h>
// #undef ABSL_OPTION_USE_STD_STRING_VIEW
// #define ABSL_OPTION_USE_STD_STRING_VIEW 0
2 changes: 1 addition & 1 deletion hps_tf/notebooks/hierarchical_parameter_server_demo.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -58,7 +58,7 @@
"\n",
"### Get HPS from NGC\n",
"\n",
"The HPS Python module is preinstalled in the 23.05 and later [Merlin TensorFlow Container](https://catalog.ngc.nvidia.com/orgs/nvidia/teams/merlin/containers/merlin-tensorflow): `nvcr.io/nvidia/merlin/merlin-tensorflow:23.05`.\n",
"The HPS Python module is preinstalled in the 23.06 and later [Merlin TensorFlow Container](https://catalog.ngc.nvidia.com/orgs/nvidia/teams/merlin/containers/merlin-tensorflow): `nvcr.io/nvidia/merlin/merlin-tensorflow:23.06`.\n",
"\n",
"You can check the existence of the required libraries by running the following Python code after launching this container.\n",
"\n",
Expand Down
2 changes: 1 addition & 1 deletion hps_tf/notebooks/hps_multi_table_sparse_input_demo.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -58,7 +58,7 @@
"\n",
"### Get HPS from NGC\n",
"\n",
"The HPS Python module is preinstalled in the 23.05 and later [Merlin TensorFlow Container](https://catalog.ngc.nvidia.com/orgs/nvidia/teams/merlin/containers/merlin-tensorflow): `nvcr.io/nvidia/merlin/merlin-tensorflow:23.05`.\n",
"The HPS Python module is preinstalled in the 23.06 and later [Merlin TensorFlow Container](https://catalog.ngc.nvidia.com/orgs/nvidia/teams/merlin/containers/merlin-tensorflow): `nvcr.io/nvidia/merlin/merlin-tensorflow:23.06`.\n",
"\n",
"You can check the existence of the required libraries by running the following Python code after launching this container.\n",
"\n",
Expand Down
2 changes: 1 addition & 1 deletion hps_tf/notebooks/hps_pretrained_model_training_demo.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -58,7 +58,7 @@
"\n",
"### Get HPS from NGC\n",
"\n",
"The HPS Python module is preinstalled in the 23.05 and later [Merlin TensorFlow Container](https://catalog.ngc.nvidia.com/orgs/nvidia/teams/merlin/containers/merlin-hugectr): `nvcr.io/nvidia/merlin/merlin-tensorflow:23.05`.\n",
"The HPS Python module is preinstalled in the 23.06 and later [Merlin TensorFlow Container](https://catalog.ngc.nvidia.com/orgs/nvidia/teams/merlin/containers/merlin-hugectr): `nvcr.io/nvidia/merlin/merlin-tensorflow:23.06`.\n",
"\n",
"You can check the existence of the required libraries by running the following Python code after launching this container.\n",
"\n",
Expand Down
2 changes: 1 addition & 1 deletion hps_tf/notebooks/hps_table_fusion_demo.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -57,7 +57,7 @@
"\n",
"### Get HPS from NGC\n",
"\n",
"The HPS Python module is preinstalled in the 23.05 and later [Merlin TensorFlow Container](https://catalog.ngc.nvidia.com/orgs/nvidia/teams/merlin/containers/merlin-tensorflow): `nvcr.io/nvidia/merlin/merlin-tensorflow:23.05`.\n",
"The HPS Python module is preinstalled in the 23.06 and later [Merlin TensorFlow Container](https://catalog.ngc.nvidia.com/orgs/nvidia/teams/merlin/containers/merlin-tensorflow): `nvcr.io/nvidia/merlin/merlin-tensorflow:23.06`.\n",
"\n",
"You can check the existence of the required libraries by running the following Python code after launching this container.\n",
"\n",
Expand Down
14 changes: 7 additions & 7 deletions hps_tf/notebooks/hps_tensorflow_triton_deployment_demo.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -58,7 +58,7 @@
"\n",
"### Get HPS from NGC\n",
"\n",
"The HPS Python module is preinstalled in the 23.05 and later [Merlin TensorFlow Container](https://catalog.ngc.nvidia.com/orgs/nvidia/teams/merlin/containers/merlin-tensorflow): `nvcr.io/nvidia/merlin/merlin-tensorflow:23.05`.\n",
"The HPS Python module is preinstalled in the 23.06 and later [Merlin TensorFlow Container](https://catalog.ngc.nvidia.com/orgs/nvidia/teams/merlin/containers/merlin-tensorflow): `nvcr.io/nvidia/merlin/merlin-tensorflow:23.06`.\n",
"\n",
"You can check the existence of the required libraries by running the following Python code after launching this container.\n",
"\n",
Expand Down Expand Up @@ -854,9 +854,9 @@
"INFO:tensorflow:Automatic mixed precision has been deactivated.\n",
"2022-11-23 01:37:23.028482: I tensorflow/core/grappler/devices.cc:66] Number of eligible GPUs (core count >= 8, compute capability >= 0.0): 1\n",
"2022-11-23 01:37:23.028568: I tensorflow/core/grappler/clusters/single_machine.cc:358] Starting new session\n",
"2022-11-23 01:37:23.051909: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1532] Created device /job:localhost/replica:0/task:0/device:GPU:0 with 30991 MB memory: -> device: 0, name: Tesla V100-SXM2-32GB, pci bus id: 0000:06:00.0, compute capability: 7.0\n",
"2022-11-23 01:37:23.058593: W tensorflow/compiler/tf2tensorrt/convert/trt_optimization_pass.cc:198] Calibration with FP32 or FP16 is not implemented. Falling back to use_calibration = False.Note that the default value of use_calibration is True.\n",
"2022-11-23 01:37:23.059761: W tensorflow/compiler/tf2tensorrt/segment/segment.cc:952] \n",
"2022-11-23 01:37:23.061909: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1532] Created device /job:localhost/replica:0/task:0/device:GPU:0 with 30991 MB memory: -> device: 0, name: Tesla V100-SXM2-32GB, pci bus id: 0000:06:00.0, compute capability: 7.0\n",
"2022-11-23 01:37:23.068593: W tensorflow/compiler/tf2tensorrt/convert/trt_optimization_pass.cc:198] Calibration with FP32 or FP16 is not implemented. Falling back to use_calibration = False.Note that the default value of use_calibration is True.\n",
"2022-11-23 01:37:23.069761: W tensorflow/compiler/tf2tensorrt/segment/segment.cc:952] \n",
"\n",
"################################################################################\n",
"TensorRT unsupported/non-converted OP Report:\n",
Expand All @@ -872,9 +872,9 @@
"For more information see https://docs.nvidia.com/deeplearning/frameworks/tf-trt-user-guide/index.html#supported-ops.\n",
"################################################################################\n",
"\n",
"2022-11-23 01:37:23.059860: W tensorflow/compiler/tf2tensorrt/segment/segment.cc:1280] The environment variable TF_TRT_MAX_ALLOWED_ENGINES=20 has no effect since there are only 1 TRT Engines with at least minimum_segment_size=3 nodes.\n",
"2022-11-23 01:37:23.059893: I tensorflow/compiler/tf2tensorrt/convert/convert_graph.cc:799] Number of TensorRT candidate segments: 1\n",
"2022-11-23 01:37:23.050667: I tensorflow/compiler/tf2tensorrt/convert/convert_graph.cc:916] Replaced segment 0 consisting of 9 nodes by TRTEngineOp_000_000.\n"
"2022-11-23 01:37:23.069860: W tensorflow/compiler/tf2tensorrt/segment/segment.cc:1280] The environment variable TF_TRT_MAX_ALLOWED_ENGINES=20 has no effect since there are only 1 TRT Engines with at least minimum_segment_size=3 nodes.\n",
"2022-11-23 01:37:23.069893: I tensorflow/compiler/tf2tensorrt/convert/convert_graph.cc:799] Number of TensorRT candidate segments: 1\n",
"2022-11-23 01:37:23.060667: I tensorflow/compiler/tf2tensorrt/convert/convert_graph.cc:916] Replaced segment 0 consisting of 9 nodes by TRTEngineOp_000_000.\n"
]
},
{
Expand Down
2 changes: 1 addition & 1 deletion hps_tf/notebooks/sok_to_hps_dlrm_demo.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -58,7 +58,7 @@
"\n",
"### Get SOK from NGC\n",
"\n",
"Both SOK and HPS Python modules are preinstalled in the 23.05 and later [Merlin TensorFlow Container](https://catalog.ngc.nvidia.com/orgs/nvidia/teams/merlin/containers/merlin-tensorflow): `nvcr.io/nvidia/merlin/merlin-tensorflow:23.05`.\n",
"Both SOK and HPS Python modules are preinstalled in the 23.06 and later [Merlin TensorFlow Container](https://catalog.ngc.nvidia.com/orgs/nvidia/teams/merlin/containers/merlin-tensorflow): `nvcr.io/nvidia/merlin/merlin-tensorflow:23.06`.\n",
"\n",
"You can check the existence of the required libraries by running the following Python code after launching this container.\n",
"\n",
Expand Down
2 changes: 1 addition & 1 deletion hps_trt/notebooks/demo_for_hugectr_trained_model.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -31,7 +31,7 @@
"\n",
"### Use NGC\n",
"\n",
"The HPS TensorRT plugin is preinstalled in the 23.05 and later [Merlin HugeCTR Container](https://catalog.ngc.nvidia.com/orgs/nvidia/teams/merlin/containers/merlin-hugectr): `nvcr.io/nvidia/merlin/merlin-hugectr:23.05`.\n",
"The HPS TensorRT plugin is preinstalled in the 23.06 and later [Merlin HugeCTR Container](https://catalog.ngc.nvidia.com/orgs/nvidia/teams/merlin/containers/merlin-hugectr): `nvcr.io/nvidia/merlin/merlin-hugectr:23.06`.\n",
"\n",
"You can check the existence of the required libraries by running the following Python code after launching this container."
]
Expand Down
2 changes: 1 addition & 1 deletion hps_trt/notebooks/demo_for_pytorch_trained_model.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -31,7 +31,7 @@
"\n",
"### Use NGC\n",
"\n",
"The HPS TensorRT plugin is preinstalled in the 23.05 and later [Merlin PyTorch Container](https://catalog.ngc.nvidia.com/orgs/nvidia/teams/merlin/containers/merlin-pytorch): `nvcr.io/nvidia/merlin/merlin-pytorch:23.05`.\n",
"The HPS TensorRT plugin is preinstalled in the 23.06 and later [Merlin PyTorch Container](https://catalog.ngc.nvidia.com/orgs/nvidia/teams/merlin/containers/merlin-pytorch): `nvcr.io/nvidia/merlin/merlin-pytorch:23.06`.\n",
"\n",
"You can check the existence of the required libraries by running the following Python code after launching this container."
]
Expand Down
2 changes: 1 addition & 1 deletion hps_trt/notebooks/demo_for_tf_trained_model.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -31,7 +31,7 @@
"\n",
"### Use NGC\n",
"\n",
"The HPS TensorRT plugin is preinstalled in the 23.05 and later [Merlin TensorFlow Container](https://catalog.ngc.nvidia.com/orgs/nvidia/teams/merlin/containers/merlin-tensorflow): `nvcr.io/nvidia/merlin/merlin-tensorflow:23.05`.\n",
"The HPS TensorRT plugin is preinstalled in the 23.06 and later [Merlin TensorFlow Container](https://catalog.ngc.nvidia.com/orgs/nvidia/teams/merlin/containers/merlin-tensorflow): `nvcr.io/nvidia/merlin/merlin-tensorflow:23.06`.\n",
"\n",
"You can check the existence of the required libraries by running the following Python code after launching this container."
]
Expand Down
6 changes: 3 additions & 3 deletions notebooks/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -19,16 +19,16 @@ git clone https://github.com/NVIDIA/HugeCTR
Pull the container using the following command:

```shell
docker pull nvcr.io/nvidia/merlin/merlin-hugectr:23.05
docker pull nvcr.io/nvidia/merlin/merlin-hugectr:23.06
```

Launch the container in interactive mode (mount the HugeCTR root directory into the container for your convenience) by running this command:

```shell
docker run --gpus all --rm -it --cap-add SYS_NICE --ipc=host --ulimit memlock=-1 --ulimit stack=67108864 -u root -v $(pwd):/HugeCTR -w /HugeCTR -p 8888:8888 nvcr.io/nvidia/merlin/merlin-hugectr:23.05
docker run --gpus all --rm -it --cap-add SYS_NICE --ipc=host --ulimit memlock=-1 --ulimit stack=67108864 -u root -v $(pwd):/HugeCTR -w /HugeCTR -p 8888:8888 nvcr.io/nvidia/merlin/merlin-hugectr:23.06
```

> To run the Sparse Operation Kit notebooks, specify the `nvcr.io/nvidia/merlin/merlin-tensorflow:23.05` container.
> To run the Sparse Operation Kit notebooks, specify the `nvcr.io/nvidia/merlin/merlin-tensorflow:23.06` container.

## 3. Customized Building (Optional)

Expand Down
4 changes: 2 additions & 2 deletions notebooks/movie-lens-example.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -1242,8 +1242,8 @@
"[HCTR][08:54:59.305][INFO][RK0][main]: Iter: 10000 Time(1000 iters): 14.0335s Loss: 0.0474014 lr:0.099995\n",
"[HCTR][08:55:08.171][INFO][RK0][main]: Iter: 11000 Time(1000 iters): 8.8579s Loss: 0.0336978 lr:0.0950576\n",
"[HCTR][08:55:16.985][INFO][RK0][main]: Iter: 12000 Time(1000 iters): 8.80581s Loss: 0.0208526 lr:0.0902453\n",
"[HCTR][08:55:23.050][INFO][RK0][main]: Evaluation, AUC: 0.990911\n",
"[HCTR][08:55:23.050][INFO][RK0][main]: Eval Time for 1000 iters: 5.11441s\n",
"[HCTR][08:55:23.060][INFO][RK0][main]: Evaluation, AUC: 0.990911\n",
"[HCTR][08:55:23.060][INFO][RK0][main]: Eval Time for 1000 iters: 5.11441s\n",
"[HCTR][08:55:30.936][INFO][RK0][main]: Iter: 13000 Time(1000 iters): 13.9421s Loss: 0.0173013 lr:0.0855579\n",
"[HCTR][08:55:39.769][INFO][RK0][main]: Iter: 14000 Time(1000 iters): 8.82507s Loss: 0.0128202 lr:0.0809955\n",
"[HCTR][08:55:48.619][INFO][RK0][main]: Iter: 15000 Time(1000 iters): 8.84112s Loss: 0.0100981 lr:0.0765581\n",
Expand Down
6 changes: 3 additions & 3 deletions release_notes.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# Release Notes

## What's New in Version 23.05
## What's New in Version 23.06
In this release, we have fixed issues and enhanced the code.

+ **3G Embedding Updates**:
Expand Down Expand Up @@ -130,7 +130,7 @@ In this release, we have fixed issues and enhanced the code.

```{important}
In January 2023, the HugeCTR team plans to deprecate semantic versioning, such as `v4.3`.
Afterward, the library will use calendar versioning only, such as `v23.05`.
Afterward, the library will use calendar versioning only, such as `v23.06`.
```

+ **Support for BERT and Variants**:
Expand Down Expand Up @@ -212,7 +212,7 @@ The [HugeCTR Training and Inference with Remote File System Example](https://nvi

```{important}
In January 2023, the HugeCTR team plans to deprecate semantic versioning, such as `v4.2`.
Afterward, the library will use calendar versioning only, such as `v23.05`.
Afterward, the library will use calendar versioning only, such as `v23.06`.
```

+ **Change to HPS with Redis or Kafka**:
Expand Down
4 changes: 2 additions & 2 deletions samples/criteo/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -11,11 +11,11 @@ HugeCTR is available as buildable source code, but the easiest way to install an

1. Pull the HugeCTR NGC Docker by running the following command:
```bash
$ docker pull nvcr.io/nvidia/merlin/merlin-hugectr:23.05
$ docker pull nvcr.io/nvidia/merlin/merlin-hugectr:23.06
```
2. Launch the container in interactive mode with the HugeCTR root directory mounted into the container by running the following command:
```bash
$ docker run --gpus=all --rm -it --cap-add SYS_NICE -u $(id -u):$(id -g) -v $(pwd):/hugectr -w /hugectr nvcr.io/nvidia/merlin/merlin-hugectr:23.05
$ docker run --gpus=all --rm -it --cap-add SYS_NICE -u $(id -u):$(id -g) -v $(pwd):/hugectr -w /hugectr nvcr.io/nvidia/merlin/merlin-hugectr:23.06
```

### Build the HugeCTR Docker Container on Your Own ###
Expand Down
4 changes: 2 additions & 2 deletions samples/criteo_multi_slots/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -11,11 +11,11 @@ HugeCTR is available as buildable source code, but the easiest way to install an

1. Pull the HugeCTR NGC Docker by running the following command:
```bash
$ docker pull nvcr.io/nvidia/merlin/merlin-hugectr:23.05
$ docker pull nvcr.io/nvidia/merlin/merlin-hugectr:23.06
```
2. Launch the container in interactive mode with the HugeCTR root directory mounted into the container by running this command:
```bash
$ docker run --gpus=all --rm -it --cap-add SYS_NICE -u $(id -u):$(id -g) -v $(pwd):/hugectr -w /hugectr nvcr.io/nvidia/merlin/merlin-hugectr:23.05
$ docker run --gpus=all --rm -it --cap-add SYS_NICE -u $(id -u):$(id -g) -v $(pwd):/hugectr -w /hugectr nvcr.io/nvidia/merlin/merlin-hugectr:23.06
```

### Build the HugeCTR Docker Container on Your Own ###
Expand Down
4 changes: 2 additions & 2 deletions samples/dcn/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -11,11 +11,11 @@ HugeCTR is available as buildable source code, but the easiest way to install an

1. Pull the HugeCTR NGC Docker by running the following command:
```bash
$ docker pull nvcr.io/nvidia/merlin/merlin-hugectr:23.05
$ docker pull nvcr.io/nvidia/merlin/merlin-hugectr:23.06
```
2. Launch the container in interactive mode with the HugeCTR root directory mounted into the container by running the following command:
```bash
$ docker run --gpus=all --rm -it --cap-add SYS_NICE -u $(id -u):$(id -g) -v $(pwd):/hugectr -w /hugectr nvcr.io/nvidia/merlin/merlin-hugectr:23.05
$ docker run --gpus=all --rm -it --cap-add SYS_NICE -u $(id -u):$(id -g) -v $(pwd):/hugectr -w /hugectr nvcr.io/nvidia/merlin/merlin-hugectr:23.06
```

### Build the HugeCTR Docker Container on Your Own ###
Expand Down
4 changes: 2 additions & 2 deletions samples/deepfm/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -11,11 +11,11 @@ HugeCTR is available as buildable source code, but the easiest way to install an

1. Pull the HugeCTR NGC Docker by running the following command:
```bash
$ docker pull nvcr.io/nvidia/merlin/merlin-hugectr:23.05
$ docker pull nvcr.io/nvidia/merlin/merlin-hugectr:23.06
```
2. Launch the container in interactive mode with the HugeCTR root directory mounted into the container by running the following command:
```bash
$ docker run --gpus=all --rm -it --cap-add SYS_NICE -u $(id -u):$(id -g) -v $(pwd):/hugectr -w /hugectr nvcr.io/nvidia/merlin/merlin-hugectr:23.05
$ docker run --gpus=all --rm -it --cap-add SYS_NICE -u $(id -u):$(id -g) -v $(pwd):/hugectr -w /hugectr nvcr.io/nvidia/merlin/merlin-hugectr:23.06
```

### Build the HugeCTR Docker Container on Your Own ###
Expand Down
4 changes: 2 additions & 2 deletions samples/din/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -11,11 +11,11 @@ HugeCTR is available as buildable source code, but the easiest way to install an

1. Pull the HugeCTR NGC Docker by running the following command:
```bash
$ docker pull nvcr.io/nvidia/merlin/merlin-hugectr:23.05
$ docker pull nvcr.io/nvidia/merlin/merlin-hugectr:23.06
```
2. Launch the container in interactive mode with the HugeCTR root directory mounted into the container by running the following command:
```bash
$ docker run --gpus=all --rm -it --cap-add SYS_NICE -u $(id -u):$(id -g) -v $(pwd):/hugectr -w /hugectr nvcr.io/nvidia/merlin/merlin-hugectr:23.05
$ docker run --gpus=all --rm -it --cap-add SYS_NICE -u $(id -u):$(id -g) -v $(pwd):/hugectr -w /hugectr nvcr.io/nvidia/merlin/merlin-hugectr:23.06
```

### Build the HugeCTR Docker Container on Your Own ###
Expand Down
4 changes: 2 additions & 2 deletions samples/dlrm/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -16,11 +16,11 @@ HugeCTR is available as buildable source code, but the easiest way to install an

1. Pull the HugeCTR NGC Docker by running the following command:
```bash
$ docker pull nvcr.io/nvidia/merlin/merlin-hugectr:23.05
$ docker pull nvcr.io/nvidia/merlin/merlin-hugectr:23.06
```
2. Launch the container in interactive mode with the HugeCTR root directory mounted into the container by running the following command:
```bash
$ docker run --gpus=all --rm -it --cap-add SYS_NICE -u $(id -u):$(id -g) -v $(pwd):/hugectr -w /hugectr nvcr.io/nvidia/merlin/merlin-hugectr:23.05
$ docker run --gpus=all --rm -it --cap-add SYS_NICE -u $(id -u):$(id -g) -v $(pwd):/hugectr -w /hugectr nvcr.io/nvidia/merlin/merlin-hugectr:23.06
```

### Build the HugeCTR Docker Container on Your Own ###
Expand Down
Loading

0 comments on commit cfe69e2

Please sign in to comment.