Skip to content

Commit

Permalink
Fix inf container tag in getting started TF-inf nb and polish exp REA…
Browse files Browse the repository at this point in the history
  • Loading branch information
rnyak authored Apr 18, 2022
1 parent c96e554 commit b54063e
Show file tree
Hide file tree
Showing 2 changed files with 5 additions and 6 deletions.
7 changes: 3 additions & 4 deletions examples/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -60,18 +60,17 @@ You can run the example notebooks by [installing NVTabular](https://github.com/N
- Merlin-Tensorflow-Training (contains NVTabular with TensorFlow)
- Merlin-Pytorch-Training (contains NVTabular with PyTorch)
- Merlin-Training (contains NVTabular with HugeCTR)
- Merlin-Inference (contains NVTabular with TensorFlow and Triton Inference support)
- Merlin-Tensorflow-Inference (contains NVTabular with TensorFlow and Triton Inference support)

To run the example notebooks using Docker containers, do the following:

1. Pull the container by running the following command:
```
docker run --runtime=nvidia --rm -it -p 8888:8888 -p 8797:8787 -p 8796:8786 --ipc=host <docker container> /bin/bash
docker run --gpus all --rm -it -p 8888:8888 -p 8797:8787 -p 8796:8786 --ipc=host <docker container> /bin/bash
```

**NOTES**:

- If you are running on Docker version 19 and higher, change ```--runtime=nvidia``` to ```--gpus all```.
- If you are running `Getting Started with MovieLens` , `Advanced Ops with Outbrain` or `Tabular Problems with Rossmann` example notebooks you need to add ` -v ${PWD}:/root/ ` to the docker script above. Here `PWD` is a local directory in your system, and this very same directory should also be mounted to the `merlin-inference`container if you would like to run the inference example. Please follow the `start and launch triton server` instructions given in the inference notebooks.
- If you are running `Training-with-HugeCTR` notebooks, please add `--cap-add SYS_NICE` to `docker run` command to suppress the `set_mempolicy: Operation not permitted` warnings.

Expand All @@ -80,7 +79,7 @@ To run the example notebooks using Docker containers, do the following:
root@2efa5b50b909:
```

2. Install jupyter-lab with `pip` by running the following command:
2. If jupyter-lab is not installed, install jupyter-lab with `pip` by running the following command:
```
pip install jupyterlab
```
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -69,7 +69,7 @@
"metadata": {},
"source": [
"```\n",
"docker run -it --gpus device=0 -p 8000:8000 -p 8001:8001 -p 8002:8002 -v ${PWD}:/model/ nvcr.io/nvidia/merlin/merlin-inference:21.11\n",
"docker run -it --gpus device=0 -p 8000:8000 -p 8001:8001 -p 8002:8002 -v ${PWD}:/model/ nvcr.io/nvidia/merlin/merlin-tensorflow-inference:22.04\n",
"```\n"
]
},
Expand Down Expand Up @@ -351,7 +351,7 @@
" model.savedmodel/\n",
" <saved-model files>\n",
"```\n",
"Let's check out our model repository layout. You can install tree library with apt-get install tree, and then run `!tree /model/models/` to print out the model repository layout as below:\n",
"Let's check out our model repository layout. You can install tree library with `apt-get install tree`, and then run `!tree /model/models/` to print out the model repository layout as below:\n",
" \n",
"```\n",
"/model/models/\n",
Expand Down

0 comments on commit b54063e

Please sign in to comment.