-
Notifications
You must be signed in to change notification settings - Fork 119
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Moving systems examples to Merlin (#1034)
* moving systems examples to Merlin * serve ranking models * move tests and update ranking_serving test * update nb * update paths and rename nb * fix typo * move traditional-ml out of ranking
- Loading branch information
Showing
7 changed files
with
3,011 additions
and
0 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,47 @@ | ||
# Training and Deploying Ranking models with Merlin | ||
|
||
Ranking models are probably the most common use case in recommender systems. The examples under this folder are designed to demonstrate how to build, train and evaluate a ranking model (e.g. DLRM) using Merlin Models and deploy on [Triton Inference Server](https://github.com/triton-inference-server/server) with Merlin Systems. Currently we support models built with TensorFlow framework, and traditional-ml models like XGBoost and python-based models with implicit datasets. Examples built with PyTorch framework are being developed and will be added here soon. | ||
|
||
To learn more about ranking models, please visit this documentation [page](https://nvidia-merlin.github.io/Merlin/stable/guide/recommender_models.html#). | ||
|
||
## Running the Example Notebooks | ||
|
||
Docker containers are available from the NVIDIA GPU Cloud. | ||
We use the latest stable version of the [merlin-tensorflow](https://catalog.ngc.nvidia.com/orgs/nvidia/teams/merlin/containers/merlin-tensorflow/tags) container to run the example notebooks. To run the example notebooks using Docker containers, perform the following steps: | ||
|
||
|
||
1. Pull and start the container by running the following command: | ||
|
||
```shell | ||
docker run --gpus all --rm -it \ | ||
-p 8888:8888 -p 8797:8787 -p 8796:8786 --ipc=host \ | ||
nvcr.io/nvidia/merlin/merlin-tensorflow:23.XX /bin/bash | ||
``` | ||
|
||
> You can find the release tags and more information on the [merlin-tensorflow](https://catalog.ngc.nvidia.com/orgs/nvidia/teams/merlin/containers/merlin-tensorflow) container page. | ||
The container opens a shell when the run command execution is completed. | ||
Your shell prompt should look similar to the following example: | ||
|
||
```shell | ||
root@2efa5b50b909: | ||
``` | ||
|
||
2. Start the JupyterLab server by running the following command: | ||
|
||
```shell | ||
jupyter-lab --allow-root --ip='0.0.0.0' | ||
``` | ||
|
||
View the messages in your terminal to identify the URL for JupyterLab. | ||
The messages in your terminal show similar lines to the following example: | ||
|
||
```shell | ||
Or copy and paste one of these URLs: | ||
http://2efa5b50b909:8888/lab?token=9b537d1fda9e4e9cadc673ba2a472e247deee69a6229ff8d | ||
or http://127.0.0.1:8888/lab?token=9b537d1fda9e4e9cadc673ba2a472e247deee69a6229ff8d | ||
``` | ||
|
||
3. Open a browser and use the `127.0.0.1` URL provided in the messages by JupyterLab. | ||
|
||
4. After you log in to JupyterLab, navigate to the `/Merlin/examples/ranking` directory to try out the example notebooks. |
Oops, something went wrong.