diff --git a/README.md b/README.md index 75320936f..11cb37f74 100644 --- a/README.md +++ b/README.md @@ -100,39 +100,28 @@ cornac.Experiment(eval_method=rs, models=models, metrics=metrics, user_based=Tru For more details, please take a look at our [examples](examples) as well as [tutorials](tutorials). For learning purposes, this list of [tutorials on recommender systems](https://github.com/PreferredAI/tutorials/tree/master/recommender-systems) will be more organized and comprehensive. -## Simple model serving +## Model serving -Here, we provide a simple way to serve a Cornac model by launching a standalone web service. While this will not be an optimized service for model deployment in production, it is quite handy for testing or creating a demo application. Supposed that we use the trained BPR from previous example, we first need to save the model: +Here, we provide a simple way to serve a Cornac model by launching a standalone web service with Flask. It is quite handy for testing or creating a demo application. Supposed that we use the trained BPR from previous example, we first need to save the model: ```python bpr.save("save_dir") ``` The model can be deployed easily by triggering Cornac serving module: ```bash -$ python -m cornac.serving --model_dir save_dir/BPR --model_class cornac.models.BPR +$ FLASK_APP='cornac.serving.app' \ + MODEL_DIR='save_dir/BPR' \ + MODEL_CLASS='cornac.models.BPR' \ + flask run --host localhost --port 8080 -# Serving BPR at port 8080 +# Running on http://localhost:8080 ``` Here we go, our model service is now ready. Let's get `top-5` item recommendations for the user `"63"`: ```bash -$ curl -X GET "http://127.0.0.1:8080/recommend?uid=63&k=5&remove_seen=false" +$ curl -X GET "http://localhost:8080/recommend?uid=63&k=5&remove_seen=false" # Response: {"recommendations": ["50", "181", "100", "258", "286"], "query": {"uid": "63", "k": 5, "remove_seen": false}} ``` -If we want to remove seen items during training, we need to provide `train_set` when starting the serving service. -```bash -$ python -m cornac.serving --help - -usage: serving.py [-h] --model_dir MODEL_DIR [--model_class MODEL_CLASS] [--train_set TRAIN_SET] [--port PORT] - -Cornac model serving - -options: - -h, --help show this help message and exit - --model_dir MODEL_DIR path to directory where the model was saved - --model_class MODEL_CLASS class of the model being deployed - --train_set TRAIN_SET path to pickled file of the train_set (to remove seen items) - --port PORT service port -``` +If we want to remove seen items during training, we need to provide `TRAIN_SET` when starting the serving app. We can also leverage [WSGI server](https://flask.palletsprojects.com/en/3.0.x/deploying/) for model deployment in production. ## Efficient retrieval with ANN search