Skip to content

Commit

Permalink
Add simple model serving (#540)
Browse files Browse the repository at this point in the history
  • Loading branch information
tqtg authored Nov 4, 2023
1 parent fbc4b57 commit e792f22
Show file tree
Hide file tree
Showing 2 changed files with 167 additions and 6 deletions.
45 changes: 39 additions & 6 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -79,11 +79,10 @@ ml_100k = cornac.datasets.movielens.load_feedback()
rs = RatioSplit(data=ml_100k, test_size=0.2, rating_threshold=4.0, seed=123)

# initialize models, here we are comparing: Biased MF, PMF, and BPR
models = [
MF(k=10, max_iter=25, learning_rate=0.01, lambda_reg=0.02, use_bias=True, seed=123),
PMF(k=10, max_iter=100, learning_rate=0.001, lambda_reg=0.001, seed=123),
BPR(k=10, max_iter=200, learning_rate=0.001, lambda_reg=0.01, seed=123),
]
mf = MF(k=10, max_iter=25, learning_rate=0.01, lambda_reg=0.02, use_bias=True, seed=123)
pmf = PMF(k=10, max_iter=100, learning_rate=0.001, lambda_reg=0.001, seed=123)
bpr = BPR(k=10, max_iter=200, learning_rate=0.001, lambda_reg=0.01, seed=123)
models = [mf, pmf, bpr]

# define metrics to evaluate the models
metrics = [MAE(), RMSE(), Precision(k=10), Recall(k=10), NDCG(k=10), AUC(), MAP()]
Expand All @@ -104,6 +103,40 @@ cornac.Experiment(eval_method=rs, models=models, metrics=metrics, user_based=Tru
For more details, please take a look at our [examples](examples) as well as [tutorials](tutorials). For learning purposes, this list of [tutorials on recommender systems](https://github.com/PreferredAI/tutorials/tree/master/recommender-systems) will be more organized and comprehensive.


## Simple model serving

Here, we provide a simple way to serve a Cornac model by launching a standalone web service. While this will not be an optimized service for model deployment in production, it is quite handy for testing or creating a demo application. Supposed that we use the trained BPR from previous example, we first need to save the model:
```python
bpr.save("save_dir")
```
The model can be deployed easily by triggering Cornac serving module:
```bash
$ python -m cornac.serving --model_dir save_dir/BPR --model_class cornac.models.BPR

# Serving BPR at port 8080
```
Here we go, our model service is now ready. Let's get `top-5` item recommendations for the user `"63"`:
```bash
$ curl -X GET "http://127.0.0.1:8080/recommend?uid=63&k=5&remove_seen=false"

# Response: {"recommendations": ["50", "181", "100", "258", "286"], "query": {"uid": "63", "k": 5, "remove_seen": false}}
```
If we want to remove seen items during training, we need to provide `train_set` when starting the serving service.
```bash
$ python -m cornac.serving --help

usage: serving.py [-h] --model_dir MODEL_DIR [--model_class MODEL_CLASS] [--train_set TRAIN_SET] [--port PORT]

Cornac model serving

options:
-h, --help show this help message and exit
--model_dir MODEL_DIR path to directory where the model was saved
--model_class MODEL_CLASS class of the model being deployed
--train_set TRAIN_SET path to pickled file of the train_set (to remove seen items)
--port PORT service port
```

## Models

The recommender models supported by Cornac are listed below. Why don't you join us to lengthen the list?
Expand All @@ -117,7 +150,7 @@ The recommender models supported by Cornac are listed below. Why don't you join
| | [Hybrid neural recommendation with joint deep representation learning of ratings and reviews (HRDR)](cornac/models/hrdr), [paper](https://www.sciencedirect.com/science/article/abs/pii/S0925231219313207) | [requirements.txt](cornac/models/hrdr/requirements.txt) | [hrdr_example.py](examples/hrdr_example.py)
| | [LightGCN: Simplifying and Powering Graph Convolution Network for Recommendation](cornac/models/lightgcn), [paper](https://arxiv.org/pdf/2002.02126.pdf) | [requirements.txt](cornac/models/lightgcn/requirements.txt) | [lightgcn_example.py](examples/lightgcn_example.py)
| 2019 | [Embarrassingly Shallow Autoencoders for Sparse Data (EASEᴿ)](cornac/models/ease), [paper](https://arxiv.org/pdf/1905.03375.pdf) | N/A | [ease_movielens.py](examples/ease_movielens.py)
| | [Neural Graph Collaborative Filtering](cornac/models/ngcf), [paper](https://arxiv.org/pdf/1905.08108.pdf) | [requirements.txt](cornac/models/ngcf/requirements.txt) | [ngcf_example.py](examples/ngcf_example.py)
| | [Neural Graph Collaborative Filtering (NGCF)](cornac/models/ngcf), [paper](https://arxiv.org/pdf/1905.08108.pdf) | [requirements.txt](cornac/models/ngcf/requirements.txt) | [ngcf_example.py](examples/ngcf_example.py)
| 2018 | [Collaborative Context Poisson Factorization (C2PF)](cornac/models/c2pf), [paper](https://www.ijcai.org/proceedings/2018/0370.pdf) | N/A | [c2pf_exp.py](examples/c2pf_example.py)
| | [Graph Convolutional Matrix Completion (GCMC)](cornac/models/gcmc), [paper](https://www.kdd.org/kdd2018/files/deep-learning-day/DLDay18_paper_32.pdf) | [requirements.txt](cornac/models/gcmc/requirements.txt) | [gcmc_example.py](examples/gcmc_example.py)
| | [Multi-Task Explainable Recommendation (MTER)](cornac/models/mter), [paper](https://arxiv.org/pdf/1806.03568.pdf) | N/A | [mter_exp.py](examples/mter_example.py)
Expand Down
128 changes: 128 additions & 0 deletions cornac/serving.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,128 @@
# Copyright 2018 The Cornac Authors. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# ============================================================================

"""CLI entry point for model serving.
"""

import argparse
import sys
import json
import pickle
import http.server
import socketserver

from urllib.parse import urlparse, parse_qs


class ModelRequestHandler(http.server.BaseHTTPRequestHandler):
def _set_response(self, status_code=200, content_type="application/json"):
self.send_response(status_code)
self.send_header("Content-type", content_type)
self.end_headers()

def do_GET(self):
if self.path == "/":
self._set_response()
response_data = {"message": "Cornac model serving."}
self.wfile.write(json.dumps(response_data).encode())
elif self.path.startswith("/recommend"):
parsed_query = parse_qs(urlparse(self.path).query)

# TODO: input validation
user_id = str(parsed_query["uid"][0])
k = -1 if "k" not in parsed_query else int(parsed_query["k"][0])
remove_seen = (
False
if "remove_seen" not in parsed_query
else parsed_query["remove_seen"][0].lower() == "true"
)

response_data = {
"recommendations": self.server.model.recommend(
user_id=user_id,
k=k,
remove_seen=remove_seen,
train_set=self.server.train_set,
),
"query": {"uid": user_id, "k": k, "remove_seen": remove_seen},
}

self._set_response()
self.wfile.write(json.dumps(response_data).encode())
else:
self.send_error(404, "Endpoint not found")


def import_model_class(model_class):
components = model_class.split(".")
mod = __import__(".".join(components[:-1]), fromlist=[components[-1]])
klass = getattr(mod, components[-1])
return klass


def parse_args():
parser = argparse.ArgumentParser(description="Cornac model serving")
parser.add_argument(
"--model_dir",
type=str,
required=True,
help="path to directory where the model was saved",
)
parser.add_argument(
"--model_class",
type=str,
default="cornac.models.Recommender",
help="class of the model being deployed",
)
parser.add_argument(
"--train_set",
type=str,
default=None,
help="path to pickled file of the train_set (to remove seen items)",
)
parser.add_argument(
"--port",
type=int,
default=8080,
help="service port",
)

return parser.parse_args(sys.argv[1:])


def main():
args = parse_args()

# Load model/train_set if provided
httpd = socketserver.TCPServer(("", args.port), ModelRequestHandler)
httpd.model = import_model_class(args.model_class).load(args.model_dir)
httpd.train_set = None
if args.train_set is not None:
with open(args.train_set, "rb") as f:
httpd.train_set = pickle.load(f)

# Start service
try:
print(f"Serving {httpd.model.name} at port {args.port}")
httpd.serve_forever()
except KeyboardInterrupt:
pass

httpd.server_close()
print("Server stopped.")


if __name__ == "__main__":
main()

0 comments on commit e792f22

Please sign in to comment.