A simple example of a Python web service for real time machine learning model deployment. It is based on this post
This includes Docker integration and SHAP explanations for the deployed model.
- docker
- docker-compose (Recommended)
Make sure that you have a model in the main directory. You can launch the example using the following line in order to create a quick classification model.
$ python example/build_linear_binary.py
or one of the scripts in the example
folder
-
variables.env
: Controls API parameters via environment variables -
requirements.txt
: Controls Python packages installed inside the container -
model.joblib
: Model saved inside a dictionary with this format{ "model": trained_model, "metadata": {"features": [{"name": "feature1", "type": "numeric"}, {"name": "feature2", "type": "numeric", "default": -1}, {"name": "feature3", "type": "numeric"}]} }
Build the image (this has to be done every time the code or the model change)
$ docker-compose build
Create and run the container
$ docker-compose up
Using uWSGI and nginx for production.
Build the image (this has to be done every time the code or the model change)
$ docker-compose -f docker-compose-production.yml build
Create and run the container
$ docker-compose -f docker-compose-production.yml up
Create the environment
$ conda create -n flask_ml_template python=3
$ conda activate flask_ml_template
Install requirements
$ pip install -r ./requirements-service.txt
$ pip install -r ./requirements.txt
Run the API service
$ python service.py
This example considers that the API was launched with the default parameters (localhost at port 5000) and its calling the example model.
Endpoint: /health
$ curl -X GET http://localhost:5000/health
up
Endpoint: /ready
$ curl -X GET http://localhost:5000/ready
ready
Endpoint: /service-info
$ curl -X GET http://localhost:5000/service-info
{
"debug": true,
"running-since": 1563355369.6482198,
"serving-model": "model.joblib",
"version-template": "1.2.0"
}
Endpoint: /info
$ curl -X GET http://localhost:5000/info
{
"metadata": {
"features": [
{
"default": -1,
"importance": 0.2,
"name": "feature1",
"type": "numeric"
},
{
"default": -1,
"importance": 0.1,
"name": "feature2",
"type": "numeric"
},
{
"default": -1,
"importance": 0.3,
"name": "feature3",
"type": "numeric"
}
]
},
"model": {
"class": "<class 'sklearn.ensemble.forest.RandomForestClassifier'>",
"cls_name": "RandomForestClassifier",
"cls_type": "<class 'sklearn.ensemble.forest.RandomForestClassifier'>",
"is_explainable": false
}
}
Endpoint: /predict
$ curl -d '[{"feature1": 1, "feature2": 1, "feature3": 2}, {"feature1": 1, "feature2": 1, "feature3": 2}]' -H "Content-Type: application/json" -X POST http://localhost:5000/predict
{
"prediction": [0, 0]
}
Endpoint: /predict?proba=1
or /predict_proba
$ curl -d '{"feature1": 1, "feature2": 1, "feature3": 2}' -H "Content-Type: application/json" -X POST "http://localhost:5000/predict?proba=1"
{
"prediction": [{
"0": 0.8,
"1": 0.2
}]
}
Endpoint: /features
$ curl -X GET "http://localhost:5000/features"
[
{
"default": -1,
"importance": 0.2,
"name": "feature1",
"type": "numeric"
},
{
"default": -1,
"importance": 0.1,
"name": "feature2",
"type": "numeric"
},
{
"default": -1,
"importance": 0.3,
"name": "feature3",
"type": "numeric"
}
]
Endpoint: /predict?proba=1&explain=1
or /explain
$curl -d '{"feature1": 1, "feature2": 1, "feature3": 2}' -H "Content-Type: application/json" -X POST "http://localhost:5000/predict?proba=1&explain=1"
{
"explanation": {
"feature1": 0.10000000149011613,
"feature2": 0.03333333383003871,
"feature3": -0.1666666691501935
},
"prediction": [{
"0": 0.7,
"1": 0.3
}]
}