49

Publishing Machine Learning API with Python Flask

 5 years ago
source link: https://www.tuicool.com/articles/hit/Un2A7nb
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.

Flask is fun and easy to setup, as it says on Flask website . And that's true. This microframework for Python offers a powerful way of annotating Python function with REST endpoint. I’m using Flask to publish ML model API to be accessible by the 3rd party business applications.

This example is based on XGBoost.

For better code maintenance, I would recommend using a separate Jupyter notebook where ML model API will be published. Import Flask module along with Flask CORS:

from flask import Flask, jsonify, request
from flask_cors import CORS, cross_origin
import pickle
import pandas as pd

Model is trained on Pima Indians Diabetes Database . CSV data can be downloaded from here . To construct Pandas data frame variable as input for model predict function, we need to define an array of dataset columns:

# Get headers for payload
headers = ['times_pregnant', 'glucose', 'blood_pressure', 'skin_fold_thick', 'serum_insuling', 'mass_index', 'diabetes_pedigree', 'age']

Previously trained and saved model is loaded using Pickle:

# Use pickle to load in the pre-trained model
with open(f'diabetes-model.pkl', 'rb') as f:
    model = pickle.load(f)

It is always a good practice to do a test run and check if the model performs well. Construct data frame with an array of column names and an array of data (using new data, the one which is not present in train or test datasets). Calling two functions — model.predict and model.predict_proba . Often I prefer model.predict_proba , it returns probability which describes how likely will be 0/1, this helps to interpret the result based on a certain range (0.25 to 0.75 for example). Pandas data frame is constructed with sample payload and then the model prediction is executed:

# Test model with data frame
input_variables = pd.DataFrame([[1, 106, 70, 28, 135, 34.2, 0.142, 22]],
                                columns=headers, 
                                dtype=float,
                                index=['input'])
# Get the model's prediction
prediction = model.predict(input_variables)
print("Prediction: ", prediction)
prediction_proba = model.predict_proba(input_variables)
print("Probabilities: ", prediction_proba)

Flask API. Make sure you enable CORS, otherwise API call will not work from another host. Write annotation before the function you want to expose through REST API. Provide an endpoint name and supported REST methods (POST in this example). Payload data is retrieved from the request, Pandas data frame is constructed and model predict_proba function is executed:

app = Flask(__name__)
CORS(app)
@app.route("/katana-ml/api/v1.0/diabetes", methods=['POST'])
def predict():
payload = request.json['data']
values = [float(i) for i in payload.split(',')]

input_variables = pd.DataFrame([values],
columns=headers,
dtype=float,
index=['input'])
# Get the model's prediction
    prediction_proba = model.predict_proba(input_variables)
    prediction = (prediction_proba[0])[1]
    
    ret = '{"prediction":' + str(float(prediction)) + '}'
    
    return ret
# running REST interface, port=5000 for direct test
if __name__ == "__main__":
    app.run(debug=False, host='0.0.0.0', port=5000)

Response JSON string is constructed and returned as a function result. I’m running Flask in Docker container, that's why using 0.0.0.0 as the host on which it runs. Port 5000 is mapped as external port and this allows calls from the outside.

While it works to start Flask interface directly in Jupyter notebook, I would recommend to convert it to Python script and run from command line as a service. Use Jupyter nbconvert command to convert to Python script:

jupyter nbconvert — to python diabetes_redsamurai_endpoint_db.ipynb

Python script with Flask endpoint can be started as the background process with PM2 process manager. This allows to run endpoint as a service and start other processes on different ports. PM2 start command:

pm2 start diabetes_redsamurai_endpoint_db.py

r6Vbuyf.png!web

pm2 monit helps to display info about running processes:

Yb2qamQ.png!webQryQ7nu.png!web

ML model classification REST API call from Postman through endpoint served by Flask:

ZBnaui3.png!webBNR7JjN.png!web

About Joyk


Aggregate valuable and interesting links.
Joyk means Joy of geeK