How to create a Python Text Classifier as a Cloud Function?

Previous reading: https://onesaitplatform.atlassian.net/wiki/spaces/DOCT/pages/2283668726

 

In this post, we are going to see an example of how to migrate a Flask application to a Cloud Function.

This is a classifier with a series of pre-trained models, that exposes a REST endpoint to predict a classification based on a text.

Code structure

The structure of the code is as follows:

The main is contained in the application.py file, where the flask application is launched and the “/predict” endpoint is declared.

#!/usr/bin/env python # coding: utf-8 # # Imports from flask import Flask, request, Response import numpy as np import pandas as pd import pickle import json import base64 # # Load models from model import load_model, predict primario_model, primario_vectorizer, primario_encoder = load_model() # Instantiate a flask application app = Flask(__name__) # Define the paths where we are going to work APP_PATH = app.root_path @app.route('/predict', methods=['POST']) def predict_post(): if request.method == 'POST': result = {} request_data = request.data.decode('latin1') request_data = json.loads(request_data, strict=False) content_str = request_data.get('text', '') base64_content = request_data.get('base64', False) if base64_content: content_str = base64.b64decode(content_str) if content_str: result = predict(content_str, primario_model, primario_vectorizer, primario_encoder) return Response( json.dumps(result), mimetype='application/json') # Main function that starts the service if __name__ == '__main__': app.run(host='localhost', port=8000, debug = False)

This will be the code that we will have to adapt in order to deploy the Flask application as a function. The rest of the pickles and components of the application will be packed in the function, so they will be accessible by the main without the need to modify anything.

Adaptation of the main

Following the official documentation of the repository, we will adapt the main class to be able to deploy it as a function.

We will need to create a new class “func.py” (the name doesn't matter, but we'll have to write it down in the Dockerfile later).

This class should be identical to "application.py", except because we will not be using Flask, and therefore those dependencies can be removed. Instead, we'll read the input data through a "handler" method that exposes the python FDK API.

def handler(ctx, data: io.BytesIO = None):

In the latest versions of the FDK API, data becomes a byte array instead of a string, thus increasing the flexibility to decode input content (e.g. non-UTF-8 encodings).

This method will have the content of the original predict_post() method, with the adaptations to read the data from the “data” variable, and will return a response with the FDK API instead of the Flask one.

# # Imports import io import json import logging import numpy as np import pandas as pd import pickle import base64 from fdk import response # # Load models from model import load_model, predict primario_model, primario_vectorizer, primario_encoder = load_model() def handler(ctx, data: io.BytesIO = None): result = {} byte_str = data.read() request_data = byte_str.decode('latin1') request_data = json.loads(request_data, strict=False) content_str = request_data.get('text', '') base64_content = request_data.get('base64', False) if base64_content: content_str = base64.b64decode(content_str) if content_str: result = predict(content_str, primario_model, primario_vectorizer, primario_encoder) return response.Response(ctx, response_data=json.dumps(result),headers={"Content-Type": "application/json"})

Yaml of the function

The next step is to create a Yaml descriptor for the Fn to process when building and deploying the function.

In this yaml, we must also define the endpoint that we want, to be able to invoke the function.

Depending on the (estimated) memory required, we may want to change the limit of “memory”.

Dockerfile of the function

Lastly, we will need to define a Dockerfile to deploy the function. This Dockerfile is provided generically in the Fn documentation.

We will be able to modify the generic Dockerfile to choose the version of Python that we want, in addition to being able to install dependencies or custom libraries.

In this case, we have used version 3.7, which was compatible with the declared versions of the dependencies.

As we can see, the installation of Python dependencies takes those from the requirements.txt file, so it is important that this file has the fdk dependency.

You can also see that the only extra line added to the generic Dockerfile is the one commented out:

This is because this library needs to be installed first before the others.

Create the function in platform

Git repository

In order to create the function and deploy it to the platform, we will need to upload the code to a Git repository.

Create the application

With the Git code uploaded, we will create the serverless application in the platform (review the previous guide for these steps).

Create the function

Finally, we will create the function indicating the name and the path to “func.yaml”, taking into account that the structure of the repository is as follows:

Deploy the function

Once the function is defined within the application, we will deploy it.

Test the function

Finally, we will test the function deployed through the defined endpoint:

The first time the function is called, it will take a long time, like in the screenshot above (4.52 seconds), but subsequent calls will be very fast: