(Client APIs) Installing and using the Python client


New Release 1.4

On April 13, 2021, version 1.4 of the Python client library was released on PyPi:


This tutorial shows the installation of the Python REST client for OnesaitPlatform, as well as some use cases.


This Platform client library contains several classes that implement the functionalities of different platform clients:

  • Iot-Broker Client: Queries and insertions of ontology data.
  • Api-Manager client: Management of REST APIs and calls.
  • File-Manager Client: Upload and download of binary files.
  • MQTT Client: Queries and insertions of ontology data using MQTT protocol.
  • BaseModelService: Management of Machine/Deep Learning models

With these clients, a simple communication with the platform from any Python environment is established.

Library installation from pypi:

The easiest way to install the library is by using the pypi repository and using pip. To install it, just run the command:

>> pip install onesaitplatform-client-services

It is recommended to use the upgrade flag to update the dependencies in case they are already installed (recommended for all installations).

>> pip install --upgrade onesaitplatform-client-services

Client use examples

Sample notebooks can be found in the /examples/*.ipynb library installation folder.

NOTE: Very important: For every IoTBrokerClient, always do a join() at the beginning to open the connection and leave () at the end to close it.

Digital Client

Create Client

To create a Digital Client, a connection configuration is required. We will next explain how to properly set up the parameters.

If the client connects from within the Platform's network (from platform notebooks or internal microservices), then the configuration to connect to the Digital Broker (Iot Broker) is as follows:

from onesaitplatform.iotbroker import DigitalClient
HOST = "iotbrokerservice"
PORT = 19000
IOT_CLIENT = <digital client name>
IOT_CLIENT_TOKEN = <digital client token>
PROTOCOL = 'http'

client = DigitalClient(host=HOST, port=PORT, iot_client=IOT_CLIENT, iot_client_token=IOT_CLIENT_TOKEN)
client.protocol = PROTOCOL
client.avoid_ssl_certificate = AVOID_SSL_CERTIFICATE

However, if the client is used from an external platform network (local PC and other network locations ), then the configuration is as follows:

The parameter is automatically redirected so this is not necessary, but you can set PORT = 443 (default https port).

The PROTOCOL parameter is set to "https" by default, but you can be set it up to PROTOCOL = "https".

The AVOID_SSL_CERTIFICATE parameter is set to True by default, but it can be set to AVOID_SSL_CERTIFICATE = True.

import json
from onesaitplatform.iotbroker import DigitalClient

HOST = "lab.onesaitplatform.com"
PORT = 443
IOT_CLIENT = "RestaurantTestClient"
IOT_CLIENT_TOKEN = "f633128219f54a37b23409e7f4173100"
PROTOCOL = 'https'

client = DigitalClient(host=HOST, port=PORT, iot_client=IOT_CLIENT, iot_client_token=IOT_CLIENT_TOKEN)
client.protocol = PROTOCOL
client.avoid_ssl_certificate = AVOID_SSL_CERTIFICATE

Join / Leave

To start a connection, you need to do either a join() or a connect(), which uses the iot_client and iot_client_token credentials. At the end, you must always use either leave() or disconnect().




The supported query formats are "SQL" and "NATIVE" for SQL syntax and MongoDB respectively.

query = "select * from Restaurants limit 3"
ok_query, results_query = client.query(ontology="Restaurants", query=query, query_type="SQL")

query = "db.Restaurants.find().limit(3)"
ok_query, results_query = client.query(ontology="Restaurants", query=query, query_type="NATIVE")

Query batch

This query makes successive requests of smaller size, being optimal when you want to request a lot of data. It is built with query + "offset N limit batch_size" o query + .skip(N)-limit(batch_size)" for SQL and MongoDB respectively.

query_batch = "select c from Restaurants as c"
ok_query, results_query = client.query_batch(ontology="Restaurants", query=query_batch, query_type="SQL", batch_size=50)

query_batch = "db.Restaurants.find()"
ok_query, results_query = client.query_batch(ontology="Restaurants", query=query_batch, query_type="NATIVE", batch_size=50)


new_restaurant = {
    'Restaurant': {
        'address': {
            'building': '2780',
            'coord': [-73.98241999999999, 40.579505],
            'street': 'Stillwell Avenue',
            'zipcode': '11224'
        'borough': 'Brooklyn',
        'cuisine': 'American',
        'grades': [
            {'date': '2014-06-10T00:00:00Z', 'grade': 'A', 'score': 5}
        'name': 'Riviera Caterer 18',
        'restaurant_id': '40356118'
new_restaurant_str = json.dumps(new_restaurant)
new_restaurants = [new_restaurant]
ok_insert, res_insert = client.insert("Restaurants", new_restaurants)

Api-Manager Client

Create Client

Find APIs

List APIs

Make API request

File-Manager Client

Create Client

When client is created, it is possible to use Bearer token (APIs) or user token (My APIs>user tokens)

import json
from onesaitplatform.files import FileManager

HOST = "www.lab.onesaitplatform.com"
USER_TOKEN = "Bearer eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJwcmluY2lwYWwiOiJianJhbWlyZXoiLCJjbGllbnRJZCI6Im9uZXNhaXRwbGF0Zm9ybSIsInVzZXJfbmFtZSI6ImJqcmFtaXJleiIsInNjb3BlIjpbIm9wZW5pZCJdLCJuYW1lIjoiYmpyYW1pcmV6IiwiZXhwIjoxNjE3ODI2NjkzLCJncmFudFR5cGUiOiJwYXNzd29yZCIsInBhcmFtZXRlcnMiOnsidmVydGljYWwiOm51bGwsImdyYW50X3R5cGUiOiJwYXNzd29yZCIsInVzZXJuYW1lIjoiYmpyYW1pcmV6In0sImF1dGhvcml0aWVzIjpbIlJPTEVfREFUQVNDSUVOVElTVCJdLCJqdGkiOiJmNGM2NDUzZC0xYTEyLTRkMGUtYTVlNy05ZmNlMDY4OTY1NDYiLCJjbGllbnRfaWQiOiJvbmVzYWl0cGxhdGZvcm0ifQ.Nz5cDvMjh361z4r6MMD2jUOpYSmUKVLkMThHDK0sg6o"
file_manager = FileManager(host=HOST, user_token=USER_TOKEN)
file_manager.protocol = "https"

Upload file

uploaded, info = file_manager.upload_file("dummy_file.txt", "./dummy_file.txt")

Download file

downloaded, info = file_manager.download_file("5ccc4b34f2df81000b8f49a")


BaseModelService is a class that manages the whole lifecycle of ML/DL models that can be trained and deployed in Onesait Platform. It is a generalistic base class intended to be mother of more specific classes that deal with scpecific models. For example, to manage a model of Sentiment Analysys, a class SentimentAnalysisModelService can be created. This second class must inherit from BaseModelService. BaseModelService is in charge of the interaction with different components of Platform: getting datasets from File Repository or ontologies, saving the resulting models in File Repository, report the creation of a new model in the corresponding ontology, and so on. The developer od the specific class SentimentAnalysisModelService doesn't need to bother about those issues and can focus just in training and inference code:

from onesaitplatform.model import BaseModelService

class SentimentAnalysisModelService(BaseModelService):
    """Service for models of Sentiment Analysis"""

    def __init__(self, **kargs):
    def load_model(self, model_path=None, hyperparameters=None):
        """Loads a previously trained model and save it a one or more object attributes"""


    def train(self, dataset_path=None, hyperparameters=None, model_path=None):
        Trains a model given a dataset and saves it in a local path.
        Returns a dictionary with the obtained metrics


        return metrics

    def predict(self, inputs=None):
        """Predicts given a model and an array of inputs"""


        return results

Once the class has been implemented, it can be used to build an object able to train models and predict with them:

    'PLATFORM_HOST': "lab.onesaitplatform.com",
    'PLATFORM_PORT': 443,
    'PLATFORM_DIGITAL_CLIENT': "SentimentAnalysisDigitalClient",
    'PLATFORM_DIGITAL_CLIENT_TOKEN': "534f2eb845c746bd9a50cfab30273317",
    'PLATFORM_ONTOLOGY_MODELS': "SentimentAnalysisModels",
    'PLATFORM_USER_TOKEN': "Bearer eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJwcmluY2lwYWwiOiJianJhbWlyZXoiLCJjbGllbnRJZCI6Im9uZXNhaXRwbGF0Zm9ybSIsInVzZXJfbmFtZSI6ImJqcmFtaXJleiIsInNjb3BlIjpbIm9wZW5pZCJdLCJuYW1lIjoiYmpyYW1pcmV6IiwiZXhwIjoxNjE3ODI2NjkzLCJncmFudFR5cGUiOiJwYXNzd29yZCIsInBhcmFtZXRlcnMiOnsidmVydGljYWwiOm51bGwsImdyYW50X3R5cGUiOiJwYXNzd29yZCIsInVzZXJuYW1lIjoiYmpyYW1pcmV6In0sImF1dGhvcml0aWVzIjpbIlJPTEVfREFUQVNDSUVOVElTVCJdLCJqdGkiOiJmNGM2NDUzZC0xYTEyLTRkMGUtYTVlNy05ZmNlMDY4OTY1NDYiLCJjbGllbnRfaWQiOiJvbmVzYWl0cGxhdGZvcm0ifQ.Nz5cDvMjh361z4r6MMD2jUOpYSmUKVLkMThHDK0sg6o",
    'TMP_FOLDER': '/tmp/',
    'NAME': "SentimentAnalysis"

sentiment_analysis_model_service = SentimentAnalysisModelService(config=PARAMETERS)

MODEL_NAME = 'sentiment_analysis'
MODEL_DESCRIPTION = 'First version of the model for sentiment analysis'
DATASET_FILE_ID = '605360b7cfb6d70134a3b1a0'
    'NUM_WORDS': 10000,
    'BATCH_SIZE': 16,
    'EPOCHS': 10,
    'DROPOUT': 0.2,
    'LEARNING_RATE': 0.001,

    name=MODEL_NAME, version=MODEL_VERSION, description=MODEL_DESCRIPTION,
    dataset_file_id=DATASET_FILE_ID, hyperparameters=HYPERPARAMETERS

sequences = ['Esta es una opinión muy buena', 'Esta es una opinión muy mala']
results = sentiment_analysis_model_service.predict(inputs=sequences)