Kafka notificacion topics for ontologies

This functionality is available from version 2.1.0-gradius onwards.

Introduction

Onesait Platform has already integrated Kafka as a way to insert data into our ontologies but, what if we want to be notified via Kafka every time new data gets inserted into ontologies?

Well, in this Q2 we are introducing the kafka notification topics for ontologies which allows you to be aware of each insert throughout a kafka topic.

How to use it

This new functionality will be available by topic. It can be activated by creating/editing an ontology and going to the “Advanced settings“ tab, then checking the “ALLOWS CREATE NOTIFICATION KAFKA TOPIC?“ property.

This will create a topic with the following naming convention: ONTOLOGY_OUTPUT_<ontology_name>

Once the ontology is created/updated, every insertion sent to it(from any source as API, CRUD, Digital clients,…), will write the data to the notification topic.

Here is an example of how to activate a kafka notification topic and how to retrieve the notificacion data from it.

  1. Activate kafka notification topic for the ontology “KafkaNotifOnt“

    Select an ontology to edit


    Check “ALLOWS CREATE NOTIFICATION KAFKA TOPIC?“ and save changes

     

  2. Create/update a digital client and allow it to read from the ontlogy “KafkaNotifOnt“. This will grant ACL permissions to the topic for that client:

    Select the digital client to update:


    Add “KafkaNotifOnt” with “READ” or “ALL” access level and save changes.

    Remember the token as we will be using it in the follwong step.

  3. Create a kafka consumer on the notification topic. In this example we will use a console consumer with SLAS authentication:

    First we will create a jaas config file using the name of the digital client as user and it’s token as password and save it to a file

    1 2 3 4 5 6 7 echo ' KafkaClient{ org.apache.kafka.common.security.plain.PlainLoginModule required username="KafkaClient" password="2abb5497223344969bb45077a071b8b7"; }; '>/etc/kafka/kafka_client_jass.conf


    Then we will export the KAFKA_OPTS variable with it’s location

    1 export KAFKA_OPTS="-Djava.security.auth.login.config=/etc/kafka/kafka_client_jass.conf"


    After that, we need to create a consumer properties file to enable SASL as authentication method and save it to a file

    1 2 3 4 echo ' security.protocol=SASL_PLAINTEXT sasl.mechanism=PLAIN '>consumer.properties


    Finally we can start our consumer and leave it running


     

  4. Insert data into the ontology

    In this example we will use the CRUD

    Then insert some data



  5. Check the consumer: