A Look at the Control Panel and other Platform Concepts

From version 3.1.0 onwards, the Control Panel has changed the name of the Ontologies to Entities. This does not alter any functionality. Only the nomenclature has been changed, for a better understanding of the concept.

Control Panel Introduction

Control Panel is the central web console for managing all platform concepts.

It has a set of tools and capabilities that allow the complete management of the solutions.

A Look at the Control Panel

Login

The login screen is configurable depending on the Identity Manager we use in our installation. By default, OAuth2 is installed on the configdb.

You can also use the advanced Identity Manager that works on Keycloak, or even use an LDAP as a user repository, etc., or extend the functionality through plugins.

Authentication protocols supported by the platform

All users whose account is activated can access the control panel with their username and password.

In this screen. you can also complete the registration of a new user who is not already registered.

After this step. you will receive an email to confirm the email account.

Initial Control Panel screen

  • The menus appear depending on the user's role. They are configurable by an administrator user in the menu management option.

  • The side menu has a search engine.

    • Administration: (Administrator role case) gives access to platform configurations such as user management, tenants, templates....

    • Development: It groups the most common functions for a developer.

    • Visualization and GIS: Maps, reports and dashboards.

    • Analytical tools: It groups the data analytics such as:

      • notebooks: provides data scientists with a multi-user web environment in which to elaborate analysis models of the information stored in the Platform with their favorite languages (Python, Spark, R, Tensorflow, Keras, SQL,...) in an interactive way.

      • dataflows: allows you to visually create and configure data flows between sources and destinations for both ETL/ELT and Streaming processes.

    • Clients and digital twins:

      • Clients: management of the clients that connect to the platform, through the client apis, java, arduino, javascript,....

      • DIGITAL TWINS

      • Digital Twin definition:

        • A Digital Twin is a digital representation of a real-world entity or system that does not act as a replacement for the physical object of the system it represents but as a replica of a real-world entity or system but as a replica of it, allowing the communication (testing, monitoring, control) of this physical device without being attached to it without having to be attached to it.

    • Data governance:

      • Lineage: With this functionality, you can query the relationships of an element (Ontology, Dashboard...) with all the other elements of the Platform.

      • Data Refiner: The objective of this component is to "refine" the information that is loaded or extracted from the platform.

    • Open information: access to the open data portal and to the management of its resources.

    • Development tools: It groups tools such as cache management, centralized configuration....

    • Tools:

      • query tool for data queries on entities.

      • tools for generating user interface with low code from a figma design.

      • system audit.

      • Load tool: allows to obtain metrics of data loads on the platform.

  • In the top bar, we have:

    • an access to perform searches in the developer's portal.

    • language change.

    • access to the platform’s apis.

    • user information.

User Roles

Roles & Access to the Platform Features by Role

  • USER (ROLE_USER): This role has access to the platform in query mode, i.e., it can consume platform information generated by others but not upload information, therefore it can consume dashboards, APIs, consult ontologies, ...

  • DEVELOPER (ROLE_DEVELOPER): It can create ontologies, APIs, rules,... It is the platform’s user type, and the one that is created by default. It has limited access to the AI capabilities of the platform, to control the resource consumption of the installation.

  • ANALYTICS (ROLE_DATASCIENTIST): this role extends the capabilities of the Developer role, allowing access to the analytical and AI tools, therefore it has access to the DataFlow, Notebooks, Models,...

  • ADMINISTRATOR (ROLE_ADMINISTRATOR): This role has administration access to the Platform’s Control Panel. From there, it can manage all the concepts of an instance of the platform created by other users, including user management, ontologies, permissions, ....

  • Example of user creation:

Steps:

  1. Fill in the form with the user's information and assign a role.

  2. After this, you go to the next option to enter a password.

  3. If you want, you can add additional properties in json format something like : {'organization':'city council','type':'1' } for example.

Multitenant

Multitenant

This option must be configured in the installation.

Multitenant is a software architecture principle in which a single instance of the application is able to serve multiple clients or organizations (tenant or instance).

This model differs from architectures with multiple instances where each organization or client has its own installed instance of the application because, in a multitenant architecture, the application can virtually partition its data and configuration so that each client has a virtual instance adapted to its requirements.

Multitenant functionality in Platform is supported on 2 concepts:

  • Vertical: it represents a specific product and project. Let's suppose the deployment of Platform on an organization that offers products in SaaS mode. In that case, we could have different verticals deployed on an instance, for example Vertical Smart Home, Vertical Waste Management,...

  • Tenant: it represents a customer to whom the organization serves its products, for example, we could offer the Smart Home solution to Carrefour, Leroy Merlin,...

With the support of Onesait Platform, we could serve several verticals and tenants with a single platform instance.

Projects

This feature allows you to group users and share a series of resources with them. It is available for developing verticals on the platform.

It is important to note that only DEVELOPERS and ANALYTICS users can create Projects.

Example of developer user project.

With a developer user, we access to My Projects:

 

fill in the form with the project data:

and create it. After this we will be able to access to its edition.

We will add other users to the project; in this case we have added analytics:

We will define who has access to which resources and how.

 

Firstly, look for the type of resource, in this case Dashboard.

Two dashboards will appear. We will share them with the new user: one with view access and the other with manage access, which will allow him to modify it.

If the type of access manage is given, you’ll be asked if the type of access that you are giving to the user has associated resources.

Finally, the list of shared resources and their type of access will appear.

Realms

REALMS

This functionality allows you to create security domains for your applications and use these domains from your applications through the authentication and authorization of the Identity Manager.

In a domain, you can create your own roles and assign platform users to them.

Entities

Introduction to Entities

video course: Onesait Platform Ontologies training

In the simplest case, an entity can be compared to a table in a relational database, but an entity can contain a complete domain model, which would require in a relational database a set of related tables.

In the platform, a schema in JSON-Schema format is used to define the entities. This schema defines the structure of the information that the entity will store, allowing semantic validations on the received data.

List of ontologies

  1. Ontology list description:

2. Description of the list options:

Templates/Data Models: for standardization purposes

A Data Model represents the structure of your data and relationships, therefore it organizes the elements and standardizes how they relate to each other.

Data Models are essential in the IoT or Smart Cities environment, where we deal with physical assets, measurements, devices, processes, people, ... and our Data Model must be able to model all these concepts.

An administrator user can create these data models to make them available for the rest of users.

Example:

 

We will create a template using the schema, for later use.

{ "$schema": "http://json-schema.org/draft-07/schema#", "title": "DataModel Alarm Schema", "type": "object", "required": [ "Alarm" ], "properties": { "Alarm": { "type": "string", "$ref": "#/datos" } }, "datos": { "description": "Properties for DataModel Alarm", "type": "object", "required": [ "timestamp", "assetId", "severity" ], "properties": { "timestamp": { "type": "object", "required": [ "$date" ], "properties": { "$date": { "type": "string", "format": "date-time" } }, "additionalProperties": false }, "assetId": { "type": "string" }, "severity": { "type": "string", "enum": [ "LOW", "MEDIUM", "CRITICAL" ] }, "alarmSource": { "type": "string" }, "details": { "type": "string" }, "status": { "type": "string" } } } }

Ontology creation

File uploading

 

It is possible to create ontologies from a file. This can be done via API:

Importing data into an ontology from a file via API

or from the control panel:

 

EXample:

We create an ontology, "severity", by importing it from file below:

fields{ type (low, medium, high) ,code (0,1,2)}

We select the file.

We generate the schema and click on “create”. With this, we have created a new ontology with the imported data.

Step by step Ontology creation

Step by step Ontology creation

Example:

We select this option in the ontology creation.

We load the general information and select the options.

Data models: We select the previously created template, and we also show how to use an empty template and generate the schema.

We show relationships between ontologies (ontology severity).

Advanced options

Here we must bear in mind several points:

If you want the records to be deleted from time to time and if you want to save them as files before deleting them.

Non-persistent ontologies
Non-persistent ontologies

Triggering processes such as:

  • Launching a Rule.

  • Execution of a Flow from the FlowEngine.

  • Execution of a DataFlow Pipeline.

  • Execution of a microservice.

  • Communicating with Broker Kafka for insertion and notification.

  • (In the future) Execution of a function in the FaaS support.

Also bear in mind if these Kafka options are selected for large volumes of information and low latency.

Kafka notificacion topics for ontologies

How to send & receive data to/from the Kafka Broker of the Platform

KPI

KPI

A KPI (Key Performance Indicator) is basically a metric that helps us to measure and quantify progress based on a set of goals and objectives, i.e., a KPI is designed to show how progress is in a particular process or product.

Example: Let's create a KPI for the HelsinkiPopulation ontology.

We will use the query that returns the population average:
SELECT avg(c.Helsinki.population) as media FROM HelsinkiPopulation AS c

We will assign a frequency to the KPI to get this measure based on the frequency.

ONTOLOGY REST API

ONTOLOGY REST API

Example: We will create an ontology with origin an api rest of the platform.

We have 2 options:

  • Manually fill in the REST API definition.

  • Obtain the data automatically through a Json swagger that defines the external API to be used.

We will choose the second option.

 

Base url:

https://lab.onesaitplatform.com/controlpanel/

swagger.json :

https://lab.onesaitplatform.com/controlpanel/v2/api-docs?group=Gadgets

As security, we will opt for user and password.

After the import, it is convenient to assign to each method its Default Operation Type, so that the ontology works correctly.

Ontologías TIMES SERIES

TIMES SERIES

Example:

  1. We create an ontology of this type:

 

tag: identifier

field: signal

2 ) auxiliary data (creates another ontology where the last data received is recorded to speed up queries).

  1. We select the temporary windows.

  2. We create the ontology.

  3. From the crud, we add a record and check that two entries have been created, one for each window.

  4. In the stats ontology, we also see how the data has been loaded.


Control Panel Utilities

Entities CRUD

From the entity management list, we have access to the crud of each Entity.

Query Tool

In addition to the menu option that is available, you can access by api:

API QUERY TOOL