Platform Integration and Interoperability Mechanisms
- 1 Introduction
- 2 Platform Capabilities
- 3 Platform Modules that support it
- 3.1 Acquisition Layer (Acquisition Layer)
- 3.1.1 Digital Broker
- 3.1.2 Kafka Broker
- 3.1.3 DataFlow
- 3.1.4 Digital Twin Broker
- 3.1.5 Video Broker
- 3.2 Interoperability Layer (Interoperability Layer)
- 3.2.1 API Manager
- 3.2.2 Dashboard Engine
- 3.2.3 Open Data Portal
- 3.2.4 SDKs and APIS
- 3.1 Acquisition Layer (Acquisition Layer)
Introduction
In many scenarios, the Platform acts as the organization's Central Platform (e.g. Smart City Platform) and there are a set of verticals/developments/applications that must either dump certain information on the platform (e.g. alarm values, KPIS,...) or consume information managed by the platform (e.g. asset information).
The correct integration of the Vertical Services with the Platform is essential to ensure that the information flowing through the vertical is incorporated into the global information flow provided by the platform, thus allowing to exploit the information in a holistic way.
The objective of this entry is to describe the integration mechanisms offered by the platform through its technical components and its interoperability capabilities.
Platform Capabilities
Integration, whether unidirectional or bidirectional, will be carried out thanks to Onesait Platform's out-of-the-box integration and interoperability capabilities, which allow integration with any device or IT system in a very simple and agile way. These include:
Connectors: The Platform offers connectors in multiple protocols that allow sending, receiving and subscribing to the data managed by the platform. Among these connectors we have REST APIS, MQTT Bus, Bug Kafka and WebSockets. In addition, the platform allows to create connectors for the Digital Broker and deploy them as plugins or easily connect to existing Buses and platforms through the DataFlow.
NGSI-v2 FIWARE Protocol Support which allows interoperability with FIWARE platforms through its Orion Context Broker. See details: Compatibilidad FIWARE NGSIv2 en Onesait Platform
APIS Multilanguage client: the Platform offers multi-language APIs to facilitate the development of clients that want to communicate with the Platform. Among these APIs we have APIs in Java, Kafka, Python, Javascript, Node, Android and Node-network. See details: https://onesaitplatform.atlassian.net/wiki/spaces/DOC/pages/2215907321
Plataforma open-source: all platform components, including client APIs (GitHub - onesaitplatform/onesait-cloud-platform-clientlibraries: Client libraries to interact with Onesait Platform Cloud Side (Digital Broker specially) ) y los componentes que dan soporte a la integración (GitHub - onesaitplatform/onesaitplatform-cloud: Onesait Platform Community edition is a free, open-source Digital Platform that anyone can download and use to build a complete solution over it. This repo contains the Cloud Side of the Platform. )
Semantic approach: the semantic approach of the Platform allows information to be modeled in Ontologies, which make it possible to manage information in a way that is completely agnostic of the technological protocol used to send the data, which means that regardless of the protocol, the information is managed in the same way. This facilitates the integration not only between the services and the Platform, but also the integration and global management of data between the different services. Within this approach, the platform supports JSON-Schema and JSON-LD for linked data. Soporte JSON-LD en Plataforma including http://schema.org compliant definition.
Digital Twins: A Digital Twin is a digital representation of a real world entity or system that does not act as a replacement of the physical object of the system it represents but as a replica of it, allowing communication (testing, monitoring, command) of this physical device without having to be attached to it. See details Aplicabilidad e Importancia de los Digital Twins en el ámbito Smart Cities
Platform Modules that support it
The image shows the Platform components organized by layers where the Acquisition and Publication Layers are highlighted, which include the platform components to communicate with it (sending, query and subscription):
The described Layers correspond to the Layers Model of the Smart City Platform of the UNE-178104 standard. More info: https://onesaitplatform.atlassian.net/wiki/spaces/DOC/pages/2215903386
Acquisition Layer (Acquisition Layer)
This layer provides the mechanisms for capturing data from the Acquisition Systems, it is also in charge of enabling the interconnection with other external systems that only consume data and abstracts the information from the Acquisition Systems with a standard semantic approach.
Digital Broker
It is the platform's Broker and the default data acquisition mechanism:
It offers multi-protocol gateways (REST, MQTT, WebSockets,...).
Bidirectional communication with the platform clients.
Offers multi-language APIs (Java, Javascript, Python, Node.js, Android,...) that allow each vertical to develop on its preferred language and platform.
This is the mechanism with which systems and devices are typically integrated.
Kafka Broker
The platform integrates a Kafka cluster that allows communication with systems that use this exchange protocol, generally because they handle a large volume of information and need low latency.
DataFlow
This component allows you to configure data flows from a web interface. These flows are composed of a source (which can be files, databases, TCP services, HTTP, queues, ... or the Platform Digital Broker), one or more transformations (processors in Python, Groovy, Javascript, ...) and one or more destinations (the same options as the source).
It is the mechanism to be used when the platform is the one that will collect information from the source system and not the one that sends it to the platform or when the system/device is not integrated and dumps the information on an external bus. After this, a conversion on the platform will be necessary.
Among its main features:
It includes a large number of input and output connectors, such as connection with databases, Web services, Google Analytics, REST services, brokers, ... which allows us to decouple in a simple and integrated way through a single tool. (Conectores ). Among the main Dataflow connectors we can find Big Data connectors with Hadoop, Spark, FTP, Files, Endpoint REST, JDBC, NoSQL DB, Kafka, Azure Cloud Services, AWS, Google, ...
Large number of connectors for IoT scenarios such as OPC, CoAP, MQTT...
It allows an external system to communicate with the platform through a supported protocol (MQTT, REST, Kafka, JMS, ...), the platform orchestrates this data and routes it to another system or incorporates it into the platform.
In addition, there is a palette of components that support more specific protocols that the platform administrator can enable as needed:
There are specific connectors for Amazon, Google Cloud, Azure, etc.
All information flowing through the component is monitored, audited and made available for exploitation, in addition to which the DataFlow component offers traceability and online monitoring:
Digital Twin Broker
This Broker allows communication between the Digital Twins and the platform, and between them. It supports REST and Web Sockets as protocols.
The platform, implementing the W3C Web of Things (Home - Web of Things (WoT) ) , which defines how a Digital Twin has to be modeled, as well as the communication APIs it must provide, provides full support to materialize these concepts in digital systems that can connect with each other, and collaborate according to rules visually established by a municipal management operator.
For this purpose, the following capabilities are provided from the platform:
Modeling of a Digital Twin from the Platform Control Panel: so that a user can precisely define the interface (inputs, outputs and status) of my Digital Twin. Modeling allows using the semantics included in the platform (inputs, outputs and state can be ontologies in turn).
Simulation of the Digital Twin: so that you can be testing the behavior of the Digital Twin (DT), allowing you to use the platform's AI modules.
Digital Twin implementation: once the DT has been modeled, the platform is able to generate code in different languages to implement the required functionality for the use of the DT in operation:
Digital Twin Status: our Digital Twins are securely connected to the Platform, and the platform has a Shadow of its status:
Orchestration of Digital Twins: once modeled, implemented and running the platform allows to visually build an orchestration of Digital Twins, so that the output of one Digital Twin can be mapped to the input of another so that one reacts to the state changes of the other.
Video Broker
This component allows connecting to cameras through the WebRTC protocol and processing the video stream by associating it with an algorithm (person detection, OCR, etc.).
The result of this processing can be represented in the platform dashboards in a simple way:
Interoperability Layer (Interoperability Layer)
This layer offers interfaces over the Knowledge Layer, establishing security policies and connectors so that external systems can access the Platform and vice versa. It allows services to be built from the Platform data. For this purpose, one of the APIs offered to developers must be the native API for accessing knowledge layer data.
API Manager
This module makes available the information managed by the platform with REST interfaces, these APIs are created from the control panel and can be queried or updated. It also offers an API Portal for API consumption and an API Gateway to invoke the APIs.
The Platform integrates the Gravitee API Manager for scenarios where a more advanced control of the REST APIS is needed, for example custom security policies, throutput control,...
This is the typical mechanism for integrating web applications, portals and mobile applications with the platform, and current BI tools already offer connectors, as can be seen in these examples:
¿Cómo representar datos de un API REST de Plataforma en QlikView?
Dashboard Engine
This module allows, in a simple way, the generation and visualization of powerful dashboards on the information managed by the platform, consumable from different types of devices and with analytical and data discovery capabilities. All of this is centrally orchestrated through the Onesait Platform control panel, and these dashboards can be made public or shared with other platform users.
These dashboards are based on the philosophy of powerful component frameworks such as React, Angular or Vue, and are built from simple autonomous and reusable components called gadgets.
Open Data Portal
Onesait Platform integrates the CKAN open-source Open Data portal among its components.
CKAN is an Open Source data portal that provides tools to publish, share, find and use data. Its basic unit is the datasets, where data are published, which are formed by different resources and metadata. The resources store the data, allowing different formats (CSV, XML, JSON...) and thanks to the integration with the platform, the complete system is provided with functionalities such as:
Unified management in the Control Panel and Platform
Complete management of datasets and resources.
Publication of ontologies as datasets.
Publication of APIs as datasets
Integration with Platform Dashboards
Full platform security integration, authentications, authorizations, etc.
SDKs and APIS
The platform offers REST APIS to access all the components of the platform both at the operation and management levels, as well as multilingual SDKs to communicate with the platform in a simple way:
See detail: https://onesaitplatform.atlassian.net/wiki/spaces/DOC/pages/2215907321