Published Release 2.0.0 (fireball) of onesait Platform

EN | ES

On april 6, 2020, release 2.0.0 de la onesait Cloud Platform was published (aka fireball version, following our version policy:/wiki/spaces/PT/pages/60326117).

You can follow our roadmap in Roadmap onesait Platform

Below you can see this release's main features:

Engine

This version includes support for aplication and microservices developent, for which the platform offers a centralized web console that supports the administration, configuration and development of all types of applications, integrating the creation of applications, its deployment on the CaaS infrastructure, application, visualization, ..

The following features have been incorporated in this release:

  • Ontologies full integration with relational databases: from this release on, in addition to being able to connect to a relational database and create the ontology that represents a table (more info) you can create an Ontology using the UI wizard or by providing a JSON-schema and then the platform creates the tables and FKs on the configured database on a JDBC connection.

   For this, select:

1.Creation from external relational database

2.When creating the ontology, selcect a configured JDBC connection an NEW TABLE:

3.For every attribute on the ontology all configuration is needed (Name, type, conf,..):

as well as constraints (PK, FK,...)

where even an attribute from another existing table can be selected (as FK)

4.When all configuration is done, select button UPDATE SQL. It will show the SQL to be executed on the DB remote, just in case some modifications would be needed:

5.Finally, select GENERATE SCHEMA to update JSON-Schema definition and push NEW:

The new Table will have been created in the DB.

You can find the full guide on using this functionality here: /wiki/spaces/PT/pages/371490825

  • GeoJson support in Ontologies on PostGIS: complementing the previous functionality, this new version allows to establish a conection with a PostGIS database and create an ontology from an existing table with GEOMETRY type columns or to create the table from the selected ontology, as in this case: 

The Platform will return the ontology's GIS data in GeoJson format (using ST_AsText PostGIS function) for using this attributes on the viewers.

We can see it on the QueryTool, for example:

And we can create GIS Layers on these ontologies

and add them to our GIS viewers:

In this post the functionality is explained in more detail/wiki/spaces/PT/pages/491782269


  • Mongo 4.X SQL support in Semantic DataHub: Up to this version support for executing SQL queries over MongoDB was limited to Mongo versions 3.X. A new SQL Engine that works with Mongo versions 4.X has been included in this version. This engine is based on the library: open-source sql-to-mongo-db-query-converter with which we have collaborated to expand and improve its functionality and in turn contribute these improvements to the community

This new SQL engine can also be connected with Mongo Atlas.

When a solution built on platform using Mongo 3.X wants to migrate to Mongo 4.X is recommended to open a suport line request to proceed with the migration, which basically involves:

-Mongo4 and new SQL Engine installation.

-Execution of the SQL queries of the solution on the Query Tool to guarantee backward compatibility (or if there are any queries to review / modify).

-Once the compatibility is certified, the data stored in Mongo3 will be migrated to Mongo4 and the definitive deactivation of Mongo3 and the previous SQL Engine will take place.


  • Identity Manager Plugin Architecture: starting in this version, systems built on Platform can take advantage of its plugin architecture that allows adding functionalities and extensions to different platform modules simply by adding a Java library (JAR) following some rules.

This functionality is part of the Platform's Enterprise version, and will be published in the next version (Q2 of 2020) although it can already be tested in PREVIEW MODE in the Identity Manager module.

Once the plugin has been developed and packaged, it is added as an environment variable on the deployment: 

  so the module loads it automatically:

In this post How to extend the Platform with plugins you can find how to develop and deploy your own plugins.


  • SQL/JsonPath subscriptions via Digital Broker: this functionality allows a platform client to subscribe to "changes" on ontologies, so that when a condition is passed, the platform will notify the subscribed client

  These subscriptions can be defined from the Control Panel:

   

   Once a subscription is defined, a client can subscribe to these changes either by MQTT or by REST (in this case indicating a callback).

   This guide explains this functionality in detail. /wiki/spaces/OP/pages/429228173


  • Microservices generation from ZIP file o Git Repository: from this version on you can select to create a microservice from a ZIP file or a Git repository. This allows to have a catalog of Git repositories that can be used as templates:

When creating the microservice from a Git repository, it will ask for the necessary information to connect to that repository. If everything is correct, a new microservice will be created from that configuration.


  • Advanced management of platform user credentials: In this version, numerous improvements have been included in the management of the credentials of a platform user, among which we can highlight:
    • Setting the minimum password required length and complexity.
    • Password block after multiple failed attempts.
    • Checking of previously used passwords.
    • Force password change every X days.


  • Complete Kafka integration with platform security: the platform supports Kafka as one of the protocols for ingest, until now security has focused on the authentication of the connected client. In this version, security control has been added to the use of ACL to manage the data authorization when we use Kafka intake topics. In this way, when creating a client and indicating permissions to the ontologies, the same authorizations (read / write / both) are required for Kafka topics. All this is managed by the platform on the integrated Kafka Bus.ntegrado.


  • Flow Engine migration to NodeRED 1.0.3: FlowEngine execution engine has been updated to the latest version of Node-RED (1.0.3), this adds numerous updates to the component, both at flow editor level, asynchronous message passing, .. You can check the new features of this version at the following link: https://nodered.org/blog/2019/09/30/version-1-0-released


  • ConfigDB management via Query Tool: A deployment level option has been enabled (by default only enabled in Development Environments) that allows a user with an Administrator role in the Control Panel to do a complete management of the ConfigDB from the QueryTool itself.

Thus, an administrator user can connect to the ConfigDB, access the tables

And consult the data of these, in addition to being able to do INSERT, UPDATE and DELETE operations on these configuration data..


  • In addition to a large number of corrections and performance improvements identified through your requests to the platform support mailbox that help us improve the platform day by day.

Intelligence

This platform's version focuses on supporting the development of systems and applications that need to use analytical, AI or distributed intake capabilities of the platform.

In this quarter we have focused on:

  • BigQuery integration with Platform's Ontologies: BigQuery support has been incorporated into the Semantic DataHub in this version, which allows to manage persistent ontologies on this Google database:

The integration with BigQuery has been done through the Agents architecture described in the next point.

  • Agents architecture for JDBC connections: The Platform has incorporated a new JDBC Agent Architecture that allows connecting with various persistence technologies with JDBC interface. The platform allows through this JDBC connection offered by the agent the creation of ontologies, which enables the possibility of using many databases with different types, drivers or even incompatible versions of the same database.

Once we have the agent created for the source database, a JDBC connection can be created on the Platform

and from there generate the ontology:

and be able to query it like any other:

You can find the detail of the functionality in this entry: How to create an Ontology over JDBC Agent of Platform?


  • Cluster deployment Dashboard Engine support: The Javascript library that manages the connection to the Dashboard Engine has been updated. Now in cluster operations of the Dashboard if any of the nodes becomes inoperative it reestablish the connection transparently against other available node,


  • Additional resources in Reports support: from this version on associated with a Report, additional resources can be added, which can be subreports, datasources, other sources, ... as can be seen in the image:

This tutorial explains in detail how to include additional resources, including fonts: /wiki/spaces/OP/pages/323813377


  • Ontology creation from Tables with same name support: from now on when you create an ontology from a JDBC connection 

it will suggest an ontology name that unites the connection name and the table name. I can in any case select another name for the ontology since the platform has this assigment stored:

Things

This version of the platform is focused on the creation of IoT systems in which Edge and Cloud capabilities of the platform can collaborate, allowing bi-directional communication with devices, modeling and deployment of Digital Twins, ...

In this quarter we have focused the work in these lines:

  • Full Synoptics review: within this task we have resolved various errors and incorporated improvements identified by our colleagues (thanks Sergio). This motor is already being used in several of our products:

  • Device Security Kafka Integration: Kafka's integration with the platform has been improved, so that Devices on Platform that handle ontologies transfer their permissions to Kafka ACLs.

DevOps

This category includes all the tools that help Platform's Development and Operation of the platform.

In this quarter we have incorporated these new tools:

  • K8s Deployment with Helm Charts: Helm charts have been created 

that allows to automate the deployment of the platform on OpenShift:


  • Kafka deployment over Kubernetes network (instead host net): In this version, the images referring to Kafka platform services (Zookeeper, Kafka, KSQL, monitoring) have been modified to make use of the Docker / Kubernetes network. We have also simplified the configuration of all of them, allowing you to configure the services with the same environment variables that Kafka indicates in its base images.

 For more information you can follow the following link: https://docs.confluent.io/current/installation/docker/config-reference.html


  • Review and improvements in Export / Import Tool: a review of the Export / Import tool has been completed, also adding the ability to export and import Notebooks.

You can find the detail of this functionality here: /wiki/spaces/PT/pages/16973827

  • MariaDB as ConfigDB default database: from Platform version 2.0 onwards MariaDB is used as ConfigDB reference technology. This does not prevent the use of other databases, including MySQL, since the platform can support other databases to store configuration data.

  • Web Projects and Datasources management APIS: several new REST APIS have been created for the management of elements such as Web Projects and DataSources. This allows, for example, uploading a ZIP file of a web project from a continuous integration environment invoking this API.

Open Source Initiative

This line of work includes all tasks related to the Platform's Open Source version (Onesait Platform Community).

In this quarter we have continued with the line launched in Q3 of 2019, specifically:

  • Release 1.1.0-ce of Onesait Platform published on Github: The Open source version of the platform is available on Github (https://github.com/onesaitplatform/). In addition, new manuals have been included and existing ones have been improved, so that the platform execution and use is easier for the users.