Onesait Platform as a Low Code Platform



What is a Low Code Platform?

An LCAP (Low Code Application Platform) is a platform that supports quick development, automated deployment, and execution and management of applications using high-level programming abstractions (either visual or declarative).
These platforms support the development of user interfaces, business logic and data services, and they improve productivity when compared to conventional application platforms.
An enterprise LCAP must support the development of enterprise applications, which require high performance, scalability, high availability, disaster recovery, security, SLAs, resource usage tracking and vendor support, and access to local and cloud services.


Value proposition of the Onesait Platform as a Low Code platform

Onesait Platform stands on top of four pillars, one of which specifically refers to the agile development of solutions:

Thus, the Platform provides capabilities in the four phases of a project: Start-up (Architecture, Provisioning, etc.), Development, DevOps, and Change management.
We will next see the differences when approaching these phases with traditional development or with development based on Onesait Platform:


Start-up

Development

Operation

Change management

Traditional development

-Choice and testing of technologies.
-Need to provide infrastructure and base software.
-Complexity to share environments and infrastructure between different systems.

-Large amount of code.
-Several teams developing on each layer.
-More difficult reuse.

-Effort to deploy in Environments.
-Development environment different from the execution environment.
-Provision of infrastructure and of base components.
-Difficulty to deploy on premise and in cloud.

-Complexity to understand and evolve code over time.
-Dependence on the specific developer.
-Complexity in modifying the infrastructure.

Development with Onesait Platform

-Automatic provision of environments.
-Proven, integrated and ready-to-use technologies.
-Web environment that centralizes development and administration.
-Management of several projects and teams in a single environment.

-Visual development on ready-to-use components.
-Common platform for all layers.
-Reusability between projects.

-Automated deployment.
-Independence at deployment level.
-Infrastructure and base components deployed and supported.

-More maintainable visual development.
-Less dependency on developer.
-Automatically scalable.
-Support in development and production with the ability to prioritise.

Some of Onesait Platform’s Low Code Capabilities

At this point, we are going to detail some of the capabilities offered by Onesait Platform as a Low Code Platform. We are not going to see all of them, but we will see those that give an idea of the added value it offers.
Also, remember that the Onesait Platform is a Data-Centric platform, which makes it possible to manage the information life cycle, by enabling the construction of applications with "data at the core". We will analyse these capabilities on the basis of data processing: 

In the Persistence Layer

In this layer, the Platform provides capabilities such as multistore modelling, where the platform provides both independence of the database used, and the import from a file.

Let’s see some of these capabilities in detail.

Multistore modelling

The platform allows to handle, in a virtually transparent way, databases of different types, from any relational database with JDBC driver, to NoSQL databases such as Mongo, Elasticsearch, Big Data repositories such as HIVE and Kudu, etc. It is also able to treat a REST API or a Digital Twin as another data source.
This abstraction is done through the concept of ontology, which makes it independent from the underlying database, always offering an SQL engine to access it (even when using NoSQL databases such as Mongo or Elasticsearch).

All this implies that:

  • The modelling of the entities is done from the Platform’s Control Panel itself, and when you create the entity in the platform (the famous ontologies), it is responsible for creating the necessary tables or elements in the selected database.
  • The Platform is in charge of provisioning or connecting to the selected databases, thus isolating the development team from this complexity.
  • It also allows to connect to pre-existing databases and to create ontologies from existing tables.
  • The ontology is represented as a JSON-Schema and the instances of the ontologies are JSON files. This is independent of the selected storage engine, as the platform takes care of converting to and from this model.


More information on the concept of Ontologies /wiki/spaces/PT/pages/333316099.

Import from file

The Platform also offers a UI to easily upload a file, in JSON, CSV or XML format. The Platform will detect the file format and will automatically generate the ontology in the selected repository and load the data into it:

In the Ingestion Layer

In this layer, the Platform offers several capabilities, among which DataRefiner and DataFlow stand out.

Data Refiner

The objective of this component is to "refine" the information that is loaded or extracted from the Platform. To this end, the tool offers an Excel-like interface that allows:

  • An end-user to load, from a UI, data from various places, such as from their own PC, from a URL or from information stored in the Platform itself.

  • Data to be uploaded in the main formats, including XLS, XLSX, XML, JSON, CSV, etc.

  • The user to work with these data with an Excel-like interface to perform data profiling, including data cleansing, enhancement, restructuring or reconciliation.
  • The "refined" data to be downloaded as files, or uploaded to the Platform as an ontology.


More information on this componente /wiki/spaces/PT/pages/1080066100.

Data Flow

This module allows you to create and configure data flows between sources and destinations in a visual way, both for ETL/ELT type processes and for Streaming flows, including transformations and data quality processes among these.
Let’s see a couple of examples:

  • Ingest to Hadoop of the tail of a file with field elimination process:

  • Ingest from a REST endpoint and upload to the Semantic DataHub of the Platform with data quality process.


Beyond this:

  • It offers a large number of connectors for specific communications, both inbound and outbound, as well as processors (int he Platform's Developer Portal, you can see all the /wiki/spaces/PT/pages/116556146). Among the main DataFlow connectors, we can find Big Data connectors with Hadoop, Spark, FTP, Files, Endpoint REST, JDBC, NoSQL DB, Kafka, Azure Cloud Services, AWS, Google, etc.
  • All the creation, development, deployment and monitoring of the flows is done from the Platform’s web console (the so-called Control Panel).


And in debug mode::

More information on the component in /wiki/spaces/PT/pages/293077588.

In the Analytics Layer

In this third layer, the Platform allows for the creation of AI models from a centralized environment, as well as publishing these models as microservices and APIs. Let's see what this layer offers:

Centralised development of ML and AI models

The module called Notebooks provides data scientists with a multi-user web environment in which they can develop analysis models on the information stored in the Platform with their favourite languages (Python, Spark, R, Tensorflow, Keras, SQL, etc.), and in an interactive way.
The notebooks are defined and managed from the Platform’s own Control Panel, from where you can create, edit and execute your models (notebooks) from the Control Panel itself. The image shows an example in Python:

This module allows the data scientists to create algorithms in various languages depending on the data scientist’s needs and preferences, supporting Python, Spark, R, SparkSQL, SQL, SQL HIVE, Shell, among others. Next you can see an example of data exchange between different languages:

Thanks to the service capabilities of the REST API notebooks offered by the Platform, notebooks can be orchestrated and called to be executed from the FlowEngine, allowing execution control within the solution logic:

This allows you to create visualisations of different types, even adding HTML code via Angular, which allows you to create very sophisticated visualisations. These visualisations can be added as a gadget to the Dashboards engine, so that both data analytics and presentation of results is facilitated, either in a report or exposed as a gadget of the Platform. An example of data visualisation using graphs could be:

Another example of visualisation, this time of data with a spatial component on cartography:

You can find more information on this module in /wiki/spaces/PT/pages/49807646 of the Developer Portal.

Publication of Notebooks as Microservices/APIs

The Platform allows to create a Model from a notebook, indicating the input parameters and managing the execution result of each model:

Also allowing to compare the result of a model:

The Platform also allows to publish a model as a microservice, so that the Platform allows to compile it, generate the Docker image and deploy it on the infrastructure. In other words, the Platform takes care of these steps:

In the Publication Layer

This layer offers a rich set of tools which we will see next.

Visual development of business logics

The Platform integrates a component called Flow Engine, which allows the creation of process flows, visually orchestrating pre-created components and customised logic in a simple way.


We offer a large number of connectors, ranging from protocols used at enterprise level such as HTTP, Web Services, Kafka, etc., to protocols used in devices such as MODBUS, Profibus, communication with social networks, etc., as well as components to integrate with the Platform’s capabilities.
This component also integrates with the rest of the components of the Platform, allowing, for example, to plan the sending of emails, trigger a Flow on the arrival of an ontology, orchestrate Dataflow pipelines, orchestrate notebooks, etc.

Visual creation of REST APIs

The Platform integrates an API Manager that makes REST APIs available, visually and without any programming, on the elements managed by the Platform.
These APIs are defined and managed from the Platform's Control Panel itself:

For example, this is how an API created on an ontology would be displayed:

Associated with each API, the Open API descriptor is created, which allows APIs to be tested and invoked to check their operation.
In addition, the Platform allows APIs to be created from flows built with the FlowEngine module. The following image shows how to create a REST API to send mails, encapsulating the actual provider:

Visual Creation of Dashboard

The Platform includes a Dashboard module that allows, in a simple way, the generation and visualisation of powerful dashboards on the information managed by the Platform, consumable from different types of devices and with analytical and Data Discovery capabilities.
These Dashboards are built from the Platform Control Panel itself and, based on queries made on the Platform entities (DataSources) and Gadgets, complete dashboards can be built.
A practical example of this can be seen in the following Dashboard on the incidence of COVID-19 in Portugal, created by our colleague Pedro Moura (https://lab.onesaitplatform.com/web/covid19pt/#)

The Platform offers a set of pre-created gadgets for business users,

y la capacidad de crear nuevos gadgets a través del concepto de Templates/Plantillas en los que puedo incorporar librerías Javascript y crear mis propios gadgets.

This module allows for the creation of Dashboards on any data managed as an ontology, independently of its repository, and it allows the interaction between gadgets of different types, so that an analysis and drill-down can be performed on the information of the different gadgets and their exploration dimensions.

/wiki/spaces/PT/pages/49807669 of the Developer Portal provides detailed information and examples of this component.

UI development from FIGMA sketches

To finish with the publishing layer section, we will talk about a recently-added capability that allows you to generate UIs in Vue JS from a FIGMA layout.
This is as easy as binding the FIGMA layout to names in the visual and action elements:

We then import the FIGMA design into the Control Panel, which will recognise the bindings and request additional information for the mapping, such as mappings from methods and variables to the APIs defined in the Platform:

With just this, the Platform will generate a fully operational Vue.js application:

This guide explains in detail the process, or if you prefer we also have the video example (in Spanish).

Other Utilities

Finally, the Platform offers a set of tools that allow you to speed up development and focus on business development and not on the necessary technical services.
The bulk of the utilities can be found in /wiki/spaces/PT/pages/1388675154, although we are going to highlight some of them.

Centralised log

The platform integrates Graylog as a solution to centralise the logs of its modules and other modules, storing the logs in an Elasticsearch embedded with the Platform, and it offers a UI to exploit the information.

Dropbox-style file management

This utility, called File Repository, allows uploading and sharing all kinds of files either through the UI or through a REST API.

This utility is integrated with other components such as the DataFlow, that allows access to these files.

WebApps Manager

This utility allows you to deploy web applications (a complete HTML5 website: HTML, JS, CSS, images, etc.) from the Control Panel itself, so that the Platform is the one that makes this website available through an integrated web server..

Microservice development and deployment

The Platform provides a tool to create and manage the lifecycle of applications and of microservices (Spring and others) from the Control Panel, following the standards of the CI/CD cycle: creation, compilation, test execution, generation of Docker images and deployment in CaaS.

The functionalities provided by this life cycle from the Platform are:

  • Creation of microservices.
  • Code download and development.
  • Code compilation and generation of Docker images from Jenkins.
  • Deployment of Docker images on the CaaS platform.
  • Update of a Microservice.


All this can be carried out easily from the management UI: