This module allows you to visually create and configure data flows between sources and destinations for both ETL/ELT-type processes and streaming flows, including among these flows transformations and data quality processes.
Let's see a couple of examples:
It also offers a large number of connectors for specific communications, both input and output, as well as processors (in the Platform Developer Portal, you can see all the connectors: http://bit.ly/2rwWZ1N ).
Among the main connectors of the Dataflow, you can find Big Data connectors with Hadoop, Spark, FTP, Files, Endpoint REST, JDBC, BD NoSQL, Kafka, Azure Cloud Services, AWS, Google, ...
All creation, development, deployment and monitoring of flows is performed from the platform's web console (ControlPanel):
List of DataFlows by user with Administrator role (who can see flows from other users):
Fully integrated with the main Big Data technologies, including HDFS, HIVE, Spark, Kafka, SparkSQL, ... allowing to handle them easily and centrally:
Besides connectors in areas such as IoT (OPC, CoAP, MQTT), Social Networks, ...