Developing with Operators


The EnOS Stream Processing Service provides a set of underlying packaged operators for developers to develop customized stream data processing jobs to meet the requirements of complex business scenarios. Developers can also upload and install custom operator libraries for developing customized stream processing pipelines.

Overview

EnOS Stream Processing Service provides a user-friendly drag-and-drop UI for designing data processing pipelines. You can quickly configure pipelines by adding operators (stages) to the pipeline, thus completing data ingestion, filtering, processing, and storage tasks without any programming. |

A data processing pipeline usually consists of multiple stages that are connected by arrows that define the data stream, though which the data sequentially flows. Each stage represents a read-and-write or processing operation to the data. This kind of process forms a stream data processing job. A pipeline can include the following stages.

  • Source

    The stage that specifies the data source and passes the output data to later stages, such as the Kafka Consumer stage.

  • Processor

    The stage for data conversion, where the input data is normalized or changed (such as data filtering, transforming, calculation, etc.).

  • Destination

    The stage for storing the processed data in the target storage or sending data for further processing.


EnOS Stream Processing Service supports developing general pipelines and advanced pipelines. For detailed information, refer to the following topics: