FlowMaker Basics
FlowMaker orchestrates industrial data pipelines as directed graphs of flow boxes connected in a runtime graph. A run starts in the Launcher, is validated and resolved into a full RunRequest, then executes in the stateless Runner through persistent ZeroMQ worker connections.
[INFO!hub/RUN START GUARANTEE] Flow execution only begins after node roles, versions, and IO contracts are resolved; most structural mistakes are rejected before runtime traffic starts.
In this section
- Core Concepts: Sources, Pipes, and Sinks
- Inputs and Outputs
- Backpressure and Data Flow Control
- DataCatalog Integration, Source Connections, and Payload Conventions
Quick Overview
[NOTE!api/THREE NODE ROLES] Every worker implements exactly one role: Source (ingests data), Pipe (transforms data), or Sink (persists/outputs data). This separation keeps workers single-purpose and reusable across pipelines.
[NOTE!link/CONTRACT-FIRST DESIGN] Inputs and outputs are declared upfront as typed ports. Connections between nodes are validated before any data flows, catching mismatches during deployment rather than at runtime.
[NOTE!speed/FLOW CONTROL] Workers block until downstream consumers are ready. This natural backpressure prevents memory bloat and keeps long-running industrial pipelines stable under variable load.
[NOTE!database/CENTRALIZED CONFIG] Database credentials, API endpoints, and connection parameters live in the DataCatalog as source connections. Workers reference them by ID, keeping secrets out of code and enabling environment-specific overrides.
Mental model (end-to-end)
- Design a graph (
FlowDef) with nodes and connections. - Resolve each node to a registered box and live worker.
- Initialize each worker with runtime context + connected IO.
- Route data from source outputs to destination inputs.
- Control throughput through blocking acknowledgements (backpressure).
- Stop/complete with observable run state transitions.
Real-world baseline example
A packaging line flow can be modeled as:
- Source: PLC ingestion box emits machine counters and alarm states.
- Pipe: normalization box converts vendor-specific payloads into a common schema.
- Pipe: quality rule box adds OEE/defect metrics.
- Sink: historian sink stores time series; alert sink pushes anomalies to operations.
[NOTE!account_tree/TRANSFERABLE PATTERN] When onboarding a new domain, keep the graph shape and control loop intact first, then specialize only box implementations and options.
Where this comes from
readme.mdplatform-backend/docs/runtime-v2.mdsdk/worker/repo-management/README.mdsdk/typescript/src/flowbox-registration-info.tssdk/typescript/src/flowbox-init-params.ts