Concepts

Note

The Stream Connector is deprecated and we recommend using the MQTT connector for inter-task and inter-instance communications.

Data flow

The Stream Connector exchanges data between instances of Dataristix running on different computers and serves live values in JSON format for use in other applications such as Microsoft Excel.

The connector acts as a data sink (output stream) or a data source (input stream). Output streams receive data from Tasks sending topics, tags and tag values and serve data in JSON format or forward data to input streams of another installation of Dataristix elsewhere across the Intranet or Internet.

For example, you may have an Dataristix instance running at various sites collecting data for machines and configure a task to send the data to an output stream. At your headquarters you configure an Dataristix instance with input streams to receive the data from the various sites for reporting.

Output streams may relay topics and tags for a single task or for multiple tasks. Individual input streams need to ‘pick’ one task from an output stream and remain bound to the set of topics of a single task. This warrants that only one input stream is affected when individual tasks are started or stopped on the sender side and hence aids trouble shooting. Topics received by the input stream can be processed just like the topics of other connectors, that means you can configure local tasks to store received data into databases or otherwise process the data.

../_images/stream-concept1.svg

One output stream per task

../_images/stream-concept2.svg

Multiple tasks sending data via the same output stream

When a task that sends data to an output stream is started then the task’s topic names, tag names and live values become available to input streams elsewhere. On the receiving side, matching topics and tags are created with the input stream to receive live values sent by the output stream. Should there be any changes on the output stream side (for example, tags may be added or removed) then the input stream can be kept in sync via a ‘Refreshed from Source’ function. During the Refresh, with the task sending data to the output stream running, the input stream adjusts to the new configuration. Errors may be reported if the input stream is out of sync, for example if data is received for an unexpected topic or tag.

Client or Server

Input streams and output streams exchange data via binary Websockets communications (TCP/IP) via a configurable URL. One stream is acting as the server awaiting connections, and the other stream is acting as the client, initiating connections.

While it is more intuitive to think of output streams as servers and input streams as clients, it is possible to reverse these roles. This may be preferable where, for example, data is produced in a factory floor system for relay to a business system. With an output stream configured as “client”, the factory floor system only needs to open an outbound port in the firewall to send data to an input stream “server” on the business system side.

If there are streams that operate in server mode then the local stream service requires configuration as described in the next section.