What is the core concept of edge computing? It's to establish data connectivity and filtering between the devices at the network edge that generate big data, and the systems in the cloud that need to consume that data for machine learning, predictive analytics, and so on.
In several previous posts we discussed the EdgeX project from The Linux Foundation. We saw how its microservices are designed to collect data from industrial devices and systems and then move that data from the local network to the cloud.
In this post we’ll focus on how that final step of transferring real-world data from our industrial assets to the cloud actually happens—through the Export Services layer of EdgeX.
The EdgeX platform is meant to operate independently from other systems when required. This is what we want, for example, for SCADA networks in low-availability network applications, because data from field devices can be buffered until network connectivity is reestablished.
The EdgeX framework is fully capable of operating autonomously when disconnected from cloud-based management applications (at one end), or sensors and devices (at the other).
The Export Services layer provides a set of microservices that allow clients to register for data of interest coming from devices and systems attached to EdgeX. Those microservices also dictate where and when the data should be delivered.
For example, these microservices may send a temperature value from a sensor every five minutes to a REST address in whatever format the application that needs that information requires—like JSON data in compressed form for a predictive analytics application. The data comes from the core microservices of EdgeX.
As data is sent by sensors and devices up through the device services and into core data, that data is then funneled to an export facility.
The export facility comprises two microservices:
- Client Registration—Allows clients to register for data of interest.
- Export Distro—Filters and transforms the data as needed and then transfers the data to the registered clients.
The Export Services layer can then deliver the data to clients residing on the same platform where the EdgeX framework runs, or on other systems:
- Local clients might be analytics services, critical event processors, or rules engine services. Local services often perform some type of command response or control on the attached devices, based on the data they're sending up to the EdgeX framework.
- Cloud-based clients running on other systems might be hosted services or cloud-based applications. Cloud-based systems are often the historical repositories for data and may also provide deeper analytics capabilities.
The Client Registration microservice offers a simple REST API to allow clients to establish new requests, update existing requests, and remove requests for core data. While there is no user interface directly associated with this service, UI consoles and other system management systems can easily provide this capability using the APIs provided by the EdgeX framework.
The registration service allows clients to provide several details about what data they may want, how data is formatted, and where to deliver the data. Let's take a closer look at those.
What Data To Request
The "what" is about designating filters for the data—weeding out what the client doesn't want and sending only data of interest to this client. Filtering is important because by default, all data collected by the gateway that passes through the core data microservices is sent to each client.
Two filters can be set up on data coming from core microservices:
- Filter by device id or device name. A client will get data only from specific devices or sensors if a collection of device ids or names is specified with the client registration.
- Filter by value descriptor id or value descriptor name. A client will get only data that's tagged with specific value descriptors if a collection of value descriptor ids or names is specified with a client registration. Value descriptors describe the type and makeup of data. For example, a value descriptor called "temperature" probably describes thermostat or other temperature reading data.
When no filters are supplied with the registration, all data is forwarded to the client. So filtering provides some real benefits: it avoids unnecessary network traffic, improves data security, and makes processing data easier for the client by providing only what the client needs.
How Data is Formatted
How the data is delivered to clients is also dictated by the client registration. Clients today can request that data be sent in a particular format, that it be encrypted, and/or that it be compressed. Here are the options today in EdgeX:
- Format—JSON or XML
- Encryption—No encryption or AES
- Compression—No compression, GZIP, or ZIP
Where To Deliver Data
EdgeX clients can also ask that data be pushed to an endpoint of their choice. Details about the endpoint (URL, credentials, etc.) must be provided when registering. Today, FUSE supports two types of destinations: push to a REST endpoint or publish to a designated MQTT broker topic. (Learn more about MQTT.)
So that's an overview of the microservices EdgeX uses to complete the transfer of data from the network edge to the cloud, as well as an introduction to some of the data filtering capabilities EdgeX offers.
With its base in open-source technologies and its support by The Linux Foundation, EdgeX is poised to provide a solid framework for the industrial Internet of Things (IIoT). Whether you're already working on an IIoT application or just want to be informed for the future, you'll want to keep an eye on EdgeX Foundry and framework development.
To get more updates on EdgeX as they become available, subscribe to the Opto 22 blog.