Edge computing is defined as ‘near the source of the data’ or at the edge of the network.
This is in reference to the Industrial Internet of Things (IIoT), of course; what isn’t these days? But with this come various services and potential cost reductions. So, what is edge computing really?
Any PLC can be an edge controller if all you are looking for is normal PLC-type actions. In the OEM field this may not mean very much, but, in the end-user world, it can make a big difference.
As an OEM, you may be required to be “edge,” but that’s for another time.
I have been introduced to two new edge-type controllers in the past month—groov Edge Programmable Industrial Controller (EPIC) from Opto22 and Open Secure Automation (OSA) Remote from Bedrock Automation.
They both talk to the advantages of being on the edge, which, to me, means the last node on the spectrum of whatever network you’re connected to—the outer edge.
In some cases, the outer edge is a remote RTU site, which both products want to address, as well as the EPIC handling the larger PLC-based application, such as machine control, at the edge.
The groov EPIC is different from a standard automation controller in a few ways. EPIC collects, processes and views local and remote data; and it shares and provides remote connectivity to monitor all the above.
The hardware looks very impressive and is Class I, Div. 2-compliant, which is important in many industries. It has the full array of I/O modules with common wiring arms for all. The big difference here is the built-in LCD touchscreen running Ignition Edge from Inductive Automation or groov View. A full-fledged HMI/SCADA application built-in is very cool. Since it is an open platform, any Java-based application can run in the system.
With Ignition Edge, you have access to multiple drivers, so you can have access to network-based data on the same screen.
EPIC can use the open standard MQTT publish/subscribe (P/S) model of data handling, in addition to the producer/consumer response model. The cloud uses P/S, so the system can and does integrate with the cloud seamlessly.
For those bit bangers who want to use scripting, it is available, as well as full root access to the Linux kernel through a secure shell portal. Proprietary applications can be developed and run on EPIC, which may be attractive to some original equipment manufacturers (OEMs) and system integrators (Sis).
The system comes with complete diagnostics with the ability to display wiring diagrams and system configuration under the groov Manage application.
The only drawback is the use of flowcharting for its main programming platform. The usage of IEC 61131-3 is coming, but not just yet. And testing the EPIC in any small application shouldn’t be too taxing due to the flowcharting issue, but that’s just my personal opinion.
All in all, EPIC is a leap from normal automation controllers and PLCs. It’s worth your time to investigate.
Enter Bedrock Automation’s OSA Remote PLC/remote terminal unit (RTU)/edge controller. This controller boasts some of the same hardware specifications as its big brother, including a metal enclosure for high-energy pulse protection (EMP immunity). The I/O is universal and can be configured for analog, digital and/or protocols such as HART, Ethernet IP or Modbus TCP.
Bedrock believes that having a lower-cost, intrinsically secure device at the edge with smaller I/O counts will aid in the deployment of edge devices instead of standard PLCs and RTU devices.
Its SCADA connectivity is done with embedded OPC-UA. MQTT P/S and data distribution service (DDS) for cloud-based services will be forthcoming later in 2018.
What is cool about the device is its size. At 45 square inches and 2.3 inches deep, it can fit just about anywhere. The temperature range of -40 to +80 °C allows it to be used in some extreme environments. It will use Bedrock’s standard IEC 61131 software platform for configuration, which is free to all.
So, can a normal PLC be an edge controller? If the MQTT protocol and P/S system is in your future, then probably not. MQTT as a protocol is very lean and designed for small pieces of data transfer using small transfer pipes. It is a very compact protocol and is used by various devices, such as wireless temperature sensors.
Do we need it? Yes, we do, since we want data everywhere at any time on any device. The cloud will do this for us, as long as we do things securely. I still subscribe to the one-door-to-the-floor mindset to control access and authenticate users, and the edge brings more complexity to that scenario, which has to be managed. However, MQTT can bring simplicity to edge deployments, if properly applied.
In a lot of cases, the edge requires more security than not. Be aware.