Agentic AI needs SDA to bring a faster pace and more innovation to the world of automation
Davy Demeyer is founder of Acceleer, a Belgian company specializing in collaborative design specifications, code generation and automated deployment to test, staging and production. Demeyer has been talking about software-designed automation (SDA) for years. He launched Acceleer in 2024 specifically to help speed up the transition to SDA.
Get your subscription to Control Design’s daily newsletter.
What is the primary focus of software-defined automation (SDA)?
Davy Demeyer, founder, Acceleer: The two main targets are decoupling the software runtimes from the hardware, which is something that, for example, OPA accomplishes, and making applications and data manageable through software instead of through graphical user interfaces (GUIs).
Most of the focus today is on the former, but the biggest gains will come from the latter, especially since we're moving to a world where artificial-intelligence (AI) agents will help us manage both the engineering and production workflows.
Graphical user interfaces are not really a limit for AI agents, but they will just be too slow, so there will be a tendency to select applications that are highly performant in automated workflows.
What are the primary benefits of software-defined automation?
Davy Demeyer, founder, Acceleer: The main benefits are speed, scalability and flexibility.
For scalability, it is not only the scalability of the automation layer itself, but also for applications that build on the automation layer, like data analytics, digital twins and the manufacturing execution system (MES).
One example benefit is that it will be very easy to quickly spin up testing and staging environments that are almost identical, like the production environment.
Another example is that, in a big organization, it will be much easier for the global engineering team to get a quick overview of all deployed control systems they have in their worldwide organization, even if it spans hundreds of plants.
A final example is that it will allow compressing the engineering workflows and time to market.
How does software-defined automation figure in the convergence of IT and OT?
Davy Demeyer, founder, Acceleer: The meaning that's most connected to IT-OT convergence is linking data between the two worlds. There is another meaning, and that is bringing benefits and best practices that we are used to in IT to our world of OT, while still taking into account the specific environments and requirements we have in OT.
These benefits are the same—speed, scalability and flexibility.
Which standards and protocols will be affected most or increase/decrease in use because of software-defined automation?
Davy Demeyer, founder, Acceleer: A general concern and risk for many of the industrial standards is that they are not openly available. SDA will bring a faster pace and more innovation to the world of automation,
And agentic AI will more and more be used to help with this advanced pace, and it also needs SDA to work efficiently.
It is already becoming clear that standards that are not known by the main language models don't get recommended. It is an important consideration that the main standards organizations in our field—International Society of Automation (ISA) and International Electrotechnical Commission (IEC), but also newer organizations such as Open Process Automation Forum (OPAF)—need to start thinking about.
Staying with the theme of agentic AI, one of the standards that has a high chance of wrapping all our industrial engineering and production applications is the recently proposed model context protocol (MCP).
According to the MCP website introduction:
“MCP is an open protocol that standardizes how applications provide context to LLMs. Think of MCP like a USB-C port for AI applications. Just as USB-C provides a standardized way to connect your devices to various peripherals and accessories, MCP provides a standardized way to connect AI models to different data sources and tools.”
A similar idea, but without the explicit AI focus, is the proposal for a common API by CESMII.
Another set of standards that will probably increase is standards defined under the OPC UA nodesets.
Many of them are still not very commonly used, but the combination of SDA, agentic AI and network effects will probably increase their adoption.
When mentioning OPC UA, we should also mention message queuing telemetry transport (MQTT), which seems to be here to stay.
In process automation, two interesting standards are Data Exchange in the Process Industry (DEXPI) because piping and instrumentation diagrams (P&IDs) are the source of our engineering workflows and Module Type Package (MTP) because integrating individual machines and skids into bigger control or distributed control systems is one of the key bottlenecks to be resolved.
Of course, ISA-95 and ISA-88 are here to stay, and ISA-106 is probably picking up more popularity for the continuous processes.
I should also mention the Asset Administration Shell (AAS) because some end users see it as part of the solution for an SDA-future.
For version control, everything will probably move to Git, just because it's the de-facto standard in software development, and SDA will build on the tooling and workflows we have in the software world.
Also, definitely note LinuxContainers can run on Docker/Kubernetes, where we already see organizations making vendor decisions based on this capability.
Which components will see the biggest impact from software-defined automation?
Davy Demeyer, founder, Acceleer: Definitely the control modules/CPUs will be impacted. Controllers that are not agnostic will quickly disappear. Virtual runtimes and programmable logic controllers (PLCs) will run on open hardware. In Open Process Automation, these are called the decentralized computing nodes (DCNs). There might even be a trend to run virtual PLCs on classical server hardware. Anyway, everything becomes very flexible.
On the integrated development environments used to program the PLCs, we will see a shift to IDEs built around the code, instead of the code being locked away inside the IDEs. This will even happen for distributed control systems.
In both cases the GUIs will have a similar interface as before, still allowing graphical ways of programming like ladder, function block diagram (FBD), continuous function chart (CFC) and sequential function chart (SFC). Just underlying everything will be text files.
On the applications in production, interfaces will become open, and the speed of these interfaces will become very important. And everything will be running in containers.
In what ways does software-defined automation allow machine builders more flexibility in hardware selection and management?
Davy Demeyer, founder, Acceleer: One of the main reasons machine builders only propose one automation brand is because it is too much engineering effort to switch between and maintain multiple ways of programming.
In the software world, it would be like teams switching between Java and .NET: both solid programming standards, but no team that wants to be efficient considers switching between them.
By making the software runtimes hardware-agnostic the machine vendors will have much more flexibility.
One case will be where an end user will ask for a specific hardware brand, because they want to keep all hardware the same for maintenance reasons.
Another case will be where machine builders will be able to switch between hardware brands when there are supply issues, like what we saw during COVID.
Finally, hardware will become much more powerful than what's available today for the same price, allowing it to run multiple workloads on it—for example, one PLC runtime and maybe a separate software application to help manage the machine.
How can machine builders prepare for and leverage software-defined automation?
Davy Demeyer, founder, Acceleer: For hardware-software decoupling, first, have conversations with the main customers. Understand what their near and longer-term expectations are.
Second, have conversations with the main automation vendor. Ask what SDA-based products can be bought today and at least require the possibility to switch out the hardware by another vendor.
For open software applications, think what software-based functionalities would make a lot of sense to add to the machine if it would not add repeated engineering effort.
What often happens with machine vendors is that they have good ideas, but it makes no financial sense if they have to repeat the engineering effort for every individual machine.
If, instead, a configuration of each machine could be fed to an automated configuration/engineering/deployment workflow, then it could really help to drive innovation on the machines.
How does software-defined automation build on existing IT and network infrastructure in factories and plants?
Davy Demeyer, founder, Acceleer: Workloads will move toward containers and virtual runtimes. Windows systems will slowly disappear from the factory floors.
And, since everything will become software-defined, it will become standard practice to have a staging environment that will be a full duplicate of the production environment.
Tell us about your company’s state-of-the-art product that involves software-defined automation.
Davy Demeyer, founder, Acceleer: Acceleer brings Design-Ops to the world of process automation engineering.
It simplifies and scales the automation engineering workflows, just like what DevOps has done for normal software development.
The big difference with DevOps is that we start from the design stage. The future of automation engineering is based on an open ecosystem, where different engineering applications can be linked together through flexible interfaces.
Upstream, most of the engineering vendors have agreed to export their P&IDs into the DEXPI XML format. Acceleer allows an automatic import from all the equipment defined on these P&IDs.
Next, process and automation engineers collaborate in defining the detailed functional specifications, defining how the code in the control system will work.
Once the functional specifications are ready, the code is automatically generated, using a template-based approach very common in software automation workflows.
The code is then automatically imported into the integrated engineering environment of the selected control system, either through the software development kit (SDK) or directly into the text files containing the code and configuration. From there, the users can complete any code that can't be auto-generated and finally deploy to testing, staging and production environments.
The overall workflow stays as close as possible to the workflows our engineers are used to today, except it avoids any repeated data entry, which often takes up a big part of the engineering time.
Design-Ops allows acceleration of the overall engineering workflow, making the entire process more predictable and repeatable. It allows users to take back ownership of the functionality of the plants, today often locked away inside the PLCs and distributed control systems.
Design-Ops only works if everything becomes software-defined. Software-defined does not only mean decoupling the software runtimes from the hardware. It also means that applications and the data they contain become manageable through software instead of through GUIs.
Automation systems are becoming more open. We already see this in PLC vendors moving to the next generation of IDEs such as Siemens’ Simatic AX, CoDeSys go!, Beckhoff’s TwinCAT PLC++ and B&R Automation Studio Code. The same is happening in the DCS world with Open Process Automation (OPA) being ready for deployment, supported by most of the main vendors.