Three Key Considerations for Deploying Edge Computing
Edge computing is transforming analytics, artificial intelligence (AI), and machine learning (ML) applications. Significant investments in libraries and frameworks have opened new opportunities for engineers and operations specialists to integrate these capabilities with process control. While the potential for valuable applications is vast, security, management, and scalability must be carefully considered during the design phase.
When deploying edge solutions, companies should focus on three main aspects: the applications running at the edge, the supporting infrastructure, and the security and orchestration of edge devices.
1. Edge Computing Applications
Edge computing involves running applications close to control processes or machinery. An edge device typically resides on the same network as operational technology (OT) devices, facilitating low-latency data collection and response communication. An edge device can be a physical appliance or a virtual machine (VM) with low network latency to the process equipment.
The choice between physical devices and VMs depends on the preference for processing power versus flexibility. Physical devices offer low latency and significant processing power but lack flexibility in resource allocation and are often dedicated to a single application. VMs, on the other hand, can be dynamically allocated, providing flexibility but potentially introducing additional latency.
Edge computing offers numerous benefits, including low response times, efficient bandwidth use, real-time data handling, and scalable applications. By localizing applications, data does not need to be sent to a central server, reducing latency and enabling near-real-time solutions. Scaling edge infrastructure involves adding more edge devices to handle increasing demands and improving load balancing and node management.
Edge applications are typically specific to their use case and are often purpose-built. Examples include analytics or models to optimize process control, reduce scrap, improve yield, lower utility consumption, and predict maintenance needs. Data is processed locally at the edge, with raw or aggregated data potentially sent to enterprise servers or the cloud for further analysis. Aggregating data locally can reduce transfer and storage costs in the cloud.
2. Edge Computing Infrastructure
Supporting infrastructure for edge applications is often one of the most uncertain aspects when integrating these solutions. A detailed scope of the edge application is necessary to understand which process parameters need to be collected, the application’s output, and what defines success in the process.
Other considerations, like the process's readiness for data collection and advanced analytics, can be assessed using an analytics maturity model. Advanced process analytics require investments in data collection, cleansing, contextualization, and storage, all of which must be supported by the infrastructure. Not all organizations are prepared for advanced process analytics, so working with analytics experts is crucial to implementing these solutions effectively.
As devices are deployed and applications scale, the network load increases. Understanding where data is being transferred is essential to minimize load issues with the existing infrastructure. Upgrading network switches to support larger data volumes may be necessary to prevent network interruptions due to bandwidth contention. Consulting with plant floor network integrators during the planning phases can help identify and provide solutions to mitigate these issues.
3. Edge Computing Security and Orchestration
Security for edge devices should be prioritized from the start of the design process. The functional requirements of an edge application help determine the necessary security measures for processing data. Common security requirements include data encryption (both in transit and at rest), transport layer security (TLS) communications, and system patching. Orchestration platforms can enhance security by managing the lifecycle of edge devices, updating them and their applications, and providing these services at scale. This allows for efficient management actions, such as updating application versions, deploying new analytic models, or applying security patches across a large fleet of edge devices.
Using an orchestration platform can accelerate application development and scalability. Small proof-of-concept or pilot solutions can be evaluated and then deployed across the fleet of edge devices. The rapid innovation facilitated by orchestration platforms, combined with built-in security features, enables developers to focus on delivering solutions rather than managing deployments and architectures.
Edge computing, particularly when integrated with AI/ML, introduces new possibilities and challenges in controls engineering. Edge applications support low-latency communications and can be tailored to meet various business goals. Investing in the supporting infrastructure is crucial for ensuring analytics and network scalability readiness. A security-first approach and a robust orchestration platform ensure that edge devices are managed and secured effectively.
By prioritizing security, orchestration, and infrastructure, companies can harness AI/ML solutions at the edge to optimize plant operations.
This article was initially published in Control Engineering.