What cloud vendors often call “the edge” is in reality the center of your universe and your customers’ operations. If you can obtain a clear view into the state of your equipment in each moment, as well as details about the environments in which they operate, there is tremendous opportunity for optimizing productivity, reducing costs, and increasing revenue. To generate business value, however, there are critical decisions to consider when designing your solution. The ultimate value of your connected product system is a function of the quality of your architecture at the edge - the insights derived by your analytics and other services are only as good as the data you provide them. In this article we shall walk through the set of edge architecture challenges most likely to stand in the way of your enterprise digital transformation, and what you can do to solve them.
A Kingdom of Isolation
The first issue you’re most likely to discover is that your current equipment software and drivers aren’t compatible with the latest cloud connectivity clients. You’ve got library incompatibilities between the two, and your drivers can’t talk to them. There’s a wall at the edge between your machines and public cloud vendor services. Fortunately, containers provide a way to sidestep the incompatibilities and enable end-to-end remote monitoring and control, machine learning, and enterprise workflow integration. Proper use of containers and other best practices can help your team bring intelligence to the edge, speed up delivery, and increase the value of your products in the market, while leveraging your investment in your existing device drivers and control logic.
The Balancing Act
Containers provide a way forward for resolving incompatibilities and enabling complex solutions spanning multiple systems. They still require thoughtful orchestration and execution, such as how they are organized and what each will include. Additionally, these containers will be running on resource-constrained edge devices and gateways, so you’ll need to balance the resource needs of each component inside each container.
Division of Labor, Separation of Concerns
As a best practice, we recommend isolating your I/O and any control loops inside an individual container. Use a separate container for normalizing data and providing access to history for local usage. Your client for communicating with your cloud IoT services should be in a separate container as well. If you need a local Human Machine Interface (HMI), this too should be in its own container. Why all the isolation? When you separate your edge components by major functions, your team gains agility and control that is reflected in the speed of delivery and flexibility of the software produced.
Through this approach, each container can be owned by a different team, thereby decoupling dependencies at both human and technology levels. Versioning and driver incompatibilities are removed from the equation, and project managers can focus on challenges specific to their domain without impacting and being impacted by others with each change. Much like microservices in the cloud, the containers interact through well-defined interfaces, each with its own contract. These interfaces enable independent evolution and maintenance.
Rates of Maturation
Containerization also enables each component to move forward over the life of your products at different rates. Your I/O container will iterate at the speed of your internal engineering team, whereas your HTML5 user interfaces can improve at the speed of the web. Meanwhile, your cloud client components can zip ahead at the speed of the cloud providers themselves, enabling your system to take advantage of their latest offerings regardless of how much effort your team puts into the other areas.
Depending on your BOM constraints and available computing power, your system may only support a limited number of containers. Resource management across containers can be difficult. Sensor history rotation and memory management must be considered upfront, including necessary trade-offs when physical constraints limit you to just two containers.
The Not So Final Word
Another piece of advice for teams developing edge components for their connected product system is to expect new challenges after deployment. At a minimum you’ll need to keep your web components updated for security and browser compatibility reasons, and these updates may call for more resources than was required by the original code. Design your solution, both the software and hardware components, for the future at least 18-24 months out. Adding a little more to your initial BOM cost to give your platform some headroom will pay off down the road with the flexibility to solve future problems without forcing updates to your hardware. Certification is hard, and the longer you can prolong the lifespan of your equipment in the field through software updates the more profitable your business is likely to be.
Edge offerings from a range of vendors can be included on your gateway, providing additional functionality and intelligence at the edge. With proper container isolation, your solutions can always take advantage of the latest innovations from these vendors without requiring updates to any other components.
Your connected product solution is critical to your digital transformation as an enterprise, and your edge architecture is critical for providing the industrial data you need for generating insights and driving actions for your business and your customers. Through proper planning and strategy at the source of this data from the beginning, your organization can set itself up for success in a changing and increasingly connected world.
James Branigan is Co-Founder and CTO of Bright Wolf.