Edge Vs Cloud Computing

Three reasons to use edge computing when deploying IoT.

Mnet 191962 Cloud 0
Tom KovanicTom Kovanic

Cloud computing has been in vogue for a fair number of years. In fact, one may have come to believe that deploying a data center using the cloud is the best and maybe, the only way to go. Recently, a new way to deploy storage and compute resources has emerged: edge computing. What is edge computing and why should one consider it when deploying IoT applications?

Edge Computing – What is it?

With edge computing, the compute, storage, and application resources are located close to the user of the data or the source of the data. This moves the application platform closer to the IoT devices or sensors. In some cases, they could be in the same physical location.

Cloud computing does the exact opposite. It moves the compute and storage resources away from the IoT devices. When using a public cloud provider, one generally does not know where those resources are located. This makes it difficult for the cloud to support real-time IoT applications. The latency, or the time it takes to respond to an event, may exceed the time requirement for an IoT deployment.

Why Edge Computing vs Cloud Computing for IoT?

Cloud computing was developed to reduce the equipment and operational costs of deploying a data center. It also provides flexibility in that computing and storage resources can be added, or removed, as needed. This prevents the underutilization of capital assets or situations where demand would outstrip the capability of a privately-owned data center. The term The Cloud came about as an image of a cloud was used to signify that the true location of the resources was not known.

The reason that these shared pools of resources is referred to as the cloud is because one does not know where the resources are physically located. The resources can be across the street, across town, across the country, or on the other side of the planet. In fact, the resources can be moved around among the cloud provider’s various facilities for load balancing.

IoT deployments that are supporting real-time applications are sensitive to latency. The arbitrary relocation of the compute and storage resources can lengthen the latency such that the real-time application ceases to function. The cloud was not designed to support critical, real-time manufacturing applications.

The very nature of cloud computing makes it difficult to be adopted for real-time IoT applications.

With edge computing, the compute and storage resources’ location is known and does not move. Latency is constant, and the resulting network jitter is removed. The cloud’s ability to bring additional resources quickly on-line, or to reduce the number of resources employed, is not needed for real-time IoT applications. If the needs of the real-time IoT application are fairly static, the supporting resources at the edge are also fairly static.

Advantages of Edge Computing

Edge computing has three advantages over could computing when it comes to deploying IoT applications:

Reduces Latency

Several factors impact the latency of a network: the propagation delay through the physical media of the network; the amount of time it takes to route data through the networking equipment (switches, routers, firewalls, etc.); and, the amount of time it takes to process the data. Adopting edge computing and locating the processing and storage resources close to the IoT devices would make the delay through the media almost negligible. Additionally, there would be fewer layers of switching, and those switches would be smaller with an accompanying reduction in switching time.

Enhances Reliability

Shortening the network and removing hops through switches improves the reliability of the network. There are fewer chances of the data becoming corrupted, which would cause packets to be retransmitted. This retransmission would add to latency.

Network jitter would also be greatly reduced. With edge computing, the location of the resources is fixed and unmoving. Additionally, an IP network is inherently jittery. Packets can be routed through different paths to the destination and, in fact, the packets could arrive out of order. This uncertainty adds to network jitter. Shortening up the path between the IoT device and the compute and storage resources greatly reduces the network jitter inherent with IP. Network jitter could make it difficult to support real-time applications if the latency roams outside of the required limits.

Enhances Security

Edge computing offers the opportunity to provide a more secure environment regardless of how one would deploy: co-location or directly owning the equipment. Co-location facilities are physically secure locations. If one deploys the edge computing equipment near the factory floor where the IoT devices are located, the resources can be easily secured through door access controls, video monitoring, and other security methods.

Edge Computing Is Made for Real-time IoT

A typical data center deployed using cloud resources will most likely not be able to support the needs of a real-world IoT deployment on the factory floor. Cloud computing’s typical latency is too long, and the network is jittery. Edge computing solves those problems by locating the compute and storage resources close the IoT devices. It reduces latency and jitter, improves the reliability of the network, and has the added benefit of increased security.

Tom Kovanic is a Technical Marketing Manager at Panduit.

More in Cloud Computing