What is Edge Computing?

Edge computing figure

Edge computing is what it sounds like: computing that takes place at the edge of corporate networks, with “the edge” being defined as the place where end devices access the rest of the network – things like phones, laptops, industrial robots, and sensors. The edge used to be a place where these devices connected so they could deliver data to  and receive instructions and download software updates from a centrally located data center or the cloud.

Edge computing is a distributed computing paradigm that brings computation and data storage closer to the location where it is needed, to improve response times and save bandwidth.

Now with the explosion of the Internet of Things, that model has shortcomings. IoT devices gather so much data that the sheer volume requires larger and more expensive connection store data centers and the cloud. The nature of the work performed by these IoT devices is also creating a need for much faster connections between the data center or cloud and the devices. For example, if sensors in valves at a petroleum refinery detect dangerously high pressure in the pipes, shutoffs must be triggered as soon as possible. With analysis of that pressure data taking place at distant processing centers, the automatic shutoff instructions may come too late.

Latency

With processing power placed local to the end devices, latency is less, and that round trip time can be significantly reduced, potentially saving downtime, damage to property and even lives. Even with the introduction of edge devices that provide local computing and storage, there will still be a need to connect them to data centers,  whether they are on premises or in the cloud. For example, temperature and humidity sensors in agricultural fields gather valuable data, but that data doesn’t have to be analyzed or stored in real time. Edge devices can collect, sort and perform preliminary analysis of the data, then send it along to where it needs to go: to centralized applications or some form of long-term storage, again either on-prem or in the cloud. Because this traffic may not be time-sensitive, slower, less expensive connections – possibly over the internet – can be used. And because the data is pre-sorted, the volume of traffic that needs to be sent at all may be reduced.

What are the benefits of edge computing?

So the upside of edge computing is faster response time for applications that require it and slowing the growth of expensive long-haul connections to processing and storage centers.

What are the drawbacks of edge computing?

The downside can be security. With data being collected and analyzed at the edge, it’s important to include security for the IoT devices that connect to the edge devices and for the edge devices themselves. They contains valuable data, but they are also network elements that, if exploited, could compromise other devices that contain stores of valuable assets.

With edge computing becoming more essential, it’s also important to make sure that the edge devices themselves don’t become a single point of failure. Network architects need to build in redundancy and provide fail over contingencies in order to avoid crippling downtime if a primary node goes down. The industry has already gone a long way toward addressing the demands of edge computing and it is becoming mainstream. Its importance is likely to grow even more as the use of real-time applications becomes more prevalent.

Difference between Edge and Cloud Computing.

EDGECLOUD
Advantages
Real time response.Scalable.
Low Latency.Big Data processing.
Edge can work without cloud and improve data security. 
The EC distributed structure reduces: network traffic, storage and bandwidth cost. 
Disadvantages
Storage capacity is limited.Response time is slow.
EC needs proprietary networks.High latency.
IoT devices have a high power consumption.Cloud does not have an offline mode.
Difficult to maintain the security of data. 
High costs of data storage and transmission. 
Edge computing

Use cases

There are probably dozens of ways to characterize use cases. But here are some examples to help clarify thinking and highlight opportunities for collaboration. Four major categories of workload requirements that benefit from a distributed architecture are analytics, compliance, security, and NFV.

DATA COLLECTION AND ANALYTICS

IoT, where data is often collected from a large network of microsites, is an example of an application that benefits from the edge computing model. Sending masses of data over often limited network connections to an analytics engine located in a centralized data center is counterproductive; it may not be responsive enough, could contribute to excessive latency, and wastes precious bandwidth. Since edge devices can also produce terabytes of data, taking the analytics closer to the source of the data on the edge can be more cost-effective by analyzing data near the source and only sending small batches of condensed information back to the centralized systems. There is a trade off here—balancing the cost of transporting data to the core against losing some information.

SECURITY

Unfortunately, as edge devices proliferate––including mobile handsets and IoT sensors––new attack vectors are emerging that take advantage of the proliferation of endpoints. Edge computing offers the ability to move security elements closer to the originating source of attack, enables higher performance security applications, and increases the number of layers that help defend the core against breaches and risk.

COMPLIANCE REQUIREMENTS

Compliance covers a broad range of requirements, ranging from geofencing, data sovereignty, and copyright enforcement. Restricting access to data based on geography and political boundaries, limiting data streams depending on copyright limitations, and storing data in places with specific regulations are all achievable and enforceable with edge computing infrastructure.

NETWORK FUNCTION VIRTUALIZATION (NFV)

Network Function Virtualization (NFV) is at its heart the quintessential edge computing application because it provides infrastructure functionality. Telecom operators are looking to transform their service delivery models by running virtual network functions as part of, or layered on top of, an edge computing infrastructure. To maximize efficiency and minimize cost/complexity, running NFV on edge computing infrastructure makes sense.

REAL-TIME

Real-time applications, such as AR/VR, connected cars, telemedicine, tactile internet Industry 4.0 and smart cities, are unable to tolerate more than a few milliseconds of latency and can be extremely sensitive to jitter, or latency variation. As an example, connected cars will require low latency and high bandwidth, and depend on computation and content caching near the user, making edge capacity a necessity. In many scenarios, particularly where closed-loop automation is used to maintain high availability, response times in tens of milliseconds are needed, and cannot be met without edge computing infrastructure.

IMMERSIVE

Edge computing expands bandwidth capabilities, unlocking the potential of new immersive applications. Some of these include AR/VR, 4K video, and 360° imaging for verticals like healthcare. Caching and optimizing content at the edge is already becoming a necessity since protocols like TCP don’t respond well to sudden changes in radio network traffic. Edge computing infrastructure, tied into real-time access to radio/network information can reduce stalls and delays in video by up to 20% during peak viewing hours, and can also vary the video feed bitrate based on radio conditions.

NETWORK EFFICIENCY

Many applications are not sensitive to latency and do not require large amounts of nearby compute or storage capacity, so they could theoretically run in a centralized cloud, but the bandwidth requirements and/or compute requirements may still make edge computing a more efficient approach. Some of these workloads are common today, including video surveillance and IoT gateways, while others, including facial recognition and vehicle number plate recognition, are emerging capabilities. With many of these, the edge computing infrastructure not only reduces bandwidth requirements, but can also provide a platform for functions that enable the value of the application—for example, video surveillance motion detection and threat recognition. In many of these applications, 90% of the data is routine and irrelevant, so sending it to a centralized cloud is prohibitively expensive and wasteful of often scarce network bandwidth. It makes more sense to sort the data at the edge for anomalies and changes, and only report on the actionable data.

SELF-CONTAINED AND AUTONOMOUS SITE OPERATIONS

Many environments, even today, have limited, unreliable or unpredictable connectivity. These could include transportation (planes, buses, ships), mining operations (oil rigs, pipelines, mines), power infrastructure (wind farms, solar power plants), and even environments that should typically have good connectivity, like stores. Edge computing neatly supports such environments by allowing sites to remain semi-autonomous and functional when needed or when the network connectivity is not available. The best example of this approach is the need for retail locations to maintain their point of sales (POS) systems, even when there is temporarily no network connectivity.

PRIVACY

Enterprises may have needs for edge computing capacity depending on workloads, connectivity limits and privacy. For example, medical applications that need to anonymize personal health information (PHI) before sending it to the cloud could do this utilizing edge computing infrastructure.Another way to look at requirements that would benefit from cloud edge computing is by the type of company that would deploy them. Operator applications are workloads put on edge computing infrastructure that is built and managed by operators—telecommunications companies, for example. Third-party applications are built by organizations to run on existing edge infrastructure, in order to leverage others’ edge computing infrastructure. It is worth noting that any applications could leverage any or all of the capabilities provided by a cloud—compute, block storage, object storage, virtual networking, bare metal, or containers.