Office Hours: Mon-Fri, 8.30am-5pm

Edge Computing Explained

Person stood at the edge of a cliff

July 31, 2023

Edge computing is a specific school of thought relating to networking. Its core principle is that computing should be brought as close as possible to the source of data. The goal is to reduce latency and bandwidth usage.

Simply put, edge computing is all about running fewer processes in the cloud. Instead, these processes are moved to more localised locations. These include:

  • The user’s own computer.
  • An edge server.
  • An IoT device.

Bringing these computing demands to the network’s edge keeps long-term communication between client and server to a minimum.


What is meant by the ‘network edge’?


For internet devices, the network edge has the following meaning. It is where the device, or a local network on which the device is contained, communicates with the worldwide web. It is a somewhat ambiguous term, as you could identify the network edge in different ways under this definition:

  • The user’s router, ISP and local edge server are considered the edge.
  • But the processor inside an IoT camera could also be considered the network edge.

What really matters is that the edge of the network must be in close physical proximity to the device. This differs from origin servers and cloud servers, which are often a long distance from the devices they communicate with.


How is edge computing different from other computing models?


Early computers were large and bulky. As personal computers came into existence, the options for more distributed computing became more varied. For a time, it was personal computing that acted as the primary model. The storing of data and running of applications occurred locally on the user’s device, or there may have been a data centre on-site.

Today, with the advent of cloud computing, this model has been shaken up. With cloud services centralised in a vendor-managed cloud, the data and applications they run can be accessed remotely via the internet from any connected device. However, the disadvantage of this model is latency, which originates from the distance between users and data centres that host the cloud computing.

Edge computing aims to keep computing close to end users. With minimal distance for the data to travel, this latency issue is addressed, but the centralised nature of cloud computing remains in essence.


What is an example of edge computing?

Imagine a building that is secured with many IoT video cameras. These cameras simply output a raw video signal, which is continuously streamed to a cloud server. The cloud server runs that feed through motion detection software so that the only footage that is saved to the database is footage that contains movement. This puts a consistent strain on the building’s internet infrastructure. A large amount of bandwidth is consumed by transferring that video stream, which also puts a heavy burden on the cloud server as it processes the footage from multiple cameras at once.

Now, imagine moving that motion sensor computing to the network edge. Each camera has its own internal computer running motion-detection software. When motion is detected, it sends footage to the cloud server, but only when that is needed. The result is a substantial reduction in bandwidth use because most of the footage will never have to be sent to the cloud server.

What’s more, the cloud server is now only required to store that important footage as it arrives. This enables the server to communicate with a greater quantity of cameras without the risk of being overwhelmed. This is the goal of edge computing.


What other possible use cases are there for edge computing?


There are many products, processes, applications and services where edge computing could be deployed. Here are a few possibilities to consider:


  • Self-driving Cars: Autonomous vehicles must be able to react in real time. Latency caused by sending data and receiving instructions from a server could create the risk of accidents.
  • IoT Devices: Get more efficient user interactions by having smart devices run code themselves rather than having to rely on a cloud server.
  • Video Conferencing: When you run an interactive live video link, the bandwidth requirements are very high. Moving backend processes into closer proximity to the video’s source could decrease instances of lag and latency.
  • Medical Monitoring Devices: Medical devices need to be able to respond instantly. This can be compromised by having to wait for instructions from a cloud server.


What are the advantages of edge computing?


Edge computing offers several key benefits against other models. These include:


Cost Savings

With lower bandwidth use and less strain on servers, businesses can reduce their investments in these resources. The more IoT and similar devices there are, the more computation will be required for homes and businesses to operate. Moving large swathes of this computation to the edge could be the only feasible solution.


Improved Performance

Reduced latency is another key benefit of edge computing. Distant servers cause delays, even for relatively simple tasks like two co-workers chatting over an IM platform. If those processes are brought to the edge, there will be far swifter communications between devices, leading to improved performance.


New Functionality

Edge computing can bring about the availability of new functions that were not possible previously. For example, real-time data processing may be possible for companies via the use of the edge – something that would have to take longer using a cloud-based model.


What are the disadvantages of edge computing?


No model is without its problems. For all the advantages of edge computing, there are some downsides to it as well.



Edge computing can increase attack vectors. With more smart-enabled devices in the mix, there will be new opportunities for malicious attackers to seek to compromise those devices. This makes people more vulnerable to attacks.


More Local Hardware

Edge computing requires more local infrastructure to pull off. IoT devices with built-in computing capabilities are more sophisticated and will thus cost more money to purchase. While hardware costs are generally decreasing, it will still be a significant investment for some companies or individuals to implement edge computing.


Final thoughts


The concept of edge computing is gaining traction as people seek new solutions to the rising issues of computational power for an increasingly digital world. Edge servers are an option to mitigate the need for extra hardware, for example. If latency, bandwidth and server resources are topics that are frequently coming up in conversation in your home or business, it may be time to start considering the benefits of edge computing.

You May Also Like…

Cloud ERP vs SaaS ERP Compared

Cloud ERP vs SaaS ERP Compared

Enterprise Resource Planning (ERP) software is now an important tool for any company that wants to run a tight ship,...