What is edge computing?

A CGI render of an automated, smart manufacturing floor using Industrial Internet of Things (IoT) sensors. Machine arms construct a car, while holographic lines and WiFI symbols denote a private network of devices within the factory.
(Image credit: Getty Images)

Edge computing, as the name suggests, involves the collection and processing of data by a dedicated application, system, or node closer to the user or data collection point - rather than sending this back to a central server for processing.

For example, it's possible for a robotic arm to send reports on how many units it has picked and packed from the production line, or for an app on your phone to send performance data to the developer for analysis to gain insights into what features features people are and aren't using.

Traditionally, the collection and synthesis of data happened far apart (across the world in many cases), and so far that's been fine – most data processing tasks weren't time or network-sensitive enough to warrant anything else. However, today we live in a world of connected devices and tools capable of producing an enormous amount of data, at a rate far faster than previous generations of technology.

Just imagine an Internet of Things (IoT) enabled factory where every tool, belt, machine and device has an affixed computer that collects information about its work and performance. There's too much data being produced too quickly for collection and transport to an offsite processing hub, and many industries need systems to pivot quickly and can't let network latency or data volume slow them down.

Edge computing is the answer – where data processing and analysis is done in small, on-site, sometimes micro data centres or, increasingly, in the device itself.

Edge computing real world examples

Neither centralised off-site cloud nor on-premise network architectures were built to operate right alongside data sources because in many cases the applications need extremely low data latency, and bringing computing closer to the point of data origin reduces bandwidth needs (and costs). 

Just some examples of sectors where it's finding a home include;

Manufacturing

Managers and process owners are increasingly reliant on data to adapt to changing conditions quickly in manufacturing. Known as smart manufacturing or Industry 4.0, information emerging from the elements of the data-enabled plant can be analysed and used to improve workflows, respond to weak links or choke points, and pre-empt breakdowns.

For example, robots in smart manufacturing environments will operate autonomously, and augmented or virtual reality systems will help staff better visualise metrics about processes and improve safety.

Systems like the above not only need data (and a lot of it), the processing needs to happen as close as possible to the data source. That's giving edge computing a new niche in manufacturing as computing is bought ever-closer to the machines and tools on factory floors.

Oil and gas

Some edge computing hardware is specifically made to operate in tough environments, which makes it ideal for the resources, oil and gas industries. With high temperatures, extreme conditions in the surrounding air, intensive and constant motion, dust, and a host of other effects, resource industries use edge computers to ensure the smooth running of processes and workflows in some of the most challenging environments.

In the case of an offshore oil rig, sending data via satellite to an onshore data processing centres is not only prohibitively expensive, given the amount of data produced, but takes much longer in machine terms than processing it right on site.

Autonomous vehicles

When it comes to road safety, a few microseconds can often mean the difference between life and death. Autonomous vehicles are required to make decisions near-instantly, and sending information off to a server to wait for a recommended action to be beamed back would never be practical.

On-board data processing can synthesise information from on-board cameras and other sensors with near-immediacy, and as the infrastructure of autonomous vehicles and smart roads expands, it will do the same with data from other cars, traffic signals and smart signs as it passes them, processing it all locally.

Agriculture

Just like oil and gas, farming and agriculture tends to happen far away from the data processing infrastructure we mostly find in cities or urban areas. It also means any data collection or signal hardware is exposed to the vagaries of the outdoors – rain, sunlight, heat, wind, etc.

But while immediacy of action isn't as necessary for the sector, the cost of transporting such vast datasets to a cloud or even an on-premise processing environment back at the farm house control centre can be high. That means computing done at the source of data collection, using technology like digital thermometers, rain gauges, and pH meters, can make a farm or greenhouse much more efficient.

The new workplace

But before you think edge computing is only for intensive or highly physical environments, it's also found a surprising new home in the shadow of COVID lockdowns. As it became easier to deploy and manage, many businesses around the world adopted edge computing to help facilitate more dispersed working practises.

Edge computing has let people work not just from home but remote locations that aren't close to widespread network infrastructure, especially if they perform data-intensive duties and the average home broadband connection isn't sufficient for heavy data transport.

And because it means better access not just to general office applications but higher fidelity and better connected video content for meetings, seminars, etc, it's also an area many think will both drive and be further developed by wireless technologies like 5G.

Security risks at the edge

Mockup of a padlock covered in blue and red neon code denoting ransomware, malware, and security

(Image credit: Getty Images)

Of course, moving any process, application or tool outside the confines of the secured domain in an organisation poses a risk. It might be as simple as a user accessing company data on their own handheld or mobile device while unaware something they've installed for personal use has a keylogger.

The more signal devices and tools we add to networks, the more we potentially expose them to bad actors, especially in the IoT age.

Unauthorised access remains one of the most pervasive threat vectors. Many devices are deployed with default security credentials, so hackers or cyber criminals might already know the out-of-the-box passwords to access them, piggybacking the device's own network access to infiltrate the whole operation and potentially getting their hands on sensitive data.

Edge computing also means potentially exposing a network of distributed system gateways, and every attached node, to attack. Edge computing owners need to put more effort into establishing and maintaining standardized security policies across the entire device network, and each tool might have operation-specific vulnerabilities that demand another level of security awareness.

Physical access points like IoT devices also give hackers a whole new attack surface to target – a cyber criminal posing as a potential customer getting a factory tour and placing a signal transmitter on a CNC machine might sound like the a spy thriller, but it offers a relatively simple and low-cost method of infiltrating a network.

Edge computing and AI

Ai concept image showing data infrastructure symbolized by digitized blocks.

(Image credit: Getty Images)

Just like almost every other technology, edge computing will be ineluctably changed by artificial intelligence.

The advantages and benefits you're using edge computing for – predictive maintenance, anomaly detection, etc – are supercharged with AI. Fixes and changes created and deployed by machine learning algorithms can happen right there on the device.

And because data transport and latency concerns are all-but eliminated, it's the reason AI in edge computing will help usher in time-sensitive applications like self-driving vehicles or automated manufacturing that respond to input levels or market conditions in real time.

In fact, AI is even capable of improving the infrastructure that edge computing relies on. Firstly, it can further streamline the operation by predicting and allocating computing resources among edge computing nodes on the fly, as variables in the system change. Second, AI can see and respond to anomalous activity that might be a cyber breach – and shut it down pending further investigation – much faster than human security staff.

Edge computing and 5G

Worker is installing 5G Telecommunication Base Station

(Image credit: Getty Images)

One of the most compelling examples of edge computing in action is in the rollout of 5G technology, and many see the two technologies as inextricably linked.

Although the next generation of networking technology boasts never-before seen speeds, the reality is that the average customer and business won't be able to take advantage of these, largely because where and how you access 5G will make an enormous difference to how well the network performs.

In dense urban areas, for example, 5G networks are heavily reliant on small cell technology in order to relay signals around physical objects that otherwise block access. In rural settings, you may find that a 5G connection is no better than previous generations if infrastructure is lacking.

Even if a business is able to secure a strong 5G connection, the technology is still sending data to a data centre for processing, which may not be the best option for those requiring instantaneous data - even with the faster transfer speeds. That's where edge computing can come in. Many 5G companies have already started to develop their own edge development platforms that allow for the building of applications capable of using the instant data processing of edge computing alongside the ultra-fast connections supported by 5G.

What next for edge computing?

Overall, edge computing is skyrocketing. Driven by increased data production and demand for real time processing, reduced latency and processing efficiency, a recent report claims the edge computing market will grow to $33.8bn by 2028, compounding at 29.8% per year.

It's also expanding along with the growth of the IoT market, with 30 billion devices expected to be online by next year, many of which will need the local, on-board processing edge computing offers to foster real-time decision making.

Healthcare, manufacturing, smart city design and resources are all set to benefit from and further drive edge computing growth. In just one example, one forecaster predicts healthcare will plant the flag through the use of applications like remote patient monitoring and real-time diagnoses.

As mentioned above, expanded network capabilities thanks to technologies like 5G will also usher the edge computing age in faster, with observers and hardware makers as varied as Gartner and Cisco touting the benefits and opportunities of streamlined data transfer and connected devices.

Drew Turney
Freelance journalist

Drew Turney is a freelance journalist who has been working in the industry for more than 25 years. He has written on a range of topics including technology, film, science, and publishing.

At ITPro, Drew has written on the topics of smart manufacturing, cyber security certifications, computing degrees, data analytics, and mixed reality technologies. 

Since 1995, Drew has written for publications including MacWorld, PCMag, io9, Variety, Empire, GQ, and the Daily Telegraph. In all, he has contributed to more than 150 titles. He is an experienced interviewer, features writer, and media reviewer with a strong background in scientific knowledge.

With contributions from