Shifting computing power to the edge of your network

by Paul Gysen - Cloud & IM Consulting Lead | minutes read

If the history of the computer industry is indeed, as is claimed so often, characterised by periodic changes between centralised and distributed computing, the advent of edge computing seems to present us with another shift towards a more distributed computing model. But what is driving this new pendulum swing, expanding further the now well-established paradigm of cloud computing?


After moving away from the central mainframe computer of the 1970s to the distributed client-server model of the 1980s’ PC era, we witnessed a gradual return to a more centralized computing model from the 1990s onwards. The driving forces behind that paradigm shift were mobile computing and cloud computing. The combination of those two technological concepts allowed us once again to centralise our calculation work, just like we used to do in the mainframe era, while using our smartphones and other thin client devices as basic terminals. It felt like a match made in IT heaven, providing us with the best of both worlds: virtually unlimited computing ability as well as ubiquitous connectivity.

Edge computing: what’s in a name?

These days, due to the rise of edge computing, the pendulum is striking in the other direction again. Simply put, edge computing means that part of the data processing is carried out closer to the source of that data, instead of first having to send all of that data halfway across the earth to a data centre for processing in the Cloud. Apart from lowering bandwidth consumption and latency, this can also lead to improved safety and cost savings.

Judging from this definition, edge computing in a certain sense means stretching out from the cloud. For sure, it does not spell the end of cloud computing, however, as some might like to proclaim. As is often the case with new technology concepts, both cloud computing and edge computing will remain relevant to specific applications or use cases and will therefore work very well alongside each other.

Main driver: Internet of Things

Now that the Internet of Things (IoT) has also entered the game, the trend towards edge computing is facing a significant acceleration. Smart IoT devices and sensors produce a proliferation of data that needs to be processed and therefore exchanged with your computing environment. However, the responsiveness of these exchanges may sometimes be critical to the optimal functioning of those very IoT end-point devices and sensors themselves.

Processing large amounts of data in a traditional cloud data centre requires high-speed, high-bandwidth connectivity, which is not necessarily available or sufficiently stable everywhere – or only at too hefty a price. And that’s where edge computing comes in, since its key driver is reducing the network latency between end-point devices and the computing environment that provides computing services to those devices. Therefore, the more interactions those end-point devices are requesting with that environment, the closer these services ought to be brought to them.

From a networking perspective, a traditional cloud computing environment is usually pretty distant from the actual end-user, since it is mainly centralised within the data centre of some cloud service provider. Edge computing, on the other hand, is located in proximity of where its usage is needed. And that, in a nutshell, is its key strength.

In my next blog post I will present to you some of the potential benefits offered by edge computing. You can read it here.