What is edge computing?
Simply put, edge computing is the process of using computational power at a remote network end-point device to compile the data from the device and send back key information only.
The purpose of this is to minimise the local processing overhead and reduce network traffic so that the network core can support greater device numbers without impacting latency or bandwidth.
The result is greater availability to critical information, actionable intelligence, increased network reach, and device numbers.
Is edge computing the future?
To understand this, we need to consider the evolution of computing technology, the route it has been taking, the cycles it endures, and the way in which it evolves.
Much like human evolution, computing follows stages and has developed patterns and some cyclic behaviour and level of predictability. It is already clear that edge computing is not the future itself but a portion of a greater picture.
Edge computing is a few steps in a journey to truly ubiquitous computing, blending dynamic adaptive mesh networking and distributed processing to generate intelligent responses and information gathering across what will likely become AI neural networks.
Does this sound a bit farfetched? Well, it’s not actually. The purpose of edge processing is to provide usable data from IoT and minimise the amount of data that needs to be sent for decisions to be made.
The ability to reduce the data beyond compression to actionable intelligence facilitates the need for algorithms to be employed to distil the data and create the inferences needed to drive the decision making. As the level of processing capability increases and evolves, we cross the line from simple processing platforms to AI processing platforms for many IoT and edge devices.
What about traditional networking infrastructure?
So, this explains what happens at the edge itself, but why will we see traditional networking infrastructure continue to change and evolve?
Logic dictates that even with AI there will come a point where the need to combine processing from multiple nodes where it becomes beneficial and where high bandwidth nodes begin to compete for resources in what are becoming congested networks.
New challenges require new solutions, and solutions are basically what I, like many others, have dedicated my working lives too: creating technical hardware and software solutions that enable the evolution of smart edge and IoT technology to meet the demands of today, tomorrow, and the future.
The solution for edge computing then becomes to enable the connection between nodes to work together to not only provide greater processing power but to work with the network infrastructure itself to negotiate alternative data transport routes and, if needed, pass data from node-to-node to bypass network congestion.
None of these activities are in fact new but the capability to utilise AI to autonomously work together to maximise data potential and network capacity will change the nature of edge computing as the devices become both end-point and data transport hubs (when working together).
There is a massive investment at the edge because it can provide long-term value and savings, it requires new thinking and new generations of performance and hardware in the devices and associated equipment (to facilitate the journey).
However, this is not the whole story, and already we are augmenting the edge with AI and making further reductions in the traffic required to determine outcomes and decisions and to reduce data or enhance data beyond simple compression techniques.
Edge computing quite simply is the start of a big shift in distributed computing.