top of page

Decentralized Computing: A New Competitive Edge

Co-authored by Moustafa Moustafa and Simon Clark


In today’s digital age data can be a by-product of any interaction and any activity. Storing, processing and presenting back this data is fast becoming a crucial capability for organizations to thrive as discussed in our recent blog “Will data replace cash as the lifeblood of an organisation”. More and more organizations are becoming awash with data, requiring more computing power and network connectivity to process and draw meaning to inform business strategies and decisions.


In this blog we:

  • Summarise the current cloud-computing model, its uses and drawbacks

  • Describe the elements of edge-computing to enable better decision making

  • Explore some exciting applications of edge computing

  • Provide a Julius and Clark perspective on barriers organisations need to overcome in adopting the edge computing model


Cloud Computing: Limitations For Growth

Most of us will agree we are firmly in the age of cloud computing. Both individuals and organisations rely on cloud computing services from a variety of providers for activities such as: storing files, online streaming of videos, cloud accounting software, and online collaboration capabilities. All of this can be carried out without having to occupy large amounts of space on an individual’s hard drive. In fact, cloud computing is more common than you may think. When you are using video conferencing software, or when you ask Siri about the weather, data is compressed, transferred globally to cloud servers, decompressed, processed and response provided back to the end user in almost real-time.


A major advantage of cloud computing is that its infrastructure does not require active management by end users, but rather by the providers usually computing giants such as Amazon’s AWS, Google Cloud, IBM Watson and Microsoft Azure.

However, as our world continues to change, flaws of cloud computing are starting to emerge.


Latency and bandwidth are the most significant challenges with cloud computing. Latency is the time gap between sending data and receiving feedback in a control system. Bandwidth is the rate at which data is delivered.


A good analogy is to imagine a return motorway journey between two cities. Bandwidth is how many lanes are on the motorway, while Latency is the speed of the traffic (data). So assuming latency is the same in both directions you might have high bandwidth (ability to send a lot of data at once from your organisation) to the cloud but if that cloud has restricted bandwidth (fewer motorway lanes available) it will take longer for you to get a response back. If bandwidth is too restricted then it may affect usability of software e.g. video conference glitch. With organisations generating data at a scale never seen before, those that require real-time feedback for data heavy and time-critical operations are looking at new ways to maximise bandwidth and reduce latency. This is where edge computing comes in.



The New Era: Living Life On The Edge

What then is “edge computing” and what opportunities does it bring? Edge computing involves bringing computational processing and storage (edge nodes) geographically closer to the internet of things devices (sensors, smart phones and even connected security cameras) that are generating data (edge devices). Having computational power closer to data generating edge devices reduces latency, as you are not reliant on a central server a few hundred or even thousands of miles away. This allows faster reactions and decision making enabled by the almost real-time data processing of local edge nodes.


Applications Of Edge Computing

Applications of edge computing can be incredibly broad. From a production site perspective think about sensors that monitor the condition/health of manufacturing equipment to aid predictive maintenance, or an internet-connected thermal video camera looking for potential covid-19 infected employees as they enter a building.


Whilst a few devices producing data on a network can transmit that data quite easily, problems arise when the number of devices transmitting data at the same time grows. Instead of one or two sensors or a few thermal video cameras transmitting footage, imagine hundreds or thousands of devices all transmitting and receiving data for a production site all at once.


As the number of IoT devices grows the cost for the increased cloud service bandwidth could be astronomical and the quality of things like live video feed could suffer due to latency issues.


Edge-computing can solve the above problems. An edge computer can process data from many edge devices (smartphone, sensor, video camera etc), and send only the relevant data up to the cloud reducing bandwidth needs. In addition, edge computing can process the relevant data and send it back to the edge device extremely fast. For the cases where facial recognition or facial thermal measurements of employees are required, this local processing would be significantly quicker and cheaper than using a cloud based application. Algorithms on a local edge node could make the assessment whether an employee is sick or not locally and only the faces of a potential covid-19 infected employees would need to be sent to the organisation’s central cloud database so that the organisation can track and monitor cases per production site and support employees in a coordinated way.



Barriers To Overcome For Edge Computing

Julius and Clark see the following barriers to overcome for organisations to adopt edge computing:

1. Network Bandwidth

Historically organisations allocate higher bandwidth for central servers and data centres. With more computation and data being processed and stored at the edge, organisations will need to adjust bandwidths across their networks to reflect the fact that more network activity will be carried out geographically closer to where devices are located. Whilst 5G brings the promise of increased connectivity, bandwidth allocation strategies will still be important in effective use of such improved connectivity.

2. Device Diversity

As a greater variety of edge devices come online, organisations will need to ensure that each device is compatible with other devices. As well as an end to end architecture approach, software and developers can utilise technologies such as robotic process automation to facilitate such activities. The transition to edge computing will need to ensure a robust bi-lingual network infrastructure is in place.


3. Device Monitoring and Maintenance

By bringing elements of the IT infrastructure back in-house, maintenance will be an important factor. In such a data driven world any duration of time that a local server is not available has a significant impact on edge computing. This will require highly advanced IT monitoring solutions and increased local IT skills. In addition IT teams will need to have procedures in place for what to do if an edge device or edge node fails e.g. local cache or standby local node computers.


4. Security

When an organisation’s IT infrastructure is centralized around a few data centres both software and physical security can be standardised. With edge computing cloud to node to device interactions can be carried out at a local level. This raises the challenge for organisations to apply network security models and physical security that would normally be observed in centralised data centres. For example, if an edge node or edge device on the network edge is hacked, then all the devices connected to that node could also be compromised.


5. Data Storage and Backup

Part of the identified need for edge computing comes from the fact that more and more locations are collecting large amounts of data a lot of which is sent to the cloud costing a lot of money. If less data is sent to the cloud, organisations need a data strategy that can comprehend all local data and its storage in an accessible way and in accordance with data handling rules and laws.



Conclusion

With more IoT devices coming online, the sheer volume of data being delivered around the world means an edge computing model would be the best way forward for organisations looking to save money and speed up business operations compared to cloud based offerings.


Even as edge computing begins to pick up, cloud computing will still remain crucial for digital infrastructures across the globe. The 2020 pandemic has tested just how robust global digital infrastructures are as more people than ever before are working from remote locations. Cloud server owners have so far done a good job in swiftly redesigning the architecture of these data ecosystems to ensure the steady flow of data.


The question remains as to how sustainable a cloud only approach will be in the long run as more devices go online and faster local decision needs are required.


Like, share and follow us!

17 views0 comments
bottom of page