Drivers and benefits of Edge Computing
By Andrew Kirker, General Manager, Data Centre, Schneider Electric
Monday, 11 July, 2016
Edge Computing can provide a better end-user computing experience and reduced data costs.
The age of the Internet of Things (IoT) is in full swing and is gaining momentum. Cisco predicts there will be 50 billion connected things — mobile devices, smart appliances, sensors, cars and industrial machines — by 2020 alone. Our connected lives are generating unprecedented demand for computing power, connectivity and quality of service, especially in terms of latency. This is driving the evolution of more data centres being placed closer to the user or data source, known as the Edge.
Generally regarded as the architecture of the future, the rise of Edge Computing is gaining popularity as an alternative to conventional approaches where the data centre can be remote and geographically distant from the user.
At the same time, mobile telecom networks and data networks are converging into a cloud computing architecture. To support this, computing power and storage is being inserted out on the network edge in order to lower data transport time and increase availability.
Welcome to the Edge
Edge Computing places data acquisition and control functions, storage of high-bandwidth content and applications closer to the end user and devices (such as smartphones, tablets and sensors). It is inserted into a logical end point of a network (internet or private network), as part of a larger cloud computing architecture.
In general, there are three types of Edge Computing: local devices, localised data centres and regional data centres.
Local devices are sized to accommodate a defined and specified purpose. Deployment is immediate and they are suitable for home or small office applications, eg, a building security system or local video content stored on a digital video recorder. Another example is a cloud storage gateway, a device that enables users to integrate cloud storage into applications without moving the applications into the cloud itself.
Localised data centres have 1–10 racks, provide significant processing and storage capabilities and are fast to deploy in existing environments. These centres are often available as configure-to-order systems, pre-engineered and then assembled on-site. Another form is the prefabricated micro data centre, which is assembled in a factory and dropped on-site.
These single-enclosure systems can be equipped for normal IT enclosures, such as an office environment, or rugged enclosure types that need to be rainproof, corrosion-proof and fireproof. The single-rack versions can leverage existing buildings, cooling and power to save on CAPEX rather than having to build a new dedicated site. Installation requires picking the location in close proximity to the building power and fibre source. The multirack versions are more capable and flexible due to scale, but require more planning and installation time and need their own form of dedicated cooling. These 1- to 10-rack systems are suitable for a broad base of applications requiring low latency, high bandwidth or added security.
Regional data centres have more than 10 racks and are located closer to the user and data source than centralised cloud data centres. Due to their scale, they have more processing and storage capabilities than localised data centres. Even if they are prefabricated they will take longer to build due to construction, permitting and local compliance issues. They’ll also need dedicated power and cooling sources. Latency will be dependent on the physical proximity to the users and data as well as the number of hops in between.
Applications and use cases
So what does living on the edge look like? Let’s take a look at three applications of Edge Computing and the benefits delivered:
Application #1: High-bandwidth content distribution. Excessive latency creates traffic jams that prevent data from filling the network to capacity. The impact of latency on network bandwidth can be temporary like a traffic light or constant like a single-lane bridge. The greatest probability of network congestion is from video on demand, 4K TV and video streaming, which are the fastest growing high-bandwidth applications.
In order to relieve network congestion, service providers are connecting a system of computers on the internet that caches the content closer to the user. This enables the content to be deployed rapidly to numerous users by duplicating the content on multiple servers and directing the content to users based on proximity.
Application #2: Edge Computing as IoT aggregation and control point. The technologies that will enable ‘smart’ everything in the future — whether cities, agriculture, cars or health — will require the massive deployment of IoT sensors. IoT can automate operations in two main ways: by gathering information about equipment and devices to monitor status or behaviour, and using that information to provide visibility and control to optimise processes and resources.
The Industrial Internet of things (IIoT), which includes the harnessing of sensor data, machine-to-machine communication control and automation technologies, will also generate large amounts of data and network traffic. As part of this, many savvy organisations are adopting operational intelligence (OI) to provide visibility and insight into their business operations. OI is data-driven, real-time analytics and is being used to transform workplace processes, embed knowledge in systems and minimise the effect of workplace generational shifts.
Application #3: On-premise applications. Edge Computing transforms cloud computing into a more distributed architecture. The main advantage is that any kind of disruption is limited to only one point in the network. For example, a distributed denial-of-service (DDoS) attack or a long-lasting power outage would be limited to the Edge Computing device and the local applications on that device, as opposed to all applications running on a centralised cloud data centre.
Companies that have migrated to off-premise cloud computing can take advantage of Edge Computing for increased redundancy and availability. Business-critical applications, or applications needed to operate the core functions of the business, can be duplicated on-site.
The future on the Edge
The exponential growth of data, driven by the IoT, is causing major bandwidth concerns for organisations as they struggle to understand where and how to best manage and process their data. This growing quantity of data will need to be processed and analysed in real time — Edge Computing can help take that data and workload technology to a whole new level.
Ultimately, Edge Computing can solve latency challenges by moving data closer to the end user, enabling companies to take better advantage of opportunities leveraging a cloud computing architecture and provide greater availability and access to data. This results in a better end-user computing experience and reduced data costs.
Data centres account for nearly 4% of Australia's energy consumption. How do we take charge...
As Australia bets on technology for its net-zero ambitions, data centres are set to take centre...
Voltage converter, voltage divider, linear stabiliser — which one should you choose?