Docker - the next step in application virtualisation

ASE IT
By Andrew Sjoquist, CEO, ASE IT
Monday, 04 May, 2015


Docker - the next step in application virtualisation

Docker will be the next revolution to overtake data centres and cloud computing.

Developed by a company of the same name, Docker is designed to build, ship and run distributed applications. It’s named after the now ubiquitous shipping containers used to haul cargo around the globe. Shipping containers made it easy to maximise cargo on ships through standardisation, and Docker technology is intended to make it easy to standardise distributed applications across the web.

There’s no argument that virtualisation has enabled the web as we know it today - without it, massive services such as Amazon, Azure and Google, as well as popular applications such as Dropbox, Facebook and others, would simply not be possible.

Docker, however, is the next level of virtualisation.

One of the issues associated with virtual machines is that they potentially waste a lot of resources. For example, in a data centre a Windows OS might be running 100 virtual machines, but a large portion of the compute power goes into simply running the operating system and not the applications hosted on the VM. What you end up with in the data centre is a sprawl of virtual machines - it’s efficient, but not the most effective way of doing things.

Containers have been part of the Linux operating system since about 2008, and what they enable is operating system virtualisation through a virtual environment that has its own process and network space instead of creating a fully fledged virtual machine.

Docker takes an application and packages all its dependencies so that it doesn’t need a full virtual machine to run. This means we are able to take applications and services and load more of them into each virtual machine that we are running. This also makes the apps more resilient and reliable. There are also efficiency overheads that are reduced. For example, booting into a Docker-ised application will take a couple of seconds, compared to the boot process of a full virtual machine which could mean a wait time of anything up to a minute or so before it’s up and running.

So what’s the upshot? Docker means that you don’t need to spin up a full virtual machine for every single application that you want to run. This means more processing power is available from the server farm to run each application, instead of running a full virtual machine. It reduces overhead for the data centre, more applications can run and more can be done with fewer compute resources.

Docker is taking the technology world by storm. Giants like Microsoft, Amazon and Google have all expressed a lot of interest in it, and are falling over themselves to make the technology available on their cloud infrastructure.

Application virtualisation using technologies such as Docker is the next revolution that’s going to overtake data centres and cloud computing. Over the next few years, look for more efficiency and greater availability from cloud infrastructure - all thanks to the little application container that could.

Andrew Sjoquist is the founder and Chief Executive Officer of Sydney-based ASE and intabank. Building on 15 years' experience, including involvement in the federal government’s 2007 SkillsONE initiative and the Australian Digital Datacasting trials, he is widely engaged in various facets of the internet, telecommunications and IT systems industry.

Related Articles

Storage strategy in the multicloud era

Data has become the essential raw material in the strategic orientation of business, making data...

Private AI models: redefining data privacy and customisation

Private AI signifies a critical step forward towards a more secure, personalised and efficient...

Why having an observability strategy is critical for effective AI adoption

As organisations continue to adopt AI and put it to work in a variety of innovative ways, many...


  • All content Copyright © 2024 Westwick-Farrow Pty Ltd