Thinkpiece: Do we really need ‘the cloud’?

By Andrew Collins
Friday, 02 September, 2011


Sometimes in IT, it seems like vendors and users have totally different languages.

In such scenarios, vendors will describe their products in terms that are completely unrelated to how IT managers and IT implementers see the world.

Usually this is pretty benign, though annoying. For example, a bundle of word processing and spreadsheet software may, in the eyes of a vendor, become a “dynamic, multilayered set of office solutions that leverages your staff’s core competencies and allows intelligent information capture within the enterprise ecosystem”.

In most cases, those at the receiving end of this nonsense expect this sort of thing, and can separate marketing rubbish from reality easily enough. But sometimes there is a real disconnect between how vendors talk about a technology, and how it’s perceived by those that buy and use it.

This is increasingly the case for cloud.

While the concept is not new - depending on how liberal you are with your interpretation, you can potentially trace it back to the 1960s - the term ‘cloud computing’ has gone through repeated reiteration and qualification in the last ten years. We now have public clouds, private clouds, virtual private clouds, hybrid clouds, software-as-a-service, infrastructure-as-a-service, the Intercloud, and so on.

Where once the term ‘cloud computing’ meant one specific thing, it can now refer to any one of these disparate things and each one of these things is a fundamentally different technology.

While there are clear definitions of the cloud (see NIST's definition of cloud computing), the term is an abstraction. It is a conceptual umbrella term that encompasses many different technologies that are philosophically related, but practically very different in terms of what they are, what they do and how they work.

As a consequence, when someone discusses the cloud, they are usually referring to one of the specific technologies that comes under the cloud banner, rather than the abstract concept of ‘the cloud’. This can lead to gross amounts of confusion.

For example, it’s difficult to have meaningful discussions about the benefits of ‘the cloud’, because when you talk about the cloud you’re not talking about one singular thing - you’re talking about many different technologies, each of which has its own ups and downs, and its own best use cases. There are very few generalisations you can make that apply to each thing that lives under the cloud umbrella.

This confusion just makes life difficult for those that end up implementing and using these technologies. Just Google 'the cloud is' alongside your favourite expletive, and you’ll find a variety of colourful discussions on message boards discussing these problems.

Given the confusion and frustration that stems from pushing all these disparate technologies and products under the vague banner of the cloud, we are left with the question: what value is there lumping these things together?

While it’s nice for vendors to be able to group these things together under a conceptual banner, what value is there in this for end users, when each technology or product is markedly different? Aren’t we better off considering each in its own context, with its own pros, cons, and best use cases?

In other words: do we really need ‘the cloud’?

What do you think? Let us know.

Related Articles

How to prepare for the AI future (that isn't here yet)

Something big is indeed coming, but the revolution is not here yet.

Storage strategy in the multicloud era

Data has become the essential raw material in the strategic orientation of business, making data...

Private AI models: redefining data privacy and customisation

Private AI signifies a critical step forward towards a more secure, personalised and efficient...


  • All content Copyright © 2024 Westwick-Farrow Pty Ltd