Australia's AI ambitions are being slowed by storage

Hitachi Vantara

By George Dragatsis, CTO & Director of Technical Sales ANZ, Hitachi Vantara
Wednesday, 14 June, 2023

Australia's AI ambitions are being slowed by storage

Australia, like many countries, is rapidly growing its use of AI. So much so that the country has had an action plan since mid-2021 and things have expanded from there.

Gartner research shows that on a global basis, 40% of organisations already have “thousands” of AI models in production. These are just the models that pass some sort of approval process — meeting quality, accuracy or value targets for example. Forty six per cent of built models don’t ever see the light of day.

The aggregate impact is a world where there are already hundreds of millions of digital apps and services in existence, the vast majority making use of AI.

In 2019, IDC predicted the global economy would reach “digital supremacy” in 2023, with over 500 million digital apps and services developed and deployed using cloud-native approaches. This was expected to rise to 750 million by 2025, with at least 90% being AI-enabled.

The figures speak to a growing interest in experimenting and understanding the potential applications of AI in different business domains — but also to the looming challenges many organisations must now confront.

Year on year, a growing proportion of organisations think of themselves as AI underachievers. These organisations, according to Deloitte, have done a “significant amount of development and deployment activity [but] haven’t adopted enough leading practices to help them effectively achieve more meaningful outcomes”.

In other words, the more organisations experiment with AI, the more they learn and the more accurately they can benchmark their progress -— but on the flipside, the more they understand their weaknesses and where key bottlenecks to increased scale and production use of the technology lie.

The AI bottleneck

In the evolution of enterprise AI, organisations have gone straight to experimentation and deployment of models at scale, without having some of the underlying foundational elements in place first.

The early focus of any AI program tends to be on the models themselves. It’s at this point where organisations start to encounter challenges. What worked on a small scale and with a limited training set may not scale up linearly for production use.

Model tweaking alone requires considerable effort and results in progressively smaller gains over time. The larger performance gains are locked up in what underpins the models: the source data, supporting infrastructure or a combination of the two.

Data preparation problems are common. It’s become a rite of passage for AI adopters to have to step back and clean up their source data before it gets ingested into a model.

An area for performance optimisation that receives comparatively less attention is the suitability of data infrastructure to support an organisation’s AI ambitions.

Delivering AI models and insights at the right speed depends on the performance of the data centre. Infrastructure may not be set up in a way that is optimal for AI use. Infrastructure teams may be more accustomed to configuring infrastructure to drive consistent and stable performance of monolithic applications. But AI is a different world, and its requirements quickly become apparent when dealing with how data is stored and accessed by the AI models.

Block storage has traditionally been used for high throughput and high IO workloads, such as ERP systems and other IT workloads, where low latency is important to achieving high performance and resiliency. But block-based platforms have been found lacking in their ability to meet the scale-out demand that the computational side of modern data workloads, such as AI and machine learning, create.

Putting in place proper storage systems is as strategic as the advanced computational platform used to create and run the AI models and the cleanliness of the data that feeds the models.

Underpowering a complex distributed computation platform for AI or ML will deliver lower performing results, diminishing the quality of the outcomes.

Specifying a target-state architecture

To support AI platforms and use cases, it’s important to ensure that the storage technology utilised can meet the demands of the AI-based models throughout the entirety of the data lifecycle.

It’s important to select a storage technology that can meet the scale requirements of the data required to support the model training efforts. If the storage system can only support a single tier of storage medium, then it requires a data plane to move data between storage technologies. Otherwise, AI platforms will suffer by not having access to the data they need at any given time.

Ideally, AI modelling work is underpinned by a scale-out architecture where storage and traffic capacity can be easily increased, and where multiple compute nodes can connect as a single scale-out cluster, enabling workloads to run in parallel and expand with compute nodes as needed.

There are two approaches that can help. The first is adoption of software-defined storage for block (scale-out) that features a native data plane to enable data to move seamlessly between scale-up and scale-out storage. A second solution that may be considered is a scale-out filesystem that supports object-based storage.

These promise the greatest economical return for AI adopters, reducing the labour necessary to maintain the systems and removing the brittle nature of traditional data movement, allowing AI models to be fed data seamlessly to always produce their best work.

Image credit:

Related Articles

The power of AI: chatbots are learning to understand your emotions

How AI is levelling up and can now read between the lines.

Making public cloud work for Australia

Why businesses are still struggling to adapt to a future in the cloud.

Generative AI: from buzzword to boon for businesses

There are already solid business applications for generative AI, but as the technology continues...

  • All content Copyright © 2024 Westwick-Farrow Pty Ltd