Overcoming economic headwinds: CIOs' top priorities

Databricks

By Ed Lenta, Senior Vice President and General Manager, Asia Pacific and Japan, Databricks
Monday, 19 June, 2023


Overcoming economic headwinds: CIOs' top priorities

The pandemic accelerated tech adoption, but it also caused many organisations to incur a great deal of technical debt. Now, as things have settled from the pandemic-related chaos, many businesses and their CIOs are focused on building IT infrastructure that is resilient and sustainable.

However, the current economic climate makes digital transformation especially challenging. Rising costs — attributed to inflationary pressures and high interest rates — and a grave Australia-wide tech talent shortage are forcing CIOs to narrow their scope of work. Given this, IT chiefs must squarely focus on activities that allow for business efficiency optimisation for cost savings.

Below are five key priorities for CIOs to keep track of as they work to drive business goals in challenging conditions.

Moving beyond legacy systems

More than half of Australian organisations’ workloads are expected to be hosted on public clouds by 2025, according to specialist IT research and advisory firm ADAPT. Cloud migration enables organisations to innovate faster and be more agile. Of the Australian digital leaders who have already overseen their organisations’ cloud migration, 80% say they are satisfied with the returns their cloud transformation programs have attracted to date, research by consulting firm KPMG suggests.

By contrast, the cost to businesses that choose to keep their workloads on legacy systems can be enormous. Moreover, on-prem solutions like Hadoop, and other data management solutions such as Netezza and Teradata, can also be a burden on costs. For large enterprises, just the licensing fees needed to maintain legacy systems often can amount to millions of dollars. Adding to that, hardware maintenance and utilisation can account for as much as 20% of the total cost, as well as the DevOps burden that comes with supporting such legacy systems.

In order for businesses to remain competitive, it’s critical for them to make the switch from legacy systems to modern infrastructure.

Consolidating data

Data is any organisation’s most prized asset, but not every business is able to leverage its data effectively. In fact, 64% of Australian digital heads have stated that having data across disparate systems and applications is a key challenge to their data strategies. Data consolidation is needed to really make sense of all the information an organisation has in its possession.

AusNet Services, for example, exemplifies how organisations can take on an effective data strategy. With over 11 billion electricity, gas and network connection assets, it was essential for the organisation to have a consolidated view of the energy and equipment used by its 1.5 million customers to better serve them.

The energy services provider centralised all of its data by migrating it to the Databricks Lakehouse Platform on Azure. Since then, AusNet has experienced three times faster data processing and 50% cost savings.

AusNet Services’ experience highlights the urgent need for other organisations to untangle data to drive greater customer value and ensure that they are making data-driven decisions.

Building a unified data management architecture

Data consolidation to gain actionable insights is an important approach to streamlining valuable information and making sense of it. However, it’s still important to address the overarching challenge of disparate technologies and multiple tech stacks, which is not sustainable in this economic environment.

Maintaining such data silos over time, with their associated infrastructure and plumbing, is now seen as an unnecessary overhead that doesn’t provide any additional value or competitive advantage. Adopting a single technology vision and architecture in today’s market is a must.

A data lakehouse approach to data management is a strategy that more and more companies are utilising to consolidate their business intelligence (BI) and artificial intelligence (AI) infrastructure. A data lakehouse is an open data management architecture that combines the flexibility, cost-efficiency and scale of a data lake with the data management capability of a data warehouse.

Industry analyst firm Gartner expects lakehouse adoption to reach critical mass in the next two to five years, according to the Gartner Hype Cycle for Data Management, 2022.

Mastering AI

In recent months, the excitement around ChatGPT has propelled organisations across industries to adopt AI. The interest in large language models (LLMs) has surged by 1310% from the end of November 2022 to the beginning of May 2023, according to the Databricks’ 2023 State of Data and AI report. The fact is, data- and AI-driven organisations outperform their peers. Given this, it is imperative for organisations to master AI — whether it is generative AI or AI in general — to optimise business processes and create a competitive advantage over other organisations.

This is even more critical as businesses must prioritise cost containment and act with caution regarding digital investments amidst rising economic pressures.

For most businesses, building an LLM from scratch is not feasible, yet using third-party platforms is a risk to proprietary information. In order to democratise generative AI, we launched Dolly 2.0 — an open-source, instruction-following LLM, fine-tuned on a human-generated instruction dataset licensed for research and commercial use. This type of technology empowers organisations to build their own LLM.

CIOs must understand internal priorities and then match them with the right digital solutions that meet business needs and fit within budgetary constraints.

Effectively managing risk

With a series of recent high-profile data breaches in Australia, local businesses are under more data security scrutiny than ever before. More organisations now understand that risk, governance, compliance and security are fundamental data challenges.

Gartner estimates that poor data quality costs organisations an average of $12.9 million every year. An effective data governance strategy can help here, as long as it includes a focus on data quality. This can ensure that the provenance of data can be known, rules can be enforced on the data, and changes can be tracked to build trust in the results of data queries and analysis.

Data governance encapsulates the policies and practices needed to securely manage the data assets within an organisation. Leveraging a unified approach to managing data and analytics helps data leaders address the most common challenges when modernising their risk management practices.

Businesses can adopt a more agile approach when their data can serve multiple risk management use cases and they are no longer restricted to the narrow view of their individual use case.

CIOs are under immense pressure, as they lead from the front and help their organisations effectively navigate current conditions. However, by identifying ways of data consolidation and utilising their data effectively, businesses can enhance operational efficiency and deliver greater value.

Image credit: iStock.com/sesame

Related Articles

Big AI in big business: three pillars of risk

Preparation for AI starts with asking the right questions.

Making sure your conversational AI measures up

Measuring the quality of an AI bot and improving on it incrementally is key to helping businesses...

Digital experience is the new boardroom metric

Business leaders are demanding total IT-business alignment as digital experience becomes a key...


  • All content Copyright © 2024 Westwick-Farrow Pty Ltd