Storage maps the future of digital data


By Angus MacDonald*
Thursday, 07 April, 2011


Storage maps the future of digital data

The one consistent theme in the digital world is that growth is a constant. It is estimated that from 2009 to 2020, the size of the digital universe will have increased 44-fold; that is a 41% increase in capacity every year. Storing, locating and extracting value from high volumes of data will become increasingly complex. Oracle’s Angus MacDonald outlines a storage strategy to cope with this avalanche of data.

As the digitally enabled business world evolves, the mix of data and its anticipated usages are going to change as well. Already, there is an increased diversity of data types with 80% of today’s data being unstructured and the re-use of data is shrinking, with 80% of data never being used after 90 days. However, regulation and compliancy dictates that data is adequately archived for long periods of time, sometimes up to triple digits in number of years.

The fallout of the way data storage is currently handled is massive - the impact on the environment is one of these factors. Storage already consumes 40% of data centre power and it is predicted that within 10 years the total energy consumed by storage solutions could increase to more than six times what it is today. Based on these predictions, storage could represent over 75% of the energy consumed within the data centre, and if you consider that 80% of data is never looked at again after three months, storage is a major IT trigger for energy burnout.

Another fallout is cost and the added expense of managing growing volumes of data. Explosive data growth is causing storage costs to skyrocket, which is forcing IT organisations to look for more cost-effective archival and backup solutions. As the data explosion continues, resources available to manage that data are not keeping up.

Data centre managers are often required to manage 20 to 40% annual increases in data with the same (or fewer) number of employees and capital expenditures. In addition, the business-critical nature of data is driving up storage management costs by 25% per year, so in the long term it will become the number one cost within many data centres. Therefore, it’s becoming increasingly more important to align the value of data with the capabilities and cost of the storage it is stored on.

Looking forwards, the future of storage management must be simple, easily accessible, cost efficient, environmentally friendly and streamlined, so organisations can function and perform quicker and better.

Striving for nirvana

There are three essential elements that must be considered when formulating a storage strategy to meet growing data demands - the evolving function of the data centre, business drivers and the ‘nirvana’ storage solution.

Today’s typical data centre is migrating from a physical, static and heterogeneous set-up, to a grid-based virtualised infrastructure, to a cloud computing environment that enables self-service, policy-based resource management and capacity planning. Along the way, the storage solution must be able to support this style of data centre, so it is critical that the storage system is dynamic enough to support the difficult-to-predict demands of these application environments through a tiered approach.

Reducing cost was at the top of the CIO’s agenda yesterday, now it’s business growth and profitability. The storage strategy must fall in line with these objectives. So, regardless of an organisation’s size, the storage solution must be able to scale to solve the larger, more complex business problems and it has to perform in real time so organisations can react and make business decisions immediately. Likewise, the infrastructure has to be efficient so complex business problems can be effectively solved at a reduced cost and improved speed, and there must be data integrity built in to meet long-term business and regulatory compliancy.

Finally, there is the liberating act of creating a ‘storage nirvana’, should cost and incumbent infrastructure not be an object. For a CIO, this would probably include on-demand secure data access, application-aware storage optimisation, unlimited capacity, scalable performance, appliance-like rapid deployment, and integrated application, system and storage management. Although this nirvana is a distance away, these ideas must be taken into consideration to guide organisations onto a path of accelerated performance, profitability and lower IT costs.

A pyramid strategy

To make the strategy a reality, companies must shift away from the traditional approach of managing islands of storage and move to an automated, tiered and unified storage infrastructure.

By adopting a formula whereby certain data to be stored is assigned to certain storage pools, organisations will improve the price, performance, capacity and functionality of their storage infrastructure.

A typical tiered storage model has four tiers. Newly emerged, tier 0 uses flash memory storage, is extremely high performing and stores high value information that needs to be captured, analysed and presented at high speed. Primary storage, classified as Tier 1, is based on fibre channel disk systems and should have high performance, high availability with near zero downtime and fast recovery to support customer-facing and revenue-based applications. Tier 2 storage should be managed on low-cost high-capacity disks, with the capability to manage broad business applications such as databases, backup, email and file systems. Finally, Tier 3, which is based on the more cost-effective, energy-efficient tape technology, serves the purpose to store high-volume archival data for regulatory purposes and doesn’t require immediate access.

To cope with today’s explosive data growth, many IT organisations are using a tiered storage approach that balances the cost of different types of storage media against application performance requirements.

However, to optimise the tiered storage architecture, companies must classify and value the data of the business, then map and assign it to the best-fit tier. Data can be classified into four categories with I/O intensive data being assigned to Tier 0 storage; mission-critical data, such as revenue- and customer-based applications to Tier 1; vital data that doesn’t require immediate recovery for the business to continue operating to Tier 2; and archival data with low-activity, long-term retention periods to Tier 3.

Leveraging economic prosperity

Leveraging a tiered storage environment has significant economic advantages. Research has shown that a single tiered storage environment has an average lifetime cost of $15,000 per terabyte; a dual tiered one of $8000 per terabyte; and a four-tiered storage structure, $4000 per terabyte. With the majority of the data residing in the archival data Tier 3, which is built on tapes, costs will naturally depreciate. Likewise, an automated systematic data-value mapping and distribution approach requires less administration and maintenance at the low end of the storage pyramid, thus reducing costs and freeing up staff time to focus on the mission-critical data.

Such an approach to storage also reduces compliancy risk and improves business continuity as organisations will be able to more easily satisfy legal and audit requirements, which in turn, improves service levels. Ultimately, organisations will witness performance improvements as upgrades will become easier, stale data will be removed from production resources and there will be less disruption to the production environment.

With growth, performance and profitability high on the C-level agenda, storage management can play a significant role in helping organisations to achieve these objectives.

*Angus MacDonald is the Chief Technology Officer for Oracle’s Systems Line of Business in Australia and New Zealand. He is responsible for articulating Oracle’s Systems product strategy and roadmap, collecting customer feedback and requirements, and driving the adoption of Oracle’s Systems technologies.

Images courtesy of iStockphoto

Related Articles

Revolutionising connectivity: the trends redefining data centres in 2024

The rush of generative AI has hit the IT ecosystem hard.

Five key data trends Australian IT leaders need to know about this year

With zettabytes of data freely available at our fingertips, businesses must look inwards and...

Future-proofing digital growth in the cloud

As companies move into 2024, many will grapple with the best approach to unlocking the full...


  • All content Copyright © 2024 Westwick-Farrow Pty Ltd