Big data's need for data centre efficiency

Schneider Electric IT Australia

By Francois Vazille, Vice President of IT Business, Pacific Zone, Schneider Electric
Wednesday, 15 October, 2014


Big data's need for data centre efficiency

Our need to process and store information is increasing exponentially. It is estimated that Australian internet traffic will nearly triple from 2013 to 2018. Globally, we’ve moved into a period termed ‘the internet of everything’, or the interconnection of computing-like devices with existing internet infrastructure. This means that by 2020 there will be 50 billion connected devices - nearly seven devices per person.

This proliferation of devices, among other trends, is fuelling larger and complex data sets - big data - which many organisations are using to solve their challenging business problems. Big data and services as a trend is growing rapidly. IDC forecasts that it will expand at a 27% compound annual growth rate through 2017 - six times faster than the overall ICT (information and communications technology) market.

Big data has great potential, but to be of use, it must be processed and analysed, and data centres are necessary for this. This growth trend will result in unnecessary energy burn, some of it from electricity generated from fossil fuels, which can result in poor outcomes for underprepared companies and also for the environment.

Big data and the data centre

While reliability and latency, or the time delay between the cause and the effect of some physical change in data centres, are still crucial to data centre operators, energy efficiency has become a priority as companies try to reduce the large cost of energy. Gartner estimates that energy accounts for 12% of all data centre costs, and this percentage is rising fast.

One of the best ways to improve data centre efficiency is to reduce the energy consumed by power, cooling and lighting infrastructure. Greenfield sites or completely new data centres built from the ground up require modern technologies including hardware and software to implement a stable infrastructure. This involves planning and designing highly efficient data centres that push the PUE (power usage effectiveness) down closer to target ratios of 1.0.

For Brownfield facilities that re-use elements of existing sites and make the most of existing building infrastructure, companies can adopt a ‘data centre life cycle approach’ to help those facilities become much more efficient and better placed to meet the demands brought on by big data. This approach examines each stage of the data centre’s life span to improve energy efficiency. The method takes into account the different output and conditions data centres may experience from planning and design, through to operations and assessment of the facility.

Many specific tactics and upgrades will benefit the operations of each life cycle stage, including lower data centre operating temperatures, use of hot-aisle/cold-aisle configurations with better air containment, equipment with ‘economiser’ modes of operations and use of data centre infrastructure management (DCIM) software.

Companies wishing to utilise the data centre lifestyle approach to aid in the management of big data can start by looking at assessments and baselining as well as comparing current performance to a historical metric. This examination can help pinpoint what’s needed most.

Modular, containerised data centres also bring many benefits including cost savings up front and - over time - increased reliability, improved efficiency and more. They are faster to deploy and have lower total cost of ownership (TCO) than ‘stick-built’ data centres. Recent analysis suggests that customers can expect a 30% reduction in TCO, a 13% decline in first-installed costs and a 60% decrease in time to deploy.

Considering the steep growth curve expected for big data, addressing retention inefficiencies will also prove a sustainability hit for the environment. Relying on old techniques and aiming for average levels of energy efficiency is no longer an attractive option when considering the rapid growth in big data.

Organisations that focus on new approaches and services to ensure their facilities are capable of handling the rapidly increasing costs that come with retention of big data will save money and cut down on inefficiencies. Services must be large enough, flexible, scalable and energy efficient to cope with these new demands.

Image courtesy Intel Free Press under CC

Related Articles

Revolutionising connectivity: the trends redefining data centres in 2024

The rush of generative AI has hit the IT ecosystem hard.

Five key data trends Australian IT leaders need to know about this year

With zettabytes of data freely available at our fingertips, businesses must look inwards and...

Future-proofing digital growth in the cloud

As companies move into 2024, many will grapple with the best approach to unlocking the full...


  • All content Copyright © 2024 Westwick-Farrow Pty Ltd