Big data's need for data centre efficiency

Schneider Electric IT Business

By Francois Vazille, Vice President of IT Business, Pacific Zone, Schneider Electric
Wednesday, 15 October, 2014


6722296855 d62644be73 o

Our need to process and store information is increasing exponentially. It is estimated that Australian internet traffic will nearly triple from 2013 to 2018. Globally, we’ve moved into a period termed ‘the internet of everything’, or the interconnection of computing-like devices with existing internet infrastructure. This means that by 2020 there will be 50 billion connected devices - nearly seven devices per person.

This proliferation of devices, among other trends, is fuelling larger and complex data sets - big data - which many organisations are using to solve their challenging business problems. Big data and services as a trend is growing rapidly. IDC forecasts that it will expand at a 27% compound annual growth rate through 2017 - six times faster than the overall ICT (information and communications technology) market.

Big data has great potential, but to be of use, it must be processed and analysed, and data centres are necessary for this. This growth trend will result in unnecessary energy burn, some of it from electricity generated from fossil fuels, which can result in poor outcomes for underprepared companies and also for the environment.

Big data and the data centre

While reliability and latency, or the time delay between the cause and the effect of some physical change in data centres, are still crucial to data centre operators, energy efficiency has become a priority as companies try to reduce the large cost of energy. Gartner estimates that energy accounts for 12% of all data centre costs, and this percentage is rising fast.

One of the best ways to improve data centre efficiency is to reduce the energy consumed by power, cooling and lighting infrastructure. Greenfield sites or completely new data centres built from the ground up require modern technologies including hardware and software to implement a stable infrastructure. This involves planning and designing highly efficient data centres that push the PUE (power usage effectiveness) down closer to target ratios of 1.0.

For Brownfield facilities that re-use elements of existing sites and make the most of existing building infrastructure, companies can adopt a ‘data centre life cycle approach’ to help those facilities become much more efficient and better placed to meet the demands brought on by big data. This approach examines each stage of the data centre’s life span to improve energy efficiency. The method takes into account the different output and conditions data centres may experience from planning and design, through to operations and assessment of the facility.

Many specific tactics and upgrades will benefit the operations of each life cycle stage, including lower data centre operating temperatures, use of hot-aisle/cold-aisle configurations with better air containment, equipment with ‘economiser’ modes of operations and use of data centre infrastructure management (DCIM) software.

Companies wishing to utilise the data centre lifestyle approach to aid in the management of big data can start by looking at assessments and baselining as well as comparing current performance to a historical metric. This examination can help pinpoint what’s needed most.

Modular, containerised data centres also bring many benefits including cost savings up front and - over time - increased reliability, improved efficiency and more. They are faster to deploy and have lower total cost of ownership (TCO) than ‘stick-built’ data centres. Recent analysis suggests that customers can expect a 30% reduction in TCO, a 13% decline in first-installed costs and a 60% decrease in time to deploy.

Considering the steep growth curve expected for big data, addressing retention inefficiencies will also prove a sustainability hit for the environment. Relying on old techniques and aiming for average levels of energy efficiency is no longer an attractive option when considering the rapid growth in big data.

Organisations that focus on new approaches and services to ensure their facilities are capable of handling the rapidly increasing costs that come with retention of big data will save money and cut down on inefficiencies. Services must be large enough, flexible, scalable and energy efficient to cope with these new demands.

Image courtesy Intel Free Press under CC

Related Articles

The missing link in your disaster recovery plan

Even a momentary electrical interruption can spell disaster for your business, so it pays to not...

Managing energy through measurement

The old adage that you can't manage what you can't measure is nowhere more apt than in...

The downsides of extending the life of your servers

Looking to delay a decision on cloud, or to simply save a buck, organisations are holding onto...


  • All content Copyright © 2017 Westwick-Farrow Pty Ltd