Why ones and zeros are a catch 22

The data center is travelling down a road to complication, thanks to rising amounts of data. Consolidation, automation and standardization will ens

28 November 2011 by Penny Jones - DatacenterDynamics

Why ones and zeros are a catch 22
Dr Wing To

At the outset, it may seem like just another challenge for the data center but growing data amounts are creating one of the biggest catch 22s present to data center operators today - and it is set to get worse. Much like the US Army Air Force bombardier that author Joseph Heller writes about in his book Catch-22, data center operators are increasingly facing a situation which has no logical outcome. Data is giving rise to growth and innovation in the data center, but the more you have, the more oversight is required to manage it.

 
Data is also affecting the way data centers are being run and the role of those running them, according to IBM VP of product development for software Dr Wing To. So much so that high density, when referred to in terms of data, is fast on its way to becoming the norm in data centers. “Higher workloads are bringing about the need for more computing capacity and this is giving rise to a number of new challenges in the data center,” To says. “We would not even describe it as ‘high density’ anymore, but a side-effect of this high density is the need for the more optimal use of computing capacity, and the need for faster rates of innovation and new services.”
 
With data amounts steadily growing, proper management of data workloads is becoming crucial. Businesses have taken quickly to the reduction in costs and increasing traction of beneficial technologies such as radio frequency identification (RFID) and deeper analytics. They are also enjoying the benefits mobile devices used by staff can bring. No one can deny that Blackberrys and laptops have changed the way businesses operate. Now the data center is being asked to manage the workload this increased amount of ones and zeros is commanding.
 
Data is keeping data centers in business, but how much should the data center operator understand about the business side of data?
 
Moving towards consolidation
According to To, the pathway for coping with higher data densities is already being forged. For many data centers it is starting with a move towards consolidation - bringing current and future resources together. This is followed by automation (setting workload activities) and standardization of data and processes (optimizing to provide flexibility in the deployment of resources). This can be seen in the vendor space, with new technologies being introduced for networking, storage and overall compute capacity, especially with capabilities relating to virtualization.
 
“At IBM, we are promoting the idea that to have complete control over your systems you need to have full visibility and automation. Do you know what is happening in the systems being used, and especially how all virtual infrastructure is used?” To asks.
 
“It is no longer just about the server. Services or applications require the use of the server, network and storage, and all of these can be virtualized today. This is why the technology that allows you to see how data is exchanged between virtual machines is so important. The data center of the future will have automation across the entire data center, not just on the hypervisors.”
 
When To talks about data centers he does not mince his words. He says the need for high density thinking covers data centers run by utilities, data centers run by banks, data centers run by hospitals - every data center imaginable. He believes we will see a growing requirement for a more data-focused approach to dealing with the business, and to managing the data center.
 
“Today you have financial modelling platforms, traffic systems for toll bridges and companies using technology to carry out rail repairs. The world is becoming more instrumented, and as time goes on it finds more and more things to add to this list. What the data center has to do is find out how to do optimization so that the right workload can be used to do the right task,” To says.
 
Dealing with data
Sean Flannagan is the technical leader for IBM’s systems and technology group in the UK. He works with some of IBM’s largest business partners. He happily pushes IBM’s Smarter Planet ideology - its smarter computing story - which puts data into every nook and cranny of today’s society, boasting its benefits from the Watson computer to integrated road traffic solutions for governments. “Data is key for decisionmaking. The problem is that the amount of data out there is so large and, in many cases, difficult to capture,” Flannagan says.
 
IBM’s own product range has changed in recent years to accommodate growing data. Its V7000 midrange storage technology is just one example. The product range has been developed to get 480 terabytes in two cluster units – a perfect by-product of high density.
 
“We have seen a 61% growth in what I’d call provisioning for high density by the companies we work with. This is for both high-performance computing and companies centralizing and growing their mainframe footprint, Flannagan says.
 
“I think the future is about keeping pace. The data center will need to be ready for new workloads coming in, and that will drive more infrastructure and power requirements while posing new challenges. It is going to catch some of our clients out and, hopefully, drive new business for IBM.”
 

CONNECT WITH US

Sign in


Forgotten Password?

Create MyDCD account

Regions

region LATAM y España North America Europe Em Português Middle East Africa Asia Pacific

Whitepapers View All