Cloud Computing: Data Center Efficacy – Cracking the 20 Percent Code
March 10, 2014Grazed from Midsize Insider. Author: David Bonderud.
Cloud-based data centers suffer from an efficiency problem. Despite massive resource pools, many offer utilization rates that are no higher than 20 percent, which in turn drives up cloud costs for midsize businesses and increases server sprawl. Now, a team of researchers from Stanford University may have found a way to improve data center efficacy while still meeting strict performance goals.
No More Reservations
According to a recent Stanford News article, Associate Professor Christos Kozyrakis and doctoral student Christina Delimitrou have developed a cluster management system called Quasar that can theoretically triple server efficiency without affecting service reliability. How? It starts with a move away from the "reservation model." Used by many cloud providers, this model starts with a question to IT professionals: How much resource capacity does a specific application need? IT administrators understandably overestimate in their efforts to mitigate potential issues, but a new problem is emerging — data centers bursting with resources that no one is using…
The Stanford team’s project aims to change all that by moving to a performance-based model, in which developers and midsize IT professionals input data about what kind of performance is required from their application. The Quasar system then assigns network resources in order to meet response time or data volume goals instead of idly standing by with "as-needed" resources…
Read more from the source @ http://midsizeinsider.com/en-us/article/data-center-efficacy-cracking-the-20-pe
Subscribe to the CloudCow bi-monthly newsletter @ http://eepurl.com/smZeb


