2016: The Year Converged and Hyper-Converged Infrastructure Becomes Mainstream

January 29, 2016 Off By David
Grazed from VMblog.com. Author: Todd Pavone

Article Written by Todd Pavone, Executive Vice President, Product Strategy & Development, VCE  

In 2015 we heard a lot about DevOps, Big Data, containerization, and the Internet of Things (IoT), but what do all of these macro trends mean for the modern data center? What kinds of impact do these massive disruptive forces have in shaping 2016? Most important, in a world of lean teams and flat or shrinking budgets, how can you be prepared? Here are a few predictions from VCE’s point of view on how CIOs can embrace and apply the digital transformation initiatives to drive competitive differentiation and success this coming year. 

Forget the Hype: Hyper-Scale Is For Real

 

Hyper-scale data centers – the assembly of massive computing infrastructures to support today’s global Internet and always-connected businesses – are disrupting the traditional data center value chain by creating new and more complex computing ecosystems. But, before you think we’ve witnessed the full effects of this transition, realize we are just at the beginning of these changes. A longer and more complex evolution of converged and hyper-converged infrastructure is just getting started.

Certainly virtualization, software-defined storage, containers, and micro services have already had big effects on how the modern data center is built: virtual machine collections are now the norm, not the exception. Storage pools are elastic, and hypervisors have become more flexible and powerful. But that is not enough.

According to IDC Research, hyper-scale data centers will become the primary adopters of new compute and storage technologies. According to their report, 70 percent of these new storage and 50 percent compute technologies will reside in hyper-scale data centers by 2017. And by 2018, there will be 10 times more CPU cores than people! Not just that, IDC also predicts 77 billion data center cores, and a doubling of their annual growth too! We have to move beyond yesterday’s generation of infrastructure and better position our businesses for this future growth in computing resources.

Orchestration and Automation: The Enterprise’s Best Kept Secret

Orchestration is going to be the biggest mitigating factor as data centers grow beyond thousands of servers. It will be the glue that keeps everything together and running smoothly. Automated orchestration makes a difference when it comes time to scaling up and down the number of servers that are needed to satisfy spot demands. We are still in early days in understanding how these kinds of tools can deliver a full complement of services. Expect them to become capable and flexible in the coming years.

Balancing the different elements of the data center is going to be the next great challenge. While it is great that we can virtualize our CPUs along with our networks and storage pools, we need to manage them together and make sure that we can expand and contract, and make trade-offs across all three elements to meet our changing workloads. This is because each element depends on the others to deliver the most value. Next-generation systems will need to continually adapt to additional use cases and variable workloads through intelligent sizing and orchestration tools.

##

About the Author

Todd Pavone is the Executive Vice President of Product Strategy and Development at VCE. He is responsible for product engineering, product management, solutions and technology alliances. Pavone and VCE are working to deliver next generation Converged Solutions to transform the economics, agility and profitability of enterprises and service providers as they transition to cloud-enabled business models.