HPE Acquires SimpliVity for $650 Million, Another Proof Point for HCIJanuary 30, 2017
This month, HP Enterprise (HPE) announced an agreement to acquire Hyperconverged Infrastructure (HCI) vendor Simplivity for $650M. The acquisition highlights the growing popularity of consolidated computing systems that CIOs are adopting as an alternative to public cloud services, and suggests that on-premises computing systems still remain a crucial option for many organizations.
SimpliVity is but one of many companies offering hyperconverged systems which bundle computing, storage and networking onto the same server. The company’s flagship product is called the OmniCube hyperconverged infrastructure appliance, and they’ve been working with hardware vendors such as Lenovo, Dell, Cisco and Huawei to bring the software portion of its solution to those vendors’ hardware platforms.
SimpliVity was founded in 2009, and over the last couple of years, they raised almost $276 million in four funding rounds led by Waypoint Capital, Accel Partners, Charles River Ventures, DFJ Growth, Kleiner Perkins Caufield & Byers (KPCB) and Meritech Capital Partners. When the company raised its Series D round of $175 million in March of 2015, the company was estimated to be worth as much as $1 billion.
The acquisition announcement was seen as big news in the market, despite the fact that its exiting price tag fell far short of its once estimated valuation. While the purchase price may have raised a few eyebrows and threw out a few questions, the actual sale itself wasn’t much of a surprise to anyone who had been following the industry. Data centers are evolving.
"We believe that HPE’s acquisition of SimpliVity is another proof point that customers are demanding simplicity in the way IT operations are procured, installed and managed," said Hans O’Sullivan, CEO & Founder of StorMagic. "The leading vendors in server and storage hardware didn’t invent the concept of hyperconverged, but they are realizing that integrating compute, storage and networking and eliminating external SANs isn’t just a future fantasy, it’s already happening."
O’Sullivan added, "Of course, SANs have their place in the world of IT, but for small environments that lack IT resources and have tight budgets, a software-defined storage solution running on industry-standard servers can save millions of dollars and simplify management."
Chuck Dubuque, VP of Product Marketing at Tintri, wasn’t surprised by the announced acquisition either. He explained, "It’s getting harder for vendors to differentiate themselves in an intensely competitive storage market. What further complicates the landscape is the growing need for organizations to create and support a new generation of applications in the cloud. The IT team wants more simplicity and automation so they aren’t stuck at their desks twisting dials to tune storage. And the Cloud Architects want all the agility of public cloud with the predictability and control of their enterprise data center."
According to Jeff Ready, CEO of Scale Computing, HPE was losing business to hyperconverged players. And with Dell/EMC and VMware now in control of the entire stack, HPE needed to start making some moves in the space.
"This was a storage-centric acquisition," explained Ready. "And HCI goes well beyond storage, so I don’t think this gets them all the way to where they need to be. The bigger issue I see is that every time HPE sells hyperconvergence, money goes into Dell’s pocket because of VMware. This deal doesn’t change that."
What’s all the fuss about HCI?
From a market perspective, HCI continues to grow at the expense of the traditional server and storage market because of its modular, scale-out architecture. The demand for hyperconverged systems is being fueled by organizations with private, on-premises data center environments that want to be more flexible and cloud-like without sacrificing control and security. And organizations are adopting hyperconverged systems to gain more efficiencies from their existing virtualization technology.
IDC says the market for hyperconverged systems topped $2.6 billion in 2016 and could reach $6.4 billion by 2020. And according to market research from Gartner, HCI will be the fastest-growing segment of the overall market for integrated systems, reaching almost $5 billion by 2019, propelling it toward mainstream use in the next five years.
From a customer perspective, hyperconvergence offers a very practical, cost-effective approach to deploying IT infrastructure. HCI provides faster speed, better economics and ease-of-use, allowing customers to grow their infrastructures incrementally over time without forcing them to "rip and replace" data center assets. These trends seem highly likely to continue for the next several years.
"HCI has quickly become the primary way of deploying infrastructure," explained Ready. "HPE wouldn’t have made this acquisition if they weren’t losing to HCI, and so the proof is in the deal. The early thinking around HCI was that it was a storage technology, but it really goes well beyond that. Some companies like us at Scale Computing provide the whole stack, from hypervisor through to the hardware, and integrate the whole thing. Dell-VMware has this with VXRail as well."
HPE said it expects to close the SimpliVity acquisition deal by the end of next month and plans to offer the company’s software with its ProLiant DL380 servers. Later in the year, the company plans to sell an integrated HPE SimpliVity hyperconverged system based on its ProLiant series.
Analysts now turn their attention over to other HCI vendors, chief among them Nutanix, worth around $4 billion and currently estimated to have more than 50 percent of the hyperconverged system revenue with over 3,000 customers worldwide.
So how does Cisco, Dell/EMC, IBM and Lenovo respond? Will there be other acquisitions coming in the near term? Or will there simply be a reshuffling of partnerships? And what exactly is the broader HCI strategy at HPE?
It was business as usual for StorMagic after the SimpliVity acquisition was announced. Company representatives told VMblog they would continue to partner with the industry’s leading server providers to deliver their software-defined storage to customers and eliminate the over-provisioning that typically comes with the appliance model approach.
"By analyzing customers’ exact performance requirements," explained O’Sullivan, "we build customized solutions that tolerate any environment in any location, significantly lower CAPEX and OPEX and future-proof for easy expansion."
For Scale Computing, their positioning will remain the same as well. The company is positioned as an alternative to VMware-based infrastructures; and that, they say, comes with a number of added benefits such as upfront costs, licensing savings, etc. But most importantly, there’s the time savings aspect of running the system.
About the Author
David Marshall is an industry recognized virtualization and cloud computing expert, a seven time recipient of the VMware vExpert distinction, and has been heavily involved in the industry for the past 16 years. To help solve industry challenges, he co-founded and helped start several successful virtualization software companies such as ProTier, Surgient, Hyper9 and Vertiscale. He also spent a few years transforming desktop virtualization while at Virtual Bridges.
David is also a co-author of two very popular server virtualization books: "Advanced Server Virtualization: VMware and Microsoft Platforms in the Virtual Data Center" and "VMware ESX Essentials in the Virtual Data Center" and the Technical Editor on Wiley’s "Virtualization for Dummies" and "VMware VI3 for Dummies" books. David also authored countless articles for a number of well known technical magazines, including: InfoWorld, Virtual-Strategy and TechTarget. In 2004, he founded the oldest independent virtualization and cloud computing news site, VMblog.com, which he still operates today.
Follow David Marshall