Well, I certainly gave the hornet’s nest a good, healthy smack with my recent post (“Ten Reasons Why VMware Is Leading The Hyperconverged Market”).
Never underestimate the power of a well-written blog post to shake things up :)
In addition to hearing from dozens of enraged Nutanix employees, the usual round of pundits are now weighing in with their perspectives as well. Everyone is entitled to their opinions, and there seems to be no shortage of those.
Although I do find it interesting that no one has yet attempted to refute any of the facts behind the ten arguments I presented. That's typical in these situations: lots of passionate emotions, very little discussion of the underlying facts.
The good news: I did have a chance to have a few great conversations with intelligent, non-partisan folks who said they were thinking differently after reading my thoughts.
Thinking differently is always a good thing :)
The basic question: if hyperconverged is truly “strategic”, what bigger and better world should it lead to? Or, is it an end unto itself in its present form, and thus not strategic?
Clearly, this is a great question, and worth a blog post or two to discuss …
The Fuss Over Hyperconverged
The first wave of hyperconverged was presented as an appliance that didn’t need an external storage array. The value proposition was heavily weighted towards “convenient consumption”.
Bring it in, rack it up, connect the network and presto! Given the typical complex state of affairs in standing up IT infrastructure, it seems almost as easy as calling up your service provider and having them fire up a few new instances on your behalf. And I’ve learned to never underestimate the appeal of convenient consumption. VMware's EVO:RAIL hyperconverged offering clearly targets this model.
Yes, there are still IT shops that certainly prefer that “convenient consumption” benefit, but I see a growing number now view the potential to do more with the technology: both now and in the future.
But, as a result, the criteria changes in their mind — less weighted towards “the box” and immediate gratification, and more weighted towards “the strategy” of how their short-term choices play into the broader evolution of the IT landscape.
How Strategic Technology Usually Plays Out In Enterprise IT
The new technology is brought in to ostensibly solve an immediate short-term requirement with obvious justification. But at the same time, there is full awareness that this same technology has the potential to play a broader and more transformative role changing the way things are done in IT.
A familiar example?
VMware virtualization got its start by solving an immediate data center problem: rampant server overprovisioning. The pitch at the time was deadly simple: save money with virtualization. However, over time, people realized that — once virtualized — vSphere could fundamentally change the way IT was done from an operational perspective: provisioning, management and more. And that was a really big deal.
The answer to a tactical problem built the foundation for a great strategic outcome. And it wasn't dumb luck on the part of IT shops, either. They saw what we saw.
Another example from the storage world?
When flash was introduced, it was seen as the solution to a very narrow but very demanding set of workloads, e.g. databases with very high transactional rates — a tactical solution to a specific pain point.
As prices dropped, many shops have now decided on a ‘flash first’ strategy — use it just about anywhere that performance could potentially be an issue.
The result was that users got spectacular performance, and IT could get out of the storage-performance-problem-resolution business — arguably transformative in its own way.
Words Fail Me
So if we’re going to think of hyperconverged as one of these “two-fers” (tactical today, strategic tomorrow) what does the longer term picture look like?
At VMware, we describe future state data center architectures as “software defined”, e.g. the software-defined data center or SDDC. Other labels also get used around similar concepts: devops, cloud, software-define infrastructure, etc.
Why? The core technology ideas behind each are similar: heavy use of commodity technologies, everything programmable and thus able to be automated by software, driven by application-centric policies, dynamic and flexible, ease of consumption for end users, an enterprise-class operational model, etc.
Here’s the observation: with this perspective, well-considered hyperconverged solutions can easily deliver a “two-fer” for enterprise IT.
The tactical problem they solve is a cost-effective solution for an immediate infrastructure requirement. The strategic benefit is that they can potentially create is a pathway to SDDC or whatever you’d prefer to call your next-generation environment.
But if we’re going to want to exploit that second part, our evaluation criteria may have to evolve.
Getting To SDDC
If you’ve ever sat down with a customer responsible for a large, complex enterprise IT environment, pitching the attractiveness of something like SDDC isn’t hard. On paper, it’s not hard to get agreement that it’s a great future vision of how IT ought to work.
The fun part starts in putting in place a realistic plan to get there.
Not surprisingly, there are a *lot* of moving parts that are highly resistant to change. Legacy investments and legacy vendors. Operational models and processes that have existed for perhaps decades. Entrenched organizations complete with factions, tribes and internal politics.
And, of course, precious little time between firefighting episodes to actually work on anything.
Much as we’d like to believe that all it takes is magic software and a quick implementation plan, the reality is usually quite different.
Can Hyperconverged Be A Short-Cut To A Better Place?
Let’s say you’d like to introduce SDDC-like concepts quickly into your data center, but do so with a minimum of cost, hassle and inevitable organizational impact. It’d be hard to imagine an easier or more powerful way of doing so than standing up a modest vSphere+VSAN hyperconverged environment. Or maybe an EVO:RAIL if you're looking for something even simpler. Same basic software technology in both.
You’d get the very best in hyperconverged software technology: vSphere, VSAN, vRealize Automation Suite, vRealize Operations, NSX, etc. etc. You’d have vSphere admins on your staff who didn’t need a long learning curve. You’d already have support relationships in place, etc.
You could evaluate for yourself — and quickly — what the new technology can offer. With almost no downside.
- proven functionality, processes and tools that can scale to truly large enterprises
- the ability to accommodate and leverage existing infrastructure choices (servers, storage, network)
- the ability to extend in new directions, like OpenStack, containers or whatever new thing comes along next that looks attractive
- and with well-supported interoperability between all essential components
Let’s Turn It Around
OK, so I get involved in a lot of VSAN sales calls. Let me share something interesting that I’m finding as a result.
Right now, it’s breaking 50/50 between people who are clearly looking for a tactical solution, and people who clearly want to take the first step towards a strategic outcome along the lines I’ve described.
Yes, there are the inevitable feature/function questions as you’d expect. But it’s clear to this crowd that they’re not just looking for a like-for-like replacement for a traditional storage array. No, they see something much more attractive just over the horizon.
They probably don’t use the word “hyperconverged” to describe what their first step is. I think that’s partly due to the fact that the term has been unfairly chained to a specific piece of hardware and associated consumption, and that’s not what they’re ultimately after.
They’re looking for a software model that fundamentally changes the way IT is done.
That being said, some of this group are still interested in the ease-of-consumption that comes with an appliance model such as EVO:RAIL — but it’s clear that’s not their desired end state.
Where Does That Leave Us?
Will it continue to be associated with a specific vendor appliance, focusing on ease-of-consumption?
Or will more people realize that hyperconverged is essentially software, representative of a large-scale design pattern made easy to consume, and thus can be an attractive short-cut to a much nicer place?
I know which outcome I’m betting on.
Like this post? Why not subscribe via email?