There was a certain crowd that thought I’d be part of the EMC Federation forever. There was another crowd that couldn’t truly believe I’d join Oracle, and enthusiastically at that.
Gratifyingly, there were also plenty of well-wishers -- that was nice!
At the time, I really couldn’t go into the reasons why this made such logical sense to me. It had a lot to do with what I was seeing in enterprise IT – both on the supply and demand side.
When the world changes, you need to change as well.
So I did.
Obligatory disclaimer: these are entirely my personal views – they have not been reviewed nor approved by my employer. I take full personal responsibility for everything I say here.
A Jaundiced View Of Enterprise IT Organizations
I’ve enjoyed studying literally thousands of IT organizations from multiple angles. The more I look, the more new patterns emerge. I find it fascinating. I guess I’m easily entertained.
Order processing, for example, generates revenue for a company, so it’s pretty close to the top of the enterprise value pyramid. VDI, while nice, is more of an optimization than a money-maker for the company, so it’s typically somewhat lower down in the food chain.
My belief is that – when resources are limited – IT organizations will tend to invest in the applications that are of high value to the enterprise, and tend to starve the others.
Through that admittedly jaundiced view “what’s valuable to the business?”, so much of what we talk about in enterprise IT quickly becomes less interesting.
For example, hypervisors are certainly cool, but they essentially save money; they don’t make money. Containers are also cool, but only if they help bring new revenue-generating applications to market faster. Otherwise, they’re just a distraction.
A flash array lashed to a critical database can do more work, potentially making money – but flash arrays are quickly losing meaningful differentiation, and are now competing on price. At the end of the day, flash is flash -- unless someone can come up with some real secret sauce.
Show me an application or technology that clearly makes money (or avoids serious risks) for a company that uses it, and I’ll show you something enterprise IT is willing to invest in – ahead of everything else.
A Similarly Jaundiced View Of Enterprise IT Vendors
Whether we’re talking on-premises technology, public clouds or both – everyone is slowly yet inevitably getting pulled in.
Even the over-marketed startups.
How many tech pitches do we see about “saving money?” Not that saving money isn’t important, but you can’t save your way to prosperity.
How many tech firms are routinely laying off staff, writing off dodgy acquisitions, starting to pay dividends, aggressively buying back shares, dealing with activist investors, struggling with IPOs, etc.?
None of these are signs of growth.
I believe customers win with commoditization in the short term, but not so much in the long term.
Yes, prices go down, but so do vendor margins. And a good portion of those margins get reinvested in R&D – making better stuff. Commoditization is like kryptonite to innovation. It gets harder for a large company to innovate in any form when it’s getting sucked into the black hole of commoditization.
Back To Enterprise IT Shops
That same black hole of commoditization is also affecting enterprise IT organizations, but in different ways. Many are fashioning themselves to look like internal IT service providers, and benchmarking their services against external alternatives. They have no choice in the matter.
Budgets are tight, and CIOs routinely tell me that finding (and paying for) good people is getting increasingly harder – even in the really big shops. Face it: a career in enterprise IT isn’t as glamorous as it once was, and many bright young graduates are choosing different paths.
These enterprise IT shops would rely more on vendor expertise, but it’s now a pay-for-what-you-get model, thanks to commoditization. So more contractors and consultants are used, hollowing out the core of enterprise IT knowledge even further.
"Science projects" are most decidedly becoming less fashionable. There's no money, no time and scarce talents. IT shops now want complete solutions that deliver proven an quantifiable value, so they can focus on more important things.
It’s not pretty on either side of the fence.
What I Concluded
Witness the server vendors, storage vendors and network vendors. They are all attempting to reinvent themselves -- as what, I don’t know. I do think VMware is in a far better position than most, but will still have to paddle awfully fast in the coming years.
No, applications make money for the enterprise, and unless – as a vendor -- you’ve got a solid base of revenue-generating applications and supporting technologies that directly and visibly help, gravity will suck you in.
I believe that the more complete and integrated the solution a vendor can offer, the more resistant they will be.
Unless you can show your customers a wide range of complementary cloud services to go with your applications and supported technologies, you as a vendor will be sucked in even faster.
No compatible cloud option? Back of the line for you …
HP doesn’t appear to have much of a cloud that offers compatible consumption options that mirror what's sold on-premises. Nor does Dell. Or Cisco, or EMC/VCE, or NetApp, and IBM’s cloud doesn’t look particularly healthy to me. I think Microsoft has a reasonable cloud strategy, because it complements their “applications and technologies” portfolio quite well – but it doesn't appear to have chance of addressing critical applications in decent-sized enterprises.
Fewer shops can justify building IT infrastructure on their own. It’s expensive, it’s time-consuming, and it’s risky. Especially when it comes to the infrastructure supporting those revenue-generating applications that keep the company running.
Witness the rise of reference architectures, converged systems, hyperconverged systems and their ilk.
But none of these are really integrated with the important databases and applications. To be specific, they are not tightly integrated with those parts of IT that actually make money. So these newer animals end up being a cost-savings play for their customers; and not so much a revenue-generation play.
Nice, but gravity sucks, doesn’t it?
And how many of these reference architectures, converged systems, hyperconverged systems, etc. – have a precisely compatible consumption option in the public cloud?
None – as far as I know.
As a result, I believe that they too will be drawn into the black hole of commoditization – it’s only a matter of time.
People will pay for things that help make them money before they pay for things that save them money.
When Demand And Supply Meet
Enterprise IT organizations are getting hollowed out. Most will have to prioritize.
At the top of the list, those critical applications that are essential to the business: taking orders, delivering product, supporting customers, analytics, etc. Make them run faster, make them run better, secure them, remove risks, etc.
These shops aren’t going to have the luxury of hand-crafting critical IT infrastructure for much longer. It’s too hard, and too important.
They will look at cloud options at the periphery for now, but over time more critical enterprise apps will want to go there.
I believe they will prefer to work with suppliers who understand and can deliver applications, databases and the infrastructure that makes it all hum along – offered as complete engineered solutions, not as science projects.
They will prefer having precisely compatible cloud consumption options – perhaps more as an attractive option than an immediate need.
They will charge a justifiable premium for the value they create. That premium can be reinvested in creating better answers to enterprise IT challenges.
Gravity can be overcome.
And, in a nutshell, that’s why I joined Oracle.
Like this post? Why not subscribe via email?