The Changing Face of Information -- Virtualization Changes Everything
I feel bad writing yet another post about how virtualization is transforming how we think about computing -- and IT in general -- but, given that I'm in the middle of a series of related posts here, there's no way I can escape covering the topic.
Just to bring you up to speed, in addition to an intro, I've written about the growing case for information governance, identified information risk management as the new frontier in security, how the unmet needs of knowledge workers are creating a crisis, and how a changing definition of applications will change IT.
How could I ignore virtualization?
So, rather than rehash the obvious, I'm going to try and draw a different view; one that speaks to information directly, rather than computing.
This Is Not A VMware-Specific Discussion
Just to be clear, I'm talking about any and all virtualization technologies here: servers, desktops, networks, storage -- it's all up for grabs. Yes, many of these concepts can be best exemplified by what VMware does today; but I want to think more in terms of the complete picture for this post.
Simply put: virtualization abstracts physical resources. As a result, very interesting things happen.
First (and most obvious) consolidation savings can result. I think this case has been pretty much made on this one.
Secondly, IT resources become "liquid". They've lost their direct association with distinct physical entities. They can be relocated and resized dynamically, as needed. A relatively new concept, but perhaps ultimately more powerful that simple cost savings.
And finally, the orchestration of IT (and information delivery) can be unified around a single management paradigm; dealing with abstracted virtual entities, and largely ignoring most physical realities. That's mostly a potential these days, but it'll be a reality before too much longer. Ultimately, this may be the biggest concept of all.
The first characteristic leads to cost savings.
The second characteristic leads to improved responsiveness of IT.
And the final characteristic leads to better service delivery.
Better, faster, cheaper: it does all three.
Why Is This Important In The Information Discussion?
If you draw a picture with one end being the information (presumably on a storage device) and the other end being a user (wanting to see the information in some form), everything in between can potentially be thought of as "friction".
The information is constantly changing. What users want to see is constantly changing. And all the IT gloop in the middle struggles mightily to keep up with the other two endpoints.
Much like superconducting has the potential to change how electricity is delivered, virtualization has the potential to change how information is delivered. I know, I'm stretching an analogy here, but it's useful.
Less friction is good, and leads to all sorts of wonderful things.
Let's Look At Some Examples
My favorite current example of how all three play out centers around VMware's VDI -- the ability to capture a complete desktop image (e.g. Windows XP Pro) and run it on a server, rather than a desktop.
Cost savings can play out when the total cost of delivering and supporting desktops and laptops are considered; the beefier the better.
Value generation comes from "liquidity". Need more memory, CPU, storage performance? It's yours, on demand. Not using it? It's available for someone else. And, by the way, I can get the exact same superior experience on just about any desktop device, including the one at home.
Service delivery? I don't have to worry about my desktop barfing on me unexpectedly. As long as I've got a decent network, I've got all the performance I can use. My files are always there, and backed up. I don't have to wait 15 minutes to power up and log in first thing in the morning. I don't have to wait 15 minutes while IT pushes yet another massive update over the wire to my struggling desktop.
And so on.
Better, faster, cheaper. Less friction.
Or let's take traditional outsourcing. Part of the high costs associated with any outsourcing proposition are at the front end (moving it all over to the outsourcer), and -- occasionally -- moving it all back, or to another outsourcer.
Now, let's say that the outsourcer starts by virtualizing most of your environment. Clear benefits there. But, once done, it's much easier to move the environment, both into the outsourcer, and if needed, somewhere else.
The outsourcer (assuming they're up-to-speed!) can deliver a better service level, for less money, and be more responsive to changes you might want in your environment. And, through the act of virtualization, much of the "switching costs" have been dramatically reduced.
Better, faster, cheaper. Less friction.
Moving away from VMware-like discussions, let's look at our old friend storage virtualization. Storage can be pooled and consolidated using storage virtualization, so there's the potential for cost savings (although I would argue that there's probably easier ways to get at that particular benefit).
Storage resources are now "liquid"; done right, an arbitrary application can be moved to an arbitrary array at any arbitrary time. When things change, IT can react much faster than before.
And, finally, service levels can be aligned to what the business needs. Someone needs a whole lotta storage performance for an end-of-month sprint? Simply move the workload over to something really, really fast (flash?) and then move it off when you're done.
Better, faster, cheaper. Less friction.
More Opportunities For Reducing Friction
Several vendors are working on the idea of distributing pre-tested "stacks" for applications, all neatly wrapped in a virtualization layer. Some are going even farther, and toying with the idea of keeping that stack patched and updated on behalf of the IT organization.
Once again, more friction out of the information delivery system.
I'm guessing that, before too long, we'll see things virtualized that we really hadn't considered. One early example is EMC's Avamar: sure, it's optimized to backup VMware images using client-side dedupe. But you may not have noticed that the back end can run as a completely self-contained set of virtual machines.
Now, when considering this, there will be those that argue that doing this in a virtual machine isn't as efficient or as optimized as a dedicated piece of hardware.
Maybe they're right, and maybe they aren't, but when people consider the ease of installation, management, etc. using a single VMware-centric paradigm, maybe they won't be as concerned about getting the last 10% out of the environment.
Putting Avamar's back-end in a virtual machine reduces friction in a VMware environment.
So, What Does This Mean For IT?
The future of IT is virtual. Physical entities will only exist to support virtual abstractions, and little else. Much of what we assume can only be done on physical devices will find their way to the virtual world, and this will likely happen much faster than we can think.
Infrastructure will change to reflect the new reality. IT processes will change to exploit the new capabilities. Friction will be dramatically reduced.
Nick Carr draws an excellent analogy between power generation and corporate IT in his book "The Big Switch".
Going a bit further, if you were in the power generation business, wouldn't you be very interested in superconducters?
Chuck you were spot on! The change is still happening.
Posted by: Joe | January 27, 2013 at 04:27 PM