You're reading a post that's part of a larger sequence. I'm taking a look at several of the core issues that we'll all be facing in the coming years as we rapidly move to an information economy.
In addition to introducing the topic, I've taken a look at information governance, information risk management, and the growing importance of knowledge workers.
In this post, I want to dig into a fundamental question -- what is an application, or -- more importantly -- what will be an application?
The Application As The Quintessential IT Paradigm
You ask someone what IT is all about, and you'll frequently get an answer like "we deliver the applications that the business needs". The concept of an application is intimately bound up in how IT thinks of themselves on multiple levels.
Did something break? They'll tell you what applications are affected. Got a new project? It'll often be either a new application, or improving an existing one.
But, if the fundamental concept of an application is changing (which it is), does this mean that IT will have to change as well?
The New Application
Jeff Nick (our uber-CTO) has an elegant description of an application, which I hopefully won't mangle too badly here: an application is a composition of services that meets a business requirement.
Like most things Jeff says, it is at once both incredibly simple, and incredibly deep. So let's take it apart.
No More Monolithic Applications
Those of us of a certain age grew up in an era of monolithic applications: big, freakin' monsters with code that looked more like archaeology than architecture.
Data integration? Easy, extract what you need and move it around between these behemoths. Or, if that doesn't work, maybe do some screen-scraping to give users a composite view of multiple applications. Or, if the problem was bad enough, ditch the existing applications and create a new, bigger, more scary monster.
On many levels, it's clear that those views are no longer in vogue. We now are trying to live in a world of composed services.
What Is A Service?
In this context, it's something like an information source (e.g. customer data), business logic (e.g. billing) or other logical component that is designed to be an element in something bigger. A service exposes or publishes its capabilities to others who might want to use it, and at the same time, hides complexity on how things actually get done.
Now, remember, I'm oversimplifying here, so take it easy on me.
More interesting is when we think about service that might live outside the firewall; either provided by things like Google or Amazon, or -- perhaps -- specialized service providers (e.g. credit rating risk).
What Is A Composition?
Just about anything that strings things together to accomplish something useful to someone. The range of compositions include ad-hoc API usage, SOA compositions, SOAP and REST protocols -- the whole shooting match.
Rather than get into a debate as to which is better (and why), I think it's safe to say that we'll see many compositions in the future.
Ideally, the act of creating a new application would be sitting down at a library of existing services, connecting the plumbing a bit, maybe writing a few new services or doing a bit of glue-ware, and -- presto! -- you'd have it.
How Does This Change Things For IT?
If an application is many services, composed in many ways, how does this affect IT?
Profoundly, I'd offer.
First, the concept of an "application" (funding, management, process, governance, etc.) is now very squishy. As an example, what's more important, delivering the final app, or getting the underlying services right for future use?
Second, how do we feel about services (and compositions) that happen outside the firewall? Not all useful services and compositions will live safely behind four data center walls, will they?
Third, how do we think about traditional IT processes (provisioning, change management, help desk, service level delivery, etc.) in this world of composed services? I'd argue that most of our thinking was formed in a world where applications were monolithic entities, and not distributed compositions.
Fourth, the infrastructure needs to be considerably rethought. In a world of multiple services, how do I dynamically size resources based on fluctuating requirements? How do I capture a consistent state for recovery? How do I secure things adequately in a multi-tenancy world? And so on ..
There's more, but I think I've illustrated my point -- in a world where the concept of an "application" changes, many aspects of IT thinking has to change as well.
Putting Users In Charge Of Applications
The idea of a self-service IT world (or one that's more self-service than today!) is inherently appealing, albeit frightening to some. But it's not that new an idea, if you think about it.
Due to my age, I can remember a workplace where a manager would write some thoughts on a pad of paper, hand them over to an administrative assistant, who would enter the work into a word processor, go back for review, etc. Somehow along the way, we all learned how to use a word processor and spreadsheets, to a certain extent.
At other times in my career, I was doing a fair amount of traditional analytics. After working with an IT analyst, I realized that I'd be better off learning the tools, and just doing what I needed myself.
Recently, I've met a few large organizations that have self-serve environments set up for developers: they use a web tool to request a virtual server (or desktop), pre-populated with their development stack and templates.
Mashable data is popping up in a few corners. I've even met business analysts (not IT people) who routinely construct new application logic using SOA principles, and turn them over for implementation.
As we all become more proficient (especially with the newer generation of web tools), perhaps the IT investment model should slowly evolve -- to one where services, logic, resources and information is directly exposed to users for their creative use.
Now, we're all familiar with the laws of unintended consequences. As an example, the proliferation of spreadsheet technology means that you'll have some interesting meetings where you try to discern who has the "right" numbers. Or, going back to the self-service developer example, they'll admit that their infrastructure was swamped with people spinning up all sorts of development environments. And, in a world of mashable data, I'm sure we'll see some pretty entertaining results.
But that's not a reason to generally move in this direction, all things being equal.
What Kind Of IT Organization Are You Building?
In a world where every interesting application is a composition of services, some interesting questions are being raised.
Is it delivering applications, or creating robust services that can be composed?
Should we think of infrastructure as enabling this new model, or hampering it?
And -- ultimately -- how far will we go towards putting tools in the hands of users, and letting them figure out how to get what they need?
Comments