VISTA Enterprise Network - Successful Implementation, World Class Support

Thursday, September 17, 2009

Interlude: Shelley on Hybris


I met a traveller from an antique land
Who said: Two vast and trunkless legs of stone
Stand in the desert. Near them on the sand,
Half sunk, a shatter'd visage lies, whose frown
And wrinkled lip and sneer of cold command
Tell that its sculptor well those passions read
Which yet survive, stamp'd on these lifeless things,
The hand that mock'd them and the heart that fed.
And on the pedestal these words appear:
"My name is Ozymandias, king of kings:
Look on my works, ye Mighty, and despair!"
Nothing beside remains. Round the decay
Of that colossal wreck, boundless and bare,
The lone and level sands stretch far away.

Wednesday, September 16, 2009

Four Problems with Master Plans: (2) Imprecision

Dear Reader,

Yes, after the first criticism this second one doesn't seem fair, but it's true. Life, which isn't a logical syllogism, is filled with seeming contradictions. Here's one: not only are master plans too precise, paradoxically they also aren't precise enough.

Specifically while they're too precise about what the future solutions should look like, they're far too vague about the nature of the problems they're supposed to solve. As a result, even in theory master plans only address some parts of the problem and not others.

This is bad but understandable.

Master plans can't possibly identify, let alone address, the full scope of the problems they're supposed to solve because they're created at the beginning, before any serious effort has been made to solve the problem, and thus before Murphy's Law has had its hundreds of chances to teach us how wrong we are about the full extent of the problem. At the time we write a master plan, we just don't know enough about the problem to get its details clear in our heads, and without those details we can't really solve the problem, only take a clumsy swipe at it.

This is understandable but bad.

How bad?

Sometimes it's inconvenient.

Maybe while planning your day, you fail to pay attention to what you're doing at the moment and lock your keys in the car, thus ruining your plans.

Maybe to save three million dollars you replace a deep, stiff bridge design that lets the wind pass through it with a cheaper, shallower, more flexible one that makes the wind pass over and under it, creating aerodynamic lift so it flutters in the wind, tears itself apart, and collapses into the deeps.

Maybe you overplan your bombing runs by putting precisely the right amount of gas in the planes for them to fly out, bomb their targets, and return, but then they encounter a headwind that makes them burn up extra gas so they drop out of the skies before returning home, forcing the pilots to parachute to safety.

Sometimes it's catastrophic.

Maybe you use an explosive gas for airship buoyancy and then try to control all the many scenarios that could cause it to ignite, only to discover that sooner or later your planning or control slips, killing thirty-six people.

Maybe it seems inconvenient to standardize your fire-hose couplings, so you don't, so when your city catches fire the fire engines from neighboring cities arrive only to find they can't help you, and your city burns to the ground.

Maybe your regulations only require a total lifeboat capacity of 1,178 people, even though your ship can carry 3,547, causing the deaths of 1,517 people.

Details matter.

It is the details of problems that make them problems. Something as small as an O-ring seal can kill you if you get it wrong. Engineering is the discipline where Don't sweat the small stuff is a recipe for disaster (and where . . . and it's all small stuff is criminal negligence).

In medical informatics, more starkly obvious than in much of life, the problems we face seem to shift when we discover their nature is different than we at first thought, and they also actually shift as the nature of medicine and software shift around us. The details of our solutions have to be able to shift along with the problems or we can't solve them.

Master plans prevent that dance of solutions with problems in four ways.

First, they emphasize the big picture over the details, wasting energy on prophecy that should have been spent on understanding the problems better, so the necessary details of the solution are missing or vague.

Second, what details they do specify are shaped by design to create (i.e., to serve) the totality, not by discovery to actually solve the problems, so they tend to be out of sync with the reality of the problems (i.e., wrong).

Third, they rigidly lock in those details, preventing them from moving with the problems as the problems shift, soon making any accurate details outdated and incorrect.

Fourth, the nature and interactions of the problems we aim to solve are far too complex to capture in a plan to begin with, ensuring that no matter how much effort goes into a medical-informatics master plan, its worldview always ends up being a ridiculous cartoon of the situation it is meant to deal with.

Any one of these characteristics would be enough to cripple a master plan. Together, they create irresistible forces that cause all VISTA master plans to converge on the same brief lifecycle of (1) ballyhoo, (2) bogging down, and (3) breakdown. And yet the inevitability of failure evidently can't compete with the intoxicating feeling of control a master plan creates, judging by the relentless parade of VISTA-replacement (or "modernization") boondoggles. It would be tragic if after fifteen years it weren't so drearily risible.

Still, at least we can serve as an object lesson to validate Alexander's point:

Master plans fit the shape of the problems with too little precision, and so they fail.

Yours truly,

Postscript: And so, Alexander's critique of architectural master plans holds even more true for enterprise-scale medical-informatics than it does for architecture: Thus, as a source of organic order, a master plan is both too precise, and not precise enough. The totality is too precise: the details are not precise enough. It fails because each part hinges on a conception of a "totality," which cannot respond to the inevitable accidents of time and still maintain its order. And it fails because as a result of its rigidity, it cannot afford to guide the details around the buildings which really matter; if drawn in detail, these details would be absurdly rigid.

Tuesday, September 15, 2009

Interlude: Why Some People Just Don't See It

Dear Reader,

In response to my post on 30 August 2009, "Principle 1: Organic Order, part 1: The Three Kinds of Order", on August 31, 2009 at 6:58 AM Die Anyway wrote:

're:"'s precisely because VA turned away from those processes toward more totalitarian ones that it lost the ability to effectively manage or develop VISTA."

'I see it and you see it so why don't the ivory-tower, pointy-hairs see it? Or do they see it and ignore it because they have an entirely different agenda?

'In any case, as a biologist and programmer I like the idea of organic design even if it did originate in those pre-historic times of the mid '70s.

'Eat well, stay fit, die anyway!'

My answer doesn't fit within a blogger comment, so I'll reply here, as this post:

Dear Die Anyway,

The VISTA managers in VA who try to overly centralize and control VISTA, i.e., who try to impose totalitarian rather than organic order, (1) are not necessarily the majority, just the most powerful or in-favor managers (I know some good VISTA supervisors and managers in the VA and elsewhere), and (2) they do not see what we see because they lack the proper understanding of their context.

They think they're facing an entirely different kind of problem than they actually are. If you interpret their actions from the standpoint of their worldview, their approach makes sense for a while. As we'll explore later in this weblog, their approach leads to a gradually increasing breakdown that eventually becomes so dire that even they can see things aren't working. By then they're usually too burned out and despondent to be capable of taking responsibility for their actions by steering the organization in a healthier direction. There's a certain amount of morale a manager has to have to be effective, and unfortunately when one is committed to a false idea one tends to use up one's effective energies on the dead ends.

The problem with the actual cosmos is that its forces and principles are subtle, easily overlooked. Any reductionist intellectual can mentally reduce an intricate organic system to a trivial mechanical one. When one is tasked with something impossible, like managing VISTA while pleasing Congress, one's mind finds otherwise implausible oversimplifications oddly irresistible, because they offer a desperately needed false hope.

Once people decide on an interpretation of reality, they have an astonishing ability to see every "fact" that confirms that interpretation, and an equally astonishing ability not to see everything else that disproves that interpretation. To the outside observer this seems to result in irrational behavior, but from within the interpreter's reality bubble, within the framework of interpretations, the behavior may be completely rational, even inevitable.

It is a great disappointment to discover that rationality has been overhyped. Reason can be used to reach the falsest or vilest conclusions through irresistible logic drawn from false premises.

Rationality is worthless, even dangerous, unless it is harnessed to two things grossly undervalued - even invisible - in our culture:

(1) profound insight into the essential principles at work in creating each situation, and

(2) mature, discriminating taste capable of balancing priorities among competing values to figure out which good must give way for which other good.

Insight is vital because the truth of hardly any situation is visible on the surface, but instead must be sought in the nuanced and subtle but powerful forces that create that surface.

Taste is vital because contrary to just about everything we teach through schools, the arts, and common sense, the bad things in the world do not result from a great conflict between good and evil, but from conflicts between various goods.

Values conflict, which the classical Greeks knew but which we do not, and which good deserves priority over the others shifts and flows from situation to situation depending on the hidden forces at work.

Neither the truth about the source of problems nor the two vital qualities needed to deal with that truth end up in position descriptions, evaluation criteria, or the law. Instead, managers inside the government and out are held to simple-minded, mechanical criteria, and if they do not bend themselves to fit those laws, criteria, and requirements then they cannot thrive in their careers. They have to believe what they're doing is right, so they do.

In short, I think the answer to your important question is answered best by Upton Sinclair:

"It is difficult to get a man to understand something when his job depends on not understanding it."

Part of why I love biology is that it is difficult to explain in simplistic mechanistic terms, so its study tends to compel you to develop an appreciation of deeper, essential principles.

Thank you for your comments.

Yours truly,

Tuesday, September 1, 2009

Four Problems with Master Plans: (1) Precision

Dear Reader,

It is simply not possible to fix today what the environment should be like twenty years from today, and then to steer the piecemeal process of development toward that fixed, imaginary world.

This escapes us when we agree to create a master plan. We forget the simple truth that we're not gods or prophets; predicting the future is never our forte but often our downfall.

Our grand designs fail in four ways.

Today we discuss the first: a master plan is too precise. By predicting the precise shape of the future - the following buildings or software modules will be built in the following ways, and will contribute to the design like so - it leads us into precise collisions with reality when some parts of the plan prove impossible, as always happens.

Murphy's Law isn't very funny to software engineers; it's the reality under which we operate, a reality that's too easy to ignore during planning but impossible to ignore later when our designs come crashing down around all our heads at great expense.

VISTA programming is designed to take advantage of Murphy's Law. It's a highly adaptive process in which we immerse ourselves in the problem with our adopters as guides, in which we do not set out to predict the future, only to solve some specific, immediate problem. A nonprogrammer simply cannot imagine how much failure is involved before one achieves success, but that string of failures is why the VISTA model works and master planning doesn't.

By making planning a cumbersome, expensive process that comes at the beginning, before Murphy's Law has taught us the many ways we're wrong about what's possible, master planning builds a profound ignorance about what's possible into the development process from the start. It saves the exposure of all its failures for the end, when the differences between what's possible and what the plan proposes finally accumulate so greatly that everything collapses in a great crash. Ironically, it's the act of struggling to preserve the plan's success that increases the scale of the failure, since covering up problems and investing more time and money help intensify the inevitable collapse.

By contrast, the VISTA methodology begins making mistakes up front, one at a time, while they're still small - a string of little crashes. We continue making mistakes the whole way, from beginning to end, but they remain small because (1) there is no big, expensive investment in a master plan to try to protect so we can and do change directions repeatedly, (2) the future user of the software is sitting right there the whole time saying "No, we can't do it like that," and (3) instead of struggling to adhere to an increasingly irrelevant plan the team is building up a map of the terrain of the project, a practical guide to many different ways to succeed or fail.

The contrast is stark. Master plans are prophecies. VISTA plans are histories. Master plans are expensive, produced separately from the software. VISTA plans are cheap, produced alongside the software. Master plans are written and finished at the beginning, when we know as little as possible about the problems and its solution. VISTA plans are written during programming and not finished until the end, when we know as much as possible about what didn't work and what did.

The excessive precision of a master plan comes from its focus on the solution, which means trying to pin down the future. VISTA plans avoid that by focusing on understanding the problem and letting the solution emerge gradually and often unexpectedly.

That is, in the master-planning approach we create the illusion of knowing where we're going without first really understanding the nature of the problems we claim to be solving. In the VISTA planning approach, we acknowledge from the start that we have only a vague idea where we're going, only that we know we're going to solve a specific problem by getting to know it very well.

The VISTA approach involves a lot of false starts and backtracking, a lot of mistakes - it looks sloppy and loose compared to the professionalism of a master plan - but that's its strength and the weakness of a master plan.

The VISTA methodology was first advocated by John D. Chase, Jack Brooks, and Ted O'Neill. Here's what Ted O'Neill and Marty Johnson had to say about the difference between the master-planning approach and their own, in their memo to Ken Dickie back on June 10, 1981:

Requirement specifications can be derived effectively and quickly by developing a basic functioning system with the intended user, then further refining the system through user-suggested modifications. The resulting system, in daily use, will provide a much more complete and accurate reflection of user requirements than any narrative description of a system that has not been built. Moreover, by using available modern techniques, such as natural language application generators, database management systems and other software tools, fully operational prototype systems can usually be produced more rapidly than can paper descriptions thereof. Requirement specifications that are produced before any prototype testing activity has been undertaken are now recognized by most computer professionals as inadequate.

Now go back and reread Alexander's quote from the top of this blog entry.

These are two different ways of saying the same thing. They have the same criticism of master plans, independently arrived at from two different perspectives. Our own history bears out the truth of this convergent revelation.

Master plans predict the shape of the future with too much precision, and so they fail.

Yours truly,

Postscript: Master planning isn't the only kind of planning. The Principle of Diagnosis involves a very different form of up-front planning that helps to create natural order instead of totalitarian order. There are also forms of community planning that focus on studying the problems communities face - the trends, developing problems, and so on - and then draw general, reasonable conclusions to help communities plan. Likewise, there are dynamic forms of project planning designed to accommodate the unpredictable path of research and development. Each of these approaches is the opposite of the master planning approach because they focus on what we can know best - the nature of our present and developing problems - and sketch lightly and prudently in recommending how and when those problems can be solved.