(This article has originally been published in the Spring ’96 ING Component Architecture newsletter – ICA)
Back ten years or so, when I started dabbling in object technology, I could afford to be some kind of a “software hippie”. I’m sure you know the kind. They pop up every now and then, even in respected software companies (I didn’t name one, did I?). They radiated the unrelenting optimistic attitude that melted the software crisis, budget restrictions and management dedicated to the old ways like an overheated nuclear reactor. They seem to become however, at least from my point of view, rarer. But then my point of view has changed of course. What was a new, untested, and indeed overly optimistic and enthusiastic “new wave”, has now become a respected and accepted technique. I have less and less difficulty in advocating this “new way” to the decision makers. Instead of bright and curious technical persons, often working as programmers in the larger companies, it is more and more middle and higher management acting as object oriented evangelists. So I started wearing ties, stopped trying to convince and put my efforts in the “how” instead of the “if”.
What is OMT?
Don’t be afraid that I am going to give a full synopsis of OMT here. I will give a short introduction in general terms to lay the groundwork for my arguments later in this article. I am not assuming you have any previous knowledge of object-oriented methods.
Origins of OMT
OMT, short for Object Modeling Technique, was one of those that survived the years to become a de-facto standard in software engineering. Developed at General Electrics as a an in-house modeling technique, it quickly grew after its introduction into the world with the book [1] as the most popular analysis method (or, as the Americans say: methodology, which seems to be semantically incorrect).
OMT growing pains
The first publication did arrive in a vacuum that filled up very quickly with numerous analysis and design methods. On a conference last year one of those methodologists reported to have counted 34 of those methods. Which was for him reason enough to announce the demise of his own method.
Of course this proliferation of methods only reflected the uncertainty that companies faced in working with the new object-oriented tools and languages. Frantically looking for solutions they seemed prepared to try anything. For some of us this created a feeling of déja-vu. Didn’t we see this same sequence of events with the rise of structured methods?
For the chief methodologists this created some kind of a dilemma, which was reflected in the main publications where they started efforts towards some kind of unification and warnings against a “methods war”. The success however of some methods was undeniable. Especially OMT was more easy to adopt because of the many similarities with structured analysis and design, and its close pictographic similarity to Entity-Relationship modeling.
Merging of methods
It became obvious that most of these methods concentrated on some of the phases in software engineering. OMT was strong in analysis, weak in the design, and in fact completely lacked tools for requirements analysis. Another player in the methods field, Grady Booch, who had also published a book on object-oriented design, took up the glove and did the obvious thing: he employed the main methodologists of two of the most popular methods, James Rumbaugh who had become the main spokesman of the OMT method, and Ivar Jacobson who had created a method called Objectory.
Objectory
The power of Objectory came from a surprising direction: process modeling. Its main concept was use-cases, which described in a more or less formal way the interactions of external agents (called actors) with the proposed system. This approach has the advantage that is becomes much easier to communicate requirements of the future system with its (potential) users.
Unified Method
The merging of these three methods promises to have a synergistic result. Together they cover the whole process of software development in a more or less complete way. The method resulting from this merger however, has taken a long time in maturing and is still not yet completely published. A white paper exists on the Internet (www.rational.com), but concentrates mainly on the notational aspects. Only incidental publications have appeared in the main object-oriented magazine[2] . It is understandable therefore that most organizations are deferring the use of this method and continue to work with more or less adapted versions of their old method.
Applications of methods
Before I continue it is perhaps good to emphasize that although there may be many methods, there are more similarities than differences. They all contain a central repository which is called the Object Model. They all emphasize that modeling the problem domain may not be the largest portion of the work to be done (this is usually the user interface) but it is the most important portion.
This is where I will come to the main argument of my article: the emphasis on modeling. Some of you may have encountered the old-style computer analyst. It used to be a person with a computer science degree or something similar. He or she was invited in your organization and quickly assumed to know more and better of your problem domain (say: banking) than you. This arrogance is one of the main disadvantages for object-oriented analysts today: they have to cope with the mistakes of their colleagues in the past. In object-oriented modeling several things are happening at the same time:
- tools are provided that lay emphasis on abstraction and problem domain modeling
- the role of the user in creating these models is central, that of the analyst is more of an enabler
- the evolution of these models is much more controllable from a user’s point of view because of the continuous and consistent feedback with the user or domain expert (for example with prototypes)
So actually users of an object-oriented application are in a much more empowered position than before. This has a heavy impact on the whole development process, which might be the subject of another article.
This empowerment of the user has unexpected results though: it leads to a more involved interest in modeling of the users themselves. And here is where object-orientation is beginning to prove its power of abstraction. Whether they are astronomers, biologists, economists or carpenters, it appears that object-oriented domain knowledge capturing is easier and more effective than other techniques. Object-oriented models capture domain knowledge. So it is easy to understand why there is such an interest in business objects: these object-oriented business models are becoming repositories of critical business concepts. Perhaps we will see that industrial espionage of the (near) future will concentrate on business models. And maybe we will see underground vaults guarding the most valuable assets of businesses: object models.
Leave a Reply