Discussions

News: Modeling Usage Low; Developers Confused About UML 2.0, MDA

  1. Many major software tools vendors are promoting formal modeling and Object Management Group’s MDA initiative, as well as supporting mappings of it to J2EE architectures. However, MDA has not taken off. Only about one-third of developers recently surveyed said they use UML, and not a single respondent believes that code generated from models is generally production-ready.

    Read Modeling Usage Low; Developers Confused About UML 2.0, MDA.

    A slew of vendors are implementing MDA-based modeling suites that can generate J2EE architectures. Just yesterday, Compuware announceds that they had fully implemented MDA in their OptimaJ IDE. This allows you to model your application in a platform neutral, OO manner, and have your complete three tier J2EE architecture generated for you. Your model can eventually even be converted into a .NET application simply via re-generation.

    If you watch Jack Greenfields tech talk, it sounds like model driven development is the way of the future. But why hasn't it taken off?

    Thoughts?

    Threaded Messages (121)

  2. Hi,

    I just attended an extensive demo of OptimalJ by Compuware. Though their MDA implementation is quite fascinating, I still wonder if it suits most of the real world application requirements. They successfully demostrated creating a Customer-Order-OrderItems relationship in less than 30 mins all the way from JSPs to session facade to CMP2.0 beans but yet the functionality was very much CRUD operations.

    Anyone evaluated OptimalJ for a real transaction based application?

    Its also focused more on generating services (EJB) layer and has very little to offer in presentation layer. A major problem I see is the tight coupling with netbeans IDE. We have developers using IDEA, Kawa and Eclipse and I dont see how somebody can convince developers to leave their favourite IDE just to benefit from OptimalJ.

    comments??

    /tahir

  3. >>Anyone evaluated OptimalJ for a real transaction based
    >>application

    I spent a very little bit of time looking at their tool (I was almost forced to in order to get them off my back - the sales people were annoyingly pushy.)

    (My earier comments on OptimalJ).

    As for UML usage, I think that UML is like any other documentation. Its should be used to aid communication. Big up-front UML design goes against the grain of the lighter, more agile, XP priciples of evolving design to meet evolving requirements.

    I think that the primary reason why UML usage is so low is because the tools have been so poor. Flakey "reverse-engineering" has meant that the UML was never sync'ed with the code and just got left behind. Lets face it, when given a choice between "updating the UML" and getting the system to production, its a no brainer which task the team sidelines.

    Together's tool is great at keeping your model and source in synch. Its about the only tool I would use since it gives you the capability to simultaneously work at the code level and the UML level. Its just a pity it costs as much as a family sedan.

    As for MDA, I dont see how these tools will ever deal with the low level details of getting a real-world system developed.

    -Nick

  4. I suggest you to try IBMs' WSAD based on Eclipse and VAJ and Rational Rose ... for J2EE applications.
  5. It is interesting that for web based applications, we have taken great effort to separate the work of HTML developers and coders. That's because coders have realized that there's no way that they can develop nice looking professional HTML on their own.

    However, this acknowledgement doesn't seem to be present when we talk about UML. Clearly, we need a way for the business experts to express their intents, we need to incorporate that information into the programs we develop. When those business intents change, our programs should change correspondingly. The process should be iterative and adaptive.

    To gain some insight on this problem we only need to look at how it's being done for web based development. The state of the art practice is to not write any code inside JSP pages but instead provide tag libraries with built in functionality. If we extrapolate this for UML, then implementation details need to be clearly separated from the UML diagram. The classes that you see in UML, may or may not map directly into Java classes. A developers job then would be to map a pure logical UML diagram into implementation. The mapping should be robust to changes, so that when diagrams change, code should change appropriately.

    Business software development without input from business stake holders are doomed to failure. It's about time that we give business experts some control of the software artifact, just like we give HTML developers control of the display. Maybe we trust the HTML developers to do a good job, we don't have the same trust in the business experts?
      
  6. I know what your saying. I think the role you describe would be somewhat fulfilled by system analysts. Systems analyst translate the business requirements into more formal funtional requirements. Many system's analyst do use UML, but I think this is the wrong tool for them. Systems analyst need to be experts at deciphering business processes, not determining how this will map into code (that's an architects job). UML is too code-oriented for them. What system analysts need is a tool that can more easily represent the business model (is there such a thing as Business Modeling Language??) They should then leave it up to architects and programmers to translate this into an program structure.
  7. I just read Nick's post about Jack Greenfield. Nick faulted an MDA tool, by writing: "Everything sounds great when development is portrayed as a waterfall (ie little change). However, try to make a change in direction during development and you suddenly have lots of problems on your hands."

    Um, the waterfall approach emerges on almost every project, no matter what process is supposedly followed. So one of the goals of a CASE tool should be to support the waterfall approach, among others such as the iterative approach.

    One difference between the waterfall and iterative approaches is *what* work is lost if requirements change at a third of the way through project development. In the case of waterfall, only analysis and design are lost since the coding phase wouldn't have begun. In the case of iterative, code is also lost. This is potentially a fault of the iterative approach if coding is time-consuming, as with coding by hand. But if the code was push-button generated by an MDA tool, then the thrown away code is no expense. So MDA could accelerate development, which is one of automation's greatest prizes.
  8. I've got news for you, requirements change all of the time. Not just at the beginning of the project. Not just a third of the way through the project. They are constantly changing.

    Taking your example, if you use the waterfall approach and you spend the first third of the project in analysis and design, you have no working example of the software to show you customer a third of the way through. If you extract this out to a six month project, you go two months with no code? So, you have spent a third of the project without getting any meaningful feedback your customer other than their requirements.

    It has been my experience that requirements are a "notion" of what the customer wants. They don't REALLY know what they want until you start showing them some working software. Once they see something working, they can start steering it more in the direction they want. You are NOT going to get the design perfect the first time without a massive amount of effort and (wasted) time. With an interative approach, you start getting feedback very soon as to whether or not you are heading in the same direction as your customer's vision.

    The waterfall approach does not have to emerge in every project. I wound fight hard to keep this from happening. It just starts friction between the business folks and development folks when you start "freezing" requirements and you have hard boundaries around phases. If this business necessitates a change in the design, it should be allowed. That what the software is for - to support the business and its customers, not to follow a rigid workflow. And if the code is done correctly (loose coupling, well-managed dependencies), it should be malleable enough to allow change.

    I have to admit I have never used an MDA tool (like BridgePoint) and I am not intimately familiar with the Shlaer-Mellor methodlogy. From what I have read/seen of both, it looks like most/all of the work is done in models. If that's the case, don't all of the details of the requirement have to be represented in a model, including very fine-grained details? If so, why not just jump into the code? I don't see the value.

    Ryan
  9. Ryan wrote:
    >
    > I've got news for you, requirements change all of the time.
    > Not just at the beginning of the project. Not just a third
    > of the way through the project. They are constantly
    > changing.

    Yes, exactly. This is why alleviating the laborious crafting of hand code is so important. So code generation can be very helpful. If requirements change, and the code were hand written, then the loss of hand work includes analysis, design, *and* code. But if the code is generated, then only analysis and design are lost.

    > The waterfall approach does not have to emerge in every
    > project.

    The very nature of development practice makes analysis precede design, and design preced code. That's an inescapable waterfall for any project.

    > I have to admit I have never used an MDA tool (like
    > BridgePoint) ...

    I have. I learned code generation with BridgePoint, and then I helped write a faster code generator for it. At no time did the model translation strike me as unsound, and usually I recognized value.

    > If that's the case, don't all of the details of the
    > requirement have to be represented in a model, including
    > very fine-grained details?

    An executable model has three ways of satisfying a requirement:

    1) Model it entirely.

    2) Add a hand-coded library, and let the models delegate to it. Shlaer-Mellor refers to that library as a hand-coded domain.

    3) Model it partially, and modify the code generator, if it isn't black box. For example, make the generator recognize transaction boundaries within the model. In Shlaer-Mellor this design activity is known as 'coloring', where in the MDA tool, the user marks various actions as, say yellow, and the code generator could be modified to then recognize yellow colored actions as transactional boundaries. This is a similar principle as with J2EE deployment descriptors, though those are textual.


  10. Brian,

    It seems that you like the Waterfall approach to developing software.

    If this is the case, then I couldnt DISagree with you more.

    However, rather than me attempt to convince you on this forum why the waterfall approach to software is completely crap, I really recommend you read a couple of very good books:

    (I think every developer should read these books before touching a keyboard)

    First read (for a project management perspective):
    Book 1

    Then read (for a developers perspective):
    Book 2

    They are both short reads (less that 10mm thick)

    With redard to your post:
    >> In the case of waterfall, only analysis and design are
    >> lost since the coding phase wouldn't have begun

    If 1/3 of the way through a project, you havent written any code, the project is doomed.
    You can easily spend all your time "changing design" and you will never have anything to show. Your business sponsers will (and should!) can it.

    >> But if the code was push-button generated by an MDA tool,
    >> then the thrown away code is no expense.

    I think this is hugely optimistic. For any real-world problem, I dont see tools that can do this successfully (or even come close). Code generation is great for generic stuff - not so good for the complexity you find in real-world problems.

    Cheers,
    Nick
  11. Nick wrote:
    >
    > However, rather than me attempt to convince you on this
    > forum why the waterfall approach to software is
    > completely crap...

    I never claimed otherwise.

    > If 1/3 of the way through a project, you havent written
    > any code, the project is doomed.

    That's why most of us prefer iterative development, which in the face of changing requirements is accelerated by MDA, as I already described.

  12. >> That's why most of us prefer iterative

    OK, then we agree on something at least (not bad considering all the "silly", "invalid" and "irrelevent" points being made).

    >> which in the face of changing requirements is
    >> accelerated by MDA

    And here is where we will have to wait and see. What tools do you have experience with?

    -Nick
  13. "What tools do you have experience with?" -- Nick

    BridgePoint for general MDA. Also two generations of domain-specific MDA tool, much of it designed by me. But my resume has tons of hand-coding experience on it too, which is why I know the cost of hand code.
  14. Nick,

    Where are you located? Sorry to hear you had problems with a sales person from Compuware... Most of them are ok folks, the Optimal product line is relatively new and naturally when one is selling missionary style it can come across as being pushy.

    Mike
  15. declarative programming[ Go to top ]

    I read lots and lots about MDA and executable UML's etc. Traditionally we do
    Design ------ Development --------------- Testing -------------- Deployment
    but instead do something like this
    Design ------ Testing -------------- Deployment

    straight from static declrative class models!
    check this out, intresting product.
    www.gorillalogic.com

    I wonder how many years will it take to become adopted by mainstream!
  16. Maybe the problem is that it's too "futuristic". It a nice idea, but its something extremely difficult to implement in practice. In fact, I don't believe the state of the art today is able to implement MDA is a workable way. There are just too many details that haven't been fleshed out.

    Today's tools are extremely more powerful than yesterdays, just look at the capabilities of Eclipse, IDEA, JBuilder. They've added extremely sophisticated refactoring, browsing, code assist etc. However, the vision of MDA is still a few more years in the future.
  17. "However, the vision of MDA is still a few more years in the future."

    I'm surprised the software development scene has taken so long to catch up. MDA has been documented and successfully applied for at least a decade. Coworkers and I were using the Shlaer-Mellor methodology to automaticly translate model diagrams for simulation or execution back in 1994. We demoed for Ivar Jacobsen, a co-inventor of UML. Apparently UML and OMG evolve at a snail's pace if it has taken so long to embrace what other methodologies (Shlaer-Mellor, ROOM) had proven before the Web even existed.
  18. Brian,

    <quote>
    I'm surprised the software development scene has taken so long to catch up. MDA has been documented and successfully applied for at least a decade.
    </quote>

    you're right. MDA is not new. We started with a generative approach about the early nineties. New is the label "MDA" and that it builds an umbrella over some standards. That hopefully provides us with tool independence to some degree.

    Years ago there was for example this standard CASE tool interchange format CDIF, which is now superceeded by XMI. In my view the technical benefit of XMI over CDIF is small. But XMI is widely accepted by tool vendors, so there is the great benefit for us.

    The approach helps to stay as far as possible technology independent. But by now your are still tied to your development environment. It's currently not seamlessly possible to migrate your model from one tool to another. With the evolution of the repository standard (MOF) and the stream-based interchange format (XMI) the ambiguity decreases and the interoperability improves.

    The wide acceptance of these standards causes the snail's pace.

    cheers,

    Phillip Ghadir.
  19. Just some thoughts...

    Thought 1:
    In my mind there is almost always a hard tradeoff between flexibility and simplicity. You want it simple, then you must trade off some flexibility. You want it flexible, then you must buy that by adding complexity. You can not have a single button that can do anything you want when you push it. Probably someone could state this in a formal way with in the context of information theory or something I am sure (I am not that person).

    Thought 2:
    Imagine that you could do everything through MDA tools (I know this is not their current goal, this is a thought experiment). Not a single speck of Java would be required. Then essentially the MDA would be just another programming language (a visual programming language). In order for the MDA to be simplier than the original language we are trying to replace we must limit flexiblity. Configurable pre-build components would be built (they are not infinitely configurable and thus flexiblity is reduced) that could be hooked into by the MDA tools to enable them to build comparable applications. Would the MDA approach be simpler? Maybe, but unless the MDA language turns out to be a superior 'language' we haven't really gained much. We could have just as easily written these components and exposed them to the original language (a library) without sacrificing flexibility.

    Thought 3:
    UML sucks (ok so maybe 'thought 3' is maybe more like opinion 3 ;-) UML looks enticing when you look at a small example, much like state diagrams look like a cool visual way to represent state transitions, but just like state diagrams, this appearance quickly fades once your system becomes 'real world' complex (which is why you will probably never see a state diagram for a language compiler ;-) When all your blocks for a particular problem area can no longer fit on a single page, the advantage of visual representations over simpler text-based mechanism seems to disappear (if there ever was an advantage). I've never really understood why anyone thought a box with attribute and method names in it was superior to a simple textual representation (written in IDL or whatever your favorite interface definition language is). "But what about the relationships between classes?" you say. Well any half-complex UML diagram I have every seen ommitted 90% of the relationship lines to avoid creating an unintelligible mess of criss-crossing lines anyway. Better off creating a more conceptual level diagram (where omitting such connections is valid).
  20. I don't think that 'UML sucks' just because somebody tries to fit too much information on the single diagram. You can split your design into several diagrams that will show independent parts of the (sub)system or focus on the different details of the same part.
    In good tools (TCC, for instance) you can link different parts to have them naturally connected. We do that to connect web-layer logic with appropriate EJB-layer logic in the sequence diagrams.
    Diagram should not describe all the coding issues. It should give you possibility to look at the design of the (sub)system in general. Having such opportunity really helps in large projects. Also following the rule "design first" makes your (sub)system simpler, with cleaner and more understandable, extensible design.
  21. Alexey,

    Yes definitely you can split onto mulitple views, but then the advantage of the visual approach (as compared to textual) is pretty much gone in my opinion. That is just me, I'm sure others may find it still valuable. However the textual approach is soo much easier to produce and maintain as well. I personally don't see the advantage of a visual approach when you have so many pieces you must place them on many seperate views.

    Consider the W3C DOM for example. The W3C committee decided to use IDL to represent the interfaces and relationships. It was easier for everyone to view/read and understand. I actually found that to be far easier to understand and navigate than an equivalent collection of UML diagrams.
  22. Why do we feel that UML for software modeling is so complex while other engg. disciplines use similar complex blueprints? Do you think that the tools other fields use are more mature?
    Lack of expert UML designers could be the reason why managers don't want to spend time and money on diagrams. Th e tool support on the other hand could be better. I've been trying to get hold of rose for a long time but it costs a leg and an arm. Moreover the process wars seem to be causing some confusion . RUP and XP, for example. Is software development still maturing?
  23. "Why do we feel that UML for software modeling is so complex while other engg. disciplines use similar complex blueprints? Do you think that the tools other fields use are more mature?"

    Comparing software development to other engineering disciplines is not accurate for this discussion. You can't "refactor" a highway, airplane, chemical plant, etc.
    It is imperative that your design be accurate at a very granular level before implementation begins. (Most) Software is not this strict. You can begin coding with a rough design in place, learn as you code, and let the design evolve. Software's flexibility in this area allows for early implementation and short iterations where almost all other engineering disciplines to not.


    "Moreover the process wars seem to be causing some confusion . RUP and XP, for example. Is software development still maturing? "

    I would say more like evolving. It is MUCH greener than other engineering disciplines, no? How long has software engineering really been around? Thrirty years or so? Forty? People have been building huge physical structures from throusands of years - I would hope this process is more mature.

    As for conflicting methodologies - they both have their place. Unless a very rigorous process must be followed (government software? life-critical software), I would argue that lighter methods work just as well. I am glad to see the XP movement garnering so much momentum and attention.

    Ryan
  24. Ryan Breidenbach wrote:

    "You can begin coding with a rough design in place, learn as you code, and let the design evolve."

    Agreed. Indeed, if you are doing almost anything original; I.E building new components and/or architectures rather than putting together well-understood components in similar ways in cookie-cutter systems, you are compelled to work this way.

    "Unless a very rigorous process must be followed (government software? life-critical software), I would argue that lighter methods work just as well. I am glad to see the XP movement garnering so much momentum and attention."

    I like XP and would jump at the chance to work in an XP shop. Unfortunately it's my feeling that XP is ill-suited to the current structure of the industry. I believe XP is best suited to in-house development teams which largely do not exist any more. XP postulates a degree of trust between developer and customer which I have not seen anywhere this past 8 years or more.

    XP requires far more than developer buy-in, that is the easy part. It requires enormous changes in the way software development projects are organized and paid for. I may be wrong, but I do not see that degree of trust forming spontaneously.
  25. "I believe XP is best suited to in-house development teams which largely do not exist any more." -- Don

    In-house projects constitute the bulk of Java development, so your comment doesn't make sense here on a Java chat site.
  26. "Thought 3:
    UML sucks (ok so maybe 'thought 3' is maybe more like opinion"

    I think UML is very useful. The problem I see is that developers and managers can't seem to understand the need for design and deliberation.
    This is where I see XP as lacking. It focuses on code writing collaberation instead of on design. Having 2 people together jamming out code isn't that much better than 1 person.

    "When all your blocks for a particular problem area can no longer fit on a single page, the advantage of visual representations over simpler text-based mechanism seems to disappear "

    That may be an illustration of a design problem. Systems should have clear lines of segmentation. There are limits, clearly, on screen and page real estate that make larger diagrams hard to deal with. But UML does provide packaging constructs that help in this area.

    If your design has "an unintelligible mess of criss-crossing lines" then that's a clue to a design problem. If your code is really "an unintelligible mess of criss-crossing" dependencies then you have code a problem. Seeing it in a diagram first can keep you from writing that awful code.

  27. Sartoris,

    <quote>
    If your design has "an unintelligible mess of criss-crossing lines" then that's a clue to a design problem. If your code is really "an unintelligible mess of criss-crossing" dependencies then you have code a problem. Seeing it in a diagram first can keep you from writing that awful code.
    </quote>

    I essentially agree. A mass of criss-crossing probably means your code is too tightly coupled and you have not split up the problem domain properly. However UML is sold as a way to architect large complex projects. In practice I havn't found the value in this. In practice I find one needs several layers of abstraction (i.e. view from 50,0000 ft, 1000 ft, 50 ft). UML seems to mostly be a the layer of just above code (10 ft say ;-) I feel at this 'distance' you are better off using IDL or some other textual way to describe the system.

    Look the the W3C specifications, RFCs and open source projects. From the small sample of these I have seen, I have not seen a single case where they used UML. The simplicity of a texutual approach (creation, version control, portability, maintainance, 'free' price) make it superior to a graphical one in my mind. But that is just me.
  28. Anick,

    you wrote:

    <quote>
    ... UML seems to mostly be a the layer of just above code (10 ft say ;-) I feel at this 'distance' you are better off using IDL or some other textual way to describe the system.
    </quote>

    I cannot agree on that. UML definitely allows the modelling of systems from arbitrary viewpoints and with various abstraction levels. If you'd said that UML models reverse engineered from existing code is almost on code level, I'd agreed.
    The basic concept of MDA relies on the fact that UML is appropriate for expressing models on more than one level of abstraction. - That is: distinction between Platform-Independent Models and Platform-Specific Models.

    The MDA - Architectural Overview document describes a concept for correlating models from different abstraction levels called "zooming".

    Actually we're using UML to specify systems on a very high level, from which on we're creating transient PSMs and source code. With iQgen and a not yet published platform (codename: Gemini) we are able to create from one abstract model:
    - two different UIs and
    - back end with,
      - simple object streaming persistence,
      - Castor/JDO, or
      - EJB 2.0 CMP.

    It depends on configuration which front-end and back-end you wish to generate, so the model is definitely much more abstract than the code.
     
    cheers,

    Phillip Ghadir
  29. Phillip,

    I would be interested in seeing an example of this "zooming" capabilitiy.

    So far in all the projects, that I have viewed in some way (companies I have worked, companies that I have consulted for, 'coummunity' OS projects, etc), I have yet to see 'UML' used in more than a 'toy' capacity or as a way for TAs/SAs and others to feel like they are professional "You can't question my design, I used UML!". I don't pretend that my limited anecedotal experience represents the world, but so far UML seems like this thing that everyone talks about but that no one really uses in a serious way. It the thing people feel obligated to create in their planning documents so they can appear to know what they are doing and then is quickly ignored and forgotten once real development starts.

    I would love to see solid counter examples. If I saw how UML could be used in a real and productive manner I might change my mind about it. I have searched the web several times looking for comprehensive UML examples (not just small artificial 'tuturial' examples). So far I have not had any luck finding a single good UML example. I track quite a few open source projects and no open source projects I know of uses UML (why not?). So far I find documents with 'words' and the odd custom block diagram far easier to follow and understand. If UML is about more effective communication, then I would like to see an example where it 'more effectively' communicates than old-style word-heavy, diagram-ight documentation.

    Cheers.
  30. Hi Anick,

    <quote>
    I would love to see solid counter examples. If I saw how UML could be used in a real and productive manner I might change my mind about it. I have searched the web several times looking for comprehensive UML examples (not just small artificial 'tuturial' examples).
    </quote>

    I'm not sure if I've got you right. One time I've seen a large scale federated UML model containing about 600 classes only from the business domain. But of course such a real life model is too valuable to make it publicly available.
    There are some high level UML patterns modelled in Fowler's Analysis Patterns book (at least you can get the UML models from his website). But I don't know whether this is what you are looking for.

    <quote>
    I track quite a few open source projects and no open source projects I know of uses UML (why not?).
    </quote>

    I don't know, but quite often I've asked myself the same question. Maybe they are only drawing sketches on paper. Some people might prefer simple hacking, too.

    <quote>
    I would be interested in seeing an example of this "zooming" capabilitiy.
    </quote>

    One example that could come close to that is the sample model, which comes with Innovator. It shows an analysis model of Online Banking as well as a design overview, and a detailed design.

    cheers,

    Phillip Ghadir.
  31. Phillip,

    Thanks for the reply.

    You have seen a real world UML model that contained 600 classes, so it's good to know that they do exist somewhere. Of course there is no garuntee that it was of benefit over a text based approach. Obviously I would have to look at it myself and see if I went "Wow, that is a lot easier to understand".

    I will try to find the diagram examples you mention on Fowler's website.

    Thanks.
  32. I agree with Anick. Please visit the following URLs:

    DOMAIN-SPECIFIC MODELLING: 10 TIMES FASTER THAN UML

    http://www.metacase.com/papers/Domain-specific_modelling_10X_faster_than_UML.pdf

    OR

    http://www.metacase.com/papers/sdc2000/sld001.htm

    OR

    Magazine article in Embedded Systems Engineering (2002)

    http://www.esemagazine.co.uk/common/viewer/archive/2002/May/14/feature5.phtm

    In my project, I used domain specific model expressed in
    XML and then generated EJB/SQL/XML code by using XSLT. We
    did not use expensive case tools. Out of approximately 2480
    files in the project, 1520 are generated ==> 61% code
    generation. The serverside.com paper that I authored with
    Craig Cleveland is:

    http://www.theserverside.com/resources/article.jsp?l=XMLCodeGen

    Soumen Sarkar.
  33. It is nice to see I am not alone in my questioning the productivity gains of UML.

    I visited the sites and papers that you mentioned and enjoyed them. I would like to see more articles. It made me realize that I am focusing too much on code and not on proper analysis of the problem domain first. UML does nothing to aid in this effort that I can see. UML is not so much a universal modelling language as a universal way to describe code. Since it is 'barely' above code, it doesn't really offer much value. For those that can not see the differrence, consider using UML to model the propagation of sound waves in water. UML can describe the 'code' that you would use, but the real model (the wave equation, reflection, refraction, defraction, ray theory, etc) is either in your head or in the text books you must refer to.

    Cheers.
  34. "UML is not so much a universal modelling language as a universal way to describe code. Since it is 'barely' above code, it doesn't really offer much value." -- Anick

    Are you claiming that UML's use case diagrams are "barely above code"? Hmmm.
  35. I can represent use cases much more effectively with text than with big useless use-case diagrams. Use-cases are valuable, sure. Before UML, use-cases were called 'functional requirements'. UML just gave it a fancy new name and a really inefficient way to represent them. I don't see the value personally.
  36. Anick,

    "I can represent use cases much more effectively with text than with big useless use-case diagrams."

    I agree. I think I remember Fowler agreeing with you in his book "UML Distilled", but don't quote me on that.

    However, I don't feel that way about the sequence diagram. I have found that sequence diagrams (supported by other program specification documents) are excellent at communicating requirements to the programmer. JMO.

    Mike

     
  37. Mike,

    Sequence diagrams are definitely one of the more interesting and unique elements of UML. I have only created a few sequence diagrams and they were both pretty simple and small and were not created for any 'real' need, but only mostly only used to add a technical sophistication type 'look' to my otherwise text heavy documents ;-) So my experience with them is minimal. Anyway it is an area I havn't really thought of , pro or con.

    What value do they add? (I'm asking this question of myself as I write this). Is there any sequence diagram that can't be represented by a simple list of function calls or steps? Actually now that I think about it, why do people use sequence diagrams rather than old style flow charts? and why use flow charts when you can just use simple psuedo-code? Sequence diagrams seem less powerful than flow-charts to me (can not represent decisions and branching) and flow charts have less information density than psuedo-code (so psuedo code wins! ;-)

    The sequence diagram is 2-d, but the information is 1-d (a linear 'sequence' of function calls) or is it? Perhaps in sequence diagrams one can represent asynchronous execution in which case a simple 'list' would not do. I don't know. Actually I doubt a sequence diagram is adequate for representing asynchronous execution now that I think about it (techniques like those listed in "Concurrency" by Magee and Kramer are what are really required). I guess you could use it to illustrate a 'possible' asynchronous sequence that would be only one of many, but then you could do this with a list also. Hmmm. Maybe the diagram as it is laid out makes it easier for us to mentally keep track of what is what? As long as the diagram doesn't get too big this might be true. As situations arise I will ask myself "could a sequence diagram disply this more effectively?". Perhaps.

    Cheers.
  38. <quote>What value do they add? (I'm asking this question of myself as I write this). Is there any sequence diagram that can't be represented by a simple list of function calls or steps? Actually now that I think about it, why do people use sequence diagrams rather than old style flow charts? and why use flow charts when you can just use simple psuedo-code? Sequence diagrams seem less powerful than flow-charts to me (can not represent decisions and branching) and flow charts have less information density than psuedo-code (so psuedo code wins! ;-) </quote>

    Most often I personally prefer activity diagrams to sequence diagrams. They 'speak louder' than sequence diagrams and are easier to create. In an activity diagram you kan describe decisions, alternative flows of control, forking, joining and synchronization in an easy way that really communicates. Complex sequence diagrams are for super-humans to read and for gods to create.

    - Hejin
  39. "UML can describe the 'code' that you would use, but the real model (the wave equation, reflection, refraction, defraction, ray theory, etc) is either in your head or in the text books you must refer to." -- Anick

    The thrust of object-oriented software is that real-world phenomena are readily abstracted into capsules of data and behavior. If science can be taught, then it can also be modeled as objects. Modeling, like math, is just a language more formal than English. The conceptual content is the same regardless of the description language, which is why I'm surprised that you might claim that some descendent of Algol, such as C or Java, could be the only feasible way to model data and behavior. Is there some emotional reason why you'd want hand code to predominate?
  40. Brian,

    Huh?

    I think you misunderstand me. I'm all for modeling. Domain based modeling. What I'm saying is that UML models code and is not the best model of the particular choosen domain. For this one should use the natural language and models of the domain and not directly jump into the concept of objects and relationships. I gave the example of the propagation of sound in water as an extreme example (OO has absolutely no concept of 'fields' for example or anything remotely like a field, which is of central important to this model). You would not 'jump' to UML to model this. UML is not the best language to model this domain. For this particular domain there is essentially one model (with many simplifications such as ray theory for speed of calculation) - the "wave model". If you think you could bypass or replace this model by jumping directly to UML I would REALLY love to see the result. Few things fit as neatly into objects and object relationships as you might think. Think about topics of interest in physics, mathematics, law, linguistics, psychology, literature, history, business theory, economics, chemistry, ... (the list goes on and on). You would be hard pressed to naturally model any of the models from any of the textbooks from any of these disciplines. Can they be shoe-horned into an OO world, or a procedure based world, or a relational based world? Yes of course, (or sometimes anyway), this is like asking "Can you write a program to represent the model?", but you should not 'start' there, thinking first about the 'code'. You should not start doing any UML or coding at all until you firmly understand the problem domain (have a natural model for it). In our own very limited domain, OO doesn't even model relational models very well, which would seem like a pretty close cousin to OO you would think. If it did, then there would be no need for the plethora of relational mapping tools and techniques there are out there; with which no one is yet really satisfied with. You wouldn't need them if OO was a natural way to model relational data.

    Which brings us back to the true value of UML, modeling code, which in my opinion is small. Representing a class as a visual box does not provide significant value over representing the class textually (in my opinion). This is just my opinion. I doubt I would convince anyone who believes otherwise, but I do point to the open source community, and standards (RFC, W3C, etc) and ask "how come they don't appear to be using UML?" (at least not that I know of). Perhaps they are all fools or are irresponisible. Who knows.

    Cheers.
  41. Anick wrote:
    >
    > ...OO doesn't even model relational models very well...

    Interesting. Shlaer-Mellor is an object-oriented methodology that uses enhanced entity-relation diagrams for all of its static modeling. I prefer this.

    > Representing a class as a visual box does not provide
    > significant value over representing the class textually
    > (in my opinion).

    No, of course a box lacks value by itself. But the real benefit of diagrams is in presenting information networks, as with object associations, data flow, control flow, lifecycle, and interactions.

    > ...I do point to the open source community, and
    > standards (RFC, W3C, etc) and ask how come they don't
    > appear to be using UML? ...

    The utility of diagraming is severely limited without an executable model, so much so that I only voluntarily diagram if the model is executable. A methodology like Shlaer-Mellor or OMG's MDA suggests an executable model. RUP and UML don't.
  42. Brain,

    I think I have taken to harsh a stance against UML and I appologize. I regret taken such an antagonistic stance. Really I would like to see example of how UML has worked for people. When I have used UML myself or I have seeen the results of others who have UML, I didn't find it useful and it seemed like a lot of work with little benifit. I'm the type of person who has to see the value personally and doesn't use the latest tool or jump on the latest fade unless I can see obvious and real improvements over the previous system.

    Perhaps I and my colleagues are not using UML correctly. Perhaps if I saw a larger real-world example and real added value (through MDA tools or other), I would change my mind. I have spent some time looking for this, but because most people use UML for proprietary in-house projects, it is hard to find real examples.

    So I will wait for now. I will focus on the natural domain model, which I feel has been somewhat ignored by most and pioneers like yourself can prove and imporve UML and MDA tools and eventually I may also see the light.

    Cheers.
  43. "Brain, I think I have taken to harsh a stance against UML and I appologize. I regret taken such an antagonistic stance. Really I would like to see example of how UML has worked for people." -- Anick

    I'm satisfied with my past success at executable models. Though for this I haven't used UML, since unlike Shlaer-Mellor or ROOM, UML wasn't designed with executable models in mind, even though UML's inventors were aware of the concept at the time. I have no appreciation for UML beyond it being a de-facto standard. I consider UML's notation too complex and its static semantics too close to code. Still, UML will surely become further entrenched now that OMG has designated it the official language of MDA. And MDA's automation is too significant a productivity boost to grumble about which style of artwork to use.
  44. Brian,

    I'm not familiar with executable models, but I imagine them to be a type of 'simulation' language. If this is so then definitely I could see the power of simulation tools where the computer can basically run through scenarios for you and where you can quickly try out ideas and find and fix areas of weakness. Basically this would be a type of high level prototyping and I'm all for prototyping.

    Cheers.
  45. "I'm not familiar with executable models, but I imagine them to be a type of 'simulation' language. If this is so then definitely I could see the power of simulation tools where the computer can basically run through scenarios for you and where you can quickly try out ideas and find and fix areas of weakness. Basically this would be a type of high level prototyping and I'm all for prototyping." -- Anick

    You totally grasp the intent of executable models, for which it's unnecessary to write code merely to prototype and validate behavior. For example, if the behavior of a bunch of actors is modeled with state diagrams, then it's possible to mechanicly debug their interactions prior to coding. It's a developer productivity boost to have correct behavior prior to procedural coding. That's one example of how MDA can save labor by avoiding rework, but it goes even further when considering also the reduction of required hand code.

    At a minimum, state models can be automaticly encoded for deployment in re-usable machinery, such as a state transition table. Still better, if each state's action is internally modeled with a graphical notation that details processing abstractly, then most of a program's business logic no longer needs manual coding in a low-level target syntax, such as Java.

    For example, a Java method's body makes heavy use of local temporary variables, and typicly programmers explicitly declare them, which is tedious and a common source of compilation errors. But if processing is modeled as a data flow diagram, then temporary variables are abstracted away without loss of expressivity or executability. And readability is greatly enhanced too. With Java source it's straining to find all the subsequent uses of a temprary variable. Whereas, with a data flow diagram, the distribution of temporary data is readily apparent visualy as a consequence of the notation.

    The trick is arriving at a graphical notation that rigorously models abstract processing. It's a gaping void in most modeling notations, including UML, and I've dedicated myself as hopeful-inventor at this. That is, I take processing artwork (such as control-flow and data-flow diagrams) seriously, and in my spare time I'm implementing a diagramatic CASE tool for round-tripping with Java source. My inspiration is Shlaer-Mellor process diagrams and Prograph, but the semantics are entirely Java, and the notation is novel. Such process diagraming is a crucial technology before any application can be specified solely via executable graphical models, which I consider an innevitable innovation. So, yes, I hope to make hand coding Java's source text files obsolete.
  46. <brian-miller>
    At a minimum, state models can be automaticly encoded for deployment in re-usable machinery...
    </brian-miller>

    Correct. I overlooked the mention of state models in my previous post. These are also important for MDD.

    Have you looked at Rose Real-Time (RT)? I believe that it uses state models heavily.

    Bran Celic (sp?) of Rational also came up with a very simple but powerful notation for modeling state. One can actually bypass other notable UML models in favor of using it. The model type was invented for RT, but can be used for non-RT development. It deserves a look.

    Vaughn
  47. Brian,

    I think you might be setting the bar too high (hey if you can hit it great though). Personally I think the value of the tools you talk about it in their simulation capability for prototyping. Programming simulation tools seem like they would be a great idea (really rapid prototyping). The mapping to code and 'round-tripping' you describe is probably unnecessary though in my mind. Write the prototype in some high level simulation language (or visually, whatever) UNDERSTAND the system and then it should be a piece of cake to write the actual code. This would be a one way mapping (prototype --> write code). Round tripping that actually works seems unrealistically hard for the real world (like AI almost)

    Cheers.
  48. <Anick>
    The mapping to code and 'round-tripping' you describe is probably unnecessary though in my mind.
    </Anick>

    Roundtrip engineering becomes unnecessary when the "model is the code." That's because Java or whatever serves as what we think of today object code or p-code or byte-code or whatever.

    Again, consider static structure, interaction, and state diagrams along with Action Semantics language as the 100% solution for the furture of development. Frameworks will plug and play as models, not as code. So when you purchase a framework, you get the model because the model is the code.

    Think of the certifications that could exist because "code" (models) that can play themselves through the use case based test cases that ship with the model (because they are also model/code). It will be the end of vaporware and the end of software that only claims to have a given feature. End users could actually be able to demo the product from the architectural and design model in order to verify a use case they need.

    That's what Jack Greenfield (great name for this space!) and others refer to as moving toward Software Fabrication. At that time engineers will truly _have_ to be engineers.

    Futuristic? Of course. But it is on the way, and not that far down the road. So coders, look both ways before you cross the road if you don't want to be flattened by the MDA/MDD race cars. :-)

    Vaughn
  49. Interesting concept. If I understand things correctly it sounds like you are saying that the model IS the code, so essentially this is an attempt to create a new programming language. In this case there is no point to even mention Java other than the model might 'compile' (or whatever it does) to Java byte-code or something, but it could just as well compile to something else or be interpreted directly, etc. If the model is done through visually drawn diagrams and such than the 'language' is a visual language. If it is in XML or such that it is a programming language based on XML (of which there are some others, XSLT for example). I don't think the programmer disappears because the language is made visual. The same skills would be required to program in this language as any other I would imagine. Theorectically this would just be a higher level language that would bring greater efficiencies and such (like the higher level languages before it). Visual programming languages have been tried before (I remember a Borland product was a visual programming language, don't remember name) without much success. That doesn't mean that visual programming languages can't work, just that the previous attempts weren't able to get it quite right. It sounds like a difficult problem (creating a visual programming language), but I look forward to trying out the results one day.

    Cheers.
  50. "If the model is done through visually drawn diagrams and such than the 'language' is a visual language. If it is in XML or such that it is a programming language based on XML (of which there are some others, XSLT for example)." -- Anick

    Ah, but dialects such as SVG have proven that drawings and text are entirely interchangeable through automatic translation. So the distinction between diagrams and text with regards to automation is moot. What matters most is that humans are shown their favorite reperesentation, which for me happens to be pictures, with each being worth a thousand words.
  51. <quote>
    This is where I see XP as lacking. It focuses on code writing collaberation instead of on design. Having 2 people together jamming out code isn't that much better than 1 person.
    </quote>

    Apparently, you don't fully understand XP. XP has never been against design. The four activities XP encourages are coding, testing, listening and DESIGN. So before you comment on something, please get the facts straight.

    That said, XP does not suggest Big Upfront Design. This is where many projects fall down. You can make all sorts of UML diagrams all day long, but eventually you are going to have to get your hands dirty and code. And it is not until you code until you discover the limitation/errors/incorrent assumptions of your original design. That is when you go back, design some more (now that you have learned) and then code. Wash. Rinse. Repeat.

    And XP is not against modeling, either. Scott Ambler has written tons on this very thing - see http://www.agilemodeling.com/. The idea is that UML is a communication tool, not an end product. I think we can all agree that the end product of software development is SOFTWARE. If it isn't, what the hell are we developing? UML is a way to commuicate software's design, interactions, etc.
    And sometimes, a whiteboard and a marker is worth as much as a $X,XXX CASE tool.

    Personally, I see the value of UML. If my team is deliberating on a design, I know I can go up the white board, scetch out a sequence diagram of an idea, and I know that everybody will understand my thinking. Sort of the same thing with design pattern catalogs - it creates a common vocabulary for software developers.

    And, yes, XP encourages pair programming. People are quite opinionated about this. I have done quite a bit of both. I would simply say that until you have tried pair programming(I mean REALLY tried it. For weeks. With different people), don't knock it. I'm not going to get into the pros/cons of this - that is an entirely different subject. I am just saying pair programming isn't for everybody, but it certainly has its advantages over lone ranger coding.

    Ryan
  52. "Apparently, you don't fully understand XP. XP has never been against design. The four activities XP encourages are coding, testing, listening and DESIGN. So before you comment on something, please get the facts straight. "

    My original post
    <quote>
    This is where I see XP as lacking. It focuses on code writing collaberation instead of on design. Having 2 people together jamming out code isn't that much better than 1 person.
    </quote>

    The keyword is "focuses". I stand by that statement. The emphasis on pair programming is greater than the emphasis on design. By the time code is being written a design should already be in place.
  53. Sartoris Snopes said:

    The keyword is "focuses". I stand by that statement. The emphasis on pair programming is greater than the emphasis on design. By the time code is being written a design should already be in place.

    -----------------

    OK. To me, that is implying that XP starts coding before a design in place. This is simply no the case. However, XP does start coding before the entire application has been completely designed. I think this is a good thing. I think it is foolish assumption to think you can design an entire system before writing a single line of code and get it right. Perhaps it can be done (althought I am VERY skeptical), I just think that is a huge, up-front waste of time.

    In the XP approach, you take the most critical element of you application and design this first. Then you implement this in code. Then you take the next most critical element, and you design this. If this does not completely mesh with the first thing you implemented, you refactor some and implement them both. etc. This evolutionary design changes as you add more things, because you aren't always going to get it right the first time. This does not say you make foolish design decisions with considering other parts of the system. You make the design flexible (design patterns play a big role here) to reduce dependencies on different parts of the applications. This does not mean you have to think of everything first before you start coding.

    I would argue that this approach takes just about as long as the 'diagram everything first' approach, but you start getting working code faster. Also, compare these two approaches:

    design->design->design->design->code->code->code->code
    design->code->design->code->design->code->design->code

    Now lets say the customer wants to see what you have half way through the project. With approach one, you can show them some pictures of what their system will look like down the road. With approach two, you can show them working software with their most critical pieces already implemented.

    Also, all of this (and most of this thread) is assuming green-field projects. In my experience, this makes up a small percentage of software development. The majority of development seems to be enhancements/replacements/defects. Do MDA/model-to-code tools work well in this space. If you take over a project that was developed by other consultants (I have done this several time), would an MDA tool work well in this circumstance? Let's assume you don't have any pre-existing UML to work with. What do you do in this circumstance.

    Ryan

    P.S. For a more tongue-and-cheek look at this debate, check out The Fragile Manifesto

    http://www.sdmagazine.com/documents/s=7468/sdm0208j/
  54. Hi Ryan,

    Good post. In the old days, back when we had to walk miles to school in the snow :), there were all kinds of monolithic development methodologies (anyone remember STRADIS?). Yeah, I'm way old. They suffered from exactly what you are describing here... way to much time spent dotting every i and crossing every t before you ever starting building the system. In defense of these techniques, procedural languages did not lend themselves to the "refactoring" techniques that are possible today. Then the new buzzword of the time appeared .... prototyping. Protyping was basically what the new buzz "refactoring" is. Build a little, rework, and rinse and repeat (stole that from you are someone else.. I like it). Prototyping came in a couple of different favors... prototypes that you intended to present to users and learn from, and then throw away and start over. In other cases, you did not intend to throw away the prototype and start over. In extreme cases, maybe you even have built the prototype in a language you didn't intend to use in production (e.g. prototype in VB and then rewrite in C++). The moral of the story is to find the right balance between upfront analysis and design and actual development. This is more of an art than a science. The old methods of designing everything upfront is a loser and a thing of the past in my opinion. Also, jumping straight to coding will be a loser. I have read a bit about XP, and like much of it. Of course, alot of it is sold as something new, and reality it's just some rehashing of old ideas with fresh paint. One thing I have bought into that does seem new is writing tests before you code. That is an excellent idea, in my opinion. We will just have to agree to disagee on the paired programming. I don't have to try everything before I can make a judgement on it's merits.

    Mike
  55. "To me, that is implying that XP starts coding before a design in place."

    That's not my point. My opinion is that XP does not focus enough on design. XP doesn't forgo design entirely but it puts more emphasis on 'pair programming' than collaberative design.

    "I think it is foolish assumption to think you can design an entire system before writing a single line of code and get it right."

    This is true.

    "In the XP approach, you take the most critical element of you application and design this first. Then you implement this in code."

    This is bad. This ignores the elaboration phase. I've worked on several projects that skipped this phase. Grand assumptions were made, many lines of code was written only to discover very late in the game that the design would not work. The code had to be almost totally ditched.
  56. "This is bad. This ignores the elaboration phase. I've worked on several projects that skipped this phase. Grand assumptions were made, many lines of code was written only to discover very late in the game that the design would not work. The code had to be almost totally ditched."

    What is exactly the "elaboration" phase. I am not completely familiar with this term. Explain to me what happens here that helps solve this problem. Is this phase specific to a particular methodogy?

    Regarding XP...I don't think not designing the whole system first is bad. When you are designing components of a system, this design is never made in a vacuum. You have to take other parts of the system into consideration. But you don't get into the details until later. You always have a high-lever idea of how it all fits together. But how you flesh out the core components make impact how you want to design peripheral components.

    Part of what makes this process successful is simply good OO practices. By communicating with other pieces through interfaces, you are not forced to design the other piece completely, just figure out what their interface it should expose. The project I am working on right now, the GUI and the backend were both designed in an evolving manner completely independant of each other (using pair programming too!) The integration of these two pieces, while not entirely seemless, went quite smoothly.

    I am also partial to the "No Big Upfront Design" camp because in the projects what have used this, it was unsuccessful. When we started actual coding, we had TONS of diagrams, models, etc. Granted, we did not use a CASE/MDA tool to generate the code. Still, we found out very quickly that the original design was worthless.

    Both processes, if implemented correctly, probably have their place in particular types/size projects. Most projects I have worked on recently (small; 4-10 developers) lend themselves nicely to XP. No complaints so far.

    Ryan
  57. "What is exactly the "elaboration" phase?" -- Ryan

    I think it's part of the Rational Unified Process, or something. Extreme Programming is a minimal actual example of RUP, as is documented. Personally, I prefer Shlaer/Mellor's process because its diagram notation is most clear, and its automation phases are the most clear.
  58. "Regarding XP...I don't think not designing the whole system first is bad." -- Ryan

    Agreed. The beauty of XP is that it allows limited end-to-end testing soonest in the process. This is the so-called "steel thread" which is so important for doing continuous integration. XP totally kicks booty on this point, that and RAD prototyping.


  59. " I think UML is very useful. The problem I see is that developers and managers can't seem to understand the need for design and deliberation.
    This is where I see XP as lacking. It focuses on code writing collaberation instead of on design. Having 2 people together jamming out code isn't that much better than 1 person. "

    XP doesnt obviate the need for design (this is one of the misconceptions). In fact, it ensures that design is a continuous process. Design *needs* to be a continuous process in order to keep in step with the ever-moving requirements. Refactoring (a core aspect of XP) is the act of evolving the design.

    The problem with UML's usage to date is that it has been centered around a big up-front design phase .... ala waterfall.
    (And as I have said before, this has been a partially a result of crap tools. Its also been due to a particularly flawed idea of what software development is all about)

    Generally (unless you are in the military or aerospace or industrial control industries) this kind of approach fails the project rather than aids it.

    UML isnt crap - its an important communication medium. However, at the end of the day, its the code that executes. What the MDA stuff is trying to do is get the UML to execute.... I just think it is too high a level of abstraction to be able to do anything useful.


    -Nick
  60. "UML isnt crap - its an important communication medium. However, at the end of the day, its the code that executes. What the MDA stuff is trying to do is get the UML to execute.... I just think it is too high a level of abstraction to be able to do anything useful. "

    Actually Nick, your post almost precisely shows exactly how bizarre this entire thread is. It appears to combine a whole lot of people with slightly intersecting knowledge bases and have them trying to create a solid discussion - from XP, UML, MDA, JDO, EJB, etc.

    Your line about "get the UML to execute" is quite simply wrong. MDA doesn't require the UML not to be executable - take Bold from BoldSoft (another MDA product) for example. The UML model _is executable_. You also put your OCL in there for invariant constraints (as well as pre and post conditions and derived attributes) and it becomes executable - all constraints you put into the model become a fundamental part of your application.

    BoldSoft view the UML world in three parts - UML for communication ("UML sucks"), UML for code generation, and UML for execution. BoldSoft uses the latter, and they have many case studies where people are saying they are building their applications 5-10 times faster using the MDA approach. It certainly isn't the same as what Steve has been talking about and I have noticed that most Java MDA products don't make much use of OCL (I wish they would, models created with UML are simply imprecise and OCL is pretty easy to support), but I could not say with certainty that none do.

    So this is another one of those "from J2EE to oblivion" kinds of amusing discussons - with all participants not knowing quite the same set of information everyone else does and making assumptions because of it. I would say though, to those who believe that UML has no place in the development lifecycle, MDA will prove you so unbelieveably wrong. If you don't grok it now, at least start looking into it - you can never know enough. Another site from those that have been mentioned already is Convergent Architecture (ArcStyler is a product mentioned in this, which has already been talked about). The site... www.convergentarchitecture.com

    Richard
  61. "So this is another one of those "from J2EE to oblivion" kinds of amusing discussons - with all participants not knowing quite the same set of information everyone else does and making assumptions because of it." -- Richard.

    Well, J2EE makes a fine target platform for MDA, as is the case with Sun's Ace project. And JSPs are well suited as template-based code generators.
  62. "...your post almost precisely shows exactly how bizarre this entire thread is. It appears to combine a whole lot of people with slightly intersecting knowledge bases and have them trying to create a solid discussion - from XP, UML, MDA, JDO, EJB, etc"

    Fortunately, however, your post unifies and clarifies the discussion for everyone :-)

    MDA suggests/implies a design methodology and execution platform. Comparison to other methodologies like XP, and to other platforms like EJB is fairly natural.

    "Your line about "get the UML to execute" is quite simply wrong. MDA doesn't require the UML not to be executable"

    I dont quite understand your point. Perhaps mine was misunderstood. (at least, I dont understand what exactly you are pointing out to be "quite simply wrong").

    My two points were:
    + Its the code that executes, not the documentation.
    + Trying to program in a high level language like UML limits your ability to deal with low-level details - and hence the ability to solve many real world problems (unless you can step outside the tool to access the low-level code).

    "and they have many case studies where people are saying they are building their applications 5-10 times faster using the MDA approach"

    5-10 times faster than what?? (I am sorry, I am not taking a shot at you personally, but this is the same marketing-speak that I got from Compuware)

    At no point am I saying that UML and MDA has no value or no future. Simply, the ambitions of MDA are quite high and I have yet to see a tool that we could use.

    -Nick

  63. "Fortunately, however, your post unifies and clarifies the discussion for everyone :-)"

    I know you meant that amusingly, but I truly do not believe that a discussion like this has much point - so much further understanding needs to be had.

    When you say "misunderstood", I suppose. You say "Its the code that executes, not the documentation. " - well, isn't the UML code documentation? Isn't OCL considered documentation? Bold for example executes the model itself, it ensures that during the execution of your application based on that model, all constraints (modelled or placed in OCL) are executed all of the time. I'm not trying to push Bold particularly as it is not a Java platform and not really relevant here, but it is an example of how it can be done.

    As you say, you do finally have to develop the code there is no question of that, but MDA is as appropriate to XP as it is to a fully bloated RUP process. MDA just gives you architectural options, MDA says "do a platform independent model" then "use something to make it platform dependent". It doesn't say write your code for you, but if a lot of what you do is CRUD, why not have the option?

    5-10 times faster than before. I have to agree with their assessment, they have a case study of the Swedish Parliament on the OMG/MDA site that said exactly that. They have people on their newsgroups who say that (plants maybe?) - I have used it myself and it was quite a revelation, and I am one of the most jaded people there is when it comes to UML and Java! Certainly much faster than using non-MDA.

    I have yet to evaluate all of the tools (the iQgen I will examine as I have time) so I cannot make a comment about that. There is a general distaste amongst Java developers for anything that encourages them from having less access to "code" (as a friend of mine once said, "yes, all very well, but I _want_ to write code!")

    I only know what I do because I have to do a presentation on it next month otherwise I would probably have skipped this entire conversation, the idea that it would be yet another top. I just hope developers will be open minded enough to accept it when its value is clearly demonstrated.

    Richard
  64. "You can not have a single button that can do anything you want when you push it."

    But we can have a button press that does something as simple and routine as generating programs from diagrams. Very straightforward, really.
  65. "Imagine that you could do everything through MDA tools (I know this is not their current goal, this is a thought experiment). Not a single speck of Java would be required. Then essentially the MDA would be just another programming language (a visual programming language)."

    Distancing the developer from text source is as important for boosting creativity today as abandoning assembly language was decades ago.

    Also, years ago I made two MDA tools, and I gave both of them backdoors for attaching hand code when the tools were otherwise too inflexible to model crucial application details. So like, manually writing text source isn't completely vanishing any time soon.

    Hand code is particularly important in the OMG's MDA specification, which describes levels of possible MDA automation, with entirely hand translating diagrams into code as the most primitive example of a valid MDA.
  66. If MDA is like a visual programming language as you say, then great! It may or may not work out though. Being 'visual' does not automatically make it superior. I remember Borland made some sort of visual programming tool where you did the programming in some sort of visual flow chart. It worked, it was very innovative, but it didn't take off. Again, once things got complex and you couldn't see all the blocks of a particular operation on one page then you were no better off then text, (worse because you saw less information per 'view' than with text).

    On the other hand I have seen some digital signal processing tools that were 'visually' programmed (you dropped in various blocks into the processing flow) that seemed mildly successful for this particular niche. The same arrangement could have been done in XML, or a properties file as well though with minimal complexity, but not nearly as sexy.

    Visual programming is appealing because it seems like it might be easier to do and to understand, but in the end I feel you can actually represent more information in the same space with text than you can with blocks and lines. Blocks and lines take up a lot of space and don't add a lot of additional information. In addition with text you have the advantage of ease of creation, maintenance, free, portable, etc that I have mentioned in some previous posts.

    I'm not saying that somebody won't eventually built a next generation language that is visual and that is superior. Maybe. I'll wait and see. Meanwhile I will use text based approaches (IDL, text documents and explainations, just like every W3C or RFC document out there that I have ever seen).

    Cheers.
  67. "In order for the MDA to be simplier than the original language we are trying to replace we must limit flexiblity."

    This is no different than was the loss of flexibility when switching from assembly language to C++, or from C++ to Java. Always for mainstream development the increased abstraction has been worth the slight loss of flexibility.

    "Would the MDA approach be simpler? Maybe, but unless the MDA language turns out to be a superior 'language' we haven't really gained much."

    If you believed in model diagrams, such as UML, then I don't see how you could avoid MDA as ordinary design tools evolve. One reason that diagramatic modeling couldn't be popularized was that UML tools lacked an emphasis on MDA automation. Maybe that's changing now.

    But really all a diagramming tool needs to support MDA is a model interchange format, such as XML. That allows an external code generation script to automicly translate the models into executables. This has been feasible with modeling tools for years.
  68. With the current state of affairs, I see a number of challenges for the MDA effort. Here are two that are at the top of my list:

    1) This confirms my experience - most developers do not use UML on a regular basis. So right off the bat, we have a barrier to entry. Furthermore, using UML as a convenient notation is one thing, but using it in such a way that executable implementations can be generated directly raises the learning curve significantly. And tools today are not even close to helping developers here. We can't even seem to get plain ol' UML tools done right in this industry, let alone what is required for MDA!

    2) In the Java space, we have a vast number of choices regarding the frameworks we use at implementation time. To make things manageable, I would think an MDA vendor would need to lock its customers into one set of frameworks (and they would likely be proprietary for maximum control). Otherwise, they are facing an enormous number of possible transformations, all of which require some user input. This is simply unacceptable to the general body of developers who have invested time learning the frameworks that aren't supported.

    I think efforts like MDA offer interesting ideas, but let's be realistic. The existence of tools that generate robust enterpise systems for the general body of developers from UML seems akin to jumping the Grand Canyon at this point.
  69. I actually thought the replies to Jack Greenfields tech talk provided many reasons why "code generation" tools are not likely to be successful. I thought Nick Minutello in particular nailed it.

    http://www.theserverside.com/home/thread.jsp?thread_id=13034

    I'm not sure why shops wouldn't adopt UML. It serves the same purpose that entity modeling serves. Simple graphical diagrams that clearly communicate requirements. Now that's not the same thing as requiring an expensive investment in complex tools like Rational. In my opinion, you could go a very long way with a free UML tool and good manual procedures. JMO. I don't think it's realistic to generate "production ready" J2EE applications. However, I do think any tool that would generate templates or design pattern object shells, were the developer fills in the detail does make sense. Who wouldn't want some of the monkey work taken care of. I haven't used XDoclet, but I thought that was it's purpose.

    Although I may be a skeptic about using code generated in a production system, the thought does occur to me that such tools may serve as a good J2EE teaching aid. Probably not exactly what the vendors are hoping for. :)

    Mike
  70. Hi Mike and the Community,

    <quote>
    In my opinion, you could go a very long way with a free UML tool and good manual procedures. JMO. I don't think it's realistic to generate "production ready" J2EE applications. However, I do think any tool that would generate templates or design pattern object shells, were the developer fills in the detail does make sense. Who wouldn't want some of the monkey work taken care of. I haven't used XDoclet, but I thought that was it's purpose.
    </quote>

    I could not agree with you more than I already do: "any tool that would generate templates or design pattern object shells, where the developer fills in the detail does make sense."

    To get relief from that same itch and make productive use of MDA possible, I recently wrote UML2EJB, an Open Source code generator that produces EJBs from UML models. It is very very small and simple, yet template-driven. Use any XMI-enabled CASE tool to model your classes, get the best architect that you will find on your project and make him write the templates, and: let UML2EJB do the "monkey work"! :-)

    UML2EJB is based on XDoclet so that you get EJBs that can readily be deployed into the most popular application servers.

    Try it and give me some feedback.
    You can find more info at:
    http://uml2ejb.sourceforge.net/
    http://sourceforge.net/projects/uml2ejb/

    Have fun!
    Matthias

    -----------

    Matthias Bohlen,
    Consulting that helps software teams to succeed...
    http://www.mbohlen.de/
  71. " I actually thought the replies to Jack Greenfields tech talk provided many reasons why "code generation" tools are not likely to be successful. I thought Nick Minutello in particular nailed it.
    http://www.theserverside.com/home/thread.jsp?thread_id=13034 "

    Nick's post was way too pessimistic. Every complaint he made was either invalid or irrelevant. He shouldn't presume superiority of hand code based on it's current prevalance.
  72. Brian,

    "Nick's post was way too pessimistic. Every complaint he made was either invalid or irrelevant. He shouldn't presume superiority of hand code based on it's current prevalance."

    We will just have to agree to disagree. :) I do think building software is fundamentally different than producing widgets on the assembly line. And I'm that pain in the butt on every project that keeps yelling we have to have standards and try and turn this project into an assembly line "LIKE" process.

    And besides, even if he was completely wrong, "irrelevant" is to harsh. Isn't expressing opinions the relevant point of TSS?

    Mike

  73. "And besides, even if he was completely wrong, "irrelevant" is to harsh."

    Nick made three claims, and the last was, "Building distributed enterprise applications that are performant and scalable is a very complex task." But complexity is exactly what design automation remedies, and it seemed a silly point. Developers strive to tame complexity with automation, and MDA is just automation.
  74. Brian,

    "Developers strive to tame complexity with automation, and MDA is just automation."

    I agree, and appreciate your input. This site is absolutely awsome because intelligent guys like you AND Nick express their views. No need to be PC here. :)

    On the issue of automation, it's all a matter on where you fill comfortable with the dependencies. There are many layers of trust a developer is basically agreeing to. I agree to trust the Java compiler, the OS, and the application server, for example. Some of these things are just a given (i.e. it's not practical not to depend on them). Other areas like persistence (Entity EJB vs JDO), IDE's and MDA type tools are not clear cut in my opinion. I have to weigh any potential dependecies (marrying in) with the benefits. In the current economic climate, it may not even be based on the merits of your tool. No matter how good your product is, I'm (or the client) in trouble if you disappear (HP, WebGain). That's sad, because you are dead on that automation is needed with J2EE. A J2EE application, MDAed or not, will still be complex underneath. At what point do you allow your developers to be dependent on the tool vs deeply understanding the technology.

    Remember, I said early on that I like to play devil's advocate. I appreciate the debate.

    Mike



  75. Mike,

    you wrote:

    <quote>
    No matter how good your product is, I'm (or the client) in trouble if you disappear (HP, WebGain).
    </quote>

    Of course you're right, and that's why it's important to use standards that give you vendor independence. XMI/MOF/UML already cover the modeling side, J2EE and CORBA (some might add .NET, but not me ;-)) the implementation basis. Regarding transformation, work is currently being done by the OMG with the MOF 2.0 Query / Views / Transformations RFP.

    <quote>
    That's sad, because you are dead on that automation is needed with J2EE. A J2EE application, MDAed or not, will still be complex underneath. At what point do you allow your developers to be dependent on the tool vs deeply understanding the technology.
    </quote>

    I don't believe that a project team will ever be totally isolated from the underlying technology. I don't believe in tools that 'automagically' do all your creative work for you. Using a template driven approach, you need a very good understanding of the technology do develop templates; it's just that you don't need to do what somebody else in this thread called 'monkey work'.

    Stefan Tilkov
  76. "A J2EE application, MDAed or not, will still be complex underneath." -- Mike

    The difference between that MDAproject and the hand-made one is that the MDA project is the one with meaningful art.

    Certainly, by allowing formal translation to code, standardized diagrams can be used directly as application blueprints. And that makes visual designs valuable to developers.


  77. Brian,

    Firstly, I am so glad to hear that I am in the business of making ivalid, irrelevent and silly points. I do my utmost to make them as silly, invalid and completely irrelevent as possible.

    The point you seem to be making is that complexity can be hidden by automation.

    To a limited degree, I agree (j2ee is evidence of this).

    But, how exactly is this automation driven? Is it also automated?

    At some point, a human must instruct a piece of silicon to perform some work using 1's and 0's. I am not being patronising here - my point is that its all about what level this human works at to write these instructions.

    Its a natural inevitability that the higher the level of abstraction, the less control over detail you have. My point, quite simply, is that in complex systems, these details matter - and there are lots of them. Unless they are generic and repetative, it is very difficult to abstract them away.

    I would be interested to see what aspects of building enterprise applications you feel can be successfully automated.

    -Nick


  78. Let's give two ideas for each of the use of UML:

      - code generation: the problem is not that there are many things you can not do, but what are the things that would be better handled by a graphical language instead of text ? Also the UML concepts must be supported by the programming language: for instance class hierarchies are well represented in UML, but generated state diagrams look crappy because they is no specific keyword in java

    - code documentation: everybody has his own perspective on what is important at a given time, and you end up representing several times the same things slightly differently (for instance sequence diagrams and collaboration diagrams could be synchronized) depending of the user; I think there should be a single model, and use queries to dynamically generate diagrams at the level you want, for instance based on packages or modification date ( it may require some 3D displays like www.thebrain.com and graph layout algorithms like http://www.ilog.com/products/jviews/graphlayout/ and http://www.tomsawyer.com/glt/); until then, "grep" and "javadoc" will remain the most powerful tools to navigate through my code !

    Loïc
  79. Why UML that we have XML Schema+XMLSpy?
  80. I have to disagree wholeheartedly, which is to be expected since our company sells an MDA compliant, model-driven software generator.

    We developed this product (called iQgen) because we have successfully used an MDA approach in a number of projects, well before the term was even invented, and wanted to turn our experience into a tool set that we could use to jump-start large project development efforts. I strongly believe that every project that exceeds a basic level of complexity will gain immense profit from this way of software development.

    Some reasons for using MDA and code generation technology are presented in our whitepaper (which covers our product but is valid for at least some of our competitors as well).

    I think that a lot of people don't realize that MDA is not the kind of code generation they are used to from their experience with CASE tools. I can see no reason why you would want to model a 1:1 mapping of your design in UML instead of your target language; models are intended to allow people to communicate on a more abstract level. What we and other MDA supporters do is to separate a business model from an architectural mapping, and thus, ultimately, the target environment. Of course that mapping has got to be under complete control of the architect. MDA is not about tools that force you to use their "reference" or "standard" architecture (since there is no single architecture suitable for every purpose). It's about enabling *you* to define your own architectural transformation in one place instead of having it sneak into a thousand places all over your code.

    Some examples: One of our customers developed a system prototype based on the EJB 2.0 PFD 1 (using dependent objects in their CMP model). When PFD 2 came out and major parts of the spec changed, they were able to change their architecture and regenerate all of the system, without changing any business code, without making changes to their UML models, in a matter of days. Another example is a project I worked on where a rather naive CORBA architecture (everything was a CORBA object - anyone remember the intergalactic object b******t?) was changed to a more sensible one in a week. The project included millions of lines of code (developed during a year by up to 180 developers).

    It's an illusion that you can define an architecture at the beginning of a large project so perfectly that you don't want (and need) to evolve it during the system's life time, often even during the system's development. MDA allows to do just that: Keep your business model, save your existing (functional) implementation code, and change the architecture when needed.

    Stefan Tilkov
    CEO, innoQ

  81. I completely agree with you Stefan. This is exactly the experience I had with good MDA Tools. They are not yet perfect, but just think about how long it took the traditional IDE to get to the state they are now in.

    In the MDA tool market there are several different approaches which fit different problems. Think about TCC, ArcStyler or in contrast iUML/iCCG - there are HUGE differences.


  82. I completely agree with you Stefan. This is exactly the experience I had with good MDA Tools. They are not yet perfect, but just think about how long it took the traditional IDE to get to the state they are now in.

    In the MDA tool market there are several different approaches which fit different problems. Think about TCC, ArcStyler or in contrast iUML/iCCG - there are HUGE differences.

    Felix



  83. UML TODAY
    The problem with UML today is (as quoted in the article) that there are no real benefits for
    the programmer (I don't use developer here!) because most often UML is used as documentation
    no really coupled to the system. Very often the documentation (UML) is created after the
    system has been developed.
    This use of UML really sucks since it adds some more work an the end of a project and has no short term benefit - at least not for the programmers who drew the UML diagrams.

    VISION OF MDA
    What is MDA really about? Its about transforming one model to another model where the final
    model typically is code. To archive this we need good tools that do these transformations
    automatically and semantically correct. To archive semantically correct transformations it
    is necessary to have a set of well defined rules (which are part of an Architectural Style).

    MDA TODAY
    One clear advantage of model driven development is encapsulation of complexity. Different levels of complexity are encapsulated in different models while each model represents the actual system.
    To repeat: In the context of MDA, UML is not only documentation - the models really represent the system. You change something in the model - the actual system is changed in the way you changed the model. At least in theory!
    There is no MDA-tool around (at least I don't know one) that really support this 100% - but the important question is if they are supporting this idea in a way that the ROI is really improved. I believe there are tools that have an dramatic impact on the ROI.. While I worked with only three of the current ?MDA-tools? (IMHO only one has the characteristics to claim the ?MDA? label) I came to the conclusion that wisely used, one of these tools will improve your productivity up to 20% (which is BIIIG money in large projects) in the context of J2EE (EJP, dynamic web interfaces).
    One problem with today's MDA-tools is that you will end up with very HUGE models but the possibilities to e.g. debug these models (yeah - instead debugging Java it will be debugging UML now) are still very limited if existing.

    MDA HYPE
    Just want to mention that you should really check if a product is vaporware or really existing and if there are any projects that successfully used that product.

    STATEMENT
    I believe that the concepts of MDA have the potential to change the way how software is developed. I strongly believe that MDA may have an impact comparable to the introduction of higher programming languages some decades ago.


    Regards,
    Felix



  84. A major problem that I've experienced is the horrendous price of many of the UML CASE tools. I’ve found this be a significant inhibitor when trying to introduce UML-based modelling practices to companies and to the developers I work with. Ideally I like all developers in a team to have access to a CASE tool so that they can contribute to the design and make best use of any forward and reverse engineering facilities available. Inevitably, however, a compromise has to be made and only a select few end up with the full CASE tool and the others have to make do with read-only versions or HTML pages:-( Fortunately, extremely good and realistically priced CASE tools are now becoming available, so perhaps things may change:-)

    David.......................
  85. Domain Specific MDA tools?[ Go to top ]

    If I could summarize, are we all saying "The ideas a great, but the execution is lacking"? I for one, see the value of working at different abstraction levels and working from different viewpoints or perspectives. The big question is, are the tools up to snuff? Is this technology really ready for prime time?

    I can see UML and its meta model framework to being very benificial to defining your class structure. I've seen sequence diagrams useful for validating/testing code. I've seen state diagrams used in protocol and gui design. Activity diagrams seem very useful for workflow and b2b systems. There are definitely many things that are useful in UML, and I've seen bits and pieces of them embedded in different kinds of products supporting different kinds of domains.

    With today's technology I don't think you can build a universal tool. However, if you have a specific domain in mind, then it should be possible and be extremely effective. So, now I have a new question to MDA vendors, "Who is providing tools to help build domain specific MDA tools?".

  86. Domain Specific MDA tools?[ Go to top ]

    In my opinion the domain doesn't matter at all, if you using the MDA approach.

    If the code generator is flexible enough (e. g. using templates), you should be able to meet yout requirements regarding code.
    UML is just a unified notation that help us communicate to each other on a abstract level but it knows nothing about domain specific.
     

  87. Domain Specific MDA tools?[ Go to top ]

    "If the code generator is flexible enough (e. g. using templates), you should be able to meet yout requirements regarding code."

    Totally. A good OOA/OOD tool focuses on mapping requirements to code.
  88. Domain Specific MDA tools?[ Go to top ]

    Carlos Perez wrote:

    "With today's technology I don't think you can build a universal tool. However, if you have a specific domain in mind, then it should be possible and be extremely effective. So, now I have a new question to MDA vendors, "Who is providing tools to help build domain specific MDA tools?"."

    Sort of like Martin Fowler's book about reuable patterns a few years ago? It's a damn good idea.

    I use UML to document architectural and design choices. It's an admirable tool for showing high-level design. It is necessary to know UML (at a minimum) to keep up in this profession because most book authors use it to document their designs. There seems to be major potential to use UML to document reusable architectural and design model 'templates', but I don't see that happening yet.

    Thus far I've worked on one project using Rational Rose (purportedly the RUP). The theory was that a fully-documented Rose model would either generate useable code (it didn't) and/or that once the perfect model was complete somehow *magically* the model would inspire the developers to write great code without further input somehow. That didn't happen either.

    There were many problems with that project and it would be a mistake to blame either UML or the RUP for problems which stemmed from analysis paralysis, team bloat, and general mismanagement. But it sometimes seems that vendors and methodologists oversell their products as *magic bullets* which allow projects to do without highly skilled and experienced programmers and designers, and/or will allow one to 'design it right the first time' (aka use the waterfall model instead of an iterative approach).

    The idea should not be to produce crap software faster but rather to produce better software in a normal timeframe.

    My major memory of the RR model I constructed (without meaningful guidelines) was that by the time the designers got around to reviewing the model it was 3 days before the deadline. Quelle horror, there were major problems with the model (which I had constructed for my own guidance rather than for anyone else's use)! And those problems MUST be redressed immediately! Pressure was brought to bear, with the result that about 10 unscheduled hours were consumed in that last week.

    During the last week before a delivery time is short. The result was that I stayed after midnight rather than until 10 PM every day that week.

    The next time I'll be tempted not to do the model at all unless copious time is explicitly scheduled for construction and review of a model. Why give them something to shoot at?

    Some training in the RUP and RR is also necessary. UML used as a language is relatively straightforward, but doing a RR model with UML under the RUP is not obvious at all I'm afraid. Especially if you aim to do it *right*.

  89. Hi David,
    roughly you are right.

    YES
    Most tools are priced very high and will provide an good ROI only if applied correctly. Not every tool will increase your ROI some even will bring your project to a halt!

    KNOWLEDGE
    Therefor it is important to understand the concepts of the MDA, the inherent problems of different approaches (of different MDA tools) to reach the vision that is described by the MDA. You will need developers who know how to apply the tools / the concepts.

    BUT
    You are refering to "UML CASE tool" - I would associate something like Rational Rose 2001 with that expression. Take this: Rational Rose 2001 alone is not a MDA tool -> core concepts are not implemented / supported. Ofcourse you can use Rose also in a MDA project ... but most of the work must be done "by hand" in such a scenario. Got it? In this case -> no increased ROI (no improved quality or productivity).


    REVERSE ENGINEERING
    In complex environments reverse engineering is very limited and will cause problems with a homogeneous architecture. Some MDA tools support reverse engineering while others prefere strict forward engineering. This is a point where you have to decide which skills are available in your development team and what you want to do with a MDA tool. Please consider the limitation of either Reverse Enginnering or Forward Engineering and make your decision.

  90. It's very simple. Modelling tools will become more accepted and used in the industry when the Fat Cats in charge realize the truth -- that documentation and modelling is more important than hacking code. Obviously, this epiphany cannot be achieved until managers and directors become enlightened with the idea that UML diagrams are worth more $$$ than the codebase.
    The MDA tools have done their part: they guide the typical manager into the green grass on the other side by having such great features as the code generation capabilities. Now it's time for IT companies to do THEIR part and start taking modelling a great deal more seriously than before. These tools will ultimately replace a slew of "hands-on" tasks with instant results. It will also usher in the abstract thinker into the low-level world of programming, which is something that all development departments sorely need.

    And by the way, of the 33% of developers who claim to be using UML, you can pretty much put down your next paycheque on the assumption that half of them are kidding themselves and misleading the person who took the survey just so they could look smart, and most of the other half do not have the kinds of UML diagrams that are of "quality" material. Until this phenomena truly changes and we all start taking UML more seriously, these tools will go nowhere.

    Basil.
  91. Hey Basil,

    Why aren't you more reluctant to trust your code base to a tool. Doesn't that mean marrying in to a vendor. Using a tool to eliminate repetitive tasks "monkey work" makes alot of sense to me (like automatically creating interfaces), but I don't want to drive my business logic with a CASE type tool. It just makes more sense to leave that in the language domain, and go code it. JMO.

    Mike
  92. Hi Mike,

    "...Why aren't you more reluctant to trust your code base to a tool. Doesn't that mean marrying in to a vendor..."

       I have no problem marrying a vendor so long as the market in which the vendor conducts business is at the very least, an oligopoly that can never be consolidated further. For example, we can only buy middle-class cars from three different automakers right now -- Ford, Chrysler, and GM. I don't see any reason why we should be changing our buying habits just to get a fourth or fifth vendor in that market, and I use that analogy on software too. Yes, I know... all this rhetoric about MicroSoft owning *everything* if we're not careful does indeed have some definite truths to it, but I am not worried because time has shown that MicroSoft will ALWAYS have decent competitors. Bottom line: I don't mind tying myself into a vendor via buying their code-generator/modelling tool so long as the tool generates code that can be used for multiple languages/platforms, which is what this article is about.

    "...I don't want to drive my business logic with a CASE type tool..."

    I do. I am hoping that in the future, the position of modeller, architect, and programmer all merge into one person using one tool. I know that sounds crazy now, but that's where the industry is heading. The reason it's heading that way is because most of the problems associated with software development is not because of the limitations of the code or the tools, it's the human factor. The only solution is to merge tasks traditionally associated with a team of 10 - 20 people into one "do-able" task that can be performed by an intelligent and capable professional.

    Basil
  93. Why aren't you more reluctant to trust your code base to a >tool. Doesn't that mean marrying in to a vendor. Using a >tool to eliminate repetitive tasks "monkey work" makes >alot of sense to me (like automatically creating >interfaces), but I don't want to drive my business logic >with a CASE type tool. It just makes more sense to leave >that in the language domain, and go code it. JMO.


    Yes of course! Nobody claims that you should generate 100% of your implementation. Use the representation that is most appropriate for the type of information - e.g. a UML model to describe package and class relationships, and code for your implementation. I believe that most, if not all, tools that claim to be MDA compliant do not just generate new code from an existing model, but also merge existing business code into the new implementation. In general, you generate code from two input sources, one being the model, the other being the existing implementation.

    Stefan Tilkov
  94. Stefan,

    I just read your following post again:

    "Yes of course! Nobody claims that you should generate 100% of your implementation. Use the representation that is most appropriate for the type of information - e.g. a UML model to describe package and class relationships, and code for your implementation. I believe that most, if not all, tools that claim to be MDA compliant do not just generate new code from an existing model, but also merge existing business code into the new implementation. In general, you generate code from two input sources, one being the model, the other being the existing implementation."

    We may not be that far off after all. I think I had missed your point. If a MDA tool served as a framework/code shell type of generator, and not a business logic generator, I could be convinced. I would still be watching out for tool dependency. With J2EE and this economy, you would seem unwise to become to tool dependent (i.e. if I had to on my project, I sure be better be able to go back to the old manual approach if the vendor disappears ... WebGain, HP, etc).

    Mike


  95. I agree with Basil that it's time for the IT management to take the technology on board.

    The problem that I see is that the managers look to their senior developers and architects for the advice on strategy and get a confused and inconsistent response.

    From a developer's perspective the problem for me has always been that whilst you CAN use XML model technologies for generating and defining systems, the tools available (such as they are) are so painful to use that it's very tempting to slip back into tried and tested grooves once deadlines start to threaten.

    Until a established and slick development paradigm is developed for this kind of model based development management are never going to take it seriously.

    A colleague of mine and myself have been looking at the problem of what kind of environment would provide the kind of ease of use in this scenario that we are already seeing with standard development environments.

    Check it out at: http://www.metacoder.com

    We'd greatly value any comments and opinions you guys have on the ideas we have so far.

    Cheers - Steve
  96. I see UML diagrams as a map of the project. It gives you an overview of the project and helps you communicating with other people. Also it helps you localize the details.
    I don't see the necessity for modeling diagrams to contain all the details since these details are already available in the code, without an extensive effort of synchronization.
    Generating production code with small specification is very unlikely. Just think about your last two projets. How much do they have in common? How much do they have in common with other projects in other industry? Maybe less than 10%. A lot of application contains bussiness logic that is specific to the bussiness domain and sometimes to the specific company your are programming for. This bussiness logic cannot be generated right now and has to be specified in a programming language (visual or not). The only things you can generate are EJB code, class skeleton, methods signature, etc. Those things are common across the projects and well specified but those things will not represent the entire project most of the time.
  97. Let me present a challenge for the purpose of debate. I don't necessarily believe my challenge. But then again, maybe I do. :)

    Disclaimer: I haven't been on a J2EE project yet, but have participated in n-tier development with limited use of UML (sequence diagrams).

    Challenge: A KISS approach to an architectural J2EE framework, bare minimum UML\requirements\design documentation, and some solid manual procedures are the best approach to J2EE software development. You do not need expensive UML or MDA type tools.

    Let me compliment my challenge with a hypothetical. Imagine I have just been hired by a company to manage J2EE project staffing and act as a lead architect on the project(like I said, hypothetical :). This is the first J2EE project for the company, so they are starting from scratch. The following are concepts\tasks that seem like logical starting points.

    1) Use UML on the project. You have to use something to document requirements, and UML seems very suited for the task, plus it is a standard skill (i.e. you should be able to hire UML expertise, as opposed to someone having to come in and learn a "home grown" approach). My point about minimum documentation above is this: I do expect the documentation to provide a contract for the life of the software. If an application modification is required, you should be able to start with the UML to determine the impact and domain of the problem. The documentation should be "kept up to date" along with the code base. Obviously, this will never happen if you create a million diagrams per use case. Actually, I view the code base as part of the documentation. I just need a logical map from the UML (and maybe supporting home grown supporting documentation like word docs including further programming specs) to get me to the code. In my opinion, with this type of approach, a free UML tool like Poseidon (or ....) and MS Word would suffice. I have found that a developer can perform alot of manual code and documentation changes fairly quickly when procedures and coding techniques are standarized.

    2) I would think a typical "first J2EE effort" from a company would involve agreeing to a J2EE framework, and then hopefully applying that framework in a "prototype" first, before attacking the full production system (Yeah, I know in the real world that doesn't always happen). By Framework I mean something like STRUTS -> EJB (no entity EJB and a finite number of design patterns) -> JDO. Like I have said before on the TSS, I think J2EE "is too much". The first thing a company needs to do with their first J2EE development effort is trim the huge J2EE world down to a subset of "what is required for this project". Another way to say that is "As KISS as humanly possible". If the company doesn't need to deal with web services, SOAP, or "name your own buzzword here", don't add in the complexity to your framework. Obviously many design decisions need to be made with an eye to the future, but you get my point. So again, the point of all of this is, if one takes the time to keep things as KISS as possible, why would one need a code generator.

    3) Along with requirements documentation living with the software, I also believe "testing" should be kept current (e.g. JUnit). Let's imagine I have an application in production supported by a large "test code" base. One of the comments above was the ability with an MDA type approach to re-architect your application. Will that procedure also re-architect by "test code"?

    So again, I think the most important aspect for all J2EE development is KISS for everything (documentation, framework, companion tests). Given that, wouldn't a manual approach to slinging code and maintenance work just fine.

    As Bill O'Rielly says... "Where am I going wrong?"

    Cheers,

    Mike

    note: I am playing devil's advocate here. Although I approach tools with some skepticism, I become a huge fan when they actually help me do my job.

  98. > As Bill O'Rielly says... "Where am I going wrong?"
    >

    Listening to Bill O'Rielly, for starters...
  99. "Listening to Bill O'Rielly, for starters..."

    :-)!!!! Good one.
  100. Since I was the one who introduced the re-architecting notion into this discussion, I think it's worth elaborating a bit. Let's say that - following your KISS approach - you start with an architecture that uses JDO for persistence, perhaps because you think that EJB 2.0 CMP is not yet implemented well enough or not supported by your application server. You develop your application, implement lots of code in your entities (not beans, just classes), and get a satisfactory result. You and your users are happy, and you start using the system for a year or two.

    Now you develop another application in the same business domain, but with different non-functional requirements. EJB 2.1 might be there, app server support for CMP might have grown to be excellent ... so you decide on a different architecture, using entity beans.

    Given that the basic requirements and the business domain logic are the same, you now have a problem: You need to rewrite your code, either once (and upgrade the first system to the new architecture). Why should you have to do that? Why should a change in a purely technical aspect of your system prevent you from using your existing, already implemented, already tested application?

    This is where the MDA approach pays off: You re-use the models, you re-use the business code, and just define a new mapping from your platform independent model (PIM) to a platform specific model (PSM).

    OK, now it might be the case that you don't ever implement two systems from the same business domain. But even in this situation, there'll be a huge benefit: You'll not be able to re-use your business domain model, but you can apply your defined mapping to a different domain. You'll have a working architecture (including the supporting frameworks) that has proved its value, so why shouldn't you be able to apply it easily a second or third time?

    In my experience, model-driven code generation will save you time even in a single, simple project; starting from that, the benefits will grow exponentially with application size and functionality.

    Stefan Tilkov
  101. Hey Stefan,

    I actually think given the complexity of J2EE applications, it would be unlikely that you would chase the "flavor of the month". If it ain't broke, don't fix it. Maybe in your example, you build the second project with JDO also? Now, I would give you this... in J2EE land it IS likely to be broke down the road. :)

    Maybe I didn't understand you, but are you saying using MDA one would be able to take an application that used Castor JDO, and launch a wizard or something that would take the same code base and be "ready to go so to speak" under WebSphere 4, or maybe WebSphere 5? If so, wouldnt' that imply that the tool is having to keep up with application server and persistence manager proprietary features. I'm sure I have misunderstood.

    Mike
  102. I'm going to do something a bit strange. Respond to my own post. :(

    I said, responding to Stephan,

    "Maybe I didn't understand you, but are you saying using MDA one would be able to take an application that used Castor JDO, and launch a wizard or something that would take the same code base and be "ready to go so to speak" under WebSphere 4, or maybe WebSphere 5? If so, wouldnt' that imply that the tool is having to keep up with application server and persistence manager proprietary features. I'm sure I have misunderstood."

    Actually, the application proprietary features mainly have to do with "deployment", doesn't it. I'm guessing these types of tools do not deal with deployment. Is that correct?

    Mike
  103. Mike,

    you wrote:

    >Maybe I didn't understand you, but are you saying using
    >MDA one would be able to take an application that used
    >Castor JDO, and launch a wizard or something that
    >would take the same code base and be "ready to go so
    >to speak" under WebSphere 4, or maybe
    >WebSphere 5?

    Essentially, yes. In fact, this very example is one that we have just demonstrated in an internal prototype. Of course, the JDO implementation has to be the result of a forward-generation step for this to work, or it would have to be adapted so that it fits the generation engine's conventions.

    BTW, I don't care too much for wizards since the term seems to imply that's something magical is happening, or at least something that is very well hidden from me. Given the understandable skepticiscm about code generation that I know many developers have, the way things work need to be very clear and understandable.

    >If so, wouldnt' that imply that the tool is having to keep
    >up with application server and persistence manager
    >>proprietary features. I'm sure I have misunderstood.
     
    I think you've hit *the* most important point: If the tool did that, I would very much consider it to be worthless. After all, you might have reasons for a specific approach to your technical architecture - if you relied on a vendor to provide you with exactly what you need, you'd probably be quite unhappy. The key is that the tool should be an engine that allows you to start with an existing mapping from business model to implementation, and change that according to your needs. The mapping and the engine need to be separate parts of your solution. And the mapping needs to be under your complete control.

    A final thing about vendor independence: I think the code that is generated should be written at least as well, and probably even better, than what you would have done manually.

    In another message you said:
    >Actually, the application proprietary features mainly
    >have to do with "deployment", doesn't it. I'm guessing
    >these types of tools do not deal with deployment.
    >Is that correct?

    I'm not too sure that I understand what you're saying here. Could you elaborate?

    Stefan Tilkov
  104. What is this - a junior jumble?

    Unjumble this:
    Modeling Usage Low; Developers Confused About UML 2.0, MDA

    Answer:
    UML 2.0 Modeling Usage Low; Developers Confused About MDA Somehow systems still developed

    Would you use a design tool to make automated enhancements to a deployed system. No. Remember when CASE tool sales reps said you could do this and it failed. Would you use a design tool to conceptualize and prototype new development? Yes.

    People model although the respondants of this statistically challenged study must have been Visual Basic developers. Visio?

    Somewhere between what Ambler's Agile Modeling book espouses (1 brain, 1 dry-erase, 1 whiteboard, 1 camera) and the Rational "Ironmaiden" suite of tools, people seem to be converging on common design practices albeit incrementally and maybe not fast enough for tool reps. You can even design verbally in pattern-speak and not write anything down.

    Next question: UML on Windows XP or OS X. Agile or extreme?











  105. I fall into the camp that is firmly against complete modelling in UML and as such against using MDA for everything. I tend to regard UML as an essential tool with utility in 3 major areas, only one of which has been talked about here.

    1/ As a white board language, along with patterns it gives an effective visual and linguistic shorthand for the communication of design ideas in a meeting, and also facilitates the recording of design decisions reached during the meeting.

    2/ Documentation by example. A 'generic' UML diagram is a fantastic tool for establishing the standard way of approaching common design areas. Everyone uses it extensively in recording patterns, I also use it to communicate with other designers/developers how a set of similar components are designed both before, during and after development.

    3/ Documentation by exception. Picks up where documentation by example leaves off. There will always be components that do not behave like any other or have non-obvious design decisions and I use UML to record these.

    In general this approach to UML documentation does what I need and scales well to large projects where the UML documentation set will be smaller and easier to keep up to date.

    I have used tools from Rational and Together and of the two have been most impressed by how Together manages the round trip, tightly integrating the code and UML. However I find that using an MDA tool exclusively can limit your options when coming to non-conventional solutions and (for now) can make it very difficult to migrate to (brand) new standards and component types that the MDA tool has yet to embrace.
  106. Robert,

    I'm not sure whether we fall into different camps although I am a strong proponent of an MDA approach. I'd be very interested to see whether we can agree on something. You mention three uses of UML:

    1) white board language
    2) documentation by example and
    3) documentation by exception

    On the first point we can probably agree very easily, since nearly everybody I know draws class and other diagrams in UML notation - it's the one graphical language that all of us understand.

    I'd like to discuss what you wrote under 2):
    <quote>
    A 'generic' UML diagram is a fantastic tool for establishing the standard way of approaching common design areas. Everyone uses it extensively in recording patterns, I also use it to communicate with other designers/developers how a set of similar components are designed both before, during and after development.
    </quote>
    (BTW, why can't TSS turn this '<quote>' pseudo-markup into meaningful HTML?)
    Intentionally oversimplifying, I view what you describe here as "Architecture". If you take a look at the architecture definition from SEI, you'll find one from a Sun architect that reads:

    "A software architecture is the assignment of specific transformations that are applied to convert a purely logical model of a system that satisfies all functional requirements into a model of a system that satisfies both functional and non-functional requirements."

    To me, this is the most useful definition I have seen so far, because architecture is about solving similar problems in a similar way. That's why you define how to persist your objects, which patterns to use for GUI interaction, how to implement control flow, and thus make decisions about the structural patterns of your system.

    Would you agree that what you call "documentation by example" is done in exactly this context? If so, here's the way my company goes about developing systems when we are contracted to do so. You need to define three things:

    1 - The logical, architectural building blocks of your domain architecture. In UML 1.x, you do this by agreeing on sterotypes and tagged values to describe what *your* logical concepts are. The result is a UML profile. With future UML and/or MOF versions, this might evolve into the possibility to define your own meta-model. You might also decide to use a proprietary way to describe this.

    2 - A set of technical decisions and, possibly, development efforts, like e.g. which application server to use, which persistence layer to use (CMP or JDO or ...), whether to use a web app framework like Struts, WebWorks or Tapestry and so on

    3 - A technical architecture that describes how to translate a model that has been developed based on the profile to an implementation that uses/attaches to the tools and frameworks you selected.

    Step 3 is the one that you either do manually, i.e. following established guidelines, or automatically, by using a tool that does (large parts of) this transformation for you. My experience with the manual approach convinces me that it is time-consuming, tedious, and not much fun. Who likes to have a simple change to a method signature require updating different files in different formats in dozens of places? I don't. Whether you do this manually or automatically: There is still a lot of code to be written, because specifying algoritms is best done on the code level, not in the model (otherwise it becomes only a different syntactical representation for your code).

    We can probably easily agree on your third point: There'll always be examples, places where the 'standard' architecture doesn't fit, and those should be documented rigorously.

    I believe that the major thing about MDA - although you'll find nothing like that anywhere on the OMG site - is not the use of UML. It's the idea to decouple different parts of system design and development. If you developed a tool with a fancy GUI that saves information about your system's logical aspects in XML, and generated (and merged) code based on this, you would be following the same approach.

    I think MDA (which is BTW not something the OMG guys inventend, but only try to standardize) is a great idea that should not be rejected because of bad experience with UML (or rather, the way popular tools support it).

    Stefan Tilkov
  107. Stefan,

    I apologise for not making myself entirely clear, I haven't had a bad experience with UML, more with the limitations of the MDA-like tools that use UML as their graphical notation.

    I do understand that MDA as a concept could greatly reduce the effort in engineering and reengineering software, but I was trying to express how the current tools in this area are too constrictive and overcomplicated and how in my opinion the current efforts in MDA are falling into the same trap as they do not seem to be good at automating the working from the general to the specific, (I would be very interested to see how things such as generics will affect this.) and try to do too much (generating production ready code).

    Part of my problem with MDA is that I am reacting to the extreme enthusiasts' (of which I don't think you are one, just an enthusiast) claim that it can do everything, in my experience of the current state of the art a 60 - 40 or at very best 80 - 20 rule applies.

    On your definition (your step 3), we are already most of the way there what with things like EJB-CMP (imagine writing and rewriting all that persistence code), XDoclets etc. all we need are more pretty GUIs.

    Anything to reduce my time spent on the boring stuff is good, but I remain unconvinced by the claims and demonstrations so far.
  108. Robert,

    you wrote:

    <quote>
    Part of my problem with MDA is that I am reacting to the extreme enthusiasts' (of which I don't think you are one, just an enthusiast) claim that it can do everything, in my experience of the current state of the art a 60 - 40 or at very best 80 - 20 rule applies.
    </quote>

    I agree with all of that - I definitely feel enthusiastic about this approach, but I do hope I'm not too extreme. And I also have the same opinion regarding the 60/40 or 80/20 split. There's still a lot to do, but I believe things are going to improve a lot in the near future.

    Any tool that claims to be able to generate 100% of the business logic is either incomplete or forces you to trade one programming language for the other. I remember a software engineering professor of mine who - about 10 years ago - made us "design" software with detailed Nassi Shneidermann charts, which was probably the worst CASE tool experience I ever had. You might call it "visual" to draw charts on the same semantic level as code, but it was also annoyingly stupid.

    <quote>
    Anything to reduce my time spent on the boring stuff is good, but I remain unconvinced by the claims and demonstrations so far.
    </quote>

    I can perfectly understand this view. To shamelessly insert a plug for our own product again, iQgen aims to provide a solution that lives up to its (pragmatic) promises. If you can find the time to check it out (and have access to an XMI-exporting CASE tool), we'd be happy for any comments you might have.

    <quote>
    On your definition (your step 3), we are already most of the way there what with things like EJB-CMP (imagine writing and rewriting all that persistence code), XDoclets etc. all we need are more pretty GUIs.
    </quote>

    I think it would be very interesting to compare different approaches to code generation, e.g. comparing XDoclet to an MDA approach. Perhaps somebody with a good grasp of XDoclet would like to team up to write an article for TSS?

    Stefan Tilkov
  109. I would just like to add a few points to the point of the "model _is_ the code" sub thread.

    First, I believe that at some point this _will_ occur. I believe that it is at the point that C++ was in the early to mid 1980s. The engineers who believed in it early took the knocks, but we all know the rest of the story.

    Almost exactly one year ago I heard Greenfield, Jakobson, Rumbaugh, Booch, Wojtek Kozaczynski, and others promoting this approach at the Rational User's Conference (RUC). And to a _minimal degree_ it has been delivered in the Rational XDE using the Reusable Asset Specification. It is a start, but it will grow.

    For example, regarding the 60-40 or 80-20 problem: (This thread is long so I may have missed it if someone else brought up the point, but...) have any of you heard of the Action Semantics language for UML? This is the way being promoted by MDA/MDD vendors to solve the 80-20 problem that Stephan brings up. Check out the Action Semantics site. One MDA company doing this stuff is Kabira, and Rational is working on an approach with Action Semantics as well. This is early, but I believe the leading-edge developers will be using this in one more year, and the rest within 5-10 years.

    Here are the layers that I perceive MDA working:

     1 - Static Structure model code gen
     2 - Interaction model code gen
     3 - Action Semantics code gen

    Both 1 & 2 should be possible today to help reach the 80-20 stage. Layer 3 is here today for some vendors, and should be becoming more prolific within 12 to 18 months.

    I took a look at the iQgen stuff. It looks reasonable enough, but it seems to only works with static structure models. Am I missing something, Stefan? Can it generate code based on interaction models? What do you think of the upcoming Action Semantics spec?

    The model will be the code. The debugger will be integrated with the "code" editor. The question is, how soon will we as individual developers embrace it? We can get our feet wet today using iQgen, and if you have big bucks, with Kabira.

    Thoughts?

    Vaughn
  110. "What do you think of the upcoming Action Semantics spec?" -- Vaughn

    Thanks for the tip! I just now skimmed umlactionsemantics.org's final submission, which is an excellent regurgitation of the process modeling notation used in the Shlaer-Mellor and ROOM methodologies for at least a decade. Apparently it takes OMG to get the software industry's attention. At least now a standards-based market is being defined for those of us who wish to benefit from it, either as contributors or users. Lack of standards has always been a drag for Shlaer-Mellor.

    "The model will be the code. The debugger will be integrated with the "code" editor. The question is, how soon will we as individual developers embrace it?" -- Vaughn

    Perhaps the biggest problem with the OMG's specification of MDA and action semantics is it's inability to reverse engineer legacy Java source. As such, the OMG's stuff can only be applied to green field subsystems. It isn't that this would deny MDA it's inevitable dominance in mainstream software, but rather merely slow its adoption. Whereas my hobby involves work at unlimited round-tripping between Java source and process diagrams, so that legacy Java projects can be visually manipulated on an ad-hoc basis at any time in a codebase's maturation and without any lasting commitment to diagraming.
  111. "What do you think of the upcoming Action Semantics spec?" -- Vaughn

    I just re-skimmed the final submission, and I'm impressed with their descriptive formalisms, which I consider a solid foundation for standardizing process modeling in an abstract sense, though not necessarily graphicly. I had hoped that the final submission would have included example action diagrams, but I was disappointed. Appendix A offers only metamodel diagrams, which are useless for intuitively evaluating the readability of a language.

    The final submission has no graphical process notation at all, despite pages of textual description towards it. Clearly these folks have missed a perfect opportunity to offer a visual notation, and the intent seems to be that a visual notation will be specified later, perhaps by another party. They well detail an interactive medium, including navigation and incremental validation. But they offer no tangible artwork. How strange.
  112. MDA definitely needs Action Semantics, after all what does it mean to execute an MDA model without it?

    I see 2 problems however

    (1) Actions Semantics specification does NOT have a notation, visual or textual. So there's no standardized form. It's like having the JVM but not having Java!

    (2) It's pretty new (Finalized Nov. 2001), so I'm not too sure about its maturity. Without actual implementations in the public, I can't validate if it works well or even works at all!

    In short, MDA is a good idea, but it'll take a few more years for it to become viable. It takes time for new ideas to be implemented well. Refactoring for example was known a long time in Smalltalk circles, Fowler's book for Java was published in 1999, yet it was only late 2001 when Java based refactoring tools appeared.

    The specs have been written, the ball is now in the vendors court. It's time to delivery useful tools, that's of course if it's possible.
  113. <Carlos-Perez>
    It's pretty new ... Without actual implementations in the public, I can't validate if it works well or even works at all! ... In short, MDA is a good idea, but it'll take a few more years for it to become viable.
    </Carlos-Perez>

    Kabira has Action Semantics working today (actually a year old or more). They have their own syntax. Expect Rational to fight a battle over that because Kabira, while hosting their MDA tool in Rose, will be probably viewed as a competitor in a few years. However, all of this may be mute if Rational purchases them as they have other leading tools they sell. If Rational decides to compete, then expect Rational to win the battle on syntax.

    A few years? I say a year to 18 months because leading-edge guys (like us?) start to use it. Then full adoption will take place in 5-10 years. The point is that it is coming.

    Vaughn
  114. Rational, ... UML 2.0, MDA[ Go to top ]

    "If Rational decides to compete, then expect Rational to win the battle on syntax." -- Vaughn

    Sad, but true. I say sad since all of the specifications Rational has wrought reek of the complexity inherent in C++. The same is true of OMG. It's like a cultural thing. Other orginizations have done better at keeping it simple.
  115. Prograph, ... UML 2.0, MDA[ Go to top ]

    "MDA definitely needs Action Semantics, after all what does it mean to execute an MDA model without it?" -- Carlos

    Spiritually I agree. But notice that OMG's MDA specification doesn't require rigorous action modeling, does it? I think OMG mentions it as an option.

    "It's time to delivery useful tools, that's of course if it's possible." -- Carlos

    Over a decade ago, Prograph proved the commercial viability of procedural programming entirely through process diagrams. So it's most definitely possible.
  116. <brian-miller>
    Action Semantics... But they offer no tangible artwork. How strange.
    </brian-miller>

    I agree that at some point a graphical representation would be nice. But I believe that the inital target is to have developers author code using the Action Semantics as a _very_ high-level OO programming language.

    Painful? Probably not too bad since we are only talking about 10-20% of the entire app being developed in it. And when you consider that you can inject Java, C#, or C++ (may as well select a very fast target object code language) object code into any given architectural technology solution container (J2EE, .NET, CORBA, whatever), that 20% of Action Semantics code just seems a lot easier to write. And language wars are over, or will take on a new pictorial dimension.

    It will probably be a nice break to write some code after doing all that model building. Sort of nostalgic.

    Vaughn
  117. Robert,

    in my oppinion there is no real benefit in roundtrip engineering. I don't know later versions of TogetherJ, but all roundtrip approaches I know provide a too tight coupling of code and model, so that they are not supporting my needs that good.

    I think the applications of UML you mentioned are very important in any software development effort. MDA requires just to use modelling for one more purpose: for specifying your system.

    Good designed MDA applications must allow to refactor your software easily. By now there are some limitations to that.
    But MDA provides a concept of model transformations - not only in the sense of forward but also of reverse engineering. With next generation reverse engineering tools it's hopefully possible to refactor software on all abstraction layers in parallel: in PIM, PSM and code.

    Maybe then roundtrip engineering is what I need.

    cheers,

    Phillip Ghadir
  118. The real reason modeling usage is low is because of the incredibly high prices that Rational and some of the other vendors are charging for their products.

    My organization was willing to buy our Rose licenses because he believes in design. So do I. I'm not a fanatic about it, but I like to keep the model up to date. It's a good way to look at what you're building in a new way.

    Doug
  119. "I'm not a fanatic about it, but I like to keep the model up to date." -- Doug

    The downfall of modeling without an executable model is the destructive tendancy of the model and code to diverge monotonicly. With an executable model, the model always matches the code since the two are synchronized with the push of a button. I don't even bother with modeling if the model can't be executed.
  120. Want about people who use modeling as a communicative tool and not the end product? For a lot of developers, modeling is a way to communicate design ideas or to provide a road map of the software that is being produced. If that model becomes outdated as the code develops - scrap the model. It serves no purpose.

    I don't see modeling as an all-or-nothing proposition. It certainly has its transient uses as well. Models CAN be created, implemented, and thrown out after they have server their purpose. By limiting modeling to developers who generate all their code from models is really limiting the use of modeling.

    Ryan
  121. "For a lot of developers, modeling is a way to communicate design ideas or to provide a road map of the software that is being produced." -- Ryan

    Yes. Throw-away whiteboarding is perhaps my only use of UML, for which it's well suited.
  122. AndroMDA is free[ Go to top ]

    In this thread there has been some complaints about the cost of UML tools.

    Grab yourself the cheapest edition of MagicDraw or Poseidon, or the free community edition of Poseidon. Go to the http://www.andromda.org website for an open source MDA tool, and give it a try (it's hosted on SourceForge).

    Sometimes it makes sense to use a bike and sometimes it's better just to walk. It all depends on the terrain. That about sums up how I feel about MDA and using AndroMDA. It's not a matter of one being better or worse.