Discussions

News: Design Patterns are Code Smells

  1. Design Patterns are Code Smells (97 messages)

    Stuart Halloway has posted "Design Patterns are Code Smells," pointing out that 'when you are doing design patterns, implementation language matters,' and that design patterns tend to not be as good as code recipes.
    Language advances kill patterns-as-recipes. Back in 1998, Peter Norvig argued that most of the original GOF patterns were invisible or simpler in Dylan or Lisp. Since then, Greg Sullivan has made the same point for Scheme. Jan Hannemann demonstrated the same for Java+AspectJ. Design patterns do not perform well as recipes. They are seasonal at best. At code level, most design patterns are code smells. When programmers see a design pattern in a code review, they slip into somnolent familiarity. Wake up! Is that a design pattern, or a stale recipe from a moldy language?
    So, readers: the obvious questions are: "Where do you get your Java EE recipes from? Are they the right approach?"

    Threaded Messages (97)

  2. I think Peter Norvig's examples explain it best from a dynamic language perspective
  3. There were already some patterns not mentioned in GoF because they were invisible in C++: Apparently, the GoF said in their intro that if they had used a procedural programming language, they "might have included deisgn patterns called 'Inheritance', 'Encapsulation' and 'Polymorphism'". But because those could be taken for granted, they didn't bother. With a more advanced language, GoF's patterns can be taken for granted, or at least trivially implemented in a reusable way. Scala, for example, has explicit language support for a Singleton. Programming in Patterns is like Painting by Numbers, great for learning but you don't want to look at the results.
  4. I think Peter Norvig's examples explain it best from a
    dynamic language perspective
    Wow, wonderful presentation page, must be written in RoR.
  5. I think Peter Norvig's examples explain it best from a
    dynamic language perspective


    Wow, wonderful presentation page, must be written in RoR.
    No bitter boy. If it was written in RoR it would be all Ajaxified up for your viewing pleasure.
  6. I think Peter Norvig's examples explain it best from a
    dynamic language perspective


    Wow, wonderful presentation page, must be written in RoR.
    Peter Norvig is a LISP guy. His "Paradigms of Artificial Intelligence Programming: Case Studies in Common Lisp" should be mandatory reading for all developers. You learn what LISP has had since the early days and you find out what all higher-level languages since have been striving to achieve. Norvig's LISP book will teach you more about Java/Ruby programming than any Design Patterns for Dummies type book you could pick up at Barnes & Noble. He strikes me as a practical guy so my guess is that the website is what he intended: Dense content with little graphical syling to get in the way.
  7. Peter Norvig is a LISP guy. His "Paradigms of Artificial Intelligence Programming: Case Studies in Common Lisp" should be mandatory reading for all developers. You learn what LISP has had since the early days and you find out what all higher-level languages since have been striving to achieve. Norvig's LISP book will teach you more about Java/Ruby programming than any Design Patterns for Dummies type book you could pick up at Barnes & Noble.
    +1 The book is a treasure.
  8. I think Peter Norvig's examples explain it best from a
    dynamic language perspective


    Wow, wonderful presentation page, must be written in RoR.
    A quick "view source" yields: Why the Rails slam?
  9. I think Peter Norvig's examples explain it best from a
    dynamic language perspective


    Wow, wonderful presentation page, must be written in RoR.


    A quick "view source" yields:

    Why the Rails slam?
    What scares people makes them lash out. But then again it is Friday fun day so jokes are welcome.
  10. What scares people makes them lash out. But then again it is Friday fun day so jokes are welcome.
    Is anyone surprised? It kind of reminds me of Javalobby and Circle the Wagons Friday, where nervous Nelly Java developers are given a chance by the trolling editors to lash out at anything they don't understand or are scared of.
  11. What scares people makes them lash out. But then again it is Friday fun day so jokes are welcome.


    Is anyone surprised? It kind of reminds me of Javalobby and Circle the Wagons Friday, where nervous Nelly Java developers are given a chance by the trolling editors to lash out at anything they don't understand or are scared of.
    Hi Frank, Thanks for the pointer. Peter Norvig is new to me. Rather than call the GoF Design Patterns code smells, I'd rather think of them as antidotes to what I would describe as language smells. In other languages many of the GoF patterns just aren't needed. There is an interesting (if old) discussion on this very subject on the c2 wiki: http://c2.com/cgi/wiki?LanguageSmell Like you say, for many ignorance is bliss... Paul.
  12. Rather than call the GoF Design Patterns code smells, I'd rather think of them as antidotes to what I would describe as language smells. In other languages many of the GoF patterns just aren't needed. There is an interesting (if old) discussion on this very subject on the c2 wiki:
    I think that may be true of the GoF patterns, and probably many others. However, I think more expressive languages greatly increase the need for developers to be able to recognize and formalize patterns. I personally find that my Python code starts on a downward slope in terms of my productivity at continued enhancement at around 1000 lines (which isn't much). I don't think I'd want to write a Python program more than a few thousand lines. Maybe I'm just dumb, but the lack of static type checking (and other static validations) really starts to kill me. The only way to combat this is to keep the code small. So I look for patterns and try to encapsulate them in metaclasses, descriptors, function objects, plain old inheritance, etc. This can substantially reduce the code size, and provide a productivity boost by raising the level of abstraction. IMHO if you don't do this, the code will just collapse on its own weight. Languages like Java and C++ require you to build more structure, and consequently you don't have to hold as much of the code in your head at once (the IDE doe it for you...). So I contend that patterns are far more important in expressive dynamic languages, the difference is instead of formalizing them in words and diagrams you formalize them in code. Of course, if you want expressiveness and structure, there's always Scala.
  13. Good points. Scala is nice. Groovy also gives you the choice of going static or dynamic.
  14. Groovy Design Patterns[ Go to top ]

    Groovy also gives you the choice of going static or dynamic.
    Some Groovy design pattern info: http://groovy.codehaus.org/Design+Patterns+with+Groovy I believe most of these patterns are still relevant, though if you read through the details, many of them change because of the fact that you are using a modern dynamic language.
  15. Thanks for the heads up! I just eliminated all of the interfaces from my application, and changed all of my references to point to concrete implementation classes. That should get rid of most of those smelly design patterns.
  16. Thanks for the heads up! I just eliminated all of the interfaces from my application, and changed all of my references to point to concrete implementation classes.

    That should get rid of most of those smelly design patterns.
    I think your sarcasm is misplaced. Over-engineering is still rife in Java land. What percentage of interfaces end up having one and only one implementation ? A high percenatge in my experience. Why not start with no interfaces then add them when they are needed ? Apart from anything else it is tedious when using your control/click using an IDE to find it takes you to the interface declaration...
  17. Thanks for the heads up! I just eliminated all of the interfaces from my application, and changed all of my references to point to concrete implementation classes.That should get rid of most of those smelly design patterns.
    I think your sarcasm is misplaced. Over-engineering is still rife in Java land.What percentage of interfaces end up having one and only one implementation ? A high percenatge in my experience. Why not start with no interfaces then add them when they are needed ?
    Because, sometimes, your experience tells you that "You are going to need it". See here for more details when and why this happens. As for myself, I found Corby's post not just sarcastic but downright funny :-) -- Cedric http://testng.org
  18. "You are going to need it". See here for more details when and why this happens.

    As for myself, I found Corby's post not just sarcastic but downright funny :-)

    --
    Cedric
    http://testng.org
    I see what you're saying about YAGNI, but what they should really say is, "If you do need it, add it." Why are people so afraid of refactoring? I'm sitting on top of a simple portal app that has way too much code, just cause they thought they might need it. Three years later, there are a couple places they might have needed it, but now it's a maintenance nightmare for two or three cases. Can't people just think on their feet? Code for what's in front of you and roll with the punches.
  19. "You are going to need it". See here for more details when and why this happens.

    As for myself, I found Corby's post not just sarcastic but downright funny :-)

    --
    Cedric
    http://testng.org


    I see what you're saying about YAGNI, but what they should really say is, "If you do need it, add it." Why are people so afraid of refactoring?

    I'm sitting on top of a simple portal app that has way too much code, just cause they thought they might need it. Three years later, there are a couple places they might have needed it, but now it's a maintenance nightmare for two or three cases.

    Can't people just think on their feet? Code for what's in front of you and roll with the punches.
    Not afraid, but often there is no option. We build a public API, we have to make it future proof because once the API is published customers are going to use it and we can't change it. Were we to expose a concrete class we'd be suckered, we'd never be able to put an interface in front of it. So we expose an interface always even if initially there's only a single implementation class (or none at all, maybe we leave implementation to the customer...). The principle of only introducing interfaces once you get to multiple implementation classes is a nice one, but in the real world it often holds no ground because as soon as the API is complete you no longer have the freedom to make such changes. Same with a lot of refactoring. While you can refactor implementation all you like, a public API cannot be refactored without breaking code you have no control over, code that may make customers paying a lot of money to use your API very unhappy if it breaks.
  20. What percentage of interfaces end up having one and only one implementation ?
    Preciously few if any. I guess you don't practice TDD.
  21. What percentage of interfaces end up having one and only one implementation ?


    Preciously few if any. I guess you don't practice TDD.
    Whether I practice TDD is irrelevant. Over engineering existed in Java long before TDD came to the forefront. Every project I come onto (commercial/financial) it is the same story. Java 'architects' over-engineer their creations and that code stays the same for years. 90% of interfaces in these type of scenarios only have one implementation and only ever will. My experience tells me to add interfaces to implementations as and when they are needed, not before.
  22. Interfaces and TDD[ Go to top ]

    What percentage of interfaces end up having one and only one implementation ?


    Preciously few if any. I guess you don't practice TDD.


    Whether I practice TDD is irrelevant. Over engineering existed in Java long before TDD came to the forefront. Every project I come onto (commercial/financial) it is the same story. Java 'architects' over-engineer their creations and that code stays the same for years. 90% of interfaces in these type of scenarios only have one implementation and only ever will.

    My experience tells me to add interfaces to implementations as and when they are needed, not before.
    I think the point of the TDD comment is that TDD forces you to consider using interfaces to reduce coupling between classes. Yes, near 100% of my data access interfaces have exactly one implementation. But I never create these interfaces because, gosh, I may want to change from Oracle to Some-New-Cool-Database in the future. But suppose I create a data access class without an interface. Then I want to create a class (a service, web component, whatever) that uses this data access class. Well, crap, since there is no abstraction to my data access class, I have to be tightly coupled to the data later by referencing to class directly. Now any time I want to test my service/web component/whatever I have to fire up a connection to a database. If only I had an interface, then I could mock my data access layer to write an isolated unit test. Interfaces provide a means for multiple implementations, but they (arguably more importantly) provide a layer of abstraction. And as a side note, they are also quite helpful when using a proxy-based AOP framework (such as Spring's) to provide services not even considered when your class is first created (caching???).
  23. Re: Interfaces and TDD[ Go to top ]

    Yes, near 100% of my data access interfaces have exactly one implementation.
    It's stupid to have one DAO interface for each DAO implementation. There should only be one DAO interface for all of your DAO implementations. select(long dong) select(int i) select(Object object) insert(Object object) update(Object object) delete(Object object) DAO is simple stuff. Keep it that way with simple methods. If you feel the need for more elaborate methods, put them in the service/business layer that uses DAO. KISS.
  24. Re: Interfaces and TDD[ Go to top ]

    Yes, near 100% of my data access interfaces have exactly one implementation.


    It's stupid to have one DAO interface for each DAO implementation. There should only be one DAO interface for all of your DAO implementations.

    select(long dong)
    select(int i)
    select(Object object)
    insert(Object object)
    update(Object object)
    delete(Object object)

    DAO is simple stuff. Keep it that way with simple methods. If you feel the need for more elaborate methods, put them in the service/business layer that uses DAO. KISS.
    Who you callin' stupid? I'd say whether or not your design is better depends on what you call a DAO and what you call the service API. It also depends on language, framework, style, other compromises, data storage technology, etc. We've been using Spring's JDBC helpers a lot lately so we end up with a service API that hooks up to a bunch of fine-grained DAOs. Each DAO ends up being a one method class that implements a Spring interface like SqlUpdate, MappingSqlQuery, etc. I guess the DAOlets are similar to commands in the command pattern (Oops I said the p-word). So is my DAO the consolidating class or are they the individual implementations of the JDBCTemplate helpers?
  25. Re: Interfaces and TDD[ Go to top ]

    Yes, near 100% of my data access interfaces have exactly one implementation.


    It's stupid to have one DAO interface for each DAO implementation. There should only be one DAO interface for all of your DAO implementations.

    select(long dong)
    select(int i)
    select(Object object)
    insert(Object object)
    update(Object object)
    delete(Object object)

    DAO is simple stuff. Keep it that way with simple methods. If you feel the need for more elaborate methods, put them in the service/business layer that uses DAO. KISS.
    In 2007 with JDO and Hibernate are we still talking about DAO ? Or do you really think to be able to use DAO pattern to switch between different ORM tools ? Or better to switch from a RDBMS based implementation to a XML one ? Anyway, decision for an interface or a full abstract class should be driven by analysis of the role (essence) of the entity in the problem domain. Certain entities are well represented by interfaces, no matter how many implementations will be produced at the end. I don't think that this criteria is always the good one. Looks like deciding architectures, technologies and tools before the analysis of the problem. Guido.
  26. Re: Interfaces and TDD[ Go to top ]

    Yes, near 100% of my data access interfaces have exactly one implementation.


    It's stupid to have one DAO interface for each DAO implementation. There should only be one DAO interface for all of your DAO implementations.

    select(long dong)
    select(int i)
    select(Object object)
    insert(Object object)
    update(Object object)
    delete(Object object)

    DAO is simple stuff. Keep it that way with simple methods. If you feel the need for more elaborate methods, put them in the service/business layer that uses DAO. KISS.

    In 2007 with JDO and Hibernate are we still talking about DAO ?
    Or do you really think to be able to use DAO pattern to switch between different ORM tools ?
    Or better to switch from a RDBMS based implementation to a XML one ?
    Anyway, decision for an interface or a full abstract class should be driven by analysis of the role (essence) of the entity in the problem domain.
    Certain entities are well represented by interfaces, no matter how many implementations will be produced at the end.
    I don't think that this criteria is always the good one.
    Looks like deciding architectures, technologies and tools before the analysis of the problem.

    Guido.
    I haven't used DAO for quite a while, rather I prefer the Repository pattern outlined in DDD by Evans. I think the patterns overlap in many instances, with Repository more centered on abstracting domain aggregates, their construction and managing creation related invariants. I think you're mislead to believe that DAO was only there to provide abstraction from the persistence API and that with JPA that is no longer relavant. The truth is that is abstracts creation and persistence of domain objects (in the case of Repository) aggregates, which can encapsulate various caching strategies and various other concerns. Ilya
  27. Re: Interfaces and TDD[ Go to top ]

    Yes, near 100% of my data access interfaces have exactly one implementation.


    It's stupid to have one DAO interface for each DAO implementation. There should only be one DAO interface for all of your DAO implementations.

    select(long dong)
    select(int i)
    select(Object object)
    insert(Object object)
    update(Object object)
    delete(Object object)

    DAO is simple stuff. Keep it that way with simple methods. If you feel the need for more elaborate methods, put them in the service/business layer that uses DAO. KISS.

    In 2007 with JDO and Hibernate are we still talking about DAO ?
    Or do you really think to be able to use DAO pattern to switch between different ORM tools ?
    Or better to switch from a RDBMS based implementation to a XML one ?
    Anyway, decision for an interface or a full abstract class should be driven by analysis of the role (essence) of the entity in the problem domain.
    Certain entities are well represented by interfaces, no matter how many implementations will be produced at the end.
    I don't think that this criteria is always the good one.
    Looks like deciding architectures, technologies and tools before the analysis of the problem.

    Guido.


    I haven't used DAO for quite a while, rather I prefer the Repository pattern outlined in DDD by Evans. I think the patterns overlap in many instances, with Repository more centered on abstracting domain aggregates, their construction and managing creation related invariants.

    I think you're mislead to believe that DAO was only there to provide abstraction from the persistence API and that with JPA that is no longer relavant. The truth is that is abstracts creation and persistence of domain objects (in the case of Repository) aggregates, which can encapsulate various caching strategies and various other concerns.

    Ilya
    In fact !!! 99% of the people you ask an example of DAO pattern will show you the "classical" 1 domain class -> 1 DAO interface approach. I am working with ORM tools since 1997 and I always used the repository approach (even if I called it StorageManager). The problem here is that DAO is not strictly a pattern because it means different solutions to do different people Guido
  28. Re: Interfaces and TDD[ Go to top ]

    In 2007 with JDO and Hibernate are we still talking about DAO ?
    And what do you do? Do you let the JDO/hibernate APIs leak into your application code?
  29. Re: Interfaces and TDD[ Go to top ]

    In 2007 with JDO and Hibernate are we still talking about DAO ?

    And what do you do? Do you let the JDO/hibernate APIs leak into your application code?
    It depends on the meaning you give to DAO (see other posts too). Guido
  30. I think people make a fundamental mistake when they assume that Java interfaces are needed to mock dependencies. The truth, however, is that the tools commonly used for mocking objects in developer testing are simply too limited. Specifically, such tools require that the dependencies to be mocked either implement an interface or at least that the class is non-final. Some time ago I wanted to write unit tests for presentation layer classes in a business application, but those classes were directly instantiating concrete (and final) service classes in the domain layer. Obviously, I would normally have to change the design of the application. Instead, I created a Java mocking tool that would allow me to write those unit tests without any change at all in the current application design or implementation. So, here is the tool: JMockit Rogério Liesenfeld
  31. I think people make a fundamental mistake when they assume that Java interfaces are needed to mock dependencies.

    The truth, however, is that the tools commonly used for mocking objects in developer testing are simply too limited. Specifically, such tools require that the dependencies to be mocked either implement an interface or at least that the class is non-final.
    That's not true, you can do it with implementation classes at least using EasyMock.


    Some time ago I wanted to write unit tests for presentation layer classes in a business application, but those classes were directly instantiating concrete (and final) service classes in the domain layer. Obviously, I would normally have to change the design of the application. Instead, I created a Java mocking tool that would allow me to write those unit tests without any change at all in the current application design or implementation.

    So, here is the tool: JMockit


    Rogério Liesenfeld
    So you complemented for the bad design with a framework that allows you to test such design? Sorry, I'm not trying to be harsh it might be that I'm misunderstanding you. Change in application design is called refactoring, which you and everyone should practice religiously in any agile environment. Also, object lifecycle containers that are able to manage dependencies and provide wrapper services (AOP), like Spring or Guice would have resolved your dependency issues where you instantiate a concrete implementation. Ilya
  32. That's not true, you can do it with implementation classes at least using EasyMock.
    Hmm, are you sure? Last time I looked, there was no way I could mock a final class using EasyMock. Would you have any link to EasyMock documentation about this possibility, a sample, something?
    So you complemented for the bad design with a framework that allows you to test such design? Change in application design is called refactoring, which you and everyone should practice religiously in any agile environment.
    Hehe, I am not surprised someone would come out accusing the supposed "bad design". To say it's otherwise sure is heresy! Just for the record, I do practice refactoring "religiously". I even bought from my pocket a personal license for IntelliJ IDEA 6 just so I can use its superior refactoring capabilities. And for that application's design and architecture, I am quite comfortable to defend its liberal use of the "new" operator, final classes and methods, static facades, and base classes. There are only a few Java interfaces, for the cases where they were truly justified. I suggest books such as "Patterns of Enterprise Application Architecture" and "Domain Driven Design" for those interested in similar ideas.
    Also, object lifecycle containers that are able to manage dependencies and provide wrapper services (AOP), like Spring or Guice would have resolved your dependency issues where you instantiate a concrete implementation.
    I am familiar with Spring, AOP and even Guice, but can't seem to find a good use for them in business applications of the kind I usually develop. Why bother with dependency injection when I can simply use "new", without any practical loss? Why care about declarative transaction demarcation, if I can have a few lines of code in a single base class that solve 99% of my real world needs? Really, Java developers have traditionally been known for over-engineering, at least when it comes to the design and architecture of applications. This is sad, and worries me when I hear managers entertaining the possibility of going to .NET, Flex or some other non-Java alternative, because they see us Java developers as less productive. Rogério
  33. That's not true, you can do it with implementation classes at least using EasyMock.


    Hmm, are you sure? Last time I looked, there was no way I could mock a final class using EasyMock. Would you have any link to EasyMock documentation about this possibility, a sample, something?

    Sorry, I didn't take the final classes into consideration. Yeah, it currently doesn't support final classes or private constructors if you're using static factory methods.
    And for that application's design and architecture, I am quite comfortable to defend its liberal use of the "new" operator, final classes and methods, static facades, and base classes. There are only a few Java interfaces, for the cases where they were truly justified. I suggest books such as "Patterns of Enterprise Application Architecture" and "Domain Driven Design" for those interested in similar ideas.
    Nothing is wrong with the new operator in many instances, especially when you don't deal with class inheritance hierarchies and different implementations. But with Spring/Guice, which are effectively object factories, these patterns are not as prevalent. Most of what you described is completely fine IMO and use of interfaces based on justification is perfectly fine as well. But in your case it seems like introducing an interface was completely justified, since for testing/mocking purposes you were effectively trying to create a different interface implementation and inject it at run time.
    I am familiar with Spring, AOP and even Guice, but can't seem to find a good use for them in business applications of the kind I usually develop.
    So you develop dependency injection frameworks yourself? I mean, the case that Spring is a bloated container framework is nonsensical since it's distributed in a modular fashion and you can easily enjoy only the lightweight dependency injection fictionality with a very small footprint.
    Why bother with dependency injection when I can simply use "new", without any practical loss? Why care about declarative transaction demarcation, if I can have a few lines of code in a single base class that solve 99% of my real world needs?
    Because implementations change and software life-cycles are volatile, therefore these container frameworks allow you to enjoy a higher level of abstraction and really adhere to the separation of concern principles.
    Really, Java developers have traditionally been known for over-engineering, at least when it comes to the design and architecture of applications.
    I think this statement is true for any developer. Well, other than procedural scripters that traditionally just repeated themselves and didn't care about abstractions on any level higher than function reuse.
    This is sad, and worries me when I hear managers entertaining the possibility of going to .NET, Flex or some other non-Java alternative, because they see us Java developers as less productive.

    Rogério
    Managers entertain things that have nothing to do with any thorough knowledge of technology, so you can't say that their choices are in part due to any practical experience. That's not the case of course when it comes to some, but I think for the most part there is more strategical/political force involved than should be. With that said, since you mentioned flex I'll take this opportunity to mention that it really is what view layer development should be all about. It's the most compelling and enjoyable technology I've work with in a very long time.
  34. And for that application's design and architecture, I am quite comfortable to defend its liberal use of the "new" operator, final classes and methods, static facades, and base classes. There are only a few Java interfaces, for the cases where they were truly justified.
    ...use of interfaces based on justification is perfectly fine as well. But in your case it seems like introducing an interface was completely justified, since for testing/mocking purposes you were effectively trying to create a different interface implementation and inject it at run time.
    In the situation I mentioned previously, I had to create a unit test for a class A (in the presentation layer) that called a method on a class B (in the domain layer). Therefore, it was necessary to "mock out" the dependency of A on that specific method of B. I fail to see the need for an entirely different implementation of class B in such a test. All I really need is to replace the method of class B called in A by a mock method defined in the unit test itself. That's exactly what the new tool JMockit allowed me to do, in a very simple and effective way, compared to how complex it would have been if using the "conventional" approach: a separated interface, dependency injection, and the use of EasyMock or jMock.
    I am familiar with Spring, AOP and even Guice, but can't seem to find a good use for them in business applications of the kind I usually develop.
    So you develop dependency injection frameworks yourself? I mean, the case that Spring is a bloated container framework is nonsensical since it's distributed in a modular fashion and you can easily enjoy only the lightweight dependency injection fictionality with a very small footprint.
    Oh no, of course not! :^) I created JMockit because there was nothing similar available, and I would probably use Guice if it were useful for a given application. Actually, I probably would prefer a ServiceLocator instead of dependency injection; after all, Martin Fowler, in the article where DI was coined, concluded that the ServiceLocator was probably a better choice for services used in a single application (funny how developers never seem to consider that).
    Why bother with dependency injection when I can simply use "new", without any practical loss? Why care about declarative transaction demarcation, if I can have a few lines of code in a single base class that solve 99% of my real world needs?
    Because implementations change and software life-cycles are volatile, therefore these container frameworks allow you to enjoy a higher level of abstraction and really adhere to the separation of concern principles.
    What you are saying is only valid in certain situations, and this is the core question on which we seem to disagree. For me, situations where "implementations change" (in the sense of replacing a class that implements a separated interface with another entire class) are very rarely justified in single-purpose applications. When the application use cases or UI change a lot what we really need is to facilitate the work of the average maintenance developer, who will typically make more localized changes than complete class replacements (not to mention that the interfaces themselves will change a lot). For me, the "higher level of abstraction" consists of the overall application architecture and the components in the application infrastructure layer, which (hopefully) evolves in a much slower and more gradual pace (I have being doing just that for months). Individual use cases simply follow the architecture, with the domain model expressing business specific concepts such as "Person", "Protocol", "ProtocolMaintenance", "PersonEditScreen" and so on. Finally, if I assume correctly, the "separation of concerns" you mentioned refers to the concern of obtaining concrete implementations for the abstractions (represented by separated interfaces). Of course I agree that it makes sense to use design patterns (Abstract Factory, Service Locator, etc.) or a dependency injection framework for separating that concern, but only if it really is a concern worth separating. So, today we have developers writing tons of "object wiring configuration" code, be it XML, annotations or Java/Groovy/etc. But how many can truly justify it on technical grounds, if pressed? I suspect not many. From my experience, in most cases one could do away with all that extra baggage and gain a simpler, more elegant and maintainable code base, without sacrificing OO or testability.
    With that said, since you mentioned flex I'll take this opportunity to mention that it really is what view layer development should be all about. It's the most compelling and enjoyable technology I've work with in a very long time.
    I believe you, but IMO there is an even better alternative, at least for Java-based web apps: Google's GWT. We are implementing all of our new use cases in it, and also rewriting a few of the existing use cases. I (and others in the team) have been impressed so far, for it actually works well in IE and Firefox (I initially had fears) and productivity is high. The only downside is that it takes a little getting used to the many GWT modules we have to create, and the several ways of running them. But apart from that, in my experience the GWT SDK is the most compelling one since the JDK itself! Rogério
  35. In the situation I mentioned previously, I had to create a unit test for a class A (in the presentation layer) that called a method on a class B (in the domain layer). Therefore, it was necessary to "mock out" the dependency of A on that specific method of B. I fail to see the need for an entirely different implementation of class B in such a test. All I really need is to replace the method of class B called in A by a mock method defined in the unit test itself.
    Rogerio, I don't disagree with the userfulness of your framework and will keep it in mind when and if I run into the scenerio of creating mock objects from classes that are final and don't implement an interface... With that said, when you say that you don't see a reason for another implementation and though no reason for an interface, we could get into a heated debate on what a mock object really is. To me it's another implementation, though weaved at runtime. So in theory you do have a different implementation of that domain object to support your testing infrastructure. But let's not get into that religious debate. I see many scenerios where a final concrete class would need to be mocked. I personally also don't create interfaces for core domain entities, unless I see multiple implementations right away or am refactoring away from conditionals. But I also admit that I don't make them final, though there is the religious debate on the use of the final class modifier and it's usefulness of restricting inheritence. I religiously use it for field mutability reasons though. That is very project/design dependent so I'm sure what you're doing is right.
    Oh no, of course not! :^) I created JMockit because there was nothing similar available, and I would probably use Guice if it were useful for a given application. Actually, I probably would prefer a ServiceLocator instead of dependency injection; after all, Martin Fowler, in the article where DI was coined, concluded that the ServiceLocator was probably a better choice for services used in a single application (funny how developers never seem to consider that).
    IMO service locator is fine, but doesn't scale well without a central configuration (i.e. JNDI) as your services grow. You either roll your own configuration framework, you're stuck with an unmanagable service locator, or you have multiple service locators. The service locator was/is prevalent in J2EE world due to central JNDI configuration. With JNDI you loose some of the benefits of spring based ApplicationContext. Well ApplicationContext is the service locator in non-EJB application case, so why implement one yourself on top of your own configuration implementation. I admit I stay away from EJB containers lately and most apps are stateless service exposing backends with stateful clients (i.e. Flex or Ajax). With Ajax is harder to keep state on the client as you can in Flex, though the server pieces still maintains session related information.
    So, today we have developers writing tons of "object wiring configuration" code, be it XML, annotations or Java/Groovy/etc. But how many can truly justify it on technical grounds, if pressed? I suspect not many. From my experience, in most cases one could do away with all that extra baggage and gain a simpler, more elegant and maintainable code base, without sacrificing OO or testability.
    On the grounds of injecting AOP based concerns into the application, I think it's worth it, there are many other benefits as well. I agree that the wiring files eventually become a management nightmare if not correctly structured/componentized. With spring 2.0 this is less of a nightmare with the mix of annotations/xml configurations. Either way, I'm not claiming spring is the end of all problems, but I think it makes lifes a lot easier. How does that sacrifice OO or testing? It's the same service locator/abstract factory that you'd role on your own in certain situations. I understand that some situations don't dictate the insurmountable need for IOC, but it's there, it's lightweight and relatively easy to master/use. I do admit that I've written frameworks before that dictated rolling my own rudimentary DI implementation.
    I believe you, but IMO there is an even better alternative, at least for Java-based web apps: Google's GWT.
    Yeah, I've heard and read a lot about it, but haven't had a chance to use it yet. I might due so, hopefully sooner than later. I'm sort of eager, having explored most other mainstream component frameworks. Ilya
  36. JMockit looks pretty neat, I just tried using it with Groovy and had it up and running in minutes: http://groovy.codehaus.org/Using+JMockit+with+Groovy The example on that link has also been done in plain Groovy, JMock, JMock2, EasyMock, RMock, GSpec, Instinct, JBehave and JDummy. The JMockit solution certainly does look very compact and easy to understand.
  37. I think people make a fundamental mistake when they assume that Java interfaces are needed to mock dependencies.

    The truth, however, is that the tools commonly used for mocking objects in developer testing are simply too limited. Specifically, such tools require that the dependencies to be mocked either implement an interface or at least that the class is non-final.


    That's not true, you can do it with implementation classes at least using EasyMock.



    Some time ago I wanted to write unit tests for presentation layer classes in a business application, but those classes were directly instantiating concrete (and final) service classes in the domain layer. Obviously, I would normally have to change the design of the application. Instead, I created a Java mocking tool that would allow me to write those unit tests without any change at all in the current application design or implementation.

    So, here is the tool: JMockit


    Rogério Liesenfeld


    So you complemented for the bad design with a framework that allows you to test such design? Sorry, I'm not trying to be harsh it might be that I'm misunderstanding you. Change in application design is called refactoring, which you and everyone should practice religiously in any agile environment.

    Also, object lifecycle containers that are able to manage dependencies and provide wrapper services (AOP), like Spring or Guice would have resolved your dependency issues where you instantiate a concrete implementation.

    Ilya
    JMockit seems like exactly what I've been looking for. The notion that design is automatically better because it meets the needs of limited mocking tools like EasyMock has always struck me as myopic. Refactoring is great but should be done as needed -- not a priori due to requirements introduced solely by tooling limitations. Adding complexity up front to workaround testing tool issues seems like an anti-pattern.
  38. Whether I practice TDD is irrelevant. Over engineering existed in Java long before TDD came to the forefront. Every project I come onto (commercial/financial) it is the same story. Java 'architects' over-engineer their creations and that code stays the same for years. 90% of interfaces in these type of scenarios only have one implementation and only ever will.
    Isn't one of the goals of "engineering code" to make it so it won't need to be changed for years? I'd say that's an accomplishment - engineering code that can adapt to changing requirements without changes.
  39. What percentage of interfaces end up having one and only one implementation ?


    Preciously few if any. I guess you don't practice TDD.


    Whether I practice TDD is irrelevant. Over engineering existed in Java long before TDD came to the forefront. Every project I come onto (commercial/financial) it is the same story. Java 'architects' over-engineer their creations and that code stays the same for years. 90% of interfaces in these type of scenarios only have one implementation and only ever will.

    My experience tells me to add interfaces to implementations as and when they are needed, not before.
    +1
  40. What percentage of interfaces end up having one and only one implementation ?


    Preciously few if any. I guess you don't practice TDD.


    Whether I practice TDD is irrelevant. Over engineering existed in Java long before TDD came to the forefront. Every project I come onto (commercial/financial) it is the same story. Java 'architects' over-engineer their creations and that code stays the same for years. 90% of interfaces in these type of scenarios only have one implementation and only ever will.

    My experience tells me to add interfaces to implementations as and when they are needed, not before.
    Agreed! Moreover, adding interfaces, etc, only for TDD strikes me as a silliness in TDD that will pass with time -- at least in many languages. Why? Well, in Ruby, etc, one can use the open classes to override methods as need be on-the-fly for testing purposes. In Java one can use AspectJ to do similar things. Why folk are in favor of adding complexity to their code for test cases when aspects, etc, could do so unobtrusively without muddying their production code is beyond me.
  41. Even if there is one implementation of an interface, don't you see any benifit in interface based programming. Isn't it nice to be able to change the implementation, with out having to impact whole bunch of code.
  42. What percentage of interfaces end up having one and only one implementation ?


    Preciously few if any. I guess you don't practice TDD.
    Snobby tone aside, out of curiosity, how does TDD or the lack of it weigh into how many implementations of an interface are "typical"? (Assuming, which might be a mistake, that implementations strictly for the purpose of testing aren't counted.)
  43. TDD means Test Driven Development. Of course, with such a methodology of development, tests are an important part of the code, and not counting implementations written strictly for the purpose of testing is a mistake. It's very easy to mock an interface and thus be able to test a class/component depending on an instance of this interface. If your class/component depends on the physical implementation of this interface, testing becomes more difficult. Thats why TDD makes use of many interfaces, even if there is only one non-mock implementation of this interface.
  44. Interfaces only when required[ Go to top ]

    What percentage of interfaces end up having one and only one implementation ? A high percentage in my experience. Why not start with no interfaces then add them when they are needed ?
    I thought I was alone in recommending this. I strongly agree, why introduce an interface when it is not required. Lack of interfaces has not prevented me doing TDD.
  45. I am bit out of date with the Java world, and must admit that I don't really follow how you go about creating a good suite of unit tests without using interfaces and mock implementations. But I will have a look at the various mocking tools (including jmockit) when I get a chance. But, what I really wanted to offer is perhaps a strategic way of looking at the usage of interfaces to split modules/layers even when there is only one implementation of the interface.. I consider that one of the biggest dificulties with the delivery of large projects is the utilisation of a large number of developers, it seems that the more developers involved, the lower their productivity. I have thought for some time that liberally using interfaces in your application design to separate modules/layers in an application can simplify the design/development process when utilising a team of designers/developers. To my mind, regardless of whether you are trying to waterfall design everything out first, or are utilising a more agile process, the most effective way to design/develop is to delegate responsibility for specific modules/layers to individual developers and then separate the modules with interfaces. Then, whenever the designer/developer finds themselves needing or wanting to change an interface, this is the point where they need to raise and agree their proposed changes with the developers responsible for whats on the other side of the interface (and whoever has overarching architecture responsibility). I realise, you could always use an actual class at the intersection between the modules/layers here, but that complicates things as you then have two (or more) people responsible for the actual implementation details of the class and that makes it much more diffcult. When you use an interface, that is reducing the amount of stuff that multiple people are responsible for decision making on. I think what I have mentioned above is a key benefit of reducing the coupling in a system, and of course can be accomplished in many different ways. i.e. Inter application/system coupling with a messageing backbone has been something that has proved its worth in the enterprise. I think that there is also another key benefit to this approach. That is to the management of a projects dependancies. If the design has split the project up into a number of modules or layers connected by interfaces, you can order the project however you want, by creating mock implementations first for any dependancies that your module has prior to developing your module. In other words, you don't need to have full implementations for the modules you are dependant on to start the development/testing of your module.(TDD of course provides this benefit to a degree, but I suppose what I am suggesting is a course grained TDD only, although individual module developers could be doing hard out TDD). I think that not only does this make the project easier to schedule & manage, but more importantly it allows you to focus attention on the modules with the highest risk early in the project. The delegation of responsibility to individuals and very clear separation of who is resposible for what concerns in this approach enables whoever has responsibility for the overarching architecture & design to avoid micro design and management and really concentrate on the interfaces and hence module/layer couplings. I realise though that this might be important for smaller projects with few developers.
  46. Oops, thought I proof read that carefully, the last line should of coursee read: I realise though that this might not be important for smaller projects with few developers.
  47. In my experience many developers follow what is laid down before without really thinking about the merits or applicability of a particular pattern. Formulaic pattern adoption and proliferation doesn’t always guarantee a well designed solution. Sometimes the original intent, reuse, encapsulation etc, gets lost amidst the factories and locators; a pattern wasteland. I think that patterns have a lifespan that is in part influenced by language and evolving technology. Sometimes it’s better to over-engineer up-front rather than under-engineer and have to patch. My experience with product development is that it’s rare that you get an opportunity to revisit a design, pattern choice or change an API interface etc. Once the code is out there you can’t take it back. I mean, imagine the complaints if log4j completely redesigned their interfaces. In this scenario, attention to design and the choice of patterns is made up-front, often a long time before roll-out and sometimes, hey, you guess wrong. But sometimes, hey you guess right and just before a demo or a rollout you’re damn happy you’ve got that interface in place to customize or repurpose. Perhaps it isn’t that the old patterns are crap but that we need GOF edition II?
  48. Thanks for the heads up! I just eliminated all of the interfaces from my application, and changed all of my references to point to concrete implementation classes.

    That should get rid of most of those smelly design patterns.


    I think your sarcasm is misplaced. Over-engineering is still rife in Java land.

    What percentage of interfaces end up having one and only one implementation ? A high percenatge in my experience. Why not start with no interfaces then add them when they are needed ...
    Because that could end up as far more work to change. No offense, but if you think Interfaces are overengineering, I'd hate to work on your stuff. Seems like you've got a bit of duplication in your stuff. It takes next to no time to create an interface and when you need it, its there. By your logic, if you favor ArrayList(which I do), why use List and not make everything concrete. In addition, any IDE I've used allows you to directly to implementations, so don't blame to tool for the user.
  49. Thanks for the heads up! I just eliminated all of the interfaces from my application, and changed all of my references to point to concrete implementation classes.

    That should get rid of most of those smelly design patterns.


    I think your sarcasm is misplaced. Over-engineering is still rife in Java land.

    What percentage of interfaces end up having one and only one implementation ? A high percenatge in my experience. Why not start with no interfaces then add them when they are needed ...


    Because that could end up as far more work to change. No offense, but if you think Interfaces are overengineering, I'd hate to work on your stuff. Seems like you've got a bit of duplication in your stuff.


    It takes next to no time to create an interface and when you need it, its there. By your logic, if you favor ArrayList(which I do), why use List and not make everything concrete.

    In addition, any IDE I've used allows you to directly to implementations, so don't blame to tool for the user.
    Interfaces are great and very useful when used where they are needed. They are the mainstays of any good maintainable/flexible/extendable piece of software. Maybe its your projects I keep coming onto where I am forced to deal mountains of layers, interfaces, abstract classes, a whole host remarkable intricate twists and turns in the code - amazing number of java files, alot which sit there in superb isolation, providing untold genericity which will never be needed. Try going into a large code base like this to do a 'small' change. Its the over-selling of 'design for future requirements' (including generating an interface as a matter of course for any object) that is a direct cause of many 'clumps' ofcode garbage that exists in financial institutions in London today. And it is over-engineering and the people who propagate a over zealous design philosphy it that is the root cause of this. People like you I guess.
  50. Maybe its your projects I keep coming onto where I am forced to deal mountains of layers, interfaces, abstract classes, a whole host remarkable intricate twists and turns in the code - amazing number of java files, alot which sit there in superb isolation, providing untold genericity which will never be needed. Try going into a large code base like this to do a 'small' change. Its the over-selling of 'design for future requirements' (including generating an interface as a matter of course for any object) that is a direct cause of many 'clumps' ofcode garbage that exists in financial institutions in London today. And it is over-engineering and the people who propagate a over zealous design philosphy it that is the root cause of this. People like you I guess.
    I think it is unnecessary to get personal here. I'm not scolding. I'm just suggesting we could all just stop now and the thread would be a lot better. I've worked with code that was a lot like what I think you are describing. The here were layers upon layers upon layers many of which seemed to exist only to move create pointless objects and move them to the next layer. Oh, and break business logic up in nonsensical ways that created many painful non-API logic dependencies. My personal take was that the designers thought they knew what they were doing but really didn't. I've never seen an application that was harder to change. The extra layers and insane dead-end 100-stack-level-deep calls were clearly intended to help with future needs but actually had the exact opposite result. The code stagnated in a semi-working state because of the vast over-complexity of the code. Small changes would result in errors or unexpected behavior changes in seemingly unrelated part of the system. Nothing could be modified without a complete analysis of the entire system (which was fairly large.) But the problem here is not design patterns or interfaces. It was inexperienced developers with a GoF book and some other things they read but didn't really understand. I seriously believe that until this industry starts mentoring new developers as a rule, we will always be pointing at last years hot technique as the cause of all our problems. Many of these techniques have proved to be useful and valuable but almost all have been misunderstood, misused, and overdone. Basically anything that is being touted as the solution for much of the problems of today will be considered the cause of the problems of tomorrow. I fully expect to see a 'solution' to Spring in the near future. Really, as far as I am concerned, the problem is a lack of brains in the right places.
  51. Maybe its your projects I keep coming onto where I am forced to deal mountains of layers, interfaces, abstract classes, a whole host remarkable intricate twists and turns in the code - amazing number of java files, alot which sit there in superb isolation, providing untold genericity which will never be needed. Try going into a large code base like this to do a 'small' change. Its the over-selling of 'design for future requirements' (including generating an interface as a matter of course for any object) that is a direct cause of many 'clumps' ofcode garbage that exists in financial institutions in London today. And it is over-engineering and the people who propagate a over zealous design philosphy it that is the root cause of this. People like you I guess.


    I think it is unnecessary to get personal here. I'm not scolding. I'm just suggesting we could all just stop now and the thread would be a lot better.

    I've worked with code that was a lot like what I think you are describing. The here were layers upon layers upon layers many of which seemed to exist only to move create pointless objects and move them to the next layer. Oh, and break business logic up in nonsensical ways that created many painful non-API logic dependencies.

    My personal take was that the designers thought they knew what they were doing but really didn't. I've never seen an application that was harder to change. The extra layers and insane dead-end 100-stack-level-deep calls were clearly intended to help with future needs but actually had the exact opposite result. The code stagnated in a semi-working state because of the vast over-complexity of the code. Small changes would result in errors or unexpected behavior changes in seemingly unrelated part of the system. Nothing could be modified without a complete analysis of the entire system (which was fairly large.)

    But the problem here is not design patterns or interfaces. It was inexperienced developers with a GoF book and some other things they read but didn't really understand.

    I seriously believe that until this industry starts mentoring new developers as a rule, we will always be pointing at last years hot technique as the cause of all our problems. Many of these techniques have proved to be useful and valuable but almost all have been misunderstood, misused, and overdone. Basically anything that is being touted as the solution for much of the problems of today will be considered the cause of the problems of tomorrow. I fully expect to see a 'solution' to Spring in the near future.

    Really, as far as I am concerned, the problem is a lack of brains in the right places.
    Big +1 to the mentoring part.
  52. Thanks for the heads up! I just eliminated all of the interfaces from my application, and changed all of my references to point to concrete implementation classes.

    That should get rid of most of those smelly design patterns.


    I think your sarcasm is misplaced. Over-engineering is still rife in Java land.

    What percentage of interfaces end up having one and only one implementation ? A high percenatge in my experience. Why not start with no interfaces then add them when they are needed ...


    Because that could end up as far more work to change. No offense, but if you think Interfaces are overengineering, I'd hate to work on your stuff. Seems like you've got a bit of duplication in your stuff.


    It takes next to no time to create an interface and when you need it, its there. By your logic, if you favor ArrayList(which I do), why use List and not make everything concrete.

    In addition, any IDE I've used allows you to directly to implementations, so don't blame to tool for the user.


    Interfaces are great and very useful when used where they are needed. They are the mainstays of any good maintainable/flexible/extendable piece of software. Maybe its your projects I keep coming onto where I am forced to deal mountains of layers, interfaces, abstract classes, a whole host remarkable intricate twists and turns in the code - amazing number of java files, alot which sit there in superb isolation, providing untold genericity which will never be needed. Try going into a large code base like this to do a 'small' change. Its the over-selling of 'design for future requirements' (including generating an interface as a matter of course for any object) that is a direct cause of many 'clumps' ofcode garbage that exists in financial institutions in London today. And it is over-engineering and the people who propagate a over zealous design philosphy it that is the root cause of this. People like you I guess.
    I've never done any work in London so it wouldn't be mine. Perhaps we are saying the samething. I certain have objects that don't use interfaces because they don't need them. I just don't the use of interfaces constitute over-engineering. I think that people abuse tools from a lack of understanding or a particular philosophy. You implied that it is difficult to get to an implementation, but in Idea, at least, you can go directly to a single implementation with a Ctrl+B. If you have multiple, it lists them, so I don't believe this is difficult. That said, depending on where the code is used, would determine if I would default to an interface or not if I'm not sure. If the code is in a single project, I would probably not use an interface if I'm not sure, because I can refactor the code with an "Extract Interface" and be confident that everything is fairly self-contained. If I was writing a class for a module that was to be used in different projects, I would err on the side of the interface and provide an abstract default implementation. If you think that is too much, well, I guess we'll never work together and that is humanity's loss. :-)
  53. Thats a good point[ Go to top ]

    I spent the last year working in a scripting language for a reason. To break myself from the bonds of really really smelly MVC patterns that occur in the places where Ive worked. I wanted to see if I could make simple updates to an app without updating 3 really crappy layers of code that only existed to make the patterns happy. And in most cases, in heavily patterned, over engineered apps all the abstraction goes to waste. By the time theres something to "switch that interface to point to something else" its not Java or anything similar and the interface never mattered. Apps dont change much architecturally over time without a huge cost (patterns or no) so all the patterns are overhead. So I went "less is more" and then I came back after a year and I did appreciate my ability to remember how to overcome some frailties of oversimplification using simple patterns. But instead of following a book I was only abstracting to what I needed and I didnt have to think about it much. Thats what learning is about. Patterns arent the solution. Neither are fancy language implementations. They are tools in your toolbox nothing else.
  54. by the way[ Go to top ]

    The articles point is complete bull***t Languages have nothing to do with patterns What a joke.
  55. Re: Design Patterns are Code Smells[ Go to top ]

    Thanks for the heads up! I just eliminated all of the interfaces from my application, and changed all of my references to point to concrete implementation classes.

    That should get rid of most of those smelly design patterns.
    To Stu's point, you don't need that sort of thing in Ruby. :)
  56. Re: Design Patterns are Code Smells[ Go to top ]

    Thanks for the heads up! I just eliminated all of the interfaces from my application, and changed all of my references to point to concrete implementation classes.

    That should get rid of most of those smelly design patterns.


    To Stu's point, you don't need that sort of thing in Ruby. :)
    You don't need that sort of thing in any dynamic language, what's your point? The debate of interfaces and their usefulness should be taken to the debate of strong vs. weak typed languages. There are benefits and drawbacks to both. I find it that I love working with strong typed languages for larger scale applications. For small, quick hack apps, dynamic languages are great. That's not to say you can't build a robust enterprise application in a dynamic language, it's just that you have to enforce certain development practices that you wouldn't in a strong typed language. Since ruby (or any weak typed language) has no type enforcement, imagine now getting a reference to some implementation that changed it's public interface. Hmmm, looks like a I'll be busy scanning my code for things that come out of the box with strong typed languages and their IDEs. Also refactoring in a weak type language is not as straight forward. Ilya
  57. ..You don't need that sort of thing in any dynamic language, what's your point? The debate of interfaces and their usefulness should be taken to the debate of strong vs. weak typed languages. There are benefits and drawbacks to both. ..
    I don't think that was Bob's point either Ilya. One of the referenced arguments of the article pointed towards the (kind of obvious, but whatever...) fact that these mysterious patterns take different forms or have no meaning depending on the language you attempt to apply them to. There is no point in getting defensive about dyna vs. static (dead) languages as that wasn't the point in anything I read. (well, sort of)
  58. ..You don't need that sort of thing in any dynamic language, what's your point? The debate of interfaces and their usefulness should be taken to the debate of strong vs. weak typed languages. There are benefits and drawbacks to both. ..


    I don't think that was Bob's point either Ilya. One of the referenced arguments of the article pointed towards the (kind of obvious, but whatever...) fact that these mysterious patterns take different forms or have no meaning depending on the language you attempt to apply them to.

    There is no point in getting defensive about dyna vs. static (dead) languages as that wasn't the point in anything I read. (well, sort of)
    I don't think static is dead, but maybe you're refering to the new hooks from staticaly typed languages into dynamic languages though allowing them to become more hybrid. I don't think static languages are dead nor will die for a very long time, they are definitely more useful in large OO design. Either way, the best hybrid approach I've come across, though not without its limitations, is ActionScript 3. It allows you to mix static and dynamic typing at levels you prefer and does it very elegantly. They still have some catching up to do in the OO facilities, though AS3 supports most mainstream OO idioms.
  59. Thanks for the heads up! I just eliminated all of the interfaces from my application, and changed all of my references to point to concrete implementation classes.

    That should get rid of most of those smelly design patterns.
    Funny. I think there is a good point that Stuart makes (I've seen him make the same points in a no-fluff conference) about design patterns are really the result of deficiencies in a language, library, framework. Case in point: Singleton pattern in Java. The GoF singleton pattern was needed in Java because Java didn't have a built-in way to enforce it. Using a dependency injection framework has (for me) deprecated the need to use the traditional singleton pattern (private constructor - static inner instance) Actually using a dependency injection framework has pretty much deprecated the Factory pattern since that is what a DI framework does so well. I don't think anybody should feel defensive about Stuart's comments. All languages have deficiencies, some of them intentional, so all languages require common patterns to work around those deficiencies.
  60. Corby Page , I hope you was being Sarcastic! (smile)
  61. OO Smell[ Go to top ]

    Thanks for the heads up! I just eliminated all of the interfaces from my application, and changed all of my references to point to concrete implementation classes.

    That should get rid of most of those smelly design patterns.
    And change all your methods to static make sure it can't be override, and make sure all fields are static as well. No constructor need it. Easier to debug.
  62. Re: OO Smell[ Go to top ]

    Thanks for the heads up! I just eliminated all of the interfaces from my application, and changed all of my references to point to concrete implementation classes.

    That should get rid of most of those smelly design patterns.


    And change all your methods to static make sure it can't be override, and make sure all fields are static as well. No constructor need it. Easier to debug.
    Oh god I think I'm having flashbacks to some code I inherited ~5 years ago. Be sure to share one database connection with autocommit turned on over all sessions with no synchronization while you're at it, and pretend the only container class is Vector.
  63. Don't scoff at GOF[ Go to top ]

    If someone, say 5 years ago, showed me code that implemented a GOF pattern straight from book, then I would give then points for applying higher level concepts. If they showed me the same today, I would point them to frameworks that could do a better job and suggest refactoring. The statement "Design Patterns are Code Smells" has no historical context. Languages, frameworks, and best practices change over time. C# and RoR have the advantage of baking patterns into convention. Java doesn't have the luxury of refactoring itself. It has to move incrementally on a 10+ year old design.
  64. Re: Don't scoff at GOF[ Go to top ]

    If someone, say 5 years ago, showed me code that implemented a GOF pattern straight from book, then I would give then points for applying higher level concepts. If they showed me the same today, I would point them to frameworks that could do a better job and suggest refactoring.
    I don't really understand this conventional wisdom. It seems very popular this year but what do the framework author use? Other frameworks? It strikes me as a 'run a long little one' sort of attitude. I'm all for reuse and I use frameworks when they make sense but I'm not convinced that developers should never try to solve problems on their own. That's not what I call a development. That's a glorified typist. If no one ever tries to create something that a framework doesn't already provide, who will come up with the approaches that lead to future frameworks? Do we really believe that our current frameworks have reached perfection?
  65. Context is important[ Go to top ]

    If no one ever tries to create something that a framework doesn't already provide, who will come up with the approaches that lead to future frameworks? Do we really believe that our current frameworks have reached perfection?
    It depends if your job is developing frameworks or solving business problems. If somebody asked you to deliver a pile of packages, would you build a van or hire one?
  66. Re: Context is important[ Go to top ]

    If no one ever tries to create something that a framework doesn't already provide, who will come up with the approaches that lead to future frameworks? Do we really believe that our current frameworks have reached perfection?


    It depends if your job is developing frameworks or solving business problems.

    If somebody asked you to deliver a pile of packages, would you build a van or hire one?
    If I delivered packages every day, I'd buy a van, maybe more than one. When you make a sandwich, do you use an automatic sandwich-making machine? This a common but flawed analogy. Frameworks don't do anything by themselves. Have you ever considered who creates a lot of these open source frameworks? It often not that person's job. It also makes a flawed assumption that using something that is made by someone else is always better and cheaper than doing it yourself. Often it's actually more expensive and produces inferior results. A lot of the most successful businesses have beat their competition by doing things themselves. I'm not against frameworks and I use them all the time. I'd never try to create a Java-backed web site without a framework. There are lots of reasons to use a framework and if they save time and effort, they should be used. But the conventional wisdom that all code must be written using a DI framework is just ridiculous bullshit. It will be replaced by the next all 'X must Y' brain-dead dogma. What we don't need is more developers that can't do anything without their favorite crutch.
  67. Re: Context is important[ Go to top ]

    It also makes a flawed assumption that using something that is made by someone else is always better and cheaper than doing it yourself. Often it's actually more expensive and produces inferior results. A lot of the most successful businesses have beat their competition by doing things themselves.
    Can I quote you on that? Maybe the fact that someone else it saying it will help make it believable. And can you tell me what the competition is doing? I don't think anyone will do anything that the competition isn't already doing. Maybe if I can cite someone else saying what the competition is doing, someone will really listen to me!
  68. Re: Context is important[ Go to top ]

    It also makes a flawed assumption that using something that is made by someone else is always better and cheaper than doing it yourself. Often it's actually more expensive and produces inferior results. A lot of the most successful businesses have beat their competition by doing things themselves.


    Can I quote you on that? Maybe the fact that someone else it saying it will help make it believable.

    And can you tell me what the competition is doing? I don't think anyone will do anything that the competition isn't already doing. Maybe if I can cite someone else saying what the competition is doing, someone will really listen to me!
    That's why I always write my own java.lang.* implementations - to stay ahead of the competition.
  69. Re: Context is important[ Go to top ]

    It also makes a flawed assumption that using something that is made by someone else is always better and cheaper than doing it yourself. Often it's actually more expensive and produces inferior results. A lot of the most successful businesses have beat their competition by doing things themselves.


    Can I quote you on that? Maybe the fact that someone else it saying it will help make it believable.

    And can you tell me what the competition is doing? I don't think anyone will do anything that the competition isn't already doing. Maybe if I can cite someone else saying what the competition is doing, someone will really listen to me!


    That's why I always write my own java.lang.* implementations - to stay ahead of the competition.
    I'm pretty sure you don't need to be told that the negation of "don't do anything yourself" is not "do everything yourself".
  70. Re: Context is important[ Go to top ]

    It also makes a flawed assumption that using something that is made by someone else is always better and cheaper than doing it yourself. Often it's actually more expensive and produces inferior results. A lot of the most successful businesses have beat their competition by doing things themselves.


    Can I quote you on that? Maybe the fact that someone else it saying it will help make it believable.

    And can you tell me what the competition is doing? I don't think anyone will do anything that the competition isn't already doing. Maybe if I can cite someone else saying what the competition is doing, someone will really listen to me!
    I'm not sure if you are being serious or not but the kind of thing I am talking about is one of the ways WalMart became successful. Setting aside any feelings people have about WalMart, WalMart implemented it's own distribution system. Current conventional wisdom would say that WalMart isn't in the logistics business and therefore should not have done that. But the fact remains that doing so gave them a distinct competitive advantage. There are also good examples where outsourcing everything has resulted in disaster. Literally. For example, ValueJet's outsourcing of almost everything was a contributing factor to it's unusually high rate of airplane crashes. I am not saying that implementing everything from scratch is a good idea. Actually that's a really bad idea. What I'm saying is that picking something out and saying 'this isn't our business' is completely arbitrary. It's really just a specious argument. These types of decisions should be made based on weighing the costs and advantages of doing things yourself and using an outside source. If we really believe that "don't do it yourself" must be followed without exception, why are we even using frameworks? Shouldn't we always buy fully working applications off-the-shelf? You should be solving business problems, not writing code, right? That's actually what I'm dealing with right now. The organization I work for has purchased many millions of dollars worth of COTS tools in the name of saving money. Some of this has been successful but a lot of it has been a disaster. We've ended up with unhappy users, corrupt data, missed deadlines, indefinitely delayed strategic initiatives and on top of that we've had to hire more permanent employees and more contractors to support these tools. A number of these things could have been done in house by pulling together some frameworks and writing some custom code at a much lower cost (up-front and over time) but that's 'doing it yourself' and must be avoided at extreme expense.
  71. Re: Context is important[ Go to top ]

    James - Are you under a psuedonym? Because on several occasions I could have sworn you were describing my employer, but you're not in the corporate address book. That's a rhetorical question. Build vs. buy is always a very hard decision. IMHO the track record of off-the-shelf enterprise applications is horrendous. But so is the record of their bespoke counterparts. My personal opinion is that while significant progress has been made at componentizing solutions for the technical problems associated with building business applications, little to no progress has been made at building actual business components. Just blather about SOA that ignores all of the hard problems. Takes dates and times, for example. The JDK classes for it are awful. Joda Time solves more technical problems associated with dates, times, timezones, etc than I ever knew existed all in a nice extensible library. It provides a solid foundation that bears little resemblance to the way actual human beings think about time. So it's the programmers job to translate between the fuzzy, illogical world of human time and the rigorous definitions being used in the software. Often times both loose out. Dealing with dates, weeks, months, etc. is not new. Not by a longshot. But our default representation of a Date is still the number of seconds since the Java Epoch (I wonder if historians will record that...). Maybe this is all my ignorance. Someone please point me at a reusable library where I can create a nice, typesafe object that represents "the second Monday of the month, unless that Monday is a holiday in which case use the next working day after that Monday."
  72. Re: Context is important[ Go to top ]

    James -
    Are you under a psuedonym? Because on several occasions I could have sworn you were describing my employer, but you're not in the corporate address book.
    It's like this everywhere, it seems.
    That's a rhetorical question.

    Build vs. buy is always a very hard decision. IMHO the track record of off-the-shelf enterprise applications is horrendous. But so is the record of their bespoke counterparts.

    My personal opinion is that while significant progress has been made at componentizing solutions for the technical problems associated with building business applications, little to no progress has been made at building actual business components.
    I think this is intractable problem to some degree. To create a tool that handles everyone's needs, it has to be so configurable that the scope of the configuration becomes equivalent to that of a general purpose language. Generally speaking, these tools were not intended to be general purpose languages and perform badly in that role, unsurprisingly.
    Just blather about SOA that ignores all of the hard problems.
    It's unfortunate that SOA has become such a meaningless buzzword in so many circles because underneath all the marketing BS, there's actually a really great architectural approach. Most people who talk about SOA don't know anything about that, though. SOA might as well stand for 'Software On Acid' based on some of the descriptions I've heard.
    Takes dates and times, for example. The JDK classes for it are awful. Joda Time solves more technical problems associated with dates, times, timezones, etc than I ever knew existed all in a nice extensible library. It provides a solid foundation that bears little resemblance to the way actual human beings think about time.
    Actually I think the Calendar class is a pretty awesome tool for writing complex date based logic. It's ugly but you can do just about anything you can think of with it. The real crime is that the Date class has been so broken for so long. It's absurd, really.
    So it's the programmers job to translate between the fuzzy, illogical world of human time and the rigorous definitions being used in the software.
    I believe that this is what the role of a programmer is and will always be. The problem is that a lot of people want to turn programmers into technical typists (and/or mouse-clickers) by buying tools that supposedly remove the need for real programming. I've not yet seen anything that suggests this is achievable.
    Often times both loose out.

    Dealing with dates, weeks, months, etc. is not new. Not by a longshot. But our default representation of a Date is still the number of seconds since the Java Epoch (I wonder if historians will record that...)
    I thought Jan 1, 1970 was common across many computer platforms as the beginning of the current epoch. Maybe I'm wrong.
  73. Re: Context is important[ Go to top ]

    I believe that this is what the role of a programmer is and will always be. The problem is that a lot of people want to turn programmers into technical typists (and/or mouse-clickers) by buying tools that supposedly remove the need for real programming. I've not yet seen anything that suggests this is achievable.
    I agree in the general case but not in the specific case. If we don't start building widely reusable abstractions for extremely common business concepts I think we'll hit (or have already hit) a brick wall in terms of productivity for small scale applications. Yes, it's not that hard to do whatever you want with available date/calendar classes, which is probably why no one that I know of has written a library that brings it up the the business level. It's too hard to get right in the general case and too easy to implement in the specific case. But I think we're really paying for it in terms of productivity and reliability.
  74. Re: Context is important[ Go to top ]

    I believe that this is what the role of a programmer is and will always be. The problem is that a lot of people want to turn programmers into technical typists (and/or mouse-clickers) by buying tools that supposedly remove the need for real programming. I've not yet seen anything that suggests this is achievable.


    I agree in the general case but not in the specific case. If we don't start building widely reusable abstractions for extremely common business concepts I think we'll hit (or have already hit) a brick wall in terms of productivity for small scale applications.

    Yes, it's not that hard to do whatever you want with available date/calendar classes, which is probably why no one that I know of has written a library that brings it up the the business level. It's too hard to get right in the general case and too easy to implement in the specific case. But I think we're really paying for it in terms of productivity and reliability.
    Reusable components are good and if you can get something that does what you need at lower cost than you can do it yourself, you should do so. I don't think there is much to argue about over that. One thing that seems to be missed, however, is the total cost over life of the system is what matters. What I have problem with is taking this to the extreme to say that everything must be done with reusable components from outside. This is never going to work. Take your date example. There is no way that any vendor or project is going to provide a tool that will handle every specific date logic desired by every organization in the world. It's never going to happen. What I see often is that when a quality tool cannot be found, either business process are changed to fit some software written by people who are not in that business (they are in software) or a product of poor quality is chosen or both. This is madness. I cringe every time I hear someone suggesting that our customers change their processes because pathetic piece of garbage we call 'a solution' does not fit their current process and the vendor wants a king's ransom to make it so that it does. Then we wonder why our customers think we are a bunch of clowns. No one outside of IT cares whether you "do it yourself" or not. All they care about is whether it works and how much it costs. We'd be wise to start thinking in those terms too instead of making decisions based upon specious arguments and flawed analogies. Reusable components from outside the organization are one way to save time and money. Other ways are internally created reusable components and using powerful tools that make it possible to get things done correctly and quickly.
  75. Re: Context is important[ Go to top ]

    Take your date example. There is no way that any vendor or project is going to provide a tool that will handle every specific date logic desired by every organization in the world. It's never going to happen.
    Not every piece of logic, but I'm fairly certain the level of abstraction could be increased...or maybe the impedance mismatch between context-based human conceptions and context-free formal constructs could be lowered. You "just" need to determine what the facets of the context are, and make a way to easily blend the context type and the formal construct type together into something that a lay person would recognize.
  76. Re: Context is important[ Go to top ]

    Take your date example. There is no way that any vendor or project is going to provide a tool that will handle every specific date logic desired by every organization in the world. It's never going to happen.


    Not every piece of logic, but I'm fairly certain the level of abstraction could be increased...or maybe the impedance mismatch between context-based human conceptions and context-free formal constructs could be lowered. You "just" need to determine what the facets of the context are, and make a way to easily blend the context type and the formal construct type together into something that a lay person would recognize.
    The problem is that most lay people can't even express what it is that they want in plain terms. A lay person will say "I want X" and when X is provided they will say that it's not what they asked for. If another human cannot understand their request, a computer will not either, not until AI makes some major leaps, anyway. And I don't believe that any AI will be able to know what the lay person really wanted Y with Z in cases A,B,C when they said they wanted X. I think that programming languages can be improved but this will mainly make developers more productive and lower the barrier of entry for people to become developers. But the real work of development: taking fuzzy human desires and telling the computer how to satisfy them is not going to be solved by better languages. Actually better languages will increase the amount of time developers spending doing these things instead of solving language related issues. In reality, a lot of better languages already exist. Java is good for certain tasks but for it's not suited for a lot of the things it's commonly used for.
  77. Re: Context is important[ Go to top ]

    I think that programming languages can be improved but this will mainly make developers more productive and lower the barrier of entry for people to become developers.
    Would you call me cynical if I said I think we should be raising the barrier? Unless someone comes up with a programming language that some how forces organized, logical thought on people without destroying their ability to communicate with the outside world.
    But the real work of development: taking fuzzy human desires and telling the computer how to satisfy them is not going to be solved by better languages.
    Very true.
    Actually better languages will increase the amount of time developers spending doing these things instead of solving language related issues.
    Yes.
    In reality, a lot of better languages already exist. Java is good for certain tasks but for it's not suited for a lot of the things it's commonly used for.
    Yes. This isn't a language issue, although a good language can help. What makes you think that a company can create an entire ERP System (just to pick an example) that is suitable for a large portion of business but while thinking that it is impossible to come up with a good, general purpose date/time library that has a low impedance mismatch with the normal human conception of time? ...or maybe you don't think that such applications can be built. You never said that they can so I shouldn't put words in your mouth. Ok, I'm going to take a different approach. I often use acyclic directed graphs and trees. I end up hand-coding the logic on my objects to maintain the DAG-ness or the tree-ness because while there are libraries for dealing with graphs, I'm not aware of any that let you nicely mix them in with your regular business-type objects. You could go even simpler - bidirectional associations. I suspect this is because most people write...uhh...what I'll call floppy object models, which I'll define as meaning that it is possible (or, say easy) to put an object graph into an invalid state by simply using its public interface. This would also include constructors that return invalid, half-initialized objects that are then configured through a bunch of public setter methods (note: I think private/protected/package constructors that return half-initialized objects with "non-validating" non-public setter that are then configured by a factory object or DI are ok, as long as by the time "user code" gets the object it is valid and will stay valid).
  78. Re: Context is important[ Go to top ]

    What makes you think that a company can create an entire ERP System (just to pick an example) that is suitable for a large portion of business but while thinking that it is impossible to come up with a good, general purpose date/time library that has a low impedance mismatch with the normal human conception of time?

    ...or maybe you don't think that such applications can be built. You never said that they can so I shouldn't put words in your mouth.
    I don't think an ERP system can be built that suitable for a large portion of business. I think most of an ERP system can be built because a lot of stuff is similar across most businesses. A lot of vendors try to address this with customization but they have done it really poorly, in my opinion. I think this may be getting better. Most of our tools allow you to plug-in custom Java classes to some degree. But in general, the models provided for tailoring these COTS applications are clunky at best.
    Ok, I'm going to take a different approach. I often use acyclic directed graphs and trees. I end up hand-coding the logic on my objects to maintain the DAG-ness or the tree-ness because while there are libraries for dealing with graphs, I'm not aware of any that let you nicely mix them in with your regular business-type objects.
    This precisely the kind of thing that I mean. There's a library out there that does, say a big part of what you want but doesn't provide an effective way to add the rest of what you want to do. It then becomes either worthless or a burden.
    You could go even simpler - bidirectional associations.

    I suspect this is because most people write...uhh...what I'll call floppy object models, which I'll define as meaning that it is possible (or, say easy) to put an object graph into an invalid state by simply using its public interface. This would also include constructors that return invalid, half-initialized objects that are then configured through a bunch of public setter methods (note: I think private/protected/package constructors that return half-initialized objects with "non-validating" non-public setter that are then configured by a factory object or DI are ok, as long as by the time "user code" gets the object it is valid and will stay valid).
    I've kind of lost track of your point but I know exactly what you mean here.
  79. Re: Context is important[ Go to top ]

    I think that programming languages can be improved but this will mainly make developers more productive and lower the barrier of entry for people to become developers.


    Would you call me cynical if I said I think we should be raising the barrier? Unless someone comes up with a programming language that some how forces organized, logical thought on people without destroying their ability to communicate with the outside world.
    Well, that's a little cynical but I think I know where you are coming from. What I mean is that there are a lot of smart people that could be good business logic developers but haven't been initiated into the world of programming. I don't think we will see a language in our lifetimes that will make any schmuck off the street and make able to program properly but I think it's possible there could be one that could let people with a firm grasp on logic program without a lot of gotchas.
  80. Re: Context is important[ Go to top ]

    No one outside of IT cares whether you "do it yourself" or not. All they care about is whether it works and how much it costs.
    That's almost certainly not true. You're right that we'd be wise to start thinking in those terms, but so would everyone else. The rest of the world may not care about all the techno mumbo-jumbo that IT people like to spew (as do other professionals in their respective domains), but they certainly have opinions about build vs buy. I see non-IT people crawl into bed with software vendors and defend those vendors' products as if they were the One True Religion(tm) on a regular basis. I also watch vendors feed this like there was no tomorrow, creating little cults around their products. Then two people who prefer opposing products (or one COTS and one bespoke) get together and it's like a chicken fight.
  81. Re: Context is important[ Go to top ]

    No one outside of IT cares whether you "do it yourself" or not. All they care about is whether it works and how much it costs.


    That's almost certainly not true. You're right that we'd be wise to start thinking in those terms, but so would everyone else. The rest of the world may not care about all the techno mumbo-jumbo that IT people like to spew (as do other professionals in their respective domains), but they certainly have opinions about build vs buy. I see non-IT people crawl into bed with software vendors and defend those vendors' products as if they were the One True Religion(tm) on a regular basis. I also watch vendors feed this like there was no tomorrow, creating little cults around their products. Then two people who prefer opposing products (or one COTS and one bespoke) get together and it's like a chicken fight.
    OK, you've got me there. What I should have said is that nobody outside of IT 'should' care.
  82. Re: Context is important[ Go to top ]

    I thought Jan 1, 1970 was common across many computer platforms as the beginning of the current epoch. Maybe I'm wrong.
    It is, but try googling "Java Epoch" and you'll see what I mean. I noticed it in some API docs the other day (I don't remember for what...) and thought "wow, isn't that Java Centric."
  83. because we arent computer scientists[ Go to top ]

    Any framework or abstraction you or I create is a known factor by the people who ARE computer scientists. THEY (the computer scientists) just arent that interested in OUR problems. And why should they be? They are too busy trying to get the most out of computers to cure diseases and solve the problems of physical science. Which results in faster computers for us to putz around and play with these silly patterns of crap which are no more than the musings of children which have been in the playpen too long. We are not computer scientists just because we are software engineers. We are the ones stuck with the last rev of what anyone in computer science cared about before they got on to more important business.
  84. Just my 2 cents I've seen number of people (in the past) that develop with GOF book in hand just copying almost everything from it (down to names, etc.). That's obviously clinical. But Java implements/provides many base patterns in JDK and we use them on the daily basis (sometime without realizing it). My personal opinion is that many of the patterns are rather over-elaborated and they certainly depend on the language in use. It's nicely summarized here: http://en.wikipedia.org/wiki/Design_pattern_(computer_science)#Criticism (ULR is broken by TSS parser)... Best, Nikita. GridGain – Open Source Java Grid Computing
  85. Salaam to all, Stuart's comments are perfectly reasonable in context of an arts and crafts approach to building technological artifacts. Which is precisely where we have been, and regrettably appears will remain in the near future, in the so-called 'software industry'. However, they do betray a clear misunderstanding regarding the value of the added complexity of an 'architectural' approach to software development. (The subtitle of the GoF book, lets remember, contains the words "reusable" ...) The GoF book, if it is not clear, is based on the lessons drawn from other disciplines, specifically architecture and urban design, and the pioneering ideas and works of Christopher Alexander, as an alternative approach to the prevalent approach to effective use of modern industrial processes and materials. Prior to Alexander's work, which can fairy be characterized as a (chunked) bottom-up approach to design, the high doctrine of design in Architecture was that articulated by the Modern school, including Corbusier, Mies, and Bauhaus. The modern approach can effectively be described as a theory of Types, quite similar to what are comprehensive 'frameworks' in our field. (RoR, for example, is a platform for quick assembly of a 'type'.) Alexander's movement was part of a historic reactionary movement against the top-down Type approach in architecture, which was clearly seen to promote look-alike, soul-less, and ultimately inhuman environments. One reactionary movement, Post-Modernism, chose to simply 'dress' the Modern top-down design Type, or employ 'irony' in the misuse of a Type for a function, or through unexpected juxtaposition of types. Another reactionary movement, typified by Bruce Goff, chose to entirely do away with the industrial vocabulary, and affected an embrace of 'organic forms' and 'organic methodologies'. (This is somewhat similar to the philosophical motivation of XP.) None of the above 'solutions', of course, solved the problem which initially engendered the Modern Movement in Architecture: How to effectively utilize modern materials and processes to design and build environments in the industrial age. Alexander's approach was to reject the notion of (overall) type, and instead embrace 'localized patterns', as 'guidelines' in the design of an overall structure, or (urban) environment, which would then incorporate a 'realization' of a set of patterns in its final design, and implementation. A pattern is a 'conceptual' design. It is not a concrete design independent of the target platform, just as the conceptual design pattern of a 'courtyard' is entirely silent on the topic of whether concrete or bricks are used. Recipes are the vocabulary of contractors and construction workers. Patterns are the vocabulary of architects, Engineers, and designers. And the smell is significant of the conflation of these two distinct roles in our "industry". There will never be a software "industry" until high-school dropouts in some 3rd world nation sit around an assembly line assembling high-technology software components of which they are completely clueless. Until, that day, we will continue to have intelligent people wasting their lives crafting (yet another) book-keeping or ledger 'type' application, and, in order to keep their creative and intellectual needs satisfied, jump from one technology to another in boredom.
  86. You used a lot of words but I'm struggling to figure out what your point is.
    The GoF book, if it is not clear, is based on the lessons drawn from other disciplines, specifically architecture and urban design, and the pioneering ideas and works of Christopher Alexander, as an alternative approach to the prevalent approach to effective use of modern industrial processes and materials.
    I've always taken the GoF book as the result of identifying commonly reoccurring software design issues across many projects and giving them a name. That they chose to name a software pattern after a similar physical architecture pattern is, as you state, probably not an accident. The authors did not invent the patterns as much as they discovered the need to document that similar design challenges showed up across diverse projects: needing to have a single point of reference, needing to interface with existing legacy code, needing to isolate lower level implementation from higher level logic. I don't know if you are blowing smoke in the next six paragraphs or if you have a real point. There may be something interesting there but it is lost to me - maybe because a J2EE bulletin board isn't the appropriate place to post a first draft of your thesis? If you have a finished paper that goes into detail and provides references to back up what you are asserting then that would be an interesting read. I guess my problem here is I have no idea if you are separating the idea of software architecture and physical architecture at all. My opinion has been that it is dangerous to assume that these two share anything other than a handful of ideas at the conceptual level. I'd go further in saying the only reason physical architectural terminology is used in software at all is because of our human limitation to create new ideas mostly out of what we already know. My guess is that as we move forward talking about software will have less to do with "building blocks" and more do do with fluids, waves, motion, fuzziness etc. Your last paragraph scares me. It almost sounds like you want the dismal assembly line approach to software development, which isn't an appropriate analogy anyway. Assembly lines are used to mass produce the same widget at the lowest possible cost. Since the cost of replicating a software component is zero, there will never be a need for the assembly line. I'd be interested in hearing your point of view, if you can state it more succinctly and clearly. I just didn't get about 90% of what you said.
  87. Food Smells. :)[ Go to top ]

    I am an experienced developer, I survived many projects before I knew any design patterns, then learned them afterwards. After I really understood them, I felt shameful for some code I wrote before, even they run perfectly well for the business operations. Last year, I started to work on a project. I noticed something similar to my old code. A huge class does everything for 4 kinds of calculation. So for everything big or small, this class has to be visited and changed. I spent one month and turned this one class into 94 classes collaborated with each other in Abstract Factory, Prototype, Singleton patterns for object creation and Strategy pattern for variant situation handling of each kind of calculation and then added the fifth one without concerning any other 4. Previous to my change, there were many if-else branches and long methods doing complicated decision making, after the change, each class is organized by its functionality and product family. If you think these classes as tools you used to fix your home, previous to my change, all tools are stored in a bag without any order and after the change, they are clearly labeled and put on a tool shelf. This is achieved by using appropriate design patterns. Of course, we can't overcome language limit by using design patterns, we know that. This is same as people can't overcome their life limit by eating food, we can't just say the food smells. :)
  88. IMO, I think the article put it way too black or white. True a design pattern can and is sometimes abstracted in a programming language. For instance, the factory pattern is provided in dynamic languages implementing meta-programming facilities. However, you have to remember that the more and more abstractions you add in the language, the more and more bloated and complex your language can become. For instance, would you see the State pattern implemented directly in Java? In the end, I believe the real question to be : Is the design pattern so used now that it should be abstracted in the language, ie given the status of an axiom as Inherintance and Polymorphism were in OO for instance? Or is it still not used enough to consider making a abstraction out of it. AOP is a nice example, trying to integrate the proxy design pattern (based on the dynamic strategy) in the core language. Some people believe it is the future of java while some other think it shouldn't be implemented at the language level and that the abstraction model is not right. There are no absolute right answers here contrary to what some people try to pretend. So in the end it's more of choosing what should be raised as an axiom of the language. This isn't as an easy decision as the author seems to imply. You have to decide not only wether or not the new abstraction is going to make your programming language more powerful but mainly if it's going to be worth the readibility lost associated with this new abstraction (an abstraction always means more indirection). Think how people were against OO at first because it was making it to hard to follow the program execution flow and see how this argument is brought once more against AOP. A language is based on a paradigm and a paradigm is just a view and a way to represent your program execution and there is no true way to represent something. It's more about trying to find the more powerful and yet readable, simple paradigm convenient for the majority of people out there.
  89. It's more about trying to find the more powerful and yet readable, simple paradigm convenient for the majority of people out there.
    and I found VB :)
  90. It's more about trying to find the more powerful and yet readable, simple paradigm convenient for the majority of people out there.


    and I found VB :)
    You forgot the powerful part ;)
  91. It's more about trying to find the more powerful and yet readable, simple paradigm convenient for the majority of people out there.


    and I found VB :)


    You forgot the powerful part ;)
    But what makes you think VB is not powerful? It is procedural, and doesn't allow she-bang OO architeral framework development, but it sure got the job done - to the point of creating DCOM components easily enough - that is distributed computing ... Yes, it was not a 'general purpose' languange in the sense that C/C++ is (you don't write TCP server in VB) - but you could write the components that could be hosted on COM+/MTS and would scale as much as windows platform would. 'Powerful' is not sufficiently well defined. I was sitting at one of Rod Johnson's session hearing about 'power' of spring. I had to ask him to clarify what he meant by 'power' - was it scalability/performanace or was it flexibility/maintainability. I need to know you concpet of 'power' before deciding why VB is not powerful.
  92. Interesting discussion sudhir http://www.jyog.com
  93. I've read the article again - and read some of the posts here. I basically agree with the article, but I wouldn't use the label "code smell". Design patterns in themselves are neither good or bad. What I think we are really talking about is the designer. Patterns are language specific and they can stagnate as the article author describes around the limits of a given language. I see many posts here arguing at diferent levels of abstraction and from different perspectives. The good thing about patterns is that they provide a common language with which we all can communicate about design. The problem IMO is that our common "design" language today is so poor. The mainstream understanding of OO design stems from C++ and its limits and associated patterns. These patterns have been "borrowed" by the Java community also. As a consequence our mainstream OO "pattern language" is relatively poor. Many people (as many of the posters here) have stumbled on to better ways themselves from within the constraints of a given language (the JMockit discussion comes to mind here). But in my opinion, the thinking of programmers and the common understanding of what OO design should be all about won't improve until more programmers are exposed to more languages (Smalltalk, Self, Lisp, scheme etc). C++ is just a bad intro to OO in my opinion, and Java is not much better. Perhaps it would help if the Universities stopped "vocational training" and went back to teaching "computer science". When I pair program with many junior programmers they are surprised when I describe language limitations as language smells, and show how certain patterns could be made redundant with language changes. This shouldn't be a surprise it should be common knowledge. A good example of this is the use of iterators versus closures. How else are we ever going to make appropriate language choices? For example now that Java is open source, how is the community going to provide adequate stewardship for the language if most developers can't distinguish good from bad? Having a pattern language isn't the problem IMO. The problem IMO is that our common design language today is pretty crude and based on a limited (wrong?) set of programming languages. We need incorporate more programming languages into our common design language IMO, extending it into something that is more expressive. Paul.
  94. Paul,
    But in my opinion, the thinking of programmers and the common understanding of what OO design should be all about won't improve until more programmers are exposed to more languages (Smalltalk, Self, Lisp, scheme etc).
    I think one of the pragmattic programmers said you should learn one new language a year (maybe that's already been mentioned in this thread...). I'm not sure that's enough, but it would be a start.
    Perhaps it would help if the Universities stopped "vocational training" and went back to teaching "computer science".
    +1...although defining computer science is probably tricky. I don't think any of my CS professors would have agreed on a definition.
  95. At least there is one huge benefit with design patterns, and that is that it gives people (developing in different languages) a common "language" to discuss code design. Imagine a java guy discussing pros and cons of singleton and dao with a c++ guy, without a common terminology. Of course we can all start using "the thing formerly known as a pattern formerly known as a singleton" instead... Also, perhaps it is a new fashion to skip interfaces and factories, but I promise you that the guy who maintains your 2 million line enterprisy app 15 years from now will love you more if you stick with them.
  96. I may be way out of the discussion. I read the article and it seems many of the members here are way out of the discussion. I understand patterns as three types: Architectural Styles, Design patterns and Language Idioms. As you can see, the styles are architectural patterns that do not imply any language implementation since they are at architectural level. The Design Patterns (mostly GoF derived) are that: design. That means the idea of the solutions is given in higher detail, at least the essence of it, but it is not code. Lastly, the Idioms are aimed to particular languages and their capabilities. So, talking about patterns as code smells, is like looking for construction problems in the bricks by studying the blueprints. The patters is not going to go away. The idea of the Factory may need to be implemented with lots of tricks in one language, and may be natural, built-in in another one, but the idea is the same. The real problem is the overuse of patterns as recipes. The pattern solves a problem, under certain environment, and has some consequences. Thus, applying patterns is an art of design, not development (codification). We are talking about inventing the apple pie and peeling the apple. William Martinez Pomares
  97. +1 I wish I had a link to back this up but somewhere I read that the GoF (or one of them) admitted that including concrete examples to their book was a big mistake. It made many assume that the patterns were recipes instead of language-neutral concepts. Off subject a bit but I've been involved in many interviews lately where knowing "Design Patterns" is listed on the resume as if it was a technology. "You know how to implement a Singleton in Java? You're hired!"
  98. By the way, I meant idioms too not axioms in my answer. London beer I guess ;-)