Opinion: Using Bleeding Edge Technology in a Team


News: Opinion: Using Bleeding Edge Technology in a Team

  1. Opinion: Using Bleeding Edge Technology in a Team (23 messages)

    Cedric brings up some interesting questions, when his weblog talks about using bleeding edge technology in the real world.

    "It's easy for talented developers to juggle and assimilate innovative notions such as Aspect-Oriented Programming and integrate those in their real job, but what about the rest of the team? Or do these people work solo?"

    Do you have any answers to his questions?

    - Are you using bleeding-edge technology in your every-day job?
    - If you are, are you the only one on your team?
    - If you are not, how does the team feel about this and was it easy to convince them?

    Read Cedrics entry

    In my experience, the majority of teams are using legacy technologies, and "playing" with new things. Only some of the luckier (and often smaller teams/companies) are using the bleeding, or even leading edge toys.

    Threaded Messages (23)

  2. I work for Northwest Alliance for Computational Science and Engineering (NACSE for short) which is affiliated with the engineering school at Oregon State University. We are primarily a computer engineering research group which includes supercomuting, parallel tools, user interface, and middleware. We work with the federal government, DOD, other universities, and scientific researchers in CS and other disciplines. I really only deal with the middleware side of our research and we do a lot with J2EE (which I am mostly responsible for here).

    Since we are research, we want to and need to work with bleeding-edge. It's our job to come up with new uses and developments in technology. Having said that, there are some big caveats. First, we are funded by research grants, so obviously that means different things than development in commercial space. In fact, this is my first experience in research, and there was a big paradigm shift for me.

    One of the big reasons that we get to play with bleeding edge is that we get things for cheap. Universities typcially get massive discounts on software and hardware from almost every company. This gives us the ability to try out all sorts of new things that commercial companies wouldn't do because the potential sunk-cost nature of these things would likely be too great.

    Depending on the project, I am sometimes the only person on the team. Sometimes I'm with a team of 10 or 20 people. Of course it is easier to work with the bleeding edge when you're alone. However, we do work in a lot of collaborative environments for development. I've found that when we work with other researchers like the San Diego Supercomputing Center (SDSC) or others like it, it's much easier to convince them to try new things and see what happens.

    When we work closely with commercial companies, I've found that they are more reluctant to do this. That makes perfect sense though: They are in the business of making money, while researchers like me are in the business of coming up with new stuff (although I certainly like getting paid too).

    I usually don't have to hard of a time convincing others on the NACSE team as long as I have strong justification for doing something. I just have to back up my reasons. Sometimes things work, and sometimes they don't.

    One of the hardest things about bleeding edge is managing its impact on things you've already done. Say your using leading edge toolset, then the toolset changes and does not support the features/APIs you used earlier. You now have the sunk cost and the additional cost of re-engineering. Worse, if a whole class of products goes away. Remember in the lat 90's when everyone and their best-buddies were all hyping Application Service Providers? What happened to that? That was one bandwagon that never rolled (and glad to say we didn't get on it).

    I guess you just have to pick carefully and take the calculated risk if you're going to get into bleeding edge. I'm not trying to imply that research or working with bleeding edge is just one big risk. We still have to deliver, and the stuff has to work. But you do have to calculate the upside and downside of risk even more carefully with bleeding edge.

    It sure is fun though to play with the bleeding edge, but there is a reason it's called "bleeding."

    -Jason McKerr
    Northwest Alliance for Computational Science and Engineering
  3. I think what we've seen for the last couple of year is a shift from using bleeding-edge technologies to using technologies that are right for the given task. Let's stress it - technologies that *proved* to be right, or working. If two or three years ago amount of money one would get from VCs depended on how many XMLs, XSLs, EJBs and J2EEs you got on your flag, now it depends on how *fast*, with required quality a team or a company can delivere result. And this notion of effectiveness defines choice of used technologies.

    Bottomline: Today best technologies and best practices are in use. Effectively it leaves bleeding things aside until thry prove to be best. Though, it's pretty funny that even best practices require buy-in when introduced to a team.

    BTW, I'm wondering how would these bleeing things prove to be best if no one is using them? :)
  4. You are an idiot[ Go to top ]

    Since when is AOP bleeding edge? it has been around for decades in a variety of systems from Emacs with defadvice. to CLOS with MOP.
  5. Since when is AOP bleeding edge? it has been around for decades in a variety of systems from Emacs with defadvice. to CLOS with MOP.



    I have limited computer education, but some computer use.

    I am good at thinking up projects and have a website that needs good planning, but I need very selective education likely, of great reading material about how to go on with my website and now I am thinking about giving up on my hosts and trying to somehow get the right machines, etc.

    I am also not the best reader, but I do well with images. Why do you not link to images. Most people actually are right brain dominant and do well with images.
  6. Everybody keeps talking about the weather, but nobody ever DOES anything about it.

    For all the ink that has been spilled over the dynamics of team programming, no consensus has been reached. Here and there, someone stands on a soapbox, strenuously advocating the universal adoption of something that they saw work, with their own eyes--once, at one shop, on one project. Funny how each of those proposed panaceas is so unique, fascinatingly different from all the others.

    Software development is partly mechanical and partly creative. Insofar as it is mechanical, it yields to the same kinds of schemes for partitioning work among team members that have long been used in mechanical engineering. Insofar as it is creative, it is inherently not a team activity, for the same reasons that great symphonies or novels are not written by teams. Creative people have to have ownership.

    The boundary between mechanism and creativity is determined by several things. Two of the most important are the management of complexity, and the maturity (~= predictability) of tools and platforms.

    Complexity management is typically botched. This is the largest single reason why there is still a premium on creativity in our industry. Only a creative mind can cope with the huge and crazily-shaped lumps of complexity that emerge from the typical "design" process.

    Mature technologies can be methodically learned, but making use of immature technologies depends on making discoveries, which is to say on the kinds of things--intuition, non-linear reasoning--that are associated with creativity. Two heads can be better than one when it comes to brainstorming workarounds or reading documentation between the lines, but they must both be creative heads--which, in turn, means that when the fires have finally been put out, they will no longer be able to work seamlessly together.
  7. That sounds like elitism to me. Does this mean that the rest of the crew you are working with are morons ??? What is so innovative in AOP which makes regular team member unable to grasp it ???

    >My point is not so much with Rickard but about
    >the reality of "bleeding-edge" technology in general. >It's easy for talented developers to juggle and >assimilate innovative notions such as Aspect-Oriented >Programming and integrate those in their real job, but >what about the rest of the team? Or do these people work >solo?

  8. I don't think that is necessarily elitist, I am yet to see any aspect of human endeavour in which everyone is equally proficient or interested.

    I think it is a bit naïve to think that everyone has the same ability or willingness to learn and use so-called bleeding edge technologies. Indeed I am sure many of us have come across developers who are generally disinterested in what they do.
  9. "Indeed I am sure many of us have come across developers who are generally disinterested in what they do."

    Yes, and I have never enjoyed working with these people. They rarely contribute creative input to a project and their work is usually of poorer quality. I think this goes for all "professional" lines of work. I would not want my children taught by a teacher who is generally disinterested in their work. I would not want to go to a doctor...get my point?

    Besides, what is wrong with elitism? Some people are smarter than others. Some people have certain abilities while others do not. That's life. Software development requires a certain amount a intellect to grasp the logic behind it and, of course, a good deal of problem solving skills. If somebody is in this field and does not possess these skills, we should not coddle them and say, "Gee, they just don't like development like I do". They should find a different line of work because, chances are, they are not going to contribute much value.

    Somebody compared disinterest developers to a person that shovels asphalt or sorts potatoes for a living. I hardly think these are the remotely the same thing. Some people get this line of work. Some don't. I don't want to work with those who don't. If that is elitist, so be it. This isn't for everybody. If it was, it would pay minimum wage.


    - Side note: An interesting book on this topic is "In Defense of Elitism" by William A. Henry.
  10. Indeed I am sure many of us have come across

    >developers who are generally disinterested in
    >what they do

    I worked with quite a few who were absolutely disinterested in what they did, but they produced excellent results anyway, just like professionals are supposed to do.

    Also, I worked with people who were absolutely obsessed with "bleeding edge" stuff and wasted lots of their and others time trying to utilize each and every magic bullet they came across. It's even worse than "non invented here" syndrome.

    My point is that if "bleeding edge" stuff does make sense - with regard to the project goals of course, then usually it is not a problem to adopt it in a team, not just by a selected few - that what mentoring, knowledge sharing etc are for.

  11. I would not want to go to a doctor...get my point?

    Yes. You definitely do not want to go to a curious and enthusiastic doctor who is itching to try the latest and greatest "bleeding edge" stuff on you. You want to go to the old and bored one, who knows what he is doing ;-)

  12. It's not the rest of the team that worries me. While the guru is on the team it will still work.

    But, when the project goes into maintenance and the Guru moves on then there is a big risk. Especially if the technology that was chosen was simply a blip/proprietary and is no longer mainstream.

    Like everything in software it comes down to management. You need to let folks play otherwise your team deskills/leaves. You also need to ensure that those playful components do not create risk for the project. Which is what project/people management is all about.

    On the other hand, JFDI still works if the team is good enough. Then it becomes a simple audit process to ensure they aren't rewritting C++ into a smalltalk look-a-like (for instance).

  13. Dimitri,

    I never once mentioned "bleeding edge" technology in my last post (althought that IS what this thread is about). I was responding to a comment about disinterested developers. So, to expand on your analogy...

    Yes, I would happily go to a doctor who uses the latest technology - if they know what they are doing. Sometimes bleeding edge is the ONLY thing that can help you when it comes to medicine. And no, I would never want to be treated by a doctor who is "bored" with what they do. From my experience, boredom/disinterest leads to lack of focus which leads to mistakes. With all else being equal, give me the doctor who gives a damn about his work over the one who could care less.

    To bring this full circle back to software...I do not have a problem with using bleeding edge software. Most likely, this new software was created to better address a problem. AOP addresses issues that are not cleanly handled with OOP. Am I advocating using bleeding edge all the time in production? No. But it doesn't hurt to take a look at the new stuff that is out there. Change is good change. Change is progress. Today's bleeding edge is tomorrow's leading edge and the next day's current technology.

    Perhaps I am a little bit biased because I do enjoy learning bleeding edge technologies. Maybe I just don't understand how somebody can do something for a living, but have no desire to learn more about it or get any better.


    P.S. Grady Booch wrote an article about AOP being the "next big thing" in SD magazine a year and a half ago, so is AOP really bleeding edge?
  14. But it doesn't hurt to take a look at the new stuff

    >that is out there. Change is good change. Change is
    >progress. Today's bleeding edge is tomorrow's leading
    >edge and the next day's current technology.

    Yes, but the point of the original article was that "bleeding edge" stuff is problematical, because only selected few can comprehend it. Cedric also mentions "I would also have to sell this to my team" - well, just go ahead and "sell" it. Do not assume that you are surrounded by brainless idiots.

  15. That sounds like elitism to me. Does this mean that the rest of the crew you are working with are morons ??? What is so innovative in AOP which makes regular team member unable to grasp it ???

    Some people just do work. 8 hours a day, 5 days a week. Just like digging the asfalt to put phone cables. Like sorting potatoes in the food market. etc.

    They are not morons. They may be good fellows. Caring fathers, involved mothers. They are not excited with programming craft, which doesn't make them morons in any way. Programming provides them with bi-monthly paychecks. Those paychecks are decent comparing to other boring jobs. So, they hang on. You won't expect them to spend hours of personal time learning exotic technologies. What for?
  16. Now take the next step: characterize the output of the clock-punchers vs. the output of the ones who are fascinated by the technologies. Then pretend to be a manager: which kind do you want working for you? Or both? And why?
  17. Now take the next step: characterize the output of the clock-punchers vs. the output of the ones who are fascinated by the technologies. Then pretend to be a manager: which kind do you want working for you? Or both? And why?

    Both. Why? Because curiousity doesn't make you a better programmer.
  18. "Because curiousity doesn't make you a better programmer."

    Wow, that's a bold statement. Care to elaborate?

    Certainly curiousity isn't the only characteristic of great programmers, but I would suggest it's a common one.
  19. "Because curiousity doesn't make you a better programmer."

    > Wow, that's a bold statement. Care to elaborate?

    How many programming languages do you know?

    There are hundreds of them. There are MANY theories, methodologies, processes, frameworks etc. Curious people try large percentage of them. Meybe they have time. I don't. I'm not curious. I don't want to spend hours messing with Yet Another XXX Oriented Programming. I know that it's 99.99% probability that it'll be gone unnoticed for good. It deserves it. Let it be forgotten forever.

    Does it make me worse programmer? In no way. Because I spent my spare time on mastering few things, which I use or anticipate to use in near future. I do read a lot. However, I believe that the CONCEPT is more important than the details, and it takes 15 minutes to get into the concept. Otherwise, it's not a concept, it's a CRAP.

    Example A: Once upon a time, I went to a bookshop. That day I got my monthly stipend at the university. I found a book called "Abstractions and specifications in Programming" by B Liskov. I read first pages, and immidiately bought that book, despite its hight cost. There were few very clear concepts, Iterators are among them. I liked them so much, I've been using them ever since in many situations.

    Example B: Once upon a time, one of my friends told me that he's working with the latest release of Turbo Pascal (late 80's, I believe). He said it has "objects", they are cool. When he said me that objects are "data structures bound with operations", somehow it related in my mind to the book from Example A. It took me about a minute or so, to understand what is he talking about and immidiately fall in love with it. Next thing, he gave me ASCII txt file with Stroustrup's C++ book (first edition, translated from English). I liked it. The most terrible written book :P

    My point is that, every NEW THING can be explained in 5-15 minutes, or 1 to 5 pages. You spend those 15 minutes, and decide either to feed your unhealthy curiosity and spend 6 month on it, or just skip it. Curious people don't have the strength to skip the CRAP.

    You can't invent something SO-O-O NEW that it takes 500 pages and 6 months of time to grasp it. If you think you can, then you are the God, or at least one of his relatives.

    Example C: How much time it takes to explain the essence of fractals? Try to explain them to someone who doesn't know about them (if you find that poor fella). You'll be surprised :) And it's a very complex and powerfull CONCEPT.
  20. Curiosity has made ME a better programmer. I cannot speak for anyone else.
  21. There is a need to distinguish between bleeding edge technologies and bleeding edge paradigms (are there any?). In my short developer life, I've seen several paradigms come and go, but they all seem to "play" like discussions about religion - What works for some does not work for others, and may not work again.

    Technologies, however, are trapped into an absorbtion cycle. The technologies that become predominant are more easily leveraged and can be absorbed by key team members in the shortest period of time. Thus, bleeding edge becomes leading edge.

    Once absorbed, these technologies develop a kind of inertia - once your team is familiar with Oracle (or whatever), there is a resistance to change. Are there gains to stepping out from the comfort zone again? J2EE is nice, because you can step out onto the bleeding edge while leveraging existing code and systems. Is this the design of the future, the catchall paradigm/technology?

    For my part, I've been using other people's bleeding edge tools and paying their penalties. For me to use bleeding edge technology again, I'll have to be the one designing it, or at least have some input. The price is too high and the learning curve too steep to strap on someone else's skis.
  22. Yes, we are using fairly new technology. Yes, we are a small company. Yes, I guess I'm lucky.

    This is a nasty game we have gotten ourselves into. We learn something new (say EJBs), invest tons of our free time to really get to know the technology (books, magazines, certifications), successfully develop an application using this technology then 1 year later, we have to start all over just to stay bleeding edge.

    Another thing I love is these job descriptions "must have over 5 years of experience with EJBs" (in 2002)! Right out of a Dilbert page.
  23. This is a fascinating question, even without "in a team" angle. I think just about the hardest challenge in software is knowing when to go bleeding edge and when not to.

    A case in point: a couple of years ago, I had one very bright developer working for me who was almost entirely useless. One of the main reasons was that he got bored with anything once he felt he understood it (which was always well before he'd completed a real project with it). And he would always look for the next thing and try to sneak new technologies in, regardless of the consequences for maintenance.

    Yet I would not wish to work with developers who were not curious. While naturally not everyone aspires to be a guru, I don't think there's any place for even junior developers who have no interest in technology beyond work. After all, even in the present climate, we all earn a lot of money compared to the great majority of people. We are already paid for some research time. (Btw, I'm married to a doctor, and she and many of the doctors we know spend a considerable amount of their own type reading journals and generally keeping up to date. They certainly aren't bored by their discipline, and I would hate to go to a doctor who was.)

    I think it's important to understand what's there on the bleeding edge, and be able to judge when it's cost-effective to apply it. Developers should be able to answer any questions skeptical management might have about particular technology without dismissing them as ignorance. After all if a developer can't explain why using EJB/XML/XSLT/AOP/whatever will save money, improve quality or achieve other obviously desirable outcomes (in non-technical terms) he's probably just having fun. I talk a bit about this in my new book, Expert One-on-One J2EE Design and Development.

  24. To sum up: I do think the software industry is too happy to embrace unproven "advances" just because they're new and hyped. For example, EJB has been seen as a magic bullet and used in many cases where it wasn't warranted, with poor results. (I say a lot about this in my book.)

    Yet curiosity is essential. Which is why most medical boards have recently set up "maintenance of professional standards" programs to stop old, bored doctors making too many mistakes. And they do make mistakes. My wife is a specialist, and most specialists will tell you that it is the old, bored doctors who continue to prescribe drugs that have been known for ten years not to work well or to have bad side effects, because they can't be bothered to keep up with the literature. Sorry to keep on with the medical analogy, but someone did bring it up...