Requirement Coverage - Another dimension to Unit Testing


News: Requirement Coverage - Another dimension to Unit Testing

  1. Technobuff has an article called "Requirement Coverage - Another dimension to Unit Testing" that - as the title implies - shows how many requirements have been fulfilled for a project through the use of JRequire (a Technobuff product).
    The coverage of requirements is a fundamental need throughout the software life cycle. Be it during design, coding or testing, the ability to ensure that the software meets the expected requirements is something that every developer and project manager aspires for. However, the challenge is how do we as developers or project managers ensure that the piece of software delivered did meet all the requirements? As a client, how do I make sure that the outsourced requirements are being delivered? Moreover, as the software matures and goes into several iterations of enhancements and bug fixes, it becomes more and more daunting to ensure requirement coverage in the software.

    Several techniques and tools have been invented to target requirement coverage. Some tools offer the capability of mapping requirements to the other project artifacts, such as, use cases, test cases and design documents. Some projects prefer to follow simple spreadsheet based traceability matrix of requirements to various other artifacts. As the project enters the coding phase, maintaining requirement coverage becomes even more challenging. Many a times the individual developers may not be the requirements experts. However, they are supposed to deliver what the requirements demand from the software. Also, they may not be using the same requirement coverage tool or techniques on a daily basis that were used in the earlier phases of the project.

    In a nutshell, while it is understood that the requirement coverage is critical for the successful delivery of the project, the means to achieve it is not so obvious and straight-forward.

    How do you track how your codebase fulfills a set of requirements? Do you track requirements formally? At what point is requirements tracking necessary for a project?

    Threaded Messages (15)

  2. Pricing[ Go to top ]

    This is something that will be useful if the enterprise level projects stick to separating their business logic to utility classes which are easily testable, but never the less the concept is good. Will definitely appeal to any project manager. A detailed cook book would be nice.
  3. Pricing[ Go to top ]

    Looks like a good idea. But I think the reason that JUnit took off explosively is that it was agressively free/open source software.

    I just hope that no one made any financial predictions on growth like that of JUnit adoption.

    But it is great that there is a commercial market for really clever and useful testing tools. For years we had uninteresting "enterprise" class test software which just isn't that helpful, yet has the "enterprise" price tag.
  4. How do you track how your codebase fulfills a set of requirements?

    Unfortunately, not really. Traceability is quickly lost between requirements and implementation. I think this is an absolute travesty.
    Do you track requirements formally?

    Absolutely! Typically with two sets of books - one maintained by the "customers" and one maintained by the "implementers." Sounds like a pain, and maybe in an ideal world it wouldn't be necessary, but a customer who doesn't track his own requirements deserves what he gets, and a implementer who doesn't track the requirements deserves to be sued.
    At what point is requirements tracking necessary for a project?

    All points. It never stops. Unfortunately when systems are deployed, a lot of people want to redefine the requirements to be defined by what the system does. This is very bad, because systems are never deployed with 100% requirements coverage, and there implementations of misinterpretted requirements always exist.

    Linking requirements coverage to automated tests makes sense (although it can't be the exclusive way of validating coverage), but if you follow a "classic" definition of unit tests (testing one class in isolation), I don't think it will work. Requirements need to be in the language of the user, and user's usually can't comprehend the behavior of individual classes in an effective manner.
  5. Annotations?[ Go to top ]

    Well, maybe something like that could be done via annotations/xdoclet and some discipline...

    Speaking of an ideal world...
  6. Why restrict unit tests to only gauge the robustness of your code?

    It's been my experience that developers don't write effective functional tests because most bugs come from misinterpreting requirements rather than poor coding. If a developer writes a test for a requirement based on the same interpretation that they used to implement the requirement, all they're really doing is testing robustness anyway. Besides, it is seldom that a single developer does all of the coding for a requirement. That's why developers do unit testing in white-box fashion -- i.e. measured by coverage -- and Q/A testers do functional testing -- i.e. measured by requirement coverage.

    You would do better to integrate with tools that are more suitable for functional testing than JUnit.

    Just my opinion...

    -- Les Walker
  7. It's been my experience that developers don't write effective functional tests because most bugs come from misinterpreting requirements rather than poor coding.


    And I envy you if your organization has the maturity to call a misinterpretted requirement a "bug." We release near bug-free software with lots of "issues."
  8. Why restrict unit tests to only gauge the robustness of your code?
    (This was in the original article, not from James Walker)

    Maybe because "in computer programming, a unit test is a procedure used to verify a particular module of source code is working properly" ( There are nice, accepted terms for testing things that are not units of code such as "integration tests", "acceptance tests" "black box test" and so on.

    Also, not every JUnit test is a unit test and vice-versa. Even if we continental Europeans usually pronounce them the same.
  9. Although I am not connected to this project, I would like to point your attention to a not-yet-released open source tool "Edibamus" from the german Fraunhofer Institute. (I am just longing for it to be realeased.)

    It tracks requirements in an XP-style in a JSP Wiki. By the help of eclipse plugins it can provide the crucial connection to real code artefacts.

    By doing this, project stakeholders of any kind can easily set and track the status and readyness of requirements. Plus we have a very efficient documentation, because high level descriptions are directly linked to code.

    I hope you will find this topic as so much related, that you will have a look at the flash demo of Eudibamus at Futher information on this project can be found at

    I am looking forward for interessting discussions.
  10. Eudibamus looks great.

    However, judging by the posting date on (2 years ago) I wouldn't hold my breath for a release.
  11. Release of Eudibamus this year[ Go to top ]

    I totally agree. From the dates on the webpage and the lack of up-to-date information you can not assume, there is still development on the way. But SnipSnap is just the JSP-based wiki that is the foundation of Eudibamus. It was released some years ago.

    The requirement tracking tool on top of SnipSnap - namingly Eudibamus - is quite younger.

    The project leaders lately announced an upcoming evaluation phase for the public and a first downloadable release at the end of this year.

    So I am still holding my breath. ;-)
  12. Check out ( based on Fit ). This project is actually released and likely covers a lot of the same ground.
  13. Good idea and better practice this tracing. A well implemented tool may be a great help.

    But, lets take a look at the interesting projects, the bigger ones. Who really knows the requirements? Perhaps as developer you know the use cases. But are the the business use cases, i.e. the requirements? Who knows the real non functional requirements. I'm not speaking of the alway mentioned maintainability, ... I'm speaking of the ones relevant for the project or the product! What's about usability, learnability, political ones, ...?
    Gathering theses is a little difficult. JRequire does not help. It's good for developers, but people writing the requirments seldomly use eclipse!

    Martin Prischmann (
  14. A very good idea.[ Go to top ]

    This took creeps into the idea of acceptance testing. It would often be helpful for development organizations to creep into acceptance testing so there are fewer surprises.
  15. Unusual Use of Coverage[ Go to top ]

    This is closer to a JUnit add-on than it is to a 'coverage' tool. I'd have to spend more time with it to form a strong opinion, but based on a quick glance, it looks as if it assesses coverage based on whether or not unit tests fail, which doesn't really tell you whether or not the requirements are covered, only whether or not the related tests are passing.

    Let me put it another way: Code/Test Coverage tools are usually intended to tell you how much of the code that your tests exercise. So when I read 'Requirements Coverage', I wonder how a tool will tell me how much of my requirements were 'exercised' or 'implemented' in some fashion. This doesn't seem to be the intent of /this/ tool.
  16. Using JUnit tests for requirements[ Go to top ]

    First, a disclaimer: I haven't read the article yet, maybe ideas I express here are close to those in the article...

    In one of the companies I've worked for, I was a developer for 2 years and a Business Analyst for another 1,5. I've been on both sides of the fence - I had to interpret incomplete requirements and I had to express requirements for others. Coming from development, and with that experience of misinterpreting requirements cause they were just not there, my requirements tended to stretch to a very low level, up to the level some people call design.

    The requirement document had the high level use cases (unfortunately not really the business ones, but say next level down) and 2-3 levels of more detailed ones. Yes, at a certain level of details a more or less complete set of requirements for a unit of code would emerge, but it would be really scattered through the document.

    All this lead me to what I do now as a developer (again): I capture requirements as JUnit (not unit) tests, before development if possible. These tests also include performance testing right away with JUnitPerf. They are maintained in a similar manner as unit tests which I very often write after the coding ;) As soon as someone comes up with an additional (forgotten) requirement - there it goes. Javadoc for these tests contains the full use case description.

    So far it has been helping me a lot. Ideally I guess someone else then developer should write these tests (and I tend to think more tester and BA together in a pair programming way) which certainly and unfortunately very often is unrealistic as it would require java/JUnit knowledge from them. However for myself it is still easy to switch BA and developer's hats :)

    Report from test running tool can be used to figure how many requirements are definitely covered. I agree with one of the previous posts that if they fail - it may just mean that code is faulty. I address it by running the [real] unit tests first.

    Also, reports from test coverage tools applied only to there requirements tests can show you the complexity of the code - how much other things does the code do not required by requirements :)