Article: Approaches to Mocking

Discussions

News: Article: Approaches to Mocking

  1. Article: Approaches to Mocking (13 messages)

    Simon Stewart has written an article about the various approaches to Mocking objects. He discusses static mocks (MockMaker), dynamic mocks (Dynamocks/jMock, EasyMock), and even a little on the pro's and con's of using AOP for mocking.

    Introduction

    "Everyone knows what a mock is, just from the name, but as with many seemingly simple ideas, there is more to them than first meets the eye. This article explores the two types of mocks that exist and covers some of the problems inherent in their use. Finally, it considers the reason why a developer might chose to use mocks. After all, common understanding holds that mocks are used for unit testing, a key part of Test Driven Design, but that isn't necessarily about testing at all."

    Read Approaches to Mocking

    Threaded Messages (13)

  2. For Those About To Mock[ Go to top ]

    There are two situations where I find mocks useful:

    1) Mocking an environment is necessary to eliminate performance problems from testing the actual environment.
    2) You do not have sufficient access or control over the mocked environment to create a predictable test environment.

    But if you don't have a compelling case for using mocks, by default you should not use them. I shudder at the fetishists who create testing infrastructures that are unstable and difficult to maintain because of their quest for 'completely pure' unit tests.

    For example, in the article Mr. Stewart poses the problem of how to properly unit test a UserManager object. His solution... refactor all of the interesting logic into the User object and then mock the User object! What the hell did that accomplish? He increased the complexity of his codebase, while reducing the usefulness of his unit test to near zero.

    Good Lord, Apache Cactus provides a complex mock infrastructure to simulate your Stateless Session Bean container! If deploying and running in-container tests on SLSB's is so inefficient that it justifies using the invasive and complicated Cactus infrastructure to run a unit test, then you need to migrate off of Weblogic 3.0.

    I am a huge fan of TDD as a design process, and as a means of building a robust, repeatable testing infrastructure. But some of the zealouts need to stand back and recognize when they are investing more effort into the testing infrastructure than into the application it was intended to support.
  3. more on mock testing[ Go to top ]

    This thread has a lot of commentary on
    mocks and testing.

    http://www.artima.com/forums/flat.jsp?forum=106&thread=30031
  4. Don't mock other people's code[ Go to top ]

    Corby,

    I have been trying to do what you say:

    "Mock the environment"

    If by that you mean:

    "mock JDBC"
    "mock the Servlet API"
    "mock external library X that my app depends on"

    Then I'm sure you have had a hell of a time.

    The thing is: Don't do it. This is mock abuse. It complicates your tests and you often end up implementing the goddarned API you are mocking all over again. It is not the recommended way to do it.

    Instead, make sure your code doesn't depend on "other people's code". (Like JDBC, Servlet API, Foo - etc.) Because they are too hard to mock - they simply weren't designed for it and are usually too low level. Instead, create a thin adapter layer on top of whatever layer you want to abstract away in your tests. Then mock that layer. Your own.

    -And assert interop with the real JDBC/Servlet/whatever API in your integration tests - which don't use mocks, but the real thing.

    I share this view with the pioneers of the whole mocking practice: Nat Pryce, Tim Mackinnon, Steve Freeman and Joe Walnes.

    Repeat after me:

    don't mock other people's code.

    Aslak Hellesøy
  5. Don't mock other people's code[ Go to top ]

    Instead, create a thin adapter layer on top of whatever layer you want to

    >abstract away in your tests. Then mock that layer. Your own.

    This is a very good idea. I use this way to test JDBC code.
    Adapter has no "CanNotIplementThisInMockExeption". It saves a lot of time and it is useful for more things .
  6. One Rule to Rule Them All[ Go to top ]

    Someone told me (or have I read it somewhere?) that most software solutions derive from one root: just add one or more levels of indirection. And if you think for a minute, there's some truth in it: relational database normalization, N tier solutions, facade pattern, network protocols, etc...

    If something is in your way to the solution, add a level of indirection, and abstract it away! Gone! :)

    Cheers!
    Henrique Steckelberg
  7. Don't mock other people's code[ Go to top ]

    Instead, create a thin adapter layer on top of whatever layer you want to abstract away in your tests. Then mock that layer. Your own.

    Yes, certainly that is how all of my mocks are done. I still stand by my original premise; justify the use of mocks, and don't implement them only for knee-jerk 'keep my unit tests pure' reasons. I feel that way because:

    1) When testing infrastructures become unnecessarily complex, you increase the cost of refactoring (tests and mocks must be refactored/rewritten in addition to the application code). Once refactoring becomes overly expensive, I find that team members tend to avoid it, even though we know better. Human nature.

    2) As you can tell, I'm not especially religious about the distinction between unit tests and integration tests. For me, the purpose of unit tests is to test an encapsulated API offered by a given class or component. When you strictly enforce that a class can not interact with other (non-mock) classes during its unit test, you are breaking that encapsulation by requiring the unit test to structure itself around the class's internal implementation details. In a good design, a class's external dependencies can be minimalized, but I don't sweat them when they exist.

    [Vincent, sorry about misusing the mock/in-container terminology. I was trying to contract the heavier testing infrastructure supplied by Cactus with a more lightweight infrastructure offered by apps like JunitEE.]
  8. For Those About To Mock[ Go to top ]

    Good Lord, Apache Cactus provides a complex mock infrastructure to simulate your Stateless Session Bean container! If deploying and running in-container tests on SLSB's is so inefficient that it justifies using the invasive and complicated Cactus infrastructure to run a unit test, then you need to migrate off of Weblogic 3.0.

    >

    Hi Corby,

    Just a little correction: Cactus does not create a mock infrastructure. It runs tests in the container.
  9. We Salute You[ Go to top ]

    ...

    >For example, in the article Mr. Stewart poses the problem of how to properly >unit test a UserManager object. His solution... refactor all of the >interesting logic into the User object and then mock the User object!

    Actually, that wasn't the problem posed or the solution. The problem was how to test the canUserLogin method of some other class. And the refactoring did not move any logic into the User object; The refactoring split-up the original
    canUserLogin method by

    - Using IoC to obtain UserManager object (added setUserManager)
    - Using IoC to obtain the User object (added overloaded canUserLogin)

    User defined the same validatePassword method before and after the refactoring.

    > What the hell did that accomplish?

    It allowed him to directly test the logic of the canUserLogin without involving and (indirectly) testing the logic/infrastructure of

    a) The UserManager object
    b) the User object

    I think this gives at least 2 benefits

    1)Unit Tests will run faster, since the infrastructure of UserManager (and/or User) would presumable access a database, directory server, or something like that.
    2)Unit Tests for canUserLogin won't fail due to bugs in UserManager or User classes - as they will with original code. canUserLogin is supposed to return false if the object user is null, otherwise return the result of user.validatePassword. With his refactoring + MockUser, that is exactly what he can/does (begin to) test.

    He should eventually test the full integration of canUserLogin with non-mock versions of UserManager and User, but we can do that in integration testing rather than unit testing. One advantage to that approach being that fast tests facilitate TDD more than slow tests.
  10. We Salute You[ Go to top ]

    Mike, who is writing these integration tests?
    Nobody. It's like people who say we'll write
    unit tests after we write all the code. It doesn't
    happen. Unit tests will get written if they
    are developed as part of the development process.
    The same applies to integration level tests.

    If the user manager is too slow for testing it
    will be too slow for deployment.

    If you aren't testing your assumption about the user
    manager during your testing and the test fails
    during deployment or system testing then that is not acceptable
    code quality in my mind. Saying a test passed
    with your mocks is not a credible defense.

    Nobody cares about your mocks. They aren't real. What
    we want is a functional system. Caring about test speed
    at the expense of correctness misses the point.
  11. Integration Testing Does Happen[ Go to top ]

    Mike, who is writing these integration tests? Nobody.


    > It's like people who say we'll write
    > unit tests after we write all the code. It doesn't
    > happen. Unit tests will get written if they
    > are developed as part of the development process.
    > The same applies to integration level tests.

    I think you're assuming way too much here. Just because you haven't been doing integration testing on your projects, doesn't mean nobody is. For example, we are using the SilkTest tool to do end-to-end, black-box integration testing on my current project. Also, you will see in the Cactus documentation that Cactus tests are typically a form of (what I would call white-box) integration testing, since they run in-container.

    > If the user manager is too slow for testing it
    > will be too slow for deployment.

    Not neccesarily. Suppose UserManager takes a second to retrieve user data. A user of the system can wait a second to login to the application without complaint. But when you're frequently running a bunch of test cases - as is done in TDD - the wait is more of a burden because

    a) It grows with the number of test cases
    b) You are waiting more often - each time you run your tests

    Furthermore, if you need to test in-container (with something like Cactus) then you have to rebuild and redeploy each time you run your tests. That hardly promotes doing TDD with cycles of test -> code -> refactor -> test again to make sure the tests still pass.

    > If you aren't testing your assumption about the user
    > manager during your testing and the test fails
    > during deployment or system testing then that is not acceptable
    > code quality in my mind. Saying a test passed
    > with your mocks is not a credible defense.

    True. But as I said above you are operating from the flawed assumption that it isn't feasible to seperate true "unit testing" from some type of "integration testing" and do both on a project.

    >
    > Nobody cares about your mocks. They aren't real. What
    > we want is a functional system. Caring about test speed
    > at the expense of correctness misses the point.

    I agree. I just think that the solution is to seperate unit tests - which should be very fast and ran very often, from integration tests - which will run slower and thus less often (maybe once or twice a day).
  12. Integration Testing Does Happen[ Go to top ]

    I think you're assuming way too much here.

    >....
    >I agree. I just think that the solution is to seperate unit tests

    How much unit testing gets done when it
    isn't part of the development process?
    Sure, some people will do it sometimes.
    Most pople won't do it and they usually
    won't do it as well as if the developed
    it during development. I have seen this
    over and over again. Saying it doesn't
    have to be that way doesn't change human
    nature.

    It's clear people have different ideas on
    what constitutes a unit boundry and acceptable
    testing.

    I know integration testing won't be
    performed. Even if it is done it doesn't help
    isolate and solve problems immediately, which
    is a great win for unit testing. Waiting until later
    argues for skipping integration testing and
    relying purely on system testing.

    I am still in wonder how anyone thinks it is ok
    that any but the most unpredictable failure is ok once they
    release there code. If you don't test all
    your assumptions in all situations then you will have failures
    based on incorrect assumptions about all the code you are using.
    Saying we'll find it in integration testing is never
    ok in my book.

    >For example, we are using the SilkTest tool to do end-to-end, black-box >integration testing on my current project.

    I think those are system tests, not integration tests.
    You are testing the entire system, not smaller integration
    units that are going through a specific set of interaction
    tests.

    You can make the argument that good system tests
    mean you don't need integration tests or unit tests
    if your code mostly works, which is should. Unlike
    unit tests these tests make it very hard to pin-point
    what went wrong. In unit testing we have the opposite.
  13. Integration Testing Does Happen[ Go to top ]

    How much unit testing gets done when it

    > isn't part of the development process?
    > Sure, some people will do it sometimes.

    If unit testing isn't part of the development process, then perhaps using mocks isn't the right thing to do. As the article and your posts point out, using mocks as a testing technique leaves a lot to be desired. Mocks are only a tool that can be used to help build better code; like any tool they can be misused, so if you don't need them, don't use them.

    > If you don't test all
    > your assumptions in all situations then you will have failures
    > based on incorrect assumptions about all the code you are using.
    > Saying we'll find it in integration testing is never
    > ok in my book.

    Which is why the XPers suggest "testing in depth". Don't rely on just the unit tests, or just the integration tests in order to catch bugs because, as you say, it's not going to do the job well enough.
  14. Mocks must die[ Go to top ]

    Isn't it simply wrong and against your very basic instincts? Mocking implies bad coupling. If all frameworks are designed with unit-test-ability in mind, we'll get rid of mocking objects one day. Servlets must die, so does MockServlet.