Blue bar graphic
Colorado Software Summit logo
Colorado Software Summit banner

Gold bar graphic

Dan Bergh Johnsson
Omegapoint AB
Sweden

Effective Unit Testing of In-Container Components

Unit testing is good. Almost everybody agrees on this. However, comparatively few people and projects perform unit testing in practice. One big reason for this is that there are significant differences between the schoolbook examples of testing a Money class, and the real life reality of testing an EJB implementation class where the functionality you want to test is intermingled with container callbacks, naming lookups and inter-component dependencies. The result is often little or no testing at all.

In this session we investigate hands-on techniques and strategies for turning your almost-untestable EJBs, Struts actions or other serverside components into designs that can be effectively tested with fast-running unit-test suites. En route we will negotiate JNDI lookups, dependencies on other components, database connections and other obstacles.

Real Life Automated Tests – A Case Study

One of the prime success factors for projects is the praxis of automated testing. However, automated testing can be done at many different levels (unit tests, functionality tests, testability test, etc.), and in many different ways (plain old JUnit tests, with or without mocking, using in-container frameworks as Cactus, or client-driven tests as HttpUnit). It is not always obvious which combination is the best. Another challenge is how to follow up and evaluate the test results. Test coverage tools like Clover can give some metrics, but it still requires an interpretation of the metrics to turn them into new strategies for how to improve the testing. And the map is filled with dead-end roads.

In this session we follow a system that evolved through more than two years of development and several major releases. We describe how different techniques have been used during different phases, how it has worked in practice, obstacles found and ideas tried and trashed. That is, a story over what did work, what did not, and how we tried to improve. To spoil the ending: it ends with a big "to be continued".

Photo of Dan Johnsson

Dan Bergh Johnsson is working on creating a toolbox of technologies, tools, and methods for building secure system architectures. He does this through his job as an architecture and security consultant at Omegapoint AB, one of the leading IT-security companies of Sweden. At Omegapoint he leads the process of applying the company's security competence in the Java/J2EE space.

From his background in programming Dan has taken a broader interest in architecture in general and system integration in particular. This is also the background for his interest in high-quality practises such as unit testing. Through consulting and teaching he has played a pioneering part in the introduction of Java and J2EE in Sweden.

In his spare-time Dan is a dedicated dancer and member of the swing dance show troupe Sanslös Swing.

Email: dan.johnsson@omegapoint.se

Back to...

On to...

Michael Keith