dbunit vs recreate schema

As I continue looking at writing integration tests for the back end of JavaRanch’s JForum install, I faced the decision of how to guarantee a clean database.  A previous post on the topic covers how to clone a postgresql database via the command line.

I was originally thinking about using dbUnit to import data.  I may still do that if I find myself needing a lot of data.  However, I realized it was more important to be able to recreate the table structure each time.  Our database schema changes regularly as we enhance tables and I don’t want people to have to update the exported dataset. The concept is to have the test create the schema/tables/data from scratch each time it runs.

Some interesting things in this infrastructure:

  1. JUnit suite wide setup – I had so much to say on this topic that I wrote a separate blog post on it.
  2. Main method – The main method is interesting because it provides the starting point. It sets up the database (more on that later) and kicks off all the JUnit tests. It uses the JUnit runner to benefit from JUnit’s reporting mechanism. I also added logging for the database setup in case it took a long time on someone’s machine. This turned out not to be a problem. At the moment, I’m leaving it “just in case.”
    public static void main(String[] args) throws Exception {
    long start = System.currentTimeMillis();
    setUp(args);
    long end = System.currentTimeMillis();
    System.out.println();
    System.out.println("Done setting up database in " + (end - start) + "ms.  Now running tests.");
    System.out.println();
    JUnitCore.main("com.javaranch.test.functional.All_JForum_Functional_Tests");
    }
    
  3. Pointing to a test database– Many developers, including myself, have test data in their “jforum” database and wouldn’t appreciate the integration tests mucking around with it. As a result, the integration tests use a special “jforum_integration_test” database. This database has its schema recreated each run of the test. Conveniently, JForum has a property for the database name that all the code uses. Having the code update this in setup is sufficient.
    SystemGlobals.setValue(ConfigKeys.DATABASE_CONNECTION_DBNAME, "jforum_integration_test");
    
  4. Tear down first
    A common pattern when integration testing is to call tear down before setup. Tear down drops the “public” schema. (which is where postgresql stores everything) Setup creates the schema and loads the DDL/SQL. This is done in a DatabaseController class to keep the All_XXX_Tests class uncluttered. For example:

    conn.prepareStatement("DROP SCHEMA public CASCADE;");
    
  5. Running a DDL file – This is a blog entry in and of itself. Which is good because I made it on

Conclusion
I’ve been using this test structure for a couple weeks now. It has served the purpose well and I expect it to live on. A nice side effect is that we find out very quickly if the DDL is incorrect in SVN!

Leave a Reply

Your email address will not be published. Required fields are marked *