multiple ways of using soft asserts in junit 5

Do you think this test passes?

    @Test
    void mystery() {
        var softly = new SoftAssertions();
        softly.assertThat("robot").isEqualTo("izzy");
        softly.assertThat(126).isLessThanOrEqualTo(125);
    }

It does! Since softly is never told that it is done, the test does not fail. I’ve been using the approach of calling assertAll() to get my tests to fail:

    @Test
    void callingAssertAll() {
        var softly = new SoftAssertions();
        softly.assertThat("robot").isEqualTo("izzy");
        softly.assertThat(126).isLessThanOrEqualTo(125);
        softly.assertAll();
    }

I learned there are a few other approaches today. One is using an autocloseable version. This one is great if you are using a local variable:

    @Test
    void autoclosable() {
        try (var softly = new AutoCloseableSoftAssertions()) {
            softly.assertThat("robot").isEqualTo("izzy");
            softly.assertThat(126).isLessThanOrEqualTo(125);
        }
    }

Alternatively, you can use the lambda version. I like the autoclosable one better as it doesn’t encourage cramming stuff in a lambda. I could call another method inside assertSoftly with the actual asserts but that doesn’t seem better than using try with resources.

    @Test
    void lambda() {
        SoftAssertions.assertSoftly(s -> {
            s.assertThat("robot").isEqualTo("izzy");
            s.assertThat(126).isLessThanOrEqualTo(125);
        });
    }

There’s another way with an extension (that comes with asserj-core). I like this approach as it uses an instance variable and doesn’t require figuring out a place to call assertAll

@ExtendWith(SoftAssertionsExtension.class)
class SoftAssertionsExtensionTest {

    @InjectSoftAssertions
    SoftAssertions softly;

    @Test
    void field() {
        softly.assertThat("robot").isEqualTo("izzy");
        softly.assertThat(126).isLessThanOrEqualTo(125);
    }
}

The same can be done with a method. It’s nice to have choices!

@ExtendWith(SoftAssertionsExtension.class)
class SoftAssertionsExtensionTest {

    @Test
    void parameter(SoftAssertions softly) {
        softly.assertThat("robot").isEqualTo("izzy");
        softly.assertThat(126).isLessThanOrEqualTo(125);
    }

}

creating a new junit 5 project in gradle in intellij

I haven’t created a new JUnit 5/Gradle project in a IntelliJ in a long time. When doing so today, I ran into a number of problems. I wound up just starting over, but writing up what I encountered/learned.

Java 16 vs 17

IntelliJ supports Java 17. When I set up Gradle, it told me that Gradle doesn’t yet support Java 17. I confirmed on the Gradle website this is true. Ok that’s fair. Java 17 isn’t fully released for another month. While there might be an early adopter package that does work with Java 17, my focus here is JUnit 5, no, Java 17. So I went with Java 16.

String not found – “Package java.lang is declared in module java.base, which is not in the module graph”

This surprised me. I’m not using modules in this project. Or at least I don’t want to be, nor do I have a module-info.java file. I tried invalidating the IntelliJ cache and restarting the IDE. That didn’t work. I then tried deleting the .gradle and .idea directories and reopening IntelliJ. That worked. IntellIJ even asked if I wanted to trust the gradle project as it recreated one.

This approach lost all IntelliJ settings including Java version, gradle version and source/test directories.

Recreating the project

At this point, I decided to delete my build.gradle and settings.gradle files in addition to the .idea and .gradle directories. My plan was to start over. I then created a new Gradle project in the same directory. I created a dummy file and had IntellIJ create a JUnit 5 test for me. That updated the Gradle file in a way that the IDEA was happy with. Now to commit to git lest anything else happen.

ant and junit 5 – outputting test duration and failure to the log

In JUnit 5, you use the junitlauncher Ant task rather than the junit Ant task. Unfortunately, this isn’t a drop in replacement. In addition to needing a workaround to set system properties, you also need a workaround to write the test results to the console log.

JUnit 4

With JUnit 4 and Ant, you got output that looked list this for each test run. In the console. Real time while the build was running. This was really useful.

[junit] Running com.javaranch.jforum.MissingHeadTagTest
[junit] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.685 sec

The problem

JUnit 5 itself provides a number of built in options for this. However, Ant integration renders this “not very useful” for a few reasons.

    1. I want to use the legacy listeners. For example:
      <listener type="legacy-xml" sendSysOut="true" sendSysErr="true" />
      <listener type="legacy-plain"sendSysOut="true"sendSysErr="true"/>
      
      

      However, if I enabled sendSysOut and sendSysErr to them, all the listeners present redirect away from System.out. This means if I try to use the built it LoggingListener, it also redirects to the file and I don’t see it in the console. This means, I have to chose between the legacy listeners and seeing real time data.

    2. Ant seems to pass a fileset as individual tests. This means that running this code, prints a summary for each test rather than the whole thing at the end.
      <junitlauncher haltOnFailure="true" printsummary="true">
           <classpath refid="test.classpath" />
           <testclasses outputdir="build/test-report">
      	<fileset dir="build/test">
      	  <include name="**/*Tests.class" />
              </fileset>
      </junitlauncher>
      

      It looks like this. The summary tells me about each run, but not with the test names so not useful anyway.

      [junitlauncher] Test run finished after 117 ms
      [junitlauncher] [         3 containers found      ]
      [junitlauncher] [         0 containers skipped    ]
      [junitlauncher] [         3 containers started    ]
      [junitlauncher] [         0 containers aborted    ]
      [junitlauncher] [         3 containers successful ]
      [junitlauncher] [         0 containers failed     ]
      [junitlauncher] [        15 tests found           ]
      [junitlauncher] [         0 tests skipped         ]
      [junitlauncher] [        15 tests started         ]
      [junitlauncher] [         0 tests aborted         ]
      [junitlauncher] [        15 tests successful      ]
      [junitlauncher] [         0 tests failed          ]
      [junitlauncher] Test run finished after 14 ms
      [junitlauncher] [         2 containers found      ]
      [junitlauncher] [         0 containers skipped    ]
      [junitlauncher] [         2 containers started    ]
      [junitlauncher] [         0 containers aborted    ]
      [junitlauncher] [         2 containers successful ]
      [junitlauncher] [         0 containers failed     ]
      [junitlauncher] [        10 tests found           ]
      [junitlauncher] [         0 tests skipped         ]
      [junitlauncher] [        10 tests started         ]
      [junitlauncher] [         0 tests aborted         ]
      [junitlauncher] [        10 tests successful      ]
      [junitlauncher] [         0 tests failed          ]
    3. If I use JUnit Launcher’s fail on error property, it fails on the first test rather than telling me about all of them. (I don’t use this feature anyway.)
    4. Custom extensions also have their system out/err redirected to the legacy listeners.

The solution

This feels like a hacky workaround. But I only have one project that uses Ant so I don’t have to worry about duplicate code. The legacy listeners are useful so I don’t want to get rid of them.

I wrote a custom listener that stores the results in memory for each test and writes it to disk after each test class runs. That way it gets all the data, not just the last one. And then Ant writes it out.

<target name="test.junit.launcher" depends="compile">
		<junitlauncher haltOnFailure="false" printsummary="false">
			<classpath refid="test.classpath" />
			<testclasses outputdir="build/test-report">
				<fileset dir="build/test">
					<include name="**/*Tests.class" />
				</fileset>
				<listener type="legacy-xml" sendSysOut="true" sendSysErr="true" />
				<listener type="legacy-plain" sendSysOut="true" sendSysErr="true" />
				<listener classname="com.example.project.CodeRanchListener" />
			</testclasses>
		</junitlauncher>
		<loadfile property="summary" srcFile="build/status-as-tests-run.txt" />
		        <echo>${summary}</echo>
	</target>

And the listener

package com.example.project;

import java.io.*;
import java.time.*;

import org.junit.platform.engine.*;
import org.junit.platform.engine.TestDescriptor.*;
import org.junit.platform.engine.TestExecutionResult.*;
import org.junit.platform.launcher.*;

public class MyListener implements TestExecutionListener {
	
	private StringWriter inMemoryWriter = new StringWriter();

	private int numSkippedInCurrentClass;
	private int numAbortedInCurrentClass;
	private int numSucceededInCurrentClass;
	private int numFailedInCurrentClass;
	private Instant startCurrentClass;

	private void resetCountsForNewClass() {
		numSkippedInCurrentClass = 0;
		numAbortedInCurrentClass = 0;
		numSucceededInCurrentClass = 0;
		numFailedInCurrentClass = 0;
		startCurrentClass = Instant.now();
	}

	@Override
	public void executionStarted(TestIdentifier testIdentifier) {
		if ("[engine:junit-jupiter]".equals(testIdentifier.getParentId().orElse(""))) {
			println("Ran " + testIdentifier.getLegacyReportingName());
			resetCountsForNewClass();
		}
	}

	@Override
	public void executionSkipped(TestIdentifier testIdentifier, String reason) {
		numSkippedInCurrentClass++;
	}

	@Override
	public void executionFinished(TestIdentifier testIdentifier, TestExecutionResult testExecutionResult) {
		if ("[engine:junit-jupiter]".equals(testIdentifier.getParentId().orElse(""))) {
			int totalTestsInClass = numSucceededInCurrentClass + numAbortedInCurrentClass
					+ numFailedInCurrentClass + numSkippedInCurrentClass;
			Duration duration = Duration.between(startCurrentClass, Instant.now());
			double numSeconds = duration.getNano() / (double) 1_000_000_000;
			String output = String.format("Tests run: %d, Failures: %d, Aborted: %d, Skipped: %d, Time elapsed: %f sec",
					totalTestsInClass, numFailedInCurrentClass, numAbortedInCurrentClass,
					numSkippedInCurrentClass, numSeconds);
			println(output);

		}
		// don't count containers since looking for legacy JUnit 4 counting style
		if (testIdentifier.getType() == Type.TEST) {
			if (testExecutionResult.getStatus() == Status.SUCCESSFUL) {
				numSucceededInCurrentClass++;
			} else if (testExecutionResult.getStatus() == Status.ABORTED) {
				println("  ABORTED: " + testIdentifier.getDisplayName());
				numAbortedInCurrentClass++;
			} else if (testExecutionResult.getStatus() == Status.FAILED) {
				println("  FAILED: " + testIdentifier.getDisplayName());
				numFailedInCurrentClass++;
			}
		}
	}
	
	private void println(String str) {
		inMemoryWriter.write(str + "\n");
	}
	
	/*
	 * Append to file on disk since listener can't write to System.out (becuase legacy listeners enabled)
	 */
	private void flushToDisk() {
		try (FileWriter writer = new FileWriter("build/status-as-tests-run.txt", true)) {
			writer.write(inMemoryWriter.toString());
		} catch (IOException e) {
			throw new UncheckedIOException(e);
		}
	}

	@Override
	public void testPlanExecutionFinished(TestPlan testPlan) {
		flushToDisk();
	}
}