switching the coderanch build to gitlab

After having some “trouble” upgrading Jenkins on the CodeRanch server, we concluded it would be easier to switch to GitLab for the build than fix it. After all, we are already using GitLab SaaS (software as a service) for source control. While I’ve done GitLab pipelines before, this was my first time using Ant in one so it was interesting. Which means a blog post.

Why we use Ant and our custom deployment model

We have a few CodeRanch moderators who work on the forum software. (Less than 5 which is convenient as that’s how many people can be in a GitLab org for free. One of those moderators lives in a country with less than reliable internet. This means using Maven (or even Ant with Ivy) is a problem because it expects more internet than he may have available at a given moment.

Additionally, uploading large files is sometimes a problem so we don’t deploy a .war file. We instead deploy a loosefiles.zip file which contains the code but not all the dependencies. The dependencies are uploaded only on change.

I don’t recommend any company operate like this but it meets our needs. And since it is a hobby, also gives us fun technical challenges.

Fun fact: when I started working on the forum software (17 years ago) I had dialup internet. It was reliable, but I also benefited from the no uploading a war file sized artifact personally.

The main build part of the pipeline

Ant isn’t supported for Auto DevOps so didn’t consider that approach. The main part of the build was fairly straightforward:

image: eclipse-temurin:21

variables:
  FF_TIMESTAMPS: 1

ant-dist:
  stage: build
  before_script:
    - apt-get update && apt-get install -y ant
  script:
    - ant dist
  artifacts:
    paths:
      - qa/
      - dist/
    reports:
      junit: qa/reports/*.xml
    expire_in: 1 week
  rules:
    - if: '$CI_PIPELINE_SOURCE == "merge_request_event"'
    - if: '$CI_COMMIT_BRANCH == "master"'

The pipeline uses a Java 21 image (last LTS as of when set up). I added FF_TIMESTAMPS so the log output tells me how long everything takes. (The free plan gives you a certain number of build minutes per month so this is important. Using this information we decided not to have the build create the deployment artifact (which minifies files and zips them up) as that took a bunch of time and the people who deploy always run that locally anyway.

The apt-get takes about 15 seconds to install Ant (I checked this because I would have included Ant in the repo if it was slow). Next comes actually running the dist target of the build which compiles, runs the JUnit tests and PMD for static analysis.

Next the pipeline makes the qa (build reports) and dist (binaries) available for browsing/downloading. It also publishes the JUnit output which allows the merge request and pipeline to conveniently show test data.

Finally, the triggers are merge requests and master..

Setting up semgrep

Since SAST is free on GitLab I set that up as well. The remainder of the pipeline is

# based on https://semgrep.dev/docs/semgrep-ci/sample-ci-configs#sample-gitlab-cicd-configuration-snippet
semgrep:
  # A Docker image with Semgrep installed.
  image: semgrep/semgrep
  # Run the "semgrep scan" command on the command line of the docker image.
  script: semgrep ci --config auto --include src --gitlab-sast --output=gl-sast-report.json --text-output=semgrep.txt --json-output=semgrep.json --sarif-output=semgrep.sarif || true
 
  variables:
    # Upload findings to GitLab SAST Dashboard:
    SEMGREP_GITLAB_JSON: "1"
 
  artifacts:
    paths:
        - semgrep.txt
        - semgrep.json
        - semgrep.sarif
    reports:
      sast: gl-sast-report.json
    expire_in: 1 week

  rules:
  # Scan changed files in MRs, (diff-aware scanning):
  - if: $CI_MERGE_REQUEST_IID

  # Scan mainline (default) branches and report all findings.
  - if: $CI_COMMIT_BRANCH == $CI_DEFAULT_BRANCH
  

While most of this code came from the sample, semgrep was far more interesting. You can publish to semgrep.dev and see the results in a nice UI. It says that is free for up to 10 committers. Cool we have less than that. However, when the project comes from GitLab, it requires a GitLab group token with admin access. I was less enthusiastic about that. But even then, I still couldn’t use it because GitLab free product doesn’t allow you to set up group access tokens.

You might be wondering why there are so many output formats. Free GitLab basically tells you if there are new findings, but not a visual display of the full report. And I’m not sure what the other developers will want to use so I provided everything. I plan to use SARIF. There are two free visualizers:

ant and junit 5 – outputting test duration and failure to the log

In JUnit 5, you use the junitlauncher Ant task rather than the junit Ant task. Unfortunately, this isn’t a drop in replacement. In addition to needing a workaround to set system properties, you also need a workaround to write the test results to the console log.

JUnit 4

With JUnit 4 and Ant, you got output that looked list this for each test run. In the console. Real time while the build was running. This was really useful.

[junit] Running com.javaranch.jforum.MissingHeadTagTest
[junit] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.685 sec

The problem

JUnit 5 itself provides a number of built in options for this. However, Ant integration renders this “not very useful” for a few reasons.

    1. I want to use the legacy listeners. For example:
      <listener type="legacy-xml" sendSysOut="true" sendSysErr="true" />
      <listener type="legacy-plain"sendSysOut="true"sendSysErr="true"/>
      
      

      However, if I enabled sendSysOut and sendSysErr to them, all the listeners present redirect away from System.out. This means if I try to use the built it LoggingListener, it also redirects to the file and I don’t see it in the console. This means, I have to chose between the legacy listeners and seeing real time data.

    2. Ant seems to pass a fileset as individual tests. This means that running this code, prints a summary for each test rather than the whole thing at the end.
      <junitlauncher haltOnFailure="true" printsummary="true">
           <classpath refid="test.classpath" />
           <testclasses outputdir="build/test-report">
      	<fileset dir="build/test">
      	  <include name="**/*Tests.class" />
              </fileset>
      </junitlauncher>
      

      It looks like this. The summary tells me about each run, but not with the test names so not useful anyway.

      [junitlauncher] Test run finished after 117 ms
      [junitlauncher] [         3 containers found      ]
      [junitlauncher] [         0 containers skipped    ]
      [junitlauncher] [         3 containers started    ]
      [junitlauncher] [         0 containers aborted    ]
      [junitlauncher] [         3 containers successful ]
      [junitlauncher] [         0 containers failed     ]
      [junitlauncher] [        15 tests found           ]
      [junitlauncher] [         0 tests skipped         ]
      [junitlauncher] [        15 tests started         ]
      [junitlauncher] [         0 tests aborted         ]
      [junitlauncher] [        15 tests successful      ]
      [junitlauncher] [         0 tests failed          ]
      [junitlauncher] Test run finished after 14 ms
      [junitlauncher] [         2 containers found      ]
      [junitlauncher] [         0 containers skipped    ]
      [junitlauncher] [         2 containers started    ]
      [junitlauncher] [         0 containers aborted    ]
      [junitlauncher] [         2 containers successful ]
      [junitlauncher] [         0 containers failed     ]
      [junitlauncher] [        10 tests found           ]
      [junitlauncher] [         0 tests skipped         ]
      [junitlauncher] [        10 tests started         ]
      [junitlauncher] [         0 tests aborted         ]
      [junitlauncher] [        10 tests successful      ]
      [junitlauncher] [         0 tests failed          ]
    3. If I use JUnit Launcher’s fail on error property, it fails on the first test rather than telling me about all of them. (I don’t use this feature anyway.)
    4. Custom extensions also have their system out/err redirected to the legacy listeners.

The solution

This feels like a hacky workaround. But I only have one project that uses Ant so I don’t have to worry about duplicate code. The legacy listeners are useful so I don’t want to get rid of them.

I wrote a custom listener that stores the results in memory for each test and writes it to disk after each test class runs. That way it gets all the data, not just the last one. And then Ant writes it out.

<target name="test.junit.launcher" depends="compile">
		<junitlauncher haltOnFailure="false" printsummary="false">
			<classpath refid="test.classpath" />
			<testclasses outputdir="build/test-report">
				<fileset dir="build/test">
					<include name="**/*Tests.class" />
				</fileset>
				<listener type="legacy-xml" sendSysOut="true" sendSysErr="true" />
				<listener type="legacy-plain" sendSysOut="true" sendSysErr="true" />
				<listener classname="com.example.project.CodeRanchListener" />
			</testclasses>
		</junitlauncher>
		<loadfile property="summary" srcFile="build/status-as-tests-run.txt" />
		        <echo>${summary}</echo>
	</target>

And the listener

package com.example.project;

import java.io.*;
import java.time.*;

import org.junit.platform.engine.*;
import org.junit.platform.engine.TestDescriptor.*;
import org.junit.platform.engine.TestExecutionResult.*;
import org.junit.platform.launcher.*;

public class MyListener implements TestExecutionListener {
	
	private StringWriter inMemoryWriter = new StringWriter();

	private int numSkippedInCurrentClass;
	private int numAbortedInCurrentClass;
	private int numSucceededInCurrentClass;
	private int numFailedInCurrentClass;
	private Instant startCurrentClass;

	private void resetCountsForNewClass() {
		numSkippedInCurrentClass = 0;
		numAbortedInCurrentClass = 0;
		numSucceededInCurrentClass = 0;
		numFailedInCurrentClass = 0;
		startCurrentClass = Instant.now();
	}

	@Override
	public void executionStarted(TestIdentifier testIdentifier) {
		if ("[engine:junit-jupiter]".equals(testIdentifier.getParentId().orElse(""))) {
			println("Ran " + testIdentifier.getLegacyReportingName());
			resetCountsForNewClass();
		}
	}

	@Override
	public void executionSkipped(TestIdentifier testIdentifier, String reason) {
		numSkippedInCurrentClass++;
	}

	@Override
	public void executionFinished(TestIdentifier testIdentifier, TestExecutionResult testExecutionResult) {
		if ("[engine:junit-jupiter]".equals(testIdentifier.getParentId().orElse(""))) {
			int totalTestsInClass = numSucceededInCurrentClass + numAbortedInCurrentClass
					+ numFailedInCurrentClass + numSkippedInCurrentClass;
			Duration duration = Duration.between(startCurrentClass, Instant.now());
			double numSeconds = duration.getNano() / (double) 1_000_000_000;
			String output = String.format("Tests run: %d, Failures: %d, Aborted: %d, Skipped: %d, Time elapsed: %f sec",
					totalTestsInClass, numFailedInCurrentClass, numAbortedInCurrentClass,
					numSkippedInCurrentClass, numSeconds);
			println(output);

		}
		// don't count containers since looking for legacy JUnit 4 counting style
		if (testIdentifier.getType() == Type.TEST) {
			if (testExecutionResult.getStatus() == Status.SUCCESSFUL) {
				numSucceededInCurrentClass++;
			} else if (testExecutionResult.getStatus() == Status.ABORTED) {
				println("  ABORTED: " + testIdentifier.getDisplayName());
				numAbortedInCurrentClass++;
			} else if (testExecutionResult.getStatus() == Status.FAILED) {
				println("  FAILED: " + testIdentifier.getDisplayName());
				numFailedInCurrentClass++;
			}
		}
	}
	
	private void println(String str) {
		inMemoryWriter.write(str + "\n");
	}
	
	/*
	 * Append to file on disk since listener can't write to System.out (becuase legacy listeners enabled)
	 */
	private void flushToDisk() {
		try (FileWriter writer = new FileWriter("build/status-as-tests-run.txt", true)) {
			writer.write(inMemoryWriter.toString());
		} catch (IOException e) {
			throw new UncheckedIOException(e);
		}
	}

	@Override
	public void testPlanExecutionFinished(TestPlan testPlan) {
		flushToDisk();
	}
}

 

ant and junit5 – simulating sysproperty

In JUnit 5, you use the junitlauncher Ant task rather than the junit Ant task. Unfortunately, this isn’t a drop in replacement. For example, junitlauncher doesn’t offer the option to fork and run the JUnit tests. As a result, it also doesn’t have the nested sysproperty tag so you can pass system properties. This is a problem.

For the CodeRanch, we set a system property for the default file encoding. Since developers are around the world, we can’t assume everyone “just has” the encoding set.

Disclaimer

Since JUnit 5 functionality for Ant was introduced this year, I’m hoping what I did in this post is a short term workaround.

Option 1 – pass to Ant

You can pass the the properties to Ant itself as described on Stack Overflow. For example:

__JAVA_OPTIONS=-Dfile.encoding=ISO8859_1

pros:

  • simple

cons:

  • subpar – all the developers need to remember to do this. The reason we had it in Ant in the first place is so folks wouldn’t need to remember
  • for some use cases, the desired system properties could be derived and not know when calling Ant

Option 2- Nashorn Code

Since JUnit 5 is being run in the same process as Ant itself, you need to set the system property in memory. Luckily, Ant allows you to run scripting in various languages. I chose Nashorn because it is built into Java. (There are other variants of this; see below)

<script language="javascript">
  <![CDATA[
    var imports = new JavaImporter(java.lang.System);
    imports.System.setProperty('file.encoding', 'ISO8859_1')
  ]]>
</script>

Pros

  • Short and simple
  • Nashorn is deprecated for removal starting Java 11. This means at some point, it can be removed. (I’m hopeful that the Ant task itself will support system properties by then.

Cons

  • Requires Java 8 or higher
  • The System property is set for the remainder of the build (you could write another code block to null it out after the test if this is a problem)

Variants of option 2

If you are running Ant with a version of Java below Java 8, you could use this technique, but use Rhino instead. I didn’t test this, but I think the code is


importClass(java.lang.System);
System.setProperty('file.encoding', 'ISO8859_1')


JUnit 5 itself uses Java 8 so nobody would be in the situation of pre-Java 8 and trying to use this blog post.

If you are running Ant with a version of Java where Nashorn has been removed, you could use Groovy or Jython as the embedded language. The code is simpler. I didn’t chose this because it requires adding another jar to the Ant directory. I prefer to minimize these set up extensions. Especially for a feature like this which is likely to be temporary.

System.setProperty('file.encoding', 'ISO8859_1')