Skip to main content

Documentation Index

Fetch the complete documentation index at: https://trunk-4cab4936-mintlify-sync-from-docs-1778014214.mintlify.app/llms.txt

Use this file to discover all available pages before exploring further.

You can automatically detect and manage flaky tests in your Nightwatch projects by integrating with Trunk. This document explains how to configure Nightwatch to output JUnit XML reports that can be uploaded to Trunk for analysis.

Checklist

By the end of this guide, you should achieve the following before proceeding to the next steps to configure your CI provider.
  • Generate a compatible test report
  • Configure the report file path or glob
  • Disable retries for better detection accuracy
  • Test uploads locally
After correctly generating reports following the above steps, you’ll be ready to move on to the next steps to configure uploads in CI.

Generating Reports

Nightwatch will automatically report test results in multiple formats. You can configure the output location by updating the nightwatch.conf.cjs config file.
module.exports = {
  output_folder: 'test-reports',
  ...
}
You can also specify output at runtime with the command line option --output <OUTPUT_FOLDER>:
nightwatch --output ./test-reports

Report File Path

Nightwatch outputs multiple reports for each test suite under the specified output folder. If you configured your output folder to be under ./test-reports, the JUnit XML files will be found under ./test-reports/**. You can upload multiple JUnit reports by using a glob like ./test-reports/**/*.xml.
Duplicate UploadsWhen using globs, it’s important to clean up old test reports between test runs. If your glob path contains old JUnit files, uploading old test results can cause tests to be mislabeled.

Disable Retries

You need to disable automatic retries if you previously enabled them. Retries compromise the accurate detection of flaky tests. Nightwatch doesn’t implement any form of automatic retry for failed or flaky tests by default. If you have a custom implementation of retries, remember to disable them.

Try It Locally

The Validate Command

You can validate your test reports using the Trunk CLI. If you don’t have it installed already, you can install and run the validate command like this:
SKU="trunk-analytics-cli-x86_64-unknown-linux.tar.gz"
curl -fL --retry 3 \
  "https://github.com/trunk-io/analytics-cli/releases/latest/download/${SKU}" \
  | tar -xz

chmod +x trunk-analytics-cli
./trunk-analytics-cli validate --junit-paths "./junit.xml"
This will not upload anything to Trunk. To improve detection accuracy, you should address all errors and warnings before proceeding to the next steps.

Test Upload

Before modifying your CI jobs to automatically upload test results to Trunk, try uploading a single test run manually. You make an upload to Trunk using the following command:
./trunk-analytics-cli upload --junit-paths "./test-reports/**/*.xml" \
    --org-url-slug <TRUNK_ORG_SLUG> \
    --token <TRUNK_ORG_TOKEN>
You can find your Trunk organization slug and token in the settings or by following these instructions. After your upload, you can verify that Trunk has received and processed it successfully in the Uploads tab. Warnings will be displayed if the report has issues.

Next Steps

Configure your CI to upload test runs to Trunk. Find the guides for your CI framework below:
Azure DevOps Pipelinesazure-devops-pipelinesazure.png
BitBucket Pipelinesbitbucket-pipelinesbitbucket.png
BuildKitebuildkitebuildkite.png
CircleCIcirclecicircle-ci.png
Drone CIdronecidrone.png
GitHub Actionsgithub-actionsgithub.png
Gitlabgitlabgitlab.png
Jenkinsjenkinsjenkins.png
Semaphoresemaphorecisemaphore.png
TeamCityhttps://github.com/trunk-io/docs/blob/main/flaky-tests/get-started/frameworks/broken-reference/README.mdteamcity.png
Travis CItraviscitravis.png
Other CI Providersotherciother.png