Skip to content

Commit

Permalink
Always throw error when build fails during test (#64116)
Browse files Browse the repository at this point in the history
## What?

When creating the Next.js instance fails for whatever reason (most
commonly `next build` failing) it currently exits the process, this
causes Jest to not report these tests as failed.

The change in this PR ensures that anytime creating the instance fails
it throws an error that is caught by Jest and then causes Jest to report
the tests as failed.

The reporting of tests is relevant as our monitoring tools (i.e. DataDog
/ Turbopack test manifest) rely on correct reporting of all failing
tests.


<!-- Thanks for opening a PR! Your contribution is much appreciated.
To make sure your PR is handled as smoothly as possible we request that
you follow the checklist sections below.
Choose the right checklist for the change(s) that you're making:

## For Contributors

### Improving Documentation

- Run `pnpm prettier-fix` to fix formatting issues before opening the
PR.
- Read the Docs Contribution Guide to ensure your contribution follows
the docs guidelines:
https://nextjs.org/docs/community/contribution-guide

### Adding or Updating Examples

- The "examples guidelines" are followed from our contributing doc
https://github.com/vercel/next.js/blob/canary/contributing/examples/adding-examples.md
- Make sure the linting passes by running `pnpm build && pnpm lint`. See
https://github.com/vercel/next.js/blob/canary/contributing/repository/linting.md

### Fixing a bug

- Related issues linked using `fixes #number`
- Tests added. See:
https://github.com/vercel/next.js/blob/canary/contributing/core/testing.md#writing-tests-for-nextjs
- Errors have a helpful link attached, see
https://github.com/vercel/next.js/blob/canary/contributing.md

### Adding a feature

- Implements an existing feature request or RFC. Make sure the feature
request has been accepted for implementation before opening a PR. (A
discussion must be opened, see
https://github.com/vercel/next.js/discussions/new?category=ideas)
- Related issues/discussions are linked using `fixes #number`
- e2e tests added
(https://github.com/vercel/next.js/blob/canary/contributing/core/testing.md#writing-tests-for-nextjs)
- Documentation added
- Telemetry added. In case of a feature if it's used or not.
- Errors have a helpful link attached, see
https://github.com/vercel/next.js/blob/canary/contributing.md


## For Maintainers

- Minimal description (aim for explaining to someone not on the team to
understand the PR)
- When linking to a Slack thread, you might want to share details of the
conclusion
- Link both the Linear (Fixes NEXT-xxx) and the GitHub issues
- Add review comments if necessary to explain to the reviewer the logic
behind a change

### What?

### Why?

### How?

Closes NEXT-
Fixes #

-->


Closes NEXT-3016
  • Loading branch information
timneutkens authored Apr 5, 2024
1 parent a00bed8 commit 411846a
Show file tree
Hide file tree
Showing 2 changed files with 5 additions and 9 deletions.
2 changes: 1 addition & 1 deletion .github/workflows/test_e2e_deploy.yml
Original file line number Diff line number Diff line change
Expand Up @@ -51,7 +51,7 @@ jobs:
- run: node scripts/run-e2e-test-project-reset.mjs
name: Reset test project

- run: docker run --rm -v $(pwd):/work mcr.microsoft.com/playwright:v1.35.1-jammy /bin/bash -c "cd /work && NODE_VERSION=${{ env.NODE_LTS_VERSION }} ./scripts/setup-node.sh && corepack enable > /dev/null && NEXT_JUNIT_TEST_REPORT=true DATADOG_API_KEY=${DATADOG_API_KEY} DD_ENV=ci VERCEL_TEST_TOKEN=${{ secrets.VERCEL_TEST_TOKEN }} VERCEL_TEST_TEAM=vtest314-next-e2e-tests NEXT_TEST_JOB=1 NEXT_TEST_MODE=deploy TEST_TIMINGS_TOKEN=${{ secrets.TEST_TIMINGS_TOKEN }} NEXT_TEST_CONTINUE_ON_ERROR=1 xvfb-run node run-tests.js --type e2e --timings -g ${{ matrix.group }}/2 -c 1 >> /proc/1/fd/1"
- run: docker run --rm -v $(pwd):/work mcr.microsoft.com/playwright:v1.35.1-jammy /bin/bash -c "cd /work && NODE_VERSION=${{ env.NODE_LTS_VERSION }} ./scripts/setup-node.sh && corepack enable > /dev/null && NEXT_JUNIT_TEST_REPORT=true DATADOG_API_KEY=${DATADOG_API_KEY} DD_ENV=ci VERCEL_TEST_TOKEN=${{ secrets.VERCEL_TEST_TOKEN }} VERCEL_TEST_TEAM=vtest314-next-e2e-tests NEXT_TEST_JOB=1 NEXT_TEST_MODE=deploy TEST_TIMINGS_TOKEN=${{ secrets.TEST_TIMINGS_TOKEN }} xvfb-run node run-tests.js --type e2e --timings -g ${{ matrix.group }}/2 -c 1 >> /proc/1/fd/1"
name: Run test/e2e (deploy)

- name: Upload test report
Expand Down
12 changes: 4 additions & 8 deletions test/lib/e2e-utils.ts
Original file line number Diff line number Diff line change
Expand Up @@ -213,16 +213,12 @@ export async function createNext(
} catch (err) {
require('console').error('Failed to create next instance', err)
try {
nextInstance.destroy()
await nextInstance?.destroy()
} catch (_) {}

if (process.env.NEXT_TEST_CONTINUE_ON_ERROR) {
// Other test should continue to create new instance if NEXT_TEST_CONTINUE_ON_ERROR explicitly specified.
nextInstance = undefined
throw err
} else {
process.exit(1)
}
nextInstance = undefined
// Throw instead of process exit to ensure that Jest reports the tests as failed.
throw err
} finally {
flushAllTraces()
}
Expand Down

0 comments on commit 411846a

Please sign in to comment.