This repository contains the code for the UK Export Finance Trade Finance Service. This documentation provides a comprehensive overview of the UKEF Digital TradeFinance Service (DTFS), including prerequisites, technology stack, setup instructions, testing procedures, deployment guidelines, and other essential information for the developers.
Status π¦
CI π«
CD π
- Node.js (Version 20 or later) with npm
- Docker and Docker Compose
- Node.js, NPM
- MongoDB
- Microsoft SQL Server (with TypeORM)
- Docker
- Cypress (E2E tests)
- Webpack
- GovUK and MOJ design systems
- Nunjucks (UI templates)
- Connect-Flash
- Clone this repository.
- Run
nvm install VERSION_NUMBER
with the node version number above to ensure you've got the correct Node.js version (thennvm use VERSION_NUMBER
to use it). - Create a single
.env
file in the project root, using.env.sample
as a base. Some sensitive variables may need to be shared within the team. - Generate JWT key pairs with
secrets/set_jwt_keypair.sh
(usebash secrets/set_jwt_keypair.sh
for Windows). - Base64 encode the generated public and private keys and add them to your portal-api
.env
file as follows:JWT_SIGNING_KEY=your_private_key
JWT_VALIDATING_KEY=your_public_key
- Set UKEF TFM environment variables in your terminal:
UKEF_TFM_API_SYSTEM_KEY
andUKEF_TFM_API_REPORTS_KEY
. - Run
npm run env:copy
to copy your root .env file into all the individual projects that need it. - Run
npm ci
in the root folder of the repository. (note: this will install dependencies for the entire project, including those specified in sub-packages. More details on this in the npm workspaces docs) - Start your local environment with
npm run start
. - Run migrations on the MSSQL Server database (see SQL DB docs for details)
- Create mock data by running
npm run load
from the root folder of the repository. This should generate mocks in your database (both Mongo and MSSQL). (for more details on what this does please see utils docs)
Recommended: Install a MongoDB client such as Compass or Robo 3T and a MSSQL DB client such as Azure Data Studio.
Note: If you're on Windows and experiencing issues with MongoDB, install mongosh for command-line debugging.
npm run start
Several services are built:
Service | URL |
---|---|
Portal UI | http://localhost:5000 |
Portal API | http://localhost:5001 |
External API | http://localhost:5002 |
TFM UI | http://localhost:5003 |
TFM API | http://localhost:5004 |
Central API | http://localhost:5005 |
GEF | http://localhost:5006 |
MongoDB | root:r00t@localhost:27017 (Connect via MongoDB client) |
MSSQL Server DB | SSMS: Server=localhost:1433;Database=DTFS;User Id=dtfs;Password=AbC!2345; DataGrip: jdbc:sqlserver://localhost:1433;database=DTFS;user=dtfs;password=AbC!2345 |
To access GEF locally, use http://localhost.
To stop the local environment, simply exit the running terminal and run:
npm run stop
- For Portal (BSS & GEF) mock users: utils/mock-data-loader/portal-users/index.js
- For Trade Finance Manager (TFM) mock users: utils/mock-data-loader/tfm/mocks/users.js (use the
username
to log in as opposed to theemail
)
As this project interfaces with various 3rd party APIs, it requires a range of environment variables to manage this and work with the repository locally. All variables are listed in a private spreadsheet, which should be shared with new engineers and updated as necessary.
These variables are stored as secrets in the GitHub repository. When deploying to an Azure environment, Azure automatically retrieves the GitHub secrets and updates the environment accordingly. To update a secret:
- Update the secret in the spreadsheet.
- Update the secret in GitHub secrets.
- Deploy to the development environment.
- Deploy to the test environment.
With Docker running, execute all tests using the following command:
npm run pipeline
From the respective folder (./e2e-tests/portal, ./e2e-tests/gef, ./e2e-tests/ukef, ./e2e-tests/tfm):
npx cypress run --config video=false
npx cypress run --spec "cypress/e2e/**/my-test.spec.js" --config video=false
npx cypress open .
From the respective folder (./portal-api, ./dtfs-central-api, ./trade-finance-manager-api):
npm run api-test
npm run api-test "**/*/deals-party-db.api-test.js"
From the respective folder (./portal, ./gef-ui, ./trade-finance-manager-ui):
npm run unit-test
npm run unit-test /path/to/file.test.js
The gef-ui
, portal
and trade-finance-manager-ui
folders/services all have a public
folder which contains compiled/minified CSS and JS that is used in the running application.
These CSS and JS files are built from SCSS and JS source files using a tool called Webpack. You can check which SCSS and JS source files are used in the webpack.common.config.js
file (each relevant service has one). In general, each of the three services has:
- A
scripts
folder containing the source JS. - A
styles
folder containing the source SCSS.
The developer should run npm run build
inside the service in question to recompile the CSS and JS in the public
folder after making any changes to the source files or their dependencies.
Sub-resource integrity (SRI)
Client side JavaScript files are protected by SRI security feature which allows the browser to verify the authenticity of the JavaScript files in use.
npm run build
, a new file hash will need to be generated. Otherwise, the script will not be executed.
This can be done by updating the integrity
attribute in any HTML/Nunjucks script
tags that use the file to reflect the new hash of the recompiled file (a good place to check for these script
tags is the templates/index.njk
file in the service).
There are two ways to update these:
- (Preferred) Render a template that uses the script in a browser; a console error should give you the hash of the recompiled file.
- Run
cat FILENAME.js | openssl dgst -sha512 -binary | openssl base64 -A
on each. Note -- GEF UI references Portal UI's javascript files when using the reverse proxy, so use the hashes from Portal UI's.js
files in GEF-UI.
In the root directory or any service, run:
npm run lint
- Create a branch and PR clearly describing the change, along with the Jira ticket number.
- The PR will run tests for the affected services.
- Once the PR tests pass, another engineer reviews and approves the PR.
- The PR is then merged into the main branch.
GitHub Actions will automatically run a build and push of container images to Azure, where they will be picked up and deployed in the Dev environment.
E2E tests for GHA have been set up to run in parallel. When they run, you will see duplicates of each job with a number denoting the instance.
Several environments are used for CI/CD:
The GEF test environment is hosted on the same URL as Portal v2. Following steps would allow access to GEF portal.
- Log in to Portal v2: https://tfs-xxx-fd.azurefd.net
- Manually navigate to the GEF URL to create a new GEF application: https://tfs-xxx-fd.azurefd.net/gef/mandatory-criteria
- Alternatively, visit an existing GEF deal by ID: http://tfs-xxx-fd.azurefd.net/gef/deals/1
All environments require a manual trigger to ensure stability, free from CI/CD interference, and ready for QA and user testing.
You can check the latest deployed commit by looking at the test/dev branch or visiting the health check endpoint, e.g., https://tfs-xxx-fd.azurefd.net/healthcheck.
You can check the latest deployed commit by looking at the test/dev branch or visiting the health check endpoint, e.g., https://tfs-xxx-fd.azurefd.net/healthcheck.
You can check the latest deployed commit by looking at the test/dev branch or visiting the health check endpoint, e.g., https://tfs-xxx-fd.azurefd.net/healthcheck.
After deployment, manually check if everything is working correctly by submitting a deal to TFM and verifying that the deal has data populated. Please note that there is currently an issue where the Number Generator Azure Function App may not work correctly after deployment. If you encounter this issue, restart the Number Generator Function App in the Azure Portal. This should resolve the problem, and newly submitted deals will work as expected. Make sure to wipe the durable-functions-log
collection to remove any dead documents.
Refer to the /utils/mock-data-loader
README for instructions.
A Microsoft Azure storage account is required for working with file uploads and Azure functions. You can create a storage account within the Azure Portal GUI:
Home > Services > Storage accounts > Create
Ensure that you select the UK South region and the dev/test resource group.
Each deal and facility submitted to TFM requires a unique ukefID, which is obtained from the APIM MDM Number Generator API. A background process is started to fetch the ID, and this process is managed by the Number Generator Azure Durable Function. The steps involved are as follows:
- A deal is created in Portal/GEF and submitted to TFM.
- TFM calls the Number Generator Azure Function, stores the status endpoint for the function call in the Durable Functions log, and returns a
ukefID
of "PENDING." - The Number Generator Function attempts to generate the number a maximum of 5 times before declaring a failure.
- A scheduled job on
tfm-api
polls the status endpoint for each running job until a result is received. - If the result is successful, the deal and facilities are updated with the generated IDs.
- If the result is an error, the entry in the Durable Functions log collection is updated with the error.
When a deal is submitted to TFM, many external API calls are currently made in the TFM submission controller. This process can be slow, resource-intensive, and prone to failures if one of the API calls encounters an issue. Retries are not currently configured.
To address these issues, the plan is to move all these API calls into background processes with automatic retries. This will enhance the user experience, make the process fail-safe, and improve the development workflow.
Email notifications are sent through MDM APIM using GOV.UK Notify at various stages, such as:
- When a deal status changes in Portal.
- When TFM acknowledges a deal submission.
- When a deal is approved or declined in TFM.
- When a TFM task is ready to start.
Each service that triggers an email has its "send email" function, including Portal (BSS & GEF) and TFM. Each function requires the following:
- Template ID.
- Email address.
- Email variables (an object of properties/values to display in the template).
Notify team members must be listed in the UKEF Notify "team members" page to receive emails and edit templates.
To test emails locally, replace the bank's email address associated with the user(s) involved in the deal creation and submission process with your own email address, provided that it's listed in the Notify team members. Specifically, update the following:
banks
MongoDB collection >emails
array (typically, use the bank with ID 9).users
MongoDB collection >bank.emails
array (typically, use the userBANK1_MAKER1
).- TFM Task emails are sent to the teams responsible for the tasks. To test these emails, replace the team's email in the
tfm-teams
MongoDB collection.
Notify currently has limited support for complex, conditional content, and it does not support iteration. For emails with lists of facilities, a workaround is used to generate a single string with HTML/Notify encodings that render lists in the Notify template. This single string is passed as a single email variable to Notify, as implemented in notify-template-formatters.js
within the TFM API.
After some time, Docker can consume a significant amount of hard drive space. To clean it up, run the following command:
docker system prune --volumes
Cookies are used for persistent sessions (login) and CSRF protection. They are configured with the following flags and names:
- Secure
- HTTP only
- SameSite as
Strict
__Host-
prefix (for Session cookie only)
Cookie Names:
- Session:
__Host-dtfs-session
- CSRF:
_csrf