diff --git a/CONTRIBUTING.md b/CONTRIBUTING.md
index 5028fcdb28..06d710eb20 100644
--- a/CONTRIBUTING.md
+++ b/CONTRIBUTING.md
@@ -2,11 +2,11 @@
We'd love for you to contribute to our source code and to make the Forest even better than it is today! Here are the guidelines we'd like you to follow:
- - [Question or Problem?](#question)
- - [Issues and Bugs](#issue)
- - [Feature Requests](#feature)
- - [Submission Guidelines](#submit)
- - [Further Info](#info)
+* [Question or Problem?](#question)
+* [Issues and Bugs](#issue)
+* [Feature Requests](#feature)
+* [Submission Guidelines](#submit)
+* [Further Info](#info)
## Got a Question or Problem?
@@ -27,7 +27,6 @@ You can request a new feature by submitting an issue to our [Github Repository][
* **Major Changes** that you wish to contribute to the project should be discussed first on our [Slack group][slack] so that we can better coordinate our efforts, prevent duplication of work, and help you to craft the change so that it is successfully accepted into the project.
* **Small Changes** can be crafted and submitted to the [Github Repository][github] as a Pull Request.
-
## Want a Doc Fix?
If you want to help improve the docs, it's a good idea to let others know what you're working on to minimize duplication of effort. Create a new issue (or comment on a related existing one) to let others know what you're working on.
@@ -37,6 +36,7 @@ For large fixes, please build and test the documentation before submitting the M
## Submission Guidelines
### Submitting an Issue
+
Before you submit your issue search the archive, maybe your question was already answered.
If your issue appears to be a bug, and hasn't been reported, open a new issue. Help us to maximize the effort we can spend fixing issues and adding new features, by not reporting duplicate issues. Providing the following information will increase the chances of your issue being dealt with quickly:
@@ -58,18 +58,18 @@ Before you submit your merge request consider the following guidelines:
* Make your changes in a new git branch:
```shell
- git checkout -b my-fix-branch master
+ git checkout -b my-fix-branch develop
```
* Create your patch, **including appropriate test cases**.
* Run the test suite and ensure that all tests pass.
-* Add a line in the CHANGELOG.md under Unreleased. This will be used form generating the release notes.
* Install [pre-commit hooks](https://pre-commit.com/). The hooks runs some basic checks and update the docs. The commit will run the hooks, you can invoke the hooks manually `pre-commit run --all-files` as well.
* Commit your changes using a descriptive commit message.
```shell
git commit -a
```
+
Note: the optional commit `-a` command line option will automatically "add" and "rm" edited files.
* Build your changes locally to ensure all the tests pass:
@@ -79,7 +79,7 @@ Before you submit your merge request consider the following guidelines:
git push origin my-fix-branch
```
-In Github, send a pull request to original master branch: f.e. `terraform-aws-vpc:master`.
+In Github, send a pull request to original develop branch: f.e. `terraform-aws-vpc:develop`.
If we suggest changes, then:
* Make the required updates.
@@ -89,10 +89,10 @@ If we suggest changes, then:
If the PR gets too outdated we may ask you to rebase and force push to update the PR:
-```shell
-git rebase master -i
-git push origin my-fix-branch -f
-```
+ ```shell
+ git rebase develop -i
+ git push origin my-fix-branch -f
+ ```
_WARNING: Squashing or reverting commits and force-pushing thereafter may remove Github comments on code that were previously made by you or others in your commits. Avoid any form of rebasing unless necessary._
@@ -109,10 +109,10 @@ from the main (upstream) repository:
git push origin --delete my-fix-branch
```
-* Check out the master branch:
+* Check out the develop branch:
```shell
- git checkout master -f
+ git checkout develop -f
```
* Delete the local branch:
@@ -121,10 +121,10 @@ from the main (upstream) repository:
git branch -D my-fix-branch
```
-* Update your master with the latest upstream version:
+* Update your develop with the latest upstream version:
```shell
- git pull --ff upstream master
+ git pull --ff upstream develop
```
## Info
@@ -136,5 +136,5 @@ Use the badge to sign-up.
[![Slack](https://philips-software-slackin.now.sh/badge.svg)](https://philips-software-slackin.now.sh)
[contribute]: CONTRIBUTING.md
-[github]: https://github.com/philips-lam/terraform-aws-github-runner/issues
+[github]: https://github.com/philips-labs/terraform-aws-github-runner/issues
[slack]: https://philips-software.slack.com/home
diff --git a/examples/default/README.md b/examples/default/README.md
index ebf481abc4..9cf612ce17 100644
--- a/examples/default/README.md
+++ b/examples/default/README.md
@@ -1,10 +1,10 @@
# Action runners deployment default example
-This modules shows how to create GitHub action runners. Lambda release will be downloaded from GitHub.
+This module shows how to create GitHub action runners. Lambda release will be downloaded from GitHub.
## Usages
-Steps for the full setup, such as creating a GitHub app can be found in the root module's [README](../../README.md). First download the Lambda releases from GitHub. Alternatively you can build the lambdas locally with Node or Docker, there is a simple build script in `/.ci/build.sh`. In the `main.tf` you can simple remove the location of the lambda zip files, the default location will work in this case.
+Steps for the full setup, such as creating a GitHub app can be found in the root module's [README](../../README.md). First download the Lambda releases from GitHub. Alternatively you can build the lambdas locally with Node or Docker, there is a simple build script in `/.ci/build.sh`. In the `main.tf` you can simply remove the location of the lambda zip files, the default location will work in this case.
> Ensure you have set the version in `lambdas-download/main.tf` for running the example. The version needs to be set to a GitHub release version, see https://github.com/philips-labs/terraform-aws-github-runner/releases
diff --git a/examples/permissions-boundary/README.md b/examples/permissions-boundary/README.md
index 940b83d725..60d21345f7 100644
--- a/examples/permissions-boundary/README.md
+++ b/examples/permissions-boundary/README.md
@@ -1,10 +1,10 @@
# Action runners deployed with permissions boundary
-This modules shows how to create GitHub action runners with permissions boundaries and paths used in role, policies, and instance profiles.
+This module shows how to create GitHub action runners with permissions boundaries and paths used in role, policies, and instance profiles.
## Usages
-Steps for the full setup, such as creating a GitHub app can be find the module [README](../../README.md). First create the deploy role and boundary policies. This steps required an admin user.
+Steps for the full setup, such as creating a GitHub app can be find the module [README](../../README.md). First create the deploy role and boundary policies. These steps require an admin user.
> Ensure you have set the version in `lambdas-download/main.tf` for running the example. The version needs to be set to a GitHub release version, see https://github.com/philips-labs/terraform-aws-github-runner/releases
diff --git a/examples/permissions-boundary/setup/outpus.tf b/examples/permissions-boundary/setup/outputs.tf
similarity index 100%
rename from examples/permissions-boundary/setup/outpus.tf
rename to examples/permissions-boundary/setup/outputs.tf
diff --git a/examples/ubuntu/README.md b/examples/ubuntu/README.md
index 0672ddf7a6..211e60d2e7 100644
--- a/examples/ubuntu/README.md
+++ b/examples/ubuntu/README.md
@@ -1,10 +1,10 @@
# Action runners deployment ubuntu example
-This modules shows how to create GitHub action runners using an Ubuntu AMI. Lambda release will be downloaded from GitHub.
+This module shows how to create GitHub action runners using an Ubuntu AMI. Lambda release will be downloaded from GitHub.
## Usages
-Steps for the full setup, such as creating a GitHub app can be found in the root module's [README](../../README.md). First download the Lambda releases from GitHub. Alternatively you can build the lambdas locally with Node or Docker, there is a simple build script in `/.ci/build.sh`. In the `main.tf` you can simple remove the location of the lambda zip files, the default location will work in this case.
+Steps for the full setup, such as creating a GitHub app can be found in the root module's [README](../../README.md). First download the Lambda releases from GitHub. Alternatively you can build the lambdas locally with Node or Docker, there is a simple build script in `/.ci/build.sh`. In the `main.tf` you can simply remove the location of the lambda zip files, the default location will work in this case.
> Ensure you have set the version in `lambdas-download/main.tf` for running the example. The version needs to be set to a GitHub release version, see https://github.com/philips-labs/terraform-aws-github-runner/releases
diff --git a/examples/ubuntu/providers.tf b/examples/ubuntu/providers.tf
index fd82b3cd00..5e0aedc8fe 100644
--- a/examples/ubuntu/providers.tf
+++ b/examples/ubuntu/providers.tf
@@ -13,7 +13,7 @@ terraform {
provider "aws" {
region = local.aws_region
- // If you use roles with specific permissons please add your role
+ // If you use roles with specific permissions please add your role
// assume_role {
// role_arn = "arn:aws:iam::123456789012:role/MyAdminRole"
// }
diff --git a/modules/runner-binaries-syncer/lambdas/runner-binaries-syncer/src/lambda.ts b/modules/runner-binaries-syncer/lambdas/runner-binaries-syncer/src/lambda.ts
index d8c7ebc94b..85ea9622bd 100644
--- a/modules/runner-binaries-syncer/lambdas/runner-binaries-syncer/src/lambda.ts
+++ b/modules/runner-binaries-syncer/lambdas/runner-binaries-syncer/src/lambda.ts
@@ -2,6 +2,10 @@ import { handle } from './syncer/handler';
// eslint-disable-next-line
export const handler = async (event: any, context: any, callback: any): Promise => {
- await handle();
- callback();
+ try {
+ await handle();
+ callback(null);
+ } catch (e) {
+ callback(e);
+ }
};
diff --git a/modules/runner-binaries-syncer/lambdas/runner-binaries-syncer/src/syncer/handler.test.ts b/modules/runner-binaries-syncer/lambdas/runner-binaries-syncer/src/syncer/handler.test.ts
index 34f923f952..6958429ca7 100644
--- a/modules/runner-binaries-syncer/lambdas/runner-binaries-syncer/src/syncer/handler.test.ts
+++ b/modules/runner-binaries-syncer/lambdas/runner-binaries-syncer/src/syncer/handler.test.ts
@@ -56,6 +56,30 @@ describe('Synchronize action distribution.', () => {
expect(mockS3.upload).toBeCalledTimes(0);
});
+ it('Distribution is up-to-date with latest release when there are no prereleases.', async () => {
+ process.env.GITHUB_RUNNER_ALLOW_PRERELEASE_BINARIES = 'true';
+ const releases = listReleases.slice(1);
+
+ mockOctokit.repos.listReleases.mockImplementation(() => ({
+ data: releases,
+ }));
+ mockS3.getObjectTagging.mockImplementation(() => {
+ return {
+ promise() {
+ return Promise.resolve({ TagSet: [{ Key: 'name', Value: 'actions-runner-linux-x64-2.272.0.tar.gz' }] });
+ },
+ };
+ });
+
+ await handle();
+ expect(mockOctokit.repos.listReleases).toBeCalledTimes(1);
+ expect(mockS3.getObjectTagging).toBeCalledWith({
+ Bucket: bucketName,
+ Key: bucketObjectKey,
+ });
+ expect(mockS3.upload).toBeCalledTimes(0);
+ });
+
it('Distribution is up-to-date with latest prerelease.', async () => {
process.env.GITHUB_RUNNER_ALLOW_PRERELEASE_BINARIES = 'true';
mockS3.getObjectTagging.mockImplementation(() => {
@@ -95,6 +119,32 @@ describe('Synchronize action distribution.', () => {
expect(s3JsonBody['Tagging']).toEqual('name=actions-runner-linux-x64-2.272.0.tar.gz');
});
+ it('Distribution should update to release if there are no pre-releases.', async () => {
+ process.env.GITHUB_RUNNER_ALLOW_PRERELEASE_BINARIES = 'true';
+ const releases = listReleases.slice(1);
+
+ mockOctokit.repos.listReleases.mockImplementation(() => ({
+ data: releases,
+ }));
+ mockS3.getObjectTagging.mockImplementation(() => {
+ return {
+ promise() {
+ return Promise.resolve({ TagSet: [{ Key: 'name', Value: 'actions-runner-linux-x64-0.tar.gz' }] });
+ },
+ };
+ });
+
+ await handle();
+ expect(mockOctokit.repos.listReleases).toBeCalledTimes(1);
+ expect(mockS3.getObjectTagging).toBeCalledWith({
+ Bucket: bucketName,
+ Key: bucketObjectKey,
+ });
+ expect(mockS3.upload).toBeCalledTimes(1);
+ const s3JsonBody = mockS3.upload.mock.calls[0][0];
+ expect(s3JsonBody['Tagging']).toEqual('name=actions-runner-linux-x64-2.272.0.tar.gz');
+ });
+
it('Distribution should update to prerelease.', async () => {
process.env.GITHUB_RUNNER_ALLOW_PRERELEASE_BINARIES = 'true';
mockS3.getObjectTagging.mockImplementation(() => {
diff --git a/modules/runner-binaries-syncer/lambdas/runner-binaries-syncer/src/syncer/handler.ts b/modules/runner-binaries-syncer/lambdas/runner-binaries-syncer/src/syncer/handler.ts
index 374cc09d1d..8bef7e1fd6 100644
--- a/modules/runner-binaries-syncer/lambdas/runner-binaries-syncer/src/syncer/handler.ts
+++ b/modules/runner-binaries-syncer/lambdas/runner-binaries-syncer/src/syncer/handler.ts
@@ -49,7 +49,7 @@ async function getLinuxReleaseAsset(
const latestReleaseIndex = assetsList.data.findIndex((a) => a.prerelease === false);
let asset = undefined;
- if (fetchPrereleaseBinaries && latestPrereleaseIndex < latestReleaseIndex) {
+ if (fetchPrereleaseBinaries && latestPrereleaseIndex != -1 && latestPrereleaseIndex < latestReleaseIndex) {
asset = assetsList.data[latestPrereleaseIndex];
} else if (latestReleaseIndex != -1) {
asset = assetsList.data[latestReleaseIndex];
@@ -86,6 +86,7 @@ async function uploadToS3(s3: S3, cacheObject: CacheObject, actionRunnerReleaseA
});
}).catch((error) => {
console.error(`Exception: ${error}`);
+ throw error;
});
}
diff --git a/modules/runners/README.md b/modules/runners/README.md
index d98c614ffd..d8c4715e1d 100644
--- a/modules/runners/README.md
+++ b/modules/runners/README.md
@@ -6,11 +6,11 @@ This module creates resources required to run the GitHub action runner on AWS EC
### Action runners on EC2
-The action runners are created via a launch template, on launch template only the subnet needs to be provided. During launch the installation is handled via a user data script. The configuration is fetched from SSM parameter store.
+The action runners are created via a launch template; in the launch template only the subnet needs to be provided. During launch the installation is handled via a user data script. The configuration is fetched from SSM parameter store.
### Lambda scale up
-The scale up lambda is triggered by events on a SQS queue. Events on this queued are delayed, which will will give the workflow some time to start running on available runners. For each event the lambda will check the workflow is still queued and no other limits are reached. In that case the lambda will create a new EC2 instance. The lambda only needs to know which launch template to use and which subnets are available. From the available subnets a random one will be chosen. Once the instance is created the event is assumed as handled, and we assume the workflow wil start at some moment once the created instance is ready.
+The scale up lambda is triggered by events on a SQS queue. Events on this queue are delayed, which will give the workflow some time to start running on available runners. For each event the lambda will check if the workflow is still queued and no other limits are reached. In that case the lambda will create a new EC2 instance. The lambda only needs to know which launch template to use and which subnets are available. From the available subnets a random one will be chosen. Once the instance is created the event is assumed as handled, and we assume the workflow wil start at some moment once the created instance is ready.
### Lambda scale down
diff --git a/modules/runners/lambdas/runners/src/lambda.ts b/modules/runners/lambdas/runners/src/lambda.ts
index 156ac39f26..fd24e09db9 100644
--- a/modules/runners/lambdas/runners/src/lambda.ts
+++ b/modules/runners/lambdas/runners/src/lambda.ts
@@ -18,7 +18,7 @@ export const scaleUp = async (event: SQSEvent, context: Context, callback: any):
export const scaleDown = async (event: ScheduledEvent, context: Context, callback: any): Promise => {
try {
- scaleDownAction();
+ await scaleDownAction();
callback(null);
} catch (e) {
console.error(e);
diff --git a/modules/runners/lambdas/runners/src/scale-runners/runners.test.ts b/modules/runners/lambdas/runners/src/scale-runners/runners.test.ts
index 13e4f8e428..d5916e88c3 100644
--- a/modules/runners/lambdas/runners/src/scale-runners/runners.test.ts
+++ b/modules/runners/lambdas/runners/src/scale-runners/runners.test.ts
@@ -1,6 +1,6 @@
-import { listRunners, createRunner } from './runners';
+import { listRunners, createRunner, terminateRunner, RunnerInfo } from './runners';
-const mockEC2 = { describeInstances: jest.fn(), runInstances: jest.fn() };
+const mockEC2 = { describeInstances: jest.fn(), runInstances: jest.fn(), terminateInstances: jest.fn() };
const mockSSM = { putParameter: jest.fn() };
jest.mock('aws-sdk', () => ({
EC2: jest.fn().mockImplementation(() => mockEC2),
@@ -102,6 +102,23 @@ describe('list instances', () => {
});
});
+describe('terminate runner', () => {
+ const mockTerminateInstances = { promise: jest.fn() };
+ beforeEach(() => {
+ jest.clearAllMocks();
+ mockEC2.terminateInstances.mockImplementation(() => mockTerminateInstances);
+ mockTerminateInstances.promise.mockReturnThis();
+ });
+ it('calls terminate instances with the right instance ids', async () => {
+ const runner: RunnerInfo = {
+ instanceId: 'instance-2',
+ };
+ await terminateRunner(runner);
+
+ expect(mockEC2.terminateInstances).toBeCalledWith({ InstanceIds: [runner.instanceId] });
+ });
+});
+
describe('create runner', () => {
const mockRunInstances = { promise: jest.fn() };
const mockPutParameter = { promise: jest.fn() };
diff --git a/modules/runners/lambdas/runners/src/scale-runners/runners.ts b/modules/runners/lambdas/runners/src/scale-runners/runners.ts
index 082836dbcd..bbad2a498e 100644
--- a/modules/runners/lambdas/runners/src/scale-runners/runners.ts
+++ b/modules/runners/lambdas/runners/src/scale-runners/runners.ts
@@ -2,15 +2,15 @@ import { EC2, SSM } from 'aws-sdk';
export interface RunnerInfo {
instanceId: string;
- launchTime: Date | undefined;
- repo: string | undefined;
- org: string | undefined;
+ launchTime?: Date;
+ repo?: string;
+ org?: string;
}
export interface ListRunnerFilters {
runnerType?: 'Org' | 'Repo';
runnerOwner?: string;
- environment: string | undefined;
+ environment?: string;
}
export interface RunnerInputParameters {
diff --git a/modules/runners/lambdas/runners/src/scale-runners/scale-down-config.test.ts b/modules/runners/lambdas/runners/src/scale-runners/scale-down-config.test.ts
index 56bda839de..fbdbe42194 100644
--- a/modules/runners/lambdas/runners/src/scale-runners/scale-down-config.test.ts
+++ b/modules/runners/lambdas/runners/src/scale-runners/scale-down-config.test.ts
@@ -6,15 +6,11 @@ const DEFAULT_IDLE_COUNT = 1;
const now = moment.tz(new Date(), 'America/Los_Angeles');
function getConfig(cronTabs: string[]): ScalingDownConfigList {
- const result: ScalingDownConfigList = [];
- for (const cron of cronTabs) {
- result.push({
- cron: cron,
- idleCount: DEFAULT_IDLE_COUNT,
- timeZone: DEFAULT_TIMEZONE,
- });
- }
- return result;
+ return cronTabs.map((cron) => ({
+ cron: cron,
+ idleCount: DEFAULT_IDLE_COUNT,
+ timeZone: DEFAULT_TIMEZONE,
+ }));
}
describe('scaleDownConfig', () => {
diff --git a/modules/runners/lambdas/runners/src/scale-runners/scale-down.test.ts b/modules/runners/lambdas/runners/src/scale-runners/scale-down.test.ts
index 6d956d735b..cd1ac53807 100644
--- a/modules/runners/lambdas/runners/src/scale-runners/scale-down.test.ts
+++ b/modules/runners/lambdas/runners/src/scale-runners/scale-down.test.ts
@@ -130,7 +130,7 @@ describe('scaleDown', () => {
},
}));
- mockOctokit.paginate.mockImplementationOnce(() => {
+ mockOctokit.paginate.mockImplementation(() => {
return DEFAULT_REGISTERED_RUNNERS;
});
@@ -171,8 +171,8 @@ describe('scaleDown', () => {
expect(listRunners).toBeCalledWith({
environment: environment,
});
- expect(terminateRunner).not;
- expect(mockOctokit.apps.getRepoInstallation).not;
+ expect(terminateRunner).not.toBeCalled();
+ expect(mockOctokit.apps.getRepoInstallation).not.toBeCalled();
});
it('No runners for org.', async () => {
@@ -181,56 +181,8 @@ describe('scaleDown', () => {
expect(listRunners).toBeCalledWith({
environment: environment,
});
- expect(terminateRunner).not;
- expect(mockOctokit.apps.getRepoInstallation).not;
- });
- });
-
- describe('on repo level', () => {
- beforeAll(() => {
- process.env.ENABLE_ORGANIZATION_RUNNERS = 'false';
- process.env.SCALE_DOWN_CONFIG = '[]';
- const mockListRunners = mocked(listRunners);
- mockListRunners.mockImplementation(async () => {
- return DEFAULT_RUNNERS;
- });
- });
-
- it('Terminate 3 of 5 runners for repo.', async () => {
- await scaleDown();
- expect(listRunners).toBeCalledWith({
- environment: environment,
- });
-
- expect(mockOctokit.apps.getRepoInstallation).toBeCalled();
- expect(terminateRunner).toBeCalledTimes(3);
- for (const toTerminate of DEFAULT_RUNNERS_TO_BE_REMOVED) {
- expect(terminateRunner).toHaveBeenCalledWith(toTerminate);
- }
- });
- });
-
- describe('on org level', () => {
- beforeAll(() => {
- process.env.ENABLE_ORGANIZATION_RUNNERS = 'true';
- process.env.SCALE_DOWN_CONFIG = '[]';
- const mockListRunners = mocked(listRunners);
- mockListRunners.mockImplementation(async () => {
- return DEFAULT_RUNNERS;
- });
- });
-
- it('Terminate 3 of 5 runners for org.', async () => {
- await scaleDown();
- expect(listRunners).toBeCalledWith({
- environment: environment,
- });
-
- expect(mockOctokit.apps.getOrgInstallation).toBeCalled();
- expect(terminateRunner).toBeCalledTimes(3);
- for (const toTerminate of DEFAULT_RUNNERS_TO_BE_REMOVED) {
- expect(terminateRunner).toHaveBeenCalledWith(toTerminate);
- }
+ expect(terminateRunner).not.toBeCalled();
+ expect(mockOctokit.apps.getRepoInstallation).not.toBeCalled();
});
});
@@ -299,8 +251,8 @@ describe('scaleDown', () => {
});
it('Terminate 1 of runners for org.', async () => {
- await scaleDown();
process.env.ENABLE_ORGANIZATION_RUNNERS = 'true';
+ await scaleDown();
expect(listRunners).toBeCalledWith({
environment: environment,
@@ -314,14 +266,14 @@ describe('scaleDown', () => {
});
it('Terminate 1 of runners for repo.', async () => {
- await scaleDown();
process.env.ENABLE_ORGANIZATION_RUNNERS = 'false';
+ await scaleDown();
expect(listRunners).toBeCalledWith({
environment: environment,
});
- expect(mockOctokit.apps.getOrgInstallation).toBeCalled();
+ expect(mockOctokit.apps.getRepoInstallation).toBeCalled();
expect(terminateRunner).toBeCalledTimes(1);
for (const toTerminate of RUNNERS_TO_BE_REMOVED_WITH_AUTO_SCALING_CONFIG) {
expect(terminateRunner).toHaveBeenCalledWith(toTerminate);
@@ -352,7 +304,7 @@ describe('scaleDown ghes', () => {
},
}));
- mockOctokit.paginate.mockImplementationOnce(() => {
+ mockOctokit.paginate.mockImplementation(() => {
return DEFAULT_REGISTERED_RUNNERS;
});
@@ -386,8 +338,8 @@ describe('scaleDown ghes', () => {
expect(listRunners).toBeCalledWith({
environment: environment,
});
- expect(terminateRunner).not;
- expect(mockOctokit.apps.getRepoInstallation).not;
+ expect(terminateRunner).not.toBeCalled();
+ expect(mockOctokit.apps.getRepoInstallation).not.toBeCalled();
});
it('No runners for org.', async () => {
@@ -396,8 +348,8 @@ describe('scaleDown ghes', () => {
expect(listRunners).toBeCalledWith({
environment: environment,
});
- expect(terminateRunner).not;
- expect(mockOctokit.apps.getRepoInstallation).not;
+ expect(terminateRunner).not.toBeCalled();
+ expect(mockOctokit.apps.getRepoInstallation).not.toBeCalled();
});
});
@@ -514,8 +466,8 @@ describe('scaleDown ghes', () => {
});
it('Terminate 1 of runners for org.', async () => {
- await scaleDown();
process.env.ENABLE_ORGANIZATION_RUNNERS = 'true';
+ await scaleDown();
expect(listRunners).toBeCalledWith({
environment: environment,
@@ -529,14 +481,14 @@ describe('scaleDown ghes', () => {
});
it('Terminate 1 of runners for repo.', async () => {
- await scaleDown();
process.env.ENABLE_ORGANIZATION_RUNNERS = 'false';
+ await scaleDown();
expect(listRunners).toBeCalledWith({
environment: environment,
});
- expect(mockOctokit.apps.getOrgInstallation).toBeCalled();
+ expect(mockOctokit.apps.getRepoInstallation).toBeCalled();
expect(terminateRunner).toBeCalledTimes(1);
for (const toTerminate of RUNNERS_TO_BE_REMOVED_WITH_AUTO_SCALING_CONFIG) {
expect(terminateRunner).toHaveBeenCalledWith(toTerminate);
diff --git a/modules/runners/lambdas/runners/src/scale-runners/scale-down.ts b/modules/runners/lambdas/runners/src/scale-runners/scale-down.ts
index cd8eb20401..41b0226b1d 100644
--- a/modules/runners/lambdas/runners/src/scale-runners/scale-down.ts
+++ b/modules/runners/lambdas/runners/src/scale-runners/scale-down.ts
@@ -20,13 +20,6 @@ function createGitHubClientForRunnerFactory(): (runner: RunnerInfo, orgLevel: bo
const cache: Map = new Map();
return async (runner: RunnerInfo, orgLevel: boolean) => {
- const ghesBaseUrl = process.env.GHES_URL;
- let ghesApiUrl = '';
- if (ghesBaseUrl) {
- ghesApiUrl = `${ghesBaseUrl}/api/v3`;
- }
- const ghAuth = await createGithubAuth(undefined, 'app', ghesApiUrl);
- const githubClient = await createOctoClient(ghAuth.token, ghesApiUrl);
const repo = getRepo(runner, orgLevel);
const key = orgLevel ? repo.repoOwner : repo.repoOwner + repo.repoName;
const cachedOctokit = cache.get(key);
@@ -37,6 +30,13 @@ function createGitHubClientForRunnerFactory(): (runner: RunnerInfo, orgLevel: bo
}
console.debug(`[createGitHubClientForRunner] Cache miss for ${key}`);
+ const ghesBaseUrl = process.env.GHES_URL as string;
+ let ghesApiUrl = '';
+ if (ghesBaseUrl) {
+ ghesApiUrl = `${ghesBaseUrl}/api/v3`;
+ }
+ const ghAuth = await createGithubAuth(undefined, 'app', ghesApiUrl);
+ const githubClient = await createOctoClient(ghAuth.token, ghesApiUrl);
const installationId = orgLevel
? (
await githubClient.apps.getOrgInstallation({
@@ -163,7 +163,6 @@ export async function scaleDown(): Promise {
const githubAppClient = await createGitHubClientForRunner(ec2runner, enableOrgLevel);
- const repo = getRepo(ec2runner, enableOrgLevel);
const ghRunners = await listGithubRunners(githubAppClient, ec2runner, enableOrgLevel);
let orphanEc2Runner = true;
for (const ghRunner of ghRunners) {
@@ -174,6 +173,7 @@ export async function scaleDown(): Promise {
idleCounter--;
console.debug(`Runner '${ec2runner.instanceId}' will kept idle.`);
} else {
+ const repo = getRepo(ec2runner, enableOrgLevel);
await removeRunner(ec2runner, ghRunner.id, repo, enableOrgLevel, githubAppClient);
}
}
diff --git a/modules/runners/lambdas/runners/src/scale-runners/scale-up.test.ts b/modules/runners/lambdas/runners/src/scale-runners/scale-up.test.ts
index 062af95daa..6f82aedcdb 100644
--- a/modules/runners/lambdas/runners/src/scale-runners/scale-up.test.ts
+++ b/modules/runners/lambdas/runners/src/scale-runners/scale-up.test.ts
@@ -168,14 +168,15 @@ describe('scaleUp with GHES', () => {
process.env.RUNNERS_MAXIMUM_COUNT = '1';
await scaleUpModule.scaleUp('aws:sqs', TEST_DATA);
expect(mockOctokit.actions.createRegistrationTokenForOrg).not.toBeCalled();
+ expect(mockOctokit.actions.createRegistrationTokenForRepo).not.toBeCalled();
});
it('creates a token when maximum runners has not been reached', async () => {
await scaleUpModule.scaleUp('aws:sqs', TEST_DATA);
- expect(mockOctokit.actions.createRegistrationTokenForOrg).toBeCalled();
expect(mockOctokit.actions.createRegistrationTokenForOrg).toBeCalledWith({
org: TEST_DATA.repositoryOwner,
});
+ expect(mockOctokit.actions.createRegistrationTokenForRepo).not.toBeCalled();
});
it('does not retrieve installation id if already set', async () => {
@@ -267,11 +268,13 @@ describe('scaleUp with GHES', () => {
it('does not create a token when maximum runners has been reached', async () => {
process.env.RUNNERS_MAXIMUM_COUNT = '1';
await scaleUpModule.scaleUp('aws:sqs', TEST_DATA);
+ expect(mockOctokit.actions.createRegistrationTokenForOrg).not.toBeCalled();
expect(mockOctokit.actions.createRegistrationTokenForRepo).not.toBeCalled();
});
-
+
it('creates a token when maximum runners has not been reached', async () => {
await scaleUpModule.scaleUp('aws:sqs', TEST_DATA);
+ expect(mockOctokit.actions.createRegistrationTokenForOrg).not.toBeCalled();
expect(mockOctokit.actions.createRegistrationTokenForRepo).toBeCalledWith({
owner: TEST_DATA.repositoryOwner,
repo: TEST_DATA.repositoryName,
@@ -303,6 +306,10 @@ describe('scaleUp with GHES', () => {
const spy = jest.spyOn(ghAuth, 'createGithubAuth');
await scaleUpModule.scaleUp('aws:sqs', TEST_DATA_WITH_ZERO_INSTALL_ID);
expect(mockOctokit.apps.getOrgInstallation).not.toBeCalled();
+ expect(mockOctokit.apps.getRepoInstallation).toBeCalledWith({
+ owner: TEST_DATA.repositoryOwner,
+ repo: TEST_DATA.repositoryName,
+ });
expect(spy).toHaveBeenNthCalledWith(1, undefined, 'app', 'https://github.enterprise.something/api/v3');
expect(spy).toHaveBeenNthCalledWith(
2,
@@ -316,8 +323,6 @@ describe('scaleUp with GHES', () => {
process.env.RUNNER_EXTRA_LABELS = 'label1,label2';
await scaleUpModule.scaleUp('aws:sqs', TEST_DATA);
expectedRunnerParams.runnerServiceConfig = expectedRunnerParams.runnerServiceConfig + `--labels label1,label2`;
- expectedRunnerParams.runnerType = 'Repo';
- expectedRunnerParams.runnerOwner = `${TEST_DATA.repositoryOwner}/${TEST_DATA.repositoryName}`;
expect(createRunner).toBeCalledWith(expectedRunnerParams, 'lt-1');
});
@@ -325,11 +330,7 @@ describe('scaleUp with GHES', () => {
process.env.RUNNER_EXTRA_LABELS = 'label1,label2';
process.env.RUNNER_GROUP_NAME = 'TEST_GROUP_IGNORED';
await scaleUpModule.scaleUp('aws:sqs', TEST_DATA);
- expectedRunnerParams.runnerServiceConfig =
- `--url ` +
- `https://github.enterprise.something/${TEST_DATA.repositoryOwner}/${TEST_DATA.repositoryName} ` +
- `--token 1234abcd --labels label1,label2`;
- expectedRunnerParams.runnerOwner = `${TEST_DATA.repositoryOwner}/${TEST_DATA.repositoryName}`;
+ expectedRunnerParams.runnerServiceConfig = expectedRunnerParams.runnerServiceConfig + `--labels label1,label2`;
expect(createRunner).toBeCalledWith(expectedRunnerParams, 'lt-1');
});
@@ -341,6 +342,16 @@ describe('scaleUp with GHES', () => {
expect(createRunner).toHaveBeenNthCalledWith(1, expectedRunnerParams, 'lt-1');
expect(createRunner).toHaveBeenNthCalledWith(2, expectedRunnerParams, 'lt-2');
});
+
+ it('all launch templates fail', async () => {
+ const mockCreateRunners = mocked(createRunner);
+ mockCreateRunners.mockRejectedValue(new Error('All launch templates failed'));
+ await expect(scaleUpModule.scaleUp('aws:sqs', TEST_DATA)).rejects.toThrow('All launch templates failed');
+ expect(createRunner).toBeCalledTimes(2);
+ expect(createRunner).toHaveBeenNthCalledWith(1, expectedRunnerParams, 'lt-1');
+ expect(createRunner).toHaveBeenNthCalledWith(2, expectedRunnerParams, 'lt-2');
+ mockCreateRunners.mockReset();
+ });
});
});
@@ -370,6 +381,7 @@ describe('scaleUp with public GH', () => {
it('retrieves installation id if not set', async () => {
const spy = jest.spyOn(ghAuth, 'createGithubAuth');
await scaleUpModule.scaleUp('aws:sqs', TEST_DATA_WITH_ZERO_INSTALL_ID);
+ expect(mockOctokit.apps.getOrgInstallation).toBeCalled();
expect(mockOctokit.apps.getRepoInstallation).not.toBeCalled();
expect(spy).toHaveBeenNthCalledWith(1, undefined, 'app', '');
expect(spy).toHaveBeenNthCalledWith(2, TEST_DATA.installationId, 'installation', '');
@@ -404,14 +416,15 @@ describe('scaleUp with public GH', () => {
process.env.RUNNERS_MAXIMUM_COUNT = '1';
await scaleUpModule.scaleUp('aws:sqs', TEST_DATA);
expect(mockOctokit.actions.createRegistrationTokenForOrg).not.toBeCalled();
+ expect(mockOctokit.actions.createRegistrationTokenForRepo).not.toBeCalled();
});
it('creates a token when maximum runners has not been reached', async () => {
await scaleUpModule.scaleUp('aws:sqs', TEST_DATA);
- expect(mockOctokit.actions.createRegistrationTokenForOrg).toBeCalled();
expect(mockOctokit.actions.createRegistrationTokenForOrg).toBeCalledWith({
org: TEST_DATA.repositoryOwner,
});
+ expect(mockOctokit.actions.createRegistrationTokenForRepo).not.toBeCalled();
});
it('creates a runner with correct config', async () => {
@@ -465,11 +478,13 @@ describe('scaleUp with public GH', () => {
it('does not create a token when maximum runners has been reached', async () => {
process.env.RUNNERS_MAXIMUM_COUNT = '1';
await scaleUpModule.scaleUp('aws:sqs', TEST_DATA);
+ expect(mockOctokit.actions.createRegistrationTokenForOrg).not.toBeCalled();
expect(mockOctokit.actions.createRegistrationTokenForRepo).not.toBeCalled();
});
-
+
it('creates a token when maximum runners has not been reached', async () => {
await scaleUpModule.scaleUp('aws:sqs', TEST_DATA);
+ expect(mockOctokit.actions.createRegistrationTokenForOrg).not.toBeCalled();
expect(mockOctokit.actions.createRegistrationTokenForRepo).toBeCalledWith({
owner: TEST_DATA.repositoryOwner,
repo: TEST_DATA.repositoryName,
@@ -488,6 +503,7 @@ describe('scaleUp with public GH', () => {
const spy = jest.spyOn(ghAuth, 'createGithubAuth');
await scaleUpModule.scaleUp('aws:sqs', TEST_DATA_WITH_ZERO_INSTALL_ID);
expect(mockOctokit.apps.getOrgInstallation).not.toBeCalled();
+ expect(mockOctokit.apps.getRepoInstallation).toBeCalled();
expect(spy).toHaveBeenNthCalledWith(1, undefined, 'app', '');
expect(spy).toHaveBeenNthCalledWith(2, TEST_DATA.installationId, 'installation', '');
});
diff --git a/modules/runners/variables.tf b/modules/runners/variables.tf
index ecc71dc1ee..99411ddf99 100644
--- a/modules/runners/variables.tf
+++ b/modules/runners/variables.tf
@@ -14,7 +14,7 @@ variable "subnet_ids" {
}
variable "overrides" {
- description = "This maps provides the possibility to override some defaults. The following attributes are supported: `name_sg` overwrite the `Name` tag for all security groups created by this module. `name_runner_agent_instance` override the `Name` tag for the ec2 instance defined in the auto launch configuration. `name_docker_machine_runners` override the `Name` tag spot instances created by the runner agent."
+ description = "This map provides the possibility to override some defaults. The following attributes are supported: `name_sg` overrides the `Name` tag for all security groups created by this module. `name_runner_agent_instance` overrides the `Name` tag for the ec2 instance defined in the auto launch configuration. `name_docker_machine_runners` overrides the `Name` tag spot instances created by the runner agent."
type = map(string)
default = {
@@ -70,7 +70,7 @@ variable "instance_types" {
}
variable "ami_filter" {
- description = "List of maps used to create the AMI filter for the action runner AMI."
+ description = "Map of lists used to create the AMI filter for the action runner AMI."
type = map(list(string))
default = {
@@ -91,13 +91,13 @@ variable "userdata_template" {
}
variable "userdata_pre_install" {
- description = "User-data script snippet to insert before GitHub acton runner install"
+ description = "User-data script snippet to insert before GitHub action runner install"
type = string
default = ""
}
variable "userdata_post_install" {
- description = "User-data script snippet to insert after GitHub acton runner install"
+ description = "User-data script snippet to insert after GitHub action runner install"
type = string
default = ""
}
@@ -172,7 +172,7 @@ variable "role_permissions_boundary" {
}
variable "role_path" {
- description = "The path that will be added to the role, if not set the environment name will be used."
+ description = "The path that will be added to the role; if not set, the environment name will be used."
type = string
default = null
}
@@ -218,7 +218,7 @@ variable "logging_retention_in_days" {
}
variable "enable_ssm_on_runners" {
- description = "Enable to allow access the runner instances for debugging purposes via SSM. Note that this adds additional permissions to the runner instances."
+ description = "Enable to allow access to the runner instances for debugging purposes via SSM. Note that this adds additional permissions to the runner instances."
type = bool
}
@@ -238,7 +238,7 @@ variable "runners_lambda_s3_object_version" {
}
variable "create_service_linked_role_spot" {
- description = "(optional) create the serviced linked role for spot instances that is required by the scale-up lambda."
+ description = "(optional) create the service linked role for spot instances that is required by the scale-up lambda."
type = bool
default = false
}
@@ -262,7 +262,7 @@ variable "cloudwatch_config" {
}
variable "runner_log_files" {
- description = "(optional) List of logfiles to send to cloudwatch, will only be used if `enable_cloudwatch_agent` is set to true. Object description: `log_group_name`: Name of the log group, `prefix_log_group`: If true, the log group name will be prefixed with `/github-self-hosted-runners/`, `file_path`: path to the log file, `log_stream_name`: name of the log stream."
+ description = "(optional) List of logfiles to send to CloudWatch, will only be used if `enable_cloudwatch_agent` is set to true. Object description: `log_group_name`: Name of the log group, `prefix_log_group`: If true, the log group name will be prefixed with `/github-self-hosted-runners/`, `file_path`: path to the log file, `log_stream_name`: name of the log stream."
type = list(object({
log_group_name = string
prefix_log_group = bool
diff --git a/modules/webhook/lambdas/webhook/src/lambda.ts b/modules/webhook/lambdas/webhook/src/lambda.ts
index 4227ff0ef2..29a96dd0bd 100644
--- a/modules/webhook/lambdas/webhook/src/lambda.ts
+++ b/modules/webhook/lambdas/webhook/src/lambda.ts
@@ -3,8 +3,12 @@ import { APIGatewayEvent, Context } from 'aws-lambda';
// eslint-disable-next-line @typescript-eslint/no-explicit-any
export const githubWebhook = async (event: APIGatewayEvent, context: Context, callback: any): Promise => {
- const statusCode = await handle(event.headers, event.body);
- callback(null, {
- statusCode: statusCode,
- });
+ try {
+ const statusCode = await handle(event.headers, event.body);
+ callback(null, {
+ statusCode: statusCode,
+ });
+ } catch (e) {
+ callback(e);
+ }
};
diff --git a/modules/webhook/lambdas/webhook/src/sqs/index.ts b/modules/webhook/lambdas/webhook/src/sqs/index.ts
index 2d83646de4..1a6e75e808 100644
--- a/modules/webhook/lambdas/webhook/src/sqs/index.ts
+++ b/modules/webhook/lambdas/webhook/src/sqs/index.ts
@@ -1,5 +1,4 @@
-import { SQS } from 'aws-sdk';
-import AWS from 'aws-sdk';
+import AWS, { SQS } from 'aws-sdk';
AWS.config.update({
region: process.env.AWS_REGION,
diff --git a/modules/webhook/variables.tf b/modules/webhook/variables.tf
index 0e48c3a36f..d9187f9aa4 100644
--- a/modules/webhook/variables.tf
+++ b/modules/webhook/variables.tf
@@ -45,7 +45,7 @@ variable "role_permissions_boundary" {
}
variable "role_path" {
- description = "The path that will be added to the role, if not set the environment name will be used."
+ description = "The path that will be added to the role; if not set, the environment name will be used."
type = string
default = null
}