Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

(cdk deploy): cdk deploy fails when shared assets are deployed to different accounts because Docker assets are not built locally for all targets #26446

Open
igirardi opened this issue Jul 20, 2023 · 4 comments
Labels
@aws-cdk/assets Related to the @aws-cdk/assets package bug This issue is a bug. effort/medium Medium work item – several days of effort p2

Comments

@igirardi
Copy link

Describe the bug

What is the issue?

In version 2.87.0 of the AWS CDK CLI Docker assets are not properly built locally and assets push to Amazon ECR fails. This happens when Docker assets are shared between stacks that are deployed to different accounts.
Deployment fails with the following error:

 ❌ Deployment failed: Error: Failed to publish asset ed41e37f8f22191abee7e48aac735c5537b770fc51669a4e98d878567518bb2c:111111111111-us-east-1
    at Deployments.publishSingleAsset (.../container-deploy-issue-typescript/node_modules/aws-cdk/lib/index.js:415:11819)
    at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
    at async Object.publishAsset (../container-deploy-issue-typescript/node_modules/aws-cdk/lib/index.js:415:151100)
    at async .../container-deploy-issue-typescript/node_modules/aws-cdk/lib/index.js:415:137075

Failed to publish asset ed41e37f8f22191abee7e48aac735c5537b770fc51669a4e98d878567518bb2c:111111111111-us-east-1

Deployment with version 2.79.1 is fine. These findings are related to a change in the asset deployment with recent AWS CLI versions.

What is the impact?

If a CDK app contains two stacks deployed to different accounts that use the same asset, the cdk deploy command would fail because local assets are not built for all target accounts.

Who is affected?

Recent CDK versions for example version 2.87.0.

How do I resolve this?

Work around with the recent AWS CDK CLI is to force asset build using the extraHash functionality. This work around can be complex for customers deploying several stacks sharing assets. Good choice for the extraHash parameter string is the target account id to avoid unnecessary rebuild of assets deployed to the same account.
Customers need to explicitly force asset build with extra flags while this was the default behavior with older AWS CDK CLI versions.

Additional information

This ticket has been opened as documentation for customers having similar issues and to guide for a proper work around. It would be nice to restore the previous AWS CDK functionality or provide an explanation of the reasons of this change.

Expected Behavior

Running cdk deploy should create/update CloudFormation stacks.

Current Behavior

The following is the output from a cdk deploy run that should have triggered cdk to create the app's stacks in CloudFormation.

npx cdk deploy --all --require-approval never


✨  Synthesis time: 16.82s

StackA:  start: Building c829d2f8b374d8d213cbaa23cccd3c80ba5f4316350325efa25e2d793bdb1c0f:222222222222-us-east-1
StackA:  success: Built c829d2f8b374d8d213cbaa23cccd3c80ba5f4316350325efa25e2d793bdb1c0f:222222222222-us-east-1
StackA:  start: Publishing c829d2f8b374d8d213cbaa23cccd3c80ba5f4316350325efa25e2d793bdb1c0f:222222222222-us-east-1
StackA:  start: Building cbda9eabff497b95ede734f4867ebcc5e6e7b541a7a24e9383a269999f571376:222222222222-us-east-1
StackA:  success: Published c829d2f8b374d8d213cbaa23cccd3c80ba5f4316350325efa25e2d793bdb1c0f:222222222222-us-east-1
#1 [internal] load build definition from Dockerfile
#1 sha256:f7b2b52a7ce80056b73686850c29a73e53fedfce0c9b1da09b3be3659fc88ab1
#1 transferring dockerfile: 114B 0.0s done
#1 DONE 0.0s

#2 [internal] load .dockerignore
#2 sha256:50bfcb487b58a21b1d1c4c2597aa42bf590ef4184b125e86d089e63b100fb2b0
#2 transferring context: 2B done
#2 DONE 0.0s

#3 [internal] load metadata for public.ecr.aws/lambda/nodejs:18
#3 sha256:77a2ab281b7f878597f9b30e118a6922165ee5de0c42562d46739318263a5435
#3 DONE 2.3s

#5 [internal] load build context
#5 sha256:5c08f97c5ec8d24149dfe0d0da978ce8298b39f43aa85954939c8934ec9ccc1b
#5 transferring context: 105B done
#5 DONE 0.0s

#4 [1/2] FROM public.ecr.aws/lambda/nodejs:18@sha256:7e4c6440bff92dcd89f2d54791cbe5242ff67663759d6be18b608e21093e2e4e
#4 sha256:8189d61f6cdec23f30e03df1102de51df6cc76f370f59c696995ee78020d03fa
#4 resolve public.ecr.aws/lambda/nodejs:18@sha256:7e4c6440bff92dcd89f2d54791cbe5242ff67663759d6be18b608e21093e2e4e 0.0s done
#4 sha256:31cbd26f980c7938bd7fb2bc3c6054b47e04ccdf6ff7df00b09efa5cef53062b 1.58kB / 1.58kB done
#4 sha256:fd13fe3cc20a63345c037bc33e36dfe4ae0c2fd2f33514dd0c88820cb6c82bbf 3.00kB / 3.00kB done
#4 sha256:8f5a40c0a4dcf5e0b17b775694cb809745581fc5b21e19109f2c5ae953620f08 0B / 417B 0.2s
#4 sha256:7e4c6440bff92dcd89f2d54791cbe5242ff67663759d6be18b608e21093e2e4e 772B / 772B done
#4 sha256:ccb06c2479628c5ed70908212e91f0f7216645ac1ab120c10dd44de8fb15a2b5 0B / 104.79MB 0.2s
#4 sha256:6f5513fa990158392d544d4a3321d57f449bf85bc1b4d9a4968393cdf6efb09f 0B / 87.36kB 0.2s
#4 sha256:8f5a40c0a4dcf5e0b17b775694cb809745581fc5b21e19109f2c5ae953620f08 417B / 417B 0.5s done
#4 sha256:79a77e7c1be9a2c4f77ead609e8d8b7162377bb6905b2a244c7964d74d8c8762 0B / 2.51MB 0.5s
#4 sha256:6f5513fa990158392d544d4a3321d57f449bf85bc1b4d9a4968393cdf6efb09f 87.36kB / 87.36kB 0.6s done
#4 sha256:c57e9c8d50115d283e4e5a806aabefe2ec4f457fb173d0699d7e2531e6540cd3 0B / 49.83MB 0.7s
#4 sha256:ccb06c2479628c5ed70908212e91f0f7216645ac1ab120c10dd44de8fb15a2b5 11.53MB / 104.79MB 0.9s
#4 sha256:ccb06c2479628c5ed70908212e91f0f7216645ac1ab120c10dd44de8fb15a2b5 16.78MB / 104.79MB 1.1s
#4 sha256:79a77e7c1be9a2c4f77ead609e8d8b7162377bb6905b2a244c7964d74d8c8762 1.05MB / 2.51MB 1.1s
#4 sha256:79a77e7c1be9a2c4f77ead609e8d8b7162377bb6905b2a244c7964d74d8c8762 2.10MB / 2.51MB 1.2s
#4 sha256:ccb06c2479628c5ed70908212e91f0f7216645ac1ab120c10dd44de8fb15a2b5 24.12MB / 104.79MB 1.3s
#4 sha256:79a77e7c1be9a2c4f77ead609e8d8b7162377bb6905b2a244c7964d74d8c8762 2.51MB / 2.51MB 1.2s done
#4 sha256:89d5eae598d04269637500b21ff01dde1ed216be9f71e5183c50ffc9b01468fb 0B / 21.65MB 1.3s
#4 sha256:ccb06c2479628c5ed70908212e91f0f7216645ac1ab120c10dd44de8fb15a2b5 29.36MB / 104.79MB 1.5s
#4 sha256:ccb06c2479628c5ed70908212e91f0f7216645ac1ab120c10dd44de8fb15a2b5 38.80MB / 104.79MB 1.8s
#4 sha256:c57e9c8d50115d283e4e5a806aabefe2ec4f457fb173d0699d7e2531e6540cd3 3.15MB / 49.83MB 1.8s
#4 sha256:89d5eae598d04269637500b21ff01dde1ed216be9f71e5183c50ffc9b01468fb 3.15MB / 21.65MB 1.8s
#4 sha256:ccb06c2479628c5ed70908212e91f0f7216645ac1ab120c10dd44de8fb15a2b5 46.14MB / 104.79MB 2.1s
#4 sha256:89d5eae598d04269637500b21ff01dde1ed216be9f71e5183c50ffc9b01468fb 7.34MB / 21.65MB 2.1s
#4 sha256:c57e9c8d50115d283e4e5a806aabefe2ec4f457fb173d0699d7e2531e6540cd3 6.29MB / 49.83MB 2.2s
#4 sha256:ccb06c2479628c5ed70908212e91f0f7216645ac1ab120c10dd44de8fb15a2b5 52.24MB / 104.79MB 2.3s
#4 sha256:c57e9c8d50115d283e4e5a806aabefe2ec4f457fb173d0699d7e2531e6540cd3 9.44MB / 49.83MB 2.5s
#4 sha256:89d5eae598d04269637500b21ff01dde1ed216be9f71e5183c50ffc9b01468fb 10.49MB / 21.65MB 2.5s
#4 sha256:ccb06c2479628c5ed70908212e91f0f7216645ac1ab120c10dd44de8fb15a2b5 62.91MB / 104.79MB 2.8s
#4 sha256:c57e9c8d50115d283e4e5a806aabefe2ec4f457fb173d0699d7e2531e6540cd3 11.96MB / 49.83MB 2.8s
#4 sha256:89d5eae598d04269637500b21ff01dde1ed216be9f71e5183c50ffc9b01468fb 15.73MB / 21.65MB 2.8s
#4 sha256:ccb06c2479628c5ed70908212e91f0f7216645ac1ab120c10dd44de8fb15a2b5 71.30MB / 104.79MB 3.2s
#4 sha256:c57e9c8d50115d283e4e5a806aabefe2ec4f457fb173d0699d7e2531e6540cd3 15.73MB / 49.83MB 3.2s
#4 sha256:89d5eae598d04269637500b21ff01dde1ed216be9f71e5183c50ffc9b01468fb 18.87MB / 21.65MB 3.2s
#4 sha256:ccb06c2479628c5ed70908212e91f0f7216645ac1ab120c10dd44de8fb15a2b5 79.69MB / 104.79MB 3.5s
#4 sha256:c57e9c8d50115d283e4e5a806aabefe2ec4f457fb173d0699d7e2531e6540cd3 18.87MB / 49.83MB 3.5s
#4 sha256:89d5eae598d04269637500b21ff01dde1ed216be9f71e5183c50ffc9b01468fb 21.65MB / 21.65MB 3.4s done
#4 sha256:ccb06c2479628c5ed70908212e91f0f7216645ac1ab120c10dd44de8fb15a2b5 91.23MB / 104.79MB 3.8s
#4 sha256:c57e9c8d50115d283e4e5a806aabefe2ec4f457fb173d0699d7e2531e6540cd3 22.02MB / 49.83MB 3.8s
#4 sha256:ccb06c2479628c5ed70908212e91f0f7216645ac1ab120c10dd44de8fb15a2b5 101.71MB / 104.79MB 4.2s
#4 sha256:c57e9c8d50115d283e4e5a806aabefe2ec4f457fb173d0699d7e2531e6540cd3 25.17MB / 49.83MB 4.2s
#4 sha256:ccb06c2479628c5ed70908212e91f0f7216645ac1ab120c10dd44de8fb15a2b5 104.79MB / 104.79MB 4.4s done
#4 sha256:c57e9c8d50115d283e4e5a806aabefe2ec4f457fb173d0699d7e2531e6540cd3 28.31MB / 49.83MB 4.5s
#4 sha256:c57e9c8d50115d283e4e5a806aabefe2ec4f457fb173d0699d7e2531e6540cd3 31.46MB / 49.83MB 4.7s
#4 sha256:c57e9c8d50115d283e4e5a806aabefe2ec4f457fb173d0699d7e2531e6540cd3 35.65MB / 49.83MB 5.0s
#4 sha256:c57e9c8d50115d283e4e5a806aabefe2ec4f457fb173d0699d7e2531e6540cd3 38.80MB / 49.83MB 5.2s
#4 extracting sha256:ccb06c2479628c5ed70908212e91f0f7216645ac1ab120c10dd44de8fb15a2b5
#4 sha256:c57e9c8d50115d283e4e5a806aabefe2ec4f457fb173d0699d7e2531e6540cd3 45.09MB / 49.83MB 5.5s
#4 sha256:c57e9c8d50115d283e4e5a806aabefe2ec4f457fb173d0699d7e2531e6540cd3 49.83MB / 49.83MB 5.7s
#4 sha256:c57e9c8d50115d283e4e5a806aabefe2ec4f457fb173d0699d7e2531e6540cd3 49.83MB / 49.83MB 5.7s done
#4 extracting sha256:ccb06c2479628c5ed70908212e91f0f7216645ac1ab120c10dd44de8fb15a2b5 5.1s
#4 extracting sha256:ccb06c2479628c5ed70908212e91f0f7216645ac1ab120c10dd44de8fb15a2b5 5.8s done
#4 extracting sha256:6f5513fa990158392d544d4a3321d57f449bf85bc1b4d9a4968393cdf6efb09f 0.1s done
#4 extracting sha256:8f5a40c0a4dcf5e0b17b775694cb809745581fc5b21e19109f2c5ae953620f08 done
#4 extracting sha256:79a77e7c1be9a2c4f77ead609e8d8b7162377bb6905b2a244c7964d74d8c8762
#4 extracting sha256:79a77e7c1be9a2c4f77ead609e8d8b7162377bb6905b2a244c7964d74d8c8762 0.2s done
#4 extracting sha256:c57e9c8d50115d283e4e5a806aabefe2ec4f457fb173d0699d7e2531e6540cd3
#4 extracting sha256:c57e9c8d50115d283e4e5a806aabefe2ec4f457fb173d0699d7e2531e6540cd3 2.5s done
#4 extracting sha256:89d5eae598d04269637500b21ff01dde1ed216be9f71e5183c50ffc9b01468fb 0.1s
#4 extracting sha256:89d5eae598d04269637500b21ff01dde1ed216be9f71e5183c50ffc9b01468fb 4.5s done
#4 DONE 19.3s

#6 [2/2] COPY source /var/task
#6 sha256:bdcb68b71e1cdb29ca83de9e5f92c9df309e3dc357926476c828816a45059621
#6 DONE 1.1s

#7 exporting to image
#7 sha256:e8c613e07b0b7ff33893b694f7759a10d42e180f2b4dc349fb57dc6b71dcab00
#7 exporting layers 0.0s done
#7 writing image sha256:ad5cbf70907697488070a9327e6b3d79de8fa66eb3bae365483f629bd65c5e11 done
#7 naming to docker.io/library/cdkasset-cbda9eabff497b95ede734f4867ebcc5e6e7b541a7a24e9383a269999f571376 done
#7 DONE 0.0s

Use 'docker scan' to run Snyk tests against images to find vulnerabilities and learn how to fix them
StackA:  success: Built cbda9eabff497b95ede734f4867ebcc5e6e7b541a7a24e9383a269999f571376:222222222222-us-east-1
StackB:  start: Publishing cbda9eabff497b95ede734f4867ebcc5e6e7b541a7a24e9383a269999f571376:111111111111-us-east-1
StackA:  start: Publishing cbda9eabff497b95ede734f4867ebcc5e6e7b541a7a24e9383a269999f571376:222222222222-us-east-1
The push refers to repository [222222222222.dkr.ecr.us-east-1.amazonaws.com/cdk-hnb659fds-container-assets-222222222222-us-east-1]
5c17ca08cef5: Preparing
715361e4e2ee: Preparing
746612719f34: Preparing
15dd6c63f3a2: Preparing
a2151dd95dc1: Preparing
b5fbba070eb2: Preparing
2a9a5b39a041: Preparing
b5fbba070eb2: Waiting
2a9a5b39a041: Waiting
StackB:  start: Building ed41e37f8f22191abee7e48aac735c5537b770fc51669a4e98d878567518bb2c:111111111111-us-east-1
StackB:  success: Built ed41e37f8f22191abee7e48aac735c5537b770fc51669a4e98d878567518bb2c:111111111111-us-east-1
StackB:  start: Publishing ed41e37f8f22191abee7e48aac735c5537b770fc51669a4e98d878567518bb2c:111111111111-us-east-1
15dd6c63f3a2: Layer already exists
715361e4e2ee: Layer already exists
a2151dd95dc1: Layer already exists
746612719f34: Layer already exists
b5fbba070eb2: Layer already exists
2a9a5b39a041: Layer already exists
The push refers to repository [111111111111.dkr.ecr.us-east-1.amazonaws.com/cdk-hnb659fds-container-assets-111111111111-us-east-1]
An image does not exist locally with the tag: 111111111111.dkr.ecr.us-east-1.amazonaws.com/cdk-hnb659fds-container-assets-111111111111-us-east-1
StackB:  fail: docker push 111111111111.dkr.ecr.us-east-1.amazonaws.com/cdk-hnb659fds-container-assets-111111111111-us-east-1:cbda9eabff497b95ede734f4867ebcc5e6e7b541a7a24e9383a269999f571376 exited with error code 1: An image does not exist locally with the tag: 111111111111.dkr.ecr.us-east-1.amazonaws.com/cdk-hnb659fds-container-assets-111111111111-us-east-1
5c17ca08cef5: Pushed
StackB:  success: Published ed41e37f8f22191abee7e48aac735c5537b770fc51669a4e98d878567518bb2c:111111111111-us-east-1
cbda9eabff497b95ede734f4867ebcc5e6e7b541a7a24e9383a269999f571376: digest: sha256:06183095909d1f487f198fae8ad1a173505341b523b3f40b42b505311f1363a8 size: 1788
StackA:  success: Published cbda9eabff497b95ede734f4867ebcc5e6e7b541a7a24e9383a269999f571376:222222222222-us-east-1

 ❌ Deployment failed: Error: Failed to publish asset ed41e37f8f22191abee7e48aac735c5537b770fc51669a4e98d878567518bb2c:111111111111-us-east-1
    at Deployments.publishSingleAsset (.../container-deploy-issue-typescript/node_modules/aws-cdk/lib/index.js:415:11819)
    at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
    at async Object.publishAsset (.../container-deploy-issue-typescript/node_modules/aws-cdk/lib/index.js:415:151100)
    at async .../container-deploy-issue-typescript/node_modules/aws-cdk/lib/index.js:415:137075

Failed to publish asset ed41e37f8f22191abee7e48aac735c5537b770fc51669a4e98d878567518bb2c:111111111111-us-east-1

Reproduction Steps

CDK app folder structure:

jest.config.js
tsconfig.json
package.json
package-lock.json
source/index.js
cdk.json
Dockerfile
bin/cdk-app.ts

Content of jest.config.js:

module.exports = {
  testEnvironment: 'node',
  roots: ['<rootDir>/test'],
  testMatch: ['**/*.test.ts'],
  transform: {
    '^.+\\.tsx?$': 'ts-jest'
  }
};

Content of tsconfig.json:

{
  "compilerOptions": {
    "target": "ES2020",
    "module": "commonjs",
    "lib": [
      "es2020",
      "dom"
    ],
    "declaration": true,
    "strict": true,
    "noImplicitAny": true,
    "strictNullChecks": true,
    "noImplicitThis": true,
    "alwaysStrict": true,
    "noUnusedLocals": false,
    "noUnusedParameters": false,
    "noImplicitReturns": true,
    "noFallthroughCasesInSwitch": false,
    "inlineSourceMap": true,
    "inlineSources": true,
    "experimentalDecorators": true,
    "strictPropertyInitialization": false,
    "typeRoots": [
      "./node_modules/@types"
    ]
  },
  "exclude": [
    "node_modules",
    "cdk.out"
  ]
}

Content of package.json:

{
  "name": "container-deploy-issue-typescript",
  "version": "0.1.0",
  "bin": {
    "container-deploy-issue-typescript": "bin/container-deploy-issue-typescript.js"
  },
  "scripts": {
    "build": "tsc",
    "watch": "tsc -w",
    "test": "jest",
    "cdk": "cdk"
  },
  "devDependencies": {
    "@types/jest": "^29.5.1",
    "@types/node": "20.1.7",
    "jest": "^29.5.0",
    "ts-jest": "^29.1.0",
    "aws-cdk": "2.87.0",
    "ts-node": "^10.9.1",
    "typescript": "~5.1.3"
  },
  "dependencies": {
    "aws-cdk-lib": "2.87.0",
    "constructs": "^10.0.0",
    "source-map-support": "^0.5.21"
  }
}

Content of source/index.js:

exports.handler = async () => {};

Content of cdk.json:

{
  "app": "npx ts-node --prefer-ts-exts bin/cdk-app.ts",
  "watch": {
    "include": [
      "**"
    ],
    "exclude": [
      "README.md",
      "cdk*.json",
      "**/*.d.ts",
      "**/*.js",
      "tsconfig.json",
      "package*.json",
      "yarn.lock",
      "node_modules",
      "test"
    ]
  },
  "context": {
    "@aws-cdk/aws-lambda:recognizeLayerVersion": true,
    "@aws-cdk/core:checkSecretUsage": true,
    "@aws-cdk/core:target-partitions": [
      "aws",
      "aws-cn"
    ],
    "@aws-cdk-containers/ecs-service-extensions:enableDefaultLogDriver": true,
    "@aws-cdk/aws-ec2:uniqueImdsv2TemplateName": true,
    "@aws-cdk/aws-ecs:arnFormatIncludesClusterName": true,
    "@aws-cdk/aws-iam:minimizePolicies": true,
    "@aws-cdk/core:validateSnapshotRemovalPolicy": true,
    "@aws-cdk/aws-codepipeline:crossAccountKeyAliasStackSafeResourceName": true,
    "@aws-cdk/aws-s3:createDefaultLoggingPolicy": true,
    "@aws-cdk/aws-sns-subscriptions:restrictSqsDescryption": true,
    "@aws-cdk/aws-apigateway:disableCloudWatchRole": true,
    "@aws-cdk/core:enablePartitionLiterals": true,
    "@aws-cdk/aws-events:eventsTargetQueueSameAccount": true,
    "@aws-cdk/aws-iam:standardizedServicePrincipals": true,
    "@aws-cdk/aws-ecs:disableExplicitDeploymentControllerForCircuitBreaker": true,
    "@aws-cdk/aws-iam:importedRoleStackSafeDefaultPolicyName": true,
    "@aws-cdk/aws-s3:serverAccessLogsUseBucketPolicy": true,
    "@aws-cdk/aws-route53-patters:useCertificate": true,
    "@aws-cdk/customresources:installLatestAwsSdkDefault": false,
    "@aws-cdk/aws-rds:databaseProxyUniqueResourceName": true,
    "@aws-cdk/aws-codedeploy:removeAlarmsFromDeploymentGroup": true,
    "@aws-cdk/aws-apigateway:authorizerChangeDeploymentLogicalId": true,
    "@aws-cdk/aws-ec2:launchTemplateDefaultUserData": true,
    "@aws-cdk/aws-secretsmanager:useAttachedSecretResourcePolicyForSecretTargetAttachments": true,
    "@aws-cdk/aws-redshift:columnId": true,
    "@aws-cdk/aws-stepfunctions-tasks:enableEmrServicePolicyV2": true,
    "@aws-cdk/aws-ec2:restrictDefaultSecurityGroup": true,
    "@aws-cdk/aws-apigateway:requestValidatorUniqueId": true,
    "@aws-cdk/aws-kms:aliasNameRef": true,
    "@aws-cdk/core:includePrefixInUniqueNameGeneration": true
  }
}

Content of Dockerfile:

FROM public.ecr.aws/lambda/nodejs:18
COPY source ${LAMBDA_TASK_ROOT}

Content of bin/cdk-app.ts:

#!/usr/bin/env node

import 'source-map-support/register';
import path = require('path');
import * as cdk from 'aws-cdk-lib';
import { Construct } from 'constructs';
import { NodejsFunction } from 'aws-cdk-lib/aws-lambda-nodejs';
import { DockerImageFunction, DockerImageCode } from 'aws-cdk-lib/aws-lambda';

const envA  = { account: '222222222222', region: 'us-east-1' };
const envB  = { account: '111111111111', region: 'us-east-1' };


class StackA extends cdk.Stack {
  functionA: NodejsFunction;

  constructor(scope: Construct, id: string, props?: cdk.StackProps) {
    super(scope, id, props);

    new DockerImageFunction(this, 'Function', {
      functionName: 'FunctionA',
      code: DockerImageCode.fromImageAsset(path.join(__dirname, '..')),
    });
  }
}

class StackB extends cdk.Stack {
  constructor(scope: Construct, id: string, props: cdk.StackProps) {
    super(scope, id, props);

    new DockerImageFunction(this, 'Function', {
      functionName: 'FunctionB',
      code: DockerImageCode.fromImageAsset(path.join(__dirname, '..')),
    });
  }
}

const app = new cdk.App();
new StackA(app, 'StackA', { env: envA });
new StackB(app, 'StackB', { env: envB });

app.synth()

Similar project and code structure is in the Reproduction Steps of issue #25714.

Run npm install to generate the package-lock.json and install the npm packages locally.

For the deployment to multiple target accounts make sure to execute cdk bootstrap setting the flags --trust and --cloudformation-execution-policies.

Run npx cdk deploy --all --require-approval never to reproduce the issue.

Possible Solution

Introduce extraHash to force asset rebuild when same asset has to be locally built and deployed to multiple target accounts.

Content of bin/cdk-app.ts:

#!/usr/bin/env node

import 'source-map-support/register';
import path = require('path');
import * as cdk from 'aws-cdk-lib';
import { Construct } from 'constructs';
import { NodejsFunction } from 'aws-cdk-lib/aws-lambda-nodejs';
import { AssetImageCodeProps, DockerImageFunction, DockerImageCode } from 'aws-cdk-lib/aws-lambda';

const envA  = { account: '222222222222', region: 'us-east-1' };
const envB  = { account: '111111111111', region: 'us-east-1' };


class StackA extends cdk.Stack {
  functionA: NodejsFunction;

  constructor(scope: Construct, id: string, props?: cdk.StackProps) {
    super(scope, id, props);

    const assetImageCodeProps: AssetImageCodeProps = {
      cmd: ["index.handler"],
      extraHash: "A",
      invalidation: {
        extraHash: true,
      },
    }
    new DockerImageFunction(this, 'Function', {
      functionName: 'FunctionA',
      code: DockerImageCode.fromImageAsset(path.join(__dirname, '..'), assetImageCodeProps),
    });
  }
}

class StackB extends cdk.Stack {
  constructor(scope: Construct, id: string, props: cdk.StackProps) {
    super(scope, id, props);

    const assetImageCodeProps: AssetImageCodeProps = {
      cmd: ["index.handler"],
      extraHash: "B",
      invalidation: {
        extraHash: true,
      },
    }

    new DockerImageFunction(this, 'Function', {
      functionName: 'FunctionB',
      code: DockerImageCode.fromImageAsset(path.join(__dirname, '..'), assetImageCodeProps),
    });
  }
}

const app = new cdk.App();
new StackA(app, 'StackA', { env: envA });
new StackB(app, 'StackB', { env: envB });

app.synth()

Additional Information/Context

Test deployment with AWS CDK version 2.79.1 changing the version in the file package.json:

{
  "name": "container-deploy-issue-typescript",
  "version": "0.1.0",
  "bin": {
    "container-deploy-issue-typescript": "bin/container-deploy-issue-typescript.js"
  },
  "scripts": {
    "build": "tsc",
    "watch": "tsc -w",
    "test": "jest",
    "cdk": "cdk"
  },
  "devDependencies": {
    "@types/jest": "^29.5.1",
    "@types/node": "20.1.7",
    "jest": "^29.5.0",
    "ts-jest": "^29.1.0",
    "aws-cdk": "2.79.1",
    "ts-node": "^10.9.1",
    "typescript": "~5.1.3"
  },
  "dependencies": {
    "aws-cdk-lib": "2.79.1",
    "constructs": "^10.0.0",
    "source-map-support": "^0.5.21"
  }
}

Install the packages npm install and run npx cdk deploy --all --require-approval never to check that deployment is successful with the older AWS CDK version.

CDK CLI Version

2.87.0 (build 9fca790)

Framework Version

No response

Node.js Version

v18.16.1

OS

OS X & Ubuntu & Amazon Linux 2

Language

Typescript

Language Version

5.1.6

Other information

Thanks for the hard work on CDK! I'm a very happy user.

@igirardi igirardi added bug This issue is a bug. needs-triage This issue or PR still needs to be triaged. labels Jul 20, 2023
@github-actions github-actions bot added the @aws-cdk/assets Related to the @aws-cdk/assets package label Jul 20, 2023
@naataaniitsosie
Copy link

naataaniitsosie commented Jul 20, 2023

I receive a similar error when using publishing a single docker image to multiple regions in the same account via a TarballImageAsset construct.


ecs.ContainerImage.fromTarball(path.join(__dirname, "..", "..", "..", "apps", "api", "image.tar")),

Regions: us-east-1 and us-west-2

Error:

fail: docker push XXXXXXXXXX.dkr.ecr.us-west-2.amazonaws.com/cdk-hnb659fds-container-assets-XXXXXXXXXX-us-west-2:16a6587ca24fb32b32d096b40b6d2a99615c0320829bbe41dc76861ba859b1dd exited with error code 1: tag does not exist: XXXXXXXXXX.dkr.ecr.us-west-2.amazonaws.com/cdk-hnb659fds-container-assets-XXXXXXXXXX-us-west-2:16a6587ca24fb32b32d096b40b6d2a99615c0320829bbe41dc76861ba859b1dd

Using an older version of the CDK ^2.70.0 fixes the issue as previously mentioned by @igirardi. However, I want to use the latest CDK version for other reasons.

@tmokmss
Copy link
Contributor

tmokmss commented Jul 21, 2023

Related: #25962

@pahud
Copy link
Contributor

pahud commented Jul 21, 2023

Thank you @tmokmss for the insights #25962 (comment)

@pahud pahud added p1 effort/medium Medium work item – several days of effort and removed needs-triage This issue or PR still needs to be triaged. labels Jul 21, 2023
@pahud pahud added p2 and removed p1 labels Jun 11, 2024
@kadishmal
Copy link

As a workaround I have cleared all docker cache locally via the following commands which forces to rebuild images and push to AWS ECR.

docker rmi $(docker images -a --filter=dangling=true -q)
docker rm $(docker ps --filter=status=exited --filter=status=created -q)
docker system prune -a

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
@aws-cdk/assets Related to the @aws-cdk/assets package bug This issue is a bug. effort/medium Medium work item – several days of effort p2
Projects
None yet
Development

No branches or pull requests

5 participants