-
Notifications
You must be signed in to change notification settings - Fork 1.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add configurable working directory #1231
Conversation
LGTM |
@cartermckinnon This PR fixes the issues we were having with hardened linux builds. Thanks again! |
one question inline. LGTM otherwise! |
Hey @cartermckinnon, any idea when this might be merged to master? Thanks again for the work |
Hi @cartermckinnon, another friendly ping as this PR solves some issues we've been having and we're pinned on it until merged. Let me know if there's anything I can do to help work towards merging this. Thanks! |
Hi @cartermckinnon any update on when this is going to get merged? |
Sorry folks, I was out of the office and had some other priorities. I'll resolve the conflicts and wrap this one up. |
Hi, any update on this PR? |
f41b3c8
to
8a15a9f
Compare
8a15a9f
to
c35f9d2
Compare
"mkdir -p {{user `working_dir`}}", | ||
"mkdir -p {{user `working_dir`}}/log-collector-script" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can't this just be combined into one line?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yeah. I think I separated this to make it very obvious what the intent of this provisioner is, I don't feel too strongly either way.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ya. I had mixed feelings. It's unnecessary, but removes the illusion that we're creating this just for the log collector script.
{ | ||
"type": "shell", | ||
"inline": [ | ||
"rm -rf {{user `working_dir`}}" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is it possible bad things will happen here? I think not since we're creating {{user
remote_folder}}/worker
? I don't how we could accidentally delete anything too problematic.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yup, we're using /tmp
right now and deleting it in the cleanup script. I added this to directly mirror the provisioner up top where we create this dir.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
Thanks for the good PR to allow using this repo with CIS hardened Amazon Linux 2. Hope it merges soon and works again. |
@cartermckinnon still i am getting this error /sandeep/amazon-eks-ami/scripts/generate-version-info.sh Command : EKS version 1.27 |
@Sandeepsac Doesn't look like it's one of the errors that this PR is meant to fix. Looks like it's a new one. You probably need to look at the containerd logs. Looks like containerd hasn't started. |
The @Sandeepsac are you using the default for |
yes @cartermckinnon I am using default for working_dir
|
@Sandeepsac My guess is there's a permissions error at |
@cartermckinnon I will check the permission , Thanks and referring this blog for custom AMI creation CIS hardening AMI : CIS Amazon Linux 2 Benchmark - Level 2 |
Hi there, is there an ETA on when this is going to get tagged? |
@cartermckinnon any update for this error 2023-07-03T06:53:02Z: ==> amazon-ebs: Provisioning with shell script: /eks-1.26/amazon-eks-ami/scripts/generate-version-info.sh ==> Wait completed after 5 minutes 47 seconds ==> Some builds didn't complete successfully and had errors: ==> Builds finished but no artifacts were created. |
Issue #, if available:
Closes #1229
Description of changes:
This aims to address issues brought up in #1230 and #1198.
The underlying issue is the base working directory used throughout the AMI template (
/tmp
) may not be suitable for all use cases. Resources are staged in this directory (/tmp/worker
) before being moved into their final position (such as the contents offiles/
) and this directory is also used to stash the AWS CLI installer, among other things.After this change:
remote_folder
can still be used to control the location of Packer scripts on the builder instance.working_dir
will default to a sub-directory ofremote_folder
, and can be adjusted as needed. All provisioners that create files on the builder instance will do so within this directory. This directory is deleted at the end of the build.By submitting this pull request, I confirm that you can use, modify, copy, and redistribute this contribution, under the terms of your choice.
Testing Done
Defaults succeed:
Different
remote_folder
succeeds:Different
working_directory
succeeds: