Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Allow a local tarball to be uploaded instead of fetching from S3 #17

Merged
merged 2 commits into from
Mar 27, 2018
Merged
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
22 changes: 18 additions & 4 deletions scripts/on-prem-archive.sh
Original file line number Diff line number Diff line change
Expand Up @@ -17,11 +17,18 @@
# after itself by deleting the intermediate files that were created during its run.
# Setting this variable to any value will cause the cleanup to be skipped. By
# default, the script will clean up after itself.
#
# Additionally, if you're using this script to populate an existing depot, and you
# don't have network connectivity to download a tarball from S3, you can pass the
# path to your existing tarball as the third argument and that will be used to
# upload packages instead. Note that this script expects the tarball passed to be
# in the same format as the one that this script generates - it can't have any
# random internal structure.

set -euo pipefail

usage() {
echo "Usage: on-prem-archive.sh {create-archive|populate-depot <DEPOT_URL>}"
echo "Usage: on-prem-archive.sh {create-archive|populate-depot <DEPOT_URL> [PATH_TO_EXISTING_TARBALL]}"
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This looks fine - one thing that would be useful is to have a command like "download-archive" that can be used to retrieve the latest tarball. Then in the README instructions above, we can tell them to ensure they are using a tarball that has been downloaded using that command.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Also, minor - create-archive doesn't needs the <DEPOT_URL> param, so that could be marked as optional as well for the usage

Copy link
Contributor Author

@raskchanky raskchanky Mar 26, 2018

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I can add a download-archive command to download the latest tarball.

The create-archive command doesn't take any arguments at all, which is why none are listed 😄 The pipe in the usage means "or", so the way that usage is read is create-archive OR populate-depot <DEPOT_URL> [PATH_TO_EXISTING_TARBALL] with the angle brackets <> denoting required arguments and the square brackets [] denoting optional arguments. If the create-archive command took any arguments, they would appear before the pipe.

These are the same conventions we follow with the hab CLI, and are generally followed by most unix command line utilities. Maybe adding some spaces around the pipe would make this more clear?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ah ok, I was reading the usage incorrectly then :)

exit 1
}

Expand Down Expand Up @@ -202,9 +209,16 @@ case "${1:-}" in
s3_root_url="${HAB_ON_PREM_BOOTSTRAP_S3_ROOT_URL:-https://s3-us-west-2.amazonaws.com}/$bucket"
tmp_dir=$(mktemp -d)

cd "$tmp_dir"
echo "Fetching latest package bootstrap file."
curl -O "$s3_root_url/$marker"
if [ -f "${3:-}" ]; then
echo "Skipping S3 download and using existing file $3 instead."
mv "$3" "$tmp_dir/$marker"
cd "$tmp_dir"
else
echo "Fetching latest package bootstrap file."
cd "$tmp_dir"
curl -O "$s3_root_url/$marker"
fi

tar zxvf $marker

echo
Expand Down