Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add a content parameter to aws_s3 put operation #20

Merged
merged 35 commits into from
Nov 16, 2020
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
35 commits
Select commit Hold shift + click to select a range
9592d10
Document usage of aws_s3 content parameter
srgoni Feb 4, 2020
528c873
Added utility function to calculate s3 etag from content instead of file
srgoni Feb 4, 2020
c17b732
Implemented s3 put from bytes content
srgoni Feb 4, 2020
437cf42
Moved positional parameters
srgoni Feb 4, 2020
0974fa6
Added missing imports
srgoni Feb 4, 2020
01474a1
Replace file md5 utility call with direct calculation
srgoni Feb 4, 2020
b6f203b
pylint nitpick
srgoni Feb 4, 2020
9afd3c8
Removed file open
srgoni Feb 4, 2020
10e205a
Moved other use of positional param
srgoni Feb 4, 2020
404f556
Added potentially missing version parameter
srgoni Feb 4, 2020
1a2c345
Added unit tests for static content and template put
srgoni Feb 4, 2020
33698c8
Changed content parameter to string and removed src requirement
srgoni Feb 4, 2020
471eb15
Remove trailing newline from unit test
srgoni Feb 4, 2020
6b49ef4
Fix unit test cleanup
srgoni Feb 4, 2020
72de30b
Encode content string to utf-8
srgoni Feb 4, 2020
de02783
Document implicit conversion to UTF-8
onitake Feb 5, 2020
4a44493
Add idempotency test
srgoni May 8, 2020
5bc9e88
Validate src as path
srgoni May 8, 2020
37d37a4
Don't validate src when None
srgoni May 8, 2020
55721eb
Change src/content validation logic
srgoni May 8, 2020
2d1091c
Fix PEP8 issue
srgoni May 8, 2020
3eacaf9
Fix idempotency test
srgoni May 8, 2020
6123be8
Encode the content string only once for hashing and upload
srgoni May 8, 2020
4f6c89a
Only try to encode when content is set
srgoni May 8, 2020
aa6e402
Don't skip src check if we get a falsy value, only if unset
srgoni May 8, 2020
9b7b793
Bind local variable
srgoni May 8, 2020
7ee3970
Updated feature version
srgoni Jul 10, 2020
e40e935
Add new content_base64 parameter
srgoni Jul 10, 2020
f9da75c
Add test case for S3 upload from b64 encoded binary
srgoni Jul 10, 2020
baecef6
Fixed type of src argument
srgoni Jul 10, 2020
6e339e8
Fixed copy pasta
srgoni Jul 10, 2020
6c1fce7
Use role_path in unit tests
srgoni Jul 10, 2020
4cbc8c3
Dropped obsolete aws_connection_info
srgoni Sep 28, 2020
a6dcf34
Update version_added (stale)
tremble Nov 16, 2020
54f765e
Add changelog fragment
tremble Nov 16, 2020
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 2 additions & 0 deletions changelogs/fragments/20-aws_s3-content.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
minor_changes:
- aws_s3 - Add support for uploading templated content (https://github.com/ansible-collections/amazon.aws/pull/20).
33 changes: 33 additions & 0 deletions plugins/module_utils/s3.py
Original file line number Diff line number Diff line change
Expand Up @@ -48,3 +48,36 @@ def calculate_etag(module, filename, etag, s3, bucket, obj, version=None):
return '"{0}-{1}"'.format(digest_squared.hexdigest(), len(digests))
else: # Compute the MD5 sum normally
return '"{0}"'.format(module.md5(filename))


def calculate_etag_content(module, content, etag, s3, bucket, obj, version=None):
if not HAS_MD5:
return None

if '-' in etag:
# Multi-part ETag; a hash of the hashes of each part.
parts = int(etag[1:-1].split('-')[1])
digests = []
offset = 0

s3_kwargs = dict(
Bucket=bucket,
Key=obj,
)
if version:
s3_kwargs['VersionId'] = version

for part_num in range(1, parts + 1):
s3_kwargs['PartNumber'] = part_num
try:
head = s3.head_object(**s3_kwargs)
except (BotoCoreError, ClientError) as e:
module.fail_json_aws(e, msg="Failed to get head object")
length = int(head['ContentLength'])
digests.append(md5(content[offset:offset + length]))
offset += length

digest_squared = md5(b''.join(m.digest() for m in digests))
return '"{0}-{1}"'.format(digest_squared.hexdigest(), len(digests))
else: # Compute the MD5 sum normally
return '"{0}"'.format(md5(content).hexdigest())
75 changes: 62 additions & 13 deletions plugins/modules/aws_s3.py
Original file line number Diff line number Diff line change
Expand Up @@ -126,6 +126,22 @@
src:
description:
- The source file path when performing a PUT operation.
- Either I(content), I(content_base64) or I(src) must be specified for a PUT operation. Ignored otherwise.
type: path
content:
description:
- The content to PUT into an object.
- The parameter value will be treated as a string and converted to UTF-8 before sending it to S3.
To send binary data, use the I(content_base64) parameter instead.
- Either I(content), I(content_base64) or I(src) must be specified for a PUT operation. Ignored otherwise.
version_added: "1.3.0"
type: str
content_base64:
description:
- The base64-encoded binary data to PUT into an object.
- Use this if you need to put raw binary data, and don't forget to encode in base64.
- Either I(content), I(content_base64) or I(src) must be specified for a PUT operation. Ignored otherwise.
version_added: "1.3.0"
type: str
ignore_nonexistent_bucket:
description:
Expand Down Expand Up @@ -155,6 +171,13 @@
src: /usr/local/myfile.txt
mode: put

- name: PUT operation from a rendered template
aws_s3:
bucket: mybucket
object: /object.yaml
content: "{{ lookup('template', 'templates/object.yaml.j2') }}"
mode: put

- name: Simple PUT operation in Ceph RGW S3
amazon.aws.aws_s3:
bucket: mybucket
Expand Down Expand Up @@ -275,7 +298,9 @@

import mimetypes
import os
import io
from ssl import SSLError
import base64

try:
import botocore
Expand All @@ -294,6 +319,7 @@
from ..module_utils.ec2 import get_aws_connection_info
from ..module_utils.s3 import HAS_MD5
from ..module_utils.s3 import calculate_etag
from ..module_utils.s3 import calculate_etag_content

IGNORE_S3_DROP_IN_EXCEPTIONS = ['XNotImplemented', 'NotImplemented']

Expand All @@ -319,9 +345,12 @@ def key_check(module, s3, bucket, obj, version=None, validate=True):
return True


def etag_compare(module, local_file, s3, bucket, obj, version=None):
def etag_compare(module, s3, bucket, obj, version=None, local_file=None, content=None):
s3_etag = get_etag(s3, bucket, obj, version=version)
local_etag = calculate_etag(module, local_file, s3_etag, s3, bucket, obj, version)
if local_file is not None:
local_etag = calculate_etag(module, local_file, s3_etag, s3, bucket, obj, version)
else:
local_etag = calculate_etag_content(module, content, s3_etag, s3, bucket, obj, version)

return s3_etag == local_etag

Expand Down Expand Up @@ -479,7 +508,7 @@ def option_in_extra_args(option):
return allowed_extra_args[temp_option]


def upload_s3file(module, s3, bucket, obj, src, expiry, metadata, encrypt, headers):
def upload_s3file(module, s3, bucket, obj, expiry, metadata, encrypt, headers, src=None, content=None):
if module.check_mode:
module.exit_json(msg="PUT operation skipped - running in check mode", changed=True)
try:
Expand All @@ -500,13 +529,19 @@ def upload_s3file(module, s3, bucket, obj, src, expiry, metadata, encrypt, heade
extra['Metadata'][option] = metadata[option]

if 'ContentType' not in extra:
content_type = mimetypes.guess_type(src)[0]
content_type = None
if src is not None:
content_type = mimetypes.guess_type(src)[0]
if content_type is None:
# s3 default content type
content_type = 'binary/octet-stream'
extra['ContentType'] = content_type

s3.upload_file(Filename=src, Bucket=bucket, Key=obj, ExtraArgs=extra)
if src is not None:
s3.upload_file(Filename=src, Bucket=bucket, Key=obj, ExtraArgs=extra)
else:
f = io.BytesIO(content)
s3.upload_fileobj(Fileobj=f, Bucket=bucket, Key=obj, ExtraArgs=extra)
except (botocore.exceptions.ClientError, botocore.exceptions.BotoCoreError) as e:
module.fail_json_aws(e, msg="Unable to complete PUT operation.")
try:
Expand Down Expand Up @@ -652,17 +687,20 @@ def main():
s3_url=dict(aliases=['S3_URL']),
dualstack=dict(default='no', type='bool'),
rgw=dict(default='no', type='bool'),
src=dict(),
src=dict(type='path'),
content=dict(),
content_base64=dict(),
ignore_nonexistent_bucket=dict(default=False, type='bool'),
encryption_kms_key_id=dict()
)
module = AnsibleAWSModule(
argument_spec=argument_spec,
supports_check_mode=True,
required_if=[['mode', 'put', ['src', 'object']],
required_if=[['mode', 'put', ['object']],
['mode', 'get', ['dest', 'object']],
['mode', 'getstr', ['object']],
['mode', 'geturl', ['object']]],
mutually_exclusive=[['content', 'content_base64', 'src']],
)

bucket = module.params.get('bucket')
Expand All @@ -683,6 +721,8 @@ def main():
dualstack = module.params.get('dualstack')
rgw = module.params.get('rgw')
src = module.params.get('src')
content = module.params.get('content')
content_base64 = module.params.get('content_base64')
ignore_nonexistent_bucket = module.params.get('ignore_nonexistent_bucket')

object_canned_acl = ["private", "public-read", "public-read-write", "aws-exec-read", "authenticated-read", "bucket-owner-read", "bucket-owner-full-control"]
Expand Down Expand Up @@ -762,10 +802,10 @@ def main():
else:
module.fail_json(msg="Key %s does not exist." % obj)

if path_check(dest) and overwrite != 'always':
if dest and path_check(dest) and overwrite != 'always':
if overwrite == 'never':
module.exit_json(msg="Local object already exists and overwrite is disabled.", changed=False)
if etag_compare(module, dest, s3, bucket, obj, version=version):
if etag_compare(module, s3, bucket, obj, version=version, local_file=dest):
module.exit_json(msg="Local and remote object are identical, ignoring. Use overwrite=always parameter to force.", changed=False)

try:
Expand All @@ -779,8 +819,10 @@ def main():
# if putting an object in a bucket yet to be created, acls for the bucket and/or the object may be specified
# these were separated into the variables bucket_acl and object_acl above

if not path_check(src):
module.fail_json(msg="Local object for PUT does not exist")
if content is None and content_base64 is None and src is None:
module.fail_json('Either content, content_base64 or src must be specified for PUT operations')
if src is not None and not path_check(src):
module.fail_json('Local object "%s" does not exist for PUT operation' % (src))

if bucketrtn:
keyrtn = key_check(module, s3, bucket, obj, version=version, validate=validate)
Expand All @@ -790,14 +832,21 @@ def main():
module.params['permission'] = bucket_acl
create_bucket(module, s3, bucket, location)

# the content will be uploaded as a byte string, so we must encode it first
bincontent = None
if content is not None:
bincontent = content.encode('utf-8')
if content_base64 is not None:
bincontent = base64.standard_b64decode(content_base64)

if keyrtn and overwrite != 'always':
if overwrite == 'never' or etag_compare(module, src, s3, bucket, obj):
if overwrite == 'never' or etag_compare(module, s3, bucket, obj, version=version, local_file=src, content=bincontent):
# Return the download URL for the existing object
get_download_url(module, s3, bucket, obj, expiry, changed=False)

# only use valid object acls for the upload_s3file function
module.params['permission'] = object_acl
upload_s3file(module, s3, bucket, obj, src, expiry, metadata, encrypt, headers)
upload_s3file(module, s3, bucket, obj, expiry, metadata, encrypt, headers, src=src, content=bincontent)

# Delete an object from a bucket, not the entire bucket
if mode == 'delobj':
Expand Down
Binary file added tests/integration/targets/aws_s3/files/test.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
111 changes: 111 additions & 0 deletions tests/integration/targets/aws_s3/tasks/main.yml
Original file line number Diff line number Diff line change
Expand Up @@ -526,6 +526,108 @@
- result is not changed
when: ansible_system == 'Linux' or ansible_distribution == 'MacOSX'

- name: create an object from static content
onitake marked this conversation as resolved.
Show resolved Hide resolved
aws_s3:
bucket: "{{ bucket_name }}"
object: put-content.txt
mode: put
content: >-
test content
register: result

- assert:
that:
- result is changed

- name: ensure idempotency on static content
aws_s3:
bucket: "{{ bucket_name }}"
object: put-content.txt
mode: put
overwrite: different
content: >-
test content
register: result

- assert:
that:
- result is not changed

- name: fetch test content
aws_s3:
bucket: "{{ bucket_name }}"
mode: getstr
object: put-content.txt
register: result

- assert:
that:
- result.contents == "test content"

- set_fact:
put_template_text: test template

- name: create an object from a template
aws_s3:
bucket: "{{ bucket_name }}"
object: put-template.txt
mode: put
content: "{{ lookup('template', 'templates/put-template.txt.j2') }}"
register: result

- assert:
that:
- result is changed

- name: fetch template content
aws_s3:
bucket: "{{ bucket_name }}"
mode: getstr
object: put-template.txt
register: result

- assert:
that:
- result.contents == "{{ lookup('template', 'templates/put-template.txt.j2') }}"

# at present, there is no lookup that can process binary data, so we use slurp instead
- slurp:
src: "{{ role_path }}/files/test.png"
register: put_binary

- name: create an object from binary data
aws_s3:
bucket: "{{ bucket_name }}"
object: put-binary.bin
mode: put
content_base64: "{{ put_binary.content }}"
register: result

- assert:
that:
- result is changed

- name: fetch binary content
aws_s3:
bucket: "{{ bucket_name }}"
mode: get
dest: "{{ tmpdir.path }}/download_binary.bin"
object: put-binary.bin
register: result

- name: stat the files so we can compare the checksums
stat:
path: "{{ item }}"
get_checksum: yes
loop:
- "{{ role_path }}/files/test.png"
- "{{ tmpdir.path }}/download_binary.bin"
register: binary_files

- assert:
that:
- binary_files.results[0].stat.checksum == binary_files.results[1].stat.checksum

always:
- name: remove uploaded files
aws_s3:
Expand All @@ -537,6 +639,9 @@
- delete.txt
- delete_encrypt.txt
- delete_encrypt_kms.txt
- put-content.txt
- put-template.txt
- put-binary.txt
ignore_errors: yes

- name: delete temporary files
Expand All @@ -550,3 +655,9 @@
bucket: "{{ bucket_name }}"
mode: delete
ignore_errors: yes

- name: delete the dot bucket
aws_s3:
bucket: "{{ bucket_name + '.bucket' }}"
mode: delete
ignore_errors: yes
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
template:
{{ put_template_text }}