-
Notifications
You must be signed in to change notification settings - Fork 4.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
s3api --sse-customer-key cannot accept binary data #815
Comments
So the problem here is that file:// is typically meant for text types, which we by default will use the user's configured encoding from their locale settings when we read the file. However, there are situations where a user may want to indicate that the data is binary and should not be decoded, as shown in this issue here. I think there's two general approaches:
2 is an additive change so there shouldn't be any breaking changes, but it does put the burden on the user to specify their encodings. 1 has potential impact for things like EC2's user-data where even if local encoding != utf-8, we decode using the local encoding and then encode to utf-8. Now we'd be passing the file in its local encoding untouched. You could make the case that this is what it should have been doing all along though. Any thoughts? |
@jamesls thanks for the feedback. I'll sleep on it and get back to you tomorrow. |
As for # 2, as you described, you must provide ways to specify encoding per file and
If we switch to #1, this current behavior can cause trouble. But this behavior has existed only a month or so(from 1.3.11/May 15 #765), so we don't need to care too much about breaking backward compatibility. Mercurial project had same encoding issues and has a good wrap-up on encoding strategy: http://mercurial.selenic.com/wiki/EncodingStrategy For user inputs
This is same as aws-cli. For files
This is your # 1 plan. I'm +1 for # 1 (blob) approach. |
Pull Request: #1010 should fix this issue. Closing issue. |
This report is a feature related to the following :
http://aws.amazon.com/about-aws/whats-new/2014/06/12/amazon-s3-now-supports-server-side-encryption-with-customer-provided-keys-sse-c/
When you manipulate S3 through SSE with Customer-provided Keys, aws cli's
sse-customer-key parameter
can only accept utf-8 encodable data.For AES-256, you need 256 bit key. When you generate a key using
/dev/urandom
(as described here http://docs.oracle.com/cd/E19253-01/816-4557/scftask-10/index.html), you end up withUnicodeDecodeError
.My understanding is that AES's key can consist of arbitrary bytes as long as its key length is correct( and that's why
x-amz-copy-source-server-side-encryption-customer-key
request header is base-64 encoded)How to reproduce
AWS Version
Related Issues
The text was updated successfully, but these errors were encountered: