-
Notifications
You must be signed in to change notification settings - Fork 595
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
The downloaded data did not match the data from the server #654
Comments
:( |
Any thougths @stephenplusplus ? |
@beshkenadze sorry that it's taken so long to get on this. It's hard to say what could be causing this or if it's related to #651. The error is being returned because either an MD5 or CRC32c validation check isn't passing. In other words, the data you've received isn't matching the data that's stored in the bucket. You can workaround this by disabling validation: bucket.file("reviews/reviews_com.sample.android_201409.csv").download({
destination: './reviews.cvs',
validation: false
},
function(err, content){
// No more mismatch error (hopefully)
}); If you want to do some debugging, I put up a branch you can swap out your gcloud dependency for. It will just do a little console.log-ing to help see what's going on: $ npm install --save stephenplusplus/gcloud-node#spp--654 |
Hey @stephenplusplus, |
Not sure I understand the question @beshkenadze ... To give some background (sorry if you already know this, just adding for context):
I don't see anything that would indicate that we're looking at the uncompressed file, as we're treating the data as nothing more than bytes and ignoring the file type all together. It could be possible that Play (when uploading the data) is somehow bypassing the part where they set the CRC32 and MD5 hash for the file (which would cause this error to happen on all Play-uploaded files). Could you tell us what you get back in the headers that start with /cc @stephenplusplus |
Now find the real file and will try to show an example. |
That's what this will do: $ npm install --save stephenplusplus/gcloud-node#spp--654 |
@stephenplusplus version ("version": "0.8.1") to old :) |
That branch (spp--654) is tracking master: https://github.com/stephenplusplus/gcloud-node/tree/spp--654 |
I used request-debug and got this: {
response: {
debugId: 1,
headers: {
'x-guploader-uploadid': 'XXXX',
expires: 'Mon, 20 Jul 2015 13:40:35 GMT',
date: 'Mon, 20 Jul 2015 13:40:35 GMT',
'cache-control': 'private, max-age=0',
'last-modified': 'Sun, 19 Jul 2015 18:53:59 GMT',
etag: 'W/"XXXX"',
'x-goog-generation': '1437332039288000',
'x-goog-metageneration': '1',
'x-goog-stored-content-encoding': 'gzip',
'x-goog-stored-content-length': '5939',
'content-type': 'text/csv; charset=utf-16le',
'x-goog-hash': 'crc32c=66rJzQ==, md5=2T/NKanU9vTItoiF7+tMAA==',
'x-goog-storage-class': 'STANDARD',
vary: 'Accept-Encoding',
'content-length': '24148',
server: 'UploadServer',
'alternate-protocol': '443:quic,p=1',
connection: 'close'
},
statusCode: 200
}
} |
Nice :) Using my branch will show the hashes that are being built locally as well. |
Headers: { 'x-guploader-uploadid': 'XXXX',
expires: 'Mon, 20 Jul 2015 13:46:35 GMT',
date: 'Mon, 20 Jul 2015 13:46:35 GMT',
'cache-control': 'private, max-age=0',
'last-modified': 'Sun, 19 Jul 2015 18:53:59 GMT',
etag: 'W/"XXXX"',
'x-goog-generation': '1437332039288000',
'x-goog-metageneration': '1',
'x-goog-stored-content-encoding': 'gzip',
'x-goog-stored-content-length': '5939',
'content-type': 'text/csv; charset=utf-16le',
'x-goog-hash': 'crc32c=66rJzQ==, md5=2T/NKanU9vTItoiF7+tMAA==',
'x-goog-storage-class': 'STANDARD',
vary: 'Accept-Encoding',
'content-length': '24148',
server: 'UploadServer',
'alternate-protocol': '443:quic,p=1',
connection: 'close' }
Local CRC32c Hash: Fw==
Local MD5 Hash: Hwt6cw9joXTy4EOtQqh0pg==
crypto.js:126
return this._handle.digest(outputEncoding);
^
Error: Not initialized
at Error (native) |
Those don't match even a little bit! Like you pointed out @beshkenadze, I think we're running into issues because
|
Can we run a quick test of uploading a zipped file, and seeing that the
verification fails ? I'd feel much more confident if this was happening on
a non-magic bucket (the one in question here is actually owned by Google
Play, so it might be doing something wonky).
|
Yep, just confirmed it still happens. |
OK cool - so that's going to be tricky. It means anytime anyone downloads a
file that's zipped will get a content mismatch error (when really, there is
no mismatch).
|
I think idea no. 3 is probably the best way to handle this. I'll take a stab at it and PR soon. |
👍 |
Hi , |
@adielmil Could you post a new issue on https://github.com/googleapis/nodejs-storage? Be sure to fill out the issue template and provide sample code/files that we can use to reproduce. |
Started happening to us too (also with a
|
Just to throw another angle on this issue. I'm currently seeing this in firebase storage emulators when developing locally. So far, not in production. |
@acSpock Hit the same problem using firebase emulator. What helped me was to use |
Perhaps a slightly better approach would be... bucket.file(filePath).download({
validation: !process.env.FUNCTIONS_EMULATOR,
}); So then you keep the validation in prod. In addition, if you're using TypeScript, you can satisfy the type checking by following the suggestion in this SO post: https://stackoverflow.com/a/53981706/6506026 |
@LB-Digital is |
@JarnoRFB yes you need it for firebase admin to know youre running local emulator |
Got the same error as explained in this issue: googleapis/google-cloud-node#654 (comment)
I'm running into this error when using the emulator as well |
* feat: add batchGetEffectiveIamPolicies sample code. Add batchGetEffectiveIamPolicies sample code and also lint the protobuf imports. * chore: fix the Copyright year for getBatchEffectiveIamPolicies.js * chore: refactor logging and remove loop for checking results * 🦉 Updates from OwlBot post-processor See https://github.com/googleapis/repo-automation-bots/blob/main/packages/owl-bot/README.md * chore: modify the logging to print nested Object. Co-authored-by: Owl Bot <gcf-owl-bot[bot]@users.noreply.github.com>
This PR was generated using Autosynth. 🌈 Synth log will be available here: https://source.cloud.google.com/results/invocations/940354f9-15cd-4361-bbf4-dc9af1426979/targets - [ ] To automatically regenerate this PR, check this box. Source-Link: googleapis/synthtool@99c93fe
* feat: add batchGetEffectiveIamPolicies sample code. Add batchGetEffectiveIamPolicies sample code and also lint the protobuf imports. * chore: fix the Copyright year for getBatchEffectiveIamPolicies.js * chore: refactor logging and remove loop for checking results * 🦉 Updates from OwlBot post-processor See https://github.com/googleapis/repo-automation-bots/blob/main/packages/owl-bot/README.md * chore: modify the logging to print nested Object. Co-authored-by: Owl Bot <gcf-owl-bot[bot]@users.noreply.github.com>
Source-Link: googleapis/googleapis@253807f Source-Link: googleapis/googleapis-gen@80a264b Copy-Tag: eyJwIjoiLmdpdGh1Yi8uT3dsQm90LnlhbWwiLCJoIjoiODBhMjY0YmIyZWRmOWVjZWRhYzM1NDNjMDk2Y2IxODY0MGYzMzVjMSJ9 See https://github.com/googleapis/repo-automation-bots/blob/main/packages/owl-bot/README.md Co-authored-by: Owl Bot <gcf-owl-bot[bot]@users.noreply.github.com> Co-authored-by: Benjamin E. Coe <[email protected]>
chore: relocate owl bot post processor
Source-Author: Takashi Matsuo <[email protected]> Source-Date: Fri Oct 2 12:13:27 2020 -0700 Source-Repo: googleapis/synthtool Source-Sha: 0c868d49b8e05bc1f299bc773df9eb4ef9ed96e9 Source-Link: googleapis/synthtool@0c868d4
Source-Author: Takashi Matsuo <[email protected]> Source-Date: Fri Oct 2 12:13:27 2020 -0700 Source-Repo: googleapis/synthtool Source-Sha: 0c868d49b8e05bc1f299bc773df9eb4ef9ed96e9 Source-Link: googleapis/synthtool@0c868d4
Hey,
Any files that I get from the bucket "pubsite_prod_rev_", gets error code: CONTENT_DOWNLOAD_MISMATCH.
The text was updated successfully, but these errors were encountered: