-
Notifications
You must be signed in to change notification settings - Fork 370
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
MultiPart Upload #11588
Comments
Please provide a complete example - I don't really understand exactly what you're trying to achieve, and a concrete code example will make it a lot clearer. |
(You can definitely overwrite the contents of an object later, but you can't overwrite them within that upload session. That's what the docs are trying to say.) |
Download process: const int chunkSizeInBytes = 10 * 1024 * 1024;
ulong offset = 0;
while (offset < fileSize)
{
int bytesRead;
using (var downloadStream = new MemoryStream())
{
await _cloudStorage.Download(fileName, downloadStream, new DownloadObjectOptions
{
Range = new RangeHeaderValue((long?)offset, (long?)(offset + chunkSizeInBytes - 1))
});
bytesRead = (int)downloadStream.Length;
var endOfPart= bytesRead < chunkSizeInBytes;
await _cloudStorage.UploadMultiPart(uploadPath, downloadStream, endOfPart);
}
offset += (ulong)bytesRead;
} Upload process: public async Task UploadMultiPart(string uploadPath, Stream memoryStream, bool endOfPart)
{
UploadObjectOptions options = new UploadObjectOptions
{
PredefinedAcl = PredefinedObjectAcl.PublicRead
};
if (!_sessions.TryGetValue(uploadPath, out Uri uploadUri))
{
ObjectsResource.InsertMediaUpload tempUploader = _storageClient.CreateObjectUploader(_bucketName, uploadPath, "application/octet-stream", memoryStream, options);
uploadUri = await tempUploader.InitiateSessionAsync();
_sessions.TryAdd(uploadPath, uploadUri);
}
IProgress<IUploadProgress> progress = new Progress<IUploadProgress>(
p => Console.WriteLine($"bytes: {p.BytesSent}, status: {p.Status}")
);
ResumableUpload actualUploader = ResumableUpload.CreateFromUploadUri(uploadUri, memoryStream);
actualUploader.ProgressChanged += progress.Report;
await actualUploader.UploadAsync();
if (endOfPart)
{
_ = _sessions.TryRemove(uploadPath, out _);
}
} For example, here I want to download a 200 mb file in 10 mb parts and upload each downloaded part. I cannot overwrite the file in this way |
It's still not clear to me what you want the result to be though, or why you want to upload it in parts at all. At the moment I think you're effectively trying to "not complete" the upload after each part, in which case this issue is just a duplicate of googleapis/google-api-dotnet-client#2480 - please could you check whether that describes what you're trying to do? |
@jskeet I'm having a similar problem. I'm streaming large data-lake files ~ 1 GB each, reading them Amazon S3 doing some processing & filtering and then writing them to Googles Cloud Storage. Currently I need to store those files to disk before uploading them to Google cloud. I think what is needed in the code is: content.Headers.ContentRange =
isLastChunk
? Headers.ContentRangeHeaderValue(totalWritten, totalWritten + chunk.Length, totalWritten + chunk.Length)
: Headers.ContentRangeHeaderValue(totalWritten, totalWritten + chunk.Length) |
@pkese: "I think what is needed in the code" - which code, exactly? (I very much doubt that implementing this is just a single statement change.) Please note the final comment in googleapis/google-api-dotnet-client#2480 - we'd like to get to this at some point, but it's not high on our priority list at the moment. I'm going to close this issue as I believe it's a duplicate of the linked one, and I'd really prefer to avoid multiple issues getting separate comment threads. If you believe it's not a duplicate of that, please let me know and I can reopen this one. |
Hello, everyone,
I want to use your library to upload large files without having to write them to my disk. For example, I have a 1GB bucket. I want to download files from this bucket in 10 mb chunks, write them to stream and then upload them back in zip format.
I tried this with CreateObjectUploader but it won't let me overwrite the file. When the upload is done, I only have a 10mb file in the bucket.
https://cloud.google.com/storage/docs/performing-resumable-uploads
It is written in the document that this is not possible, but I wanted to ask you in case there is a different method. Is there a method that I can make it continue uploading from a certain point for upload, like Range in the DownloadOption
The text was updated successfully, but these errors were encountered: