Skip to content

Commit

Permalink
Do not check for S3 blob to exist before writing (#31128)
Browse files Browse the repository at this point in the history
In #19749 an extra check was added before writing each blob to ensure that we would not be
overriding an existing blob. Due to S3's weak consistency model, this check was best effort. To
make matters worse, however, this resulted in a HEAD request to be done before every PUT, in
particular also when PUTTING a new object. The approach taken in #19749 worsened our
consistency guarantees for follow-up snapshot actions, as it made it less likely for new files that
had been written to be available for reads.

This commit therefore removes this extra check. Due to the weak consistency model, this check
was a best effort thing anyway, and there's currently no way to prevent accidental overrides on S3.
  • Loading branch information
ywelsch committed Jun 6, 2018
1 parent 0b89edd commit 29a4406
Show file tree
Hide file tree
Showing 2 changed files with 5 additions and 4 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -96,10 +96,6 @@ public InputStream readBlob(String blobName) throws IOException {

@Override
public void writeBlob(String blobName, InputStream inputStream, long blobSize) throws IOException {
if (blobExists(blobName)) {
throw new FileAlreadyExistsException("Blob [" + blobName + "] already exists, cannot overwrite");
}

SocketAccess.doPrivilegedIOException(() -> {
if (blobSize <= blobStore.bufferSizeInBytes()) {
executeSingleUpload(blobStore, buildKey(blobName), inputStream, blobSize);
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -96,6 +96,11 @@ protected BlobStore newBlobStore() throws IOException {
new ByteSizeValue(10, ByteSizeUnit.MB), "public-read-write", "standard");
}

@Override
public void testVerifyOverwriteFails() {
assumeFalse("not implemented because of S3's weak consistency model", true);
}

public void testExecuteSingleUploadBlobSizeTooLarge() throws IOException {
final long blobSize = ByteSizeUnit.GB.toBytes(randomIntBetween(6, 10));
final S3BlobStore blobStore = mock(S3BlobStore.class);
Expand Down

0 comments on commit 29a4406

Please sign in to comment.