Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Allow custom endpoints with Google Cloud Storage. #648

Merged
merged 1 commit into from
Nov 21, 2019
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions AUTHORS
Original file line number Diff line number Diff line change
Expand Up @@ -37,6 +37,7 @@ By order of apparition, thanks:
* Jumpei Yoshimura (S3 docs)
* Jon Dufresne
* Rodrigo Gadea (Dropbox fixes)
* Martey Dodoo



Expand Down
5 changes: 5 additions & 0 deletions docs/backends/gcloud.rst
Original file line number Diff line number Diff line change
Expand Up @@ -149,6 +149,11 @@ must fit in memory. Recommended if you are going to be uploading large files.

Sets Cache-Control HTTP header for the file, more about HTTP caching can be found `here <https://developers.google.com/web/fundamentals/performance/optimizing-content-efficiency/http-caching#cache-control>`_

``GS_CUSTOM_ENDPOINT`` (optional: default is ``None``)

Sets a `custom endpoint <https://cloud.google.com/storage/docs/request-endpoints>`_,
that will be used instead of ``https://storage.googleapis.com`` when generating URLs for files.

``GS_LOCATION`` (optional: default is ``''``)

Subdirectory in which the files will be stored.
Expand Down
2 changes: 1 addition & 1 deletion setup.py
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,7 @@ def read(filename):
'boto': ['boto>=2.32.0'],
'boto3': ['boto3>=1.4.4'],
'dropbox': ['dropbox>=7.2.1'],
'google': ['google-cloud-storage>=0.22.0'],
'google': ['google-cloud-storage>=1.15.0'],
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is this change required?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The code in this pull request relies on the fact that Blob.generate_signed_url has a api_access_endpoint parameter (introduced in googleapis/google-cloud-python#7460 and google-cloud-storage 1.15.0).

It would be theoretically possible to update the code so that it is still compatible with google-cloud-storage 0.22.0, but because of the substantial differences in google-cloud-storage pre- and post-1.15.0, this would take a substantial amount of work and would significantly increase the complexity of this pull request's code.

Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That change was merged in April. I'm not sure if I can unilaterally require that upgrade.

Is the major difference that non-public URLs wouldn't be able to be signed?

'libcloud': ['apache-libcloud'],
'sftp': ['paramiko'],
},
Expand Down
17 changes: 15 additions & 2 deletions storages/backends/gcloud.py
Original file line number Diff line number Diff line change
Expand Up @@ -16,6 +16,7 @@

try:
from google.cloud.storage import Blob, Client
from google.cloud.storage.blob import _quote
from google.cloud.exceptions import Conflict, NotFound
except ImportError:
raise ImproperlyConfigured("Could not load Google Cloud Storage bindings.\n"
Expand Down Expand Up @@ -88,6 +89,7 @@ class GoogleCloudStorage(Storage):
project_id = setting('GS_PROJECT_ID')
credentials = setting('GS_CREDENTIALS')
bucket_name = setting('GS_BUCKET_NAME')
custom_endpoint = setting('GS_CUSTOM_ENDPOINT', None)
location = setting('GS_LOCATION', '')
auto_create_bucket = setting('GS_AUTO_CREATE_BUCKET', False)
auto_create_acl = setting('GS_AUTO_CREATE_ACL', 'projectPrivate')
Expand Down Expand Up @@ -263,9 +265,20 @@ def url(self, name):
name = self._normalize_name(clean_name(name))
blob = self.bucket.blob(self._encode_name(name))

if self.default_acl == 'publicRead':
if not self.custom_endpoint and self.default_acl == 'publicRead':
return blob.public_url
return blob.generate_signed_url(self.expiration)
elif self.default_acl == 'publicRead':
return '{storage_base_url}/{quoted_name}'.format(
storage_base_url=self.custom_endpoint,
quoted_name=_quote(name, safe=b"/~"),
)
elif not self.custom_endpoint:
return blob.generate_signed_url(self.expiration)
else:
return blob.generate_signed_url(
expiration=self.expiration,
api_access_endpoint=self.custom_endpoint,
)

def get_available_name(self, name, max_length=None):
name = clean_name(name)
Expand Down
16 changes: 16 additions & 0 deletions tests/test_gcloud.py
Original file line number Diff line number Diff line change
Expand Up @@ -372,6 +372,22 @@ def test_url_not_public_file_with_custom_expires(self):
self.assertEqual(url, 'http://signed_url')
blob.generate_signed_url.assert_called_with(timedelta(seconds=3600))

def test_custom_endpoint(self):
self.storage.custom_endpoint = "https://example.com"

self.storage.default_acl = 'publicRead'
url = "{}/{}".format(self.storage.custom_endpoint, self.filename)
self.assertEqual(self.storage.url(self.filename), url)

signed_url = 'https://signed_url'
self.storage.default_acl = 'projectPrivate'
self.storage._bucket = mock.MagicMock()
blob = mock.MagicMock()
generate_signed_url = mock.MagicMock(return_value=signed_url)
blob.generate_signed_url = generate_signed_url
self.storage._bucket.blob.return_value = blob
self.assertEqual(self.storage.url(self.filename), signed_url)

def test_get_available_name(self):
self.storage.file_overwrite = True
self.assertEqual(self.storage.get_available_name(
Expand Down