Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

docs: add clarification to batch module #1045

Merged
merged 5 commits into from
Jun 6, 2023
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
22 changes: 21 additions & 1 deletion google/cloud/storage/batch.py
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,21 @@
# limitations under the License.
"""Batch updates / deletes of storage buckets / blobs.

See https://cloud.google.com/storage/docs/json_api/v1/how-tos/batch
A batch request is a single standard HTTP request containing multiple Cloud Storage JSON API calls.
Within this main HTTP request, there are multiple parts which each contain a nested HTTP request.
The body of each part is itself a complete HTTP request, with its own verb, URL, headers, and body.

Note that Cloud Storage does not support batch operations for uploading or downloading.
Additionally, the current batch design does not support library methods whose return values
depend on the response payload. See more details in the [Sending Batch Requests official guide](https://cloud.google.com/storage/docs/batch).

Examples of situations when you might want to use the Batch module:
``blob.patch()``
``blob.update()``
``blob.delete()``
``bucket.delete_blob()``
``bucket.patch()``
``bucket.update()``
"""
from email.encoders import encode_noop
from email.generator import Generator
Expand Down Expand Up @@ -131,6 +145,12 @@ def content(self):
class Batch(Connection):
"""Proxy an underlying connection, batching up change operations.

.. warning::

Cloud Storage does not support batch operations for uploading or downloading.
Additionally, the current batch design does not support library methods whose
return values depend on the response payload.

:type client: :class:`google.cloud.storage.client.Client`
:param client: The client to use for making connections.

Expand Down
4 changes: 4 additions & 0 deletions google/cloud/storage/blob.py
Original file line number Diff line number Diff line change
Expand Up @@ -3450,6 +3450,10 @@ def rewrite(
If :attr:`user_project` is set on the bucket, bills the API request
to that project.

.. note::

``rewrite`` is not supported in a ``Batch`` context.

:type source: :class:`Blob`
:param source: blob whose contents will be rewritten into this blob.

Expand Down
13 changes: 10 additions & 3 deletions google/cloud/storage/bucket.py
Original file line number Diff line number Diff line change
Expand Up @@ -1482,7 +1482,8 @@ def delete(
If ``force=True`` and the bucket contains more than 256 objects / blobs
this will cowardly refuse to delete the objects (or the bucket). This
is to prevent accidental bucket deletion and to prevent extremely long
runtime of this method.
runtime of this method. Also note that ``force=True`` is not supported
in a ``Batch`` context.

If :attr:`user_project` is set, bills the API request to that project.

Expand Down Expand Up @@ -1675,6 +1676,7 @@ def delete_blobs(
Called once for each blob raising
:class:`~google.cloud.exceptions.NotFound`;
otherwise, the exception is propagated.
Note that ``on_error`` is not supported in a ``Batch`` context.

:type client: :class:`~google.cloud.storage.client.Client`
:param client: (Optional) The client to use. If not passed, falls back
Expand Down Expand Up @@ -1801,6 +1803,8 @@ def copy_blob(
:param preserve_acl: DEPRECATED. This argument is not functional!
(Optional) Copies ACL from old blob to new blob.
Default: True.
Note that ``preserve_acl`` is not supported in a
``Batch`` context.

:type source_generation: long
:param source_generation: (Optional) The generation of the blob to be
Expand Down Expand Up @@ -1932,8 +1936,11 @@ def rename_blob(
old blob. This means that with very large objects renaming
could be a very (temporarily) costly or a very slow operation.
If you need more control over the copy and deletion, instead
use `google.cloud.storage.blob.Blob.copy_to` and
`google.cloud.storage.blob.Blob.delete` directly.
use ``google.cloud.storage.blob.Blob.copy_to`` and
``google.cloud.storage.blob.Blob.delete`` directly.

Also note that this method is not fully supported in a
``Batch`` context.

:type blob: :class:`google.cloud.storage.blob.Blob`
:param blob: The blob to be renamed.
Expand Down
9 changes: 8 additions & 1 deletion samples/snippets/storage_batch_request.py
Original file line number Diff line number Diff line change
Expand Up @@ -28,7 +28,14 @@


def batch_request(bucket_name, prefix=None):
"""Use a batch request to patch a list of objects with the given prefix in a bucket."""
"""
Use a batch request to patch a list of objects with the given prefix in a bucket.

Note that Cloud Storage does not support batch operations for uploading or downloading.
Additionally, the current batch design does not support library methods whose return values
depend on the response payload.
See https://cloud.google.com/python/docs/reference/storage/latest/google.cloud.storage.batch
"""
# The ID of your GCS bucket
# bucket_name = "my-bucket"
# The prefix of the object paths
Expand Down