Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Remove doc references to Gcloud::Backoff #801

Merged
merged 1 commit into from
Jul 22, 2016
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion lib/gcloud.rb
Original file line number Diff line number Diff line change
Expand Up @@ -222,7 +222,7 @@ def pubsub scope: nil, retries: nil, timeout: nil
# * `https://www.googleapis.com/auth/bigquery`
# @param [Integer] retries Number of times to retry requests on server error.
# The default value is `3`. Optional.
# @param [Integer] timeout Default timeout to use in requests. Optional.
# @param [Integer] timeout Default request timeout in seconds. Optional.
#
# @return [Gcloud::Bigquery::Project]
#
Expand Down
23 changes: 11 additions & 12 deletions lib/gcloud/bigquery.rb
Original file line number Diff line number Diff line change
Expand Up @@ -369,24 +369,23 @@ def self.bigquery project = nil, keyfile = nil, scope: nil, retries: nil,
# BigQuery](https://cloud.google.com/bigquery/exporting-data-from-bigquery)
# for details.
#
# ## Configuring Backoff
# ## Configuring retries and timeout
#
# The {Gcloud::Backoff} class allows users to globally configure how Cloud API
# requests are automatically retried in the case of some errors, such as a
# `500` or `503` status code, or a specific internal error code such as
# `rateLimitExceeded`.
# You can configure how many times API requests may be automatically retried.
# When an API request fails, the response will be inspected to see if the
# request meets criteria indicating that it may succeed on retry, such as
# `500` and `503` status codes or a specific internal error code such as
# `rateLimitExceeded`. If it meets the criteria, the request will be retried
# after a delay. If another error occurs, the delay will be increased before a
# subsequent attempt, until the `retries` limit is reached.
#
# If an API call fails, the response will be inspected to see if the call
# should be retried. If the response matches the criteria, then the request
# will be retried after a delay. If another error occurs, the delay will be
# increased incrementally before a subsequent attempt. The first retry will be
# delayed one second, the second retry two seconds, and so on.
# You can also set the request `timeout` value in seconds.
#
# ```ruby
# require "gcloud"
# require "gcloud/backoff"
#
# Gcloud::Backoff.retries = 5 # Raise the maximum number of retries from 3
# gcloud = Gcloud.new
# bigquery = gcloud.bigquery retries: 10, timeout: 120
# ```
#
# See the [BigQuery error
Expand Down
23 changes: 11 additions & 12 deletions lib/gcloud/datastore.rb
Original file line number Diff line number Diff line change
Expand Up @@ -519,24 +519,23 @@ def self.datastore project = nil, keyfile = nil, scope: nil, retries: nil,
# end
# ```
#
# ## Configuring Backoff
# ## Configuring retries and timeout
#
# The {Gcloud::Backoff} class allows users to globally configure how Cloud API
# requests are automatically retried in the case of some errors, such as a
# `500` or `503` status code, or a specific internal error code such as
# `rateLimitExceeded`.
# You can configure how many times API requests may be automatically retried.
# When an API request fails, the response will be inspected to see if the
# request meets criteria indicating that it may succeed on retry, such as
# `500` and `503` status codes or a specific internal error code such as
# `rateLimitExceeded`. If it meets the criteria, the request will be retried
# after a delay. If another error occurs, the delay will be increased before a
# subsequent attempt, until the `retries` limit is reached.
#
# If an API call fails, the response will be inspected to see if the call
# should be retried. If the response matches the criteria, then the request
# will be retried after a delay. If another error occurs, the delay will be
# increased incrementally before a subsequent attempt. The first retry will be
# delayed one second, the second retry two seconds, and so on.
# You can also set the request `timeout` value in seconds.
#
# ```ruby
# require "gcloud"
# require "gcloud/backoff"
#
# Gcloud::Backoff.retries = 5 # Raise the maximum number of retries from 3
# gcloud = Gcloud.new
# datastore = gcloud.datastore retries: 10, timeout: 120
# ```
#
# See the [Datastore error
Expand Down
23 changes: 11 additions & 12 deletions lib/gcloud/dns.rb
Original file line number Diff line number Diff line change
Expand Up @@ -312,24 +312,23 @@ def self.dns project = nil, keyfile = nil, scope: nil, retries: nil,
# zone.export "path/to/db.example.com"
# ```
#
# ## Configuring Backoff
# ## Configuring retries and timeout
#
# The {Gcloud::Backoff} class allows users to globally configure how Cloud API
# requests are automatically retried in the case of some errors, such as a
# `500` or `503` status code, or a specific internal error code such as
# `rateLimitExceeded`.
# You can configure how many times API requests may be automatically retried.
# When an API request fails, the response will be inspected to see if the
# request meets criteria indicating that it may succeed on retry, such as
# `500` and `503` status codes or a specific internal error code such as
# `rateLimitExceeded`. If it meets the criteria, the request will be retried
# after a delay. If another error occurs, the delay will be increased before a
# subsequent attempt, until the `retries` limit is reached.
#
# If an API call fails, the response will be inspected to see if the call
# should be retried. If the response matches the criteria, then the request
# will be retried after a delay. If another error occurs, the delay will be
# increased incrementally before a subsequent attempt. The first retry will be
# delayed one second, the second retry two seconds, and so on.
# You can also set the request `timeout` value in seconds.
#
# ```ruby
# require "gcloud"
# require "gcloud/backoff"
#
# Gcloud::Backoff.retries = 5 # Raise the maximum number of retries from 3
# gcloud = Gcloud.new
# dns = gcloud.dns retries: 10, timeout: 120
# ```
#
module Dns
Expand Down
23 changes: 11 additions & 12 deletions lib/gcloud/logging.rb
Original file line number Diff line number Diff line change
Expand Up @@ -314,24 +314,23 @@ def self.logging project = nil, keyfile = nil, scope: nil, retries: nil,
# logger.info "Job started."
# ```
#
# ## Configuring Backoff
# ## Configuring retries and timeout
#
# The {Gcloud::Backoff} class allows users to globally configure how Cloud API
# requests are automatically retried in the case of some errors, such as a
# `500` or `503` status code, or a specific internal error code such as
# `rateLimitExceeded`.
# You can configure how many times API requests may be automatically retried.
# When an API request fails, the response will be inspected to see if the
# request meets criteria indicating that it may succeed on retry, such as
# `500` and `503` status codes or a specific internal error code such as
# `rateLimitExceeded`. If it meets the criteria, the request will be retried
# after a delay. If another error occurs, the delay will be increased before a
# subsequent attempt, until the `retries` limit is reached.
#
# If an API call fails, the response will be inspected to see if the call
# should be retried. If the response matches the criteria, then the request
# will be retried after a delay. If another error occurs, the delay will be
# increased incrementally before a subsequent attempt. The first retry will be
# delayed one second, the second retry two seconds, and so on.
# You can also set the request `timeout` value in seconds.
#
# ```ruby
# require "gcloud"
# require "gcloud/backoff"
#
# Gcloud::Backoff.retries = 5 # Raise the maximum number of retries from 3
# gcloud = Gcloud.new
# logging = gcloud.logging retries: 10, timeout: 120
# ```
#
module Logging
Expand Down
23 changes: 11 additions & 12 deletions lib/gcloud/pubsub.rb
Original file line number Diff line number Diff line change
Expand Up @@ -395,24 +395,23 @@ def self.pubsub project = nil, keyfile = nil, scope: nil, retries: nil,
# end
# ```
#
# ## Configuring Backoff
# ## Configuring retries and timeout
#
# The {Gcloud::Backoff} class allows users to globally configure how Cloud API
# requests are automatically retried in the case of some errors, such as a
# `500` or `503` status code, or a specific internal error code such as
# `rateLimitExceeded`.
# You can configure how many times API requests may be automatically retried.
# When an API request fails, the response will be inspected to see if the
# request meets criteria indicating that it may succeed on retry, such as
# `500` and `503` status codes or a specific internal error code such as
# `rateLimitExceeded`. If it meets the criteria, the request will be retried
# after a delay. If another error occurs, the delay will be increased before a
# subsequent attempt, until the `retries` limit is reached.
#
# If an API call fails, the response will be inspected to see if the call
# should be retried. If the response matches the criteria, then the request
# will be retried after a delay. If another error occurs, the delay will be
# increased incrementally before a subsequent attempt. The first retry will be
# delayed one second, the second retry two seconds, and so on.
# You can also set the request `timeout` value in seconds.
#
# ```ruby
# require "gcloud"
# require "gcloud/backoff"
#
# Gcloud::Backoff.retries = 5 # Raise the maximum number of retries from 3
# gcloud = Gcloud.new
# pubsub = gcloud.pubsub retries: 10, timeout: 120
# ```
#
# See the [Pub/Sub error codes](https://cloud.google.com/pubsub/error-codes)
Expand Down
23 changes: 11 additions & 12 deletions lib/gcloud/resource_manager.rb
Original file line number Diff line number Diff line change
Expand Up @@ -207,24 +207,23 @@ def self.resource_manager keyfile = nil, scope: nil, retries: nil,
# resource_manager.undelete "tokyo-rain-123"
# ```
#
# ## Configuring Backoff
# ## Configuring retries and timeout
#
# The {Gcloud::Backoff} class allows users to globally configure how Cloud API
# requests are automatically retried in the case of some errors, such as a
# `500` or `503` status code, or a specific internal error code such as
# `rateLimitExceeded`.
# You can configure how many times API requests may be automatically retried.
# When an API request fails, the response will be inspected to see if the
# request meets criteria indicating that it may succeed on retry, such as
# `500` and `503` status codes or a specific internal error code such as
# `rateLimitExceeded`. If it meets the criteria, the request will be retried
# after a delay. If another error occurs, the delay will be increased before a
# subsequent attempt, until the `retries` limit is reached.
#
# If an API call fails, the response will be inspected to see if the call
# should be retried. If the response matches the criteria, then the request
# will be retried after a delay. If another error occurs, the delay will be
# increased incrementally before a subsequent attempt. The first retry will be
# delayed one second, the second retry two seconds, and so on.
# You can also set the request `timeout` value in seconds.
#
# ```ruby
# require "gcloud"
# require "gcloud/backoff"
#
# Gcloud::Backoff.retries = 5 # Raise the maximum number of retries from 3
# gcloud = Gcloud.new
# resource_manager = gcloud.resource_manager retries: 10, timeout: 120
# ```
#
# See the [Resource Manager error
Expand Down
23 changes: 11 additions & 12 deletions lib/gcloud/storage.rb
Original file line number Diff line number Diff line change
Expand Up @@ -427,24 +427,23 @@ def self.storage project = nil, keyfile = nil, scope: nil, retries: nil,
# file.acl.public!
# ```
#
# ## Configuring Backoff
# ## Configuring retries and timeout
#
# The {Gcloud::Backoff} class allows users to globally configure how Cloud API
# requests are automatically retried in the case of some errors, such as a
# `500` or `503` status code, or a specific internal error code such as
# `rateLimitExceeded`.
# You can configure how many times API requests may be automatically retried.
# When an API request fails, the response will be inspected to see if the
# request meets criteria indicating that it may succeed on retry, such as
# `500` and `503` status codes or a specific internal error code such as
# `rateLimitExceeded`. If it meets the criteria, the request will be retried
# after a delay. If another error occurs, the delay will be increased before a
# subsequent attempt, until the `retries` limit is reached.
#
# If an API call fails, the response will be inspected to see if the call
# should be retried. If the response matches the criteria, then the request
# will be retried after a delay. If another error occurs, the delay will be
# increased incrementally before a subsequent attempt. The first retry will be
# delayed one second, the second retry two seconds, and so on.
# You can also set the request `timeout` value in seconds.
#
# ```ruby
# require "gcloud"
# require "gcloud/backoff"
#
# Gcloud::Backoff.retries = 5 # Raise the maximum number of retries from 3
# gcloud = Gcloud.new
# storage = gcloud.storage retries: 10, timeout: 120
# ```
#
# See the [Storage status and error
Expand Down
2 changes: 1 addition & 1 deletion lib/gcloud/storage/bucket.rb
Original file line number Diff line number Diff line change
Expand Up @@ -327,7 +327,7 @@ def update
# The bucket must be empty before it can be deleted.
#
# The API call to delete the bucket may be retried under certain
# conditions. See {Gcloud::Backoff} to control this behavior.
# conditions. See {Gcloud#storage} to control this behavior.
#
# @return [Boolean] Returns `true` if the bucket was deleted.
#
Expand Down
2 changes: 1 addition & 1 deletion lib/gcloud/storage/project.rb
Original file line number Diff line number Diff line change
Expand Up @@ -167,7 +167,7 @@ def bucket bucket_name
# bucket. See {Bucket::Cors} for details.
#
# The API call to create the bucket may be retried under certain
# conditions. See {Gcloud::Backoff} to control this behavior.
# conditions. See {Gcloud#storage} to control this behavior.
#
# You can pass [website
# settings](https://cloud.google.com/storage/docs/website-configuration)
Expand Down
23 changes: 11 additions & 12 deletions lib/gcloud/translate.rb
Original file line number Diff line number Diff line change
Expand Up @@ -242,24 +242,23 @@ def self.translate key = nil, retries: nil, timeout: nil
# languages[0].name #=> "Afrikaans"
# ```
#
# ## Configuring Backoff
# ## Configuring retries and timeout
#
# The {Gcloud::Backoff} class allows users to globally configure how Cloud API
# requests are automatically retried in the case of some errors, such as a
# `500` or `503` status code, or a specific internal error code such as
# `rateLimitExceeded`.
# You can configure how many times API requests may be automatically retried.
# When an API request fails, the response will be inspected to see if the
# request meets criteria indicating that it may succeed on retry, such as
# `500` and `503` status codes or a specific internal error code such as
# `rateLimitExceeded`. If it meets the criteria, the request will be retried
# after a delay. If another error occurs, the delay will be increased before a
# subsequent attempt, until the `retries` limit is reached.
#
# If an API call fails, the response will be inspected to see if the call
# should be retried. If the response matches the criteria, then the request
# will be retried after a delay. If another error occurs, the delay will be
# increased incrementally before a subsequent attempt. The first retry will be
# delayed one second, the second retry two seconds, and so on.
# You can also set the request `timeout` value in seconds.
#
# ```ruby
# require "gcloud"
# require "gcloud/backoff"
#
# Gcloud::Backoff.retries = 5 # Raise the maximum number of retries from 3
# gcloud = Gcloud.new
# translate = gcloud.translate retries: 10, timeout: 120
# ```
#
module Translate
Expand Down
23 changes: 11 additions & 12 deletions lib/gcloud/vision.rb
Original file line number Diff line number Diff line change
Expand Up @@ -249,24 +249,23 @@ def self.vision project = nil, keyfile = nil, scope: nil, retries: nil,
# annotation = vision.annotate image, faces: 5
# ```
#
# ## Configuring Backoff
# ## Configuring retries and timeout
#
# The {Gcloud::Backoff} class allows users to globally configure how Cloud API
# requests are automatically retried in the case of some errors, such as a
# `500` or `503` status code, or a specific internal error code such as
# `rateLimitExceeded`.
# You can configure how many times API requests may be automatically retried.
# When an API request fails, the response will be inspected to see if the
# request meets criteria indicating that it may succeed on retry, such as
# `500` and `503` status codes or a specific internal error code such as
# `rateLimitExceeded`. If it meets the criteria, the request will be retried
# after a delay. If another error occurs, the delay will be increased before a
# subsequent attempt, until the `retries` limit is reached.
#
# If an API call fails, the response will be inspected to see if the call
# should be retried. If the response matches the criteria, then the request
# will be retried after a delay. If another error occurs, the delay will be
# increased incrementally before a subsequent attempt. The first retry will be
# delayed one second, the second retry two seconds, and so on.
# You can also set the request `timeout` value in seconds.
#
# ```ruby
# require "gcloud"
# require "gcloud/backoff"
#
# Gcloud::Backoff.retries = 5 # Raise the maximum number of retries from 3
# gcloud = Gcloud.new
# vision = gcloud.vision retries: 10, timeout: 120
# ```
#
module Vision
Expand Down