Skip to content

Commit

Permalink
PR #17971: [keras/legacy_tf_layers] Standardise docstring usage of "D…
Browse files Browse the repository at this point in the history
…efault to"

Imported from GitHub PR #17971

This is one of many PRs. Discussion + request to split into multiple PRs @ #17748
Copybara import of the project:

--
5331dac by Samuel Marks <[email protected]>:

[keras/legacy_tf_layers/base.py,keras/legacy_tf_layers/migration_utils.py,keras/legacy_tf_layers/variable_scope_shim.py] Standardise docstring usage of "Default to"

--
1801651 by Samuel Marks <[email protected]>:

[keras/legacy_tf_layers/migration_utils.py] Move docstring from method to class and document `seed`

Merging this change closes #17971

FUTURE_COPYBARA_INTEGRATE_REVIEW=#17971 from SamuelMarks:keras.legacy_tf_layers-defaults-to 1801651
PiperOrigin-RevId: 535274600
  • Loading branch information
SamuelMarks authored and tensorflower-gardener committed May 31, 2023
1 parent 2722812 commit 86fa511
Show file tree
Hide file tree
Showing 3 changed files with 15 additions and 11 deletions.
4 changes: 2 additions & 2 deletions keras/legacy_tf_layers/base.py
Original file line number Diff line number Diff line change
Expand Up @@ -365,8 +365,8 @@ def add_weight(
or "non_trainable_variables" (e.g. BatchNorm mean, stddev).
Note, if the current variable scope is marked as non-trainable
then this parameter is ignored and any added variables are also
marked as non-trainable. `trainable` defaults to `True` unless
`synchronization` is set to `ON_READ`.
marked as non-trainable. `trainable` becomes `True` unless
`synchronization` is set to `ON_READ`. Defaults to `True`.
constraint: constraint instance (callable).
use_resource: Whether to use `ResourceVariable`.
synchronization: Indicates when a distributed a variable will be
Expand Down
7 changes: 5 additions & 2 deletions keras/legacy_tf_layers/migration_utils.py
Original file line number Diff line number Diff line change
Expand Up @@ -43,11 +43,14 @@ class DeterministicRandomTestTool(object):
This applies both to the stateful random operations used for creating and
initializing variables, and to the stateful random operations used in
computation (such as for dropout layers).
Args:
mode: Set mode to 'constant' or 'num_random_ops'. Defaults to
'constant'.
seed: The random seed to use.
"""

def __init__(self, seed: int = 42, mode="constant"):
"""Set mode to 'constant' or 'num_random_ops'. Defaults to
'constant'."""
if mode not in {"constant", "num_random_ops"}:
raise ValueError(
"Mode arg must be 'constant' or 'num_random_ops'. "
Expand Down
15 changes: 8 additions & 7 deletions keras/legacy_tf_layers/variable_scope_shim.py
Original file line number Diff line number Diff line change
Expand Up @@ -215,7 +215,7 @@ def get_variable(
Args:
name: The name of the new or existing variable.
shape: Shape of the new or existing variable.
dtype: Type of the new or existing variable (defaults to `DT_FLOAT`).
dtype: Type of the new or existing variable. Defaults to `DT_FLOAT`.
initializer: Initializer for the variable.
regularizer: A (Tensor -> Tensor or None) function; the result of
applying it on a newly created variable will be added to the
Expand All @@ -226,16 +226,16 @@ def get_variable(
always forced to be False.
trainable: If `True` also add the variable to the graph collection
`GraphKeys.TRAINABLE_VARIABLES` (see `tf.Variable`). `trainable`
defaults to `True`, unless `synchronization` is set to `ON_READ`, in
which case it defaults to `False`.
becomes `True`, unless `synchronization` is set to `ON_READ`, in
which case it becomes `False`. Defaults to `True`.
collections: List of graph collections keys to add the `Variable` to.
Defaults to `[GraphKeys.GLOBAL_VARIABLES]` (see `tf.Variable`).
caching_device: Optional device string or function describing where
the Variable should be cached for reading. Defaults to the
the Variable should be cached for reading. `None` to use the
Variable's device. If not `None`, caches on another device.
Typical use is to cache on the device where the Ops using the
`Variable` reside, to deduplicate copying through `Switch` and other
conditional statements.
conditional statements. Defaults to `None`.
partitioner: Optional callable that accepts a fully defined
`TensorShape` and dtype of the `Variable` to be created, and returns
a list of partitions for each axis (currently only one axis can be
Expand All @@ -245,8 +245,9 @@ def get_variable(
initial_value must be known.
use_resource: If False, creates a regular Variable. If True, creates
instead an experimental ResourceVariable which has well-defined
semantics. Defaults to False (will later change to True). When eager
execution is enabled this argument is always forced to be true.
semantics. When starting off as False it will later change to True.
When eager execution is enabled this argument always True.
Defaults to `False`.
custom_getter: Callable that takes as a first argument the true
getter, and allows overwriting the internal get_variable method. The
signature of `custom_getter` should match that of this method, but
Expand Down

0 comments on commit 86fa511

Please sign in to comment.