Skip to content

Commit

Permalink
Merge pull request #18226 from VaishnaviMudaliar:master
Browse files Browse the repository at this point in the history
PiperOrigin-RevId: 540968715
  • Loading branch information
tensorflower-gardener committed Jun 16, 2023
2 parents dc46601 + bedd434 commit 8e63b2f
Showing 1 changed file with 2 additions and 2 deletions.
4 changes: 2 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -87,7 +87,7 @@ model.compile(loss='categorical_crossentropy',
```

If you need to, you can further configure your optimizer. The Keras philosophy is to keep simple things simple,
while allowing the user to be fully in control when they need to (the ultimate control being the easy extensibility of the source code via subclassing).
while allowing the user to be fully in control when they need to be (the ultimate control being the easy extensibility of the source code via subclassing).

```python
model.compile(loss=tf.keras.losses.categorical_crossentropy,
Expand Down Expand Up @@ -121,7 +121,7 @@ Keras follows the principle of **progressive disclosure of complexity**: it make
yet it makes it possible to handle arbitrarily advanced use cases,
only requiring incremental learning at each step.

In much the same way that you were able to train & evaluate a simple neural network above in a few lines,
In pretty much the same way that you were able to train & evaluate a simple neural network above in a few lines,
you can use Keras to quickly develop new training procedures or exotic model architectures.
Here's a low-level training loop example, combining Keras functionality with the TensorFlow `GradientTape`:

Expand Down

0 comments on commit 8e63b2f

Please sign in to comment.