Skip to content

Commit

Permalink
Update 4.5Optimization.md
Browse files Browse the repository at this point in the history
  • Loading branch information
KuangYu authored Nov 7, 2023
1 parent de7f928 commit 34ed56b
Showing 1 changed file with 1 addition and 3 deletions.
4 changes: 1 addition & 3 deletions docs/user_guide/4.5Optimization.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,14 +6,12 @@ Automatic differentiation is a crucial component of DMFF and plays a significant

## 2. Function module

Imports: Importing necessary modules and functions from `jax` and `optax`.

Function `periodic_move`:
- Creates a function to perform a periodic update on parameters. If the update causes the parameters to exceed a given range, they are wrapped around in a periodic manner (like an angle that wraps around after 360 degrees).

Function `genOptimizer`:
- It's a function to generate an optimizer based on user preferences.
- Depending on the arguments, it can produce various optimization schemes, such as SGD, Nesterov, Adam, and others.
- Depending on the arguments, it can produce various optimization schemes, such as SGD, Nesterov, Adam, etc.
- Supports learning rate schedules like exponential decay and warmup exponential decay.
- The optimizer can be further augmented with features like gradient clipping, periodic parameter wrapping, and keeping parameters non-negative.

Expand Down

0 comments on commit 34ed56b

Please sign in to comment.