Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Raise optimizers covering complex functional conditions which cross curvature anomalies. #605

Open
linjing-lab opened this issue Nov 28, 2024 · 0 comments

Comments

@linjing-lab
Copy link

Complex curvature converge

See jittor named optim.py which contains SGD, RMSprop, Adam, AdamW, Adan optimizer, still memorized gradient improved directions and runtime frame rates, now optimtool of partial differentiation should take a whole dense functional parameters as training objectives. Simply describes vectors items as single or multi-correlational functional loss measures. Jittor stiil stays on step wise with curvature anomalies that gives entry in differentiation and directions, real simulation inspired optimizers.

Logs on same diff behaviors

Some statistical complex combination wasn't trained convergence in diff kernel level, trained secant convergence avoids curvature anomalies, secant theory was long geometry distance in one time optimization. There are gradient distributed optimizers in file, logs were memories and clusters reactions chains.

Minimal Reproduce

Acknowledged about many tasks included vision, video, text, multi-modal, and sequence sparse distributed training, reality objects always differentiated when sequences present much distinguished mapping rules, so my deployment of such cross curvature targets produced one framework related with BFGS, L_BFGS methods, DFP method limited in matrix cycles. BFGS method almost converged all functional parameters based loss type while waiting for differentiation queue task.

Expected behavior

Expect leverages runtime RAM memory when covering reality production, which composed with complex functional conditions in possible of new curvature anomalies. A new environment trial on Dask Machine Learning, reflects RAM full memory while much low efficiency in completing cycles of differentiation or times only task.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant