Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Added docs for ROM, added docs for Cayley (and optimized routine), simplified calculation in attention layer. #76

Merged
merged 41 commits into from
Sep 20, 2023

Conversation

benedict-96
Copy link
Collaborator

This PR is a bit of a mess. I started working on different things (minor fixes) and thought it wouldn't be worth doing it on separate branches, but it ended up being bigger fixes and additions. Most of what's new here should affect the docs though, so I hope it's still possible to decipher what has changed. Short description:

  1. In kernels/inverses/inverse_kernels.jl we made the computation of the new activation function slightly better by using @view.
  2. data_loader/data_loader.jl now has a constructor that can be called with a single matrix (and relevant functions in data_loader/matrix_assign.jl). This is used for ROM autoencoders.
  3. There is a new script (separate directory symplectic_autoencoders) with a routine to generate data (generate_data.jl) and a routine for training (training.jl). The example is taken from 2112.10815.
  4. Fixed the Cayley transform (in optimizers/manifold_related/retractions.jl) and wrote corresponding documentation.
  5. The rest is documentation, especially regarding ROM (autoencoder.md and symplectic_autoencoder.md).

To point 3.: this seems to be working really well. We easily get from a PSD error of 0.3 to 0.03 when using our symplectic autoencoders.

@codecov
Copy link

codecov bot commented Sep 18, 2023

Codecov Report

Merging #76 (2f51853) into main (e36e9c4) will decrease coverage by 0.69%.
The diff coverage is 52.17%.

@@            Coverage Diff             @@
##             main      #76      +/-   ##
==========================================
- Coverage   72.71%   72.02%   -0.69%     
==========================================
  Files          93       94       +1     
  Lines        2294     2320      +26     
==========================================
+ Hits         1668     1671       +3     
- Misses        626      649      +23     
Files Changed Coverage Δ
src/GeometricMachineLearning.jl 100.00% <ø> (ø)
src/data/data_training.jl 79.71% <ø> (ø)
src/data_loader/matrix_assign.jl 0.00% <0.00%> (ø)
src/data_loader/data_loader.jl 64.58% <15.00%> (-35.42%) ⬇️
src/kernels/inverses/inverse_kernel.jl 80.00% <100.00%> (-0.36%) ⬇️
src/layers/attention_layer.jl 97.61% <100.00%> (ø)
src/layers/multi_head_attention.jl 100.00% <100.00%> (ø)
src/layers/psd_like_layer.jl 63.15% <100.00%> (ø)
src/layers/stiefel_layer.jl 100.00% <100.00%> (ø)
src/optimizers/manifold_related/retractions.jl 88.88% <100.00%> (+9.47%) ⬆️

... and 3 files with indirect coverage changes

📣 We’re building smart automated test selection to slash your CI/CD build times. Learn more

@michakraus michakraus merged commit 1fd44b7 into main Sep 20, 2023
@michakraus michakraus deleted the new_activation_function branch September 20, 2023 07:14
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants