Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Improve directory setup #23

Open
wants to merge 24 commits into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
24 commits
Select commit Hold shift + click to select a range
6521a74
Moved build_function functions to separate directory.
benedict-96 Dec 13, 2024
6f59167
Moved pullback.jl into derivatives directory.
benedict-96 Dec 13, 2024
c659bbe
Moved files to new directory symbolic_neuralnet.
benedict-96 Dec 13, 2024
c64a7d6
Made spacing uniform.
benedict-96 Dec 13, 2024
8acf5cb
renamed utils to custom_definitions_and_extensions and moved relevant…
benedict-96 Dec 13, 2024
7789dac
Removed HamiltonianNN use.
benedict-96 Dec 13, 2024
7b332c2
Moved chain.jl in separate directory.
benedict-96 Dec 13, 2024
e438730
Removed tests that aren't really needed anymore.
benedict-96 Dec 13, 2024
1b5f9e4
Adjusted tests layout to how they appear in src.
benedict-96 Dec 13, 2024
3ff82f4
1 -> begin.
benedict-96 Dec 13, 2024
7683404
Introduced a new constructor for SymbolicNeuralNetwork (using Neural …
benedict-96 Dec 14, 2024
5a60aac
Deleted extra specific layer constructors that aren't needed anymore.
benedict-96 Dec 14, 2024
d26cf27
Renamed file build_function2.jl to build_function_double_input.jl.
benedict-96 Dec 15, 2024
85aa3f0
Added tests for build_nn_function (with input and with input&output);…
benedict-96 Dec 15, 2024
cc8f1dd
Moved test from docstring to test/.../build_function_arrays.jl.
benedict-96 Dec 15, 2024
4194d0b
Now importing correct structs/functions.
benedict-96 Dec 15, 2024
02c67ea
Removed reference to HSNN.
benedict-96 Dec 16, 2024
f495dc8
Moved remaining tests from docstring tests to build_function directory.
benedict-96 Dec 16, 2024
a260ae5
Added test that compares symbolic pullback to zygote pullback.
benedict-96 Dec 16, 2024
30a0045
Fixed docstring problem.
benedict-96 Dec 16, 2024
53643c5
Remove doctests completely.
benedict-96 Dec 16, 2024
fe4dfa7
params now used as function instead of as a keyword.
benedict-96 Dec 17, 2024
49baf61
Merge branch 'main' into improve-directory-setup
benedict-96 Dec 17, 2024
94e9da1
Resolved merge conflict.
benedict-96 Dec 17, 2024
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
5 changes: 3 additions & 2 deletions Project.toml
Original file line number Diff line number Diff line change
Expand Up @@ -14,9 +14,9 @@ Symbolics = "0c5d862f-8b57-4792-8d23-62f2024744c7"
AbstractNeuralNetworks = "0.3, 0.4, 0.5"
Documenter = "1.8.0"
ForwardDiff = "0.10.38"
GeometricMachineLearning = "0.3.7"
Latexify = "0.16.5"
RuntimeGeneratedFunctions = "0.5"
SafeTestsets = "0.1"
Symbolics = "5, 6"
Zygote = "0.6.73"
julia = "1.6"
Expand All @@ -29,6 +29,7 @@ Random = "9a3f8284-a2c9-5f02-9a11-845980a1fd5c"
SafeTestsets = "1bc83da4-3b8d-516f-aca4-4fe02f6d838f"
Test = "8dfed614-e22c-5e08-85e1-65c5234f0b40"
Zygote = "e88e6eb3-aa80-5325-afca-941959d7151f"
GeometricMachineLearning = "194d25b2-d3f5-49f0-af24-c124f4aa80cc"

[targets]
test = ["Test", "ForwardDiff", "Random", "Documenter", "Latexify", "SafeTestsets", "Zygote"]
test = ["Test", "ForwardDiff", "Random", "Documenter", "Latexify", "SafeTestsets", "Zygote", "GeometricMachineLearning"]
4 changes: 2 additions & 2 deletions docs/src/double_derivative.md
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@
```@example jacobian_gradient
using AbstractNeuralNetworks
using SymbolicNeuralNetworks
using SymbolicNeuralNetworks: Jacobian, Gradient, derivative
using SymbolicNeuralNetworks: Jacobian, Gradient, derivative, params
using Latexify: latexify

c = Chain(Dense(2, 1, tanh; use_bias = false))
Expand Down Expand Up @@ -92,7 +92,7 @@ x = \begin{pmatrix} 1 \\ 0 \end{pmatrix}, \quad W = \begin{bmatrix} 1 & 0 \\ 0 &
```

```@example jacobian_gradient
built_function = build_nn_function(derivative(g), nn.params, nn.input)
built_function = build_nn_function(derivative(g), params(nn), nn.input)

x = [1., 0.]
ps = NeuralNetworkParameters((L1 = (W = [1. 0.; 0. 1.], b = [0., 0.]), ))
Expand Down
2 changes: 1 addition & 1 deletion docs/src/hamiltonian_neural_network.md
Original file line number Diff line number Diff line change
Expand Up @@ -35,7 +35,7 @@ z_data = randn(T, 2, n_points)
nothing # hide
```

We now specify a pullback [`HamiltonianSymbolicNeuralNetwork`](@ref):
We now specify a pullback `HamiltonianSymbolicNeuralNetwork`

```julia hnn
_pullback = SymbolicPullback(nn)
Expand Down
6 changes: 3 additions & 3 deletions docs/src/symbolic_neural_networks.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ We first call the symbolic neural network that only consists of one layer:

```@example snn
using SymbolicNeuralNetworks
using AbstractNeuralNetworks: Chain, Dense
using AbstractNeuralNetworks: Chain, Dense, params

input_dim = 2
output_dim = 1
Expand All @@ -23,7 +23,7 @@ using Symbolics
using Latexify: latexify

@variables sinput[1:input_dim]
soutput = nn.model(sinput, nn.params)
soutput = nn.model(sinput, params(nn))

soutput
```
Expand Down Expand Up @@ -101,7 +101,7 @@ We now compare the neural network-approximated curve to the original one:
fig = Figure()
ax = Axis3(fig[1, 1])

surface!(x_vec, y_vec, [c([x, y], nn_cpu.params)[1] for x in x_vec, y in y_vec]; alpha = .8, colormap = :darkterrain, transparency = true)
surface!(x_vec, y_vec, [c([x, y], params(nn_cpu))[1] for x in x_vec, y in y_vec]; alpha = .8, colormap = :darkterrain, transparency = true)
fig
```

Expand Down
12 changes: 6 additions & 6 deletions scripts/pullback_comparison.jl
Original file line number Diff line number Diff line change
Expand Up @@ -19,16 +19,16 @@ output = rand(1, batch_size)
# output sensitivities
_do = 1.

# spb(nn_cpu.params, nn.model, (input, output))[2](_do)
# zpb(nn_cpu.params, nn.model, (input, output))[2](_do)
# @time spb_evaluated = spb(nn_cpu.params, nn.model, (input, output))[2](_do)
# @time zpb_evaluated = zpb(nn_cpu.params, nn.model, (input, output))[2](_do)[1].params
# spb(params(nn_cpu), nn.model, (input, output))[2](_do)
# zpb(params(nn_cpu), nn.model, (input, output))[2](_do)
# @time spb_evaluated = spb(params(nn_cpu), nn.model, (input, output))[2](_do)
# @time zpb_evaluated = zpb(params(nn_cpu), nn.model, (input, output))[2](_do)[1].params
# @assert values(spb_evaluated) .≈ values(zpb_evaluated)

function timenn(pb, params, model, input, output, _do = 1.)
pb(params, model, (input, output))[2](_do)
@time pb(params, model, (input, output))[2](_do)
end

timenn(spb, nn_cpu.params, nn.model, input, output)
timenn(zpb, nn_cpu.params, nn.model, input, output)
timenn(spb, params(nn_cpu), nn.model, input, output)
timenn(zpb, params(nn_cpu), nn.model, input, output)
34 changes: 9 additions & 25 deletions src/SymbolicNeuralNetworks.jl
Original file line number Diff line number Diff line change
Expand Up @@ -15,42 +15,26 @@ module SymbolicNeuralNetworks

RuntimeGeneratedFunctions.init(@__MODULE__)

include("equation_types.jl")
include("custom_definitions_and_extensions/equation_types.jl")

export symbolize
include("utils/symbolize.jl")
include("symbolic_neuralnet/symbolize.jl")

export AbstractSymbolicNeuralNetwork
export SymbolicNeuralNetwork, SymbolicModel
export HamiltonianSymbolicNeuralNetwork, HNNLoss
export architecture, model, params, equations, functions
export SymbolicNeuralNetwork

# make symbolic parameters (`NeuralNetworkParameters`)
export symbolicparameters
include("layers/abstract.jl")
include("layers/dense.jl")
include("layers/linear.jl")
include("chain.jl")

export evaluate_equations
include("symbolic_neuralnet.jl")

export symbolic_hamiltonian
include("hamiltonian.jl")
include("symbolic_neuralnet/symbolic_neuralnet.jl")

export build_nn_function
include("utils/build_function.jl")
include("utils/build_function2.jl")
include("utils/build_function_arrays.jl")
include("build_function/build_function.jl")
include("build_function/build_function_double_input.jl")
include("build_function/build_function_arrays.jl")

export SymbolicPullback
include("pullback.jl")
include("derivatives/pullback.jl")

include("derivatives/derivative.jl")
include("derivatives/jacobian.jl")
include("derivatives/gradient.jl")

include("custom_equation.jl")

include("utils/latexraw.jl")
include("custom_definitions_and_extensions/latexraw.jl")
end
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,7 @@ The functions mentioned in the implementation section were adjusted ad-hoc to de
Other problems may occur. In case you bump into one please [open an issue on github](https://github.com/JuliaGNI/SymbolicNeuralNetworks.jl/issues).
"""
function build_nn_function(eq::EqT, nn::AbstractSymbolicNeuralNetwork)
build_nn_function(eq, nn.params, nn.input)
build_nn_function(eq, params(nn), nn.input)
end

function build_nn_function(eq::EqT, sparams::NeuralNetworkParameters, sinput::Symbolics.Arr)
Expand All @@ -39,25 +39,26 @@ Build a function that can process a matrix. This is used as a starting point for
# Examples

```jldoctest
using SymbolicNeuralNetworks: _build_nn_function, symbolicparameters
using Symbolics
using AbstractNeuralNetworks
using SymbolicNeuralNetworks: _build_nn_function, SymbolicNeuralNetwork
using AbstractNeuralNetworks: params, Chain, Dense, NeuralNetwork
import Random
Random.seed!(123)

c = Chain(Dense(2, 1, tanh))
params = symbolicparameters(c)
@variables sinput[1:2]
eq = c(sinput, params)
built_function = _build_nn_function(eq, params, sinput)
ps = NeuralNetwork(c).params
input = rand(2, 2)

(built_function(input, ps, 1), built_function(input, ps, 2)) .≈ (c(input[:, 1], ps), c(input[:, 2], ps))
nn = NeuralNetwork(c)
snn = SymbolicNeuralNetwork(nn)
eq = c(snn.input, params(snn))
built_function = _build_nn_function(eq, params(snn), snn.input)
built_function([1. 2.; 3. 4.], params(nn), 1)

# output

(true, true)
1-element Vector{Float64}:
-0.9999967113439513
```

Note that we have to supply an extra argument (index) to `_build_nn_function` that we do not have to supply to [`build_nn_function`](@ref).

# Implementation

This first calls `Symbolics.build_function` with the keyword argument `expression = Val{true}` and then modifies the generated code by calling:
Expand Down
Original file line number Diff line number Diff line change
@@ -1,32 +1,29 @@
"""
build_nn_function(eqs::AbstractArray{<:NeuralNetworkParameters}, sparams, sinput...)

Build an executable function based on `eqs` that potentially also has a symbolic output.
Build an executable function based on an array of symbolic equations `eqs`.

# Examples

```jldoctest
using SymbolicNeuralNetworks: build_nn_function, SymbolicNeuralNetwork
using AbstractNeuralNetworks: Chain, Dense, NeuralNetwork
using AbstractNeuralNetworks: Chain, Dense, NeuralNetwork, params
import Random
Random.seed!(123)

ch = Chain(Dense(2, 1, tanh))
nn = SymbolicNeuralNetwork(ch)
eqs = [(a = ch(nn.input, nn.params), b = ch(nn.input, nn.params).^2), (c = ch(nn.input, nn.params).^3, )]
funcs = build_nn_function(eqs, nn.params, nn.input)
nn = NeuralNetwork(ch)
snn = SymbolicNeuralNetwork(nn)
eqs = [(a = ch(snn.input, params(snn)), b = ch(snn.input, params(snn)).^2), (c = ch(snn.input, params(snn)).^3, )]
funcs = build_nn_function(eqs, params(snn), snn.input)
input = [1., 2.]
ps = NeuralNetwork(ch).params
a = ch(input, ps)
b = ch(input, ps).^2
c = ch(input, ps).^3
funcs_evaluated = funcs(input, ps)

(funcs_evaluated[1].a, funcs_evaluated[1].b, funcs_evaluated[2].c) .≈ (a, b, c)
funcs_evaluated = funcs(input, params(nn))

# output

(true, true, true)
2-element Vector{NamedTuple}:
(a = [-0.9999386280616135], b = [0.9998772598897417])
(c = [-0.9998158954841537],)
```
"""
function build_nn_function(eqs::AbstractArray{<:Union{NamedTuple, NeuralNetworkParameters}}, sparams::NeuralNetworkParameters, sinput::Symbolics.Arr...)
Expand All @@ -47,25 +44,21 @@ Return a function that takes an input, (optionally) an output and neural network

```jldoctest
using SymbolicNeuralNetworks: build_nn_function, SymbolicNeuralNetwork
using AbstractNeuralNetworks: Chain, Dense, NeuralNetwork
using AbstractNeuralNetworks: Chain, Dense, NeuralNetwork, params
import Random
Random.seed!(123)

c = Chain(Dense(2, 1, tanh))
nn = SymbolicNeuralNetwork(c)
eqs = (a = c(nn.input, nn.params), b = c(nn.input, nn.params).^2)
funcs = build_nn_function(eqs, nn.params, nn.input)
nn = NeuralNetwork(c)
snn = SymbolicNeuralNetwork(nn)
eqs = (a = c(snn.input, params(snn)), b = c(snn.input, params(snn)).^2)
funcs = build_nn_function(eqs, params(snn), snn.input)
input = [1., 2.]
ps = NeuralNetwork(c).params
a = c(input, ps)
b = c(input, ps).^2
funcs_evaluated = funcs(input, ps)

(funcs_evaluated.a, funcs_evaluated.b) .≈ (a, b)
funcs_evaluated = funcs(input, params(nn))

# output

(true, true)
(a = [-0.9999386280616135], b = [0.9998772598897417])
```

# Implementation
Expand All @@ -90,16 +83,17 @@ Return an executable function for each entry in `eqs`. This still has to be proc

```jldoctest
using SymbolicNeuralNetworks: function_valued_parameters, SymbolicNeuralNetwork
using AbstractNeuralNetworks: Chain, Dense, NeuralNetwork
using AbstractNeuralNetworks: Chain, Dense, NeuralNetwork, params
import Random
Random.seed!(123)

c = Chain(Dense(2, 1, tanh))
nn = SymbolicNeuralNetwork(c)
eqs = (a = c(nn.input, nn.params), b = c(nn.input, nn.params).^2)
funcs = function_valued_parameters(eqs, nn.params, nn.input)
nn = NeuralNetwork(c)
snn = SymbolicNeuralNetwork(nn)
eqs = (a = c(snn.input, params(snn)), b = c(snn.input, params(snn)).^2)
funcs = function_valued_parameters(eqs, params(snn), snn.input)
input = [1., 2.]
ps = NeuralNetwork(c).params
ps = params(nn)
a = c(input, ps)
b = c(input, ps).^2

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@
See the *extended help section* of [`build_nn_function(::EqT, ::AbstractSymbolicNeuralNetwork)`](@ref).
"""
function build_nn_function(eqs, nn::AbstractSymbolicNeuralNetwork, soutput)
build_nn_function(eqs, nn.params, nn.input, soutput)
build_nn_function(eqs, params(nn), nn.input, soutput)

Check warning on line 16 in src/build_function/build_function_double_input.jl

View check run for this annotation

Codecov / codecov/patch

src/build_function/build_function_double_input.jl#L16

Added line #L16 was not covered by tests
end

function build_nn_function(eq::EqT, sparams::NeuralNetworkParameters, sinput::Symbolics.Arr, soutput::Symbolics.Arr)
Expand Down
5 changes: 0 additions & 5 deletions src/chain.jl

This file was deleted.

File renamed without changes.
File renamed without changes.
Loading
Loading