Skip to content

Commit

Permalink
Merge pull request #17 from JuliaGNI/compathelper/new_version/2024-12…
Browse files Browse the repository at this point in the history
…-06-00-46-23-264-00684279103

CompatHelper: bump compat for AbstractNeuralNetworks to 0.5, (keep existing compat)
  • Loading branch information
michakraus authored Dec 17, 2024
2 parents 8d610cc + 4bd6cb2 commit 42aa39a
Show file tree
Hide file tree
Showing 9 changed files with 20 additions and 20 deletions.
4 changes: 2 additions & 2 deletions Project.toml
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@ RuntimeGeneratedFunctions = "7e49a35a-f44a-4d26-94aa-eba1b4ca6b47"
Symbolics = "0c5d862f-8b57-4792-8d23-62f2024744c7"

[compat]
AbstractNeuralNetworks = "0.3, 0.4"
AbstractNeuralNetworks = "0.3, 0.4, 0.5"
Documenter = "1.8.0"
ForwardDiff = "0.10.38"
Latexify = "0.16.5"
Expand All @@ -31,4 +31,4 @@ Test = "8dfed614-e22c-5e08-85e1-65c5234f0b40"
Zygote = "e88e6eb3-aa80-5325-afca-941959d7151f"

[targets]
test = ["Test", "ForwardDiff", "Random", "Documenter", "Latexify", "SafeTestsets", "Zygote"]
test = ["Test", "ForwardDiff", "Random", "Documenter", "Latexify", "SafeTestsets", "Zygote"]
4 changes: 2 additions & 2 deletions docs/src/hamiltonian_neural_network.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@ Here we build a Hamiltonian neural network as a symbolic neural network.
```julia hnn
using SymbolicNeuralNetworks
using GeometricMachineLearning
using AbstractNeuralNetworks: Dense, initialparameters, UnknownArchitecture, Model
using AbstractNeuralNetworks: Dense, UnknownArchitecture, Model
using LinearAlgebra: norm
using ChainRulesCore
using KernelAbstractions
Expand Down Expand Up @@ -45,7 +45,7 @@ nothing # hide
We can now train the network:

```julia hnn
ps = NeuralNetworkParameters(initialparameters(c, T))
ps = NeuralNetwork(c, T).params
dl = DataLoader(z_data, hvf_analytic(z_data))
o = Optimizer(AdamOptimizer(.01), ps)
batch = Batch(200)
Expand Down
2 changes: 1 addition & 1 deletion docs/src/symbolic_neural_networks.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ We first call the symbolic neural network that only consists of one layer:

```@example snn
using SymbolicNeuralNetworks
using AbstractNeuralNetworks: Chain, Dense, initialparameters
using AbstractNeuralNetworks: Chain, Dense
input_dim = 2
output_dim = 1
Expand Down
4 changes: 2 additions & 2 deletions src/derivatives/jacobian.jl
Original file line number Diff line number Diff line change
Expand Up @@ -45,7 +45,7 @@ We can use `Jacobian` together with [`build_nn_function`](@ref):
```jldoctest
using SymbolicNeuralNetworks
using SymbolicNeuralNetworks: Jacobian, derivative
using AbstractNeuralNetworks: Dense, Chain, initialparameters
using AbstractNeuralNetworks: Dense, Chain, NeuralNetwork
using Symbolics
import Random
Expand All @@ -59,7 +59,7 @@ nn = SymbolicNeuralNetwork(c)
□ = SymbolicNeuralNetworks.Jacobian(nn)
# here we need to access the derivative and convert it into a function
jacobian1 = build_nn_function(derivative(□), nn)
ps = initialparameters(c, Float64)
ps = NeuralNetwork(c, Float64).params
input = rand(input_dim)
#derivative
Dtanh(x::Real) = 4 * exp(2 * x) / (1 + exp(2x)) ^ 2
Expand Down
4 changes: 2 additions & 2 deletions src/pullback.jl
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@ c = Chain(Dense(2, 1, tanh))
nn = SymbolicNeuralNetwork(c)
loss = FeedForwardLoss()
pb = SymbolicPullback(nn, loss)
ps = initialparameters(c) |> NeuralNetworkParameters
ps = NeuralNetwork(c).params
pv_values = pb(ps, nn.model, (rand(2), rand(1)))[2](1) |> typeof
# output
Expand Down Expand Up @@ -50,7 +50,7 @@ c = Chain(Dense(2, 1, tanh))
nn = SymbolicNeuralNetwork(c)
loss = FeedForwardLoss()
pb = SymbolicPullback(nn, loss)
ps = initialparameters(c) |> NeuralNetworkParameters
ps = NeuralNetwork(c).params
input_output = (rand(2), rand(1))
loss_and_pullback = pb(ps, nn.model, input_output)
pv_values = loss_and_pullback[2](1)
Expand Down
2 changes: 1 addition & 1 deletion src/utils/build_function.jl
Original file line number Diff line number Diff line change
Expand Up @@ -48,7 +48,7 @@ params = symbolicparameters(c)
@variables sinput[1:2]
eq = c(sinput, params)
built_function = _build_nn_function(eq, params, sinput)
ps = initialparameters(c)
ps = NeuralNetwork(c).params
input = rand(2, 2)
(built_function(input, ps, 1), built_function(input, ps, 2)) .≈ (c(input[:, 1], ps), c(input[:, 2], ps))
Expand Down
12 changes: 6 additions & 6 deletions src/utils/build_function_arrays.jl
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@ Build an executable function based on `eqs` that potentially also has a symbolic
```jldoctest
using SymbolicNeuralNetworks: build_nn_function, SymbolicNeuralNetwork
using AbstractNeuralNetworks: Chain, Dense, initialparameters, NeuralNetworkParameters
using AbstractNeuralNetworks: Chain, Dense, NeuralNetwork
import Random
Random.seed!(123)
Expand All @@ -16,7 +16,7 @@ nn = SymbolicNeuralNetwork(ch)
eqs = [(a = ch(nn.input, nn.params), b = ch(nn.input, nn.params).^2), (c = ch(nn.input, nn.params).^3, )]
funcs = build_nn_function(eqs, nn.params, nn.input)
input = [1., 2.]
ps = initialparameters(ch) |> NeuralNetworkParameters
ps = NeuralNetwork(ch).params
a = ch(input, ps)
b = ch(input, ps).^2
c = ch(input, ps).^3
Expand Down Expand Up @@ -47,7 +47,7 @@ Return a function that takes an input, (optionally) an output and neural network
```jldoctest
using SymbolicNeuralNetworks: build_nn_function, SymbolicNeuralNetwork
using AbstractNeuralNetworks: Chain, Dense, initialparameters, NeuralNetworkParameters
using AbstractNeuralNetworks: Chain, Dense, NeuralNetwork
import Random
Random.seed!(123)
Expand All @@ -56,7 +56,7 @@ nn = SymbolicNeuralNetwork(c)
eqs = (a = c(nn.input, nn.params), b = c(nn.input, nn.params).^2)
funcs = build_nn_function(eqs, nn.params, nn.input)
input = [1., 2.]
ps = initialparameters(c) |> NeuralNetworkParameters
ps = NeuralNetwork(c).params
a = c(input, ps)
b = c(input, ps).^2
funcs_evaluated = funcs(input, ps)
Expand Down Expand Up @@ -90,7 +90,7 @@ Return an executable function for each entry in `eqs`. This still has to be proc
```jldoctest
using SymbolicNeuralNetworks: function_valued_parameters, SymbolicNeuralNetwork
using AbstractNeuralNetworks: Chain, Dense, initialparameters, NeuralNetworkParameters
using AbstractNeuralNetworks: Chain, Dense, NeuralNetwork
import Random
Random.seed!(123)
Expand All @@ -99,7 +99,7 @@ nn = SymbolicNeuralNetwork(c)
eqs = (a = c(nn.input, nn.params), b = c(nn.input, nn.params).^2)
funcs = function_valued_parameters(eqs, nn.params, nn.input)
input = [1., 2.]
ps = initialparameters(c) |> NeuralNetworkParameters
ps = NeuralNetwork(c).params
a = c(input, ps)
b = c(input, ps).^2
Expand Down
4 changes: 2 additions & 2 deletions test/neural_network_derivative.jl
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
using Test, SymbolicNeuralNetworks
using SymbolicNeuralNetworks: Jacobian, derivative
using AbstractNeuralNetworks: Chain, Dense, initialparameters, NeuralNetworkParameters
using AbstractNeuralNetworks: Chain, Dense, NeuralNetwork
using LinearAlgebra: norm
import Symbolics, Random, ForwardDiff

Expand All @@ -26,7 +26,7 @@ function test_jacobian(n::Integer, T = Float32)
nn = SymbolicNeuralNetwork(c)
g = Jacobian(nn)

params = initialparameters(c, T) |> NeuralNetworkParameters
params = NeuralNetwork(c, T).params
input = rand(T, n)
@test build_nn_function(g.output, nn)(input, params) c(input, params)
@test build_nn_function(derivative(g), nn)(input, params) ForwardDiff.jacobian(input -> c(input, params), input)
Expand Down
4 changes: 2 additions & 2 deletions test/symbolic_gradient.jl
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@ function test_symbolic_gradient(input_dim::Integer = 3, output_dim::Integer = 1,
@assert second_dim > 1 "second_dim must be greater than 1!"
c = Chain(Dense(input_dim, hidden_dim, tanh), Dense(hidden_dim, output_dim, tanh))
sparams = symbolicparameters(c)
ps = initialparameters(c, T) |> NeuralNetworkParameters
ps = NeuralNetwork(c, T).params
@variables sinput[1:input_dim]
sout = norm(c(sinput, sparams)) ^ 2
sdparams = symbolic_differentials(sparams)
Expand All @@ -40,7 +40,7 @@ Also checks the parallelization, but for the full function.
function test_symbolic_gradient2(input_dim::Integer = 3, output_dim::Integer = 1, hidden_dim::Integer = 2, T::DataType = Float64, second_dim::Integer = 1, third_dim::Integer = 1)
c = Chain(Dense(input_dim, hidden_dim, tanh), Dense(hidden_dim, output_dim, tanh))
sparams = symbolicparameters(c)
ps = initialparameters(c, T) |> NeuralNetworkParameters
ps = NeuralNetwork(c, T).params
@variables sinput[1:input_dim]
sout = norm(c(sinput, sparams)) ^ 2
input = rand(T, input_dim, second_dim, third_dim)
Expand Down

0 comments on commit 42aa39a

Please sign in to comment.