-
-
Notifications
You must be signed in to change notification settings - Fork 101
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
refactor: move type-pirated function from BoundaryValueDiffEq here, use Accessors.jl #696
refactor: move type-pirated function from BoundaryValueDiffEq here, use Accessors.jl #696
Conversation
Codecov ReportAttention: Patch coverage is
Additional details and impacted files@@ Coverage Diff @@
## master #696 +/- ##
===========================================
- Coverage 41.53% 30.82% -10.71%
===========================================
Files 55 55
Lines 4582 4577 -5
===========================================
- Hits 1903 1411 -492
- Misses 2679 3166 +487 ☔ View full report in Codecov by Sentry. |
249e69f
to
6f91b3f
Compare
Fix conflict? |
908c197
to
d74e879
Compare
I'm looking into what is cancelling these tests |
test/downstream/symbol_indexing.jl
Outdated
@test getp(sys, b)(sol) ≈ 100 | ||
@test sol.ps[a] ≈ 1 | ||
@test sol.ps[b] ≈ 100 | ||
sol = @test_throws ArgumentError solve(prob, GradientDescent()) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
what's the error here?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
ERROR: ArgumentError: Fminbox(GradientDescent{LineSearches.InitialPrevious{Float64}, LineSearches.HagerZhang{Float64, Base.RefValue{Bool}}, Nothing, Optim.var"#13#15"}(LineSearches.InitialPrevious{Float64}
alpha: Float64 1.0
alphamin: Float64 0.0
alphamax: Float64 Inf
, LineSearches.HagerZhang{Float64, Base.RefValue{Bool}}
delta: Float64 0.1
sigma: Float64 0.9
alphamax: Float64 Inf
rho: Float64 5.0
epsilon: Float64 1.0e-6
gamma: Float64 0.66
linesearchmax: Int64 50
psi3: Float64 0.1
display: Int64 0
mayterminate: Base.RefValue{Bool}
, nothing, Optim.var"#13#15"(), Flat())) requires gradients, since you didn't use `OptimizationFunction` with a valid AD backend https://docs.sciml.ai/Optimization/stable/API/ad/ the lower and upper bounds thus will be ignored.
I had messaged Vaibhav about it when it came up, not sure if he has been able to fix it yet.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It was fixed I have created a release for it now
Requires SciML/RecursiveArrayTools.jl#384 |
821f586
to
317adb1
Compare
This avoids loading `ADTypes.jl` before downstream tests are precompiled
317adb1
to
55d81af
Compare
Checklist
contributor guidelines, in particular the SciML Style Guide and
COLPRAC.
Additional context
Add any other context about the problem here.