You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
But when I try to run it, it unexpectedly asks for derivatives.
Error & Stacktrace ⚠️
ERROR: ArgumentError:Fminbox(NelderMead{Optim.AffineSimplexer, Opt
im.AdaptiveParameters}(Optim.AffineSimplexer(0.025, 0.5), Optim.Ada
ptiveParameters(1.0, 1.0, 0.75, 1.0))) requires gradients, use `OptimizationFunction` either with a valid AD backend https://docs.scim
l.ai/Optimization/stable/API/ad/ or a provided 'grad'function.
Environment (please complete the following information):
Describe the bug 🐞
I was playing around with optimizing a problem using some code that looks as follows:
using Optimization
using OptimizationOptimJL
function cost(u,p)
# define the cost function here
end
prob = Optimization.OptimizationProblem(
Optimization.OptimizationFunction(cost),
[2e-5], #u0
(p1, p2, p3, p4), # p
lb=[1e-8], ub=[1e-4]
)
ρ = OptimizationOptimJL.solve(prob, NelderMead(), maxiters=50, maxtime=10^5)
But when I try to run it, it unexpectedly asks for derivatives.
Error & Stacktrace⚠️
Environment (please complete the following information):
using Pkg; Pkg.status()
using Pkg; Pkg.status(; mode = PKGMODE_MANIFEST)
versioninfo()
Additional context
Add any other context about the problem here.
The text was updated successfully, but these errors were encountered: