-
Notifications
You must be signed in to change notification settings - Fork 15
Issues: JuliaSmoothOptimizers/ADNLPModels.jl
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Author
Label
Projects
Milestones
Assignee
Sort
Issues list
Document use of conditionals for sparsity detection
documentation
Improvements or additions to documentation
#283
opened Jul 31, 2024 by
gdalle
Compatibility issue with OrdinaryDiffEq due to sparsity analysis in v0.8.3
#280
opened Jul 23, 2024 by
abavoil
Unit tests fail on QuadraticModels with ADNLPModels 0.8
bug
Something isn't working
#255
opened Jun 24, 2024 by
tmigot
Add performance benchmark
run gradient benchmark
run Hessian benchmark
run Hessian product benchmark
run Jacobian benchmark
run Jacobian product benchmark
#241
opened Jun 6, 2024 by
tmigot
9 of 15 tasks
Sparse Jacobian/Hessian not GPU-compatible
bug
Something isn't working
#226
opened May 7, 2024 by
tmigot
4 tasks
hprod/jprod not GPU-compatible
bug
Something isn't working
#225
opened May 7, 2024 by
tmigot
4 tasks
Add more documentation on how to make our own backend
documentation
Improvements or additions to documentation
#194
opened Jul 19, 2023 by
tmigot
Test FastDifferentiation.jl
enhancement
New feature or request
#189
opened Jul 19, 2023 by
amontoison
Using a given NLPModel allocate with Lagrange mutliplier
bug
Something isn't working
#181
opened Jul 13, 2023 by
tmigot
Improve error message in ADNLSModel constructor
documentation
Improvements or additions to documentation
good first issue
Good for newcomers
#100
opened Feb 20, 2023 by
tmigot
Clarify Improvements or additions to documentation
good first issue
Good for newcomers
linequ
to indicate linear *residuals* in ADNLSModel
documentation
#73
opened Aug 4, 2022 by
tmigot
ProTip!
Exclude everything labeled
bug
with -label:bug.