From ea76fabf71a857c98d382d5bfb198ba8da80f860 Mon Sep 17 00:00:00 2001 From: Vaibhav Dixit Date: Wed, 10 Jul 2024 22:57:11 -0400 Subject: [PATCH 1/8] New tutorials and docs --- docs/src/getting_started.md | 44 +++++----- docs/src/index.md | 5 +- .../src/optimization_packages/optimization.md | 39 ++++++++ docs/src/tutorials/certification.md | 88 +++++++++++++++++++ docs/src/tutorials/ensemble.md | 30 +++++++ 5 files changed, 181 insertions(+), 25 deletions(-) create mode 100644 docs/src/optimization_packages/optimization.md create mode 100644 docs/src/tutorials/certification.md create mode 100644 docs/src/tutorials/ensemble.md diff --git a/docs/src/getting_started.md b/docs/src/getting_started.md index b6fef6a44..a868cea8d 100644 --- a/docs/src/getting_started.md +++ b/docs/src/getting_started.md @@ -1,41 +1,39 @@ -# Getting Started with Optimization in Julia +# Getting Started with Optimization.jl In this tutorial, we introduce the basics of Optimization.jl by showing -how to easily mix local optimizers from Optim.jl and global optimizers -from BlackBoxOptim.jl on the Rosenbrock equation. The simplest copy-pasteable -code to get started is the following: +how to easily mix local optimizers and global optimizers on the Rosenbrock equation. +The simplest copy-pasteable code using a quasi-Newton method (LBFGS) to solve the Rosenbrock problem is the following: ```@example intro # Import the package and define the problem to optimize -using Optimization +using Optimization, Zygote rosenbrock(u, p) = (p[1] - u[1])^2 + p[2] * (u[2] - u[1]^2)^2 u0 = zeros(2) p = [1.0, 100.0] -prob = OptimizationProblem(rosenbrock, u0, p) - -# Import a solver package and solve the optimization problem -using OptimizationOptimJL -sol = solve(prob, NelderMead()) +optf = OptimizationFunction(rosenbrock, AutoZygote()) +prob = OptimizationProblem(optf, u0, p) -# Import a different solver package and solve the optimization problem a different way -using OptimizationBBO -prob = OptimizationProblem(rosenbrock, u0, p, lb = [-1.0, -1.0], ub = [1.0, 1.0]) -sol = solve(prob, BBO_adaptive_de_rand_1_bin_radiuslimited()) +sol = solve(prob, Optimization.LBFGS()) ``` -Notice that Optimization.jl is the core glue package that holds all the common -pieces, but to solve the equations, we need to use a solver package. Here, OptimizationOptimJL -is for [Optim.jl](https://github.com/JuliaNLSolvers/Optim.jl) and OptimizationBBO is for -[BlackBoxOptim.jl](https://github.com/robertfeldt/BlackBoxOptim.jl). +## Import a different solver package and solve the problem + +OptimizationOptimJL is a wrapper for [Optim.jl](https://github.com/JuliaNLSolvers/Optim.jl) and OptimizationBBO is a wrapper for [BlackBoxOptim.jl](https://github.com/robertfeldt/BlackBoxOptim.jl). -The output of the first optimization task (with the `NelderMead()` algorithm) -is given below: +First let's use the NelderMead a derivative free solver from Optim.jl: ```@example intro -optf = OptimizationFunction(rosenbrock, Optimization.AutoForwardDiff()) -prob = OptimizationProblem(optf, u0, p, lb = [-1.0, -1.0], ub = [1.0, 1.0]) -sol = solve(prob, NelderMead()) +using OptimizationOptimJL +sol = solve(prob, Optim.NelderMead()) +``` + +BlackBoxOptim.jl offers derivative-free global optimization solvers that requrie the bounds to be set via `lb` and `ub` in the `OptimizationProblem`. Let's use the BBO_adaptive_de_rand_1_bin_radiuslimited() solver: + +```@example intro +using OptimizationBBO +prob = OptimizationProblem(rosenbrock, u0, p, lb = [-1.0, -1.0], ub = [1.0, 1.0]) +sol = solve(prob, BBO_adaptive_de_rand_1_bin_radiuslimited()) ``` The solution from the original solver can always be obtained via `original`: diff --git a/docs/src/index.md b/docs/src/index.md index f6e894be2..141967dbd 100644 --- a/docs/src/index.md +++ b/docs/src/index.md @@ -29,8 +29,9 @@ Pkg.add("Optimization") The packages relevant to the core functionality of Optimization.jl will be imported accordingly and, in most cases, you do not have to worry about the manual -installation of dependencies. However, you will need to add the specific optimizer -packages. +installation of dependencies. [Optimization.jl](@ref) natively offers a LBFGS solver +but for more solver choices (discussed below in Optimization Packages), you will need +to add the specific wrapper packages. ## Contributing diff --git a/docs/src/optimization_packages/optimization.md b/docs/src/optimization_packages/optimization.md new file mode 100644 index 000000000..028a9e5bf --- /dev/null +++ b/docs/src/optimization_packages/optimization.md @@ -0,0 +1,39 @@ +# Optimization.jl + +There are some solvers that are available in the Optimization.jl package directly without the need to install any of the solver wrappers. + +## Methods + +`LBFGS`: The popular quasi-Newton method that leverages limited memory BFGS approximation of the inverse of the Hessian. Through a wrapper over the [L-BFGS-B](https://users.iems.northwestern.edu/%7Enocedal/lbfgsb.html) fortran routine accessed from the [LBFGSB.jl](https://github.com/Gnimuc/LBFGSB.jl/) package. It directly supports box-constraints. + +This can also handle arbitrary non-linear constraints through a Augmented Lagrangian method with bounds constraints described in 17.4 of Numerical Optimization by Nocedal and Wright. Thus serving as a general-purpose nonlinear optimization solver available directly in Optimization.jl. + +## Examples + +### Unconstrained rosenbrock problem + +```@example L-BFGS +using Optimization, Zygote + +rosenbrock(x, p) = (p[1] - x[1])^2 + p[2] * (x[2] - x[1]^2)^2 +x0 = zeros(2) +p = [1.0, 100.0] + +optf = OptimizationFunction(rosenbrock, AutoZygote()) +prob = Optimization.OptimizationProblem(optf, x0, p) +sol = solve(prob, Optimization.LBFGS()) +``` + +### With nonlinear and bounds constraints + +```@example L-BFGS +function con2_c(res, x, p) + res .= [x[1]^2 + x[2]^2, (x[2] * sin(x[1]) + x[1]) - 5] +end + +optf = OptimizationFunction(rosenbrock, AutoZygote(), cons = con2_c) +prob = OptimizationProblem(optf, x0, p, lcons = [1.0, -Inf], + ucons = [1.0, 0.0], lb = [-1.0, -1.0], + ub = [1.0, 1.0]) +res = solve(prob, Optimization.LBFGS(), maxiters = 100) +``` diff --git a/docs/src/tutorials/certification.md b/docs/src/tutorials/certification.md new file mode 100644 index 000000000..7c4d9ae8c --- /dev/null +++ b/docs/src/tutorials/certification.md @@ -0,0 +1,88 @@ +# Using SymbolicAnalysis.jl for convexity certificates + +In this tutorial, we will show how to use automatic convexity certification of the optimization problem using [SymbolicAnalysis.jl](https://github.com/Vaibhavdixit02/SymbolicAnalysis.jl). + +This works with the `structural_analysis` keyword argument to `OptimizationProblem`. This tells the package to try to trace through the objective and constraints with symbolic variables (for more details on this look at the [Symbolics documentation](https://symbolics.juliasymbolics.org/stable/manual/functions/#function_registration)). This relies on the Disciplined Programming approach hence neccessitates the use of "atoms" from the SymbolicAnalysis.jl package. + + + +We'll use a simple example to illustrate the convexity structure certification process. + +```@example symanalysis +using SymbolicAnalysis, Zygote, LinearAlgebra, Optimization, OptimizationMOI + +function f(x, p = nothing) + return exp(x[1]) + x[1]^2 +end + +optf = OptimizationFunction(f, Optimization.AutoForwardDiff()) +prob = OptimizationProblem(optf, [0.4], structural_analysis = true) + +sol = solve(prob, Optimization.LBFGS(), maxiters = 1000) +``` + +The result can be accessed as the `analysis_results` field of the solution. + +```@example symanalysis +sol.cache.analysis_results.objective +``` + +Relatedly you can enable structural analysis in Riemannian optimization problems (supported only on the SPD manifold). + +We'll look at the Riemannian center of mass of SPD matrices which is known to be a Geodesically Convex problem on the SPD manifold. + +```@example symanalysis +using Optimization, OptimizationManopt, Symbolics, Manifolds, Random, LinearAlgebra, SymbolicAnalysis + +M = SymmetricPositiveDefinite(5) +m = 100 +σ = 0.005 +q = Matrix{Float64}(LinearAlgebra.I(5)) .+ 2.0 + +data2 = [exp(M, q, σ * rand(M; vector_at = q)) for i = 1:m]; + +f(x, p = nothing) = sum(SymbolicAnalysis.distance(M, data2[i], x)^2 for i = 1:5) +optf = OptimizationFunction(f, Optimization.AutoZygote()) +prob = OptimizationProblem(optf, data2[1]; manifold = M, structural_analysis = true) + +opt = OptimizationManopt.GradientDescentOptimizer() +sol = solve(prob, opt, maxiters = 100) +``` \ No newline at end of file diff --git a/docs/src/tutorials/ensemble.md b/docs/src/tutorials/ensemble.md new file mode 100644 index 000000000..4df6cbf22 --- /dev/null +++ b/docs/src/tutorials/ensemble.md @@ -0,0 +1,30 @@ +# Multistart optimization with EnsembleProblem + +The `EnsembleProblem` in SciML serves as a common interface for running a problem on multiple sets of initializations. In the context +of optimization, this is useful for performing multistart optimization. + +This can be useful for complex, low dimensional problems. We demonstrate this, again, on the rosenbrock function. + +```@example ensemble +using Optimization, OptimizationOptimJL, Random + +Random.seed!(100) + +rosenbrock(x, p) = (p[1] - x[1])^2 + p[2] * (x[2] - x[1]^2)^2 +x0 = zeros(2) + +optf = OptimizationFunction(rosenbrock, Optimization.AutoForwardDiff()) +prob = OptimizationProblem(optf, x0, [1.0, 100.0]) +@time sol1 = Optimization.solve(prob, OptimizationOptimJL.BFGS(), maxiters = 5) + +@show sol1.objective + +ensembleprob = Optimization.EnsembleProblem( + prob, [x0, x0 .+ rand(2), x0 .+ rand(2), x0 .+ rand(2)]) + +@time sol = Optimization.solve(ensembleprob, OptimizationOptimJL.BFGS(), + EnsembleThreads(), trajectories = 4, maxiters = 5) +@show findmin(i -> sol[i].objective, 1:4)[1] +``` + +With the same number of iterations (5) we get a much lower (1/100th) objective value by using multiple initial points. The initialization strategy used here was a pretty trivial one but approaches based on Quasi-Monte Carlo sampling should be typically more effective. From c4075b002fe519efdec5867a65636685220f6cbd Mon Sep 17 00:00:00 2001 From: Vaibhav Dixit Date: Tue, 6 Aug 2024 21:16:55 -0400 Subject: [PATCH 2/8] Fix lbfgsb iterations stuff --- src/lbfgsb.jl | 18 ++++-------------- 1 file changed, 4 insertions(+), 14 deletions(-) diff --git a/src/lbfgsb.jl b/src/lbfgsb.jl index 2c20e25fd..8a055582f 100644 --- a/src/lbfgsb.jl +++ b/src/lbfgsb.jl @@ -57,15 +57,6 @@ function __map_optimizer_args(cache::Optimization.OptimizationCache, opt::LBFGS; return mapped_args end -function SciMLBase.__init(prob::SciMLBase.OptimizationProblem, - opt::LBFGS, - data = Optimization.DEFAULT_DATA; save_best = true, - callback = (args...) -> (false), - progress = false, kwargs...) - return OptimizationCache(prob, opt, data; save_best, callback, progress, - kwargs...) -end - function SciMLBase.__solve(cache::OptimizationCache{ F, RC, @@ -94,15 +85,13 @@ function SciMLBase.__solve(cache::OptimizationCache{ } if cache.data != Optimization.DEFAULT_DATA maxiters = length(cache.data) - data = cache.data else maxiters = Optimization._check_and_convert_maxiters(cache.solver_args.maxiters) - data = Optimization.take(cache.data, maxiters) end local x - solver_kwargs = __map_optimizer_args(cache, cache.opt; cache.solver_args...) + solver_kwargs = __map_optimizer_args(cache, cache.opt; maxiters, cache.solver_args...) if !isnothing(cache.f.cons) eq_inds = [cache.lcons[i] == cache.ucons[i] for i in eachindex(cache.lcons)] @@ -251,8 +240,9 @@ function SciMLBase.__solve(cache::OptimizationCache{ opt_ret = deduce_retcode(stop_reason) t1 = time() - stats = Optimization.OptimizationStats(; iterations = maxiters, - time = t1 - t0, fevals = maxiters, gevals = maxiters) + + stats = Optimization.OptimizationStats(; iterations = optimizer.isave[30], + time = t1 - t0, fevals = optimizer.isave[34], gevals = optimizer.isave[34]) return SciMLBase.build_solution(cache, cache.opt, res[2], res[1], stats = stats, retcode = opt_ret, original = optimizer) From 04d72755705aa7a186d9d9aa4cba743dd406b64d Mon Sep 17 00:00:00 2001 From: Vaibhav Dixit Date: Tue, 6 Aug 2024 21:22:29 -0400 Subject: [PATCH 3/8] format docs --- docs/src/getting_started.md | 2 +- docs/src/index.md | 4 ++-- docs/src/tutorials/certification.md | 12 ++++++------ docs/src/tutorials/ensemble.md | 2 +- 4 files changed, 10 insertions(+), 10 deletions(-) diff --git a/docs/src/getting_started.md b/docs/src/getting_started.md index a868cea8d..03e3dec66 100644 --- a/docs/src/getting_started.md +++ b/docs/src/getting_started.md @@ -1,7 +1,7 @@ # Getting Started with Optimization.jl In this tutorial, we introduce the basics of Optimization.jl by showing -how to easily mix local optimizers and global optimizers on the Rosenbrock equation. +how to easily mix local optimizers and global optimizers on the Rosenbrock equation. The simplest copy-pasteable code using a quasi-Newton method (LBFGS) to solve the Rosenbrock problem is the following: ```@example intro diff --git a/docs/src/index.md b/docs/src/index.md index 141967dbd..019afbd09 100644 --- a/docs/src/index.md +++ b/docs/src/index.md @@ -29,8 +29,8 @@ Pkg.add("Optimization") The packages relevant to the core functionality of Optimization.jl will be imported accordingly and, in most cases, you do not have to worry about the manual -installation of dependencies. [Optimization.jl](@ref) natively offers a LBFGS solver -but for more solver choices (discussed below in Optimization Packages), you will need +installation of dependencies. [Optimization.jl](@ref) natively offers a LBFGS solver +but for more solver choices (discussed below in Optimization Packages), you will need to add the specific wrapper packages. ## Contributing diff --git a/docs/src/tutorials/certification.md b/docs/src/tutorials/certification.md index 7c4d9ae8c..35d8f34ec 100644 --- a/docs/src/tutorials/certification.md +++ b/docs/src/tutorials/certification.md @@ -1,6 +1,6 @@ # Using SymbolicAnalysis.jl for convexity certificates -In this tutorial, we will show how to use automatic convexity certification of the optimization problem using [SymbolicAnalysis.jl](https://github.com/Vaibhavdixit02/SymbolicAnalysis.jl). +In this tutorial, we will show how to use automatic convexity certification of the optimization problem using [SymbolicAnalysis.jl](https://github.com/Vaibhavdixit02/SymbolicAnalysis.jl). This works with the `structural_analysis` keyword argument to `OptimizationProblem`. This tells the package to try to trace through the objective and constraints with symbolic variables (for more details on this look at the [Symbolics documentation](https://symbolics.juliasymbolics.org/stable/manual/functions/#function_registration)). This relies on the Disciplined Programming approach hence neccessitates the use of "atoms" from the SymbolicAnalysis.jl package. @@ -43,7 +43,6 @@ sol = solve(prob, Optimization.LBFGS()) ``` --> - We'll use a simple example to illustrate the convexity structure certification process. ```@example symanalysis @@ -70,19 +69,20 @@ Relatedly you can enable structural analysis in Riemannian optimization problems We'll look at the Riemannian center of mass of SPD matrices which is known to be a Geodesically Convex problem on the SPD manifold. ```@example symanalysis -using Optimization, OptimizationManopt, Symbolics, Manifolds, Random, LinearAlgebra, SymbolicAnalysis +using Optimization, OptimizationManopt, Symbolics, Manifolds, Random, LinearAlgebra, + SymbolicAnalysis M = SymmetricPositiveDefinite(5) m = 100 σ = 0.005 q = Matrix{Float64}(LinearAlgebra.I(5)) .+ 2.0 -data2 = [exp(M, q, σ * rand(M; vector_at = q)) for i = 1:m]; +data2 = [exp(M, q, σ * rand(M; vector_at = q)) for i in 1:m]; -f(x, p = nothing) = sum(SymbolicAnalysis.distance(M, data2[i], x)^2 for i = 1:5) +f(x, p = nothing) = sum(SymbolicAnalysis.distance(M, data2[i], x)^2 for i in 1:5) optf = OptimizationFunction(f, Optimization.AutoZygote()) prob = OptimizationProblem(optf, data2[1]; manifold = M, structural_analysis = true) opt = OptimizationManopt.GradientDescentOptimizer() sol = solve(prob, opt, maxiters = 100) -``` \ No newline at end of file +``` diff --git a/docs/src/tutorials/ensemble.md b/docs/src/tutorials/ensemble.md index 4df6cbf22..3700b3dcd 100644 --- a/docs/src/tutorials/ensemble.md +++ b/docs/src/tutorials/ensemble.md @@ -10,7 +10,7 @@ using Optimization, OptimizationOptimJL, Random Random.seed!(100) -rosenbrock(x, p) = (p[1] - x[1])^2 + p[2] * (x[2] - x[1]^2)^2 +rosenbrock(x, p) = (p[1] - x[1])^2 + p[2] * (x[2] - x[1]^2)^2 x0 = zeros(2) optf = OptimizationFunction(rosenbrock, Optimization.AutoForwardDiff()) From 7a63f627c094f4cb31c116218e920cfaf6e2752b Mon Sep 17 00:00:00 2001 From: Vaibhav Dixit Date: Wed, 7 Aug 2024 01:04:24 -0400 Subject: [PATCH 4/8] Update solvers list and add symbolics to docs deps --- docs/Project.toml | 1 + docs/src/index.md | 158 +++++++++++++++++++++++++++++++++++++++------- 2 files changed, 136 insertions(+), 23 deletions(-) diff --git a/docs/Project.toml b/docs/Project.toml index dbe5dd65c..c5e9a746e 100644 --- a/docs/Project.toml +++ b/docs/Project.toml @@ -35,6 +35,7 @@ OrdinaryDiffEq = "1dea7af3-3e70-54e6-95c3-0bf5283fa5ed" ReverseDiff = "37e2e3b7-166d-5795-8a7a-e32c996b4267" SciMLBase = "0bca4576-84f4-4d90-8ffe-ffa030f20462" SciMLSensitivity = "1ed8b502-d754-442c-8d5d-10ac956f44a1" +Symbolics = "0c5d862f-8b57-4792-8d23-62f2024744c7" Tracker = "9f7883ad-71c0-57eb-9f7f-b5c9e6d3789c" Zygote = "e88e6eb3-aa80-5325-afca-941959d7151f" diff --git a/docs/src/index.md b/docs/src/index.md index 019afbd09..8649e328f 100644 --- a/docs/src/index.md +++ b/docs/src/index.md @@ -49,29 +49,141 @@ to add the specific wrapper packages. + On the [Julia Discourse forums](https://discourse.julialang.org) + See also [SciML Community page](https://sciml.ai/community/) -## Overview of the Optimizers - -| Package | Local Gradient-Based | Local Hessian-Based | Local Derivative-Free | Box Constraints | Local Constrained | Global Unconstrained | Global Constrained | -|:----------------------- |:--------------------:|:-------------------:|:---------------------:|:---------------:|:-----------------:|:--------------------:|:--------------------:| -| BlackBoxOptim | ❌ | ❌ | ❌ | ✅ | ❌ | ✅ | ❌ ✅ | -| CMAEvolutionaryStrategy | ❌ | ❌ | ❌ | ✅ | ❌ | ✅ | ❌ | -| Evolutionary | ❌ | ❌ | ❌ | ✅ | ❌ | ✅ | 🟡 | -| Flux | ✅ | ❌ | ❌ | ❌ | ❌ | ❌ | ❌ | -| GCMAES | ❌ | ❌ | ❌ | ✅ | ❌ | ✅ | ❌ | -| MathOptInterface | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | 🟡 | -| MultistartOptimization | ❌ | ❌ | ❌ | ✅ | ❌ | ✅ | ❌ | -| Metaheuristics | ❌ | ❌ | ❌ | ✅ | ❌ | ✅ | 🟡 | -| NOMAD | ❌ | ❌ | ❌ | ✅ | ❌ | ✅ | 🟡 | -| NLopt | ✅ | ❌ | ✅ | ✅ | 🟡 | ✅ | 🟡 | -| Optim | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | -| PRIMA | ❌ | ❌ | ✅ | ✅ | ✅ | ❌ | ❌ | -| QuadDIRECT | ❌ | ❌ | ❌ | ✅ | ❌ | ✅ | ❌ | - -✅ = supported - -🟡 = supported in downstream library but not yet implemented in `Optimization`; PR to add this functionality are welcome - -❌ = not supported +## Overview of the solver packages in alphabetical order + +
+ BlackBoxOptim + - **Global Methods** + - Zeroth order + - Unconstrained + - Box Constraints +
+ +
+ CMAEvolutionaryStrategy + - **Global Methods** + - Zeroth order + - Unconstrained + - Box Constraints +
+ +
+ Evolutionary + - **Global Methods** + - Zeroth order + - Unconstrained + - Box Constraints + - Non-linear Constraints +
+ +
+ GCMAES + - **Global Methods** + - First order + - Box Constraints + - Unconstrained +
+ +
+ Manopt + - **Local Methods** + - First order + - Second order + - Zeroth order + - Box Constraints + - Constrained 🟡 + - **Global Methods** + - Zeroth order + - Unconstrained +
+ +
+ MathOptInterface + - **Local Methods** + - First order + - Second order + - Box Constraints + - Constrained + - **Global Methods** + - First order + - Second order + - Constrained +
+ +
+ MultistartOptimization + - **Global Methods** + - Zeroth order + - First order + - Second order + - Box Constraints +
+ +
+ Metaheuristics + - **Global Methods** + - Zeroth order + - Unconstrained + - Box Constraints +
+ +
+ NOMAD + - **Global Methods** + - Zeroth order + - Unconstrained + - Box Constraints + - Constrained 🟡 +
+ +
+ NLopt + - **Local Methods** + - First order + - Zeroth order + - Second order 🟡 + - Box Constraints + - Local Constrained 🟡 + - **Global Methods** + - Zeroth order + - First order + - Unconstrained + - Constrained 🟡 +
+ +
+ Optim + - **Local Methods** + - Zeroth order + - First order + - Second order + - Box Constraints + - Constrained + - **Global Methods** + - Zeroth order + - Unconstrained + - Box Constraints +
+ +
+ PRIMA + - **Local Methods** + - Derivative-Free: ✅ + - **Constraints** + - Box Constraints: ✅ + - Local Constrained: ✅ +
+ +
+ QuadDIRECT + - **Constraints** + - Box Constraints: ✅ + - **Global Methods** + - Unconstrained: ✅ +
+ +🟡 = supported in downstream library but not yet implemented in `Optimization.jl`; PR to add this functionality are welcome + ## Citation From 63f22dc7ecbb5d789a8140baf3e68afae04dd17b Mon Sep 17 00:00:00 2001 From: Vaibhav Dixit Date: Wed, 7 Aug 2024 09:59:55 -0400 Subject: [PATCH 5/8] Add symbolicAnalysis to docs project --- docs/Project.toml | 1 + 1 file changed, 1 insertion(+) diff --git a/docs/Project.toml b/docs/Project.toml index c5e9a746e..b6ad9c17c 100644 --- a/docs/Project.toml +++ b/docs/Project.toml @@ -35,6 +35,7 @@ OrdinaryDiffEq = "1dea7af3-3e70-54e6-95c3-0bf5283fa5ed" ReverseDiff = "37e2e3b7-166d-5795-8a7a-e32c996b4267" SciMLBase = "0bca4576-84f4-4d90-8ffe-ffa030f20462" SciMLSensitivity = "1ed8b502-d754-442c-8d5d-10ac956f44a1" +SymbolicAnalysis = "4297ee4d-0239-47d8-ba5d-195ecdf594fe" Symbolics = "0c5d862f-8b57-4792-8d23-62f2024744c7" Tracker = "9f7883ad-71c0-57eb-9f7f-b5c9e6d3789c" Zygote = "e88e6eb3-aa80-5325-afca-941959d7151f" From b41807794ed6a99bf8d5db0975459ea8a0f7167d Mon Sep 17 00:00:00 2001 From: Vaibhav Dixit Date: Wed, 7 Aug 2024 13:32:38 -0400 Subject: [PATCH 6/8] delete downgrade runs for now --- .github/workflows/Downgrade.yml | 53 --------------------------------- 1 file changed, 53 deletions(-) delete mode 100644 .github/workflows/Downgrade.yml diff --git a/.github/workflows/Downgrade.yml b/.github/workflows/Downgrade.yml deleted file mode 100644 index 8a0fb32f9..000000000 --- a/.github/workflows/Downgrade.yml +++ /dev/null @@ -1,53 +0,0 @@ -name: Downgrade -on: - pull_request: - branches: - - master - paths-ignore: - - 'docs/**' - push: - branches: - - master - paths-ignore: - - 'docs/**' -jobs: - test: - runs-on: ubuntu-latest - strategy: - fail-fast: false - matrix: - version: ['1'] - group: - - Core - - OptimizationBBO - - OptimizationCMAEvolutionStrategy - - OptimizationEvolutionary - - OptimizationFlux - - OptimizationGCMAES - - OptimizationMetaheuristics - - OptimizationMOI - - OptimizationMultistartOptimization - - OptimizationNLopt - #- OptimizationNonconvex - - OptimizationNOMAD - - OptimizationOptimJL - - OptimizationOptimisers - - OptimizationPRIMA - - OptimizationQuadDIRECT - - OptimizationSpeedMapping - - OptimizationPolyalgorithms - steps: - - uses: actions/checkout@v4 - - uses: julia-actions/setup-julia@v2 - with: - version: ${{ matrix.version }} - - uses: julia-actions/julia-downgrade-compat@v1 -# if: ${{ matrix.version == '1.6' }} - with: - skip: Pkg,TOML - - uses: julia-actions/julia-buildpkg@v1 - - if: ${{ matrix.group == 'OptimizationQuadDIRECT' }} - run: julia --project -e 'using Pkg; Pkg.Registry.add(RegistrySpec(url = "https://github.com/HolyLab/HolyLabRegistry.git")); Pkg.add("QuadDIRECT")' - - uses: julia-actions/julia-runtest@v1 - env: - GROUP: ${{ matrix.group }} From c43e0cba5159035fa2db116217e53ac9e79be9e9 Mon Sep 17 00:00:00 2001 From: Vaibhav Dixit Date: Wed, 7 Aug 2024 16:54:20 -0400 Subject: [PATCH 7/8] rem commented out example --- docs/src/tutorials/certification.md | 39 ----------------------------- 1 file changed, 39 deletions(-) diff --git a/docs/src/tutorials/certification.md b/docs/src/tutorials/certification.md index 35d8f34ec..9ecdc0c35 100644 --- a/docs/src/tutorials/certification.md +++ b/docs/src/tutorials/certification.md @@ -4,45 +4,6 @@ In this tutorial, we will show how to use automatic convexity certification of t This works with the `structural_analysis` keyword argument to `OptimizationProblem`. This tells the package to try to trace through the objective and constraints with symbolic variables (for more details on this look at the [Symbolics documentation](https://symbolics.juliasymbolics.org/stable/manual/functions/#function_registration)). This relies on the Disciplined Programming approach hence neccessitates the use of "atoms" from the SymbolicAnalysis.jl package. - We'll use a simple example to illustrate the convexity structure certification process. ```@example symanalysis From 92e9273c3b4e6102680e254f2be5f694cea99a58 Mon Sep 17 00:00:00 2001 From: Vaibhav Dixit Date: Wed, 7 Aug 2024 19:25:42 -0400 Subject: [PATCH 8/8] format docs --- docs/src/index.md | 14 -------------- 1 file changed, 14 deletions(-) diff --git a/docs/src/index.md b/docs/src/index.md index 8649e328f..c1f7b69f4 100644 --- a/docs/src/index.md +++ b/docs/src/index.md @@ -58,7 +58,6 @@ to add the specific wrapper packages. - Unconstrained - Box Constraints -
CMAEvolutionaryStrategy - **Global Methods** @@ -66,7 +65,6 @@ to add the specific wrapper packages. - Unconstrained - Box Constraints
-
Evolutionary - **Global Methods** @@ -75,7 +73,6 @@ to add the specific wrapper packages. - Box Constraints - Non-linear Constraints
-
GCMAES - **Global Methods** @@ -83,7 +80,6 @@ to add the specific wrapper packages. - Box Constraints - Unconstrained
-
Manopt - **Local Methods** @@ -96,7 +92,6 @@ to add the specific wrapper packages. - Zeroth order - Unconstrained
-
MathOptInterface - **Local Methods** @@ -109,7 +104,6 @@ to add the specific wrapper packages. - Second order - Constrained
-
MultistartOptimization - **Global Methods** @@ -118,7 +112,6 @@ to add the specific wrapper packages. - Second order - Box Constraints
-
Metaheuristics - **Global Methods** @@ -126,7 +119,6 @@ to add the specific wrapper packages. - Unconstrained - Box Constraints
-
NOMAD - **Global Methods** @@ -135,7 +127,6 @@ to add the specific wrapper packages. - Box Constraints - Constrained 🟡
-
NLopt - **Local Methods** @@ -150,7 +141,6 @@ to add the specific wrapper packages. - Unconstrained - Constrained 🟡
-
Optim - **Local Methods** @@ -164,7 +154,6 @@ to add the specific wrapper packages. - Unconstrained - Box Constraints
-
PRIMA - **Local Methods** @@ -173,7 +162,6 @@ to add the specific wrapper packages. - Box Constraints: ✅ - Local Constrained: ✅
-
QuadDIRECT - **Constraints** @@ -181,10 +169,8 @@ to add the specific wrapper packages. - **Global Methods** - Unconstrained: ✅
- 🟡 = supported in downstream library but not yet implemented in `Optimization.jl`; PR to add this functionality are welcome - ## Citation ```