Skip to content

Commit

Permalink
Merge branch 'master' into NewBranin
Browse files Browse the repository at this point in the history
  • Loading branch information
ChrisRackauckas authored Dec 13, 2023
2 parents 0fc03e7 + 33d3eaa commit 9a0ef6f
Show file tree
Hide file tree
Showing 56 changed files with 632 additions and 467 deletions.
3 changes: 3 additions & 0 deletions .github/dependabot.yml
Original file line number Diff line number Diff line change
Expand Up @@ -5,3 +5,6 @@ updates:
directory: "/" # Location of package manifests
schedule:
interval: "weekly"
ignore:
- dependency-name: "crate-ci/typos"
update-types: ["version-update:semver-patch"]
1 change: 0 additions & 1 deletion .github/workflows/CI.yml
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,6 @@ jobs:
- Core
version:
- '1'
- '1.6'
steps:
- uses: actions/checkout@v4
- uses: julia-actions/setup-julia@v1
Expand Down
10 changes: 9 additions & 1 deletion .github/workflows/Documentation.yml
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,15 @@ jobs:
with:
version: '1'
- name: Install dependencies
run: julia --project=docs/ -e 'using Pkg; Pkg.develop(PackageSpec(path=pwd())); Pkg.instantiate()'
run: julia --project=docs/ -e 'using Pkg;
Pkg.develop(PackageSpec(path=pwd()));
Pkg.develop(PackageSpec(path=joinpath(pwd(), "lib", "SurrogatesAbstractGPs")));
Pkg.develop(PackageSpec(path=joinpath(pwd(), "lib", "SurrogatesFlux")));
Pkg.develop(PackageSpec(path=joinpath(pwd(), "lib", "SurrogatesMOE")));
Pkg.develop(PackageSpec(path=joinpath(pwd(), "lib", "SurrogatesPolyChaos")));
Pkg.develop(PackageSpec(path=joinpath(pwd(), "lib", "SurrogatesRandomForest")));
Pkg.develop(PackageSpec(path=joinpath(pwd(), "lib", "SurrogatesSVM")));
Pkg.instantiate()'
- name: Build and deploy
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} # For authentication with GitHub Actions token
Expand Down
13 changes: 13 additions & 0 deletions .github/workflows/SpellCheck.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,13 @@
name: Spell Check

on: [pull_request]

jobs:
typos-check:
name: Spell Check with Typos
runs-on: ubuntu-latest
steps:
- name: Checkout Actions Repository
uses: actions/checkout@v4
- name: Check spelling
uses: crate-ci/[email protected]
2 changes: 2 additions & 0 deletions .typos.toml
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
[default.extend-words]
ND = "ND"
6 changes: 3 additions & 3 deletions Project.toml
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
name = "Surrogates"
uuid = "6fc51010-71bc-11e9-0e15-a3fcc6593c49"
authors = ["SciML"]
version = "6.6.0"
version = "6.8.0"

[deps]
Distributions = "31c24e10-a181-5473-b8eb-7969acd0382f"
Expand All @@ -21,10 +21,10 @@ Flux = "0.12, 0.13"
GLM = "1.3"
IterativeSolvers = "0.9"
PolyChaos = "0.2"
QuasiMonteCarlo = "=0.2.16"
QuasiMonteCarlo = "0.3"
Statistics = "1"
Zygote = "0.4, 0.5, 0.6"
julia = "1.6"
julia = "1.9"

[extras]
Cubature = "667455a9-e2ce-5579-9412-b964f529a492"
Expand Down
2 changes: 1 addition & 1 deletion docs/Project.toml
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@ Zygote = "e88e6eb3-aa80-5325-afca-941959d7151f"

[compat]
AbstractGPs = "0.5.13"
Documenter = "0.27"
Documenter = "1"
Flux = "0.13.7, 0.14"
Plots = "1.36.2"
QuadGK = "2.6.0"
Expand Down
18 changes: 6 additions & 12 deletions docs/make.jl
Original file line number Diff line number Diff line change
Expand Up @@ -10,17 +10,11 @@ using Plots
include("pages.jl")

makedocs(sitename = "Surrogates.jl",
strict = [
:doctest,
:linkcheck,
:parse_error,
:example_block,
# Other available options are
# :autodocs_block, :cross_references, :docs_block, :eval_block, :example_block, :footnote, :meta_block, :missing_docs, :setup_block
],
format = Documenter.HTML(analytics = "UA-90474609-3",
assets = ["assets/favicon.ico"],
canonical = "https://docs.sciml.ai/Surrogates/stable/"),
pages = pages)
linkcheck = true,
warnonly = [:missing_docs],
format = Documenter.HTML(analytics = "UA-90474609-3",
assets = ["assets/favicon.ico"],
canonical = "https://docs.sciml.ai/Surrogates/stable/"),
pages = pages)

deploydocs(repo = "github.com/SciML/Surrogates.jl.git")
78 changes: 39 additions & 39 deletions docs/pages.jl
Original file line number Diff line number Diff line change
@@ -1,40 +1,40 @@
pages = ["index.md"
"Tutorials" => [
"Basics" => "tutorials.md",
"Radials" => "radials.md",
"Kriging" => "kriging.md",
"Gaussian Process" => "abstractgps.md",
"Lobachevsky" => "lobachevsky.md",
"Linear" => "LinearSurrogate.md",
"InverseDistance" => "InverseDistance.md",
"RandomForest" => "randomforest.md",
"SecondOrderPolynomial" => "secondorderpoly.md",
"NeuralSurrogate" => "neural.md",
"Wendland" => "wendland.md",
"Polynomial Chaos" => "polychaos.md",
"Variable Fidelity" => "variablefidelity.md",
"Gradient Enhanced Kriging" => "gek.md",
"GEKPLS" => "gekpls.md",
"MOE" => "moe.md",
"Parallel Optimization" => "parallel.md"
]
"User guide" => [
"Samples" => "samples.md",
"Surrogates" => "surrogate.md",
"Optimization" => "optimizations.md",
]
"Benchmarks" => [
"Sphere function" => "sphere_function.md",
"Lp norm" => "lp.md",
"Rosenbrock" => "rosenbrock.md",
"Tensor product" => "tensor_prod.md",
"Cantilever beam" => "cantilever.md",
"Water Flow function" => "water_flow.md",
"Welded beam function" => "welded_beam.md",
"Branin function" => "BraninFunction.md",
"Improved Branin function" => "ImprovedBraninFunction.md",
"Ackley function" => "ackley.md",
"Gramacy & Lee Function" => "gramacylee.md",
"Salustowicz Benchmark" => "Salustowicz.md",
"Multi objective optimization" => "multi_objective_opt.md",
]]
"Tutorials" => [
"Basics" => "tutorials.md",
"Radials" => "radials.md",
"Kriging" => "kriging.md",
"Gaussian Process" => "abstractgps.md",
"Lobachevsky" => "lobachevsky.md",
"Linear" => "LinearSurrogate.md",
"InverseDistance" => "InverseDistance.md",
"RandomForest" => "randomforest.md",
"SecondOrderPolynomial" => "secondorderpoly.md",
"NeuralSurrogate" => "neural.md",
"Wendland" => "wendland.md",
"Polynomial Chaos" => "polychaos.md",
"Variable Fidelity" => "variablefidelity.md",
"Gradient Enhanced Kriging" => "gek.md",
"GEKPLS" => "gekpls.md",
"MOE" => "moe.md",
"Parallel Optimization" => "parallel.md",
]
"User guide" => [
"Samples" => "samples.md",
"Surrogates" => "surrogate.md",
"Optimization" => "optimizations.md",
]
"Benchmarks" => [
"Sphere function" => "sphere_function.md",
"Lp norm" => "lp.md",
"Rosenbrock" => "rosenbrock.md",
"Tensor product" => "tensor_prod.md",
"Cantilever beam" => "cantilever.md",
"Water Flow function" => "water_flow.md",
"Welded beam function" => "welded_beam.md",
"Branin function" => "BraninFunction.md",
"Improved Branin function" => "ImprovedBraninFunction.md",
"Ackley function" => "ackley.md",
"Gramacy & Lee Function" => "gramacylee.md",
"Salustowicz Benchmark" => "Salustowicz.md",
"Multi objective optimization" => "multi_objective_opt.md",
]]
4 changes: 2 additions & 2 deletions docs/src/InverseDistance.md
Original file line number Diff line number Diff line change
Expand Up @@ -15,15 +15,15 @@ default()

### Sampling

We choose to sample f in 25 points between 0 and 10 using the `sample` function. The sampling points are chosen using a Low Discrepancy, this can be done by passing `LowDiscrepancySample()` to the `sample` function.
We choose to sample f in 25 points between 0 and 10 using the `sample` function. The sampling points are chosen using a Low Discrepancy, this can be done by passing `HaltonSample()` to the `sample` function.

```@example Inverse_Distance1D
f(x) = sin(x) + sin(x)^2 + sin(x)^3
n_samples = 25
lower_bound = 0.0
upper_bound = 10.0
x = sample(n_samples, lower_bound, upper_bound, LowDiscrepancySample(;base=2))
x = sample(n_samples, lower_bound, upper_bound, HaltonSample())
y = f.(x)
scatter(x, y, label="Sampled points", xlims=(lower_bound, upper_bound), legend=:top)
Expand Down
2 changes: 1 addition & 1 deletion docs/src/ackley.md
Original file line number Diff line number Diff line change
Expand Up @@ -58,7 +58,7 @@ The fit looks good. Let's now see if we are able to find the minimum value using
optimization methods:

```@example ackley
surrogate_optimize(ackley,DYCORS(),lb,ub,my_rad,UniformSample())
surrogate_optimize(ackley,DYCORS(),lb,ub,my_rad,RandomSample())
scatter(x, y, label="Sampled points", xlims=(lb, ub), ylims=(0, 30), legend=:top)
plot!(xs, ackley.(xs), label="True function", legend=:top)
plot!(xs, my_rad.(xs), label="Radial basis optimized", legend=:top)
Expand Down
2 changes: 1 addition & 1 deletion docs/src/gekpls.md
Original file line number Diff line number Diff line change
Expand Up @@ -80,7 +80,7 @@ This next example demonstrates how this can be accomplished.
y = sphere_function.(x)
g = GEKPLS(x, y, grads, n_comp, delta_x, lb, ub, extra_points, initial_theta)
x_point, minima = surrogate_optimize(sphere_function, SRBF(), lb, ub, g,
UniformSample(); maxiters = 20,
RandomSample(); maxiters = 20,
num_new_samples = 20, needs_gradient = true)
println(minima)
Expand Down
47 changes: 25 additions & 22 deletions docs/src/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -107,62 +107,65 @@ my_lobachevsky = LobachevskySurrogate(x,y,lb,ub,alpha=alpha,n=n)
value = my_lobachevsky(5.0)
#Adding more data points
surrogate_optimize(f,SRBF(),lb,ub,my_lobachevsky,UniformSample())
surrogate_optimize(f,SRBF(),lb,ub,my_lobachevsky,RandomSample())
#New approximation
value = my_lobachevsky(5.0)
```
## Reproducibility

```@raw html
<details><summary>The documentation of this SciML package was built using these direct dependencies,</summary>
```

```@example
using Pkg # hide
Pkg.status() # hide
```

```@raw html
</details>
```

```@raw html
<details><summary>and using this machine and Julia version.</summary>
```

```@example
using InteractiveUtils # hide
versioninfo() # hide
```

```@raw html
</details>
```

```@raw html
<details><summary>A more complete overview of all dependencies and their versions is also provided.</summary>
```

```@example
using Pkg # hide
Pkg.status(;mode = PKGMODE_MANIFEST) # hide
Pkg.status(; mode = PKGMODE_MANIFEST) # hide
```

```@raw html
</details>
```
```@raw html
You can also download the
<a href="
```
```@eval
using TOML
version = TOML.parse(read("../../Project.toml",String))["version"]
name = TOML.parse(read("../../Project.toml",String))["name"]
link = "https://github.com/SciML/"*name*".jl/tree/gh-pages/v"*version*"/assets/Manifest.toml"
```
```@raw html
">manifest</a> file and the
<a href="
```

```@eval
using TOML
version = TOML.parse(read("../../Project.toml",String))["version"]
name = TOML.parse(read("../../Project.toml",String))["name"]
link = "https://github.com/SciML/"*name*".jl/tree/gh-pages/v"*version*"/assets/Project.toml"
```
```@raw html
">project</a> file.
using Markdown
version = TOML.parse(read("../../Project.toml", String))["version"]
name = TOML.parse(read("../../Project.toml", String))["name"]
link_manifest = "https://github.com/SciML/" * name * ".jl/tree/gh-pages/v" * version *
"/assets/Manifest.toml"
link_project = "https://github.com/SciML/" * name * ".jl/tree/gh-pages/v" * version *
"/assets/Project.toml"
Markdown.parse("""You can also download the
[manifest]($link_manifest)
file and the
[project]($link_project)
file.
""")
```
3 changes: 1 addition & 2 deletions docs/src/moe.md
Original file line number Diff line number Diff line change
Expand Up @@ -92,7 +92,7 @@ end
lb = [-1.0, -1.0]
ub = [1.0, 1.0]
n = 150
x = sample(n, lb, ub, SobolSample())
x = sample(n, lb, ub, RandomSample())
y = discont_NDIM.(x)
x_test = sample(10, lb, ub, GoldenSample())
Expand All @@ -110,7 +110,6 @@ rbf = RadialBasis(x, y, lb, ub)
rbf_pred_vals = rbf.(x_test)
rbf_rmse = rmse(true_vals, rbf_pred_vals)
println(rbf_rmse > moe_rmse)
```

### Usage Notes - Example With Other Surrogates
Expand Down
2 changes: 1 addition & 1 deletion docs/src/optimizations.md
Original file line number Diff line number Diff line change
Expand Up @@ -28,5 +28,5 @@ surrogate_optimize(obj::Function,sop1::SOP,lb::Number,ub::Number,surrSOP::Abstra
To add another optimization method, you just need to define a new
SurrogateOptimizationAlgorithm and write its corresponding algorithm, overloading the following:
```
surrogate_optimize(obj::Function,::NewOptimizatonType,lb,ub,surr::AbstractSurrogate,sample_type::SamplingAlgorithm;maxiters=100,num_new_samples=100)
surrogate_optimize(obj::Function,::NewOptimizationType,lb,ub,surr::AbstractSurrogate,sample_type::SamplingAlgorithm;maxiters=100,num_new_samples=100)
```
16 changes: 8 additions & 8 deletions docs/src/parallel.md
Original file line number Diff line number Diff line change
Expand Up @@ -17,24 +17,24 @@ To ensure that points of interest returned by `potential_optimal_points` are suf

The following strategies are available for virtual point selection for all optimization algorithms:

- "Minimum Constant Liar (CLmin)":
- "Minimum Constant Liar (MinimumConstantLiar)":
- The virtual point is assigned using the lowest known value of the merit function across all evaluated points.
- "Mean Constant Liar (CLmean)":
- "Mean Constant Liar (MeanConstantLiar)":
- The virtual point is assigned using the mean of the merit function across all evaluated points.
- "Maximum Constant Liar (CLmax)":
- "Maximum Constant Liar (MaximumConstantLiar)":
- The virtual point is assigned using the great known value of the merit function across all evaluated points.

For Kriging surrogates, specifically, the above and follow strategies are available:

- "Kriging Believer (KB)":
- "Kriging Believer (KrigingBeliever):
- The virtual point is assigned using the mean of the Kriging surrogate at the virtual point.
- "Kriging Believer Upper Bound (KBUB)":
- "Kriging Believer Upper Bound (KrigingBelieverUpperBound)":
- The virtual point is assigned using 3$\sigma$ above the mean of the Kriging surrogate at the virtual point.
- "Kriging Believer Lower Bound (KBLB)":
- "Kriging Believer Lower Bound (KrigingBelieverLowerBound)":
- The virtual point is assigned using 3$\sigma$ below the mean of the Kriging surrogate at the virtual point.


In general, CLmin and KBLB tend to favor exploitation while CLmax and KBUB tend to favor exploration. CLmean and KB tend to be a compromise between the two.
In general, MinimumConstantLiar and KrigingBelieverLowerBound tend to favor exploitation while MaximumConstantLiar and KrigingBelieverUpperBound tend to favor exploration. MeanConstantLiar and KrigingBeliever tend to be a compromise between the two.

## Examples

Expand All @@ -50,7 +50,7 @@ y = f.(x)
my_k = Kriging(x, y, lb, ub)
for _ in 1:10
new_x, eis = potential_optimal_points(EI(), lb, ub, my_k, SobolSample(), 3, CLmean!)
new_x, eis = potential_optimal_points(EI(), MeanConstantLiar(), lb, ub, my_k, SobolSample(), 3)
add_point!(my_k, new_x, f.(new_x))
end
```
4 changes: 2 additions & 2 deletions docs/src/polychaos.md
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@ we are trying to fit. Under the hood, PolyChaos.jl has been used.
It is possible to specify a type of polynomial for each dimension of the problem.
### Sampling

We choose to sample f in 25 points between 0 and 10 using the `sample` function. The sampling points are chosen using a Low Discrepancy, this can be done by passing `LowDiscrepancySample()` to the `sample` function.
We choose to sample f in 25 points between 0 and 10 using the `sample` function. The sampling points are chosen using a Low Discrepancy, this can be done by passing `HaltonSample()` to the `sample` function.

```@example polychaos
using Surrogates
Expand All @@ -20,7 +20,7 @@ default()
n = 20
lower_bound = 1.0
upper_bound = 6.0
x = sample(n,lower_bound,upper_bound,LowDiscrepancySample(2))
x = sample(n,lower_bound,upper_bound,HaltonSample())
f = x -> log(x)*x + sin(x)
y = f.(x)
scatter(x, y, label="Sampled points", xlims=(lower_bound, upper_bound), legend=:top)
Expand Down
Loading

0 comments on commit 9a0ef6f

Please sign in to comment.