Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

typos CI #450

Merged
merged 1 commit into from
Dec 12, 2023
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
3 changes: 3 additions & 0 deletions .github/dependabot.yml
Original file line number Diff line number Diff line change
Expand Up @@ -5,3 +5,6 @@ updates:
directory: "/" # Location of package manifests
schedule:
interval: "weekly"
ignore:
- dependency-name: "crate-ci/typos"
update-types: ["version-update:semver-patch"]
13 changes: 13 additions & 0 deletions .github/workflows/SpellCheck.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,13 @@
name: Spell Check

on: [pull_request]

jobs:
typos-check:
name: Spell Check with Typos
runs-on: ubuntu-latest
steps:
- name: Checkout Actions Repository
uses: actions/checkout@v3
- name: Check spelling
uses: crate-ci/[email protected]
2 changes: 2 additions & 0 deletions .typos.toml
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
[default.extend-words]
ND = "ND"
2 changes: 1 addition & 1 deletion docs/src/optimizations.md
Original file line number Diff line number Diff line change
Expand Up @@ -28,5 +28,5 @@ surrogate_optimize(obj::Function,sop1::SOP,lb::Number,ub::Number,surrSOP::Abstra
To add another optimization method, you just need to define a new
SurrogateOptimizationAlgorithm and write its corresponding algorithm, overloading the following:
```
surrogate_optimize(obj::Function,::NewOptimizatonType,lb,ub,surr::AbstractSurrogate,sample_type::SamplingAlgorithm;maxiters=100,num_new_samples=100)
surrogate_optimize(obj::Function,::NewOptimizationType,lb,ub,surr::AbstractSurrogate,sample_type::SamplingAlgorithm;maxiters=100,num_new_samples=100)
```
2 changes: 1 addition & 1 deletion docs/src/randomforest.md
Original file line number Diff line number Diff line change
Expand Up @@ -32,7 +32,7 @@ plot!(f, label="True function", xlims=(lower_bound, upper_bound), legend=:top)

With our sampled points we can build the Random forests surrogate using the `RandomForestSurrogate` function.

`randomforest_surrogate` behaves like an ordinary function which we can simply plot. Addtionally you can specify the number of trees created
`randomforest_surrogate` behaves like an ordinary function which we can simply plot. Additionally you can specify the number of trees created
using the parameter num_round

```@example RandomForestSurrogate_tutorial
Expand Down
2 changes: 1 addition & 1 deletion docs/src/surrogate.md
Original file line number Diff line number Diff line change
Expand Up @@ -48,7 +48,7 @@ It's great that you want to add another surrogate to the library!
You will need to:

1. Define a new mutable struct and a constructor function
2. Define add\_point!(your\_surrogate::AbstactSurrogate,x\_new,y\_new)
2. Define add\_point!(your\_surrogate::AbstractSurrogate,x\_new,y\_new)
3. Define your\_surrogate(value) for the approximation

**Example**
Expand Down
2 changes: 1 addition & 1 deletion src/GEK.jl
Original file line number Diff line number Diff line change
Expand Up @@ -93,7 +93,7 @@ end

function GEK(x, y, lb::Number, ub::Number; p = 1.0, theta = 1.0)
if length(x) != length(unique(x))
println("There exists a repetion in the samples, cannot build Kriging.")
println("There exists a repetition in the samples, cannot build Kriging.")
return
end
mu, b, sigma, inverse_of_R = _calc_gek_coeffs(x, y, p, theta)
Expand Down
6 changes: 3 additions & 3 deletions src/GEKPLS.jl
Original file line number Diff line number Diff line change
Expand Up @@ -201,8 +201,8 @@ function _ge_compute_pls(X, y, n_comp, grads, delta_x, xlimits, extra_points)
bb_vals = bb_vals .* grads[i, :]'
_y = y[i, :] .+ sum(bb_vals, dims = 2)

#_pls.fit(_X, _y) # relic from sklearn versiom; retained for future reference.
#coeff_pls[:, :, i] = _pls.x_rotations_ #relic from sklearn versiom; retained for future reference.
#_pls.fit(_X, _y) # relic from sklearn version; retained for future reference.
#coeff_pls[:, :, i] = _pls.x_rotations_ #relic from sklearn version; retained for future reference.

coeff_pls[:, :, i] = _modified_pls(_X, _y, n_comp) #_modified_pls returns the equivalent of SKLearn's _pls.x_rotations_
if extra_points != 0
Expand Down Expand Up @@ -304,7 +304,7 @@ end
######end of bb design######

"""
We substract the mean from each variable. Then, we divide the values of each
We subtract the mean from each variable. Then, we divide the values of each
variable by its standard deviation.

Parameters
Expand Down
2 changes: 1 addition & 1 deletion src/Kriging.jl
Original file line number Diff line number Diff line change
Expand Up @@ -104,7 +104,7 @@ Constructor for type Kriging.
function Kriging(x, y, lb::Number, ub::Number; p = 2.0,
theta = 0.5 / max(1e-6 * abs(ub - lb), std(x))^p)
if length(x) != length(unique(x))
println("There exists a repetion in the samples, cannot build Kriging.")
println("There exists a repetition in the samples, cannot build Kriging.")
return
end

Expand Down
2 changes: 1 addition & 1 deletion src/Optimization.jl
Original file line number Diff line number Diff line change
Expand Up @@ -1701,7 +1701,7 @@ function surrogate_optimize(obj::Function, sopd::SOP, lb, ub, surrSOPD::Abstract
new_points_y[i] = y_best
end

#new_points[i] is splitted in new_points_x and new_points_y now contains:
#new_points[i] is split in new_points_x and new_points_y now contains:
#[x_1,y_1; x_2,y_2,...,x_{num_new_samples},y_{num_new_samples}]

#2.4 Adaptive learning and tabu archive
Expand Down
Loading