Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Make theta a user-supplied array param #370

Merged

Conversation

vikram-s-narayan
Copy link
Contributor

Addresses #366 by enabling users to supply their own theta array as a param. This will allow users to do hyperparameter optimization outside of the GEKPLS system.

@codecov
Copy link

codecov bot commented Jul 8, 2022

Codecov Report

Merging #370 (ade62b4) into master (2de1b9d) will increase coverage by 0.21%.
The diff coverage is 100.00%.

@@            Coverage Diff             @@
##           master     #370      +/-   ##
==========================================
+ Coverage   78.76%   78.97%   +0.21%     
==========================================
  Files          16       16              
  Lines        2270     2269       -1     
==========================================
+ Hits         1788     1792       +4     
+ Misses        482      477       -5     
Impacted Files Coverage Δ
src/GEKPLS.jl 92.12% <100.00%> (+0.88%) ⬆️
src/Optimization.jl 72.34% <0.00%> (+0.27%) ⬆️

📣 Codecov can now indicate which changes are the most critical in Pull Requests. Learn more

@vikram-s-narayan vikram-s-narayan changed the title make theta a user-supplied array param Make theta a user-supplied array param Jul 8, 2022
@ranjanan
Copy link
Contributor

ranjanan commented Jul 8, 2022

Is the plan to add a gradient free optimization method for the user for now while you work on #371 ?

@@ -35,15 +35,14 @@ function bounds_error(x, xl)
end

#constructor for GEKPLS Struct
function GEKPLS(X, y, grads, n_comp, delta_x, xlimits, extra_points, θ)
function GEKPLS(X, y, grads, n_comp, delta_x, xlimits, extra_points, theta)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We should docstring these. Anyways, that's for the future.

@ChrisRackauckas
Copy link
Member

I'm not sure the hyperparameter optimization needs to be in the library with this feature. You just create and run an OptimizationProblem tuning theta and it should be fine.

@ChrisRackauckas ChrisRackauckas merged commit 8cebcbe into SciML:master Jul 8, 2022
@vikram-s-narayan
Copy link
Contributor Author

vikram-s-narayan commented Jul 9, 2022

Is the plan to add a gradient free optimization method for the user for now while you work on #371 ?

Users can use BlackBoxOptim.jl or equivalent on their own (outside of the GEKPLS system) until we make GEKPLS differentiable. After that, users will still be doing hyperparameter optimization on their own outside of the system with their choice of gradient-based optimization

@ChrisRackauckas
Copy link
Member

Set it up to make an Optimization.jl OptimizationProblem and it'll then have access to all optimizers. Then it will just fail if you try to AutoZygote or something, which would then be fixed by diff support.

@vikram-s-narayan vikram-s-narayan deleted the make_theta_param_as_array branch December 19, 2022 13:04
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants