-
Notifications
You must be signed in to change notification settings - Fork 102
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Custom stopping criteria and loss functions #211
Comments
I'm not sure this is provided by this package, but you can get it using the MLJ wrapper:
Is this what you're after? |
Mmm... I see warm restart has not been implemented for the wrapper, which will make run time very slow for large numbers of iterations. I've posted JuliaAI/MLJDecisionTreeInterface.jl#40 in response. |
I am trying to use a random forest classifier with:
Similar features are implemented in other packages such as LightGBM. See, for instance, the links below:
I was hoping to be able to do something similar with DecisionTree.jl directly. |
Yes, I understand. I just don't think that functionality exists here. I'll leave the issue open, and perhaps someone will add it. For my part, I'd rather prioritise model-generic solutions to solutions to controlling iterative models, which is what MLJIteration does. That way we avoid a lot of duplication of effort. |
@ablaom I think I figured how to do it using native APIs. In the case of classification trees, this is easy enough. All you need to do it to do something along the lines of In the case of a random forest, things are a little more complicated. Is there any way around writing a custom DecisionTree.jl/src/classification/main.jl Line 378 in f57a156
and DecisionTree.jl/src/classification/main.jl Line 394 in f57a156
I suppose. EDIT I think it could be nice to extend DecisionTree or have a small package with more flexible versions of |
Yes, I also recently discovered that the |
@ablaom I have almost finished writing a custom implementation that allows for custom bootstrapping as well (e.g., stratified sampling). Do you think it would be best to keep it separate or would you accept a pull request with it as well? |
Glad to hear about the progress. I think to reduce the maintenance burden on this package I'd prefer not to add model-generic functionality within the package itself. MLJ and other toolboxes provide for things like stratified resampling. For example:
I suggest that if MLJ has a feature you're missing that you open an issue there - and maybe even help provide it. The impact will be greater and the maintenance burden lower. |
@fipelle When this PR merges you will be able to (efficiently) control early stopping (and more) through the MLJ interface. A RandomForestClassifier example is given in the PR. Another example is this notebook. |
Hi,
I can't seem to find the documentation for creating custom stopping criteria (ideally for ensembles) and loss functions. Could you please point me in the right direction? Thanks!
The text was updated successfully, but these errors were encountered: