-
Notifications
You must be signed in to change notification settings - Fork 163
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
linearGAM with sklearn gridsearchCV #247
Comments
@hongkahjun unfortunately this is a somewhat deep issue. When writing the new This will require a deeper fix right now :/ |
Getting what looks like the same problem when using Admittedly knowing absolutely nothing about the motivations for the changes you made, it seems like diverging from such an important package's requirements would be a bad idea? I know that if I can't get it to play nice with my company's existing sklearn infrastructure I'm probably going to have to abandon it. Is there an older version that keeps to sklearn's requirements? |
I see two solutions to this issue:
Please, let me know if either of these options is preferable and I will happily throw together a PR to fix this issue. Also, let me know if I've missed something! |
Actually, having looked at this further this does not seem to have anything at all to do with I believe this was the root of this bug. When I instead apply the changes is #267 GridSearchCV succeeds with no issues. Could you please look at this PR and see it is an acceptable fix? |
It seems this can be simply fixed by adding "callbacks=callbacks," to line 2267 of pygam.py. |
Also having this issue |
Is this not fixed yet? |
here is complete correction to pygam.py (line 2461): super(LinearGAM, self).init( |
Hi
I tried implementing LinearGAM with sklearn's GridsearchCV and got an error when gridsearchCV tried to clone the estimator. The code is below:
gam_rank, gam_cv_results = gam(x_all, y_all)
I get the error
The dataset I used was sklearn's california housing dataset.
The text was updated successfully, but these errors were encountered: