Autokeras with flexible activation function and learning rate scheudle #1405
-
Hi Haifeng and any one who may help, I have very fun experience working with Autokeras, but it seems the activation function is fixed to ReLU and a constant learning rate is used. We are using feed-forward neural networks for regressions, and would like to use AutoKeras to discover more neural network architectures with the flexible activation functions(such as LeakyReLU), and learning rate schedules (such as exponential decay). Is this possible? Looking forward to explore more. Thank you. Meng |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 1 reply
-
You may implement it through the callbacks argument passed to the fit function. For weight decay we are supporting adam with weight decay in the search space but not configurable by the user. |
Beta Was this translation helpful? Give feedback.
You may implement it through the callbacks argument passed to the fit function.
I am not 100% sure if it works or not.
For weight decay we are supporting adam with weight decay in the search space but not configurable by the user.