Major upgrade
In TPOT 0.4, we've made some major changes to the internals of TPOT and added some convenience functions. We've summarized the changes below.
- Added new sklearn models and preprocessors
- AdaBoostClassifier
- BernoulliNB
- ExtraTreesClassifier
- GaussianNB
- MultinomialNB
- LinearSVC
- PassiveAggressiveClassifier
- GradientBoostingClassifier
- RBFSampler
- FastICA
- FeatureAgglomeration
- Nystroem
- Added operator that inserts virtual features for the count of features with values of zero
- Reworked parameterization of TPOT operators
- Reduced parameter search space with information from a scikit-learn benchmark
- TPOT no longer generates arbitrary parameter values, but uses a fixed parameter set instead
- Removed XGBoost as a dependency
- Too many users were having install issues with XGBoost
- Replaced with scikit-learn's GradientBoostingClassifier
- Improved descriptiveness of TPOT command line parameter documentation
- Removed min/max/avg details during fit() when verbosity > 1
- Replaced with tqdm progress bar
- Added tqdm as a dependency
- Added
fit_predict()
convenience function - Added
get_params()
function so TPOT can operate in scikit-learn'scross_val_score
& related functions