You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
We are using Databricks AutoML for a regression problem. The job runs for around 5 minutes and then fails with the error :
ERROR databricks.automl.base_learner: AutoML run with experiment id: 1264215502939848 failed with non-AutoML error Exception('Unable to generate notebook at /mlworkspace/mlflow_experiments/23-01-24-07:55-16. Model_Train_Automl-8af8fe13/23-01-24-07:55-DataExploration-6daa65a552c058ab075213cdd68e2ece using format JUPYTER: {"error_code":"MAX_NOTEBOOK_SIZE_EXCEEDED","message":"File size imported is (61906255 bytes), exceeded max size (50000000 bytes)"}\n')
The dimension of the dataset : (1160, 22)
Screenshot from the run -
The text was updated successfully, but these errors were encountered:
However, I notice that when I had the same error a couple of weeks ago, the error showed that the "max size" was about 10Mb and yours is 50Mb.. I wonder if only the size limit was increased but not the underlying issue regarding automl creating a big notebook for a small dataset.
We are using Databricks AutoML for a regression problem. The job runs for around 5 minutes and then fails with the error :
ERROR databricks.automl.base_learner: AutoML run with experiment id: 1264215502939848 failed with non-AutoML error Exception('Unable to generate notebook at /mlworkspace/mlflow_experiments/23-01-24-07:55-16. Model_Train_Automl-8af8fe13/23-01-24-07:55-DataExploration-6daa65a552c058ab075213cdd68e2ece using format JUPYTER: {"error_code":"MAX_NOTEBOOK_SIZE_EXCEEDED","message":"File size imported is (61906255 bytes), exceeded max size (50000000 bytes)"}\n')
The dimension of the dataset : (1160, 22)
Screenshot from the run -
The text was updated successfully, but these errors were encountered: