autogluon
autogluon copied to clipboard
Commenting out the train_data.head command will not improve model performance.
Describe the issue linked to the documentation
It reads in the documentation:
For example: you can comment out the train_data.head command or increase subsample_size to train using a larger dataset, increase the num_epochs and num_boost_round hyperparameters, and increase the time_limit (which you should do for all code in these tutorials).
https://auto.gluon.ai/stable/tutorials/tabular/tabular-indepth.html
But train_data.head is only used in a print statement, and so commenting it out will no improve model fit performance.
Suggest a potential alternative/fix
Rewrite:
For example: you can increase subsample_size to train using a larger dataset, increase the num_epochs and num_boost_round hyperparameters, and increase the time_limit (which you should do for all code in these tutorials).