Viktor
Viktor
I have the same problem. Is any way to train Catboost on very large files 20-30Gb+? It does not fit to my memory and it has to be some simple...
@Evgueni-Petrov-aka-espetrov I have tried solution with Pool, but it loads all data to RAM. Does Pool support partially load to RAM? Maybe the reason is that one column is Auxiliary...
This is how I tried to load data and column description ` col_types = ["Num", "Auxiliary", "Num", "Num", "Categ", "Label"] + ["Num"] * 1449 pd.DataFrame(col_types).to_csv("col_descr.csv", header=None, sep='\t') ` ` from...
@Evgueni-Petrov-aka-espetrov for quantizer it is needed to load the Pool, but I can not load it, since Pool creation failed due to RAM limit
@praeclarum have you solved it?