Miguel Crispim Romao

Results 24 comments of Miguel Crispim Romao

Interested for ML training and inference as well. The overhead to transfer to sagemaker is too high, we just train models on EC2 GPU boxes and then use CPU runtime...

I notice this as well, huge memory leak that does not improve by invoking python's garbage collector

Hi @fbarrios , I've noticed the memory leak if you attempt to summarise/extract keywords from many small documents. It seems that it might be retaining in memory things that it...

This is not working for me, and the INFO is quite annoying for longer projects

This happens to me as well https://gist.github.com/romanovzky/a7fd3a7ede09c8c81b06bc5965c34402 The best model: ```python {'GRU': 2, 'GRU_1': 1, 'epochs': 1, 'recurrent_dropout': 0.7371698374615214, 'recurrent_dropout_1': 0.6517968154887782, 'recurrent_dropout_2': 0.4371162594318422} ``` You can see that both the...

I have this problem as well with `ExponentialCyclicalLearningRate` will there be a fix? This is incredibly off putting when running several experiments trying out optimiser and learning rate details...

@m-zheng I thought about a similar solution, but I don't know which checkpoints are not being used anymore or won't be used for the next iterations. How did you solve...

Ok, so this is a posteriori solution. My problem is that the growing storage space during training is preventing me from finalising a full optimisation round.

I guess no progress has been made? The difficulty here is that ImbLearn applies `fit` and `sample`, notice the latter is not `transform` as it does not change features (transformations),...