pytorch_cnn_trainer
pytorch_cnn_trainer copied to clipboard
Train and Valid split for CSVDataset
🚀 Feature
Sometimes, we only get train.csv and test.csv, so it is best to split the train.csv into train_set and valid_set by default in CSVDataset like how it is in create_folder_dataset.
User can split it into train_set and valid_set after creating CSVDataset like below but it is good to have it out of the box.
split = 0.8
complete_dataset = CSVDataset(df, data_dir, target, transform)
train_split = len(complete_dataset) * split
valid_split = len(complete_dataset) * (1 - split)
train_set, valid_set = torch.utils.data.random_split(
complete_dataset, [train_split, valid_split]
)
Hi @hassiahk Thanks for the issue. I guess this can be better.
Currently, I'm not maintaining/developing this repo any more. I know that this project was quite good and people seemed to like this (2k + downloads over PyPi !!)
The entire code from this repository will be shifted to a bigger repo which I will open source in a few days. I will be really glad to accept contributions to them.
Sorry for this, but I will keep you informed.
Merging your other PR though 😄
That is a complete CV package, a much better and bigger with same API of this repo.
I would be glad to discuss PRs / Issues over there.
Hi @oke-aditya. No problem, I would be more than happy to contribute to the new one. :smiley:
Also let me know if you need any help in open sourcing your product. I am more than happy to help. :smile:
I always wanted to get involved in a project like that. :sweat_smile:
Hi @hassiahk !! Happy to say that this entire code and much more is shifted to complete CV Package under work
Here is the link to it.
It has super rough edges right now. I will soon create slack channel, API Docs, Examples (mostly same to these) and docuementation for the package.
Anyways feel free to PM me or open / help on issues in GitHub. I would be really glad for support in new package.
Hello @oke-aditya. I will look into this and see if I can contribute to it in any way. Thank you so much for sharing it. :smile: