Deep-SVDD-PyTorch icon indicating copy to clipboard operation
Deep-SVDD-PyTorch copied to clipboard

Loading custom datasets

Open pavanvyn opened this issue 5 years ago • 10 comments

pavanvyn avatar Jul 04 '19 18:07 pavanvyn

Hello, I am using you code for my project. I have a few queries -

  1. My data consists of a training directory with labelled images (train --> class1, class2,.. --> images) and a testing directory of unlabelled images (test --> images). I have not been able to find a decent method to load them onto the code. Do I use a csv file to load them or is it to be done directly?
  2. How exactly would I have to configure/edit my data module to be the equivalent of mnist.py or cifar10.py (datasets directory)?
  3. As of now, have edited the datasets and networks (I built a custom architecture) directories. Am I right in assuming that the modules in base, utils and optim need not be edited in any way. If not, how should I go about doing it?

Any suggestion would be appreciated! Thank you.

pavanvyn avatar Jul 04 '19 18:07 pavanvyn

@pavanvyn Did you find a way to load custom datasets?

jetjodh avatar Aug 03 '19 06:08 jetjodh

Yes, I did. I wrote a program to extract data using a csv file.

pavanvyn avatar Aug 03 '19 16:08 pavanvyn

@pavanvyn Can you share the code?

jetjodh avatar Aug 05 '19 10:08 jetjodh

Sorry for the late reply. I used another GitHub code to do it. https://github.com/utkuozbulak/pytorch-custom-dataset-examples Use this.

pavanvyn avatar Aug 13 '19 12:08 pavanvyn

Hello, I am using you code for my project. I have a few queries -

1. My data consists of a training directory with labelled images (train --> class1, class2,.. --> images) and a testing directory of unlabelled images (test --> images). I have not been able to find a decent method to load them onto the code. Do I use a csv file to load them or is it to be done directly?

2. How exactly would I have to configure/edit my data module to be the equivalent of mnist.py or cifar10.py (datasets directory)?

3. As of now,  have edited the datasets and networks (I built a custom architecture) directories. Am I right in assuming that the modules in base, utils and optim need not be edited in any way. If not, how should I go about doing it?

Any suggestion would be appreciated! Thank you.

Hello, can you give me some instructions on how to use my own data? I don't know how to use the link you sent. Thank you very much !

JCCVW avatar Jun 22 '20 10:06 JCCVW

Go through my github https://github.com/pavanvyn/Galaxy-classification I have used Lukas Ruff's anomaly detection program with custom datasets using csv files. My csv files for extracting data look like this -

Location Label /full/location/to/image_1.jpg 0 /full/location/to/image_2.jpg 0 /full/location/to/image_3.jpg 1 /full/location/to/image_4.jpg 2 /full/location/to/image_5.jpg 1 /full/location/to/image_6.jpg 0

Hope this helps.

pavanvyn avatar Jul 08 '20 08:07 pavanvyn

Hello dear @pavanvyn

How do you split the train and test dataset? as far as I understood you made one dataset that is called train_test_dataset.csv

thank you

omid-ghozatlou avatar Jun 07 '21 08:06 omid-ghozatlou

@pavanvyn Can you share the code again?...

raymondlimw avatar Sep 17 '21 08:09 raymondlimw

@pavanvyn Could you please share the code again? Thank you.

ChanganLeo avatar Oct 28 '21 13:10 ChanganLeo