compatibility-family-learning
compatibility-family-learning copied to clipboard
Compatibility Family Learning for Item Recommendation and Generation
Compatibility Family Learning for Item Recommendation and Generation

Prerequisites
- Linux
- Python 3
- CPU or NVIDIA GPU + CUDA CuDNN
Getting Started
Installation
-
Install Tensorflow from https://www.tensorflow.org/install/
-
Install Caffe from http://caffe.berkeleyvision.org/installation.html (for Amazon co-purchase experiments).
-
Install Keras from https://keras.io/ (for Polyvore experiments).
-
Install python packages
pip install tqdm
pip install smart-open
pip install boto3
pip install scipy
pip install numpy
Recommendation Experiments
Fashion-MNIST+1+2 Experiments
- Run
./experiments/fashion_30/convert_fashion_30.sh. - Run
./experiments/fashion_30/run.sh. - Run
./experiments/fashion_30/eval.sh.
Amazon also_veiwed/bought Experiments
-
Put
image_features_Clothing_Shoes_and_Jewelry.bandmeta_Clothing_Shoes_and_Jewelry.json.gzfrom http://jmcauley.ucsd.edu/data/amazon/ todata/amazon. -
Run
./experiments/monomer/convert_amazon.sh. -
Put
also_bought.txt.gz,also_viewed.txt.gz,duplicate_list.txt.gz,productMeta_simple.txt.gzfrom Learning Compatibility Across Categories for Heterogeneous Item Recommendation (https://sites.google.com/a/eng.ucsd.edu/ruining-he/) intodata/monomer. -
Run
./experiments/monomer/unzip_monomer.sh. -
Download
Monomer.tar.gzfrom Learning Compatibility Across Categories for Heterogeneous Item Recommendation (https://sites.google.com/a/eng.ucsd.edu/ruining-he/) and put it to./Monomer. -
Run
./experiments/monomer/prepare_monomer.sh. -
Run
./experiments/monomer/split_monomer.sh. -
Run
./experiments/monomer/process_monomer.sh. -
Run
./experiments/monomer/run.sh. -
Run
./experiments/monomer/eval.sh.
Amazon Co-purchase Experiments
-
Put
train.txt,val.txt,test.txt,train_ids.txt,val_ids.txt,test_ids.txtfrom Learning Visual Clothing Style with Heterogeneous Dyadic Co-occurrences todata/dyadic/; putgooglenet-siamese-final.caffemodelintomodels/. -
Put
metadata.json.gzfrom http://jmcauley.ucsd.edu/data/amazon/ todata/amazon. -
Run
./experiments/dyadic/preprocess_dyadic.sh. -
Crawl all images from Learning Visual Clothing Style with Heterogeneous Dyadic Co-occurrences by scrapy, put them on S3. Check
./data/dyadic/all_id_pairs.txtfor image paths, and see./experiments/dyadic/amazon_crawleras an example. -
Extract images, run
python -m cfl.scripts.copy_images --items-store ITEMS_S3_STORE_PATH --images-store IMAGES_S3_STORE_PATH --output-path IMAGES_S3_PATH --input-file data/dyadic/all_id_pairs.txt. -
Fetch images to local, run
aws s3 sync IMAGES_S3_DIR data/dyadic/original_images. -
Preprocess dyadic dataset, run
./experiments/dyadic/preprocess_dyadic_latent.sh. -
Predict dyadic latents, run
./experiments/dyadic/predict_dyadic_latent.shunder caffe environment. -
Convert dyadic dataset, run
./experiments/dyadic/convert_dyadic_latent.sh. -
Run
./experiments/dyadic/run.sh. -
Run
./experiments/dyadic/eval.sh.
Polyvore Experiments
-
Crawl all images, put images in
IMAGES_DIR, items inITEMS_S3_STORE_PATH. See./experiments/polyvore/polyvore_crawleras an example. -
Run
python -m cfl.scripts.preprocess_polyvore --items-store ITEMS_S3_STORE_PATH --image-dir IMAGES_DIR --output-dir data/polyvore. -
Run
python -m cfl.keras.extract_v3 --input-dir data/polyvore/images --output-dir data/polyvore/latents. -
Run
./experiments/polyvore/convert_polyvore.sh. -
Run
./experiments/polyvore/run.sh
Generation Experiments
Note that you must run data preprocesing in the Recommendation section before running these experiments.
MNIST+1+2 Experiments
- Run
./experiments/mnist_30/convert_mnist_30.sh. - Run
./experiments/mnist_30/run_gen.sh. - Run
./experiments/mnist_30/run_cgan.sh.
Amazon Co-purchase Experiments
-
Convert dyadic dataset, run
./experiments/dyadic/preprocess_dyadic_gen.sh. -
Run
./experiments/dyadic/run_gen.sh. -
Run
python -m cfl.scripts.convert_disco --input-dir parsed_data/dyadic_gen_all --output-dir parsed_data/dyadic_discofor DiscoGAN. -
Run
python -m cfl.scripts.convert_pix2pix --input-dir parsed_data/dyadic_gen_all --disco-dir parsed_data/dyadic_disco --output-dir parsed_data/dyadic_pix2pixfor pix2pix. -
Run DiscoGAN & pix2pix.
Polyvore Experiments
-
Run
./experiments/polyvore/run_gen.sh -
Run
python -m cfl.scripts.convert_disco --input-dir parsed_data/polyvore_random/top_to_other --output-dir parsed_data/polyvore_random/top_to_other_discofor DiscoGAN. -
Run
python -m cfl.scripts.convert_pix2pix --input-dir parsed_data/polyvore_random/top_to_other --disco-dir parsed_data/polyvore_random/top_to_other_disco --output-dir parsed_data/polyvore_random/top_to_other_pix2pixfor pix2pix. -
Run DiscoGAN & pix2pix.
Citation
If you use this code for your research, please cite our papers.
@inproceedings{shih2018compatibility,
author = {Shih, Yong-Siang and Chang, Kai-Yueh and Lin, Hsuan-Tien and Sun, Min},
title = {Compatibility Family Learning for Item Recommendation and Generation},
booktitle = {Proceedings of the AAAI Conference on Artificial Intelligence (AAAI)},
pdf = {https://arxiv.org/pdf/1712.01262},
arxiv = {http://arxiv.org/abs/1712.01262},
year = {2018},
month = feb
}