Fashion-Hash-Net
Fashion-Hash-Net copied to clipboard
Code and dataset for CVPR 2019 paper "Learning Binary Code for Personalized Fashion Recommendation"
Fashion Hash Net
Description
This responsitory contains the code of paper Learning Binary Code for Personalized Fashion Recommendation
Required Packages
- pytorch
- torchvision
- PIL
- numpy
- pandas
- tqdm: A Fast, Extensible Progress Bar for Python and CLI
- lmdb: A universal Python binding for the LMDB 'Lightning' Database.
- yaml: PyYAML is a full-featured YAML framework for the Python programming language.
-
visdom: To start a visdom server run
python -m visdom.server
I upgraded the version of PyTorch to 1.2.0
and the package dependency is solved automatically with conda
.
The last 4 packages can be install via conda
:
conda install python-lmdb pyyaml visdom tqdm -c conda-forge
How to Use the Code
The main script scripts/run.py
currently supports the following functions:
ACTION_FUNS = {
# train models
"train": train,
# runing the FITB task
"fitb": fitb,
# evaluate pairs accuracy
"evaluate-accuracy": evalute_accuracy,
# evaluate NDCG and AUC
"evaluate-rank": evalute_rank,
# compute the binary codes
"extract-features": extract_features,
}
Configurations
There are three main modules in polyvore
:
-
polyvore.data
: module for polyvore-dataset -
polyvore.model
: module for fashion hash net -
polyvore.solver
: module for training
For configurations, see polyvore.param
, and we give some examples in cfg
folder. The configuration file was written in yaml format.
Train
To train FHN-T3
with both visual and semantic features, run the following script:
scripts/run.py train --cfg ./cfg/train/FHN_VSE_T3_630.yaml
Evaluate
To evaluate the accuracy of positive-negative pairs:
scripts/run.py evaluate-accuracy --cfg ./cfg/evalute/FHN_VSE_T3_630.yaml
To evaluate the rank quality:
scripts/run.py evaluate-rank --cfg ./cfg/evaluate-rank/FHN_VSE_T3_630.yaml
To evaluate the FITB task:
scripts/run.py fitb --cfg ./cfg/fitb/FHN_VSE_T3_630.yaml
How to Use the Polyvore-$U$s
-
Download the data from OneDrive and put the
polyvore
folder underdata
; -
Unzip the
polyvore/images/291x291.tar.gz
; -
Use
script/build_polyvore.py
to convert images and save indata/polyvore/lmdb
.
script/build_polyvore.py data/polyvore/images/291x291 data/polyvore/images/lmdb
The
lmdb
format can accelerate the load of images and set as default in configuration. If you don't want to use thelmdb
format, change the setting touse_lmdb: false
inyaml
files.
See <data/README.md> for details
How to Cite
@inproceedings{Lu:2019tk,
author = {Lu, Zhi and Hu, Yang and Jiang, Yunchao and Chen, Yan and Zeng, Bing},
title = {{Learning Binary Code for Personalized Fashion Recommendation}},
booktitle = {CVPR},
year = {2019}
}
Contact
Email: [email protected]