Fan Bao
Fan Bao
Here is the reference to issue #29
Thanks for the reply. My running command on training cluster head is `python tools/train_self_v2.py --config-file ./configs/cifar10/spice_self.py --all 0`. The config file is ``` num_cluster = 20 model_name = "spice_self_{}".format(num_cluster) #...
We use [this set of hyperparameters](https://github.com/baofff/Analytic-DPM/blob/main/cifar_imagenet_codes/profiles/ddpm/cifar10/train.py) to train the CIFAR10 model.
You can download it manually from huggingface: https://huggingface.co/gpt2 . Then you replace 'gpt2' to the local directory where files are saved.
This is limited by the CLIP we use. So currently we do not support arbitray length.
Yes. Many finetuing algorithms on transformers are already available, such as [https://arxiv.org/abs/2110.04366](https://arxiv.org/abs/2110.04366)