train-CLIP-FT icon indicating copy to clipboard operation
train-CLIP-FT copied to clipboard

Results 7 train-CLIP-FT issues
Sort by recently updated
recently updated
newest added

hello I am going to train CLIP with my own dataset, but I met a exception : > raise MisconfigurationException("`train_dataloader` must be implemented to be used with the Lightning Trainer")...

This repo greatly helped my finetuning process. Thanks. I am wondering if these resulting checkpoints, which are produced by this code in `lightning_logs/version_#/checkpoints/~~~.ckpt,` are compatible with `clip.load`? I could `torch.load`...

Thank you for your excellent work. I have trained on my own data set, but I cannot load the trained model. How to test the model?

Is there a reason for this? All other parameters shared by `model.model` and `clp` (in train_finetune.py) are the same. Thank you!

After installing the following packages on Colab: pip install transformers pip install git+https://github.com/openai/CLIP.git pip install 'git+https://github.com/katsura-jp/pytorch-cosine-annealing-with-warmup' pip install pytorch-lightning and running the command: python train_finetune.py --folder data --batch_size 512 --gpu...

It seems the code trains all of the parameters. How do you train the last (output) layer only? BTW- Thank you for the code!