viyjy
viyjy
Dear author, please consider to add the following paper and its code, thanks. You can put it under `Image-based VL-PTMs/Representation Learning`. Vision-Language Pre-Training with Triple Contrastive Learning arxiv link: https://arxiv.org/abs/2202.10401...
Hi, when I download the LAION2B-multi dataset using Spark, it will stop at some point, but there is no error. Then I rerun the code by setting `incremental_mode="incremental"` and get...
Hi, Thanks for your awesome work. I am using AWS EMR (with spark) to download the LAION5B dataset by following this [distributed mode](https://github.com/rom1504/img2dataset/blob/main/dataset_examples/laion5B.md). However, when I run the download.py in...
Yunsheng, Would you please provide the SSL model and hyper-parameters for the VGG-based models? Thanks.
## 🚀 Feature Request I found that MLFlowLogger slows down the throughput twice than wandbLogger. I saw that there are lots of "import mlflow" in https://github.com/mosaicml/composer/blob/dev/composer/loggers/mlflow_logger.py, is that root cause?...
Hi, thanks for your work. I am wondering what's the difference between two CogVLM models in Table 2 and Table 4. The reason I am asking is that the performance...
Hi, thanks for your work. May I ask did you fine-tune VE and MLP Adaptor at both pre-training stage and SFT stage?, thanks.
1. mathvista_testmini: ``` { "results": { "mathvista_testmini": { " ": " ", "alias": "mathvista_testmini" }, "mathvista_testmini_cot": { "alias": " - mathvista_testmini_cot", "gpt_eval_score,none": 29.2, "gpt_eval_score_stderr,none": "N/A", "submission,none": [], "submission_stderr,none": [] },...
Hi, thanks for your great work. When using a single node for model training and saving intermediate checkpoints, I can use `resume_from_checkpoint` to continue training. However, when using multiple nodes...
Hi, for example I am training a job using this [yaml](https://github.com/mosaicml/diffusion/blob/main/yamls/hydra-yamls/SD-2-base-512.yaml), how to do continue training if this job failed? Thanks.