LISA
LISA copied to clipboard
Project Page for "LISA: Reasoning Segmentation via Large Language Model"
Hi @X-Lai, @tianzhuotao, @yukang2017, @yanwei-li , @xbkaishui I am evaluating your method with your provided model (xinlai/LISA-13B-llama2-v1) as a part of my research studies. I see the significant difference between...
I am trying to retrain the LISA with 7B model, I encounter such warning: Token indices sequence length is longer than the specified maximum sequence length for this model (565...
Hi there! I'm exploring whether LISA is available on Hugging Face. I came across one link [here](https://huggingface.co/xinlai/LISA-13B-llama2-v0-explanatory), but unfortunately, there's no description provided. My interest lies in Hugging Face because...
Hi, Great work! I was wondering what is the difference between different model files, like `bf16_zero_pp_rank_1_mp_rank_00_optim_states.pt`, `mp_rank_00_model_states.pt`, `pytorch_model.bin` which is obtained after the `zero_to_fp32.py`, and the file obtained after `merge_lora_weights_and_save_hf_model.py`....
Hi, Thanks for such a great and interesting work. I would like to ask how long it takes to infer a picture in the refcocog dataset using Lisa-7b on a...
Hi Thank you for your impressive work. In order to reproduce the results of your paper and do further research. Could you let me know if the testing set of...
Dear author, Thank you for your significant contribution to the community; the performance of your model is truly impressive. I am curious to know if you have considered replacing the...
Hi, thank you for your excellent work. I trained my own lisa model and merged the weights. But when i try to run the chat.py for inference, the error occured...
During training the model, I frequently receive the following warning: `Token indices sequence length is longer than the specified maximum sequence length for this model (590 > 512). Running this...
Hi, when running chat.py, I can get the text response, but the shape of the mask is [0, 1, x, x], how can I fix it? Thanks in advance!