NGUYEN, Van Tuan

Results 4 comments of NGUYEN, Van Tuan

Hi you guys. Have you solved this problem? I have a same error when evaluating the testing set. I see that It throws the error at the last batch of...

Hi you guy, @LeonYang95. I have just considered that there is an error in `module/Attention.py/class LinearAttention` , in https://github.com/LeonYang95/PLELog/blob/c8bb56b08fe6368f3b3c71ff88de8a87c48c7607/module/Attention.py#L275 `combined_tensors.squeeze(1)` will remove the input dimension which has size 1. So,...

> Hi @dino-chiio , > > Your comment about the shape issue is correct, the squeeze will produce this error when sequence length is one. But for other situations, the...

Hi everyone. I have implemented the [BLIP-VQA-BASE](https://huggingface.co/Salesforce/blip-vqa-base) model for the VQA task [here](https://github.com/dino-chiio/blip-vqa-finetune). I hope this implementation can help you and this implementation will receive some advice.