Sebastian.W
Sebastian.W
回答错乱跟padding_side的值应该没有关系,请参考我的回复https://github.com/Vision-CAIR/MiniGPT-4/issues/146#issuecomment-1523063250
Maybe this is helpful #164
Update GPU infomation: ``` Thu Apr 27 18:02:17 2023 +-----------------------------------------------------------------------------+ | NVIDIA-SMI 470.161.03 Driver Version: 470.161.03 CUDA Version: 11.4 | |-------------------------------+----------------------+----------------------+ | GPU Name Persistence-M| Bus-Id Disp.A | Volatile Uncorr....
I have solved the problem by modifying the `train_configs/minigpt4_stage2_finetune.yaml` as below: ```yaml iters_per_epoch: 20 batch_size_train: 2 batch_size_eval: 4 num_workers: 2 warmup_steps: 20 ``` I didn't dive deeper to find the...
``` +---------------------------------------------------------------------------------------+ | NVIDIA-SMI 530.30.02 Driver Version: 530.30.02 CUDA Version: 12.1 | |-----------------------------------------+----------------------+----------------------+ | GPU Name Persistence-M| Bus-Id Disp.A | Volatile Uncorr. ECC | | Fan Temp Perf Pwr:Usage/Cap| Memory-Usage...
> dataset/README_2_STAGE.md I read the document, and then I raised the two questions. Maybe it's quite clear to you, but could you answer my questions or show me anexample how...
> Hello! Our stage 2 focuses on teaching the model to talk in a natural way. If your own dataset can also serve for this purpose, you can ignore our...
@TsuTikgiau Thanks a lot, really helpful!
> Could you successfully fine-tune the 7B model? I met problem about padding. Yes, I ran the fine tuning stage(stage 2) sucessfully, but the result is not good till now,...
我的demo第一次也出现了这种胡说八道的情况。后来发现是我在修改`minigpt4/configs/models/minigpt4.yaml`文件时,误将`llama_model`指向了vicuna的delta文件。你不一定是跟我一样的情况,但是建议仔细检查一下配置文件是否正确。