necrophagists
necrophagists
Hi, I've been using it recently and noticed that wandb logs memory information even when trainer's args.skip_memory_tracker=True, how can I turn off this setting in wandb?
 i change the llm to llava-v1.6-mistral-7b,but model's output is empty。
Hello,can you share the pretrain Loss Curve and fine-tune Loss Curve? I have some questions about my reproduction results.Thank you!
### 📚 The doc issue 怎么使用lmdeploy进行视频模态的推理? ### Suggest a potential alternative/fix _No response_
s 
Hello! I wonder if RWKV7 used the sequence packing strategy during pre-training? If so, do the samples need to be masked from each other?
非常棒的工作,我想问下你们的MQAR数据是用Zoology仓库里的脚本生成的吗?