necrophagists

Results 7 issues of necrophagists

Hi, I've been using it recently and noticed that wandb logs memory information even when trainer's args.skip_memory_tracker=True, how can I turn off this setting in wandb?

![image](https://github.com/hpcaitech/Open-Sora/assets/120618287/6a2b6237-c7c1-4f42-b918-a98aaf6ef8dd) i change the llm to llava-v1.6-mistral-7b,but model's output is empty。

Hello,can you share the pretrain Loss Curve and fine-tune Loss Curve? I have some questions about my reproduction results.Thank you!

### 📚 The doc issue 怎么使用lmdeploy进行视频模态的推理? ### Suggest a potential alternative/fix _No response_

s ![微信截图_20240923214125](https://github.com/user-attachments/assets/d0ba934e-e018-49cf-ac2d-92b146506b29)

Hello! I wonder if RWKV7 used the sequence packing strategy during pre-training? If so, do the samples need to be masked from each other?

非常棒的工作,我想问下你们的MQAR数据是用Zoology仓库里的脚本生成的吗?