WangRongsheng
WangRongsheng
> dev 版真的有惊喜。哎呀。棒 dev版本在哪里?
@yihuiluo235 你好,请问怎么使用到dev版本?
 > https://i.328888.xyz/2023/01/17/HfL23.png 您好,我也是这样的问题,在小程序开发者工具可以正常加载,一旦真机调试或者预览就一直显示在加载中,请问如何解决? @HunterXuan
您好,确认开启了“不校验合法域名”。这个模型不是本地加载的嘛(model文件夹),为什么会存在网络问题?
打印查看了,没什么大问题。还是报: ``` RuntimeErrorRuntimeError: : Tensors must be CUDA and denseTensors must be CUDA and dense ```
方便告知:cuda、cudnn、peft、transformers、bitsandbytes、accelerate、pytorch、torchvision等版本吗?
报错场景1: 1. sft:LoRA 2. rm:LoRA 3. ppo:QLoRA 报错场景2: 1. sft:QLoRA 2. rm:QLoRA 3. ppo:QLoRA或LoRA 正常场景: 1. sft:LoRA 2. rm:LoRA 3. ppo:LoRA
This will help you: https://colab.research.google.com/drive/1OK4kYsZphwt5DXchKkzMBjYF6jnkqh4R?usp=sharing
> Hi @jerryjliu If the `ChatGPTLLMPredictor` will be deprecated, how can we custom the chatGPT prompt by using `langchain's LLM wrapper`? Could you give a simple demo to show? Thanks!...
@jerryjliu @madawei2699 @timurka Maybe you can take a look at chatgpt_refine_prompt examples:https://github.com/jerryjliu/llama_index/pull/733