TaskWeaver
TaskWeaver copied to clipboard
CodeSpace is disabled
May I know why I am disabled from creating a codespace from task weaver github respo?
Thank you!
Hi @cswangxiaowei, could you check any configuration related the codespace? I can see the codespace after switching to my private GitHub account.
My code space is still disabled. Besides, may I know how if I can use this format to use qwen's api with baichuan model? I tried but failed: { "llm.api_type": "qwen", "llm.model": "baichuan2-7b-chat-v1", "llm.api_key": "YOUR_API_KEY" } p.s. I test this api locally and it works so I think there must be some problem with the code format of using this api. Thank you!
I solved the problem of creating a codespace. Please just review my issue of api format: { "llm.api_type": "qwen", "llm.model": "baichuan2-7b-chat-v1", "llm.api_key": "YOUR_API_KEY" } May I know if this one will work?
We don't directly support the baichuan model, but you can configure it by using the litellm
api_type. In litellm
, you can follow instructions in this page to connect via vllm
provider, which now supports baichuan2 model.
QWen LLM API supports baichuan model in the model list. May I know if I can use Qwen api with baichuan model in task weaver?
Thanks for sharing the information. For the baichuan model, are you referring to the open-sourced one or the commercial one? Do you have any docs about the Baichuan on QWen? Thanks!
No problem, I want to use baichuan2-13b-chat-v1, which is free now. https://help.aliyun.com/zh/dashscope/developer-reference/baichuan-metering-and-billing?spm=a2c4g.11186623.0.i10
Hi, @cswangxiaowei, the issue was triggered because Baichuan2 on QWen does not support the streaming mode. However, we also find that the context length of Baichuan2 7B/13B on QWen (not sure why) is 2048 token only, which is not sufficient to accommodate our prompt. Therefore, it will fail. Maybe a better way is to use the QWen model or use VLLM to self-host a Baichuan model.
Thank you!