sunxAI
sunxAI
The default max-token of command-r is 128k, but the display is 4096. I modified - api/core/model_runtime/model_providers/cohere/llm/command-r.yaml and mapped it to docker. Dify can operate, but messages cannot be sent. Report...

Gpt-4 turbo is a 128k context, and it is the same situation. I can only set it to 4096. This is restricted by Dify or OpenAI.  @Yeuoly Please help...
他这个开源的linux 没有出视频占位符,企业版也没有,当时费了好大劲,自己开发,图片占位符导出、视频占位符导出、文字占位符导出,后来开发出来公司也不用这块了 ,哎
用了 很好用,慢慢积累知识 ,https://sunx.ai 欢迎同样是innei.in的提交友链。