[Question] Are you considering supporting the model service at dashscope.aliyun.com?
Required prerequisites
- [X] I have read the documentation https://camel-ai.github.io/camel/camel.html.
- [X] I have searched the Issue Tracker and Discussions that this hasn't already been reported. (+1 or comment there if it has.)
- [X] Consider asking first in a Discussion.
Questions
Currently, Alibaba's dashscope.aliyun.com model service supports two methods: a private SDK and an OpenAI-compatible interface.
I have made some appropriate modifications to the Camel code, and I can now call the Qwen model on dashscope through the OpenAI-compatible interface.
Aside from adding appropriate model configurations in camel/models/model_factory.py and camel/types/enums.py, the only obstacle is that camel/utils/token_counting.py checks for the model name and throws an exception if there’s a mismatch (could we use a default value instead and log a warning instead of throwing an exception?).
Being new to Camel, I wondered if the compatible model interface was appropriate. While many model services provide an interface compatible with OpenAI, support for function calls varies widely.
Hey @coolbeevip , thanks for the question! It's a great idea to add more support for models support OpenAI-compatible interface, I will create a PR for this and add you as reviewer. To support more model types I will not set model type as enum value, I will let user pass the model type string directly and I will set OpenAI's token counter as default one, but allow user to switch the token counter based on their needs
@coolbeevip PR create: https://github.com/camel-ai/camel/pull/815
@coolbeevip PR create: #815
Thanks LGTM