FastChat
FastChat copied to clipboard
Support Bailing model
trafficstars
We hope make our LLM named BaiLing become the one of the optional LLM on the chat.lmsys.org and join the chat on the website. We setup our own http end-point for LLM's reasoning and Bailing LLM has already been compatible with openai client and I have passed the test based on the FastChat document on my local environment.
I can setup PR to submit my code to fastchat/serve/api_provider.py. Is that all I need to do? Thank you.