langserve
langserve copied to clipboard
how can i use local model
I have started using a model with vllm, and I want to use the web-based Langsmith in conjunction with my local model. The code is as follows. Iām curious about how I should customize the model and then use it on the web page.
###code llm = ChatOpenAI(model_name = 'Qwen2.5-14B-Instruct', base_url = "http://xxx:9009/v1", api_key = "EMPTY", temperature=0).bind(response_format={"type": "json_object"})