Local-LLM-Server icon indicating copy to clipboard operation
Local-LLM-Server copied to clipboard

Only allowed now, your model ChatModel

Open nygula opened this issue 1 year ago • 1 comments

D:\Local-LLM-Server\demos\dotnet-demo\bin\Debug\net6.0>dotnet-demo.exe Unhandled exception. Microsoft.SemanticKernel.HttpOperationException: Service request failed. Status: 400 (Bad Request)

Content: {"object":"error","message":"Only allowed now, your model ChatModel","code":40301}

Headers: Date: Tue, 30 Jan 2024 06:08:50 GMT Server: uvicorn Content-Length: 83 Content-Type: application/json

---> Azure.RequestFailedException: Service request failed.

D:\Local-LLM-Server>python startup.py 2024-01-30 14:12:46 | ERROR | stderr | [32mINFO[0m: Started server process [[36m9664[0m] 2024-01-30 14:12:46 | ERROR | stderr | [32mINFO[0m: Waiting for application startup. 2024-01-30 14:12:46 | ERROR | stderr | [32mINFO[0m: Application startup complete. 2024-01-30 14:12:46 | ERROR | stderr | [32mINFO[0m: Uvicorn running on [1mhttp://127.0.0.1:21001[0m (Press CTRL+C to quit) 2024-01-30 14:12:47 | ERROR | stderr | INFO: Started server process [19276] 2024-01-30 14:12:47 | ERROR | stderr | INFO: Waiting for application startup. 2024-01-30 14:12:47 | ERROR | stderr | INFO: Application startup complete. 2024-01-30 14:12:47 | ERROR | stderr | INFO: Uvicorn running on http://127.0.0.1:21000 (Press CTRL+C to quit) 2024-01-30 14:12:52 | INFO | model_worker | Loading the model ['ChatModel'] on worker b8b48c0d ... 2024-01-30 14:13:04 | INFO | stdout | INFO: 127.0.0.1:50817 - "POST /list_models HTTP/1.1" 200 OK 2024-01-30 14:13:04 | INFO | stdout | INFO: 127.0.0.1:50816 - "POST /v1/chat/completions HTTP/1.1" 400 Bad Request 2024-01-30 14:13:09 | INFO | stdout | INFO: 127.0.0.1:50820 - "POST /list_models HTTP/1.1" 200 OK 2024-01-30 14:13:09 | INFO | stdout | INFO: 127.0.0.1:50819 - "POST /v1/chat/completions HTTP/1.1" 400 Bad Request

nygula avatar Jan 30 '24 06:01 nygula

please wait for the server started messge:

Local-LLM-Server is successfully started, please use http://127.0.0.1:21000 to access the OpenAI interface

feiyun0112 avatar Jan 31 '24 01:01 feiyun0112