byzer-llm
byzer-llm copied to clipboard
Easy, fast, and cheap pretrain,finetune, serving for everyone
chat 模型可以试用一下命令部署: ``` byzerllm deploy --pretrained_model_type saas/official_openai \ --cpus_per_worker 0.01 \ --gpus_per_worker 0 \ --num_workers 1 \ --infer_params saas.api_key=xxxxx saas.model=llama2 saas.base_url="http://localhost:11434/v1/" \ --model ollama_llama2_chat ``` 嵌入及重排及重排呢?部署了嵌入模型,测试:`byzerllm query --model bge-m3 --query...
false document implication
> qianfan.errors.InvalidArgumentError: The provided model `ERNIE-3.5-128K` is not in the list of supported models. If this is a recently added model, try using the `endpoint` arguments and create an issue...
readme的中英文链接弄反了 | [English](https://github.com/allwefantasy/byzer-llm/blob/master/docs/zh) | [中文](https://github.com/allwefantasy/byzer-llm/blob/master/docs/en) |
[data_副本.txt](https://github.com/allwefantasy/byzer-llm/files/15233131/data_.txt) 使用下面的代码运行的时候出现下面的异常 `/Users/xxxxxx/miniforge3/lib/python3.10/site-packages/langchain/__init__.py:29: UserWarning: Importing PromptTemplate from langchain root module is no longer supported. Please use langchain_core.prompts.PromptTemplate instead. warnings.warn( /Users/xxxxxx/miniforge3/lib/python3.10/site-packages/langchain/__init__.py:29: UserWarning: Importing PromptTemplate from langchain root module is no longer...
``` # https://platform.openai.com/docs/guides/function-calling from openai import OpenAI import json client = OpenAI( base_url="http://127.0.0.1:8000/v1", api_key="simple" ) # Example dummy function hard coded to return the same weather # In production, this...
如题所述,是不是 macOS 目前不支持,各版本如下: ``` (base) ➜ ~ byzerllm --version Traceback (most recent call last): File "/Users/mintisan/miniconda3/bin/byzerllm", line 5, in from byzerllm.byzerllm import main File "/Users/mintisan/miniconda3/lib/python3.11/site-packages/byzerllm/__init__.py", line 3, in from pyjava.api.mlsql...