gpt_academic icon indicating copy to clipboard operation
gpt_academic copied to clipboard

[Feature]: 有没有可能支持llamafile.cpp或者Ollama?

Open cloudherder opened this issue 1 year ago • 8 comments

Class | 类型

大语言模型

Feature Request | 功能请求

现在llamafile.cpp或者Ollama运行大模型很方便了,可以考虑加入支持吗?

cloudherder avatar Feb 25 '24 14:02 cloudherder

+1

FLongWang avatar Mar 01 '24 02:03 FLongWang

+1,急求ollama支持

NeverOccurs avatar Mar 03 '24 16:03 NeverOccurs

算了,求人不如求己。

cloudherder avatar Mar 11 '24 14:03 cloudherder

大兄弟,搞定能不能给个仓库,最近太忙,没时间写接口了🙏

FLongWang avatar Mar 12 '24 00:03 FLongWang

大兄弟,搞定能不能给个仓库,最近太忙,没时间写接口了🙏

这个:https://github.com/cloudherder/models_for_gpt_academic ,勉强能用。

cloudherder avatar Mar 12 '24 02:03 cloudherder

老哥,稳!+999赞

FLongWang avatar Mar 12 '24 08:03 FLongWang

before ollama start, run: export OLLAMA_HOST="0.0.0.0:11434"

v37.4版本是可以直接支持的,改config的几个点即可: config.py

API_KEY = "ollama-key" LLM_MODEL = "one-api-qwen:14b(max_token=32768)"

API_URL_REDIRECT = {"https://api.openai.com/v1/chat/completions": "http://your_ip:11434/v1/chat/completions"}

AVAIL_LLM_MODELS = ["one-api-qwen:14b(max_token=32768)"]

CUSTOM_API_KEY_PATTERN = "ollama-key"

zerotoone01 avatar Apr 25 '24 07:04 zerotoone01

llama3 配置: LLM_MODEL = "ollama-llama3(max_token=4096)" AVAIL_LLM_MODELS = ["one-api-claude-3-sonnet-20240229(max_token=100000)", "ollama-llama3(max_token=4096)"] #如果你的模型是llama2,就填llama2,注意:一定不要填错 API_URL_REDIRECT = {"http://localhost:11434/api/chat": "http://:11434/api/chat"}# your address

下面是原因,感兴趣可以看 #模型调用的本质.即:request库必须匹配到对应的模型名才可以已正常访问,ollama是一个管理库,而不是名称. import requests url = 'http://*******:11434/api/chat' data = { "model": "llama3", "messages": [ { "role": "user", "content": "why is the sky blue?" } ] } response = requests.post(url, json=data)

打印响应内容 print(response.text)

lin-uice avatar Jun 22 '24 04:06 lin-uice