gpt_academic
gpt_academic copied to clipboard
[Bug]: 使用one-api + azure作为API_URL_REDIRECT 时出现错误
Installation Method | 安装方法与平台
Docker(Linux)
Version | 版本
Latest | 最新版
OS | 操作系统
Docker
Describe the bug | 简述
“Traceback (most recent call last): File "./request_llm/bridge_chatgpt.py", line 199, in predict if ('data: [DONE]' in chunk_decoded) or (len(json.loads(chunk_decoded[6:])['choices'][0]["delta"]) == 0): ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^^^ IndexError: list index out of range”
data: {"id":"","object":"","created":0,"model":"","prompt_filter_results":[{"prompt_index":0,"content_filter_results":{"hate":{"filtered":false,"severity":"safe"},"self_harm":{"filtered":false,"severity":"safe"},"sexual":{"filtered":false,"severity":"safe"},"violence":{"filtered":false,"severity":"safe"}}}],"choices":[],"usage":null}data: {"id":"chatcmpl-7xrADmwJUiAKBMFmoNq5FIQpxi2qO","object":"chat.completion.chunk","created":1694500069,"model":"gpt-35-turbo","choices":[{"index":0,"finish_reason":null,"delta":{"role":"assistant"},"content_filter_results":{}}],"usage":null}data: {"id":"chatcmpl-7xrADmwJUiAKBMFmoNq5FIQpxi2qO","object":"chat.completion.chunk","created":1694500069,"model":"gpt-35-turbo","choices":[{"index":0,"finish_reason":null,"delta":{"content":"Hello"},"content_filter_results":{"hate":{"filtered":false,"severity":"safe"},"self_harm":{"filtered":false,"severity":"safe"},"sexual":{"filtered":false,"severity":"safe"},"violence":{"filtered":false,"severity":"safe"}}}],"usage":null}data: {"id":"chatcmpl-7xrADmwJUiAKBMFmoNq5FIQpxi2qO","object":"chat.completion.chunk","created":1694500069,"model":"gpt-35-turbo","choices":[{"index":0,"finish_reason":null,"delta":{"content":"!"},"content_filter_results":{"hate":{"filtered":false,"severity":"safe"},"self_harm":{"filtered":false,"severity":"safe"},"sexual":{"filtered":false,"severity":"safe"},"violence":{"filtered":false,"severity":"safe"}}}],"usage":null}data: {"id":"chatcmpl-7xrADmwJUiAKBMFmoNq5FIQpxi2qO","object":"chat.completion.chunk","created":1694500069,"model":"gpt-35-turbo","choices":[{"index":0,"finish_reason":null,"delta":{"content":" How"},"content_filter_results":{"hate":{"filtered":false,"severity":"safe"},"self_harm":{"filtered":false,"severity":"safe"},"sexual":{"filtered":false,"severity":"safe"},"violence":{"filtered":false,"severity":"safe"}}}],"usage":null}data: {"id":"chatcmpl-7xrADmwJUiAKBMFmoNq5FIQpxi2qO","object":"chat.completion.chunk","created":1694500069,"model":"gpt-35-turbo","choices":[{"index":0,"finish_reason":null,"delta":{"content":" can"},"content_filter_results":{"hate":{"filtered":false,"severity":"safe"},"self_harm":{"filtered":false,"severity":"safe"},"sexual":{"filtered":false,"severity":"safe"},"violence":{"filtered":false,"severity":"safe"}}}],"usage":null}data: {"id":"chatcmpl-7xrADmwJUiAKBMFmoNq5FIQpxi2qO","object":"chat.completion.chunk","created":1694500069,"model":"gpt-35-turbo","choices":[{"index":0,"finish_reason":null,"delta":{"content":" I"},"content_filter_results":{"hate":{"filtered":false,"severity":"safe"},"self_harm":{"filtered":false,"severity":"safe"},"sexual":{"filtered":false,"severity":"safe"},"violence":{"filtered":false,"severity":"safe"}}}],"usage":null}data: {"id":"chatcmpl-7xrADmwJUiAKBMFmoNq5FIQpxi2qO","object":"chat.completion.chunk","created":1694500069,"model":"gpt-35-turbo","choices":[{"index":0,"finish_reason":null,"delta":{"content":" assist"},"content_filter_results":{"hate":{"filtered":false,"severity":"safe"},"self_harm":{"filtered":false,"severity":"safe"},"sexual":{"filtered":false,"severity":"safe"},"violence":{"filtered":false,"severity":"safe"}}}],"usage":null}data: {"id":"chatcmpl-7xrADmwJUiAKBMFmoNq5FIQpxi2qO","object":"chat.completion.chunk","created":1694500069,"model":"gpt-35-turbo","choices":[{"index":0,"finish_reason":null,"delta":{"content":" you"},"content_filter_results":{"hate":{"filtered":false,"severity":"safe"},"self_harm":{"filtered":false,"severity":"safe"},"sexual":{"filtered":false,"severity":"safe"},"violence":{"filtered":false,"severity":"safe"}}}],"usage":null}data: {"id":"chatcmpl-7xrADmwJUiAKBMFmoNq5FIQpxi2qO","object":"chat.completion.chunk","created":1694500069,"model":"gpt-35-turbo","choices":[{"index":0,"finish_reason":null,"delta":{"content":" today"},"content_filter_results":{"hate":{"filtered":false,"severity":"safe"},"self_harm":{"filtered":false,"severity":"safe"},"sexual":{"filtered":false,"severity":"safe"},"violence":{"filtered":false,"severity":"safe"}}}],"usage":null}data: {"id":"chatcmpl-7xrADmwJUiAKBMFmoNq5FIQpxi2qO","object":"chat.completion.chunk","created":1694500069,"model":"gpt-35-turbo","choices":[{"index":0,"finish_reason":null,"delta":{"content":"?"},"content_filter_results":{"hate":{"filtered":false,"severity":"safe"},"self_harm":{"filtered":false,"severity":"safe"},"sexual":{"filtered":false,"severity":"safe"},"violence":{"filtered":false,"severity":"safe"}}}],"usage":null}data: {"id":"chatcmpl-7xrADmwJUiAKBMFmoNq5FIQpxi2qO","object":"chat.completion.chunk","created":1694500069,"model":"gpt-35-turbo","choices":[{"index":0,"finish_reason":"stop","delta":{},"content_filter_results":{}}],"usage":null}data: [DONE]
使用相同的one-api配置和 API-Key 的 fastgpt、chatgpt-web等服务都可以正常使用, 只有本项目出现异常.
Screen Shot | 有帮助的截图
Terminal Traceback & Material to Help Reproduce Bugs | 终端traceback(如有) + 帮助我们复现的测试材料样本(如有)
No response
很奇怪,这是反代的azure吗,但项目直接支持azure呀
很奇怪,这是反代的azure吗,但项目直接支持azure呀
是反代的Azure, 不直接使用Azure AI是因为只能配置一个Azure AI, 如果使用one-api, 则可以配置多个Azure AI进行轮询使用, 而且可以接入不同的Endpoint, 还有避免达到频率限制等. 还有一个好处是one-api可以接管多个AI 服务,方便管理.
这里的问题是one api返回了一个"choices":[],但openai官方接口和azure官方接口都不会这样做
cross reference: https://github.com/songquanpeng/one-api/issues/521
这里的问题是one api返回了一个"choices":[],但openai官方接口和azure官方接口都不会这样做
https://platform.openai.com/docs/api-reference/chat/create
官方也是有 choice 的:
{
"id": "chatcmpl-123",
"object": "chat.completion",
"created": 1677652288,
"model": "gpt-3.5-turbo-0613",
"choices": [{
"index": 0,
"message": {
"role": "assistant",
"content": "\n\nHello there, how may I assist you today?",
},
"finish_reason": "stop"
}],
"usage": {
"prompt_tokens": 9,
"completion_tokens": 12,
"total_tokens": 21
}
}
这里的问题是one api返回了一个"choices":[],但openai官方接口和azure官方接口都不会这样做
https://platform.openai.com/docs/api-reference/chat/create
官方也是有 choice 的:
{ "id": "chatcmpl-123", "object": "chat.completion", "created": 1677652288, "model": "gpt-3.5-turbo-0613", "choices": [{ "index": 0, "message": { "role": "assistant", "content": "\n\nHello there, how may I assist you today?", }, "finish_reason": "stop" }], "usage": { "prompt_tokens": 9, "completion_tokens": 12, "total_tokens": 21 } }
有,但不能是空的
那这是程序实现的时候没有检查索引是否越界,推荐把这个 bug 修了。
至于是空的这种情况,我之后检查一下,按理说不应该,应该是一对一转换的,如果为空说明上游返回的也是空。
可以适配一下么? 这个one-api还蛮不错的.
可以适配一下么? 这个one-api还蛮不错的.
主要是不敢改呀,第三方各种五花八门的接口,改了这个,马上崩另一个
我觉得还是以openai官方为准绳吧
而且对于one-api,不是没有适配过,我都适配麻了。同样是one-api,gpt3跟gpt4走的格式都不一样,然后现在azure走的跟gpt3、4都不一样
我再检查一下,DONE 缺失这种有例子可以看吗
我再检查一下,DONE 缺失这种有例子可以看吗
官方的例子
"choices":[{"index":0,"finish_reason":"stop","delta":{}}]
one-api有时候会直接缺少这个,突兀结尾
我再检查一下,DONE 缺失这种有例子可以看吗
官方的例子
"choices":[{"index":0,"finish_reason":"stop","delta":{}}]
one-api有时候会直接缺少这个,突兀结尾
我再检查一下,DONE 缺失这种有例子可以看吗
官方的例子,会以这个结尾。
"choices":[{"index":0,"finish_reason":"stop","delta":{}}]
one-api根据模型的不同,有时候会直接缺少这个,突兀结尾,常见于gpt-4。
@songquanpeng 另外您有兴趣加一下项目群610599535或者我QQ505030475吗,最近大家遇到许多类似问题,这种琐碎的问题我觉得还是即时沟通更高效一点
好的
所以有办法解决吗?
而且对于one-api,不是没有适配过,我都适配麻了。同样是one-api,gpt3跟gpt4走的格式都不一样,然后现在azure走的跟gpt3、4都不一样
3.5 和 4 是一样的代码逻辑
而且对于官方接口,实际上 One API 根本没有解析,就是原样传过来的,所做的事情就是算了一下消耗额度:
Azure好像是最近改版了api的输出嘛?前几天用还正常,现在就不行了。这个bug会修吗?不然只能自己瞎改适配了,确实是"choices":[]产生的问题。
Azure的接口返回的steam流开头和结尾都有不符合代码原本判断的东西,少key。 我把这两句都过滤掉了然后简单测试可以正常使用。
Azure好像是最近改版了api的输出嘛?前几天用还正常,现在就不行了。这个bug会修吗?不然只能自己瞎改适配了,确实是"choices":[]产生的问题。
可以发个PR,我看下能不能稳定兼容其他接口
所以目前是无法匹配one-api的接口吗?
所以目前是无法匹配one-api的接口吗?
我不是,我就用Azure官方申请的接口啊。没改就会报错。
各位大佬,有进展了没?
早就解决了呀 @VectorZhao
早就解决了呀 @VectorZhao
我使用的时候,chatpdf功能会报错,正常聊天没问题
早就解决了呀 @VectorZhao
我使用的时候,chatpdf功能会报错,正常聊天没问题
一样啊,用插件理解pdf就报错,聊天没问题
早就解决了呀 @VectorZhao
用插件理解pdf就报错,聊天没问题
早就解决了呀 @VectorZhao
File ".\request_llms\bridge_chatgpt.py", line 117, in predict_no_ui_long_connection json_data = json.loads(chunk.lstrip('data:'))['choices'][0]
json_data = json.loads(chunk.lstrip('data:'))['choices'][0] 修改为
start
json_data = json.loads(chunk.lstrip('data:')) if 'choices' not in json_data or not json_data['choices']: continue # 如果choices不存在或者为空,跳过当前迭代 json_data = json_data['choices'][0] #end
测试了az可以用,也不影响官方的接口使用。