澄潭

Results 82 issues of 澄潭

## Why do you need it? When using the streaming interface of deepseek-r1 from dashscope, reasoningContent also needs to be output in a streaming manner; otherwise, the API will wait...

type/enhancement
sig/wasm
area/ai

基于 LLM Mock Server 增强 ai-proxy 插件的 e2e 测试进度如下 ### 目前已支持 - baidu - doubao - minimax ### 待支持 #### 因响应格式与OpenAI不同,需在 Mock Server 中实现对应响应 - [ ] #1717 - [...

area/e2e
area/ai

help wanted
area/e2e
area/ai

## Why do you need it? Because running model services in an e2e environment or accessing the LLM provider's API is not very feasible. The e2e testing of plugins for...

area/e2e
kind/HEP
sig/wasm
area/ai

help wanted
area/e2e
area/ai

help wanted
area/e2e
area/ai

help wanted
area/e2e
area/ai

代理字节扣子相关api,实现对应llm mock server逻辑 https://www.coze.cn/open/docs/developer_guides/chat_v3

help wanted
area/e2e
area/ai

## Why do you need it? 之前的 cors 插件实现不够严谨,具体如下: 1. 在判断是否要触发对预检请求直接生成响应时,没有检查 Access-Control-Request-* 相关头; 2. 对于携带 Origin 请求头,但是 method/headers 等和配置不同的请求,直接返回了 403, 这个没有必要,因为浏览器会做相应的判断: https://github.com/alibaba/higress/blob/b997e6fd265aaaf5b9de6a8d44acb2a3079fa05e/plugins/wasm-go/extensions/cors/config/cors_config.go#L281-L310 3. 应该增加 Vary: Origin 响应头,来避免浏览器缓存导致跨域失效

level/normal
sig/wasm