kakaluote0098
kakaluote0098
**_疑问:_** 官方文档中使用 Azure OpenAI 服务时的配置: export MIDSCENE_USE_AZURE_OPENAI=1 export MIDSCENE_AZURE_OPENAI_SCOPE="https://cognitiveservices.azure.com/.default" export MIDSCENE_AZURE_OPENAI_INIT_CONFIG_JSON='{"apiVersion": "2024-11-01-preview", "endpoint": "...", "deployment": "..."}' 其中MIDSCENE_USE_AZURE_OPENAI和MIDSCENE_USE_AZURE_OPENAI怎么配置? 每个公司访问的Azure接口不一样,MIDSCENE_AZURE_OPENAI_SCOPE是否需要改变? ****使用的connectivity-test做的测试,当前测试文件采用的是openai的,不是Azure的** ,能否提供一个Azure的demo?** 配置如下:OPENAI_API_KEY=**xxxxxxxxxxx** **xxxxxxxxxxx** 为Azure的key **报错信息如下**:FAIL tests/connectivity.test.ts > Use OpenAI SDK directly...
File "D:\daima\OmniParser\.venv\Lib\site-packages\transformers\models\auto\auto_factory.py", line 526, in from_pretrained config, kwargs = AutoConfig.from_pretrained( ^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\daima\OmniParser\.venv\Lib\site-packages\transformers\models\auto\configuration_auto.py", line 1112, in from_pretrained raise ValueError( ValueError: Unrecognized model in weights/icon_caption_florence. Should have a `model_type` key in...
 实际http的链接发不开