lobe-chat
lobe-chat copied to clipboard
[Request] Ollama模型能否设置不默认显示
🥰 需求描述
能否把Ollama模型默认开启交给环境去控制,跟其它模型语言一样。因为大多数都人应该都不会在服务本机装Ollama模型,所以没必要默认开启,或者改成通过环境变量去控制。
🧐 解决方案
通过环境变量去控制是否默认开启。
📝 补充信息
No response
Bot detected the issue body's language is not English, translate it automatically. 👯👭🏻🧑🤝🧑👫🧑🏿🤝🧑🏻👩🏾🤝👨🏿👬🏿
🥰 Description of requirements
Can the Ollama model be turned on by default and left to the environment to control, like other model languages? Because most people will not install the Ollama model on the service machine, there is no need to enable it by default or change it to control through environment variables.
🧐 Solution
Use environment variables to control whether it is enabled by default.
📝 Supplementary information
No response
👀 @yincangshiwei
Thank you for raising an issue. We will investigate into the matter and get back to you as soon as possible.
Please make sure you have given us as much context as possible.
非常感谢您提交 issue。我们会尽快调查此事,并尽快回复您。 请确保您已经提供了尽可能多的背景信息。
不知为何,ModelProviderCard 的 enabled 在 initialSettingsState 时事实上没有生效。
Bot detected the issue body's language is not English, translate it automatically. 👯👭🏻🧑🤝🧑👫🧑🏿🤝🧑🏻👩🏾🤝👨🏿👬🏿
For some reason, ModelProviderCard's enabled does not actually take effect in initialSettingsState.
在环境变量中将OLLAMA_MODEL_LIST
设置为-all
即可
ref:https://lobehub.com/docs/self-hosting/environment-variables/model-provider
Bot detected the issue body's language is not English, translate it automatically. 👯👭🏻🧑🤝🧑👫🧑🏿🤝🧑🏻👩🏾🤝👨🏿👬🏿
Just set OLLAMA_MODEL_LIST
to -all
in the environment variable
ref: https://lobehub.com/docs/self-hosting/environment-variables/model-provider
✅ @yincangshiwei
This issue is closed, If you have any questions, you can comment and reply.
此问题已经关闭。如果您有任何问题,可以留言并回复。
:tada: This issue has been resolved in version 0.160.0 :tada:
The release is available on:
Your semantic-release bot :package::rocket: