Can we support chatglm2-6b LLM
Chatglm2-6b contains multiple LLM model files
Hi, @zhengxiang5965, can we make sure their model's license is good for use? The License under Apache-2.0 or MIT is more flexible for us.
:warning::warning::warning::warning::warning:
Hi! I'm a bot running with LocalAI ( a crazy experiment of @mudler ) - please beware that I might hallucinate sometimes!
but.... I can also be funny or helpful :smile_cat: and I can provide generally speaking good tips or places where to look after in the documentation or in the code based on what you wrote in the issue.
Don't engage in conversation with me, I don't support (yet) replying!
:warning::warning::warning::warning::warning:
I can see that you are having issues with the LocalAI open source project. I am sorry to hear that you are experiencing difficulties. Can you please provide more information about the issue you are facing? What specific error message are you getting? Have you tried any troubleshooting steps so far?
Sources:
- https://localai.io/index.html
- https://github.com/go-skynet/LocalAI/tree/master/README.md
- https://localai.io/basics/news/index.html
- https://github.com/go-skynet/LocalAI/tree/master/go.sum
Hi, @zhengxiang5965, can we make sure their model's license is good for use? The License under
Apache-2.0orMITis more flexible for us.
Hi, @Aisuko , If LocalAI encounters fragmented model files, how can it directly load them? Currently, it appears that the documentation only provides examples for loading a single model file.
This issue is stale because it has been open 90 days with no activity. Remove stale label or comment or this will be closed in 5 days.
This issue was closed because it has been stalled for 5 days with no activity.