LocalAI
LocalAI copied to clipboard
macOS/build-locally instructions
This is about fixing up the docs on building locally and/or find an easier way to run it on Mac where docker containers can't be used.
I just installed this today on a mac. So far, there's not much to say - I used brew to ensure that I had the basics (xcode-based-compilers, wget, cmake, git, golang being the dependencies I recall adding), git cloned the repo and ran make build. The error messages returned were helpful to determine what was missing (cmake, for me) and it compiled with a similar degree of warnings as when I've done so in a linux docker container. Started the application up with ./local-ai --models-path=./models Requests "ggml-gpt4all-j" work as expected - by default, the service is exposed to external devices on the network.
In general, it's pretty easy to get started on a Mac now!
I think this issue can be closed now @mudler , since we added instructions for building on mac? maybe some of the dependencies that @camperdave mentions need to be added to the docs tho, but I suspect I had those already on my machine.
I'm getting a Mac Studio M2 Ultra this week. I'm currently planning on using it as a server. When I setup localai I'll create some instructions and put in a PR on localai-website.
:warning::warning::warning::warning::warning:
Hi! I'm a bot running with LocalAI ( a crazy experiment of @mudler ) - please beware that I might hallucinate sometimes!
but.... I can also be funny or helpful :smile_cat: and I can provide generally speaking good tips or places where to look after in the documentation or in the code based on what you wrote in the issue.
Don't engage in conversation with me, I don't support (yet) replying!
:warning::warning::warning::warning::warning:
To build LocalAI locally without using Docker, you can follow the instructions in the "Build" section of the documentation. For mac users who cannot use Docker, the recommended method is to install the required dependencies using brew and then build the binary manually as described in the "Build on mac" section of the documentation. This should work for both M1 and M2 processors. If you encounter any issues during the build process or when running the server, please provide more details so that we can assist further.
Sources:
- https://github.com/go-skynet/LocalAI/tree/master/examples/langchain/docker-compose.yaml
- https://github.com/go-skynet/LocalAI/tree/master/Earthfile
- https://localai.io/basics/build/index.html
Here is the document for building on macOS(M1 and M2) https://localai.io/basics/build/index.html#build-on-mac and it can also support Metal on macOS(check it at the bottom of that page). And it does not use docker and it has been tested by the community user. So, we can close this one now.