FreedomGPT icon indicating copy to clipboard operation
FreedomGPT copied to clipboard

Task list for Intel Mac support?

Open FrankDMartinez opened this issue 2 years ago • 5 comments

Hi.

I hope you are having a great new year so far.

Is there a task list for what is needed for Intel Mac support? Maybe some of us could help?

Thanks in advance.

FrankDMartinez avatar Apr 06 '23 16:04 FrankDMartinez

Why? They have been replaced by the shitty ARM-based proprietary stuff. Other than that, follow the Linux instructions.

cooperdk avatar Apr 13 '23 00:04 cooperdk

Tried that; it didn’t work and that’s why I asked.

FrankDMartinez avatar Apr 13 '23 00:04 FrankDMartinez

"It didn't work". That's not much to go on. What response do you get. Other than that, Mac is really not optimal for AI. I guess an Intel based system is better though.

You should try the alpaca.cpp method. It's a lot better anyway. Read the readme, there's a link. It's command line based and you can adjust settings, which you cannot in the gui.

cooperdk avatar Apr 13 '23 04:04 cooperdk

While I appreciate the concern, fixing the errors of the build on an Intel Mac is not pre se the concern as much as identifying what needs the project has with respect to getting the code to work on an Intel Mac in general. While that statement might sound like a distinction without a difference, I am talking about the larger picture beyond a simple build. The readme suggests the possibility of a larger set of needs beyond successfully building on an Intel Mac, which is why I asked for the task list.

FrankDMartinez avatar Apr 13 '23 12:04 FrankDMartinez

To my understanding, the only requirements for a successful build are those listed.

If it will not build on an Intel Mac, it may be because it's simply not possible. That said, the app is really only a graphical shell to the alpaca.cpp code which you can build on a Mac. This code is useless in itself and only provides a fancy browser window for entering requests. The same can be achieved by running the chat CLI program with the -i argument.

I would do that instead, since this method also allows adjustments. It even better, I would use llama.cpp which enables larger models. I use that with the Alpaca 13b q4 model, which is a lot better.

It still only has s response limit of around 3 KB og you ask it to write a story, though, which is the main drawback of this model.

cooperdk avatar Apr 13 '23 13:04 cooperdk