ramalama icon indicating copy to clipboard operation
ramalama copied to clipboard

provides binary/installer to ease the installation/onboarding of ramalama

Open benoitf opened this issue 9 months ago • 10 comments

Proposal: Provide Self-Contained Installers for Windows and macOS

Problem

Currently, Python is not installed by default on Windows and macOS. Since ramalama package requires a Python runtime, installation becomes more complex compared to a self-contained binary.

  • The package is available on PyPI, but users need a proper Python installation.
  • Homebrew can be used on macOS, but not all users have it installed.
  • Windows users need to install Python separately before using the package.

Suggested Solution

To improve accessibility, I am thinking of:

  1. A .pkg installer for macOS and .exe installer for Windows
  2. A self-contained binary for Windows and macOS (with a potential startup delay due to unpacking) or a directory to unpack

Potential Approach

It seems that PyInstaller can generate these self-contained packages out of the box. Using it to create platform-specific installers might simplify installation and adoption.

Benefits

  • Easier installation process without requiring users to set up Python manually
  • Broader accessibility for non-developer users
  • Reduces friction in adoption

benoitf avatar Feb 13 '25 20:02 benoitf

@lsm5 I wonder if this is something we could execute via github actions, when we generate a release?

rhatdan avatar Feb 13 '25 21:02 rhatdan

macOS is a packaging effort.

Could we consider running RamaLama inside podman-machine or WSL2 for Windows? Porting it to Windows will be a significant effort.

Note if we run RamaLama directly on Windows and/or macOS you lose all the container features of RamaLama, which is kind of a key goal of RamaLama and Podman Desktop.

ericcurtin avatar Feb 13 '25 22:02 ericcurtin

Note if we run RamaLama directly on Windows and/or macOS you lose all the container features of RamaLama, which is kind of a key goal of RamaLama and Podman Desktop.

Hello, I'm not sure to follow there ? it's only a packaging thing. So python runtime is included. I don't see why we wouldn't be able to run any of the container features ?

benoitf avatar Feb 14 '25 07:02 benoitf

Note if we run RamaLama directly on Windows and/or macOS you lose all the container features of RamaLama, which is kind of a key goal of RamaLama and Podman Desktop.

Hello, I'm not sure to follow there ? it's only a packaging thing. So python runtime is included. I don't see why we wouldn't be able to run any of the container features ?

Because containers don't exist in Windows or macOS, but if you run RamaLama inside a Linux VM like podman-machine or WSL2 (WSL2 already should have the GPU passthrough necessary on WIndows) you are in a Linux environment where you can run containers.

ericcurtin avatar Feb 14 '25 09:02 ericcurtin

Because containers don't exist in Windows or macOS, but if you run RamaLama inside a Linux VM like podman-machine or WSL2 (WSL2 already should have the GPU passthrough necessary on WIndows) you are in a Linux environment where you can run containers.

if you have a podman machine on macOS or Windows it means you have the podman CLI on your host (which is a podman-remote) then it means that the podman command you're running is launching the container. Why would you go inside the podman machine to run the command while it's on the host ?

benoitf avatar Feb 14 '25 09:02 benoitf

Because containers don't exist in Windows or macOS, but if you run RamaLama inside a Linux VM like podman-machine or WSL2 (WSL2 already should have the GPU passthrough necessary on WIndows) you are in a Linux environment where you can run containers.

if you have a podman machine on macOS or Windows it means you have the podman CLI on your host (which is a podman-remote) then it means that the podman command you're running is launching the container. Why would you go inside the podman machine to run the command while it's on the host ?

Because RamaLama makes all sorts of assumptions that the Base OS is Unix-like, which is a fair assumption when it's a containers oriented tool...

It's not only a packaging thing for Windows, if you try and execute RamaLama on Windows it will fail in multiple ways.

You also don't need to package python3 when you run inside podman-machine as you can depend on the Linux distros packaging.

ericcurtin avatar Feb 14 '25 10:02 ericcurtin

If someone is good with MAC and/or Windows installers, help here would be appreciated.

rhatdan avatar Apr 02 '25 11:04 rhatdan

@lsm5 I wonder if this is something we could execute via github actions, when we generate a release?

We post some binaries on podman via github actions. I guess it could be doable here too. @ashley-cui and @l0rd did most of the github actions and windows / mac work IIUC, so they'd be the right people to check with.

lsm5 avatar Apr 02 '25 13:04 lsm5

It would not be a terrible idea just to ship this with podman desktop/podman machine, it's small, like we do with krunkit

ericcurtin avatar Apr 02 '25 14:04 ericcurtin

If anyone wants to take a stab at it, here's some examples that might take you in the right direction: podman's mac installer code lives here, windows installer lives here, and the GitHub action to build it on tag pushes lives here. Both installers need to be signed as well, so you need to get a signing key.

ashley-cui avatar Apr 02 '25 14:04 ashley-cui

@benoitf Do you think this issue should be closed?

rhatdan avatar Jul 22 '25 13:07 rhatdan

A friendly reminder that this issue had no activity for 30 days.

github-actions[bot] avatar Aug 25 '25 00:08 github-actions[bot]

Still an open issue, and would love for someone to step up and take this on.

rhatdan avatar Aug 26 '25 12:08 rhatdan

A friendly reminder that this issue had no activity for 30 days.

github-actions[bot] avatar Sep 27 '25 00:09 github-actions[bot]

A friendly reminder that this issue had no activity for 30 days.

github-actions[bot] avatar Nov 18 '25 00:11 github-actions[bot]