aicommits
aicommits copied to clipboard
Add support for custom OpenAI base url (LocalAI integration)
Feature request
Adding a OpenAI url setting allows integration with https://localai.io/, a locally-run API that is compatible with the OpenAI API specification.
Why?
Running the AI server locally provides enhanced data privacy.
Alternatives
No response
Additional context
No response
maybe the most important feature request I've seen
In the mean time:
- Locate the npm package for aicommits. In windows, normal nodejs installation it's C:\Program Files\nodejs\node_modules\aicommits
- Edit /dist/cli.mjs: Open this file, find api.openai.com and replace it with your URL. There is only one occurrence of this text.
In the mean time:
- Locate the npm package for aicommits. In windows, normal nodejs installation it's C:\Program Files\nodejs\node_modules\aicommits
- Edit /dist/cli.mjs: Open this file, find api.openai.com and replace it with your URL. There is only one occurrence of this text.
Run npm root -g
to get the global node_modules
path
well almost, I also had to setup nginx and tell node to ingore my self-signed cert, and change out gpt-3.5-turbo with a model I had pulled with ollama already, I did this in nix with something like this:
{ config, lib, pkgs, ... }:
let
in
{
services = {
nginx = {
enable = true;
virtualHosts = let
base = locations: {
inherit locations;
forceSSL = true;
#enableACME = true;
};
proxy = port: base {
"/".proxyPass = "http://127.0.0.1:" + toString(port) + "/";
};
in {
# Define example.com as reverse-proxied service on 127.0.0.1:3000
"localhost" = proxy 11434 // {
default = true;
sslCertificate = /etc/localhost/localhost.pem;
sslCertificateKey = /etc/localhost/localhost-key.pem;
};
};
};
ollama = {
enable = true;
acceleration = "cuda";
};
};
systemd.services = {
ollama.serviceConfig.DynamicUser = lib.mkForce false;
};
environment.systemPackages = with pkgs; [
ollama
];
}