Built-in SSL / TLS termination
Is your feature request related to a problem? Please describe. When thinking about using localAI in a production environment to serve an open source llm with an OpenAI compatible API things like SSL / TLS termination come to mind. I searched the webpage and github repository for built-in support in localAI but could not find any appearences. Maybe i missed something but i would like to ask before implementing a reverse proxy capable on this task.
Describe the solution you'd like An launch option for built-in SSL / TLS termination in localAI
Describe alternatives you've considered Architectural approach using an reverse proxy like nginx
Additional context We would like to use localAI in a production context connecting to a llama.cpp backend service and serving a "like OpenAI" API for an open source llm.
:warning::warning::warning::warning::warning:
Hi! I'm a bot running with LocalAI ( a crazy experiment of @mudler ) - please beware that I might hallucinate sometimes!
but.... I can also be funny or helpful :smile_cat: and I can provide generally speaking good tips or places where to look after in the documentation or in the code based on what you wrote in the issue.
Don't engage in conversation with me, I don't support (yet) replying!
:warning::warning::warning::warning::warning:
I understand that you are looking for a way to enable SSL/TLS termination in LocalAI to serve an open source LLM with an OpenAI compatible API. I can confirm that this is not currently supported out of the box, but it is definitely a feature request that we can consider implementing. As an alternative, you could use a reverse proxy like nginx to handle SSL/TLS termination and then forward the requests to LocalAI. This would allow you to maintain separate concerns for security and inference, which can be useful in a production environment. To implement this approach, you would need to configure nginx to listen on port 443 (or another appropriate port) and terminate SSL/TLS connections. You would then need to create a location block that proxies the requests to LocalAI on a different port (e.g. 8080). Here is an example configuration for nginx:
server {
listen 443;
ssl_certificate /path/to/certificate;
ssl_key /path/to/key;
# Other SSL/TLS settings as needed
}
location / {
proxy_pass http://localhost:8080;
# Other proxy settings as needed
}
You would then need to ensure that LocalAI is listening on port 8080 and that it is configured to handle incoming requests from the proxy. This can be done by setting the http2_port parameter in the CLI or via a config file.
I hope this helps! Let me know if you have any further questions or concerns.
Sources:
- https://github.com/go-skynet/LocalAI/tree/master/README.md
- https://localai.io/index.html
- https://localai.io/advanced/index.html
@timlohse1104 if you deploy this using the helm chart, it is as simple as adding a cluster issuer annotation to the values.yaml file and defining your ingress, provided you configured a cluster CA:
If a kubernetes cluster doesn't make sense for your use case, using a reverse proxy is an easy alternative - Envoy is super easy to get working for SSL termination, nginx is a little more involved.
Having built-in support for HTTPS within the LocalAI server is something I'm interested in as well. While using a reverse proxy technically does provide HTTPS support, the communication on loopback is still unencrypted and in some environments will not be acceptable during a security audit or by policy. Many organizations require encryption regardless if the traffic is transmitted over a publicly accessible IP address/port, or over the internal (not publicly accessible) loopback interface.
Agree. k8s ingress is nice and all, but (a) not everyone will be using k8s, and (b) to do it right in k8s you really need a either what this ticket asks for, or a service-mesh-like sidecar; ingress is still insufficient for zero trust.
This issue is stale because it has been open 90 days with no activity. Remove stale label or comment or this will be closed in 5 days.
This issue was closed because it has been stalled for 5 days with no activity.