[Bug]: lmstudio and ollama providers not working with HTTPS
What did you do when it broke?
I added the ollama and lmstudio (openai compatible) providers in the config:
and added the models under language models:
How did it break?
I created a notebook and sent a message to the chat with models from ollama and lmstudio
Logs or Screenshots
With a model from ollama I get the following in the logs:
2025-11-26 20:18:41.992 | ERROR | api.routers.chat:execute_chat:384 - Error executing chat: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1010)
and with a model from lmstudio I get the following:
2025-11-26 20:30:07.454 | ERROR | api.routers.chat:execute_chat:384 - Error executing chat: Connection error.
Open Notebook Version
v1-latest (Docker)
Environment
- OS: Ubuntu 24.04
Additional Context
SSL verification
Both ollama and lmstudio run on a separate machine behind a reverse proxy (caddy) with a makecert SSL certificate. Is there any way to disable SSL verification in an environment variable maybe?
Different error
If I go into the container itself, I can curl both endpoints (without verifying the certificate, -k):
root@b319a5e97ec5:/app# curl -k https://lmstudio.macstudio.lan/v1/models | head -4
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
100 2407 100 2407 0 0 80380 0 --:--:-- --:--:-- --:--:-- 83000
{
"data": [
{
"id": "mistralai/devstral-small-2507",
root@b319a5e97ec5:/app# curl -k https://ollama.macstudio.lan/v1/models | head -1
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
100 455 100 455 0 0 18016 0 --:--:-- --:--:-- --:--:-- 18200
{"object":"list","data":[{"id":"gpt-oss:20b","object":"model","created":1763592286,"owned_by":"library"},{"id":"llama3.1:latest","object":"model","created":1747989461,"owned_by":"library"},{"id":"nomic-embed-text:latest","object":"model","created":1747475699,"owned_by":"library"},{"id":"bge-m3:latest","object":"model","created":1747395886,"owned_by":"library"},{"id":"nomic-embed-text:v1.5","object":"model","created":1747395768,"owned_by":"library"}]}
So I'm not sure why ollama and lmstudio give two different errors.
Contribution
- [ ] I am a developer and would like to work on fixing this issue (pending maintainer approval)
Unfortunately even after adding my root CA in the container and running update-ca-certificates, I can now curl https://ollama.macstudio.lan/v1/models without certificate verification issue, but I get the same error in the logs, Connection error. for lmstudio/openai compatible and [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1010) for ollama, which result in a 500 Internal Server Error.
Is there maybe some kind of certificate pinning going on by one of the libraries?
This seems to be on your reverse proxy side instead of the app itself. Did you try it without https to check?
Hi, thanks for getting back to me. I'm not sure how the error is on the reverse proxy side, the container can access lmstudio and ollama, through the reverse proxy. Yes, without SSL it works normally.
Thanks for the detailed report! This is indeed a user environment/setup issue rather than a bug in Open Notebook, but I understand it's frustrating.
Root Cause
The issue is that Python's SSL verification uses its own certificate store (via the certifi package) rather than the system's certificate store. Even after running update-ca-certificates in the container, Python doesn't pick up those changes.
The two different error messages you're seeing are because:
- Ollama: Uses our custom httpx-based client → explicit SSL error
- LMStudio (OpenAI-compatible): Uses the OpenAI SDK which wraps the error → generic "Connection error"
Workarounds You Can Use Today
Option 1: Use HTTP (simplest)
As you discovered, using HTTP instead of HTTPS works. If your services are on a trusted local network, this may be acceptable.
Option 2: Set Python SSL Environment Variables
You can point Python to your custom CA bundle by setting these environment variables in your Docker container:
SSL_CERT_FILE=/path/to/your/ca-bundle.pem
REQUESTS_CA_BUNDLE=/path/to/your/ca-bundle.pem
You'll need to:
- Mount your CA certificate into the container
- Set these environment variables in your
docker-compose.ymlordocker.env
Example in docker-compose.yml:
services:
open-notebook:
environment:
- SSL_CERT_FILE=/certs/my-ca-bundle.pem
- REQUESTS_CA_BUNDLE=/certs/my-ca-bundle.pem
volumes:
- /path/to/your/ca-bundle.pem:/certs/my-ca-bundle.pem:ro
Option 3: Add your CA to Python's certifi store
Inside the container, you can append your CA certificate to certifi's bundle:
# Find certifi's certificate bundle
python -c "import certifi; print(certifi.where())"
# Append your CA certificate to it
cat /path/to/your/root-ca.pem >> $(python -c "import certifi; print(certifi.where())")
Note: This would need to be done each time the container is recreated, so you might want to add this to a startup script or custom Dockerfile.
Future Fix
I'm working on adding SSL configuration options to the Esperanto library (which handles the LLM provider connections). This will allow you to:
- Disable SSL verification via config or environment variable (
ESPERANTO_SSL_VERIFY=false) - Specify a custom CA bundle (
ESPERANTO_SSL_CA_BUNDLE=/path/to/ca.pem)
This is tracked in the Esperanto repo and will be available in a future release.
Let me know if any of the workarounds help!
Update: Fix Available!
The SSL verification configuration feature has been merged into Esperanto. Once we release a new version of Open Notebook with the updated Esperanto dependency, you'll be able to use these environment variables:
Option 1: Custom CA Bundle (recommended)
ESPERANTO_SSL_CA_BUNDLE=/path/to/your/ca-bundle.pem
For Docker, mount your certificate and set the environment variable:
services:
open-notebook:
environment:
- OLLAMA_API_BASE=https://ollama.macstudio.lan
- ESPERANTO_SSL_CA_BUNDLE=/certs/ca-bundle.pem
volumes:
- /path/to/your/ca-bundle.pem:/certs/ca-bundle.pem:ro
Option 2: Disable SSL Verification (development only)
ESPERANTO_SSL_VERIFY=false
Documentation has been updated:
I'll keep this issue open until we release the updated version with this fix.
Fix pushed. Version 1.2.3 builing for docker right now.
@lfnovo Thanks a lot!! I'll check it out once it's available. :)
It already is
@lfnovo I saw that a new v1-latest was pushed https://hub.docker.com/r/lfnovo/open_notebook, but I don't see 1.2.3. I pulled the new v1-latest, but even with ESPERANTO_SSL_VERIFY=false I still see the error:
I see 1.2.2 in the logs though, not 1.2.3 yet:
2025-11-28 09:45:38.572 | INFO | api.routers.config:get_latest_version_cached:66 - Checking for latest version from GitHub...
2025-11-28 09:45:38.858 | INFO | api.routers.config:get_latest_version_cached:74 - Latest version from GitHub: 1.2.2, Current version: 1.2.2
2025-11-28 09:45:38.859 | INFO | api.routers.config:get_latest_version_cached:85 - Version check complete. Update available: False
@lfnovo I just tested it, with ESPERANTO_SSL_VERIFY=false I see SSL verification being disabled, but unfortunately still the same errors (same result when setting ESPERANTO_SSL_CA_BUNDLE):
/app/.venv/lib/python3.12/site-packages/esperanto/providers/llm/openai.py:51: UserWarning: SSL verification is disabled. This is insecure and should only be used in development/testing environments. For production, use ssl_ca_bundle to specify a custom CA certificate instead.
self._create_http_clients()
2025-12-03 19:23:06.991 | ERROR | api.routers.chat:execute_chat:384 - Error executing chat: Connection error.
INFO: 172.18.0.2:40922 - "POST /api/chat/execute HTTP/1.1" 500 Internal Server Error
INFO: 172.18.0.2:44886 - "GET /api/sources?notebook_id=notebook:3czggxk8y3kmlluv4z1d HTTP/1.1" 200 OK
INFO: 172.18.0.2:44902 - "OPTIONS /api/chat/sessions/chat_session%3Aez0r78mezbzmpkxlguby HTTP/1.1" 200 OK
INFO: 172.18.0.2:44902 - "PUT /api/chat/sessions/chat_session%3Aez0r78mezbzmpkxlguby HTTP/1.1" 200 OK
INFO: 172.18.0.2:44902 - "GET /api/chat/sessions/chat_session%3Aez0r78mezbzmpkxlguby HTTP/1.1" 200 OK
INFO: 172.18.0.2:44914 - "GET /api/chat/sessions?notebook_id=notebook:3czggxk8y3kmlluv4z1d HTTP/1.1" 200 OK
INFO: 172.18.0.2:44914 - "POST /api/chat/context HTTP/1.1" 200 OK
/app/.venv/lib/python3.12/site-packages/esperanto/providers/llm/ollama.py:50: UserWarning: SSL verification is disabled. This is insecure and should only be used in development/testing environments. For production, use ssl_ca_bundle to specify a custom CA certificate instead.
self._create_http_clients()
2025-12-03 19:25:32.605 | ERROR | api.routers.chat:execute_chat:384 - Error executing chat: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1010)
INFO: 172.18.0.2:44914 - "POST /api/chat/execute HTTP/1.1" 500 Internal Server Error
Trying option 3, that doesn't seem to apply here as certifi is not installed, I'd also prefer a non-runtime solution.
Trying option 2, if I set my root CA this way, all other requests fail:
Building open-notebook @ file:///app
2025-12-03 19:37:29,961 INFO success: api entered RUNNING state, process has stayed up for > than 1 seconds (startsecs)
2025-12-03 19:37:29,961 INFO success: api entered RUNNING state, process has stayed up for > than 1 seconds (startsecs)
× Failed to download `identify==2.6.15`
├─▶ Failed to fetch:
│ `https://files.pythonhosted.org/packages/0f/1c/e5fd8f973d4f375adb21565739498e2e9a1e54c858a97b9a8ccfdc81da9b/identify-2.6.15-py2.py3-none-any.whl`
├─▶ Request failed after 3 retries
├─▶ error sending request for url
│ (https://files.pythonhosted.org/packages/0f/1c/e5fd8f973d4f375adb21565739498e2e9a1e54c858a97b9a8ccfdc81da9b/identify-2.6.15-py2.py3-none-any.whl)
├─▶ client error (Connect)
╰─▶ invalid peer certificate: UnknownIssuer
help: `identify` (v2.6.15) was included because `open-notebook:dev` (v1.2.3)
depends on `pre-commit` (v4.5.0) which depends on `identify`
2025-12-03 19:37:32,458 WARN exited: api (exit status 1; not expected)
2025-12-03 19:37:32,458 WARN exited: api (exit status 1; not expected)
Making a cert bundle would be an option but would be either static which is not great, or would need to be done at runtime.
I am checking other options, but overall it seems that the esperanto option doesn't work as it should?
To anyone checking, this is how I solved it as the esperanto env variable didn't work:
open_notebook:
image: lfnovo/open_notebook:v1-latest
container_name: open_notebook
ports:
- "8502:8502" # Next.js Frontend
- "5055:5055" # REST API
env_file:
- ./open-notebook/.env
environment:
- REQUESTS_CA_BUNDLE=/etc/ssl/certs/ca-certificates.crt
- SSL_CERT_FILE=/etc/ssl/certs/ca-certificates.crt
depends_on:
- surrealdb
volumes:
- ./open-notebook/notebook_data:/app/data # Application data
- /etc/ssl/certs/mkcert-root_macstudio.pem:/usr/local/share/ca-certificates/mkcert-root_macstudio.crt:ro
command: >
sh -c "update-ca-certificates && exec /usr/bin/supervisord -c /etc/supervisor/conf.d/supervisord.conf"
The important bits are mounting the root CA as a volume, executing update-ca-certificates before the entry point, and the env variables.