jupyter-ai
jupyter-ai copied to clipboard
There seems to be a problem with the chat backend, please look at the JupyterLab server logs
Description
Get error message "404 GET /api/ai/chats?token=[secret] (efd31bb56cc24686bea06aa9c4fb330b@::1) 5.90ms referer=None" in terminal. Get error message "There seems to be a problem with the chat backend, please look at the JupyterLab server logs" in Jupyter.
Reproduce
conda create -n jupyter-ai python=3.11 conda activate jupyter-ai pip install jupyter pip install jupyter_ai jupyter lab
Context
- Operating System and version: Macbook pro m1 pro
- Browser and version: Version 119
- JupyterLab version: 4.0.9
Thank you for opening your first issue in this project! Engagement like this is essential for open source projects! :hugs:
If you haven't done so already, check out Jupyter's Code of Conduct. Also, please try to follow the issue template as it helps other other community members to contribute more effectively.
You can meet the other Jovyans by joining our Discourse forum. There is also an intro thread there where you can stop by and say Hi! :wave:
Welcome to the Jupyter community! :tada:
@chenxuewei-ihealth Thank you for reaching out to us! Can you try running the following extensions in the same conda environment where you run JupyterLab?
jupyter --version
jupyter labextension list
jupyter server extension list
You should have both a client (lab) extension and a server extension installed for Jupyter AI.
Thank you for your quick response.
- jupyter --version
(jupyter-ai) ➜ ~ jupyter --version Selected Jupyter core packages... IPython : 8.18.1 ipykernel : 6.27.1 ipywidgets : not installed jupyter_client : 8.6.0 jupyter_core : 5.5.0 jupyter_server : 2.12.1 jupyterlab : 4.0.9 nbclient : 0.9.0 nbconvert : 7.12.0 nbformat : 5.9.2 notebook : not installed qtconsole : not installed traitlets : 5.14.0
2.jupyter labextension list
(jupyter-ai) ➜ ~ jupyter labextension list JupyterLab v4.0.9 /Users/chenxuewei/opt/anaconda3/envs/jupyter-ai/share/jupyter/labextensions jupyterlab_pygments v0.3.0 enabled OK (python, jupyterlab_pygments) @jupyter-ai/core v2.6.0 enabled OK (python, jupyter_ai)
3.jupyter server extension list
(jupyter-ai) ➜ ~ jupyter server extension list Config dir: /Users/chenxuewei/.jupyter
Config dir: /Users/chenxuewei/opt/anaconda3/envs/jupyter-ai/etc/jupyter
jupyter_lsp enabled
- Validating jupyter_lsp...
A _jupyter_server_extension_points function was not found in jupyter_lsp. Instead, a _jupyter_server_extension_paths function was found and will be used for now. This function name will be deprecated in future releases of Jupyter Server.
jupyter_lsp 2.2.1 OK
jupyter_ai enabled
- Validating jupyter_ai...
Extension package jupyter_ai took 1.8570s to import
jupyter_ai 2.6.0 OK
jupyter_server_terminals enabled
- Validating jupyter_server_terminals...
jupyter_server_terminals 0.4.4 OK
jupyterlab enabled
- Validating jupyterlab...
jupyterlab 4.0.9 OK
notebook_shim enabled
- Validating notebook_shim...
A _jupyter_server_extension_points function was not found in notebook_shim. Instead, a _jupyter_server_extension_paths function was found and will be used for now. This function name will be deprecated in future releases of Jupyter Server.
notebook_shim OK
Config dir: /usr/local/etc/jupyter
@chenxuewei-ihealth Can you include the terminal output after running jupyter lab and trying to use Jupyter AI?
@dlqqq , hello, i have same issue=( It is because i'm from Russia or it doesn't matter for jupyterlab-ai extension? Thank you for your attention.
Hi, I had the same problem. My problem was that I selected a model in gpt4all as my embedded model even though I did not install it. It worked after I installed the model. From there, I could select other models.
I have the same problem.
There seems to be a problem with the Chat backend, please look at the JupyterLab server logs or contact your administrator to correct this problem.
and still no idea how to fix this.
JupyterLab Version 4.0.11
Same, but in a docker container:
Same error:
On local m2 mac: deleted the entire dir: /Users//%myname%/Library/Jupyter/ Building docker container with:
FROM jupyter/all-spark-notebook:spark-3.5.0. Assign 8888 to localhost Navigate to jupyter, select jupyterLabs Install jupyter-ai Click chat, get There seems to be a problem with the Chat backend, please look at the JupyterLab server logs or contact your administrator to correct this problem.
Any update here? I am facing the same issue as well
Jupyter Lab version - 4.1.5
Server Logs -
[W 2024-04-03 06:18:30.888 ServerApp] 404 GET /id-9t24hyusqk8zs44z-jupyter/api/ai/chats ([email protected]) 1.53ms referer=None
[W 2024-04-03 06:18:34.315 ServerApp] 404 GET /id-9t24hyusqk8zs44z-jupyter/api/ai/chats ([email protected]) 1.56ms referer=None
[W 2024-04-03 06:18:38.881 ServerApp] 404 GET /id-9t24hyusqk8zs44z-jupyter/api/ai/chats ([email protected]) 1.76ms referer=None
Can you paste full logs, especially anything shown just after startup? This fragment is likely irrelevant.
Saw this log in jupyter server startup
cannot import name 'Executable' from 'sqlalchemy' (/home/ray/anaconda3/lib/python3.10/site-packages/sqlalchemy/__init__.py)). Are you sure the extension is installed
Updated sqlalchemy version and the extension seems to be working fine now @krassowski
I fixed my issue When starting Jupyter Lab It showed some models not installed I ran the pip install (by the way, I do -m to ensure it stays in conda I originally installed the extension through the Jupyter extension manager.
I first tried the above example of installing sqlalchemy. That didn't fix it.
python -m pip install langchain_anthropic langchain_openai langchain_google_genai langchain_nvidia_ai_endpoints
original log file
[W 2024-04-05 13:56:29.855 AiExtension] Unable to load model provider `anthropic . Please install the `langchain_anthropic` package.
[W 2024-04-05 13:56:29.855 AiExtension] Unable to load model provider `anthropic-chat . Please install the `langchain_anthropic` package.
[W 2024-04-05 13:56:29.855 AiExtension] Unable to load model provider `azure-chat-openai . Please install the `langchain_openai` package.
[I 2024-04-05 13:56:29.855 AiExtension] Registered model provider `cohere`.
[W 2024-04-05 13:56:29.855 AiExtension] Unable to load model provider `gemini . Please install the `langchain_google_genai` package.
[I 2024-04-05 13:56:29.855 AiExtension] Registered model provider `gpt4all`.
[I 2024-04-05 13:56:29.855 AiExtension] Registered model provider `huggingface_hub`.
[W 2024-04-05 13:56:29.855 AiExtension] Unable to load model provider `nvidia-chat . Please install the `langchain_nvidia_ai_endpoints` package.
[W 2024-04-05 13:56:29.855 AiExtension] Unable to load model provider `openai . Please install the `langchain_openai` package.
[W 2024-04-05 13:56:29.856 AiExtension] Unable to load model provider `openai-chat . Please install the `langchain_openai` package.
[I 2024-04-05 13:56:29.856 AiExtension] Registered model provider `qianfan`.
[I 2024-04-05 13:56:29.856 AiExtension] Registered model provider `sagemaker-endpoint`.
[I 2024-04-05 13:56:29.856 AiExtension] Registered model provider `togetherai`.
[I 2024-04-05 13:56:29.859 AiExtension] Registered embeddings model provider `bedrock`.
[I 2024-04-05 13:56:29.859 AiExtension] Registered embeddings model provider `cohere`.
[I 2024-04-05 13:56:29.859 AiExtension] Registered embeddings model provider `gpt4all`.
[I 2024-04-05 13:56:29.859 AiExtension] Registered embeddings model provider `huggingface_hub`.
[E 2024-04-05 13:56:29.859 AiExtension] Unable to load embeddings model provider class from entry point `openai`: No module named 'langchain_openai'.
I fixed my issue When starting Jupyter Lab It showed some models not installed I ran the pip install (by the way, I do -m to ensure it stays in conda I originally installed the extension through the Jupyter extension manager.
I first tried the above example of installing sqlalchemy. That didn't fix it.
python -m pip install langchain_anthropic langchain_openai langchain_google_genai langchain_nvidia_ai_endpoints
Thank you so much! This worked for me. Hope this helps others as well!
python -m pip install langchain_anthropic langchain_openai langchain_google_genai langchain_nvidia_ai_endpoints
If these dependencies are necessary, then why are they not part of the install docs (or the conda recipe)? This leads to a bad UX, given that the user just sees "Something is wrong" when they follow the official install docs and then try to use the jupyter-ai chat for the first time.
What solved my issue was the fact that apparently I had both Jupyter Lab and Jupyter-AI installed concurrently. Removing both and install Jupyter-AI solved the issue.