local-deep-researcher
local-deep-researcher copied to clipboard
Failed to load assistants
When installing the repo and starting the UI service at https://smith.langchain.com/studio/?baseUrl=http://127.0.0.1:2024 gives an error:
Failed to load assistants Please verify if the API server is running or accessible from the browser. TypeError: Load failed
The documentation endpoint methods work however at http://127.0.0.1:2024/docs
Terminal logs are also OK:
How to fix this?
Ollama is running smoothly and I have the following .env file:
OLLAMA_BASE_URL='http://localhost:11434' # the endpoint of the Ollama service, defaults to http://localhost:11434 if not set
OLLAMA_MODEL='deepseek-r1:8b' # the name of the model to use, defaults to 'llama3.2' if not set
# Which search service to use, either 'duckduckgo' or 'tavily' or 'perplexity'
SEARCH_API='duckduckgo'
# Web Search API Keys (choose one or both)
TAVILY_API_KEY=tvly-xxxxx # Get your key at https://tavily.com
PERPLEXITY_API_KEY=pplx-xxxxx # Get your key at https://www.perplexity.ai
MAX_WEB_RESEARCH_LOOPS=3
FETCH_FULL_PAGE=true
I have the error too,use grok3 have this: 404 错误
GET /assistants/081587fc-586a-4ce3-8bed-9b97031bb8ad/schemas 多次返回 404 Not Found:
2025-02-26T07:43:04.930926Z [info] GET /assistants/081587fc-586a-4ce3-8bed-9b97031bb8ad/schemas 404 0ms
2025-02-26T07:43:06.171427Z [info] GET /assistants/081587fc-586a-4ce3-8bed-9b97031bb8ad/schemas 404 0ms
请求头显示来源是 https://smith.langchain.com,表明这是 LangSmith Studio UI 发出的请求,带有 x-auth-scheme: langsmith。
Adding @dqbd and @lc-arjun.
@a1095753788 the error you mentioned is unrelated and should be resolved now
The same problem
Same here.
try the disabling adblock or security extensions
This happened to me using Safari. I switched to chrome, that fixed it.
try the disabling adblock or security extensions
It doesn't work
Same here using Firefox on Mac connecting to an instance of ollama-deep-research running locally (but on a different machine) in a Linux VM.
Ollama is up and running and I can reach it from my Firefox. I can also reach the /docs endpoint of ollama-deep-research. Not using any privacy extensions or ad blockers in my Firefox. It also doesn't work in Chrome, but the error message is slightly different: "TypeError: Failed to fetch" instead of "TypeError: NetworkError when attempting to fetch resource".
Try replacing 127.0.0.1 with localhost, at least it works for me. https://smith.langchain.com/studio/thread?baseUrl=http%3A%2F%2Flocalhost%3A2024
Still not fixed, not even firefox works. In some way it first worked halfway (didn't show this error, only when I entered a prompt and searXNG was not able to be reached/connected and I started troubleshooting).
Then everytime I get this error and seemingly nothing works now. I think the installing guidance for docker container is outdated otherwise.
Please try to recreate and fix this error...
EDIT: deployed it using LangGraph Studio method. Docker method does not work.
Can confirm, Failed to load assistants with Safari, works with Chrome/Firefox.
This happened to me using Safari. I switched to chrome, that fixed it.
https://smith.langchain.com/studio/thread?baseUrl=http%3A%2F%2Flocalhost%3A2024
man your fix works bro thanks
for me , brave adblocker n security stuff was blocking it
Same here for Safari but Chrome did well
I faced a similar issue, i was running from local vs code, turns out it the env vars to be set correctly.
- generate the required api keys.
- open integrated terminal in the main project folder
- $env:LANGSMITH_API_KEY ="your api key" and $env:OPENAI_API_KEY ="your api key"
- check if it is set using : echo $env:LANGSMITH_API_KEY
- Now FROM THE SAME TERMINAL navigate to the studio folder
- type langgraph dev
This fixed the issue for me
Just use the --tunnel flag
langgraph dev --tunnel
https://smith.langchain.com/studio/thread?baseUrl=http%3A%2F%2Flocalhost%3A2024
man your fix works bro thanks
it's working for me