Calling `langgraph up` i get the error `langgraph-api-1 | FileNotFoundError: [Errno 2] No such file or directory: '/deps/langgraph-example-pyproject/my_agent\\agent.py`
Checked other resources
- [X] I added a very descriptive title to this issue.
- [X] I searched the LangGraph/LangChain documentation with the integrated search.
- [X] I used the GitHub search to find a similar question and didn't find it.
- [X] I am sure that this is a bug in LangGraph/LangChain rather than my code.
- [X] I am sure this is better as an issue rather than a GitHub discussion, since this is a LangGraph bug and not a design question.
Example Code
langgraph up
Error Message and Stack Trace (if applicable)
langgraph-api-1 | 2024-10-09T06:31:32.686169Z [error ] Traceback (most recent call last):
langgraph-api-1 | File "/usr/local/lib/python3.11/site-packages/starlette/routing.py", line 693, in lifespan
langgraph-api-1 | async with self.lifespan_context(app) as maybe_state:
langgraph-api-1 | File "/usr/local/lib/python3.11/contextlib.py", line 210, in __aenter__
langgraph-api-1 | return await anext(self.gen)
langgraph-api-1 | ^^^^^^^^^^^^^^^^^^^^^
langgraph-api-1 | File "/api/langgraph_api/lifespan.py", line 23, in lifespan
langgraph-api-1 | File "/api/langgraph_api/shared/graph.py", line 205, in collect_graphs_from_env
langgraph-api-1 | File "/usr/local/lib/python3.11/site-packages/langchain_core/runnables/config.py", line 590, in run_in_executor
langgraph-api-1 | return await asyncio.get_running_loop().run_in_executor(
langgraph-api-1 | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
langgraph-api-1 | File "/usr/local/lib/python3.11/concurrent/futures/thread.py", line 58, in run
langgraph-api-1 | result = self.fn(*self.args, **self.kwargs)
langgraph-api-1 | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
langgraph-api-1 | File "/usr/local/lib/python3.11/site-packages/langchain_core/runnables/config.py", line 581, in wrapper
langgraph-api-1 | return func(*args, **kwargs)
langgraph-api-1 | ^^^^^^^^^^^^^^^^^^^^^
langgraph-api-1 | File "/api/langgraph_api/shared/graph.py", line 233, in _graph_from_spec
langgraph-api-1 | File "<frozen importlib._bootstrap_external>", line 936, in exec_module
langgraph-api-1 | File "<frozen importlib._bootstrap_external>", line 1073, in get_code
langgraph-api-1 | File "<frozen importlib._bootstrap_external>", line 1130, in get_data
langgraph-api-1 | FileNotFoundError: [Errno 2] No such file or directory: '/deps/langgraph-example-pyproject/my_agent\\agent.py'
langgraph-api-1 | [uvicorn.error] api_revision=e09c235 api_variant=licensed
langgraph-api-1 | 2024-10-09T06:31:32.686561Z [error ] Application startup failed. Exiting. [uvicorn.error] api_revision=e09c235 api_variant=licensed
langgraph-api-1 exited with code 3
Description
I want to test the langgraph locally fist before deploy and when i run langgraph up i get the following error langgraph-api-1 | FileNotFoundError: [Errno 2] No such file or directory: '/deps/langgraph-example-pyproject/my_agent\\agent.py. While i encountered this issue i cloned the example project from this and attepmpted to serve it up but i got the same issue. Might it be because i am using windows? But it shouldnt matter since it runs in a docker container.
System Info
langgraph.json
` { "dependencies": ["."], "graphs": { "agent": "./my_agent/agent.py:graph" }, "env": ".env" }
`
pyproject.toml
` [tool.poetry] name = "my_agent" version = "0.1.0" description = "Example LangGraph project for deployment to LangGraph Cloud" authors = [ "langchain-ai" ] packages = [ { include = "my_agent" }, ]
[tool.poetry.dependencies] python = ">=3.9.0,<3.13" langgraph = "^0.2.0" langchain_anthropic = "^0.1.0" langchain_core = "^0.2.0" langchain_openai = "^0.1.0" tavily-python = "^0.3.0" langchain_community = "^0.2.0"
[build-system] requires = ["poetry-core"] build-backend = "poetry.core.masonry.api" `
OS Name Microsoft Windows 10 Enterprise Version 10.0.19045 Build 19045
❯ langchain --version langchain-cli 0.0.31
this is the config for the other project
` [tool.poetry] name = "quinn_agent" version = "0.1.0" description = "" readme = "README.md"
packages = [ { include = "quinn_agent" }, ]
[tool.poetry.dependencies] python = ">=3.12,<3.13" langchain = "^0.3.2" langchain-openai = "^0.2.2" langchainhub = "^0.1.21" langchain-pinecone = "^0.2.0" python-dotenv = "^1.0.1" langgraph = "^0.2.34" neo4j = "^5.25.0" lark = "^1.2.2" pydantic = "^2.9.2"
[build-system] requires = ["poetry-core"] build-backend = "poetry.core.masonry.api" `
{ "python_version": "3.12", "dependencies": [ "." ], "graphs": { "agent": "./quinn_agent/agent.py:graph" }, "env": ".env" }
File structure
@Andrei-Tocut yes, Windows is not currently supported but we're working on adding support
+1
+1 I see the error like below...tryies to use npm to install for python instead of pip
[langgraph-api 3/5] RUN cd /deps/langgraph_001 && npm i:
This should be fixed now, but feel free to reopen if you're still running into it. We also now recommend using langgraph dev for local testing: https://langchain-ai.github.io/langgraph/tutorials/langgraph-platform/local-server/
I am Still facing the same relative path error on windows, any help would be appreciated
As a Temporary Fix, we could create a main.py in same directory as langgraph.json with
from agents.graph import create_workflow
# useful for LangGraph Server
graph = create_workflow()
And tweak the config to match the path with
"python_version": "3.12",
"dockerfile_lines": [],
"dependencies": [
"."
],
"graphs": {
"agent": "./main.py:graph"
},
"env": ".env"
}
PS: This works perfectly fine for me
This should be fixed now, but feel free to reopen if you're still running into it. We also now recommend using langgraph dev for local testing
@vbarda Looks like the fix (I'm presuming it's https://github.com/langchain-ai/langgraph/pull/2480) was merged in a separate branch, not main, and hasn't been shipped to a release. Can we re-open this issue until the fix has made its way in a release?
Edit: Looks like it was fixed for langgraph dev, but if you run langgraph build -t <image_name> and then use docker-compose file to run the graph, it would fail
In the output of langgraph build -t <img_name>:
For Windows users, the workaround is to run the same langgraph build command from WSL (Windows Subsystem for Linux)
It has been several months so far, hasn't it been fixed yet? I still encounter this problem now.
bumping this, facing a similar issue
Will resolve next week. If you're blocked by this you can use WSL in the meantime (https://en.wikipedia.org/wiki/Windows_Subsystem_for_Linux).
Should be fixed in https://github.com/langchain-ai/langgraph/pull/3318
I have the similar problem as below, when I deploy my graph to langGraph platform studio.
my pyproject.toml `[tool.poetry] name = "sample_agent" version = "0.1.0" description = "Starter" authors = ["Markus Ecker [email protected]"] license = "MIT" [project] name = "sample_agent" version = "0.0.1" requires-python = ">=3.10,<4.0" dependencies = [ "langchain-openai>=0.2.1", "langchain-anthropic>=0.2.1", "langchain>=0.3.1", "openai>=1.51.0", "langchain-community>=0.3.1", "copilotkit==0.1.39", "uvicorn>=0.31.0", "python-dotenv>=1.0.1", "langchain-core>=0.3.25", "langgraph-cli[inmem]>=0.1.64", "langchain-mcp-adapters (==0.0.5)", "fastmcp>=0.4.1", "langgraph (>=0.3.11,<0.4.0)" ]
[build-system] requires = ["setuptools >= 61.0"] build-backend = "setuptools.build_meta"
[tool.poetry.dependencies] python = ">=3.10,<4.0" langchain-openai = "^0.2.1" langchain-anthropic = "^0.2.1" langchain = "^0.3.1" openai = "^1.51.0" langchain-community = "^0.3.1" copilotkit = "0.1.39" uvicorn = "^0.31.0" python-dotenv = "^1.0.1" langchain-core = "^0.3.25" langgraph-cli = {extras = ["inmem"], version = "^0.1.64"} langchain-mcp-adapters = "^0.0.3" fastmcp = "^0.4.1" langgraph = "^0.3.5"
[tool.poetry.scripts] demo = "sample_agent.demo:main"
[tool.ruff] lint.select = [ "E", # pycodestyle "F", # pyflakes "I", # isort "D", # pydocstyle "D401", # First line should be in imperative mood "T201", "UP", ] lint.ignore = [ "UP006", "UP007", # We actually do want to import from typing_extensions "UP035", # Relax the convention by not requiring documentation for every function parameter. "D417", "E501", ] [tool.ruff.lint.per-file-ignores] "tests/*" = ["D", "UP"] [tool.ruff.lint.pydocstyle] convention = "google"
[tool.mypy] ignore_missing_imports = true ` Below is error message. But langgraph work normally when is running in local.
[ERROR] Background run failed Traceback (most recent call last): File "/api/langgraph_api/worker.py", line 128, in worker File "/usr/local/lib/python3.12/asyncio/tasks.py", line 520, in wait_for return await fut ^^^^^^^^^ File "/api/langgraph_api/stream.py", line 293, in consume File "/api/langgraph_api/stream.py", line 283, in consume File "/api/langgraph_api/stream.py", line 234, in astream_state File "/api/langgraph_api/asyncio.py", line 72, in wait_if_not_done File "/usr/local/lib/python3.12/site-packages/langgraph/pregel/__init__.py", line 2313, in astream async for _ in runner.atick( File "/usr/local/lib/python3.12/site-packages/langgraph/pregel/runner.py", line 527, in atick _panic_or_proceed( File "/usr/local/lib/python3.12/site-packages/langgraph/pregel/runner.py", line 619, in _panic_or_proceed raise exc File "/usr/local/lib/python3.12/site-packages/langgraph/pregel/retry.py", line 128, in arun_with_retry return await task.proc.ainvoke(task.input, config) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.12/site-packages/langgraph/utils/runnable.py", line 583, in ainvoke input = await step.ainvoke(input, config, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.12/site-packages/langgraph/utils/runnable.py", line 371, in ainvoke ret = await asyncio.create_task(coro, context=context) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/deps/mcp_langgraph2025/src/custom_agent/graph.py", line 47, in chat_node async with MultiServerMCPClient(mcp_config) as mcp_client: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.12/site-packages/langchain_mcp_adapters/client.py", line 256, in __aenter__ await self.connect_to_server_via_stdio(server_name, **connection_dict) File "/usr/local/lib/python3.12/site-packages/langchain_mcp_adapters/client.py", line 196, in connect_to_server_via_stdio stdio_transport = await self.exit_stack.enter_async_context(stdio_client(server_params)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.12/contextlib.py", line 659, in enter_async_context result = await _enter(cm) ^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.12/contextlib.py", line 210, in __aenter__ return await anext(self.gen) ^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.12/site-packages/mcp/client/stdio.py", line 100, in stdio_client process = await anyio.open_process( ^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.12/site-packages/anyio/_core/_subprocesses.py", line 190, in open_process return await get_async_backend().open_process( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.12/site-packages/anyio/_backends/_asyncio.py", line 2561, in open_process process = await asyncio.create_subprocess_exec( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.12/asyncio/subprocess.py", line 224, in create_subprocess_exec transport, protocol = await loop.subprocess_exec( ^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "uvloop/loop.pyx", line 2841, in subprocess_exec File "uvloop/loop.pyx", line 2799, in __subprocess_run File "uvloop/handles/process.pyx", line 611, in uvloop.loop.UVProcessTransport.new File "uvloop/handles/process.pyx", line 112, in uvloop.loop.UVProcess._init FileNotFoundError: [Errno 2] No such file or directory During task with name 'chat_node' and id '11b2b974-3362-762e-375d-e5a1df378a6e'
@willy20040711 Did you find a solution for this yet? I am running in the same error when running the docker image. It works fine on LangGraph dev.