OpenHands icon indicating copy to clipboard operation
OpenHands copied to clipboard

LLM write into non exist file and maybe prompts leak

Open yufansong opened this issue 1 year ago • 5 comments

Describe the bug

In #1063 and also some others feedback, LLM sometimes output a wrong file directory. And in the #1070 discussion here. LLM should work in our docker env, it should see the actual file directory. Open this issue to track the related problem.

Setup and configuration

Current version:

commit e0c7492609bf8ebdbd7270c9e9261bf2944f3bc6 (HEAD -> main, origin/main, origin/HEAD)
Author: Boxuan Li <[email protected]>
Date:   Fri Apr 12 12:01:52 2024 -0700

    Traffic Control: Add new config MAX_CHARS (#1015)

    * Add new config MAX_CHARS

    * Fix mypy linting issues


My operating system:

My environment vars and other configuration (be sure to redact API keys):

LLM_API_KEY="ollama"
LLM_MODEL="ollama/atlas:002"
LLM_EMBEDDING_MODEL="local"
LLM_BASE_URL="http://localhost:11434"
WORKSPACE_DIR="./workspace/"
SANDBOX_TYPE="exec"

My model and agent (you can see these settings in the UI):

  • Model: A local model
  • Agent:

Commands I ran to install and run OpenDevin:

cd ~/$OpenDevin/ && poetry run python opendevin/main.py -d ./workspace/ -t "create a script that says hi"

Steps to Reproduce:

  1. run opendevin.
  2. run into a bunch of file operations path issues.

Logs, error messages, and screenshots:

==============
STEP 9

PLAN:
write a bash script that prints hi

ACTION:
FileWriteAction(path='./workspace/memory.txt', content="I'm going to create a file called 'memory.txt' to store my thoughts.", action=<ActionType.WRITE: 'write'>)

ERROR:
[Errno 2] No such file or directory: './workspace/./workspace/memory.txt'
Traceback (most recent call last):
  File "/home/atlas/OpenDevin/opendevin/controller/agent_controller.py", line 186, in step
    if inspect.isawaitable(action.run(self)):
                           ^^^^^^^^^^^^^^^^
  File "/home/atlas/OpenDevin/opendevin/action/fileop.py", line 43, in run
    with open(whole_path, "w", encoding="utf-8") as file:
         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
FileNotFoundError: [Errno 2] No such file or directory: './workspace/./workspace/memory.txt'

OBSERVATION:
[Errno 2] No such file or directory: './workspace/./workspace/memory.txt'

Additional Context

yufansong avatar Apr 16 '24 17:04 yufansong

Thanks @yufansong! Can you share which model/agent?

I'm trying to figure out how to set up logging so we can see the full prompt/response again--that would be super helpful. We need to figure out where ./workspace got sent to the LLM.

My guess is that there was an error message higher above Step 9, which included ./workspace

rbren avatar Apr 16 '24 17:04 rbren

Looks like LLM logs are stored in ./logs/llm/$DATE/--if you're able to look through those that'd be super helpful

rbren avatar Apr 16 '24 17:04 rbren

Thanks @yufansong! Can you share which model/agent?

From the user configuration, it shows ollama/atlas:002. Added it in description.

I'm trying to figure out how to set up logging so we can see the full prompt/response again--that would be super helpful. We need to figure out where ./workspace got sent to the LLM.

My guess is that there was an error message higher above Step 9, which included ./workspace

I also cannot reproduce it locally. But I remember already see related issue for several times. I will check with the user for more details. If any new progress, I will update in this issue.

yufansong avatar Apr 16 '24 18:04 yufansong

Looks like LLM logs are stored in ./logs/llm/$DATE/--if you're able to look through those that'd be super helpful

You also need to have DEBUG=1 in .env or otherwise enabled in the environment. It's undocumented, I'll add it just now to the Development.md.

enyst avatar Apr 16 '24 18:04 enyst

I'm trying to figure out how to set up logging so we can see the full prompt/response again--that would be super helpful.

That must be because it wasn't working right. It had a round of fixes now.

enyst avatar Apr 16 '24 20:04 enyst