goose icon indicating copy to clipboard operation
goose copied to clipboard

Goose Mode Bug with Databricks Llama: Model response did not respect the required format #1161

Open wendytang opened this issue 9 months ago • 1 comments

Describe the bug When using GOOSE_MODE=approve, with model databricks-meta-llama-3-3-70b-instruct, run into this error

/mode approve
◇  Goose would like to call the above tool, do you approve?
│  Yes
│
─── text_editor | developer ──────────────────────────
path: /tmp/hi.txt
command: write
new_str: hi)','file_text':'hi

◇  Goose would like to call the above tool, do you approve?
│  Yes
│

─── text_editor | developer ──────────────────────────
path: /tmp/hi.txt
command: write
new_str: ,

─── text_editor | developer ──────────────────────────
path: /tmp/hi.txt
command: write
new_str: ,


◇  Goose would like to call the above tool, do you approve?
│  Yes
│

◐  Wrangling widgets...                                                                                                                                                    2025-03-07T20:00:10.977662Z ERROR goose::agents::truncate: Error: Request failed: Request failed with status: 400 Bad Request. Message: Error: Model response did not respect the required format. Please consider retrying or using a more straightforward prompt.

    at crates/goose/src/agents/truncate.rs:444

Ran into this error: Request failed: Request failed with status: 400 Bad Request. Message: Error: Model response did not respect the required format. Please consider retrying or using a more straightforward prompt.
.

Related to Issue 1161

To Reproduce Steps to reproduce the behavior:

  1. goose configure // configure databricks llama model
  2. goose session
  3. /mode approve
  4. write an example text file for me

Expected behavior It looks like a timeout or issue with

Screenshots If applicable, add screenshots to help explain your problem.

Please provide following information:

  • OS & Arch: [e.g. mac m1 max]
  • Interface: [UI]
  • Version: [latest]
  • Extensions enabled: [none]
  • Provider & Model: [databricks-meta-llama-3-3-70b-instruct]

Additional context

  • I wonder if anyone knows whether this is a llama specific problem.

wendytang avatar Mar 07 '25 20:03 wendytang

I think it is llama specific issue, other users report it as well, it seems like llama doesn't use the tool call, instead, it returns the json tool response, I may be wrong

yingjiehe-xyz avatar Mar 07 '25 20:03 yingjiehe-xyz