gpt-pilot icon indicating copy to clipboard operation
gpt-pilot copied to clipboard

[Bug]: Reprompting broken for local llm

Open Wladastic opened this issue 10 months ago • 7 comments

Version

VisualStudio Code extension

Operating System

Windows 11

What happened?

When using Mac, Linux or Windows 11 with WSL2 Ubuntu, I get the following Bug.

Whenever a longer output is expected from an Agent, GPT-Pilot forces it to go way beyond its token limit. With Hermes-Mistral-7B-Pro for example Outputting:

                },
                "human_intervention_description": "Create a directory in root.",
            },
        },
    ]
}
}

 
"
"Tower
##
##
                
##
## 1/
##  "Add  "Rem
##
    "Data
​
###  "Redissh
##  "L1
   "Depar---designer
    "and
  "Designb1 0 proposed
##
   Dynamic showed
"
    "Current 
    "F###
"D
  "##
#### 

##
## <dummy00014>w4 1
d  ion
##
##
   A------- - -F
 
## 
   	Accid0   Custom
        "
    */
                
## 	b   B This
##
                
   A 

i have no idea how to fix this yet. Oobabooga for example is showing 17k tokens although .env says 8192. Same with LMStudio and Ollama

Wladastic avatar Mar 30 '24 18:03 Wladastic

I have experimented a bit with context length. Cranking up Alpha value seems to have helped a ton image

Wladastic avatar Mar 30 '24 18:03 Wladastic

Found the error in the log finally: 2024-03-30 19:54:10,788 [llm_connection.py:516 - stream_gpt_completion() ] ERROR: Unable to decode line: : ping - 2024-03-30 18:54:10.748186 Expecting value

Wladastic avatar Mar 30 '24 18:03 Wladastic

issue solved, therefore closing

techjeylabs avatar Apr 19 '24 14:04 techjeylabs

Also not solved, I fixed it locally and will push code changes later this week

Wladastic avatar Apr 19 '24 14:04 Wladastic

sry for confusion, was tryin to bring order to the chaos. Waitin for your pull request :)

techjeylabs avatar Apr 19 '24 17:04 techjeylabs