langflow icon indicating copy to clipboard operation
langflow copied to clipboard

Error: Requested tokens exceed context window of 512

Open kelheor opened this issue 1 year ago • 3 comments

I'm trying to use Vector Store example. I changed some nodes, so I'm using TextLoader, HuggingFaceEmbeddings and LlamaCpp instead of OpenAI.

When I'm trying to ask simple question about text, I'm getting this error:

C:\Users\kelhe\anaconda3\lib\site-packages\langflow\api\chat_manager.py:215 │ │ in process_graph │ │ │ │ 212 │ # Generate result and thought │ │ 213 │ try: │ │ 214 │ │ logger.debug("Generating result and thought") │ │ ❱ 215 │ │ result, intermediate_steps = await get_result_and_steps( │ │ 216 │ │ │ langchain_object, chat_message.message or "", websocket=we │ │ 217 │ │ ) │ │ 218 │ │ logger.debug("Generated result and intermediate_steps") │ │ │ │ C:\Users\kelhe\anaconda3\lib\site-packages\langflow\interface\run.py:198 in │ │ get_result_and_steps │ │ │ │ 195 │ │ ) │ │ 196 │ │ thought = format_actions(intermediate_steps) if intermediate_s │ │ 197 │ except Exception as exc: │ │ ❱ 198 │ │ raise ValueError(f"Error: {str(exc)}") from exc │ │ 199 │ return result, thought │ │ 200 │ │ 201 │ ╰──────────────────────────────────────────────────────────────────────────────╯ ValueError: Error: Requested tokens exceed context window of 512

I guess the problem is in this parameter: llama_model_load_internal: n_ctx = 512 But I can't find any place to change it. Max Tokens param = 2048 did not change anything.

How can I increase context window?

kelheor avatar Jun 03 '23 03:06 kelheor

Some issue here

DaiZack avatar Jun 08 '23 13:06 DaiZack

I am having the same exact issue. Langflow Only ChatGPT works, it's a conspiracy to use ChatGPT only! LOL... I love the concept of langflow for prototyping but I expected it to work...

maggenti avatar Jun 08 '23 19:06 maggenti

Hey @kelheor

Our next release will give a lot more flexibility to choose what fields appear in the node.

ogabrielluiz avatar Jun 08 '23 21:06 ogabrielluiz

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.

stale[bot] avatar Jul 23 '23 22:07 stale[bot]

This is already fixed on the latest release.

lucaseduoli avatar Jul 25 '23 14:07 lucaseduoli