langflow
langflow copied to clipboard
Error: Requested tokens exceed context window of 512
I'm trying to use Vector Store example. I changed some nodes, so I'm using TextLoader, HuggingFaceEmbeddings and LlamaCpp instead of OpenAI.
When I'm trying to ask simple question about text, I'm getting this error:
C:\Users\kelhe\anaconda3\lib\site-packages\langflow\api\chat_manager.py:215 │ │ in process_graph │ │ │ │ 212 │ # Generate result and thought │ │ 213 │ try: │ │ 214 │ │ logger.debug("Generating result and thought") │ │ ❱ 215 │ │ result, intermediate_steps = await get_result_and_steps( │ │ 216 │ │ │ langchain_object, chat_message.message or "", websocket=we │ │ 217 │ │ ) │ │ 218 │ │ logger.debug("Generated result and intermediate_steps") │ │ │ │ C:\Users\kelhe\anaconda3\lib\site-packages\langflow\interface\run.py:198 in │ │ get_result_and_steps │ │ │ │ 195 │ │ ) │ │ 196 │ │ thought = format_actions(intermediate_steps) if intermediate_s │ │ 197 │ except Exception as exc: │ │ ❱ 198 │ │ raise ValueError(f"Error: {str(exc)}") from exc │ │ 199 │ return result, thought │ │ 200 │ │ 201 │ ╰──────────────────────────────────────────────────────────────────────────────╯ ValueError: Error: Requested tokens exceed context window of 512
I guess the problem is in this parameter: llama_model_load_internal: n_ctx = 512 But I can't find any place to change it. Max Tokens param = 2048 did not change anything.
How can I increase context window?
Some issue here
I am having the same exact issue. Langflow Only ChatGPT works, it's a conspiracy to use ChatGPT only! LOL... I love the concept of langflow for prototyping but I expected it to work...
Hey @kelheor
Our next release will give a lot more flexibility to choose what fields appear in the node.
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.
This is already fixed on the latest release.