langflow icon indicating copy to clipboard operation
langflow copied to clipboard

Flow returns cached output from playground

Open sangdth opened this issue 1 year ago • 3 comments

Bug Description

After making the flow, and use Playground to test, the API calling will return the last respond from Playground.

Reproduction

  1. Create the flow.
  2. Run test in playground.
  3. Use the API link
  4. Still see the old output.
Screenshot 2024-07-23 at 13 24 54 Screenshot 2024-07-23 at 13 25 13

Extract links (1).json

Expected behavior

Expect to see the correct output.

Who can help?

No response

Operating System

Ubuntu

Langflow Version

1.0.11

Python Version

None

sangdth avatar Jul 23 '24 10:07 sangdth

To resolve the issue where the API call returns cached output from the Playground instead of the expected new output, you need to ensure that the API call is not using cached data. This can be done by adding cache-busting techniques to your API call. One common method is to append a unique query parameter to the API request URL.

In the provided code, you can modify the getComponent function call to include a timestamp or a random number as a query parameter to prevent caching. Here is an example of how you can do this:

async function getFlowData() {
  const res = await getComponent(`${id!}?_=${new Date().getTime()}`);
  const newFlow = cloneFLowWithParent(res, res.id, false, true);
  return newFlow;
}

By appending ?_=${new Date().getTime()} to the URL, you ensure that each request URL is unique, thereby preventing the browser or any intermediate cache from serving a cached response.

Additionally, in the Langflow Workspace, you can use the Refresh All option in the top navigation bar to refresh all components and delete cache. This can be particularly useful if you are working within the Langflow interface and want to ensure that all components are up-to-date and not using any cached data [1][2].

To continue talking to Dosu, mention @dosu.

dosubot[bot] avatar Jul 23 '24 10:07 dosubot[bot]

Hi @sangdth,

Could you please share your flow? If you are using any components with caching mechanisms (such as a LLM), that might be causing the issue. If not, it could be a bug. Your flow will help us address this asap.

Thank you!

anovazzi1 avatar Jul 24 '24 21:07 anovazzi1

Hi @anovazzi1, I attached the json in the issue, you can scroll up a bit and see. Or I can re-upload here: Extract links (1).json

Is there a way to disable the cache? Our system use it internally so we do not have high number of requests.

sangdth avatar Jul 25 '24 06:07 sangdth

I apologize for the delay. I have identified the root cause of the issue. It appears that the problem does not lie with the cache but with our API code. The input type you are using is a Text Input, not a chat Input. Therefore, the value for "input_type": "chat" is incorrect and should be "input_type": "text". I will fix this in our generated code. Thank you for bringing this to our attention.

anovazzi1 avatar Aug 01 '24 21:08 anovazzi1

Thanks @anovazzi1, I specified the "input_type": "text" and it works well now!

sangdth avatar Aug 08 '24 17:08 sangdth