Langflow API slow compared to Playground.
Bug Description
I'm running langflow-1.0.12 as a docker container, the playground seems much faster than using the webhook. I thought Langsmith was the cause of the problem but, although the playground without Langsmith is much faster, via API the problem still persist.
Reproduction
Try to use any flow by playground and then by webjook.
Expected behavior
The inference time is 3 time slow on webhook
Who can help?
No response
Operating System
Ubuntu 22.04
Langflow Version
1.0.12
Python Version
3.12
Screenshot
No response
Flow File
No response
Hey @Frenz86! I'm here to help you with any bugs, questions, or contributions. Let's work together to solve the issue you're facing.
The slower inference time via the webhook in Langflow version 1.0.12 running as a Docker container on Ubuntu 22.04 with Python 3.12 is likely due to the Dockerfile using Python 3.10-slim instead of Python 3.12. This discrepancy can lead to performance issues.
To resolve this, you should modify the Dockerfile to use Python 3.12. Here is the updated Dockerfile:
FROM langflowai/backend_build as backend_build
FROM python:3.12-slim
WORKDIR /app
RUN apt-get update && apt-get install git -y
COPY --from=backend_build /app/dist/*.whl /app/
RUN pip install langflow-*.whl
RUN rm *.whl
EXPOSE 80
CMD [ "uvicorn", "--host", "0.0.0.0", "--port", "7860", "--factory", "langflow.main:create_app" ]
By ensuring the Docker container uses Python 3.12, you can align the environment more closely with the one used in the Playground, potentially improving the inference time [1].
To continue talking to Dosu, mention @dosu.
@Frenz86
Do you need any assistance with this case? If not, please let us know if this issue can be closed.
Try the new version 1.0.15
Thank you for your contribution! This issue will be closed. If you have any questions or encounter another problem, please open a new issue and we will be ready to assist you.