WrenAI
WrenAI copied to clipboard
feature(wren-ai-service): refactor the query understanding pipe with async
This is a draft PR to allow members to test and compare the performance of synchronous and asynchronous LLM components.
-
The Locust file is located in the
toolsfolder. You can execute the following command to start Locust:poetry run locust -f ./tools/locust_pipe.py -
To try the synchronous LLM, check out the commit
aca00d6. -
The asynchronous LLM is in the current head commit.