chainlit
chainlit copied to clipboard
Chainlit deployment for multiple worker for production
Hi Team,
Thanks a lot for the wonderful work. I know more documentation will be available in future, but here is some questions I have if you can help me.
Are there any ways to use multiple worker while deploying in production? Are there any default setting of number of worker when we run 'chainlit run myapp.py' Where does the session stored? (In case I want to run with multiple workers, can I assume that for a particular session it can always find its session variable ? in cloud the assigned worker for a request can be changed for a single session unless session affinity is in place)
This is not possible atm, we would have to use gunicorn to spawn uvicorn workers I guess.
Thanks Chainlit/chainlit
Could you please help me to understand how to run using gunicorn. Let's say I have chainlit program available as app.py. I can run that using 'chainlit run app.py' . Now with gunicorn how to run that? Are there any coding changes required?
On Sat, Feb 3, 2024, 3:51 AM Willy Douhard @.***> wrote:
This is not possible atm, we would have to use gunicorn to spawn uvicorn workers I guess.
— Reply to this email directly, view it on GitHub https://github.com/Chainlit/chainlit/issues/719#issuecomment-1925222588, or unsubscribe https://github.com/notifications/unsubscribe-auth/AGVIXV7GUFF63M2GNACEIA3YRX3CHAVCNFSM6AAAAABCVQITX6VHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMYTSMRVGIZDENJYHA . You are receiving this because you authored the thread.Message ID: @.***>
uvicorn.Config
has 'workers' parameter, how about exposure in chainlit?
Hi Team
We can expose in chainlit. But can we do something so that after certain time let's say 10 minutes session variables are automatically deleted? Can we do that please suggest.
On Tue, Feb 27, 2024, 10:36 PM Deepliu @.***> wrote:
uvicorn.Config has 'workers' parameter, how about exposure in chainlit?
— Reply to this email directly, view it on GitHub https://github.com/Chainlit/chainlit/issues/719#issuecomment-1968155062, or unsubscribe https://github.com/notifications/unsubscribe-auth/AGVIXV3NJBBEJFDBALLV633YV2Q4BAVCNFSM6AAAAABCVQITX6VHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMYTSNRYGE2TKMBWGI . You are receiving this because you authored the thread.Message ID: @.***>
Ya, I cannot deploy to multiple fargate tasks because gunicorn in the background has some issues. Related to:
https://github.com/benoitc/gunicorn/issues/1194
I'm looking forward to an answer to this to implement a workaround.
Could we just set the WEB_CONCURRENCY
env variable to >1 to get multiple workers? @willydouhard
https://www.uvicorn.org/settings/
Support this idea. This is needed for scaling up.
This feature is really needed to horizontally scale CHAINLIT.
By running your web server with multiple worker processes, you can effectively utilize multiple CPU cores, which can lead to better performance and the ability to handle more concurrent requests.
Wouldn't this be solvable by using mount_chainlit
with a fastapi app and just running the fastapi app with uvicorn and setting the desired number of workers?
https://docs.chainlit.io/deploy/api
I tried that, but could not make it work with the auth. If there is an example/cookbook entry, can you post?
From: Marty Sullivan @.> Sent: Sunday, August 25, 2024 9:00 AM To: Chainlit/chainlit @.> Cc: Singh, Shivsantosh @.>; Comment @.> Subject: Re: [Chainlit/chainlit] Chainlit deployment for multiple worker for production (Issue #719)
Wouldn't this be solvable by using mount_chainlit with a fastapi app and just running the fastapi app with uvicorn and setting the desired number of workers?
https://docs.chainlit.io/deploy/api
— Reply to this email directly, view it on GitHubhttps://github.com/Chainlit/chainlit/issues/719#issuecomment-2308638960, or unsubscribehttps://github.com/notifications/unsubscribe-auth/AVXD7FIIDT3QRUGZZS3GAG3ZTFFTHAVCNFSM6AAAAABCVQITX6VHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDGMBYGYZTQOJWGA. You are receiving this because you commented.Message ID: @.***>
@shivsant I don't have a working example. I was just assuming that would work. I'd say the fact it doesn't is a bug and should be reported