Open-Assistant
Open-Assistant copied to clipboard
Add aws deployment configuration for backend
We can deploy the open assistant website to AWS for scalability but we can't yet deploy the backend.
This requires:
- [x] Support for a backend specific PostgresDB
- [ ] Support for Redis (not yet used but soon to be)
- [x] Support for a load balanced webservice
We can deploy dockerized backend with supporting services (Postgres, Redis) on ECS or EKS with terraform/terragrunt.
We can deploy dockerized backend with supporting services (Postgres, Redis) on ECS or EKS with terraform/terragrunt.
true, but it would probably be much more cost effective and much easier to manage if we used something like fargate
If someone wants to compare price points for a few different setups, that'd be super helpful.
For the database, there's either
For Redis, there's either
For the Postgres database, I'm pretty sure a small simple RDS Postgres database will be cheapest. It's also easier to work with since we can give it a public IP and access it directly (versus Aurora). I doubt we can setup a postgres container that'll run cheaper than AWS's hosted instances but if someone can demonstrate this being possible, it's worth considering.
FYI: I have aurora serverless working for my staging instance. But I'll likely switch it all to RDS shortly.