realtime
realtime copied to clipboard
[WIP] Add workflows triggered by realtime events
This PR implements the workflows we discussed earlier.
Workflows are defined using Amazon States Language and reside in a database controlled by realtime.
Workflows are stored in the database, the table is synchronised with an ETS table. We use realtime events to synchronise the workflows between these two tables.
The workflow trigger (realtime/workflows/trigger.ex) is responsible for looking up which workflows match a given event and start the execution.
Workflows can be transient or persistent. Transient workflows are executed inside a Task, if the realtime server restarts or if they crash they are not restarted. Persistent workflows are stored as Oban jobs, they are persisted across restarts.
The interpreter
The current interpreter implementation compiles a gen_statem at runtime from a quoted module. This is less than ideal because it also means we need to unload the module when we are finished executing the workflow.
We need to implement persistent workflows in such a way that workflows' execution is resumed from the state they were interrupted, this can be done by "breaking" the interpreter in several persistent jobs, each one interpreting one state and transitioning to the next. The challenge here is to deal with Parallel and Map states, since they run a number of sub-workflows in parallel. My idea is to enqueue a persistent job for each sub-workflow and one job that polls for the jobs completion. This approach has one downside (two if you don't like polling): pruning old jobs also requires checking that the job result is not needed by any other job.
What's left to do
- [x] Add a new set of configurations for the workflows database. Users should be able to select the schema.
- [x] Validate rest requests and return meaningful error responses
- [x] Define a plugin-like architecture for ASL Resources (e.g. one for aws lambda functions, one for simple http post requests, etc)
- [x] Ignore changes happening on workflows schema/tables by default? It's very easy to trigger a workflow in response to events generated by another workflow.
- [ ] Implement an interpreter, possibly re-using parts of the
states_languagelibrary - [ ] Implement http post request resources. Workflow metadata should be included in the header (
X-Supabase-Workflow-Id,X-SUpabase-Execution-Id,X-Supabase-Workflow-Name, etc). To implement this the interpreter needs to carry around a context object with all this information. - [ ] Log state events to a table (optional, if
log_type == :postgres). These events can be used by users to debug their workflows.
I will try to keep this list updated as we progress through it.
@fracek I left ya a couple of comments. Awesome work on this so far!!! 🔥
I updated the first comment. Overall you can use this code to test how workflows work. I need to figure out a good implementation for the persistent workflow interpreter and then it's in a good shape! Event logging is a non-issue and handling resources is already implemented (just need to set the correct headers).
Closing. Working on Workflows somewhere else now. To be announced!