task_bunny
task_bunny copied to clipboard
[RFC] Add config option to parse the payload into a map with atoms
Hey,
I just connected my job to my existing logic and it errored quite fast, due to map keys not being atoms.
The fix was quite simple but maybe a configurable option in task_bunny
would be more user-friendly.
def perform(data) do
data
|> Poison.encode!()
|> Poison.decode!(keys: :atoms)
|> Indexer.index_by_job()
end
I don't want to have implicit atom conversion on TaskBunny layer (even if it's configurable) because of the potential memory leak. Also if the users can notice if there is a configuration option, they can also do the trick with defining the custom base job.
# Caveat: I haven't tried it. Just sketching the idea.
def BaseJob do
defmacro __using__(_options \\ []) do
quote do
use TaskBunny.Job
def perform(payload) do
for {key, val} <- string_key_map, into: %{}, do: {String.to_atom(key), val}
|> perform_with_atom()
end
end
end
end
defmodule MyJob do
use BaseJob
def perform_with_atom(payload) do
Indexer.index_by_job(payload)
end
end
However I also feel passing and taking string keyed map is kinda tedious. For example, let's imagine we have a job sending an email to user.
defmodule EmailJob do
use TaskBunny.Job
def perform(%{"user_id" => user_id, "email_id" => email_id) do
...
end
end
iex> EmailJob.enqueue(%{"user_id" => user.id, "email_id" => email.id})
If we can write this like this, it's more readable and cleaner.
defmodule EmailJob do
use TaskBunny.Job
def perform(user_id, email_id) do
...
end
end
iex> EmailJob.enqueue(user.id, email.id)
Since Elixir doesn't support n-ary, the implementation won't be straight forward. I haven't had a chance tackle the solution yet but maybe we can do something with macro. I also wanted to avoid over-engineering. However I share the idea because I can understand the interface of perform
can be improved.
Valid points.
For the atom keyed map: Maybe it's worth noting that in the readme? As you can see, my perform function is just the entry point to my real application logic and I didn't even think about the atom key issue until I saw it erroring.
For the perform function improvements: I like your idea, as long as the map version (current behavior) won't be removed. Maybe it can be solved via pattern matching and some sort of special data structure for the json object in rabbitmq to identify if it's just a map or parameters that should be passed to the perform action. Not sure how much complexity and technical debt this will introduce tho.
👍 for the document improvement. I will add the information to README and TaskBunny.Job
.
There's String.to_existing_atom
which prevents memory leaks. Maybe that's worth considering (as opt-in option).