silverback
silverback copied to clipboard
More dynamic broker configuration
Overview
I'd like to be able to use more dynamic broker configuration on initialization. Right now, we can kind of sort of provide one string as the first argument to __init__() here:
https://github.com/ApeWorX/silverback/blob/8e89008921761bbdccb909dde51c4a94d4b9c5dd/silverback/settings.py#L72-L73
But broker don't all share the same signature and some may have more complex configuration options. For instance, this PR in taskiq-sqs that adds two extra params to configure how it interacts with SQS.
Specification
I'm not entirely sure how we'd want to go about this, which is why I'm creating an issue. Either we'd have to somehow hack together some way to dynamically provide this config via env vars, or maybe we need a way to provide the broker to the application without using pydantic settings at all.
Thoughts?
One of the options would seem to be a config file of some sort. For ape core, one thing I want to do in 0.9 is allow config via pyproject.toml "tools", which pydantic-settings supports https://docs.pydantic.dev/latest/concepts/pydantic_settings/#pyprojecttoml
I like the method @fubuloubu gave, but historically I've always used environment variables or a .env file to get pulled in and managed by pydantic settings. Then all of the settings handle building your config. I'd be for either method of dealing with this
I like the method @fubuloubu gave, but historically I've always used environment variables or a .env file to get pulled in and managed by pydantic settings. Then all of the settings handle building your config. I'd be for either method of dealing with this
I think what essentially you can do is something like this:
class SilverbackConfig(BaseSettings):
BROKER_CLASS: str
BROKER_ARGS: dict[str, Any] = {}
$ SILVERBACK_BROKER_CLASS="..." SILVERBACK_BROKER_ARGS='{"uri": "https://..."}' silverback run ...
and then it would essentially give you a dict like {"uri": "https://..."} you can destructure with **:
broker = broker_class(**broker_args)
Never really been a fan of the .env file pattern. Secret settings that override env vars in a hidden file that sometimes accidentally get committed to the repo isn't super ideal.
I'm a little curious if we could have a config file that merges with our settings, but not sure that really works well either. The problem isn't that we can't use env vars (we can), but how do we provide arbitrary arguments to the constructor using said settings.
For instance, here's the new signature that brought this on
def.__init__(
self,
sqs_queue_url: str,
wait_time_seconds: int = 0, # Used for long polling
max_number_of_messages: int = 1, # size of batch to receive from the queue
result_backend: Optional[AsyncResultBackend] = None,
task_id_generator: Optional[Callable[[], str]] = None,
) -> None:
If we used something like env vars we'd need to do some fancy parsing to get that to work.
Never really been a fan of the
.envfile pattern. Secret settings that override env vars in a hidden file that sometimes accidentally get committed to the repo isn't super ideal.I'm a little curious if we could have a config file that merges with our settings, but not sure that really works well either. The problem isn't that we can't use env vars (we can), but how do we provide arbitrary arguments to the constructor using said settings.
For instance, here's the new signature that brought this on
def.__init__( self, sqs_queue_url: str, wait_time_seconds: int = 0, # Used for long polling max_number_of_messages: int = 1, # size of batch to receive from the queue result_backend: Optional[AsyncResultBackend] = None, task_id_generator: Optional[Callable[[], str]] = None, ) -> None:If we used something like env vars we'd need to do some fancy parsing to get that to work.
I don't think there is a good solution for parsing complex classes as types to add to args that doesn't involve doing something complex to handle it
Perhaps that is just never how the tools are configured to work, you would be expected to do the conversion upstream before you initialize a class that needs to pass a more complex class as an arg
oops, just saw your reply.
class SilverbackConfig(BaseSettings): BROKER_CLASS: str BROKER_ARGS: dict[str, Any] = {}
That's really interesting. Had no idea we could provide complex types as env vars. From the pydantic docs:
Complex types like list, set, dict, and sub-models are populated from the environment by treating the environment variable's value as a JSON-encoded string.
Another way to populate nested complex variables is to configure your model with the env_nested_delimiter config setting, then use an environment variable with a name pointing to the nested module fields. What it does is simply explodes your variable into nested models or dicts. So if you define a variable FOO__BAR__BAZ=123 it will convert it into FOO={'BAR': {'BAZ': 123}} If you have multiple variables with the same structure they will be merged.
We could even provide them individually.
This at least wouldn't require much in the way of changes to support immediately. :+1:
We could even provide them individually.
I think this requires a specific submodel to have it work, it doesn't work with a generic dict