Results 98 comments of David Blain

> Airlfow is not a processing tool ( stream or batch ) > > but if you still want to run and control batching manually in airflow then ; >...

> I don't understand. > > The second operator I'm talking about is just a PythonOperator splitting a list of "work" in N sublists I've updated the example DAG is...

> `after your first task add an intermediary task that create N batch` > > ```python > with DAG( > "a", > default_args=DEFAULT_ARGS, > schedule_interval=timedelta(hours=24), > ): > distinct_users_ids_task =...

> I'll go back to the list with this feedback, but this implementation is tantamount to a parallel scheduler, executor and triggerer implementation so is very unlikely to be accepted...

This PR will be closed in favor of [AIP-88](https://github.com/apache/airflow/pull/51391/)

@eladkal and @potiuk what do you think about this implementation? I would ideally want to use entry points to register the dialects, so that additional dialects can be loaded through...

> We actually already use entrypoints - the `provider.yaml` "subset" is already exposed in providers and various provider's capabilities are available this way. They are even automatically extracted from provider.yaml's...

> I think in mssql provider, we do have a few "loosely related" things in providers already and cross-provider dependencies (both explicit and implicit) and sometimes where things are "between"...

@eladkal @potiuk I'm getting following error, I know what it means but don't understand what it causes: ``` Found 5 errors in providers Error: The `airflow.providers.apache.hive.transfers.mssql_to_hive` object in transfers list...

@potiuk As I needed to add the dialects notion to ProvidersManager, and the common sql provider now needs the dialects from the ProvidersManager, I also had to change the min...