Sven Thoms

Results 87 comments of Sven Thoms

@c-h-russell-walker Thank you for letting me know about 0.1.0 I will have to do changes in Elyra Jupyterlab plugin for parsing anyways, so good to know. It will be a...

@c-h-russell-walker what makes my environment special from an architectural perspective: We have in use both Airflow 2.x as well as Open Data Hub Kubeflow notebooks as well as DataRobot (I...

@eladkal @potiuk similar error in Airflow 2.8.2 dag-processor pod/container ``` {dag_processor_job_runner.py:60} INFO - Starting the Dag Processor Job [2024-06-18T02:50:07.781+0000] {validators.py:101} ERROR - Invalid stat name: dag_processing.last_duration.random error 2-0424133757V /python3.8/site-packages/airflow/metrics/validators.py", line...

I am now using the AWS S3 envs in Elyra itself, if, and only if, KUBERNETES_SECRET auth_type is used. I am also now not requiring, pre-save of runtime config, the...

@lresende does Kubeflow pipelines also work with Git integration, or with direct api calls? Elyra-speaking, I mean. I assume Kubeflow running on K8S uses S3 for pipeline .tar.gz exchange between...

@harshad16 @lresende @caponetto @jiridanek Ok, so I built from my fork branch an Elyra wheel file and tested it with Jupyterlab 4.x and an Airflow runtime config. Before I started...

Testing case when no env vars present in Jupyterlab when using auth type KUBERNETES_SECRET: To delete the env vars, you have to restart jupyterlab / the workbench and essentially not...

Hi, just a positive note from me here: it is great there is work on this, as this issue is e.g. also important when using S3 connections in airflow steps....

> Unless you are ok with all tasks in your pipeline having the same label(s). Then simply use add_pod_label on the very first task in alphabetical order This is what...

I layed this out conceptually in terms of where the system-level envs apply to, i.e. Jupyterlab or at runtime Airflow and KFP or both (scope). @harshad16 what should I write...