Liran
Liran
You don't need to define anything as input. Just use table names from glue inside your sqls.
Hi @cyrillay can you provide how you are running the metorikku job from EMR? you are running as a step? what is the command? Also which metorikku version are you...
Can you also add: ```--conf spark.sql.catalogImplementation=hive --conf hive.metastore.client.factory.class=com.amazonaws.glue.catalog.metastore.AWSGlueDataCatalogHiveClientFactory```?
Unfortunately metorikku comes bundled with AWS (due to DQ integration)... Can you try to add the following jar to the command via --jars https://repo1.maven.org/maven2/com/amazonaws/aws-java-sdk-core/1.11.961/aws-java-sdk-core-1.11.961.jar LMK if that helps
Cool idea I would actually just create a new type of step that continues/exits based on query outcome.
We'll add it soon to the code but for now: ``` # Path to metric file metric: "some_metric.yaml" # Paths to all the different mock inputs mocks: # Name of...