Julian Seither
Julian Seither
@austiezr The `expect_column_values_to_not_exceed_timerange` inculded in the following class is our custom expectation that is probably responsible for the exception (because of the timestamps handled in it): ``` class CustomSparkDFDataset(SparkDFDataset): #...
@austiezr Can you say anything to my issue here? I'm still stuggling and according to the stacktrace, it occurs directly in Great Expectations core (I accidently truncated it in the...
@itsvikramagr Sorry for pushing this, but we use this connector in production and need some advice on how to avoid those LimitExceeded errors on "empty" streams. After some retries, our...
@itsvikramagr It happens only for empty streams. There is one more consumer but as it happens only for empty streams, this might not be a problem of too many read...
I attached a log from one of our jobs. I realized that I misunderstood what `ShardEnd` means. It means that a shard is at end of life. That's not the...
To be honest: I didn't follow up pail for this project. I implemented the data model with Hive tables and a Data Vault-like approach at the end. Am 30.01.2016 4:26...