deequ
deequ copied to clipboard
[BUG] Row-level filtering marking the records as pass when null values are present in the column
I am working on filtering data based on row-level checks. It's working fine when notnull values are present in the column
But incorrectly marking the records as pass when null values are present in the column.
For example
import sparkSession.implicits._
Seq(
(1, "a", 1),
(2, "b", 3),
(3, null, null),
(4, "c", 5),
(5, null, null),
(6, "d", 7)
).toDF("item", "att1", "att2")
Applied below rules:
rule1 : .isPrimaryKey("att1","att2")
rule2: .isGreaterThan("att2", "att1")
rule3: .isgreaterthanorequalto("att2", "att1")
+----+----+----+-----+-----+-----+
|item|att1|att2|rule1|rule2|rule3|
+----+----+----+-----+-----+-----+
| 1| a| 1| true|false| true|
| 2| b| 3| true| true| true|
| 3|null|null| true| true| true|
| 4| c| 5| true| true| true|
| 5|null|null| true| true| true|
| 6| d| 7| true| true| true|
+----+----+----+-----+-----+-----+
When columns values are null, the row-level check status is considered as true but it should be false.