filipeo2-mck
filipeo2-mck
Setting minimal versions of `numba` and `visions` worked for me (for python 3.9): ```txt # numba>=0.59.0 removed `numba.decorated_jit` numba>=0.59.0 visions>=0.7.6 ``` Edit: It looks like the above works if you...
+1 installing it under docker (`linux/amd64` arch)
@NeerajMalhotra-QB
I believe that both solutions are complementary, as each infrastructure available is different from others. In some scenarios, cache will be better, to other, checkpoints. I would like to have...
Hi! @NeerajMalhotra-QB , the current setup only uses parquet files and I tested it locally only 👍 @kasperjanehag , agree, env vars will give enough flexibility to the user.
@dom-mcloughlin , I just opened the #1570 PR, could you take a look at it, please?
Watching... It would be great to be able to export both complete StructType object or the [condensed DDL format](https://spark.apache.org/docs/latest/api/python/reference/pyspark.sql/api/pyspark.sql.DataFrameReader.load.html), for example: ``` col0 INT, col1 DOUBLE, col3 STRING ```
@Garett601 and @vladgrish , I just opened the #1570 PR, could you take a look at it, please?
#1570 was merged, I believe this can be closed @cosmicBboy
What about ensuring uniqueness values over a composite primary key of a table, for example? I understand that all three `id_*` columns below should be taken into account when applying...