dlt-with-debug
dlt-with-debug copied to clipboard
A lightweight helper utility which allows developers to do interactive pipeline development by having a unified source code for both DLT run and Non-DLT interactive notebook run.
I've been happily using the `dlt-with-debug` library, but I'm running into an issue when importing the dlt signatures without an active SparkSession. I'm trying: ``` import dlt_with_debug.dlt_signatures as dlt ```...
Has this being implemented yet? Also how different is this from the standard dlt.create_table or dlt.create_view?
Hello :) Thanks for this useful lib ! Will it be available not only for pyspark but also for SQL ? Thanks,
adresses #6
…support for table by overwriting globals after everytime display is called.
Add unit tests for the functions and get coverage.
For example ``` @dlt.create_table(name='table1') def my_func(): spark.read.csv() ``` Tie "table1" to the function name "my_func"