Jeff Chiu
Jeff Chiu
The incremental materialization in 1.1.x of this adapter is a little out of date with changes in dbt-core 1.1.x. - handle cases where unique_key is a list - dbt-core handles...
Several tests in `test_data_types.py` are currently failing, and the issues around types are also what's behind the failures in TestDocsGenerateSqlite and TestDocsGenReferencesSqlite. Here's some things the adapter needs to account...
The closest thing in sqlite is [group_concat](https://www.sqlite.org/lang_aggfunc.html). Maybe there's an extension that provides this? sqlean doesn't have it.
Not sure how feasible this is.
I think is doable? The slightly tricky part is determining whether the passed-in value is date or timestamp.
There is a `sqlite__datediff_broken` macro that needs to be fixed, more fully implemented, renamed, and tested.
See https://docs.getdbt.com/docs/contributing/building-a-new-adapter#other-files Couldn't figure out how to do the schemas_and_paths and extensions parts ```yaml fixed: type: sqlite threads: 1 database: "database" schema: "main" prompts: schemas_and_paths: main: hint: '/my_project/data/etl.db' schema_directory: hint:...
Due to some personal issues, I unfortunately don't have time to maintain this project anymore. This adapter works with dbt 1.4.0 but there have been two minor versions since then...
Since using SQLite is always going to be local, we can (mis)use dbt's python models to run code, probably in a subprocess.
This PR fixes the following issue: if you have a facet field defined as follows: ```ruby config.add_facet_field 'example_query_facet_field', label: 'Publish Date', :query => { :years_5 => { label: 'within 5...