Preql
Preql copied to clipboard
Error on saving table to BigQuery - type mismatch
Given this schema in a source table in BigQuery:
attempting to persist a table as a subset of the source table throws an error:
connect("bigquery:///gcp-project-id-here")
calendar = import_table("eCom_US.calendar")
temp = calendar{
Calendar_Date, // DATE
Fin_Year, // INTEGER
Fin_Year_Label // STRING
}[..5]
print(temp)
print(columns(temp))
// {Calendar_Date: string?, Fin_Year: int?, Fin_Year_Label: string?}
table test_data.temp_cal = temp
// Got error: 400 Query column 1 has type DATE which cannot be inserted into column Calendar_Date, which has type STRING
Pushed a fix to the schemas
branch. The code you gave should work now.
Right now the support for dates needs improvement. I will try to straighten it out in the coming weeks. If you get any more errors let me know!
Tried the same code, got a new error, traceback below. I checked in BigQuery UI - the dataset with that name exists, has tables inside.
Traceback (most recent call last):
File "/PyCharmVenv/PreQL/bin/preql", line 8, in <module>
sys.exit(main())
File "/PyCharmVenv/PreQL/lib/python3.9/site-packages/preql/__main__.py", line 76, in main
p('import ' + args.module)
File "/PyCharmVenv/PreQL/lib/python3.9/site-packages/preql/api.py", line 25, in inner
return f(*args, **kwargs)
File "/PyCharmVenv/PreQL/lib/python3.9/site-packages/preql/api.py", line 197, in __call__
res = self._run_code(code, '<inline>', args)
File "/PyCharmVenv/PreQL/lib/python3.9/site-packages/preql/api.py", line 193, in _run_code
return self._interp.execute_code(code + "\n", source_name, pql_args)
File "/PyCharmVenv/PreQL/lib/python3.9/site-packages/preql/core/interpreter.py", line 36, in inner
return f(interp, *args, **kwargs)
File "/PyCharmVenv/PreQL/lib/python3.9/site-packages/preql/core/interpreter.py", line 86, in _execute_code
last = execute(stmt)
File "/PyCharmVenv/PreQL/lib/python3.9/site-packages/preql/core/evaluate.py", line 384, in execute
return stmt._execute() or objects.null
File "/PyCharmVenv/PreQL/lib/python3.9/site-packages/preql/core/evaluate.py", line 365, in _execute
module = import_module(context.state, r)
File "/PyCharmVenv/PreQL/lib/python3.9/site-packages/preql/core/evaluate.py", line 349, in import_module
i.include(module_path)
File "/PyCharmVenv/PreQL/lib/python3.9/site-packages/preql/core/interpreter.py", line 36, in inner
return f(interp, *args, **kwargs)
File "/PyCharmVenv/PreQL/lib/python3.9/site-packages/preql/core/interpreter.py", line 96, in _include
self._execute_code(f.read(), fn)
File "/PyCharmVenv/PreQL/lib/python3.9/site-packages/preql/core/interpreter.py", line 86, in _execute_code
last = execute(stmt)
File "/PyCharmVenv/PreQL/lib/python3.9/site-packages/preql/core/evaluate.py", line 384, in execute
return stmt._execute() or objects.null
File "/PyCharmVenv/PreQL/lib/python3.9/site-packages/preql/core/evaluate.py", line 168, in _execute
set_var(table_def.name, t)
File "/PyCharmVenv/PreQL/lib/python3.9/site-packages/preql/core/state.py", line 220, in set_var
return context.state.set_var(name, value)
File "/PyCharmVenv/PreQL/lib/python3.9/site-packages/preql/core/state.py", line 203, in set_var
return self.ns.set_var(name, value)
File "/PyCharmVenv/PreQL/lib/python3.9/site-packages/preql/core/state.py", line 43, in set_var
ns = ns.get_var(p)
File "/PyCharmVenv/PreQL/lib/python3.9/site-packages/preql/core/state.py", line 36, in get_var
raise NameNotFound(name)
preql.core.state.NameNotFound: test_data
Yes, I understand why this is happening. I will fix it soon.
Sorry for the hick-ups, and thank you for your patience!
Okay, so I misunderstood your problem, sorry! I now realize you're running a script with -m
or -f
, in which case the tables aren't automatically available. (the reasoning is that querying the schema each time you run a script will cause an unnecessary lag)
One solution (requires latest master
), is to declare each table at the top:
table test_data {...}
Another solution which should already work, is to use connect()
:
connect("bigquery:///...", load_all_tables: true)
I'm also thinking of adding a command-line switch to always load all tables