Gareth
Gareth
Polygons you draw should be validated to be actual polygons before being saved, that polygons can have cutouts (see spec), and that the export data structure is correct (polygon should...
Hi @cometbus you shouldn't need to upgrade, a single pod holds close to a million vectors. It's possible that a previous run of the script did not run close out...
##Found Answer## I have the time right now but I'm not super familiar with the code. Basically what's the fastest way to get the table meta given the database and...
Here is the function I plan on using: `from civis.APIClient import get_table_id`
Initial code where table_id is passed from `read_civis` into `read_civis_sql` ```python if use_pandas: table_meta = client.tables.get(table_id) # The split is to handle e.g. DECIMAL(17,9) redshift_types = {col['name']: col['sql_type'].split('(')[0] for col...
I think we may need to change the place where `self.input_targets_` is defined in order to use the dataset API
Where these just CPU benchmarks? I'd expect the largest gains to come from GPU. Large batch sizes were probably competiting for ram which with the model which wouldn't be the...
I imagine the dataset API is designed for GPU models since that is where you are most likely going to deal with a disk io bottleneck you'd want to solve...
There seems to be an update an tf 1.5, soon to be released. However there appears to be a work around in 1.4. I'll experiment sometime in the next week...