Gaurav Sheni
Gaurav Sheni
@TalAmuyal switch to pyspark, which supports 3.9 https://pypi.org/project/pyspark/
Full stack trace: ``` ValueError Traceback (most recent call last) in 7 index='id', 8 time_index='datetime', ----> 9 accuracy=0.50) ~/lib/python3.6/site-packages/autonormalize/autonormalize.py in auto_entityset(df, accuracy, index, name, time_index) 133 entityset (ft.EntitySet) : created...
The branch does allow the autonormalize to progress further (now 13/13), but I got the following error with the above (attached dataset) `KeyError: 'Variable: country not found in entity' `
``` 100%|██████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 13/13 [24:53 10 accuracy=0.50) ~/autonormalize/autonormalize/autonormalize.py in auto_entityset(df, accuracy, index, name, time_index) 133 entityset (ft.EntitySet) : created entity set 134 """ --> 135 return make_entityset(df, find_dependencies(df, accuracy, index), name,...
Yes, Woodwork supports Dask DataFrames in the current DataTables approach, and in the upcoming Accessor implementation. You can see how to use Dask DataFrames [here](https://woodwork.alteryx.com/en/latest/guides/using_woodwork_with_dask_and_koalas.html ). The docs also talk...
@chukarsten @asniyaz Can we prioritize this and add it to the next EvalML sprint? It is affecting our current work
@cp2boston or @chukarsten Can we fix the lint error and release notes conflict here?
@tamargrey sounds good. We will be addressing this issue in the upcoming sprint.
@rwedge ice-boxing this for now.
We should make this section easy to read for first time users. A short and concise section that allows users to see how we do inference.