Rafal Wojdyla
Rafal Wojdyla
I have started working on this feature some time ago - can probably upload what I have right now (it's far from complete). That said if anyone feels like working...
To start with it, I used Fig to create fresh development environment with python 3.4 on Ubuntu Trusty. Unfortunately everything borked at very beginning: ``` pip install -r requirements-dev.txt ```...
wat!? protobuf 2.6.0 Supported protobuf for Hadoop is 2.5.0. I guess this ticket requires some investigation - how do you want to tackle this?
True if I change dependancy to >=2.6.0 on ubuntu with python3.4 I can install all the dev requirements. `setup.py test` doesn't work tho. Simple `snakebite -v` either. Due to incompatibility...
Maybe we can swig it: http://www.swig.org/
@jacobtomlinson please correct me if I'm wrong, does dask/distributed#4377 really address this issue? Looking briefly into that, it seems like it's more about the number of Dask processes and thread...
@jacobtomlinson thanks for prompt response. I understand, but do you think `dask-cloudprovider` could set sensible defaults given that it has information about number of Dask workers/threads and VM types (#CPUs)?...
To check if the client is within GCE, we could use the util from #232
For those looking for a temp workaround until #153 gets merged, in your sphinx `conf.py` file: ```python import logging as pylogging from sphinx.util import logging # Workaround for https://github.com/agronholm/sphinx-autodoc-typehints/issues/123 #...
FYI @jthielen, in case you want to look into this.