Bas Nijholt

Results 219 issues of Bas Nijholt

In the server, sending an object takes too long. When many (1000) clients connect, I don't want the program to slow down. Blocked by https://github.com/cloudpipe/cloudpickle/issues/374. The current implementation is: `client.py`...

Also allow to pass these kwargs: ``` def save_dataframe( fname: str | list[str], *, format: _DATAFRAME_FORMATS = "parquet", # noqa: A002 save_kwargs: dict[str, Any] | None = None, expand_dicts: bool...

- Show serialization time in live info - Show serialized function size in info - Show total size of files (data/df/learners)

works: ```python import adaptive_scheduler learner = adaptive_scheduler.utils.fname_to_learner("data/offset_0.0__width_0.01.pickle") from pathos.multiprocessing import ProcessPool ex = ProcessPool() fut = ex.map(learner.function, [0]) fut ```

We can probably copy the setup of SLURM and PBS from https://github.com/dask/dask-jobqueue/blob/master/.travis.yml