Bas Nijholt

Results 503 comments of Bas Nijholt
trafficstars

Thanks for your detailed look! I've been able to make a minimal example, where after creating a `ipyparallel.Client` the exception is raised with the following code: ```python from ipyparallel import...

I've simplified the above code, actually just importing `ipyparallel` will make `loky` fail!

@pierreglaser, I am relatively sure that this is an `ipyparallel` problem. It's fixed by https://github.com/ipython/ipyparallel/pull/379, which hasn't made it into a release yet, unfortunately. It does, however, mean that `loky`...

```python (py38) basnijholt-imac ➜ ~ python Python 3.8.2 | packaged by conda-forge | (default, Mar 23 2020, 17:55:48) [Clang 9.0.1 ] on darwin Type "help", "copyright", "credits" or "license" for...

It would be great to make `lru_cache` work. For now, I have fixed it by making a cache that is shared in memory: [docs](https://adaptive-scheduler.readthedocs.io/en/latest/reference/adaptive_scheduler.utils.html#adaptive_scheduler.utils.LRUCachedCallable), [source](https://github.com/basnijholt/adaptive-scheduler/blob/63f12486ad2d319196f0a530df148502bc7628c8/adaptive_scheduler/utils.py#L546-L629).

This is because of https://github.com/cloudpipe/cloudpickle/issues/178 and can be reproduced with ```python import cloudpickle from functools import lru_cache @lru_cache def g(x): return x dump = cloudpickle.dumps(g) del g g = cloudpickle.loads(dump)...

@mcg1969, could you take another look at this? Several team members have reported that `conda info --json` takes ≈25 seconds because this package blocks things.

That would be pretty cool indeed :) Thanks for the suggestions on how to save the data, I've been thinking about a good way of saving (and restoring) the learners...

The instructions are inside the file and the icon is set inside of BTT. The instructions aren't too clear though 😅

Yes, see https://github.com/snoack/flake8-per-file-ignores/blame/master/README.md#L6-L8. @snoack, the fact that this isn't obvious to @v-goncharenko is precisely the reason to archive this 😄