Andrew Wagner

Results 23 comments of Andrew Wagner

I am also spending a lot of time figuring out what the output types of your functions are. It would be nice to have static types for the outputs of:...

pycocotools' dataset abstraction does not provided an API that's ergonomic for non-COCO data, but this library does; it's why I'm trying it out. Make a Data instance for you ground...

First of all, since this is my first issue, thanks for open sourcing Earthly! It seems really ergonomic for those who already know docker. I have always hated docker's cache...

I am hitting it in 3.6.8 in a conda environment. `sudo PATH=$PATH PYTHONPATH=$PYTHONPATH /home/awagner/miniconda3/bin/pyflame -o pyflame_profile.txt --threads --trace python3 ../common/bench.py ` `Failed to PTRACE_PEEKDATA at 0x55628625c5a8: Input/output error` ``` $...

Following another issue report, things seem to work with 3.6.5. but not 3.6.6+, so this is a python version issue, not a conda issue. Workaround: > conda install python=3.6.5

There is pyflif: https://github.com/szborows/pyflif/issues It's currently licensed as GPL rather than LGPL like the reference FLIF, though. I filed an issue requesting a license switch so I can consider flif...

Shouldn't something like this be a 1.0 blocker? It seems like you don't want people rolling their own solutions to get around a pain point in something as fundamental as...

Looking at code in pytorch master: https://github.com/pytorch/pytorch/blob/480bb7d35656ac98fec3cabd77b5ef59f9d1a021/torch/jit/annotations.py#L273 It seems like Dict[Tuple[str,str]] should be covered in ~that recursion... but maybe not in: https://github.com/pytorch/pytorch/blob/c480eebf958a306afdb4bcdf15afe89ce6a38731/torch/csrc/jit/python/pybind_utils.h#L273

Update: all the above was with torch.jit.script() which is actually a sort of transpiler, rather than a tracer. I am progressing with torch.jit.trace().

Thanks for pointing that out; I'll see if I can adapt my code to use it (I currently inherit from torch.nn.Module). One obstacle might be that my code uses torch.nn.ModuleList....