lizz

Results 136 comments of lizz

How about inferring? One cannot omit samples.

Yeah one cannot easily generate long tuples. So if we change the interface to Vector, things will be easier

Check out this fork: https://github.com/innerlee/face.evoLVe.PyTorch ## Speed Comparison ### original ```python In [1]: from PIL import Image ...: from detector import detect_faces In [2]: img = Image.open('../disp/Fig1.png').convert('RGB') In [3]: %time...

There are lots of code optimization also

Purely speed changes. The bottleneck is not model inference.

I use the provided weights for inference. Haven't tried training :shrug:

> ninja -v which project is been built? ninja is not used in the repo btw

The `.zshrc` in this repo is not a "just works" one. It is used for reference. Please do not use it to replace your own .zshrc.

Side note: * The dircolors can be installed using `dircolor.sh` * The thefuck command can be installed manually * etc.etc.

What's your OS, and is anaconda on the PATH?