Kimmy

Results 37 comments of Kimmy

Is there a reason why the fq language itself couldn't be used to implement custom decoders? For instance, suppose I'm investigating DOOM WAD files with `fq`. These are a collection...

All great points. I imagine performance could be quite limiting if you had dozens of jq-like filters probing for file type support, heh :)

This looks like a docopt issue. Adding a blank line in `__doc__` just between the `pok (push)` line and before `Options:` seems to make this issue go away for me.

Removing the `break` hides the issue. The [diff between the two generated codes](https://gist.github.com/gcr/118290c3bd2b49f8719c4741bb3b3a28/revisions) is illuminating. Without the `break`, the flow jumps much farther to `BeforeRet_` and the unitialized value isn't...

!nim c ```nim proc foo() = for _ in @[1, 3, 5]: discard "abcde"[25..

Hm. This might be a bug in openresty... I recommend you check with that package, if you're still having issues. (Also sorry, this library might be pretty unsupported. I've honestly...

I see! Thank you -- I wasn't sure how to express this network in the usual Torch way, so I had to use the `nngraph` module. This example makes that...

Hmm. There could be something worth investigating here. One reason for having two convolutional layers was that we could do some dimensionality reduction to add a bottleneck. This forces each...

oops! that comment, and the one near line 159, is left-over from a broken multi-GPU version. The entire thing runs on a single GPU. You can set the GPU to...

Oh, interesting! I'll add a link to this issue in the README, if you don't mind. What is the 'scale&bias layer'? In Torch, batch normalization layers have learnable `weight` and...