Mike J Innes

Results 334 comments of Mike J Innes

1. Yes, definitely. The XLA work is initially targeting training, but assuming there are deployment tools that we can easily compile to that's an easy corollary. Mjolnir is a much...

It's a little clumsy right now, but here's how you can get a graph for the forward pass of a simple model, ready to deploy: ```julia (xla-test) pkg> add Flux...

Hey @ayush1999, this package works a bit better now, has some basic docs and works on 0.6. So hopefully it's a bit easier to start hacking on.

Having a few examples up on the website would be really great.

I think we just need to add support for the reshape op. It's a bit of a shame that the error message for missing ops are so bad, but I'm...

Yeah, we need a primitive definition that tells FluxJS how to map Julia convolutions to JS ones; something the [the softmax one](https://github.com/FluxML/FluxJS.jl/blob/6a0464d8a6a3ad42519896652bb1588ded4f6ca4/src/lib.jl#L32).

Yeah, it's pretty easy to merge all return blocks into a final block with a phi node.

You probably just need to pin IRTools at 0.1, or whatever version is listed in the IRTools manifest. Hopefully @Maaarcocr can update for recent changes in IRTools at some point.

The use of `@inbounds return ...` [here](https://github.com/JuliaLang/julia/blob/v1.7.3/base/dict.jl) leads to a redundant trailing block in the IR: ``` 1: (%1, %2, %3) %4 = Base.ht_keyindex(%2, %3) %5 = $(Expr(:inbounds, true)) %6...

What you're describing isn't just similar to the Hydra transform, it *is* the Hydra transform, so we should be able to make this easy for you. Taking an example from...