Valentin Churavy
Valentin Churavy
Offsets will still be wrong... We may be able to fix this in Julia proper, but right now cross-compilation from 64bit to 32bit is not a supported feature.
With opaque pointers gep will also use byte offsets, we haven't turned on opaque pointers, but eventually we must. Might be Julia 1.11
Great idea! I don't think this is currently possible, but I would be excited to receive such a pull-request.
> KA.jl-heavy code probably would benefit much more (unless KA.jl itself assumes Int64) Currently it does, but we can change that.
Yes, it should be straightforward but will require updating the documentation in many places.
Have you had a chance to look at what KA 0.9 does? It might not be a complete as you want to, but right now you can adapt/allocat/synchronize w.r.t to...
I like the way files are handled with `do...end` blocks, but I don't quite now how that would pan out in terms of memory managements and normally one shouldn't create...
Which it is: https://github.com/dmlc/MXNet.jl/blob/a2164ae43ab70d8be7708b7dc9974a5a6a360a8e/src/model.jl#L186 I was wondering if it would be possible to use the same executor and _update_ the weights of the model?
As an example of how this can be useful, let's say you want to freeze the first few layers of a network ``` julia """ Find nodes that are step...
replaced by #534