Michael Abbott

Results 1316 comments of Michael Abbott

For `norm` you can just write it yourself: ``` julia> A = randn(3,5); julia> sqrt.(sum(abs2, A; dims=1)) ≈ mapslices(norm, A, dims=1) true julia> sum(abs, A; dims=1) ≈ mapslices(x -> norm(x,1),...

Sadly not -- status when last I looked was that my "better way" seemed not to be called at all, and I wasn't sure why.

Is this quicker than just reshaping and broadcasting, with something like this? ``` function kron2d(A, B) A4 = reshape(A, 1, size(A,1), 1, size(A,2)) B4 = reshape(B, size(B,1), 1, size(B,2), 1)...

> I agree - honestly didn't know reshape would work this way. You may be interested in TensorCast.jl, my package for taking this seriously. It's possible that some broadcasting implementation...

Maybe something like OutMacro would also work? Then it would be pretty obvious what `using OutMacro` at the top of some script is doing. [noblock]

This has no description of what it does, seems to contain one function and zero tests.

I made [a package](https://github.com/mcabbott/TensorSlice.jl) which does this, and some other things. The notation you suggest works, as does the more compact `\` for combining indices: ```julia using TensorOperations; using TensorSlice...

I had a go at making this happen, and have an almost-working implementation which acts on Flux's `TrackedArray`s, and defines gradients for each of the three basis functions, by calling...

I am new to all this too, but begin to understand a little how Flux's model works. The essence of this is to overload functions like `add!(α, A::TrackedArray, ...` so...

The issue is that TrackedArrays are immutable. Perhaps one could still cheat.... my strategy was instead that `add!` etc. act on the Array which will become `C.data`, but return a...