Kyle Daruwalla
Kyle Daruwalla
@BioTurboNick Reported on Slack that with the default Flux initialization, his model would get stuck in an all zeros state, but not with the PyTorch init.
I think that's the only one left
I'm not entirely sure but I think `@allowscalar` inserts a `try` block? That might help narrow things down.
I think [these signatures](https://github.com/FluxML/OneHotArrays.jl/blob/d27d037110b704afda170a9b6940b4cb91d4636b/src/array.jl#L65-L76) need to be made more specific to include GPUArray + Int + Vararg{Int} as a combo.
I would refer to https://github.com/FluxML/Flux.jl/pull/1448 and the related issues and PRs. In particular, the storage change here is reverting back to Flux’s original implementation. This PR is a bit more...
I have commented on #36 to follow up that work. Unfortunately, I think this PR is going to be a non-starter since primary motivation for the current storage is having...
The following is type-stable: ```julia function ohaxis(data::AbstractArray{ i, D - 1)..., ntuple(i -> i + D, N - D)...) return PermutedDimsArray(out, perm) end end ``` We can offer the `Val`...
Yeah it should check that the one hot property is preserved or error. So if the first index isn't `Colon` then it should error. Otherwise check that the data itself...
I was thinking that `setindex!` with the first index as `Colon` should accept `v` that is a genuine one-hot. We can have a faster path when `v
I am also concerned about fragility. The implementation itself is sensible, but as written seems like it will need to get updated for internal changes often. The core idea is...