Lux.jl
                                
                                
                                
                                    Lux.jl copied to clipboard
                            
                            
                            
                        Add a faster activation path
- ~
sigmoid_fastfails on GPU as pointed out in the NNlib PR. Need to work around that~ - ~Specialize 
WrappedFunctiononBroadcast.BroadcastFunction~ --> can't do this mutation will violate purity wrt input/output arrays 
- #570 pulls out the easy part of this
 
Remaining Problems:
- [ ] Type Stability failure for DEQs
 - [ ] NeuralPDE failure