flash-linear-attention
flash-linear-attention copied to clipboard
Using operators without having `transformers` installed
I'm currently trying to use just the operators defined in fla.ops
; however, because of the __init__.py
script for the main package, it's not possible to do this without importing things from the HF transformers
package, which makes the import slower (and broke it entirely until I upgraded the package).
It would be nice if there were a way to just import the operators without the layer modules or anything else.