flash-linear-attention icon indicating copy to clipboard operation
flash-linear-attention copied to clipboard

Using operators without having `transformers` installed

Open Ronsor opened this issue 1 year ago • 1 comments

I'm currently trying to use just the operators defined in fla.ops; however, because of the __init__.py script for the main package, it's not possible to do this without importing things from the HF transformers package, which makes the import slower (and broke it entirely until I upgraded the package).

It would be nice if there were a way to just import the operators without the layer modules or anything else.

Ronsor avatar Apr 20 '24 21:04 Ronsor

@Ronsor Hi, may I know why you need this. I think it's hard to use fla as well if transformers is unavailable. Currently this pkg is heavily tied with 🤗 transformers, e.g., we use its off the shelf utility fns to check Trion versions and raise some warnings and we alse make use of some activations in transformers.

yzhangcs avatar Apr 21 '24 12:04 yzhangcs

This issue is stale because it has been open for 30 days with no activity.

github-actions[bot] avatar Jun 23 '24 00:06 github-actions[bot]

This issue was closed because it has been inactive for 7 days since being marked as stale.

github-actions[bot] avatar Jun 30 '24 00:06 github-actions[bot]