bion howard

Results 184 comments of bion howard

```sh > rg 'imp\.' setup.py 70: version=imp.load_source('_metadata', 'bsuite/_metadata.py').__version__, ``` might be easy to fix, i only see one spot workaround, you can clone the repo and edit setup.py ```py #...

https://gist.github.com/bionicles/76775dd1feaf9149132149f0048d820b if anybody wants the code here

# Compiler Diagnostic for "Future Not Send" this screwed me up today on server side rendering regarding [clippy::future_not_send](https://rust-lang.github.io/rust-clippy/master/index.html#future_not_send) because i'm trying to cache the jsons so I can skip re-building...

yes, i'm down to help review

like this? i didnt check the other nnx notebooks, they might need fixes also, checking ...

didn't see any links in the other notebooks in the docs/nnx folder

Ok, makes sense, I'd still suggest the re-export because the module is marketed as a "drop in replacement" so having the aliases could save numerous users a lot of time....

Maybe I'm wrong, I thought I saw that somewhere, if it's not a goal then that's fine, just good. Ah, here's where I found that, Stack Overflow !![image](https://github.com/user-attachments/assets/51509c3e-4c1c-4fe7-89a6-2c72a9d525c7) Looks like...

> It will be significantly slower than fp32 How do we know fp8 would be slower than fp32? Seems like it would be significantly less calculation involved if there were...

[ARM docs](https://developer.arm.com/documentation/102374/0102/Data-processing---floating-point/Support-for-8-bit-and-16-bit-floating-point) [ARM blog](https://community.arm.com/arm-community-blogs/b/announcements/posts/arm-supports-fp8-a-new-8-bit-floating-point-interchange-format-for-neural-network-processing) [FP8 paper](https://arxiv.org/abs/2209.05433) [tons of fp8 instructions in AARCH](https://developer.arm.com/documentation/ddi0602/2025-03/SIMD-FP-Instructions?lang=en) **?**